Sample records for continuum code tempest

  1. Implementation of an anomalous radial transport model for continuum kinetic edge codes

    NASA Astrophysics Data System (ADS)

    Bodi, K.; Krasheninnikov, S. I.; Cohen, R. H.; Rognlien, T. D.

    2007-11-01

    Radial plasma transport in magnetic fusion devices is often dominated by plasma turbulence compared to neoclassical collisional transport. Continuum kinetic edge codes [such as the (2d,2v) transport version of TEMPEST and also EGK] compute the collisional transport directly, but there is a need to model the anomalous transport from turbulence for long-time transport simulations. Such a model is presented and results are shown for its implementation in the TEMPEST gyrokinetic edge code. The model includes velocity-dependent convection and diffusion coefficients expressed as a Hermite polynominals in velocity. The specification of the Hermite coefficients can be set, e.g., by specifying the ratio of particle and energy transport as in fluid transport codes. The anomalous transport terms preserve the property of no particle flux into unphysical regions of velocity space. TEMPEST simulations are presented showing the separate control of particle and energy anomalous transport, and comparisons are made with neoclassical transport also included.

  2. Tempest simulations of kinetic GAM mode and neoclassical turbulence

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Dimits, A. M.

    2007-11-01

    TEMPEST is a nonlinear five dimensional (3d2v) gyrokinetic continuum code for studies of H-mode edge plasma neoclassical transport and turbulence in real divertor geometry. The 4D TEMPEST code correctly produces frequency, collisionless damping of GAM and zonal flow with fully nonlinear Boltzmann electrons in homogeneous plasmas. For large q=4 to 9, the Tempest simulations show that a series of resonance at higher harmonics v||=φGqR0/n with n=4 become effective. The TEMPEST simulation also shows that GAM exists in edge plasma pedestal for steep density and temperature gradients, and an initial GAM relaxes to the standard neoclassical residual with neoclassical transport, rather than Rosenbluth-Hinton residual due to the presence of ion-ion collisions. The enhanced GAM damping explains experimental BES measurements on the edge q scaling of the GAM amplitude. Our 5D gyrokinetic code is built on 4D Tempest neoclassical code with extension to a fifth dimension in toroidal direction and with 3D domain decompositions. Progress on performing 5D neoclassical turbulence simulations will be reported.

  3. Neoclassical simulation of tokamak plasmas using the continuum gyrokinetic code TEMPEST.

    PubMed

    Xu, X Q

    2008-07-01

    We present gyrokinetic neoclassical simulations of tokamak plasmas with a self-consistent electric field using a fully nonlinear (full- f ) continuum code TEMPEST in a circular geometry. A set of gyrokinetic equations are discretized on a five-dimensional computational grid in phase space. The present implementation is a method of lines approach where the phase-space derivatives are discretized with finite differences, and implicit backward differencing formulas are used to advance the system in time. The fully nonlinear Boltzmann model is used for electrons. The neoclassical electric field is obtained by solving the gyrokinetic Poisson equation with self-consistent poloidal variation. With a four-dimensional (psi,theta,micro) version of the TEMPEST code, we compute the radial particle and heat fluxes, the geodesic-acoustic mode, and the development of the neoclassical electric field, which we compare with neoclassical theory using a Lorentz collision model. The present work provides a numerical scheme for self-consistently studying important dynamical aspects of neoclassical transport and electric field in toroidal magnetic fusion devices.

  4. Neoclassical simulation of tokamak plasmas using the continuum gyrokinetic code TEMPEST

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.

    2008-07-01

    We present gyrokinetic neoclassical simulations of tokamak plasmas with a self-consistent electric field using a fully nonlinear (full- f ) continuum code TEMPEST in a circular geometry. A set of gyrokinetic equations are discretized on a five-dimensional computational grid in phase space. The present implementation is a method of lines approach where the phase-space derivatives are discretized with finite differences, and implicit backward differencing formulas are used to advance the system in time. The fully nonlinear Boltzmann model is used for electrons. The neoclassical electric field is obtained by solving the gyrokinetic Poisson equation with self-consistent poloidal variation. With a four-dimensional (ψ,θ,γ,μ) version of the TEMPEST code, we compute the radial particle and heat fluxes, the geodesic-acoustic mode, and the development of the neoclassical electric field, which we compare with neoclassical theory using a Lorentz collision model. The present work provides a numerical scheme for self-consistently studying important dynamical aspects of neoclassical transport and electric field in toroidal magnetic fusion devices.

  5. Neoclassical Simulation of Tokamak Plasmas using Continuum Gyrokinetc Code TEMPEST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, X Q

    We present gyrokinetic neoclassical simulations of tokamak plasmas with self-consistent electric field for the first time using a fully nonlinear (full-f) continuum code TEMPEST in a circular geometry. A set of gyrokinetic equations are discretized on a five dimensional computational grid in phase space. The present implementation is a Method of Lines approach where the phase-space derivatives are discretized with finite differences and implicit backwards differencing formulas are used to advance the system in time. The fully nonlinear Boltzmann model is used for electrons. The neoclassical electric field is obtained by solving gyrokinetic Poisson equation with self-consistent poloidal variation. Withmore » our 4D ({psi}, {theta}, {epsilon}, {mu}) version of the TEMPEST code we compute radial particle and heat flux, the Geodesic-Acoustic Mode (GAM), and the development of neoclassical electric field, which we compare with neoclassical theory with a Lorentz collision model. The present work provides a numerical scheme and a new capability for self-consistently studying important aspects of neoclassical transport and rotations in toroidal magnetic fusion devices.« less

  6. 5D Tempest simulations of kinetic edge turbulence

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Xiong, Z.; Cohen, B. I.; Cohen, R. H.; Dorr, M. R.; Hittinger, J. A.; Kerbel, G. D.; Nevins, W. M.; Rognlien, T. D.; Umansky, M. V.; Qin, H.

    2006-10-01

    Results are presented from the development and application of TEMPEST, a nonlinear five dimensional (3d2v) gyrokinetic continuum code. The simulation results and theoretical analysis include studies of H-mode edge plasma neoclassical transport and turbulence in real divertor geometry and its relationship to plasma flow generation with zero external momentum input, including the important orbit-squeezing effect due to the large electric field flow-shear in the edge. In order to extend the code to 5D, we have formulated a set of fully nonlinear electrostatic gyrokinetic equations and a fully nonlinear gyrokinetic Poisson's equation which is valid for both neoclassical and turbulence simulations. Our 5D gyrokinetic code is built on 4D version of Tempest neoclassical code with extension to a fifth dimension in binormal direction. The code is able to simulate either a full torus or a toroidal segment. Progress on performing 5D turbulence simulations will be reported.

  7. Tempest Simulations of Collisionless Damping of the Geodesic-Acoustic Mode in Edge-Plasma Pedestals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, X. Q.; Xiong, Z.; Nevins, W. M.

    The fully nonlinear (full-f) four-dimensional TEMPEST gyrokinetic continuum code correctly produces the frequency and collisionless damping of geodesic-acoustic modes (GAMs) and zonal flow, with fully nonlinear Boltzmann electrons for the inverse aspect ratio {epsilon} scan and the tokamak safety factor q scan in homogeneous plasmas. TEMPEST simulations show that the GAMs exist in the edge pedestal for steep density and temperature gradients in the form of outgoing waves. The enhanced GAM damping may explain experimental beam emission spectroscopy measurements on the edge q scaling of the GAM amplitude.

  8. Tempest Simulations of Collisionless Damping of the Geodesic-Acoustic Mode in Edge-Plasma Pedestals

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Xiong, Z.; Gao, Z.; Nevins, W. M.; McKee, G. R.

    2008-05-01

    The fully nonlinear (full-f) four-dimensional TEMPEST gyrokinetic continuum code correctly produces the frequency and collisionless damping of geodesic-acoustic modes (GAMs) and zonal flow, with fully nonlinear Boltzmann electrons for the inverse aspect ratio γ scan and the tokamak safety factor q scan in homogeneous plasmas. TEMPEST simulations show that the GAMs exist in the edge pedestal for steep density and temperature gradients in the form of outgoing waves. The enhanced GAM damping may explain experimental beam emission spectroscopy measurements on the edge q scaling of the GAM amplitude.

  9. TEMPEST simulations of collisionless damping of the geodesic-acoustic mode in edge-plasma pedestals.

    PubMed

    Xu, X Q; Xiong, Z; Gao, Z; Nevins, W M; McKee, G R

    2008-05-30

    The fully nonlinear (full-f) four-dimensional TEMPEST gyrokinetic continuum code correctly produces the frequency and collisionless damping of geodesic-acoustic modes (GAMs) and zonal flow, with fully nonlinear Boltzmann electrons for the inverse aspect ratio scan and the tokamak safety factor q scan in homogeneous plasmas. TEMPEST simulations show that the GAMs exist in the edge pedestal for steep density and temperature gradients in the form of outgoing waves. The enhanced GAM damping may explain experimental beam emission spectroscopy measurements on the edge q scaling of the GAM amplitude.

  10. TEMPEST Simulations of Collisionless Damping of Geodesic-Acoustic Mode in Edge Plasma Pedestal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, X Q; Xiong, Z; Nevins, W M

    The fully nonlinear (full-f) 4D TEMPEST gyrokinetic continuum code produces frequency, collisionless damping of GAM and zonal flow with fully nonlinear Boltzmann electrons for the inverse aspect ratio {epsilon}-scan and the tokamak safety factor q-scan in homogeneous plasmas. The TEMPEST simulation shows that GAM exists in edge plasma pedestal for steep density and temperature gradients, and an initial GAM relaxes to the standard neoclassical residual, rather than Rosenbluth-Hinton residual due to the presence of ion-ion collisions. The enhanced GAM damping explains experimental BES measurements on the edge q scaling of the GAM amplitude.

  11. TEMPEST Simulations of Collisionless Damping of Geodesic-Acoustic Mode in Edge Plasma Pedestal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, X; Xiong, Z; Nevins, W

    The fully nonlinear 4D TEMPEST gyrokinetic continuum code produces frequency, collisionless damping of geodesic-acoustic mode (GAM) and zonal flow with fully nonlinear Boltzmann electrons for the inverse aspect ratio {epsilon}-scan and the tokamak safety factor q-scan in homogeneous plasmas. The TEMPEST simulation shows that GAM exists in edge plasma pedestal for steep density and temperature gradients, and an initial GAM relaxes to the standard neoclassical residual, rather than Rosenbluth-Hinton residual due to the presence of ion-ion collisions. The enhanced GAM damping explains experimental BES measurements on the edge q scaling of the GAM amplitude.

  12. Collisional tests and an extension of the TEMPEST continuum gyrokinetic code

    NASA Astrophysics Data System (ADS)

    Cohen, R. H.; Dorr, M.; Hittinger, J.; Kerbel, G.; Nevins, W. M.; Rognlien, T.; Xiong, Z.; Xu, X. Q.

    2006-04-01

    An important requirement of a kinetic code for edge plasmas is the ability to accurately treat the effect of colllisions over a broad range of collisionalities. To test the interaction of collisions and parallel streaming, TEMPEST has been compared with published analytic and numerical (Monte Carlo, bounce-averaged Fokker-Planck) results for endloss of particles confined by combined electrostatic and magnetic wells. Good agreement is found over a wide range of collisionality, confining potential and mirror ratio, and the required velocity space resolution is modest. We also describe progress toward extension of (4-dimensional) TEMPEST into a ``kinetic edge transport code'' (a kinetic counterpart of UEDGE). The extension includes averaging of the gyrokinetic equations over fast timescales and approximating the averaged quadratic terms by diffusion terms which respect the boundaries of inaccessable regions in phase space. F. Najmabadi, R.W. Conn and R.H. Cohen, Nucl. Fusion 24, 75 (1984); T.D. Rognlien and T.A. Cutler, Nucl. Fusion 20, 1003 (1980).

  13. Tempest Neoclassical Simulation of Fusion Edge Plasmas

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Xiong, Z.; Cohen, B. I.; Cohen, R. H.; Dorr, M.; Hittinger, J.; Kerbel, G. D.; Nevins, W. M.; Rognlien, T. D.

    2006-04-01

    We are developing a continuum gyrokinetic full-F code, TEMPEST, to simulate edge plasmas. The geometry is that of a fully diverted tokamak and so includes boundary conditions for both closed magnetic flux surfaces and open field lines. The code, presently 4-dimensional (2D2V), includes kinetic ions and electrons, a gyrokinetic Poisson solver for electric field, and the nonlinear Fokker-Planck collision operator. Here we present the simulation results of neoclassical transport with Boltzmann electrons. In a large aspect ratio circular geometry, excellent agreement is found for neoclassical equilibrium with parallel flows in the banana regime without a temperature gradient. In divertor geometry, it is found that the endloss of particles and energy induces pedestal-like density and temperature profiles inside the magnetic separatrix and parallel flow stronger than the neoclassical predictions in the SOL. The impact of the X-point divertor geometry on the self-consistent electric field and geo-acoustic oscillations will be reported. We will also discuss the status of extending TEMPEST into a 5-D code.

  14. Numerical Solution of the Gyrokinetic Poisson Equation in TEMPEST

    NASA Astrophysics Data System (ADS)

    Dorr, Milo; Cohen, Bruce; Cohen, Ronald; Dimits, Andris; Hittinger, Jeffrey; Kerbel, Gary; Nevins, William; Rognlien, Thomas; Umansky, Maxim; Xiong, Andrew; Xu, Xueqiao

    2006-10-01

    The gyrokinetic Poisson (GKP) model in the TEMPEST continuum gyrokinetic edge plasma code yields the electrostatic potential due to the charge density of electrons and an arbitrary number of ion species including the effects of gyroaveraging in the limit kρ1. The TEMPEST equations are integrated as a differential algebraic system involving a nonlinear system solve via Newton-Krylov iteration. The GKP preconditioner block is inverted using a multigrid preconditioned conjugate gradient (CG) algorithm. Electrons are treated as kinetic or adiabatic. The Boltzmann relation in the adiabatic option employs flux surface averaging to maintain neutrality within field lines and is solved self-consistently with the GKP equation. A decomposition procedure circumvents the near singularity of the GKP Jacobian block that otherwise degrades CG convergence.

  15. Verification of TEMPEST with neoclassical transport theory

    NASA Astrophysics Data System (ADS)

    Xiong, Z.; Cohen, B. I.; Cohen, R. H.; Dorr, M.; Hittinger, J.; Kerbel, G.; Nevins, W. M.; Rognlien, T.; Umansky, M.; Xu, X.

    2006-10-01

    TEMPEST is an edge gyro-kinetic continuum code developed to study boundary plasma transport over the region extending from the H-mode pedestal across the separatrix to the divertor plates. For benchmark purposes, we present results from the 4D (2r,2v) TEMPEST for both steady-state transport and time-dependent Geodesic Acoustic Modes (GAMs). We focus on an annular region inside the separatrix of a circular cross-section tokamak where analytical and numerical results are available. The parallel flow velocity and radial particle flux are obtained for different collisional regimes and compared with previous neoclassical results. The effect of radial electric field and the transition to steep edge gradients is emphasized. The dynamical response of GAMs is also shown and compared to recent theory.

  16. Simulations of 4D edge transport and dynamics using the TEMPEST gyro-kinetic code

    NASA Astrophysics Data System (ADS)

    Rognlien, T. D.; Cohen, B. I.; Cohen, R. H.; Dorr, M. R.; Hittinger, J. A. F.; Kerbel, G. D.; Nevins, W. M.; Xiong, Z.; Xu, X. Q.

    2006-10-01

    Simulation results are presented for tokamak edge plasmas with a focus on the 4D (2r,2v) option of the TEMPEST continuum gyro-kinetic code. A detailed description of a variety of kinetic simulations is reported, including neoclassical radial transport from Coulomb collisions, electric field generation, dynamic response to perturbations by geodesic acoustic modes, and parallel transport on open magnetic-field lines. Comparison is made between the characteristics of the plasma solutions on closed and open magnetic-field line regions separated by a magnetic separatrix, and simple physical models are used to qualitatively explain the differences observed in mean flow and electric-field generation. The status of extending the simulations to 5D turbulence will be summarized. The code structure used in this ongoing project is also briefly described, together with future plans.

  17. Edge gyrokinetic theory and continuum simulations

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Xiong, Z.; Dorr, M. R.; Hittinger, J. A.; Bodi, K.; Candy, J.; Cohen, B. I.; Cohen, R. H.; Colella, P.; Kerbel, G. D.; Krasheninnikov, S.; Nevins, W. M.; Qin, H.; Rognlien, T. D.; Snyder, P. B.; Umansky, M. V.

    2007-08-01

    The following results are presented from the development and application of TEMPEST, a fully nonlinear (full-f) five-dimensional (3d2v) gyrokinetic continuum edge-plasma code. (1) As a test of the interaction of collisions and parallel streaming, TEMPEST is compared with published analytic and numerical results for endloss of particles confined by combined electrostatic and magnetic wells. Good agreement is found over a wide range of collisionality, confining potential and mirror ratio, and the required velocity space resolution is modest. (2) In a large-aspect-ratio circular geometry, excellent agreement is found for a neoclassical equilibrium with parallel ion flow in the banana regime with zero temperature gradient and radial electric field. (3) The four-dimensional (2d2v) version of the code produces the first self-consistent simulation results of collisionless damping of geodesic acoustic modes and zonal flow (Rosenbluth-Hinton residual) with Boltzmann electrons using a full-f code. The electric field is also found to agree with the standard neoclassical expression for steep density and ion temperature gradients in the plateau regime. In divertor geometry, it is found that the endloss of particles and energy induces parallel flow stronger than the core neoclassical predictions in the SOL.

  18. Simulation of Plasma Transport in a Toroidal Annulus with TEMPEST

    NASA Astrophysics Data System (ADS)

    Xiong, Z.

    2005-10-01

    TEMPEST is an edge gyro-kinetic continuum code currently under development at LLNL to study boundary plasma transport over a region extending from inside the H-mode pedestal across the separatrix to the divertor plates. Here we report simulation results from the 4D (θ, ψ, E, μ) TEMPEST, for benchmark purpose, in an annulus region immediately inside the separatrix of a large aspect ratio, circular cross-section tokamak. Besides the normal poloidal trapping regions, there are radial inaccessible regions at a fixed poloid angle, energy and magnetic moment due to the radial variation of the B field. To handle such cases, a fifth-order WENO differencing scheme is used in the radial direction. The particle and heat transport coefficients are obtained for different collisional regimes and compared with the neo-classical transport theory.

  19. The next-generation ESL continuum gyrokinetic edge code

    NASA Astrophysics Data System (ADS)

    Cohen, R.; Dorr, M.; Hittinger, J.; Rognlien, T.; Collela, P.; Martin, D.

    2009-05-01

    The Edge Simulation Laboratory (ESL) project is developing continuum-based approaches to kinetic simulation of edge plasmas. A new code is being developed, based on a conservative formulation and fourth-order discretization of full-f gyrokinetic equations in parallel-velocity, magnetic-moment coordinates. The code exploits mapped multiblock grids to deal with the geometric complexities of the edge region, and utilizes a new flux limiter [P. Colella and M.D. Sekora, JCP 227, 7069 (2008)] to suppress unphysical oscillations about discontinuities while maintaining high-order accuracy elsewhere. The code is just becoming operational; we will report initial tests for neoclassical orbit calculations in closed-flux surface and limiter (closed plus open flux surfaces) geometry. It is anticipated that the algorithmic refinements in the new code will address the slow numerical instability that was observed in some long simulations with the existing TEMPEST code. We will also discuss the status and plans for physics enhancements to the new code.

  20. Comparisons of anomalous and collisional radial transport with a continuum kinetic edge code

    NASA Astrophysics Data System (ADS)

    Bodi, K.; Krasheninnikov, S.; Cohen, R.; Rognlien, T.

    2009-05-01

    Modeling of anomalous (turbulence-driven) radial transport in controlled-fusion plasmas is necessary for long-time transport simulations. Here the focus is continuum kinetic edge codes such as the (2-D, 2-V) transport version of TEMPEST, NEO, and the code being developed by the Edge Simulation Laboratory, but the model also has wider application. Our previously developed anomalous diagonal transport matrix model with velocity-dependent convection and diffusion coefficients allows contact with typical fluid transport models (e.g., UEDGE). Results are presented that combine the anomalous transport model and collisional transport owing to ion drift orbits utilizing a Krook collision operator that conserves density and energy. Comparison is made of the relative magnitudes and possible synergistic effects of the two processes for typical tokamak device parameters.

  1. Continuum Edge Gyrokinetic Theory and Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, X Q; Xiong, Z; Dorr, M R

    The following results are presented from the development and application of TEMPEST, a fully nonlinear (full-f) five dimensional (3d2v) gyrokinetic continuum edge-plasma code. (1) As a test of the interaction of collisions and parallel streaming, TEMPEST is compared with published analytic and numerical results for endloss of particles confined by combined electrostatic and magnetic wells. Good agreement is found over a wide range of collisionality, confining potential, and mirror ratio; and the required velocity space resolution is modest. (2) In a large-aspect-ratio circular geometry, excellent agreement is found for a neoclassical equilibrium with parallel ion flow in the banana regimemore » with zero temperature gradient and radial electric field. (3) The four-dimensional (2d2v) version of the code produces the first self-consistent simulation results of collisionless damping of geodesic acoustic modes and zonal flow (Rosenbluth-Hinton residual) with Boltzmann electrons using a full-f code. The electric field is also found to agree with the standard neoclassical expression for steep density and ion temperature gradients in the banana regime. In divertor geometry, it is found that the endloss of particles and energy induces parallel flow stronger than the core neoclassical predictions in the SOL. (5) Our 5D gyrokinetic formulation yields a set of nonlinear electrostatic gyrokinetic equations that are for both neoclassical and turbulence simulations.« less

  2. TEMPEST simulations of the plasma transport in a single-null tokamak geometry

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Bodi, K.; Cohen, R. H.; Krasheninnikov, S.; Rognlien, T. D.

    2010-06-01

    We present edge kinetic ion transport simulations of tokamak plasmas in magnetic divertor geometry using the fully nonlinear (full-f) continuum code TEMPEST. Besides neoclassical transport, a term for divergence of anomalous kinetic radial flux is added to mock up the effect of turbulent transport. To study the relative roles of neoclassical and anomalous transport, TEMPEST simulations were carried out for plasma transport and flow dynamics in a single-null tokamak geometry, including the pedestal region that extends across the separatrix into the scrape-off layer and private flux region. A series of TEMPEST simulations were conducted to investigate the transition of midplane pedestal heat flux and flow from the neoclassical to the turbulent limit and the transition of divertor heat flux and flow from the kinetic to the fluid regime via an anomalous transport scan and a density scan. The TEMPEST simulation results demonstrate that turbulent transport (as modelled by large diffusion) plays a similar role to collisional decorrelation of particle orbits and that the large turbulent transport (large diffusion) leads to an apparent Maxwellianization of the particle distribution. We also show the transition of parallel heat flux and flow at the entrance to the divertor plates from the fluid to the kinetic regime. For an absorbing divertor plate boundary condition, a non-half-Maxwellian is found due to the balance between upstream radial anomalous transport and energetic ion endloss.

  3. Four-Dimensional Continuum Gyrokinetic Code: Neoclassical Simulation of Fusion Edge Plasmas

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.

    2005-10-01

    We are developing a continuum gyrokinetic code, TEMPEST, to simulate edge plasmas. Our code represents velocity space via a grid in equilibrium energy and magnetic moment variables, and configuration space via poloidal magnetic flux and poloidal angle. The geometry is that of a fully diverted tokamak (single or double null) and so includes boundary conditions for both closed magnetic flux surfaces and open field lines. The 4-dimensional code includes kinetic electrons and ions, and electrostatic field-solver options, and simulates neoclassical transport. The present implementation is a Method of Lines approach where spatial finite-differences (higher order upwinding) and implicit time advancement are used. We present results of initial verification and validation studies: transition from collisional to collisionless limits of parallel end-loss in the scrape-off layer, self-consistent electric field, and the effect of the real X-point geometry and edge plasma conditions on the standard neoclassical theory, including a comparison of our 4D code with other kinetic neoclassical codes and experiments.

  4. Multiphase, multi-electrode Joule heat computations for glass melter and in situ vitrification simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowery, P.S.; Lessor, D.L.

    Waste glass melter and in situ vitrification (ISV) processes represent the combination of electrical thermal, and fluid flow phenomena to produce a stable waste-from product. Computational modeling of the thermal and fluid flow aspects of these processes provides a useful tool for assessing the potential performance of proposed system designs. These computations can be performed at a fraction of the cost of experiment. Consequently, computational modeling of vitrification systems can also provide and economical means for assessing the suitability of a proposed process application. The computational model described in this paper employs finite difference representations of the basic continuum conservationmore » laws governing the thermal, fluid flow, and electrical aspects of the vitrification process -- i.e., conservation of mass, momentum, energy, and electrical charge. The resulting code is a member of the TEMPEST family of codes developed at the Pacific Northwest Laboratory (operated by Battelle for the US Department of Energy). This paper provides an overview of the numerical approach employed in TEMPEST. In addition, results from several TEMPEST simulations of sample waste glass melter and ISV processes are provided to illustrate the insights to be gained from computational modeling of these processes. 3 refs., 13 figs.« less

  5. New Web Server - the Java Version of Tempest - Produced

    NASA Technical Reports Server (NTRS)

    York, David W.; Ponyik, Joseph G.

    2000-01-01

    A new software design and development effort has produced a Java (Sun Microsystems, Inc.) version of the award-winning Tempest software (refs. 1 and 2). In 1999, the Embedded Web Technology (EWT) team received a prestigious R&D 100 Award for Tempest, Java Version. In this article, "Tempest" will refer to the Java version of Tempest, a World Wide Web server for desktop or embedded systems. Tempest was designed at the NASA Glenn Research Center at Lewis Field to run on any platform for which a Java Virtual Machine (JVM, Sun Microsystems, Inc.) exists. The JVM acts as a translator between the native code of the platform and the byte code of Tempest, which is compiled in Java. These byte code files are Java executables with a ".class" extension. Multiple byte code files can be zipped together as a "*.jar" file for more efficient transmission over the Internet. Today's popular browsers, such as Netscape (Netscape Communications Corporation) and Internet Explorer (Microsoft Corporation) have built-in Virtual Machines to display Java applets.

  6. Optimum Vessel Performance in Evolving Nonlinear Wave Fields

    DTIC Science & Technology

    2012-11-01

    TEMPEST , the new, nonlinear, time-domain ship motion code being developed by the Navy. Table of Contents Executive Summary i List of Figures iii...domain ship motion code TEMPEST . The radiation and diffraction forces in the level 3.0 version of TEMPEST will be computed by the body-exact strip theory...nonlinear responses of a ship to a seaway are being incorporated into version 3 of TEMPEST , the new, nonlinear, time-domain ship motion code that

  7. TEMPEST Simulations of the Plasma Transport in a Single-Null Tokamak Geometry

    DOE PAGES

    X. Q. Xu; Bodi, K.; Cohen, R. H.; ...

    2010-05-28

    We present edge kinetic ion transport simulations of tokamak plasmas in magnetic divertor geometry using the fully nonlinear (full-f) continuum code TEMPEST. Besides neoclassical transport, a term for divergence of anomalous kinetic radial flux is added to mock up the effect of turbulent transport. In order to study the relative roles of neoclassical and anomalous transport, TEMPEST simulations were carried out for plasma transport and flow dynamics in a single-null tokamak geometry, including the pedestal region that extends across the separatrix into the scrape-off layer and private flux region. In a series of TEMPEST simulations were conducted to investigate themore » transition of midplane pedestal heat flux and flow from the neoclassical to the turbulent limit and the transition of divertor heat flux and flow from the kinetic to the fluid regime via an anomalous transport scan and a density scan. The TEMPEST simulation results demonstrate that turbulent transport (as modelled by large diffusion) plays a similar role to collisional decorrelation of particle orbits and that the large turbulent transport (large diffusion) leads to an apparent Maxwellianization of the particle distribution. Moreover, we show the transition of parallel heat flux and flow at the entrance to the divertor plates from the fluid to the kinetic regime. For an absorbing divertor plate boundary condition, a non-half-Maxwellian is found due to the balance between upstream radial anomalous transport and energetic ion endloss.« less

  8. TEMPEST Simulations of the Plasma Transport in a Single-Null Tokamak Geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    X. Q. Xu; Bodi, K.; Cohen, R. H.

    We present edge kinetic ion transport simulations of tokamak plasmas in magnetic divertor geometry using the fully nonlinear (full-f) continuum code TEMPEST. Besides neoclassical transport, a term for divergence of anomalous kinetic radial flux is added to mock up the effect of turbulent transport. In order to study the relative roles of neoclassical and anomalous transport, TEMPEST simulations were carried out for plasma transport and flow dynamics in a single-null tokamak geometry, including the pedestal region that extends across the separatrix into the scrape-off layer and private flux region. In a series of TEMPEST simulations were conducted to investigate themore » transition of midplane pedestal heat flux and flow from the neoclassical to the turbulent limit and the transition of divertor heat flux and flow from the kinetic to the fluid regime via an anomalous transport scan and a density scan. The TEMPEST simulation results demonstrate that turbulent transport (as modelled by large diffusion) plays a similar role to collisional decorrelation of particle orbits and that the large turbulent transport (large diffusion) leads to an apparent Maxwellianization of the particle distribution. Moreover, we show the transition of parallel heat flux and flow at the entrance to the divertor plates from the fluid to the kinetic regime. For an absorbing divertor plate boundary condition, a non-half-Maxwellian is found due to the balance between upstream radial anomalous transport and energetic ion endloss.« less

  9. TEMPEST II--A NEUTRON THERMALIZATION CODE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shudde, R.H.; Dyer, J.

    The TEMPEST II neutron thermalization code in Fortran for IBM 709 or 7090 calculates thermal neutron flux spectra based upon the Wigner-Wilkins equation, the Wilkins equation, or the Maxwellian distribution. When a neutron spectrum is obtained, TEMPEST II provides microscopic and macroscopic cross section averages over that spectrum. Equations used by the code and sample input and output data are given. (auth)

  10. Application of the TEMPEST computer code for simulating hydrogen distribution in model containment structures. [PWR; BWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trent, D.S.; Eyler, L.L.

    In this study several aspects of simulating hydrogen distribution in geometric configurations relevant to reactor containment structures were investigated using the TEMPEST computer code. Of particular interest was the performance of the TEMPEST turbulence model in a density-stratified environment. Computed results illustrated that the TEMPEST numerical procedures predicted the measured phenomena with good accuracy under a variety of conditions and that the turbulence model used is a viable approach in complex turbulent flow simulation.

  11. Application of the TEMPEST computer code to canister-filling heat transfer problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farnsworth, R.K.; Faletti, D.W.; Budden, M.J.

    Pacific Northwest Laboratory (PNL) researchers used the TEMPEST computer code to simulate thermal cooldown behavior of nuclear waste glass after it was poured into steel canisters for long-term storage. The objective of this work was to determine the accuracy and applicability of the TEMPEST code when used to compute canister thermal histories. First, experimental data were obtained to provide the basis for comparing TEMPEST-generated predictions. Five canisters were instrumented with appropriately located radial and axial thermocouples. The canister were filled using the pilot-scale ceramic melter (PSCM) at PNL. Each canister was filled in either a continous or a batch fillingmore » mode. One of the canisters was also filled within a turntable simulant (a group of cylindrical shells with heat transfer resistances similar to those in an actual melter turntable). This was necessary to provide a basis for assessing the ability of the TEMPEST code to also model the transient cooling of canisters in a melter turntable. The continous-fill model, Version M, was found to predict temperatures with more accuracy. The turntable simulant experiment demonstrated that TEMPEST can adequately model the asymmetric temperature field caused by the turntable geometry. Further, TEMPEST can acceptably predict the canister cooling history within a turntable, despite code limitations in computing simultaneous radiation and convection heat transfer between shells, along with uncertainty in stainless-steel surface emissivities. Based on the successful performance of TEMPEST Version M, development was initiated to incorporate 1) full viscous glass convection, 2) a dynamically adaptive grid that automatically follows the glass/air interface throughout the transient, and 3) a full enclosure radiation model to allow radiation heat transfer to non-nearest neighbor cells. 5 refs., 47 figs., 17 tabs.« less

  12. Overview of Edge Simulation Laboratory (ESL)

    NASA Astrophysics Data System (ADS)

    Cohen, R. H.; Dorr, M.; Hittinger, J.; Rognlien, T.; Umansky, M.; Xiong, A.; Xu, X.; Belli, E.; Candy, J.; Snyder, P.; Colella, P.; Martin, D.; Sternberg, T.; van Straalen, B.; Bodi, K.; Krasheninnikov, S.

    2006-10-01

    The ESL is a new collaboration to build a full-f electromagnetic gyrokinetic code for tokamak edge plasmas using continuum methods. Target applications are edge turbulence and transport (neoclassical and anomalous), and edge-localized modes. Initially the project has three major threads: (i) verification and validation of TEMPEST, the project's initial (electrostatic) edge code which can be run in 4D (neoclassical and transport-timescale applications) or 5D (turbulence); (ii) design of the next generation code, which will include more complete physics (electromagnetics, fluid equation option, improved collisions) and advanced numerics (fully conservative, high-order discretization, mapped multiblock grids, adaptivity), and (iii) rapid-prototype codes to explore the issues attached to solving fully nonlinear gyrokinetics with steep radial gradiens. We present a brief summary of the status of each of these activities.

  13. TEMPEST code simulations of hydrogen distribution in reactor containment structures. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trent, D.S.; Eyler, L.L.

    The mass transport version of the TEMPEST computer code was used to simulate hydrogen distribution in geometric configurations relevant to reactor containment structures. Predicted results of Battelle-Frankfurt hydrogen distribution tests 1 to 6, and 12 are presented. Agreement between predictions and experimental data is good. Best agreement is obtained using the k-epsilon turbulence model in TEMPEST in flow cases where turbulent diffusion and stable stratification are dominant mechanisms affecting transport. The code's general analysis capabilities are summarized.

  14. Modeling anomalous radial transport in kinetic transport codes

    NASA Astrophysics Data System (ADS)

    Bodi, K.; Krasheninnikov, S. I.; Cohen, R. H.; Rognlien, T. D.

    2009-11-01

    Anomalous transport is typically the dominant component of the radial transport in magnetically confined plasmas, where the physical origin of this transport is believed to be plasma turbulence. A model is presented for anomalous transport that can be used in continuum kinetic edge codes like TEMPEST, NEO and the next-generation code being developed by the Edge Simulation Laboratory. The model can also be adapted to particle-based codes. It is demonstrated that the model with a velocity-dependent diffusion and convection terms can match a diagonal gradient-driven transport matrix as found in contemporary fluid codes, but can also include off-diagonal effects. The anomalous transport model is also combined with particle drifts and a particle/energy-conserving Krook collision operator to study possible synergistic effects with neoclassical transport. For the latter study, a velocity-independent anomalous diffusion coefficient is used to mimic the effect of long-wavelength ExB turbulence.

  15. A velocity-dependent anomalous radial transport model for (2-D, 2-V) kinetic transport codes

    NASA Astrophysics Data System (ADS)

    Bodi, Kowsik; Krasheninnikov, Sergei; Cohen, Ron; Rognlien, Tom

    2008-11-01

    Plasma turbulence constitutes a significant part of radial plasma transport in magnetically confined plasmas. This turbulent transport is modeled in the form of anomalous convection and diffusion coefficients in fluid transport codes. There is a need to model the same in continuum kinetic edge codes [such as the (2-D, 2-V) transport version of TEMPEST, NEO, and the code being developed by the Edge Simulation Laboratory] with non-Maxwellian distributions. We present an anomalous transport model with velocity-dependent convection and diffusion coefficients leading to a diagonal transport matrix similar to that used in contemporary fluid transport models (e.g., UEDGE). Also presented are results of simulations corresponding to radial transport due to long-wavelength ExB turbulence using a velocity-independent diffusion coefficient. A BGK collision model is used to enable comparison with fluid transport codes.

  16. TEMPEST code modifications and testing for erosion-resisting sludge simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onishi, Y.; Trent, D.S.

    The TEMPEST computer code has been used to address many waste retrieval operational and safety questions regarding waste mobilization, mixing, and gas retention. Because the amount of sludge retrieved from the tank is directly related to the sludge yield strength and the shear stress acting upon it, it is important to incorporate the sludge yield strength into simulations of erosion-resisting tank waste retrieval operations. This report describes current efforts to modify the TEMPEST code to simulate pump jet mixing of erosion-resisting tank wastes and the models used to test for erosion of waste sludge with yield strength. Test results formore » solid deposition and diluent/slurry jet injection into sludge layers in simplified tank conditions show that the modified TEMPEST code has a basic ability to simulate both the mobility and immobility of the sludges with yield strength. Further testing, modification, calibration, and verification of the sludge mobilization/immobilization model are planned using erosion data as they apply to waste tank sludges.« less

  17. Neoclassical orbit calculations with a full-f code for tokamak edge plasmas

    NASA Astrophysics Data System (ADS)

    Rognlien, T. D.; Cohen, R. H.; Dorr, M.; Hittinger, J.; Xu, X. Q.; Collela, P.; Martin, D.

    2008-11-01

    Ion distribution function modifications are considered for the case of neoclassical orbit widths comparable to plasma radial-gradient scale-lengths. Implementation of proper boundary conditions at divertor plates in the continuum TEMPEST code, including the effect of drifts in determining the direction of total flow, enables such calculations in single-null divertor geometry, with and without an electrostatic potential. The resultant poloidal asymmetries in densities, temperatures, and flows are discussed. For long-time simulations, a slow numerical instability develops, even in simplified (circular) geometry with no endloss, which aids identification of the mixed treatment of parallel and radial convection terms as the cause. The new Edge Simulation Laboratory code, expected to be operational, has algorithmic refinements that should address the instability. We will present any available results from the new code on this problem as well as geodesic acoustic mode tests.

  18. Edge Simulation Laboratory Progress and Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohen, R

    The Edge Simulation Laboratory (ESL) is a project to develop a gyrokinetic code for MFE edge plasmas based on continuum (Eulerian) techniques. ESL is a base-program activity of OFES, with an allied algorithm research activity funded by the OASCR base math program. ESL OFES funds directly support about 0.8 FTE of career staff at LLNL, a postdoc and a small fraction of an FTE at GA, and a graduate student at UCSD. In addition the allied OASCR program funds about 1/2 FTE each in the computations directorates at LBNL and LLNL. OFES ESL funding for LLNL and UCSD began inmore » fall 2005, while funding for GA and the math team began about a year ago. ESL's continuum approach is a complement to the PIC-based methods of the CPES Project, and was selected (1) because of concerns about noise issues associated with PIC in the high-density-contrast environment of the edge pedestal, (2) to be able to exploit advanced numerical methods developed for fluid codes, and (3) to build upon the successes of core continuum gyrokinetic codes such as GYRO, GS2 and GENE. The ESL project presently has three components: TEMPEST, a full-f, full-geometry (single-null divertor, or arbitrary-shape closed flux surfaces) code in E, {mu} (energy, magnetic-moment) coordinates; EGK, a simple-geometry rapid-prototype code, presently of; and the math component, which is developing and implementing algorithms for a next-generation code. Progress would be accelerated if we could find funding for a fourth, computer science, component, which would develop software infrastructure, provide user support, and address needs for data handing and analysis. We summarize the status and plans for the three funded activities.« less

  19. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 2, Assessment and verification results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyler, L L; Trent, D S; Budden, M J

    During the course of the TEMPEST computer code development a concurrent effort was conducted to assess the code's performance and the validity of computed results. The results of this work are presented in this document. The principal objective of this effort was to assure the code's computational correctness for a wide range of hydrothermal phenomena typical of fast breeder reactor application. 47 refs., 94 figs., 6 tabs.

  20. Dynamics of kinetic geodesic-acoustic modes and the radial electric field in tokamak neoclassical plasmas

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Belli, E.; Bodi, K.; Candy, J.; Chang, C. S.; Cohen, R. H.; Colella, P.; Dimits, A. M.; Dorr, M. R.; Gao, Z.; Hittinger, J. A.; Ko, S.; Krasheninnikov, S.; McKee, G. R.; Nevins, W. M.; Rognlien, T. D.; Snyder, P. B.; Suh, J.; Umansky, M. V.

    2009-06-01

    We present edge gyrokinetic simulations of tokamak plasmas using the fully non-linear (full-f) continuum code TEMPEST. A non-linear Boltzmann model is used for the electrons. The electric field is obtained by solving the 2D gyrokinetic Poisson equation. We demonstrate the following. (1) High harmonic resonances (n > 2) significantly enhance geodesic-acoustic mode (GAM) damping at high q (tokamak safety factor), and are necessary to explain the damping observed in our TEMPEST q-scans and consistent with the experimental measurements of the scaling of the GAM amplitude with edge q95 in the absence of obvious evidence that there is a strong q-dependence of the turbulent drive and damping of the GAM. (2) The kinetic GAM exists in the edge for steep density and temperature gradients in the form of outgoing waves, its radial scale is set by the ion temperature profile, and ion temperature inhomogeneity is necessary for GAM radial propagation. (3) The development of the neoclassical electric field evolves through different phases of relaxation, including GAMs, their radial propagation and their long-time collisional decay. (4) Natural consequences of orbits in the pedestal and scrape-off layer region in divertor geometry are substantial non-Maxwellian ion distributions and parallel flow characteristics qualitatively like those observed in experiments.

  1. Fully Nonlinear Edge Gyrokinetic Simulations of Kinetic Geodesic-Acoustic Modes and Boundary Flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, X Q; Belli, E; Bodi, K

    We present edge gyrokinetic neoclassical simulations of tokamak plasmas using the fully nonlinear (full-f) continuum code TEMPEST. A nonlinear Boltzmann model is used for the electrons. The electric field is obtained by solving the 2D gyrokinetic Poisson Equation. We demonstrate the following: (1) High harmonic resonances (n > 2) significantly enhance geodesic-acoustic mode (GAM) damping at high-q (tokamak safety factor), and are necessary to explain both the damping observed in our TEMPEST q-scans and experimental measurements of the scaling of the GAM amplitude with edge q{sub 95} in the absence of obvious evidence that there is a strong q dependencemore » of the turbulent drive and damping of the GAM. (2) The kinetic GAM exists in the edge for steep density and temperature gradients in the form of outgoing waves, its radial scale is set by the ion temperature profile, and ion temperature inhomogeneity is necessary for GAM radial propagation. (3) The development of the neoclassical electric field evolves through different phases of relaxation, including GAMs, their radial propagation, and their long-time collisional decay. (4) Natural consequences of orbits in the pedestal and scrape-off layer region in divertor geometry are substantial non-Maxwellian ion distributions and flow characteristics qualitatively like those observed in experiments.« less

  2. Turbulence-driven Coronal Heating and Improvements to Empirical Forecasting of the Solar Wind

    NASA Astrophysics Data System (ADS)

    Woolsey, Lauren N.; Cranmer, Steven R.

    2014-06-01

    Forecasting models of the solar wind often rely on simple parameterizations of the magnetic field that ignore the effects of the full magnetic field geometry. In this paper, we present the results of two solar wind prediction models that consider the full magnetic field profile and include the effects of Alfvén waves on coronal heating and wind acceleration. The one-dimensional magnetohydrodynamic code ZEPHYR self-consistently finds solar wind solutions without the need for empirical heating functions. Another one-dimensional code, introduced in this paper (The Efficient Modified-Parker-Equation-Solving Tool, TEMPEST), can act as a smaller, stand-alone code for use in forecasting pipelines. TEMPEST is written in Python and will become a publicly available library of functions that is easy to adapt and expand. We discuss important relations between the magnetic field profile and properties of the solar wind that can be used to independently validate prediction models. ZEPHYR provides the foundation and calibration for TEMPEST, and ultimately we will use these models to predict observations and explain space weather created by the bulk solar wind. We are able to reproduce with both models the general anticorrelation seen in comparisons of observed wind speed at 1 AU and the flux tube expansion factor. There is significantly less spread than comparing the results of the two models than between ZEPHYR and a traditional flux tube expansion relation. We suggest that the new code, TEMPEST, will become a valuable tool in the forecasting of space weather.

  3. Modeling study of deposition locations in the 291-Z plenum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahoney, L.A.; Glissmeyer, J.A.

    The TEMPEST (Trent and Eyler 1991) and PART5 computer codes were used to predict the probable locations of particle deposition in the suction-side plenum of the 291-Z building in the 200 Area of the Hanford Site, the exhaust fan building for the 234-5Z, 236-Z, and 232-Z buildings in the 200 Area of the Hanford Site. The Tempest code provided velocity fields for the airflow through the plenum. These velocity fields were then used with TEMPEST to provide modeling of near-floor particle concentrations without particle sticking (100% resuspension). The same velocity fields were also used with PART5 to provide modeling ofmore » particle deposition with sticking (0% resuspension). Some of the parameters whose importance was tested were particle size, point of injection and exhaust fan configuration.« less

  4. TEMPEST: A computer code for three-dimensional analysis of transient fluid dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fort, J.A.

    TEMPEST (Transient Energy Momentum and Pressure Equations Solutions in Three dimensions) is a powerful tool for solving engineering problems in nuclear energy, waste processing, chemical processing, and environmental restoration because it analyzes and illustrates 3-D time-dependent computational fluid dynamics and heat transfer analysis. It is a family of codes with two primary versions, a N- Version (available to public) and a T-Version (not currently available to public). This handout discusses its capabilities, applications, numerical algorithms, development status, and availability and assistance.

  5. Solar Wind Acceleration: Modeling Effects of Turbulent Heating in Open Flux Tubes

    NASA Astrophysics Data System (ADS)

    Woolsey, Lauren N.; Cranmer, Steven R.

    2014-06-01

    We present two self-consistent coronal heating models that determine the properties of the solar wind generated and accelerated in magnetic field geometries that are open to the heliosphere. These models require only the radial magnetic field profile as input. The first code, ZEPHYR (Cranmer et al. 2007) is a 1D MHD code that includes the effects of turbulent heating created by counter-propagating Alfven waves rather than relying on empirical heating functions. We present the analysis of a large grid of modeled flux tubes (> 400) and the resulting solar wind properties. From the models and results, we recreate the observed anti-correlation between wind speed at 1 AU and the so-called expansion factor, a parameterization of the magnetic field profile. We also find that our models follow the same observationally-derived relation between temperature at 1 AU and wind speed at 1 AU. We continue our analysis with a newly-developed code written in Python called TEMPEST (The Efficient Modified-Parker-Equation-Solving Tool) that runs an order of magnitude faster than ZEPHYR due to a set of simplifying relations between the input magnetic field profile and the temperature and wave reflection coefficient profiles. We present these simplifying relations as a useful result in themselves as well as the anti-correlation between wind speed and expansion factor also found with TEMPEST. Due to the nature of the algorithm TEMPEST utilizes to find solar wind solutions, we can effectively separate the two primary ways in which Alfven waves contribute to solar wind acceleration: 1) heating the surrounding gas through a turbulent cascade and 2) providing a separate source of wave pressure. We intend to make TEMPEST easily available to the public and suggest that TEMPEST can be used as a valuable tool in the forecasting of space weather, either as a stand-alone code or within an existing modeling framework.

  6. Performance of a parallel thermal-hydraulics code TEMPEST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fann, G.I.; Trent, D.S.

    The authors describe the parallelization of the Tempest thermal-hydraulics code. The serial version of this code is used for production quality 3-D thermal-hydraulics simulations. Good speedup was obtained with a parallel diagonally preconditioned BiCGStab non-symmetric linear solver, using a spatial domain decomposition approach for the semi-iterative pressure-based and mass-conserved algorithm. The test case used here to illustrate the performance of the BiCGStab solver is a 3-D natural convection problem modeled using finite volume discretization in cylindrical coordinates. The BiCGStab solver replaced the LSOR-ADI method for solving the pressure equation in TEMPEST. BiCGStab also solves the coupled thermal energy equation. Scalingmore » performance of 3 problem sizes (221220 nodes, 358120 nodes, and 701220 nodes) are presented. These problems were run on 2 different parallel machines: IBM-SP and SGI PowerChallenge. The largest problem attains a speedup of 68 on an 128 processor IBM-SP. In real terms, this is over 34 times faster than the fastest serial production time using the LSOR-ADI solver.« less

  7. Nonlinear Full-f Edge Gyrokinetic Turbulence Simulations

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Dimits, A. M.; Umansky, M. V.

    2008-11-01

    TEMPEST is a nonlinear full-f 5D electrostatic gyrokinetic code for simulations of neoclassical and turbulent transport for tokamak plasmas. Given an initial density perturbation, 4D TEMPEST simulations show that the kinetic GAM exists in the edge in the form of outgoing waves [1], its radial scale is set by plasma profiles, and the ion temperature inhomogeneity is necessary for GAM radial propagation. From an initial Maxwellian distribution with uniform poloidal profiles on flux surfaces, the 5D TEMPEST simulations in a flux coordinates with Boltzmann electron model in a circular geometry show the development of neoclassical equilibrium, the generation of the neoclassical electric field due to neoclassical polarization, and followed by a growth of instability due to the spatial gradients. 5D TEMPEST simulations of kinetic GAM turbulent generation, radial propagation, and its impact on transport will be reported. [1] X. Q. Xu, Phys. Rev. E., 78 (2008).

  8. Preliminary testing of turbulence and radionuclide transport modeling in deep ocean environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onishi, Y.; Dummuller, D.C.; Trent, D.S.

    Pacific Northwest Laboratory (PNL) performed a study for the US Environmental Protection Agency's Office of Radiation Programs to (1) identify candidate models for regional modeling of low-level waste ocean disposal sites in the mid-Atlantic ocean; (2) evaluate mathematical representation of the model's eddy viscosity/dispersion coefficients; and (3) evaluate the adequacy of the k-{epsilon} turbulence model and the feasibility of one of the candidate models, TEMPEST{copyright}/FLESCOT{copyright}, to deep-ocean applications on a preliminary basis. PNL identified the TEMPEST{copyright}/FLESCOT{copyright}, FLOWER, Blumberg's, and RMA 10 models as appropriate candidates for the regional radionuclide modeling. Among these models, TEMPEST/FLESCOT is currently the only model thatmore » solves distributions of flow, turbulence (with the k-{epsilon} model), salinity, water temperature, sediment, dissolved contaminants, and sediment-sorbed contaminants. Solving the Navier-Stokes equations using higher order correlations is not practical for regional modeling because of the prohibitive computational requirements; therefore, the turbulence modeling is a more practical approach. PNL applied the three-dimensional code, TEMPEST{copyright}/FLESCOT{copyright} with the k-{epsilon} model, to a very simple, hypothetical, two-dimensional, deep-ocean case, producing at least qualitatively appropriate results. However, more detailed testing should be performed for the further testing of the code. 46 refs., 39 figs., 6 tabs.« less

  9. Turbulence-driven coronal heating and improvements to empirical forecasting of the solar wind

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woolsey, Lauren N.; Cranmer, Steven R.

    Forecasting models of the solar wind often rely on simple parameterizations of the magnetic field that ignore the effects of the full magnetic field geometry. In this paper, we present the results of two solar wind prediction models that consider the full magnetic field profile and include the effects of Alfvén waves on coronal heating and wind acceleration. The one-dimensional magnetohydrodynamic code ZEPHYR self-consistently finds solar wind solutions without the need for empirical heating functions. Another one-dimensional code, introduced in this paper (The Efficient Modified-Parker-Equation-Solving Tool, TEMPEST), can act as a smaller, stand-alone code for use in forecasting pipelines. TEMPESTmore » is written in Python and will become a publicly available library of functions that is easy to adapt and expand. We discuss important relations between the magnetic field profile and properties of the solar wind that can be used to independently validate prediction models. ZEPHYR provides the foundation and calibration for TEMPEST, and ultimately we will use these models to predict observations and explain space weather created by the bulk solar wind. We are able to reproduce with both models the general anticorrelation seen in comparisons of observed wind speed at 1 AU and the flux tube expansion factor. There is significantly less spread than comparing the results of the two models than between ZEPHYR and a traditional flux tube expansion relation. We suggest that the new code, TEMPEST, will become a valuable tool in the forecasting of space weather.« less

  10. A finite volume Fokker-Planck collision operator in constants-of-motion coordinates

    NASA Astrophysics Data System (ADS)

    Xiong, Z.; Xu, X. Q.; Cohen, B. I.; Cohen, R.; Dorr, M. R.; Hittinger, J. A.; Kerbel, G.; Nevins, W. M.; Rognlien, T.

    2006-04-01

    TEMPEST is a 5D gyrokinetic continuum code for edge plasmas. Constants of motion, namely, the total energy E and the magnetic moment μ, are chosen as coordinate s because of their advantage in minimizing numerical diffusion in advection operato rs. Most existing collision operators are written in other coordinates; using them by interpolating is shown to be less satisfactory in maintaining overall numerical accuracy and conservation. Here we develop a Fokker-Planck collision operator directly in (E,μ) space usin g a finite volume approach. The (E, μ) grid is Cartesian, and the turning point boundary represents a straight line cutting through the grid that separates the ph ysical and non-physical zones. The resulting cut-cells are treated by a cell-mergin g technique to ensure a complete particle conservation. A two dimensional fourth or der reconstruction scheme is devised to achieve good numerical accuracy with modest number of grid points. The new collision operator will be benchmarked by numerical examples.

  11. Calculation of ion distribution functions and neoclassical transport in the edge of single-null divertor tokamaks

    NASA Astrophysics Data System (ADS)

    Rognlien, T. D.; Cohen, R. H.; Xu, X. Q.

    2007-11-01

    The ion distribution function in the H-mode pedestal region and outward across the magnetic separatrix is expected to have a substantial non-Maxwellian character owing to the large banana orbits and steep gradients in temperature and density. The 4D (2r,2v) version of the TEMPEST continuum gyrokinetic code is used with a Coulomb collision model to calculate the ion distribution in a single-null tokamak geometry throughout the pedestal/scrape-off-layer regions. The mean density, parallel velocity, and energy radial profiles are shown at various poloidal locations. The collisions cause neoclassical energy transport through the pedestal that is then lost to the divertor plates along the open field lines outside the separatrix. The resulting heat flux profiles at the inner and outer divertor plates are presented and discussed, including asymmetries that depend on the B-field direction. Of particular focus is the effect on ion profiles and fluxes of a radial electric field exhibiting a deep well just inside the separatrix, which reduces the width of the banana orbits by the well-known squeezing effect.

  12. Correlation models for waste tank sludges and slurries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahoney, L.A.; Trent, D.S.

    This report presents the results of work conducted to support the TEMPEST computer modeling under the Flammable Gas Program (FGP) and to further the comprehension of the physical processes occurring in the Hanford waste tanks. The end products of this task are correlation models (sets of algorithms) that can be added to the TEMPEST computer code to improve the reliability of its simulation of the physical processes that occur in Hanford tanks. The correlation models can be used to augment, not only the TEMPEST code, but other computer codes that can simulate sludge motion and flammable gas retention. This reportmore » presents the correlation models, also termed submodels, that have been developed to date. The submodel-development process is an ongoing effort designed to increase our understanding of sludge behavior and improve our ability to realistically simulate the sludge fluid characteristics that have an impact on safety analysis. The effort has employed both literature searches and data correlation to provide an encyclopedia of tank waste properties in forms that are relatively easy to use in modeling waste behavior. These properties submodels will be used in other tasks to simulate waste behavior in the tanks. Density, viscosity, yield strength, surface tension, heat capacity, thermal conductivity, salt solubility, and ammonia and water vapor pressures were compiled for solutions and suspensions of sodium nitrate and other salts (where data were available), and the data were correlated by linear regression. In addition, data for simulated Hanford waste tank supernatant were correlated to provide density, solubility, surface tension, and vapor pressure submodels for multi-component solutions containing sodium hydroxide, sodium nitrate, sodium nitrite, and sodium aluminate.« less

  13. Navigation, Guidance and Control For the CICADA Expendable Micro Air Vehicle

    DTIC Science & Technology

    2015-01-01

    aircraft, as shown in Figure 5a. A Tempest UAV mothership was used as the host platform for the CICADA vehicles. Figure 5b shows how two CICADAs were...mounted on wing pylon drop mechanisms located on each wing of the Tempest . The Tempest was needed to carry the CICADAs back within range of the recovery...carried the Tempest and CICADA combination to a maximum altitude of 57,000 feet above sea-level. At that point, Tempest was released from the balloon and

  14. 48 CFR 239.7102-2 - Compromising emanations-TEMPEST or other standard.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-TEMPEST or other standard. 239.7102-2 Section 239.7102-2 Federal Acquisition Regulations System DEFENSE... INFORMATION TECHNOLOGY Security and Privacy for Computer Systems 239.7102-2 Compromising emanations—TEMPEST or....e., an established National TEMPEST standard (e.g., NACSEM 5100, NACSIM 5100A) or a standard used by...

  15. 48 CFR 239.7102-2 - Compromising emanations-TEMPEST or other standard.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...-TEMPEST or other standard. 239.7102-2 Section 239.7102-2 Federal Acquisition Regulations System DEFENSE... INFORMATION TECHNOLOGY Security and Privacy for Computer Systems 239.7102-2 Compromising emanations—TEMPEST or....e., an established National TEMPEST standard (e.g., NACSEM 5100, NACSIM 5100A) or a standard used by...

  16. 48 CFR 239.7102-2 - Compromising emanations-TEMPEST or other standard.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-TEMPEST or other standard. 239.7102-2 Section 239.7102-2 Federal Acquisition Regulations System DEFENSE... INFORMATION TECHNOLOGY Security and Privacy for Computer Systems 239.7102-2 Compromising emanations—TEMPEST or....e., an established National TEMPEST standard (e.g., NACSEM 5100, NACSIM 5100A) or a standard used by...

  17. 48 CFR 239.7102-2 - Compromising emanations-TEMPEST or other standard.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...-TEMPEST or other standard. 239.7102-2 Section 239.7102-2 Federal Acquisition Regulations System DEFENSE... INFORMATION TECHNOLOGY Security and Privacy for Computer Systems 239.7102-2 Compromising emanations—TEMPEST or....e., an established National TEMPEST standard (e.g., NACSEM 5100, NACSIM 5100A) or a standard used by...

  18. 48 CFR 239.7102-2 - Compromising emanations-TEMPEST or other standard.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...-TEMPEST or other standard. 239.7102-2 Section 239.7102-2 Federal Acquisition Regulations System DEFENSE... INFORMATION TECHNOLOGY Security and Privacy for Computer Systems 239.7102-2 Compromising emanations—TEMPEST or....e., an established National TEMPEST standard (e.g., NACSEM 5100, NACSIM 5100A) or a standard used by...

  19. Pressurized thermal shock: TEMPEST computer code simulation of thermal mixing in the cold leg and downcomer of a pressurized water reactor. [Creare 61 and 64

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyler, L.L.; Trent, D.S.

    The TEMPEST computer program was used to simulate fluid and thermal mixing in the cold leg and downcomer of a pressurized water reactor under emergency core cooling high-pressure injection (HPI), which is of concern to the pressurized thermal shock (PTS) problem. Application of the code was made in performing an analysis simulation of a full-scale Westinghouse three-loop plant design cold leg and downcomer. Verification/assessment of the code was performed and analysis procedures developed using data from Creare 1/5-scale experimental tests. Results of three simulations are presented. The first is a no-loop-flow case with high-velocity, low-negative-buoyancy HPI in a 1/5-scale modelmore » of a cold leg and downcomer. The second is a no-loop-flow case with low-velocity, high-negative density (modeled with salt water) injection in a 1/5-scale model. Comparison of TEMPEST code predictions with experimental data for these two cases show good agreement. The third simulation is a three-dimensional model of one loop of a full size Westinghouse three-loop plant design. Included in this latter simulation are loop components extending from the steam generator to the reactor vessel and a one-third sector of the vessel downcomer and lower plenum. No data were available for this case. For the Westinghouse plant simulation, thermally coupled conduction heat transfer in structural materials is included. The cold leg pipe and fluid mixing volumes of the primary pump, the stillwell, and the riser to the steam generator are included in the model. In the reactor vessel, the thermal shield, pressure vessel cladding, and pressure vessel wall are thermally coupled to the fluid and thermal mixing in the downcomer. The inlet plenum mixing volume is included in the model. A 10-min (real time) transient beginning at the initiation of HPI is computed to determine temperatures at the beltline of the pressure vessel wall.« less

  20. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  1. Strategy Plan A Methodology to Predict the Uniformity of Double-Shell Tank Waste Slurries Based on Mixing Pump Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.A. Bamberger; L.M. Liljegren; P.S. Lowery

    This document presents an analysis of the mechanisms influencing mixing within double-shell slurry tanks. A research program to characterize mixing of slurries within tanks has been proposed. The research program presents a combined experimental and computational approach to produce correlations describing the tank slurry concentration profile (and therefore uniformity) as a function of mixer pump operating conditions. The TEMPEST computer code was used to simulate both a full-scale (prototype) and scaled (model) double-shell waste tank to predict flow patterns resulting from a stationary jet centered in the tank. The simulation results were used to evaluate flow patterns in the tankmore » and to determine whether flow patterns are similar between the full-scale prototype and an existing 1/12-scale model tank. The flow patterns were sufficiently similar to recommend conducting scoping experiments at 1/12-scale. Also, TEMPEST modeled velocity profiles of the near-floor jet were compared to experimental measurements of the near-floor jet with good agreement. Reported values of physical properties of double-shell tank slurries were analyzed to evaluate the range of properties appropriate for conducting scaled experiments. One-twelfth scale scoping experiments are recommended to confirm the prioritization of the dimensionless groups (gravitational settling, Froude, and Reynolds numbers) that affect slurry suspension in the tank. Two of the proposed 1/12-scale test conditions were modeled using the TEMPEST computer code to observe the anticipated flow fields. This information will be used to guide selection of sampling probe locations. Additional computer modeling is being conducted to model a particulate laden, rotating jet centered in the tank. The results of this modeling effort will be compared to the scaled experimental data to quantify the agreement between the code and the 1/12-scale experiment. The scoping experiment results will guide selection of parameters to be varied in the follow-on experiments. Data from the follow-on experiments will be used to develop correlations to describe slurry concentration profile as a function of mixing pump operating conditions. This data will also be used to further evaluate the computer model applications. If the agreement between the experimental data and the code predictions is good, the computer code will be recommended for use to predict slurry uniformity in the tanks under various operating conditions. If the agreement between the code predictions and experimental results is not good, the experimental data correlations will be used to predict slurry uniformity in the tanks within the range of correlation applicability.« less

  2. Enabling Global Observations of Clouds and Precipitation on Fine Spatio-Temporal Scales from CubeSat Constellations: Temporal Experiment for Storms and Tropical Systems Technology Demonstration (TEMPEST-D)

    NASA Astrophysics Data System (ADS)

    Reising, S. C.; Todd, G.; Padmanabhan, S.; Lim, B.; Heneghan, C.; Kummerow, C.; Chandra, C. V.; Berg, W. K.; Brown, S. T.; Pallas, M.; Radhakrishnan, C.

    2017-12-01

    The Temporal Experiment for Storms and Tropical Systems (TEMPEST) mission concept consists of a constellation of 5 identical 6U-Class satellites observing storms at 5 millimeter-wave frequencies with 5-10 minute temporal sampling to observe the time evolution of clouds and their transition to precipitation. Such a small satellite mission would enable the first global measurements of clouds and precipitation on the time scale of tens of minutes and the corresponding spatial scale of a few km. TEMPEST is designed to improve the understanding of cloud processes by providing critical information on temporal signatures of precipitation and helping to constrain one of the largest sources of uncertainty in cloud models. TEMPEST millimeter-wave radiometers are able to perform remote observations of the cloud interior to observe microphysical changes as the cloud begins to precipitate or ice accumulates inside the storm. The TEMPEST technology demonstration (TEMPEST-D) mission is in progress to raise the TRL of the instrument and spacecraft systems from 6 to 9 as well as to demonstrate radiometer measurement and differential drag capabilities required to deploy a constellation of 6U-Class satellites in a single orbital plane. The TEMPEST-D millimeter-wave radiometer instrument provides observations at 89, 165, 176, 180 and 182 GHz using a single compact instrument designed for 6U-Class satellites. The direct-detection topology of the radiometer receiver substantially reduces both its power consumption and design complexity compared to heterodyne receivers. The TEMPEST-D instrument performs precise, end-to-end calibration using a cross-track scanning reflector to view an ambient blackbody calibration target and cosmic microwave background every scan period. The TEMPEST-D radiometer instrument has been fabricated and successfully tested under environmental conditions (vibration, thermal cycling and vacuum) expected in low-Earth orbit. TEMPEST-D began in Aug. 2015, with a rapid 2.5-year development to deliver a complete spacecraft with integrated payload by Feb. 2018. TEMPEST-D has been manifested by NASA CSLI planned for launch on ELaNa-23 on Cygnus Antares II to the ISS in Mar. 2018. The TEMPEST-D satellite is expected to be deployed into a 400-km orbit at 51.6° inclination a few months after arrival at ISS.

  3. Electric Propulsion Test & Evaluation Methodologies for Plasma in the Environments of Space and Testing (EP TEMPEST) (Briefing Charts)

    DTIC Science & Technology

    2015-04-01

    in the Environments of Space and Testing (EP TEMPEST ) - Program Review (Briefing Charts) 5a. CONTRACT NUMBER In-House 5b. GRANT NUMBER 5c...of Space and Testing (EP TEMPEST ) AFOSR T&E Program Review 13-17 April 2015 Dr. Daniel L. Brown In-Space Propulsion Branch (RQRS) Aerospace Systems...Statement A: Approved for public release; distribution is unlimited. EP TEMPEST (Lab Task, FY14-FY16) Program Goals and Objectives Title: Electric

  4. 32 CFR 623.7 - Reports.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Tempest Rapid Materiel Report in message form and sent electrically. The message report will be prepared according to Army Regulation 500-60. (2) Daily message reports. Tempest Rapid Daily Materiel Reports of Army... line. (3) Final reports. In addition to the final Tempest Rapid Daily Materiel Report, a final report...

  5. Koinonia: The Requirements and Vision for an Unclassified Information-Sharing System

    DTIC Science & Technology

    2013-06-01

    of an effort to share information with multinational partners in Multinational Planning Augmentation Team (MPAT) ( Tempest Express Fact Sheet 2011... Tempest fact sheet. Global Security.org. May 7, 2011. Accessed May 3, 2013. http://www.globalsecurity.org/military/ops/ tempest -express.htm U.S

  6. 78 FR 25531 - Requested Administrative Waiver of the Coastwise Trade Laws: Vessel TEMPEST; Invitation for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-01

    ... DEPARTMENT OF TRANSPORTATION Maritime Administration [Docket No. MARAD-2013-0046] Requested Administrative Waiver of the Coastwise Trade Laws: Vessel TEMPEST; Invitation for Public Comments AGENCY... TEMPEST is: Intended Commercial Use of Vessel: ``Offshore wreck diving.'' Geographic Region: Maine, New...

  7. 32 CFR 623.7 - Reports.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Tempest Rapid Materiel Report in message form and sent electrically. The message report will be prepared according to Army Regulation 500-60. (2) Daily message reports. Tempest Rapid Daily Materiel Reports of Army... line. (3) Final reports. In addition to the final Tempest Rapid Daily Materiel Report, a final report...

  8. Tempest: Accelerated MS/MS database search software for heterogeneous computing platforms

    PubMed Central

    Adamo, Mark E.; Gerber, Scott A.

    2017-01-01

    MS/MS database search algorithms derive a set of candidate peptide sequences from in-silico digest of a protein sequence database, and compute theoretical fragmentation patterns to match these candidates against observed MS/MS spectra. The original Tempest publication described these operations mapped to a CPU-GPU model, in which the CPU generates peptide candidates that are asynchronously sent to a discrete GPU to be scored against experimental spectra in parallel (Milloy et al., 2012). The current version of Tempest expands this model, incorporating OpenCL to offer seamless parallelization across multicore CPUs, GPUs, integrated graphics chips, and general-purpose coprocessors. Three protocols describe how to configure and run a Tempest search, including discussion of how to leverage Tempest's unique feature set to produce optimal results. PMID:27603022

  9. Remote Sensing of Precipitation from 6U-Class Small Satellite Constellations: Temporal Experiment for Storms and Tropical Systems Technology Demonstration (TEMPEST-D)

    NASA Astrophysics Data System (ADS)

    Reising, S. C.; Gaier, T.; Kummerow, C. D.; Chandra, C. V.; Padmanabhan, S.; Lim, B.; Heneghan, C.; Berg, W. K.; Olson, J. P.; Brown, S. T.; Carvo, J.; Pallas, M.

    2016-12-01

    The Temporal Experiment for Storms and Tropical Systems (TEMPEST) mission concept consists of a constellation of 5 identical 6U-Class nanosatellites observing at 5 millimeter-wave frequencies with 5-minute temporal sampling to observe the time evolution of clouds and their transition to precipitation. The TEMPEST concept is designed to improve the understanding of cloud processes, by providing critical information on the time evolution of cloud and precipitation microphysics and helping to constrain one of the largest sources of uncertainty in climate models. TEMPEST millimeter-wave radiometers are able to make observations in the cloud to observe changes as the cloud begins to precipitate or ice accumulates inside the storm. Such a constellation deployed near 400 km altitude and 50°-65° inclination is expected to capture more than 3 million observations of precipitation during a one-year mission, including over 100,000 deep convective events. The TEMPEST Technology Demonstration (TEMPEST-D) mission will be deployed to raise the TRL of the instrument and key satellite systems as well as to demonstrate measurement capabilities required for a constellation of 6U-Class nanosatellites to directly observe the temporal development of clouds and study the conditions that control their transition from non-precipitating to precipitating clouds. A partnership among Colorado State University (Lead Institution), NASA/Caltech Jet Propulsion Laboratory and Blue Canyon Technologies, TEMPEST-D will provide observations at five millimeter-wave frequencies from 89 to 183 GHz using a single compact instrument that is well suited for the 6U-Class architecture. The top-level requirements for the 90-day TEMPEST-D mission are to: (1) demonstrate precision inter-satellite calibration between TEMPEST-D and one other orbiting radiometer (e.g. GPM or MHS) measuring at similar frequencies; and (2) demonstrate orbital drag maneuvers to control altitude, as verified by GPS, sufficient to achieve relative positioning in a constellation of 6U-Class nanosatellites. The TEMPEST-D 6U-Class satellite is planned to be delivered in July 2017 for launch through NASA CSLI no later than March 2018.

  10. Shakespeare for the 1990s: A Multicultural Tempest.

    ERIC Educational Resources Information Center

    Carey-Webb, Allen

    1993-01-01

    Argues that William Shakespeare's "The Tempest" is the play that is best suited for the high school English curriculum of the 1990s. Discusses historical and critical aspects the play's key themes. Shows ways of using the play in high school classes, and describes 19 works to read alongside of"The Tempest." (HB)

  11. Quicksilver: Middleware for Scalable Self-Regenerative Systems

    DTIC Science & Technology

    2006-04-01

    Applications can be coded in any of about 25 programming languages ranging from the obvious ones to some very obscure languages , such as OCaml ...technology. Like Tempest, Quicksilver can support applications written in any of a wide range of programming languages supported by .NET. However, whereas...so that developers can work in standard languages and with standard tools and still exploit those solutions. Vendors need to see some success

  12. [Cardiologic emergencies and natural disaster. Prospective study with Xynthia tempest].

    PubMed

    Trebouet, E; Lipp, D; Dimet, J; Orion, L; Fradin, P

    2011-02-01

    Stress-induced cardiomyopathy and ischemic cardiopathy have been described after natural disasters such as earthquakes. Count stress-induced cardiomyopathies and ischemic cardiopathies just after Xynthia tempest which damaged the Vendean coast on February2010, in order to study epidemiology. Included patients were living in a tempest damaged village, and admitted in Vendee hospital just after or in the week following the tempest, and presenting a suspected acute coronary syndrome or stress-induced cardiomyopathy. Among 3350 inhabitants of the two damaged Vendean towns, we count three acute coronary syndromes, two Tako-Tsubo cardiomyopathies, and one coronary spasm. We count five women and one man, average age is 76. The diagnosis of ischemic cardiopathy and stress-induced cardiomyopathy is over-represented in this tempest damaged population, that have been little described. Copyright © 2010 Elsevier Masson SAS. All rights reserved.

  13. Tempest: Accelerated MS/MS Database Search Software for Heterogeneous Computing Platforms.

    PubMed

    Adamo, Mark E; Gerber, Scott A

    2016-09-07

    MS/MS database search algorithms derive a set of candidate peptide sequences from in silico digest of a protein sequence database, and compute theoretical fragmentation patterns to match these candidates against observed MS/MS spectra. The original Tempest publication described these operations mapped to a CPU-GPU model, in which the CPU (central processing unit) generates peptide candidates that are asynchronously sent to a discrete GPU (graphics processing unit) to be scored against experimental spectra in parallel. The current version of Tempest expands this model, incorporating OpenCL to offer seamless parallelization across multicore CPUs, GPUs, integrated graphics chips, and general-purpose coprocessors. Three protocols describe how to configure and run a Tempest search, including discussion of how to leverage Tempest's unique feature set to produce optimal results. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  14. Recent Advances in the Tempest UAS for In-Situ Measurements in Highly-Dynamic Environments

    NASA Astrophysics Data System (ADS)

    Argrow, B. M.; Frew, E.; Houston, A. L.; Weiss, C.

    2014-12-01

    The spring 2010 deployment of the Tempest UAS during the VORTEX2 field campaign verified that a small UAS, supported by a customized mobile communications, command, and control (C3) architecture, could simultaneously satisfy Federal Aviation Administration (FAA) airspace requirements, and make in-situ thermodynamic measurements in supercell thunderstorms. A multi-hole airdata probe was recently integrated into the Tempest UAS airframe and verification flights were made in spring 2013 to collect in-situ wind measurements behind gust fronts produced by supercell thunderstorms in northeast Colorado. Using instantaneous aircraft attitude estimates from the autopilot, the in-situ measurements were converted to inertial wind estimates, and estimates of uncertainty in the wind measurements was examined. To date, the limited deployments of the Tempest UAS have primarily focused on addressing the engineering and regulatory requirements to conduct supercell research, and the Tempest UAS team of engineers and meteorologists is preparing for deployments with the focus on collecting targeted data for meteorological exploration and hypothesis testing. We describe the recent expansion of the operations area and altitude ceiling of the Tempest UAS, engineering issues for accurate inertial wind estimates, new concepts of operation that include the simultaneous deployment of multiple aircraft with mobile ground stations, and a brief description of our current effort to develop a capability for the Tempest UAS to perform autonomous path planning to maximize energy harvesting from the local wind field for increased endurance.

  15. Numerical simulation of jet mixing concepts in Tank 241-SY-101

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trent, D.S.; Michener, T.E.

    The episodic gas release events (GRES) that have characterized the behavior of Tank 241-SY-101 for the past several years are thought to result from gases generated by the waste material in it that become trapped in the layer of settled solids at the bottom of the tank. Several concepts for mitigating the GREs have been proposed. One concept involves mobilizing the solid particles with mixing jets. The rationale behind this idea is to prevent formation of a consolidated layer of settled solids at the bottom of the tank, thus inhibiting the accumulation of gas bubbles in this layer. Numerical simulationsmore » were conducted using the TEMPEST computer code to assess the viability and effectiveness of the proposed jet discharge concepts and operating parameters. Before these parametric studies were commenced, a series of turbulent jet studies were conducted that established the adequacy of the TEMPEST code for this application. Configurations studied for Tank 241-SY-101 include centrally located downward discharging jets, draft tubes, and horizontal jets that are either stationary or rotating. Parameter studies included varying the jet discharge velocity, jet diameter, discharge elevation, and material properties. A total of 18 simulations were conducted and are reported in this document. The effect of gas bubbles on the mixing dynamics was not included within the scope of this study.« less

  16. Thermal modeling of tanks 241-AW-101 and 241-AN-104 with the TEMPEST code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antoniak, Z.I.; Recknagle, K.P.

    The TEMPEST code was exercised in a preliminary study of double-shell Tanks 241 -AW-101 and 241-AN-104 thermal behavior. The two-dimensional model used is derived from our earlier studies on heat transfer from Tank 241-SY-101. Several changes were made to the model to simulate the waste and conditions in 241-AW-101 and 241-AN-104. The nonconvective waste layer was assumed to be 254 cm (100 in.) thick for Tank 241-AW-101, and 381 cm (150 in.) in Tank 241-AN-104. The remaining waste was assumed, for each tank, to consist of a convective layer with a 7.6-cm (3-inch) crust on top. The waste heat loadsmore » for 241-AW-101 and 241-AN-104 were taken to be 10 kW (3.4E4 Btu/hr) and 12 kW (4.0E4 Btu/hr), respectively. Present model predictions of maximum and convecting waste temperatures are within 1.7{degrees}C (3{degrees}F) of those measured in Tanks 241-AW-101 and 241-AN-104. The difference between the predicted and measured temperature is comparable to the uncertainty of the measurement equipment. These models, therefore, are suitable for estimating the temperatures within the tanks in the event of changing air flows, waste levels, and/or waste configurations.« less

  17. Turbulent Heating and Wave Pressure in Solar Wind Acceleration Modeling: New Insights to Empirical Forecasting of the Solar Wind

    NASA Astrophysics Data System (ADS)

    Woolsey, L. N.; Cranmer, S. R.

    2013-12-01

    The study of solar wind acceleration has made several important advances recently due to improvements in modeling techniques. Existing code and simulations test the competing theories for coronal heating, which include reconnection/loop-opening (RLO) models and wave/turbulence-driven (WTD) models. In order to compare and contrast the validity of these theories, we need flexible tools that predict the emergent solar wind properties from a wide range of coronal magnetic field structures such as coronal holes, pseudostreamers, and helmet streamers. ZEPHYR (Cranmer et al. 2007) is a one-dimensional magnetohydrodynamics code that includes Alfven wave generation and reflection and the resulting turbulent heating to accelerate solar wind in open flux tubes. We present the ZEPHYR output for a wide range of magnetic field geometries to show the effect of the magnetic field profiles on wind properties. We also investigate the competing acceleration mechanisms found in ZEPHYR to determine the relative importance of increased gas pressure from turbulent heating and the separate pressure source from the Alfven waves. To do so, we developed a code that will become publicly available for solar wind prediction. This code, TEMPEST, provides an outflow solution based on only one input: the magnetic field strength as a function of height above the photosphere. It uses correlations found in ZEPHYR between the magnetic field strength at the source surface and the temperature profile of the outflow solution to compute the wind speed profile based on the increased gas pressure from turbulent heating. With this initial solution, TEMPEST then adds in the Alfven wave pressure term to the modified Parker equation and iterates to find a stable solution for the wind speed. This code, therefore, can make predictions of the wind speeds that will be observed at 1 AU based on extrapolations from magnetogram data, providing a useful tool for empirical forecasting of the sol! ar wind.

  18. Numerical Methods for Nonlinear Fokker-Planck Collision Operator in TEMPEST

    NASA Astrophysics Data System (ADS)

    Kerbel, G.; Xiong, Z.

    2006-10-01

    Early implementations of Fokker-Planck collision operator and moment computations in TEMPEST used low order polynomial interpolation schemes to reuse conservative operators developed for speed/pitch-angle (v, θ) coordinates. When this approach proved to be too inaccurate we developed an alternative higher order interpolation scheme for the Rosenbluth potentials and a high order finite volume method in TEMPEST (,) coordinates. The collision operator is thus generated by using the expansion technique in (v, θ) coordinates for the diffusion coefficients only, and then the fluxes for the conservative differencing are computed directly in the TEMPEST (,) coordinates. Combined with a cut-cell treatment at the turning-point boundary, this new approach is shown to have much better accuracy and conservation properties.

  19. 32 CFR 2001.51 - Technical security.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Surveillance Countermeasures and TEMPEST necessary to detect or deter exploitation of classified information..., TEMPEST Countermeasures for Facilities, and SPB Issuance 6-97, National Policy on Technical Surveillance...

  20. 32 CFR 2001.51 - Technical security.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Surveillance Countermeasures and TEMPEST necessary to detect or deter exploitation of classified information..., TEMPEST Countermeasures for Facilities, and SPB Issuance 6-97, National Policy on Technical Surveillance...

  1. 32 CFR 2001.51 - Technical security.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Surveillance Countermeasures and TEMPEST necessary to detect or deter exploitation of classified information..., TEMPEST Countermeasures for Facilities, and SPB Issuance 6-97, National Policy on Technical Surveillance...

  2. 32 CFR 2001.51 - Technical security.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Surveillance Countermeasures and TEMPEST necessary to detect or deter exploitation of classified information..., TEMPEST Countermeasures for Facilities, and SPB Issuance 6-97, National Policy on Technical Surveillance...

  3. 32 CFR 2001.51 - Technical security.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Surveillance Countermeasures and TEMPEST necessary to detect or deter exploitation of classified information..., TEMPEST Countermeasures for Facilities, and SPB Issuance 6-97, National Policy on Technical Surveillance...

  4. Temporal Experiment for Storms and Tropical Systems Technology Demonstration (TEMPEST-D): Risk Reduction for 6U-Class Nanosatellite Constellations

    NASA Astrophysics Data System (ADS)

    Reising, S. C.; Todd, G.; Kummerow, C. D.; Chandrasekar, V.; Padmanabhan, S.; Lim, B.; Brown, S. T.; van den Heever, S. C.; L'Ecuyer, T.; Ruf, C. S.; Luo, Z. J.; Munchak, S. J.; Haddad, Z. S.; Boukabara, S. A.

    2015-12-01

    The Temporal Experiment for Storms and Tropical Systems Technology Demonstration (TEMPEST-D) is designed to demonstrate required technology to enable a constellation of 6U-Class nanosatellites to directly observe the time evolution of clouds and study the conditions that control the transition of clouds to precipitation using high-temporal resolution observations. TEMPEST millimeter-wave radiometers in the 90-GHz to 183-GHz frequency range penetrate into the cloud to observe key changes as the cloud begins to precipitate or ice accumulates inside the storm. The evolution of ice formation in clouds is important for climate prediction since it largely drives Earth's radiation budget. TEMPEST improves understanding of cloud processes and helps to constrain one of the largest sources of uncertainty in climate models. TEMPEST-D provides observations at five millimeter-wave frequencies from 90 to 183 GHz using a single compact instrument that is well suited for the 6U-Class architecture and fits well within the capabilities of NASA's CubeSat Launch Initiative (CSLI), for which TEMPEST-D was approved in 2015. For a potential future mission of one year of operations, five identical 6U-Class satellites deployed in the same orbital plane with 5-10 minute spacing at ~400 km altitude and 50°-65° inclination are expected to capture 3 million observations of precipitation, including 100,000 deep convective events. TEMPEST is designed to provide critical information on the time evolution of cloud and precipitation microphysics, yielding a first-order understanding of the behavior of assumptions in current cloud-model parameterizations in diverse climate regimes.

  5. TEMPEST-D Spacecraft

    NASA Image and Video Library

    2018-05-17

    The complete TEMPEST-D spacecraft shown with the solar panels deployed. RainCube, CubeRRT and TEMPEST-D are currently integrated aboard Orbital ATKs Cygnus spacecraft and are awaiting launch on an Antares rocket. After the CubeSats have arrived at the station, they will be deployed into low-Earth orbit and will begin their missions to test these new technologies useful for predicting weather, ensuring data quality, and helping researchers better understand storms. https://photojournal.jpl.nasa.gov/catalog/PIA22458

  6. TEMPEST-D MM-Wave Radiometer

    NASA Astrophysics Data System (ADS)

    Padmanabhan, S.; Gaier, T.; Reising, S. C.; Lim, B.; Stachnik, R. A.; Jarnot, R.; Berg, W. K.; Kummerow, C. D.; Chandrasekar, V.

    2016-12-01

    The TEMPEST-D radiometer is a five-frequency millimeter-wave radiometer at 89, 165, 176, 180, and 182 GHz. The direct-detection architecture of the radiometer reduces its power consumption and eliminates the need for a local oscillator, reducing complexity. The Instrument includes a blackbody calibrator and a scanning reflector, which enable precision calibration and cross-track scanning. The MMIC-based millimeter-wave radiometers take advantage of the technology developed under extensive investment by the NASA Earth Science Technology Office (ESTO). The five-frequency millimeter-wave radiometer is built by Jet Propulsion Laboratory (JPL), which has produced a number of state-of-the-art spaceborne microwave radiometers, such as the Microwave Limb Sounder (MLS), Advanced Microwave Radiometer (AMR) for Jason-2/OSTM, Jason-3, and the Juno Microwave Radiometer (MWR). The TEMPEST-D Instrument design is based on a 165 to 182 GHz radiometer design inherited from RACE and an 89 GHz receiver developed under the ESTO ACT-08 and IIP-10 programs at Colorado State University (CSU) and JPL. The TEMPEST reflector scan and calibration methodology is adapted from the Advanced Technology Microwave Sounder (ATMS) and has been validated on the Global Hawk unmanned aerial vehicle (UAV) using the High Altitude MMIC Sounding radiometer (HAMSR) instrument. This presentation will focus on the design, development and performance of the TEMPEST-D radiometer instrument. The flow-down of the TEMPEST-D mission objectives to instrument level requirements will also be discussed.

  7. Crisis Stability and Long-Range Strike: A Comparative Analysis of Fighters, Bombers, and Missiles

    DTIC Science & Technology

    2013-01-01

    1947–1949 Crisis pakistan: Seize Kashmir hawker tempests , but not brandished or employed no India: Deny pakistan control of Kashmir hawker... tempests , but not brandished or employed no war Berlin 1948–1949 Crisis uSSR: Force western powers out of west Berlin Bombers and strike aircraft...resolution to border dispute B-24, B-57, hF-24, tempest , Mystère IV, Ouragan aircraft, but none brandished or employed no war Table C.1

  8. Simulation of the wastewater temperature in sewers with TEMPEST.

    PubMed

    Dürrenmatt, David J; Wanner, Oskar

    2008-01-01

    TEMPEST is a new interactive simulation program for the estimation of the wastewater temperature in sewers. Intuitive graphical user interfaces assist the user in managing data, performing calculations and plotting results. The program calculates the dynamics and longitudinal spatial profiles of the wastewater temperature in sewer lines. Interactions between wastewater, sewer air and surrounding soil are modeled in TEMPEST by mass balance equations, rate expressions found in the literature and a new empirical model of the airflow in the sewer. TEMPEST was developed as a tool which can be applied in practice, i.e., it requires as few input data as possible. These data include the upstream wastewater discharge and temperature, geometric and hydraulic parameters of the sewer, material properties of the sewer pipe and surrounding soil, ambient conditions, and estimates of the capacity of openings for air exchange between sewer and environment. Based on a case study it is shown how TEMPEST can be applied to estimate the decrease of the downstream wastewater temperature caused by heat recovery from the sewer. Because the efficiency of nitrification strongly depends on the wastewater temperature, this application is of practical relevance for situations in which the sewer ends at a nitrifying wastewater treatment plant.

  9. Electronic Warfare Test and Evaluation (Essai et evaluation en matiere de guerre electronique)

    DTIC Science & Technology

    2012-12-01

    Largest known chamber is 80 x 76 x 21 m. Shielding and quiet zones Usually ≥100 dB over at least 0.5 – 18 GHz. TEMPEST grade. Quiet zones: one or...accommodated as an afterthought. The highest level of RF/EO/IR/UV security control is offered by TEMPEST -grade aircraft-sized anechoic chambers. 6.9.7 SUT...aircraft-sized, RF- and laser-shielded anechoic chamber, shielded rooms, and an EW Sub-System Test Laboratory, all TEMPEST grade. It is co-located with the

  10. Anticipating Reader Response: Why I Chose "The Tempest" for English Literature Survey.

    ERIC Educational Resources Information Center

    Jones, Dan C.

    1985-01-01

    Argues in favor of a reader-response approach to the process of selecting the literary works students read in introductory or survey courses. Offers a rationale for using "The Tempest" in such a course. (FL)

  11. Idempotent Methods for Continuous Time Nonlinear Stochastic Control

    DTIC Science & Technology

    2012-09-13

    AND ADDRESS(ES) dba AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Stochastech Corporation dba Tempest Technologies 8939 S...Stochastic Control Problems Ben G. Fitzpatrick Tempest Technologies 8939 S. Sepulveda Boulevard, Suite 506 Los Angeles, CA 90045 Sponsored by

  12. Exploring the temporal structure of heterochronous sequences using TempEst (formerly Path-O-Gen).

    PubMed

    Rambaut, Andrew; Lam, Tommy T; Max Carvalho, Luiz; Pybus, Oliver G

    2016-01-01

    Gene sequences sampled at different points in time can be used to infer molecular phylogenies on a natural timescale of months or years, provided that the sequences in question undergo measurable amounts of evolutionary change between sampling times. Data sets with this property are termed heterochronous and have become increasingly common in several fields of biology, most notably the molecular epidemiology of rapidly evolving viruses. Here we introduce the cross-platform software tool, TempEst (formerly known as Path-O-Gen), for the visualization and analysis of temporally sampled sequence data. Given a molecular phylogeny and the dates of sampling for each sequence, TempEst uses an interactive regression approach to explore the association between genetic divergence through time and sampling dates. TempEst can be used to (1) assess whether there is sufficient temporal signal in the data to proceed with phylogenetic molecular clock analysis, and (2) identify sequences whose genetic divergence and sampling date are incongruent. Examination of the latter can help identify data quality problems, including errors in data annotation, sample contamination, sequence recombination, or alignment error. We recommend that all users of the molecular clock models implemented in BEAST first check their data using TempEst prior to analysis.

  13. Evolution of plastic anisotropy for high-strain-rate computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiferl, S.K.; Maudlin, P.J.

    1994-12-01

    A model for anisotropic material strength, and for changes in the anisotropy due to plastic strain, is described. This model has been developed for use in high-rate, explicit, Lagrangian multidimensional continuum-mechanics codes. The model handles anisotropies in single-phase materials, in particular the anisotropies due to crystallographic texture--preferred orientations of the single-crystal grains. Textural anisotropies, and the changes in these anisotropies, depend overwhelmingly no the crystal structure of the material and on the deformation history. The changes, particularly for a complex deformations, are not amenable to simple analytical forms. To handle this problem, the material model described here includes a texturemore » code, or micromechanical calculation, coupled to a continuum code. The texture code updates grain orientations as a function of tensor plastic strain, and calculates the yield strength in different directions. A yield function is fitted to these yield points. For each computational cell in the continuum simulation, the texture code tracks a particular set of grain orientations. The orientations will change due to the tensor strain history, and the yield function will change accordingly. Hence, the continuum code supplies a tensor strain to the texture code, and the texture code supplies an updated yield function to the continuum code. Since significant texture changes require relatively large strains--typically, a few percent or more--the texture code is not called very often, and the increase in computer time is not excessive. The model was implemented, using a finite-element continuum code and a texture code specialized for hexagonal-close-packed crystal structures. The results for several uniaxial stress problems and an explosive-forming problem are shown.« less

  14. Dilution physics modeling: Dissolution/precipitation chemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onishi, Y.; Reid, H.C.; Trent, D.S.

    This report documents progress made to date on integrating dilution/precipitation chemistry and new physical models into the TEMPEST thermal-hydraulics computer code. Implementation of dissolution/precipitation chemistry models is necessary for predicting nonhomogeneous, time-dependent, physical/chemical behavior of tank wastes with and without a variety of possible engineered remediation and mitigation activities. Such behavior includes chemical reactions, gas retention, solids resuspension, solids dissolution and generation, solids settling/rising, and convective motion of physical and chemical species. Thus this model development is important from the standpoint of predicting the consequences of various engineered activities, such as mitigation by dilution, retrieval, or pretreatment, that can affectmore » safe operations. The integration of a dissolution/precipitation chemistry module allows the various phase species concentrations to enter into the physical calculations that affect the TEMPEST hydrodynamic flow calculations. The yield strength model of non-Newtonian sludge correlates yield to a power function of solids concentration. Likewise, shear stress is concentration-dependent, and the dissolution/precipitation chemistry calculations develop the species concentration evolution that produces fluid flow resistance changes. Dilution of waste with pure water, molar concentrations of sodium hydroxide, and other chemical streams can be analyzed for the reactive species changes and hydrodynamic flow characteristics.« less

  15. Enhancing Electromagnetic Side-Channel Analysis in an Operational Environment

    DTIC Science & Technology

    2013-09-01

    phenomenon of compromising power and EM emissions has been known and exploited for decades. Declassified TEMPEST documents reveal vulnerabilities of...Components. One technique to detect potentially compromising emissions is to use a wide-band receiver tuned to a specific frequency. High-end TEMPEST

  16. Temporal Experiment for Storms and Tropical Systems (TEMPEST) CubeSat Constellation

    NASA Astrophysics Data System (ADS)

    Reising, S. C.; Todd, G.; Padmanabhan, S.; Brown, S. T.; Lim, B.; Kummerow, C. D.; Chandra, C. V.; van den Heever, S. C.; L'Ecuyer, T. S.; Luo, Z. J.; Haddad, Z. S.; Munchak, S. J.; Ruf, C. S.; Berg, G.; Koch, T.; Boukabara, S. A.

    2014-12-01

    TEMPEST addresses key science needs related to cloud and precipitation processes using a constellation of five CubeSats with identical five-frequency millimeter-wave radiometers spaced 5-10 minutes apart in orbit. The deployment of CubeSat constellations on satellite launches of opportunity allows Earth system observations to be accomplished with greater robustness, shorter repeat times and at a small fraction of the cost of typical Earth Science missions. The current suite of Earth-observing satellites is capable of measuring precipitation parameters using radar or radiometric observations. However, these low Earth-orbiting satellites provide only a snapshot of each storm, due to their repeat-pass times of many hours to days. With typical convective events lasting 1-2 hours, it is highly unlikely that the time evolution of clouds through the onset of precipitation will be observed with current assets. The TEMPEST CubeSat constellation directly observes the time evolution of clouds and identifies changes in time to detect the moment of the onset of precipitation. The TEMPEST millimeter-wave radiometers penetrate into the cloud to directly observe changes as the cloud begins to precipitate or ice accumulates inside the storm. The evolution of ice formation in clouds is important for climate prediction because it largely drives Earth's radiation budget. TEMPEST improves understanding of cloud processes and helps to constrain one of the largest sources of uncertainty in climate models. TEMPEST provides observations at five millimeter-wave frequencies from 90 to 183 GHz using a single compact instrument that is well suited for a 6U CubeSat architecture and fits well within the NASA CubeSat Launch Initiative (CSLI) capabilities. Five identical CubeSats deployed in the same orbital plane with 5-10 minute spacing at 390-450 km altitude and 50-65 degree inclination capture 3 million observations of precipitation, including 100,000 deep convective events in a one-year mission. TEMPEST provides critical information on the time evolution of cloud and precipitation microphysics, thereby yielding a first-order understanding of how assumptions in current cloud-model parameterizations behave in diverse climate regimes.

  17. TEMPEST: A three-dimensional time-dependent computer program for hydrothermal analysis: Volume 1, Numerical methods and input instructions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trent, D.S.; Eyler, L.L.; Budden, M.J.

    This document describes the numerical methods, current capabilities, and the use of the TEMPEST (Version L, MOD 2) computer program. TEMPEST is a transient, three-dimensional, hydrothermal computer program that is designed to analyze a broad range of coupled fluid dynamic and heat transfer systems of particular interest to the Fast Breeder Reactor thermal-hydraulic design community. The full three-dimensional, time-dependent equations of motion, continuity, and heat transport are solved for either laminar or turbulent fluid flow, including heat diffusion and generation in both solid and liquid materials. 10 refs., 22 figs., 2 tabs.

  18. Shakespeare's Poetics of Play-Making and Therapeutic Action in "The Tempest."

    ERIC Educational Resources Information Center

    Reed, Melissa Ann

    2000-01-01

    Practices Kenneth Burke's rhetoric of empathic identification to read and understand six levels of consubstantiality between Shakespeare and his Elizabethan audience blueprinted by the authorized text of "The Tempest." Offers implications for the contemporary practices of poetry and drama therapy with participants capable of…

  19. Novel Algorithm/Hardware Partnerships for Real-Time Nonlinear Control

    DTIC Science & Technology

    2014-02-28

    Investigate Tempest Technologies 28 February 2014 Abstract The real-time implementation of controls in nonlinear systems remains one of the great...button for resetting the FPGA board in Max-Plus MVM FPGA system. We utilize the built-in 32MB BPI flash as storage for the Tempest Max-Plus MVM

  20. Asia-Pacific: A Selected Bibliography

    DTIC Science & Technology

    2013-01-01

    www.rsis.edu.sg/publications/Perspective/RSIS0842009.pdf Kurlantzick, Joshua. "Avoiding a Tempest in the South China Sea." Council on Foreign Relations...September 2, 2010. http://www.cfr.org/china/avoiding- tempest -south-china- sea/p22858 Kurlantzick, Joshua. "Growing U.S. Role in South China Sea

  1. A "Tempest" Project: Shakespeare and Critical Conflicts.

    ERIC Educational Resources Information Center

    McCann, Thomas M.; Flanagan, Joseph M.

    2002-01-01

    Describes a 4-week unit of study that focuses on Shakespeare's "The Tempest," a text that has been especially controversial in today's climate of increased multicultural awareness. Involves students in a larger conversation about the possibilities for reading and interpreting literature and prepares them to write mature analyses of the…

  2. Optimal Elevation and Configuration of Hanford's Double-Shell Tank Waste Mixer Pumps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onishi, Yasuo; Yokuda, Satoru T.; Majumder, Catherine A.

    The objective of this study was to compare the mixing performance of the Lawrence pump, which has injection nozzles at the top, with an alternative pump that has injection nozzles at the bottom, and to determine the optimal elevation for the alternative pump. Sixteen cases were evaluated: two sludge thicknesses at eight levels. A two-step evaluation approach was used: Step 1 to evaluate all 16 cases with the non-rotating mixer pump model and Step 2 to further evaluate four of those cases with the more realistic rotating mixer pump model. The TEMPEST code was used.

  3. TEMPEST: A three-dimensional time-dependence computer program for hydrothermal analysis: Volume 1, Numerical methods and input instructions: Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trent, D.S.; Eyler, L.L.

    TEMPEST offers simulation capabilities over a wide range of hydrothermal problems that are definable by input instructions. These capabilities are summarized by categories as follows: modeling capabilities; program control; and I/O control. 10 refs., 22 figs., 2 tabs. (LSP)

  4. "The Tempest": A Negotiable Meta-Panopticon

    ERIC Educational Resources Information Center

    Motlagh, Hanieh Mehr

    2015-01-01

    In "The Tempest", Shakespeare represents a world in which the model of a panopticon within a panopticon reveals how the power relations functions. All the major and minor characters establish panopticons which start from their own bodies and soul and move toward the larger one which belongs to that of Prospero as the higher order who has…

  5. Naval Mine Countermeasures: The Achilles Heel of U.S. Homeland Defense

    DTIC Science & Technology

    2013-05-20

    vii. 5 Ibid, viii. 6 Mark Tempest , “Port Security: Sea Mines, UWIEDS and Other Threats,” EagleSpeak. May 1, 2008, http://observer.guardian.co.uk...2013. http://news.yahoo.com/blogs/lookout /nypd-ray-kelly-boston-marathon-bombings-173922894.html. Tempest , Mark. “Port Security: Sea Mines, UWIEDS

  6. Dramatic Prelude: Using Drama To Introduce Classic Literature to Young Readers.

    ERIC Educational Resources Information Center

    Winstead, Anita

    1997-01-01

    This paper describes the work done by a third-grade class to write and present adaptations of Dickens'"A Christmas Carol" and Shakespeare's "The Tempest." Students explored the authors' lives and collectively wrote their own renditions of the stories; the entire short text of their version of "The Tempest" is included. (PB)

  7. TEMPEST. Transient 3-D Thermal-Hydraulic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eyler, L.L.

    TEMPEST is a transient, three-dimensional, hydrothermal program that is designed to analyze a range of coupled fluid dynamic and heat transfer systems of particular interest to the Fast Breeder Reactor (FBR) thermal-hydraulic design community. The full three-dimensional, time-dependent equations of motion, continuity, and heat transport are solved for either laminar or turbulent fluid flow, including heat diffusion and generation in both solid and liquid materials. The equations governing mass, momentum, and energy conservation for incompressible flows and small density variations (Boussinesq approximation) are solved using finite-difference techniques. Analyses may be conducted in either cylindrical or Cartesian coordinate systems. Turbulence ismore » treated using a two-equation model. Two auxiliary plotting programs, SEQUEL and MANPLOT, for use with TEMPEST output are included. SEQUEL may be operated in batch or interactive mode; it generates data required for vector plots, contour plots of scalar quantities, line plots, grid and boundary plots, and time-history plots. MANPLOT reads the SEQUEL-generated data and creates the hardcopy plots. TEMPEST can be a valuable hydrothermal design analysis tool in areas outside the intended FBR thermal-hydraulic design community.« less

  8. Observations of Convective Development from Repeat Pass Radiometry during CalWaters 2015: Outlook for the TEMPEST Mission

    NASA Astrophysics Data System (ADS)

    Brown, S. T.

    2015-12-01

    The Temporal Experiment for Storms and Tropical Systems (TEMPEST), which was recently selected as a NASA Earth Ventures technology demonstration mission, uses a constellation of five CubeSats flying in formation to provide observations of developing precipitation with a temporal resolution of 5 minutes. The observations are made using small mm-wave radiometers with frequencies ranging from 90 to 183 GHz which are sensitive to the integrated ice water path above the precipitation layer in the storm. This paper describes TEMPEST like observations that were made with the High Altitude MMIC Sounding Radiometer (HAMSR) on the ER-2 during CalWaters 2015. HAMSR is a mm-wave airborne radiometer with 25 channels in three bands; 50, 118 and 183 GHz. During the campaign, a small isolated area of convection was identified by the ER-2 pilot and 5 overpasses of the area were made with about 5 minutes between each pass. The HAMSR data reveal two convective cells, one which was diminishing and one which was developing. The mm-wave channels near the 183 GHz water vapor line clearly show the change in the vertical extent of the storm with time, a proxy for vertical velocity. These data demonstrate the potential for TEMPEST like observations from an orbital vantage point. This paper will provide an overview of the measurements, an analysis of the observations and offer perspectives for the TEMPEST mission.

  9. Temporal Experiment for Storms and Tropical Systems Technology Demonstration (TEMPEST-D): Risk Reduction for 6U-Class Nanosatellite Constellations

    NASA Astrophysics Data System (ADS)

    Reising, Steven C.; Gaier, Todd C.; Kummerow, Christian D.; Padmanabhan, Sharmila; Lim, Boon H.; Brown, Shannon T.; Heneghan, Cate; Chandra, Chandrasekar V.; Olson, Jon; Berg, Wesley

    2016-04-01

    TEMPEST-D will reduce the risk, cost and development time of a future constellation of 6U-Class nanosatellites to directly observe the time evolution of clouds and study the conditions that control the transition from non-precipitating to precipitating clouds using high-temporal resolution observations. TEMPEST-D provides passive millimeter-wave observations using a compact instrument that fits well within the size, weight and power (SWaP) requirements of the 6U-Class satellite architecture. TEMPEST-D is suitable for launch through NASA's CubeSat Launch Initiative (CSLI), for which it was selected in February 2015. By measuring the temporal evolution of clouds from the moment of the onset of precipitation, a TEMPEST constellation mission would improve our understanding of cloud processes and help to constrain one of the largest sources of uncertainty in climate models. Knowledge of clouds, cloud processes and precipitation is essential to our understanding of climate change. Uncertainties in the representation of key processes that govern the formation and dissipation of clouds and, in turn, control the global water and energy budgets lead to substantially different predictions of future climate in current models. TEMPEST millimeter-wave radiometers with five frequencies from 89 GHz to 182 GHz penetrate into the cloud to observe key changes as precipitation begins or ice accumulates inside the storm. The evolution of ice formation in clouds is important for climate prediction and a key factor in Earth's radiation budget. TEMPEST is designed to provide critical information on the time evolution of cloud and precipitation, yielding a first-order understanding of assumptions and uncertainties in current cloud parameterizations in general circulation models in diverse climate regimes. For a potential future one-year operational mission, five identical 6U-Class satellites would be deployed in the same orbital plane with 5- to 10-minute spacing deployed in an orbit similar to the International Space Station resupply missions, i.e. at ~400 km altitude and ~51° inclination. A one-year mission would capture 3 million observations of precipitation greater than 1 mm/hour rain rate, including at least 100,000 deep convective events. Passive drag-adjusting maneuvers would separate the five CubeSats in the same orbital plane by 5-10 minutes each, similar to deployment techniques to be used by NASA's Cyclone Global Navigation Satellite Systems (CYGNSS) mission.

  10. Liberation Tigers of Tamil Elam, Aum Shinrikyo, Al Qaeda, and the Syrian Crisis: Nonstate Actors Acquiring WMD

    DTIC Science & Technology

    2013-12-01

    Qaeda’s Tactics and Targets (Alexandria, VA: Tempest Publishing, 2003), 52; Jason Burke, Al-Qa’ida Casting a Shadow of Terror (London: I.B. Tauris...Aimee Ibrahim. The al-Qaeda Theat: An Analytical Guide to al Qaeda’s Tactics and Targets. Alexandria, VA: Tempest Publishing, 2003. Warrick, Joby

  11. Cleaning Up and Maintenance in the Wake of an Urban School Administration Tempest.

    ERIC Educational Resources Information Center

    Murtadha-Watts, Khuala

    2000-01-01

    Describes the context of a city corporation's attempt to initiate educational reform, focusing on two city school administrators, a newly hired Latina superintendent and an African American female assistant superintendent. Uses the metaphor of a tempest to describe the tension between the urge for rapid reform of the new superintendent and the…

  12. Combatting Electoral Traces: The Dutch Tempest Discussion and Beyond

    NASA Astrophysics Data System (ADS)

    Pieters, Wolter

    In the Dutch e-voting debate, the crucial issue leading to the abandonment of all electronic voting machines was compromising radiation, or tempest: it would be possible to eavesdrop on the choice of the voter by capturing the radiation from the machine. Other countries, however, do not seem to be bothered by this risk. In this paper, we use actor-network theory to analyse the socio-technical origins of the Dutch tempest issue in e-voting, and we introduce concepts for discussing its implications for e-voting beyond the Netherlands. We introduce the term electoral traces to denote any physical, digital or social evidence of a voter’s choices in an election. From this perspective, we provide a framework for risk classification as well as an overview of countermeasures against such traces.

  13. Director, Operational Test and Evaluation FY 2015 Annual Report

    DTIC Science & Technology

    2016-01-01

    review. For example, where a wind turbine project was found to have the potential to seriously degrade radar cross section testing at the Naval Air...Assessment Plan U.S. Special Operations Command Tempest Wind 2015 Assessment Plan U.S. Transportation Command Turbo Challenge 2015 Final Assessment...U.S. Air Forces Central Command 2015 May 2015 U.S. Special Operations Command-Pacific Tempest Wind 2014 May 2015 North American Aerospace Defense

  14. Impact of xynthia tempest on viral contamination of shellfish.

    PubMed

    Grodzki, Marco; Ollivier, Joanna; Le Saux, Jean-Claude; Piquet, Jean-Côme; Noyer, Mathilde; Le Guyader, Françoise S

    2012-05-01

    Viral contamination in oyster and mussel samples was evaluated after a massive storm with hurricane wind named "Xynthia tempest" destroyed a number of sewage treatment plants in an area harboring many shellfish farms. Although up to 90% of samples were found to be contaminated 2 days after the disaster, detected viral concentrations were low. A 1-month follow-up showed a rapid decrease in the number of positive samples, even for norovirus.

  15. The TeMPEST Transit Search: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Baliber, N. R.; Cochran, W. D.

    The Texas, McDonald Photometric Extrasolar Search for Transits, TeMPEST, is a photometric search for transits of extrasolar giant planets orbiting at distances less than approximately 0.1 AU to their parent stars. This survey is being conducted with the McDonald Observatory 0.76 meter Prime Focus Camera (PFC), which provides a 46.2 x 46.2 arcsec field of view. From August through December, 2001, we obtained our first full season of data on two fields in the Galactic plane, one in the constellation Cassiopeia and the other in Camelopardus. In these two fields, V-band time-series photometry with a cadence of about 9 minutes has been performed on over 5000 stars with sufficient precision, better than 0.01 mag, to detect transits of close-orbiting Jovian planets. We present representative light curves from variable stars and an eclipsing system from our 2001 data. The TeMPEST project is funded by the NASA Origins program.

  16. Evaluation of Information Leakage via Electromagnetic Emanation and Effectiveness of Tempest

    NASA Astrophysics Data System (ADS)

    Tanaka, Hidema

    It is well known that there is relationship between electromagnetic emanation and processing information in IT devices such as personal computers and smart cards. By analyzing such electromagnetic emanation, eavesdropper will be able to get some information, so it becomes a real threat of information security. In this paper, we show how to estimate amount of information that is leaked as electromagnetic emanation. We assume the space between the IT device and the receiver is a communication channel, and we define the amount of information leakage via electromagnetic emanations by its channel capacity. By some experimental results of Tempest, we show example estimations of amount of information leakage. Using the value of channel capacity, we can calculate the amount of information per pixel in the reconstructed image. And we evaluate the effectiveness of Tempest fonts generated by Gaussian method and its threshold of security.

  17. Tempest: Mesoscale test case suite results and the effect of order-of-accuracy on pressure gradient force errors

    NASA Astrophysics Data System (ADS)

    Guerra, J. E.; Ullrich, P. A.

    2014-12-01

    Tempest is a new non-hydrostatic atmospheric modeling framework that allows for investigation and intercomparison of high-order numerical methods. It is composed of a dynamical core based on a finite-element formulation of arbitrary order operating on cubed-sphere and Cartesian meshes with topography. The underlying technology is briefly discussed, including a novel Hybrid Finite Element Method (HFEM) vertical coordinate coupled with high-order Implicit/Explicit (IMEX) time integration to control vertically propagating sound waves. Here, we show results from a suite of Mesoscale testing cases from the literature that demonstrate the accuracy, performance, and properties of Tempest on regular Cartesian meshes. The test cases include wave propagation behavior, Kelvin-Helmholtz instabilities, and flow interaction with topography. Comparisons are made to existing results highlighting improvements made in resolving atmospheric dynamics in the vertical direction where many existing methods are deficient.

  18. Simulation framework for electromagnetic effects in plasmonics, filter apertures, wafer scattering, grating mirrors, and nano-crystals

    NASA Astrophysics Data System (ADS)

    Ceperley, Daniel Peter

    This thesis presents a Finite-Difference Time-Domain simulation framework as well as both scientific observations and quantitative design data for emerging optical devices. These emerging applications required the development of simulation capabilities to carefully control numerical experimental conditions, isolate and quantifying specific scattering processes, and overcome memory and run-time limitations on large device structures. The framework consists of a new version 7 of TEMPEST and auxiliary tools implemented as Matlab scripts. In improving the geometry representation and absorbing boundary conditions in TEMPEST from v6 the accuracy has been sustained and key improvements have yielded application specific speed and accuracy improvements. These extensions include pulsed methods, PML for plasmon termination, and plasmon and scattered field sources. The auxiliary tools include application specific methods such as signal flow graphs of plasmon couplers, Bloch mode expansions of sub-wavelength grating waves, and back-propagation methods to characterize edge scattering in diffraction masks. Each application posed different numerical hurdles and physical questions for the simulation framework. The Terrestrial Planet Finder Coronagraph required accurate modeling of diffraction mask structures too large for solely FDTD analysis. This analysis was achieved through a combination of targeted TEMPEST simulations and full system simulator based on thin mask scalar diffraction models by Ball Aerospace for JPL. TEMPEST simulation showed that vertical sidewalls were the strongest scatterers, adding nearly 2lambda of light per mask edge, which could be reduced by 20° undercuts. TEMPEST assessment of coupling in rapid thermal annealing was complicated by extremely sub-wavelength features and fine meshes. Near 100% coupling and low variability was confirmed even in the presence of unidirectional dense metal gates. Accurate analysis of surface plasmon coupling efficiency by small surface features required capabilities to isolate these features and cleanly illuminate them with plasmons and plane-waves. These features were shown to have coupling cross-sections up to and slightly exceeding their physical size. Long run-times for TEMPEST simulations of finite length gratings were overcome with a signal flow graph method. With these methods a plasmon coupler with over a 10lambda 100% capture length was demonstrated. Simulation of 3D nano-particle arrays utilized TEMPEST v7's pulsed methods to minimize the number of multi-day simulations. These simulations led to the discovery that interstitial plasmons were responsible for resonant absorption and transmission but not reflection. Simulation of a sub-wavelength grating mirror using pulsed sources to map resonant spectra showed that neither coupled guided waves nor coupled isolated resonators accurately described the operation. However, a new model based on vertical propagation of lateral Bloch modes with zero phase progression efficiently characterized the device and provided principles for designing similar devices at other wavelengths.

  19. Space Weathering Perspectives on Europa Amidst the Tempest of the Jupiter Magnetospheric System

    NASA Technical Reports Server (NTRS)

    Cooper, J. F.; Hartle, R. E.; Lipatov, A. S.; Sittler, E. C.; Cassidy, T. A.; Ip. W.-H.

    2010-01-01

    Europa resides within a "perfect storm" tempest of extreme external field, plasma, and energetic particle interactions with the magnetospheric system of Jupiter. Missions to Europa must survive, functionally operate, make useful measurements, and return critical science data, while also providing full context on this ocean moon's response to the extreme environment. Related general perspectives on space weathering in the solar system are applied to mission and instrument science requirements for Europa.

  20. Tempest: GPU-CPU computing for high-throughput database spectral matching.

    PubMed

    Milloy, Jeffrey A; Faherty, Brendan K; Gerber, Scott A

    2012-07-06

    Modern mass spectrometers are now capable of producing hundreds of thousands of tandem (MS/MS) spectra per experiment, making the translation of these fragmentation spectra into peptide matches a common bottleneck in proteomics research. When coupled with experimental designs that enrich for post-translational modifications such as phosphorylation and/or include isotopically labeled amino acids for quantification, additional burdens are placed on this computational infrastructure by shotgun sequencing. To address this issue, we have developed a new database searching program that utilizes the massively parallel compute capabilities of a graphical processing unit (GPU) to produce peptide spectral matches in a very high throughput fashion. Our program, named Tempest, combines efficient database digestion and MS/MS spectral indexing on a CPU with fast similarity scoring on a GPU. In our implementation, the entire similarity score, including the generation of full theoretical peptide candidate fragmentation spectra and its comparison to experimental spectra, is conducted on the GPU. Although Tempest uses the classical SEQUEST XCorr score as a primary metric for evaluating similarity for spectra collected at unit resolution, we have developed a new "Accelerated Score" for MS/MS spectra collected at high resolution that is based on a computationally inexpensive dot product but exhibits scoring accuracy similar to that of the classical XCorr. In our experience, Tempest provides compute-cluster level performance in an affordable desktop computer.

  1. Creating a Simple Single Computational Approach to Modeling Rarefied and Continuum Flow About Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Goldstein, David B.; Varghese, Philip L.

    1997-01-01

    We proposed to create a single computational code incorporating methods that can model both rarefied and continuum flow to enable the efficient simulation of flow about space craft and high altitude hypersonic aerospace vehicles. The code was to use a single grid structure that permits a smooth transition between the continuum and rarefied portions of the flow. Developing an appropriate computational boundary between the two regions represented a major challenge. The primary approach chosen involves coupling a four-speed Lattice Boltzmann model for the continuum flow with the DSMC method in the rarefied regime. We also explored the possibility of using a standard finite difference Navier Stokes solver for the continuum flow. With the resulting code we will ultimately investigate three-dimensional plume impingement effects, a subject of critical importance to NASA and related to the work of Drs. Forrest Lumpkin, Steve Fitzgerald and Jay Le Beau at Johnson Space Center. Below is a brief background on the project and a summary of the results as of the end of the grant.

  2. TEMPEST: Twin Electric Magnetospheric Probes Exploring on Spiral Trajectories--A Proposal to the Medium Class Explorer Program

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The objective of the Twin Electric Magnetospheric Probes Exploring on Spiral Trajectories (TEMPEST) mission is to understand the nature and causes of magnetic storm conditions in the magnetosphere whether they be manifested classically in the buildup of the ring current, or (as recently discovered) by storms of relativistic electrons that cause the deep dielectric charging responsible for disabling satellites in synchronous orbit, or by the release of energy into the auroral ionosphere and the plasma sheet during substorms.

  3. Development of a Task-Exposure Matrix (TEM) for Pesticide Use (TEMPEST).

    PubMed

    Dick, F D; Semple, S E; van Tongeren, M; Miller, B G; Ritchie, P; Sherriff, D; Cherrie, J W

    2010-06-01

    Pesticides have been associated with increased risks for a range of conditions including Parkinson's disease, but identifying the agents responsible has proven challenging. Improved pesticide exposure estimates would increase the power of epidemiological studies to detect such an association if one exists. Categories of pesticide use were identified from the tasks reported in a previous community-based case-control study in Scotland. Typical pesticides used in each task in each decade were identified from published scientific and grey literature and from expert interviews, with the number of potential agents collapsed into 10 groups of pesticides. A pesticide usage database was then created, using the task list and the typical pesticide groups employed in those tasks across seven decades spanning the period 1945-2005. Information about the method of application and concentration of pesticides used in these tasks was then incorporated into the database. A list was generated of 81 tasks involving pesticide exposure in Scotland covering seven decades producing a total of 846 task per pesticide per decade combinations. A Task-Exposure Matrix for PESTicides (TEMPEST) was produced by two occupational hygienists who quantified the likely probability and intensity of inhalation and dermal exposures for each pesticide group for a given use during each decade. TEMPEST provides a basis for assessing exposures to specific pesticide groups in Scotland covering the period 1945-2005. The methods used to develop TEMPEST could be used in a retrospective assessment of occupational exposure to pesticides for Scottish epidemiological studies or adapted for use in other countries.

  4. Ground based research in microgravity materials processing

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Rathz, Tom

    1994-01-01

    The core activities performed during this time period have been concerned with tracking the TEMPEST experiments on the shuttle with drops of Zr, Ni, and Nb alloys. In particular a lot of Zr drops are being made to better define the recalescence characteristics of that system so that accurate comparisons of the drop tube results with Tempest can be made. A new liner, with minimal reflectivity characteristics, has been inserted into the drop tube in order to improve the recalescence measurements of the falling drops. The first installation to make the geometric measurements to ensure a proper fit has been made. The stovepipe sections are currently in the shop at MSFC being painted with low reflectivity black paint. Work has also continued on setting up the MEL apparatus obtained from Oak Ridge in the down stairs laboratory at the Drop Tube Facilities. Some ground-based experiments on the same metals as are being processed on TEMPEST are planned for the MEL. The flight schedules for the KC-135 experiments are still to be determined in the near future.

  5. Continuum Absorption Coefficient of Atoms and Ions

    NASA Technical Reports Server (NTRS)

    Armaly, B. F.

    1979-01-01

    The rate of heat transfer to the heat shield of a Jupiter probe has been estimated to be one order of magnitude higher than any previously experienced in an outer space exploration program. More than one-third of this heat load is due to an emission of continuum radiation from atoms and ions. The existing computer code for calculating the continuum contribution to the total load utilizes a modified version of Biberman's approximate method. The continuum radiation absorption cross sections of a C - H - O - N ablation system were examined in detail. The present computer code was evaluated and updated by being compared with available exact and approximate calculations and correlations of experimental data. A detailed calculation procedure, which can be applied to other atomic species, is presented. The approximate correlations can be made to agree with the available exact and experimental data.

  6. Simulation of Hanford Tank 241-C-106 Waste Release into Tank 241-Y-102

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KP Recknagle; Y Onishi

    Waste stored in Hdord single-shell Tank 241-C-106 will be sluiced with a supernatant liquid from doubIe-shell Tank 241 -AY- 102 (AY-1 02) at the U.S. Department of Energy's Har@ord Site in Eastern Washington. The resulting slurry, containing up to 30 wtYo solids, will then be transferred to Tank AY-102. During the sluicing process, it is important to know the mass of the solids being transferred into AY- 102. One of the primary instruments used to measure solids transfer is an E+ densitometer located near the periphery of the tank at riser 15S. This study was undert.dcen to assess how wellmore » a densitometer measurement could represent the total mass of soiids transferred if a uniform lateral distribution was assumed. The study evaluated the C-1 06 slurry mixing and accumulation in Tank AY- 102 for the following five cases: Case 1: 3 wt'%0 slurry in 6.4-m AY-102 waste Case 2: 3 w-t% slurry in 4.3-m AY-102 waste Case 3: 30 wtYo slurry in 6.4-m AY-102 waste Case 4: 30 wt% slurry in 4.3-m AY-102 waste Case 5: 30 wt% slurry in 5. O-m AY-102 waste. The tirne-dependent, three-dimensional, TEMPEST computer code was used to simulate solid deposition and accumulation during the injection of the C-106 slurry into AY-102 through four injection nozzles. The TEMPEST computer code was applied previously to other Hanford tanks, AP-102, SY-102, AZ-101, SY-101, AY-102, and C-106, to model tank waste mixing with rotating pump jets, gas rollover events, waste transfer from one tank to another, and pump-out retrieval of the sluiced waste. The model results indicate that the solid depth accumulated at the densitometer is within 5% of the average depth accumulation. Thus the reading of the densitometer is expected to represent the total mass of the transferred solids reasonably well.« less

  7. Critical Care Coding for Neurologists.

    PubMed

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  8. Coding of Neuroinfectious Diseases.

    PubMed

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  9. Diagnostic Coding for Epilepsy.

    PubMed

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  10. Mourning in the psychoanalytic situation and in Shakespeare's The Tempest.

    PubMed

    Houlding, Sybil

    2015-01-01

    Recognizing that mourning builds psychic structure, the author highlights the ubiquitous and essential nature of mourning in the psychoanalytic situation. Reality testing is intimately connected to mourning and is the warp on which psychic structure is woven in the analytic situation. Reality testing necessarily involves opportunities for mourning and thus will be present in every analytic hour. The confrontation with reality is the basis for all processes of mourning, or for creating defenses against this painful experience. The author views mourning as fundamentally a transformational process, and Shakespeare's The Tempest is used to illustrate this aspect of mourning. © 2015 The Psychoanalytic Quarterly, Inc.

  11. Suggestions to Gain Deeper Understanding of Magnetic Fields in Astrophysics Classrooms

    NASA Astrophysics Data System (ADS)

    Woolsey, Lauren N.

    2016-01-01

    I present two tools that could be used in an undergraduate or graduate classroom to aid in developing intuition of magnetic fields, how they are measured, and how they affect large scale phenomena like the solar wind. The first tool is a Mathematica widget I developed that simulates observations of magnetic field in the Interstellar Medium (ISM) using the weak Zeeman effect. Woolsey (2015, JAESE) discusses the relevant background information about what structures in the ISM produce a strong enough effect and which molecules are used to make the measurement and why. This widget could be used in an entry level astronomy course as a way to show how astronomers actually make certain types of measurements and allow students to practice inquiry-based learning to understand how different aspects of the ISM environment strengthen or weaken the observed signal. The second tool is a Python model of the solar wind, The Efficient Modified Parker Equation Solving Tool (TEMPEST), that is publicly available on GitHub (https://github.com/lnwoolsey/tempest). I discuss possible short-term projects or investigations that could be done using the programs in the TEMPEST library that are suitable for upper-level undergraduates or in graduate level coursework (Woolsey, 2015, JRAEO).

  12. Analyze and predict VLTI observations: the Role of 2D/3D dust continuum radiative transfer codes

    NASA Astrophysics Data System (ADS)

    Pascucci, I.; Henning, Th; Steinacker, J.; Wolf, S.

    2003-10-01

    Radiative Transfer (RT) codes with image capability are a fundamental tool for preparing interferometric observations and for interpreting visibility data. In view of the upcoming VLTI facilities, we present the first comparison of images/visibilities coming from two 3D codes that use completely different techniques to solve the problem of self-consistent continuum RT. In addition, we focus on the astrophysical case of a disk distorted by tidal interaction with by-passing stars or internal planets and investigate for which parameters the distortion can be best detected in the mid-infrared using the mid-infrared interferometric device MIDI.

  13. 2D/3D Dust Continuum Radiative Transfer Codes to Analyze and Predict VLTI Observations

    NASA Astrophysics Data System (ADS)

    Pascucci, I.; Henning, Th.; Steinacker, J.; Wolf, S.

    Radiative Transfer (RT) codes with image capability are a fundamental tool for preparing interferometric observations and for interpreting visibility data. In view of the upcoming VLTI facilities, we present the first comparison of images/visibilities coming from two 3D codes that use completely different techniques to solve the problem of self-consistent continuum RT. In addition, we focus on the astrophysical case of a disk distorted by tidal interaction with by-passing stars or internal planets and investigate for which parameters the distortion can be best detected in the mid-infrared using the mid-infrared interferometric device MIDI.

  14. Numerical simulation to determine the effects of incident wind shear and turbulence level on the flow around a building

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Y.Q.; Huber, A.H.; Arya, S.P.S.

    The effects of incident shear and turbulence on flow around a cubical building are being investigated by a turbulent kinetic energy/dissipation model (TEMPEST). The numerical simulations demonstrate significant effects due to the differences in the incident flow. The addition of upstream turbulence and shear results in a reduced size of the cavity directly behind the building. The accuracy of numerical simulations is verified by comparing the predicted mean flow fields with the available wind-tunnel measurements of Castro and Robins (1977). Comparing the authors' results with experimental data, the authors show that the TEMPEST model can reasonably simulate the mean flow.

  15. Atmospheric absorption of terahertz radiation and water vapor continuum effects

    NASA Astrophysics Data System (ADS)

    Slocum, David M.; Slingerland, Elizabeth J.; Giles, Robert H.; Goyette, Thomas M.

    2013-09-01

    The water vapor continuum absorption spectrum was investigated using Fourier Transform Spectroscopy. The transmission of broadband terahertz radiation from 0.300 to 1.500 THz was recorded for multiple path lengths and relative humidity levels. The absorption coefficient as a function of frequency was determined and compared with theoretical predictions and available water vapor absorption data. The prediction code is able to separately model the different parts of atmospheric absorption for a range of experimental conditions. A variety of conditions were accurately modeled using this code including both self and foreign gas broadening for low and high water vapor pressures for many different measurement techniques. The intensity and location of the observed absorption lines were also in good agreement with spectral databases. However, there was a discrepancy between the resonant line spectrum simulation and the observed absorption spectrum in the atmospheric transmission windows caused by the continuum absorption. A small discrepancy remained even after using the best available data from the literature to account for the continuum absorption. From the experimental and resonant line simulation spectra the air-broadening continuum parameter was calculated and compared with values available in the literature.

  16. Continua or Chimera?

    ERIC Educational Resources Information Center

    Booth, Tony

    1994-01-01

    This article looks at two concepts in the British 1993 draft Code of Practice concerning students with special needs: the concepts of a "continuum of needs" and a "continuum of provision." Issues involved in connecting the two continua are addressed, including whether service delivery decisions should be based on severity of…

  17. A psychoanalytic study of Edward de Vere's The Tempest.

    PubMed

    Waugaman, Richard M

    2009-01-01

    There is now abundant evidence that Freud was correct in believing Edward de Vere (1550-1604) wrote under the pseudonym "William Shakespeare." One common reaction is "What difference does it make?" I address that question by examining many significant connections between de Vere's life and The Tempest. Such studies promise to bring our understanding of Shakespeare's works back into line with our usual psychoanalytic approach to literature, which examines how a great writer's imagination weaves a new creation out of the threads of his or her life experiences. One source of the intense controversy about de Vere's authorship is our idealization of the traditional author, about whom we know so little that, as Freud noted, we can imagine his personality was as fine as his works.

  18. Information Security due to Electromagnetic Environments

    NASA Astrophysics Data System (ADS)

    Sekiguchi, Hidenori; Seto, Shinji

    Generally, active electronic devices emit slightly unintentional electromagnetic noise. From long ago, electromagnetic emission levels have been regulated from the aspect of electromagnetic compatibility (EMC). Also, it has been known the electromagnetic emissions have been generated from the ON/OFF of signals in the device. Recently, it becomes a topic of conversation on the information security that the ON/OFF on a desired signal in the device can be reproduced or guessed by receiving the electromagnetic emission. For an example, a display image on a personal computer (PC) can be reconstructed by receiving and analyzing the electromagnetic emission. In sum, this fact makes known information leakage due to electromagnetic emission. “TEMPEST" that has been known as a code name originated in the U. S. Department of Defense is to prevent the information leakage caused by electromagnetic emissions. This paper reports the brief summary of the information security due to electromagnetic emissions from information technology equipments.

  19. Edge-relevant plasma simulations with the continuum code COGENT

    NASA Astrophysics Data System (ADS)

    Dorf, M.; Dorr, M.; Ghosh, D.; Hittinger, J.; Rognlien, T.; Cohen, R.; Lee, W.; Schwartz, P.

    2016-10-01

    We describe recent advances in cross-separatrix and other edge-relevant plasma simulations with COGENT, a continuum gyro-kinetic code being developed by the Edge Simulation Laboratory (ESL) collaboration. The distinguishing feature of the COGENT code is its high-order finite-volume discretization methods, which employ arbitrary mapped multiblock grid technology (nearly field-aligned on blocks) to handle the complexity of tokamak divertor geometry with high accuracy. This paper discusses the 4D (axisymmetric) electrostatic version of the code, and the presented topics include: (a) initial simulations with kinetic electrons and development of reduced fluid models; (b) development and application of implicit-explicit (IMEX) time integration schemes; and (c) conservative modeling of drift-waves and the universal instability. Work performed for USDOE, at LLNL under contract DE-AC52-07NA27344 and at LBNL under contract DE-AC02-05CH11231.

  20. Simulation of neoclassical transport with the continuum gyrokinetic code COGENT

    DOE PAGES

    Dorf, M. A.; Cohen, R. H.; Dorr, M.; ...

    2013-01-25

    The development of the continuum gyrokinetic code COGENT for edge plasma simulations is reported. The present version of the code models a nonlinear axisymmetric 4D (R, v∥, μ) gyrokinetic equation coupled to the long-wavelength limit of the gyro-Poisson equation. Here, R is the particle gyrocenter coordinate in the poloidal plane, and v∥ and μ are the guiding center velocity parallel to the magnetic field and the magnetic moment, respectively. The COGENT code utilizes a fourth-order finite-volume (conservative) discretization combined with arbitrary mapped multiblock grid technology (nearly field-aligned on blocks) to handle the complexity of tokamak divertor geometry with high accuracy.more » Furthermore, topics presented are the implementation of increasingly detailed model collision operators, and the results of neoclassical transport simulations including the effects of a strong radial electric field characteristic of a tokamak pedestal under H-mode conditions.« less

  1. Lagrangian continuum dynamics in ALEGRA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Michael K. W.; Love, Edward

    Alegra is an ALE (Arbitrary Lagrangian-Eulerian) multi-material finite element code that emphasizes large deformations and strong shock physics. The Lagrangian continuum dynamics package in Alegra uses a Galerkin finite element spatial discretization and an explicit central-difference stepping method in time. The goal of this report is to describe in detail the characteristics of this algorithm, including the conservation and stability properties. The details provided should help both researchers and analysts understand the underlying theory and numerical implementation of the Alegra continuum hydrodynamics algorithm.

  2. Pidgin and English in Melanesia: Is There a Continuum?

    ERIC Educational Resources Information Center

    Siegel, Jeff

    1997-01-01

    Examines the linguistic features of Tok Pisin (the Papua New Guinea variety of Melanesian Pidgin) resulting from decreolization and the linguistic features of Papua New Guinea English. Discusses code-switching and transference between Tok Pisin and English and concludes that an English-to-pidgin continuum does not exist in Papua New Guinea or in…

  3. AEROELASTIC SIMULATION TOOL FOR INFLATABLE BALLUTE AEROCAPTURE

    NASA Technical Reports Server (NTRS)

    Liever, P. A.; Sheta, E. F.; Habchi, S. D.

    2006-01-01

    A multidisciplinary analysis tool is under development for predicting the impact of aeroelastic effects on the functionality of inflatable ballute aeroassist vehicles in both the continuum and rarefied flow regimes. High-fidelity modules for continuum and rarefied aerodynamics, structural dynamics, heat transfer, and computational grid deformation are coupled in an integrated multi-physics, multi-disciplinary computing environment. This flexible and extensible approach allows the integration of state-of-the-art, stand-alone NASA and industry leading continuum and rarefied flow solvers and structural analysis codes into a computing environment in which the modules can run concurrently with synchronized data transfer. Coupled fluid-structure continuum flow demonstrations were conducted on a clamped ballute configuration. The feasibility of implementing a DSMC flow solver in the simulation framework was demonstrated, and loosely coupled rarefied flow aeroelastic demonstrations were performed. A NASA and industry technology survey identified CFD, DSMC and structural analysis codes capable of modeling non-linear shape and material response of thin-film inflated aeroshells. The simulation technology will find direct and immediate applications with NASA and industry in ongoing aerocapture technology development programs.

  4. Calculation of continuum damping of Alfvén eigenmodes in tokamak and stellarator equilibria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowden, G. W.; Hole, M. J.; Könies, A.

    2015-09-15

    In an ideal magnetohydrodynamic (MHD) plasma, shear Alfvén eigenmodes may experience dissipationless damping due to resonant interaction with the shear Alfvén continuum. This continuum damping can make a significant contribution to the overall growth/decay rate of shear Alfvén eigenmodes, with consequent implications for fast ion transport. One method for calculating continuum damping is to solve the MHD eigenvalue problem over a suitable contour in the complex plane, thereby satisfying the causality condition. Such an approach can be implemented in three-dimensional ideal MHD codes which use the Galerkin method. Analytic functions can be fitted to numerical data for equilibrium quantities inmore » order to determine the value of these quantities along the complex contour. This approach requires less resolution than the established technique of calculating damping as resistivity vanishes and is thus more computationally efficient. The complex contour method has been applied to the three-dimensional finite element ideal MHD Code for Kinetic Alfvén waves. In this paper, we discuss the application of the complex contour technique to calculate the continuum damping of global modes in tokamak as well as torsatron, W7-X and H-1NF stellarator cases. To the authors' knowledge, these stellarator calculations represent the first calculation of continuum damping for eigenmodes in fully three-dimensional equilibria. The continuum damping of global modes in W7-X and H-1NF stellarator configurations investigated is found to depend sensitively on coupling to numerous poloidal and toroidal harmonics.« less

  5. Proceduracy: Computer Code Writing in the Continuum of Literacy

    ERIC Educational Resources Information Center

    Vee, Annette

    2010-01-01

    This dissertation looks at computer programming through the lens of literacy studies, building from the concept of code as a written text with expressive and rhetorical power. I focus on the intersecting technological and social factors of computer code writing as a literacy--a practice I call "proceduracy". Like literacy, proceduracy is a human…

  6. Continuities in Reading Acquisition, Reading Skill, and Reading Disability.

    ERIC Educational Resources Information Center

    Perfetti, Charles A.

    1986-01-01

    Learning to read depends on eventual mastery of coding procedures, and even skilled reading depends on coding processes low in cost to processing resources. Reading disability may be understood as a point on an ability continuum or a wide range of coding ability. Instructional goals of word reading skill, including rapid and fluent word…

  7. TEMPEST simulations of the neoclassical transport in a single-null tokamak geometry

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Cohen, R. H.; Rognlien, T. D.

    2009-05-01

    TEMPEST simulations were carried out for plasma transport and flow dynamics in a single-null tokamak geometry. The core radial boundary ion distribution is a fixed Maxwellian FM with N0=N(ψ0) and Ti0=Ti(ψ0)=300eV, and exterior radial boundary ion distribution is Neumann boundary condition with Fi(,,μ)/ψ|ψw=0 during a simulation. Given boundary conditions and initial profiles, the interior plasmas in the simulations should evolve into a neoclassical steady state. A volume source term in the private flux region is included, representing the ionization in the private flux region to achieve the neoclassical steady state. A series of TEMPEST simulations are conducted to investigate the scaling characteristics of the neoclassical transport and flow as a function of ν*i via a density scan. Here ν*i is the effective collision frequency, defined by ν*i=&-3/2circ;νii√2qR0/vTi, νii is the ion-ion collision, and vTi the ion thermal velocity. Simulation results show significant poloidal variation of density and ion temperature profiles due to the endloss machanism at the divertor plates. Each region (Edge, the SOL and private flux) achieves the dynamical steady state at its own time scale due to the different physical processes. The impact of self-consistent electric field on transport and flow will be presented.

  8. TeMPEST: the Texas, McDonald Photometric Extrasolar Search for Transits

    NASA Astrophysics Data System (ADS)

    Baliber, N. R.; Cochran, W. D.

    2001-11-01

    The TeMPEST project is a photometric search for transits of extrasolar giant planets orbiting at distances < ~ 0.1 AU to their parent stars. As is the case with HD 209458, the only known transiting system, measurements of the photometric dimming of stars with transiting planets, along with radial velocity (RV) data, will provide information on physical characteristics (mass, radius, and mean density) of these planets. Further study of HD 209458 b and planets like it might reveal their reflectivity, putting further constraints on their surface temperatures, as well as allow measurement of the composition of their outer atmospheres. To detect these types of systems, we use the McDonald Observatory 0.76m Prime Focus Camera (PFC), which provides a 46.2 arcmin square field. We are currently obtaining our first full season of data, and by early 2002 will have sufficient data to follow approximately 5,000 stars with the precision necessary to detect transits of close-orbiting Jovian planets. We also present data of the detection of the transit of the planet orbiting HD 209458 using the 0.76m PFC. These data are consistent with the partial occultation of the light from the star caused by the transit of an opaque disc of radius 1.4 R Jup. The TeMPEST project is funded by the NASA Origins program.

  9. Tempest gas turbine extends EGT product line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chellini, R.

    With the introduction of the 7.8 MW (mechanical output) Tempest gas turbine, ECT has extended the company`s line of its small industrial turbines. The new Tempest machine, featuring a 7.5 MW electric output and a 33% thermal efficiency, ranks above the company`s single-shaft Typhoon gas turbine, rated 3.2 and 4.9 MW, and the 6.3 MW Tornado gas turbine. All three machines are well-suited for use in combined heat and power (CHP) plants, as demonstrated by the fact that close to 50% of the 150 Typhoon units sold are for CHP applications. This experience has induced EGT, of Lincoln, England, tomore » announce the introduction of the new gas turbine prior to completion of the testing program. The present single-shaft machine is expected to be used mainly for industrial trial cogeneration. This market segment, covering the needs of paper mills, hospitals, chemical plants, ceramic industry, etc., is a typical local market. Cogeneration plants are engineered according to local needs and have to be assisted by local organizations. For this reason, to efficiently cover the world market, EGT has selected a number of associates that will receive from Lincoln completely engineered machine packages and will engineer the cogeneration system according to custom requirements. These partners will also assist the customer and dispose locally of the spares required for maintenance operations.« less

  10. Solution of the Burnett equations for hypersonic flows near the continuum limit

    NASA Technical Reports Server (NTRS)

    Imlay, Scott T.

    1992-01-01

    The INCA code, a three-dimensional Navier-Stokes code for analysis of hypersonic flowfields, was modified to analyze the lower reaches of the continuum transition regime, where the Navier-Stokes equations become inaccurate and Monte Carlo methods become too computationally expensive. The two-dimensional Burnett equations and the three-dimensional rotational energy transport equation were added to the code and one- and two-dimensional calculations were performed. For the structure of normal shock waves, the Burnett equations give consistently better results than Navier-Stokes equations and compare reasonably well with Monte Carlo methods. For two-dimensional flow of Nitrogen past a circular cylinder the Burnett equations predict the total drag reasonably well. Care must be taken, however, not to exceed the range of validity of the Burnett equations.

  11. Efficient Computation of Atmospheric Flows with Tempest: Development of Next-Generation Climate and Weather Prediction Algorithms at Non-Hydrostatic Scales

    NASA Astrophysics Data System (ADS)

    Guerra, J. E.; Ullrich, P. A.

    2015-12-01

    Tempest is a next-generation global climate and weather simulation platform designed to allow experimentation with numerical methods at very high spatial resolutions. The atmospheric fluid equations are discretized by continuous / discontinuous finite elements in the horizontal and by a staggered nodal finite element method (SNFEM) in the vertical, coupled with implicit/explicit time integration. At global horizontal resolutions below 10km, many important questions remain on optimal techniques for solving the fluid equations. We present results from a suite of meso-scale test cases to validate the performance of the SNFEM applied in the vertical. Internal gravity wave, mountain wave, convective, and Cartesian baroclinic instability tests will be shown at various vertical orders of accuracy and compared with known results.

  12. Impact of the Level of State Tax Code Progressivity on Children's Health Outcomes

    ERIC Educational Resources Information Center

    Granruth, Laura Brierton; Shields, Joseph J.

    2011-01-01

    This research study examines the impact of the level of state tax code progressivity on selected children's health outcomes. Specifically, it examines the degree to which a state's tax code ranking along the progressive-regressive continuum relates to percentage of low birthweight babies, infant and child mortality rates, and percentage of…

  13. Continuum Vlasov Simulation in Four Phase-space Dimensions

    NASA Astrophysics Data System (ADS)

    Cohen, B. I.; Banks, J. W.; Berger, R. L.; Hittinger, J. A.; Brunner, S.

    2010-11-01

    In the VALHALLA project, we are developing scalable algorithms for the continuum solution of the Vlasov-Maxwell equations in two spatial and two velocity dimensions. We use fourth-order temporal and spatial discretizations of the conservative form of the equations and a finite-volume representation to enable adaptive mesh refinement and nonlinear oscillation control [1]. The code has been implemented with and without adaptive mesh refinement, and with electromagnetic and electrostatic field solvers. A goal is to study the efficacy of continuum Vlasov simulations in four phase-space dimensions for laser-plasma interactions. We have verified the code in examples such as the two-stream instability, the weak beam-plasma instability, Landau damping, electron plasma waves with electron trapping and nonlinear frequency shifts [2]^ extended from 1D to 2D propagation, and light wave propagation.^ We will report progress on code development, computational methods, and physics applications. This work was performed under the auspices of the U.S. DOE by LLNL under contract no. DE-AC52-07NA27344. This work was funded by the Lab. Dir. Res. and Dev. Prog. at LLNL under project tracking code 08-ERD-031. [1] J.W. Banks and J.A.F. Hittinger, to appear in IEEE Trans. Plas. Sci. (Sept., 2010). [2] G.J. Morales and T.M. O'Neil, Phys. Rev. Lett. 28,417 (1972); R. L. Dewar, Phys. Fluids 15,712 (1972).

  14. Scientific Overview of Temporal Experiment for Storms and Tropical Systems (TEMPEST) Program

    NASA Astrophysics Data System (ADS)

    Chandra, C. V.; Reising, S. C.; Kummerow, C. D.; van den Heever, S. C.; Todd, G.; Padmanabhan, S.; Brown, S. T.; Lim, B.; Haddad, Z. S.; Koch, T.; Berg, G.; L'Ecuyer, T.; Munchak, S. J.; Luo, Z. J.; Boukabara, S. A.; Ruf, C. S.

    2014-12-01

    Over the past decade and a half, we have gained a better understanding of the role of clouds and precipitation on Earth's water cycle, energy budget and climate, from focused Earth science observational satellite missions. However, these missions provide only a snapshot at one point in time of the cloud's development. Processes that govern cloud system development occur primarily on time scales of the order of 5-30 minutes that are generally not observable from low Earth orbiting satellites. Geostationary satellites, in contrast, have higher temporal resolution but at present are limited to visible and infrared wavelengths that observe only the tops of clouds. This observing gap was noted by the National Research Council's Earth Science Decadal Survey in 2007. Uncertainties in global climate models are significantly affected by processes that govern the formation and dissipation of clouds that largely control the global water and energy budgets. Current uncertainties in cloud parameterization within climate models lead to drastically different climate outcomes. With all evidence suggesting that the precipitation onset may be governed by factors such atmospheric stability, it becomes critical to have at least first-order observations globally in diverse climate regimes. Similar arguments are valid for ice processes where more efficient ice formation and precipitation have a tendency to leave fewer ice clouds behind that have different but equally important impacts on the Earth's energy budget and resulting temperature trends. TEMPEST is a unique program that will provide a small constellation of inexpensive CubeSats with millimeter-wave radiometers to address key science needs related to cloud and precipitation processes. Because these processes are most critical in the development of climate models that will soon run at scales that explicitly resolve clouds, the TEMPEST program will directly focus on examining, validating and improving the parameterizations currently used in cloud scale models. The time evolution of cloud and precipitation microphysics is dependent upon parameterized process rates. The outcome of TEMPEST will provide a first-order understanding of how individual assumptions in current cloud model parameterizations behave in diverse climate regimes.

  15. A Continuum Mechanical Approach to Geodesics in Shape Space

    DTIC Science & Technology

    2010-01-01

    the space of shapes, where shapes are implicitly described as boundary contours of objects. The proposed shape metric is derived from a ...investigate the close link between abstract geometry on the infinite -dimen- sional space of shapes and the continuum mechanical view of shapes as boundary...are texture-coded in the bottom row. of multiple components of volumetric objects. The

  16. Approach to Shakespeare.

    ERIC Educational Resources Information Center

    Bannerman, Andrew

    1969-01-01

    For an introduction to Shakespeare's "Tempest," dramatic interest and tension were created in the classroom through taped interviews with survivors of present-day sea disasters, student improvisations of scenes, music, and historical accounts of shipwrecks. (MF)

  17. Efficient Computation of Atmospheric Flows with Tempest: Validation of Next-Generation Climate and Weather Prediction Algorithms at Non-Hydrostatic Scales

    NASA Astrophysics Data System (ADS)

    Guerra, Jorge; Ullrich, Paul

    2016-04-01

    Tempest is a next-generation global climate and weather simulation platform designed to allow experimentation with numerical methods for a wide range of spatial resolutions. The atmospheric fluid equations are discretized by continuous / discontinuous finite elements in the horizontal and by a staggered nodal finite element method (SNFEM) in the vertical, coupled with implicit/explicit time integration. At horizontal resolutions below 10km, many important questions remain on optimal techniques for solving the fluid equations. We present results from a suite of idealized test cases to validate the performance of the SNFEM applied in the vertical with an emphasis on flow features and dynamic behavior. Internal gravity wave, mountain wave, convective bubble, and Cartesian baroclinic instability tests will be shown at various vertical orders of accuracy and compared with known results.

  18. Tempest - Efficient Computation of Atmospheric Flows Using High-Order Local Discretization Methods

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.

    2014-12-01

    The Tempest Framework composes several compact numerical methods to easily facilitate intercomparison of atmospheric flow calculations on the sphere and in rectangular domains. This framework includes the implementations of Spectral Elements, Discontinuous Galerkin, Flux Reconstruction, and Hybrid Finite Element methods with the goal of achieving optimal accuracy in the solution of atmospheric problems. Several advantages of this approach are discussed such as: improved pressure gradient calculation, numerical stability by vertical/horizontal splitting, arbitrary order of accuracy, etc. The local numerical discretization allows for high performance parallel computation and efficient inclusion of parameterizations. These techniques are used in conjunction with a non-conformal, locally refined, cubed-sphere grid for global simulations and standard Cartesian grids for simulations at the mesoscale. A complete implementation of the methods described is demonstrated in a non-hydrostatic setting.

  19. TEMperature Pressure ESTimation of a homogeneous boiling fuel-steel mixture in an LMFBR core. [TEMPEST code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pyun, J.J.; Majumdar, D.

    The paper describes TEMPEST, a simple computer program for the temperature and pressure estimation of a boiling fuel-steel pool in an LMFBR core. The time scale of interest of this program is large, of the order of ten seconds. Further, the vigorous boiling in the pool will generate a large contact, and hence a large heat transfer between fuel and steel. The pool is assumed to be a uniform mixture of fuel and steel, and consequently vapor production is also assumed to be uniform throughout the pool. The pool is allowed to expand in volume if there is steel meltingmore » at the walls. In this program, the total mass of liquid and vapor fuel is always kept constant, but the total steel mass in the pool may change by steel wall melting. Because of a lack of clear understanding of the physical phenomena associated with the progression of a fuel-steel mixture at high temperature, various input options have been built-in to enable one to perform parametric studies. For example, the heat transfer from the pool to the surrounding steel structure may be controlled by input values for the heat transfer coefficients, or, the heat transfer may be calculated by a correlation obtained from the literature. Similarly, condensation of vapor on the top wall can be specified by input values of the condensation coefficient; the program can otherwise calculate condensation according to the non-equilibrium model predictions. Meltthrough rates of the surrounding steel walls can be specified by a fixed melt-rate or can be determined by a fraction of the heat loss that goes to steel-melting. The melted steel is raised to the pool temperature before it is joined with the pool material. Several applications of this program to various fuel-steel pools in the FFTF and the CRBR cores are discussed.« less

  20. Polarized Continuum Radiation from Stellar Atmospheres

    NASA Astrophysics Data System (ADS)

    Harrington, J. Patrick

    2015-10-01

    Continuum scattering by free electrons can be significant in early type stars, while in late type stars Rayleigh scattering by hydrogen atoms or molecules may be important. Computer programs used to construct models of stellar atmospheres generally treat the scattering of the continuum radiation as isotropic and unpolarized, but this scattering has a dipole angular dependence and will produce polarization. We review an accurate method for evaluating the polarization and limb darkening of the radiation from model stellar atmospheres. We use this method to obtain results for: (i) Late type stars, based on the MARCS code models (Gustafsson et al. 2008), and (ii) Early type stars, based on the NLTE code TLUSTY (Lanz and Hubeny 2003). These results are tabulated at http://www.astro.umd.edu/~jph/Stellar_Polarization.html. While the net polarization vanishes for an unresolved spherical star, this symmetry is broken by rapid rotation or by the masking of part of the star by a binary companion or during the transit of an exoplanet. We give some numerical results for these last cases.

  1. Pressure measurements in a low-density nozzle plume for code verification

    NASA Technical Reports Server (NTRS)

    Penko, Paul F.; Boyd, Iain D.; Meissner, Dana L.; Dewitt, Kenneth J.

    1991-01-01

    Measurements of Pitot pressure were made in the exit plane and plume of a low-density, nitrogen nozzle flow. Two numerical computer codes were used to analyze the flow, including one based on continuum theory using the explicit MacCormack method, and the other on kinetic theory using the method of direct-simulation Monte Carlo (DSMC). The continuum analysis was carried to the nozzle exit plane and the results were compared to the measurements. The DSMC analysis was extended into the plume of the nozzle flow and the results were compared with measurements at the exit plane and axial stations 12, 24 and 36 mm into the near-field plume. Two experimental apparatus were used that differed in design and gave slightly different profiles of pressure measurements. The DSMC method compared well with the measurements from each apparatus at all axial stations and provided a more accurate prediction of the flow than the continuum method, verifying the validity of DSMC for such calculations.

  2. Of Tales and Tempests.

    ERIC Educational Resources Information Center

    Bottoms, Janet

    1996-01-01

    Examines the prose versions of Shakespeare plays written for children by Charles and Mary Lamb, Bernard Miles, and Leon Garfield. Suggests that the content ranges far from Shakespeare's originals and promotes values that should be questioned critically. (TB)

  3. Serenity Above, Tempests Below

    NASA Image and Video Library

    2006-01-10

    Whiffs of cloud dance in Saturn atmosphere, while the dim crescent of Rhea 1,528 kilometers, or 949 miles across hangs in the distance. The dark ringplane cuts a diagonal across the top left corner of this view

  4. Agricultural Spraying

    NASA Technical Reports Server (NTRS)

    1986-01-01

    AGDISP, a computer code written for Langley by Continuum Dynamics, Inc., aids crop dusting airplanes in targeting pesticides. The code is commercially available and can be run on a personal computer by an inexperienced operator. Called SWA+H, it is used by the Forest Service, FAA, DuPont, etc. DuPont uses the code to "test" equipment on the computer using a laser system to measure particle characteristics of various spray compounds.

  5. Neutron Capture Gamma-Ray Libraries for Nuclear Applications

    NASA Astrophysics Data System (ADS)

    Sleaford, B. W.; Firestone, R. B.; Summers, N.; Escher, J.; Hurst, A.; Krticka, M.; Basunia, S.; Molnar, G.; Belgya, T.; Revay, Z.; Choi, H. D.

    2011-06-01

    The neutron capture reaction is useful in identifying and analyzing the gamma-ray spectrum from an unknown assembly as it gives unambiguous information on its composition. This can be done passively or actively where an external neutron source is used to probe an unknown assembly. There are known capture gamma-ray data gaps in the ENDF libraries used by transport codes for various nuclear applications. The Evaluated Gamma-ray Activation file (EGAF) is a new thermal neutron capture database of discrete line spectra and cross sections for over 260 isotopes that was developed as part of an IAEA Coordinated Research Project. EGAF is being used to improve the capture gamma production in ENDF libraries. For medium to heavy nuclei the quasi continuum contribution to the gamma cascades is not experimentally resolved. The continuum contains up to 90% of all the decay energy and is modeled here with the statistical nuclear structure code DICEBOX. This code also provides a consistency check of the level scheme nuclear structure evaluation. The calculated continuum is of sufficient accuracy to include in the ENDF libraries. This analysis also determines new total thermal capture cross sections and provides an improved RIPL database. For higher energy neutron capture there is less experimental data available making benchmarking of the modeling codes more difficult. We are investigating the capture spectra from higher energy neutrons experimentally using surrogate reactions and modeling this with Hauser-Feshbach codes. This can then be used to benchmark CASINO, a version of DICEBOX modified for neutron capture at higher energy. This can be used to simulate spectra from neutron capture at incident neutron energies up to 20 MeV to improve the gamma-ray spectrum in neutron data libraries used for transport modeling of unknown assemblies.

  6. The detailed balance requirement and general empirical formalisms for continuum absorption

    NASA Technical Reports Server (NTRS)

    Ma, Q.; Tipping, R. H.

    1994-01-01

    Two general empirical formalisms are presented for the spectral density which take into account the deviations from the Lorentz line shape in the wing regions of resonance lines. These formalisms satisfy the detailed balance requirement. Empirical line shape functions, which are essential to provide the continuum absorption at different temperatures in various frequency regions for atmospheric transmission codes, can be obtained by fitting to experimental data.

  7. Peridynamics with LAMMPS : a user guide.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehoucq, Richard B.; Silling, Stewart Andrew; Plimpton, Steven James

    2008-01-01

    Peridynamics is a nonlocal formulation of continuum mechanics. The discrete peridynamic model has the same computational structure as a molecular dynamic model. This document details the implementation of a discrete peridynamic model within the LAMMPS molecular dynamic code. This document provides a brief overview of the peridynamic model of a continuum, then discusses how the peridynamic model is discretized, and overviews the LAMMPS implementation. A nontrivial example problem is also included.

  8. A Continuum Diffusion Model for Viscoelastic Materials

    DTIC Science & Technology

    1988-11-01

    ZIP Code) 7b. ADDRESS (CJI. Slow, and ZIP Code) Mechanics Div isi on Office of Naval Research; Code 432 Collge Satio, T as 7843800 Quincy Ave. Collge ...these studies, which involved experimental, analytical, and materials science aspects, were conducted by researchers in the fields of physical and...thermodynamics, with irreversibility stemming from the foregoing variables yr through "growth laws" that correspond to viscous resistance. The physical ageing of

  9. Investigation on a coupled CFD/DSMC method for continuum-rarefied flows

    NASA Astrophysics Data System (ADS)

    Tang, Zhenyu; He, Bijiao; Cai, Guobiao

    2012-11-01

    The purpose of the present work is to investigate the coupled CFD/DSMC method using the existing CFD and DSMC codes developed by the authors. The interface between the continuum and particle regions is determined by the gradient-length local Knudsen number. A coupling scheme combining both state-based and flux-based coupling methods is proposed in the current study. Overlapping grids are established between the different grid systems of CFD and DSMC codes. A hypersonic flow over a 2D cylinder has been simulated using the present coupled method. Comparison has been made between the results obtained from both methods, which shows that the coupled CFD/DSMC method can achieve the same precision as the pure DSMC method and obtain higher computational efficiency.

  10. A photometric search for transiting planets

    NASA Astrophysics Data System (ADS)

    Baliber, Nairn Reese

    In the decade since the discovery of the first planet orbiting a main-sequence star other than the Sun, more than 160 planets have been detected in orbit around other stars, most of them discovered by measuring the velocity of the reflexive motion of their parent stars caused by the gravitational pull of the planets. These discoveries produced a population of planets much different to the ones in our Solar System and created interest in other methods to detect these planets. One such method is searching for transits, the slight photometric dimming of stars caused by a close-orbiting, Jupiter-sized planet passing between a star and our line of sight once per orbit. We report results from TeMPEST, the Texas, McDonald Photometric Extrasolar Search for Transits, a transit survey conducted with the McDonald Observatory 0.76 m Prime Focus Corrector (PFC). We monitored five fields of stars in the plane of the Milky Way over the course of two and a half years. We created a photometry pipeline to perform high-precision differential photometry on all of the images, and used a software detection algorithm to detect transit signals in the light curves. Although no transits were found, we calculated our detection probability by determining the fraction of the stars monitored by TeMPEST which were suitable to show transits, measuring the probability of detecting transit signals based on the temporal coverage of our fields, and measuring our detection efficiency by inserting false transits into TeMPEST data to see what fraction could be recovered by our automatic detection software. We conclude that in our entire data set, we generated an effective sample of 2660 stars, a sample in which if any star is showing a transit, it would have been detected. We found no convincing transits in our data, but current statistics from radial velocity surveys indicate that only one in about 1300 of these stars should be showing transits. These numbers are consistent with the lack of transits produced by TeMPEST and the small number of transits generated by other surveys. We therefore discuss methods by which a transit survey's effective sample may be increased to make such surveys productive in a reasonable amount of time.

  11. MaMiCo: Software design for parallel molecular-continuum flow simulations

    NASA Astrophysics Data System (ADS)

    Neumann, Philipp; Flohr, Hanno; Arora, Rahul; Jarmatz, Piet; Tchipev, Nikola; Bungartz, Hans-Joachim

    2016-03-01

    The macro-micro-coupling tool (MaMiCo) was developed to ease the development of and modularize molecular-continuum simulations, retaining sequential and parallel performance. We demonstrate the functionality and performance of MaMiCo by coupling the spatially adaptive Lattice Boltzmann framework waLBerla with four molecular dynamics (MD) codes: the light-weight Lennard-Jones-based implementation SimpleMD, the node-level optimized software ls1 mardyn, and the community codes ESPResSo and LAMMPS. We detail interface implementations to connect each solver with MaMiCo. The coupling for each waLBerla-MD setup is validated in three-dimensional channel flow simulations which are solved by means of a state-based coupling method. We provide sequential and strong scaling measurements for the four molecular-continuum simulations. The overhead of MaMiCo is found to come at 10%-20% of the total (MD) runtime. The measurements further show that scalability of the hybrid simulations is reached on up to 500 Intel SandyBridge, and more than 1000 AMD Bulldozer compute cores.

  12. Teaching Modules for Nine Plays by Shakespeare.

    ERIC Educational Resources Information Center

    Smith, Denzell

    The nine modules presented in this paper are designed to guide students in a one-semester Shakespeare Course through the reading of three Shakespearean tragedies ("Hamlet,""Othello," and "Macbeth"), three comedies ("Midsummer Night's Dream,""Merchant of Venice," and "The Tempest"), and…

  13. 2D Implosion Simulations with a Kinetic Particle Code

    NASA Astrophysics Data System (ADS)

    Sagert, Irina; Even, Wesley; Strother, Terrance

    2017-10-01

    Many problems in laboratory and plasma physics are subject to flows that move between the continuum and the kinetic regime. We discuss two-dimensional (2D) implosion simulations that were performed using a Monte Carlo kinetic particle code. The application of kinetic transport theory is motivated, in part, by the occurrence of non-equilibrium effects in inertial confinement fusion (ICF) capsule implosions, which cannot be fully captured by hydrodynamics simulations. Kinetic methods, on the other hand, are able to describe both, continuum and rarefied flows. We perform simple 2D disk implosion simulations using one particle species and compare the results to simulations with the hydrodynamics code RAGE. The impact of the particle mean-free-path on the implosion is also explored. In a second study, we focus on the formation of fluid instabilities from induced perturbations. I.S. acknowledges support through the Director's fellowship from Los Alamos National Laboratory. This research used resources provided by the LANL Institutional Computing Program.

  14. DSMC computations of hypersonic flow separation and re-attachment in the transition to continuum regime

    NASA Astrophysics Data System (ADS)

    Prakash, Ram; Gai, Sudhir L.; O'Byrne, Sean; Brown, Melrose

    2016-11-01

    The flow over a `tick' shaped configuration is performed using two Direct Simulation Monte Carlo codes: the DS2V code of Bird and the code from Sandia National Laboratory, called SPARTA. The configuration creates a flow field, where the flow is expanded initially but then is affected by the adverse pressure gradient induced by a compression surface. The flow field is challenging in the sense that the full flow domain is comprised of localized areas spanning continuum and transitional regimes. The present work focuses on the capability of SPARTA to model such flow conditions and also towards a comparative evaluation with results from DS2V. An extensive grid adaptation study is performed using both the codes on a model with a sharp leading edge and the converged results are then compared. The computational predictions are evaluated in terms of surface parameters such as heat flux, shear stress, pressure and velocity slip. SPARTA consistently predicts higher values for these surface properties. The skin friction predictions of both the codes don't give any indication of separation but the velocity slip plots indicate an incipient separation behavior at the corner. The differences in the results are attributed towards the flow resolution at the leading edge that dictates the downstream flow characteristics.

  15. Word Magic: Shakespeare's Rhetoric for Gifted Students.

    ERIC Educational Resources Information Center

    Kester, Ellen S.

    Intended for teachers of gifted students in grades 4-12, the curriculum uses six of Shakespeare's comedies ("The Taming of the Shrew,""The Tempest,""Twelfth Night,""The Comedy of Errors,""As You Like It," and "A Midsummer Night's Dream") as materials for nurturing intellectual and…

  16. 'Towers in the Tempest' Computer Animation Submission

    NASA Technical Reports Server (NTRS)

    Shirah, Greg

    2008-01-01

    The following describes a computer animation that has been submitted to the ACM/SIGGRAPH 2008 computer graphics conference: 'Towers in the Tempest' clearly communicates recent scientific research into how hurricanes intensify. This intensification can be caused by a phenomenon called a 'hot tower.' For the first time, research meteorologists have run complex atmospheric simulations at a very fine temporal resolution of 3 minutes. Combining this simulation data with satellite observations enables detailed study of 'hot towers.' The science of 'hot towers' is described using: satellite observation data, conceptual illustrations, and a volumetric atmospheric simulation data. The movie starts by showing a 'hot tower' observed by NASA's Tropical Rainfall Measuring Mission (TRMM) spacecraft's three dimensional precipitation radar data of Hurricane Bonnie. Next, the dynamics of a hurricane and the formation of 'hot towers' are briefly explained using conceptual illustrations. Finally, volumetric cloud, wind, and vorticity data from a supercomputer simulation of Hurricane Bonnie are shown using volume techniques such as ray marching.

  17. Self-consistent continuum solvation for optical absorption of complex molecular systems in solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timrov, Iurii; Biancardi, Alessandro; Andreussi, Oliviero

    2015-01-21

    We introduce a new method to compute the optical absorption spectra of complex molecular systems in solution, based on the Liouville approach to time-dependent density-functional perturbation theory and the revised self-consistent continuum solvation model. The former allows one to obtain the absorption spectrum over a whole wide frequency range, using a recently proposed Lanczos-based technique, or selected excitation energies, using the Casida equation, without having to ever compute any unoccupied molecular orbitals. The latter is conceptually similar to the polarizable continuum model and offers the further advantages of allowing an easy computation of atomic forces via the Hellmann-Feynman theorem andmore » a ready implementation in periodic-boundary conditions. The new method has been implemented using pseudopotentials and plane-wave basis sets, benchmarked against polarizable continuum model calculations on 4-aminophthalimide, alizarin, and cyanin and made available through the QUANTUM ESPRESSO distribution of open-source codes.« less

  18. Davies the Manipulator of "The Salterton Trilogy."

    ERIC Educational Resources Information Center

    Tedford, Barbara W.

    Some critics of Robertson Davies' three novels that comprise the Salterton trilogy, "Tempest-Tost" (1951), "Leaven of Malice" (1954), and "A Mixture of Frailties" (1958) complain of their creaky novelistic machinery, suggesting that they merely show an essayist, or journalist, becoming a novelist. These three novels,…

  19. Code CUGEL: A code to unfold Ge(Li) spectrometer polyenergetic gamma photon experimental distributions

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Born, U.

    1970-01-01

    A FORTRAN code was developed for the Univac 1108 digital computer to unfold lithium-drifted germanium semiconductor spectrometers, polyenergetic gamma photon experimental distributions. It was designed to analyze the combination continuous and monoenergetic gamma radiation field of radioisotope volumetric sources. The code generates the detector system response matrix function and applies it to monoenergetic spectral components discretely and to the continuum iteratively. It corrects for system drift, source decay, background, and detection efficiency. Results are presented in digital form for differential and integrated photon number and energy distributions, and for exposure dose.

  20. IUTAM Symposium on Statistical Energy Analysis, 8-11 July 1997, Programme

    DTIC Science & Technology

    1997-01-01

    distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (Maximum200 words) This was the first international scientific gathering devoted...energy flow, continuum dynamics, vibrational energy, statistical energy analysis (SEA) 15. NUMBER OF PAGES 16. PRICE CODE INSECURITY... correlation v=V(ɘ ’• • determination of the correlation n^, =11^, (<?). When harmonic motion and time-average are considered, the following I

  1. Enhanced verification test suite for physics simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  2. Performance evaluation of rotating pump jet mixing of radioactive wastes in Hanford Tanks 241-AP-102 and -104

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onishi, Y.; Recknagle, K.P.

    The purpose of this study was to confirm the adequacy of a single mixer pump to fully mix the wastes that will be stored in Tanks 241-AP-102 and -104. These Hanford double-shell tanks (DSTs) will be used as staging tanks to receive low-activity wastes from other Hanford storage tanks and, in turn, will supply the wastes to private waste vitrification facilities for eventual solidification. The TEMPEST computer code was applied to Tanks AP-102 and -104 to simulate waste mixing generated by the 60-ft/s rotating jets and to determine the effectiveness of the single rotating pump to mix the waste. TEMPESTmore » simulates flow and mass/heat transport and chemical reactions (equilibrium and kinetic reactions) coupled together. Section 2 describes the pump jet mixing conditions the authors evaluated, the modeling cases, and their parameters. Section 3 reports model applications and assessment results. The summary and conclusions are presented in Section 4, and cited references are listed in Section 5.« less

  3. Simulation of Carbon Production from Material Surfaces in Fusion Devices

    NASA Astrophysics Data System (ADS)

    Marian, J.; Verboncoeur, J.

    2005-10-01

    Impurity production at carbon surfaces by plasma bombardment is a key issue for fusion devices as modest amounts can lead to excessive radiative power loss and/or hydrogenic D-T fuel dilution. Here results of molecular dynamics (MD) simulations of physical and chemical sputtering of hydrocarbons are presented for models of graphite and amorphous carbon, the latter formed by continuous D-T impingement in conditions that mimic fusion devices. The results represent more extensive simulations than we reported last year, including incident energies in the 30-300 eV range for a variety of incident angles that yield a number of different hydrocarbon molecules. The calculated low-energy yields clarify the uncertainty in the complex chemical sputtering rate since chemical bonding and hard-core repulsion are both included in the interatomic potential. Also modeled is hydrocarbon break-up by electron-impact collisions and transport near the surface. Finally, edge transport simulations illustrate the sensitivity of the edge plasma properties arising from moderate changes in the carbon content. The models will provide the impurity background for the TEMPEST kinetic edge code.

  4. Possible Pasts: Historiography and Legitimation in "Henry VIII."

    ERIC Educational Resources Information Center

    Kamps, Ivo

    1996-01-01

    Aims to rehabilitate the reputation of Shakespeare's "Henry VIII" and emphasizes its potential usefulness in the classroom by reconsidering it in the context of Renaissance history writing. Shows how "Henry VIII" can be taught as a commentary on or seen as a continuation of incipient themes in "The Tempest" and…

  5. SIMULATING THE EFFECTS OF UPSTREAM TURBULENCE ON DISPERSION AROUND A BUILDING

    EPA Science Inventory

    The effects of high turbulence versus no turbulence in a sheared boundary-layer flow approaching a building are being investigated by a turbulent kinetic energy/dissipation (k-e) model (TEMPEST). The effects on both the mean flow and the concentration field around a cubical build...

  6. Tempest, Arizona: Criminal Epistemologies and the Rhetorical Possibilities of Raza Studies

    ERIC Educational Resources Information Center

    Serna, Elias

    2013-01-01

    This essay looks at Ethnic Studies activism in Arizona through a rhetorical lens in order to highlight epistemological aspects of activities such as a high school Chicano Literature class, Roberto "Dr. Cintli" Rodriguez's journalism, and student activism to defend the Mexican-American Studies Department. Taking rhetoric's premise that…

  7. A Decade of Inquiry: Tempest in a Teacup?

    ERIC Educational Resources Information Center

    Merwin, William

    This review covers research literature concerning the inquiry method reported by "Social Education" from 1960 to 1968. Studies were selected which dealt with either the learning outcomes or adaptability aspects of inquiry. It is cautiously concluded that under certain conditions inquiry is at least as effective as more traditional…

  8. Tinkering Change vs. System Change

    ERIC Educational Resources Information Center

    Hubbard, Russ

    2009-01-01

    In this article, the author makes a distinction between two kinds of change: tinkering change and systemic change. Tinkering change includes reforms intended to address a specific deficiency or practice. Such tinkering change can be contrasted to what Shakespeare termed "sea change" in "The Tempest" ("a sea change into something rich and strange")…

  9. Shakespeare in an Elementary School Setting.

    ERIC Educational Resources Information Center

    Wood, Robin H.

    1997-01-01

    For almost 50 years, the 8th-grade graduating class at a New Jersey private elementary school has presented an expertly produced Shakespeare play, alternating between "The Tempest" and "A Midsummer Night's Dream." The whole school becomes involved, from younger kids reading story versions of the plays, to older kids making…

  10. Automatic generation of user material subroutines for biomechanical growth analysis.

    PubMed

    Young, Jonathan M; Yao, Jiang; Ramasubramanian, Ashok; Taber, Larry A; Perucchio, Renato

    2010-10-01

    The analysis of the biomechanics of growth and remodeling in soft tissues requires the formulation of specialized pseudoelastic constitutive relations. The nonlinear finite element analysis package ABAQUS allows the user to implement such specialized material responses through the coding of a user material subroutine called UMAT. However, hand coding UMAT subroutines is a challenge even for simple pseudoelastic materials and requires substantial time to debug and test the code. To resolve this issue, we develop an automatic UMAT code generation procedure for pseudoelastic materials using the symbolic mathematics package MATHEMATICA and extend the UMAT generator to include continuum growth. The performance of the automatically coded UMAT is tested by simulating the stress-stretch response of a material defined by a Fung-orthotropic strain energy function, subject to uniaxial stretching, equibiaxial stretching, and simple shear in ABAQUS. The MATHEMATICA UMAT generator is then extended to include continuum growth by adding a growth subroutine to the automatically generated UMAT. The MATHEMATICA UMAT generator correctly derives the variables required in the UMAT code, quickly providing a ready-to-use UMAT. In turn, the UMAT accurately simulates the pseudoelastic response. In order to test the growth UMAT, we simulate the growth-based bending of a bilayered bar with differing fiber directions in a nongrowing passive layer. The anisotropic passive layer, being topologically tied to the growing isotropic layer, causes the bending bar to twist laterally. The results of simulations demonstrate the validity of the automatically coded UMAT, used in both standardized tests of hyperelastic materials and for a biomechanical growth analysis.

  11. Breakdown and Limit of Continuum Diffusion Velocity for Binary Gas Mixtures from Direct Simulation

    NASA Astrophysics Data System (ADS)

    Martin, Robert Scott; Najmabadi, Farrokh

    2011-05-01

    This work investigates the breakdown of the continuum relations for diffusion velocity in inert binary gas mixtures. Values of the relative diffusion velocities for components of a gas mixture may be calculated using of Chapman-Enskog theory and occur not only due to concentration gradients, but also pressure and temperature gradients in the flow as described by Hirschfelder. Because Chapman-Enskog theory employs a linear perturbation around equilibrium, it is expected to break down when the velocity distribution deviates significantly from equilibrium. This breakdown of the overall flow has long been an area of interest in rarefied gas dynamics. By comparing the continuum values to results from Bird's DS2V Monte Carlo code, we propose a new limit on the continuum approach specific to binary gases. To remove the confounding influence of an inconsistent molecular model, we also present the application of the variable hard sphere (VSS) model used in DS2V to the continuum diffusion velocity calculation. Fitting sample asymptotic curves to the breakdown, a limit, Vmax, that is a fraction of an analytically derived limit resulting from the kinetic temperature of the mixture is proposed. With an expected deviation of only 2% between the physical values and continuum calculations within ±Vmax/4, we suggest this as a conservative estimate on the range of applicability for the continuum theory.

  12. Particle kinetic simulation of high altitude hypervelocity flight

    NASA Technical Reports Server (NTRS)

    Boyd, Iain; Haas, Brian L.

    1994-01-01

    Rarefied flows about hypersonic vehicles entering the upper atmosphere or through nozzles expanding into a near vacuum may only be simulated accurately with a direct simulation Monte Carlo (DSMC) method. Under this grant, researchers enhanced the models employed in the DSMC method and performed simulations in support of existing NASA projects or missions. DSMC models were developed and validated for simulating rotational, vibrational, and chemical relaxation in high-temperature flows, including effects of quantized anharmonic oscillators and temperature-dependent relaxation rates. State-of-the-art advancements were made in simulating coupled vibration-dissociation recombination for post-shock flows. Models were also developed to compute vehicle surface temperatures directly in the code rather than requiring isothermal estimates. These codes were instrumental in simulating aerobraking of NASA's Magellan spacecraft during orbital maneuvers to assess heat transfer and aerodynamic properties of the delicate satellite. NASA also depended upon simulations of entry of the Galileo probe into the atmosphere of Jupiter to provide drag and flow field information essential for accurate interpretation of an onboard experiment. Finally, the codes have been used extensively to simulate expanding nozzle flows in low-power thrusters in support of propulsion activities at NASA-Lewis. Detailed comparisons between continuum calculations and DSMC results helped to quantify the limitations of continuum CFD codes in rarefied applications.

  13. QR Codes in Education and Communication

    ERIC Educational Resources Information Center

    Durak, Gurhan; Ozkeskin, E. Emre; Ataizi, Murat

    2016-01-01

    Technological advances brought applications of innovations to education. Conventional education increasingly flourishes with new technologies accompanied by more learner active environments. In this continuum, there are learners preferring self-learning. Traditional learning materials yield attractive, motivating and technologically enhanced…

  14. RAINIER: A simulation tool for distributions of excited nuclear states and cascade fluctuations

    NASA Astrophysics Data System (ADS)

    Kirsch, L. E.; Bernstein, L. A.

    2018-06-01

    A new code has been developed named RAINIER that simulates the γ-ray decay of discrete and quasi-continuum nuclear levels for a user-specified range of energy, angular momentum, and parity including a realistic treatment of level spacing and transition width fluctuations. A similar program, DICEBOX, uses the Monte Carlo method to simulate level and width fluctuations but is restricted in its initial level population algorithm. On the other hand, modern reaction codes such as TALYS and EMPIRE populate a wide range of states in the residual nucleus prior to γ-ray decay, but do not go beyond the use of deterministic functions and therefore neglect cascade fluctuations. This combination of capabilities allows RAINIER to be used to determine quasi-continuum properties through comparison with experimental data. Several examples are given that demonstrate how cascade fluctuations influence experimental high-resolution γ-ray spectra from reactions that populate a wide range of initial states.

  15. REX3DV1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holm, Elizabeth A.

    2002-03-28

    This code is a FORTRAN code for three-dimensional Monte Carol Potts Model (MCPM) Recrystallization and grain growth. A continuum grain structure is mapped onto a three-dimensional lattice. The mapping procedure is analogous to color bitmapping the grain structure; grains are clusters of pixels (sites) of the same color (spin). The total system energy is given by the Pott Hamiltonian and the kinetics of grain growth are determined through a Monte Carlo technique with a nonconserved order parameter (Glauber dynamics). The code can be compiled and run on UNIX/Linux platforms.

  16. The water vapour self-continuum absorption in the infrared atmospheric windows: new laser measurements near 3.3 and 2.0 µm

    NASA Astrophysics Data System (ADS)

    Lechevallier, Loic; Vasilchenko, Semen; Grilli, Roberto; Mondelain, Didier; Romanini, Daniele; Campargue, Alain

    2018-04-01

    The amplitude, the temperature dependence, and the physical origin of the water vapour absorption continuum are a long-standing issue in molecular spectroscopy with direct impact in atmospheric and planetary sciences. In recent years, we have determined the self-continuum absorption of water vapour at different spectral points of the atmospheric windows at 4.0, 2.1, 1.6, and 1.25 µm, by highly sensitive cavity-enhanced laser techniques. These accurate experimental constraints have been used to adjust the last version (3.2) of the semi-empirical MT_CKD model (Mlawer-Tobin_Clough-Kneizys-Davies), which is widely incorporated in atmospheric radiative-transfer codes. In the present work, the self-continuum cross-sections, CS, are newly determined at 3.3 µm (3007 cm-1) and 2.0 µm (5000 cm-1) by optical-feedback-cavity enhanced absorption spectroscopy (OFCEAS) and cavity ring-down spectroscopy (CRDS), respectively. These new data allow extending the spectral coverage of the 4.0 and 2.1 µm windows, respectively, and testing the recently released 3.2 version of the MT_CKD continuum. By considering high temperature literature data together with our data, the temperature dependence of the self-continuum is also obtained.

  17. Investigation of Perceptual-Motor Behavior Across the Expert Athlete to Disabled Patient Skill Continuum can Advance Theory and Practical Application.

    PubMed

    Müller, Sean; Vallence, Ann-Maree; Winstein, Carolee

    2017-12-14

    A framework is presented of how theoretical predictions can be tested across the expert athlete to disabled patient skill continuum. Common-coding theory is used as the exemplar to discuss sensory and motor system contributions to perceptual-motor behavior. Behavioral and neural studies investigating expert athletes and patients recovering from cerebral stroke are reviewed. They provide evidence of bi-directional contributions of visual and motor systems to perceptual-motor behavior. Majority of this research is focused on perceptual-motor performance or learning, with less on transfer. The field is ripe for research designed to test theoretical predictions across the expert athlete to disabled patient skill continuum. Our view has implications for theory and practice in sports science, physical education, and rehabilitation.

  18. Experimental verification of a progressive damage model for composite laminates based on continuum damage mechanics. M.S. Thesis Final Report

    NASA Technical Reports Server (NTRS)

    Coats, Timothy William

    1994-01-01

    Progressive failure is a crucial concern when using laminated composites in structural design. Therefore the ability to model damage and predict the life of laminated composites is vital. The purpose of this research was to experimentally verify the application of the continuum damage model, a progressive failure theory utilizing continuum damage mechanics, to a toughened material system. Damage due to tension-tension fatigue was documented for the IM7/5260 composite laminates. Crack density and delamination surface area were used to calculate matrix cracking and delamination internal state variables, respectively, to predict stiffness loss. A damage dependent finite element code qualitatively predicted trends in transverse matrix cracking, axial splits and local stress-strain distributions for notched quasi-isotropic laminates. The predictions were similar to the experimental data and it was concluded that the continuum damage model provided a good prediction of stiffness loss while qualitatively predicting damage growth in notched laminates.

  19. Hard X rays and low-energy gamma rays from the Moon: Dependence of the continuum on the regolith composition and the solar activity

    NASA Astrophysics Data System (ADS)

    Banerjee, D.; Gasnault, O.

    2008-07-01

    The primary aim of the high-energy X-ray spectrometer (HEX) experiment on the Chandrayaan-1 mission to the Moon is to characterize the movement of volatiles on the lunar surface through the detection of the 46.5 keV line from 210Pb, a decay product of 222Rn. An important consideration for design and operation of HEX is to estimate the continuum background signal expected from the lunar surface, as well as its dependence on solar activity and lunar composition. We have developed a Monte Carlo code utilizing Geant4 for simulating the interaction of cosmic rays in the lunar regolith, and we estimated the variation in the continuum background in the energy region of interest for various lunar compositions. Dependence of the continuum background on solar activity was also evaluated considering ferroan anorthositic (FAN) composition. Our results suggest the viability of inferring lithologic characteristics of planetary surfaces based on a study of low-energy gamma ray emission.

  20. pacce: Perl algorithm to compute continuum and equivalent widths

    NASA Astrophysics Data System (ADS)

    Riffel, Rogério; Borges Vale, Tibério

    2011-08-01

    We present Perl Algorithm to Compute continuum and Equivalent Widths ( pacce). We describe the methods used in the computations and the requirements for its usage. We compare the measurements made with pacce and "manual" ones made using iraf splot task. These tests show that for synthetic simple stellar population (SSP) models the equivalent widths strengths are very similar (differences ≲0.2 Å) for both measurements. In real stellar spectra, the correlation between both values is still very good, but with differences of up to 0.5 Å. pacce is also able to determine mean continuum and continuum at line center values, which are helpful in stellar population studies. In addition, it is also able to compute the uncertainties in the equivalent widths using photon statistics. The code is made available for the community through the web at http://www.if.ufrgs.br/~riffel/software.html .

  1. Discipline Issues: Is There a Tempest Brewing in B.C. Schools?

    ERIC Educational Resources Information Center

    Fraser, Stephen R.

    1987-01-01

    Educational policy in British Columbia does not distinguish between special needs and regular class students in relation to discipline practices. Although Canadian courts have generally upheld the rights of school boards rather than the unspecified rights of special needs children, a recent court case suggests the possibility of change. (JW)

  2. NUMERICAL SIMULATION TO DETERMINE THE EFFECTS OF INCIDENT WIND SHEAR AND TURBULENCE LEVEL ON THE FLOW AROUND A BUILDING

    EPA Science Inventory

    The effects of incident shear and turbulence on flow around a cubical building are being investigated by a turbulent kinetic energy dissipation (k-e) model (TEMPEST). he numerical simulations demonstrate significant effects due to the differences in the incident flow. he addition...

  3. Tempests into Rainbows. Managing Turbulence.

    ERIC Educational Resources Information Center

    Fleming, Robben W.

    This autobiography recounts personal experiences as a college professor and administrator (President of the University of Michigan) during the 1960s to the early 1980s. The 17 chapters discuss early years growing up in Illinois; college years; employment with the federal government; enlistment in the Army; the war in Germany; the end of the war,…

  4. A Triple Tropical Tempest Train: Karina, Lowell, Mariest

    NASA Image and Video Library

    2014-08-22

    NASA and NOAA satellites are studying the triple tropical tempests that are now romping through the Eastern Pacific Ocean. NOAA's GOES-West satellite captured Tropical Storm Karina, Tropical Storm Lowell and newly formed Tropical Storm Marie on August 22. NOAA's GOES-West satellite captured all three storms in an infrared image at 0900 UTC (5 a.m. EDT), and Tropical Lowell clearly dwarfs Karina to its west, and Marie to the east. The infrared image was created at NASA/NOAA's GOES Project at the NASA Goddard Space Flight Center in Greenbelt, Maryland. For more information about Lowell, visit: www.nasa.gov/content/goddard/12e-eastern-pacific-ocean/ For more information about Karina, visit: www.nasa.gov/content/goddard/karina-eastern-pacific/ Rob Gutro NASA's Goddard Space Flight Center NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  5. Optical observables in stars with non-stationary atmospheres. [fireballs and cepheid models

    NASA Technical Reports Server (NTRS)

    Hillendahl, R. W.

    1980-01-01

    Experience gained by use of Cepheid modeling codes to predict the dimensional and photometric behavior of nuclear fireballs is used as a means of validating various computational techniques used in the Cepheid codes. Predicted results from Cepheid models are compared with observations of the continuum and lines in an effort to demonstrate that the atmospheric phenomena in Cepheids are quite complex but that they can be quantitatively modeled.

  6. Advances in Computational Capabilities for Hypersonic Flows

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip

    1997-01-01

    The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.

  7. Fatal Amusements: Contemplating the Tempest of Contemporary Media and American Culture

    ERIC Educational Resources Information Center

    Strate, Lance

    2016-01-01

    Our use of the electronic media to conduct serious discourse raises the question of whether "we are amusing ourselves to death," as Neil Postman argued. The approach known as "media ecology," the study of media as environments, which emphasizes the need to understand context and find balance, provides a basis for the analysis…

  8. Transforming Conceptual Space into a Creative Learning Place: Crossing a Threshold

    ERIC Educational Resources Information Center

    Moffat, Kirstine; McKim, Anne

    2016-01-01

    This article describes, discusses and reflects on a teaching and learning experiment in a first year BA course. Students were led out of the lecture room to a different space, the New Place Theatre. While this move out of the usual teaching space was appropriate for the text being studied, William Shakespeare's "The Tempest", the…

  9. Tempest in a Therapeutic Community: Implementation and Evaluation Issues for Faith-Based Programming

    ERIC Educational Resources Information Center

    Scott, Diane L.; Crow, Matthew S.; Thompson, Carla J.

    2010-01-01

    The therapeutic community (TC) is an increasingly utilized intervention model in corrections settings. Rarely do these TCs include faith-based curriculum other than that included in Alcoholics Anonymous or Narcotics Anonymous programs as does the faith-based TC that serves as the basis for this article. Borrowing from the successful TC model, the…

  10. The LTS timing analysis program :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, Darrell Jewell; Schwarz, Jens

    The LTS Timing Analysis program described in this report uses signals from the Tempest Lasers, Pulse Forming Lines, and Laser Spark Detectors to carry out calculations to quantify and monitor the performance of the the Z-Accelerators laser triggered SF6 switches. The program analyzes Z-shots beginning with Z2457, when Laser Spark Detector data became available for all lines.

  11. Watered by Tempests: Hurricanes in the Cultural Fabric of the United Houma Nation

    ERIC Educational Resources Information Center

    D'Oney, J. Daniel

    2008-01-01

    Hurricanes Katrina and Rita affected hundreds of thousands in southern Louisiana. To say that they touched people of every stripe and color dramatically is a gross understatement. Aside from the loss of life and property damage, families were uprooted, traditions disrupted, and one of the largest migrations in American history forced on a state…

  12. Monstrous No More: How Shakespeare's Caliban and the Community College Student Aspire Together

    ERIC Educational Resources Information Center

    Gold Wright, Jill Y.

    2006-01-01

    Many students enter classes like the Shakespeare character Caliban, knowing books to be powerful but feeling eluded by them, unable to access their knowledge. Author Jill Wright shares new-found inspiration and insight she discovered while co-directing Act III, Scene ii of William Shakespeare's "The Tempest" and suddenly realized a…

  13. "Score Choice": A Tempest in a Teapot?

    ERIC Educational Resources Information Center

    Hoover, Eric

    2009-01-01

    A new option that allows students to choose which of their test scores to send to colleges has generated renewed criticism of the College Board. College Board officials tout the option, called Score Choice, as a way to ease test taker anxiety. Some prominent admissions officials have publicly described Score Choice as a sales tactic that will…

  14. Television and the Crisis in the Humanities.

    ERIC Educational Resources Information Center

    Burns, Gary

    It is indeed a problem, perhaps even a crisis, that many Americans are ignorant of "The Tempest," the Civil War, the location of the Persian Gulf, the Constitution, or the chief justice of the Supreme Court. However, if conservative humanists continue to ostracize, scorn, and ignore both media studies and the media themselves, the result will not…

  15. On the Nature of Off-limb Flare Continuum Sources Detected by SDO /HMI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heinzel, P.; Kašparová, J.; Kleint, L.

    The Helioseismic and Magnetic Imager on board the Solar Dynamics Observatory has provided unique observations of off-limb flare emission. White-light continuum enhancements were detected in the “continuum” channel of the Fe 6173 Å line during the impulsive phase of the observed flares. In this paper we aim to determine which radiation mechanism is responsible for such enhancement being seen above the limb, at chromospheric heights around or below 1000 km. Using a simple analytical approach, we compare two candidate mechanisms, the hydrogen recombination continuum (Paschen) and the Thomson continuum due to scattering of disk radiation on flare electrons. Both mechanismsmore » depend on the electron density, which is typically enhanced during the impulsive phase of a flare as the result of collisional ionization (both thermal and also non-thermal due to electron beams). We conclude that for electron densities higher than 10{sup 12} cm{sup −3}, the Paschen recombination continuum significantly dominates the Thomson scattering continuum and there is some contribution from the hydrogen free–free emission. This is further supported by detailed radiation-hydrodynamical (RHD) simulations of the flare chromosphere heated by the electron beams. We use the RHD code FLARIX to compute the temporal evolution of the flare-heating in a semi-circular loop. The synthesized continuum structure above the limb resembles the off-limb flare structures detected by HMI, namely their height above the limb, as well as the radiation intensity. These results are consistent with recent findings related to hydrogen Balmer continuum enhancements, which were clearly detected in disk flares by the IRIS near-ultraviolet spectrometer.« less

  16. CONTINUUM INTENSITY AND [O i] SPECTRAL LINE PROFILES IN SOLAR 3D PHOTOSPHERIC MODELS: THE EFFECT OF MAGNETIC FIELDS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fabbian, D.; Moreno-Insertis, F., E-mail: damian@iac.es, E-mail: fmi@iac.es

    2015-04-01

    The importance of magnetic fields in three-dimensional (3D) magnetoconvection models of the Sun’s photosphere is investigated in terms of their influence on the continuum intensity at different viewing inclination angles and on the intensity profile of two [O i] spectral lines. We use the RH numerical radiative transfer code to perform a posteriori spectral synthesis on the same time series of magnetoconvection models used in our publications on the effect of magnetic fields on abundance determination. We obtain a good match of the synthetic disk-center continuum intensity to the absolute continuum values from the Fourier Transform Spectrometer (FTS) observational spectrum; the matchmore » of the center-to-limb variation synthetic data to observations is also good, thanks, in part, to the 3D radiation transfer capabilities of the RH code. The different levels of magnetic flux in the numerical time series do not modify the quality of the match. Concerning the targeted [O i] spectral lines, we find, instead, that magnetic fields lead to nonnegligible changes in the synthetic spectrum, with larger average magnetic flux causing both of the lines to become noticeably weaker. The photospheric oxygen abundance that one would derive if instead using nonmagnetic numerical models would thus be lower by a few to several centidex. The inclusion of magnetic fields is confirmed to be important for improving the current modeling of the Sun, here in particular in terms of spectral line formation and of deriving consistent chemical abundances. These results may shed further light on the still controversial issue regarding the precise value of the solar oxygen abundance.« less

  17. RAINIER: A simulation tool for distributions of excited nuclear states and cascade fluctuations

    DOE PAGES

    Kirsch, L. E.; Bernstein, L. A.

    2018-03-04

    In this paper, a new code has been developed named RAINIER that simulates the γ-ray decay of discrete and quasi-continuum nuclear levels for a user-specified range of energy, angular momentum, and parity including a realistic treatment of level spacing and transition width fluctuations. A similar program, DICEBOX, uses the Monte Carlo method to simulate level and width fluctuations but is restricted in its initial level population algorithm. On the other hand, modern reaction codes such as TALYS and EMPIRE populate a wide range of states in the residual nucleus prior to γ-ray decay, but do not go beyond the usemore » of deterministic functions and therefore neglect cascade fluctuations. This combination of capabilities allows RAINIER to be used to determine quasi-continuum properties through comparison with experimental data. Finally, several examples are given that demonstrate how cascade fluctuations influence experimental high-resolution γ-ray spectra from reactions that populate a wide range of initial states.« less

  18. RAINIER: A simulation tool for distributions of excited nuclear states and cascade fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirsch, L. E.; Bernstein, L. A.

    In this paper, a new code has been developed named RAINIER that simulates the γ-ray decay of discrete and quasi-continuum nuclear levels for a user-specified range of energy, angular momentum, and parity including a realistic treatment of level spacing and transition width fluctuations. A similar program, DICEBOX, uses the Monte Carlo method to simulate level and width fluctuations but is restricted in its initial level population algorithm. On the other hand, modern reaction codes such as TALYS and EMPIRE populate a wide range of states in the residual nucleus prior to γ-ray decay, but do not go beyond the usemore » of deterministic functions and therefore neglect cascade fluctuations. This combination of capabilities allows RAINIER to be used to determine quasi-continuum properties through comparison with experimental data. Finally, several examples are given that demonstrate how cascade fluctuations influence experimental high-resolution γ-ray spectra from reactions that populate a wide range of initial states.« less

  19. Laser-based volumetric flow visualization by digital color imaging of a spectrally coded volume.

    PubMed

    McGregor, T J; Spence, D J; Coutts, D W

    2008-01-01

    We present the framework for volumetric laser-based flow visualization instrumentation using a spectrally coded volume to achieve three-component three-dimensional particle velocimetry. By delivering light from a frequency doubled Nd:YAG laser with an optical fiber, we exploit stimulated Raman scattering within the fiber to generate a continuum spanning the visible spectrum from 500 to 850 nm. We shape and disperse the continuum light to illuminate a measurement volume of 20 x 10 x 4 mm(3), in which light sheets of differing spectral properties overlap to form an unambiguous color variation along the depth direction. Using a digital color camera we obtain images of particle fields in this volume. We extract the full spatial distribution of particles with depth inferred from particle color. This paper provides a proof of principle of this instrument, examining the spatial distribution of a static field and a spray field of water droplets ejected by the nozzle of an airbrush.

  20. Assessment of Tank 241-S-112 Liquid Waste Mixing in Tank 241-SY-101

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onishi, Yasuo; Trent, Donald S.; Wells, Beric E.

    The objectives of this study were to evaluate mixing of liquid waste from Tank 241-S-112 with waste in Tank 241-SY-101 and to determine the properties of the resulting waste for the cross-site transfer to avoid potential double-shell tank corrosion and pipeline plugging. We applied the time-varying, three-dimensional computer code TEMPEST to Tank SY-101 as it received the S-112 liquid waste. The model predicts that temperature variations in Tank SY-101 generate a natural convection flow that is very slow, varying from about 7 x 10{sup -5} to 1 x 10{sup -3} ft/sec (0.3 to about 4 ft/hr) in most areas. Thus,more » natural convection would eventually mix the liquid waste in SY-101 but would be very slow to achieve nearly complete mixing. These simulations indicate that the mixing of S-112 and SY-101 wastes in Tank SY-101 is a very slow process, and the density difference between the two wastes would further limit mixing. It is expected to take days or weeks to achieve relatively complete mixing in Tank SY-101.« less

  1. PowderSim: Lagrangian Discrete and Mesh-Free Continuum Simulation Code for Cohesive Soils

    NASA Technical Reports Server (NTRS)

    Johnson, Scott; Walton, Otis; Settgast, Randolph

    2013-01-01

    PowderSim is a calculation tool that combines a discrete-element method (DEM) module, including calibrated interparticle-interaction relationships, with a mesh-free, continuum, SPH (smoothed-particle hydrodynamics) based module that utilizes enhanced, calibrated, constitutive models capable of mimicking both large deformations and the flow behavior of regolith simulants and lunar regolith under conditions anticipated during in situ resource utilization (ISRU) operations. The major innovation introduced in PowderSim is to use a mesh-free method (SPH-based) with a calibrated and slightly modified critical-state soil mechanics constitutive model to extend the ability of the simulation tool to also address full-scale engineering systems in the continuum sense. The PowderSim software maintains the ability to address particle-scale problems, like size segregation, in selected regions with a traditional DEM module, which has improved contact physics and electrostatic interaction models.

  2. Development and application of computational aerothermodynamics flowfield computer codes

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj

    1993-01-01

    Computations are presented for one-dimensional, strong shock waves that are typical of those that form in front of a reentering spacecraft. The fluid mechanics and thermochemistry are modeled using two different approaches. The first employs traditional continuum techniques in solving the Navier-Stokes equations. The second-approach employs a particle simulation technique (the direct simulation Monte Carlo method, DSMC). The thermochemical models employed in these two techniques are quite different. The present investigation presents an evaluation of thermochemical models for nitrogen under hypersonic flow conditions. Four separate cases are considered. The cases are governed, respectively, by the following: vibrational relaxation; weak dissociation; strong dissociation; and weak ionization. In near-continuum, hypersonic flow, the nonequilibrium thermochemical models employed in continuum and particle simulations produce nearly identical solutions. Further, the two approaches are evaluated successfully against available experimental data for weakly and strongly dissociating flows.

  3. Nebular Continuum and Line Emission in Stellar Population Synthesis Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Byler, Nell; Dalcanton, Julianne J.; Conroy, Charlie

    Accounting for nebular emission when modeling galaxy spectral energy distributions (SEDs) is important, as both line and continuum emissions can contribute significantly to the total observed flux. In this work, we present a new nebular emission model integrated within the Flexible Stellar Population Synthesis code that computes the line and continuum emission for complex stellar populations using the photoionization code Cloudy. The self-consistent coupling of the nebular emission to the matched ionizing spectrum produces emission line intensities that correctly scale with the stellar population as a function of age and metallicity. This more complete model of galaxy SEDs will improvemore » estimates of global gas properties derived with diagnostic diagrams, star formation rates based on H α , and physical properties derived from broadband photometry. Our models agree well with results from other photoionization models and are able to reproduce observed emission from H ii regions and star-forming galaxies. Our models show improved agreement with the observed H ii regions in the Ne iii/O ii plane and show satisfactory agreement with He ii emission from z = 2 galaxies, when including rotating stellar models. Models including post-asymptotic giant branch stars are able to reproduce line ratios consistent with low-ionization emission regions. The models are integrated into current versions of FSPS and include self-consistent nebular emission predictions for MIST and Padova+Geneva evolutionary tracks.« less

  4. Parallel algorithm for multiscale atomistic/continuum simulations using LAMMPS

    NASA Astrophysics Data System (ADS)

    Pavia, F.; Curtin, W. A.

    2015-07-01

    Deformation and fracture processes in engineering materials often require simultaneous descriptions over a range of length and time scales, with each scale using a different computational technique. Here we present a high-performance parallel 3D computing framework for executing large multiscale studies that couple an atomic domain, modeled using molecular dynamics and a continuum domain, modeled using explicit finite elements. We use the robust Coupled Atomistic/Discrete-Dislocation (CADD) displacement-coupling method, but without the transfer of dislocations between atoms and continuum. The main purpose of the work is to provide a multiscale implementation within an existing large-scale parallel molecular dynamics code (LAMMPS) that enables use of all the tools associated with this popular open-source code, while extending CADD-type coupling to 3D. Validation of the implementation includes the demonstration of (i) stability in finite-temperature dynamics using Langevin dynamics, (ii) elimination of wave reflections due to large dynamic events occurring in the MD region and (iii) the absence of spurious forces acting on dislocations due to the MD/FE coupling, for dislocations further than 10 Å from the coupling boundary. A first non-trivial example application of dislocation glide and bowing around obstacles is shown, for dislocation lengths of ∼50 nm using fewer than 1 000 000 atoms but reproducing results of extremely large atomistic simulations at much lower computational cost.

  5. Improved continuum lowering calculations in screened hydrogenic model with l-splitting for high energy density systems

    NASA Astrophysics Data System (ADS)

    Ali, Amjad; Shabbir Naz, G.; Saleem Shahzad, M.; Kouser, R.; Aman-ur-Rehman; Nasim, M. H.

    2018-03-01

    The energy states of the bound electrons in high energy density systems (HEDS) are significantly affected due to the electric field of the neighboring ions. Due to this effect bound electrons require less energy to get themselves free and move into the continuum. This phenomenon of reduction in potential is termed as ionization potential depression (IPD) or the continuum lowering (CL). The foremost parameter to depict this change is the average charge state, therefore accurate modeling for CL is imperative in modeling atomic data for computation of radiative and thermodynamic properties of HEDS. In this paper, we present an improved model of CL in the screened hydrogenic model with l-splitting (SHML) proposed by G. Faussurier and C. Blancard, P. Renaudin [High Energy Density Physics 4 (2008) 114] and its effect on average charge state. We propose the level charge dependent calculation of CL potential energy and inclusion of exchange and correlation energy in SHML. By doing this, we made our model more relevant to HEDS and free from CL empirical parameter to the plasma environment. We have implemented both original and modified model of SHML in our code named OPASH and benchmark our results with experiments and other state-of-the-art simulation codes. We compared our results of average charge state for Carbon, Beryllium, Aluminum, Iron and Germanium against published literature and found a very reasonable agreement between them.

  6. Computational strategy for the solution of large strain nonlinear problems using the Wilkins explicit finite-difference approach

    NASA Technical Reports Server (NTRS)

    Hofmann, R.

    1980-01-01

    The STEALTH code system, which solves large strain, nonlinear continuum mechanics problems, was rigorously structured in both overall design and programming standards. The design is based on the theoretical elements of analysis while the programming standards attempt to establish a parallelism between physical theory, programming structure, and documentation. These features have made it easy to maintain, modify, and transport the codes. It has also guaranteed users a high level of quality control and quality assurance.

  7. Drag and Cooling Tests in the 24 ft Wind Tunnel on a Centaurus-Buckingham Wing Nacelle Installation. Part 3. Tests with High Speed Cowl Entry (Tempest Type)

    DTIC Science & Technology

    1946-07-01

    ESTABLISHMENT Farnboruugh. Hants. DRAG AND COOLING TESTS IN THE 24 ft. WIND TUNNEL ON A ^ CENTAURUS -BUCKINGHAM WING NACELLE INSTALLATION PART...ILLIETUATICNS Ccntaurus-f uckinghain nacelle inst -illation with high entry velocity cowl ^soolii» ; fan not fitted. Installation of Centaurus .inline

  8. A Struggle Well Worth Having: The Uses of Theatre-in-education (TIE) for Learning

    ERIC Educational Resources Information Center

    Cooper, Chris

    2004-01-01

    In this article Chris Cooper conveys something of his passionate belief in the importance of attending to the preconditions of learning. He stresses the crucial role of the imagination in this, bringing, as he puts it, creativity to the process of learning. His account of a drama project based on "The Tempest" provides important insights into the…

  9. Learning and Teaching in the Arts. Research Monograph 4.

    ERIC Educational Resources Information Center

    Aiken, Henry David

    This paper, part of a research monograph series, focuses on a philosophy of education which is humanistic. The author discusses theories of art education, using as an example of visual art Giorgione's "The Tempest". A synopsis of what needs to be known in order to appreciate the various levels of significance in a great work of visual art precedes…

  10. Transforming Pedagogies: Encouraging Pre-Service Teachers to Engage the Power of the Arts in Their Approach to Teaching and Learning

    ERIC Educational Resources Information Center

    McLaren, Mary-Rose; Arnold, Julie

    2016-01-01

    This paper describes and analyses, through the use of case studies, two experiences of transformative learning in an undergraduate arts education unit. Pre-service teachers designed and engaged with arts-based curriculum activities, created their own artwork, participated in a modified production of The Tempest and kept a reflective journal. These…

  11. No Menstrual Cyclicity in Mood and Interpersonal Behaviour in Nine Women with Self-Reported Premenstrual Syndrome.

    PubMed

    Bosman, Renske C; Albers, Casper J; de Jong, Jettie; Batalas, Nikolaos; Aan Het Rot, Marije

    2018-06-06

    Before diagnosing premenstrual dysphoric disorder (PMDD), 2 months of prospective assessment are required to confirm menstrual cyclicity in symptoms. For a diagnosis of premenstrual syndrome (PMS), this is not required. Women with PMDD and PMS often report that their symptoms interfere with mood and social functioning, and are said to show cyclical changes in interpersonal behaviour, but this has not been examined using a prospective approach. We sampled cyclicity in mood and interpersonal behaviour for 2 months in women with self- reported PMS. Participants met the criteria for PMS on the Premenstrual Symptoms Screening Tool (PSST), a retrospective questionnaire. For 2 menstrual cycles, after each social interaction, they used the online software TEMPEST to record on their smartphones how they felt and behaved. We examined within-person variability in negative affect, positive affect, quarrelsomeness, and agreeableness. Participants evaluated TEMPEST as positive. However, we found no evidence for menstrual cyclicity in mood and interpersonal behaviour in any of the individual women (n = 9). Retrospective questionnaires such as the PSST may lead to oversampling of PMS. The diagnosis of PMS, like that of PMDD, might require 2 months of prospective assessment. © 2018 S. Karger AG, Basel.

  12. Tempest in a teapot: utility advertising

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ciscel, D.H.

    Utility sales programs represent a form of organizational slack. It is an expense that can be traded off in times of administrative stress, providing a satisfactory payment to the consumer while maintaining the integrity of the present institutional arrangement. Because it is a trade-off commodity, regulatory control of utility advertising will remain a ''tempest in a teapot.'' Marketing programs are an integral part of the selling process in the modern corporation, and severe restrictions on advertising must be temporary in nature. Court cases have pointed out that utility companies need to inform the consumer about the use of the productmore » and to promote demand for the product. These actions will be considered legally reasonable no matter what the final disposition of current environmental regulations and energy restrictions. In fact, as acceptable social solutions develop for environmental and energy supply problems, the pressure on utility advertising can be expected to fall proportionately. However, the utility still represents the largest industrial concern in most locales. The utility advertising program makes the company even more visible. When there is public dissatisfaction with the more complex parts of the utility delivery system, the raucous voice of outrage will emerge from this tempestuous teapot.« less

  13. Simulating the effects of upstream turbulence on dispersion around a building

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Y.Q.; Arya, S.P.S.; Huber, A.H.

    The effects of high turbulence versus no turbulence in a sheared boundary-layer flow approaching a building are being investigated by a turbulent kinetic energy/dissipation model (TEMPEST). The effects on both the mean flow and the concentration field around a cubical building are presented. The numerical simulations demonstrate significant effects due to the differences in the incident flow. The addition of upstream turbulence results in a reduced size of the cavity directly behind the building. The velocity deficits in the wake strongly depend on the upstream turbulence intensities. The accuracy of numerical simulations is verified by comparing the predicted mean flowmore » and concentration fields with the wind tunnel measurements of Castro and Robins (1977) and Robins and Castro (1977, 1975). Comparing the results with experimental data, the authors show that the TEMPEST model can reasonably simulate the mean flow. The numerical simulations of the concentration fields due to a source on the roof-top of the building are presented. Both the value and the position of the maximum ground-level concentration are changed dramatically due to the effects of the upstream level of turblence.« less

  14. Central ridge of Newfoundland: Little explored, potential large

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva, N.R. De

    The Central ridge on the northeastern Grand Banks off Newfoundland represents a large area with known hydrocarbon accumulations and the potential for giant fields. It covers some 17,000 sq km with water less than 400 m deep. The first major hydrocarbon discovery on the Newfoundland Grand Banks is giant Hibernia field in the Jeanne d'Arc basin. Hibernia field, discovered in 1979, has reserves of 666 million bbl and is due onstream in 1997. Since Hibernia, 14 other discoveries have been made on the Grand Banks, with three on the Central ridge. Oil was first discovered on Central Ridge in 1980more » with the Mobil et al. South Tempest G-88 well. In 1982 gas was discovered with the Mobil et al. North Dana I-43 well 30 km northeast of the earlier discovery. In 1983 gas and condensate were discovered with the Husky-Bow Valley et al. Trave E-87 well 20 km south of the South Tempest well. These discoveries are held under significant discovery licenses and an additional 2,400 sq km are held under exploration licenses. The paper discusses the history of the basin, the reservoir source traps, and the basin potential.« less

  15. Hauser-Feshbach calculations in deformed nuclei

    DOE PAGES

    Grimes, S. M.

    2013-08-22

    Hauser Feshbach calculations for deformed nuclei are typically done with level densities appropriate for deformed nuclei but with Hauser Feshbach codes which enforce spherical symmetry by not including K as a parameter in the decay sums. A code has been written which does allow the full K dependence to be included. Calculations with the code have been compared with those from a conventional Hauser Feshbach code. The evaporation portion (continuum) is only slightly affected by this change but the cross sections to individual (resolved) levels are changed substantially. It is found that cross sections to neighboring levels with the samemore » J but differing K are not the same. The predicted consequences of K mixing will also be discussed.« less

  16. Modeling of ion orbit loss and intrinsic toroidal rotation with the COGENT code

    NASA Astrophysics Data System (ADS)

    Dorf, M.; Dorr, M.; Cohen, R.; Rognlien, T.; Hittinger, J.

    2014-10-01

    We discuss recent advances in cross-separatrix neoclassical transport simulations with COGENT, a continuum gyro-kinetic code being developed by the Edge Simulation Laboratory (ESL) collaboration. The COGENT code models the axisymmetric transport properties of edge plasmas including the effects of nonlinear (Fokker-Planck) collisions and a self-consistent electrostatic potential. Our recent work has focused on studies of ion orbit loss and the associated toroidal rotation driven by this mechanism. The results of the COGENT simulations are discussed and analyzed for the parameters of the DIII-D experiment. Work performed for USDOE at LLNL under Contract DE-AC52-07NA27344.

  17. A new uniformly valid asymptotic integration algorithm for elasto-plastic creep and unified viscoplastic theories including continuum damage

    NASA Technical Reports Server (NTRS)

    Chulya, Abhisak; Walker, Kevin P.

    1991-01-01

    A new scheme to integrate a system of stiff differential equations for both the elasto-plastic creep and the unified viscoplastic theories is presented. The method has high stability, allows large time increments, and is implicit and iterative. It is suitable for use with continuum damage theories. The scheme was incorporated into MARC, a commercial finite element code through a user subroutine called HYPELA. Results from numerical problems under complex loading histories are presented for both small and large scale analysis. To demonstrate the scheme's accuracy and efficiency, comparisons to a self-adaptive forward Euler method are made.

  18. Continuum kinetic modeling of the tokamak plasma edge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dorf, M. A.; Dorr, M. R.; Hittinger, J. A.

    2016-05-15

    The first 4D (axisymmetric) high-order continuum gyrokinetic transport simulations that span the magnetic separatrix of a tokamak are presented. The modeling is performed with the COGENT code, which is distinguished by fourth-order finite-volume discretization combined with mapped multiblock grid technology to handle the strong anisotropy of plasma transport and the complex X-point divertor geometry with high accuracy. The calculations take into account the effects of fully nonlinear Fokker-Plank collisions, electrostatic potential variations, and anomalous radial transport. Topics discussed include: (a) ion orbit loss and the associated toroidal rotation and (b) edge plasma relaxation in the presence of anomalous radial transport.

  19. Continuum damage model for ferroelectric materials and its application to multilayer actuators

    NASA Astrophysics Data System (ADS)

    Gellmann, Roman; Ricoeur, Andreas

    2016-05-01

    In this paper a micromechanical continuum damage model for ferroelectric materials is presented. As a constitutive law it is implemented into a finite element (FE) code. The model is based on micromechanical considerations of domain switching and its interaction with microcrack growth and coalescence. A FE analysis of a multilayer actuator is performed, showing the initiation of damage zones at the electrode tips during the poling process. Further, the influence of mechanical pre-stressing on damage evolution and actuating properties is investigated. The results provided in this work give useful information on the damage of advanced piezoelectric devices and their optimization.

  20. A new uniformly valid asymptotic integration algorithm for elasto-plastic-creep and unified viscoplastic theories including continuum damage

    NASA Technical Reports Server (NTRS)

    Chulya, A.; Walker, K. P.

    1989-01-01

    A new scheme to integrate a system of stiff differential equations for both the elasto-plastic creep and the unified viscoplastic theories is presented. The method has high stability, allows large time increments, and is implicit and iterative. It is suitable for use with continuum damage theories. The scheme was incorporated into MARC, a commercial finite element code through a user subroutine called HYPELA. Results from numerical problems under complex loading histories are presented for both small and large scale analysis. To demonstrate the scheme's accuracy and efficiency, comparisons to a self-adaptive forward Euler method are made.

  1. Quantification of the water vapor greenhouse effect: setup and first results of the Zugspitze radiative closure experiment

    NASA Astrophysics Data System (ADS)

    Reichert, Andreas; Sussmann, Ralf; Rettinger, Markus

    2014-05-01

    Uncertainties in the knowledge of atmospheric radiative processes are among the main limiting factors for the accuracy of current climate models. Being the primary greenhouse gas in the Earth's atmosphere, water vapor is of crucial importance in atmospheric radiative transfer. However, water vapor absorption processes, especially the contribution attributed to the water vapor continuum, are currently not sufficiently well quantified. The aim of this study is therefore to obtain a more exact characterization of the water vapor radiative processes throughout the IR by means of a so-called radiative closure study at the Zugspitze/Schneefernerhaus observatory and thereby validate the radiative transfer codes used in current climate models. For that purpose, spectral radiance is measured at the Zugspitze summit observatory using an AERI-ER thermal emission radiometer (covering the far- and mid-infrared) and a solar absorption FTIR spectrometer (covering the near-infrared), respectively. These measurements are then compared to synthetic radiance spectra computed by means of the Line-By-Line Radiative Transfer Model (LBLRTM, Clough et al., 2005), a high resolution model widely used in the atmospheric science community. This line-by-line code provides the foundation of RRTM, a rapid radiation code (Mlawer et al., 1997) used in various weather forecast models or general circulation models like ECHAM. To be able to quantify errors in the description of water vapor radiative processes from spectral residuals, i.e. difference spectra between measured and calculated radiance, the atmospheric state used as an input to LBLRTM has to be constrained precisely. This input comprises water vapor columns, water vapor profiles, and temperature profiles measured by an LHATPRO microwave radiometer along with total column information on further trace gases (e.g. CO2 and O3) measured by the solar FTIR. We will present the setup of the Zugspitze radiative closure experiment. Due to its high-altitude location and the available permanent instrumentation, the Zugspitze observatory meets the necessary requirements to determine highly accurate water vapor continuum absorption parameters in the far- and mid-infrared spectral range from a more extensive set of closure measurements compared to previous campaign-based studies. Furthermore, we will present a novel radiometric calibration strategy for the solar FTIR spectral radiance measurements based on a combination of the Langley method and measurements of a high-temperature blackbody source that allows for the determination of continuum absorption parameters in the near-infrared spectral region, where previously no precise measurements under atmospheric conditions were available. This improved quantification of water vapor continuum absorption parameters allows us to further validate the current standard continuum model MT_CKD (Mlawer et al., 2012). Acknowledgements: Funding by KIT/IMK-IFU, the State Government of Bavaria as well as by the Deutsche Bundesstiftung Umwelt (DBU) is gratefully acknowledged. References: Clough, S. A., Shephard, M. W., Mlawer, E. J., Delamere, J. S., Iacono, M. J., Cady-Pereira, K., Boukabara, S., and Brown, P. D: Atmospheric radiative transfer modeling: a summary of the AER codes, Short Communication, J. Quant. Spectrosc. Radiat. Transfer, 91, 233-244, 2005. Mlawer, E. J., Taubman, J., Brown, P.D., Iacono, M.J, and Clough, S.A.: RRTM, a validated correlated-k model for the longwave. J. Geophys. Res., 102, 16,663-16,682, 1997. Mlawer, E. J., Payne V. H., Moncet, J., Delamere, J. S., Alvarado, M. J. and Tobin, D.C.: Development and recent evaluation of the MT_CKD model of continuum absorption, Phil. Trans. R. Soc. A, 370, 2520-2556, 2012.

  2. Studies in Inuktitut Grammar

    ERIC Educational Resources Information Center

    Beach, Matthew David

    2012-01-01

    This dissertation addresses a number of issues about the grammar of Eastern Canadian Inuktitut. Inuktitut is a dialect within the Inuit dialect continuum which is a group of languages/dialects within the Eskimo-Aleut language family. (Eastern Canadian Inuktitut has an ISO 693-3 language code of "ike".) Typologically, it is an ergative…

  3. Director, Operational Test and Evaluation FY 2014 Annual Report

    DTIC Science & Technology

    2015-01-01

    Federal Departments and Agencies. Mitigation measures such as curtailment of wind turbine operations during test periods, identification of alternative...impact of wind turbines on ground-based and airborne radars, and this investment may help mitigate interference of wind turbines with test range...Frequency Active (SURTASS CLFA) Test Plan Tactical Unmanned Aircraft System Tactical Common Data Link (Shadow) FOT&E OTA Test Plan Tempest Wind 2014

  4. "The Tempest" in an English Teapot: Colonialism and the Measure of a Man in Zadie Smith's "White Teeth"

    ERIC Educational Resources Information Center

    Gustar, Jennifer J.

    2010-01-01

    Zadie Smith's "White Teeth" argues that we can take responsibility for the future if we refuse to act in thrall to the legacies of the past, which favour one human life over another, and act instead with the conviction that all lives are "lives" (Judith Butler). "White Teeth" examines the colonial legacy of violence…

  5. Reactive transport codes for subsurface environmental simulation

    DOE PAGES

    Steefel, C. I.; Appelo, C. A. J.; Arora, B.; ...

    2014-09-26

    A general description of the mathematical and numerical formulations used in modern numerical reactive transport codes relevant for subsurface environmental simulations is presented. The formulations are followed by short descriptions of commonly used and available subsurface simulators that consider continuum representations of flow, transport, and reactions in porous media. These formulations are applicable to most of the subsurface environmental benchmark problems included in this special issue. The list of codes described briefly here includes PHREEQC, HPx, PHT3D, OpenGeoSys (OGS), HYTEC, ORCHESTRA, TOUGHREACT, eSTOMP, HYDROGEOCHEM, CrunchFlow, MIN3P, and PFLOTRAN. The descriptions include a high-level list of capabilities for each of themore » codes, along with a selective list of applications that highlight their capabilities and historical development.« less

  6. Gamma-ray spectroscopy: The diffuse galactic glow

    NASA Technical Reports Server (NTRS)

    Hartmann, Dieter H.

    1991-01-01

    The goal of this project is the development of a numerical code that provides statistical models of the sky distribution of gamma-ray lines due to the production of radioactive isotopes by ongoing Galactic nucleosynthesis. We are particularly interested in quasi-steady emission from novae, supernovae, and stellar winds, but continuum radiation and transient sources must also be considered. We have made significant progress during the first half period of this project and expect the timely completion of a code that can be applied to Oriented Scintillation Spectrometer Experiment (OSSE) Galactic plane survey data.

  7. Porting LAMMPS to GPUs.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, William Michael; Plimpton, Steven James; Wang, Peng

    2010-03-01

    LAMMPS is a classical molecular dynamics code, and an acronym for Large-scale Atomic/Molecular Massively Parallel Simulator. LAMMPS has potentials for soft materials (biomolecules, polymers) and solid-state materials (metals, semiconductors) and coarse-grained or mesoscopic systems. It can be used to model atoms or, more generically, as a parallel particle simulator at the atomic, meso, or continuum scale. LAMMPS runs on single processors or in parallel using message-passing techniques and a spatial-decomposition of the simulation domain. The code is designed to be easy to modify or extend with new functionality.

  8. Adagio 4.20 User’s Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Benjamin Whiting; Crane, Nathan K.; Heinstein, Martin W.

    2011-03-01

    Adagio is a Lagrangian, three-dimensional, implicit code for the analysis of solids and structures. It uses a multi-level iterative solver, which enables it to solve problems with large deformations, nonlinear material behavior, and contact. It also has a versatile library of continuum and structural elements, and an extensive library of material models. Adagio is written for parallel computing environments, and its solvers allow for scalable solutions of very large problems. Adagio uses the SIERRA Framework, which allows for coupling with other SIERRA mechanics codes. This document describes the functionality and input structure for Adagio.

  9. Continuum kinetic methods for analyzing wave physics and distribution function dynamics in the turbulence dissipation challenge

    NASA Astrophysics Data System (ADS)

    Juno, J.; Hakim, A.; TenBarge, J.; Dorland, W.

    2015-12-01

    We present for the first time results for the turbulence dissipation challenge, with specific focus on the linear wave portion of the challenge, using a variety of continuum kinetic models: hybrid Vlasov-Maxwell, gyrokinetic, and full Vlasov-Maxwell. As one of the goals of the wave problem as it is outlined is to identify how well various models capture linear physics, we compare our results to linear Vlasov and gyrokinetic theory. Preliminary gyrokinetic results match linear theory extremely well due to the geometry of the problem, which eliminates the dominant nonlinearity. With the non-reduced models, we explore how the subdominant nonlinearities manifest and affect the evolution of the turbulence and the energy budget. We also take advantage of employing continuum methods to study the dynamics of the distribution function, with particular emphasis on the full Vlasov results where a basic collision operator has been implemented. As the community prepares for the next stage of the turbulence dissipation challenge, where we hope to do large 3D simulations to inform the next generation of observational missions such as THOR (Turbulence Heating ObserveR), we argue for the consideration of hybrid Vlasov and full Vlasov as candidate models for these critical simulations. With the use of modern numerical algorithms, we demonstrate the competitiveness of our code with traditional particle-in-cell algorithms, with a clear plan for continued improvements and optimizations to further strengthen the code's viability as an option for the next stage of the challenge.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kowalski, Adam F.; Mathioudakis, Mihalis; Hawley, Suzanne L.

    We present a large data set of high-cadence dMe flare light curves obtained with custom continuum filters on the triple-beam, high-speed camera system ULTRACAM. The measurements provide constraints for models of the near-ultraviolet (NUV) and optical continuum spectral evolution on timescales of ≈1 s. We provide a robust interpretation of the flare emission in the ULTRACAM filters using simultaneously obtained low-resolution spectra during two moderate-sized flares in the dM4.5e star YZ CMi. By avoiding the spectral complexity within the broadband Johnson filters, the ULTRACAM filters are shown to characterize bona fide continuum emission in the NUV, blue, and red wavelength regimes. Themore » NUV/blue flux ratio in flares is equivalent to a Balmer jump ratio, and the blue/red flux ratio provides an estimate for the color temperature of the optical continuum emission. We present a new “color–color” relationship for these continuum flux ratios at the peaks of the flares. Using the RADYN and RH codes, we interpret the ULTRACAM filter emission using the dominant emission processes from a radiative-hydrodynamic flare model with a high nonthermal electron beam flux, which explains a hot, T ≈ 10{sup 4} K, color temperature at blue-to-red optical wavelengths and a small Balmer jump ratio as observed in moderate-sized and large flares alike. We also discuss the high time resolution, high signal-to-noise continuum color variations observed in YZ CMi during a giant flare, which increased the NUV flux from this star by over a factor of 100.« less

  11. Continuum kinetic modeling of the tokamak plasma edge

    DOE PAGES

    Dorf, M. A.; Dorr, M.; Rognlien, T.; ...

    2016-03-10

    In this study, the first 4D (axisymmetric) high-order continuum gyrokinetic transport simulations that span the magnetic separatrix of a tokamak are presented. The modeling is performed with the COGENT code, which is distinguished by fourth-order finite-volume discretization combined with mapped multiblock grid technology to handle the strong anisotropy of plasmatransport and the complex X-point divertor geometry with high accuracy. The calculations take into account the effects of fully nonlinear Fokker-Plank collisions, electrostatic potential variations, and anomalous radial transport. Topics discussed include: (a) ion orbit loss and the associated toroidal rotation and (b) edge plasma relaxation in the presence of anomalousmore » radial transport.« less

  12. Two-photon absorption of [2.2]paracyclophane derivatives in solution: A theoretical investigation

    NASA Astrophysics Data System (ADS)

    Ferrighi, Lara; Frediani, Luca; Fossgaard, Eirik; Ruud, Kenneth

    2007-12-01

    The two-photon absorption of a class of [2.2]paracyclophane derivatives has been studied using quadratic response and density functional theories. For the molecules investigated, several effects influencing the two-photon absorption spectra have been investigated, such as side-chain elongation, hydrogen bonding, the use of ionic species, and solvent effects, the latter described by the polarizable continuum model. The calculations have been carried out using a recent parallel implementation of the polarizable continuum model in the DALTON code. Special attention is given to those aspects that could explain the large solvent effect on the two-photon absorption cross sections observed experimentally for this class of compounds.

  13. Message Control Intensity: Rationale and Preliminary Findings.

    ERIC Educational Resources Information Center

    Rogers, L. Edna; And Others

    The discussions of four family-related topics by 85 married couples were recorded and analyzed to test the validity of an expanded version of the relational communication coding system developed by L. Edna Rogers and Richard V. Farace. The expanded version of the system is based on the implicit intensity continuum that underlies the communication…

  14. 3D Progressive Damage Modeling for Laminated Composite Based on Crack Band Theory and Continuum Damage Mechanics

    NASA Technical Reports Server (NTRS)

    Wang, John T.; Pineda, Evan J.; Ranatunga, Vipul; Smeltzer, Stanley S.

    2015-01-01

    A simple continuum damage mechanics (CDM) based 3D progressive damage analysis (PDA) tool for laminated composites was developed and implemented as a user defined material subroutine to link with a commercially available explicit finite element code. This PDA tool uses linear lamina properties from standard tests, predicts damage initiation with an easy-to-implement Hashin-Rotem failure criteria, and in the damage evolution phase, evaluates the degradation of material properties based on the crack band theory and traction-separation cohesive laws. It follows Matzenmiller et al.'s formulation to incorporate the degrading material properties into the damaged stiffness matrix. Since nonlinear shear and matrix stress-strain relations are not implemented, correction factors are used for slowing the reduction of the damaged shear stiffness terms to reflect the effect of these nonlinearities on the laminate strength predictions. This CDM based PDA tool is implemented as a user defined material (VUMAT) to link with the Abaqus/Explicit code. Strength predictions obtained, using this VUMAT, are correlated with test data for a set of notched specimens under tension and compression loads.

  15. Precision measurement of the electromagnetic dipole strengths in Be11

    NASA Astrophysics Data System (ADS)

    Kwan, E.; Wu, C. Y.; Summers, N. C.; Hackman, G.; Drake, T. E.; Andreoiu, C.; Ashley, R.; Ball, G. C.; Bender, P. C.; Boston, A. J.; Boston, H. C.; Chester, A.; Close, A.; Cline, D.; Cross, D. S.; Dunlop, R.; Finlay, A.; Garnsworthy, A. B.; Hayes, A. B.; Laffoley, A. T.; Nano, T.; Navrátil, P.; Pearson, C. J.; Pore, J.; Quaglioni, S.; Svensson, C. E.; Starosta, K.; Thompson, I. J.; Voss, P.; Williams, S. J.; Wang, Z. M.

    2014-05-01

    The electromagnetic dipole strength in Be11 between the bound states has been measured using low-energy projectile Coulomb excitation at bombarding energies of 1.73 and 2.09 MeV/nucleon on a Pt196 target. An electric dipole transition probability B(E1;1/2-→1/2+)=0.102(2) e2fm was determined using the semi-classical code Gosia, and a value of 0.098(4) e2fm was determined using the Extended Continuum Discretized Coupled Channels method with the quantum mechanical code FRESCO. These extracted B(E1) values are consistent with the average value determined by a model-dependent analysis of intermediate energy Coulomb excitation measurements and are approximately 14% lower than that determined by a lifetime measurement. The much-improved precisions of 2% and 4% in the measured B(E1) values between the bound states deduced using Gosia and the Extended Continuum Discretized Coupled Channels method, respectively, compared to the previous accuracy of ˜10% will help in our understanding of and better improve the realistic inter-nucleon interactions.

  16. Gyrokinetic continuum simulations of turbulence in the Texas Helimak

    NASA Astrophysics Data System (ADS)

    Bernard, T. N.; Shi, E. L.; Hammett, G. W.; Hakim, A.; Taylor, E. I.

    2017-10-01

    We have used the Gkeyll code to perform 3x-2v full-f gyrokinetic continuum simulations of electrostatic plasma turbulence in the Texas Helimak. The Helimak is an open field-line experiment with magnetic curvature and shear. It is useful for validating numerical codes due to its extensive diagnostics and simple, helical geometry, which is similar to the scrape-off layer region of tokamaks. Interchange and drift-wave modes are the main turbulence mechanisms in the device, and potential biasing is applied to study the effect of velocity shear on turbulence reduction. With Gkeyll, we varied field-line pitch angle and simulated biased and unbiased cases to study different turbulent regimes and turbulence reduction. These are the first kinetic simulations of the Helimak and resulting plasma profiles agree fairly well with experimental data. This research demonstrates Gkeyll's progress towards 5D simulations of the SOL region of fusion devices. Supported by the U.S. DOE SCGSR program under contract DE-SC0014664, the Max-Planck/Princeton Center for Plasma Physics, the SciDAC Center for the Study of Plasma Microturbulence, and DOE contract DE-AC02-09CH11466.

  17. Orion Aerodynamics for Hypersonic Free Molecular to Continuum Conditions

    NASA Technical Reports Server (NTRS)

    Moss, James N.; Greene, Francis A.; Boyles, Katie A.

    2006-01-01

    Numerical simulations are performed for the Orion Crew Module, previously known as the Crew Exploration Vehicle (CEV) Command Module, to characterize its aerodynamics during the high altitude portion of its reentry into the Earth's atmosphere, that is, from free molecular to continuum hypersonic conditions. The focus is on flow conditions similar to those that the Orion Crew Module would experience during a return from the International Space Station. The bulk of the calculations are performed with two direct simulation Monte Carlo (DSMC) codes, and these data are anchored with results from both free molecular and Navier-Stokes calculations. Results for aerodynamic forces and moments are presented that demonstrate their sensitivity to rarefaction, that is, for free molecular to continuum conditions (Knudsen numbers of 111 to 0.0003). Also included are aerodynamic data as a function of angle of attack for different levels of rarefaction and results that demonstrate the aerodynamic sensitivity of the Orion CM to a range of reentry velocities (7.6 to 15 km/s).

  18. Blunt Body Aerodynamics for Hypersonic Low Density Flows

    NASA Technical Reports Server (NTRS)

    Moss, James N.; Glass, Christopher E.; Greene, Francis A.

    2006-01-01

    Numerical simulations are performed for the Apollo capsule from the hypersonic rarefied to the continuum regimes. The focus is on flow conditions similar to those experienced by the Apollo 6 Command Module during the high altitude portion of its reentry. The present focus is to highlight some of the current activities that serve as a precursor for computational tool assessments that will be used to support the development of aerodynamic data bases for future capsule flight environments, particularly those for the Crew Exploration Vehicle (CEV). Results for aerodynamic forces and moments are presented that demonstrate their sensitivity to rarefaction; that is, free molecular to continuum conditions. Also, aerodynamic data are presented that shows their sensitivity to a range of reentry velocities, encompassing conditions that include reentry from low Earth orbit, lunar return, and Mars return velocities (7.7 to 15 km/s). The rarefied results obtained with direct simulation Monte Carlo (DSMC) codes are anchored in the continuum regime with data from Navier-Stokes simulations.

  19. Photoionization Modeling with TITAN Code, Distance to the Warm Absorber in AGN

    NASA Astrophysics Data System (ADS)

    Różańska, A.

    2012-08-01

    We present a method that allows us to estimate a distance from the source of continuum radiation located in the center of AGN to the highly ionized gas - warm absorber (WA). We computed a set of constant total pressure photoionization models compatible with the warm absorber conditions, where a metal-rich gas is irradiated by a continuum in the form of a double powerlaw. The first powerlaw is hard, up to 100 keV, and represents radiation from an X-ray source, while the second powerlaw extends up to several eV, and illustrates radiation from an accretion disk. When the ionized continuum is dominated by the soft component, the warm absorber is heated by free-free absorption, instead of Comptonization, and the transmitted spectra show different absorption-line characteristics for different values of the hydrogen number density at the cloud illuminated surface. This fact results in the possibility of deriving the number density on the cloud illuminated side from observations, and hence the distance to the warm absorber.

  20. English in "The Tempest": The Value of Metaphor and Re-Imagining Grammar in English

    ERIC Educational Resources Information Center

    Macken-Horarik, Mary

    2013-01-01

    Garth Boomer's thinking influenced many of those working in school English during the time he was alive. The ripple effects of his legacy continue to be felt. For the author, it is Boomer's interests in metaphor and meaning that resonate most. The use of tropes and figure is a distinctive feature of his writing and offers a rich…

  1. The Multicolored Patchwork Portraiture of an Effective Veteran High School Special Education Teacher Amidst the Tempest of the High Stakes Testing Movement

    ERIC Educational Resources Information Center

    Bicehouse, Vaughn L.

    2010-01-01

    This single-subject study used the art and science of portraiture to illuminate a veteran special education teacher who is meeting the needs of her students with disabilities. This qualitative study was not done for the purposes of generalization but rather to show how this remarkable and effective special educator acts as an inspirational…

  2. Trio of Tempests

    NASA Image and Video Library

    2017-10-04

    Three distinct active regions with towering arches above them rotated into view over a three-day period (Sept. 24-26, 2017). In extreme ultraviolet light, charged particles that are spinning along the ever-changing magnetic field lines above the active regions make the lines visible. To give some sense of scale, the largest arches rose up many times the size of Earth. Movies are available at https://photojournal.jpl.nasa.gov/catalog/PIA22038

  3. Photometric Detection of Extra-Solar Planets

    NASA Technical Reports Server (NTRS)

    Hatzes, Artie P.; Cochran, William D.

    2004-01-01

    This NASA Origins Program grant supported the TEMPEST Texas McDonald Photometric Extrasolar Search for Transits) program at McDonald Observatory, which searches for transits of extrasolar planets across the disks of their parent stars. The basic approach is to use a wide-field ground-based telescope (in our case the McDonald Observatory 0.76m telescope and it s Prime Focus Corrector) to search for transits of short period (1-15 day orbits) of close-in hot-Jupiter planets in orbit around a large sample of field stars. The next task is to search these data streams for possible transit events. We collected our first set of test data for this program using the 0.76 m PFC in the summer of 1998. From those data, we developed the optimal observing procedures, including tailoring the stellar density, exposure times, and filters to best-suit the instrument and project. In the summer of 1999, we obtained the first partial season of data on a dedicated field in the constellation Cygnus. These data were used to develop and refine the reduction and analysis procedures to produce high-precision photometry and search for transits in the resulting light curves. The TeMPEST project subsequently obtained three full seasons of data on six different fields using the McDonald Observatory 0.76m PFC.

  4. Spring Dust Storm Smothers Beijing

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A few days earlier than usual, a large, dense plume of dust blew southward and eastward from the desert plains of Mongolia-quite smothering to the residents of Beijing. Citizens of northeastern China call this annual event the 'shachenbao,' or 'dust cloud tempest.' However, the tempest normally occurs during the spring time. The dust storm hit Beijing on Friday night, March 15, and began coating everything with a fine, pale brown layer of grit. The region is quite dry; a problem some believe has been exacerbated by decades of deforestation. According to Chinese government estimates, roughly 1 million tons of desert dust and sand blow into Beijing each year. This true-color image was made using two adjacent swaths (click to see the full image) of data from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), flying aboard the OrbView-2 satellite, on March 17, 2002. The massive dust storm (brownish pixels) can easily be distinguished from clouds (bright white pixels) as it blows across northern Japan and eastward toward the open Pacific Ocean. The black regions are gaps between SeaWiFS' viewing swaths and represent areas where no data were collected. Image courtesy the SeaWiFS Project, NASA/Goddard Space Flight Center, and ORBIMAGE

  5. The Far Infrared Lines of OH as Molecular Cloud Diagnostics

    NASA Technical Reports Server (NTRS)

    Smith, Howard A.

    2004-01-01

    Future IR missions should give some priority to high resolution spectroscopic observations of the set of far-IR transitions of OH. There are 15 far-IR lines arising between the lowest eight rotational levels of OH, and ISO detected nine of them. Furthermore, ISO found the OH lines, sometimes in emission and sometimes in absorption, in a wide variety of galactic and extragalactic objects ranging from AGB stars to molecular clouds to active galactic nuclei and ultra-luminous IR galaxies. The ISO/LWS Fabry-Perot resolved the 119 m doublet line in a few of the strong sources. This set of OH lines provides a uniquely important diagnostic for many reasons: the lines span a wide wavelength range (28.9 m to 163.2 m); the transitions have fast radiative rates; the abundance of the species is relatively high; the IR continuum plays an important role as a pump; the contribution from shocks is relatively minor; and, not least, the powerful centimeter-wave radiation from OH allows comparison with radio and VLBI datasets. The problem is that the large number of sensitive free parameters, and the large optical depths of the strongest lines, make modeling the full set a difficult job. The SWAS montecarlo radiative transfer code has been used to analyze the ISO/LWS spectra of a number of objects with good success, including in both the lines and the FIR continuum; the DUSTY radiative transfer code was used to insure a self-consistent continuum. Other far IR lines including those from H2O, CO, and [OI] are also in the code. The OH lines all show features which future FIR spectrometers should be able to resolve, and which will enable further refinements in the details of each cloud's structure. Some examples are given, including the case of S140, for which independent SWAS data found evidence for bulk flows.

  6. Broadband Photometric Reverberation Mapping Analysis on SDSS-RM and Stripe 82 Quasars

    NASA Astrophysics Data System (ADS)

    Zhang, Haowen; Yang, Qian; Wu, Xue-Bing

    2018-02-01

    We modified the broadband photometric reverberation mapping (PRM) code, JAVELIN, and tested the availability to get broad-line region time delays that are consistent with the spectroscopic reverberation mapping (SRM) project SDSS-RM. The broadband light curves of SDSS-RM quasars produced by convolution with the system transmission curves were used in the test. We found that under similar sampling conditions (evenly and frequently sampled), the key factor determining whether the broadband PRM code can yield lags consistent with the SRM project is the flux ratio of the broad emission line to the reference continuum, which is in line with the previous findings. We further found a critical line-to-continuum flux ratio, about 6%, above which the mean of the ratios between the lags from PRM and SRM becomes closer to unity, and the scatter is pronouncedly reduced. We also tested our code on a subset of SDSS Stripe 82 quasars, and found that our program tends to give biased lag estimations due to the observation gaps when the R-L relation prior in Markov Chain Monte Carlo is discarded. The performance of the damped random walk (DRW) model and the power-law (PL) structure function model on broadband PRM were compared. We found that given both SDSS-RM-like or Stripe 82-like light curves, the DRW model performs better in carrying out broadband PRM than the PL model.

  7. Axisymmetric Plume Simulations with NASA's DSMC Analysis Code

    NASA Technical Reports Server (NTRS)

    Stewart, B. D.; Lumpkin, F. E., III

    2012-01-01

    A comparison of axisymmetric Direct Simulation Monte Carlo (DSMC) Analysis Code (DAC) results to analytic and Computational Fluid Dynamics (CFD) solutions in the near continuum regime and to 3D DAC solutions in the rarefied regime for expansion plumes into a vacuum is performed to investigate the validity of the newest DAC axisymmetric implementation. This new implementation, based on the standard DSMC axisymmetric approach where the representative molecules are allowed to move in all three dimensions but are rotated back to the plane of symmetry by the end of the move step, has been fully integrated into the 3D-based DAC code and therefore retains all of DAC s features, such as being able to compute flow over complex geometries and to model chemistry. Axisymmetric DAC results for a spherically symmetric isentropic expansion are in very good agreement with a source flow analytic solution in the continuum regime and show departure from equilibrium downstream of the estimated breakdown location. Axisymmetric density contours also compare favorably against CFD results for the R1E thruster while temperature contours depart from equilibrium very rapidly away from the estimated breakdown surface. Finally, axisymmetric and 3D DAC results are in very good agreement over the entire plume region and, as expected, this new axisymmetric implementation shows a significant reduction in computer resources required to achieve accurate simulations for this problem over the 3D simulations.

  8. XGC developments for a more efficient XGC-GENE code coupling

    NASA Astrophysics Data System (ADS)

    Dominski, Julien; Hager, Robert; Ku, Seung-Hoe; Chang, Cs

    2017-10-01

    In the Exascale Computing Program, the High-Fidelity Whole Device Modeling project initially aims at delivering a tightly-coupled simulation of plasma neoclassical and turbulence dynamics from the core to the edge of the tokamak. To permit such simulations, the gyrokinetic codes GENE and XGC will be coupled together. Numerical efforts are made to improve the numerical schemes agreement in the coupling region. One of the difficulties of coupling those codes together is the incompatibility of their grids. GENE is a continuum grid-based code and XGC is a Particle-In-Cell code using unstructured triangular mesh. A field-aligned filter is thus implemented in XGC. Even if XGC originally had an approximately field-following mesh, this field-aligned filter permits to have a perturbation discretization closer to the one solved in the field-aligned code GENE. Additionally, new XGC gyro-averaging matrices are implemented on a velocity grid adapted to the plasma properties, thus ensuring same accuracy from the core to the edge regions.

  9. A hydrodynamic approach to cosmology - Methodology

    NASA Technical Reports Server (NTRS)

    Cen, Renyue

    1992-01-01

    The present study describes an accurate and efficient hydrodynamic code for evolving self-gravitating cosmological systems. The hydrodynamic code is a flux-based mesh code originally designed for engineering hydrodynamical applications. A variety of checks were performed which indicate that the resolution of the code is a few cells, providing accuracy for integral energy quantities in the present simulations of 1-3 percent over the whole runs. Six species (H I, H II, He I, He II, He III) are tracked separately, and relevant ionization and recombination processes, as well as line and continuum heating and cooling, are computed. The background radiation field is simultaneously determined in the range 1 eV to 100 keV, allowing for absorption, emission, and cosmological effects. It is shown how the inevitable numerical inaccuracies can be estimated and to some extent overcome.

  10. New PANDA Tests to Investigate Effects of Light Gases on Passive Safety Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paladino, D.; Auban, O.; Candreia, P.

    The large- scale thermal-hydraulic PANDA facility (located at PSI in Switzerland), has been used over the last few years for investigating different passive decay- heat removal systems and containment phenomena for the next generation of light water reactors (Simplified Boiling Water Reactor: SBWR; European Simplified Boiling Water Reactor: ESBWR; Siedewasserreaktor: SWR-1000). Currently, as part of the European Commission 5. EURATOM Framework Programme project 'Testing and Enhanced Modelling of Passive Evolutionary Systems Technology for Containment Cooling' (TEMPEST), a new series of tests is being planned in the PANDA facility to experimentally investigate the distribution of non-condensable gases inside the containment andmore » their effect on the performance of the 'Passive Containment Cooling System' (PCCS). Hydrogen release caused by the metal-water reaction in the case of a postulated severe accident will be simulated in PANDA by injecting helium into the reactor pressure vessel. In order to provide suitable data for Computational Fluid Dynamic (CFD) code assessment and improvement, the instrumentation in PANDA has been upgraded for the new tests. In the present paper, a detailed discussion is given of the new PANDA tests to be performed to investigate the effects of light gas on passive safety systems. The tests are scheduled for the first half of the year 2002. (authors)« less

  11. From High School to University: Students' Competences Recycled

    ERIC Educational Resources Information Center

    Dias, Diana; Sa, Maria Jose

    2012-01-01

    The process of transition from high school to higher education might be viewed as a continuum of learning new codes of conduct that guide the exercise of a (re)new(ed) student craft. This article presents a qualitative analysis of the results of interviews conducted with students, focusing on the need for students to trigger a set of adaptive…

  12. Electric Propulsion Test and Evaluation Methodologies for Plasma in the Environments of Space and Testing (EP TEMPEST)

    DTIC Science & Technology

    2016-04-14

    Swanson AEDC Path 1: Magnetized electron transport impeded across magnetic field lines; transport via electron-particle collisions Path 2*: Electron...T&E (higher pressure, metallic walls) → Impacts stability, performance, plume properties, thruster lifetime Magnetic Field Lines Plasma Plume...Development of T&E Methodologies • Current-Voltage- Magnetic Field (I-V-B) Mapping • Facility Interaction Studies • Background Pressure • Plasma Wall

  13. Tempest on the Hudson: The Struggle for "Equal Pay for Equal Work" in the New York City Public Schools, 1907-1911.

    ERIC Educational Resources Information Center

    Doherty, Robert E.

    1979-01-01

    Traces trends in salaries paid to male and female public school teachers in New York City during a four-year period in the early twentieth century. Findings indicate that, in direct opposition to the situation around the turn of the century, there were few school districts that differentiated in the 1970s in salary on the basis of sex. (DB)

  14. The evolution of science literacy: Examining intertextual connections and inquiry behaviors in the classroom

    NASA Astrophysics Data System (ADS)

    Manocchi-Verrino, Carol J.

    A call for a new perspective of science literacy has been marked as the impetus of change in science education, suggesting that a meaning-making approach to literacy and inquiry are central to learning science. This research study explored how science literacy evolved in a classroom where this reconceptualized view of science literacy guided curriculum design and instruction. The teacher/researcher incorporated Interactive Science Notebooks (ISNs) and Interactive Reading Organizers and Comprehension Strategies (IROCS) into instructional materials. In a class consisting of 20 mainstream and special education students, this 7-week study collected data using Likert scales, stimulated recall interviews, a teacher/researcher journal, and students¡¦ position papers. A systematic design framework was used for the three-phase analysis. Hyperresearch RTM software facilitated the identification of open codes, an axial code, and frequency graphs. In order to develop insight into the relationship between questions, methods, and curriculum design recent recommendations for quality research in science education were considered in the methodology. The hypothesis formulated from the data suggests that science literacy evolves on a continuum, and the degree to which science literacy evolves on the continuum seems to be contingent upon their uses of intertextual connections and inquiry behaviors. Several notable insights emerged from the data which were used to guide curriculum, instruction, and assessment that promotes the development of science literacy in the middle school classroom. The study suggests a possible correlation between the use of intertextual connections and inquiry behaviors, and the use of a continuum in measuring the emergence of science literacy.

  15. Numerical investigation of non-perturbative kinetic effects of energetic particles on toroidicity-induced Alfvén eigenmodes in tokamaks and stellarators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slaby, Christoph; Könies, Axel; Kleiber, Ralf

    2016-09-15

    The resonant interaction of shear Alfvén waves with energetic particles is investigated numerically in tokamak and stellarator geometry using a non-perturbative MHD-kinetic hybrid approach. The focus lies on toroidicity-induced Alfvén eigenmodes (TAEs), which are most easily destabilized by a fast-particle population in fusion plasmas. While the background plasma is treated within the framework of an ideal-MHD theory, the drive of the fast particles, as well as Landau damping of the background plasma, is modelled using the drift-kinetic Vlasov equation without collisions. Building on analytical theory, a fast numerical tool, STAE-K, has been developed to solve the resulting eigenvalue problem usingmore » a Riccati shooting method. The code, which can be used for parameter scans, is applied to tokamaks and the stellarator Wendelstein 7-X. High energetic-ion pressure leads to large growth rates of the TAEs and to their conversion into kinetically modified TAEs and kinetic Alfvén waves via continuum interaction. To better understand the physics of this conversion mechanism, the connections between TAEs and the shear Alfvén wave continuum are examined. It is shown that, when energetic particles are present, the continuum deforms substantially and the TAE frequency can leave the continuum gap. The interaction of the TAE with the continuum leads to singularities in the eigenfunctions. To further advance the physical model and also to eliminate the MHD continuum together with the singularities in the eigenfunctions, a fourth-order term connected to radiative damping has been included. The radiative damping term is connected to non-ideal effects of the bulk plasma and introduces higher-order derivatives to the model. Thus, it has the potential to substantially change the nature of the solution. For the first time, the fast-particle drive, Landau damping, continuum damping, and radiative damping have been modelled together in tokamak- as well as in stellarator geometry.« less

  16. Color visualization for fluid flow prediction

    NASA Technical Reports Server (NTRS)

    Smith, R. E.; Speray, D. E.

    1982-01-01

    High-resolution raster scan color graphics allow variables to be presented as a continuum, in a color-coded picture that is referenced to a geometry such as a flow field grid or a boundary surface. Software is used to map a scalar variable such as pressure or temperature, defined on a two-dimensional slice of a flow field. The geometric shape is preserved in the resulting picture, and the relative magnitude of the variable is color-coded onto the geometric shape. The primary numerical process for color coding is an efficient search along a raster scan line to locate the quadrilteral block in the grid that bounds each pixel on the line. Tension spline interpolation is performed relative to the grid for specific values of the scalar variable, which is then color coded. When all pixels for the field of view are color-defined, a picture is played back from a memory device onto a television screen.

  17. Recruitment of Itinerant Teachers of the Deaf and Hard of Hearing in Rural Arizona

    ERIC Educational Resources Information Center

    Thomas, Della W.

    2010-01-01

    Legislative mandate and judicial precedence of the guarantee of a free and appropriate public education for students with disabilities can be challenging to uphold in rural areas. 13 out of 15 counties in Arizona are in rural areas according to the US Department of Agriculture Rural-Urban continuum code, 2003, making the challenge of filling…

  18. Reacting Chemistry Based Burn Model for Explosive Hydrocodes

    NASA Astrophysics Data System (ADS)

    Schwaab, Matthew; Greendyke, Robert; Steward, Bryan

    2017-06-01

    Currently, in hydrocodes designed to simulate explosive material undergoing shock-induced ignition, the state of the art is to use one of numerous reaction burn rate models. These burn models are designed to estimate the bulk chemical reaction rate. Unfortunately, these models are largely based on empirical data and must be recalibrated for every new material being simulated. We propose that the use of an equilibrium Arrhenius rate reacting chemistry model in place of these empirically derived burn models will improve the accuracy for these computational codes. Such models have been successfully used in codes simulating the flow physics around hypersonic vehicles. A reacting chemistry model of this form was developed for the cyclic nitramine RDX by the Naval Research Laboratory (NRL). Initial implementation of this chemistry based burn model has been conducted on the Air Force Research Laboratory's MPEXS multi-phase continuum hydrocode. In its present form, the burn rate is based on the destruction rate of RDX from NRL's chemistry model. Early results using the chemistry based burn model show promise in capturing deflagration to detonation features more accurately in continuum hydrocodes than previously achieved using empirically derived burn models.

  19. Pore-scale simulation of CO2-water-rock interactions

    NASA Astrophysics Data System (ADS)

    Deng, H.; Molins, S.; Steefel, C. I.; DePaolo, D. J.

    2017-12-01

    In Geologic Carbon Storage (GCS) systems, the migration of scCO2 versus CO2-acidifed brine ultimately determines the extent of mineral trapping and caprock integrity, i.e. the long-term storage efficiency and security. While continuum scale multiphase reactive transport models are valuable for large scale investigations, they typically (over-)simplify pore-scale dynamics and cannot capture local heterogeneities that may be important. Therefore, pore-scale models are needed in order to provide mechanistic understanding of how fine scale structural variations and heterogeneous processes influence the transport and geochemistry in the context of multiphase flow, and to inform parameterization of continuum scale modeling. In this study, we investigate the interplay of different processes at pore scale (e.g. diffusion, reactions, and multiphase flow) through the coupling of a well-developed multiphase flow simulator with a sophisticated reactive transport code. The objectives are to understand where brine displaced by scCO2 will reside in a rough pore/fracture, and how the CO2-water-rock interactions may affect the redistribution of different phases. In addition, the coupled code will provide a platform for model testing in pore-scale multiphase reactive transport problems.

  20. Three-dimensional modeling of flow through fractured tuff at Fran Ridge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eaton, R.R.; Ho, C.K.; Glass, RJ.

    1996-09-01

    Numerical studies have been made of an infiltration experiment at Fran Ridge using the TOUGH2 code to aid in the selection of computational models for performance assessment. The exercise investigates the capabilities of TOUGH2 to model transient flows through highly fractured tuff and provides a possible means of calibration. Two distinctly different conceptual models were used in the TOUGH2 code, the dual permeability model and the equivalent continuum model. The infiltration test modeled involved the infiltration of dyed ponded water for 36 minutes. The 205 gallon infiltration of water observed in the experiment was subsequently modeled using measured Fran Ridgemore » fracture frequencies, and a specified fracture aperture of 285 {micro}m. The dual permeability formulation predicted considerable infiltration along the fracture network, which was in agreement with the experimental observations. As expected, al fracture penetration of the infiltrating water was calculated using the equivalent continuum model, thus demonstrating that this model is not appropriate for modeling the highly transient experiment. It is therefore recommended that the dual permeability model be given priority when computing high-flux infiltration for use in performance assessment studies.« less

  1. Three-dimensional modeling of flow through fractured tuff at Fran Ridge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eaton, R.R.; Ho, C.K.; Glass, R.J.

    1996-01-01

    Numerical studies have been made of an infiltration experiment at Fran Ridge using the TOUGH2 code to aid in the selection of computational models for performance assessment. The exercise investigates the capabilities of TOUGH2 to model transient flows through highly fractured tuff and provides a possible means of calibration. Two distinctly different conceptual models were used in the TOUGH2 code, the dual permeability model and the equivalent continuum model. The infiltration test modeled involved the infiltration of dyed ponded water for 36 minutes. The 205 gallon filtration of water observed in the experiment was subsequently modeled using measured Fran Ridgemore » fracture frequencies, and a specified fracture aperture of 285 {mu}m. The dual permeability formulation predicted considerable infiltration along the fracture network, which was in agreement with the experimental observations. As expected, minimal fracture penetration of the infiltrating water was calculated using the equivalent continuum model, thus demonstrating that this model is not appropriate for modeling the highly transient experiment. It is therefore recommended that the dual permeability model be given priority when computing high-flux infiltration for use in performance assessment studies.« less

  2. All Rural Places Are Not Created Equal: Revisiting the Rural Mortality Penalty in the United States

    PubMed Central

    2014-01-01

    Objectives. I investigated mortality disparities between urban and rural areas by measuring disparities in urban US areas compared with 6 rural classifications, ranging from suburban to remote locales. Methods. Data from the Compressed Mortality File, National Center for Health Statistics, from 1968 to 2007, was used to calculate age-adjusted mortality rates for all rural and urban regions by year. Criteria measuring disparity between regions included excess deaths, annual rate of change in mortality, and proportion of excess deaths by population size. I used multivariable analysis to test for differences in determinants across regions. Results. The rural mortality penalty existed in all rural classifications, but the degree of disparity varied considerably. Rural–urban continuum code 6 was highly disadvantaged, and rural–urban continuum code 9 displayed a favorable mortality profile. Population, socioeconomic, and health care determinants of mortality varied across regions. Conclusions. A 2-decade long trend in mortality disparities existed in all rural classifications, but the penalty was not distributed evenly. This constitutes an important public health problem. Research should target the slow rates of improvement in mortality in the rural United States as an area of concern. PMID:25211763

  3. General phase spaces: from discrete variables to rotor and continuum limits

    NASA Astrophysics Data System (ADS)

    Albert, Victor V.; Pascazio, Saverio; Devoret, Michel H.

    2017-12-01

    We provide a basic introduction to discrete-variable, rotor, and continuous-variable quantum phase spaces, explaining how the latter two can be understood as limiting cases of the first. We extend the limit-taking procedures used to travel between phase spaces to a general class of Hamiltonians (including many local stabilizer codes) and provide six examples: the Harper equation, the Baxter parafermionic spin chain, the Rabi model, the Kitaev toric code, the Haah cubic code (which we generalize to qudits), and the Kitaev honeycomb model. We obtain continuous-variable generalizations of all models, some of which are novel. The Baxter model is mapped to a chain of coupled oscillators and the Rabi model to the optomechanical radiation pressure Hamiltonian. The procedures also yield rotor versions of all models, five of which are novel many-body extensions of the almost Mathieu equation. The toric and cubic codes are mapped to lattice models of rotors, with the toric code case related to U(1) lattice gauge theory.

  4. Two-dimensional implosion simulations with a kinetic particle code [2D implosion simulations with a kinetic particle code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sagert, Irina; Even, Wesley Paul; Strother, Terrance Timothy

    Here, we perform two-dimensional implosion simulations using a Monte Carlo kinetic particle code. The application of a kinetic transport code is motivated, in part, by the occurrence of nonequilibrium effects in inertial confinement fusion capsule implosions, which cannot be fully captured by hydrodynamic simulations. Kinetic methods, on the other hand, are able to describe both continuum and rarefied flows. We perform simple two-dimensional disk implosion simulations using one-particle species and compare the results to simulations with the hydrodynamics code rage. The impact of the particle mean free path on the implosion is also explored. In a second study, we focusmore » on the formation of fluid instabilities from induced perturbations. We find good agreement with hydrodynamic studies regarding the location of the shock and the implosion dynamics. Differences are found in the evolution of fluid instabilities, originating from the higher resolution of rage and statistical noise in the kinetic studies.« less

  5. Two-dimensional implosion simulations with a kinetic particle code [2D implosion simulations with a kinetic particle code

    DOE PAGES

    Sagert, Irina; Even, Wesley Paul; Strother, Terrance Timothy

    2017-05-17

    Here, we perform two-dimensional implosion simulations using a Monte Carlo kinetic particle code. The application of a kinetic transport code is motivated, in part, by the occurrence of nonequilibrium effects in inertial confinement fusion capsule implosions, which cannot be fully captured by hydrodynamic simulations. Kinetic methods, on the other hand, are able to describe both continuum and rarefied flows. We perform simple two-dimensional disk implosion simulations using one-particle species and compare the results to simulations with the hydrodynamics code rage. The impact of the particle mean free path on the implosion is also explored. In a second study, we focusmore » on the formation of fluid instabilities from induced perturbations. We find good agreement with hydrodynamic studies regarding the location of the shock and the implosion dynamics. Differences are found in the evolution of fluid instabilities, originating from the higher resolution of rage and statistical noise in the kinetic studies.« less

  6. A Tutorial Introduction to Bayesian Models of Cognitive Development

    DTIC Science & Technology

    2011-01-01

    typewriter with an infinite amount of paper. There is a space of documents that it is capable of producing, which includes things like The Tempest and does...not include, say, a Vermeer painting or a poem written in Russian. This typewriter represents a means of generating the hypothesis space for a Bayesian...learner: each possible document that can be typed on it is a hypothesis, the infinite set of documents producible by the typewriter is the latent

  7. L-band Soil Moisture Mapping using Small UnManned Aerial Systems

    NASA Astrophysics Data System (ADS)

    Dai, E.

    2015-12-01

    Soil moisture is of fundamental importance to many hydrological, biological and biogeochemical processes, plays an important role in the development and evolution of convective weather and precipitation, and impacts water resource management, agriculture, and flood runoff prediction. The launch of NASA's Soil Moisture Active/Passive (SMAP) mission in 2015 promises to provide global measurements of soil moisture and surface freeze/thaw state at fixed crossing times and spatial resolutions as low as 5 km for some products. However, there exists a need for measurements of soil moisture on smaller spatial scales and arbitrary diurnal times for SMAP validation, precision agriculture and evaporation and transpiration studies of boundary layer heat transport. The Lobe Differencing Correlation Radiometer (LDCR) provides a means of mapping soil moisture on spatial scales as small as several meters (i.e., the height of the platform) .Compared with various other proposed methods of validation based on either situ measurements [1,2] or existing airborne sensors suitable for manned aircraft deployment [3], the integrated design of the LDCR on a lightweight small UAS (sUAS) is capable of providing sub-watershed (~km scale) coverage at very high spatial resolution (~15 m) suitable for scaling scale studies, and at comparatively low operator cost. The LDCR on Tempest unit can supply the soil moisture mapping with different resolution which is of order the Tempest altitude.

  8. Understanding the HMI Pseudocontinuum in White-light Solar Flares

    NASA Astrophysics Data System (ADS)

    Švanda, Michal; Jurčák, Jan; Kašparová, Jana; Kleint, Lucia

    2018-06-01

    We analyze observations of the X9.3 solar flare (SOL2017-09-06T11:53) observed by SDO/HMI and Hinode/Solar Optical Telescope. Our aim is to learn about the nature of the HMI pseudocontinuum I c used as a proxy for the white-light continuum. From model atmospheres retrieved by an inversion code applied to the Stokes profiles observed by the Hinode satellite, we synthesize profiles of the Fe I 617.3 nm line and compare them to HMI observations. Based on a pixel-by-pixel comparison, we show that the value of I c represents the continuum level well in quiet-Sun regions only. In magnetized regions, it suffers from a simplistic algorithm that is applied to a complex line shape. During this flare, both instruments also registered emission profiles in the flare ribbons. Such emission profiles are poorly represented by the six spectral points of HMI and the MDI-like algorithm does not account for emission profiles in general; thus, the derived pseudocontinuum intensity does not approximate the continuum value properly.

  9. Center-to-limb variation of intensity and polarization in continuum spectra of FGK stars for spherical atmospheres

    NASA Astrophysics Data System (ADS)

    Kostogryz, N. M.; Milic, I.; Berdyugina, S. V.; Hauschildt, P. H.

    2016-02-01

    Aims: One of the necessary parameters needed for the interpretation of the light curves of transiting exoplanets or eclipsing binary stars (as well as interferometric measurements of a star or microlensing events) is how the intensity and polarization of light changes from the center to the limb of a star. Scattering and absorption processes in the stellar atmosphere affect both the center-to-limb variation of intensity (CLVI) and polarization (CLVP). In this paper, we present a study of the CLVI and CLVP in continuum spectra, taking into consideration the different contributions of scattering and absorption opacity for a variety of spectral type stars with spherical atmospheres. Methods: We solve the radiative transfer equation for polarized light in the presence of a continuum scattering, taking into consideration the spherical model of a stellar atmosphere. To cross-check our results, we developed two independent codes that are based on Feautrier and short characteristics methods, respectively, Results: We calculate the center-to-limb variation of intensity (CLVI) and polarization (CLVP) in continuum for the Phoenix grid of spherical stellar model atmospheres for a range of effective temperatures (4000-7000 K), gravities (log g = 1.0-5.5), and wavelengths (4000-7000 Å), which are tabulated and available at the CDS. In addition, we present several tests of our codes and compare our calculations for the solar atmosphere with published photometric and polarimetric measurements. We also show that our two codes provide similar results in all considered cases. Conclusions: For sub-giant and dwarf stars (log g = 3.0-4.5), the lower gravity and lower effective temperature of a star lead to higher limb polarization of the star. For giant and supergiant stars (log g = 1.0-2.5), the highest effective temperature yields the largest polarization. By decreasing the effective temperature of a star down to 4500-5500 K (depending on log g), the limb polarization decreases and reaches a local minimum. It increases again with a corresponding decrease in temperature down to 4000 K. For the most compact dwarf stars (log g = 5.0-5.5), the limb polarization degree shows a maximum for models with effective temperatures in the range 4200-4600 K (depending on log g) and decreases toward higher and lower temperatures. The intensity and polarization profiles are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/586/A87

  10. User's manual for COAST 4: a code for costing and sizing tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sink, D. A.; Iwinski, E. M.

    1979-09-01

    The purpose of this report is to document the computer program COAST 4 for the user/analyst. COAST, COst And Size Tokamak reactors, provides complete and self-consistent size models for the engineering features of D-T burning tokamak reactors and associated facilities involving a continuum of performance including highly beam driven through ignited plasma devices. TNS (The Next Step) devices with no tritium breeding or electrical power production are handled as well as power producing and fissile producing fusion-fission hybrid reactors. The code has been normalized with a TFTR calculation which is consistent with cost, size, and performance data published in themore » conceptual design report for that device. Information on code development, computer implementation and detailed user instructions are included in the text.« less

  11. DYNECHARM++: a toolkit to simulate coherent interactions of high-energy charged particles in complex structures

    NASA Astrophysics Data System (ADS)

    Bagli, Enrico; Guidi, Vincenzo

    2013-08-01

    A toolkit for the simulation of coherent interactions between high-energy charged particles and complex crystal structures, called DYNECHARM++ has been developed. The code has been written in C++ language taking advantage of this object-oriented programing method. The code is capable to evaluating the electrical characteristics of complex atomic structures and to simulate and track the particle trajectory within them. Calculation method of electrical characteristics based on their expansion in Fourier series has been adopted. Two different approaches to simulate the interaction have been adopted, relying on the full integration of particle trajectories under the continuum potential approximation and on the definition of cross-sections of coherent processes. Finally, the code has proved to reproduce experimental results and to simulate interaction of charged particles with complex structures.

  12. A coupled/uncoupled deformation and fatigue damage algorithm utilizing the finite element method

    NASA Technical Reports Server (NTRS)

    Wilt, Thomas E.; Arnold, Steven M.

    1994-01-01

    A fatigue damage computational algorithm utilizing a multiaxial, isothermal, continuum based fatigue damage model for unidirectional metal matrix composites has been implemented into the commercial finite element code MARC using MARC user subroutines. Damage is introduced into the finite element solution through the concept of effective stress which fully couples the fatigue damage calculations with the finite element deformation solution. An axisymmetric stress analysis was performed on a circumferentially reinforced ring, wherein both the matrix cladding and the composite core were assumed to behave elastic-perfectly plastic. The composite core behavior was represented using Hill's anisotropic continuum based plasticity model, and similarly, the matrix cladding was represented by an isotropic plasticity model. Results are presented in the form of S-N curves and damage distribution plots.

  13. Time-Resolved Properties and Global Trends in dMe Flares from Simultaneous Photometry and Spectra

    NASA Astrophysics Data System (ADS)

    Kowalski, Adam F.

    We present a homogeneous survey of near-ultraviolet (NUV) /optical line and continuum emission during twenty M dwarf flares with simultaneous, high cadence photometry and spectra. These data were obtained to study the white-light continuum components to the blue and red of the Balmer jump to break the degeneracy with fitting emission mechanisms to broadband colors and to provide constraints for radiative-hydrodynamic flare models that seek to reproduce the white-light flare emission. The main results from the continuum analysis are the following: 1) the detection of Balmer continuum (in emission) that is present during all flares, with a wide range of relative contribution to the continuum flux in the NUV; 2) a blue continuum at the peak of the photometry that is linear with wavelength from λ = 4000 - 4800Å, matched by the spectral shape of hot, blackbody emission with typical temperatures of 10 000 - 12 000 K; 3) a redder continuum apparent at wavelengths longer than Hβ; this continuum becomes relatively more important to the energy budget during the late gradual phase. The hot blackbody component and redder continuum component (which we call "the conundruum") have been detected in previous UBVR colorimetry studies of flares. With spectra, one can compare the properties and detailed timings of all three components. Using time-resolved spectra during the rise phase of three flares, we calculate the speed of an expanding flare region assuming a simple geometry; the speeds are found to be ~5- 10 km s-1 and 50 - 120 km s -1, which are strikingly consistent with the speeds at which two-ribbon flares develop on the Sun. The main results from the emission line analysis are 1) the presentation of the "time-decrement", a relation between the timescales of the Balmer series; 2) a Neupert-like relation between Ca \\pcy K and the blackbody continuum, and 3) the detection of absorption wings in the Hydrogen Balmer lines during times of peak continuum emission, indicative of hot-star spectra forming during the flare. A byproduct of this study is a new method for deriving absolute fluxes during M dwarf flare observations obtained from narrow-slit spectra or during variable weather conditions. This technique allows us to analyze the spectra and photometry independently of one another, in order to connect the spectral properties to the rise, peak, and decay phases of broadband light curve morphology. We classify the light curve morphology according to an "impulsiveness index" and find that the fast (impulsive) flares have less Balmer continuum at peak emission than the slow (gradual) flares. In the gradual phase, the energy budget of the flare spectrum during almost all flares has a larger contribution from the Hydrogen Balmer component than in the impulsive phase, suggesting that the heating and cooling processes evolve over the course of a flare. We find that, in general, the evolution of the hot blackbody is rapid, and that the blackbody temperature decreases to ~8000 K in the gradual phase. The Balmer continuum evolves more slowly than the blackbody ¨C similar to the higher order Balmer lines but faster than the lower order Balmer lines. The height of the Balmer jump increases during the gradual decay phase. We model the Balmer continuum emission using the RHD F11 model spectrum from Allred et al. (2006), but we discuss several important systematic uncertainties in relating the apparent amount of Balmer continuum to a given RHD beam model. Good fits to the shape of the RHD F11 model spectrum are not obtained at peak times, in contrast to the gradual phase. We model the blackbody component using model hot star atmospheres from Castelli & Kurucz (2004) in order to account for the effects of flux redistribution in the flare atmosphere. This modeling is motivated by observations during a secondary flare in the decay phase of a megaflare, when the newly formed flare spectrum resembled that of Vega with the Balmer continuum and lines in absorption. We model this continuum phenomenologically with the RH code using hot spots placed at high column mass in the M dwarf quiescent atmosphere; a superposition of hot spot models and the RHD model are used to explain the anti-correlation in the apparent amount of Balmer continuum in emission and the U-band light curve. We attempt to reproduce the blackbody component in self-consistent 1D radiative hydrodynamic flare models using the RADYN code. We simulate the flare using a solar-type nonthermal electron beam heating function with a total energy flux of 1012 ergs cm-2 s-1 (F12) for a duration of 5 seconds and a subsequent gradual phase. Although there is a larger amount of NUV backwarming at log mc/(1g cm-2)~0 than in the F11 model, the resulting flare continuum shape is similar to the F11 model spectrum with a larger Balmer jump and a much redder spectral shape than is seen in the observations. We do not find evidence of white-light emitting chromospheric condensations, in contrast to the previous F12 model of Livshits et al. (1981). We discuss future avenues for RHD modeling in order to produce a hot blackbody component, including the treatment of nonthermal protons in M dwarf flares.

  14. Analysis of Aeroheating Augmentation due to Reaction Control System Jets on Orion Crew Exploration Vehicle

    NASA Technical Reports Server (NTRS)

    Dyakonov, Artem A.; Buck, Gregory M.; Decaro, Anthony D.

    2009-01-01

    The analysis of effects of the reaction control system jet plumes on aftbody heating of Orion entry capsule is presented. The analysis covered hypersonic continuum part of the entry trajectory. Aerothermal environments at flight conditions were evaluated using Langley Aerothermal Upwind Relaxation Algorithm (LAURA) code and Data Parallel Line Relaxation (DPLR) algorithm code. Results show a marked augmentation of aftbody heating due to roll, yaw and aft pitch thrusters. No significant augmentation is expected due to forward pitch thrusters. Of the conditions surveyed the maximum heat rate on the aftshell is expected when firing a pair of roll thrusters at a maximum deceleration condition.

  15. The Atmospheric Response to High Nonthermal Electron Beam Fluxes in Solar Flares. I. Modeling the Brightest NUV Footpoints in the X1 Solar Flare of 2014 March 29

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kowalski, Adam F.; Allred, Joel C.; Daw, Adrian

    2017-02-10

    The 2014 March 29 X1 solar flare (SOL20140329T17:48) produced bright continuum emission in the far- and near-ultraviolet (NUV) and highly asymmetric chromospheric emission lines, providing long-sought constraints on the heating mechanisms of the lower atmosphere in solar flares. We analyze the continuum and emission line data from the Interface Region Imaging Spectrograph (IRIS) of the brightest flaring magnetic footpoints in this flare. We compare the NUV spectra of the brightest pixels to new radiative-hydrodynamic predictions calculated with the RADYN code using constraints on a nonthermal electron beam inferred from the collisional thick-target modeling of hard X-ray data from Reuven Ramatymore » High Energy Solar Spectroscopic Imager . We show that the atmospheric response to a high beam flux density satisfactorily achieves the observed continuum brightness in the NUV. The NUV continuum emission in this flare is consistent with hydrogen (Balmer) recombination radiation that originates from low optical depth in a dense chromospheric condensation and from the stationary beam-heated layers just below the condensation. A model producing two flaring regions (a condensation and stationary layers) in the lower atmosphere is also consistent with the asymmetric Fe ii chromospheric emission line profiles observed in the impulsive phase.« less

  16. The Atmospheric Response to High Nonthermal Electron Beam Fluxes in Solar Flares. I. Modeling the Brightest NUV Footpoints in the X1 Solar Flare of 2014 March 29

    NASA Technical Reports Server (NTRS)

    Kowalski, Adam F.; Allred, Joel C.; Daw, Adrian N.; Cauzzi, Gianna; Carlsson, Mats

    2017-01-01

    The 2014 March 29 X1 solar flare (SOL20140329T17:48) produced bright continuum emission in the far- and near-ultraviolet (NUV) and highly asymmetric chromospheric emission lines, providing long-sought constraints on the heating mechanisms of the lower atmosphere in solar flares. We analyze the continuum and emission line data from the Interface Region Imaging Spectrograph (IRIS) of the brightest flaring magnetic footpoints in this flare. We compare the NUV spectra of the brightest pixels to new radiative-hydrodynamic predictions calculated with the RADYN code using constraints on a nonthermal electron beam inferred from the collisional thick-target modeling of hard X-ray data from Reuven Ramaty High Energy Solar Spectroscopic Imager. We show that the atmospheric response to a high beam flux density satisfactorily achieves the observed continuum brightness in the NUV. The NUV continuum emission in this flare is consistent with hydrogen (Balmer) recombination radiation that originates from low optical depth in a dense chromospheric condensation and from the stationary beam-heated layers just below the condensation. A model producing two flaring regions (a condensation and stationary layers) in the lower atmosphere is also consistent with the asymmetric Fe II chromospheric emission line profiles observed in the impulsive phase.

  17. ECON-KG: A Code for Computation of Electrical Conductivity Using Density Functional Theory

    DTIC Science & Technology

    2017-10-01

    is presented. Details of the implementation and instructions for execution are presented, and an example calculation of the frequency- dependent ...shown to depend on carbon content,3 and electrical conductivity models have become a requirement for input into continuum-level simulations being... dependent electrical conductivity is computed as a weighted sum over k-points: () = ∑ () ∗ () , (2) where W(k) is

  18. Response to comments by Adam Smiarowski and Shane Mulè on: Christensen, N., and Lawrie, K., 2012. Resolution analyses for selecting an appropriate airborne electromagnetic (AEM) system, Exploration Geophysics, 43, 213-227

    NASA Astrophysics Data System (ADS)

    Christensen, Niels B.; Lawrie, Ken

    2015-06-01

    We analyse and compare the resolution improvement that can be obtained from including x-component data in the inversion of AEM data from the SkyTEM and TEMPEST systems. Except for the resistivity of the bottom layer, the SkyTEM system, even without including x-component data, has the better resolution of the parameters of the analysed models.

  19. TEMPEST/N33.5. Computational Fluid Dynamics Package For Incompressible, 3D, Time Dependent Pro

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trent, Dr.D.S.; Eyler, Dr.L.L.

    TEMPESTN33.5 provides numerical solutions to general incompressible flow problems with coupled heat transfer in fluids and solids. Turbulence is created with a k-e model and gas, liquid or solid constituents may be included with the bulk flow. Problems may be modeled in Cartesian or cylindrical coordinates. Limitations include incompressible flow, Boussinesq approximation, and passive constituents. No direct steady state solution is available; steady state is obtained as the limit of a transient.

  20. RainCube 6U CubeSat

    NASA Image and Video Library

    2018-05-17

    The RainCube 6U CubeSat with fully-deployed antenna. RainCube, CubeRRT and TEMPEST-D are currently integrated aboard Orbital ATKs Cygnus spacecraft and are awaiting launch on an Antares rocket. After the CubeSats have arrived at the station, they will be deployed into low-Earth orbit and will begin their missions to test these new technologies useful for predicting weather, ensuring data quality, and helping researchers better understand storms. https://photojournal.jpl.nasa.gov/catalog/PIA22457

  1. MaMiCo: Transient multi-instance molecular-continuum flow simulation on supercomputers

    NASA Astrophysics Data System (ADS)

    Neumann, Philipp; Bian, Xin

    2017-11-01

    We present extensions of the macro-micro-coupling tool MaMiCo, which was designed to couple continuum fluid dynamics solvers with discrete particle dynamics. To enable local extraction of smooth flow field quantities especially on rather short time scales, sampling over an ensemble of molecular dynamics simulations is introduced. We provide details on these extensions including the transient coupling algorithm, open boundary forcing, and multi-instance sampling. Furthermore, we validate the coupling in Couette flow using different particle simulation software packages and particle models, i.e. molecular dynamics and dissipative particle dynamics. Finally, we demonstrate the parallel scalability of the molecular-continuum simulations by using up to 65 536 compute cores of the supercomputer Shaheen II located at KAUST. Program Files doi:http://dx.doi.org/10.17632/w7rgdrhb85.1 Licensing provisions: BSD 3-clause Programming language: C, C++ External routines/libraries: For compiling: SCons, MPI (optional) Subprograms used: ESPResSo, LAMMPS, ls1 mardyn, waLBerla For installation procedures of the MaMiCo interfaces, see the README files in the respective code directories located in coupling/interface/impl. Journal reference of previous version: P. Neumann, H. Flohr, R. Arora, P. Jarmatz, N. Tchipev, H.-J. Bungartz. MaMiCo: Software design for parallel molecular-continuum flow simulations, Computer Physics Communications 200: 324-335, 2016 Does the new version supersede the previous version?: Yes. The functionality of the previous version is completely retained in the new version. Nature of problem: Coupled molecular-continuum simulation for multi-resolution fluid dynamics: parts of the domain are resolved by molecular dynamics or another particle-based solver whereas large parts are covered by a mesh-based CFD solver, e.g. a lattice Boltzmann automaton. Solution method: We couple existing MD and CFD solvers via MaMiCo (macro-micro coupling tool). Data exchange and coupling algorithmics are abstracted and incorporated in MaMiCo. Once an algorithm is set up in MaMiCo, it can be used and extended, even if other solvers are used (as soon as the respective interfaces are implemented/available). Reasons for the new version: We have incorporated a new algorithm to simulate transient molecular-continuum systems and to automatically sample data over multiple MD runs that can be executed simultaneously (on, e.g., a compute cluster). MaMiCo has further been extended by an interface to incorporate boundary forcing to account for open molecular dynamics boundaries. Besides support for coupling with various MD and CFD frameworks, the new version contains a test case that allows to run molecular-continuum Couette flow simulations out-of-the-box. No external tools or simulation codes are required anymore. However, the user is free to switch from the included MD simulation package to LAMMPS. For details on how to run the transient Couette problem, see the file README in the folder coupling/tests, Remark on MaMiCo V1.1. Summary of revisions: Open boundary forcing; Multi-instance MD sampling; support for transient molecular-continuum systems Restrictions: Currently, only single-centered systems are supported. For access to the LAMMPS-based implementation of DPD boundary forcing, please contact Xin Bian, xin.bian@tum.de. Additional comments: Please see file license_mamico.txt for further details regarding distribution and advertising of this software.

  2. M Dwarf Flare Continuum Variations on One-second Timescales: Calibrating and Modeling of ULTRACAM Flare Color Indices

    NASA Astrophysics Data System (ADS)

    Kowalski, Adam F.; Mathioudakis, Mihalis; Hawley, Suzanne L.; Wisniewski, John P.; Dhillon, Vik S.; Marsh, Tom R.; Hilton, Eric J.; Brown, Benjamin P.

    2016-04-01

    We present a large data set of high-cadence dMe flare light curves obtained with custom continuum filters on the triple-beam, high-speed camera system ULTRACAM. The measurements provide constraints for models of the near-ultraviolet (NUV) and optical continuum spectral evolution on timescales of ≈1 s. We provide a robust interpretation of the flare emission in the ULTRACAM filters using simultaneously obtained low-resolution spectra during two moderate-sized flares in the dM4.5e star YZ CMi. By avoiding the spectral complexity within the broadband Johnson filters, the ULTRACAM filters are shown to characterize bona fide continuum emission in the NUV, blue, and red wavelength regimes. The NUV/blue flux ratio in flares is equivalent to a Balmer jump ratio, and the blue/red flux ratio provides an estimate for the color temperature of the optical continuum emission. We present a new “color-color” relationship for these continuum flux ratios at the peaks of the flares. Using the RADYN and RH codes, we interpret the ULTRACAM filter emission using the dominant emission processes from a radiative-hydrodynamic flare model with a high nonthermal electron beam flux, which explains a hot, T ≈ 104 K, color temperature at blue-to-red optical wavelengths and a small Balmer jump ratio as observed in moderate-sized and large flares alike. We also discuss the high time resolution, high signal-to-noise continuum color variations observed in YZ CMi during a giant flare, which increased the NUV flux from this star by over a factor of 100. Based on observations obtained with the Apache Point Observatory 3.5 m telescope, which is owned and operated by the Astrophysical Research Consortium, based on observations made with the William Herschel Telescope operated on the island of La Palma by the Isaac Newton Group in the Spanish Observatorio del Roque de los Muchachos of the Instituto de Astrofsica de Canarias, and observations, and based on observations made with the ESO Telescopes at the La Silla Paranal Observatory under programme ID 085.D-0501(A).

  3. Progress with the COGENT Edge Kinetic Code: Collision operator options

    DOE PAGES

    Dorf, M. A.; Cohen, R. H.; Compton, J. C.; ...

    2012-06-27

    In this study, COGENT is a continuum gyrokinetic code for edge plasmas being developed by the Edge Simulation Laboratory collaboration. The code is distinguished by application of the fourth order conservative discretization, and mapped multiblock grid technology to handle the geometric complexity of the tokamak edge. It is written in v∥-μ (parallel velocity – magnetic moment) velocity coordinates, and making use of the gyrokinetic Poisson equation for the calculation of a self-consistent electric potential. In the present manuscript we report on the implementation and initial testing of a succession of increasingly detailed collision operator options, including a simple drag-diffusion operatormore » in the parallel velocity space, Lorentz collisions, and a linearized model Fokker-Planck collision operator conserving momentum and energy (© 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)« less

  4. Application of advanced computational procedures for modeling solar-wind interactions with Venus: Theory and computer code

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.; Klenke, D.; Trudinger, B. C.; Spreiter, J. R.

    1980-01-01

    Computational procedures are developed and applied to the prediction of solar wind interaction with nonmagnetic terrestrial planet atmospheres, with particular emphasis to Venus. The theoretical method is based on a single fluid, steady, dissipationless, magnetohydrodynamic continuum model, and is appropriate for the calculation of axisymmetric, supersonic, super-Alfvenic solar wind flow past terrestrial planets. The procedures, which consist of finite difference codes to determine the gasdynamic properties and a variety of special purpose codes to determine the frozen magnetic field, streamlines, contours, plots, etc. of the flow, are organized into one computational program. Theoretical results based upon these procedures are reported for a wide variety of solar wind conditions and ionopause obstacle shapes. Plasma and magnetic field comparisons in the ionosheath are also provided with actual spacecraft data obtained by the Pioneer Venus Orbiter.

  5. Water Vapor Self-Continuum by Cavity Ring Down Spectroscopy in the 1.6 Micron Transparency Window

    NASA Astrophysics Data System (ADS)

    Campargue, Alain; Kassi, Samir; Mondelain, Didier

    2014-06-01

    Since its discovery one century ago, a deep and unresolved controversy remains on the nature of the water vapor continuum. Several interpretations are proposed: accumulated effect of the distant wings of many individual spectral lines, metastable or true bound water dimers, collision-induced absorption. The atmospheric science community has largely sidestepped this controversy, and has adopted a pragmatic approach: most radiative transfer codes used in climate modelling, numerical weather prediction and remote sensing use the MT_CKD model which is a semi-empirical formulation of the continuum The MT_CKD cross-sections were tuned to available observations in the mid-infrared but in the absence of experimental constraints, the extrapolated near infrared (NIR) values are much more hazardous. Due to the weakness of the broadband absorption signal to be measured, very few measurements of the water vapor continuum are available in the NIR windows especially for temperature conditions relevant for our atmosphere. This is in particular the case for the 1.6 μm window where the very few available measurements show a large disagreement. Here we present the first measurements of the water vapor self-continuum cross-sections in the 1.6 μm window by cavity ring down spectroscopy (CRDS). The pressure dependence of the absorption continuum was investigated during pressure cycles up to 12 Torr for selected wavenumber values. The continuum level is observed to deviate from the expected quadratic dependence with pressure. This deviation is interpreted as due to a significant contribution of water adsorbed on the super mirrors to the cavity loss rate. The pressure dependence is well reproduced by a second order polynomial. We interpret the linear and quadratic terms as the adsorbed water and vapour water contribution, respectively. The derived self-continuum cross sections, measured between 5875 and 6450 wn, shows a minimum value around 6300 wn. These cross sections will be compared to the existing experimental data and models, especially to recent FTS measurements and to the last version of the MT_CKD 2.5 model. Mlawer, E.J., V.H. Payne, J.L. Moncet, et al. (2012), Phil. Trans. R. Soc. A, 370, 2520-2556. Mondelain, D., A. Aradj, S. Kassi, et al. (2013), JQSRT, 130, 381-391.

  6. Analysis of Compton continuum measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gold, R.; Olson, I. K.

    1970-01-01

    Five computer programs: COMPSCAT, FEND, GABCO, DOSE, and COMPLOT, have been developed and used for the analysis and subsequent reduction of measured energy distributions of Compton recoil electrons to continuous gamma spectra. In addition to detailed descriptions of these computer programs, the relationship amongst these codes is stressed. The manner in which these programs function is illustrated by tracing a sample measurement through a complete cycle of the data-reduction process.

  7. A Ceramic Fracture Model for High Velocity Impact

    DTIC Science & Technology

    1993-05-01

    employ damage concepts appear more relevant than crack growth models for this application . This research adopts existing fracture model concepts and...extends them through applications in an existing finite element continuum mechanics code (hydrocode) to the prediction of the damage and fracture processes...to be accurate in the lower velocity range of this work. Mescall and Tracy 15] investigated the selection of ceramic material for application in armors

  8. Agent-Based Framework for Discrete Entity Simulations

    DTIC Science & Technology

    2006-11-01

    Postgres database server for environment queries of neighbors and continuum data. As expected for raw database queries (no database optimizations in...form. Eventually the code was ported to GNU C++ on the same single Intel Pentium 4 CPU running RedHat Linux 9.0 and Postgres database server...Again Postgres was used for environmental queries, and the tool remained relatively slow because of the immense number of queries necessary to assess

  9. Cancer communication science funding trends, 2000-2012.

    PubMed

    Ramírez, A Susana; Galica, Kasia; Blake, Kelly D; Chou, Wen-Ying Sylvia; Hesse, Bradford W

    2013-12-01

    Since 2000, the field of health communication has grown tremendously, owing largely to research funding by the National Cancer Institute (NCI). This study provides an overview of cancer communication science funding trends in the past decade. We conducted an analysis of communication-related grant applications submitted to the NCI in fiscal years 2000-2012. Using 103 keywords related to health communication, data were extracted from the Portfolio Management Application, a grants management application used at NCI. Automated coding described key grant characteristics such as mechanism and review study section. Manual coding determined funding across the cancer control continuum, by cancer site, and by cancer risk factors. A total of 3307 unique grant applications met initial inclusion criteria; 1013 of these were funded over the 12-year period. The top funded grant mechanisms were the R01, R21, and R03. Applications were largely investigator-initiated proposals as opposed to responses to particular funding opportunity announcements. Among funded communication research, the top risk factor being studied was tobacco, and across the cancer control continuum, cancer prevention was the most common stage investigated. NCI support of cancer communication research has been an important source of growth for health communication science over the last 12 years. The analysis' findings describe NCI's priorities in cancer communication science and suggest areas for future investments.

  10. Dynamic simulations of geologic materials using combined FEM/DEM/SPH analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, J P; Johnson, S M

    2008-03-26

    An overview of the Lawrence Discrete Element Code (LDEC) is presented, and results from a study investigating the effect of explosive and impact loading on geologic materials using the Livermore Distinct Element Code (LDEC) are detailed. LDEC was initially developed to simulate tunnels and other structures in jointed rock masses using large numbers of polyhedral blocks. Many geophysical applications, such as projectile penetration into rock, concrete targets, and boulder fields, require a combination of continuum and discrete methods in order to predict the formation and interaction of the fragments produced. In an effort to model this class of problems, LDECmore » now includes implementations of Cosserat point theory and cohesive elements. This approach directly simulates the transition from continuum to discontinuum behavior, thereby allowing for dynamic fracture within a combined finite element/discrete element framework. In addition, there are many application involving geologic materials where fluid-structure interaction is important. To facilitate solution of this class of problems a Smooth Particle Hydrodynamics (SPH) capability has been incorporated into LDEC to simulate fully coupled systems involving geologic materials and a saturating fluid. We will present results from a study of a broad range of geomechanical problems that exercise the various components of LDEC in isolation and in tandem.« less

  11. A code for optically thick and hot photoionized media

    NASA Astrophysics Data System (ADS)

    Dumont, A.-M.; Abrassart, A.; Collin, S.

    2000-05-01

    We describe a code designed for hot media (T >= a few 104 K), optically thick to Compton scattering. It computes the structure of a plane-parallel slab of gas in thermal and ionization equilibrium, illuminated on one or on both sides by a given spectrum. Contrary to the other photoionization codes, it solves the transfer of the continuum and of the lines in a two stream approximation, without using the local escape probability formalism to approximate the line transfer. We stress the importance of taking into account the returning flux even for small column densities (1022 cm-2), and we show that the escape probability approximation can lead to strong errors in the thermal and ionization structure, as well as in the emitted spectrum, for a Thomson thickness larger than a few tenths. The transfer code is coupled with a Monte Carlo code which allows to take into account Compton and inverse Compton diffusions, and to compute the spectrum emitted up to MeV energies, in any geometry. Comparisons with cloudy show that it gives similar results for small column densities. Several applications are mentioned.

  12. Intercomparison of three microwave/infrared high resolution line-by-line radiative transfer codes

    NASA Astrophysics Data System (ADS)

    Schreier, Franz; Milz, Mathias; Buehler, Stefan A.; von Clarmann, Thomas

    2018-05-01

    An intercomparison of three line-by-line (lbl) codes developed independently for atmospheric radiative transfer and remote sensing - ARTS, GARLIC, and KOPRA - has been performed for a thermal infrared nadir sounding application assuming a HIRS-like (High resolution Infrared Radiation Sounder) setup. Radiances for the 19 HIRS infrared channels and a set of 42 atmospheric profiles from the "Garand dataset" have been computed. The mutual differences of the equivalent brightness temperatures are presented and possible causes of disagreement are discussed. In particular, the impact of path integration schemes and atmospheric layer discretization is assessed. When the continuum absorption contribution is ignored because of the different implementations, residuals are generally in the sub-Kelvin range and smaller than 0.1 K for some window channels (and all atmospheric models and lbl codes). None of the three codes turned out to be perfect for all channels and atmospheres. Remaining discrepancies are attributed to different lbl optimization techniques. Lbl codes seem to have reached a maturity in the implementation of radiative transfer that the choice of the underlying physical models (line shape models, continua etc) becomes increasingly relevant.

  13. Impact of the level of state tax code progressivity on children's health outcomes.

    PubMed

    Granruth, Laura Brierton; Shields, Joseph J

    2011-08-01

    This research study examines the impact of the level of state tax code progressivity on selected children's health outcomes. Specifically, it examines the degree to which a state's tax code ranking along the progressive-regressive continuum relates to percentage of low birthweight babies, infant and child mortality rates, and percentage of uninsured children. Using data merged from a number of public data sets, the authors find that the level of state tax code progressivity is a factor in state rates of infant and child mortality. States with lower median incomes and regressive tax policies have the highest rates of infant and child mortality.With regard to the percentage of children 17 years of age and below who lack health insurance, it is found that larger states with regressive tax policies have the largest percentage of uninsured children. In general, more heavily populated states with more progressive tax codes have healthier children. The implications of these findings are discussed in terms of tax policy and the well-being of children as well as for social work education, social work practice, and social work research.

  14. Computational methods for coupling microstructural and micromechanical materials response simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were appliedmore » to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.« less

  15. Case Managers on the Front Lines of Ethical Dilemmas: Advocacy, Autonomy, and Preventing Case Manager Burnout.

    PubMed

    Sortedahl, Charlotte; Mottern, Nina; Campagna, Vivian

    The purpose of this article is to examine how case managers are routinely confronted by ethical dilemmas within a fragmented health care system and given the reality of financial pressures that influence life-changing decisions. The Code of Professional Conduct for Case Managers (Code), published by the Commission for Case Manager Certification, acknowledges "case managers may often confront ethical dilemmas" (Code 1996, Rev. 2015). The Code and expectations that professional case managers, particularly those who are board certified, will uphold ethical and legal practice apply to case managers in every practice setting across the full continuum of health care. This discussion acknowledges the ethical dilemmas that case managers routinely confront, which empowers them to seek support, guidance, and resources to support ethical practice. In addition, the article seeks to raise awareness of the effects of burnout and moral distress on case managers and others with whom they work closely on interdisciplinary teams.

  16. Progress with the COGENT Edge Kinetic Code: Implementing the Fokker-Plank Collision Operator

    DOE PAGES

    Dorf, M. A.; Cohen, R. H.; Dorr, M.; ...

    2014-06-20

    Here, COGENT is a continuum gyrokinetic code for edge plasma simulations being developed by the Edge Simulation Laboratory collaboration. The code is distinguished by application of a fourth-order finite-volume (conservative) discretization, and mapped multiblock grid technology to handle the geometric complexity of the tokamak edge. The distribution function F is discretized in v∥ – μ (parallel velocity – magnetic moment) velocity coordinates, and the code presently solves an axisymmetric full-f gyro-kinetic equation coupled to the long-wavelength limit of the gyro-Poisson equation. COGENT capabilities are extended by implementing the fully nonlinear Fokker-Plank operator to model Coulomb collisions in magnetized edge plasmas.more » The corresponding Rosenbluth potentials are computed by making use of a finite-difference scheme and multipole-expansion boundary conditions. Details of the numerical algorithms and results of the initial verification studies are discussed. (© 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)« less

  17. Two-photon production of dilepton pairs in peripheral heavy ion collisions

    NASA Astrophysics Data System (ADS)

    Klein, Spencer R.

    2018-05-01

    The STAR collaboration has observed an excess production of e+e- pairs in relativistic heavy ion collisions, over the expectations from hadronic production models. The excess pairs have transverse momenta pT<150 MeV /c and are most prominent in peripheral gold-gold and uranium-uranium collisions. The pairs exhibit a peak at the J /ψ mass, but include a wide continuum, with pair invariant masses from 400 MeV/c 2 up to 2.6 GeV/c 2 . The ALICE Collaboration observes a similar excess in peripheral lead-lead collisions, but only at the J /ψ mass, without a corresponding continuum. This paper presents a calculation of the cross section and kinematic for two-photon production of e+e- pairs, and find general agreement with the STAR data. The calculation is based on the starlight simulation code, which is based on the Weizsäcker-Williams virtual photon approach. The STAR continuum observations are compatible with two-photon production of e+e- pairs. The ALICE analysis required individual muon pT be greater than 1 GeV/c; this eliminated almost all of the pairs from two-photon interactions, while leaving most of the J /ψ decays.

  18. Overset grid implementation of the complex Kohn variational method for electron-polyatomic molecule scattering

    NASA Astrophysics Data System (ADS)

    McCurdy, C. William; Lucchese, Robert L.; Greenman, Loren

    2017-04-01

    The complex Kohn variational method, which represents the continuum wave function in each channel using a combination of Gaussians and Bessel or Coulomb functions, has been successful in numerous applications to electron-polyatomic molecule scattering and molecular photoionization. The hybrid basis representation limits it to relatively low energies (< 50 eV) , requires an approximation to exchange matrix elements involving continuum functions, and hampers its coupling to modern electronic structure codes for the description of correlated target states. We describe a successful implementation of the method using completely adaptive overset grids to describe continuum functions, in which spherical subgrids are placed on every atomic center to complement a spherical master grid that describes the behavior at large distances. An accurate method for applying the free-particle Green's function on the grid eliminates the need to operate explicitly with the kinetic energy, enabling a rapidly convergent Arnoldi algorithm for solving linear equations on the grid, and no approximations to exchange operators are made. Results for electron scattering from several polyatomic molecules will be presented. Army Research Office, MURI, WN911NF-14-1-0383 and U. S. DOE DE-SC0012198 (at Texas A&M).

  19. Continuum modeling of catastrophic collisions

    NASA Technical Reports Server (NTRS)

    Ryan, Eileen V.; Aspaug, Erik; Melosh, H. J.

    1991-01-01

    A two dimensional hydrocode based on 2-D SALE was modified to include strength effects and fragmentation equations for fracture resulting from tensile stress in one dimension. Output from this code includes a complete fragmentation summary for each cell of the modeled object: fragment size (mass) distribution, vector velocities of particles, peak values of pressure and tensile stress, and peak strain rates associated with fragmentation. Contour plots showing pressure and temperature at given times within the object are also produced. By invoking axial symmetry, three dimensional events can be modeled such as zero impact parameter collisions between asteroids. The code was tested against the one dimensional model and the analytical solution for a linearly increasing tensile stress under constant strain rate.

  20. Eight Leadership Emergency Codes Worth Calling.

    PubMed

    Freed, David H

    Hospitals have a contemporary opportunity to change themselves before attempting to transform the larger US health care system. However, actually implementing change is much more easily described than accomplished in practice. This article calls out 8 dysfunctional behaviors that compromise professional standards at the ground level of the hospital. The construct of calling a code when one witnesses such behaviors is intended to make it safe for leaders to "See something, say something" and confront them in real time. The coordinated continuum of services that health care reform seeks to attain will not emerge until individual hospital organizations prepare themselves to operate better in their own spaces and the ones that immediately surround them.

  1. Study of helium emissions from active solar regions

    NASA Technical Reports Server (NTRS)

    Kulander, J. L.

    1973-01-01

    A theoretical study is made of the visible and UV line radiation of He I atoms and He II ions from a plane-parallel model flare layer. Codes were developed for the solution of the statistically steady state equation for a 30 level He I - II - III model, and the line and continuum transport equations. These codes are described and documented in the report along with sample solutions. Optical depths and some line intensities are presented for a 1000 km thick layer. Solutions of the steady state equations are presented for electron temperatures 10,000 to 50,000 K and electron densities 10 to the 10th power to 10 to the 14th power/cu cm.

  2. Simulation of 2D Kinetic Effects in Plasmas using the Grid Based Continuum Code LOKI

    NASA Astrophysics Data System (ADS)

    Banks, Jeffrey; Berger, Richard; Chapman, Tom; Brunner, Stephan

    2016-10-01

    Kinetic simulation of multi-dimensional plasma waves through direct discretization of the Vlasov equation is a useful tool to study many physical interactions and is particularly attractive for situations where minimal fluctuation levels are desired, for instance, when measuring growth rates of plasma wave instabilities. However, direct discretization of phase space can be computationally expensive, and as a result there are few examples of published results using Vlasov codes in more than a single configuration space dimension. In an effort to fill this gap we have developed the Eulerian-based kinetic code LOKI that evolves the Vlasov-Poisson system in 2+2-dimensional phase space. The code is designed to reduce the cost of phase-space computation by using fully 4th order accurate conservative finite differencing, while retaining excellent parallel scalability that efficiently uses large scale computing resources. In this poster I will discuss the algorithms used in the code as well as some aspects of their parallel implementation using MPI. I will also overview simulation results of basic plasma wave instabilities relevant to laser plasma interaction, which have been obtained using the code.

  3. Inferring giant planets from ALMA millimeter continuum and line observations in (transition) disks

    NASA Astrophysics Data System (ADS)

    Facchini, S.; Pinilla, P.; van Dishoeck, E. F.; de Juan Ovelar, M.

    2018-05-01

    Context. Radial gaps or cavities in the continuum emission in the IR-mm wavelength range are potential signatures of protoplanets embedded in their natal protoplanetary disk are. Hitherto, models have relied on the combination of mm continuum observations and near-infrared scattered light images to put constraints on the properties of embedded planets. Atacama Large Millimeter/submillimeter Array (ALMA) observations are now probing spatially resolved rotational line emission of CO and other chemical species. These observations can provide complementary information on the mechanism carving the gaps in dust and additional constraints on the purported planet mass. Aims: We investigate whether the combination of ALMA continuum and CO line observations can constrain the presence and mass of planets embedded in protoplanetary disks. Methods: We post-processed azimuthally averaged 2D hydrodynamical simulations of planet-disk models, in which the dust densities and grain size distributions are computed with a dust evolution code that considers radial drift, fragmentation, and growth. The simulations explored various planet masses (1 MJ ≤ Mp ≤ 15 MJ) and turbulent parameters (10-4 ≤ α ≤ 10-3). The outputs were then post-processed with the thermochemical code DALI, accounting for the radially and vertically varying dust properties. We obtained the gas and dust temperature structures, chemical abundances, and synthetic emission maps of both thermal continuum and CO rotational lines. This is the first study combining hydrodynamical simulations, dust evolution, full radiative transfer, and chemistry to predict gas emission of disks hosting massive planets. Results: All radial intensity profiles of 12CO, 13CO, and C18O show a gap at the planet location. The ratio between the location of the gap as seen in CO and the peak in the mm continuum at the pressure maximum outside the orbit of the planet shows a clear dependence on planet mass and is independent of disk viscosity for the parameters explored in this paper. Because of the low dust density in the gaps, the dust and gas components can become thermally decoupled and the gas becomes colder than the dust. The gaps seen in CO are due to a combination of gas temperature dropping at the location of the planet and of the underlying surface density profile. Both effects need to be taken into account and disentangled when inferring gas surface densities from observed CO intensity profiles; otherwise, the gas surface density drop at the planet location can easily be overestimated. CO line ratios across the gap are able to quantify the gas temperature drop in the gaps in observed systems. Finally, a CO cavity not observed in any of the models, only CO gaps, indicating that one single massive planet is not able to explain the CO cavities observed in transition disks, at least without additional physical or chemical mechanisms.

  4. SCORE-EVET: a computer code for the multidimensional transient thermal-hydraulic analysis of nuclear fuel rod arrays. [BWR; PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benedetti, R. L.; Lords, L. V.; Kiser, D. M.

    1978-02-01

    The SCORE-EVET code was developed to study multidimensional transient fluid flow in nuclear reactor fuel rod arrays. The conservation equations used were derived by volume averaging the transient compressible three-dimensional local continuum equations in Cartesian coordinates. No assumptions associated with subchannel flow have been incorporated into the derivation of the conservation equations. In addition to the three-dimensional fluid flow equations, the SCORE-EVET code ocntains: (a) a one-dimensional steady state solution scheme to initialize the flow field, (b) steady state and transient fuel rod conduction models, and (c) comprehensive correlation packages to describe fluid-to-fuel rod interfacial energy and momentum exchange. Velocitymore » and pressure boundary conditions can be specified as a function of time and space to model reactor transient conditions such as a hypothesized loss-of-coolant accident (LOCA) or flow blockage.« less

  5. Tartarus: A relativistic Green's function quantum average atom code

    DOE PAGES

    Gill, Nathanael Matthew; Starrett, Charles Edward

    2017-06-28

    A relativistic Green’s Function quantum average atom model is implemented in the Tartarus code for the calculation of equation of state data in dense plasmas. We first present the relativistic extension of the quantum Green’s Function average atom model described by Starrett [1]. The Green’s Function approach addresses the numerical challenges arising from resonances in the continuum density of states without the need for resonance tracking algorithms or adaptive meshes, though there are still numerical challenges inherent to this algorithm. We discuss how these challenges are addressed in the Tartarus algorithm. The outputs of the calculation are shown in comparisonmore » to PIMC/DFT-MD simulations of the Principal Shock Hugoniot in Silicon. Finally, we also present the calculation of the Hugoniot for Silver coming from both the relativistic and nonrelativistic modes of the Tartarus code.« less

  6. Tartarus: A relativistic Green's function quantum average atom code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gill, Nathanael Matthew; Starrett, Charles Edward

    A relativistic Green’s Function quantum average atom model is implemented in the Tartarus code for the calculation of equation of state data in dense plasmas. We first present the relativistic extension of the quantum Green’s Function average atom model described by Starrett [1]. The Green’s Function approach addresses the numerical challenges arising from resonances in the continuum density of states without the need for resonance tracking algorithms or adaptive meshes, though there are still numerical challenges inherent to this algorithm. We discuss how these challenges are addressed in the Tartarus algorithm. The outputs of the calculation are shown in comparisonmore » to PIMC/DFT-MD simulations of the Principal Shock Hugoniot in Silicon. Finally, we also present the calculation of the Hugoniot for Silver coming from both the relativistic and nonrelativistic modes of the Tartarus code.« less

  7. Sierra/Solid Mechanics 4.48 User's Guide.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merewether, Mark Thomas; Crane, Nathan K; de Frias, Gabriel Jose

    Sierra/SolidMechanics (Sierra/SM) is a Lagrangian, three-dimensional code for finite element analysis of solids and structures. It provides capabilities for explicit dynamic, implicit quasistatic and dynamic analyses. The explicit dynamics capabilities allow for the efficient and robust solution of models with extensive contact subjected to large, suddenly applied loads. For implicit problems, Sierra/SM uses a multi-level iterative solver, which enables it to effectively solve problems with large deformations, nonlinear material behavior, and contact. Sierra/SM has a versatile library of continuum and structural elements, and a large library of material models. The code is written for parallel computing environments enabling scalable solutionsmore » of extremely large problems for both implicit and explicit analyses. It is built on the SIERRA Framework, which facilitates coupling with other SIERRA mechanics codes. This document describes the functionality and input syntax for Sierra/SM.« less

  8. OpenFOAM: Open source CFD in research and industry

    NASA Astrophysics Data System (ADS)

    Jasak, Hrvoje

    2009-12-01

    The current focus of development in industrial Computational Fluid Dynamics (CFD) is integration of CFD into Computer-Aided product development, geometrical optimisation, robust design and similar. On the other hand, in CFD research aims to extend the boundaries ofpractical engineering use in "non-traditional " areas. Requirements of computational flexibility and code integration are contradictory: a change of coding paradigm, with object orientation, library components, equation mimicking is proposed as a way forward. This paper describes OpenFOAM, a C++ object oriented library for Computational Continuum Mechanics (CCM) developed by the author. Efficient and flexible implementation of complex physical models is achieved by mimicking the form ofpartial differential equation in software, with code functionality provided in library form. Open Source deployment and development model allows the user to achieve desired versatility in physical modeling without the sacrifice of complex geometry support and execution efficiency.

  9. Slip Continuity in Explicit Crystal Plasticity Simulations Using Nonlocal Continuum and Semi-discrete Approaches

    DTIC Science & Technology

    2013-01-01

    Based Micropolar Single Crystal Plasticity: Comparison of Multi - and Single Criterion Theories. J. Mech. Phys. Solids 2011, 59, 398–422. ALE3D ...element boundaries in a multi -step constitutive evaluation (Becker, 2011). The results showed the desired effects of smoothing the deformation field...Implementation The model was implemented in the large-scale parallel, explicit finite element code ALE3D (2012). The crystal plasticity

  10. TEMPEST Level-0 Theory

    DTIC Science & Technology

    2011-11-01

    trajectory of the ship-fixed reference system relative to an earth-fixed reference system. The earth-fixed reference frame, EEE ZYX O , is assumed to be...the ship and moves with all the motions of the ship. The EEE ZYX O axis system is fixed to the earth. A third axis system, ’’’ zyxO , is required...added to account for the turbulence in the propeller slipstream: 2075.0445.0225.1 eeS aa  radians for aeɛ.0 565.0S radians for ae

  11. SEER*Educate: Use of Abstracting Quality Index Scores to Monitor Improvement of All Employees.

    PubMed

    Potts, Mary S; Scott, Tim; Hafterson, Jennifer L

    2016-01-01

    Integral parts of the Seattle-Puget Sound's Cancer Surveillance System registry's continuous improvement model include the incorporation of SEER*Educate into its training program for all staff and analyzing assessment results using the Abstracting Quality Index (AQI). The AQI offers a comprehensive measure of overall performance in SEER*Educate, which is a Web-based application used to personalize learning and diagnostically pinpoint each staff member's place on the AQI continuum. The assessment results are tallied from 6 abstracting standards within 2 domains: incidence reporting and coding accuracy. More than 100 data items are aligned to 1 or more of the 6 standards to build an aggregated score that is placed on a continuum for continuous improvement. The AQI score accurately identifies those individuals who have a good understanding of how to apply the 6 abstracting standards to reliably generate high quality abstracts.

  12. Gyrokinetic continuum simulation of turbulence in a straight open-field-line plasma

    DOE PAGES

    Shi, E. L.; Hammett, G. W.; Stoltzfus-Dueck, T.; ...

    2017-05-29

    Here, five-dimensional gyrokinetic continuum simulations of electrostatic plasma turbulence in a straight, open-field-line geometry have been performed using a full- discontinuous-Galerkin approach implemented in the Gkeyll code. While various simplifications have been used for now, such as long-wavelength approximations in the gyrokinetic Poisson equation and the Hamiltonian, these simulations include the basic elements of a fusion-device scrape-off layer: localised sources to model plasma outflow from the core, cross-field turbulent transport, parallel flow along magnetic field lines, and parallel losses at the limiter or divertor with sheath-model boundary conditions. The set of sheath-model boundary conditions used in the model allows currentsmore » to flow through the walls. In addition to details of the numerical approach, results from numerical simulations of turbulence in the Large Plasma Device, a linear device featuring straight magnetic field lines, are presented.« less

  13. Subsistence and the evolution of religion.

    PubMed

    Peoples, Hervey C; Marlowe, Frank W

    2012-09-01

    We present a cross-cultural analysis showing that the presence of an active or moral High God in societies varies generally along a continuum from lesser to greater technological complexity and subsistence productivity. Foragers are least likely to have High Gods. Horticulturalists and agriculturalists are more likely. Pastoralists are most likely, though they are less easily positioned along the productivity continuum. We suggest that belief in moral High Gods was fostered by emerging leaders in societies dependent on resources that were difficult to manage and defend without group cooperation. These leaders used the concept of a supernatural moral enforcer to manipulate others into cooperating, which resulted in greater productivity. Reproductive success would accrue most to such leaders, but the average reproductive success of all individuals in the society would also increase with greater productivity. Supernatural enforcement of moral codes maintained social cohesion and allowed for further population growth, giving one society an advantage in competition with others.

  14. Reactive transport modeling in fractured rock: A state-of-the-science review

    NASA Astrophysics Data System (ADS)

    MacQuarrie, Kerry T. B.; Mayer, K. Ulrich

    2005-10-01

    The field of reactive transport modeling has expanded significantly in the past two decades and has assisted in resolving many issues in Earth Sciences. Numerical models allow for detailed examination of coupled transport and reactions, or more general investigation of controlling processes over geologic time scales. Reactive transport models serve to provide guidance in field data collection and, in particular, enable researchers to link modeling and hydrogeochemical studies. In this state-of-science review, the key objectives were to examine the applicability of reactive transport codes for exploring issues of redox stability to depths of several hundreds of meters in sparsely fractured crystalline rock, with a focus on the Canadian Shield setting. A conceptual model of oxygen ingress and redox buffering, within a Shield environment at time and space scales relevant to nuclear waste repository performance, is developed through a review of previous research. This conceptual model describes geochemical and biological processes and mechanisms materially important to understanding redox buffering capacity and radionuclide mobility in the far-field. Consistent with this model, reactive transport codes should ideally be capable of simulating the effects of changing recharge water compositions as a result of long-term climate change, and fracture-matrix interactions that may govern water-rock interaction. Other aspects influencing the suitability of reactive transport codes include the treatment of various reaction and transport time scales, the ability to apply equilibrium or kinetic formulations simultaneously, the need to capture feedback between water-rock interactions and porosity-permeability changes, and the representation of fractured crystalline rock environments as discrete fracture or dual continuum media. A review of modern multicomponent reactive transport codes indicates a relatively high-level of maturity. Within the Yucca Mountain nuclear waste disposal program, reactive transport codes of varying complexity have been applied to investigate the migration of radionuclides and the geochemical evolution of host rock around the planned disposal facility. Through appropriate near- and far-field application of dual continuum codes, this example demonstrates how reactive transport models have been applied to assist in constraining historic water infiltration rates, interpreting the sealing of flow paths due to mineral precipitation, and investigating post-closure geochemical monitoring strategies. Natural analogue modeling studies, although few in number, are also of key importance as they allow the comparison of model results with hydrogeochemical and paleohydrogeological data over geologic time scales.

  15. Aerodynamic characteristics of the upper stages of a launch vehicle in low-density regime

    NASA Astrophysics Data System (ADS)

    Oh, Bum Seok; Lee, Joon Ho

    2016-11-01

    Aerodynamic characteristics of the orbital block (remaining configuration after separation of nose fairing and 1st and 2nd stages of the launch vehicle) and the upper 2-3stage (configuration after separation of 1st stage) of the 3 stages launch vehicle (KSLV-II, Korea Space Launch Vehicle) at high altitude of low-density regime are analyzed by SMILE code which is based on DSMC (Direct Simulation Monte-Carlo) method. To validating of the SMILE code, coefficients of axial force and normal forces of Apollo capsule are also calculated and the results agree very well with the data predicted by others. For the additional validations and applications of the DSMC code, aerodynamic calculation results of simple shapes of plate and wedge in low-density regime are also introduced. Generally, aerodynamic characteristics in low-density regime differ from those of continuum regime. To understand those kinds of differences, aerodynamic coefficients of the upper stages (including upper 2-3 stage and the orbital block) of the launch vehicle in low-density regime are analyzed as a function of Mach numbers and altitudes. The predicted axial force coefficients of the upper stages of the launch vehicle are very high compared to those in continuum regime. In case of the orbital block which flies at very high altitude (higher than 250km), all aerodynamic coefficients are more dependent on velocity variations than altitude variations. In case of the upper 2-3 stage which flies at high altitude (80km-150km), while the axial force coefficients and the locations of center of pressure are less changed with the variations of Knudsen numbers (altitudes), the normal force coefficients and pitching moment coefficients are more affected by variations of Knudsen numbers (altitude).

  16. Validation of the Chemistry Module for the Euler Solver in Unified Flow Solver

    DTIC Science & Technology

    2012-03-01

    traveling through the atmosphere there are three types of flow regimes that exist; the first is the continuum regime, second is the rarified regime and...The second method has been used in a program called Unified Flow Solver (UFS). UFS is currently being developed under collaborative efforts the Air...thermal non-equilibrium case and finally to a thermo-chemical non- equilibrium case. The data from the simulations will be compared to a second code

  17. Analysis of airborne radiometric data. Volume 3. Topical reports

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reed, J.H.; Shreve, D.C.; Sperling, M.

    1978-05-01

    This volume consists of four topical reports: a general discussion of the philosophy of unfolding spectra with continuum and discrete components, a mathematical treatment of the effects of various physical parameters on the uncollided gamma-ray spectrum at aircraft elevations, a discussion of the application of the unfolding code MAZNAI to airborne data, and a discussion of the effects of the nonlinear relationship between energy deposited and pulse height in NaI(T1) detectors.

  18. Broadband Photometric Reverberation Mapping Analysis on SDSS-RM and Stripe 82 Quasars

    NASA Astrophysics Data System (ADS)

    Zhang, Haowen; Yang, Qian; Wu, Xuebing; Shen, Yue

    2018-01-01

    We extended the broadband photometric reverberation mapping (PRM) code, JAVELIN and test the availability to get broad line region (BLR) time delays that are consistent with spectroscopic reverberation mapping (SRM) projects. Broadband light curves of SDSS-RM quasars produced by convolution with system transmission curve were used in the test. We find that under similar sampling conditions (evenly and frequently sampled), the key factor determining whether the broadband PRM code can yield lags consistent with spectroscopic projects is the flux ratio of line to the reference continuum, which is in line with the findings in Zu et al. (2016). We further find a crucial line-to-continuum flux ratio, above which the mean of the ratios between the lags from PRM and SRM becomes closer to unity, and the scatter is pronouncedly reduced. Based on this flux ratio criteria, we selected some of the quasars from Hernitschek et al. (2015) and carry out broadband PRM on this subset. The performance of damped random walking (DRW) model and power-law (PL) structure function model on broadband PRM are compared using mock light curves with high, even cadences and low, uneven ones, respectively. We find that DRW model performs better in carrying out broadband PRM than PL model both for high and low cadence light curves with other data qualities similar to SDSS-RM quasars.

  19. Vector scattering analysis of TPF coronagraph pupil masks

    NASA Astrophysics Data System (ADS)

    Ceperley, Daniel P.; Neureuther, Andrew R.; Lieber, Michael D.; Kasdin, N. Jeremy; Shih, Ta-Ming

    2004-10-01

    Rigorous finite-difference time-domain electromagnetic simulation is used to simulate the scattering from proto-typical pupil mask cross-section geometries and to quantify the differences from the normally assumed ideal on-off behavior. Shaped pupil plane masks are a promising technology for the TPF coronagraph mission. However the stringent requirements placed on the optics require that the detailed behavior of the edge-effects of these masks be examined carefully. End-to-end optical system simulation is essential and an important aspect is the polarization and cross-section dependent edge-effects which are the subject of this paper. Pupil plane masks are similar in many respects to photomasks used in the integrated circuit industry. Simulation capabilities such as the FDTD simulator, TEMPEST, developed for analyzing polarization and intensity imbalance effects in nonplanar phase-shifting photomasks, offer a leg-up in analyzing coronagraph masks. However, the accuracy in magnitude and phase required for modeling a chronograph system is extremely demanding and previously inconsequential errors may be of the same order of magnitude as the physical phenomena under study. In this paper, effects of thick masks, finite conductivity metals, and various cross-section geometries on the transmission of pupil-plane masks are illustrated. Undercutting the edge shape of Cr masks improves the effective opening width to within λ/5 of the actual opening but TE and TM polarizations require opposite compensations. The deviation from ideal is examined at the reference plane of the mask opening. Numerical errors in TEMPEST, such as numerical dispersion, perfectly matched layer reflections, and source haze are also discussed along with techniques for mitigating their impacts.

  20. Continuum kinetic and multi-fluid simulations of classical sheaths

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cagas, P.; Hakim, A.; Juno, J.

    The kinetic study of plasma sheaths is critical, among other things, to understand the deposition of heat on walls, the effect of sputtering, and contamination of the plasma with detrimental impurities. The plasma sheath also provides a boundary condition and can often have a significant global impact on the bulk plasma. In this paper, kinetic studies of classical sheaths are performed with the continuum kinetic code, Gkeyll, which directly solves the Vlasov-Maxwell equations. The code uses a novel version of the finite-element discontinuous Galerkin scheme that conserves energy in the continuous-time limit. The fields are computed using Maxwell equations. Ionizationmore » and scattering collisions are included; however, surface effects are neglected. The aim of this work is to introduce the continuum kinetic method and compare its results with those obtained from an already established finite-volume multi-fluid model also implemented in Gkeyll. Novel boundary conditions on the fluids allow the sheath to form without specifying wall fluxes, so the fluids and fields adjust self-consistently at the wall. Our work demonstrates that the kinetic and fluid results are in agreement for the momentum flux, showing that in certain regimes, a multifluid model can be a useful approximation for simulating the plasma boundary. There are differences in the electrostatic potential between the fluid and kinetic results. Further, the direct solutions of the distribution function presented here highlight the non-Maxwellian distribution of electrons in the sheath, emphasizing the need for a kinetic model. The densities, velocities, and the potential show a good agreement between the kinetic and fluid results. But, kinetic physics is highlighted through higher moments such as parallel and perpendicular temperatures which provide significant differences from the fluid results in which the temperature is assumed to be isotropic. Besides decompression cooling, the heat flux is shown to play a role in the temperature differences that are observed, especially inside the collisionless sheath. Published by AIP Publishing.« less

  1. Continuum kinetic and multi-fluid simulations of classical sheaths

    DOE PAGES

    Cagas, P.; Hakim, A.; Juno, J.; ...

    2017-02-21

    The kinetic study of plasma sheaths is critical, among other things, to understand the deposition of heat on walls, the effect of sputtering, and contamination of the plasma with detrimental impurities. The plasma sheath also provides a boundary condition and can often have a significant global impact on the bulk plasma. In this paper, kinetic studies of classical sheaths are performed with the continuum kinetic code, Gkeyll, which directly solves the Vlasov-Maxwell equations. The code uses a novel version of the finite-element discontinuous Galerkin scheme that conserves energy in the continuous-time limit. The fields are computed using Maxwell equations. Ionizationmore » and scattering collisions are included; however, surface effects are neglected. The aim of this work is to introduce the continuum kinetic method and compare its results with those obtained from an already established finite-volume multi-fluid model also implemented in Gkeyll. Novel boundary conditions on the fluids allow the sheath to form without specifying wall fluxes, so the fluids and fields adjust self-consistently at the wall. Our work demonstrates that the kinetic and fluid results are in agreement for the momentum flux, showing that in certain regimes, a multifluid model can be a useful approximation for simulating the plasma boundary. There are differences in the electrostatic potential between the fluid and kinetic results. Further, the direct solutions of the distribution function presented here highlight the non-Maxwellian distribution of electrons in the sheath, emphasizing the need for a kinetic model. The densities, velocities, and the potential show a good agreement between the kinetic and fluid results. But, kinetic physics is highlighted through higher moments such as parallel and perpendicular temperatures which provide significant differences from the fluid results in which the temperature is assumed to be isotropic. Besides decompression cooling, the heat flux is shown to play a role in the temperature differences that are observed, especially inside the collisionless sheath. Published by AIP Publishing.« less

  2. Study of Plume Impingement Effects in the Lunar Lander Environment

    NASA Technical Reports Server (NTRS)

    Marichalar, Jeremiah; Prisbell, A.; Lumpkin, F.; LeBeau, G.

    2010-01-01

    Plume impingement effects from the descent and ascent engine firings of the Lunar Lander were analyzed in support of the Lunar Architecture Team under the Constellation Program. The descent stage analysis was performed to obtain shear and pressure forces on the lunar surface as well as velocity and density profiles in the flow field in an effort to understand lunar soil erosion and ejected soil impact damage which was analyzed as part of a separate study. A CFD/DSMC decoupled methodology was used with the Bird continuum breakdown parameter to distinguish the continuum flow from the rarefied flow. The ascent stage analysis was performed to ascertain the forces and moments acting on the Lunar Lander Ascent Module due to the firing of the main engine on take-off. The Reacting and Multiphase Program (RAMP) method of characteristics (MOC) code was used to model the continuum region of the nozzle plume, and the Direct Simulation Monte Carlo (DSMC) Analysis Code (DAC) was used to model the impingement results in the rarefied region. The ascent module (AM) was analyzed for various pitch and yaw rotations and for various heights in relation to the descent module (DM). For the ascent stage analysis, the plume inflow boundary was located near the nozzle exit plane in a region where the flow number density was large enough to make the DSMC solution computationally expensive. Therefore, a scaling coefficient was used to make the DSMC solution more computationally manageable. An analysis of the effectiveness of this scaling technique was performed by investigating various scaling parameters for a single height and rotation of the AM. Because the inflow boundary was near the nozzle exit plane, another analysis was performed investigating three different inflow contours to determine the effects of the flow expansion around the nozzle lip on the final plume impingement results.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biyikli, Emre; To, Albert C., E-mail: albertto@pitt.edu

    Atomistic/continuum coupling methods combine accurate atomistic methods and efficient continuum methods to simulate the behavior of highly ordered crystalline systems. Coupled methods utilize the advantages of both approaches to simulate systems at a lower computational cost, while retaining the accuracy associated with atomistic methods. Many concurrent atomistic/continuum coupling methods have been proposed in the past; however, their true computational efficiency has not been demonstrated. The present work presents an efficient implementation of a concurrent coupling method called the Multiresolution Molecular Mechanics (MMM) for serial, parallel, and adaptive analysis. First, we present the features of the software implemented along with themore » associated technologies. The scalability of the software implementation is demonstrated, and the competing effects of multiscale modeling and parallelization are discussed. Then, the algorithms contributing to the efficiency of the software are presented. These include algorithms for eliminating latent ghost atoms from calculations and measurement-based dynamic balancing of parallel workload. The efficiency improvements made by these algorithms are demonstrated by benchmark tests. The efficiency of the software is found to be on par with LAMMPS, a state-of-the-art Molecular Dynamics (MD) simulation code, when performing full atomistic simulations. Speed-up of the MMM method is shown to be directly proportional to the reduction of the number of the atoms visited in force computation. Finally, an adaptive MMM analysis on a nanoindentation problem, containing over a million atoms, is performed, yielding an improvement of 6.3–8.5 times in efficiency, over the full atomistic MD method. For the first time, the efficiency of a concurrent atomistic/continuum coupling method is comprehensively investigated and demonstrated.« less

  4. Multiresolution molecular mechanics: Implementation and efficiency

    NASA Astrophysics Data System (ADS)

    Biyikli, Emre; To, Albert C.

    2017-01-01

    Atomistic/continuum coupling methods combine accurate atomistic methods and efficient continuum methods to simulate the behavior of highly ordered crystalline systems. Coupled methods utilize the advantages of both approaches to simulate systems at a lower computational cost, while retaining the accuracy associated with atomistic methods. Many concurrent atomistic/continuum coupling methods have been proposed in the past; however, their true computational efficiency has not been demonstrated. The present work presents an efficient implementation of a concurrent coupling method called the Multiresolution Molecular Mechanics (MMM) for serial, parallel, and adaptive analysis. First, we present the features of the software implemented along with the associated technologies. The scalability of the software implementation is demonstrated, and the competing effects of multiscale modeling and parallelization are discussed. Then, the algorithms contributing to the efficiency of the software are presented. These include algorithms for eliminating latent ghost atoms from calculations and measurement-based dynamic balancing of parallel workload. The efficiency improvements made by these algorithms are demonstrated by benchmark tests. The efficiency of the software is found to be on par with LAMMPS, a state-of-the-art Molecular Dynamics (MD) simulation code, when performing full atomistic simulations. Speed-up of the MMM method is shown to be directly proportional to the reduction of the number of the atoms visited in force computation. Finally, an adaptive MMM analysis on a nanoindentation problem, containing over a million atoms, is performed, yielding an improvement of 6.3-8.5 times in efficiency, over the full atomistic MD method. For the first time, the efficiency of a concurrent atomistic/continuum coupling method is comprehensively investigated and demonstrated.

  5. Correcting X-ray spectra obtained from the AXAF VETA-I mirror calibration for pileup, continuum, background and deadtime

    NASA Technical Reports Server (NTRS)

    Chartas, G.; Flanagan, K.; Hughes, J. P.; Kellogg, E. M.; Nguyen, D.; Zombek, M.; Joy, M.; Kolodziejezak, J.

    1993-01-01

    The VETA-I mirror was calibrated with the use of a collimated soft X-ray source produced by electron bombardment of various anode materials. The FWHM, effective area and encircled energy were measured with the use of proportional counters that were scanned with a set of circular apertures. The pulsers from the proportional counters were sent through a multichannel analyzer that produced a pulse height spectrum. In order to characterize the properties of the mirror at different discrete photon energies one desires to extract from the pulse height distribution only those photons that originated from the characteristic line emission of the X-ray target source. We have developed a code that fits a modeled spectrum to the observed X-ray data, extracts the counts that originated from the line emission, and estimates the error in these counts. The function that is fitted to the X-ray spectra includes a Prescott function for the resolution of the detector a second Prescott function for a pileup peak and a X-ray continuum function. The continuum component is determined by calculating the absorption of the target Bremsstrahlung through various filters, correcting for the reflectivity of the mirror and convolving with the detector response.

  6. Correcting x ray spectra obtained from the AXAF VETA-I mirror calibration for pileup, continuum, background and deadtime

    NASA Technical Reports Server (NTRS)

    Chartas, G.; Flanagan, Kathy; Hughes, John P.; Kellogg, Edwin M.; Nguyen, D.; Zombeck, M.; Joy, M.; Kolodziejezak, J.

    1992-01-01

    The VETA-I mirror was calibrated with the use of a collimated soft X-ray source produced by electron bombardment of various anode materials. The FWHM, effective area and encircled energy were measured with the use of proportional counters that were scanned with a set of circular apertures. The pulsers from the proportional counters were sent through a multichannel analyzer that produced a pulse height spectrum. In order to characterize the properties of the mirror at different discrete photon energies one desires to extract from the pulse height distribution only those photons that originated from the characteristic line emission of the X-ray target source. We have developed a code that fits a modeled spectrum to the observed X-ray data, extracts the counts that originated from the line emission, and estimates the error in these counts. The function that is fitted to the X-ray spectra includes a Prescott function for the resolution of the detector a second Prescott function for a pileup peak and a X-ray continuum function. The continuum component is determined by calculating the absorption of the target Bremsstrahlung through various filters correcting for the reflectivity of the mirror and convolving with the detector response.

  7. Modeling the Martian neutron and gamma-ray leakage fluxes using Geant4

    NASA Astrophysics Data System (ADS)

    Pirard, Benoit; Desorgher, Laurent; Diez, Benedicte; Gasnault, Olivier

    A new evaluation of the Martian neutron and gamma-ray (continuum and line) leakage fluxes has been performed using the Geant4 code. Even if numerous studies have recently been carried out with Monte Carlo methods to characterize planetary radiation environments, only a few however have been able to reproduce in detail the neutron and gamma-ray spectra observed in orbit. We report on the efforts performed to adapt and validate the Geant4-based PLAN- ETOCOSMICS code for use in planetary neutron and gamma-ray spectroscopy data analysis. Beside the advantage of high transparency and modularity common to Geant4 applications, the new code uses reviewed nuclear cross section data, realistic atmospheric profiles and soil layering, as well as specific effects such as gravity acceleration for low energy neutrons. Results from first simulations are presented for some Martian reference compositions and show a high consistency with corresponding neutron and gamma-ray spectra measured on board Mars Odyssey. Finally we discuss the advantages and perspectives of the improved code for precise simulation of planetary radiation environments.

  8. Disseminating near-real-time hazards information and flood maps in the Philippines through Web-GIS.

    PubMed

    A Lagmay, Alfredo Mahar Francisco; Racoma, Bernard Alan; Aracan, Ken Adrian; Alconis-Ayco, Jenalyn; Saddi, Ivan Lester

    2017-09-01

    The Philippines being a locus of tropical cyclones, tsunamis, earthquakes and volcanic eruptions, is a hotbed of disasters. These natural hazards inflict loss of lives and costly damage to property. Situated in a region where climate and geophysical tempest is common, the Philippines will inevitably suffer from calamities similar to those experienced recently. With continued development and population growth in hazard prone areas, it is expected that damage to infrastructure and human losses would persist and even rise unless appropriate measures are immediately implemented by government. In 2012, the Philippines launched a responsive program for disaster prevention and mitigation called the Nationwide Operational Assessment of Hazards (Project NOAH), specifically for government warning agencies to be able to provide a 6hr lead-time warning to vulnerable communities against impending floods and to use advanced technology to enhance current geo-hazard vulnerability maps. To disseminate such critical information to as wide an audience as possible, a Web-GIS using mashups of freely available source codes and application program interface (APIs) was developed and can be found in the URLs http://noah.dost.gov.ph and http://noah.up.edu.ph/. This Web-GIS tool is now heavily used by local government units in the Philippines in their disaster prevention and mitigation efforts and can be replicated in countries that have a proactive approach to address the impacts of natural hazards but lack sufficient funds. Copyright © 2017. Published by Elsevier B.V.

  9. Thermal-hydraulic simulation of natural convection decay heat removal in the High Flux Isotope Reactor (HFIR) using RELAP5 and TEMPEST: Part 2, Interpretation and validation of results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruggles, A.E.; Morris, D.G.

    The RELAP5/MOD2 code was used to predict the thermal-hydraulic behavior of the HFIR core during decay heat removal through boiling natural circulation. The low system pressure and low mass flux values associated with boiling natural circulation are far from conditions for which RELAP5 is well exercised. Therefore, some simple hand calculations are used herein to establish the physics of the results. The interpretation and validation effort is divided between the time average flow conditions and the time varying flow conditions. The time average flow conditions are evaluated using a lumped parameter model and heat balance. The Martinelli-Nelson correlations are usedmore » to model the two-phase pressure drop and void fraction vs flow quality relationship within the core region. Systems of parallel channels are susceptible to both density wave oscillations and pressure drop oscillations. Periodic variations in the mass flux and exit flow quality of individual core channels are predicted by RELAP5. These oscillations are consistent with those observed experimentally and are of the density wave type. The impact of the time varying flow properties on local wall superheat is bounded herein. The conditions necessary for Ledinegg flow excursions are identified. These conditions do not fall within the envelope of decay heat levels relevant to HFIR in boiling natural circulation. 14 refs., 5 figs., 1 tab.« less

  10. Initial parametric study of the flammability of plume releases in Hanford waste tanks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antoniak, Z.I.; Recknagle, K.P.

    This study comprised systematic analyses of waste tank headspace flammability following a plume-type of gas release from the waste. First, critical parameters affecting plume flammability were selected, evaluated, and refined. As part of the evaluation the effect of ventilation (breathing) air inflow on the convective flow field inside the tank headspace was assessed, and the magnitude of the so-called {open_quotes}numerical diffusion{close_quotes} on numerical simulation accuracy was investigated. Both issues were concluded to be negligible influences on predicted flammable gas concentrations in the tank headspace. Previous validation of the TEMPEST code against experimental data is also discussed, with calculated results inmore » good agreements with experimental data. Twelve plume release simulations were then run, using release volumes and flow rates that were thought to cover the range of actual release volumes and rates. The results indicate that most plume-type releases remain flammable only during the actual release ends. Only for very large releases representing a significant fraction of the volume necessary to make the entire mixed headspace flammable (many thousands of cubic feet) can flammable concentrations persist for several hours after the release ends. However, as in the smaller plumes, only a fraction of the total release volume is flammable at any one time. The transient evolution of several plume sizes is illustrated in a number of color contour plots that provide insight into plume mixing behavior.« less

  11. Analytical collisionless damping rate of geodesic acoustic mode

    NASA Astrophysics Data System (ADS)

    Ren, H.; Xu, X. Q.

    2016-10-01

    Collisionless damping of geodesic acoustic mode (GAM) is analytically investigated by considering the finite-orbit-width (FOW) resonance effect to the 3rd order in the gyro-kinetic equations. A concise and transparent expression for the damping rate is presented for the first time. Good agreement is found between the analytical damping rate and the previous TEMPEST simulation result (Xu 2008 et al Phys. Rev. Lett. 100 215001) for systematic q scans. Our result also shows that it is of sufficient accuracy and has to take into account the FOW effect to the 3rd order.

  12. Evaluation of out-of-core computer programs for the solution of symmetric banded linear equations. [simultaneous equations

    NASA Technical Reports Server (NTRS)

    Dunham, R. S.

    1976-01-01

    FORTRAN coded out-of-core equation solvers that solve using direct methods symmetric banded systems of simultaneous algebraic equations. Banded, frontal and column (skyline) solvers were studied as well as solvers that can partition the working area and thus could fit into any available core. Comparison timings are presented for several typical two dimensional and three dimensional continuum type grids of elements with and without midside nodes. Extensive conclusions are also given.

  13. Neighborhood Effects on Health: Concentrated Advantage and Disadvantage

    PubMed Central

    Finch, Brian K.; Do, D. Phuong; Heron, Melonie; Bird, Chloe; Seeman, Teresa; Lurie, Nicole

    2010-01-01

    We investigate an alternative conceptualization of neighborhood context and its association with health. Using an index that measures a continuum of concentrated advantage and disadvantage, we examine whether the relationship between neighborhood conditions and health varies by socio-economic status. Using NHANES III data geo-coded to census tracts, we find that while largely uneducated neighborhoods are universally deleterious, individuals with more education benefit from living in highly educated neighborhoods to a greater degree than individuals with lower levels of education. PMID:20627796

  14. Small Business Innovations (Helicopters)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The amount of engine power required for a helicopter to hover is an important, but difficult, consideration in helicopter design. The EHPIC program model produces converged, freely distorted wake geometries that generate accurate analysis of wake-induced downwash, allowing good predictions of rotor thrust and power requirements. Continuum Dynamics, Inc., the Small Business Innovation Research (SBIR) company that developed EHPIC, also produces RotorCRAFT, a program for analysis of aerodynamic loading of helicopter blades in forward flight. Both helicopter codes have been licensed to commercial manufacturers.

  15. MIRACAL: A mission radiation calculation program for analysis of lunar and interplanetary missions

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Striepe, Scott A.; Simonsen, Lisa C.

    1992-01-01

    A computational procedure and data base are developed for manned space exploration missions for which estimates are made for the energetic particle fluences encountered and the resulting dose equivalent incurred. The data base includes the following options: statistical or continuum model for ordinary solar proton events, selection of up to six large proton flare spectra, and galactic cosmic ray fluxes for elemental nuclei of charge numbers 1 through 92. The program requires an input trajectory definition information and specifications of optional parameters, which include desired spectral data and nominal shield thickness. The procedure may be implemented as an independent program or as a subroutine in trajectory codes. This code should be most useful in mission optimization and selection studies for which radiation exposure is of special importance.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Antonelli, Perry Edward

    A low-level model-to-model interface is presented that will enable independent models to be linked into an integrated system of models. The interface is based on a standard set of functions that contain appropriate export and import schemas that enable models to be linked with no changes to the models themselves. These ideas are presented in the context of a specific multiscale material problem that couples atomistic-based molecular dynamics calculations to continuum calculations of fluid ow. These simulations will be used to examine the influence of interactions of the fluid with an adjacent solid on the fluid ow. The interface willmore » also be examined by adding it to an already existing modeling code, Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) and comparing it with our own molecular dynamics code.« less

  17. JDFTx: Software for joint density-functional theory

    DOE PAGES

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.; ...

    2017-11-14

    Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less

  18. JDFTx: Software for joint density-functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.

    Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less

  19. Molecular Diagnostics of the Internal Motions of Massive Cores

    NASA Astrophysics Data System (ADS)

    Pineda, Jorge; Velusamy, T.; Goldsmith, P.; Li, D.; Peng, R.; Langer, W.

    2009-12-01

    We present models of the internal kinematics of massive cores in the Orion molecular cloud. We use a sample of cores studied by Velusamy et al. (2008) that show red, blue, and no asymmetry in their HCO+ line profiles in equal proportion, and which therefore may represent a sample of cores in different kinematic states. We use the radiative transfer code RATRAN (Hogerheijde & van der Tak 2000) to model several transitions of HCO+ and H13CO+ as well as the dust continuum emission, of a spherical model cloud with radial density, temperature, and velocity gradients. We find that an excitation and velocity gradients are prerequisites to reproduce the observed line profiles. We use the dust continuum emission to constrain the density and temperature gradients. This allows us to narrow down the functional forms of the velocity gradient giving us the opportunity to test several theoretical predictions of velocity gradients produced by the effect of magnetic fields (e.g. Tassis et. al. 2007) and turbulence (e.g. Vasquez-Semanedi et al 2007).

  20. Electrosensory neural responses to natural electro-communication stimuli are distributed along a continuum

    PubMed Central

    Sproule, Michael K. J.

    2017-01-01

    Neural heterogeneities are seen ubiquitously within the brain and greatly complicate classification efforts. Here we tested whether the responses of an anatomically well-characterized sensory neuron population to natural stimuli could be used for functional classification. To do so, we recorded from pyramidal cells within the electrosensory lateral line lobe (ELL) of the weakly electric fish Apteronotus leptorhynchus in response to natural electro-communication stimuli as these cells can be anatomically classified into six different types. We then used two independent methodologies to functionally classify responses: one relies of reducing the dimensionality of a feature space while the other directly compares the responses themselves. Both methodologies gave rise to qualitatively similar results: while ON and OFF-type cells could easily be distinguished from one another, ELL pyramidal neuron responses are actually distributed along a continuum rather than forming distinct clusters due to heterogeneities. We discuss the implications of our results for neural coding and highlight some potential advantages. PMID:28384244

  1. LS-DYNA Simulation of Hemispherical-punch Stamping Process Using an Efficient Algorithm for Continuum Damage Based Elastoplastic Constitutive Equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salajegheh, Nima; Abedrabbo, Nader; Pourboghrat, Farhang

    An efficient integration algorithm for continuum damage based elastoplastic constitutive equations is implemented in LS-DYNA. The isotropic damage parameter is defined as the ratio of the damaged surface area over the total cross section area of the representative volume element. This parameter is incorporated into the integration algorithm as an internal variable. The developed damage model is then implemented in the FEM code LS-DYNA as user material subroutine (UMAT). Pure stretch experiments of a hemispherical punch are carried out for copper sheets and the results are compared against the predictions of the implemented damage model. Evaluation of damage parameters ismore » carried out and the optimized values that correctly predicted the failure in the sheet are reported. Prediction of failure in the numerical analysis is performed through element deletion using the critical damage value. The set of failure parameters which accurately predict the failure behavior in copper sheets compared to experimental data is reported as well.« less

  2. Cosmic reionization on computers. Ultraviolet continuum slopes and dust opacities in high redshift galaxies

    DOE PAGES

    Khakhaleva-Li, Zimu; Gnedin, Nickolay Y.

    2016-03-30

    In this study, we compare the properties of stellar populations of model galaxies from the Cosmic Reionization On Computers (CROC) project with the exiting UV and IR data. Since CROC simulations do not follow cosmic dust directly, we adopt two variants of the dust-follows-metals ansatz to populate model galaxies with dust. Using the dust radiative transfer code Hyperion, we compute synthetic stellar spectra, UV continuum slopes, and IR fluxes for simulated galaxies. We find that the simulation results generally match observational measurements, but, perhaps, not in full detail. The differences seem to indicate that our adopted dust-follows-metals ansatzes are notmore » fully sufficient. While the discrepancies with the exiting data are marginal, the future JWST data will be of much higher precision, rendering highly significant any tentative difference between theory and observations. It is, therefore, likely, that in order to fully utilize the precision of JWST observations, fully dynamical modeling of dust formation, evolution, and destruction may be required.« less

  3. An algorithm for continuum modeling of rocks with multiple embedded nonlinearly-compliant joints [Continuum modeling of elasto-plastic media with multiple embedded nonlinearly-compliant joints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurley, R. C.; Vorobiev, O. Y.; Ezzedine, S. M.

    Here, we present a numerical method for modeling the mechanical effects of nonlinearly-compliant joints in elasto-plastic media. The method uses a series of strain-rate and stress update algorithms to determine joint closure, slip, and solid stress within computational cells containing multiple “embedded” joints. This work facilitates efficient modeling of nonlinear wave propagation in large spatial domains containing a large number of joints that affect bulk mechanical properties. We implement the method within the massively parallel Lagrangian code GEODYN-L and provide verification and examples. We highlight the ability of our algorithms to capture joint interactions and multiple weakness planes within individualmore » computational cells, as well as its computational efficiency. We also discuss the motivation for developing the proposed technique: to simulate large-scale wave propagation during the Source Physics Experiments (SPE), a series of underground explosions conducted at the Nevada National Security Site (NNSS).« less

  4. Experimental Verification of a Progressive Damage Model for IM7/5260 Laminates Subjected to Tension-Tension Fatigue

    NASA Technical Reports Server (NTRS)

    Coats, Timothy W.; Harris, Charles E.

    1995-01-01

    The durability and damage tolerance of laminated composites are critical design considerations for airframe composite structures. Therefore, the ability to model damage initiation and growth and predict the life of laminated composites is necessary to achieve structurally efficient and economical designs. The purpose of this research is to experimentally verify the application of a continuum damage model to predict progressive damage development in a toughened material system. Damage due to monotonic and tension-tension fatigue was documented for IM7/5260 graphite/bismaleimide laminates. Crack density and delamination surface area were used to calculate matrix cracking and delamination internal state variables to predict stiffness loss in unnotched laminates. A damage dependent finite element code predicted the stiffness loss for notched laminates with good agreement to experimental data. It was concluded that the continuum damage model can adequately predict matrix damage progression in notched and unnotched laminates as a function of loading history and laminate stacking sequence.

  5. An algorithm for continuum modeling of rocks with multiple embedded nonlinearly-compliant joints [Continuum modeling of elasto-plastic media with multiple embedded nonlinearly-compliant joints

    DOE PAGES

    Hurley, R. C.; Vorobiev, O. Y.; Ezzedine, S. M.

    2017-04-06

    Here, we present a numerical method for modeling the mechanical effects of nonlinearly-compliant joints in elasto-plastic media. The method uses a series of strain-rate and stress update algorithms to determine joint closure, slip, and solid stress within computational cells containing multiple “embedded” joints. This work facilitates efficient modeling of nonlinear wave propagation in large spatial domains containing a large number of joints that affect bulk mechanical properties. We implement the method within the massively parallel Lagrangian code GEODYN-L and provide verification and examples. We highlight the ability of our algorithms to capture joint interactions and multiple weakness planes within individualmore » computational cells, as well as its computational efficiency. We also discuss the motivation for developing the proposed technique: to simulate large-scale wave propagation during the Source Physics Experiments (SPE), a series of underground explosions conducted at the Nevada National Security Site (NNSS).« less

  6. Cosmic reionization on computers. Ultraviolet continuum slopes and dust opacities in high redshift galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khakhaleva-Li, Zimu; Gnedin, Nickolay Y.

    In this study, we compare the properties of stellar populations of model galaxies from the Cosmic Reionization On Computers (CROC) project with the exiting UV and IR data. Since CROC simulations do not follow cosmic dust directly, we adopt two variants of the dust-follows-metals ansatz to populate model galaxies with dust. Using the dust radiative transfer code Hyperion, we compute synthetic stellar spectra, UV continuum slopes, and IR fluxes for simulated galaxies. We find that the simulation results generally match observational measurements, but, perhaps, not in full detail. The differences seem to indicate that our adopted dust-follows-metals ansatzes are notmore » fully sufficient. While the discrepancies with the exiting data are marginal, the future JWST data will be of much higher precision, rendering highly significant any tentative difference between theory and observations. It is, therefore, likely, that in order to fully utilize the precision of JWST observations, fully dynamical modeling of dust formation, evolution, and destruction may be required.« less

  7. Parallel multiscale simulations of a brain aneurysm

    PubMed Central

    Grinberg, Leopold; Fedosov, Dmitry A.; Karniadakis, George Em

    2012-01-01

    Cardiovascular pathologies, such as a brain aneurysm, are affected by the global blood circulation as well as by the local microrheology. Hence, developing computational models for such cases requires the coupling of disparate spatial and temporal scales often governed by diverse mathematical descriptions, e.g., by partial differential equations (continuum) and ordinary differential equations for discrete particles (atomistic). However, interfacing atomistic-based with continuum-based domain discretizations is a challenging problem that requires both mathematical and computational advances. We present here a hybrid methodology that enabled us to perform the first multi-scale simulations of platelet depositions on the wall of a brain aneurysm. The large scale flow features in the intracranial network are accurately resolved by using the high-order spectral element Navier-Stokes solver εκ αr. The blood rheology inside the aneurysm is modeled using a coarse-grained stochastic molecular dynamics approach (the dissipative particle dynamics method) implemented in the parallel code LAMMPS. The continuum and atomistic domains overlap with interface conditions provided by effective forces computed adaptively to ensure continuity of states across the interface boundary. A two-way interaction is allowed with the time-evolving boundary of the (deposited) platelet clusters tracked by an immersed boundary method. The corresponding heterogeneous solvers ( εκ αr and LAMMPS) are linked together by a computational multilevel message passing interface that facilitates modularity and high parallel efficiency. Results of multiscale simulations of clot formation inside the aneurysm in a patient-specific arterial tree are presented. We also discuss the computational challenges involved and present scalability results of our coupled solver on up to 300K computer processors. Validation of such coupled atomistic-continuum models is a main open issue that has to be addressed in future work. PMID:23734066

  8. The impact of health information technology on cancer care across the continuum: a systematic review and meta-analysis.

    PubMed

    Tarver, Will L; Menachemi, Nir

    2016-03-01

    Health information technology (HIT) has the potential to play a significant role in the management of cancer. The purpose of this review is to identify and examine empirical studies that investigate the impact of HIT in cancer care on different levels of the care continuum. Electronic searches were performed in four academic databases. The authors used a three-step search process to identify 122 studies that met specific inclusion criteria. Next, a coding sheet was used to extract information from each included article to use in an analysis. Logistic regression was used to determine study-specific characteristics that were associated with positive findings. Overall, 72.4% of published analyses reported a beneficial effect of HIT. Multivariate analysis found that the impact of HIT differs across the cancer continuum with studies targeting diagnosis and treatment being, respectively, 77 (P = .001) and 39 (P = .039) percentage points less likely to report a beneficial effect when compared to those targeting prevention. In addition, studies targeting HIT to patients were 31 percentage points less likely to find a beneficial effect than those targeting providers (P = .030). Lastly, studies assessing behavior change as an outcome were 41 percentage points less likely to find a beneficial effect (P = .006), while studies targeting decision making were 27 percentage points more likely to find a beneficial effect (P = .034). Based on current evidence, HIT interventions seem to be more successful when targeting physicians, care in the prevention phase of the cancer continuum, and/or decision making. An agenda for future research is discussed. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Parallel multiscale simulations of a brain aneurysm.

    PubMed

    Grinberg, Leopold; Fedosov, Dmitry A; Karniadakis, George Em

    2013-07-01

    Cardiovascular pathologies, such as a brain aneurysm, are affected by the global blood circulation as well as by the local microrheology. Hence, developing computational models for such cases requires the coupling of disparate spatial and temporal scales often governed by diverse mathematical descriptions, e.g., by partial differential equations (continuum) and ordinary differential equations for discrete particles (atomistic). However, interfacing atomistic-based with continuum-based domain discretizations is a challenging problem that requires both mathematical and computational advances. We present here a hybrid methodology that enabled us to perform the first multi-scale simulations of platelet depositions on the wall of a brain aneurysm. The large scale flow features in the intracranial network are accurately resolved by using the high-order spectral element Navier-Stokes solver εκ αr . The blood rheology inside the aneurysm is modeled using a coarse-grained stochastic molecular dynamics approach (the dissipative particle dynamics method) implemented in the parallel code LAMMPS. The continuum and atomistic domains overlap with interface conditions provided by effective forces computed adaptively to ensure continuity of states across the interface boundary. A two-way interaction is allowed with the time-evolving boundary of the (deposited) platelet clusters tracked by an immersed boundary method. The corresponding heterogeneous solvers ( εκ αr and LAMMPS) are linked together by a computational multilevel message passing interface that facilitates modularity and high parallel efficiency. Results of multiscale simulations of clot formation inside the aneurysm in a patient-specific arterial tree are presented. We also discuss the computational challenges involved and present scalability results of our coupled solver on up to 300K computer processors. Validation of such coupled atomistic-continuum models is a main open issue that has to be addressed in future work.

  10. Parallel multiscale simulations of a brain aneurysm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grinberg, Leopold; Fedosov, Dmitry A.; Karniadakis, George Em, E-mail: george_karniadakis@brown.edu

    2013-07-01

    Cardiovascular pathologies, such as a brain aneurysm, are affected by the global blood circulation as well as by the local microrheology. Hence, developing computational models for such cases requires the coupling of disparate spatial and temporal scales often governed by diverse mathematical descriptions, e.g., by partial differential equations (continuum) and ordinary differential equations for discrete particles (atomistic). However, interfacing atomistic-based with continuum-based domain discretizations is a challenging problem that requires both mathematical and computational advances. We present here a hybrid methodology that enabled us to perform the first multiscale simulations of platelet depositions on the wall of a brain aneurysm.more » The large scale flow features in the intracranial network are accurately resolved by using the high-order spectral element Navier–Stokes solver NεκTαr. The blood rheology inside the aneurysm is modeled using a coarse-grained stochastic molecular dynamics approach (the dissipative particle dynamics method) implemented in the parallel code LAMMPS. The continuum and atomistic domains overlap with interface conditions provided by effective forces computed adaptively to ensure continuity of states across the interface boundary. A two-way interaction is allowed with the time-evolving boundary of the (deposited) platelet clusters tracked by an immersed boundary method. The corresponding heterogeneous solvers (NεκTαr and LAMMPS) are linked together by a computational multilevel message passing interface that facilitates modularity and high parallel efficiency. Results of multiscale simulations of clot formation inside the aneurysm in a patient-specific arterial tree are presented. We also discuss the computational challenges involved and present scalability results of our coupled solver on up to 300 K computer processors. Validation of such coupled atomistic-continuum models is a main open issue that has to be addressed in future work.« less

  11. Temporal Trends and Temperature-Related Incidence of Electrical Storm: The TEMPEST Study (Temperature-Related Incidence of Electrical Storm).

    PubMed

    Guerra, Federico; Bonelli, Paolo; Flori, Marco; Cipolletta, Laura; Carbucicchio, Corrado; Izquierdo, Maite; Kozluk, Edward; Shivkumar, Kalyanam; Vaseghi, Marmar; Patani, Francesca; Cupido, Claudio; Pala, Salvatore; Ruiz-Granell, Ricardo; Ferrero, Angel; Tondo, Claudio; Capucci, Alessandro

    2017-03-01

    The occurrence of ventricular tachyarrhythmias seems to follow circadian, daily, and seasonal distributions. Our aim is to identify potential temporal patterns of electrical storm (ES), in which a cluster of ventricular tachycardias or ventricular fibrillation, negatively affects short- and long-term survival. The TEMPEST study (Circannual Pattern and Temperature-Related Incidence of Electrical Storm) is a patient-level, pooled analysis of previously published data sets. Study selection criteria included diagnosis of ES, absence of acute coronary syndrome as the arrhythmic trigger, and ≥10 patients included. At the end of the selection and collection processes, 5 centers had the data set from their article pooled into the present registry. Temperature data and sunrise and sunset hours were retrieved from Weather Underground, the largest weather database available online. Total sample included 246 patients presenting with ES (221 men; age: 65±9 years). Each ES episode included a median of 7 ventricular tachycardia/ventricular fibrillation episodes. Fifty-nine percent of patients experienced ES during daytime hours ( P <0.001). The prevalence of ES was significantly higher during workdays, with Saturdays and Sundays registering the lowest rates of ES (10.4% and 7.2%, respectively, versus 16.5% daily mean from Monday to Friday; P <0.001). ES occurrence was significantly associated with increased monthly temperature range when compared with the month before ( P =0.003). ES incidence is not homogenous over time but seems to have a clustered pattern, with a higher incidence during daytime hours and working days. ES is associated with an increase in monthly temperature variation. https://www.crd.york.ac.uk. Unique identifier: CRD42013003744. © 2017 American Heart Association, Inc.

  12. Regional TEMPEST survey in north-east Namibia

    NASA Astrophysics Data System (ADS)

    Peters, Geoffrey; Street, Gregory; Kahimise, Ivor; Hutchins, David

    2015-09-01

    A regional scale TEMPEST208 airborne electromagnetic survey was flown in north-east Namibia in 2011. With broad line spacing (4 km) and a relatively low-powered, fixed-wing system, the approach was intended to provide a regional geo-electric map of the area, rather than direct detection of potential mineral deposits. A key component of the geo-electric profiling was to map the relative thickness of the Kalahari sediments, which is up to 200 m thick and obscures most of the bedrock in the area. Knowledge of the thickness would allow explorers to better predict the costs of exploration under the Kalahari. An additional aim was to determine if bedrock conductors were detectable beneath the Kalahari cover. The system succeeded in measuring the Kalahari thickness where this cover was relatively thin and moderately conductive. Limitations in depth penetration mean that it is not possible to map the thickness in the centre of the survey area, and much of the northern half of the survey area. Additional problems arise due to the variable conductivity of the Kalahari cover. Where the conductivity of the Kalahari sediment is close to that of the basement, there is no discernable contrast to delineate the base of the Kalahari. Basement conductors are visible beneath the more thinly covered areas such as in the north-west and south of the survey area. The remainder of the survey area generally comprises deeper, more conductive cover and for the most part basement conductors cannot be detected. A qualitative comparison with VTEM data shows comparable results in terms of regional mapping, and suggests that even more powerful systems such as the VTEM may not detect discrete conductors beneath the thick conductive parts of the Kalahari cover.

  13. Cancer Support Needs for African American Breast Cancer Survivors and Caregivers.

    PubMed

    Haynes-Maslow, Lindsey; Allicock, Marlyn; Johnson, La-Shell

    2016-03-01

    Improved cancer screening and treatment advances have led to higher cancer survival rates in the United States. However, racial disparities in breast cancer survival persist for African American women who experience lower survival rates than white women. These disparities suggest that unmet needs related to survivorship still exist. This study focuses on the challenges that both African American cancer survivors and caregivers face across the cancer continuum. Five African American focus groups examined cancer survivor and caregiver support needs. Focus groups were recorded, transcribed, and uploaded into Atlas.ti. Thematic content analysis was applied to the text during the coding process. Themes were identified and emphasized based on the research team's integrated and unified final codes. Forty-one African Americans participated in five focus groups: 22 cancer survivors and 19 caregivers. Participants discussed five themes: (1) a culture that discourages the discussion of cancer; (2) lack of support services for African American cancer survivors; (3) lack of support services for cancer caregivers; (4) need for culturally appropriate cancer resources, including resources targeted at African American women; and (5) aspects that were helpful to cancer survivors and caregivers, including connecting with other survivors and caregivers, and having strong social support networks. We gained new insight into the unmet support needs for survivors and caregivers, especially when coping with the cancer experience continuum. While some cancer and caregiver support services exist, our study reveals a great need for services that incorporate the cultural differences that exist across races.

  14. Photometric redshifts for the next generation of deep radio continuum surveys - II. Gaussian processes and hybrid estimates

    NASA Astrophysics Data System (ADS)

    Duncan, Kenneth J.; Jarvis, Matt J.; Brown, Michael J. I.; Röttgering, Huub J. A.

    2018-07-01

    Building on the first paper in this series (Duncan et al. 2018), we present a study investigating the performance of Gaussian process photometric redshift (photo-z) estimates for galaxies and active galactic nuclei (AGNs) detected in deep radio continuum surveys. A Gaussian process redshift code is used to produce photo-z estimates targeting specific subsets of both the AGN population - infrared (IR), X-ray, and optically selected AGNs - and the general galaxy population. The new estimates for the AGN population are found to perform significantly better at z > 1 than the template-based photo-z estimates presented in our previous study. Our new photo-z estimates are then combined with template estimates through hierarchical Bayesian combination to produce a hybrid consensus estimate that outperforms both of the individual methods across all source types. Photo-z estimates for radio sources that are X-ray sources or optical/IR AGNs are significantly improved in comparison to previous template-only estimates - with outlier fractions and robust scatter reduced by up to a factor of ˜4. The ability of our method to combine the strengths of the two input photo-z techniques and the large improvements we observe illustrate its potential for enabling future exploitation of deep radio continuum surveys for both the study of galaxy and black hole coevolution and for cosmological studies.

  15. Comprehensive Benchmark Suite for Simulation of Particle Laden Flows Using the Discrete Element Method with Performance Profiles from the Multiphase Flow with Interface eXchanges (MFiX) Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Peiyuan; Brown, Timothy; Fullmer, William D.

    Five benchmark problems are developed and simulated with the computational fluid dynamics and discrete element model code MFiX. The benchmark problems span dilute and dense regimes, consider statistically homogeneous and inhomogeneous (both clusters and bubbles) particle concentrations and a range of particle and fluid dynamic computational loads. Several variations of the benchmark problems are also discussed to extend the computational phase space to cover granular (particles only), bidisperse and heat transfer cases. A weak scaling analysis is performed for each benchmark problem and, in most cases, the scalability of the code appears reasonable up to approx. 103 cores. Profiling ofmore » the benchmark problems indicate that the most substantial computational time is being spent on particle-particle force calculations, drag force calculations and interpolating between discrete particle and continuum fields. Hardware performance analysis was also carried out showing significant Level 2 cache miss ratios and a rather low degree of vectorization. These results are intended to serve as a baseline for future developments to the code as well as a preliminary indicator of where to best focus performance optimizations.« less

  16. SKIRT: Hybrid parallelization of radiative transfer simulations

    NASA Astrophysics Data System (ADS)

    Verstocken, S.; Van De Putte, D.; Camps, P.; Baes, M.

    2017-07-01

    We describe the design, implementation and performance of the new hybrid parallelization scheme in our Monte Carlo radiative transfer code SKIRT, which has been used extensively for modelling the continuum radiation of dusty astrophysical systems including late-type galaxies and dusty tori. The hybrid scheme combines distributed memory parallelization, using the standard Message Passing Interface (MPI) to communicate between processes, and shared memory parallelization, providing multiple execution threads within each process to avoid duplication of data structures. The synchronization between multiple threads is accomplished through atomic operations without high-level locking (also called lock-free programming). This improves the scaling behaviour of the code and substantially simplifies the implementation of the hybrid scheme. The result is an extremely flexible solution that adjusts to the number of available nodes, processors and memory, and consequently performs well on a wide variety of computing architectures.

  17. The polarization signature from the circumstellar disks of classical Be stars

    NASA Astrophysics Data System (ADS)

    Halonen, R. J.; Jones, C. E.

    2012-05-01

    The scattering of light in the nonspherical circumstellar envelopes of classical Be stars produces distinct polarimetric properties that can be used to investigate the physical nature of the scattering environment. Both the continuum and emission line polarization are potentially important diagnostic tools in the modeling of these systems. We combine the use of a new multiple scattering code with an established non-LTE radiative transfer code to study the characteristic wavelength-dependence of the intrinsic polarization of classical Be stars. We construct models using realistic chemical composition and self-consistent calculations of the thermal structure of the disk, and then determine the fraction of emergent polarized light. In particular, the aim of this theoretical research project is to investigate the effect of gas density and metallicity on the observed polarization properties of classical Be stars.

  18. Security of information in IT systems

    NASA Astrophysics Data System (ADS)

    Kaliczynska, Malgorzata

    2005-02-01

    The aim of the paper is to increase human awareness of the dangers connected with social engineering methods of obtaining information. The article demonstrates psychological and sociological methods of influencing people used in the attacks on IT systems. Little known techniques are presented about one of the greater threats that is electromagnetic emission or corona effect. Moreover, the work shows methods of protecting against this type of dangers. Also, in the paper one can find information on devices made according to the TEMPEST technology. The article not only discusses the methods of gathering information, but also instructs how to protect against its out-of-control loss.

  19. LAVA web-based remote simulation: enhancements for education and technology innovation

    NASA Astrophysics Data System (ADS)

    Lee, Sang Il; Ng, Ka Chun; Orimoto, Takashi; Pittenger, Jason; Horie, Toshi; Adam, Konstantinos; Cheng, Mosong; Croffie, Ebo H.; Deng, Yunfei; Gennari, Frank E.; Pistor, Thomas V.; Robins, Garth; Williamson, Mike V.; Wu, Bo; Yuan, Lei; Neureuther, Andrew R.

    2001-09-01

    The Lithography Analysis using Virtual Access (LAVA) web site at http://cuervo.eecs.berkeley.edu/Volcano/ has been enhanced with new optical and deposition applets, graphical infrastructure and linkage to parallel execution on networks of workstations. More than ten new graphical user interface applets have been designed to support education, illustrate novel concepts from research, and explore usage of parallel machines. These applets have been improved through feedback and classroom use. Over the last year LAVA provided industry and other academic communities 1,300 session and 700 rigorous simulations per month among the SPLAT, SAMPLE2D, SAMPLE3D, TEMPEST, STORM, and BEBS simulators.

  20. Single ionization and capture cross sections from biological molecules by bare projectile impact*

    NASA Astrophysics Data System (ADS)

    Quinto, Michele A.; Monti, Juan M.; Montenegro, Pablo D.; Fojón, Omar A.; Champion, Christophe; Rivarola, Roberto D.

    2017-02-01

    We report calculations on single differential and total cross sections for single ionization and single electron capture from biological targets, namely, vapor water and DNA nucleobasese molecules, by bare projectile impact: H+, He2+, and C6+. They are performed within the Continuum Distorted Wave - Eikonal Initial State approximation and compared to several existing experimental data. This study is oriented to the obtention of a reliable set of theoretical data to be used as input in a Monte Carlo code destined to micro- and nano- dosimetry.

  1. VizieR Online Data Catalog: NLTE spectral analysis of white dwarf G191-B2B (Rauch+, 2013)

    NASA Astrophysics Data System (ADS)

    Rauch, T.; Werner, K.; Bohlin, R.; Kruk, J. W.

    2013-08-01

    In the framework of the Virtual Observatory, the German Astrophysical Virtual Observatory developed the registered service TheoSSA. It provides easy access to stellar spectral energy distributions (SEDs) and is intended to ingest SEDs calculated by any model-atmosphere code. In case of the DA white dwarf G191-B2B, we demonstrate that the model reproduces not only its overall continuum shape but also the numerous metal lines exhibited in its ultraviolet spectrum. (3 data files).

  2. Metal Hydride and Alkali Halide Opacities in Extrasolar Giant Planets and Cool Stellar Atmospheres

    NASA Technical Reports Server (NTRS)

    Weck, Philippe F.; Stancil, Phillip C.; Kirby, Kate; Schweitzer, Andreas; Hauschildt, Peter H.

    2006-01-01

    The lack of accurate and complete molecular line and continuum opacity data has been a serious limitation to developing atmospheric models of cool stars and Extrasolar Giant Planets (EGPs). We report our recent calculations of molecular opacities resulting from the presence of metal hydrides and alkali halides. The resulting data have been included in the PHOENIX stellar atmosphere code (Hauschildt & Baron 1999). The new models, calculated using spherical geometry for all gravities considered, also incorporate our latest database of nearly 670 million molecular lines, and updated equations of state.

  3. Enabling Microscopic Simulators to Perform System Level Tasks: A System-Identification Based, Closure-on-Demand Toolkit for Multiscale Simulation Stability/Bifurcation Analysis, Optimization and Control

    DTIC Science & Technology

    2006-10-01

    The objective was to construct a bridge between existing and future microscopic simulation codes ( kMC , MD, MC, BD, LB etc.) and traditional, continuum...kinetic Monte Carlo, kMC , equilibrium MC, Lattice-Boltzmann, LB, Brownian Dynamics, BD, or general agent-based, AB) simulators. It also, fortuitously...cond-mat/0310460 at arXiv.org. 27. Coarse Projective kMC Integration: Forward/Reverse Initial and Boundary Value Problems", R. Rico-Martinez, C. W

  4. Use of the Fracture Continuum Model for Numerical Modeling of Flow and Transport of Deep Geologic Disposal of Nuclear Waste in Crystalline Rock

    NASA Astrophysics Data System (ADS)

    Hadgu, T.; Kalinina, E.; Klise, K. A.; Wang, Y.

    2015-12-01

    Numerical modeling of disposal of nuclear waste in a deep geologic repository in fractured crystalline rock requires robust characterization of fractures. Various methods for fracture representation in granitic rocks exist. In this study we used the fracture continuum model (FCM) to characterize fractured rock for use in the simulation of flow and transport in the far field of a generic nuclear waste repository located at 500 m depth. The FCM approach is a stochastic method that maps the permeability of discrete fractures onto a regular grid. The method generates permeability fields using field observations of fracture sets. The original method described in McKenna and Reeves (2005) was designed for vertical fractures. The method has since then been extended to incorporate fully three-dimensional representations of anisotropic permeability, multiple independent fracture sets, and arbitrary fracture dips and orientations, and spatial correlation (Kalinina et al. 20012, 2014). For this study the numerical code PFLOTRAN (Lichtner et al., 2015) has been used to model flow and transport. PFLOTRAN solves a system of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport in porous materials. The code is designed to run on massively parallel computing architectures as well as workstations and laptops (e.g. Hammond et al., 2011). Benchmark tests were conducted to simulate flow and transport in a specified model domain. Distributions of fracture parameters were used to generate a selected number of realizations. For each realization, the FCM method was used to generate a permeability field of the fractured rock. The PFLOTRAN code was then used to simulate flow and transport in the domain. Simulation results and analysis are presented. The results indicate that the FCM approach is a viable method to model fractured crystalline rocks. The FCM is a computationally efficient way to generate realistic representation of complex fracture systems. This approach is of interest for nuclear waste disposal models applied over large domains.

  5. Aero-thermo-dynamic analysis of the Spaceliner-7.1 vehicle in high altitude flight

    NASA Astrophysics Data System (ADS)

    Zuppardi, Gennaro; Morsa, Luigi; Sippel, Martin; Schwanekamp, Tobias

    2014-12-01

    SpaceLiner, designed by DLR, is a visionary, extremely fast passenger transportation concept. It consists of two stages: a winged booster, a vehicle. After separation of the two stages, the booster makes a controlled re-entry and returns to the launch site. According to the current project, version 7-1 of SpaceLiner (SpaceLiner-7.1), the vehicle should be brought at an altitude of 75 km and then released, undertaking the descent path. In the perspective that the vehicle of SpaceLiner-7.1 could be brought to altitudes higher than 75 km, e.g. 100 km or above and also for a speculative purpose, in this paper the aerodynamic parameters of the SpaceLiner-7.1 vehicle are calculated in the whole transition regime, from continuum low density to free molecular flows. Computer simulations have been carried out by three codes: two DSMC codes, DS3V in the altitude interval 100-250 km for the evaluation of the global aerodynamic coefficients and DS2V at the altitude of 60 km for the evaluation of the heat flux and pressure distributions along the vehicle nose, and the DLR HOTSOSE code for the evaluation of the global aerodynamic coefficients in continuum, hypersonic flow at the altitude of 44.6 km. The effectiveness of the flaps with deflection angle of -35 deg. was evaluated in the above mentioned altitude interval. The vehicle showed longitudinal stability in the whole altitude interval even with no flap. The global bridging formulae verified to be proper for the evaluation of the aerodynamic coefficients in the altitude interval 80-100 km where the computations cannot be fulfilled either by CFD, because of the failure of the classical equations computing the transport coefficients, or by DSMC because of the requirement of very high computer resources both in terms of the core storage (a high number of simulated molecules is needed) and to the very long processing time.

  6. Fidelity of the Integrated Force Method Solution

    NASA Technical Reports Server (NTRS)

    Hopkins, Dale; Halford, Gary; Coroneos, Rula; Patnaik, Surya

    2002-01-01

    The theory of strain compatibility of the solid mechanics discipline was incomplete since St. Venant's 'strain formulation' in 1876. We have addressed the compatibility condition both in the continuum and the discrete system. This has lead to the formulation of the Integrated Force Method. A dual Integrated Force Method with displacement as the primal variable has also been formulated. A modest finite element code (IFM/Analyzers) based on the IFM theory has been developed. For a set of standard test problems the IFM results were compared with the stiffness method solutions and the MSC/Nastran code. For the problems IFM outperformed the existing methods. Superior IFM performance is attributed to simultaneous compliance of equilibrium equation and compatibility condition. MSC/Nastran organization expressed reluctance to accept the high fidelity IFM solutions. This report discusses the solutions to the examples. No inaccuracy was detected in the IFM solutions. A stiffness method code with a small programming effort can be improved to reap the many IFM benefits when implemented with the IFMD elements. Dr. Halford conducted a peer-review on the Integrated Force Method. Reviewers' response is included.

  7. Features of Discontinuous Galerkin Algorithms in Gkeyll, and Exponentially-Weighted Basis Functions

    NASA Astrophysics Data System (ADS)

    Hammett, G. W.; Hakim, A.; Shi, E. L.

    2016-10-01

    There are various versions of Discontinuous Galerkin (DG) algorithms that have interesting features that could help with challenging problems of higher-dimensional kinetic problems (such as edge turbulence in tokamaks and stellarators). We are developing the gyrokinetic code Gkeyll based on DG methods. Higher-order methods do more FLOPS to extract more information per byte, thus reducing memory and communication costs (which are a bottleneck for exascale computing). The inner product norm can be chosen to preserve energy conservation with non-polynomial basis functions (such as Maxwellian-weighted bases), which alternatively can be viewed as a Petrov-Galerkin method. This allows a full- F code to benefit from similar Gaussian quadrature employed in popular δf continuum gyrokinetic codes. We show some tests for a 1D Spitzer-Härm heat flux problem, which requires good resolution for the tail. For two velocity dimensions, this approach could lead to a factor of 10 or more speedup. Supported by the Max-Planck/Princeton Center for Plasma Physics, the SciDAC Center for the Study of Plasma Microturbulence, and DOE Contract DE-AC02-09CH11466.

  8. Case Management Ethics: High Professional Standards for Health Care's Interconnected Worlds.

    PubMed

    Sminkey, Patrice V; LeDoux, Jeannie

    2016-01-01

    The purpose of this discussion is to draw attention to the considerable pressure on professional case managers today to coordinate with multiple stakeholders, with responsibilities that put them at the forefront of contact with payers and providers. This discussion raises awareness of how case managers, and board-certified case managers in particular, must demonstrate that they adhere to the highest ethical standards, as codified by the Commission for Case Manager Certification's Code of Professional Conduct for Case Managers. This discussion applies to case management practices and work settings across the full continuum of health care. As advocates for clients (individuals receiving case management services) and their families/support systems, case managers must adhere to the highest of ethical and professional standards. The Code of Professional Conduct for Case Managers is an indispensable resource for case managers to ensure that they place the public interest above their own, respect the rights and inherent dignity of clients, maintain objectivity in their relationships with clients, and act with integrity and fidelity with clients and others, as stipulated by the code.

  9. Cultural and Technological Issues and Solutions for Geodynamics Software Citation

    NASA Astrophysics Data System (ADS)

    Heien, E. M.; Hwang, L.; Fish, A. E.; Smith, M.; Dumit, J.; Kellogg, L. H.

    2014-12-01

    Computational software and custom-written codes play a key role in scientific research and teaching, providing tools to perform data analysis and forward modeling through numerical computation. However, development of these codes is often hampered by the fact that there is no well-defined way for the authors to receive credit or professional recognition for their work through the standard methods of scientific publication and subsequent citation of the work. This in turn may discourage researchers from publishing their codes or making them easier for other scientists to use. We investigate the issues involved in citing software in a scientific context, and introduce features that should be components of a citation infrastructure, particularly oriented towards the codes and scientific culture in the area of geodynamics research. The codes used in geodynamics are primarily specialized numerical modeling codes for continuum mechanics problems; they may be developed by individual researchers, teams of researchers, geophysicists in collaboration with computational scientists and applied mathematicians, or by coordinated community efforts such as the Computational Infrastructure for Geodynamics. Some but not all geodynamics codes are open-source. These characteristics are common to many areas of geophysical software development and use. We provide background on the problem of software citation and discuss some of the barriers preventing adoption of such citations, including social/cultural barriers, insufficient technological support infrastructure, and an overall lack of agreement about what a software citation should consist of. We suggest solutions in an initial effort to create a system to support citation of software and promotion of scientific software development.

  10. Kinetic modeling of x-ray laser-driven solid Al plasmas via particle-in-cell simulation

    NASA Astrophysics Data System (ADS)

    Royle, R.; Sentoku, Y.; Mancini, R. C.; Paraschiv, I.; Johzaki, T.

    2017-06-01

    Solid-density plasmas driven by intense x-ray free-electron laser (XFEL) radiation are seeded by sources of nonthermal photoelectrons and Auger electrons that ionize and heat the target via collisions. Simulation codes that are commonly used to model such plasmas, such as collisional-radiative (CR) codes, typically assume a Maxwellian distribution and thus instantaneous thermalization of the source electrons. In this study, we present a detailed description and initial applications of a collisional particle-in-cell code, picls, that has been extended with a self-consistent radiation transport model and Monte Carlo models for photoionization and K L L Auger ionization, enabling the fully kinetic simulation of XFEL-driven plasmas. The code is used to simulate two experiments previously performed at the Linac Coherent Light Source investigating XFEL-driven solid-density Al plasmas. It is shown that picls-simulated pulse transmissions using the Ecker-Kröll continuum-lowering model agree much better with measurements than do simulations using the Stewart-Pyatt model. Good quantitative agreement is also found between the time-dependent picls results and those of analogous simulations by the CR code scfly, which was used in the analysis of the experiments to accurately reproduce the observed K α emissions and pulse transmissions. Finally, it is shown that the effects of the nonthermal electrons are negligible for the conditions of the particular experiments under investigation.

  11. Interrogating Seyferts with NebulaBayes: Spatially Probing the Narrow-line Region Radiation Fields and Chemical Abundances

    NASA Astrophysics Data System (ADS)

    Thomas, Adam D.; Dopita, Michael A.; Kewley, Lisa J.; Groves, Brent A.; Sutherland, Ralph S.; Hopkins, Andrew M.; Blanc, Guillermo A.

    2018-04-01

    NebulaBayes is a new Bayesian code that implements a general method of comparing observed emission-line fluxes to photoionization model grids. The code enables us to extract robust, spatially resolved measurements of abundances in the extended narrow-line regions (ENLRs) produced by Active Galactic Nuclei (AGN). We observe near-constant ionization parameters but steeply radially declining pressures, which together imply that radiation pressure regulates the ENLR density structure on large scales. Our sample includes four “pure Seyfert” galaxies from the S7 survey that have extensive ENLRs. NGC 2992 shows steep metallicity gradients from the nucleus into the ionization cones. An inverse metallicity gradient is observed in ESO 138-G01, which we attribute to a recent gas inflow or minor merger. A uniformly high metallicity and hard ionizing continuum are inferred across the ENLR of Mrk 573. Our analysis of IC 5063 is likely affected by contamination from shock excitation, which appears to soften the inferred ionizing spectrum. The peak of the ionizing continuum E peak is determined by the nuclear spectrum and the absorbing column between the nucleus and the ionized nebula. We cannot separate variation in this intrinsic E peak from the effects of shock or H II region contamination, but E peak measurements nevertheless give insights into ENLR excitation. We demonstrate the general applicability of NebulaBayes by analyzing a nuclear spectrum from the non-active galaxy NGC 4691 using a H II region grid. The NLR and H II region model grids are provided with NebulaBayes for use by the astronomical community.

  12. Moment Tensor Descriptions for Simulated Explosions of the Source Physics Experiment (SPE)

    NASA Astrophysics Data System (ADS)

    Yang, X.; Rougier, E.; Knight, E. E.; Patton, H. J.

    2014-12-01

    In this research we seek to understand damage mechanisms governing the behavior of geo-materials in the explosion source region, and the role they play in seismic-wave generation. Numerical modeling tools can be used to describe these mechanisms through the development and implementation of appropriate material models. Researchers at Los Alamos National Laboratory (LANL) have been working on a novel continuum-based-viscoplastic strain-rate-dependent fracture material model, AZ_Frac, in an effort to improve the description of these damage sources. AZ_Frac has the ability to describe continuum fracture processes, and at the same time, to handle pre-existing anisotropic material characteristics. The introduction of fractures within the material generates further anisotropic behavior that is also accounted for within the model. The material model has been calibrated to a granitic medium and has been applied in a number of modeling efforts under the SPE project. In our modeling, we use a 2D, axisymmetric layered earth model of the SPE site consisting of a weathered layer on top of a half-space. We couple the hydrodynamic simulation code with a seismic simulation code and propagate the signals to distances of up to 2 km. The signals are inverted for time-dependent moment tensors using a modified inversion scheme that accounts for multiple sources at different depths. The inversion scheme is evaluated for its resolving power to determine a centroid depth and a moment tensor description of the damage source. The capabilities of the inversion method to retrieve such information from waveforms recorded on three SPE tests conducted to date are also being assessed.

  13. The Herschel view of GAS in Protoplanetary Systems (GASPS). First comparisons with a large grid of models

    NASA Astrophysics Data System (ADS)

    Pinte, C.; Woitke, P.; Ménard, F.; Duchêne, G.; Kamp, I.; Meeus, G.; Mathews, G.; Howard, C. D.; Grady, C. A.; Thi, W.-F.; Tilling, I.; Augereau, J.-C.; Dent, W. R. F.; Alacid, J. M.; Andrews, S.; Ardila, D. R.; Aresu, G.; Barrado, D.; Brittain, S.; Ciardi, D. R.; Danchi, W.; Eiroa, C.; Fedele, D.; de Gregorio-Monsalvo, I.; Heras, A.; Huelamo, N.; Krivov, A.; Lebreton, J.; Liseau, R.; Martin-Zaïdi, C.; Mendigutía, I.; Montesinos, B.; Mora, A.; Morales-Calderon, M.; Nomura, H.; Pantin, E.; Pascucci, I.; Phillips, N.; Podio, L.; Poelman, D. R.; Ramsay, S.; Riaz, B.; Rice, K.; Riviere-Marichalar, P.; Roberge, A.; Sandell, G.; Solano, E.; Vandenbussche, B.; Walker, H.; Williams, J. P.; White, G. J.; Wright, G.

    2010-07-01

    The Herschel GASPS key program is a survey of the gas phase of protoplanetary discs, targeting 240 objects which cover a large range of ages, spectral types, and disc properties. To interpret this large quantity of data and initiate self-consistent analyses of the gas and dust properties of protoplanetary discs, we have combined the capabilities of the radiative transfer code MCFOST with the gas thermal balance and chemistry code ProDiMo to compute a grid of ≈300 000 disc models (DENT). We present a comparison of the first Herschel/GASPS line and continuum data with the predictions from the DENT grid of models. Our objective is to test some of the main trends already identified in the DENT grid, as well as to define better empirical diagnostics to estimate the total gas mass of protoplanetary discs. Photospheric UV radiation appears to be the dominant gas-heating mechanism for Herbig stars, whereas UV excess and/or X-rays emission dominates for T Tauri stars. The DENT grid reveals the complexity in the analysis of far-IR lines and the difficulty to invert these observations into physical quantities. The combination of Herschel line observations with continuum data and/or with rotational lines in the (sub-)millimetre regime, in particular CO lines, is required for a detailed characterisation of the physical and chemical properties of circumstellar discs. Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.

  14. Ultraviolet spectrophotometry of three LINERs

    NASA Technical Reports Server (NTRS)

    Goodrich, R. W.; Keel, W. C.

    1986-01-01

    Three galaxies known to be LINERs were observed spectroscopically in the ultraviolet in an attempt to detect the presumed nonthermal continuum source thought to be the source of photoionization in the nuclei. NGC 4501 was found to be too faint for study with the IUE spectrographs, while NGC 5005 had an extended ultraviolet light profile. Comparison with the optical light profile of NGC 5005 indicates that the ultraviolet source is distributed spatially in the same manner as the optical starlight, probably indicating that the ultraviolet excess is due to a component of hot stars in the nucleus. These stars contribute detectable absorption features longward of 2500 A; together with optical data, the IUE spectra suggest a burst of star formation about 1 billion yr ago, with a lower rate continuing to produce a few OB stars. In NGC 4579, a point source contributing most of the ultraviolet excess is found that is much different than the optical light distribution. Furthermore, the ultraviolet to X-ray spectral index in NGC 4579 is 1.4, compatible with the UV to X-ray indices found for samples of Seyfert galaxies. This provides compelling evidence for the detection of the photoionizing continuum in NGC 4579 and draws the research fields of normal galaxies and active galactic nuclei closer together. The emission-line spectrum of NGC 4579 is compared with calculations from a photoionization code, CLOUDY, and several shock models. The photoionization code is found to give superior results, adding to the increasing weight of evidence that the LINER phenomenon is essentially a scaled-down version of the Seyfert phenomenon.

  15. Role of zonal flows in trapped electron mode turbulence through nonlinear gyrokinetic particle and continuum simulationa)

    NASA Astrophysics Data System (ADS)

    Ernst, D. R.; Lang, J.; Nevins, W. M.; Hoffman, M.; Chen, Y.; Dorland, W.; Parker, S.

    2009-05-01

    Trapped electron mode (TEM) turbulence exhibits a rich variety of collisional and zonal flow physics. This work explores the parametric variation of zonal flows and underlying mechanisms through a series of linear and nonlinear gyrokinetic simulations, using both particle-in-cell and continuum methods. A new stability diagram for electron modes is presented, identifying a critical boundary at ηe=1, separating long and short wavelength TEMs. A novel parity test is used to separate TEMs from electron temperature gradient driven modes. A nonlinear scan of ηe reveals fine scale structure for ηe≳1, consistent with linear expectation. For ηe<1, zonal flows are the dominant saturation mechanism, and TEM transport is insensitive to ηe. For ηe>1, zonal flows are weak, and TEM transport falls inversely with a power law in ηe. The role of zonal flows appears to be connected to linear stability properties. Particle and continuum methods are compared in detail over a range of ηe=d ln Te/d ln ne values from zero to five. Linear growth rate spectra, transport fluxes, fluctuation wavelength spectra, zonal flow shearing spectra, and correlation lengths and times are in close agreement. In addition to identifying the critical parameter ηe for TEM zonal flows, this paper takes a challenging step in code verification, directly comparing very different methods of simulating simultaneous kinetic electron and ion dynamics in TEM turbulence.

  16. Gravitational instabilities in a protosolar-like disc - II. Continuum emission and mass estimates

    NASA Astrophysics Data System (ADS)

    Evans, M. G.; Ilee, J. D.; Hartquist, T. W.; Caselli, P.; Szűcs, L.; Purser, S. J. D.; Boley, A. C.; Durisen, R. H.; Rawlings, J. M. C.

    2017-09-01

    Gravitational instabilities (GIs) are most likely a fundamental process during the early stages of protoplanetary disc formation. Recently, there have been detections of spiral features in young, embedded objects that appear consistent with GI-driven structure. It is crucial to perform hydrodynamic and radiative transfer simulations of gravitationally unstable discs in order to assess the validity of GIs in such objects, and constrain optimal targets for future observations. We utilize the radiative transfer code lime (Line modelling Engine) to produce continuum emission maps of a 0.17 M⊙ self-gravitating protosolar-like disc. We note the limitations of using lime as is and explore methods to improve upon the default gridding. We use casa to produce synthetic observations of 270 continuum emission maps generated across different frequencies, inclinations and dust opacities. We find that the spiral structure of our protosolar-like disc model is distinguishable across the majority of our parameter space after 1 h of observation, and is especially prominent at 230 GHz due to the favourable combination of angular resolution and sensitivity. Disc mass derived from the observations is sensitive to the assumed dust opacities and temperatures, and therefore can be underestimated by a factor of at least 30 at 850 GHz and 2.5 at 90 GHz. As a result, this effect could retrospectively validate GIs in discs previously thought not massive enough to be gravitationally unstable, which could have a significant impact on the understanding of the formation and evolution of protoplanetary discs.

  17. Perinatal depression: a review of US legislation and law.

    PubMed

    Rhodes, Ann M; Segre, Lisa S

    2013-08-01

    Accumulating research documenting the prevalence and negative effects of perinatal depression, together with highly publicized tragic critical incidents of suicide and filicide by mothers with postpartum psychosis, have fueled a continuum of legislation. Specialists in perinatal mental health should recognize how their work influences legislative initiatives and penal codes, and take this into consideration when developing perinatal services and research. Yet, without legal expertise, the status of legislative initiatives can be confusing. To address this shortfall, we assembled an interdisciplinary team of academics specializing in law, as well as perinatal mental health, to summarize these issues. This review presents the relevant federal and state legislation and summarizes the criminal codes that governed the court decisions on cases in which a mother committed filicide because of postpartum psychosis. Moreover, the review aims to help researchers and providers who specialize in perinatal depression understand their role in this legal landscape.

  18. Flexible Automatic Discretization for Finite Differences: Eliminating the Human Factor

    NASA Astrophysics Data System (ADS)

    Pranger, Casper

    2017-04-01

    In the geophysical numerical modelling community, finite differences are (in part due to their small footprint) a popular spatial discretization method for PDEs in the regular-shaped continuum that is the earth. However, they rapidly become prone to programming mistakes when physics increase in complexity. To eliminate opportunities for human error, we have designed an automatic discretization algorithm using Wolfram Mathematica, in which the user supplies symbolic PDEs, the number of spatial dimensions, and a choice of symbolic boundary conditions, and the script transforms this information into matrix- and right-hand-side rules ready for use in a C++ code that will accept them. The symbolic PDEs are further used to automatically develop and perform manufactured solution benchmarks, ensuring at all stages physical fidelity while providing pragmatic targets for numerical accuracy. We find that this procedure greatly accelerates code development and provides a great deal of flexibility in ones choice of physics.

  19. Experimental differential cross sections, level densities, and spin cutoffs as a testing ground for nuclear reaction codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voinov, Alexander V.; Grimes, Steven M.; Brune, Carl R.

    Proton double-differential cross sections from 59Co(α,p) 62Ni, 57Fe(α,p) 60Co, 56Fe( 7Li,p) 62Ni, and 55Mn( 6Li,p) 60Co reactions have been measured with 21-MeV α and 15-MeV lithium beams. Cross sections have been compared against calculations with the empire reaction code. Different input level density models have been tested. It was found that the Gilbert and Cameron [A. Gilbert and A. G. W. Cameron, Can. J. Phys. 43, 1446 (1965)] level density model is best to reproduce experimental data. Level densities and spin cutoff parameters for 62Ni and 60Co above the excitation energy range of discrete levels (in continuum) have been obtainedmore » with a Monte Carlo technique. Furthermore, excitation energy dependencies were found to be inconsistent with the Fermi-gas model.« less

  20. Application of electron closures in extended MHD

    NASA Astrophysics Data System (ADS)

    Held, Eric; Adair, Brett; Taylor, Trevor

    2017-10-01

    Rigorous closure of the extended MHD equations in plasma fluid codes includes the effects of electron heat conduction along perturbed magnetic fields and contributions of the electron collisional friction and stress to the extended Ohms law. In this work we discuss application of a continuum numerical solution to the Chapman-Enskog-like electron drift kinetic equation using the NIMROD code. The implementation is a tightly-coupled fluid/kinetic system that carefully addresses time-centering in the advance of the fluid variables with their kinetically-computed closures. Comparisons of spatial accuracy, computational efficiency and required velocity space resolution are presented for applications involving growing magnetic islands in cylindrical and toroidal geometry. The reduction in parallel heat conduction due to particle trapping in toroidal geometry is emphasized. Work supported by DOE under Grant Nos. DE-FC02-08ER54973 and DE-FG02-04ER54746.

  1. Laboratory simulation of photoionized plasma among astronomical compact objects

    NASA Astrophysics Data System (ADS)

    Fujioka, Shinsuke; Yamamoto, Norimasa; Wang, Feilu; Salzmann, David; Li, Yutong; Rhee, Yong-Joo; Nishimura, Hiroaki; Takabe, Hideaki; Mima, Kunioki

    2008-11-01

    X-ray line emission with several-keV of photon energy was observed from photoionized accreting clouds, for example CYGNUS X-3 and VELA X-1, those are exposed by hard x-ray continuum from the compact objects, such as neutron stars, black holes, or white dwarfs, although accreting clouds are thermally cold. The x-ray continuum-induced line emission gives a good insight to the accreting clouds. We will present a novel laboratory simulation of the photoionized plasma under well-characterized conditions by using high-power laser facility. Blackbody radiator with 500-eV of temperature, as a miniature of a hot compact object, was created.Silicon (Si) plasma with 30-eV of electron temperature was produced in the vicinity of the 0.5-keV blackbody radiator. Line emissions of lithium- and helium-like Si ions was clearly observed around 2-keV of photon-energy from the thermally cold Si plasma, this result is hardly interpreted without consideration of the photoionization. Atomic kinetics code reveals importance of inner-shell ionization directly caused by incoming hard x-rays.

  2. Numerical simulation of the multiple core localized low shear toroidal Alfvenic eigenmodes

    NASA Astrophysics Data System (ADS)

    Wang, Wenjia; Zhou, Deng; Hu, Youjun; Ming, Yue

    2018-03-01

    In modern tokamak experiments, scenarios with weak central magnetic shear has been proposed. It is necessary to study the Alfvenic mode activities in such scenarios. Theoretical researches have predicted the multiplicity of core-localized toroidally induced Alfvenic eigenmodes for ɛ/s > 1, where ɛ is the inverse aspect ratio and s is magnetic shear. We numerically investigate the existence of multiplicity of core-localized TAEs and mode characteristics using NOVA code in the present work. We firstly verify the existence of the multiplicity for zero beta plasma and the even mode at the forbidden zone. For finite beta plasma, the mode parities become more distinguishable, and the frequencies of odd modes are close to the upper tip of the continuum, while the frequencies of even modes are close to the lower tip of the continuum. Their frequencies are well separated by the forbidden zone. With the increasing value of ɛ/s, more modes with multiple radial nodes will appear, which is in agreement with theoretical prediction. The discrepancy between theoretical prediction and our numerical simulation is also discussed in the main text.

  3. COSMIC REIONIZATION ON COMPUTERS. ULTRAVIOLET CONTINUUM SLOPES AND DUST OPACITIES IN HIGH REDSHIFT GALAXIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khakhaleva-Li, Zimu; Gnedin, Nickolay Y., E-mail: zimu@uchicago.edu, E-mail: gnedin@fnal.gov

    We compare the properties of stellar populations of model galaxies from the Cosmic Reionization On Computers (CROC) project with the exiting ultraviolet (UV) and IR data. Since CROC simulations do not follow cosmic dust directly, we adopt two variants of the dust-follows-metals ansatz to populate model galaxies with dust. Using the dust radiative transfer code Hyperion, we compute synthetic stellar spectra, UV continuum slopes, and IR fluxes for simulated galaxies. We find that the simulation results generally match observational measurements, but, perhaps, not in full detail. The differences seem to indicate that our adopted dust-follows-metals ansatzes are not fully sufficient.more » While the discrepancies with the exiting data are marginal, the future James Webb Space Telescope (JWST) data will be of much higher precision, rendering highly significant any tentative difference between theory and observations. It is, therefore, likely, that in order to fully utilize the precision of JWST observations, fully dynamical modeling of dust formation, evolution, and destruction may be required.« less

  4. Timing the warm absorber in NGC4051

    NASA Astrophysics Data System (ADS)

    Silva, C.; Uttley, P.; Costantini, E.

    2015-07-01

    In this work we have combined spectral and timing analysis in the characterization of highly ionized outflows in Seyfert galaxies, the so-called warm absorbers. Here, we present our results on the extensive ˜600ks of XMM-Newton archival observations of the bright and highly variable Seyfert 1 galaxy NGC4051, whose spectrum has revealed a complex multi-component wind. Working simultaneously with RGS and PN data, we have performed a detailed analysis using a time-dependent photoionization code in combination with spectral and Fourier timing techniques. This method allows us to study in detail the response of the gas due to variations in the ionizing flux of the central source. As a result, we will show the contribution of the recombining gas to the time delays of the most highly absorbed energy bands relative to the continuum (Silva, Uttley & Costantini in prep.), which is also vital information for interpreting the continuum lags associated with propagation and reverberation effects in the inner emitting regions. Furthermore, we will illustrate how this powerful method can be applied to other sources and warm-absorber configurations, allowing for a wide range of studies.

  5. An object oriented code for simulating supersymmetric Yang-Mills theories

    NASA Astrophysics Data System (ADS)

    Catterall, Simon; Joseph, Anosh

    2012-06-01

    We present SUSY_LATTICE - a C++ program that can be used to simulate certain classes of supersymmetric Yang-Mills (SYM) theories, including the well known N=4 SYM in four dimensions, on a flat Euclidean space-time lattice. Discretization of SYM theories is an old problem in lattice field theory. It has resisted solution until recently when new ideas drawn from orbifold constructions and topological field theories have been brought to bear on the question. The result has been the creation of a new class of lattice gauge theories in which the lattice action is invariant under one or more supersymmetries. The resultant theories are local, free of doublers and also possess exact gauge-invariance. In principle they form the basis for a truly non-perturbative definition of the continuum SYM theories. In the continuum limit they reproduce versions of the SYM theories formulated in terms of twisted fields, which on a flat space-time is just a change of the field variables. In this paper, we briefly review these ideas and then go on to provide the details of the C++ code. We sketch the design of the code, with particular emphasis being placed on SYM theories with N=(2,2) in two dimensions and N=4 in three and four dimensions, making one-to-one comparisons between the essential components of the SYM theories and their corresponding counterparts appearing in the simulation code. The code may be used to compute several quantities associated with the SYM theories such as the Polyakov loop, mean energy, and the width of the scalar eigenvalue distributions. Program summaryProgram title: SUSY_LATTICE Catalogue identifier: AELS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 9315 No. of bytes in distributed program, including test data, etc.: 95 371 Distribution format: tar.gz Programming language: C++ Computer: PCs and Workstations Operating system: Any, tested on Linux machines Classification:: 11.6 Nature of problem: To compute some of the observables of supersymmetric Yang-Mills theories such as supersymmetric action, Polyakov/Wilson loops, scalar eigenvalues and Pfaffian phases. Solution method: We use the Rational Hybrid Monte Carlo algorithm followed by a Leapfrog evolution and a Metropolis test. The input parameters of the model are read in from a parameter file. Restrictions: This code applies only to supersymmetric gauge theories with extended supersymmetry, which undergo the process of maximal twisting. (See Section 2 of the manuscript for details.) Running time: From a few minutes to several hours depending on the amount of statistics needed.

  6. Electric tempest in a teacup: The tea leaf analogy to microfluidic blood plasma separation

    NASA Astrophysics Data System (ADS)

    Yeo, Leslie Y.; Friend, James R.; Arifin, Dian R.

    2006-09-01

    In a similar fashion to Einstein's tea leaf paradox, the rotational liquid flow induced by ionic wind above a liquid surface can trap suspended microparticles by a helical motion, spinning them down towards a bottom stagnation point. The motion is similar to Batchelor [Q. J. Mech. Appl. Math. 4, 29 (1951)] flows occurring between stationary and rotating disks and arises due to a combination of the primary azimuthal and secondary bulk meridional recirculation that produces a centrifugal and enhanced inward radial force near the chamber bottom. The technology is thus useful for microfluidic particle trapping/concentration; the authors demonstrate its potential for rapid erythrocyte/blood plasma separation for miniaturized medical diagnostic kits.

  7. Bragg x-ray survey spectrometer for ITER.

    PubMed

    Varshney, S K; Barnsley, R; O'Mullane, M G; Jakhar, S

    2012-10-01

    Several potential impurity ions in the ITER plasmas will lead to loss of confined energy through line and continuum emission. For real time monitoring of impurities, a seven channel Bragg x-ray spectrometer (XRCS survey) is considered. This paper presents design and analysis of the spectrometer, including x-ray tracing by the Shadow-XOP code, sensitivity calculations for reference H-mode plasma and neutronics assessment. The XRCS survey performance analysis shows that the ITER measurement requirements of impurity monitoring in 10 ms integration time at the minimum levels for low-Z to high-Z impurity ions can largely be met.

  8. A Transversely Isotropic Thermoelastic Theory

    NASA Technical Reports Server (NTRS)

    Arnold, S. M.

    1989-01-01

    A continuum theory is presented for representing the thermoelastic behavior of composites that can be idealized as transversely isotropic. This theory is consistent with anisotropic viscoplastic theories being developed presently at NASA Lewis Research Center. A multiaxial statement of the theory is presented, as well as plane stress and plane strain reductions. Experimental determination of the required material parameters and their theoretical constraints are discussed. Simple homogeneously stressed elements are examined to illustrate the effect of fiber orientation on the resulting strain distribution. Finally, the multiaxial stress-strain relations are expressed in matrix form to simplify and accelerate implementation of the theory into structural analysis codes.

  9. VizieR Online Data Catalog: 6 & 1.3cm deep VLA obs. toward 58 high-mass SFRs (Rosero+, 2016)

    NASA Astrophysics Data System (ADS)

    Rosero, V.; Hofner, P.; Claussen, M.; Kurtz, S.; Cesaroni, R.; Araya, E. D.; Carrasco-Gonzalez, C.; Rodriguez, L. F.; Menten, K. M.; Wyrowski, F.; Loinard, L.; Ellingsen, S. P.

    2017-01-01

    VLA continuum observations (project codes 10B-124 and 13B-210) at 6 and 1.3cm were made for all sources in the sample. The 6cm observations were made in the A configuration between 2011 June and August, providing a typical angular resolution of about 0.4". The 1.3cm observations were made in the B configuration, acquiring the first half of the data between 2010 November and 2011 May, and the second half between 2013 November and 2014 January. (2 data files).

  10. EXCALIBUR-at-CALIBAN: a neutron transmission experiment for {sup 238}U(n,n'{sub continuum}γ) nuclear data validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard, David; Leconte, Pierre; Destouches, Christophe

    2015-07-01

    Two recent papers justified a new experimental program to give a new basis for the validation of {sup 238}U nuclear data, namely neutron induced inelastic scattering and transport codes at neutron fission energies. The general idea is to perform a neutron transmission experiment through natural uranium material. As shown by Hans Bethe, neutron transmissions measured by dosimetric responses are linked to inelastic cross sections. This paper describes the principle and the results of such an experience called EXCALIBUR performed recently (January and October 2014) at the CALIBAN reactor facility. (authors)

  11. Efficient radiative transfer methods for continuum and line transfer in large three-dimensional models

    NASA Astrophysics Data System (ADS)

    Juvela, Mika J.

    The relationship between physical conditions of an interstellar cloud and the observed radiation is defined by the radiative transfer problem. Radiative transfer calculations are needed if, e.g., one wants to disentangle abundance variations from excitation effects or wants to model variations of dust properties inside an interstellar cloud. New observational facilities (e.g., ALMA and Herschel) will bring improved accuracy both in terms of intensity and spatial resolution. This will enable detailed studies of the densest sub-structures of interstellar clouds and star forming regions. Such observations must be interpreted with accurate radiative transfer methods and realistic source models. In many cases this will mean modelling in three dimensions. High optical depths and observed wide range of linear scales are, however, challenging for radiative transfer modelling. A large range of linear scales can be accessed only with hierarchical models. Figure 1 shows an example of the use of a hierarchical grid for radiative transfer calculations when the original model cloud (L=10 pc, =500 cm-3) was based a MHD simulation carried out on a regular grid (Juvela & Padoan, 2005). For computed line intensities an accuracy of 10% was still reached when the number of individual cells (and the run time) was reduced by a factor of ten. This illustrates how, as long as cloud is not extremely optically thick, most of the emission comes from a small sub-volume. It is also worth noting that while errors are ~10% for any given point they are much smaller when compared with intensity variations. In particular, calculations on hierarchical grid recovered the spatial power spectrum of line emission with very good accuracy. Monte Carlo codes are used widely in both continuum and line transfer calculations. Like any lambda iteration schemes these suffer from slow convergence when models are optically thick. In line transfer Accelerated Monte Carlo methods (AMC) present a partial solution to this problem (Juvela & Padoan, 2000; Hogerheijde & van der Tak, 2000). AMC methods can be used similarly in continuum calculations to speed up the computation of dust temperatures (Juvela, 2005). The sampling problems associated with high optical depths can be solved with weighted sampling and the handling of models with τV ~ 1000 is perfectly feasible. Transiently heated small dust grains pose another problem because the calculation of their temperature distribution is very time consuming. However, a 3D model will contain thousands of cells at very similar conditions. If dust temperature distributions are calculated only once for such a set an approximate solution can be found in a much shorter time time. (Juvela & Padoan, 2003; see Figure 2a). MHD simulations with Automatic Mesh Refinement (AMR) techniques present an exciting development for the modelling of interstellar clouds. Cloud models consist of a hierarchy of grids with different grid steps and the ratio between the cloud size and the smallest resolution elements can be 106 or even larger. We are currently working on radiative transfer codes (line and continuum) that could be used efficiently on such grids (see Figure 2b). The radiative transfer problem can be solved relatively independently on each of the sub-grids. This means that the use of convergence acceleration methods can be limited to those sub-grids where they are needed and, on the other hand, parallelization of the code is straightforward.

  12. Tank 241-AZ-101 criticality assessment resulting from pump jet mixing: Sludge mixing simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onishi, Y.; Recknagle, K.

    Tank 241-AZ-101 (AZ-101) is one of 28 double-shell tanks located in the AZ farm in the Hanford Site`s 200 East Area. The tank contains a significant quantity of fissile materials, including an estimated 9.782 kg of plutonium. Before beginning jet pump mixing for mitigative purposes, the operations must be evaluated to demonstrate that they will be subcritical under both normal and credible abnormal conditions. The main objective of this study was to address a concern about whether two 300-hp pumps with four rotating 18.3-m/s (60-ft/s) jets can concentrate plutonium in their pump housings during mixer pump operation and cause amore » criticality. The three-dimensional simulation was performed with the time-varying TEMPEST code to determine how much the pump jet mixing of Tank AZ-101 will concentrate plutonium in the pump housing. The AZ-101 model predicted that the total amount of plutonium within the pump housing peaks at 75 g at 10 simulation seconds and decreases to less than 10 g at four minutes. The plutonium concentration in the entire pump housing peaks at 0.60 g/L at 10 simulation seconds and is reduced to below 0.1 g/L after four minutes. Since the minimum critical concentration of plutonium is 2.6 g/L, and the minimum critical plutonium mass under idealized plutonium-water conditions is 520 g, these predicted maximums in the pump housing are much lower than the minimum plutonium conditions needed to reach a criticality level. The initial plutonium maximum of 1.88 g/L still results in safety factor of 4.3 in the pump housing during the pump jet mixing operation.« less

  13. Benchmark studies of the gyro-Landau-fluid code and gyro-kinetic codes on kinetic ballooning modes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, T. F.; Lawrence Livermore National Laboratory, Livermore, California 94550; Xu, X. Q.

    2016-03-15

    A Gyro-Landau-Fluid (GLF) 3 + 1 model has been recently implemented in BOUT++ framework, which contains full Finite-Larmor-Radius effects, Landau damping, and toroidal resonance [Ma et al., Phys. Plasmas 22, 055903 (2015)]. A linear global beta scan has been conducted using the JET-like circular equilibria (cbm18 series), showing that the unstable modes are kinetic ballooning modes (KBMs). In this work, we use the GYRO code, which is a gyrokinetic continuum code widely used for simulation of the plasma microturbulence, to benchmark with GLF 3 + 1 code on KBMs. To verify our code on the KBM case, we first perform the beta scan basedmore » on “Cyclone base case parameter set.” We find that the growth rate is almost the same for two codes, and the KBM mode is further destabilized as beta increases. For JET-like global circular equilibria, as the modes localize in peak pressure gradient region, a linear local beta scan using the same set of equilibria has been performed at this position for comparison. With the drift kinetic electron module in the GYRO code by including small electron-electron collision to damp electron modes, GYRO generated mode structures and parity suggest that they are kinetic ballooning modes, and the growth rate is comparable to the GLF results. However, a radial scan of the pedestal for a particular set of cbm18 equilibria, using GYRO code, shows different trends for the low-n and high-n modes. The low-n modes show that the linear growth rate peaks at peak pressure gradient position as GLF results. However, for high-n modes, the growth rate of the most unstable mode shifts outward to the bottom of pedestal and the real frequency of what was originally the KBMs in ion diamagnetic drift direction steadily approaches and crosses over to the electron diamagnetic drift direction.« less

  14. Cancer Care Coordination: a Systematic Review and Meta-Analysis of Over 30 Years of Empirical Studies.

    PubMed

    Gorin, Sherri Sheinfeld; Haggstrom, David; Han, Paul K J; Fairfield, Kathleen M; Krebs, Paul; Clauser, Steven B

    2017-08-01

    According to a landmark study by the Institute of Medicine, patients with cancer often receive poorly coordinated care in multiple settings from many providers. Lack of coordination is associated with poor symptom control, medical errors, and higher costs. The aims of this systematic review and meta-analysis were to (1) synthesize the findings of studies addressing cancer care coordination, (2) describe study outcomes across the cancer continuum, and (3) obtain a quantitative estimate of the effect of interventions in cancer care coordination on service system processes and patient health outcomes. Of 1241 abstracts identified through MEDLINE, EMBASE, CINAHL, and the Cochrane Library, 52 studies met the inclusion criteria. Each study had US or Canadian participants, comparison or control groups, measures, times, samples, and/or interventions. Two researchers independently applied a standardized search strategy, coding scheme, and online coding program to each study. Eleven studies met the additional criteria for the meta-analysis; a random effects estimation model was used for data analysis. Cancer care coordination approaches led to improvements in 81 % of outcomes, including screening, measures of patient experience with care, and quality of end-of-life care. Across the continuum of cancer care, patient navigation was the most frequent care coordination intervention, followed by home telehealth; nurse case management was third in frequency. The meta-analysis of a subset of the reviewed studies showed that the odds of appropriate health care utilization in cancer care coordination interventions were almost twice (OR = 1.9, 95 % CI = 1.5-3.5) that of comparison interventions. This review offers promising findings on the impact of cancer care coordination on increasing value and reducing healthcare costs in the USA.

  15. MGGPOD: a Monte Carlo Suite for Modeling Instrumental Line and Continuum Backgrounds in Gamma-Ray Astronomy

    NASA Technical Reports Server (NTRS)

    Weidenspointner, G.; Harris, M. J.; Sturner, S.; Teegarden, B. J.; Ferguson, C.

    2004-01-01

    Intense and complex instrumental backgrounds, against which the much smaller signals from celestial sources have to be discerned, are a notorious problem for low and intermediate energy gamma-ray astronomy (approximately 50 keV - 10 MeV). Therefore a detailed qualitative and quantitative understanding of instrumental line and continuum backgrounds is crucial for most stages of gamma-ray astronomy missions, ranging from the design and development of new instrumentation through performance prediction to data reduction. We have developed MGGPOD, a user-friendly suite of Monte Carlo codes built around the widely used GEANT (Version 3.21) package, to simulate ab initio the physical processes relevant for the production of instrumental backgrounds. These include the build-up and delayed decay of radioactive isotopes as well as the prompt de-excitation of excited nuclei, both of which give rise to a plethora of instrumental gamma-ray background lines in addition t o continuum backgrounds. The MGGPOD package and documentation are publicly available for download. We demonstrate the capabilities of the MGGPOD suite by modeling high resolution gamma-ray spectra recorded by the Transient Gamma-Ray Spectrometer (TGRS) on board Wind during 1995. The TGRS is a Ge spectrometer operating in the 40 keV to 8 MeV range. Due to its fine energy resolution, these spectra reveal the complex instrumental background in formidable detail, particularly the many prompt and delayed gamma-ray lines. We evaluate the successes and failures of the MGGPOD package in reproducing TGRS data, and provide identifications for the numerous instrumental lines.

  16. Initial development of 5D COGENT

    NASA Astrophysics Data System (ADS)

    Cohen, R. H.; Lee, W.; Dorf, M.; Dorr, M.

    2015-11-01

    COGENT is a continuum gyrokinetic edge code being developed by the by the Edge Simulation Laboratory (ESL) collaboration. Work to date has been primarily focussed on a 4D (axisymmetric) version that models transport properties of edge plasmas. We have begun development of an initial 5D version to study edge turbulence, with initial focus on kinetic effects on blob dynamics and drift-wave instability in a shearless magnetic field. We are employing compiler directives and preprocessor macros to create a single source code that can be compiled in 4D or 5D, which helps to ensure consistency of physics representation between the two versions. A key aspect of COGENT is the employment of mapped multi-block grid capability to handle the complexity of diverter geometry. It is planned to eventually exploit this capability to handle magnetic shear, through a series of successively skewed unsheared grid blocks. The initial version has an unsheared grid and will be used to explore the degree to which a radial domain must be block decomposed. We report on the status of code development and initial tests. Work performed for USDOE, at LLNL under contract DE-AC52-07NA27344.

  17. Advances in stellarator gyrokinetics

    NASA Astrophysics Data System (ADS)

    Helander, P.; Bird, T.; Jenko, F.; Kleiber, R.; Plunk, G. G.; Proll, J. H. E.; Riemann, J.; Xanthopoulos, P.

    2015-05-01

    Recent progress in the gyrokinetic theory of stellarator microinstabilities and turbulence simulations is summarized. The simulations have been carried out using two different gyrokinetic codes, the global particle-in-cell code EUTERPE and the continuum code GENE, which operates in the geometry of a flux tube or a flux surface but is local in the radial direction. Ion-temperature-gradient (ITG) and trapped-electron modes are studied and compared with their counterparts in axisymmetric tokamak geometry. Several interesting differences emerge. Because of the more complicated structure of the magnetic field, the fluctuations are much less evenly distributed over each flux surface in stellarators than in tokamaks. Instead of covering the entire outboard side of the torus, ITG turbulence is localized to narrow bands along the magnetic field in regions of unfavourable curvature, and the resulting transport depends on the normalized gyroradius ρ* even in radially local simulations. Trapped-electron modes can be significantly more stable than in typical tokamaks, because of the spatial separation of regions with trapped particles from those with bad magnetic curvature. Preliminary non-linear simulations in flux-tube geometry suggest differences in the turbulence levels in Wendelstein 7-X and a typical tokamak.

  18. Comprehensive Micromechanics-Analysis Code - Version 4.0

    NASA Technical Reports Server (NTRS)

    Arnold, S. M.; Bednarcyk, B. A.

    2005-01-01

    Version 4.0 of the Micromechanics Analysis Code With Generalized Method of Cells (MAC/GMC) has been developed as an improved means of computational simulation of advanced composite materials. The previous version of MAC/GMC was described in "Comprehensive Micromechanics-Analysis Code" (LEW-16870), NASA Tech Briefs, Vol. 24, No. 6 (June 2000), page 38. To recapitulate: MAC/GMC is a computer program that predicts the elastic and inelastic thermomechanical responses of continuous and discontinuous composite materials with arbitrary internal microstructures and reinforcement shapes. The predictive capability of MAC/GMC rests on a model known as the generalized method of cells (GMC) - a continuum-based model of micromechanics that provides closed-form expressions for the macroscopic response of a composite material in terms of the properties, sizes, shapes, and responses of the individual constituents or phases that make up the material. Enhancements in version 4.0 include a capability for modeling thermomechanically and electromagnetically coupled ("smart") materials; a more-accurate (high-fidelity) version of the GMC; a capability to simulate discontinuous plies within a laminate; additional constitutive models of materials; expanded yield-surface-analysis capabilities; and expanded failure-analysis and life-prediction capabilities on both the microscopic and macroscopic scales.

  19. Understanding Measurements Returned by the Helioseismic and Magnetic Imager

    NASA Astrophysics Data System (ADS)

    Cohen, Daniel Parke; Criscuoli, Serena

    2014-06-01

    The Helioseismic and Magnetic Imager (HMI) aboard the Solar Dynamics Observatory (SDO) observes the Sun at the FeI 6173 Å line and returns full disk maps of line-of-sight observables including the magnetic field flux, FeI line width, line depth, and continuum intensity. To properly interpret such data it is important to understand any issues with the HMI and the pipeline that produces these observables. At this aim, HMI data were analyzed at both daily intervals for a span of 3 years at disk center in the quiet Sun and hourly intervals for a span of 200 hours around an active region. Systematic effects attributed to issues with instrument adjustments and re-calibrations, variations in the transmission filters and the orbital velocities of the SDO were found while the actual physical evolutions of such observables were difficult to determine. Velocities and magnetic flux measurements are less affected, as the aforementioned effects are partially compensated for by the HMI algorithm; the other observables are instead affected by larger uncertainties. In order to model these uncertainties, the HMI pipeline was tested with synthetic spectra generated through various 1D atmosphere models with radiative transfer code (the RH code). It was found that HMI estimates of line width, line depth, and continuum intensity are highly dependent on the shape of the line, and therefore highly dependent on the line-of-sight angle and the magnetic field associated to the model. The best estimates are found for Quiet regions at disk center, for which the relative differences between theoretical and HMI algorithm values are 6-8% for line width, 10-15% for line depth, and 0.1-0.2% for continuum intensity. In general, the relative difference between theoretical values and HMI estimates increases toward the limb and with the increase of the field; the HMI algorithm seems to fail in regions with fields larger than ~2000 G. This work is carried out through the National Solar Observatory Research Experiences for Undergraduate (REU) site program, which is co-funded by the Department of Defense in partnership with the NSF REU Program. The National Solar Observatory is operated by the Association of Universities for Research in Astronomy, Inc. (AURA) under cooperative agreement with the National Science Foundation.

  20. Numerical modelling of gravel unconstrained flow experiments with the DAN3D and RASH3D codes

    NASA Astrophysics Data System (ADS)

    Sauthier, Claire; Pirulli, Marina; Pisani, Gabriele; Scavia, Claudio; Labiouse, Vincent

    2015-12-01

    Landslide continuum dynamic models have improved considerably in the last years, but a consensus on the best method of calibrating the input resistance parameter values for predictive analyses has not yet emerged. In the present paper, numerical simulations of a series of laboratory experiments performed at the Laboratory for Rock Mechanics of the EPF Lausanne were undertaken with the RASH3D and DAN3D numerical codes. They aimed at analysing the possibility to use calibrated ranges of parameters (1) in a code different from that they were obtained from and (2) to simulate potential-events made of a material with the same characteristics as back-analysed past-events, but involving a different volume and propagation path. For this purpose, one of the four benchmark laboratory tests was used as past-event to calibrate the dynamic basal friction angle assuming a Coulomb-type behaviour of the sliding mass, and this back-analysed value was then used to simulate the three other experiments, assumed as potential-events. The computational findings show good correspondence with experimental results in terms of characteristics of the final deposits (i.e., runout, length and width). Furthermore, the obtained best fit values of the dynamic basal friction angle for the two codes turn out to be close to each other and within the range of values measured with pseudo-dynamic tilting tests.

  1. mdFoam+: Advanced molecular dynamics in OpenFOAM

    NASA Astrophysics Data System (ADS)

    Longshaw, S. M.; Borg, M. K.; Ramisetti, S. B.; Zhang, J.; Lockerby, D. A.; Emerson, D. R.; Reese, J. M.

    2018-03-01

    This paper introduces mdFoam+, which is an MPI parallelised molecular dynamics (MD) solver implemented entirely within the OpenFOAM software framework. It is open-source and released under the same GNU General Public License (GPL) as OpenFOAM. The source code is released as a publicly open software repository that includes detailed documentation and tutorial cases. Since mdFoam+ is designed entirely within the OpenFOAM C++ object-oriented framework, it inherits a number of key features. The code is designed for extensibility and flexibility, so it is aimed first and foremost as an MD research tool, in which new models and test cases can be developed and tested rapidly. Implementing mdFoam+ in OpenFOAM also enables easier development of hybrid methods that couple MD with continuum-based solvers. Setting up MD cases follows the standard OpenFOAM format, as mdFoam+ also relies upon the OpenFOAM dictionary-based directory structure. This ensures that useful pre- and post-processing capabilities provided by OpenFOAM remain available even though the fully Lagrangian nature of an MD simulation is not typical of most OpenFOAM applications. Results show that mdFoam+ compares well to another well-known MD code (e.g. LAMMPS) in terms of benchmark problems, although it also has additional functionality that does not exist in other open-source MD codes.

  2. Massive star formation at high spatial resolution

    NASA Astrophysics Data System (ADS)

    Pascucci, Ilaria

    2004-05-01

    This thesis studies the early phases of massive stars and their impact on the surrounding. The capabilities of continuum radiative transfer (RT) codes to interpret the observations are also investigated. The main results of this work are: 1) Two massive star-forming regions are observed in the infrared. The thermal emission from the ultra-compact H II regions is resolved and the spectral type of the ionizing stars is estimated. The hot cores are not detected thus implying line-of-sight extinction larger than 200 visual magnitude. 2) The first mid-infrared interferometric measurements towards a young massive star resolve thermal emission on scales of 30-50 AU probing the size of the predicted disk. The visibility curve differs from those of intermediate-mass stars. 3) The close vicinity of Θ1C Ori are imaged using the NACO adaptive optics system. The binary proplyd Orion 168-326 and its interaction with the wind from Θ1C Ori are resolved. A proplyd uniquely seen face-on is also identified. 4) Five RT codes are compared in a disk configuration. The solutions provide the first 2D benchmark and serve to test the reliability of other RT codes. The images/visibilities from two RT codes are compared for a distorted disk. The parameter range in which such a distortion is detectable with MIDI is explored.

  3. The Tempest: Difficult to Control Asthma in Adolescence.

    PubMed

    Burg, Gregory T; Covar, Ronina; Oland, Alyssa A; Guilbert, Theresa W

    Severe asthma is associated with significant morbidity and is a highly heterogeneous disorder. Severe asthma in adolescence has some unique elements compared with the features of severe asthma a medical provider would see in younger children or adults. A specific focus on psychological issues and adherence highlights some of the challenges in the management of asthma in adolescents. Treatment of adolescents with severe asthma now includes 3 approved biologic phenotype-directed therapies. Therapies available to adults may be beneficial to adolescents with severe asthma. Research into predictors of specific treatment response by phenotypes is ongoing. Optimal treatment strategies are not yet defined and warrant further investigation. Copyright © 2018 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.

  4. PAHFIT: Properties of PAH Emission

    NASA Astrophysics Data System (ADS)

    Smith, J. D.; Draine, Bruce

    2012-10-01

    PAHFIT is an IDL tool for decomposing Spitzer IRS spectra of PAH emission sources, with a special emphasis on the careful recovery of ambiguous silicate absorption, and weak, blended dust emission features. PAHFIT is primarily designed for use with full 5-35 micron Spitzer low-resolution IRS spectra. PAHFIT is a flexible tool for fitting spectra, and you can add or disable features, compute combined flux bands, change fitting limits, etc., without changing the code. PAHFIT uses a simple, physically-motivated model, consisting of starlight, thermal dust continuum in a small number of fixed temperature bins, resolved dust features and feature blends, prominent emission lines (which themselves can be blended with dust features), as well as simple fully-mixed or screen dust extinction, dominated by the silicate absorption bands at 9.7 and 18 microns. Most model components are held fixed or are tightly constrained. PAHFIT uses Drude profiles to recover the full strength of dust emission features and blends, including the significant power in the wings of the broad emission profiles. This means the resulting feature strengths are larger (by factors of 2-4) than are recovered by methods which estimate the underlying continuum using line segments or spline curves fit through fiducial wavelength anchors.

  5. Particle/Continuum Hybrid Simulation in a Parallel Computing Environment

    NASA Technical Reports Server (NTRS)

    Baganoff, Donald

    1996-01-01

    The objective of this study was to modify an existing parallel particle code based on the direct simulation Monte Carlo (DSMC) method to include a Navier-Stokes (NS) calculation so that a hybrid solution could be developed. In carrying out this work, it was determined that the following five issues had to be addressed before extensive program development of a three dimensional capability was pursued: (1) find a set of one-sided kinetic fluxes that are fully compatible with the DSMC method, (2) develop a finite volume scheme to make use of these one-sided kinetic fluxes, (3) make use of the one-sided kinetic fluxes together with DSMC type boundary conditions at a material surface so that velocity slip and temperature slip arise naturally for near-continuum conditions, (4) find a suitable sampling scheme so that the values of the one-sided fluxes predicted by the NS solution at an interface between the two domains can be converted into the correct distribution of particles to be introduced into the DSMC domain, (5) carry out a suitable number of tests to confirm that the developed concepts are valid, individually and in concert for a hybrid scheme.

  6. Simulation of Chirping Avalanche in Neighborhood of TAE gap

    NASA Astrophysics Data System (ADS)

    Berk, Herb; Breizman, Boris; Wang, Ge; Zheng, Linjin

    2016-10-01

    A new kinetic code, CHIRP, focuses on the nonlinear response of resonant energetic particles (EPs) that destabilize Alfven waves which then can produce hole and clump phase space chirping structures, while the background plasma currents are assumed to respond linearly to the generated fields. EP currents are due to the motion arising from the perturbed field that is time averaged over an equilibrium orbit. A moderate EP source produces TAE chirping structures that have a limited range of chirping that do not reach the continuum. When the source is sufficiently strong, an EPM is excited in the lower continuum and it chirps rapidly downward as its amplitude rapidly grows in time. This response resembles the experimental observation of an avalanche, which occurs after a series of successive chirping events with a modest frequency shift, and then suddenly a rapid large amplitude and rapid frequency burst to low frequency with the loss of EPs. From these simulation observations we propose that in the experiment the EP population is slowly increasing to the point where the EPM is eventually excited. Supported by SCIDAC Center for Nonlinear Simulation of Energetic Particles Burning Plasmas (CSEP).

  7. Verification of continuum drift kinetic equation solvers in NIMROD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Held, E. D.; Ji, J.-Y.; Kruger, S. E.

    Verification of continuum solutions to the electron and ion drift kinetic equations (DKEs) in NIMROD [C. R. Sovinec et al., J. Comp. Phys. 195, 355 (2004)] is demonstrated through comparison with several neoclassical transport codes, most notably NEO [E. A. Belli and J. Candy, Plasma Phys. Controlled Fusion 54, 015015 (2012)]. The DKE solutions use NIMROD's spatial representation, 2D finite-elements in the poloidal plane and a 1D Fourier expansion in toroidal angle. For 2D velocity space, a novel 1D expansion in finite elements is applied for the pitch angle dependence and a collocation grid is used for the normalized speedmore » coordinate. The full, linearized Coulomb collision operator is kept and shown to be important for obtaining quantitative results. Bootstrap currents, parallel ion flows, and radial particle and heat fluxes show quantitative agreement between NIMROD and NEO for a variety of tokamak equilibria. In addition, velocity space distribution function contours for ions and electrons show nearly identical detailed structure and agree quantitatively. A Θ-centered, implicit time discretization and a block-preconditioned, iterative linear algebra solver provide efficient electron and ion DKE solutions that ultimately will be used to obtain closures for NIMROD's evolving fluid model.« less

  8. Shape design sensitivity analysis and optimal design of structural systems

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.

    1987-01-01

    The material derivative concept of continuum mechanics and an adjoint variable method of design sensitivity analysis are used to relate variations in structural shape to measures of structural performance. A domain method of shape design sensitivity analysis is used to best utilize the basic character of the finite element method that gives accurate information not on the boundary but in the domain. Implementation of shape design sensitivty analysis using finite element computer codes is discussed. Recent numerical results are used to demonstrate the accuracy obtainable using the method. Result of design sensitivity analysis is used to carry out design optimization of a built-up structure.

  9. Infrared emission from isolated dust clouds in the presence of very small dust grains

    NASA Technical Reports Server (NTRS)

    Lis, Dariusz C.; Leung, Chun M.

    1991-01-01

    Models of the effects of small grain-generated temperature fluctuations on the IR spectrum and surface brightness of externally heated interstellar dust clouds are presently constructed on the basis of a continuum radiation transport computer code which encompasses the transient heating of small dust grains. The models assume a constant fractional abundance of large and small grains throughout the given cloud. A comparison of model results with IRAS observations indicates that the observed 12-25 micron band emissions are associated with about 10-A radius grains, while the 60-100 micron emission is primarily due to large grains which are heated under the equilibrium conditions.

  10. The Zugspitze radiative closure experiment for quantifying water vapor absorption over the terrestrial and solar infrared - Part 1: Setup, uncertainty analysis, and assessment of far-infrared water vapor continuum

    NASA Astrophysics Data System (ADS)

    Sussmann, Ralf; Reichert, Andreas; Rettinger, Markus

    2016-09-01

    Quantitative knowledge of water vapor radiative processes in the atmosphere throughout the terrestrial and solar infrared spectrum is still incomplete even though this is crucial input to the radiation codes forming the core of both remote sensing methods and climate simulations. Beside laboratory spectroscopy, ground-based remote sensing field studies in the context of so-called radiative closure experiments are a powerful approach because this is the only way to quantify water absorption under cold atmospheric conditions. For this purpose, we have set up at the Zugspitze (47.42° N, 10.98° E; 2964 m a.s.l.) a long-term radiative closure experiment designed to cover the infrared spectrum between 400 and 7800 cm-1 (1.28-25 µm). As a benefit for such experiments, the atmospheric states at the Zugspitze frequently comprise very low integrated water vapor (IWV; minimum = 0.1 mm, median = 2.3 mm) and very low aerosol optical depth (AOD = 0.0024-0.0032 at 7800 cm-1 at air mass 1). All instruments for radiance measurements and atmospheric-state measurements are described along with their measurement uncertainties. Based on all parameter uncertainties and the corresponding radiance Jacobians, a systematic residual radiance uncertainty budget has been set up to characterize the sensitivity of the radiative closure over the whole infrared spectral range. The dominant uncertainty contribution in the spectral windows used for far-infrared (FIR) continuum quantification is from IWV uncertainties, while T profile uncertainties dominate in the mid-infrared (MIR). Uncertainty contributions to near-infrared (NIR) radiance residuals are dominated by water vapor line parameters in the vicinity of the strong water vapor bands. The window regions in between these bands are dominated by solar Fourier transform infrared (FTIR) calibration uncertainties at low NIR wavenumbers, while uncertainties due to AOD become an increasing and dominant contribution towards higher NIR wavenumbers. Exceptions are methane or nitrous oxide bands in the NIR, where the associated line parameter uncertainties dominate the overall uncertainty. As a first demonstration of the Zugspitze closure experiment, a water vapor continuum quantification in the FIR spectral region (400-580 cm-1) has been performed. The resulting FIR foreign-continuum coefficients are consistent with the MT_CKD 2.5.2 continuum model and also agree with the most recent atmospheric closure study carried out in Antarctica. Results from the first determination of the NIR water vapor continuum in a field experiment are detailed in a companion paper (Reichert and Sussmann, 2016) while a novel NIR calibration scheme for the underlying FTIR measurements of incoming solar radiance is presented in another companion paper (Reichert et al., 2016).

  11. Nature of the Galactic centre NIR-excess sources. I. What can we learn from the continuum observations of the DSO/G2 source?

    NASA Astrophysics Data System (ADS)

    Zajaček, Michal; Britzen, Silke; Eckart, Andreas; Shahzamanian, Banafsheh; Busch, Gerold; Karas, Vladimír; Parsa, Marzieh; Peissker, Florian; Dovčiak, Michal; Subroweit, Matthias; Dinnbier, František; Zensus, J. Anton

    2017-06-01

    Context. The Dusty S-cluster Object (DSO/G2) orbiting the supermassive black hole (Sgr A*) in the Galactic centre has been monitored in both near-infrared continuum and line emission. There has been a dispute about the character and the compactness of the object: it being interpreted as either a gas cloud or a dust-enshrouded star. A recent analysis of polarimetry data in Ks-band (2.2 μm) allows us to put further constraints on the geometry of the DSO. Aims: The purpose of this paper is to constrain the nature and the geometry of the DSO. Methods: We compared 3D radiative transfer models of the DSO with the near-infrared (NIR) continuum data including polarimetry. In the analysis, we used basic dust continuum radiative transfer theory implemented in the 3D Monte Carlo code Hyperion. Moreover, we implemented analytical results of the two-body problem mechanics and the theory of non-thermal processes. Results: We present a composite model of the DSO - a dust-enshrouded star that consists of a stellar source, dusty, optically thick envelope, bipolar cavities, and a bow shock. This scheme can match the NIR total as well as polarized properties of the observed spectral energy distribution (SED). The SED may be also explained in theory by a young pulsar wind nebula that typically exhibits a large linear polarization degree due to magnetospheric synchrotron emission. Conclusions: The analysis of NIR polarimetry data combined with the radiative transfer modelling shows that the DSO is a peculiar source of compact nature in the S cluster (r ≲ 0.04 pc). It is most probably a young stellar object embedded in a non-spherical dusty envelope, whose components include optically thick dusty envelope, bipolar cavities, and a bow shock. Alternatively, the continuum emission could be of a non-thermal origin due to the presence of a young neutron star and its wind nebula. Although there has been so far no detection of X-ray and radio counterparts of the DSO, the analysis of the neutron star model shows that young, energetic neutron stars similar to the Crab pulsar could in principle be detected in the S cluster with current NIR facilities and they appear as apparent reddened, near-infrared-excess sources. The searches for pulsars in the NIR bands can thus complement standard radio searches, which can put further constraints on the unexplored pulsar population in the Galactic centre. Both thermal and non-thermal models are in accordance with the observed compactness, total as well polarized continuum emission of the DSO.

  12. Design of the superconducting magnet for 9.4 Tesla whole-body magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Li, Y.; Wang, Q.; Dai, Y.; Ni, Z.; Zhu, X.; Li, L.; Zhao, B.; Chen, S.

    2017-02-01

    A superconducting magnet for 9.4 Tesla whole-body magnetic resonance imaging is designed and fabricated in Institute of Electrical Engineering, Chinese Academy of Sciences. In this paper, the electromagnetic design methods of the main coils and compensating coils are presented. Sensitivity analysis is performed for all superconducting coils. The design of the superconducting shimming coils is also presented and the design of electromagnetic decoupling of the Z2 coils from the main coils is introduced. Stress and strain analysis with both averaged and detailed models is performed with finite element method. A quench simulation code with anisotropic continuum model and control volume method is developed by us and is verified by experimental study. By means of the quench simulation code, the quench protection system for the 9.4 T magnet is designed for the main coils, the compensating coils and the shimming coils. The magnet cryostat design with zero helium boiling-off technology is also introduced.

  13. Aerodynamic Database Development for Mars Smart Lander Vehicle Configurations

    NASA Technical Reports Server (NTRS)

    Bobskill, Glenn J.; Parikh, Paresh C.; Prabhu, Ramadas K.; Tyler, Erik D.

    2002-01-01

    An aerodynamic database has been generated for the Mars Smart Lander Shelf-All configuration using computational fluid dynamics (CFD) simulations. Three different CFD codes, USM3D and FELISA, based on unstructured grid technology and LAURA, an established and validated structured CFD code, were used. As part of this database development, the results for the Mars continuum were validated with experimental data and comparisons made where applicable. The validation of USM3D and LAURA with the Unitary experimental data, the use of intermediate LAURA check analyses, as well as the validation of FELISA with the Mach 6 CF(sub 4) experimental data provided a higher confidence in the ability for CFD to provide aerodynamic data in order to determine the static trim characteristics for longitudinal stability. The analyses of the noncontinuum regime showed the existence of multiple trim angles of attack that can be unstable or stable trim points. This information is needed to design guidance controller throughout the trajectory.

  14. Patch Finder Plus (PFplus): a web server for extracting and displaying positive electrostatic patches on protein surfaces.

    PubMed

    Shazman, Shula; Celniker, Gershon; Haber, Omer; Glaser, Fabian; Mandel-Gutfreund, Yael

    2007-07-01

    Positively charged electrostatic patches on protein surfaces are usually indicative of nucleic acid binding interfaces. Interestingly, many proteins which are not involved in nucleic acid binding possess large positive patches on their surface as well. In some cases, the positive patches on the protein are related to other functional properties of the protein family. PatchFinderPlus (PFplus) http://pfp.technion.ac.il is a web-based tool for extracting and displaying continuous electrostatic positive patches on protein surfaces. The input required for PFplus is either a four letter PDB code or a protein coordinate file in PDB format, provided by the user. PFplus computes the continuum electrostatics potential and extracts the largest positive patch for each protein chain in the PDB file. The server provides an output file in PDB format including a list of the patch residues. In addition, the largest positive patch is displayed on the server by a graphical viewer (Jmol), using a simple color coding.

  15. Crystal growth and furnace analysis

    NASA Technical Reports Server (NTRS)

    Dakhoul, Youssef M.

    1986-01-01

    A thermal analysis of Hg/Cd/Te solidification in a Bridgman cell is made using Continuum's VAST code. The energy equation is solved in an axisymmetric, quasi-steady domain for both the molten and solid alloy regions. Alloy composition is calculated by a simplified one-dimensional model to estimate its effect on melt thermal conductivity and, consequently, on the temperature field within the cell. Solidification is assumed to occur at a fixed temperature of 979 K. Simplified boundary conditions are included to model both the radiant and conductive heat exchange between the furnace walls and the alloy. Calculations are performed to show how the steady-state isotherms are affected by: the hot and cold furnace temperatures, boundary condition parameters, and the growth rate which affects the calculated alloy's composition. The Advanced Automatic Directional Solidification Furnace (AADSF), developed by NASA, is also thermally analyzed using the CINDA code. The objective is to determine the performance and the overall power requirements for different furnace designs.

  16. Photoactive Self-Shaping Hydrogels as Noncontact 3D Macro/Microscopic Photoprinting Platforms.

    PubMed

    Liao, Yue; An, Ning; Wang, Ning; Zhang, Yinyu; Song, Junfei; Zhou, Jinxiong; Liu, Wenguang

    2015-12-01

    A photocleavable terpolymer hydrogel cross-linked with o-nitrobenzyl derivative cross-linker is shown to be capable of self-shaping without losing its physical integrity and robustness due to spontaneous asymmetric swelling of network caused by UV-light-induced gradient cleavage of chemical cross-linkages. The continuum model and finite element method are used to elucidate the curling mechanism underlying. Remarkably, based on the self-changing principle, the photosensitive hydrogels can be developed as photoprinting soft and wet platforms onto which specific 3D characters and images are faithfully duplicated in macro/microscale without contact by UV light irradiation under the cover of customized photomasks. Importantly, a quick response (QR) code is accurately printed on the photoactive hydrogel for the first time. Scanning QR code with a smartphone can quickly connect to a web page. This photoactive hydrogel is promising to be a new printing or recording material. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Computation of three-dimensional nozzle-exhaust flow fields with the GIM code

    NASA Technical Reports Server (NTRS)

    Spradley, L. W.; Anderson, P. G.

    1978-01-01

    A methodology is introduced for constructing numerical analogs of the partial differential equations of continuum mechanics. A general formulation is provided which permits classical finite element and many of the finite difference methods to be derived directly. The approach, termed the General Interpolants Method (GIM), can combined the best features of finite element and finite difference methods. A quasi-variational procedure is used to formulate the element equations, to introduce boundary conditions into the method and to provide a natural assembly sequence. A derivation is given in terms of general interpolation functions from this procedure. Example computations for transonic and supersonic flows in two and three dimensions are given to illustrate the utility of GIM. A three-dimensional nozzle-exhaust flow field is solved including interaction with the freestream and a coupled treatment of the shear layer. Potential applications of the GIM code to a variety of computational fluid dynamics problems is then discussed in terms of existing capability or by extension of the methodology.

  18. "But I didn't do it!": ethical treatment of sex offenders in denial.

    PubMed

    Levenson, Jill S

    2011-09-01

    This article addresses ethical questions and issues related to the treatment of sex offenders in denial, using the empirical research literature and the ethical codes of American Psychological Association (APA) and National Association of Social Workers (NASW) to guide the ethical decision-making process. The empirical literature does not provide an unequivocal link between denial and recidivism, though some studies suggest that decreased denial and increased accountability appear to be associated with greater therapeutic engagement and reduced recidivism for some offenders. The ethical codes of APA and NASW value the client's self-determination and autonomy, and psychologists and social workers have a duty to empower individual well-being while doing no harm to clients or others. Clinicians should view denial not as a categorical construct but as a continuum of distorted cognitions requiring clinical attention. Denial might also be considered as a responsivity factor that can interfere with treatment progress. Offering a reasonable time period for therapeutic engagement might provide a better alternative than automatically refusing treatment to categorical deniers.

  19. Patch Finder Plus (PFplus): A web server for extracting and displaying positive electrostatic patches on protein surfaces

    PubMed Central

    Shazman, Shula; Celniker, Gershon; Haber, Omer; Glaser, Fabian; Mandel-Gutfreund, Yael

    2007-01-01

    Positively charged electrostatic patches on protein surfaces are usually indicative of nucleic acid binding interfaces. Interestingly, many proteins which are not involved in nucleic acid binding possess large positive patches on their surface as well. In some cases, the positive patches on the protein are related to other functional properties of the protein family. PatchFinderPlus (PFplus) http://pfp.technion.ac.il is a web-based tool for extracting and displaying continuous electrostatic positive patches on protein surfaces. The input required for PFplus is either a four letter PDB code or a protein coordinate file in PDB format, provided by the user. PFplus computes the continuum electrostatics potential and extracts the largest positive patch for each protein chain in the PDB file. The server provides an output file in PDB format including a list of the patch residues. In addition, the largest positive patch is displayed on the server by a graphical viewer (Jmol), using a simple color coding. PMID:17537808

  20. The HIV Prison Paradox: Agency and HIV-Positive Women's Experiences in Jail and Prison in Alabama.

    PubMed

    Sprague, Courtenay; Scanlon, Michael L; Radhakrishnan, Bharathi; Pantalone, David W

    2017-08-01

    Incarcerated women face significant barriers to achieve continuous HIV care. We employed a descriptive, exploratory design using qualitative methods and the theoretical construct of agency to investigate participants' self-reported experiences accessing HIV services in jail, in prison, and post-release in two Alabama cities. During January 2014, we conducted in-depth interviews with 25 formerly incarcerated HIV-positive women. Two researchers completed independent coding, producing preliminary codes from transcripts using content analysis. Themes were developed iteratively, verified, and refined. They encompassed (a) special rules for HIV-positive women: isolation, segregation, insults, food rationing, and forced disclosure; (b) absence of counseling following initial HIV diagnosis; and (c) HIV treatment impediments: delays, interruption, and denial. Participants deployed agentic strategies of accommodation, resistance, and care-seeking to navigate the social world of prison and HIV services. Findings illuminate the "HIV prison paradox": the chief opportunities that remain unexploited to engage and re-engage justice-involved women in the HIV care continuum.

  1. 77 FR 45367 - Continuum of Care Homeless Assistance Grant Application; Continuum of Care Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-31

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5603-N-53] Continuum of Care Homeless Assistance Grant Application; Continuum of Care Application AGENCY: Office of the Chief Information Officer..., called Continuums of Care (CoC), will complete the Exhibit 1 of the Continuum of Care Homeless Assistance...

  2. ALMA unveils rings and gaps in the protoplanetary system HD 169142: signatures of two giant protoplanets

    NASA Astrophysics Data System (ADS)

    Fedele, D.; Carney, M.; Hogerheijde, M. R.; Walsh, C.; Miotello, A.; Klaassen, P.; Bruderer, S.; Henning, Th.; van Dishoeck, E. F.

    2017-04-01

    The protoplanetary system HD 169142 is one of the few cases where a potential candidate protoplanet has recently been detected by direct imaging in the near-infrared. To study the interaction between the protoplanet and the disk itself, observations of the gas and dust surface density structure are needed. This paper reports new ALMA observations of the dust continuum at 1.3 mm, 12CO, 13CO, and C18O J = 2-1 emission from the system HD 169142 (which is observed almost face-on) at an angular resolution of 0.3 arcsec × 0.2 arcsec ( 35 × 20 au). The dust continuum emission reveals a double-ring structure with an inner ring between 0.17 arcsec{-0.28 arcsec} ( 20-35 au) and an outer ring between 0.48 arcsec{-0.64 arcsec} ( 56-83 au). The size and position of the inner ring is in good agreement with previous polarimetric observations in the near-infrared and is consistent with dust trapping by a massive planet. No dust emission is detected inside the inner dust cavity (R ≲ 20 au) or within the dust gap ( 35-56 au) down to the noise level. In contrast, the channel maps of the J = 2-1 line of the three CO isotopologs reveal gas inside the dust cavity and dust gap. The gaseous disk is also much larger than the compact dust emission; it extends to 1.5 arcsec ( 180 au) in radius. This difference and the sharp drop of the continuum emission at large radii point to radial drift of large dust grains (>μm size). Using the thermo-chemical disk code dali, we modeled the continuum and the CO isotopolog emission to quantitatively measure the gas and dust surface densities. The resulting gas surface density is reduced by a factor of 30-40 inward of the dust gap. The gas and dust distribution indicate that two giant planets shape the disk structure through dynamical clearing (dust cavity and gap) and dust trapping (double-ring dust distribution).

  3. "A tempest in a cocktail glass": mothers, alcohol, and television, 1977-1996.

    PubMed

    Golden, J

    2000-06-01

    This article examines the portrayal of pregnancy and alcohol in thirty-six national network evening news broadcasts (ABC, CBS, NBC). Early coverage focused on white, middle-class women, as scientific authorities and government officials warned against drinking during pregnancy. After 1987, however, women who drank during pregnancy were depicted as members of minority groups and as a danger to society. The thematic transition began before warning labels appeared on alcoholic beverages and gained strength from official government efforts to prevent fetal alcohol syndrome. The greatest impetus for the revised discourse, however, was the eruption of a "moral panic" over crack cocaine use. By linking fetal harm to substance abuse, the panic suggested it was in the public's interest to control the behavior of pregnant women.

  4. TEMPEST in a gallimaufry: applying multilevel systems theory to person-in-context research.

    PubMed

    Peck, Stephen C

    2007-12-01

    Terminological ambiguity and inattention to personal and contextual multilevel systems undermine personality, self, and identity theories. Hierarchical and heterarchical systems theories are used to describe contents and processes existing within and across three interrelated multilevel systems: levels of organization, representation, and integration. Materially nested levels of organization are used to distinguish persons from contexts and personal from social identity. Functionally nested levels of representation are used to distinguish personal identity from the sense of identity and symbolic (belief) from iconic (schema) systems. Levels of integration are hypothesized to unfold separately but interdependently across levels of representation. Multilevel system configurations clarify alternative conceptualizations of traits and contextualized identity. Methodological implications for measurement and analysis (e.g., integrating variable- and pattern-centered methods) are briefly described.

  5. EUV phase-shifting masks and aberration monitors

    NASA Astrophysics Data System (ADS)

    Deng, Yunfei; Neureuther, Andrew R.

    2002-07-01

    Rigorous electromagnetic simulation with TEMPEST is used to examine the use of phase-shifting masks in EUV lithography. The effects of oblique incident illumination and mask patterning by ion-mixing of multilayers are analyzed. Oblique incident illumination causes streamers at absorber edges and causes position shifting in aerial images. The diffraction waves between ion-mixed and pristine multilayers are observed. The phase-shifting caused by stepped substrates is simulated and images show that it succeeds in creation of phase-shifting effects. The diffraction process at the phase boundary is also analyzed. As an example of EUV phase-shifting masks, a coma pattern and probe based aberration monitor is simulated and aerial images are formed under different levels of coma aberration. The probe signal rises quickly as coma increases as designed.

  6. Ethical issues in nanomedicine: Tempest in a teapot?

    PubMed

    Allon, Irit; Ben-Yehudah, Ahmi; Dekel, Raz; Solbakk, Jan-Helge; Weltring, Klaus-Michael; Siegal, Gil

    2017-03-01

    Nanomedicine offers remarkable options for new therapeutic avenues. As methods in nanomedicine advance, ethical questions conjunctly arise. Nanomedicine is an exceptional niche in several aspects as it reflects risks and uncertainties not encountered in other areas of medical research or practice. Nanomedicine partially overlaps, partially interlocks and partially exceeds other medical disciplines. Some interpreters agree that advances in nanotechnology may pose varied ethical challenges, whilst others argue that these challenges are not new and that nanotechnology basically echoes recurrent bioethical dilemmas. The purpose of this article is to discuss some of the ethical issues related to nanomedicine and to reflect on the question whether nanomedicine generates ethical challenges of new and unique nature. Such a determination should have implications on regulatory processes and professional conducts and protocols in the future.

  7. Tempest in a sugar-coated lab vial.

    PubMed

    Dragun, Duska; Philippe, Aurélie

    2018-06-23

    Angiotensin II type 1 receptor (AT 1 R) is a classical G-protein-coupled-receptor (GPCR) displaying complex structure consisting of 7-transmembrane helices connected by intracellular and extracellular loops. Beside Angiotensin II binding within transmembrane sites and mechanically induced ligand free activation, AT 1 R can be also activated by agonistic autoantibodies (AT 1 R-Ab) recognizing conformational epitopes contained in the second extracellular loop. Direct pathophysiologic involvement of AT 1 R-Abs is well established in several autoimmune contexts and organ transplantation (1). A commercially available sandwich ELISA appreciating native receptor conformation relies on cell membrane AT 1 R extracts from human AT 1 R overexpressing Chinese hamster ovary (CHO) cells as a solid phase. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  8. Simulation of exposure and alignment for nanoimprint lithography

    NASA Astrophysics Data System (ADS)

    Deng, Yunfei; Neureuther, Andrew R.

    2002-07-01

    Rigorous electromagnetic simulation with TEMPEST is used to examine the exposure and alignment processes for nano-imprint lithography with attenuating thin-film molds. Parameters in the design of topographical features of the nano-imprint system and material choices of the components are analyzed. The small feature size limits light transmission through the feature. While little can be done with auxiliary structures to attract light into small holes, the use of an absorbing material with a low real part of the refractive index such as silver helps mitigates the problem. Results on complementary alignment marks shows that the small transmission through the metal layer and the vertical separation of two alignment marks create the leakage equivalent to 1 nm misalignment but satisfactory alignment can be obtained by measuring alignment signals over a +/- 30 nm range.

  9. Decoding the neural representation of fine-grained conceptual categories.

    PubMed

    Ghio, Marta; Vaghi, Matilde Maria Serena; Perani, Daniela; Tettamanti, Marco

    2016-05-15

    Neuroscientific research on conceptual knowledge based on the grounded cognition framework has shed light on the organization of concrete concepts into semantic categories that rely on different types of experiential information. Abstract concepts have traditionally been investigated as an undifferentiated whole, and have only recently been addressed in a grounded cognition perspective. The present fMRI study investigated the involvement of brain systems coding for experiential information in the conceptual processing of fine-grained semantic categories along the abstract-concrete continuum. These categories consisted of mental state-, emotion-, mathematics-, mouth action-, hand action-, and leg action-related meanings. Thirty-five sentences for each category were used as stimuli in a 1-back task performed by 36 healthy participants. A univariate analysis failed to reveal category-specific activations. Multivariate pattern analyses, in turn, revealed that fMRI data contained sufficient information to disentangle all six fine-grained semantic categories across participants. However, the category-specific activity patterns showed no overlap with the regions coding for experiential information. These findings demonstrate the possibility of detecting specific patterns of neural representation associated with the processing of fine-grained conceptual categories, crucially including abstract ones, though bearing no anatomical correspondence with regions coding for experiential information as predicted by the grounded cognition hypothesis. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Continuum modeling of large lattice structures: Status and projections

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Mikulas, Martin M., Jr.

    1988-01-01

    The status and some recent developments of continuum modeling for large repetitive lattice structures are summarized. Discussion focuses on a number of aspects including definition of an effective substitute continuum; characterization of the continuum model; and the different approaches for generating the properties of the continuum, namely, the constitutive matrix, the matrix of mass densities, and the matrix of thermal coefficients. Also, a simple approach is presented for generating the continuum properties. The approach can be used to generate analytic and/or numerical values of the continuum properties.

  11. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, Peter Andrew

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomicmore » scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.« less

  12. Passing waves from atomistic to continuum

    NASA Astrophysics Data System (ADS)

    Chen, Xiang; Diaz, Adrian; Xiong, Liming; McDowell, David L.; Chen, Youping

    2018-02-01

    Progress in the development of coupled atomistic-continuum methods for simulations of critical dynamic material behavior has been hampered by a spurious wave reflection problem at the atomistic-continuum interface. This problem is mainly caused by the difference in material descriptions between the atomistic and continuum models, which results in a mismatch in phonon dispersion relations. In this work, we introduce a new method based on atomistic dynamics of lattice coupled with a concurrent atomistic-continuum method to enable a full phonon representation in the continuum description. This permits the passage of short-wavelength, high-frequency phonon waves from the atomistic to continuum regions. The benchmark examples presented in this work demonstrate that the new scheme enables the passage of all allowable phonons through the atomistic-continuum interface; it also preserves the wave coherency and energy conservation after phonons transport across multiple atomistic-continuum interfaces. This work is the first step towards developing a concurrent atomistic-continuum simulation tool for non-equilibrium phonon-mediated thermal transport in materials with microstructural complexity.

  13. Calculations of atmospheric transmittance in the 11 micrometer window for estimating skin temperature from VISSR infrared brightness temperatures

    NASA Technical Reports Server (NTRS)

    Chesters, D.

    1984-01-01

    An algorithm for calculating the atmospheric transmittance in the 10 to 20 micro m spectral band from a known temperature and dewpoint profile, and then using this transmittance to estimate the surface (skin) temperature from a VISSR observation in the 11 micro m window is presented. Parameterizations are drawn from the literature for computing the molecular absorption due to the water vapor continuum, water vapor lines, and carbon dioxide lines. The FORTRAN code is documented for this application, and the sensitivity of the derived skin temperature to variations in the model's parameters is calculated. The VISSR calibration uncertainties are identified as the largest potential source of error.

  14. Multilayer Insulation Ascent Venting Model

    NASA Technical Reports Server (NTRS)

    Tramel, R. W.; Sutherlin, S. G.; Johnson, W. L.

    2017-01-01

    The thermal and venting transient experienced by tank-applied multilayer insulation (MLI) in the Earth-to-orbit environment is very dynamic and not well characterized. This new predictive code is a first principles-based engineering model which tracks the time history of the mass and temperature (internal energy) of the gas in each MLI layer. A continuum-based model is used for early portions of the trajectory while a kinetic theory-based model is used for the later portions of the trajectory, and the models are blended based on a reference mean free path. This new capability should improve understanding of the Earth-to-orbit transient and enable better insulation system designs for in-space cryogenic propellant systems.

  15. Determining mechanical behavior of solid materials using miniature specimens

    DOEpatents

    Manahan, Michael P.; Argon, Ali S.; Harling, Otto K.

    1986-01-01

    A Miniaturized Bend Test (MBT) capable of extracting and determining mechanical behavior information from specimens only so large as to have at least a volume or smallest dimension sufficient to satisfy continuum behavior in all directions. The mechanical behavior of the material is determined from the measurements taken during the bending of the specimen and is processed according to the principles of linear or nonlinear material mechanics or both. In a preferred embodiment the determination is carried out by a code which is constructed according to the finite element method, and the specimen used for the determinations is a miniature disk simply supported for central loading at the axis on the center of the disk.

  16. UniPOPS: Unified data reduction suite

    NASA Astrophysics Data System (ADS)

    Maddalena, Ronald J.; Garwood, Robert W.; Salter, Christopher J.; Stobie, Elizabeth B.; Cram, Thomas R.; Morgan, Lorrie; Vance, Bob; Hudson, Jerome

    2015-03-01

    UniPOPS, a suite of programs and utilities developed at the National Radio Astronomy Observatory (NRAO), reduced data from the observatory's single-dish telescopes: the Tucson 12-m, the Green Bank 140-ft, and archived data from the Green Bank 300-ft. The primary reduction programs, 'line' (for spectral-line reduction) and 'condar' (for continuum reduction), used the People-Oriented Parsing Service (POPS) as the command line interpreter. UniPOPS unified previous analysis packages and provided new capabilities; development of UniPOPS continued within the NRAO until 2004 when the 12-m was turned over to the Arizona Radio Observatory (ARO). The submitted code is version 3.5 from 2004, the last supported by the NRAO.

  17. X-Ray Spectra from MHD Simulations of Accreting Black Holes

    NASA Technical Reports Server (NTRS)

    Schnittman, Jeremy D.; Noble, Scott C.; Krolik, Julian H.

    2011-01-01

    We present new global calculations of X-ray spectra from fully relativistic magneto-hydrodynamic (MHO) simulations of black hole (BH) accretion disks. With a self consistent radiative transfer code including Compton scattering and returning radiation, we can reproduce the predominant spectral features seen in decades of X-ray observations of stellar-mass BHs: a broad thermal peak around 1 keV, power-law continuum up to >100 keV, and a relativistically broadened iron fluorescent line. By varying the mass accretion rate, different spectral states naturally emerge: thermal-dominant, steep power-law, and low/hard. In addition to the spectral features, we briefly discuss applications to X-ray timing and polarization.

  18. 77 FR 33229 - Notice of Proposed Information Collection for Public Comment; Continuum of Care Homeless...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-05

    ... Information Collection for Public Comment; Continuum of Care Homeless Assistance Grant Application--Continuum of Care Application AGENCY: Office of Assistant Secretary for Community Planning and Development... collection for public comment entitled Continuum of Care of Homeless Assistance Grant Application- Continuum...

  19. PACCE: Perl Algorithm to Compute Continuum and Equivalent Widths

    NASA Astrophysics Data System (ADS)

    Riffel, Rogério; Borges Vale, Tibério

    2011-05-01

    PACCE (Perl Algorithm to Compute continuum and Equivalent Widths) computes continuum and equivalent widths. PACCE is able to determine mean continuum and continuum at line center values, which are helpful in stellar population studies, and is also able to compute the uncertainties in the equivalent widths using photon statistics.

  20. Exploring the Continuum of Vaccine Hesitancy Between African American and White Adults: Results of a Qualitative Study

    PubMed Central

    Quinn, Sandra; Jamison, Amelia; Musa, Donald; Hilyard, Karen; Freimuth, Vicki

    2016-01-01

    Vaccine delay and refusal present very real threats to public health. Since even a slight reduction in vaccination rates could produce major consequences as herd immunity is eroded, it is imperative to understand the factors that contribute to decision-making about vaccines. Recent scholarship on the concept of “vaccine hesitancy” emphasizes that vaccine behaviors and beliefs tend to fall along a continuum from refusal to acceptance. Most research on hesitancy has focused on parental decision-making about childhood vaccines, but could be extended to explore decision-making related to adult immunization against seasonal influenza. In particular, vaccine hesitancy could be a useful approach to understand the persistence of racial/ethnic disparities between African American and White adults. This study relied on a thematic content analysis of qualitative data, including 12 semi-structured interviews, 9 focus groups (N=90), and 16 in-depth interviews, for a total sample of 118 (N=118) African American and White adults. All data were transcribed and analyzed with Atlas.ti. A coding scheme combining both inductive and deductive codes was utilized to identify themes related to vaccine hesitancy. The study found a continuum of vaccine behavior from never-takers, sometimes-takers, and always-takers, with significant differences between African Americans and Whites.  We compared our findings to the Three Cs: Complacency, Convenience, and Confidence framework. Complacency contributed to low vaccine acceptance with both races.  Among sometimes-takers and always-takers, convenience was often cited as a reason for their behavior, while never-takers of both races were more likely to describe other reasons for non-vaccination, with convenience only a secondary explanation.  However, for African Americans, cost was a barrier.  There were racial differences in trust and confidence that impacted the decision-making process. The framework, though not a natural fit for the data, does provide some insight into the differential sources of hesitancy between these two populations. Complacency and confidence clearly impact vaccine behavior, often more profoundly than convenience, which can contribute either negatively or positively to vaccine acceptance. The Three Cs framework is a useful, but limited tool to understanding racial disparities. Understanding the distinctions in those cultural factors that drive lower vaccine confidence and greater hesitancy among African Americans could lead to more effective communication strategies as well as changes in the delivery of vaccines to increase convenience and passive acceptance. PMID:28239512

  1. Analysis of Plume Impingement Effects from Orion Crew Service Module Dual Reaction Control System Engine Firings

    NASA Technical Reports Server (NTRS)

    Prisbell, Andrew; Marichalar, J.; Lumpkin, F.; LeBeau, G.

    2010-01-01

    Plume impingement effects on the Orion Crew Service Module (CSM) were analyzed for various dual Reaction Control System (RCS) engine firings and various configurations of the solar arrays. The study was performed using a decoupled computational fluid dynamics (CFD) and Direct Simulation Monte Carlo (DSMC) approach. This approach included a single jet plume solution for the R1E RCS engine computed with the General Aerodynamic Simulation Program (GASP) CFD code. The CFD solution was used to create an inflow surface for the DSMC solution based on the Bird continuum breakdown parameter. The DSMC solution was then used to model the dual RCS plume impingement effects on the entire CSM geometry with deployed solar arrays. However, because the continuum breakdown parameter of 0.5 could not be achieved due to geometrical constraints and because high resolution in the plume shock interaction region is desired, a focused DSMC simulation modeling only the plumes and the shock interaction region was performed. This high resolution intermediate solution was then used as the inflow to the larger DSMC solution to obtain plume impingement heating, forces, and moments on the CSM and the solar arrays for a total of 21 cases that were analyzed. The results of these simulations were used to populate the Orion CSM Aerothermal Database.

  2. Numerical investigation of rarefaction effects in the vicinity of a sharp leading edge

    NASA Astrophysics Data System (ADS)

    Pan, Shaowu; Gao, Zhenxun; Lee, Chunhian

    2014-12-01

    This paper presents a study of rarefaction effect on hypersonic flow over a sharp leading edge. Both continuum approach and kinetic method: a widely spread commercial Computational Fluid Dynamics-Navior-Stokes-Fourier (CFD-NSF) software - Fluent together with a direct simulation Monte Carlo (DSMC) code developed by the authors are employed for simulation of transition regime with Knudsen number ranging from 0.005 to 0.2. It is found that Fluent can predict the wall fluxes in the case of hypersonic argon flow over the sharp leading edge for the lowest Kn case (Kn = 0.005) in current paper while for other cases it also has a good agreement with DSMC except at the location near the sharp leading edge. Among all of the wall fluxes, it is found that coefficient of pressure is the most sensitive to rarefaction while heat transfer is the least one. A parameter based on translational nonequilibrium and a cut-off value of 0.34 is proposed for continuum breakdown in this paper. The structure of entropy and velocity profile in boundary layer is analyzed. Also, it is found that the ratio of heat transfer coefficient to skin friction coefficient remains uniform along the surface for the four cases in this paper.

  3. The effect of spatial discretization upon traveling wave body forcing of a turbulent wall-bounded flow

    NASA Astrophysics Data System (ADS)

    You, Soyoung; Goldstein, David

    2015-11-01

    DNS is employed to simulate turbulent channel flow subject to a traveling wave body force field near the wall. The regions in which forces are applied are made progressively more discrete in a sequence of simulations to explore the boundaries between the effects of discrete flow actuators and spatially continuum actuation. The continuum body force field is designed to correspond to the ``optimal'' resolvent mode of McKeon and Sharma (2010), which has the L2 norm of σ1. That is, the normalized harmonic forcing that gives the largest disturbance energy is the first singular mode with the gain of σ1. 2D and 3D resolvent modes are examined at a modest Reτ of 180. For code validation, nominal flow simulations without discretized forcing are compared to previous work by Sharma and Goldstein (2014) in which we find that as we increase the forcing amplitude there is a decrease in the mean velocity and an increase in turbulent kinetic energy. The same force field is then sampled into isolated sub-domains to emulate the effect of discrete physical actuators. Several cases will be presented to explore the dependencies between the level of discretization and the turbulent flow behavior.

  4. Modifications to the Conduit Flow Process Mode 2 for MODFLOW-2005

    USGS Publications Warehouse

    Reimann, T.; Birk, S.; Rehrl, C.; Shoemaker, W.B.

    2012-01-01

    As a result of rock dissolution processes, karst aquifers exhibit highly conductive features such as caves and conduits. Within these structures, groundwater flow can become turbulent and therefore be described by nonlinear gradient functions. Some numerical groundwater flow models explicitly account for pipe hydraulics by coupling the continuum model with a pipe network that represents the conduit system. In contrast, the Conduit Flow Process Mode 2 (CFPM2) for MODFLOW-2005 approximates turbulent flow by reducing the hydraulic conductivity within the existing linear head gradient of the MODFLOW continuum model. This approach reduces the practical as well as numerical efforts for simulating turbulence. The original formulation was for large pore aquifers where the onset of turbulence is at low Reynolds numbers (1 to 100) and not for conduits or pipes. In addition, the existing code requires multiple time steps for convergence due to iterative adjustment of the hydraulic conductivity. Modifications to the existing CFPM2 were made by implementing a generalized power function with a user-defined exponent. This allows for matching turbulence in porous media or pipes and eliminates the time steps required for iterative adjustment of hydraulic conductivity. The modified CFPM2 successfully replicated simple benchmark test problems. ?? 2011 The Author(s). Ground Water ?? 2011, National Ground Water Association.

  5. Some Developments of the Equilibrium Particle Simulation Method for the Direct Simulation of Compressible Flows

    NASA Technical Reports Server (NTRS)

    Macrossan, M. N.

    1995-01-01

    The direct simulation Monte Carlo (DSMC) method is the established technique for the simulation of rarefied gas flows. In some flows of engineering interest, such as occur for aero-braking spacecraft in the upper atmosphere, DSMC can become prohibitively expensive in CPU time because some regions of the flow, particularly on the windward side of blunt bodies, become collision dominated. As an alternative to using a hybrid DSMC and continuum gas solver (Euler or Navier-Stokes solver) this work is aimed at making the particle simulation method efficient in the high density regions of the flow. A high density, infinite collision rate limit of DSMC, the Equilibrium Particle Simulation method (EPSM) was proposed some 15 years ago. EPSM is developed here for the flow of a gas consisting of many different species of molecules and is shown to be computationally efficient (compared to DSMC) for high collision rate flows. It thus offers great potential as part of a hybrid DSMC/EPSM code which could handle flows in the transition regime between rarefied gas flows and fully continuum flows. As a first step towards this goal a pure EPSM code is described. The next step of combining DSMC and EPSM is not attempted here but should be straightforward. EPSM and DSMC are applied to Taylor-Couette flow with Kn = 0.02 and 0.0133 and S(omega) = 3). Toroidal vortices develop for both methods but some differences are found, as might be expected for the given flow conditions. EPSM appears to be less sensitive to the sequence of random numbers used in the simulation than is DSMC and may also be more dissipative. The question of the origin and the magnitude of the dissipation in EPSM is addressed. It is suggested that this analysis is also relevant to DSMC when the usual accuracy requirements on the cell size and decoupling time step are relaxed in the interests of computational efficiency.

  6. Induced seismicity in a salt mine environment evaluated by a coupled continuum-discrete modelling.

    NASA Astrophysics Data System (ADS)

    Mercerat, E.; Souley, M.; Driad, L.; Bernard, P.

    2005-12-01

    Within the framework of a research project launched to assess the feasibility of seismic monitoring of underground growing cavities, this specific work focus on two main complementary axis: the validation of seismic monitoring techniques in salt mine environments, and the numerical modelling of deformation and failure mechanisms with their associated acoustic emissions, the induced microseismicity. The underground cavity under monitoring is located at Cerville (Lorraine, France) within a salt layer 180 m deep and it presents a rather regular cylindrical shape of 100 m diameter. Typically, the overburden is characterized by the presence of two competent layers with elasto-brittle behaviour and located 50 m above the salt layer. When the salt exploitation restarts, the cavity will progressively grow causing irreversible damage of the upper layers until its final collapse at a time scale of the order of one year. Numerical modelling of such a complex process requires a large scale model which takes into account both the growing cavity within the salt layer and the mechanical behaviour of the overburden where high deformation and fracturing is expected. To keep the elasto-brittle behaviour of the competent layers where most seismic damage is expected, we use the PFC code (Itasca Cons). To approach the other layers (mainly composed of marls and salt) which present more ductile and/or viscoplastic behaviour, a continuum approach based on the FLAC code (Itasca Cons) is employed. Numerous calibration process were needed to estimate the microproperties used in PFC to reproduce the macroscopic behaviour from laboratory tests performed on samples extracted from the competent layers. As long as the size of the PFC inclusion representing the brittle material is much higher than the core sample sizes, the scale effect of microproperties is examined. The next stage is to perform calculations on the basis of previous macroscopic and microproperties calibration results, and compare them with the observed microseismicity in the rock mass.

  7. Herschel-PACS observation of the 10 Myr old T Tauri disk TW Hya. Constraining the disk gas mass

    NASA Astrophysics Data System (ADS)

    Thi, W.-F.; Mathews, G.; Ménard, F.; Woitke, P.; Meeus, G.; Riviere-Marichalar, P.; Pinte, C.; Howard, C. D.; Roberge, A.; Sandell, G.; Pascucci, I.; Riaz, B.; Grady, C. A.; Dent, W. R. F.; Kamp, I.; Duchêne, G.; Augereau, J.-C.; Pantin, E.; Vandenbussche, B.; Tilling, I.; Williams, J. P.; Eiroa, C.; Barrado, D.; Alacid, J. M.; Andrews, S.; Ardila, D. R.; Aresu, G.; Brittain, S.; Ciardi, D. R.; Danchi, W.; Fedele, D.; de Gregorio-Monsalvo, I.; Heras, A.; Huelamo, N.; Krivov, A.; Lebreton, J.; Liseau, R.; Martin-Zaidi, C.; Mendigutía, I.; Montesinos, B.; Mora, A.; Morales-Calderon, M.; Nomura, H.; Phillips, N.; Podio, L.; Poelman, D. R.; Ramsay, S.; Rice, K.; Solano, E.; Walker, H.; White, G. J.; Wright, G.

    2010-07-01

    Planets are formed in disks around young stars. With an age of ~10 Myr, TW Hya is one of the nearest T Tauri stars that is still surrounded by a relatively massive disk. In addition a large number of molecules has been found in the TW Hya disk, making TW Hya the perfect test case in a large survey of disks with Herschel-PACS to directly study their gaseous component. We aim to constrain the gas and dust mass of the circumstellar disk around TW Hya. We observed the fine-structure lines of [O i] and [C ii] as part of the open-time large program GASPS. We complement this with continuum data and ground-based 12 CO 3-2 and 13CO 3-2 observations. We simultaneously model the continuum and the line fluxes with the 3D Monte-Carlo code MCFOST and the thermo-chemical code ProDiMo to derive the gas and dust masses. We detect the [O i] line at 63 μm. The other lines that were observed, [O i] at 145 μm and [C ii] at 157 μm, are not detected. No extended emission has been found. Preliminary modeling of the photometric and line data assuming [ 12CO] /[ 13CO] = 69 suggests a dust mass for grains with radius <1 mm of ~1.9 × 10-4 M⊙ (total solid mass of 3 × 10-3 M⊙) and a gas mass of (0.5-5) × 10-3 M⊙. The gas-to-dust mass may be lower than the standard interstellar value of 100. Herschel is an ESA space observatory with science instruments provided by Principal Investigator consortia. It is open for proposals for observing time from the worldwide astronomical community.Appendix is only available in electronic form at http://www.aanda.org

  8. Broadly continuously tunable slot waveguide quantum cascade lasers based on a continuum-to-continuum active region design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Bo; Zeng, Yong Quan; Liang, Guozhen

    2015-09-14

    We report our progress in the development of broadly tunable single-mode slot waveguide quantum cascade lasers based on a continuum-to-continuum active region design. The electroluminescence spectrum of the continuum-to-continuum active region design has a full width at half maximum of 440 cm{sup −1} at center wavelength ∼10 μm at room temperature (300 K). Devices using the optimized slot waveguide structure and the continuum-to-continuum design can be tuned continuously with a lasing emission over 42 cm{sup −1}, from 9.74 to 10.16 μm, at room temperature by using only current tuning scheme, together with a side mode suppression ratio of above 15 dB within the whole tuning range.

  9. Convective Heating of the LIFE Engine Target During Injection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holdener, D S; Tillack, M S; Wang, X R

    2011-10-24

    Target survival in the hostile, high temperature xenon environment of the proposed Laser Inertial Fusion Energy (LIFE) engine is critical. This work focuses on the flow properties and convective heat load imposed upon the surface of the indirect drive target while traveling through the xenon gas. While this rarefied flow is traditionally characterized as being within the continuum regime, it is approaching transition where conventional CFD codes reach their bounds of operation. Thus ANSYS, specifically the Navier-Stokes module CFX, will be used in parallel with direct simulation Monte Carlo code DS2V and analytically and empirically derived expressions for heat transfermore » to the hohlraum for validation. Comparison of the viscous and thermal boundary layers of ANSYS and DS2V were shown to be nearly identical, with the surface heat flux varying less than 8% on average. From the results herein, external baffles have been shown to reduce this heat transfer to the sensitive laser entrance hole (LEH) windows and optimize target survival independent of other reactor parameters.« less

  10. DAMAGE MODELING OF INJECTION-MOLDED SHORT- AND LONG-FIBER THERMOPLASTICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Kunc, Vlastimil; Bapanapalli, Satish K.

    2009-10-30

    This article applies the recent anisotropic rotary diffusion – reduced strain closure (ARD-RSC) model for predicting fiber orientation and a new damage model for injection-molded long-fiber thermoplastics (LFTs) to analyze progressive damage leading to total failure of injection-molded long-glass-fiber/polypropylene (PP) specimens. The ARD-RSC model was implemented in a research version of the Autodesk Moldflow Plastics Insight (MPI) processing code, and it has been used to simulate injection-molding of a long-glass-fiber/PP plaque. The damage model combines micromechanical modeling with a continuum damage mechanics description to predict the nonlinear behavior due to plasticity coupled with damage in LFTs. This model has beenmore » implemented in the ABAQUS finite element code via user-subroutines and has been used in the damage analyses of tensile specimens removed from the injection-molded long-glass-fiber/PP plaques. Experimental characterization and mechanical testing were performed to provide input data to support and validate both process modeling and damage analyses. The predictions are in agreement with the experimental results.« less

  11. Process metallurgy simulation for metal drawing process optimization by using two-scale finite element method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakamachi, Eiji; Yoshida, Takashi; Yamaguchi, Toshihiko

    2014-10-06

    We developed two-scale FE analysis procedure based on the crystallographic homogenization method by considering the hierarchical structure of poly-crystal aluminium alloy metal. It can be characterized as the combination of two-scale structure, such as the microscopic polycrystal structure and the macroscopic elastic plastic continuum. Micro polycrystal structure can be modeled as a three dimensional representative volume element (RVE). RVE is featured as by 3×3×3 eight-nodes solid finite elements, which has 216 crystal orientations. This FE analysis code can predict the deformation, strain and stress evolutions in the wire drawing processes in the macro- scales, and further the crystal texture andmore » hardening evolutions in the micro-scale. In this study, we analyzed the texture evolution in the wire drawing processes by our two-scale FE analysis code under conditions of various drawing angles of dice. We evaluates the texture evolution in the surface and center regions of the wire cross section, and to clarify the effects of processing conditions on the texture evolution.« less

  12. Impact of Martian atmosphere parameter uncertainties on entry vehicles aerodynamic for hypersonic rarefied conditions

    NASA Astrophysics Data System (ADS)

    Fei, Huang; Xu-hong, Jin; Jun-ming, Lv; Xiao-li, Cheng

    2016-11-01

    An attempt has been made to analyze impact of Martian atmosphere parameter uncertainties on entry vehicle aerodynamics for hypersonic rarefied conditions with a DSMC code. The code has been validated by comparing Viking vehicle flight data with present computational results. Then, by simulating flows around the Mars Science Laboratory, the impact of errors of free stream parameter uncertainties on aerodynamics is investigated. The validation results show that the present numerical approach can show good agreement with the Viking flight data. The physical and chemical properties of CO2 has strong impact on aerodynamics of Mars entry vehicles, so it is necessary to make proper corrections to the data obtained with air model in hypersonic rarefied conditions, which is consistent with the conclusions drawn in continuum regime. Uncertainties of free stream density and velocity weakly influence aerodynamics and pitching moment. However, aerodynamics appears to be little influenced by free stream temperature, the maximum error of what is below 0.5%. Center of pressure position is not sensitive to free stream parameters.

  13. Process metallurgy simulation for metal drawing process optimization by using two-scale finite element method

    NASA Astrophysics Data System (ADS)

    Nakamachi, Eiji; Yoshida, Takashi; Kuramae, Hiroyuki; Morimoto, Hideo; Yamaguchi, Toshihiko; Morita, Yusuke

    2014-10-01

    We developed two-scale FE analysis procedure based on the crystallographic homogenization method by considering the hierarchical structure of poly-crystal aluminium alloy metal. It can be characterized as the combination of two-scale structure, such as the microscopic polycrystal structure and the macroscopic elastic plastic continuum. Micro polycrystal structure can be modeled as a three dimensional representative volume element (RVE). RVE is featured as by 3×3×3 eight-nodes solid finite elements, which has 216 crystal orientations. This FE analysis code can predict the deformation, strain and stress evolutions in the wire drawing processes in the macro- scales, and further the crystal texture and hardening evolutions in the micro-scale. In this study, we analyzed the texture evolution in the wire drawing processes by our two-scale FE analysis code under conditions of various drawing angles of dice. We evaluates the texture evolution in the surface and center regions of the wire cross section, and to clarify the effects of processing conditions on the texture evolution.

  14. Notes on the KIVA-2 software and chemically reactive fluid mechanics

    NASA Astrophysics Data System (ADS)

    Holst, M. J.

    1992-09-01

    Working notes regarding the mechanics of chemically reactive fluids with sprays, and their numerical simulation with the KIVA-2 software are presented. KIVA-2 is a large FORTRAN program developed at Los Alamos National Laboratory for internal combustion engine simulation. It is our hope that these notes summarize some of the necessary background material in fluid mechanics and combustion, explain the numerical methods currently used in KIVA-2 and similar combustion codes, and provide an outline of the overall structure of KIVA-2 as a representative combustion program, in order to aid the researcher in the task of implementing KIVA-2 or a similar combustion code on a massively parallel computer. The notes are organized into three parts as follows. In Part 1, a brief introduction to continuum mechanics, to fluid mechanics, and to the mechanics of chemically reactive fluids with sprays is presented. In Part 2, a close look at the governing equations of KIVA-2 is taken, and the methods employed in the numerical solution of these equations is discussed. Some conclusions are drawn and some observations are made in Part 3.

  15. Improved Strength and Damage Modeling of Geologic Materials

    NASA Astrophysics Data System (ADS)

    Stewart, Sarah; Senft, Laurel

    2007-06-01

    Collisions and impact cratering events are important processes in the evolution of planetary bodies. The time and length scales of planetary collisions, however, are inaccessible in the laboratory and require the use of shock physics codes. We present the results from a new rheological model for geological materials implemented in the CTH code [1]. The `ROCK' model includes pressure, temperature, and damage effects on strength, as well as acoustic fluidization during impact crater collapse. We demonstrate that the model accurately reproduces final crater shapes, tensile cracking, and damaged zones from laboratory to planetary scales. The strength model requires basic material properties; hence, the input parameters may be benchmarked to laboratory results and extended to planetary collision events. We show the effects of varying material strength parameters, which are dependent on both scale and strain rate, and discuss choosing appropriate parameters for laboratory and planetary situations. The results are a significant improvement in models of continuum rock deformation during large scale impact events. [1] Senft, L. E., Stewart, S. T. Modeling Impact Cratering in Layered Surfaces, J. Geophys. Res., submitted.

  16. Derivation of effective fission gas diffusivities in UO2 from lower length scale simulations and implementation of fission gas diffusion models in BISON

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersson, Anders David Ragnar; Pastore, Giovanni; Liu, Xiang-Yang

    2014-11-07

    This report summarizes the development of new fission gas diffusion models from lower length scale simulations and assessment of these models in terms of annealing experiments and fission gas release simulations using the BISON fuel performance code. Based on the mechanisms established from density functional theory (DFT) and empirical potential calculations, continuum models for diffusion of xenon (Xe) in UO 2 were derived for both intrinsic conditions and under irradiation. The importance of the large X eU3O cluster (a Xe atom in a uranium + oxygen vacancy trap site with two bound uranium vacancies) is emphasized, which is a consequencemore » of its high mobility and stability. These models were implemented in the MARMOT phase field code, which is used to calculate effective Xe diffusivities for various irradiation conditions. The effective diffusivities were used in BISON to calculate fission gas release for a number of test cases. The results are assessed against experimental data and future directions for research are outlined based on the conclusions.« less

  17. Study of the L-mode tokamak plasma “shortfall” with local and global nonlinear gyrokinetic δf particle-in-cell simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chowdhury, J.; Wan, Weigang; Chen, Yang

    2014-11-15

    The δ f particle-in-cell code GEM is used to study the transport “shortfall” problem of gyrokinetic simulations. In local simulations, the GEM results confirm the previously reported simulation results of DIII-D [Holland et al., Phys. Plasmas 16, 052301 (2009)] and Alcator C-Mod [Howard et al., Nucl. Fusion 53, 123011 (2013)] tokamaks with the continuum code GYRO. Namely, for DIII-D the simulations closely predict the ion heat flux at the core, while substantially underpredict transport towards the edge; while for Alcator C-Mod, the simulations show agreement with the experimental values of ion heat flux, at least within the range of experimental error.more » Global simulations are carried out for DIII-D L-mode plasmas to study the effect of edge turbulence on the outer core ion heat transport. The edge turbulence enhances the outer core ion heat transport through turbulence spreading. However, this edge turbulence spreading effect is not enough to explain the transport underprediction.« less

  18. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less

  19. Astrocytes in the tempest of multiple sclerosis.

    PubMed

    Miljković, Djordje; Timotijević, Gordana; Mostarica Stojković, Marija

    2011-12-01

    Astrocytes are the most abundant cell population within the CNS of mammals. Their glial role is perfectly performed in the healthy CNS as they support functions of neurons. The omnipresence of astrocytes throughout the white and grey matter and their intimate relation with blood vessels of the CNS, as well as numerous immunity-related actions that these cells are capable of, imply that astrocytes should have a prominent role in neuroinflammatory disorders, such as multiple sclerosis (MS). The role of astrocytes in MS is rather ambiguous, as they have the capacity to both stimulate and restrain neuroinflammation and tissue destruction. In this paper we present some of the proved and the proposed functions of astrocytes in neuroinflammation and discuss the effect of MS therapeutics on astrocytes. Copyright © 2011 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  20. Manufacturing of ArF chromeless hard shifter for 65-nm technology

    NASA Astrophysics Data System (ADS)

    Park, Keun-Taek; Dieu, Laurent; Hughes, Greg P.; Green, Kent G.; Croffie, Ebo H.; Taravade, Kunal N.

    2003-12-01

    For logic design, Chrome-less Phase Shift Mask is one of the possible solutions for defining small geometry with low MEF (mask enhancement factor) for the 65nm node. There have been lots of dedicated studies on the PCO (Phase Chrome Off-axis) mask technology and several design approaches have been proposed including grating background, chrome patches (or chrome shield) for applying PCO on line/space and contact pattern. In this paper, we studied the feasibility of grating design for line and contact pattern. The design of the grating pattern was provided from the EM simulation software (TEMPEST) and the aerial image simulation software. AIMS measurements with high NA annular illumination were done. Resist images were taken on designed pattern in different focus. Simulations, AIMS are compared to verify the consistency of the process with wafer printed performance.

  1. Sturm und Drang: The turbulent, magnetic tempest in the Galactic center

    NASA Astrophysics Data System (ADS)

    Lacki, Brian C.

    2014-05-01

    The Galactic center central molecular zone (GCCMZ) bears similarities with extragalactic starburst regions, including a high supernova (SN) rate density. As in other starbursts like M82, the frequent SNe can heat the ISM until it is filled with a hot (˜ 4 × 107 K) superwind. Furthermore, the random forcing from SNe stirs up the wind, powering Mach 1 turbulence. I argue that a turbulent dynamo explains the strong magnetic fields in starbursts, and I predict an average B ˜70 μG in the GCCMZ. I demonstrate how the SN driving of the ISM leads to equipartition between various pressure components in the ISM. The SN-heated wind escapes the center, but I show that it may be stopped in the Galactic halo. I propose that the Fermi bubbles are the wind's termination shock.

  2. Simulations of pattern dynamics for reaction-diffusion systems via SIMULINK

    PubMed Central

    2014-01-01

    Background Investigation of the nonlinear pattern dynamics of a reaction-diffusion system almost always requires numerical solution of the system’s set of defining differential equations. Traditionally, this would be done by selecting an appropriate differential equation solver from a library of such solvers, then writing computer codes (in a programming language such as C or Matlab) to access the selected solver and display the integrated results as a function of space and time. This “code-based” approach is flexible and powerful, but requires a certain level of programming sophistication. A modern alternative is to use a graphical programming interface such as Simulink to construct a data-flow diagram by assembling and linking appropriate code blocks drawn from a library. The result is a visual representation of the inter-relationships between the state variables whose output can be made completely equivalent to the code-based solution. Results As a tutorial introduction, we first demonstrate application of the Simulink data-flow technique to the classical van der Pol nonlinear oscillator, and compare Matlab and Simulink coding approaches to solving the van der Pol ordinary differential equations. We then show how to introduce space (in one and two dimensions) by solving numerically the partial differential equations for two different reaction-diffusion systems: the well-known Brusselator chemical reactor, and a continuum model for a two-dimensional sheet of human cortex whose neurons are linked by both chemical and electrical (diffusive) synapses. We compare the relative performances of the Matlab and Simulink implementations. Conclusions The pattern simulations by Simulink are in good agreement with theoretical predictions. Compared with traditional coding approaches, the Simulink block-diagram paradigm reduces the time and programming burden required to implement a solution for reaction-diffusion systems of equations. Construction of the block-diagram does not require high-level programming skills, and the graphical interface lends itself to easy modification and use by non-experts. PMID:24725437

  3. Diagnosing somatisation disorder (P75) in routine general practice using the International Classification of Primary Care.

    PubMed

    Schaefert, Rainer; Laux, Gunter; Kaufmann, Claudia; Schellberg, Dieter; Bölter, Regine; Szecsenyi, Joachim; Sauer, Nina; Herzog, Wolfgang; Kuehlein, Thomas

    2010-09-01

    (i) To analyze general practitioners' diagnosis of somatisation disorder (P75) using the International Classification of Primary Care (ICPC)-2-E in routine general practice. (ii) To validate the distinctiveness of the ICD-10 to ICPC-2 conversion rule which maps ICD-10 dissociative/conversion disorder (F44) as well as half of the somatoform categories (F45.0-2) to P75 and codes the other half of these disorders (F45.3-9), including autonomic organ dysfunctions and pain syndromes, as symptom diagnoses plus a psychosocial code in a multiaxial manner. Cross-sectional analysis of routine data from a German research database comprising the electronic patient records of 32 general practitioners from 22 practices. For each P75 patient, control subjects matched for age, gender, and practice were selected from the 2007 yearly contact group (YCG) without a P75 diagnosis using a propensity-score algorithm that resulted in eight controls per P75 patient. Of the 49,423 patients in the YCG, P75 was diagnosed in 0.6% (302) and F45.3-9 in 1.8% (883) of cases; overall, somatisation syndromes were diagnosed in 2.4% of patients. The P75 coding pattern coincided with typical characteristics of severe, persistent medically unexplained symptoms (MUS). F45.3-9 was found to indicate moderate MUS that otherwise showed little clinical difference from P75. Pain syndromes exhibited an unspecific coding pattern. Mild and moderate MUS were predominantly recorded as symptom diagnoses. Psychosocial codes were rarely documented. ICPC-2 P75 was mainly diagnosed in cases of severe MUS. Multiaxial coding appears to be too complicated for routine primary care. Instead of splitting P75 and F45.3-9 diagnoses, it is proposed that the whole MUS spectrum should be conceptualized as a continuum model comprising categorizations of uncomplicated (mild) and complicated (moderate and severe) courses. Psychosocial factors require more attention. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  4. Simulations of pattern dynamics for reaction-diffusion systems via SIMULINK.

    PubMed

    Wang, Kaier; Steyn-Ross, Moira L; Steyn-Ross, D Alistair; Wilson, Marcus T; Sleigh, Jamie W; Shiraishi, Yoichi

    2014-04-11

    Investigation of the nonlinear pattern dynamics of a reaction-diffusion system almost always requires numerical solution of the system's set of defining differential equations. Traditionally, this would be done by selecting an appropriate differential equation solver from a library of such solvers, then writing computer codes (in a programming language such as C or Matlab) to access the selected solver and display the integrated results as a function of space and time. This "code-based" approach is flexible and powerful, but requires a certain level of programming sophistication. A modern alternative is to use a graphical programming interface such as Simulink to construct a data-flow diagram by assembling and linking appropriate code blocks drawn from a library. The result is a visual representation of the inter-relationships between the state variables whose output can be made completely equivalent to the code-based solution. As a tutorial introduction, we first demonstrate application of the Simulink data-flow technique to the classical van der Pol nonlinear oscillator, and compare Matlab and Simulink coding approaches to solving the van der Pol ordinary differential equations. We then show how to introduce space (in one and two dimensions) by solving numerically the partial differential equations for two different reaction-diffusion systems: the well-known Brusselator chemical reactor, and a continuum model for a two-dimensional sheet of human cortex whose neurons are linked by both chemical and electrical (diffusive) synapses. We compare the relative performances of the Matlab and Simulink implementations. The pattern simulations by Simulink are in good agreement with theoretical predictions. Compared with traditional coding approaches, the Simulink block-diagram paradigm reduces the time and programming burden required to implement a solution for reaction-diffusion systems of equations. Construction of the block-diagram does not require high-level programming skills, and the graphical interface lends itself to easy modification and use by non-experts.

  5. Aerodynamic characterization of the jet of an arc wind tunnel

    NASA Astrophysics Data System (ADS)

    Zuppardi, Gennaro; Esposito, Antonio

    2016-11-01

    It is well known that, due to a very aggressive environment and to a rather high rarefaction level of the arc wind tunnel jet, the measurement of fluid-dynamic parameters is difficult. For this reason, the aerodynamic characterization of the jet relies also on computer codes, simulating the operation of the tunnel. The present authors already used successfully such a kind of computing procedure for the tests in the arc wind tunnel (SPES) in Naples (Italy). In the present work an improved procedure is proposed. Like the former procedure also the present procedure relies on two codes working in tandem: 1) one-dimensional code simulating the inviscid and thermally not-conducting flow field in the torch, in the mix-chamber and in the nozzle up to the position, along the nozzle axis, of the continuum breakdown, 2) Direct Simulation Monte Carlo (DSMC) code simulating the flow field in the remaining part of the nozzle. In the present procedure, the DSMC simulation includes the simulation both in the nozzle and in the test chamber. An interesting problem, considered in this paper by means of the present procedure, has been the simulation of the flow field around a Pitot tube and of the related measurement of the stagnation pressure. The measured stagnation pressure, under rarefied conditions, may be even four times the theoretical value. Therefore a substantial correction has to be applied to the measured pressure. In the present paper a correction factor for the stagnation pressure measured in SPES is proposed. The analysis relies on twelve tests made in SPES.

  6. Navigating the changing learning landscape: perspective from bioinformatics.ca

    PubMed Central

    Ouellette, B. F. Francis

    2013-01-01

    With the advent of YouTube channels in bioinformatics, open platforms for problem solving in bioinformatics, active web forums in computing analyses and online resources for learning to code or use a bioinformatics tool, the more traditional continuing education bioinformatics training programs have had to adapt. Bioinformatics training programs that solely rely on traditional didactic methods are being superseded by these newer resources. Yet such face-to-face instruction is still invaluable in the learning continuum. Bioinformatics.ca, which hosts the Canadian Bioinformatics Workshops, has blended more traditional learning styles with current online and social learning styles. Here we share our growing experiences over the past 12 years and look toward what the future holds for bioinformatics training programs. PMID:23515468

  7. Determining mechanical behavior of solid materials using miniature specimens

    DOEpatents

    Manahan, M.P.; Argon, A.S.; Harling, O.K.

    1986-02-04

    A Miniaturized Bend Test (MBT) capable of extracting and determining mechanical behavior information from specimens only so large as to have at least a volume or smallest dimension sufficient to satisfy continuum behavior in all directions is disclosed. The mechanical behavior of the material is determined from the measurements taken during the bending of the specimen and is processed according to the principles of linear or nonlinear material mechanics or both. In a preferred embodiment the determination is carried out by a code which is constructed according to the finite element method, and the specimen used for the determinations is a miniature disk simply supported for central loading at the axis on the center of the disk. 51 figs.

  8. The interaction between fishbone modes and shear Alfvén waves in tokamak plasmas

    NASA Astrophysics Data System (ADS)

    He, Hongda; Liu, Yueqiang; Dong, J. Q.; Hao, G. Z.; Wu, Tingting; He, Zhixiong; Zhao, K.

    2016-05-01

    The resonant interaction between the energetic particle triggered fishbone mode and the shear Alfvén waves is computationally investigated and firmly demonstrated based on a tokamak plasma equilibrium, using the self-consistent MHD-kinetic hybrid code MARS-K (Liu et al 2008 Phys. Plasmas 15 112503). This type of continuum resonance, occurring critically due to the mode’s toroidal rotation in the plasma frame, significantly modifies the eigenmode structure of the fishbone instability, by introducing two large peaks of the perturbed parallel current density near but offside the q  =  1 rational surface (q is the safety factor). The self-consistently computed radial plasma displacement substantially differs from that being assumed in the conventional fishbone theory.

  9. Detection and Characterization of the Photospheric and Coronal Magnetic Fields of Sunspot Groups: Implications for Flare Forecasting

    DTIC Science & Technology

    2015-05-25

    Techniques & Data used.    For this project data was used primarily from two instruments, HMI and AIA onboard SDO...automatically using SMART­DF code (shown in  circles) using  HMI  data (left side). The same region observed in high resolution using  1m­Swedish Solar Telescope...simultaneous measurements of continuum  intensity and LOS magnetic field to segment ARs into umbrae, penumbrae and plage. Data  from the  HMI  instrument

  10. Quasi-Continuum Reduction of Field Theories: A Route to Seamlessly Bridge Quantum and Atomistic Length-Scales with Continuum

    DTIC Science & Technology

    2016-04-01

    AFRL-AFOSR-VA-TR-2016-0145 Quasi-continuum reduction of field theories: A route to seamlessly bridge quantum and atomistic length-scales with...field theories: A route to seamlessly bridge quantum and atomistic length-scales with continuum Principal Investigator: Vikram Gavini Department of...calculations on tens of thousands of atoms, and enable continuing efforts towards a seamless bridging of the quantum and continuum length-scales

  11. Residence in Rural Areas of the United States and Lung Cancer Mortality. Disease Incidence, Treatment Disparities, and Stage-Specific Survival.

    PubMed

    Atkins, Graham T; Kim, Taeha; Munson, Jeffrey

    2017-03-01

    There is increased lung cancer mortality in rural areas of the United States. However, it remains unclear to what extent rural-urban differences in disease incidence, stage at diagnosis, or treatment explain this finding. To explore the relationship between smoking rates, lung cancer incidence, and lung cancer mortality in populations across the rural-urban continuum and to determine whether survival is decreased in rural patients diagnosed with lung cancer and whether this is associated with rural-urban differences in stage at diagnosis or the treatment received. We conducted a retrospective cohort study of 348,002 patients diagnosed with lung cancer between 2000 and 2006. Data from metropolitan, urban, suburban, and rural areas in the United States were obtained from the Surveillance, Epidemiology, and End Results program database. County-level population estimates for 2003 were obtained from the U.S. Census Bureau, and corresponding estimates of smoking prevalence were obtained from published literature. The exposure was rurality, defined by the rural-urban continuum code area linked to each cohort participant by county of residence. Outcomes included lung cancer incidence, mortality, diagnostic stage, and treatment received. Lung cancer mortality increased with rurality in a dose-dependent fashion across the rural-urban continuum. The most rural areas had almost twice the smoking prevalence and lung cancer incidence of the largest metropolitan areas. Rural patients diagnosed with stage I non-small cell lung cancer underwent fewer surgeries (69% vs. 75%; P < 0.001) and had significantly reduced median survival (40 vs. 52 mo; P = 0.0006) compared with the most urban patients. Stage at diagnosis was similar across the rural-urban continuum, as was median survival for patients with stages II-IV lung cancer. Higher rural smoking rates drive increased disease incidence and per capita lung cancer mortality in rural areas of the United States. There were no rural-urban discrepancies in diagnostic stage, suggesting similar access to diagnostic services. Rural patients diagnosed with stage I non-small cell lung cancer had shorter survival, which may reflect disparities in access to surgical care. No survival difference for patients with advanced-stage lung cancer is attributed to lack of effective treatment during the time period of this study.

  12. 77 FR 44653 - Continuum of Care Homeless Assistance Grant Application-Technical Submission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-30

    ... DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT [Docket No. FR-5603-N-50] Continuum of Care Homeless... obtain more detailed technical information not contained in the original Continuum of Care Homeless...: Continuum of Care Homeless Assistance Grant Application--Technical Submission. OMB Approval Number: 2506...

  13. HYDROGEN BALMER CONTINUUM IN SOLAR FLARES DETECTED BY THE INTERFACE REGION IMAGING SPECTROGRAPH (IRIS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heinzel, P.; Kleint, L., E-mail: pheinzel@asu.cas.cz

    We present a novel observation of the white light flare (WLF) continuum, which was significantly enhanced during the X1 flare on 2014 March 29 (SOL2014-03-29T17:48). Data from the Interface Region Imaging Spectrograph (IRIS) in its near-UV channel show that at the peak of the continuum enhancement, the contrast at the quasi-continuum window above 2813 Å reached 100%-200% and can be even larger closer to Mg II lines. This is fully consistent with the hydrogen recombination Balmer-continuum emission, which follows an impulsive thermal and non-thermal ionization caused by the precipitation of electron beams through the chromosphere. However, a less probable photosphericmore » continuum enhancement cannot be excluded. The light curves of the Balmer continuum have an impulsive character with a gradual fading, similar to those detected recently in the optical region on the Solar Optical Telescope on board Hinode. This observation represents a first Balmer-continuum detection from space far beyond the Balmer limit (3646 Å), eliminating seeing effects known to complicate the WLF detection. Moreover, we use a spectral window so far unexplored for flare studies, which provides the potential to study the Balmer continuum, as well as many metallic lines appearing in emission during flares. Combined with future ground-based observations of the continuum near the Balmer limit, we will be able to disentangle various scenarios of the WLF origin. IRIS observations also provide a critical quantitative measure of the energy radiated in the Balmer continuum, which constrains various models of the energy transport and deposit during flares.« less

  14. Thermal and Nonthermal Contributions to the Solar Flare X-Ray Flux

    NASA Technical Reports Server (NTRS)

    Dennis, Brian R.; Phillips, K. J. H.; Sylwester, Janusz; Sylwester, Barbara; Schwartz, Richard A.; Tolbert, A. Kimberley

    2004-01-01

    The relative thermal and nonthermal contributions to the total energy budget of a solar flare are being determined through analysis of RHESSI X-ray imaging and spectral observations in the energy range from approx. 5 to approx. 50 keV. The classic ways of differentiating between the thermal and nonthermal components - exponential vs. sources - can now be combined for individual flares. In addition, RHESSI's sensitivity down to approx. 4 keV and energy resolution of approx. 1 keV FWHM allow the intensities and equivalent widths of the complex of highly ionized iron lines at approx. 6.7 keV and the complex of highly ionized iron and nickel lines at approx. 8 keV to be measured as a function of time. Using the spectral line and continuum intensities from the Chianti (version 4.2) atomic code, the thermal component of the total flare emission can be more reliably separated from the nonthermal component in the measured X-ray spectrum. The abundance of iron can also be determined from RHESSI line-to-continuum measurements as a function of time during larger flares. Results will be shown of the intensity and equivalent widths of these line complexes for several flares and the temperatures, emission measures, and iron abundances derived from them. Comparisons will be made with 6.7-keV Fe-line fluxes measured with the RESIK bent crystal spectrometer on the Coronas-F spacecraft operating in third order during the peak times of three flares (2002 May 31 at 00:12 UT, 2002 December 2 at 19:26 UT, and 2003 April 26 at 03:OO UT). During the rise and decay of these flares, RESIK was operating in first order allowing the continuum flux to be measured between 2.9 and 3.7 keV for comparison with RHESSI fluxes at its low-energy end.

  15. A qualitative grounded theory study of the conceptions of clinical practice in osteopathy - a continuum from technical rationality to professional artistry.

    PubMed

    Thomson, Oliver P; Petty, Nicola J; Moore, Ann P

    2014-02-01

    How practitioners conceive clinical practice influences many aspects of their clinical work including how they view knowledge, clinical decision-making, and their actions. Osteopaths have relied upon the philosophical and theoretical foundations upon which the profession was built to guide clinical practice. However, it is currently unknown how osteopaths conceive clinical practice, and how these conceptions develop and influence their clinical work. This paper reports the conceptions of practice of experienced osteopaths in the UK. A constructivist grounded theory approach was taken in this study. The constant comparative method of analysis was used to code and analyse data. Purposive sampling was employed to initially select participants. Subsequent theoretical sampling, informed by data analysis, allowed specific participants to be sampled. Data collection methods involved semi-structured interviews and non-participant observation of practitioners during a patient appointment, which was video-recorded and followed by a video-prompted reflective interview. Participants' conception of practice lay on a continuum, from technical rationality to professional artistry and the development of which was influenced by their educational experience, view of health and disease, epistemology of practice knowledge, theory-practice relationship and their perceived therapeutic role. The findings from this study provide the first theoretical insight of osteopaths' conceptions of clinical practice and the factors which influence such conceptions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Perceived stops to suicidal thoughts, plans, and actions in persons experiencing psychosis.

    PubMed

    Gooding, P A; Sheehy, K; Tarrier, N

    2013-01-01

    Suicide has been conceived as involving a continuum, whereby suicidal plans and acts emerge from thoughts about suicide. Suicide prevention strategies need to determine whether different responses are needed at these points on the continuum. This study investigates factors that were perceived to counter suicidal ideation, plans, and acts. The 36 participants, all of whom had had experiences of psychosis and some level of suicidality, were presented with a vignette describing a protagonist with psychotic symptoms. They were asked to indicate what would counter the suicidal thoughts, plans, and acts of the protagonist described in the vignette. Qualitative techniques were first used to code these free responses into themes/categories. Correspondence analysis was then applied to the frequency of responses in each of these categories. Social support was identified as a strong counter to suicidal ideation but not as a counter to suicidal plans or acts. Help from health professionals was strongly related to the cessation of suicidal plans as were the opinions of the protagonist's children. Changing cognitions and strengthening psychological resources were more weakly associated with the cessation of suicidal ideation and plans. The protagonist's children were considered potentially helpful in addressing suicidal acts. These results suggest that both overlapping and nonoverlapping factors need to be considered in understanding suicide prevention, dependent on whether individuals are thinking about, planning, or attempting suicide.

  17. The shell spectrum of HD 94509

    NASA Astrophysics Data System (ADS)

    Cowley, Charles R.; Przybilla, Norbert; Hubrig, Swetlana

    2015-01-01

    HD 94509 is a 9th magnitude Be star with an unusually rich metallic-lined shell. The absorption spectrum is rich, comparable to that of an A or F supergiant, but Mg II (4481A), and the Si II (4128 and 4130A), are weak, indicating a dilute radiation field, as described by Otto Struve. The H-alpha emission is double with components of equal intensity and an absorption core that dips well below the stellar continuum. H-beta is weaker, but with a similar structure. H-gamma through H-epsilon have virtually black cores, indicating that the shell covers the stellar disk. The stronger metallic absorption lines are wide near the continuum, but taper to very narrow cores. This line shape is unexplained. However, the total absorption can be modeled to reveal an overall particle densities of 10^{10}-10^{12} cm^{-3}. An electron density log(n_e) = 11.2 is obtained from the Paschen-line convergence and the Inglis-Tellar relation. Column densities are obtained with the help of curves of growth by assuming uniform conditions in the cloud. These indicate a nearly solar composition. The CLOUDY code (Ferland, et al. Rev. Mex. Astron. Astroph. 49, 137, 213) is used to produce a model that predicts matching column densities of the dominant ions, the n = 3 level of hydrogen, the H-alpha strength, and the electron density (± 0.5 dex).

  18. A Suzaku, NuSTAR and XMMNewton} view on variable absorption and relativistic reflection in NGC 4151

    NASA Astrophysics Data System (ADS)

    Beuchert, T.; Markowitz, A.; Dauser, T.; Garcia, J.; Keck, M.; Wilms, J.; Kadler, M.; Brenneman, L.; Zdziarski, A.

    2017-10-01

    We disentangle X-ray disk reflection from complex line-of-sight absorption in NGC 4151 using Suzaku, NuSTAR, and XMMNewton}. Extending upon Keck et al. (2015), we develop a physically-motivated baseline model using the latest lamp-post reflection code relxillCp_lp, which includes a Comptonization continuum. We identify two components at heights of 1.2 and 15.0 gravitational radii using a long-look simultaneous Suzaku/NuSTAR observation but argue for a vertically extended corona as opposed to distinct primary sources. We also find two neutral absorbers (one full-covering and one partial-covering), an ionized absorber (log ξ=2.8), and a highly-ionized ultra-fast outflow, all reported previously. All analyzed spectra are well described by this baseline model. The bulk of the spectral variability on time-scales from days to years can be attributed to changes of both neutral absorbers, which are inversely correlated with the hard X-ray continuum flux. The observed evolution is either consistent with changes in the absorber structure (clumpy absorber in the outer BLR or a dusty radiatively driven wind) or a geometrically stable neutral absorber that becomes increasingly ionized at a rising flux level. The soft X-rays below 1 keV are dominated by photoionized emission from extended gas, which may act as a warm mirror for the nuclear radiation.

  19. Continuum of Collaboration: Little Steps for Little Feet

    ERIC Educational Resources Information Center

    Powell, Gwynn M.

    2013-01-01

    This mini-article outlines a continuum of collaboration for faculty within a department of the same discipline. The goal of illustrating this continuum is showcase different stages of collaboration so that faculty members can assess where they are as a collective and consider steps to collaborate more. The separate points along a continuum of…

  20. Thermal-hydraulic simulation of natural convection decay heat removal in the High Flux Isotope Reactor using RELAP5 and TEMPEST: Part 1, Models and simulation results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, D.G.; Wendel, M.W.; Chen, N.C.J.

    A study was conducted to examine decay heat removal requirements in the High Flux Isotope Reactor (HFIR) following shutdown from 85 MW. The objective of the study was to determine when forced flow through the core could be terminated without causing the fuel to melt. This question is particularly relevant when a station blackout caused by an external event is considered. Analysis of natural circulation in the core, vessel upper plenum, and reactor pool indicates that 12 h of forced flow will permit a safe shutdown with some margin. However, uncertainties in the analysis preclude conclusive proof that 12 hmore » is sufficient. As a result of the study, two seismically qualified diesel generators were installed in HFIR. 9 refs., 4 figs.« less

Top