Sample records for source terms due

  1. Reducing Human Radiation Risks on Deep Space Missions

    DTIC Science & Technology

    2017-09-01

    Roadmap (2016). .........................................................108 Figure 53. Risk Assessment for Acute Radiation Syndrome Due to SPEs...Risk of Acute Radiation Syndromes Due to Solar Particle Events Figure 53 highlights the fact that acute radiation syndrome is a short-term risk...acceptable for long-term missions. Figure 53. Risk Assessment for Acute Radiation Syndrome Due to SPEs. Source: NASA Human Research Roadmap (2016

  2. High Order Finite Difference Methods with Subcell Resolution for 2D Detonation Waves

    NASA Technical Reports Server (NTRS)

    Wang, W.; Shu, C. W.; Yee, H. C.; Sjogreen, B.

    2012-01-01

    In simulating hyperbolic conservation laws in conjunction with an inhomogeneous stiff source term, if the solution is discontinuous, spurious numerical results may be produced due to different time scales of the transport part and the source term. This numerical issue often arises in combustion and high speed chemical reacting flows.

  3. Air Pollution Manual, Part 1--Evaluation. Second Edition.

    ERIC Educational Resources Information Center

    Giever, Paul M., Ed.

    Due to the great increase in technical knowledge and improvement in procedures, this second edition has been prepared to update existing information. Air pollution legislation is reviewed. Sources of air pollution are examined extensively. They are treated in terms of natural sources, man-made sources, metropolitan regional emissions, emission…

  4. Observation-based source terms in the third-generation wave model WAVEWATCH

    NASA Astrophysics Data System (ADS)

    Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.

    2015-12-01

    Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.

  5. Plasma particle sources due to interactions with neutrals in a turbulent scrape-off layer of a toroidally confined plasma

    NASA Astrophysics Data System (ADS)

    Thrysøe, A. S.; Løiten, M.; Madsen, J.; Naulin, V.; Nielsen, A. H.; Rasmussen, J. Juul

    2018-03-01

    The conditions in the edge and scrape-off layer (SOL) of magnetically confined plasmas determine the overall performance of the device, and it is of great importance to study and understand the mechanics that drive transport in those regions. If a significant amount of neutral molecules and atoms is present in the edge and SOL regions, those will influence the plasma parameters and thus the plasma confinement. In this paper, it is displayed how neutrals, described by a fluid model, introduce source terms in a plasma drift-fluid model due to inelastic collisions. The resulting source terms are included in a four-field drift-fluid model, and it is shown how an increasing neutral particle density in the edge and SOL regions influences the plasma particle transport across the last-closed-flux-surface. It is found that an appropriate gas puffing rate allows for the edge density in the simulation to be self-consistently maintained due to ionization of neutrals in the confined region.

  6. Semi-implicit and fully implicit shock-capturing methods for hyperbolic conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Shinn, J. L.

    1986-01-01

    Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogenous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the soruce terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated.

  7. A Semi-implicit Treatment of Porous Media in Steady-State CFD.

    PubMed

    Domaingo, Andreas; Langmayr, Daniel; Somogyi, Bence; Almbauer, Raimund

    There are many situations in computational fluid dynamics which require the definition of source terms in the Navier-Stokes equations. These source terms not only allow to model the physics of interest but also have a strong impact on the reliability, stability, and convergence of the numerics involved. Therefore, sophisticated numerical approaches exist for the description of such source terms. In this paper, we focus on the source terms present in the Navier-Stokes or Euler equations due to porous media-in particular the Darcy-Forchheimer equation. We introduce a method for the numerical treatment of the source term which is independent of the spatial discretization and based on linearization. In this description, the source term is treated in a fully implicit way whereas the other flow variables can be computed in an implicit or explicit manner. This leads to a more robust description in comparison with a fully explicit approach. The method is well suited to be combined with coarse-grid-CFD on Cartesian grids, which makes it especially favorable for accelerated solution of coupled 1D-3D problems. To demonstrate the applicability and robustness of the proposed method, a proof-of-concept example in 1D, as well as more complex examples in 2D and 3D, is presented.

  8. Long-term carbon loss in fragmented Neotropical forests.

    PubMed

    Pütz, Sandro; Groeneveld, Jürgen; Henle, Klaus; Knogge, Christoph; Martensen, Alexandre Camargo; Metz, Markus; Metzger, Jean Paul; Ribeiro, Milton Cezar; de Paula, Mateus Dantas; Huth, Andreas

    2014-10-07

    Tropical forests play an important role in the global carbon cycle, as they store a large amount of carbon (C). Tropical forest deforestation has been identified as a major source of CO2 emissions, though biomass loss due to fragmentation--the creation of additional forest edges--has been largely overlooked as an additional CO2 source. Here, through the combination of remote sensing and knowledge on ecological processes, we present long-term carbon loss estimates due to fragmentation of Neotropical forests: within 10 years the Brazilian Atlantic Forest has lost 69 (±14) Tg C, and the Amazon 599 (±120) Tg C due to fragmentation alone. For all tropical forests, we estimate emissions up to 0.2 Pg C y(-1) or 9 to 24% of the annual global C loss due to deforestation. In conclusion, tropical forest fragmentation increases carbon loss and should be accounted for when attempting to understand the role of vegetation in the global carbon balance.

  9. ERROR IN ANNUAL AVERAGE DUE TO USE OF LESS THAN EVERYDAY MEASUREMENTS

    EPA Science Inventory

    Long term averages of the concentration of PM mass and components are of interest for determining compliance with annual averages, for developing exposure surrogated for cross-sectional epidemiologic studies of the long-term of PM, and for determination of aerosol sources by chem...

  10. Bayesian estimation of a source term of radiation release with approximately known nuclide ratios

    NASA Astrophysics Data System (ADS)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek

    2016-04-01

    We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  11. Isotropic source terms of San Jacinto fault zone earthquakes based on waveform inversions with a generalized CAP method

    NASA Astrophysics Data System (ADS)

    Ross, Z. E.; Ben-Zion, Y.; Zhu, L.

    2015-02-01

    We analyse source tensor properties of seven Mw > 4.2 earthquakes in the complex trifurcation area of the San Jacinto Fault Zone, CA, with a focus on isotropic radiation that may be produced by rock damage in the source volumes. The earthquake mechanisms are derived with generalized `Cut and Paste' (gCAP) inversions of three-component waveforms typically recorded by >70 stations at regional distances. The gCAP method includes parameters ζ and χ representing, respectively, the relative strength of the isotropic and CLVD source terms. The possible errors in the isotropic and CLVD components due to station variability is quantified with bootstrap resampling for each event. The results indicate statistically significant explosive isotropic components for at least six of the events, corresponding to ˜0.4-8 per cent of the total potency/moment of the sources. In contrast, the CLVD components for most events are not found to be statistically significant. Trade-off and correlation between the isotropic and CLVD components are studied using synthetic tests with realistic station configurations. The associated uncertainties are found to be generally smaller than the observed isotropic components. Two different tests with velocity model perturbation are conducted to quantify the uncertainty due to inaccuracies in the Green's functions. Applications of the Mann-Whitney U test indicate statistically significant explosive isotropic terms for most events consistent with brittle damage production at the source.

  12. Radiological analysis of plutonium glass batches with natural/enriched boron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainisch, R.

    2000-06-22

    The disposition of surplus plutonium inventories by the US Department of Energy (DOE) includes the immobilization of certain plutonium materials in a borosilicate glass matrix, also referred to as vitrification. This paper addresses source terms of plutonium masses immobilized in a borosilicate glass matrix where the glass components include both natural boron and enriched boron. The calculated source terms pertain to neutron and gamma source strength (particles per second), and source spectrum changes. The calculated source terms corresponding to natural boron and enriched boron are compared to determine the benefits (decrease in radiation source terms) for to the use ofmore » enriched boron. The analysis of plutonium glass source terms shows that a large component of the neutron source terms is due to (a, n) reactions. The Americium-241 and plutonium present in the glass emit alpha particles (a). These alpha particles interact with low-Z nuclides like B-11, B-10, and O-17 in the glass to produce neutrons. The low-Z nuclides are referred to as target particles. The reference glass contains 9.4 wt percent B{sub 2}O{sub 3}. Boron-11 was found to strongly support the (a, n) reactions in the glass matrix. B-11 has a natural abundance of over 80 percent. The (a, n) reaction rates for B-10 are lower than for B-11 and the analysis shows that the plutonium glass neutron source terms can be reduced by artificially enriching natural boron with B-10. The natural abundance of B-10 is 19.9 percent. Boron enriched to 96-wt percent B-10 or above can be obtained commercially. Since lower source terms imply lower dose rates to radiation workers handling the plutonium glass materials, it is important to know the achievable decrease in source terms as a result of boron enrichment. Plutonium materials are normally handled in glove boxes with shielded glass windows and the work entails both extremity and whole-body exposures. Lowering the source terms of the plutonium batches will make the handling of these materials less difficult and will reduce radiation exposure to operating workers.« less

  13. Extended lattice Boltzmann scheme for droplet combustion.

    PubMed

    Ashna, Mostafa; Rahimian, Mohammad Hassan; Fakhari, Abbas

    2017-05-01

    The available lattice Boltzmann (LB) models for combustion or phase change are focused on either single-phase flow combustion or two-phase flow with evaporation assuming a constant density for both liquid and gas phases. To pave the way towards simulation of spray combustion, we propose a two-phase LB method for modeling combustion of liquid fuel droplets. We develop an LB scheme to model phase change and combustion by taking into account the density variation in the gas phase and accounting for the chemical reaction based on the Cahn-Hilliard free-energy approach. Evaporation of liquid fuel is modeled by adding a source term, which is due to the divergence of the velocity field being nontrivial, in the continuity equation. The low-Mach-number approximation in the governing Navier-Stokes and energy equations is used to incorporate source terms due to heat release from chemical reactions, density variation, and nonluminous radiative heat loss. Additionally, the conservation equation for chemical species is formulated by including a source term due to chemical reaction. To validate the model, we consider the combustion of n-heptane and n-butanol droplets in stagnant air using overall single-step reactions. The diameter history and flame standoff ratio obtained from the proposed LB method are found to be in good agreement with available numerical and experimental data. The present LB scheme is believed to be a promising approach for modeling spray combustion.

  14. Influence of heat conducting substrates on explosive crystallization in thin layers

    NASA Astrophysics Data System (ADS)

    Schneider, Wilhelm

    2017-09-01

    Crystallization in a thin, initially amorphous layer is considered. The layer is in thermal contact with a substrate of very large dimensions. The energy equation of the layer contains source and sink terms. The source term is due to liberation of latent heat in the crystallization process, while the sink term is due to conduction of heat into the substrate. To determine the latter, the heat diffusion equation for the substrate is solved by applying Duhamel's integral. Thus, the energy equation of the layer becomes a heat diffusion equation with a time integral as an additional term. The latter term indicates that the heat loss due to the substrate depends on the history of the process. To complete the set of equations, the crystallization process is described by a rate equation for the degree of crystallization. The governing equations are then transformed to a moving co-ordinate system in order to analyze crystallization waves that propagate with invariant properties. Dual solutions are found by an asymptotic expansion for large activation energies of molecular diffusion. By introducing suitable variables, the results can be presented in a universal form that comprises the influence of all non-dimensional parameters that govern the process. Of particular interest for applications is the prediction of a critical heat loss parameter for the existence of crystallization waves with invariant properties.

  15. The Application of Function Points to Predict Source Lines of Code for Software Development

    DTIC Science & Technology

    1992-09-01

    there are some disadvantages. Software estimating tools are expensive. A single tool may cost more than $15,000 due to the high market value of the...term and Lang variables simultaneously onlN added marginal improvements over models with these terms included singularly. Using all the available

  16. Very slow lava extrusion continued for more than five years after the 2011 Shinmoedake eruption observed from SAR interferometry

    NASA Astrophysics Data System (ADS)

    Ozawa, T.; Miyagi, Y.

    2017-12-01

    Shinmoe-dake located to SW Japan erupted in January 2011 and lava accumulated in the crater (e.g., Ozawa and Kozono, EPS, 2013). Last Vulcanian eruption occurred in September 2011, and after that, no eruption has occurred until now. Miyagi et al. (GRL, 2014) analyzed TerraSAR-X and Radarsat-2 SAR data acquired after the last eruption and found continuous inflation in the crater. Its inflation decayed with time, but had not terminated in May 2013. Since the time-series of inflation volume change rate fitted well to the exponential function with the constant term, we suggested that lava extrusion had continued in long-term due to deflation of shallow magma source and to magma supply from deeper source. To investigate its deformation after that, we applied InSAR to Sentinel-1 and ALOS-2 SAR data. Inflation decayed further, and almost terminated in the end of 2016. It means that this deformation has continued more than five years from the last eruption. We have found that the time series of inflation volume change rate fits better to the double-exponential function than single-exponential function with the constant term. The exponential component with the short time constant has almost settled in one year from the last eruption. Although InSAR result from TerraSAR-X data of November 2011 and May 2013 indicated deflation of shallow source under the crater, such deformation has not been obtained from recent SAR data. It suggests that this component has been due to deflation of shallow magma source with excess pressure. In this study, we found the possibility that long-term component also decayed exponentially. Then this factor may be deflation of deep source or delayed vesiculation.

  17. Interlaboratory study of the ion source memory effect in 36Cl accelerator mass spectrometry

    NASA Astrophysics Data System (ADS)

    Pavetich, Stefan; Akhmadaliev, Shavkat; Arnold, Maurice; Aumaître, Georges; Bourlès, Didier; Buchriegler, Josef; Golser, Robin; Keddadouche, Karim; Martschini, Martin; Merchel, Silke; Rugel, Georg; Steier, Peter

    2014-06-01

    Understanding and minimization of contaminations in the ion source due to cross-contamination and long-term memory effect is one of the key issues for accurate accelerator mass spectrometry (AMS) measurements of volatile elements. The focus of this work is on the investigation of the long-term memory effect for the volatile element chlorine, and the minimization of this effect in the ion source of the Dresden accelerator mass spectrometry facility (DREAMS). For this purpose, one of the two original HVE ion sources at the DREAMS facility was modified, allowing the use of larger sample holders having individual target apertures. Additionally, a more open geometry was used to improve the vacuum level. To evaluate this improvement in comparison to other up-to-date ion sources, an interlaboratory comparison had been initiated. The long-term memory effect of the four Cs sputter ion sources at DREAMS (two sources: original and modified), ASTER (Accélérateur pour les Sciences de la Terre, Environnement, Risques) and VERA (Vienna Environmental Research Accelerator) had been investigated by measuring samples of natural 35Cl/37Cl-ratio and samples highly-enriched in 35Cl (35Cl/37Cl ∼ 999). Besides investigating and comparing the individual levels of long-term memory, recovery time constants could be calculated. The tests show that all four sources suffer from long-term memory, but the modified DREAMS ion source showed the lowest level of contamination. The recovery times of the four ion sources were widely spread between 61 and 1390 s, where the modified DREAMS ion source with values between 156 and 262 s showed the fastest recovery in 80% of the measurements.

  18. Soundscapes

    DTIC Science & Technology

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Soundscapes Michael B. Porter and Laurel J. Henderson...hindcasts, nowcasts, and forecasts of the time-evolving soundscape . In terms of the types of sound sources, we will focus initially on commercial...modeling of the soundscape due to noise involves running an acoustic model for a grid of source positions over latitude and longitude. Typically

  19. Boundary control of bidomain equations with state-dependent switching source functions in the ionic model

    NASA Astrophysics Data System (ADS)

    Chamakuri, Nagaiah; Engwer, Christian; Kunisch, Karl

    2014-09-01

    Optimal control for cardiac electrophysiology based on the bidomain equations in conjunction with the Fenton-Karma ionic model is considered. This generic ventricular model approximates well the restitution properties and spiral wave behavior of more complex ionic models of cardiac action potentials. However, it is challenging due to the appearance of state-dependent discontinuities in the source terms. A computational framework for the numerical realization of optimal control problems is presented. Essential ingredients are a shape calculus based treatment of the sensitivities of the discontinuous source terms and a marching cubes algorithm to track iso-surface of excitation wavefronts. Numerical results exhibit successful defibrillation by applying an optimally controlled extracellular stimulus.

  20. Refinement of Regional Distance Seismic Moment Tensor and Uncertainty Analysis for Source-Type Identification

    DTIC Science & Technology

    2011-09-01

    a NSS that lies in this negative explosion positive CLVD quadrant due to the large degree of tectonic release in this event that reversed the phase...Mellman (1986) in their analysis of fundamental model Love and Rayleigh wave amplitude and phase for nuclear and tectonic release source terms, and...1986). Estimating explosion and tectonic release source parameters of underground nuclear explosions from Rayleigh and Love wave observations, Air

  1. Bias in Terms of Culture and a Method for Reducing It: An Eight-Country "Explanations of Unemployment Scale" Study

    ERIC Educational Resources Information Center

    Mylonas, Kostas; Furnham, Adrian; Divale, William; Leblebici, Cigdem; Gondim, Sonia; Moniz, Angela; Grad, Hector; Alvaro, Jose Luis; Cretu, Romeo Zeno; Filus, Ania; Boski, Pawel

    2014-01-01

    Several sources of bias can plague research data and individual assessment. When cultural groups are considered, across or even within countries, it is essential that the constructs assessed and evaluated are as free as possible from any source of bias and specifically from bias caused due to culturally specific characteristics. Employing the…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Values of current energy technology costs and prices, available from a variety of sources, can sometimes vary. While some of this variation can be due to differences in the specific materials or configurations assumed, it can also reflect differences in the definition and context of the terms "cost" and "price." This fact sheet illustrates and explains this latter source of variation in a case study of automotive lithium-ion batteries.

  3. Amplitude loss of sonic waveform due to source coupling to the medium

    NASA Astrophysics Data System (ADS)

    Lee, Myung W.; Waite, William F.

    2007-03-01

    In contrast to hydrate-free sediments, sonic waveforms acquired in gas hydrate-bearing sediments indicate strong amplitude attenuation associated with a sonic velocity increase. The amplitude attenuation increase has been used to quantify pore-space hydrate content by attributing observed attenuation to the hydrate-bearing sediment's intrinsic attenuation. A second attenuation mechanism must be considered, however. Theoretically, energy radiation from sources inside fluid-filled boreholes strongly depends on the elastic parameters of materials surrounding the borehole. It is therefore plausible to interpret amplitude loss in terms of source coupling to the surrounding medium as well as to intrinsic attenuation. Analyses of sonic waveforms from the Mallik 5L-38 well, Northwest Territories, Canada, indicate a significant component of sonic waveform amplitude loss is due to source coupling. Accordingly, all sonic waveform amplitude analyses should include the effect of source coupling to accurately characterize a formation's intrinsic attenuation.

  4. Amplitude loss of sonic waveform due to source coupling to the medium

    USGS Publications Warehouse

    Lee, Myung W.; Waite, William F.

    2007-01-01

    In contrast to hydrate-free sediments, sonic waveforms acquired in gas hydrate-bearing sediments indicate strong amplitude attenuation associated with a sonic velocity increase. The amplitude attenuation increase has been used to quantify pore-space hydrate content by attributing observed attenuation to the hydrate-bearing sediment's intrinsic attenuation. A second attenuation mechanism must be considered, however. Theoretically, energy radiation from sources inside fluid-filled boreholes strongly depends on the elastic parameters of materials surrounding the borehole. It is therefore plausible to interpret amplitude loss in terms of source coupling to the surrounding medium as well as to intrinsic attenuation. Analyses of sonic waveforms from the Mallik 5L-38 well, Northwest Territories, Canada, indicate a significant component of sonic waveform amplitude loss is due to source coupling. Accordingly, all sonic waveform amplitude analyses should include the effect of source coupling to accurately characterize a formation's intrinsic attenuation.

  5. Binary Source Microlensing Event OGLE-2016-BLG-0733: Interpretation of a Long-Term Asymmetric Perturbation

    NASA Technical Reports Server (NTRS)

    Jung, Y. K.; Udalski, A.; Yee, J. C.; Sumi, T.; Gould, A.; Han, C.; Albrow, M. D.; Lee, C.-U.; Bennett, D. P.; Suzuki, D.

    2017-01-01

    In the process of analyzing an observed light curve, one often confronts various scenarios that can mimic the planetary signals causing difficulties in the accurate interpretation of the lens system. In this paper, we present the analysis of the microlensing event OGLE-2016-BLG-0733. The light curve of the event shows a long-term asymmetric perturbation that would appear to be due to a planet. From the detailed modeling of the lensing light curve, however, we find that the perturbation originates from the binarity of the source rather than the lens. This result demonstrates that binary sources with roughly equal-luminosity components can mimic long-term perturbations induced by planets with projected separations near the Einstein ring. The result also represents the importance of the consideration of various interpretations in planet-like perturbations and of high-cadence observations for ensuring the unambiguous detection of the planet.

  6. The short-term association of selected components of fine particulate matter and mortality in the Denver Aerosol Sources and Health (DASH) study

    EPA Science Inventory

    Associations of short-term exposure to fine particulate matter (PM2.5) with daily mortality may be due to specific PM2.5 chemical components. Objectives: Daily concentrations of PM2.5 chemical species were measured over five consecutive years in Denver, CO to investigate whethe...

  7. Simulating long-term landcover change and water yield dynamics in a forested, snow-dominated Rocky Mountain watershed

    Treesearch

    R. S. Ahl; S. W. Woods

    2006-01-01

    Changes in the extent, composition, and configuration of forest cover over time due to succession or disturbance processes can result in measurable changes in streamflow and water yield. Removal of forest cover generally increases streamflow due to reduced canopy interception and evapotranspiration. In watersheds where snow is the dominant source of water, yield...

  8. Is amplitude loss of sonic waveforms due to intrinsic attenuation or source coupling to the medium?

    USGS Publications Warehouse

    Lee, Myung W.

    2006-01-01

    Sonic waveforms acquired in gas-hydrate-bearing sediments indicate strong amplitude loss associated with an increase in sonic velocity. Because the gas hydrate increases sonic velocities, the amplitude loss has been interpreted as due to intrinsic attenuation caused by the gas hydrate in the pore space, which apparently contradicts conventional wave propagation theory. For a sonic source in a fluid-filled borehole, the signal amplitude transmitted into the formation depends on the physical properties of the formation, including any pore contents, in the immediate vicinity of the source. A signal in acoustically fast material, such as gas-hydrate-bearing sediments, has a smaller amplitude than a signal in acoustically slower material. Therefore, it is reasonable to interpret the amplitude loss in the gas-hydrate-bearing sediments in terms of source coupling to the surrounding medium as well as intrinsic attenuation. An analysis of sonic waveforms measured at the Mallik 5L-38 well, Northwest Territories, Canada, indicates that a significant part of the sonic waveform's amplitude loss is due to a source-coupling effect. All amplitude analyses of sonic waveforms should include the effect of source coupling in order to accurately characterize the formation's intrinsic attenuation.

  9. Effect of Americium-241 Content on Plutonium Radiation Source Terms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainisch, R.

    1998-12-28

    The management of excess plutonium by the US Department of Energy includes a number of storage and disposition alternatives. Savannah River Site (SRS) is supporting DOE with plutonium disposition efforts, including the immobilization of certain plutonium materials in a borosilicate glass matrix. Surplus plutonium inventories slated for vitrification include materials with elevated levels of Americium-241. The Am-241 content of plutonium materials generally reflects in-growth of the isotope due to decay of plutonium and is age-dependent. However, select plutonium inventories have Am-241 levels considerably above the age-based levels. Elevated levels of americium significantly impact radiation source terms of plutonium materials andmore » will make handling of the materials more difficult. Plutonium materials are normally handled in shielded glove boxes, and the work entails both extremity and whole body exposures. This paper reports results of an SRS analysis of plutonium materials source terms vs. the Americium-241 content of the materials. Data with respect to dependence and magnitude of source terms on/vs. Am-241 levels are presented and discussed. The investigation encompasses both vitrified and un-vitrified plutonium oxide (PuO2) batches.« less

  10. An investigation on nuclear energy policy in Turkey and public perception

    NASA Astrophysics Data System (ADS)

    Coskun, Mehmet Burhanettin; Tanriover, Banu

    2016-11-01

    Turkey, which meets nearly 70 per cent of its energy demands with import, is facing the problems of energy security and current account deficit as a result of its dependence on foreign sources in terms of energy input. It is also known that Turkey is having environmental problems due to the increases in CO2 emission. Considering these problems in Turkish economy, where energy input is commonly used, it is necessary to use energy sources efficiently and provide alternative energy sources. Due to the dependency of renewable sources on meteorological conditions (the absence of enough sun, wind, and water sources), the energy generation could not be provided efficiently and permanently from these sources. At this point, nuclear energy as analternative energy source maintains its importance as a sustainable energy source that providing energy in 7 days and 24 hours. The main purpose of this study is to evaluate the nuclear energy subject within the context of negative public perceptions emerged after Chernobyl (1986) and Fukushima (2011) disasters and to investigate in the economic framework.

  11. Using landscape-level forest monitoring data to draw a representative picture of an iconic subalpine tree species

    Treesearch

    Sara A. Goeking; Deborah K. Izlar

    2015-01-01

    Whitebark pine (Pinus albicaulis) is an ecologically important species in high-altitude, mid-latitude areas of western North America due to the habitat and food source it provides for many wildlife species. Recent concerns about the long-term viability of whitebark pine stands have arisen in the face of high mortality due to a combination of fire...

  12. Alpine Warming induced Nitrogen Export from Green Lakes Valley, Colorado Front Range, USA

    NASA Astrophysics Data System (ADS)

    Barnes, R. T.; Williams, M. W.; Parman, J.

    2012-12-01

    Alpine ecosystems are particularly susceptible to disturbance due to their short growing seasons, sparse vegetation and thin soils. Atmospheric nitrogen deposition and warming temperatures currently affect Green Lakes Valley (GLV) within the Colorado Front Range. Research conducted within the alpine links chronic nitrogen inputs to a suite of ecological impacts, resulting in increased nitrate export. According to NADP records at the site, the atmospheric flux of nitrogen has decreased by 0.56 kg ha-1 yr-1 since 2000, due to a decrease in precipitation. Concurrent with this decrease, alpine nitrate yields have continued to increase; by 32% relative to the previous decade (1990-1999). In order to determine the source(s) of the sustained nitrate increases we utilized long term datasets to construct a mass balance model for four stream segments (glacier to subalpine) for nitrogen and weathering product constituents. We also compared geochemical fingerprints of various solute sources (glacial meltwater, thawing permafrost, snow, and stream water) to alpine stream water to determine if sources had changed over time. Long term trends indicate that in addition to increases in nitrate; sulfate, calcium, and silica have also increased over the same period. The geochemical composition of thawing permafrost (as indicated by rock glacial meltwater) suggests it is the source of these weathering products. Mass balance results indicate the high ammonium loads within glacial meltwater are rapidly nitrified, contributing approximately 0.45 kg yr-1 to the NO3- flux within the upper reaches of the watershed. The sustained export of these solutes during dry, summer months is likely facilitated by thawing cryosphere providing hydraulic connectivity late into the growing season. In a neighboring catchment, lacking permafrost and glacial features, there were no long term weathering or nitrogen solute trends; providing further evidence that the changes in alpine chemistry in GLV are likely due to cryospheric thaw exposing soils to biological and geochemical processes. These findings suggest that efforts to reduce nitrogen deposition loads may not improve water quality, as thawing cryosphere associated with climate change may affect alpine nitrate concentrations as much, or more than atmospheric deposition trends.

  13. Flavobacterium sepsis outbreak due to contaminated distilled water in a neonatal intensive care unit.

    PubMed

    Mosayebi, Z; Movahedian, A H; Soori, T

    2011-07-01

    Outbreaks of sepsis due to water or contaminated equipment can cause significant mortality and morbidity in neonatal intensive care units. We studied an outbreak among neonates caused by flavobacterium and investigated the characteristics of the infected neonates, antimicrobial susceptibilities, and the source of the outbreak. Forty-five neonates with documented flavobacterium sepsis were evaluated in this descriptive study. Data including sex, vaginal delivery or caesarean, preterm or term, birth weight, results of blood cultures and antibiograms were recorded and cases followed up until death or recovery. Environmental sampling for detecting the source of contamination was performed. Among the 45 patients, 28 (62.2%) were male and 17 (37.8%) female (P<0.001). The commonest clinical manifestation was respiratory distress (60%). Eighteen neonates (40%) were low birth weight. Thirty-seven neonates (82.2%) were born via caesarean section. Twenty (44.4%) of them were preterm whereas 25 (55.6%) were term (P<0.001). Mortality was 17.7%. All strains were resistant to ampicillin, and susceptible to amikacin. The source of outbreak was contaminated distilled water. Copyright © 2010 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  14. A radio source occultation experiment with comet Austin 1982g, with unusual results

    NASA Technical Reports Server (NTRS)

    De Pater, I.; Ip, W.-H.

    1984-01-01

    A radio source occultation by comet Austin 1982g was observed on September 15-16, 1982. A change in the apparent position of 1242 + 41 by 1.3 arcsec occurred when the source was 220,000 km away from the cometary ion tail. If this change was due to refraction by the cometary plasma, it indicates an electron density of the plasma of about 10,000/cu cm. When the radio source was on the other side of the plasma tail, at a distance of 230,000 km, the position angle of the electric vector of the radio source changed gradually over about 140 deg within two hours. This observation cannot be explained in terms of ionospheric Faraday rotation, and results from either an intrinsic change in the radio source or Faraday rotation in the cometary plasma due to a change in the direction and/or strength of the magnetic field. In the latter case, the cometary coma must have an electron density and a magnetic field strength orders of magnitude larger than current theories predict.

  15. Water Resources Adaptation to Global Changes: Risk Management through Sustainable Infrastructure Planning and Managements

    EPA Science Inventory

    Global changes due to cyclic and long-term climatic variations, demographic changes and economic development, have impacts on the quality and quantity of potable and irrigation source waters. Internal and external climatic forcings, for example, redistribute precipitation season...

  16. Water Resources Adaptation to Global Changes: Risk Management through Sustainable Infrastructure Planning and Management - Paper

    EPA Science Inventory

    Global changes due to cyclic and long-term climatic variations, demographic changes and economic development, have impacts on the quality and quantity of potable and irrigation source waters. Internal and external climatic forcings, for example, redistribute precipitation season...

  17. Recent advances in laser-driven neutron sources

    NASA Astrophysics Data System (ADS)

    Alejo, A.; Ahmed, H.; Green, A.; Mirfayzi, S. R.; Borghesi, M.; Kar, S.

    2016-11-01

    Due to the limited number and high cost of large-scale neutron facilities, there has been a growing interest in compact accelerator-driven sources. In this context, several potential schemes of laser-driven neutron sources are being intensively studied employing laser-accelerated electron and ion beams. In addition to the potential of delivering neutron beams with high brilliance, directionality and ultra-short burst duration, a laser-driven neutron source would offer further advantages in terms of cost-effectiveness, compactness and radiation confinement by closed-coupled experiments. Some of the recent advances in this field are discussed, showing improvements in the directionality and flux of the laser-driven neutron beams.

  18. Unusual rainbows as auroral candidates: Another point of view

    NASA Astrophysics Data System (ADS)

    Carrasco, Víctor M. S.; Trigo, Ricardo M.; Vaquero, José M.

    2017-04-01

    Several auroral events that occurred in the past have not been cataloged as such due to the fact that they were described in the historical sources with different terminologies. Hayakawa et al. (2016, PASJ, 68, 33) have reviewed historical Oriental chronicles and proposed the terms “unusual rainbow” and “white rainbow” as candidates for auroras. In this work, we present three events that took place in the 18th century in two different settings (the Iberian Peninsula and Brazil) that were originally described with similar definitions or wording to that used by the Oriental chronicles, despite the inherent differences in terms associated with Oriental and Latin languages. We show that these terms are indeed applicable to the three case studies from Europe and South America. Thus, the auroral catalogs available can be extended to Occidental sources using this new terminology.

  19. Coastal and Estuarine Mangrove Squeeze in the Mekong and Saigon Delta

    NASA Astrophysics Data System (ADS)

    Stive, M.

    2016-02-01

    Both in the Mekong and Saigon deltas coastal squeeze is a frequent and pregnant problem, which leads to amazingly alarmous coastal and estuarine erosion rates. From the landside the squeeze is due to encroaching dike relocations and agri- and aquacultures, from the sea side it is due to decreasing sediment sources and relative sea level rise. These multiple pressures at some locations, certainly away from the sediment sources (like Ca Mau) leads to unprecedentent erosion rates. Managed retreat may be a longer term solution, but this will require a new way of thinking. Sandy and silt nourishment strategies may be an innovative alternative, but will require underbuilding scientific and practical research.

  20. Reduced mercury deposition in New Hampshire from 1996 to 2002 due to changes in local sources.

    PubMed

    Han, Young-Ji; Holsen, Thomas M; Evers, David C; Driscoll, Charles T

    2008-12-01

    Changes in deposition of gaseous divalent mercury (Hg(II)) and particulate mercury (Hg(p)) in New Hampshire due to changes in local sources from 1996 to 2002 were assessed using the Industrial Source Complex Short Term (ISCST3) model (regional and global sources and Hg atmospheric reactions were not considered). Mercury (Hg) emissions in New Hampshire and adjacent areas decreased significantly (from 1540 to 880 kg yr(-1)) during this period, and the average annual modeled deposition of total Hg also declined from 17 to 7.0 microg m(-2) yr(-1) for the same period. In 2002, the maximum amount of Hg deposition was modeled to be in southern New Hampshire, while for 1996 the maximum deposition occurred farther north and east. The ISCST3 was also used to evaluate two future scenarios. The average percent difference in deposition across all cells was 5% for the 50% reduction scenario and 9% for the 90% reduction scenario.

  1. Seismic hazard assessment over time: Modelling earthquakes in Taiwan

    NASA Astrophysics Data System (ADS)

    Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting

    2017-04-01

    To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.

  2. Revisiting the radionuclide atmospheric dispersion event of the Chernobyl disaster - modelling sensitivity and data assimilation

    NASA Astrophysics Data System (ADS)

    Roustan, Yelva; Duhanyan, Nora; Bocquet, Marc; Winiarek, Victor

    2013-04-01

    A sensitivity study of the numerical model, as well as, an inverse modelling approach applied to the atmospheric dispersion issues after the Chernobyl disaster are both presented in this paper. On the one hand, the robustness of the source term reconstruction through advanced data assimilation techniques was tested. On the other hand, the classical approaches for sensitivity analysis were enhanced by the use of an optimised forcing field which otherwise is known to be strongly uncertain. The POLYPHEMUS air quality system was used to perform the simulations of radionuclide dispersion. Activity concentrations in air and deposited to the ground of iodine-131, caesium-137 and caesium-134 were considered. The impact of the implemented parameterizations of the physical processes (dry and wet depositions, vertical turbulent diffusion), of the forcing fields (meteorology and source terms) and of the numerical configuration (horizontal resolution) were investigated for the sensitivity study of the model. A four dimensional variational scheme (4D-Var) based on the approximate adjoint of the chemistry transport model was used to invert the source term. The data assimilation is performed with measurements of activity concentrations in air extracted from the Radioactivity Environmental Monitoring (REM) database. For most of the investigated configurations (sensitivity study), the statistics to compare the model results to the field measurements as regards the concentrations in air are clearly improved while using a reconstructed source term. As regards the ground deposited concentrations, an improvement can only be seen in case of satisfactorily modelled episode. Through these studies, the source term and the meteorological fields are proved to have a major impact on the activity concentrations in air. These studies also reinforce the use of reconstructed source term instead of the usual estimated one. A more detailed parameterization of the deposition process seems also to be able to improve the simulation results. For deposited activities the results are more complex probably due to a strong sensitivity to some of the meteorological fields which remain quite uncertain.

  3. Long-Term Temporal Trends of Polychlorinated Biphenyls and Their Controlling Sources in China.

    PubMed

    Zhao, Shizhen; Breivik, Knut; Liu, Guorui; Zheng, Minghui; Jones, Kevin C; Sweetman, Andrew J

    2017-03-07

    Polychlorinated biphenyls (PCBs) are industrial organic contaminants identified as persistent, bioaccumulative, toxic (PBT), and subject to long-range transport (LRT) with global scale significance. This study focuses on a reconstruction and prediction for China of long-term emission trends of intentionally and unintentionally produced (UP) ∑ 7 PCBs (UP-PCBs, from the manufacture of steel, cement and sinter iron) and their re-emissions from secondary sources (e.g., soils and vegetation) using a dynamic fate model (BETR-Global). Contemporary emission estimates combined with predictions from the multimedia fate model suggest that primary sources still dominate, although unintentional sources are predicted to become a main contributor from 2035 for PCB-28. Imported e-waste is predicted to play an increasing role until 2020-2030 on a national scale due to the decline of intentionally produced (IP) emissions. Hypothetical emission scenarios suggest that China could become a potential source to neighboring regions with a net output of ∼0.4 t year -1 by around 2050. However, future emission scenarios and hence model results will be dictated by the efficiency of control measures.

  4. Error sources in passive and active microwave satellite soil moisture over Australia

    USDA-ARS?s Scientific Manuscript database

    Development of a long-term climate record of soil moisture (SM) involves combining historic and present satellite-retrieved SM data sets. This in turn requires a consistent characterization and deep understanding of the systematic differences and errors in the individual data sets, which vary due to...

  5. Geocoronal hydrogen studies using Fabry Perot interferometers, part 2: Long-term observations

    NASA Astrophysics Data System (ADS)

    Nossal, S. M.; Mierkiewicz, E. J.; Roesler, F. L.; Reynolds, R. J.; Haffner, L. M.

    2006-09-01

    Long-term data sets are required to investigate sources of natural variability in the upper atmosphere. Understanding the influence of sources of natural variability such as the solar cycle is needed to characterize the thermosphere + exosphere, to understand coupling processes between atmospheric regions, and to isolate signatures of natural variability from those due to human-caused change. Multi-year comparisons of thermospheric + exospheric Balmer α emissions require cross-calibrated and well-understood instrumentation, a stable calibration source, reproducible observing conditions, separation of the terrestrial from the Galactic emission line, and consistent data analysis accounting for differences in viewing geometry. We discuss how we address these criteria in the acquisition and analysis of a mid-latitude geocoronal Balmer α column emission data set now spanning two solar cycles and taken mainly from Wisconsin and Kitt Peak, Arizona. We also discuss results and outstanding challenges for increasing the accuracy and use of these observations.

  6. Atomic processes and equation of state of high Z plasmas for EUV sources and their effects on the spatial and temporal evolution of the plasmas

    NASA Astrophysics Data System (ADS)

    Sasaki, Akira; Sunahara, Atushi; Furukawa, Hiroyuki; Nishihara, Katsunobu; Nishikawa, Takeshi; Koike, Fumihiro

    2016-03-01

    Laser-produced plasma (LPP) extreme ultraviolet (EUV) light sources have been intensively investigated due to potential application to next-generation semiconductor technology. Current studies focus on the atomic processes and hydrodynamics of plasmas to develop shorter wavelength sources at λ = 6.x nm as well as to improve the conversion efficiency (CE) of λ = 13.5 nm sources. This paper examines the atomic processes of mid-z elements, which are potential candidates for λ = 6.x nm source using n=3-3 transitions. Furthermore, a method to calculate the hydrodynamics of the plasmas in terms of the initial interaction between a relatively weak prepulse laser is presented.

  7. Modeling and observations of an elevated, moving infrasonic source: Eigenray methods.

    PubMed

    Blom, Philip; Waxler, Roger

    2017-04-01

    The acoustic ray tracing relations are extended by the inclusion of auxiliary parameters describing variations in the spatial ray coordinates and eikonal vector due to changes in the initial conditions. Computation of these parameters allows one to define the geometric spreading factor along individual ray paths and assists in identification of caustic surfaces so that phase shifts can be easily identified. A method is developed leveraging the auxiliary parameters to identify propagation paths connecting specific source-receiver geometries, termed eigenrays. The newly introduced method is found to be highly efficient in cases where propagation is non-planar due to horizontal variations in the propagation medium or the presence of cross winds. The eigenray method is utilized in analysis of infrasonic signals produced by a multi-stage sounding rocket launch with promising results for applications of tracking aeroacoustic sources in the atmosphere and specifically to analysis of motor performance during dynamic tests.

  8. Management of Ultimate Risk of Nuclear Power Plants by Source Terms - Lessons Learned from the Chernobyl Accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genn Saji

    2006-07-01

    The term 'ultimate risk' is used here to describe the probabilities and radiological consequences that should be incorporated in siting, containment design and accident management of nuclear power plants for hypothetical accidents. It is closely related with the source terms specified in siting criteria which assures an adequate separation of radioactive inventories of the plants from the public, in the event of a hypothetical and severe accident situation. The author would like to point out that current source terms which are based on the information from the Windscale accident (1957) through TID-14844 are very outdated and do not incorporate lessonsmore » learned from either the Three Miles Island (TMI, 1979) nor Chernobyl accident (1986), two of the most severe accidents ever experienced. As a result of the observations of benign radionuclides released at TMI, the technical community in the US felt that a more realistic evaluation of severe reactor accident source terms was necessary. In this background, the 'source term research project' was organized in 1984 to respond to these challenges. Unfortunately, soon after the time of the final report from this project was released, the Chernobyl accident occurred. Due to the enormous consequences induced by then accident, the one time optimistic perspectives in establishing a more realistic source term were completely shattered. The Chernobyl accident, with its human death toll and dispersion of a large part of the fission fragments inventories into the environment, created a significant degradation in the public's acceptance of nuclear energy throughout the world. In spite of this, nuclear communities have been prudent in responding to the public's anxiety towards the ultimate safety of nuclear plants, since there still remained many unknown points revolving around the mechanism of the Chernobyl accident. In order to resolve some of these mysteries, the author has performed a scoping study of the dispersion and deposition mechanisms of fuel particles and fission fragments during the initial phase of the Chernobyl accident. Through this study, it is now possible to generally reconstruct the radiological consequences by using a dispersion calculation technique, combined with the meteorological data at the time of the accident and land contamination densities of {sup 137}Cs measured and reported around the Chernobyl area. Although it is challenging to incorporate lessons learned from the Chernobyl accident into the source term issues, the author has already developed an example of safety goals by incorporating the radiological consequences of the accident. The example provides safety goals by specifying source term releases in a graded approach in combination with probabilities, i.e. risks. The author believes that the future source term specification should be directly linked with safety goals. (author)« less

  9. Detailed source term estimation of the atmospheric release for the Fukushima Daiichi Nuclear Power Station accident by coupling simulations of atmospheric dispersion model with improved deposition scheme and oceanic dispersion model

    NASA Astrophysics Data System (ADS)

    Katata, G.; Chino, M.; Kobayashi, T.; Terada, H.; Ota, M.; Nagai, H.; Kajino, M.; Draxler, R.; Hort, M. C.; Malo, A.; Torii, T.; Sanada, Y.

    2014-06-01

    Temporal variations in the amount of radionuclides released into the atmosphere during the Fukushima Dai-ichi Nuclear Power Station (FNPS1) accident and their atmospheric and marine dispersion are essential to evaluate the environmental impacts and resultant radiological doses to the public. In this paper, we estimate a detailed time trend of atmospheric releases during the accident by combining environmental monitoring data with atmospheric model simulations from WSPEEDI-II (Worldwide version of System for Prediction of Environmental Emergency Dose Information), and simulations from the oceanic dispersion model SEA-GEARN-FDM, both developed by the authors. A sophisticated deposition scheme, which deals with dry and fogwater depositions, cloud condensation nuclei (CCN) activation and subsequent wet scavenging due to mixed-phase cloud microphysics (in-cloud scavenging) for radioactive iodine gas (I2 and CH3I) and other particles (CsI, Cs, and Te), was incorporated into WSPEEDI-II to improve the surface deposition calculations. The fallout to the ocean surface calculated by WSPEEDI-II was used as input data for the SEA-GEARN-FDM calculations. Reverse and inverse source-term estimation methods based on coupling the simulations from both models was adopted using air dose rates and concentrations, and sea surface concentrations. The results revealed that the major releases of radionuclides due to FNPS1 accident occurred in the following periods during March 2011: the afternoon of 12 March due to the wet venting and hydrogen explosion at Unit 1, the morning of 13 March after the venting event at Unit 3, midnight of 14 March when the SRV (Safely Relief Valve) at Unit 2 was opened three times, the morning and night of 15 March, and the morning of 16 March. According to the simulation results, the highest radioactive contamination areas around FNPS1 were created from 15 to 16 March by complicated interactions among rainfall, plume movements, and the temporal variation of release rates associated with reactor pressure changes in Units 2 and 3. The modified WSPEEDI-II simulation using the new source term reproduced local and regional patterns of cumulative surface deposition of total 131I and 137Cs and air dose rate obtained by airborne surveys. The new source term was also tested using three atmospheric dispersion models (MLDP0, HYSPLIT, and NAME) for regional and global calculations and showed good agreement between calculated and observed air concentration and surface deposition of 137Cs in East Japan. Moreover, HYSPLIT model using the new source term also reproduced the plume arrivals at several countries abroad showing a good correlation with measured air concentration data. A large part of deposition pattern of total 131I and 137Cs in East Japan was explained by in-cloud particulate scavenging. However, for the regional scale contaminated areas, there were large uncertainties due to the overestimation of rainfall amounts and the underestimation of fogwater and drizzle depositions. The computations showed that approximately 27% of 137Cs discharged from FNPS1 deposited to the land in East Japan, mostly in forest areas.

  10. Evolution of air pollution source contributions over one decade, derived by PM10 and PM2.5 source apportionment in two metropolitan urban areas in Greece

    NASA Astrophysics Data System (ADS)

    Diapouli, E.; Manousakas, M.; Vratolis, S.; Vasilatou, V.; Maggos, Th; Saraga, D.; Grigoratos, Th; Argyropoulos, G.; Voutsa, D.; Samara, C.; Eleftheriadis, K.

    2017-09-01

    Metropolitan Urban areas in Greece have been known to suffer from poor air quality, due to variety of emission sources, topography and climatic conditions favouring the accumulation of pollution. While a number of control measures have been implemented since the 1990s, resulting in reductions of atmospheric pollution and changes in emission source contributions, the financial crisis which started in 2009 has significantly altered this picture. The present study is the first effort to assess the contribution of emission sources to PM10 and PM2.5 concentration levels and their long-term variability (over 5-10 years), in the two largest metropolitan urban areas in Greece (Athens and Thessaloniki). Intensive measurement campaigns were conducted during 2011-2012 at suburban, urban background and urban traffic sites in these two cities. In addition, available datasets from previous measurements in Athens and Thessaloniki were used in order to assess the long-term variability of concentrations and sources. Chemical composition analysis of the 2011-2012 samples showed that carbonaceous matter was the most abundant component for both PM size fractions. Significant increase of carbonaceous particle concentrations and of OC/EC ratio during the cold period, especially in the residential urban background sites, pointed towards domestic heating and more particularly wood (biomass) burning as a significant source. PMF analysis further supported this finding. Biomass burning was the largest contributing source at the two urban background sites (with mean contributions for the two size fractions in the range of 24-46%). Secondary aerosol formation (sulphate, nitrate & organics) was also a major contributing source for both size fractions at the suburban and urban background sites. At the urban traffic site, vehicular traffic (exhaust and non-exhaust emissions) was the source with the highest contributions, accounting for 44% of PM10 and 37% of PM2.5, respectively. The long-term variability of emission sources in the two cities (over 5-10 years), assessed through a harmonized application of the PMF technique on recent and past year data, clearly demonstrates the effective reduction in emissions during the last decade due to control measures and technological development; however, it also reflects the effects of the financial crisis in Greece during these years, which has led to decreased economic activities and the adoption of more polluting practices by the local population in an effort to reduce living costs.

  11. Numerical simulations of the Cosmic Battery in accretion flows around astrophysical black holes

    NASA Astrophysics Data System (ADS)

    Contopoulos, I.; Nathanail, A.; Sądowski, A.; Kazanas, D.; Narayan, R.

    2018-01-01

    We implement the KORAL code to perform two sets of very long general relativistic radiation magnetohydrodynamic simulations of an axisymmetric optically thin magnetized flow around a non-rotating black hole: one with a new term in the electromagnetic field tensor due to the radiation pressure felt by the plasma electrons on the comoving frame of the electron-proton plasma, and one without. The source of the radiation is the accretion flow itself. Without the new term, the system evolves to a standard accretion flow due to the development of the magneto-rotational instability. With the new term, however, the system eventually evolves to a magnetically arrested disc state in which a large-scale jet-like magnetic field threads the black hole horizon. Our results confirm the secular action of the Cosmic Battery in accretion flows around astrophysical black holes.

  12. SOURCE IDENTIFICATION OF PM2.5 IN AN ARID NORTHWEST U.S. CITY BY POSITIVE MATRIX FACTORIZATION. (R827355C008)

    EPA Science Inventory

    Spokane, WA is prone to frequent particulate pollution episodes due to dust storms, biomass burning, and periods of stagnant meteorological conditions. Spokane is the location of a long-term study examining the association between health effects and chemical or physical consti...

  13. Mach wave properties in the presence of source and medium heterogeneity

    NASA Astrophysics Data System (ADS)

    Vyas, J. C.; Mai, P. M.; Galis, M.; Dunham, Eric M.; Imperatori, W.

    2018-06-01

    We investigate Mach wave coherence for kinematic supershear ruptures with spatially heterogeneous source parameters, embedded in 3D scattering media. We assess Mach wave coherence considering: 1) source heterogeneities in terms of variations in slip, rise time and rupture speed; 2) small-scale heterogeneities in Earth structure, parameterized from combinations of three correlation lengths and two standard deviations (assuming von Karman power spectral density with fixed Hurst exponent); and 3) joint effects of source and medium heterogeneities. Ground-motion simulations are conducted using a generalized finite-difference method, choosing a parameterization such that the highest resolved frequency is ˜5 Hz. We discover that Mach wave coherence is slightly diminished at near fault distances (< 10 km) due to spatially variable slip and rise time; beyond this distance the Mach wave coherence is more strongly reduced by wavefield scattering due to small-scale heterogeneities in Earth structure. Based on our numerical simulations and theoretical considerations we demonstrate that the standard deviation of medium heterogeneities controls the wavefield scattering, rather than the correlation length. In addition, we find that peak ground accelerations in the case of combined source and medium heterogeneities are consistent with empirical ground motion prediction equations for all distances, suggesting that in nature ground shaking amplitudes for supershear ruptures may not be elevated due to complexities in the rupture process and seismic wave-scattering.

  14. Sensitivity Analysis Tailored to Constrain 21st Century Terrestrial Carbon-Uptake

    NASA Astrophysics Data System (ADS)

    Muller, S. J.; Gerber, S.

    2013-12-01

    The long-term fate of terrestrial carbon (C) in response to climate change remains a dominant source of uncertainty in Earth-system model projections. Increasing atmospheric CO2 could be mitigated by long-term net uptake of C, through processes such as increased plant productivity due to "CO2-fertilization". Conversely, atmospheric conditions could be exacerbated by long-term net release of C, through processes such as increased decomposition due to higher temperatures. This balance is an important area of study, and a major source of uncertainty in long-term (>year 2050) projections of planetary response to climate change. We present results from an innovative application of sensitivity analysis to LM3V, a dynamic global vegetation model (DGVM), intended to identify observed/observable variables that are useful for constraining long-term projections of C-uptake. We analyzed the sensitivity of cumulative C-uptake by 2100, as modeled by LM3V in response to IPCC AR4 scenario climate data (1860-2100), to perturbations in over 50 model parameters. We concurrently analyzed the sensitivity of over 100 observable model variables, during the extant record period (1970-2010), to the same parameter changes. By correlating the sensitivities of observable variables with the sensitivity of long-term C-uptake we identified model calibration variables that would also constrain long-term C-uptake projections. LM3V employs a coupled carbon-nitrogen cycle to account for N-limitation, and we find that N-related variables have an important role to play in constraining long-term C-uptake. This work has implications for prioritizing field campaigns to collect global data that can help reduce uncertainties in the long-term land-atmosphere C-balance. Though results of this study are specific to LM3V, the processes that characterize this model are not completely divorced from other DGVMs (or reality), and our approach provides valuable insights into how data can be leveraged to be better constrain projections for the land carbon sink.

  15. Aeromicrobiology/air quality

    USGS Publications Warehouse

    Andersen, Gary L.; Frisch, A.S.; Kellogg, Christina A.; Levetin, E.; Lighthart, Bruce; Paterno, D.

    2009-01-01

    The most prevalent microorganisms, viruses, bacteria, and fungi, are introduced into the atmosphere from many anthropogenic sources such as agricultural, industrial and urban activities, termed microbial air pollution (MAP), and natural sources. These include soil, vegetation, and ocean surfaces that have been disturbed by atmospheric turbulence. The airborne concentrations range from nil to great numbers and change as functions of time of day, season, location, and upwind sources. While airborne, they may settle out immediately or be transported great distances. Further, most viable airborne cells can be rendered nonviable due to temperature effects, dehydration or rehydration, UV radiation, and/or air pollution effects. Mathematical microbial survival models that simulate these effects have been developed.

  16. The Scaling of Broadband Shock-Associated Noise with Increasing Temperature

    NASA Technical Reports Server (NTRS)

    Miller, Steven A.

    2012-01-01

    A physical explanation for the saturation of broadband shock-associated noise (BBSAN) intensity with increasing jet stagnation temperature has eluded investigators. An explanation is proposed for this phenomenon with the use of an acoustic analogy. For this purpose the acoustic analogy of Morris and Miller is examined. To isolate the relevant physics, the scaling of BBSAN at the peak intensity level at the sideline ( = 90 degrees) observer location is examined. Scaling terms are isolated from the acoustic analogy and the result is compared using a convergent nozzle with the experiments of Bridges and Brown and using a convergent-divergent nozzle with the experiments of Kuo, McLaughlin, and Morris at four nozzle pressure ratios in increments of total temperature ratios from one to four. The equivalent source within the framework of the acoustic analogy for BBSAN is based on local field quantities at shock wave shear layer interactions. The equivalent source combined with accurate calculations of the propagation of sound through the jet shear layer, using an adjoint vector Green s function solver of the linearized Euler equations, allows for predictions that retain the scaling with respect to stagnation pressure and allows for the accurate saturation of BBSAN with increasing stagnation temperature. This is a minor change to the source model relative to the previously developed models. The full development of the scaling term is shown. The sources and vector Green s function solver are informed by steady Reynolds-Averaged Navier-Stokes solutions. These solutions are examined as a function of stagnation temperature at the first shock wave shear layer interaction. It is discovered that saturation of BBSAN with increasing jet stagnation temperature occurs due to a balance between the amplification of the sound propagation through the shear layer and the source term scaling.A physical explanation for the saturation of broadband shock-associated noise (BBSAN) intensity with increasing jet stagnation temperature has eluded investigators. An explanation is proposed for this phenomenon with the use of an acoustic analogy. For this purpose the acoustic analogy of Morris and Miller is examined. To isolate the relevant physics, the scaling of BBSAN at the peak intensity level at the sideline psi = 90 degrees) observer location is examined. Scaling terms are isolated from the acoustic analogy and the result is compared using a convergent nozzle with the experiments of Bridges and Brown and using a convergent-divergent nozzle with the experiments of Kuo, McLaughlin, and Morris at four nozzle pressure ratios in increments of total temperature ratios from one to four. The equivalent source within the framework of the acoustic analogy for BBSAN is based on local field quantities at shock wave shear layer interactions. The equivalent source combined with accurate calculations of the propagation of sound through the jet shear layer, using an adjoint vector Green s function solver of the linearized Euler equations, allows for predictions that retain the scaling with respect to stagnation pressure and allows for the accurate saturation of BBSAN with increasing stagnation temperature. This is a minor change to the source model relative to the previously developed models. The full development of the scaling term is shown. The sources and vector Green s function solver are informed by steady Reynolds-Averaged Navier-Stokes solutions. These solutions are examined as a function of stagnation temperature at the first shock wave shear layer interaction. It is discovered that saturation of BBSAN with increasing jet stagnation temperature occurs due to a balance between the amplification of the sound propagation through the shear layer and the source term scaling.

  17. Inverse modelling of radionuclide release rates using gamma dose rate observations

    NASA Astrophysics Data System (ADS)

    Hamburger, Thomas; Stohl, Andreas; von Haustein, Christoph; Thummerer, Severin; Wallner, Christian

    2014-05-01

    Severe accidents in nuclear power plants such as the historical accident in Chernobyl 1986 or the more recent disaster in the Fukushima Dai-ichi nuclear power plant in 2011 have drastic impacts on the population and environment. The hazardous consequences reach out on a national and continental scale. Environmental measurements and methods to model the transport and dispersion of the released radionuclides serve as a platform to assess the regional impact of nuclear accidents - both, for research purposes and, more important, to determine the immediate threat to the population. However, the assessments of the regional radionuclide activity concentrations and the individual exposure to radiation dose underlie several uncertainties. For example, the accurate model representation of wet and dry deposition. One of the most significant uncertainty, however, results from the estimation of the source term. That is, the time dependent quantification of the released spectrum of radionuclides during the course of the nuclear accident. The quantification of the source terms of severe nuclear accidents may either remain uncertain (e.g. Chernobyl, Devell et al., 1995) or rely on rather rough estimates of released key radionuclides given by the operators. Precise measurements are mostly missing due to practical limitations during the accident. Inverse modelling can be used to realise a feasible estimation of the source term (Davoine and Bocquet, 2007). Existing point measurements of radionuclide activity concentrations are therefore combined with atmospheric transport models. The release rates of radionuclides at the accident site are then obtained by improving the agreement between the modelled and observed concentrations (Stohl et al., 2012). The accuracy of the method and hence of the resulting source term depends amongst others on the availability, reliability and the resolution in time and space of the observations. Radionuclide activity concentrations are observed on a relatively sparse grid and the temporal resolution of available data may be low within the order of hours or a day. Gamma dose rates on the other hand are observed routinely on a much denser grid and higher temporal resolution. Gamma dose rate measurements contain no explicit information on the observed spectrum of radionuclides and have to be interpreted carefully. Nevertheless, they provide valuable information for the inverse evaluation of the source term due to their availability (Saunier et al., 2013). We present a new inversion approach combining an atmospheric dispersion model and observations of radionuclide activity concentrations and gamma dose rates to obtain the source term of radionuclides. We use the Lagrangian particle dispersion model FLEXPART (Stohl et al., 1998; Stohl et al., 2005) to model the atmospheric transport of the released radionuclides. The gamma dose rates are calculated from the modelled activity concentrations. The inversion method uses a Bayesian formulation considering uncertainties for the a priori source term and the observations (Eckhardt et al., 2008). The a priori information on the source term is a first guess. The gamma dose rate observations will be used with inverse modelling to improve this first guess and to retrieve a reliable source term. The details of this method will be presented at the conference. This work is funded by the Bundesamt für Strahlenschutz BfS, Forschungsvorhaben 3612S60026. References Davoine, X. and Bocquet, M., Atmos. Chem. Phys., 7, 1549-1564, 2007. Devell, L., et al., OCDE/GD(96)12, 1995. Eckhardt, S., et al., Atmos. Chem. Phys., 8, 3881-3897, 2008. Saunier, O., et al., Atmos. Chem. Phys., 13, 11403-11421, 2013. Stohl, A., et al., Atmos. Environ., 32, 4245-4264, 1998. Stohl, A., et al., Atmos. Chem. Phys., 5, 2461-2474, 2005. Stohl, A., et al., Atmos. Chem. Phys., 12, 2313-2343, 2012.

  18. Detecting the permafrost carbon feedback: talik formation and increased cold-season respiration as precursors to sink-to-source transitions

    NASA Astrophysics Data System (ADS)

    Parazoo, Nicholas C.; Koven, Charles D.; Lawrence, David M.; Romanovsky, Vladimir; Miller, Charles E.

    2018-01-01

    Thaw and release of permafrost carbon (C) due to climate change is likely to offset increased vegetation C uptake in northern high-latitude (NHL) terrestrial ecosystems. Models project that this permafrost C feedback may act as a slow leak, in which case detection and attribution of the feedback may be difficult. The formation of talik, a subsurface layer of perennially thawed soil, can accelerate permafrost degradation and soil respiration, ultimately shifting the C balance of permafrost-affected ecosystems from long-term C sinks to long-term C sources. It is imperative to understand and characterize mechanistic links between talik, permafrost thaw, and respiration of deep soil C to detect and quantify the permafrost C feedback. Here, we use the Community Land Model (CLM) version 4.5, a permafrost and biogeochemistry model, in comparison to long-term deep borehole data along North American and Siberian transects, to investigate thaw-driven C sources in NHL ( > 55° N) from 2000 to 2300. Widespread talik at depth is projected across most of the NHL permafrost region (14 million km2) by 2300, 6.2 million km2 of which is projected to become a long-term C source, emitting 10 Pg C by 2100, 50 Pg C by 2200, and 120 Pg C by 2300, with few signs of slowing. Roughly half of the projected C source region is in predominantly warm sub-Arctic permafrost following talik onset. This region emits only 20 Pg C by 2300, but the CLM4.5 estimate may be biased low by not accounting for deep C in yedoma. Accelerated decomposition of deep soil C following talik onset shifts the ecosystem C balance away from surface dominant processes (photosynthesis and litter respiration), but sink-to-source transition dates are delayed by 20-200 years by high ecosystem productivity, such that talik peaks early ( ˜ 2050s, although borehole data suggest sooner) and C source transition peaks late ( ˜ 2150-2200). The remaining C source region in cold northern Arctic permafrost, which shifts to a net source early (late 21st century), emits 5 times more C (95 Pg C) by 2300, and prior to talik formation due to the high decomposition rates of shallow, young C in organic-rich soils coupled with low productivity. Our results provide important clues signaling imminent talik onset and C source transition, including (1) late cold-season (January-February) soil warming at depth ( ˜ 2 m), (2) increasing cold-season emissions (November-April), and (3) enhanced respiration of deep, old C in warm permafrost and young, shallow C in organic-rich cold permafrost soils. Our results suggest a mosaic of processes that govern carbon source-to-sink transitions at high latitudes and emphasize the urgency of monitoring soil thermal profiles, organic C age and content, cold-season CO2 emissions, and atmospheric 14CO2 as key indicators of the permafrost C feedback.

  19. On the effect of using the Shapiro filter to smooth winds on a sphere

    NASA Technical Reports Server (NTRS)

    Takacs, L. L.; Balgovind, R. C.

    1984-01-01

    Spatial differencing schemes which are not enstrophy conserving nor implicitly damping require global filtering of short waves to eliminate the build-up of energy in the shortest wavelengths due to aliasing. Takacs and Balgovind (1983) have shown that filtering on a sphere with a latitude dependent damping function will cause spurious vorticity and divergence source terms to occur if care is not taken to ensure the irrotationality of the gradients of the stream function and velocity potential. Using a shallow water model with fourth-order energy-conserving spatial differencing, it is found that using a 16th-order Shapiro (1979) filter on the winds and heights to control nonlinear instability also creates spurious source terms when the winds are filtered in the meridional direction.

  20. Effect of inlet conditions on the turbulent statistics in a buoyant jet

    NASA Astrophysics Data System (ADS)

    Kumar, Rajesh; Dewan, Anupam

    2015-11-01

    Buoyant jets have been the subject of research due to their technological and environmental importance in many physical processes, such as, spread of smoke and toxic gases from fires, release of gases form volcanic eruptions and industrial stacks. The nature of the flow near the source is initially laminar which quickly changes into turbulent flow. We present large eddy simulation of a buoyant jet. In the present study a careful investigation has been done to study the influence of inlet conditions at the source on the turbulent statistics far from the source. It has been observed that the influence of the initial conditions on the second-order buoyancy terms extends further in the axial direction from the source than their influence on the time-averaged flow and second-order velocity statistics. We have studied the evolution of vortical structures in the buoyant jet. It has been shown that the generation of helical vortex rings in the vicinity of the source around a laminar core could be the reason for the larger influence of the inlet conditions on the second-order buoyancy terms as compared to the second-order velocity statistics.

  1. Field measurements and modeling to resolve m2 to km2 CH4 emissions for a complex urban source: An Indiana landfill study

    USDA-ARS?s Scientific Manuscript database

    Large uncertainties for landfill CH4 emissions due to spatial and temporal variabilities remain unresolved by short-term field campaigns and historic GHG inventory models. Using four field methods (aircraft-based mass balance, tracer correlation, vertical radial plume mapping, and static chambers) ...

  2. Toward an Integrated System of Income Acquisition and Management: Four Community College Responses.

    ERIC Educational Resources Information Center

    Birmingham, Kathryn M.

    This study argues that community college funding and resource development must become a long-term core function of the institution due to changes in the source of revenue for community colleges. The research problem was: (1) to identify and describe how organizational structure and management activities have changed in four community colleges in…

  3. FT-IR and C-13 NMR analysis of soil humic fractions from a long term cropping systems study

    USDA-ARS?s Scientific Manuscript database

    Increased knowledge of humic fractions is important due to its involvement in many soil ecosystem processes. Soil humic acid (HA) and fulvic acid (FA) from a nine-year agroecosystem study with different tillage, cropping system, and N source treatments were characterized using FT-IR andsolid-state ...

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.C. Ryman

    This calculation is a revision of a previous calculation (Ref. 7.5) that bears the same title and has the document identifier BBAC00000-01717-0210-00006 REV 01. The purpose of this revision is to remove TBV (to-be-verified) -41 10 associated with the output files of the previous version (Ref. 7.30). The purpose of this and the previous calculation is to generate source terms for a representative boiling water reactor (BWR) spent nuclear fuel (SNF) assembly for the first one million years after the SNF is discharged from the reactors. This calculation includes an examination of several ways to represent BWR assemblies and operatingmore » conditions in SAS2H in order to quantify the effects these representations may have on source terms. These source terms provide information characterizing the neutron and gamma spectra in particles per second, the decay heat in watts, and radionuclide inventories in curies. Source terms are generated for a range of burnups and enrichments (see Table 2) that are representative of the waste stream and stainless steel (SS) clad assemblies. During this revision, it was determined that the burnups used for the computer runs of the previous revision were actually about 1.7% less than the stated, or nominal, burnups. See Section 6.6 for a discussion of how to account for this effect before using any source terms from this calculation. The source term due to the activation of corrosion products deposited on the surfaces of the assembly from the coolant is also calculated. The results of this calculation support many areas of the Monitored Geologic Repository (MGR), which include thermal evaluation, radiation dose determination, radiological safety analyses, surface and subsurface facility designs, and total system performance assessment. This includes MGR items classified as Quality Level 1, for example, the Uncanistered Spent Nuclear Fuel Disposal Container (Ref. 7.27, page 7). Therefore, this calculation is subject to the requirements of the Quality Assurance Requirements and Description (Ref. 7.28). The performance of the calculation and development of this document are carried out in accordance with AP-3.124, ''Design Calculation and Analyses'' (Ref. 7.29).« less

  5. Using the example of Istanbul to outline general aspects of protecting reservoirs, rivers and lakes used for drinking water abstraction.

    PubMed

    Tanik, A

    2000-01-01

    The six main drinking water reservoirs of Istanbul are under the threat of pollution due to rapid population increase, unplanned urbanisation and insufficient infrastructure. In contrast to the present land use profile, the environmental evaluation of the catchment areas reveals that point sources of pollutants, especially of domestic origin, dominate over those from diffuse sources. The water quality studies also support these findings, emphasising that if no substantial precautions are taken, there will be no possibility of obtaining drinking water from them. In this paper, under the light of the present status of the reservoirs, possible and probable short- and long-term protective measures are outlined for reducing the impact of point sources. Immediate precautions mostly depend on reducing the pollution arising from the existing settlements. Long-term measures mainly emphasise the preparation of new land use plans taking into consideration the protection of unoccupied lands. Recommendations on protection and control of the reservoirs are stated.

  6. A semi-empirical analysis of strong-motion peaks in terms of seismic source, propagation path, and local site conditions

    NASA Astrophysics Data System (ADS)

    Kamiyama, M.; Orourke, M. J.; Flores-Berrones, R.

    1992-09-01

    A new type of semi-empirical expression for scaling strong-motion peaks in terms of seismic source, propagation path, and local site conditions is derived. Peak acceleration, peak velocity, and peak displacement are analyzed in a similar fashion because they are interrelated. However, emphasis is placed on the peak velocity which is a key ground motion parameter for lifeline earthquake engineering studies. With the help of seismic source theories, the semi-empirical model is derived using strong motions obtained in Japan. In the derivation, statistical considerations are used in the selection of the model itself and the model parameters. Earthquake magnitude M and hypocentral distance r are selected as independent variables and the dummy variables are introduced to identify the amplification factor due to individual local site conditions. The resulting semi-empirical expressions for the peak acceleration, velocity, and displacement are then compared with strong-motion data observed during three earthquakes in the U.S. and Mexico.

  7. Neutron crosstalk between liquid scintillators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verbeke, J. M.; Prasad, M. K.; Snyderman, N. J.

    2015-05-01

    We propose a method to quantify the fractions of neutrons scattering between liquid scintillators. Using a spontaneous fission source, this method can be utilized to quickly characterize an array of liquid scintillators in terms of crosstalk. The point model theory due to Feynman is corrected to account for these multiple scatterings. Using spectral information measured by the liquid scintillators, fractions of multiple scattering can be estimated, and mass reconstruction of fissile materials under investigation can be improved. Monte Carlo simulations of mono-energetic neutron sources were performed to estimate neutron crosstalk. A californium source in an array of liquid scintillators wasmore » modeled to illustrate the improvement of the mass reconstruction.« less

  8. Reducing mortality risk by targeting specific air pollution sources: Suva, Fiji.

    PubMed

    Isley, C F; Nelson, P F; Taylor, M P; Stelcer, E; Atanacio, A J; Cohen, D D; Mani, F S; Maata, M

    2018-01-15

    Health implications of air pollution vary dependent upon pollutant sources. This work determines the value, in terms of reduced mortality, of reducing ambient particulate matter (PM 2.5 : effective aerodynamic diameter 2.5μm or less) concentration due to different emission sources. Suva, a Pacific Island city with substantial input from combustion sources, is used as a case-study. Elemental concentration was determined, by ion beam analysis, for PM 2.5 samples from Suva, spanning one year. Sources of PM 2.5 have been quantified by positive matrix factorisation. A review of recent literature has been carried out to delineate the mortality risk associated with these sources. Risk factors have then been applied for Suva, to calculate the possible mortality reduction that may be achieved through reduction in pollutant levels. Higher risk ratios for black carbon and sulphur resulted in mortality predictions for PM 2.5 from fossil fuel combustion, road vehicle emissions and waste burning that surpass predictions for these sources based on health risk of PM 2.5 mass alone. Predicted mortality for Suva from fossil fuel smoke exceeds the national toll from road accidents in Fiji. The greatest benefit for Suva, in terms of reduced mortality, is likely to be accomplished by reducing emissions from fossil fuel combustion (diesel), vehicles and waste burning. Copyright © 2017. Published by Elsevier B.V.

  9. Comparison of the Chernobyl and Fukushima nuclear accidents: a review of the environmental impacts.

    PubMed

    Steinhauser, Georg; Brandl, Alexander; Johnson, Thomas E

    2014-02-01

    The environmental impacts of the nuclear accidents of Chernobyl and Fukushima are compared. In almost every respect, the consequences of the Chernobyl accident clearly exceeded those of the Fukushima accident. In both accidents, most of the radioactivity released was due to volatile radionuclides (noble gases, iodine, cesium, tellurium). However, the amount of refractory elements (including actinides) emitted in the course of the Chernobyl accident was approximately four orders of magnitude higher than during the Fukushima accident. For Chernobyl, a total release of 5,300 PBq (excluding noble gases) has been established as the most cited source term. For Fukushima, we estimated a total source term of 520 (340-800) PBq. In the course of the Fukushima accident, the majority of the radionuclides (more than 80%) was transported offshore and deposited in the Pacific Ocean. Monitoring campaigns after both accidents reveal that the environmental impact of the Chernobyl accident was much greater than of the Fukushima accident. Both the highly contaminated areas and the evacuated areas are smaller around Fukushima and the projected health effects in Japan are significantly lower than after the Chernobyl accident. This is mainly due to the fact that food safety campaigns and evacuations worked quickly and efficiently after the Fukushima accident. In contrast to Chernobyl, no fatalities due to acute radiation effects occurred in Fukushima. © 2013.

  10. Algorithms and analytical solutions for rapidly approximating long-term dispersion from line and area sources

    NASA Astrophysics Data System (ADS)

    Barrett, Steven R. H.; Britter, Rex E.

    Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean dispersion are shown to produce results several orders of magnitude more efficiently with a loss of accuracy small compared to the absolute accuracy of advanced dispersion models near sources. The method can be readily incorporated into existing dispersion models, and may allow for additional computation time to be expended on modelling dispersion processes more accurately in future, rather than on accounting for source geometry.

  11. Aerosol contribution to the rapid warming of near-term climate under RCP 2.6

    NASA Astrophysics Data System (ADS)

    Chalmers, N.; Highwood, E. J.; Hawkins, E.; Sutton, R.; Wilcox, L. J.

    2012-09-01

    The importance of aerosol emissions for near term climate projections is investigated by analysing simulations with the HadGEM2-ES model under two different emissions scenarios: RCP2.6 and RCP4.5. It is shown that the near term warming projected under RCP2.6 is greater than under RCP4.5, even though the greenhouse gas forcing is lower. Rapid and substantial reductions in sulphate aerosol emissions due to a reduction of coal burning in RCP2.6 lead to a reduction in the negative shortwave forcing due to aerosol direct and indirect effects. Indirect effects play an important role over the northern hemisphere oceans, especially the subtropical northeastern Pacific where an anomaly of 5-10 Wm-2 develops. The pattern of surface temperature change is consistent with the expected response to this surface radiation anomaly, whilst also exhibiting features that reflect redistribution of energy, and feedbacks, within the climate system. These results demonstrate the importance of aerosol emissions as a key source of uncertainty in near term projections of global and regional climate.

  12. Multisource Estimation of Long-term Global Terrestrial Surface Radiation

    NASA Astrophysics Data System (ADS)

    Peng, L.; Sheffield, J.

    2017-12-01

    Land surface net radiation is the essential energy source at the earth's surface. It determines the surface energy budget and its partitioning, drives the hydrological cycle by providing available energy, and offers heat, light, and energy for biological processes. Individual components in net radiation have changed historically due to natural and anthropogenic climate change and land use change. Decadal variations in radiation such as global dimming or brightening have important implications for hydrological and carbon cycles. In order to assess the trends and variability of net radiation and evapotranspiration, there is a need for accurate estimates of long-term terrestrial surface radiation. While large progress in measuring top of atmosphere energy budget has been made, huge discrepancies exist among ground observations, satellite retrievals, and reanalysis fields of surface radiation, due to the lack of observational networks, the difficulty in measuring from space, and the uncertainty in algorithm parameters. To overcome the weakness of single source datasets, we propose a multi-source merging approach to fully utilize and combine multiple datasets of radiation components separately, as they are complementary in space and time. First, we conduct diagnostic analysis of multiple satellite and reanalysis datasets based on in-situ measurements such as Global Energy Balance Archive (GEBA), existing validation studies, and other information such as network density and consistency with other meteorological variables. Then, we calculate the optimal weighted average of multiple datasets by minimizing the variance of error between in-situ measurements and other observations. Finally, we quantify the uncertainties in the estimates of surface net radiation and employ physical constraints based on the surface energy balance to reduce these uncertainties. The final dataset is evaluated in terms of the long-term variability and its attribution to changes in individual components. The goal of this study is to provide a merged observational benchmark for large-scale diagnostic analyses, remote sensing and land surface modeling.

  13. High current liquid metal ion source using porous tungsten multiemitters.

    PubMed

    Tajmar, M; Vasiljevich, I; Grienauer, W

    2010-12-01

    We recently developed an indium Liquid-Metal-Ion-Source that can emit currents from sub-μA up to several mA. It is based on a porous tungsten crown structure with 28 individual emitters, which is manufactured using Micro-Powder Injection Molding (μPIM) and electrochemical etching. The emitter combines the advantages of internal capillary feeding with excellent emission properties due to micron-size tips. Significant progress was made on the homogeneity of the emission over its current-voltage characteristic as well as on investigating its long-term stability. This LMIS seems very suitable for space propulsion as well as for micro/nano manufacturing applications with greatly increased milling/drilling speeds. This paper summarizes the latest developments on our porous multiemitters with respect to manufacturing, emission properties and long-term testing. Copyright © 2010 Elsevier B.V. All rights reserved.

  14. A high accuracy sequential solver for simulation and active control of a longitudinal combustion instability

    NASA Technical Reports Server (NTRS)

    Shyy, W.; Thakur, S.; Udaykumar, H. S.

    1993-01-01

    A high accuracy convection scheme using a sequential solution technique has been developed and applied to simulate the longitudinal combustion instability and its active control. The scheme has been devised in the spirit of the Total Variation Diminishing (TVD) concept with special source term treatment. Due to the substantial heat release effect, a clear delineation of the key elements employed by the scheme, i.e., the adjustable damping factor and the source term treatment has been made. By comparing with the first-order upwind scheme previously utilized, the present results exhibit less damping and are free from spurious oscillations, offering improved quantitative accuracy while confirming the spectral analysis reported earlier. A simple feedback type of active control has been found to be capable of enhancing or attenuating the magnitude of the combustion instability.

  15. Re-addressable Interconnects with Light-Induced Waveguides in Liquid Crystals

    DTIC Science & Technology

    2011-08-09

    average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data...15. SUBJECT TERMS EOARD, Liquid Crystals, Laser beam control 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT...external stimuli, their performance is far from optimal: their response time can be larger than 100ms and they exhibit transverse fluctuations due

  16. Effects of foliar and tuber sprout suppressants on shelf life of ware potatoes under tropical ambient conditions

    USDA-ARS?s Scientific Manuscript database

    Potato (Solanum tuberosum) is an important source of dietary carbohydrate and cash income for farmers in the tropical highlands of Kenya. The feasibility for cold storage at the farm level is limited due to the high costs of maintaining such a facility and there is limited data on the long-term post...

  17. Long-term enrichment of the stable isotopic composition of stream water due to the release of groundwater recharge from extreme precipitation events

    NASA Astrophysics Data System (ADS)

    Boutt, D. F.

    2017-12-01

    The isotopic composition of surface and groundwater is impacted by a multitude of hydrologic processes. The long-term response of these systems to hydrologic change is critical for appropriately interpreting isotopic information for streamflow generation, stream-aquifer-coupling, sources of water to wells, and understanding recharge processes. To evaluate the response time of stream-aquifer systems to extreme precipitation events we use a long-term isotope dataset from Western Massachusetts with drainage areas ranging from 0.1 to > 800 km2. The year of 2011 was the wettest calendar year on record and the months of August and September of 2011 were the wettest consecutive two-month period in the 123 year record. Stable isotopic composition of surface waters of catchments ranging from 1 - 1000 km2 show an enrichment due to summertime and Tropical Storm precipitation. Enrichment in potential recharge water is shown to have a significant long-term impact (> 3 hydrologic years) on the isotopic composition of both surface and groundwater. This highlights the importance of groundwater sources of baseflow to streams and the transient storage and release mechanisms of shallow groundwater storage. The length of isotopic recession of stream water are also a strong function of watershed area. It is concluded that the stream water isotopes are consistent with a large pulse of water being stored and released from enriched groundwater emplaced during this period of above-average precipitation. Ultimately the results point to the importance of considering hydrological processes of streamflow generation and their role in hydrologic processes beyond traditional catchment response analysis.

  18. Detecting the permafrost carbon feedback: talik formation and increased cold-season respiration as precursors to sink-to-source transitions

    DOE PAGES

    Parazoo, Nicholas C.; Koven, Charles D.; Lawrence, David M.; ...

    2018-01-12

    Thaw and release of permafrost carbon (C) due to climate change is likely to offset increased vegetation C uptake in northern high-latitude (NHL) terrestrial ecosystems. Models project that this permafrost C feedback may act as a slow leak, in which case detection and attribution of the feedback may be difficult. The formation of talik, a subsurface layer of perennially thawed soil, can accelerate permafrost degradation and soil respiration, ultimately shifting the C balance of permafrost-affected ecosystems from long-term C sinks to long-term C sources. It is imperative to understand and characterize mechanistic links between talik, permafrost thaw, and respiration ofmore » deep soil C to detect and quantify the permafrost C feedback. Here, we use the Community Land Model (CLM) version 4.5, a permafrost and biogeochemistry model, in comparison to long-term deep borehole data along North American and Siberian transects, to investigate thaw-driven C sources in NHL ( > 55°N) from 2000 to 2300. Widespread talik at depth is projected across most of the NHL permafrost region (14 million km 2) by 2300, 6.2 million km 2 of which is projected to become a long-term C source, emitting 10 Pg C by 2100, 50 Pg C by 2200, and 120 Pg C by 2300, with few signs of slowing. Roughly half of the projected C source region is in predominantly warm sub-Arctic permafrost following talik onset. This region emits only 20 Pg C by 2300, but the CLM4.5 estimate may be biased low by not accounting for deep C in yedoma. Accelerated decomposition of deep soil C following talik onset shifts the ecosystem C balance away from surface dominant processes (photosynthesis and litter respiration), but sink-to-source transition dates are delayed by 20–200 years by high ecosystem productivity, such that talik peaks early (~2050s, although borehole data suggest sooner) and C source transition peaks late (~2150–2200). The remaining C source region in cold northern Arctic permafrost, which shifts to a net source early (late 21st century), emits 5 times more C (95 Pg C) by 2300, and prior to talik formation due to the high decomposition rates of shallow, young C in organic-rich soils coupled with low productivity. Our results provide important clues signaling imminent talik onset and C source transition, including (1) late cold-season (January–February) soil warming at depth (~2m), (2) increasing cold-season emissions (November–April), and (3) enhanced respiration of deep, old C in warm permafrost and young, shallow C in organic-rich cold permafrost soils. Our results suggest a mosaic of processes that govern carbon source-to-sink transitions at high latitudes and emphasize the urgency of monitoring soil thermal profiles, organic C age and content, cold-season CO 2 emissions, and atmospheric 14CO 2 as key indicators of the permafrost C feedback« less

  19. Detecting the permafrost carbon feedback: talik formation and increased cold-season respiration as precursors to sink-to-source transitions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parazoo, Nicholas C.; Koven, Charles D.; Lawrence, David M.

    Thaw and release of permafrost carbon (C) due to climate change is likely to offset increased vegetation C uptake in northern high-latitude (NHL) terrestrial ecosystems. Models project that this permafrost C feedback may act as a slow leak, in which case detection and attribution of the feedback may be difficult. The formation of talik, a subsurface layer of perennially thawed soil, can accelerate permafrost degradation and soil respiration, ultimately shifting the C balance of permafrost-affected ecosystems from long-term C sinks to long-term C sources. It is imperative to understand and characterize mechanistic links between talik, permafrost thaw, and respiration ofmore » deep soil C to detect and quantify the permafrost C feedback. Here, we use the Community Land Model (CLM) version 4.5, a permafrost and biogeochemistry model, in comparison to long-term deep borehole data along North American and Siberian transects, to investigate thaw-driven C sources in NHL ( > 55°N) from 2000 to 2300. Widespread talik at depth is projected across most of the NHL permafrost region (14 million km 2) by 2300, 6.2 million km 2 of which is projected to become a long-term C source, emitting 10 Pg C by 2100, 50 Pg C by 2200, and 120 Pg C by 2300, with few signs of slowing. Roughly half of the projected C source region is in predominantly warm sub-Arctic permafrost following talik onset. This region emits only 20 Pg C by 2300, but the CLM4.5 estimate may be biased low by not accounting for deep C in yedoma. Accelerated decomposition of deep soil C following talik onset shifts the ecosystem C balance away from surface dominant processes (photosynthesis and litter respiration), but sink-to-source transition dates are delayed by 20–200 years by high ecosystem productivity, such that talik peaks early (~2050s, although borehole data suggest sooner) and C source transition peaks late (~2150–2200). The remaining C source region in cold northern Arctic permafrost, which shifts to a net source early (late 21st century), emits 5 times more C (95 Pg C) by 2300, and prior to talik formation due to the high decomposition rates of shallow, young C in organic-rich soils coupled with low productivity. Our results provide important clues signaling imminent talik onset and C source transition, including (1) late cold-season (January–February) soil warming at depth (~2m), (2) increasing cold-season emissions (November–April), and (3) enhanced respiration of deep, old C in warm permafrost and young, shallow C in organic-rich cold permafrost soils. Our results suggest a mosaic of processes that govern carbon source-to-sink transitions at high latitudes and emphasize the urgency of monitoring soil thermal profiles, organic C age and content, cold-season CO 2 emissions, and atmospheric 14CO 2 as key indicators of the permafrost C feedback« less

  20. Spatial & temporal variations of PM10 and particle number concentrations in urban air.

    PubMed

    Johansson, Christer; Norman, Michael; Gidhagen, Lars

    2007-04-01

    The size of particles in urban air varies over four orders of magnitude (from 0.001 microm to 10 microm in diameter). In many cities only particle mass concentrations (PM10, i.e. particles <10 microm diameter) is measured. In this paper we analyze how differences in emissions, background concentrations and meteorology affect the temporal and spatial distribution of PM10 and total particle number concentrations (PNC) based on measurements and dispersion modeling in Stockholm, Sweden. PNC at densely trafficked kerbside locations are dominated by ultrafine particles (<0.1 microm diameter) due to vehicle exhaust emissions as verified by high correlation with NOx. But PNC contribute only marginally to PM10, due to the small size of exhaust particles. Instead wear of the road surface is an important factor for the highest PM10 concentrations observed. In Stockholm, road wear increases drastically due to the use of studded tires and traction sand on streets during winter; up to 90% of the locally emitted PM10 may be due to road abrasion. PM10 emissions and concentrations, but not PNC, at kerbside are controlled by road moisture. Annual mean urban background PM10 levels are relatively uniformly distributed over the city, due to the importance of long range transport. For PNC local sources often dominate the concentrations resulting in large temporal and spatial gradients in the concentrations. Despite these differences in the origin of PM10 and PNC, the spatial gradients of annual mean concentrations due to local sources are of equal magnitude due to the common source, namely traffic. Thus, people in different areas experiencing a factor of 2 different annual PM10 exposure due to local sources will also experience a factor of 2 different exposure in terms of PNC. This implies that health impact studies based solely on spatial differences in annual exposure to PM10 may not separate differences in health effects due to ultrafine and coarse particles. On the other hand, health effect assessments based on time series exposure analysis of PM10 and PNC, should be able to observe differences in health effects of ultrafine particles versus coarse particles.

  1. The use of the virtual source technique in computing scattering from periodic ocean surfaces.

    PubMed

    Abawi, Ahmad T

    2011-08-01

    In this paper the virtual source technique is used to compute scattering of a plane wave from a periodic ocean surface. The virtual source technique is a method of imposing boundary conditions using virtual sources, with initially unknown complex amplitudes. These amplitudes are then determined by applying the boundary conditions. The fields due to these virtual sources are given by the environment Green's function. In principle, satisfying boundary conditions on an infinite surface requires an infinite number of sources. In this paper, the periodic nature of the surface is employed to populate a single period of the surface with virtual sources and m surface periods are added to obtain scattering from the entire surface. The use of an accelerated sum formula makes it possible to obtain a convergent sum with relatively small number of terms (∼40). The accuracy of the technique is verified by comparing its results with those obtained using the integral equation technique.

  2. Experimental study of the thermal-acoustic efficiency in a long turbulent diffusion-flame burner

    NASA Technical Reports Server (NTRS)

    Mahan, J. R.

    1983-01-01

    A two-year study of noise production in a long tubular burner is described. The research was motivated by an interest in understanding and eventually reducing core noise in gas turbine engines. The general approach is to employ an acoustic source/propagation model to interpret the sound pressure spectrum in the acoustic far field of the burner in terms of the source spectrum that must have produced it. In the model the sources are assumed to be due uniquely to the unsteady component of combustion heat release; thus only direct combustion-noise is considered. The source spectrum is then the variation with frequency of the thermal-acoustic efficiency, defined as the fraction of combustion heat release which is converted into acoustic energy at a given frequency. The thrust of the research was to study the variation of the source spectrum with the design and operating parameters of the burner.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeze, R.A.

    Many emerging remediation technologies are designed to remove contaminant mass from source zones at DNAPL sites in response to regulatory requirements. There is often concern in the regulated community as to whether mass removal actually reduces risk, or whether the small risk reductions achieved warrant the large costs incurred. This paper sets out a framework for quantifying the degree to which risk is reduced as mass is removed from shallow, saturated, low-permeability, dual-porosity, DNAPL source zones. Risk is defined in terms of meeting an alternate concentration level (ACL) at a compliance well in an aquifer underlying the source zone. Themore » ACL is back-calculated from a carcinogenic health-risk characterization at a downstream water-supply well. Source-zone mass-removal efficiencies are heavily dependent on the distribution of mass between media (fractures, matrix) and phases (dissolved, sorbed, free product). Due to the uncertainties in currently-available technology performance data, the scope of the paper is limited to developing a framework for generic technologies rather than making risk-reduction calculations for specific technologies. Despite the qualitative nature of the exercise, results imply that very high mass-removal efficiencies are required to achieve significant long-term risk reduction with technology, applications of finite duration. 17 refs., 7 figs., 6 tabs.« less

  4. Glacial influence on the geochemistry of riverine iron fluxes to the Gulf of Alaska and effects of deglaciation

    USGS Publications Warehouse

    Schroth, A.W.; Crusius, John; Chever, F.; Bostick, B.C.; Rouxel, O.J.

    2011-01-01

    Riverine iron (Fe) derived from glacial weathering is a critical micronutrient source to ecosystems of the Gulf of Alaska (GoA). Here we demonstrate that the source and chemical nature of riverine Fe input to the GoA could change dramatically due to the widespread watershed deglaciation that is underway. We examine Fe size partitioning, speciation, and isotopic composition in tributaries of the Copper River which exemplify a long-term GoA watershed evolution from one strongly influenced by glacial weathering to a boreal-forested watershed. Iron fluxes from glacierized tributaries bear high suspended sediment and colloidal Fe loads of mixed valence silicate species, with low concentrations of dissolved Fe and dissolved organic carbon (DOC). Iron isotopic composition is indicative of mechanical weathering as the Fe source. Conversely, Fe fluxes from boreal-forested systems have higher dissolved Fe concentrations corresponding to higher DOC concentrations. Iron colloids and suspended sediment consist of Fe (hydr)oxides and organic complexes. These watersheds have an iron isotopic composition indicative of an internal chemical processing source. We predict that as the GoA watershed evolves due to deglaciation, so will the source, flux, and chemical nature of riverine Fe loads, which could have significant ramifications for Alaskan marine and freshwater ecosystems.

  5. Changing sources and environmental factors reduce the rates of decline of organochlorine pesticides in the Arctic Atmosphere

    NASA Astrophysics Data System (ADS)

    Becker, S.; Halsall, C. J.; Tych, W.; Kallenborn, R.; Schlabach, M.; Manø, S.

    2009-01-01

    An extensive database of organochlorine (OC) pesticide concentrations measured at the Norwegian Arctic Monitoring Station was analysed to assess longer-term trends in the Arctic atmosphere. Dynamic Harmonic Regression (DHR) is employed to investigate the seasonal and cyclical behaviour of chlordanes, DDTs and hexachlorobenzene (HCB), and to isolate underlying inter-annual trends. Although a simple comparison of annual mean concentrations (1994-2005) suggest a decline for all of the OCs investigated, the longer-term trends identified by DHR only show a significant decline for p,p'-DDT. Indeed, HCB shows an increase from 2003-2005. This is thought to be due to changes in source types and the presence of impurities in current use pesticides, together with retreating sea ice affecting air-water exchange. Changes in source types were revealed by using isomeric ratios for the chlordanes and DDTs. Declining trends in ratios of trans-chlordane/cis-chlordane (TC/CC) indicate a shift from primary sources, to more ''weathered'' secondary sources, whereas an increasing trend in o,p'-DDT/p,p'-DDT ratios indicate a shift from use of technical DDT to dicofol. Continued monitoring of these OC pesticides is required to fully understand the influence of a changing climate on the behaviour and environmental cycling of these chemicals in the Arctic as well as possible impacts from ''new'' sources.

  6. Changing sources and environmental factors reduce the rates of decline of organochlorine pesticides in the Arctic atmosphere

    NASA Astrophysics Data System (ADS)

    Becker, S.; Halsall, C. J.; Tych, W.; Kallenborn, R.; Schlabach, M.; Manø, S.

    2012-05-01

    An extensive database of organochlorine (OC) pesticide concentrations measured at the Norwegian Arctic monitoring station at Ny-Ålesund, Svalbard, was analysed to assess longer-term trends in the Arctic atmosphere. Dynamic Harmonic Regression (DHR) is employed to investigate the seasonal and cyclical behaviour of chlordanes, DDTs and hexachlorobenzene (HCB), and to isolate underlying inter-annual trends. Although a simple comparison of annual mean concentrations (1994-2005) suggest a decline for all of the OCs investigated, the longer-term trends identified by DHR only show a significant decline for p,p'-DDT. Indeed, HCB shows an increase from 2003-2005. This is thought to be due to changes in source types and the presence of impurities in current use pesticides, together with retreating sea ice affecting air-water exchange. Changes in source types were revealed by using isomeric ratios for the chlordanes and DDTs. Declining trends in ratios of trans-chlordane/cis-chlordane (TC/CC) indicate a shift from primary sources, to more "weathered" secondary sources, whereas an increasing trend in o,p'-DDT/p,p'-DDT ratios indicate a shift from use of technical DDT to dicofol. Continued monitoring of these OC pesticides is required to fully understand the influence of a changing climate on the behaviour and environmental cycling of these chemicals in the Arctic as well as possible impacts from "new" sources.

  7. Short-term microbial release during rain events from on-site sewers and cattle in a surface water source.

    PubMed

    Aström, Johan; Pettersson, Thomas J R; Reischer, Georg H; Hermansson, Malte

    2013-09-01

    The protection of drinking water from pathogens such as Cryptosporidium and Giardia requires an understanding of the short-term microbial release from faecal contamination sources in the catchment. Flow-weighted samples were collected during two rainfall events in a stream draining an area with on-site sewers and during two rainfall events in surface runoff from a bovine cattle pasture. Samples were analysed for human (BacH) and ruminant (BacR) Bacteroidales genetic markers through quantitative polymerase chain reaction (qPCR) and for sorbitol-fermenting bifidobacteria through culturing as a complement to traditional faecal indicator bacteria, somatic coliphages and the parasitic protozoa Cryptosporidium spp. and Giardia spp. analysed by standard methods. Significant positive correlations were observed between BacH, Escherichia coli, intestinal enterococci, sulphite-reducing Clostridia, turbidity, conductivity and UV254 in the stream contaminated by on-site sewers. For the cattle pasture, no correlation was found between any of the genetic markers and the other parameters. Although parasitic protozoa were not detected, the analysis for genetic markers provided baseline data on the short-term faecal contamination due to these potential sources of parasites. Background levels of BacH and BacR makers in soil emphasise the need to including soil reference samples in qPCR-based analyses for Bacteroidales genetic markers.

  8. Nonlocal effects in nonisothermal hydrodynamics from the perspective of beyond-equilibrium thermodynamics.

    PubMed

    Hütter, Markus; Brader, Joseph M

    2009-06-07

    We examine the origins of nonlocality in a nonisothermal hydrodynamic formulation of a one-component fluid of particles that exhibit long-range correlations, e.g., due to a spherically symmetric, long-range interaction potential. In order to furnish the continuum modeling with physical understanding of the microscopic interactions and dynamics, we make use of systematic coarse graining from the microscopic to the continuum level. We thus arrive at a thermodynamically admissible and closed set of evolution equations for the densities of momentum, mass, and internal energy. From the consideration of an illustrative special case, the following main conclusions emerge. There are two different source terms in the momentum balance. The first is a body force, which in special circumstances can be related to the functional derivative of a nonlocal Helmholtz free energy density with respect to the mass density. The second source term is proportional to the temperature gradient, multiplied by the nonlocal entropy density. These two source terms combine into a pressure gradient only in the absence of long-range effects. In the irreversible contributions to the time evolution, the nonlocal contributions arise since the self-correlations of the stress tensor and heat flux, respectively, are nonlocal as a result of the microscopic nonlocal correlations. Finally, we point out specific points that warrant further discussions.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benites, J.; Alumno del Posgrado en CBAP, Universidad Autonoma de Nayarit, Carretera Tepic-Compostela km9. C.P. 63780. Xalisco-Nayarit-Mexico; Vega-Carrillo, H. R.

    Neutron spectra and the ambient dose equivalent were calculated inside the bunker of a 15 MV Varian linac model CLINAC iX. Calculations were carried out using Monte Carlo methods. Neutron spectra in the vicinity of isocentre show the presence of evaporation and knock-on neutrons produced by the source term, while epithermal and thermal neutron remain constant regardless the distance respect to isocentre, due to room return. Along the maze neutron spectra becomes softer as the detector moves along the maze. The ambient dose equivalent is decreased but do not follow the 1/r{sup 2} rule due to changes in the neutronmore » spectra.« less

  10. Darwin Core: An Evolving Community-Developed Biodiversity Data Standard

    PubMed Central

    Wieczorek, John; Bloom, David; Guralnick, Robert; Blum, Stan; Döring, Markus; Giovanni, Renato; Robertson, Tim; Vieglais, David

    2012-01-01

    Biodiversity data derive from myriad sources stored in various formats on many distinct hardware and software platforms. An essential step towards understanding global patterns of biodiversity is to provide a standardized view of these heterogeneous data sources to improve interoperability. Fundamental to this advance are definitions of common terms. This paper describes the evolution and development of Darwin Core, a data standard for publishing and integrating biodiversity information. We focus on the categories of terms that define the standard, differences between simple and relational Darwin Core, how the standard has been implemented, and the community processes that are essential for maintenance and growth of the standard. We present case-study extensions of the Darwin Core into new research communities, including metagenomics and genetic resources. We close by showing how Darwin Core records are integrated to create new knowledge products documenting species distributions and changes due to environmental perturbations. PMID:22238640

  11. Reprint of: High current liquid metal ion source using porous tungsten multiemitters.

    PubMed

    Tajmar, M; Vasiljevich, I; Grienauer, W

    2011-05-01

    We recently developed an indium Liquid-Metal-Ion-Source that can emit currents from sub-μA up to several mA. It is based on a porous tungsten crown structure with 28 individual emitters, which is manufactured using Micro-Powder Injection Molding (μPIM) and electrochemical etching. The emitter combines the advantages of internal capillary feeding with excellent emission properties due to micron-size tips. Significant progress was made on the homogeneity of the emission over its current-voltage characteristic as well as on investigating its long-term stability. This LMIS seems very suitable for space propulsion as well as for micro/nano manufacturing applications with greatly increased milling/drilling speeds. This paper summarizes the latest developments on our porous multiemitters with respect to manufacturing, emission properties and long-term testing. Copyright © 2010 Elsevier B.V. All rights reserved.

  12. Part 1 of a Computational Study of a Drop-Laden Mixing Layer

    NASA Technical Reports Server (NTRS)

    Okong'o, Nora A.; Bellan, Josette

    2004-01-01

    This first of three reports on a computational study of a drop-laden temporal mixing layer presents the results of direct numerical simulations (DNS) of well-resolved flow fields and the derivation of the large-eddy simulation (LES) equations that would govern the larger scales of a turbulent flow field. The mixing layer consisted of two counterflowing gas streams, one of which was initially laden with evaporating liquid drops. The gas phase was composed of two perfect gas species, the carrier gas and the vapor emanating from the drops, and was computed in an Eulerian reference frame, whereas each drop was tracked individually in a Lagrangian manner. The flow perturbations that were initially imposed on the layer caused mixing and eventual transition to turbulence. The DNS database obtained included transitional states for layers with various liquid mass loadings. For the DNS, the gas-phase equations were the compressible Navier-Stokes equations for conservation of momentum and additional conservation equations for total energy and species mass. These equations included source terms representing the effect of the drops on the mass, momentum, and energy of the gas phase. From the DNS equations, the expression for the irreversible entropy production (dissipation) was derived and used to determine the dissipation due to the source terms. The LES equations were derived by spatially filtering the DNS set and the magnitudes of the terms were computed at transitional states, leading to a hierarchy of terms to guide simplification of the LES equations. It was concluded that effort should be devoted to the accurate modeling of both the subgridscale fluxes and the filtered source terms, which were the dominant unclosed terms appearing in the LES equations.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parworth, Caroline; Fast, Jerome D.; Mei, Fan

    In this study the long-term trends of non-refractory submicrometer aerosol (NR-PM1) composition and mass concentration measured by an Aerosol Chemical Speciation Monitor (ACSM) at the U.S. Department of Energy’s Southern Great Plains (SGP) site are discussed. Over the period of 19 months (Nov. 20, 2010 – June 2012) highly time resolved (~30 min.) NR-PM1 data was recorded. Using this dataset the value-added product (VAP) of deriving organic aerosol components (OACOMP) is introduced. With this VAP, multivariate analysis of the measured organic mass spectral matrix can be performed on long term data to return organic aerosol (OA) factors that are associatedmore » with distinct sources, evolution processes, and physiochemical properties. Three factors were obtained from this VAP including two oxygenated OA (OOA) factors, differing in degrees of oxidation, and a biomass burning OA (BBOA) factor. Back trajectory analyses were performed to investigate possible sources of major NR-PM1 species at the SGP site. Organics dominated NR-PM1 mass concentration for the majority of the study with the exception of winter, when nitrate increased due to transport of precursor species from surrounding urban and agricultural areas and also due to cooler temperatures. Sulfate mass concentrations showed little seasonal variation with mixed regional and local sources. In the spring BBOA emissions increased and were mainly associated with local fires. Isoprene and carbon monoxide emission rates were computed by the Model of Emissions of Gases and Aerosols from Nature (MEGAN) to represent the spatial distribution of biogenic and anthropogenic sources, respectively. From this model there is evidence to support that biogenic emissions from the southeast contribute to SOA formation at the SGP site during the summer.« less

  14. Contributions of wildland fire to terrestrial ecosystem carbon dynamics in North America from 1990 to 2012

    USGS Publications Warehouse

    Chen, Guangsheng; Hayes, Daniel J.; McGuire, A. David

    2017-01-01

    Burn area and the frequency of extreme fire events have been increasing during recent decades in North America, and this trend is expected to continue over the 21st century. While many aspects of the North American carbon budget have been intensively studied, the net contribution of fire disturbance to the overall net carbon flux at the continental scale remains uncertain. Based on national scale, spatially explicit and long-term fire data, along with the improved model parameterization in a process-based ecosystem model, we simulated the impact of fire disturbance on both direct carbon emissions and net terrestrial ecosystem carbon balance in North America. Fire-caused direct carbon emissions were 106.55 ± 15.98 Tg C/yr during 1990–2012; however, the net ecosystem carbon balance associated with fire was −26.09 ± 5.22 Tg C/yr, indicating that most of the emitted carbon was resequestered by the terrestrial ecosystem. Direct carbon emissions showed an increase in Alaska and Canada during 1990–2012 as compared to prior periods due to more extreme fire events, resulting in a large carbon source from these two regions. Among biomes, the largest carbon source was found to be from the boreal forest, primarily due to large reductions in soil organic matter during, and with slower recovery after, fire events. The interactions between fire and environmental factors reduced the fire-caused ecosystem carbon source. Fire disturbance only caused a weak carbon source as compared to the best estimate terrestrial carbon sink in North America owing to the long-term legacy effects of historical burn area coupled with fast ecosystem recovery during 1990–2012.

  15. Quantitative investigation into the source of current slump in AlGaN/GaN HEMT on both Si (111) and sapphire: Self-heating and trapping

    NASA Astrophysics Data System (ADS)

    Bag, Ankush; Mukhopadhyay, Partha; Ghosh, Saptarsi; Das, Palash; Chakraborty, Apurba; Dinara, Syed M.; Kabi, Sanjib; Biswas, Dhurbes

    2015-05-01

    We have experimentally studied trapping and self-heating effect in terms of current slump in AlGaN/GaN HEMT grown and identically processed on Silicon (111) and Sapphire (0001) substrates. Different responses have been observed through DC characterization of different duty cycle (100%, 50%, 5% and 0.5%) of pulses at drain end. Effect of self-heating is more in case of HEMT on Sapphire due to its comparative poor thermal conductivity whereas trapped charges have strong contribution in current drop of HEMT on Si (111) due to larger lattice as well as thermal expansion coefficient mismatched epitaxy between GaN and Si (111). These results have been compared among substrates that lead us to find out optimal source of current slump quantitatively between traps and self-heating.

  16. Cosine beamforming

    NASA Astrophysics Data System (ADS)

    Ruigrok, Elmer; Wapenaar, Kees

    2014-05-01

    In various application areas, e.g., seismology, astronomy and geodesy, arrays of sensors are used to characterize incoming wavefields due to distant sources. Beamforming is a general term for phased-adjusted summations over the different array elements, for untangling the directionality and elevation angle of the incoming waves. For characterizing noise sources, beamforming is conventionally applied with a temporal Fourier and a 2D spatial Fourier transform, possibly with additional weights. These transforms become aliased for higher frequencies and sparser array-element distributions. As a partial remedy, we derive a kernel for beamforming crosscorrelated data and call it cosine beamforming (CBF). By applying beamforming not directly to the data, but to crosscorrelated data, the sampling is effectively increased. We show that CBF, due to this better sampling, suffers less from aliasing and yields higher resolution than conventional beamforming. As a flip-side of the coin, the CBF output shows more smearing for spherical waves than conventional beamforming.

  17. Taste and odor occurrence in Lake William C. Bowen and Municipal Reservoir #1, Spartanburg County, South Carolina

    USGS Publications Warehouse

    Journey, Celeste; Arrington, Jane M.

    2009-01-01

    The U.S. Geological Survey and Spartanburg Water are working cooperatively on an ongoing study of Lake Bowen and Reservoir #1 to identify environmental factors that enhance or influence the production of geosmin in the source-water reservoirs. Spartanburg Water is using information from this study to develop management strategies to reduce (short-term solution) and prevent (long-term solution) geosmin occurrence. Spartanburg Water utility treats and distributes drinking water to the Spartanburg area of South Carolina. The drinking water sources for the area are Lake William C. Bowen (Lake Bowen) and Municipal Reservoir #1 (Reservoir #1), located north of Spartanburg. These reservoirs, which were formed by the impoundment of the South Pacolet River, were assessed in 2006 by the South Carolina Department of Health and Environmental Control (SCDHEC) as being fully supportive of all uses based on established criteria. Nonetheless, Spartanburg Water had noted periodic taste and odor problems due to the presence of geosmin, a naturally occurring compound in the source water. Geosmin is not harmful, but its presence in drinking water is aesthetically unpleasant.

  18. Combining molecular fingerprints with multidimensional scaling analyses to identify the source of spilled oil from highly similar suspected oils.

    PubMed

    Zhou, Peiyu; Chen, Changshu; Ye, Jianjun; Shen, Wenjie; Xiong, Xiaofei; Hu, Ping; Fang, Hongda; Huang, Chuguang; Sun, Yongge

    2015-04-15

    Oil fingerprints have been a powerful tool widely used for determining the source of spilled oil. In most cases, this tool works well. However, it is usually difficult to identify the source if the oil spill accident occurs during offshore petroleum exploration due to the highly similar physiochemical characteristics of suspected oils from the same drilling platform. In this report, a case study from the waters of the South China Sea is presented, and multidimensional scaling analysis (MDS) is introduced to demonstrate how oil fingerprints can be combined with mathematical methods to identify the source of spilled oil from highly similar suspected sources. The results suggest that the MDS calculation based on oil fingerprints and subsequently integrated with specific biomarkers in spilled oils is the most effective method with a great potential for determining the source in terms of highly similar suspected oils. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. On estimating attenuation from the amplitude of the spectrally whitened ambient seismic field

    NASA Astrophysics Data System (ADS)

    Weemstra, Cornelis; Westra, Willem; Snieder, Roel; Boschi, Lapo

    2014-06-01

    Measuring attenuation on the basis of interferometric, receiver-receiver surface waves is a non-trivial task: the amplitude, more than the phase, of ensemble-averaged cross-correlations is strongly affected by non-uniformities in the ambient wavefield. In addition, ambient noise data are typically pre-processed in ways that affect the amplitude itself. Some authors have recently attempted to measure attenuation in receiver-receiver cross-correlations obtained after the usual pre-processing of seismic ambient-noise records, including, most notably, spectral whitening. Spectral whitening replaces the cross-spectrum with a unit amplitude spectrum. It is generally assumed that cross-terms have cancelled each other prior to spectral whitening. Cross-terms are peaks in the cross-correlation due to simultaneously acting noise sources, that is, spurious traveltime delays due to constructive interference of signal coming from different sources. Cancellation of these cross-terms is a requirement for the successful retrieval of interferometric receiver-receiver signal and results from ensemble averaging. In practice, ensemble averaging is replaced by integrating over sufficiently long time or averaging over several cross-correlation windows. Contrary to the general assumption, we show in this study that cross-terms are not required to cancel each other prior to spectral whitening, but may also cancel each other after the whitening procedure. Specifically, we derive an analytic approximation for the amplitude difference associated with the reversed order of cancellation and normalization. Our approximation shows that an amplitude decrease results from the reversed order. This decrease is predominantly non-linear at small receiver-receiver distances: at distances smaller than approximately two wavelengths, whitening prior to ensemble averaging causes a significantly stronger decay of the cross-spectrum.

  20. Unconsidered sporadic sources of carbon dioxide emission from soils in taiga forests.

    PubMed

    Karelin, D V; Zamolodchikov, D G; Isaev, A S

    2017-07-01

    Long-term monitoring in the Russian taiga zone has shown that all known extreme destructive effects resulting in the weakening and death of tree stands (windfalls, pest attacks, drought events, etc.) can be sporadic, but significant sources of CO 2 soil emission. Among them are (i) a recently found effect of the multiyear CO 2 emission from soil at the bottom of deadwood of spruce trees that died due to climate warming and subsequent pest outbreaks, (ii) increased soil CO 2 emissions due to to the fall of tree trunks during massive windfalls, and (iii) pulse CO 2 emission as a result of the so-called Birch effect after drought events in the taiga zone. According to the modeling, while depending on the spatial and temporal scales of their manifestation, the impact of these sporadic effects on the regional and global soil respiration fluxes could be significant and should be taken into consideration. This is due to continuing Climate Change, and further increase of local, regional and Global human impacts on the atmospheric greenhouse gases balance, and land use, as well.

  1. Evaluation of sensor, environment and operational factors impacting the use of multiple sensor constellations for long term resource monitoring

    NASA Astrophysics Data System (ADS)

    Rengarajan, Rajagopalan

    Moderate resolution remote sensing data offers the potential to monitor the long and short term trends in the condition of the Earth's resources at finer spatial scales and over longer time periods. While improved calibration (radiometric and geometric), free access (Landsat, Sentinel, CBERS), and higher level products in reflectance units have made it easier for the science community to derive the biophysical parameters from these remotely sensed data, a number of issues still affect the analysis of multi-temporal datasets. These are primarily due to sources that are inherent in the process of imaging from single or multiple sensors. Some of these undesired or uncompensated sources of variation include variation in the view angles, illumination angles, atmospheric effects, and sensor effects such as Relative Spectral Response (RSR) variation between different sensors. The complex interaction of these sources of variation would make their study extremely difficult if not impossible with real data, and therefore, a simulated analysis approach is used in this study. A synthetic forest canopy is produced using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model and its measured BRDFs are modeled using the RossLi canopy BRDF model. The simulated BRDF matches the real data to within 2% of the reflectance in the red and the NIR spectral bands studied. The BRDF modeling process is extended to model and characterize the defoliation of a forest, which is used in factor sensitivity studies to estimate the effect of each factor for varying environment and sensor conditions. Finally, a factorial experiment is designed to understand the significance of the sources of variation, and regression based analysis are performed to understand the relative importance of the factors. The design of experiment and the sensitivity analysis conclude that the atmospheric attenuation and variations due to the illumination angles are the dominant sources impacting the at-sensor radiance.

  2. A framework for emissions source apportionment in industrial areas: MM5/CALPUFF in a near-field application.

    PubMed

    Ghannam, K; El-Fadel, M

    2013-02-01

    This paper examines the relative source contribution to ground-level concentrations of carbon monoxide (CO), nitrogen dioxide (NO2), and PM10 (particulate matter with an aerodynamic diameter < 10 microm) in a coastal urban area due to emissions from an industrial complex with multiple stacks, quarrying activities, and a nearby highway. For this purpose, an inventory of CO, oxide of nitrogen (NO(x)), and PM10 emissions was coupled with the non-steady-state Mesoscale Model 5/California Puff Dispersion Modeling system to simulate individual source contributions under several spatial and temporal scales. As the contribution of a particular source to ground-level concentrations can be evaluated by simulating this single-source emissions or otherwise total emissions except that source, a set of emission sensitivity simulations was designed to examine if CALPUFF maintains a linear relationship between emission rates and predicted concentrations in cases where emitted plumes overlap and chemical transformations are simulated. Source apportionment revealed that ground-level releases (i.e., highway and quarries) extended over large areas dominated the contribution to exposure levels over elevated point sources, despite the fact that cumulative emissions from point sources are higher. Sensitivity analysis indicated that chemical transformations of NO(x) are insignificant, possibly due to short-range plume transport, with CALPUFF exhibiting a linear response to changes in emission rate. The current paper points to the significance of ground-level emissions in contributing to urban air pollution exposure and questions the viability of the prevailing paradigm of point-source emission reduction, especially that the incremental improvement in air quality associated with this common abatement strategy may not accomplish the desirable benefit in terms of lower exposure with costly emissions capping. The application of atmospheric dispersion models for source apportionment helps in identifying major contributors to regional air pollution. In industrial urban areas where multiple sources with different geometry contribute to emissions, ground-level releases extended over large areas such as roads and quarries often dominate the contribution to ground-level air pollution. Industrial emissions released at elevated stack heights may experience significant dilution, resulting in minor contribution to exposure at ground level. In such contexts, emission reduction, which is invariably the abatement strategy targeting industries at a significant investment in control equipment or process change, may result in minimal return on investment in terms of improvement in air quality at sensitive receptors.

  3. RADTRAD: A simplified model for RADionuclide Transport and Removal And Dose estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphreys, S.L.; Miller, L.A.; Monroe, D.K.

    1998-04-01

    This report documents the RADTRAD computer code developed for the U.S. Nuclear Regulatory Commission (NRC) Office of Nuclear Reactor Regulation (NRR) to estimate transport and removal of radionuclides and dose at selected receptors. The document includes a users` guide to the code, a description of the technical basis for the code, the quality assurance and code acceptance testing documentation, and a programmers` guide. The RADTRAD code can be used to estimate the containment release using either the NRC TID-14844 or NUREG-1465 source terms and assumptions, or a user-specified table. In addition, the code can account for a reduction in themore » quantity of radioactive material due to containment sprays, natural deposition, filters, and other natural and engineered safety features. The RADTRAD code uses a combination of tables and/or numerical models of source term reduction phenomena to determine the time-dependent dose at user-specified locations for a given accident scenario. The code system also provides the inventory, decay chain, and dose conversion factor tables needed for the dose calculation. The RADTRAD code can be used to assess occupational radiation exposures, typically in the control room; to estimate site boundary doses; and to estimate dose attenuation due to modification of a facility or accident sequence.« less

  4. Characterization and Evolution of the Swift X-ray Telescope Instrumental Background

    NASA Technical Reports Server (NTRS)

    Hill, Joanne; Pagani, C.; Morris, D. C.; Racusin, J.; Grupe, D.; Vetere, L.; Stroh, M.; Falcone, A.; Kennea, J.; Burrows, D. N.; hide

    2007-01-01

    The X-ray telescope (XRT) on board the Swift Gamma Ray Burst Explorer has successfully operated since the spacecraft launch on 20 November 2004, automatically locating GRB afterglows, measuring their spectra and lightcurves and performing observations of high-energy sources. In this work we investigate the properties of the instrumental background, focusing on its dynamic behavior on both long and short timescales. The operational temperature of the CCD is the main factor that influences the XRT background level. After the failure of the Swift active on-board temperature control system, the XRT detector now operates at a temperature range between -75C and -45C thanks to a passive cooling Heat Rejection System. We report on the long-term effects on the background caused by radiation, consisting mainly of proton irradiation in Swift's low Earth orbit and on the short-term effects of transits through the South Atlantic Anomaly (SAA), which expose the detector to periods of intense proton flux. We have determined the fraction of the detector background that is due to the internal, instrumental background and the part that is due to unresolved astrophysical sources (the cosmic X-ray background) by investigating the degree of vignetting of the measured background and comparing it to the expected value from calibration data.

  5. Fundamental Rotorcraft Acoustic Modeling From Experiments (FRAME)

    NASA Technical Reports Server (NTRS)

    Greenwood, Eric

    2011-01-01

    A new methodology is developed for the construction of helicopter source noise models for use in mission planning tools from experimental measurements of helicopter external noise radiation. The models are constructed by employing a parameter identification method to an assumed analytical model of the rotor harmonic noise sources. This new method allows for the identification of individual rotor harmonic noise sources and allows them to be characterized in terms of their individual non-dimensional governing parameters. The method is applied to both wind tunnel measurements and ground noise measurements of two-bladed rotors. The method is shown to match the parametric trends of main rotor harmonic noise, allowing accurate estimates of the dominant rotorcraft noise sources to be made for operating conditions based on a small number of measurements taken at different operating conditions. The ability of this method to estimate changes in noise radiation due to changes in ambient conditions is also demonstrated.

  6. Tracing changes in soil N transformations to explain the doubling of N2O emissions under elevated CO2 in the Giessen FACE

    NASA Astrophysics Data System (ADS)

    Moser, Gerald; Brenzinger, Kristof; Gorenflo, Andre; Clough, Tim; Braker, Gesche; Müller, Christoph

    2017-04-01

    To reduce the emissions of greenhouse gases (CO2, CH4 & N2O) it is important to quantify main sources and identify the respective ecosystem processes. While the main sources of N2O emissions in agro-ecosystems under current conditions are well known, the influence of a projected higher level of CO2 on the main ecosystem processes responsible for N2O emissions has not been investigated in detail. A major result of the Giessen FACE in a managed temperate grassland was that a +20% CO2 level caused a positive feedback due to increased emissions of N2O to 221% related to control condition. To be able to trace the sources of additional N2O emissions a 15N tracing study was conducted. We measured the N2O emission and its 15N signature, together with the 15N signature of soil and plant samples. The results were analyzed using a 15N tracing model which quantified the main changes in N transformation rates under elevated CO2. Directly after 15N fertilizer application a much higher dynamic of N transformations was observed than in the long run. Absolute mineralisation and DNRA rates were lower under elevated CO2 in the short term but higher in the long term. During the one year study period beginning with the 15N labelling a 1.8-fold increase of N2O emissions occurred under elevated CO2. The source of increased N2O was associated with NO3- in the first weeks after 15N application. Elevated CO2 affected denitrification rates, which resulted in increased N2O emissions due to a change of gene transcription rates (nosZ/(nirK+nirS)) and resulting enzyme activity (see: Brenzinger et al.). Here we show that the reported enhanced N2O emissions for the first 8 FACE years do prevail even in the long-term (> 15 years). The effect of elevated CO2 on N2O production/emission can be explained by altered activity ratios within a stable microbial community.

  7. Source identification and spatial distribution of heavy metals in tobacco-growing soils in Shandong province of China with multivariate and geostatistical analysis.

    PubMed

    Liu, Haiwei; Zhang, Yan; Zhou, Xue; You, Xiuxuan; Shi, Yi; Xu, Jialai

    2017-02-01

    Samples of surface soil from tobacco (Nicotiana tabacum L.) fields were analysed for heavy metals and showed the following concentrations (mean of 246 samples, mg/kg): As, 5.10; Cd, 0.11; Cr, 49.49; Cu, 14.72; Hg, 0.08; Ni, 19.28; Pb. 20.20 and Zn, 30.76. The values of the index of geoaccumulation (I geo ) and of the enrichment factor indicated modest enrichment with As, Cd, Cr, Hg, Ni or Pb. Principal component analysis and cluster analysis correctly allocated each investigated element to its source, whether anthropogenic or natural. The results were consistent with estimated inputs of heavy metals from fertilizers, irrigation water and atmospheric deposition. The variation in the concentrations of As, Cd, Cu, Pb and Zn in the soil was mainly due to long-term agricultural practises, and that of Cr and Ni was mainly due to the soil parent material, whereas the source of Hg was industrial activity, which ultimately led to atmospheric deposition. Atmospheric deposition was the main exogenous source of heavy metals, and fertilizers also played an important role in the accumulation of these elements in soil. Identifying the sources of heavy metals in agricultural soils can serve as a basis for appropriate action to control and reduce the addition of heavy metals to cultivated soils.

  8. Complete Moment Tensor Determination of Induced Seismicity in Unconventional and Conventional Oil/Gas Fields

    NASA Astrophysics Data System (ADS)

    Gu, C.; Li, J.; Toksoz, M. N.

    2013-12-01

    Induced seismicity occurs both in conventional oil/gas fields due to production and water injection and in unconventional oil/gas fields due to hydraulic fracturing. Source mechanisms of these induced earthquakes are of great importance for understanding their causes and the physics of the seismic processes in reservoirs. Previous research on the analysis of induced seismic events in conventional oil/gas fields assumed a double couple (DC) source mechanism. However, recent studies have shown a non-negligible percentage of a non-double-couple (non-DC) component of source moment tensor in hydraulic fracturing events (Šílený et al., 2009; Warpinski and Du, 2010; Song and Toksöz, 2011). In this study, we determine the full moment tensor of the induced seismicity data in a conventional oil/gas field and for hydrofrac events in an unconventional oil/gas field. Song and Toksöz (2011) developed a full waveform based complete moment tensor inversion method to investigate a non-DC source mechanism. We apply this approach to the induced seismicity data from a conventional gas field in Oman. In addition, this approach is also applied to hydrofrac microseismicity data monitored by downhole geophones in four wells in US. We compare the source mechanisms of induced seismicity in the two different types of gas fields and explain the differences in terms of physical processes.

  9. [A survey on fatal work accidents based on Mortality Registry data: results of the Tuscany study on INAIL and RMR cases in the period 1992-2996].

    PubMed

    Chellini, Elisabetta; Baldasseroni, Alberto; Giovannetti, Lucia; Zoppi, Ombretta

    2002-01-01

    Work-related deaths are important "sentinel events" of unsuccessful prevention. In Italy the most exhaustive source of such events is the National Fund for Occupational Diseases (INAIL), but the amount of cases from this source seems to be underestimated due to the fact that it refers only to cases occurred to subjects insured by the Fund. A previous survey estimated the real amount of work-related deaths 10-20% higher than that quantified by INAIL. This study evaluated the contribute of the two most important sources (INAIL and the Regional Mortality Registry of Tuscany-RMR) in estimating the number of these cases in Tuscany in the period 1992-96. Cases were identified from each source, and then it was applied a capture-recapture method to size the cases from work-related accident different from road accidents. RMR appeared to be the most exhaustive source with 72.3% completeness versus 56.4% completeness of INAIL source. Nevertheless the last one must be considered the primary source, more specific and accurate, and since few years also timely, than any other one. Work-related deaths from road accident represent 35.9% of INAIL cases but they are difficult to be identified from RMR and were not considered in this study. In conclusion the mortality data should be used for an epidemiologic surveillance system on work-related deaths not due to road accident in order to identify cases occurred to subjects not insured by INAIL (and therefore not defined by the Fund). These deaths are also important in terms of public health. Cases identified only from RMR, occurred in Tuscany in 1992-96, were 155: the vast majority occurred to farmers (mainly pensioners, and due to caterpillar upsetting), to bricklayers, to railway workers, to soldiers and to entrepreneurs.

  10. MSW-resonant fermion mixing during reheating

    NASA Astrophysics Data System (ADS)

    Kanai, Tsuneto; Tsujikawa, Shinji

    2003-10-01

    We study the dynamics of reheating in which an inflaton field couples two flavor fermions through Yukawa-couplings. When two fermions have a mixing term with a constant coupling, we show that the Mikheyev-Smirnov-Wolfenstein (MSW)-type resonance emerges due to a time-dependent background in addition to the standard fermion creation via parametric resonance. This MSW resonance not only alters the number densities of fermions generated by a preheating process but also can lead to the larger energy transfer from the inflaton to fermions. Our mechanism can provide additional source terms for the creation of superheavy fermions which may be relevant for the leptogenesis scenario.

  11. Global Infrasound Association Based on Probabilistic Clutter Categorization

    NASA Astrophysics Data System (ADS)

    Arora, N. S.; Mialle, P.

    2015-12-01

    The IDC collects waveforms from a global network of infrasound sensors maintained by the IMS, and automatically detects signal onsets and associates them to form event hypotheses. However, a large number of signal onsets are due to local clutter sources such as microbaroms (from standing waves in the oceans), waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NET-VISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydro-acoustic and infrasound processing built on a unified probabilistic framework. Notes: The attached figure shows all the unassociated arrivals detected at IMS station I09BR for 2012 distributed by azimuth and center frequency. (The title displays the bandwidth of the kernel density estimate along the azimuth and frequency dimensions).This plot shows multiple micro-barom sources as well as other sources of infrasound clutter. A diverse clutter-field such as this one is quite common for most IMS infrasound stations, and it highlights the dangers of forming events without due consideration of this source of noise. References: [1] Infrasound categorization Towards a statistics-based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NET-VISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013.

  12. Prevalence of microbiological contaminants in groundwater sources and risk factor assessment in Juba, South Sudan.

    PubMed

    Engström, Emma; Balfors, Berit; Mörtberg, Ulla; Thunvik, Roger; Gaily, Tarig; Mangold, Mikael

    2015-05-15

    In low-income regions, drinking water is often derived from groundwater sources, which might spread diarrheal disease if they are microbiologically polluted. This study aimed to investigate the occurrence of fecal contamination in 147 improved groundwater sources in Juba, South Sudan and to assess potential contributing risk factors, based on bivariate statistical analysis. Thermotolerant coliforms (TTCs) were detected in 66% of the investigated sources, including 95 boreholes, breaching the health-based recommendations for drinking water. A significant association (p<0.05) was determined between the presence of TTCs and the depth of cumulative, long-term prior precipitation (both within the previous five days and within the past month). No such link was found to short-term rainfall, the presence of latrines or damages in the borehole apron. However, the risk factor analysis further suggested, to a lesser degree, that the local topography and on-site hygiene were additionally significant. In summary, the analysis indicated that an important contamination mechanism was fecal pollution of the contributing groundwater, which was unlikely due to the presence of latrines; instead, infiltration from contaminated surface water was more probable. The reduction in fecal sources in the environment in Juba is thus recommended, for example, through constructing latrines or designating protection areas near water sources. The study results contribute to the understanding of microbiological contamination of groundwater sources in areas with low incomes and high population densities, tropical climates and weathered basement complex environments, which are common in urban sub-Saharan Africa. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Source calibrations and SDC calorimeter requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.

    Several studies of the problem of calibration of the SDC calorimeter exist. In this note the attempt is made to give a connected account of the requirements on the source calibration from the point of view of the desired, and acceptable, constant term induced in the EM resolution. It is assumed that a local'' calibration resulting from exposing each tower to a beam of electrons is not feasible. It is further assumed that an in situ'' calibration is either not yet performed, or is unavailable due to tracking alignment problems or high luminosity operation rendering tracking inoperative. Therefore, the assumptionsmore » used are rather conservative. In this scenario, each scintillator plate of each tower is exposed to a moving radioactive source. That reading is used to mask'' an optical cookie'' in a grey code chosen so as to make the response uniform. The source is assumed to be the sole calibration of the tower. Therefore, the phrase global'' calibration of towers by movable radioactive sources is adopted.« less

  14. Source calibrations and SDC calorimeter requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.

    Several studies of the problem of calibration of the SDC calorimeter exist. In this note the attempt is made to give a connected account of the requirements on the source calibration from the point of view of the desired, and acceptable, constant term induced in the EM resolution. It is assumed that a ``local`` calibration resulting from exposing each tower to a beam of electrons is not feasible. It is further assumed that an ``in situ`` calibration is either not yet performed, or is unavailable due to tracking alignment problems or high luminosity operation rendering tracking inoperative. Therefore, the assumptionsmore » used are rather conservative. In this scenario, each scintillator plate of each tower is exposed to a moving radioactive source. That reading is used to ``mask`` an optical ``cookie`` in a grey code chosen so as to make the response uniform. The source is assumed to be the sole calibration of the tower. Therefore, the phrase ``global`` calibration of towers by movable radioactive sources is adopted.« less

  15. A Numerical Experiment on the Role of Surface Shear Stress in the Generation of Sound

    NASA Technical Reports Server (NTRS)

    Shariff, Karim; Wang, Meng; Merriam, Marshal (Technical Monitor)

    1996-01-01

    The sound generated due to a localized flow over an infinite flat surface is considered. It is known that the unsteady surface pressure, while appearing in a formal solution to the Lighthill equation, does not constitute a source of sound but rather represents the effect of image quadrupoles. The question of whether a similar surface shear stress term constitutes a true source of dipole sound is less settled. Some have boldly assumed it is a true source while others have argued that, like the surface pressure, it depends on the sound field (via an acoustic boundary layer) and is therefore not a true source. A numerical experiment based on the viscous, compressible Navier-Stokes equations was undertaken to investigate the issue. A small region of a wall was oscillated tangentially. The directly computed sound field was found to to agree with an acoustic analogy based calculation which regards the surface shear as an acoustically compact dipole source of sound.

  16. Estimation of the time-dependent radioactive source-term from the Fukushima nuclear power plant accident using atmospheric transport modelling

    NASA Astrophysics Data System (ADS)

    Schoeppner, M.; Plastino, W.; Budano, A.; De Vincenzi, M.; Ruggieri, F.

    2012-04-01

    Several nuclear reactors at the Fukushima Dai-ichi power plant have been severely damaged from the Tōhoku earthquake and the subsequent tsunami in March 2011. Due to the extremely difficult on-site situation it has been not been possible to directly determine the emissions of radioactive material. However, during the following days and weeks radionuclides of 137-Caesium and 131-Iodine (amongst others) were detected at monitoring stations throughout the world. Atmospheric transport models are able to simulate the worldwide dispersion of particles accordant to location, time and meteorological conditions following the release. The Lagrangian atmospheric transport model Flexpart is used by many authorities and has been proven to make valid predictions in this regard. The Flexpart software has first has been ported to a local cluster computer at the Grid Lab of INFN and Department of Physics of University of Roma Tre (Rome, Italy) and subsequently also to the European Mediterranean Grid (EUMEDGRID). Due to this computing power being available it has been possible to simulate the transport of particles originating from the Fukushima Dai-ichi plant site. Using the time series of the sampled concentration data and the assumption that the Fukushima accident was the only source of these radionuclides, it has been possible to estimate the time-dependent source-term for fourteen days following the accident using the atmospheric transport model. A reasonable agreement has been obtained between the modelling results and the estimated radionuclide release rates from the Fukushima accident.

  17. The Fukushima releases: an inverse modelling approach to assess the source term by using gamma dose rate observations

    NASA Astrophysics Data System (ADS)

    Saunier, Olivier; Mathieu, Anne; Didier, Damien; Tombette, Marilyne; Quélo, Denis; Winiarek, Victor; Bocquet, Marc

    2013-04-01

    The Chernobyl nuclear accident and more recently the Fukushima accident highlighted that the largest source of error on consequences assessment is the source term estimation including the time evolution of the release rate and its distribution between radioisotopes. Inverse modelling methods have proved to be efficient to assess the source term due to accidental situation (Gudiksen, 1989, Krysta and Bocquet, 2007, Stohl et al 2011, Winiarek et al 2012). These methods combine environmental measurements and atmospheric dispersion models. They have been recently applied to the Fukushima accident. Most existing approaches are designed to use air sampling measurements (Winiarek et al, 2012) and some of them use also deposition measurements (Stohl et al, 2012, Winiarek et al, 2013). During the Fukushima accident, such measurements are far less numerous and not as well distributed within Japan than the dose rate measurements. To efficiently document the evolution of the contamination, gamma dose rate measurements were numerous, well distributed within Japan and they offered a high temporal frequency. However, dose rate data are not as easy to use as air sampling measurements and until now they were not used in inverse modelling approach. Indeed, dose rate data results from all the gamma emitters present in the ground and in the atmosphere in the vicinity of the receptor. They do not allow one to determine the isotopic composition or to distinguish the plume contribution from wet deposition. The presented approach proposes a way to use dose rate measurement in inverse modeling approach without the need of a-priori information on emissions. The method proved to be efficient and reliable when applied on the Fukushima accident. The emissions for the 8 main isotopes Xe-133, Cs-134, Cs-136, Cs-137, Ba-137m, I-131, I-132 and Te-132 have been assessed. The Daiichi power plant events (such as ventings, explosions…) known to have caused atmospheric releases are well identified in the retrieved source term, except for unit 3 explosion where no measurement was available. The comparisons between the simulations of atmospheric dispersion and deposition of the retrieved source term show a good agreement with environmental observations. Moreover, an important outcome of this study is that the method proved to be perfectly suited to crisis management and should contribute to improve our response in case of a nuclear accident.

  18. The RATIO method for time-resolved Laue crystallography

    PubMed Central

    Coppens, Philip; Pitak, Mateusz; Gembicky, Milan; Messerschmidt, Marc; Scheins, Stephan; Benedict, Jason; Adachi, Shin-ichi; Sato, Tokushi; Nozawa, Shunsuke; Ichiyanagi, Kohei; Chollet, Matthieu; Koshihara, Shin-ya

    2009-01-01

    A RATIO method for analysis of intensity changes in time-resolved pump–probe Laue diffraction experiments is described. The method eliminates the need for scaling the data with a wavelength curve representing the spectral distribution of the source and removes the effect of possible anisotropic absorption. It does not require relative scaling of series of frames and removes errors due to all but very short term fluctuations in the synchrotron beam. PMID:19240334

  19. Coppicing evaluation of short rotation coppice in the southeast of the U.S. to determine appropriate harvesting methods.

    Treesearch

    Rafael Santiago; Tom Gallagher; Matthew Smidt; Dana Mitchell

    2016-01-01

    Renewable fuels are being tested as an alternative for fossil fuels. For the Southeast U.S., the use of woody biomass has proven to be an excellent source of renewable energy in terms of cost benefit and availability. Short rotation woody crops (SRWC) are timber plantations with exclusive characteristics that can meet the intensive demand for wood due to their fast...

  20. A Survey of Blast Injury across the Full Landscape of Military Science (Etude d’ensemble des blessures dues aux explosions a travers le panorama complet de la science militaire)

    DTIC Science & Technology

    2011-04-01

    Military Science (RTO-MP-HFM-207) Executive Summary Blast injury is a significant source of casualties in current NATO operations. The term “blast...toxicologique du souffle incluant les mécanismes de dose (par exemple, normes d’exposition à un tube à choc ), la description des points limites dose

  1. Hydrometeorological extremes at the Veselí nad Moravou estate (Czech Republic) in the period 1794-1850 derived from documentary evidence of the economic character

    NASA Astrophysics Data System (ADS)

    Chromá, Kateřina

    2010-05-01

    Hydrometeorological extremes influenced always human activities (agriculture, forestry, water management) and caused losses of human lives and great material damage. Systematic meteorological and hydrological observations in the Czech Lands (recent Czech Republic) started generally in the latter half of the 19th century. In order to create long-term series of hydrometeorological extremes, it is necessary to search for other sources of information for their study before 1850. Such direct and indirect information about hydrometeorological extremes is included in documentary evidence (e.g. chronicles, memoirs, diaries, early visual weather observations, newspapers, economic sources etc.). Documentary evidence of economic character belongs to the most important sources, especially documents related to taxation records. Damage to agricultural crops on the fields or damage to hay on meadows due to the hydrological and meteorological phenomena has been a good reason for the abatement of tax duty. Based on the official correspondence of the estate of Veselí nad Moravou (southern Moravia), archival information about taxation from the Moravian Land Archives in Brno was excerpted. Based on it, 46 hydrometeorological extremes which occurred between the years 1794 and 1850 were selected and further analysed. Because of fields and meadows of the above estate were located along the Morava River, reports of damage due to floods were the most frequent, followed by damage due to torrential rains and hailstorms.

  2. Observational data on the effects of infection by the copepod Salmincola californiensis on the short- and long-term viability of juvenile Chinook salmon (Oncorhynchus tshawytscha) implanted with telemetry tags

    USGS Publications Warehouse

    Beeman, John W.; Hansen, Amy C.; Sprando, Jamie M.

    2015-01-01

    Infection with Salmincola californiensis is common in juvenile Chinook salmon in western USA reservoirs and may affect the viability of fish used in studies of telemetered animals. Our limited assessment suggests infection by Salmincola californiensis affects the short-term morality of tagged fish and may affect long-term viability of tagged fish after release; however, the intensity of infection in the sample population did not represent the source population due to the observational nature of the data. We suggest these results warrant further study into the effects of infection bySalmincola californiensis on the results obtained through active telemetry and perhaps other methods requiring handling of infected fish.

  3. Recent Advances in Laplace Transform Analytic Element Method (LT-AEM) Theory and Application to Transient Groundwater Flow

    NASA Astrophysics Data System (ADS)

    Kuhlman, K. L.; Neuman, S. P.

    2006-12-01

    Furman and Neuman (2003) proposed a Laplace Transform Analytic Element Method (LT-AEM) for transient groundwater flow. LT-AEM applies the traditionally steady-state AEM to the Laplace transformed groundwater flow equation, and back-transforms the resulting solution to the time domain using a Fourier Series numerical inverse Laplace transform method (de Hoog, et.al., 1982). We have extended the method so it can compute hydraulic head and flow velocity distributions due to any two-dimensional combination and arrangement of point, line, circular and elliptical area sinks and sources, nested circular or elliptical regions having different hydraulic properties, and areas of specified head, flux or initial condition. The strengths of all sinks and sources, and the specified head and flux values, can all vary in both space and time in an independent and arbitrary fashion. Initial conditions may vary from one area element to another. A solution is obtained by matching heads and normal fluxes along the boundary of each element. The effect which each element has on the total flow is expressed in terms of generalized Fourier series which converge rapidly (<20 terms) in most cases. As there are more matching points than unknown Fourier terms, the matching is accomplished in Laplace space using least-squares. The method is illustrated by calculating the resulting transient head and flow velocities due to an arrangement of elements in both finite and infinite domains. The 2D LT-AEM elements already developed and implemented are currently being extended to solve the 3D groundwater flow equation.

  4. On the gravitational potential and field anomalies due to thin mass layers

    NASA Technical Reports Server (NTRS)

    Ockendon, J. R.; Turcotte, D. L.

    1977-01-01

    The gravitational potential and field anomalies for thin mass layers are derived using the technique of matched asymptotic expansions. An inner solution is obtained using an expansion in powers of the thickness and it is shown that the outer solution is given by a surface distribution of mass sources and dipoles. Coefficients are evaluated by matching the inner expansion of the outer solution with the outer expansion of the inner solution. The leading term in the inner expansion for the normal gravitational field gives the Bouguer formula. The leading term in the expansion for the gravitational potential gives an expression for the perturbation to the geoid. The predictions given by this term are compared with measurements by satellite altimetry. The second-order terms in the expansion for the gravitational field are required to predict the gravity anomaly at a continental margin. The results are compared with observations.

  5. Role of large-scale motions to turbulent inertia in turbulent pipe and channel flows

    NASA Astrophysics Data System (ADS)

    Hwang, Jinyul; Lee, Jin; Sung, Hyung Jin

    2015-11-01

    The role of large-scale motions (LSMs) to the turbulent inertia (TI) term (the wall-normal gradient of the Reynolds shear stress) is examined in turbulent pipe and channel flows at Reτ ~ 930 . The TI term in the mean momentum equation represents the net force of inertia exerted by the Reynolds shear stress. Although the turbulence statistics characterizing the internal turbulent flows are similar close to the wall, the TI term differs in the logarithmic region due to the different characteristics of LSMs (λx > 3 δ) . The contribution of the LSMs to the TI term and the Reynolds shear stress in the channel flow is larger than that in the pipe flow. The LSMs in the logarithmic region act like a mean momentum source (where TI >0) even the TI profile is negative above the peak of the Reynolds shear stress. The momentum sources carried by the LSMs are related to the low-speed regions elongated in the downstream, revealing that momentum source-like motions occur in the upstream position of the low-speed structure. The streamwise extent of this structure is relatively long in the channel flow, whereas the high-speed regions on the both sides of the low-speed region in the channel flow are shorter and weaker than those in the pipe flow. This work was supported by the Creative Research Initiatives (No. 2015-001828) program of the National Research Foundation of Korea (MSIP) and partially supported by KISTI under the Strategic Supercomputing Support Program.

  6. Detection of a gas flaring signature in the AERONET optical properties of aerosols at a tropical station in West Africa

    NASA Astrophysics Data System (ADS)

    Fawole, Olusegun G.; Cai, Xiaoming; Levine, James G.; Pinker, Rachel T.; MacKenzie, A. R.

    2016-12-01

    The West African region, with its peculiar climate and atmospheric dynamics, is a prominent source of aerosols. Reliable and long-term in situ measurements of aerosol properties are not readily available across the region. In this study, Version 2 Level 1.5 Aerosol Robotic Network (AERONET) data were used to study the absorption and size distribution properties of aerosols from dominant sources identified by trajectory analysis. The trajectory analysis was used to define four sources of aerosols over a 10 year period. Sorting the AERONET aerosol retrievals by these putative sources, the hypothesis that there exists an optically distinct gas flaring signal was tested. Dominance of each source cluster varies with season: desert-dust (DD) and biomass burning (BB) aerosols are dominant in months prior to the West African Monsoon (WAM); urban (UB) and gas flaring (GF) aerosol are dominant during the WAM months. BB aerosol, with single scattering albedo (SSA) at 675 nm value of 0.86 ± 0.03 and GF aerosol with SSA (675 nm) value of 0.9 ± 0.07, is the most absorbing of the aerosol categories. The range of Absorption Angstr&öm Exponent (AAE) for DD, BB, UB and GF classes are 1.99 ± 0.35, 1.45 ± 0.26, 1.21 ± 0.38 and 0.98 ± 0.25, respectively, indicating different aerosol composition for each source. The AAE (440-870 nm) and Angstr&öm Exponent (AE) (440-870 nm) relationships further show the spread and overlap of the variation of these optical and microphysical properties, presumably due in part to similarity in the sources of aerosols and in part, due to mixing of air parcels from different sources en route to the measurement site.

  7. Assessment of groundwater exploitation in an aquifer using the random walk on grid method: a case study at Ordos, China

    NASA Astrophysics Data System (ADS)

    Nan, Tongchao; Li, Kaixuan; Wu, Jichun; Yin, Lihe

    2018-04-01

    Sustainability has been one of the key criteria of effective water exploitation. Groundwater exploitation and water-table decline at Haolebaoji water source site in the Ordos basin in NW China has drawn public attention due to concerns about potential threats to ecosystems and grazing land in the area. To better investigate the impact of production wells at Haolebaoji on the water table, an adapted algorithm called the random walk on grid method (WOG) is applied to simulate the hydraulic head in the unconfined and confined aquifers. This is the first attempt to apply WOG to a real groundwater problem. The method can not only evaluate the head values but also the contributions made by each source/sink term. One is allowed to analyze the impact of source/sink terms just as if one had an analytical solution. The head values evaluated by WOG match the values derived from the software Groundwater Modeling System (GMS). It suggests that WOG is effective and applicable in a heterogeneous aquifer with respect to practical problems, and the resultant information is useful for groundwater management.

  8. Short-term variability of mineral dust, metals and carbon emission from road dust resuspension

    NASA Astrophysics Data System (ADS)

    Amato, Fulvio; Schaap, Martijn; Denier van der Gon, Hugo A. C.; Pandolfi, Marco; Alastuey, Andrés; Keuken, Menno; Querol, Xavier

    2013-08-01

    Particulate matter (PM) pollution in cities has severe impact on morbidity and mortality of their population. In these cities, road dust resuspension contributes largely to PM and airborne heavy metals concentrations. However, the short-term variation of emission through resuspension is not well described in the air quality models, hampering a reliable description of air pollution and related health effects. In this study we experimentally show that the emission strength of resuspension varies widely among road dust components/sources. Our results offer the first experimental evidence of different emission rates for mineral dust, heavy metals and carbon fractions due to traffic-induced resuspension. Also, the same component (or source) recovers differently in a road in Barcelona (Spain) and a road in Utrecht (The Netherlands). This finding has important implications on atmospheric pollution modelling, mostly for mineral dust, heavy metals and carbon species. After rain events, recoveries were generally faster in Barcelona rather than in Utrecht. The largest difference was found for the mineral dust (Al, Si, Ca). Tyre wear particles (organic carbon and zinc) recovered faster than other road dust particles in both cities. The source apportionment of road dust mass provides useful information for air quality management.

  9. Romanian Experience in The Conditioning of Radium Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dogaru, Gh.; Dragolici, F.; Rotarescu, Gh.

    2008-07-01

    Ra{sup 226} first radionuclide separated from pitchblende in 1898 by Pierre and Marie Curie was successfully used in medicine, industry as in other fields being the only one available radionuclide till 1940 when were produced other radionuclides in accelerators. On long term the use of Ra{sup 226} sealed sources are not any more safe due to: the high specific activity, long half live, decays in Rn{sup 226} gas which increases the internal pressure of capsule leading in time to the leakage, the salts as raw materials from which the sealed sources are manufactured are soluble, there is a leak ofmore » information and records on the manufacture and operation. Based on this consideration in Romania regulatory authority did not authorized any more the use of these sealed sources [1]. The paper presents some aspects from Romanian experience related to the collection and conditioning of radium sealed sources. Data relating the radium inventory as well as the arrangements made in order to create a workshop for the conditioning of radium sources are presented. (authors)« less

  10. Study of RCR Catalogue Radio Source Integral Spectra

    NASA Astrophysics Data System (ADS)

    Zhelenkova, O. P.; Majorova, E. K.

    2018-04-01

    We present the characteristics of the sources found on the averaged scans of the "Cold" experiment 1980-1999 surveys in the right-ascension interval 2h< RA < 7h. Thereby, a refinement of the parameters of the RC catalog sources (RATANCold) for this interval is complete. To date, the RCR catalog (RATAN Cold Refined) covers the right-ascension interval 2h< RA < 17h and includes 830 sources. The spectra are built for them with the use of new data in the range of 70-230 MHz. The dependence between the spectral indices α 0.5, α 3.94 and integral flux density at the frequencies of 74 and 150 MHz, at 1.4, 3.94 and 4.85 GHz is discussed.We found that at 150 MHz in most sources the spectral index α 0.5 gets steeper with increasing flux density. In general, the sources with flat spectra are weaker in terms of flux density than the sources with steep spectra, which especially differs at 150 MHz. We believe that this is due to the brightness of their extended components, which can be determined by the type of accretion and the neighborhood of the source.

  11. Assessment of general public exposure to LTE and RF sources present in an urban environment.

    PubMed

    Joseph, Wout; Verloock, Leen; Goeminne, Francis; Vermeeren, Günter; Martens, Luc

    2010-10-01

    For the first time, in situ electromagnetic field exposure of the general public to fields from long term evolution (LTE) cellular base stations is assessed. Exposure contributions due to different radiofrequency (RF) sources are compared with LTE exposure at 30 locations in Stockholm, Sweden. Total exposures (0.2-2.6 V/m) satisfy the International Commission on Non-Ionizing Radiation Protection (ICNIRP) reference levels (from 28 V/m for frequency modulation (FM), up to 61 V/m for LTE) at all locations. LTE exposure levels up to 0.8 V/m were measured, and the average contribution of the LTE signal to the total RF exposure equals 4%.

  12. A study of the variable impedance surface concept as a means for reducing noise from jet interaction with deployed lift-augmenting flaps

    NASA Technical Reports Server (NTRS)

    Hayden, R. E.; Kadman, Y.; Chanaud, R. C.

    1972-01-01

    The feasibility of quieting the externally-blown-flap (EBF) noise sources which are due to interaction of jet exhaust flow with deployed flaps was demonstrated on a 1/15-scale 3-flap EBF model. Sound field characteristics were measured and noise reduction fundamentals were reviewed in terms of source models. Test of the 1/15-scale model showed broadband noise reductions of up to 20 dB resulting from combination of variable impedance flap treatment and mesh grids placed in the jet flow upstream of the flaps. Steady-state lift, drag, and pitching moment were measured with and without noise reduction treatment.

  13. Instantaneous and time-averaged dispersion and measurement models for estimation theory applications with elevated point source plumes

    NASA Technical Reports Server (NTRS)

    Diamante, J. M.; Englar, T. S., Jr.; Jazwinski, A. H.

    1977-01-01

    Estimation theory, which originated in guidance and control research, is applied to the analysis of air quality measurements and atmospheric dispersion models to provide reliable area-wide air quality estimates. A method for low dimensional modeling (in terms of the estimation state vector) of the instantaneous and time-average pollutant distributions is discussed. In particular, the fluctuating plume model of Gifford (1959) is extended to provide an expression for the instantaneous concentration due to an elevated point source. Individual models are also developed for all parameters in the instantaneous and the time-average plume equations, including the stochastic properties of the instantaneous fluctuating plume.

  14. JAMSS: proteomics mass spectrometry simulation in Java.

    PubMed

    Smith, Rob; Prince, John T

    2015-03-01

    Countless proteomics data processing algorithms have been proposed, yet few have been critically evaluated due to lack of labeled data (data with known identities and quantities). Although labeling techniques exist, they are limited in terms of confidence and accuracy. In silico simulators have recently been used to create complex data with known identities and quantities. We propose Java Mass Spectrometry Simulator (JAMSS): a fast, self-contained in silico simulator capable of generating simulated MS and LC-MS runs while providing meta information on the provenance of each generated signal. JAMSS improves upon previous in silico simulators in terms of its ease to install, minimal parameters, graphical user interface, multithreading capability, retention time shift model and reproducibility. The simulator creates mzML 1.1.0. It is open source software licensed under the GPLv3. The software and source are available at https://github.com/optimusmoose/JAMSS. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Slicer Method Comparison Using Open-source 3D Printer

    NASA Astrophysics Data System (ADS)

    Ariffin, M. K. A. Mohd; Sukindar, N. A.; Baharudin, B. T. H. T.; Jaafar, C. N. A.; Ismail, M. I. S.

    2018-01-01

    Open-source 3D printer has been one of the popular choices in fabricating 3D models. This technology is easily accessible and low in cost. However, several studies have been made to improve the performance of this low-cost technology in term of the accuracy of the parts finish. This study is focusing on the selection of slicer mode between CuraEngine and Slic3r. The effect on this slicer has been observe in terms of accuracy and surface visualization. The result shows that if the accuracy is the top priority, CuraEngine is the better option to use as contribute more accuracy as well as less filament is needed compared to the Slice3r. Slice3r may be very useful for complicated parts such as hanging structure due to excessive material which act as support material. The study provides basic platform for the user to have an idea which option to be used in fabricating 3D model.

  16. A Simulated Environment Experiment on Annoyance Due to Combined Road Traffic and Industrial Noises.

    PubMed

    Marquis-Favre, Catherine; Morel, Julien

    2015-07-21

    Total annoyance due to combined noises is still difficult to predict adequately. This scientific gap is an obstacle for noise action planning, especially in urban areas where inhabitants are usually exposed to high noise levels from multiple sources. In this context, this work aims to highlight potential to enhance the prediction of total annoyance. The work is based on a simulated environment experiment where participants performed activities in a living room while exposed to combined road traffic and industrial noises. The first objective of the experiment presented in this paper was to gain further understanding of the effects on annoyance of some acoustical factors, non-acoustical factors and potential interactions between the combined noise sources. The second one was to assess total annoyance models constructed from the data collected during the experiment and tested using data gathered in situ. The results obtained in this work highlighted the superiority of perceptual models. In particular, perceptual models with an interaction term seemed to be the best predictors for the two combined noise sources under study, even with high differences in sound pressure level. Thus, these results reinforced the need to focus on perceptual models and to improve the prediction of partial annoyances.

  17. Fish-Eye Observing with Phased Array Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Wijnholds, S. J.

    The radio astronomical community is currently developing and building several new radio telescopes based on phased array technology. These telescopes provide a large field-of-view, that may in principle span a full hemisphere. This makes calibration and imaging very challenging tasks due to the complex source structures and direction dependent radio wave propagation effects. In this thesis, calibration and imaging methods are developed based on least squares estimation of instrument and source parameters. Monte Carlo simulations and actual observations with several prototype show that this model based approach provides statistically and computationally efficient solutions. The error analysis provides a rigorous mathematical framework to assess the imaging performance of current and future radio telescopes in terms of the effective noise, which is the combined effect of propagated calibration errors, noise in the data and source confusion.

  18. Evaluation of the communications impact of a low power arcjet thruster

    NASA Technical Reports Server (NTRS)

    Carney, Lynnette M.

    1988-01-01

    The interaction of a 1 kW arcjet thruster plume with a communications signal is evaluated. A two-parameter, source flow equation has been used to represent the far flow field distribution of the arcjet plume in a realistic spacecraft configuration. Modelling the plume as a plasma slab, the interaction of the plume with a 4 GHz communications signal is then evaluated in terms of signal attenuation and phase shift between transmitting and receiving antennas. Except for propagation paths which pass very near the arcjet source, the impacts to transmission appear to be negligible. The dominant signal loss mechanism is refraction of the beam rather than absorption losses due to collisions. However, significant reflection of the signal at the sharp vacuum-plasma boundary may also occur for propagation paths which pass near the source.

  19. A web-based screening tool for near-port air quality assessments

    PubMed Central

    Isakov, Vlad; Barzyk, Timothy M.; Smith, Elizabeth R.; Arunachalam, Saravanan; Naess, Brian; Venkatram, Akula

    2018-01-01

    The Community model for near-PORT applications (C-PORT) is a screening tool with an intended purpose of calculating differences in annual averaged concentration patterns and relative contributions of various source categories over the spatial domain within about 10 km of the port. C-PORT can inform decision-makers and concerned citizens about local air quality due to mobile source emissions related to commercial port activities. It allows users to visualize and evaluate different planning scenarios, helping them identify the best alternatives for making long-term decisions that protect community health and sustainability. The web-based, easy-to-use interface currently includes data from 21 seaports primarily in the Southeastern U.S., and has a map-based interface based on Google Maps. The tool was developed to visualize and assess changes in air quality due to changes in emissions and/or meteorology in order to analyze development scenarios, and is not intended to support or replace any regulatory models or programs. PMID:29681760

  20. Nonlinear radiated MHD flow of nanoliquids due to a rotating disk with irregular heat source and heat flux condition

    NASA Astrophysics Data System (ADS)

    Mahanthesh, B.; Gireesha, B. J.; Shehzad, S. A.; Rauf, A.; Kumar, P. B. Sampath

    2018-05-01

    This research is made to visualize the nonlinear radiated flow of hydromagnetic nano-fluid induced due to rotation of the disk. The considered nano-fluid is a mixture of water and Ti6Al4V or AA7072 nano-particles. The various shapes of nanoparticles like lamina, column, sphere, tetrahedron and hexahedron are chosen in the analysis. The irregular heat source and nonlinear radiative terms are accounted in the law of energy. We used the heat flux condition instead of constant surface temperature condition. Heat flux condition is more relativistic and according to physical nature of the problem. The problem is made dimensionless with the help of suitable similarity constraints. The Runge-Kutta-Fehlberg scheme is adopted to find the numerical solutions of governing nonlinear ordinary differential systems. The solutions are plotted by considering the various values of emerging physical constraints. The effects of various shapes of nanoparticles are drawn and discussed.

  1. Accuracy-preserving source term quadrature for third-order edge-based discretization

    NASA Astrophysics Data System (ADS)

    Nishikawa, Hiroaki; Liu, Yi

    2017-09-01

    In this paper, we derive a family of source term quadrature formulas for preserving third-order accuracy of the node-centered edge-based discretization for conservation laws with source terms on arbitrary simplex grids. A three-parameter family of source term quadrature formulas is derived, and as a subset, a one-parameter family of economical formulas is identified that does not require second derivatives of the source term. Among the economical formulas, a unique formula is then derived that does not require gradients of the source term at neighbor nodes, thus leading to a significantly smaller discretization stencil for source terms. All the formulas derived in this paper do not require a boundary closure, and therefore can be directly applied at boundary nodes. Numerical results are presented to demonstrate third-order accuracy at interior and boundary nodes for one-dimensional grids and linear triangular/tetrahedral grids over straight and curved geometries.

  2. Due diligence responsibilities of the professional geologist

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hobbs, G.W.

    1991-03-01

    Whether in the role of independent consultant or company employee, a geologist has certain professional obligations in the evaluation of an oil and gas submittal from a third party. 'Due diligence' is the term used to describe the analysis of an investment opportunity. Due diligence involves a multidisciplinary examination of both the technical and business aspects of a submittal. In addition to the obvious geological considerations, prospect evaluations should include relevant details about the specific technical documentation reviewed, information sources, and how the data were verified. Full disclosure of ownership, technical risks, and negative aspects of the prospect should bemore » included along with the positive elements. After the geological analysis is completed, the economic merits of the prospect should be analyzed, incorporating all lease burdens and terms of participation into the calculations. Estimated exploration, development, and operating costs, together with projected annual production, cash flow, and reserves must be examined as to their reasonableness. Finally, the due diligence review should include a thorough check on the reputation, financial condition, technical and managerial expertise, and prior track record of the operator. Bank, trade, legal, and prior partner references should be contacted. The successful professional geologist in today's competitive world must have multidisciplinary skills. A solid background in geology and geophysics, a basic understanding of the principles of petroleum engineering and economics, and the wits of a private eye are needed for good due diligence work.« less

  3. INEEL Subregional Conceptual Model Report Volume 3: Summary of Existing Knowledge of Natural and Anthropogenic Influences on the Release of Contaminants to the Subsurface Environment from Waste Source Terms at the INEEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul L. Wichlacz

    2003-09-01

    This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventorymore » of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.« less

  4. A comparison between active and passive sensing of soil moisture from vegetated terrains

    NASA Technical Reports Server (NTRS)

    Fung, A. K.; Eom, H. J.

    1985-01-01

    A comparison between active and passive sensing of soil moisture over vegetated areas is studied via scattering models. In active sensing three contributing terms to radar backscattering can be identified: (1) the ground surface scatter term; (2) the volume scatter term representing scattering from the vegetation layer; and (3) the surface volume scatter term accounting for scattering from both surface and volume. In emission three sources of contribution can also be identified: (1) surface emission; (2) upward volume emission from the vegetation layer; and (3) downward volume emission scattered upward by the ground surface. As ground moisture increases, terms (1) and (3) increase due to increase in permittivity in the active case. However, in passive sensing, term (1) decreases but term (3) increases for the same reason. This self compensating effect produces a loss in sensitivity to change in ground moisture. Furthermore, emission from vegetation may be larger than that from the ground. Hence, the presence of vegetation layer causes a much greater loss of sensitivity to passive than active sensing of soil moisture.

  5. A comparison between active and passive sensing of soil moisture from vegetated terrains

    NASA Technical Reports Server (NTRS)

    Fung, A. K.; Eom, H. J.

    1984-01-01

    A comparison between active and passive sensing of soil moisture over vegetated areas is studied via scattering models. In active sensing three contributing terms to radar backscattering can be identified: (1) the ground surface scatter term; (2) the volume scatter term representing scattering from the vegetation layer; and (3) the surface volume scatter term accounting for scattering from both surface and volume. In emission three sources of contribution can also be identified: (1) surface emission; (2) upward volume emission from the vegetation layer; and (3) downward volume emission scattered upward by the ground surface. As ground moisture increases, terms (1) and (3) increase due to increase in permittivity in the active case. However, in passive sensing, term (1) decreases but term (3) increases for the same reason. This self conpensating effect produces a loss in sensitivity to change in ground moisture. Furthermore, emission from vegetation may be larger than that from the ground. Hence, the presence of vegetation layer causes a much greater loss of sensitivity to passive than active sensing of soil moisture.

  6. Committing to coal and gas: Long-term contracts, regulation, and fuel switching in power generation

    NASA Astrophysics Data System (ADS)

    Rice, Michael

    Fuel switching in the electricity sector has important economic and environmental consequences. In the United States, the increased supply of gas during the last decade has led to substantial switching in the short term. Fuel switching is constrained, however, by the existing infrastructure. The power generation infrastructure, in turn, represents commitments to specific sources of energy over the long term. This dissertation explores fuel contracts as the link between short-term price response and long-term plant investments. Contracting choices enable power plant investments that are relationship-specific, often regulated, and face uncertainty. Many power plants are subject to both hold-up in investment and cost-of-service regulation. I find that capital bias is robust when considering either irreversibility or hold-up due to the uncertain arrival of an outside option. For sunk capital, the rental rate is inappropriate for determining capital bias. Instead, capital bias depends on the regulated rate of return, discount rate, and depreciation schedule. If policies such as emissions regulations increase fuel-switching flexibility, this can lead to capital bias. Cost-of-service regulation can shorten the duration of a long-term contract. From the firm's perspective, the existing literature provides limited guidance when bargaining and writing contracts for fuel procurement. I develop a stochastic programming framework to optimize long-term contracting decisions under both endogenous and exogenous sources of hold-up risk. These typically include policy changes, price shocks, availability of fuel, and volatility in derived demand. For price risks, the optimal contract duration is the moment when the expected benefits of the contract are just outweighed by the expected opportunity costs of remaining in the contract. I prove that imposing early renegotiation costs decreases contract duration. Finally, I provide an empirical approach to show how coal contracts can limit short-term fuel switching in power production. During the era prior to shale gas and electricity market deregulation, I do not find evidence that gas generation substituted for coal in response to fuel price changes. However, I do find evidence that coal plant operations are constrained by fuel contracts. As the min-take commitment to coal increases, changes to annual coal plant output decrease. My conclusions are robust in spite of bias due to the selective reporting of proprietary coal delivery contracts by utilities.

  7. What are the current solutions for interfacing supercritical fluid chromatography and mass spectrometry?

    PubMed

    Guillarme, Davy; Desfontaine, Vincent; Heinisch, Sabine; Veuthey, Jean-Luc

    2018-04-15

    Mass spectrometry (MS) is considered today as one of the most popular detection methods, due to its high selectivity and sensitivity. In particular, this detector has become the gold standard for the analysis of complex mixtures such as biological samples. The first successful SFC-MS hyphenation was reported in the 80's, and since then, several ionization sources, mass analyzers and interfacing technologies have been combined. Due to the specific physicochemical properties and compressibility of the SFC mobile phase, directing the column effluent into the ionization source is more challenging than in LC. Therefore, some specific interfaces have to be employed in SFC-MS, to i) avoid (or at least limit) analytes precipitation due to CO 2 decompression, when the SFC mobile phase is not anymore under backpressure control, ii) achieve adequate ionization yield, even with a low proportion of MeOH in the mobile phase and iii) preserve the chromatographic integrity (i.e. maintaining retention, selectivity, and efficiency). The goal of this review is to describe the various SFC-MS interfaces and highlight the most favorable ones in terms of reliability, flexibility, sensitivity and user-friendliness. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Development of a Chemically Reacting Flow Solver on the Graphic Processing Units

    DTIC Science & Technology

    2011-05-10

    been implemented on the GPU by Schive et al. (2010). The outcome of their work is the GAMER code for astrophysical simulation. Thibault and...Euler equations at each cell. For simplification, consider the Euler equations in one dimension with no source terms; the discretized form of the...is known to be more diffusive than the other fluxes due to the large bound of the numerical signal velocities: b+, b-. 3.4 Time Marching Methods

  9. Final Environmental Assessment Addressing Construction of a Fitness Center at Beale Air Force Base, California

    DTIC Science & Technology

    2009-10-01

    adverse impacts on geology and soils would be anticipated due to construction and demolition activities, such as grading, excavation, and...2, during construction and demolition activities would limit adverse impacts on geology and soils. Therefore, no long-term, adverse, direct or...20 99 113 70 70 99 65 70 20 Live Oak Loma Rica Tierra Buena Wheatland Lincoln Linda Marysville Olivehurst South Yuba City Yuba City Source: ESRI

  10. Novel Epitaxy Between Oxides and Semiconductors - Growth and Interfacial Structures

    DTIC Science & Technology

    2007-05-16

    observed to be impressively good. 15. SUBJECT TERMS Nanotechnology, Gallium Nitride 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as...with precursors or gases, a high-purity sapphire was employed in this work. E-beam evaporation was used due to the high melting point of sapphire, and...were carried out on a four-circle triple -axes diffractometer, using a 12 kW rotating anode Cu K-alpha source. A pair of graphite crystals is used to

  11. Long-term changes after brief dynamic psychotherapy: symptomatic versus dynamic assessments.

    PubMed

    Høglend, P; Sørlie, T; Sørbye, O; Heyerdahl, O; Amlo, S

    1992-08-01

    Dynamic change in psychotherapy, as measured by theory-related or mode-specific instruments, have been criticized for being too intercorrelated with symptomatic change measures. In this study, long-term changes after brief dynamic psychotherapy were studied in 45 moderately disturbed neurotic patients by a reliable outcome battery. The factor structure of all the change variables suggested that they tapped 2 distinct and stable sources of variance: dynamic and symptomatic change. The categories of overall dynamic change were different from categories of change on the Global Assessment Scale. A small systematic difference was found between the categories of overall dynamic change and the categories of target complaints change also, due to false solutions of dynamic conflicts.

  12. [Personalization in the medicine of the future : Opportunities and risks].

    PubMed

    Malek, N P

    2017-07-01

    Personalized medicine is not a new concept. The renaissance of the term is due to the enormous progress in gene sequencing technology and functional imaging, as well as the development of targeted therapies. Application of these technologies in clinical medicine will necessitate infrastructural as well as organizational and educational changes in the healthcare system. An important change required already in the short-term is the introduction of centralized structures, preferably in university clinics, which adopt these innovations and incorporate them into clinical care. Simultaneously, the collation and use of large quantities of relevant data from highly variable sources must be successfully mastered, in order to pave the way for disruptive technologies such as artificial intelligence.

  13. Aerosol characterization over the southeastern United States using high resolution aerosol mass spectrometry: spatial and seasonal variation of aerosol composition, sources, and organic nitrates

    NASA Astrophysics Data System (ADS)

    Xu, L.; Suresh, S.; Guo, H.; Weber, R. J.; Ng, N. L.

    2015-04-01

    We deployed a High-Resolution Time-of-Flight Aerosol Mass Spectrometer (HR-ToF-AMS) and an Aerosol Chemical Speciation Monitor (ACSM) to characterize the chemical composition of submicron non-refractory particles (NR-PM1) in the southeastern US. Measurements were performed in both rural and urban sites in the greater Atlanta area, GA and Centreville, AL for approximately one year, as part of Southeastern Center of Air Pollution and Epidemiology study (SCAPE) and Southern Oxidant and Aerosol Study (SOAS). Organic aerosol (OA) accounts for more than half of NR1 mass concentration regardless of sampling sites and seasons. Positive matrix factorization (PMF) analysis of HR-ToF-AMS measurements identified various OA sources, depending on location and season. Hydrocarbon-like OA (HOA) and cooking OA (COA) have important but not dominant contributions to total OA in urban sites. Biomass burning OA (BBOA) concentration shows a distinct seasonal variation with a larger enhancement in winter than summer. We find a good correlation between BBOA and brown carbon, indicating biomass burning is an important source for brown carbon, although an additional, unidentified brown carbon source is likely present at the rural Yorkville site. Isoprene-derived OA (Isoprene-OA) is only deconvolved in warmer months and contributes 18-36% of total OA. The presence of Isoprene-OA factor in urban sites is more likely from local production in the presence of NOx than transport from rural sites. More-oxidized and less-oxidized oxygenated organic aerosol (MO-OOA and LO-OOA, respectively) are dominant fractions (47-79%) of OA in all sites. MO-OOA correlates well with ozone in summer, but not in winter, indicating MO-OOA sources may vary with seasons. LO-OOA, which reaches a daily maximum at night, correlates better with estimated nitrate functionality from organic nitrates than total nitrates. Based on the HR-ToF-AMS measurements, we estimate that the nitrate functionality from organic nitrates contributes 63-100% of total measured nitrates in summer. Further, the contribution of organic nitrates to total OA is estimated to be 5-12% in summer, suggesting that organic nitrates are important components in the ambient aerosol in the southeastern US. The spatial distribution of OA is investigated by comparing simultaneous HR-ToF-AMS measurements with ACSM measurements at two different sampling sites. OA is found to be spatially homogeneous in summer, possibly due to stagnant air mass and a dominant amount of regional SOA in the southeastern US. The homogeneity is less in winter, which is likely due to spatial variation of primary emissions. We observed that the seasonality of OA concentration shows a clear urban/rural contrast. While OA exhibits weak seasonal variation in the urban sites, its concentration is higher in summer than winter for rural sites. This observation from our year-long measurements is consistent with 14 years of organic carbon (OC) data from the SouthEastern Aerosol Research and Characterization (SEARCH) network. The comparison between short-term measurements with advanced instruments and long-term measurements of basic air quality indicators not only tests the robustness of the short-term measurements but also provides insights in interpreting long-term measurements. We find that OA factors resolved from PMF analysis on HR-ToF-AMS measurements have distinctly different diurnal variations. The compensation of OA factors with different diurnal trends is one possible reason for the repeatedly observed, relatively flat OA diurnal profile in the southeastern US. In addition, analysis of long-term measurements shows that the correlation between OC and sulfate is substantially higher in summer than winter. This seasonality could be partly due to the effects of sulfate on isoprene SOA formation as revealed by the short-term, intensive measurements.

  14. Aerosol characterization over the southeastern United States using high-resolution aerosol mass spectrometry: spatial and seasonal variation of aerosol composition and sources with a focus on organic nitrates

    NASA Astrophysics Data System (ADS)

    Xu, L.; Suresh, S.; Guo, H.; Weber, R. J.; Ng, N. L.

    2015-07-01

    We deployed a High-Resolution Time-of-Flight Aerosol Mass Spectrometer (HR-ToF-AMS) and an Aerosol Chemical Speciation Monitor (ACSM) to characterize the chemical composition of submicron non-refractory particulate matter (NR-PM1) in the southeastern USA. Measurements were performed in both rural and urban sites in the greater Atlanta area, Georgia (GA), and Centreville, Alabama (AL), for approximately 1 year as part of Southeastern Center for Air Pollution and Epidemiology study (SCAPE) and Southern Oxidant and Aerosol Study (SOAS). Organic aerosol (OA) accounts for more than half of NR-PM1 mass concentration regardless of sampling sites and seasons. Positive matrix factorization (PMF) analysis of HR-ToF-AMS measurements identified various OA sources, depending on location and season. Hydrocarbon-like OA (HOA) and cooking OA (COA) have important, but not dominant, contributions to total OA in urban sites (i.e., 21-38 % of total OA depending on site and season). Biomass burning OA (BBOA) concentration shows a distinct seasonal variation with a larger enhancement in winter than summer. We find a good correlation between BBOA and brown carbon, indicating biomass burning is an important source for brown carbon, although an additional, unidentified brown carbon source is likely present at the rural Yorkville site. Isoprene-derived OA factor (isoprene-OA) is only deconvolved in warmer months and contributes 18-36 % of total OA. The presence of isoprene-OA factor in urban sites is more likely from local production in the presence of NOx than transport from rural sites. More-oxidized and less-oxidized oxygenated organic aerosol (MO-OOA and LO-OOA, respectively) are dominant fractions (47-79 %) of OA in all sites. MO-OOA correlates well with ozone in summer but not in winter, indicating MO-OOA sources may vary with seasons. LO-OOA, which reaches a daily maximum at night, correlates better with estimated nitrate functionality from organic nitrates than total nitrates. Based on the HR-ToF-AMS measurements, we estimate that the nitrate functionality from organic nitrates contributes 63-100 % to the total measured nitrates in summer. Furthermore, the contribution of organic nitrates to total OA is estimated to be 5-12 % in summer, suggesting that organic nitrates are important components in the ambient aerosol in the southeastern USA. The spatial distribution of OA is investigated by comparing simultaneous HR-ToF-AMS measurements with ACSM measurements at two different sampling sites. OA is found to be spatially homogeneous in summer due possibly to stagnant air mass and a dominant amount of regional secondary organic aerosol (SOA) in the southeastern USA. The homogeneity is less in winter, which is likely due to spatial variation of primary emissions. We observe that the seasonality of OA concentration shows a clear urban/rural contrast. While OA exhibits weak seasonal variation in the urban sites, its concentration is higher in summer than winter for rural sites. This observation from our year-long measurements is consistent with 14 years of organic carbon (OC) data from the SouthEastern Aerosol Research and Characterization (SEARCH) network. The comparison between short-term measurements with advanced instruments and long-term measurements of basic air quality indicators not only tests the robustness of the short-term measurements but also provides insights in interpreting long-term measurements. We find that OA factors resolved from PMF analysis on HR-ToF-AMS measurements have distinctly different diurnal variations. The compensation of OA factors with different diurnal trends is one possible reason for the repeatedly observed, relatively flat OA diurnal profile in the southeastern USA. In addition, analysis of long-term measurements shows that the correlation between OC and sulfate is substantially stronger in summer than winter. This seasonality could be partly due to the effects of sulfate on isoprene SOA formation as revealed by the short-term intensive measurements.

  15. Deep Sequencing of RNA from Ancient Maize Kernels

    PubMed Central

    Rasmussen, Morten; Cappellini, Enrico; Romero-Navarro, J. Alberto; Wales, Nathan; Alquezar-Planas, David E.; Penfield, Steven; Brown, Terence A.; Vielle-Calzada, Jean-Philippe; Montiel, Rafael; Jørgensen, Tina; Odegaard, Nancy; Jacobs, Michael; Arriaza, Bernardo; Higham, Thomas F. G.; Ramsey, Christopher Bronk; Willerslev, Eske; Gilbert, M. Thomas P.

    2013-01-01

    The characterization of biomolecules from ancient samples can shed otherwise unobtainable insights into the past. Despite the fundamental role of transcriptomal change in evolution, the potential of ancient RNA remains unexploited – perhaps due to dogma associated with the fragility of RNA. We hypothesize that seeds offer a plausible refuge for long-term RNA survival, due to the fundamental role of RNA during seed germination. Using RNA-Seq on cDNA synthesized from nucleic acid extracts, we validate this hypothesis through demonstration of partial transcriptomal recovery from two sources of ancient maize kernels. The results suggest that ancient seed transcriptomics may offer a powerful new tool with which to study plant domestication. PMID:23326310

  16. Long-term trends in dissolved iron and DOC concentration linked to nitrate depletion in riparian soils

    NASA Astrophysics Data System (ADS)

    Musolff, Andreas; Selle, Benny; Fleckenstein, Jan H.; Oosterwoud, Marieke R.; Tittel, Jörg

    2016-04-01

    The instream concentrations of dissolved organic carbon (DOC) are rising in many catchments of the northern hemisphere. Elevated concentrations of DOC, mainly in the form of colored humic components, increase efforts and costs of drinking water purification. In this study, we evaluated a long-term dataset of 110 catchments draining into German drinking water reservoirs in order to assess sources of DOC and drivers of a potential long-term change. The average DOC concentrations across the wide range of different catchments were found to be well explained by the catchment's topographic wetness index. Higher wetness indices were connected to higher average DOC concentrations, which implies that catchments with shallow topography and pronounced riparian wetlands mobilize more DOC. Overall, 37% of the investigated catchments showed a significant long-term increase in DOC concentrations, while 22% exhibited significant negative trends. Moreover, we found that increasing trends in DOC were positively correlated to trends in dissolved iron concentrations at pH≤6 due to remobilization of DOC previously sorbed to iron minerals. Both, increasing trends in DOC and dissolve iron were found to be connected to decreasing trends and low concentrations of nitrate (below ~6 mg/L). This was especially observed in forested catchments where atmospheric N-depositions were the major source for nitrate availability. In these catchments, we also found long-term increases of phosphate concentrations. Therefore, we argue that dissolved iron, DOC and phosphate were jointly released under iron-reducing conditions when nitrate as a competing electron acceptor was too low in concentrations to prevent the microbial iron reduction. In contrast, we could not explain the observed increasing trends in DOC, iron and phosphate concentrations by the long-term trends of pH, sulfate or precipitation. Altogether this study gives strong evidence that both, source and long-term increases in DOC are primarily controlled by riparian wetland soils within the catchments. Here, the achievement of a long-term reduction in nitrogen deposition may in turn lead to a more pronounced iron reduction and a subsequent release of DOC and other iron-bound substances such as phosphate.

  17. Alternatives to an extended Kalman Filter for target image tracking

    NASA Astrophysics Data System (ADS)

    Leuthauser, P. R.

    1981-12-01

    Four alternative filters are compared to an extended Kalman filter (EKF) algorithm for tracking a distributed (elliptical) source target in a closed loop tracking problem, using outputs from a forward looking (FLIR) sensor as measurements. These were (1) an EKF with (second order) bias correction term, (2) a constant gain EKF, (3) a constant gain EKF with bias correction term, and (4) a statistically linearized filter. Estimates are made of both actual target motion and of apparent motion due to atmospheric jitter. These alternative designs are considered specifically to address some of the significant biases exhibited by an EKF due to initial acquisition difficulties, unmodelled maneuvering by the target, low signal-to-noise ratio, and real world conditions varying significantly from those assumed in the filter design (robustness). Filter performance was determined with a Monte Carlo study under both ideal and non ideal conditions for tracking targets on a constant velocity cross range path, and during constant acceleration turns of 5G, 10G, and 20G.

  18. Long-term measurements of submicrometer aerosol chemistry at the Southern Great Plains (SGP) using an Aerosol Chemical Speciation Monitor (ACSM)

    DOE PAGES

    Parworth, Caroline; Tilp, Alison; Fast, Jerome; ...

    2015-04-01

    In this study the long-term trends of non-refractory submicrometer aerosol (NR-PM1) composition and mass concentration measured by an Aerosol Chemical Speciation Monitor (ACSM) at the Atmospheric Radiation Measurement (ARM) program's Southern Great Plains (SGP) site are discussed. NR-PM1 data was recorded at ~30 min intervals over a period of 19 months between November 2010 and June 2012. Positive Matrix Factorization (PMF) was performed on the measured organic mass spectral matrix using a rolling window technique to derive factors associated with distinct sources, evolution processes, and physiochemical properties. The rolling window approach also allows us to capture the dynamic variations ofmore » the chemical properties in the organic aerosol (OA) factors over time. Three OA factors were obtained including two oxygenated OA (OOA) factors, differing in degrees of oxidation, and a biomass burning OA (BBOA) factor. Back trajectory analyses were performed to investigate possible sources of major NR-PM1 species at the SGP site. Organics dominated NR-PM1 mass concentration for the majority of the study with the exception of winter, when ammonium nitrate increases due to transport of precursor species from surrounding urban and agricultural areas and also due to cooler temperatures. Sulfate mass concentrations have little seasonal variation with mixed regional and local sources. In the spring BBOA emissions increase and are mainly associated with local fires. Isoprene and carbon monoxide emission rates were obtained by the Model of Emissions of Gases and Aerosols from Nature (MEGAN) and the 2011 U.S. National Emissions Inventory to represent the spatial distribution of biogenic and anthropogenic sources, respectively. The combined spatial distribution of isoprene emissions and air mass trajectories suggest that biogenic emissions from the southeast contribute to SOA formation at the SGP site during the summer.« less

  19. The Iterative Reweighted Mixed-Norm Estimate for Spatio-Temporal MEG/EEG Source Reconstruction.

    PubMed

    Strohmeier, Daniel; Bekhti, Yousra; Haueisen, Jens; Gramfort, Alexandre

    2016-10-01

    Source imaging based on magnetoencephalography (MEG) and electroencephalography (EEG) allows for the non-invasive analysis of brain activity with high temporal and good spatial resolution. As the bioelectromagnetic inverse problem is ill-posed, constraints are required. For the analysis of evoked brain activity, spatial sparsity of the neuronal activation is a common assumption. It is often taken into account using convex constraints based on the l 1 -norm. The resulting source estimates are however biased in amplitude and often suboptimal in terms of source selection due to high correlations in the forward model. In this work, we demonstrate that an inverse solver based on a block-separable penalty with a Frobenius norm per block and a l 0.5 -quasinorm over blocks addresses both of these issues. For solving the resulting non-convex optimization problem, we propose the iterative reweighted Mixed Norm Estimate (irMxNE), an optimization scheme based on iterative reweighted convex surrogate optimization problems, which are solved efficiently using a block coordinate descent scheme and an active set strategy. We compare the proposed sparse imaging method to the dSPM and the RAP-MUSIC approach based on two MEG data sets. We provide empirical evidence based on simulations and analysis of MEG data that the proposed method improves on the standard Mixed Norm Estimate (MxNE) in terms of amplitude bias, support recovery, and stability.

  20. Luminosity distance in ``Swiss cheese'' cosmology with randomized voids. II. Magnification probability distributions

    NASA Astrophysics Data System (ADS)

    Flanagan, Éanna É.; Kumar, Naresh; Wasserman, Ira; Vanderveld, R. Ali

    2012-01-01

    We study the fluctuations in luminosity distances due to gravitational lensing by large scale (≳35Mpc) structures, specifically voids and sheets. We use a simplified “Swiss cheese” model consisting of a ΛCDM Friedman-Robertson-Walker background in which a number of randomly distributed nonoverlapping spherical regions are replaced by mass-compensating comoving voids, each with a uniform density interior and a thin shell of matter on the surface. We compute the distribution of magnitude shifts using a variant of the method of Holz and Wald , which includes the effect of lensing shear. The standard deviation of this distribution is ˜0.027 magnitudes and the mean is ˜0.003 magnitudes for voids of radius 35 Mpc, sources at redshift zs=1.0, with the voids chosen so that 90% of the mass is on the shell today. The standard deviation varies from 0.005 to 0.06 magnitudes as we vary the void size, source redshift, and fraction of mass on the shells today. If the shell walls are given a finite thickness of ˜1Mpc, the standard deviation is reduced to ˜0.013 magnitudes. This standard deviation due to voids is a factor ˜3 smaller than that due to galaxy scale structures. We summarize our results in terms of a fitting formula that is accurate to ˜20%, and also build a simplified analytic model that reproduces our results to within ˜30%. Our model also allows us to explore the domain of validity of weak-lensing theory for voids. We find that for 35 Mpc voids, corrections to the dispersion due to lens-lens coupling are of order ˜4%, and corrections due to shear are ˜3%. Finally, we estimate the bias due to source-lens clustering in our model to be negligible.

  1. Assessment of short-term PM2.5-related mortality due to different emission sources in the Yangtze River Delta, China

    NASA Astrophysics Data System (ADS)

    Wang, Jiandong; Wang, Shuxiao; Voorhees, A. Scott; Zhao, Bin; Jang, Carey; Jiang, Jingkun; Fu, Joshua S.; Ding, Dian; Zhu, Yun; Hao, Jiming

    2015-12-01

    Air pollution is a major environmental risk to health. In this study, short-term premature mortality due to particulate matter equal to or less than 2.5 μm in aerodynamic diameter (PM2.5) in the Yangtze River Delta (YRD) is estimated by using a PC-based human health benefits software. The economic loss is assessed by using the willingness to pay (WTP) method. The contributions of each region, sector and gaseous precursor are also determined by employing brute-force method. The results show that, in the YRD in 2010, the short-term premature deaths caused by PM2.5 are estimated to be 13,162 (95% confidence interval (CI): 10,761-15,554), while the economic loss is 22.1 (95% CI: 18.1-26.1) billion Chinese Yuan. The industrial and residential sectors contributed the most, accounting for more than 50% of the total economic loss. Emissions of primary PM2.5 and NH3 are major contributors to the health-related loss in winter, while the contribution of gaseous precursors such as SO2 and NOx is higher than primary PM2.5 in summer.

  2. Source apportionments of aerosols and their direct radiative forcing and long-term trends over continental United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Yang; Wang, Hailong; Smith, Steven J.

    Due to US air pollution regulations, aerosol and precursor emissions have decreased during recent decades, while changes in emissions in other regions of the world also influence US aerosol trends through long-range transport. We examine here the relative roles of these domestic and foreign emission changes on aerosol concentrations and direct radiative forcing (DRF) at the top of the atmosphere over the continental US. Long-term (1980-2014) trends and aerosol source apportionment are quantified in this study using a global aerosol-climate model equipped with an explicit aerosol source tagging technique. Due to US emission control policies, the annual mean near-surface concentrationmore » of particles, consisting of sulfate, black carbon, and primary organic aerosol, decreases by about –1.1 (±0.1) / –1.4 (±0.1) μg m -3 in western US and –3.3 (±0.2) / –2.9 (±0.2) μg m -3 in eastern US during 2010–2014, as compared to those in 1980–1984. Meanwhile, decreases in US emissions lead to a warming of +0.48 (±0.03) / –0.46 (±0.03) W m -2 in western US and +1.41 (±0.07) /+1.32 (±0.09) W m -2 in eastern US through changes in aerosol DRF. Increases in emissions from East Asia generally have a modest impact on US air quality, but mitigated the warming effect induced by reductions in US emissions by 25% in western US and 7% in eastern US. Thus, as US domestic aerosol and precursor emissions continue to decrease, foreign emissions may become increasingly important to radiative forcing over the US.« less

  3. Source apportionments of aerosols and their direct radiative forcing and long-term trends over continental United States

    DOE PAGES

    Yang, Yang; Wang, Hailong; Smith, Steven J.; ...

    2018-05-23

    Due to US air pollution regulations, aerosol and precursor emissions have decreased during recent decades, while changes in emissions in other regions of the world also influence US aerosol trends through long-range transport. We examine here the relative roles of these domestic and foreign emission changes on aerosol concentrations and direct radiative forcing (DRF) at the top of the atmosphere over the continental US. Long-term (1980-2014) trends and aerosol source apportionment are quantified in this study using a global aerosol-climate model equipped with an explicit aerosol source tagging technique. Due to US emission control policies, the annual mean near-surface concentrationmore » of particles, consisting of sulfate, black carbon, and primary organic aerosol, decreases by about –1.1 (±0.1) / –1.4 (±0.1) μg m -3 in western US and –3.3 (±0.2) / –2.9 (±0.2) μg m -3 in eastern US during 2010–2014, as compared to those in 1980–1984. Meanwhile, decreases in US emissions lead to a warming of +0.48 (±0.03) / –0.46 (±0.03) W m -2 in western US and +1.41 (±0.07) /+1.32 (±0.09) W m -2 in eastern US through changes in aerosol DRF. Increases in emissions from East Asia generally have a modest impact on US air quality, but mitigated the warming effect induced by reductions in US emissions by 25% in western US and 7% in eastern US. Thus, as US domestic aerosol and precursor emissions continue to decrease, foreign emissions may become increasingly important to radiative forcing over the US.« less

  4. Radon Levels Measured at a Touristic Thermal Spa Resort in Montagu (South Africa) and Associated Effective Doses.

    PubMed

    Botha, R; Newman, R T; Maleka, P P

    2016-09-01

    Radon activity concentrations (in water and in air) were measured at 13 selected locations at the Avalon Springs thermal spa resort in Montagu (Western Cape, South Africa) to estimate the associated effective dose received by employees and visitors. A RAD-7 detector (DURRIDGE), based on alpha spectrometry, and electret detectors (E-PERM®Radelec) were used for these radon measurements. The primary source of radon was natural thermal waters from the hot spring, which were pumped to various locations on the resort, and consequently a range of radon in-water analyses were performed. Radon in-water activity concentration as a function of time (short term and long term measurements) and spatial distributions (different bathing pools, etc.) were studied. The mean radon in-water activity concentrations were found to be 205 ± 6 Bq L (source), 112 ± 5 Bq L (outdoor pool) and 79 ± 4 Bq L (indoor pool). Radon in-air activity concentrations were found to range between 33 ± 4 Bq m (at the outside bar) to 523 ± 26 Bq m (building enclosing the hot spring's source). The most significant potential radiation exposure identified is that due to inhalation of air rich in radon and its progeny by the resort employees. The annual occupational effective dose due to the inhalation of radon progeny ranges from 0.16 ± 0.01 mSv to 0.40 ± 0.02 mSv. For the water samples collected, the Ra in-water activity concentrations from samples collected were below the lower detection limit (~0.7 Bq L) of the γ-ray detector system used. No significant radiological health risk can be associated with radon and progeny from the hot spring at the Avalon Springs resort.

  5. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  6. Statistical analysis of the limitation of half integer resonances on the available momentum acceptance of the High Energy Photon Source

    NASA Astrophysics Data System (ADS)

    Jiao, Yi; Duan, Zhe

    2017-01-01

    In a diffraction-limited storage ring, half integer resonances can have strong effects on the beam dynamics, associated with the large detuning terms from the strong focusing and strong sextupoles as required for an ultralow emittance. In this study, the limitation of half integer resonances on the available momentum acceptance (MA) was statistically analyzed based on one design of the High Energy Photon Source (HEPS). It was found that the probability of MA reduction due to crossing of half integer resonances is closely correlated with the level of beta beats at the nominal tunes, but independent of the error sources. The analysis indicated that for the presented HEPS lattice design, the rms amplitude of beta beats should be kept below 1.5% horizontally and 2.5% vertically to reach a small MA reduction probability of about 1%.

  7. Investigation of organic light emitting diodes for interferometric purposes

    NASA Astrophysics Data System (ADS)

    Pakula, Anna; Zimak, Marzena; Sałbut, Leszek

    2011-05-01

    Recently the new type of light source has been introduced to the market. Organic light emitting diode (OLED) is not only interesting because of the low applying voltage, wide light emitting areas and emission efficiency. It gives the possibility to create a light source of a various shape, various color and in the near future very likely even the one that will change shape and spectrum in time in controlled way. Those opportunities have not been in our reach until now. In the paper authors try to give an answer to the question if the new light source -OLED - is suitable for interferometric purposes. Tests cover the short and long term spectrum stability, spectrum changes due to the emission area selection. In the paper the results of two OLEDs (red and white) are shown together with the result of an attempt to use them in an interferometric setup.

  8. Phase-and-amplitude recovery from a single phase-contrast image using partially spatially coherent x-ray radiation

    NASA Astrophysics Data System (ADS)

    Beltran, Mario A.; Paganin, David M.; Pelliccia, Daniele

    2018-05-01

    A simple method of phase-and-amplitude extraction is derived that corrects for image blurring induced by partially spatially coherent incident illumination using only a single intensity image as input. The method is based on Fresnel diffraction theory for the case of high Fresnel number, merged with the space-frequency description formalism used to quantify partially coherent fields and assumes the object under study is composed of a single-material. A priori knowledge of the object’s complex refractive index and information obtained by characterizing the spatial coherence of the source is required. The algorithm was applied to propagation-based phase-contrast data measured with a laboratory-based micro-focus x-ray source. The blurring due to the finite spatial extent of the source is embedded within the algorithm as a simple correction term to the so-called Paganin algorithm and is also numerically stable in the presence of noise.

  9. The Effect of Growth Environment and Salinity on Lipid Production and Composition of Salicornia virginica

    NASA Technical Reports Server (NTRS)

    Bomani, Bilal Mark McDowell; Link, Dirk; Kail, Brian; Morreale, Bryan; Lee, Eric S.; Gigante, Bethany M.; Hendricks, Robert C.

    2014-01-01

    Finding a viable and sustainable source of renewable energy is a global task. Biofuels as a renewable energy source can potentially be a viable option for sustaining long-term energy needs. Biodiesel from halophytes shows great promise due to their ability to serve not only as a fuel source, but a food source as well. Halophytes are one of the few biomass plant species that can tolerate a wide range of saline conditions. We investigate the feasibility of using the halophyte, Salicornia virginica as a biofuel source by conducting a series of experiments utilizing various growth and salinity conditions. The goal is to determine if the saline content of Salicornia virginica in our indoor growth vs outdoor growth conditions has an influence on lipid recovery and total biomass composition. We focused on using standard lipid extraction protocols and characterization methods to evaluate twelve Salicornia virginica samples under six saline values ranging from freshwater to seawater and two growth conditions. The overall goal is to develop an optimal lipid extraction protocol for Salicornia virginica and potentially apply this protocol to halophytes in general.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Almeida, G. L.; Silvani, M. I.; Lopes, R. T.

    Two main parameters rule the performance of an Image Acquisition System, namely, spatial resolution and contrast. For radiographic systems using cone beam arrangements, the farther the source, the better the resolution, but the contrast would diminish due to the lower statistics. A closer source would yield a higher contrast but it would no longer reproduce the attenuation map of the object, as the incoming beam flux would be reduced by unequal large divergences and attenuation factors. This work proposes a procedure to correct these effects when the object is comprised of a hull - or encased in it - possessingmore » a shape capable to be described in analytical geometry terms. Such a description allows the construction of a matrix containing the attenuation factors undergone by the beam from the source until its final destination at each coordinate on the 2D detector. Each matrix element incorporates the attenuation suffered by the beam after its travel through the hull wall, as well as its reduction due to the square of distance to the source and the angle it hits the detector surface. When the pixel intensities of the original image are corrected by these factors, the image contrast, reduced by the overall attenuation in the exposure phase, are recovered, allowing one to see details otherwise concealed due to the low contrast. In order to verify the soundness of this approach, synthetic images of objects of different shapes, such as plates and tubes, incorporating defects and statistical fluctuation, have been generated, recorded for further comparison and afterwards processed to improve their contrast. The developed algorithm which, generates processes and plots the images has been written in Fortran 90 language. As the resulting final images exhibit the expected improvements, it therefore seemed worthwhile to carry out further tests with actual experimental radiographies.« less

  11. Detection of a Moving Gas Source and Estimation of its Concentration Field with a Sensing Aerial Vehicle Integration of Theoretical Controls and Computational Fluids

    DTIC Science & Technology

    2016-07-21

    constants. The model (2.42) is popular for simulation of the UAV motion [60], [61], [62] due to the fact that it models the aircraft response to...inputs to the dynamic model (2.42). The concentration sensors onboard the UAV record concentration ( simulated ) data according to its spatial location...vehicle dynamics and guidance, and the onboard sensor modeling . 15. SUBJECT TERMS State estimation; UAVs , mobile sensors; grid adaptationj; plume

  12. Deadly Cold: Health Hazards Due to Cold Weather. An Information Paper by the Subcommittee on Health and Long-Term Care of the Select Committee on Aging. House of Representatives, Ninety-Eighth Congress, Second Session (February 1984).

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. House Select Committee on Aging.

    This paper, on the health hazards of cold weather for elderly persons, presents information from various sources on the death rates in winter throughout the United States. After reviewing the scope of the problem, specific health hazards associated with cold weather are discussed, i.e., hypothermia, fires, carbon monoxide poisoning, and influenza…

  13. Periodic variations in the signal-to-noise ratios of signals received from the ICE spacecraft

    NASA Technical Reports Server (NTRS)

    Nadeau, T.

    1986-01-01

    Data from the ICE probe to comet Giacobini-Zinner are analyzed to determine the effects of spacecraft rotation upon the signal to noise ratio (SNR) for the two channels of data. In addition, long-term variations from sources other than rotations are considered. Results include a pronounced SNR variation over a period of three seconds (one rotation) and a lesser effect over a two minute period (possibly due to the receiving antenna conscan).

  14. Evaluating Long-Term Impacts of Soil-Mixing Source-Zone Treatment using Cryogenic Core Collection

    DTIC Science & Technology

    2017-06-01

    to (a) coring equipment freezing downhole, (b) freezing or binding of the core sample in barrel, and ( c ) running out of LN in the vicinity of sampling...encountered due to (a) coring equipment freezing downhole, (b) freezing or binding of the core sample in barrel, and ( c ) running out of LN in the...equipment freezing downhole, (b) freezing or binding of the core sample in barrel, and ( c ) running out of LN in the vicinity of sampling. Downhole

  15. On butterfly effect in higher derivative gravities

    NASA Astrophysics Data System (ADS)

    Alishahiha, Mohsen; Davody, Ali; Naseh, Ali; Taghavi, Seyed Farid

    2016-11-01

    We study butterfly effect in D-dimensional gravitational theories containing terms quadratic in Ricci scalar and Ricci tensor. One observes that due to higher order derivatives in the corresponding equations of motion there are two butterfly velocities. The velocities are determined by the dimension of operators whose sources are provided by the metric. The three dimensional TMG model is also studied where we get two butterfly velocities at generic point of the moduli space of parameters. At critical point two velocities coincide.

  16. Use of multitemporal InSAR data to develop geohazard scenarios for Bandung, Western Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Salvi, Stefano; Tolomei, Cristiano; Duro, Javier; Pezzo, Giuseppe; Koudogbo, Fifamè

    2015-04-01

    The Greater Bandung metropolitan area is the second largest urban area in Indonesia, with a population of 8.6 million. It is subject to a variety of geohazards: volcanic hazards from seven active volcanoes within a radius of 50 km; high flood hazards, seismic hazard due to crustal active faults, the best known being the 30-km long Lembang fault, 10 km North of the city centre; subsidence hazards due to strong aquifer depletion; landslide hazard in the surrounding high country. In the framework of the FP7 RASOR project, multitemporal satellite SAR data have been processed over Bandung, Western Java. We used the SBAS InSAR technique (Berardino et al., 2002) to process two ALOS-1 datasets, to investigate the various sources of surface deformation acting in the area in the period 2008-2011. Persistent Scatterer Interferometry (PSI) has also been applied to achieve ground motion measurements with millimetric precision and high accuracy. The PSI processing technique considers a system of points that reflect the radar signal from the satellite continuously through the time. It makes use of differential interferometric phase measurements to generate long term terrain deformation and digital surface model maps. The GlobalSARTM algorithms developed by Altamira Information are applied to COSMO-SkyMed data acquired to measure ground motion over the area of interest. Strong ground displacements (up to 7 cm/yr) due to groundwater abstraction have been measured in the Bandung basin. The identification of long wavelength signals from tectonic sources is difficult due to the limited InSAR coherence outside of the urban environment. Limited deformation is observed also in the Tangkuban Perahu volcano to the north. The spatial and temporal distribution of the ground motion is important supporting information for the generation of long term subsidence and flood hazard scenarios.

  17. Modeling the contribution of point sources and non-point sources to Thachin River water pollution.

    PubMed

    Schaffner, Monika; Bader, Hans-Peter; Scheidegger, Ruth

    2009-08-15

    Major rivers in developing and emerging countries suffer increasingly of severe degradation of water quality. The current study uses a mathematical Material Flow Analysis (MMFA) as a complementary approach to address the degradation of river water quality due to nutrient pollution in the Thachin River Basin in Central Thailand. This paper gives an overview of the origins and flow paths of the various point- and non-point pollution sources in the Thachin River Basin (in terms of nitrogen and phosphorus) and quantifies their relative importance within the system. The key parameters influencing the main nutrient flows are determined and possible mitigation measures discussed. The results show that aquaculture (as a point source) and rice farming (as a non-point source) are the key nutrient sources in the Thachin River Basin. Other point sources such as pig farms, households and industries, which were previously cited as the most relevant pollution sources in terms of organic pollution, play less significant roles in comparison. This order of importance shifts when considering the model results for the provincial level. Crosschecks with secondary data and field studies confirm the plausibility of our simulations. Specific nutrient loads for the pollution sources are derived; these can be used for a first broad quantification of nutrient pollution in comparable river basins. Based on an identification of the sensitive model parameters, possible mitigation scenarios are determined and their potential to reduce the nutrient load evaluated. A comparison of simulated nutrient loads with measured nutrient concentrations shows that nutrient retention in the river system may be significant. Sedimentation in the slow flowing surface water network as well as nitrogen emission to the air from the warm oxygen deficient waters are certainly partly responsible, but also wetlands along the river banks could play an important role as nutrient sinks.

  18. Helicopter external noise prediction and reduction

    NASA Astrophysics Data System (ADS)

    Lewy, Serge

    Helicopter external noise is a major challenge for the manufacturers, both in the civil domain and in the military domain. The strongest acoustic sources are due to the main rotor. Two flight conditions are analyzed in detail because radiated sound is then very loud and very impulsive: (1) high-speed flight, with large thickness and shear terms on the advancing blade side; and (2) descent flight, with blade-vortex interaction for certain rates of descent. In both cases, computational results were obtained and tests on new blade designs have been conducted in wind tunnels. These studies prove that large noise reduction can be achieved. It is shown in conclusion, however, that the other acoustic sources (tail rotor, turboshaft engines) must not be neglected to define a quiet helicopter.

  19. Performance comparison of single axis tracking and 40° solar panels for sunny weather

    NASA Astrophysics Data System (ADS)

    Chua, Yaw Long; Yong, Yoon Kuang; Koh, Yit Yan

    2017-09-01

    The rapid increment in human population and economy growth had led to the rise of the energy demand globally. With the rapid diminishing fossil fuels based energy sources, renewable energy sources had been introduced due to its unlimited availability especially solar energy which is a sustainable and reliable energy. This research was conducted to study and compare the efficiency of the single axis tracking solar panel with a 40° inclined angle solar panel in sunny weather condition. The results indicated that the output generated by the solar panel was directly affected by the angle which the solar panel facing the sun. In terms of performance the single axis tracking solar panel emerged to be more efficient with greater energy generated.

  20. Magnetic Reconnection Driven by Thermonuclear Burning

    NASA Astrophysics Data System (ADS)

    Gatto, R.; Coppi, B.

    2017-10-01

    Considering that fusion reaction products (e.g. α-particles) deposit their energy on the electrons, the relevant thermal energy balance equation is characterized by a fusion source term, a relatively large longitudinal thermal conductivity and an appropriate transverse thermal conductivity. Then, looking for modes that are radially localized around rational surfaces, reconnected field configurations are found that can be sustained by the electron thermal energy source due to fusion reactions. Then this process can be included in the category of endogenous reconnection processes and may be viewed as a form of the thermonuclear instability that can develop in an ignited inhomogeneous plasma. A complete analysis of the equations supporting the relevant theory is reported. Sponsored in part by the U.S. DoE.

  1. Near-Source Shaking and Dynamic Rupture in Plastic Media

    NASA Astrophysics Data System (ADS)

    Gabriel, A.; Mai, P. M.; Dalguer, L. A.; Ampuero, J. P.

    2012-12-01

    Recent well recorded earthquakes show a high degree of complexity at the source level that severely affects the resulting ground motion in near and far-field seismic data. In our study, we focus on investigating source-dominated near-field ground motion features from numerical dynamic rupture simulations in an elasto-visco-plastic bulk. Our aim is to contribute to a more direct connection from theoretical and computational results to field and seismological observations. Previous work showed that a diversity of rupture styles emerges from simulations on faults governed by velocity-and-state-dependent friction with rapid velocity-weakening at high slip rate. For instance, growing pulses lead to re-activation of slip due to gradual stress build-up near the hypocenter, as inferred in some source studies of the 2011 Tohoku-Oki earthquake. Moreover, off-fault energy dissipation implied physical limits on extreme ground motion by limiting peak slip rate and rupture velocity. We investigate characteristic features in near-field strong ground motion generated by dynamic in-plane rupture simulations. We present effects of plasticity on source process signatures, off-fault damage patterns and ground shaking. Independent of rupture style, asymmetric damage patterns across the fault are produced that contribute to the total seismic moment, and even dominantly at high angles between the fault and the maximum principal background stress. The off-fault plastic strain fields induced by transitions between rupture styles reveal characteristic signatures of the mechanical source processes during the transition. Comparing different rupture styles in elastic and elasto-visco-plastic media to identify signatures of off-fault plasticity, we find varying degrees of alteration of near-field radiation due to plastic energy dissipation. Subshear pulses suffer more peak particle velocity reduction due to plasticity than cracks. Supershear ruptures are affected even more. The occurrence of multiple rupture fronts affect seismic potency release rate, amplitude spectra, peak particle velocity distributions and near-field seismograms. Our simulations enable us to trace features of source processes in synthetic seismograms, for example exhibiting a re-activation of slip. Such physical models may provide starting points for future investigations of field properties of earthquake source mechanisms and natural fault conditions. In the long-term, our findings may be helpful for seismic hazard analysis and the improvement of seismic source models.

  2. Survey on the Performance of Source Localization Algorithms.

    PubMed

    Fresno, José Manuel; Robles, Guillermo; Martínez-Tarifa, Juan Manuel; Stewart, Brian G

    2017-11-18

    The localization of emitters using an array of sensors or antennas is a prevalent issue approached in several applications. There exist different techniques for source localization, which can be classified into multilateration, received signal strength (RSS) and proximity methods. The performance of multilateration techniques relies on measured time variables: the time of flight (ToF) of the emission from the emitter to the sensor, the time differences of arrival (TDoA) of the emission between sensors and the pseudo-time of flight (pToF) of the emission to the sensors. The multilateration algorithms presented and compared in this paper can be classified as iterative and non-iterative methods. Both standard least squares (SLS) and hyperbolic least squares (HLS) are iterative and based on the Newton-Raphson technique to solve the non-linear equation system. The metaheuristic technique particle swarm optimization (PSO) used for source localisation is also studied. This optimization technique estimates the source position as the optimum of an objective function based on HLS and is also iterative in nature. Three non-iterative algorithms, namely the hyperbolic positioning algorithms (HPA), the maximum likelihood estimator (MLE) and Bancroft algorithm, are also presented. A non-iterative combined algorithm, MLE-HLS, based on MLE and HLS, is further proposed in this paper. The performance of all algorithms is analysed and compared in terms of accuracy in the localization of the position of the emitter and in terms of computational time. The analysis is also undertaken with three different sensor layouts since the positions of the sensors affect the localization; several source positions are also evaluated to make the comparison more robust. The analysis is carried out using theoretical time differences, as well as including errors due to the effect of digital sampling of the time variables. It is shown that the most balanced algorithm, yielding better results than the other algorithms in terms of accuracy and short computational time, is the combined MLE-HLS algorithm.

  3. Survey on the Performance of Source Localization Algorithms

    PubMed Central

    2017-01-01

    The localization of emitters using an array of sensors or antennas is a prevalent issue approached in several applications. There exist different techniques for source localization, which can be classified into multilateration, received signal strength (RSS) and proximity methods. The performance of multilateration techniques relies on measured time variables: the time of flight (ToF) of the emission from the emitter to the sensor, the time differences of arrival (TDoA) of the emission between sensors and the pseudo-time of flight (pToF) of the emission to the sensors. The multilateration algorithms presented and compared in this paper can be classified as iterative and non-iterative methods. Both standard least squares (SLS) and hyperbolic least squares (HLS) are iterative and based on the Newton–Raphson technique to solve the non-linear equation system. The metaheuristic technique particle swarm optimization (PSO) used for source localisation is also studied. This optimization technique estimates the source position as the optimum of an objective function based on HLS and is also iterative in nature. Three non-iterative algorithms, namely the hyperbolic positioning algorithms (HPA), the maximum likelihood estimator (MLE) and Bancroft algorithm, are also presented. A non-iterative combined algorithm, MLE-HLS, based on MLE and HLS, is further proposed in this paper. The performance of all algorithms is analysed and compared in terms of accuracy in the localization of the position of the emitter and in terms of computational time. The analysis is also undertaken with three different sensor layouts since the positions of the sensors affect the localization; several source positions are also evaluated to make the comparison more robust. The analysis is carried out using theoretical time differences, as well as including errors due to the effect of digital sampling of the time variables. It is shown that the most balanced algorithm, yielding better results than the other algorithms in terms of accuracy and short computational time, is the combined MLE-HLS algorithm. PMID:29156565

  4. Relativistic effects in local inertial frames including parametrized-post-Newtonian effects

    NASA Astrophysics Data System (ADS)

    Shahid-Saless, Bahman; Ashby, Neil

    1988-09-01

    We use the concept of a generalized Fermi frame to describe relativistic effects, due to local and distant sources of gravitation, on a body placed in a local inertial frame of reference. In particular we have considered a model of two spherically symmetric gravitating point sources, moving in circular orbits around a common barycenter where one of the bodies is chosen to be the local and the other the distant one. This has been done using the slow-motion, weak-field approximation and including four of the parametrized-post-Newtonian (PPN) parameters. The position of the classical center of mass must be modified when the PPN parameter ζ2 is included. We show that the main relativistic effect on a local satellite is described by the Schwarzschild field of the local body and the nonlinear term corresponding to the self-interaction of the local source with itself. There are also much smaller terms that are proportional, respectively, to the product of the potentials of local and distant bodies and to the distant body's self-interactions. The spatial axes of the local frame undergo geodetic precession. In addition we have an acceleration of the order of 10-11 cm sec-2 that vanish in the case of general relativity, which is discussed in detail.

  5. Computation of nonlinear ultrasound fields using a linearized contrast source method.

    PubMed

    Verweij, Martin D; Demi, Libertario; van Dongen, Koen W A

    2013-08-01

    Nonlinear ultrasound is important in medical diagnostics because imaging of the higher harmonics improves resolution and reduces scattering artifacts. Second harmonic imaging is currently standard, and higher harmonic imaging is under investigation. The efficient development of novel imaging modalities and equipment requires accurate simulations of nonlinear wave fields in large volumes of realistic (lossy, inhomogeneous) media. The Iterative Nonlinear Contrast Source (INCS) method has been developed to deal with spatiotemporal domains measuring hundreds of wavelengths and periods. This full wave method considers the nonlinear term of the Westervelt equation as a nonlinear contrast source, and solves the equivalent integral equation via the Neumann iterative solution. Recently, the method has been extended with a contrast source that accounts for spatially varying attenuation. The current paper addresses the problem that the Neumann iterative solution converges badly for strong contrast sources. The remedy is linearization of the nonlinear contrast source, combined with application of more advanced methods for solving the resulting integral equation. Numerical results show that linearization in combination with a Bi-Conjugate Gradient Stabilized method allows the INCS method to deal with fairly strong, inhomogeneous attenuation, while the error due to the linearization can be eliminated by restarting the iterative scheme.

  6. 77 FR 19740 - Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant Accident

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2010-0249] Water Sources for Long-Term Recirculation Cooling... Regulatory Guide (RG) 1.82, ``Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant... regarding the sumps and suppression pools that provide water sources for emergency core cooling, containment...

  7. Biomass burning contributions estimated by synergistic coupling of daily and hourly aerosol composition records.

    PubMed

    Nava, S; Lucarelli, F; Amato, F; Becagli, S; Calzolai, G; Chiari, M; Giannoni, M; Traversi, R; Udisti, R

    2015-04-01

    Biomass burning (BB) is a significant source of particulate matter (PM) in many parts of the world. Whereas numerous studies demonstrate the relevance of BB emissions in central and northern Europe, the quantification of this source has been assessed only in few cities in southern European countries. In this work, the application of Positive Matrix Factorisation (PMF) allowed a clear identification and quantification of an unexpected very high biomass burning contribution in Tuscany (central Italy), in the most polluted site of the PATOS project. In this urban background site, BB accounted for 37% of the mass of PM10 (particulate matter with aerodynamic diameter<10 μm) as annual average, and more than 50% during winter, being the main cause of all the PM10 limit exceedances. Due to the chemical complexity of BB emissions, an accurate assessment of this source contribution is not always easily achievable using just a single tracer. The present work takes advantage of the combination of a long-term daily data-set, characterized by an extended chemical speciation, with a short-term high time resolution (1-hour) and size-segregated data-set, obtained by PIXE analyses of streaker samples. The hourly time pattern of the BB source, characterised by a periodic behaviour with peaks starting at about 6 p.m. and lasting all the evening-night, and its strong seasonality, with higher values in the winter period, clearly confirmed the hypothesis of a domestic heating source (also excluding important contributions from wildfires and agricultural wastes burning). Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Multi-decadal Dynamics of Mercury in a Complex Ecosystem

    NASA Astrophysics Data System (ADS)

    Levin, L.

    2016-12-01

    A suite of air quality and watershed models was applied to track the ecosystem contributions of mercury (Hg), as well as arsenic (As), and selenium (Se) from local and global sources to the San Juan River basin in the Four Corners region of the American Southwest. Long-term changes in surface water and fish tissue mercury concentrations were also simulated, out to the year 2074.Atmospheric mercury was modeled using a nested, spatial-scale modeling system comprising GEOS-Chem (global scale) and CMAQ-APT (national and regional) models. Four emission scenarios were modeled, including two growth scenarios for Asian mercury emissions. Results showed that the average mercury deposition over the San Juan basin was 21 µg/m2-y. Source contributions to mercury deposition range from 2% to 9% of total deposition prior to post-2016 U.S. controls for air toxics regulatory compliance. Most of the contributions to mercury deposition in the basin are from non-U.S. sources. Watershed simulations showed power plant contributions to fish tissue mercury never exceeded 0.035% during the 85-year model simulation period, even with the long-term growth in fish tissue mercury over that period. Local coal-fired power plants contributed relatively small fractions to mercury deposition (less than 4%) in the basin; background and non-U.S. anthropogenic sources dominated. Fish-tissue mercury levels are projected to increase through 2074 due to growth projections for non-U.S. emission sources. The most important contributor to methylmercury in the lower reaches of the watershed was advection of MeHg produced in situ at upstream headwater locations.

  9. Routes to short-term memory indexing: lessons from deaf native users of American Sign Language.

    PubMed

    Hirshorn, Elizabeth A; Fernandez, Nina M; Bavelier, Daphne

    2012-01-01

    Models of working memory (WM) have been instrumental in understanding foundational cognitive processes and sources of individual differences. However, current models cannot conclusively explain the consistent group differences between deaf signers and hearing speakers on a number of short-term memory (STM) tasks. Here we take the perspective that these results are not due to a temporal order-processing deficit in deaf individuals, but rather reflect different biases in how different types of memory cues are used to do a given task. We further argue that the main driving force behind the shifts in relative biasing is a consequence of language modality (sign vs. speech) and the processing they afford, and not deafness, per se.

  10. Routes to short term memory indexing: Lessons from deaf native users of American Sign Language

    PubMed Central

    Hirshorn, Elizabeth A.; Fernandez, Nina M.; Bavelier, Daphne

    2012-01-01

    Models of working memory (WM) have been instrumental in understanding foundational cognitive processes and sources of individual differences. However, current models cannot conclusively explain the consistent group differences between deaf signers and hearing speakers on a number of short-term memory (STM) tasks. Here we take the perspective that these results are not due to a temporal order-processing deficit in deaf individuals, but rather reflect different biases in how different types of memory cues are used to do a given task. We further argue that the main driving force behind the shifts in relative biasing is a consequence of language modality (sign vs. speech) and the processing they afford, and not deafness, per se. PMID:22871205

  11. A Simulated Environment Experiment on Annoyance Due to Combined Road Traffic and Industrial Noises

    PubMed Central

    Marquis-Favre, Catherine; Morel, Julien

    2015-01-01

    Total annoyance due to combined noises is still difficult to predict adequately. This scientific gap is an obstacle for noise action planning, especially in urban areas where inhabitants are usually exposed to high noise levels from multiple sources. In this context, this work aims to highlight potential to enhance the prediction of total annoyance. The work is based on a simulated environment experiment where participants performed activities in a living room while exposed to combined road traffic and industrial noises. The first objective of the experiment presented in this paper was to gain further understanding of the effects on annoyance of some acoustical factors, non-acoustical factors and potential interactions between the combined noise sources. The second one was to assess total annoyance models constructed from the data collected during the experiment and tested using data gathered in situ. The results obtained in this work highlighted the superiority of perceptual models. In particular, perceptual models with an interaction term seemed to be the best predictors for the two combined noise sources under study, even with high differences in sound pressure level. Thus, these results reinforced the need to focus on perceptual models and to improve the prediction of partial annoyances. PMID:26197326

  12. Local SPTHA through tsunami inundation simulations: a test case for two coastal critical infrastructures in the Mediterranean

    NASA Astrophysics Data System (ADS)

    Volpe, M.; Selva, J.; Tonini, R.; Romano, F.; Lorito, S.; Brizuela, B.; Argyroudis, S.; Salzano, E.; Piatanesi, A.

    2016-12-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) is a methodology to assess the exceedance probability for different thresholds of tsunami hazard intensity, at a specific site or region in a given time period, due to a seismic source. A large amount of high-resolution inundation simulations is typically required for taking into account the full variability of potential seismic sources and their slip distributions. Starting from regional SPTHA offshore results, the computational cost can be reduced by considering for inundation calculations only a subset of `important' scenarios. We here use a method based on an event tree for the treatment of the seismic source aleatory variability; a cluster analysis on the offshore results to define the important sources; epistemic uncertainty treatment through an ensemble modeling approach. We consider two target sites in the Mediterranean (Milazzo, Italy, and Thessaloniki, Greece) where coastal (non nuclear) critical infrastructures (CIs) are located. After performing a regional SPTHA covering the whole Mediterranean, for each target site, few hundreds of representative scenarios are filtered out of all the potential seismic sources and the tsunami inundation is explicitly modeled, obtaining a site-specific SPTHA, with a complete characterization of the tsunami hazard in terms of flow depth and velocity time histories. Moreover, we also explore the variability of SPTHA at the target site accounting for coseismic deformation (i.e. uplift or subsidence) due to near field sources located in very shallow water. The results are suitable and will be applied for subsequent multi-hazard risk analysis for the CIs. These applications have been developed in the framework of the Italian Flagship Project RITMARE, EC FP7 ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389) projects, and of the INGV-DPC Agreement.

  13. SISSY: An efficient and automatic algorithm for the analysis of EEG sources based on structured sparsity.

    PubMed

    Becker, H; Albera, L; Comon, P; Nunes, J-C; Gribonval, R; Fleureau, J; Guillotel, P; Merlet, I

    2017-08-15

    Over the past decades, a multitude of different brain source imaging algorithms have been developed to identify the neural generators underlying the surface electroencephalography measurements. While most of these techniques focus on determining the source positions, only a small number of recently developed algorithms provides an indication of the spatial extent of the distributed sources. In a recent comparison of brain source imaging approaches, the VB-SCCD algorithm has been shown to be one of the most promising algorithms among these methods. However, this technique suffers from several problems: it leads to amplitude-biased source estimates, it has difficulties in separating close sources, and it has a high computational complexity due to its implementation using second order cone programming. To overcome these problems, we propose to include an additional regularization term that imposes sparsity in the original source domain and to solve the resulting optimization problem using the alternating direction method of multipliers. Furthermore, we show that the algorithm yields more robust solutions by taking into account the temporal structure of the data. We also propose a new method to automatically threshold the estimated source distribution, which permits to delineate the active brain regions. The new algorithm, called Source Imaging based on Structured Sparsity (SISSY), is analyzed by means of realistic computer simulations and is validated on the clinical data of four patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Guiding optimal biofuels :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paap, Scott M.; West, Todd H.; Manley, Dawn Kataoka

    2013-01-01

    In the current study, processes to produce either ethanol or a representative fatty acid ethyl ester (FAEE) via the fermentation of sugars liberated from lignocellulosic materials pretreated in acid or alkaline environments are analyzed in terms of economic and environmental metrics. Simplified process models are introduced and employed to estimate process performance, and Monte Carlo analyses were carried out to identify key sources of uncertainty and variability. We find that the near-term performance of processes to produce FAEE is significantly worse than that of ethanol production processes for all metrics considered, primarily due to poor fermentation yields and higher electricitymore » demands for aerobic fermentation. In the longer term, the reduced cost and energy requirements of FAEE separation processes will be at least partially offset by inherent limitations in the relevant metabolic pathways that constrain the maximum yield potential of FAEE from biomass-derived sugars.« less

  15. A step towards ending the isolation of behavior analysis: A common language with evolutionary science

    PubMed Central

    Brown, J. F.; Hendy, Steve

    2001-01-01

    In spite of repeated efforts to explain itself to a wider audience, behavior analysis remains a largely misunderstood and isolated discipline. In this article we argue that this situation is in part due to the terms we use in our technical discussions. In particular, reinforcement and punishment, with their vernacular associations of reward and retribution, are a source of much misunderstanding. Although contemporary thinking within behavior analysis holds that reinforcement and punishment are Darwinian processes whereby behavioral variants are selected and deselected by their consequences, the continued use of the terms reinforcement and punishment to account for behavioral evolution obscures this fact. To clarify and simplify matters, we propose replacing the terms reinforcement and punishment with selection and deselection, respectively. These changes would provide a terminological meeting point with other selectionist sciences, thereby increasing the likelihood that behavior analysis will contribute to Darwinian science. PMID:22478361

  16. Long-term dataset on aquatic responses to concurrent climate change and recovery from acidification

    NASA Astrophysics Data System (ADS)

    Leach, Taylor H.; Winslow, Luke A.; Acker, Frank W.; Bloomfield, Jay A.; Boylen, Charles W.; Bukaveckas, Paul A.; Charles, Donald F.; Daniels, Robert A.; Driscoll, Charles T.; Eichler, Lawrence W.; Farrell, Jeremy L.; Funk, Clara S.; Goodrich, Christine A.; Michelena, Toby M.; Nierzwicki-Bauer, Sandra A.; Roy, Karen M.; Shaw, William H.; Sutherland, James W.; Swinton, Mark W.; Winkler, David A.; Rose, Kevin C.

    2018-04-01

    Concurrent regional and global environmental changes are affecting freshwater ecosystems. Decadal-scale data on lake ecosystems that can describe processes affected by these changes are important as multiple stressors often interact to alter the trajectory of key ecological phenomena in complex ways. Due to the practical challenges associated with long-term data collections, the majority of existing long-term data sets focus on only a small number of lakes or few response variables. Here we present physical, chemical, and biological data from 28 lakes in the Adirondack Mountains of northern New York State. These data span the period from 1994-2012 and harmonize multiple open and as-yet unpublished data sources. The dataset creation is reproducible and transparent; R code and all original files used to create the dataset are provided in an appendix. This dataset will be useful for examining ecological change in lakes undergoing multiple stressors.

  17. Study of the surface contamination of copper with the improved positron annihilation-induced Auger electron spectrometer at NEPOMUC

    NASA Astrophysics Data System (ADS)

    Mayer, J.; Hugenschmidt, C.; Schreckenbach, K.

    2008-10-01

    The high intensity positron source NEPOMUC at the FRM-II in Munich enables measurement times for positron annihilation-induced Auger electron spectroscopy (PAES) of only 2.4 h/spectrum, in contrast to usual lab beams with measurement times up to several days. The high electron background due to surrounding experiments in the experimental hall of the FRM-II has been eliminated and hence background free experiments have become possible. Due to this, the signal to noise ratio has been enhanced to 4.5:1, compared to 1:3 with EAES. In addition, a long-term measurement has been performed in order to observe the contamination of a polycrystalline copper foil at 150 °C.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ammann, H.M.; Bradow, F.; Fennell, D.

    Hydrogen sulfide is a highly toxic gas which is immediately lethal in concentrations greater than 2000 ppm. The toxic end-point is due to anoxia to brain and heart tissues which results from its interaction with the celluar enzyme cytochrome oxidase. Inhibition of the enzyme halts oxidative metabolism which is the primary energy source for cells. A second toxic end-point is the irritative effect of hydrogen sulfide on mucous membranes, particularly edema at sublethal doses (250 to 500 ppm) in which sufficient exposure occurs before conciousness is lost. Recovered victims of exposure report neurologic symptoms such as headache, fatigue, irritability, vertigo,more » and loss of libido. Long-term effects are similar to those caused by anoxia due to other toxic agents like CO, and probably are not due to specific H/sub 2/S effects. H/sub 2/S is not a cumulative poison. No mutagenic, carcinogenic, reproductive, or teratogenic effects have been reported in the literature.« less

  19. Clay mineralogy, strontium and neodymium isotope ratios in the sediments of two High Arctic catchments (Svalbard)

    NASA Astrophysics Data System (ADS)

    Hindshaw, Ruth S.; Tosca, Nicholas J.; Piotrowski, Alexander M.; Tipper, Edward T.

    2018-03-01

    The identification of sediment sources to the ocean is a prerequisite to using marine sediment cores to extract information on past climate and ocean circulation. Sr and Nd isotopes are classical tools with which to trace source provenance. Despite considerable interest in the Arctic Ocean, the circum-Arctic source regions are poorly characterised in terms of their Sr and Nd isotopic compositions. In this study we present Sr and Nd isotope data from the Paleogene Central Basin sediments of Svalbard, including the first published data of stream suspended sediments from Svalbard. The stream suspended sediments exhibit considerable isotopic variation (ɛNd = -20.6 to -13.4; 87Sr / 86Sr = 0.73421 to 0.74704) which can be related to the depositional history of the sedimentary formations from which they are derived. In combination with analysis of the clay mineralogy of catchment rocks and sediments, we suggest that the Central Basin sedimentary rocks were derived from two sources. One source is Proterozoic sediments derived from Greenlandic basement rocks which are rich in illite and have high 87Sr / 86Sr and low ɛNd values. The second source is Carboniferous to Jurassic sediments derived from Siberian basalts which are rich in smectite and have low 87Sr / 86Sr and high ɛNd values. Due to a change in depositional conditions throughout the Paleogene (from deep sea to continental) the relative proportions of these two sources vary in the Central Basin formations. The modern stream suspended sediment isotopic composition is then controlled by modern processes, in particular glaciation, which determines the present-day exposure of the formations and therefore the relative contribution of each formation to the stream suspended sediment load. This study demonstrates that the Nd isotopic composition of stream suspended sediments exhibits seasonal variation, which likely mirrors longer-term hydrological changes, with implications for source provenance studies based on fixed end-members through time.

  20. The HEMP QSO Monitoring Project

    NASA Astrophysics Data System (ADS)

    Welsh, William F.; Robinson, E. L.

    2000-02-01

    Many AGN are highly variable sources. Some of these show a pronounced time delay between variations seen in their optical continuum and in their emission lines. ``Echo mapping'' is a technique that uses these time delays to measure the geometry and kinematics of the gas inside the AGN, near the supermassive black hole. The technique is immensely powerful, but the results so far have been modest due to relatively low quality data. We have initiated a long--term project to echo map QSOs. We will examine nearby (but intrinsically faint) QSOs as well as QSOs at high redshift. The high--z QSOs present a problem: it is not known ahead of time which of these are variable sources. Thus we have started a campaign to monitor about 60 high-redshift QSOs for the purpose of determining their variability characteristics. We request SSTO time on the 0.9m telescope for long--term monitoring of high--redshift QSOs to: (i) test their suitability as viable echo mapping candidates; and (ii) measure (for the first time) their variability properties, which is of intrinsic value itself.

  1. Analysis of the SPS Long Term Orbit Drifts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Velotti, Francesco; Bracco, Chiara; Cornelis, Karel

    2016-06-01

    The Super Proton Synchrotron (SPS) is the last accelerator in the Large Hadron Collider (LHC) injector chain, and has to deliver the two high-intensity 450 GeV proton beams to the LHC. The transport from SPS to LHC is done through the two Transfer Lines (TL), TI2 and TI8, for Beam 1 (B1) and Beam 2 (B2) respectively. During the first LHC operation period Run 1, a long term drift of the SPS orbit was observed, causing changes in the LHC injection due to the resulting changes in the TL trajectories. This translated into longer LHC turnaround because of the necessitymore » to periodically correct the TL trajectories in order to preserve the beam quality at injection into the LHC. Different sources for the SPS orbit drifts have been investigated: each of them can account only partially for the total orbit drift observed. In this paper, the possible sources of such drift are described, together with the simulated and measured effect they cause. Possible solutions and countermeasures are also discussed.« less

  2. Unbound motion on a Schwarzschild background: Practical approaches to frequency domain computations

    NASA Astrophysics Data System (ADS)

    Hopper, Seth

    2018-03-01

    Gravitational perturbations due to a point particle moving on a static black hole background are naturally described in Regge-Wheeler gauge. The first-order field equations reduce to a single master wave equation for each radiative mode. The master function satisfying this wave equation is a linear combination of the metric perturbation amplitudes with a source term arising from the stress-energy tensor of the point particle. The original master functions were found by Regge and Wheeler (odd parity) and Zerilli (even parity). Subsequent work by Moncrief and then Cunningham, Price and Moncrief introduced new master variables which allow time domain reconstruction of the metric perturbation amplitudes. Here, I explore the relationship between these different functions and develop a general procedure for deriving new higher-order master functions from ones already known. The benefit of higher-order functions is that their source terms always converge faster at large distance than their lower-order counterparts. This makes for a dramatic improvement in both the speed and accuracy of frequency domain codes when analyzing unbound motion.

  3. NSRD-15:Computational Capability to Substantiate DOE-HDBK-3010 Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Louie, David; Bignell, John; Dingreville, Remi Philippe Michel

    Safety basis analysts throughout the U.S. Department of Energy (DOE) complex rely heavily on the information provided in the DOE Handbook, DOE-HDBK-3010, Airborne Release Fractions/Rates and Respirable Fractions for Nonreactor Nuclear Facilities, to determine radionuclide source terms from postulated accident scenarios. In calculating source terms, analysts tend to use the DOE Handbook’s bounding values on airborne release fractions (ARFs) and respirable fractions (RFs) for various categories of insults (representing potential accident release categories). This is typically due to both time constraints and the avoidance of regulatory critique. Unfortunately, these bounding ARFs/RFs represent extremely conservative values. Moreover, they were derived frommore » very limited small-scale bench/laboratory experiments and/or from engineered judgment. Thus, the basis for the data may not be representative of the actual unique accident conditions and configurations being evaluated. The goal of this research is to develop a more accurate and defensible method to determine bounding values for the DOE Handbook using state-of-art multi-physics-based computer codes.« less

  4. Experimental investigation of the influence of Mo contained in stainless steel on Cs chemisorption behavior

    NASA Astrophysics Data System (ADS)

    Di Lemma, F. G.; Nakajima, K.; Yamashita, S.; Osaka, M.

    2017-02-01

    Chemisorption phenomena can affect fission products (FP) retention in a nuclear reactor vessel during a severe accident (SA). Detailed information on the FP chemisorbed deposits, especially for Cs, are important for a rational decommissioning of the reactor following a SA, as for the Fukushima Daiichi Power Station. Moreover the retention of Cs will influence the source term assessment and thus improved models for this phenomenon are needed in SA codes. This paper describes the influence on Cs chemisorption of molybdenum contained in stainless steel (SS) type 316. In our experiments it was observed that Cs-Mo deposits (CsFe(MoO4)3, Cs2MoO4) were formed together with CsFeSiO4, which is the predominant compound formed by chemisorption. The Cs-Mo deposits were found to revaporize from the SS sample at 1000 °C, and thus could contribute to the source term. On the other hand, CsFeSiO4 will be probably retained in the reactor during a SA due to its stability.

  5. Synthesis of Exotic Soaps in the Chemistry Laboratory

    NASA Astrophysics Data System (ADS)

    Phanstiel, Otto, IV; Dueno, Eric; Xianghong Wang, Queenie

    1998-05-01

    A variety of different triglyceride sources ranging from Vietnamese garlic oil to a local restaurant's grill sludge were saponified to generate a series of exotic soaps. Students did not quantify their results, but described their products in terms of color, texture and odor. Their results were compared with existing data on the triglyceride content for each source used (when possible). Soap texture seemed to be related to the degree of unsaturation present in the starting triglyceride. However, texture alterations due to occluded impurities could not be ruled out. In general, fats and oils high in saturated fats (butter) gave hard, chunky, and waxlike soaps, while those high in unsaturated fats gave flaky and easily crumbled soaps (olive, corn, peanut and sunflower oils). Soap color was not consistent with triglyceride unsaturation levels during the time frame studied. Odor changes were dramatic and were explained in terms of a change in chemical structure (i.e. conversion from an ester to a carboxylate salt). In general, the experiment was well received by students and stressed the importance of making precise qualitative observations during the experiment.

  6. Verification and Improvement of Flamelet Approach for Non-Premixed Flames

    NASA Technical Reports Server (NTRS)

    Zaitsev, S.; Buriko, Yu.; Guskov, O.; Kopchenov, V.; Lubimov, D.; Tshepin, S.; Volkov, D.

    1997-01-01

    Studies in the mathematical modeling of the high-speed turbulent combustion has received renewal attention in the recent years. The review of fundamentals, approaches and extensive bibliography was presented by Bray, Libbi and Williams. In order to obtain accurate predictions for turbulent combustible flows, the effects of turbulent fluctuations on the chemical source terms should be taken into account. The averaging of chemical source terms requires to utilize probability density function (PDF) model. There are two main approaches which are dominant in high-speed combustion modeling now. In the first approach, PDF form is assumed based on intuitia of modelliers (see, for example, Spiegler et.al.; Girimaji; Baurle et.al.). The second way is much more elaborate and it is based on the solution of evolution equation for PDF. This approach was proposed by S.Pope for incompressible flames. Recently, it was modified for modeling of compressible flames in studies of Farschi; Hsu; Hsu, Raji, Norris; Eifer, Kollman. But its realization in CFD is extremely expensive in computations due to large multidimensionality of PDF evolution equation (Baurle, Hsu, Hassan).

  7. Photospheric and coronal magnetic fields in six magnetographs. I. Consistent evolution of the bashful ballerina

    NASA Astrophysics Data System (ADS)

    Virtanen, Ilpo; Mursula, Kalevi

    2016-06-01

    Aims: We study the long-term evolution of photospheric and coronal magnetic fields and the heliospheric current sheet (HCS), especially its north-south asymmetry. Special attention is paid to the reliability of the six data sets used in this study and to the consistency of the results based on these data sets. Methods: We use synoptic maps constructed from Wilcox Solar Observatory (WSO), Mount Wilson Observatory (MWO), Kitt Peak (KP), SOLIS, SOHO/MDI, and SDO/HMI measurements of the photospheric field and the potential field source surface (PFSS) model. Results: The six data sets depict a fairly similar long-term evolution of magnetic fields and the heliospheric current sheet, including polarity reversals and hemispheric asymmetry. However, there are time intervals of several years long, when first KP measurements in the 1970s and 1980s, and later WSO measurements in the 1990s and early 2000s, significantly deviate from the other simultaneous data sets, reflecting likely errors at these times. All of the six magnetographs agree on the southward shift of the heliospheric current sheet (the so-called bashful ballerina phenomenon) in the declining to minimum phase of the solar cycle during a few years of the five included cycles. We show that during solar cycles 20-22, the southward shift of the HCS is mainly due to the axial quadrupole term, reflecting the stronger magnetic field intensity at the southern pole during these times. During cycle 23 the asymmetry is less persistent and mainly due to higher harmonics than the quadrupole term. Currently, in the early declining phase of cycle 24, the HCS is also shifted southward and is mainly due to the axial quadrupole as for most earlier cycles. This further emphasizes the special character of the global solar field during cycle 23.

  8. Language differences in verbal short-term memory do not exclusively originate in the process of subvocal rehearsal.

    PubMed

    Thorn, A S; Gathercole, S E

    2001-06-01

    Language differences in verbal short-term memory were investigated in two experiments. In Experiment 1, bilinguals with high competence in English and French and monolingual English adults with extremely limited knowledge of French were assessed on their serial recall of words and nonwords in both languages. In all cases recall accuracy was superior in the language with which individuals were most familiar, a first-language advantage that remained when variation due to differential rates of articulation in the two languages was taken into account. In Experiment 2, bilinguals recalled lists of English and French words with and without concurrent articulatory suppression. First-language superiority persisted under suppression, suggesting that the language differences in recall accuracy were not attributable to slower rates of subvocal rehearsal in the less familiar language. The findings indicate that language-specific differences in verbal short-term memory do not exclusively originate in the subvocal rehearsal process. It is suggested that one source of language-specific variation might relate to the use of long-term knowledge to support short-term memory performance.

  9. Campylobacteriosis - an overview.

    PubMed

    Sarkar, S R; Hossain, M A; Paul, S K; Ray, N C; Sultana, S; Rahman, M M; Islam, A

    2014-01-01

    Campylobacteriosis is a collective term, used for infectious, emerging foodborne disease caused by Campylobacter species comprising Gram negative, curved, and microaerophilic pathogens. The true incidence of human campylobacteriosis is unknown for most countries of the world including Bangladesh. But campylobacteriosis is not uncommon in our country. Due to its increasing incidence in many countries of the world, it is an important issue now a day. Animals such as birds are the main sources of infection. Farm animals such as cattle, poultry are commonly infected from such sources and raw milk, undercooked or poorly handled meat becomes contaminated. Transmission of campylobacteriosis to human occurs through consumption of infected, unpasteurized animal milk and milk products, undercooked poultry and through contaminated drinking water. Contact with contaminated poultry, livestock or household pets, especially puppies, can also cause disease. Due to variability of clinical features and limited availability of laboratory facilities, the disease remains largely under-reported. Early and specific diagnosis is important to ensure a favourable outcome regarding this food borne disease. Antibiotic treatment is controversial, and has only a benefit on the duration of symptoms. Campylobacter infections can be prevented by some simple hygienic food handling practices.

  10. Assessing the short-term clock drift of early broadband stations with burst events of the 26 s persistent and localized microseism

    NASA Astrophysics Data System (ADS)

    Xie, J.; Ni, S.; Chu, R.; Xia, Y.

    2017-12-01

    Accurate seismometer clock plays an important role in seismological studies including earthquake location and tomography. However, some seismic stations may have clock drift larger than 1 second, especially in early days of global seismic network. The 26 s Persistent Localized (PL) microseism event in the Gulf of Guinea sometime excites strong and coherent signals, and can be used as repeating source for assessing stability of seismometer clocks. Taking station GSC/TS in southern California, USA as an example, the 26 s PL signal can be easily observed in the ambient Noise Cross-correlation Function (NCF) between GSC/TS and a remote station. The variation of travel-time of this 26 s signal in the NCF is used to infer clock error. A drastic clock error is detected during June, 1992. This short-term clock error is confirmed by both teleseismic and local earthquake records with a magnitude of ±25 s. Using 26 s PL source, the clock can be validated for historical records of sparsely distributed stations, where usual NCF of short period microseism (<20 s) might be less effective due to its attenuation over long interstation distances. However, this method suffers from cycling problem, and should be verified by teleseismic/local P waves. The location change of the 26 s PL source may influence the measured clock drift, using regional stations with stable clock, we estimate the possible location change of the source.

  11. Fission Product Appearance Rate Coefficients in Design Basis Source Term Determinations - Past and Present

    NASA Astrophysics Data System (ADS)

    Perez, Pedro B.; Hamawi, John N.

    2017-09-01

    Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.

  12. Long Term Leaching of Chlorinated Solvents from Source Zones in Low Permeability Settings with Fractures

    NASA Astrophysics Data System (ADS)

    Bjerg, P. L.; Chambon, J.; Troldborg, M.; Binning, P. J.; Broholm, M. M.; Lemming, G.; Damgaard, I.

    2008-12-01

    Groundwater contamination by chlorinated solvents, such as perchloroethylene (PCE), often occurs via leaching from complex sources located in low permeability sediments such as clayey tills overlying aquifers. Clayey tills are mostly fractured, and contamination migrating through the fractures spreads to the low permeability matrix by diffusion. This results in a long term source of contamination due to back-diffusion. Leaching from such sources is further complicated by microbial degradation under anaerobic conditions to sequentially form the daughter products trichloroethylene, cis-dichloroethylene (cis-DCE), vinyl chloride (VC) and ethene. This process can be enhanced by addition of electron donors and/or bioaugmentation and is termed Enhanced Reductive Dechlorination (ERD). This work aims to improve our understanding of the physical, chemical and microbial processes governing source behaviour under natural and enhanced conditions. That understanding is applied to risk assessment, and to determine the relationship and time frames of source clean up and plume response. To meet that aim, field and laboratory observations are coupled to state of the art models incorporating new insights of contaminant behaviour. The long term leaching of chlorinated ethenes from clay aquitards is currently being monitored at a number of Danish sites. The observed data is simulated using a coupled fracture flow and clay matrix diffusion model. Sequential degradation is represented by modified Monod kinetics accounting for competitive inhibition between the chlorinated ethenes. The model is constructed using Comsol Multiphysics, a generic finite- element partial differential equation solver. The model is applied at two well characterised field sites with respect to hydrogeology, fracture network, contaminant distribution and microbial processes (lab and field experiments). At the study sites (Sortebrovej and Vadsbyvej), the source areas are situated in a clayey till with fractures and interbedded sand lenses. The field sites are both highly contaminated with chlorinated ethenes which impact the underlying sand aquifer. Anaerobic dechlorination is taking place, and cis-DCE and VC have been found in significant amounts in the matrix. Full scale remediation using ERD was implemented at Sortebrovej in 2006, and ERD has been suggested as a remedy at Vadsbyvej. Results reveal several interesting findings. The physical processes of matrix diffusion and advection in the fractures seem to be more important than the microbial degradation processes for estimation of the time frames and the distance between fractures is amongst the most sensitive model parameters. However, the inclusion of sequential degradation is crucial to determining the composition of contamination leaching into the underlying aquifer. Degradation products like VC will peak at an earlier stage compared to the mother compound due to a higher mobility. The findings highlight a need for improved characterization of low permeability aquitards lying above aquifers used for water supply. The fracture network in aquitards is currently poorly described at larger depths (below 5-8 m) and the effect of sand lenses on leaching behaviour is not well understood. The microbial processes are assumed to be taking place in the fracture system, but the interaction with and processes in the matrix need to be further explored. Development of new methods for field site characterisation and integrated field and model expertise are crucial for the design of remedial actions and for risk assessment of contaminated sites in low permeability settings.

  13. Gordonia (nocardia) amarae foaming due to biosurfactant production.

    PubMed

    Pagilla, K R; Sood, A; Kim, H

    2002-01-01

    Gordonia amarae, a filamentous actinomycete, commonly found in foaming activated sludge wastewater treatment plants was investigated for its biosurfactant production capability. Soluble acetate and paringly soluble hexadecane were used as carbon sources for G. amarae growth and biosurfactant production in laboratory scale batch reactors. The lowest surface tension (critical micelle concentration, CMC) of the cell-free culture broth was 55 dynes/cm when 1,900 mg/L acetate was used as the sole carbon source. The lowest surface tension was less than 40 dynes/cm when either 1% (v/v) hexadecane or a mixture of 1% (v/v) hexadecane and 0.5% (w/v) acetate was used as the carbon source. The maximum biomass concentration (the stationary phase) was achieved after 4 days when acetate was used along with hexadecane, whereas it took about 8 days to achieve the stationary phase with hexadecane alone. The maximum biosurfactant production was 3 x CMC with hexadecane as the sole carbon source, and it was 5 x CMC with the mixture of hexadecane and acetate. Longer term growth studies (approximately 35 days of culture growth) indicated that G. amarae produces biosurfactant in order to solubilize hexadecane, and that adding acetate improves its biosurfactant production by providing readily degradable substrate for initial biomass growth. This research confirms that the foaming problems in activated sludge containing G. amarae in the activated sludge are due to the biosurfactant production by G. amarae when hydrophobic substrates such as hexadecane are present.

  14. Detailed source term estimation of the atmospheric release for the Fukushima Daiichi Nuclear Power Station accident by coupling simulations of an atmospheric dispersion model with an improved deposition scheme and oceanic dispersion model

    NASA Astrophysics Data System (ADS)

    Katata, G.; Chino, M.; Kobayashi, T.; Terada, H.; Ota, M.; Nagai, H.; Kajino, M.; Draxler, R.; Hort, M. C.; Malo, A.; Torii, T.; Sanada, Y.

    2015-01-01

    Temporal variations in the amount of radionuclides released into the atmosphere during the Fukushima Daiichi Nuclear Power Station (FNPS1) accident and their atmospheric and marine dispersion are essential to evaluate the environmental impacts and resultant radiological doses to the public. In this paper, we estimate the detailed atmospheric releases during the accident using a reverse estimation method which calculates the release rates of radionuclides by comparing measurements of air concentration of a radionuclide or its dose rate in the environment with the ones calculated by atmospheric and oceanic transport, dispersion and deposition models. The atmospheric and oceanic models used are WSPEEDI-II (Worldwide version of System for Prediction of Environmental Emergency Dose Information) and SEA-GEARN-FDM (Finite difference oceanic dispersion model), both developed by the authors. A sophisticated deposition scheme, which deals with dry and fog-water depositions, cloud condensation nuclei (CCN) activation, and subsequent wet scavenging due to mixed-phase cloud microphysics (in-cloud scavenging) for radioactive iodine gas (I2 and CH3I) and other particles (CsI, Cs, and Te), was incorporated into WSPEEDI-II to improve the surface deposition calculations. The results revealed that the major releases of radionuclides due to the FNPS1 accident occurred in the following periods during March 2011: the afternoon of 12 March due to the wet venting and hydrogen explosion at Unit 1, midnight of 14 March when the SRV (safety relief valve) was opened three times at Unit 2, the morning and night of 15 March, and the morning of 16 March. According to the simulation results, the highest radioactive contamination areas around FNPS1 were created from 15 to 16 March by complicated interactions among rainfall, plume movements, and the temporal variation of release rates. The simulation by WSPEEDI-II using the new source term reproduced the local and regional patterns of cumulative surface deposition of total 131I and 137Cs and air dose rate obtained by airborne surveys. The new source term was also tested using three atmospheric dispersion models (Modèle Lagrangien de Dispersion de Particules d'ordre zéro: MLDP0, Hybrid Single Particle Lagrangian Integrated Trajectory Model: HYSPLIT, and Met Office's Numerical Atmospheric-dispersion Modelling Environment: NAME) for regional and global calculations, and the calculated results showed good agreement with observed air concentration and surface deposition of 137Cs in eastern Japan.

  15. Improvements and limitations on understanding of atmospheric processes of Fukushima Daiichi NPS radioactivity

    NASA Astrophysics Data System (ADS)

    Yamazawa, Hiromi; Terasaka, Yuta; Mizutani, Kenta; Sugiura, Hiroki; Hirao, Shigekazu

    2017-04-01

    Understanding on the release of radioactivity into the atmosphere from the accidental units of Fukushima Daiichi Nuclear Power Station have been improved owing to recent analyses of atmospheric concentrations of radionuclide. Our analysis of gamma-ray spectra from monitoring posts located about 100 km to the south of the site revealed temporal changes of atmospheric concentrations of several key nuclides including noble gas Xe-133 in addition to radio-iodine and cesium nuclides, including I-131 and Cs-137, at a 10 minute interval. By using the atmospheric concentration data, in combination with an inverse atmospheric transport modelling with a Bayesian statistical method, a modification was proposed for the widely used Katata's source term. A source term for Xe-133 was also proposed. Although the atmospheric concentration data and the source terms help us understand the atmospheric transport processes of radionuclides, they still have significant uncertainty due to limitations in availability of the concentration data. There still remain limitations in the atmospheric transport modeling. The largest uncertainty in the model is in the deposition processes. It had been pointed out that, in the 100 km range from the accidental site, there were locations at which the ambient dose rate significantly increased a few hours before precipitation detectors recorded the start of rain. According to our analysis, the dose rate increase was not directly caused by the air-borne radioactivity but by deposition. This phenomenon can be attributed to a deposition process in which evaporating precipitation enhances efficiency of deposition even in a case where no precipitation is observed at ground level.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeze, R.A.; McWhorter, D.B.

    Many emerging remediation technologies are designed to remove contaminant mass from source zones at DNAPL sites in response to regulatory requirements. There is often concern in the regulated community as to whether mass removal actually reduces risk, or whether the small risk reductions achieved warrant the large costs incurred. This paper sets out a proposed framework for quantifying the degree to which risk is reduced as mass is removed from DNAPL source areas in shallow, saturated, low-permeability media. Risk is defined in terms of meeting an alternate concentration limit (ACL) at a compliance well in an aquifer underlying the sourcemore » zone. The ACL is back-calculated from a carcinogenic health-risk characterization at a downgradient water-supply well. Source-zone mass-removal efficiencies are heavily dependent on the distribution of mass between media (fractures, matrix) and phase (aqueous, sorbed, NAPL). Due to the uncertainties in currently available technology performance data, the scope of the paper is limited to developing a framework for generic technologies rather than making specific risk-reduction calculations for individual technologies. Despite the qualitative nature of the exercise, results imply that very high total mass-removal efficiencies are required to achieve significant long-term risk reduction with technology applications of finite duration. This paper is not an argument for no action at contaminated sites. Rather, it provides support for the conclusions of Cherry et al. (1992) that the primary goal of current remediation should be short-term risk reduction through containment, with the aim to pass on to future generations site conditions that are well-suited to the future applications of emerging technologies with improved mass-removal capabilities.« less

  17. Implications of matrix diffusion on 1,4-dioxane persistence at contaminated groundwater sites.

    PubMed

    Adamson, David T; de Blanc, Phillip C; Farhat, Shahla K; Newell, Charles J

    2016-08-15

    Management of groundwater sites impacted by 1,4-dioxane can be challenging due to its migration potential and perceived recalcitrance. This study examined the extent to which 1,4-dioxane's persistence was subject to diffusion of mass into and out of lower-permeability zones relative to co-released chlorinated solvents. Two different release scenarios were evaluated within a two-layer aquifer system using an analytical modeling approach. The first scenario simulated a 1,4-dioxane and 1,1,1-TCA source zone where spent solvent was released. The period when 1,4-dioxane was actively loading the low-permeability layer within the source zone was estimated to be <3years due to its high effective solubility. While this was approximately an order-of-magnitude shorter than the loading period for 1,1,1-TCA, the mass of 1,4-dioxane stored within the low-permeability zone at the end of the simulation period (26kg) was larger than that predicted for 1,1,1-TCA (17kg). Even 80years after release, the aqueous 1,4-dioxane concentration was still several orders-of-magnitude higher than potentially-applicable criteria. Within the downgradient plume, diffusion contributed to higher concentrations and enhanced penetration of 1,4-dioxane into the low-permeability zones relative to 1,1,1-TCA. In the second scenario, elevated 1,4-dioxane concentrations were predicted at a site impacted by migration of a weak source from an upgradient site. Plume cutoff was beneficial because it could be implemented in time to prevent further loading of the low-permeability zone at the downgradient site. Overall, this study documented that 1,4-dioxane within transmissive portions of the source zone is quickly depleted due to characteristics that favor both diffusion-based storage and groundwater transport, leaving little mass to treat using conventional means. Furthermore, the results highlight the differences between 1,4-dioxane and chlorinated solvent source zones, suggesting that back diffusion of 1,4-dioxane mass may be serving as the dominant long-term "secondary source" at many contaminated sites that must be managed using alternative approaches. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  19. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  20. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  1. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  2. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  3. Effects of variability of X-ray binaries on the X-ray luminosity functions of Milky Way

    NASA Astrophysics Data System (ADS)

    Islam, Nazma; Paul, Biswajit

    2016-08-01

    The X-ray luminosity functions of galaxies have become a useful tool for population studies of X-ray binaries in them. The availability of long term light-curves of X-ray binaries with the All Sky X-ray Monitors opens up the possibility of constructing X-ray luminosity functions, by also including the intensity variation effects of the galactic X-ray binaries. We have constructed multiple realizations of the X-ray luminosity functions (XLFs) of Milky Way, using the long term light-curves of sources obtained in the 2-10 keV energy band with the RXTE-ASM. The observed spread seen in the value of slope of both HMXB and LMXB XLFs are due to inclusion of variable luminosities of X-ray binaries in construction of these XLFs as well as finite sample effects. XLFs constructed for galactic HMXBs in the luminosity range 1036-1039 erg/sec is described by a power-law model with a mean power-law index of -0.48 and a spread due to variability of HMXBs as 0.19. XLFs constructed for galactic LMXBs in the luminosity range 1036-1039 erg/sec has a shape of cut-off power-law with mean power-law index of -0.31 and a spread due to variability of LMXBs as 0.07.

  4. Perturbation of a Schwarzschild Black Hole Due to a Rotating Thin Disk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Čížek, P.; Semerák, O., E-mail: oldrich.semerak@mff.cuni.cz

    Will, in 1974, treated the perturbation of a Schwarzschild black hole due to a slowly rotating, light, concentric thin ring by solving the perturbation equations in terms of a multipole expansion of the mass-and-rotation perturbation series. In the Schwarzschild background, his approach can be generalized to perturbation by a thin disk (which is more relevant astrophysically), but, due to rather bad convergence properties, the resulting expansions are not suitable for specific (numerical) computations. However, we show that Green’s functions, represented by Will’s result, can be expressed in closed form (without multipole expansion), which is more useful. In particular, they canmore » be integrated out over the source (a thin disk in our case) to yield good converging series both for the gravitational potential and for the dragging angular velocity. The procedure is demonstrated, in the first perturbation order, on the simplest case of a constant-density disk, including the physical interpretation of the results in terms of a one-component perfect fluid or a two-component dust in a circular orbit about the central black hole. Free parameters are chosen in such a way that the resulting black hole has zero angular momentum but non-zero angular velocity, as it is just carried along by the dragging effect of the disk.« less

  5. Recent Progress on Spherical Torus Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ono, Masayuki; Kaita, Robert

    2014-01-01

    The spherical torus or spherical tokamak (ST) is a member of the tokamak family with its aspect ratio (A = R0/a) reduced to A ~ 1.5, well below the normal tokamak operating range of A ≥ 2.5. As the aspect ratio is reduced, the ideal tokamak beta β (radio of plasma to magnetic pressure) stability limit increases rapidly, approximately as β ~ 1/A. The plasma current it can sustain for a given edge safety factor q-95 also increases rapidly. Because of the above, as well as the natural elongation κ, which makes its plasma shape appear spherical, the ST configurationmore » can yield exceptionally high tokamak performance in a compact geometry. Due to its compactness and high performance, the ST configuration has various near term applications, including a compact fusion neutron source with low tritium consumption, in addition to its longer term goal of attractive fusion energy power source. Since the start of the two megaampere class ST facilities in 2000, National Spherical Torus Experiment (NSTX) in the US and Mega Ampere Spherical Tokamak (MAST) in UK, active ST research has been conducted worldwide. More than sixteen ST research facilities operating during this period have achieved remarkable advances in all of fusion science areas, involving fundamental fusion energy science as well as innovation. These results suggest exciting future prospects for ST research both near term and longer term. The present paper reviews the scientific progress made by the worldwide ST research community during this new mega-ampere-ST era.« less

  6. The Scaling of Broadband Shock-Associated Noise with Increasing Temperature

    NASA Technical Reports Server (NTRS)

    Miller, Steven A. E.

    2013-01-01

    A physical explanation for the saturation of broadband shock-associated noise (BBSAN) intensity with increasing jet stagnation temperature has eluded investigators. An explanation is proposed for this phenomenon with the use of an acoustic analogy. To isolate the relevant physics, the scaling of BBSAN peak intensity level at the sideline observer location is examined. The equivalent source within the framework of an acoustic analogy for BBSAN is based on local field quantities at shock wave shear layer interactions. The equivalent source combined with accurate calculations of the propagation of sound through the jet shear layer, using an adjoint vector Green's function solver of the linearized Euler equations, allows for predictions that retain the scaling with respect to stagnation pressure and allows for saturation of BBSAN with increasing stagnation temperature. The sources and vector Green's function have arguments involving the steady Reynolds- Averaged Navier-Stokes solution of the jet. It is proposed that saturation of BBSAN with increasing jet temperature occurs due to a balance between the amplication of the sound propagation through the shear layer and the source term scaling.

  7. Analysis and Design of Symmetrical Capacitor Diode Voltage Multiplier Driven by LCL-T Resonant Converter

    NASA Astrophysics Data System (ADS)

    Malviya, Devesh; Borage, Mangesh Balkrishna; Tiwari, Sunil

    2017-12-01

    This paper investigates the possibility of application of Resonant Immittance Converters (RICs) as a current source for the current-fed symmetrical Capacitor-Diode Voltage Multiplier (CDVM) with LCL-T Resonant Converter (RC) as an example. Firstly, detailed characterization of the current-fed symmetrical CDVM is carried out using repeated simulations followed by the normalization of the simulation results in order to derive the closed-form curve fit equations to predict the operating modes, output voltage and ripple in terms of operating parameters. RICs, due to their ability to convert voltage source into a current source, become a possible candidate for the realization of current source for the current-fed symmetrical CDVM. Detailed analysis, optimization and design of LCL-T RC with CDVM is performed in this paper. A step by step design procedure for the design of CDVM and the converter is proposed. A 5-stage prototype symmetrical CDVM driven by LCL-T RC to produce 2.5 kV, 50 mA dc output voltage is designed, built and tested to validate the findings of the analysis and simulation.

  8. A source study of atmospheric polycyclic aromatic hydrocarbons in Shenzhen, South China.

    PubMed

    Liu, Guoqing; Tong, Yongpeng; Luong, John H T; Zhang, Hong; Sun, Huibin

    2010-04-01

    Air pollution has become a serious problem in the Pearl River Delta, South China, particularly in winter due to the local micrometeorology. In this study, atmospheric polycyclic aromatic hydrocarbons (PAHs) were monitored weekly in Shenzhen during the winter of 2006. Results indicated that the detected PAHs were mainly of vapor phase compounds with phenanthrene dominant. The average vapor phase and particle phase PAHs concentration in Shenzhen was 101.3 and 26.7 ng m( - 3), respectively. Meteorological conditions showed great effect on PAH concentrations. The higher PAHs concentrations observed during haze episode might result from the accumulation of pollutants under decreased boundary layer, slower wind speed, and long-term dryness conditions. The sources of PAHs in the air were estimated by principal component analysis in combination with diagnostic ratios. Vehicle exhaust was the major PAHs source in Shenzhen, accounting for 50.0% of the total PAHs emissions, whereas coal combustion and solid waste incineration contributed to 29.4% and 20.6% of the total PAHs concentration, respectively. The results clearly indicated that the increasing solid waste incinerators have become a new important PAHs source in this region.

  9. Pectin, Hemicellulose, or Lignin? Impact of the Biowaste Source on the Performance of Hard Carbons for Sodium-Ion Batteries.

    PubMed

    Dou, Xinwei; Hasa, Ivana; Hekmatfar, Maral; Diemant, Thomas; Behm, R Jürgen; Buchholz, Daniel; Passerini, Stefano

    2017-06-22

    Hard carbons are currently the most widely used negative electrode materials in Na-ion batteries. This is due to their promising electrochemical performance with capacities of 200-300 mAh g -1 and stable long-term cycling. However, an abundant and cheap carbon source is necessary in order to comply with the low-cost philosophy of Na-ion technology. Many biological or waste materials have been used to synthesize hard carbons but the impact of the precursors on the final properties of the anode material is not fully understood. In this study the impact of the biomass source on the structural and electrochemical properties of hard carbons is unraveled by using different, representative types of biomass as examples. The systematic structural and electrochemical investigation of hard carbons derived from different sources-namely corncobs, peanut shells, and waste apples, which are representative of hemicellulose-, lignin- and pectin-rich biomass, respectively-enables understanding and interlinking of the structural and electrochemical properties. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Ion tracking in photocathode rf guns

    NASA Astrophysics Data System (ADS)

    Lewellen, John W.

    2002-02-01

    Projected next-generation linac-based light sources, such as PERL or the TESLA free-electron laser, generally assume, as essential components of their injector complexes, long-pulse photocathode rf electron guns. These guns, due to their design rf pulse durations of many milliseconds to continuous wave, may be more susceptible to ion bombardment damage of their cathodes than conventional rf guns, which typically use rf pulses of microsecond duration. This paper explores this possibility in terms of ion propagation within the gun, and presents a basis for future study of the subject.

  11. Hybrid Hydro Renewable Energy Storage Model

    NASA Astrophysics Data System (ADS)

    Dey, Asit Kr

    2018-01-01

    This paper aims at presenting wind & tidal turbine pumped-storage solutions for improving the energy efficiency and economic sustainability of renewable energy systems. Indicated a viable option to solve problems of energy production, as well as in the integration of intermittent renewable energies, providing system flexibility due to energy load’s fluctuation, as long as the storage of energy from intermittent sources. Sea water storage energy is one of the best and most efficient options in terms of renewable resources as an integrated solution allowing the improvement of the energy system elasticity and the global system efficiency.

  12. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2014-01-01 2014-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  13. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2012-01-01 2012-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  14. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  15. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2013-01-01 2013-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  16. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  17. A radio spectral index map and catalogue at 147-1400 MHz covering 80 per cent of the sky

    NASA Astrophysics Data System (ADS)

    de Gasperin, F.; Intema, H. T.; Frail, D. A.

    2018-03-01

    The radio spectral index is a powerful probe for classifying cosmic radio sources and understanding the origin of the radio emission. Combining data at 147 MHz and 1.4 GHz from the TIFR GMRT Sky Survey (TGSS) and the NRAO VLA Sky Survey (NVSS), we produced a large-area radio spectral index map of ˜80 per cent of the sky (Dec. > - 40 deg), as well as a radio spectral index catalogue containing 1396 515 sources, of which 503 647 are not upper or lower limits. Almost every TGSS source has a detected counterpart, while this is true only for 36 per cent of NVSS sources. We released both the map and the catalogue to the astronomical community. The catalogue is analysed to discover systematic behaviours in the cosmic radio population. We find a differential spectral behaviour between faint and bright sources as well as between compact and extended sources. These trends are explained in terms of radio galaxy evolution. We also confirm earlier reports of an excess of steep-spectrum sources along the galactic plane. This corresponds to 86 compact and steep-spectrum source in excess compared to expectations. The properties of this excess are consistent with normal non-recycled pulsars, which may have been missed by pulsation searches due to larger than average scattering along the line of sight.

  18. Emulsion chamber observations of primary cosmic-ray electrons in the energy range 30-1000 GeV

    NASA Technical Reports Server (NTRS)

    Nishimura, J.; Fujii, M.; Taira, T.; Aizu, E.; Hiraiwa, H.; Kobayashi, T.; Niu, K.; Ohta, I.; Golden, R. L.; Koss, T. A.

    1980-01-01

    The results of a series of emulsion exposures, beginning in Japan in 1968 and continued in the U.S. since 1975, which have yielded a total balloon-altitude exposure of 98,700 sq m sr s, are presented. The data are discussed in terms of several models of cosmic-ray propagation. Interpreted in terms of the energy-dependent leaky-box model, the spectrum results suggest a galactic electron residence time of 1.0(+2.0, -0.5) x 10 to the 7th yr, which is consistent with results from Be-10 observations. Finally, the possibility that departures from smooth power law behavior in the spectrum due to individual nearby sources will be observable in the energy range above 1 TeV is discussed.

  19. Inverse modelling of radionuclide release rates using gamma dose rate observations

    NASA Astrophysics Data System (ADS)

    Hamburger, Thomas; Evangeliou, Nikolaos; Stohl, Andreas; von Haustein, Christoph; Thummerer, Severin; Wallner, Christian

    2015-04-01

    Severe accidents in nuclear power plants such as the historical accident in Chernobyl 1986 or the more recent disaster in the Fukushima Dai-ichi nuclear power plant in 2011 have drastic impacts on the population and environment. Observations and dispersion modelling of the released radionuclides help to assess the regional impact of such nuclear accidents. Modelling the increase of regional radionuclide activity concentrations, which results from nuclear accidents, underlies a multiplicity of uncertainties. One of the most significant uncertainties is the estimation of the source term. That is, the time dependent quantification of the released spectrum of radionuclides during the course of the nuclear accident. The quantification of the source term may either remain uncertain (e.g. Chernobyl, Devell et al., 1995) or rely on estimates given by the operators of the nuclear power plant. Precise measurements are mostly missing due to practical limitations during the accident. The release rates of radionuclides at the accident site can be estimated using inverse modelling (Davoine and Bocquet, 2007). The accuracy of the method depends amongst others on the availability, reliability and the resolution in time and space of the used observations. Radionuclide activity concentrations are observed on a relatively sparse grid and the temporal resolution of available data may be low within the order of hours or a day. Gamma dose rates, on the other hand, are observed routinely on a much denser grid and higher temporal resolution and provide therefore a wider basis for inverse modelling (Saunier et al., 2013). We present a new inversion approach, which combines an atmospheric dispersion model and observations of radionuclide activity concentrations and gamma dose rates to obtain the source term of radionuclides. We use the Lagrangian particle dispersion model FLEXPART (Stohl et al., 1998; Stohl et al., 2005) to model the atmospheric transport of the released radionuclides. The inversion method uses a Bayesian formulation considering uncertainties for the a priori source term and the observations (Eckhardt et al., 2008, Stohl et al., 2012). The a priori information on the source term is a first guess. The gamma dose rate observations are used to improve the first guess and to retrieve a reliable source term. The details of this method will be presented at the conference. This work is funded by the Bundesamt für Strahlenschutz BfS, Forschungsvorhaben 3612S60026. References Davoine, X. and Bocquet, M., Atmos. Chem. Phys., 7, 1549-1564, 2007. Devell, L., et al., OCDE/GD(96)12, 1995. Eckhardt, S., et al., Atmos. Chem. Phys., 8, 3881-3897, 2008. Saunier, O., et al., Atmos. Chem. Phys., 13, 11403-11421, 2013. Stohl, A., et al., Atmos. Environ., 32, 4245-4264, 1998. Stohl, A., et al., Atmos. Chem. Phys., 5, 2461-2474, 2005. Stohl, A., et al., Atmos. Chem. Phys., 12, 2313-2343, 2012.

  20. FERMI OBSERVATION OF THE TRANSITIONAL PULSAR BINARY XSS J12270–4859

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xing, Yi; Wang, Zhongxiang

    Because of the disappearance of its accretion disk during the time period of 2012 November–December, XSS J12270–4859 has recently been identified as a transitional millisecond pulsar binary, joining PSR J1023+0038. We have carried out a detailed analysis of the Fermi Large Area Telescope data for this binary. While both spectra  are well-described by an exponentially cut-off power law before and after the disk-disappearance transition, which is typical for pulsars’ emissions in Fermi's 0.2–300 GeV band, we have detected a factor of 2 flux decrease related to the transition. A weak orbital modulation is possibly seen, but is only detectable in the after-transition data, making itmore » the same as orbital modulations found in X-rays. In the long-term light curve of the source before the transition, a factor of 3 flux variations are seen. Compared to the properties of J1023+0038, we discuss the implications from these results. We suggest that since the modulation is aligned with the modulations in X-rays in the orbital phase, it possibly arises due to the occultation of the γ-ray emitting region by the companion. The origin of the variations in the long-term light curve is not clear because the source field also contains unidentified radio or X-ray sources and their contamination cannot be excluded. Multi-wavelength observations of the source field will help identify the origin of the variations by detecting any related flux changes from the in-field sources.« less

  1. Financing biotechnology projects: lender due diligence requirements and the role of independent technical consultants.

    PubMed

    Keller, J B; Plath, P B

    1999-01-01

    An increasing number of biotechnology projects are being brought to commercialization using conventional structured finance sources, which have traditionally only been available to proven technologies and primary industries. Attracting and securing competitive cost financing from mainstream lenders, however, will require the sponsor of a new technology or process to undergo a greater level of due diligence. The specific areas and intensity of investigation, which are typically required by lenders in order to secure long-term financing for biotechnology-based manufacturing systems, is reviewed. The processes for evaluating the adequacy of prior laboratory testing and pilot plant demonstrations is discussed. Particular emphasis is given to scale-up considerations and the ability of the proposed facility design to accommodate significant modifications, in the event that scale-up problems are encountered.

  2. An Interactive Computer Package for Use with Simulation Models Which Performs Multidimensional Sensitivity Analysis by Employing the Techniques of Response Surface Methodology.

    DTIC Science & Technology

    1984-12-01

    total sum of squares at the center points minus the correction factor for the mean at the center points ( SSpe =Y’Y-nlY), where n1 is the number of...SSlac=SSres- SSpe ). The sum of squares due to pure error estimates 0" and the sum of squares due to lack-of-fit estimates 0’" plus a bias term if...Response Surface Methodology Source d.f. SS MS Regression n b’X1 Y b’XVY/n Residual rn-n Y’Y-b’X’ *Y (Y’Y-b’X’Y)/(n-n) Pure Error ni-i Y’Y-nl1Y SSpe / (ni

  3. Elementary Theoretical Forms for the Spatial Power Spectrum of Earth's Crustal Magnetic Field

    NASA Technical Reports Server (NTRS)

    Voorhies, C.

    1998-01-01

    The magnetic field produced by magnetization in Earth's crust and lithosphere can be distinguished from the field produced by electric currents in Earth's core because the spatial magnetic power spectrum of the crustal field differs from that of the core field. Theoretical forms for the spectrum of the crustal field are derived by treating each magnetic domain in the crust as the point source of a dipole field. The geologic null-hypothesis that such moments are uncorrelated is used to obtain the magnetic spectrum expected from a randomly magnetized, or unstructured, spherical crust of negligible thickness. This simplest spectral form is modified to allow for uniform crustal thickness, ellipsoidality, and the polarization of domains by an periodically reversing, geocentric axial dipole field from Earth's core. Such spectra are intended to describe the background crustal field. Magnetic anomalies due to correlated magnetization within coherent geologic structures may well be superimposed upon this background; yet representing each such anomaly with a single point dipole may lead to similar spectral forms. Results from attempts to fit these forms to observational spectra, determined via spherical harmonic analysis of MAGSAT data, are summarized in terms of amplitude, source depth, and misfit. Each theoretical spectrum reduces to a source factor multiplied by the usual exponential function of spherical harmonic degree n due to geometric attenuation with attitude above the source layer. The source factors always vary with n and are approximately proportional to n(exp 3) for degrees 12 through 120. The theoretical spectra are therefore not directly proportional to an exponential function of spherical harmonic degree n. There is no radius at which these spectra are flat, level, or otherwise independent of n.

  4. Contribution of Fugitive Emissions for PM10 Concentrations in an Industrial Area of Portugal

    NASA Astrophysics Data System (ADS)

    Marta Almeida, Susana; Viana Silva, Alexandra; Garcia, Silvia; Miranda, Ana Isabel

    2013-04-01

    Significant atmospheric dust arises from the mechanical disturbance of granular material exposed to the air. Dust generated from these open sources is termed "fugitive" because it is not discharged to the atmosphere in a confined flow stream. Common sources of fugitive dust include unpaved roads, agricultural tilling operations, aggregate storage piles, heavy construction and harbor operations. The objective of this work was to identify the likeliness and extend of the PM10 limit value exceedences due to fugitive emissions in a particularly zone where PM fugitive emissions are a core of environmental concerns - Mitrena, Portugal. Mitrena, is an industrial area that coexists with a high-density urban region (Setúbal) and areas with an important environmental concern (Sado Estuary and Arrábida which belongs to the protected area Natura 2000 Network). Due to the typology of industry sited in Mitrena (e.g. power plant, paper mill, cement, pesticides and fertilized productions), there are a large uncontrolled PM fugitive emissions, providing from heavy traffic and handling and storage of raw material on uncover stockyards in the harbor and industries. Dispersion modeling was performed with the software TAPM (The Air Pollution Model) and results were mapped over the study area, using GIS (Geographic Information Systems). Results showed that managing local particles concentrations can be a frustrating affair because the weight of fugitive sources is very high comparing with the local anthropogenic stationary sources. In order to ensure that the industry can continue to meet its commitments in protecting air quality, it is essential to warrant that the characteristics of releases from all fugitive sources are fully understood in order to target future investments in those areas where maximum benefit will be achieved.

  5. Comparison of a new integrated current source with the modified Howland circuit for EIT applications.

    PubMed

    Hong, Hongwei; Rahal, Mohamad; Demosthenous, Andreas; Bayford, Richard H

    2009-10-01

    Multi-frequency electrical impedance tomography (MF-EIT) systems require current sources that are accurate over a wide frequency range (1 MHz) and with large load impedance variations. The most commonly employed current source design in EIT systems is the modified Howland circuit (MHC). The MHC requires tight matching of resistors to achieve high output impedance and may suffer from instability over a wide frequency range in an integrated solution. In this paper, we introduce a new integrated current source design in CMOS technology and compare its performance with the MHC. The new integrated design has advantages over the MHC in terms of power consumption and area. The output current and the output impedance of both circuits were determined through simulations and measurements over the frequency range of 10 kHz to 1 MHz. For frequencies up to 1 MHz, the measured maximum variation of the output current for the integrated current source is 0.8% whereas for the MHC the corresponding value is 1.5%. Although the integrated current source has an output impedance greater than 1 MOmega up to 1 MHz in simulations, in practice, the impedance is greater than 160 kOmega up to 1 MHz due to the presence of stray capacitance.

  6. The effect of barriers on wave propagation phenomena: With application for aircraft noise shielding

    NASA Technical Reports Server (NTRS)

    Mgana, C. V. M.; Chang, I. D.

    1982-01-01

    The frequency spectrum was divided into high and low frequency regimes and two separate methods were developed and applied to account for physical factors associated with flight conditions. For long wave propagation, the acoustic filed due to a point source near a solid obstacle was treated in terms of an inner region which where the fluid motion is essentially incompressible, and an outer region which is a linear acoustic field generated by hydrodynamic disturbances in the inner region. This method was applied to a case of a finite slotted plate modelled to represent a wing extended flap for both stationary and moving media. Ray acoustics, the Kirchhoff integral formulation, and the stationary phase approximation were combined to study short wave length propagation in many limiting cases as well as in the case of a semi-infinite plate in a uniform flow velocity with a point source above the plate and embedded in a different flow velocity to simulate an engine exhaust jet stream surrounding the source.

  7. Stockholm Arlanda Airport as a source of per- and polyfluoroalkyl substances to water, sediment and fish.

    PubMed

    Ahrens, Lutz; Norström, Karin; Viktor, Tomas; Cousins, Anna Palm; Josefsson, Sarah

    2015-06-01

    Fire training facilities are potential sources of per- and polyfluoroalkyl substances (PFASs) to the nearby environment due to the usage of PFAS-containing aqueous fire-fighting foams (AFFFs). The multimedia distribution of perfluoroalkyl carboxylates (PFCAs), perfluoroalkyl sulfonates (PFSAs), perfluorooctanesulfonamide (PFOSA) and 6:2 fluorotelomer sulfonate (FTSA) was investigated near a fire training facility at Stockholm Arlanda Airport in Sweden. The whole body burden of PFASs in European perch (Perca fluviatilis) was 334±80μg absolute and was distributed as follows: Gonad>liver≈muscle>blood>gill. The bioconcentration factor (BCF) and sediment/water partition coefficient (Kd) increased by 0.6-1.7 and 0.2-0.5 log units, respectively, for each additional CF2 moiety for PFCAs and PFSAs. PFAS concentrations in water showed no significant decreasing trend between 2009 and 2013 (p>0.05), which indicates that Stockholm Arlanda Airport may be an important source for long-term contamination of the nearby environment with PFASs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Post-reionization Kinetic Sunyaev-Zel'dovich Signal in the Illustris simulation

    NASA Astrophysics Data System (ADS)

    Park, Hyunbae; Alvarez, Marcelo A.; Bond, John Richard

    2017-06-01

    Using Illustris, a state-of-art cosmological simulation of gravity, hydrodynamics, and star-formation, we revisit the calculation the angular power spectrum of the kinetic Sunyaev-Zel'dovich effect from the post-reionization (z < 6) epoch by Shaw et al. (2012). We not only report the updated value given by the analytical model used in previous studies, but go over the simplifying assumptions made in the model. The assumptions include using gas density for free electron density and neglecting the connected term arising due to the fourth order nature of momentum power spectrum that sources the signal. With these assumptions, Illustris gives slightly (˜ 10%) larger signal than in their work. Then, the signal is reduced by ˜ 20% when using actual free electron density in the calculation instead of gas density. This is because larger neutral fraction in dense regions results in loss of total free electron and suppression of fluctuations in free electron density. We find that the connected term can take up to half of the momentum power spectrum at z < 2. Due to a strong suppression of low-z signal by baryonic physics, the extra contribution from the connected term to ˜ 10% level although it may have been underestimated due to the finite box-size of Illustris. With these corrections, our result is very close to the original result of Shaw et al. (2012), which is well described by a simple power-law, D_l = 1.38[l/3000]0.21 μK^2, at 3000 < l < 10000.

  9. Piecewise synonyms for enhanced UMLS source terminology integration.

    PubMed

    Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J

    2007-10-11

    The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.

  10. Environmental impact and risk assessments and key factors contributing to the overall uncertainties.

    PubMed

    Salbu, Brit

    2016-01-01

    There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms, ignoring sensitive history life stages of organisms and transgenerational effects. To link sources, ecosystem transfer and biological effects to future impact and risks, a series of models are usually interfaced, while uncertainty estimates are seldom given. The model predictions are, however, only valid within the boundaries of the overall uncertainties. Furthermore, the model predictions are only useful and relevant when uncertainties are estimated, communicated and understood. Among key factors contributing most to uncertainties, the present paper focuses especially on structure uncertainties (model bias or discrepancies) as aspects such as particle releases, ecosystem dynamics, mixed exposure, sensitive life history stages and transgenerational effects, are usually ignored in assessment models. Research focus on these aspects should significantly reduce the overall uncertainties in the impact and risk assessment of radioactive contaminated ecosystems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Toxicity studies of detoxified Jatropha meal (Jatropha curcas) in rats.

    PubMed

    Rakshit, K D; Darukeshwara, J; Rathina Raj, K; Narasimhamurthy, K; Saibaba, P; Bhagya, S

    2008-12-01

    Jatropha curcas, a tropical plant introduced in many Asian and African countries is presently used as a source of biodiesel. The cake after oil extraction is rich in protein and is a potential source of livestock feed. In view of the high toxic nature of whole as well as dehulled seed meal due to the presence of toxic phorbol esters and lectin, the meal was subjected to alkali and heat treatments to deactivate the phorbol ester as well as lectin content. After treatment, the phorbol ester content was reduced up to 89% in whole and dehulled seed meal. Toxicity studies were conducted on male growing rats by feeding treated as well as untreated meal through dietary source. All rats irrespective of treatment had reduced appetite and diet intake was low accompanied by diarrhoea. The rats also exhibited reduced motor activity. The rats fed with treated meals exhibited delayed mortality compared to untreated meal fed rats (p0.02). There were significant changes both in terms of food intake and gain in body weight. Gross examination of vital organs indicated atrophy compared to control casein fed rats. However, histopathological examination of various vital organs did not reveal any treatment related microscopic changes suggesting that the mortality of rats occurred due to lack of food intake, diarrhoea and emaciation. Further studies are in progress for complete detoxification of J. curcas meal for use in livestock feed.

  12. Low birth weight and air pollution in California: Which sources and components drive the risk?

    PubMed

    Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Kleeman, Michael J; Bartell, Scott M; Cockburn, Myles; Escobedo, Loraine; Wu, Jun

    2016-01-01

    Intrauterine growth restriction has been associated with exposure to air pollution, but there is a need to clarify which sources and components are most likely responsible. This study investigated the associations between low birth weight (LBW, <2500g) in term born infants (≥37 gestational weeks) and air pollution by source and composition in California, over the period 2001-2008. Complementary exposure models were used: an empirical Bayesian kriging model for the interpolation of ambient pollutant measurements, a source-oriented chemical transport model (using California emission inventories) that estimated fine and ultrafine particulate matter (PM2.5 and PM0.1, respectively) mass concentrations (4km×4km) by source and composition, a line-source roadway dispersion model at fine resolution, and traffic index estimates. Birth weight was obtained from California birth certificate records. A case-cohort design was used. Five controls per term LBW case were randomly selected (without covariate matching or stratification) from among term births. The resulting datasets were analyzed by logistic regression with a random effect by hospital, using generalized additive mixed models adjusted for race/ethnicity, education, maternal age and household income. In total 72,632 singleton term LBW cases were included. Term LBW was positively and significantly associated with interpolated measurements of ozone but not total fine PM or nitrogen dioxide. No significant association was observed between term LBW and primary PM from all sources grouped together. A positive significant association was observed for secondary organic aerosols. Exposure to elemental carbon (EC), nitrates and ammonium were also positively and significantly associated with term LBW, but only for exposure during the third trimester of pregnancy. Significant positive associations were observed between term LBW risk and primary PM emitted by on-road gasoline and diesel or by commercial meat cooking sources. Primary PM from wood burning was inversely associated with term LBW. Significant positive associations were also observed between term LBW and ultrafine particle numbers modeled with the line-source roadway dispersion model, traffic density and proximity to roadways. This large study based on complementary exposure metrics suggests that not only primary pollution sources (traffic and commercial meat cooking) but also EC and secondary pollutants are risk factors for term LBW. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Water relations of riparian plants from warm desert regions

    USGS Publications Warehouse

    Smith, S.D.; Devitt, Dale A.; Cleverly, James R.; Busch, David E.

    1998-01-01

    Riparian plants have been classified as 'drought avoiders' due to their access to an abundant subsurface water supply. Recent water-relations research that tracks water sources of riparian plants using the stable isotopes of water suggests that many plants of the riparian zone use ground water rather than stream water, and not all riparian plants are obligate phreatophytes (dependent on ground water as a moisture source) but may occasionally be dependent of unsaturated soil moisture sources. A more thorough understanding of riparian plant-water relations must include water-source dynamics and how those dynamics vary over both space and time. Many rivers in the desert Southwest have been invaded by the exotic shrub Tamarix ramosissima (saltcedar). Our studies of Tamarix invasion into habitats formerly dominated by native riparian forests of primarily Populus and Salix have shown that Tamarix successfully invades these habitats because of its (1) greater tolerance to water stress and salinity, (2) status as a facultative, rather than obligate, phreatophyte and, therefore, its ability to recover from droughts and periods of ground-water drawdown, and (3) superior regrowth after fire. Analysis of water- loss rates indicate that Tamarix-dominated stands can have extremely high evapotranspiration rates when water tables are high but not necessarily when water tables are lower. Tamarix has leaf-level transpiration rates that are comparable to native species, whereas sap-flow rates per unit sapwood area are higher than in natives, suggesting that Tamarix maintains higher leaf area than can natives, probably due to its greater water stress tolerance. Tamarix desiccates and salinizes floodplains, due to its salt exudation and high transpiration rates, and may also accelerate fire cycles, thus predisposing these ecosystems to further loss of native taxa. Riparian species on regulated rivers can be exposed to seasonal water stress due to depression of floodplain water tables and elimination of annual floods. This can potentially result in a community shift toward more stress- tolerant taxa, such as Tamarix, due to the inability of other riparian species to germinate and establish in the desiccated floodplain environment. Management efforts aimed at maintaining native forests on regulated rivers and slowing the spread of Tamarix invasion must include at least partial reintroduction of historical flow regimes, which favor the recruitment of native riparian species and reverse long-term desiccation of desert floodplain environments.

  14. Simultaneous event-specific estimates of transport, loss, and source rates for relativistic outer radiation belt electrons: Event-Specific 1-D Modeling

    DOE PAGES

    Schiller, Q.; Tu, W.; Ali, A. F.; ...

    2017-03-11

    The most significant unknown regarding relativistic electrons in Earth’s outer Van Allen radiation belt is the relative contribution of loss, transport, and acceleration processes within the inner magnetosphere. Detangling each individual process is critical to improve the understanding of radiation belt dynamics, but determining a single component is challenging due to sparse measurements in diverse spatial and temporal regimes. However, there are currently an unprecedented number of spacecraft taking measurements that sample different regions of the inner magnetosphere. With the increasing number of varied observational platforms, system dynamics can begin to be unraveled. In this work, we employ in-situ measurementsmore » during the 13-14 January 2013 enhancement event to isolate transport, loss, and source dynamics in a one dimensional radial diffusion model. We then validate the results by comparing them to Van Allen Probes and THEMIS observations, indicating that the three terms have been accurately and individually quantified for the event. Finally, a direct comparison is performed between the model containing event-specific terms and various models containing terms parameterized by geomagnetic index. Models using a simple 3/Kp loss timescale show deviation from the event specific model of nearly two orders of magnitude within 72 hours of the enhancement event. However, models using alternative loss timescales closely resemble the event specific model.« less

  15. Simultaneous event-specific estimates of transport, loss, and source rates for relativistic outer radiation belt electrons: Event-Specific 1-D Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiller, Q.; Tu, W.; Ali, A. F.

    The most significant unknown regarding relativistic electrons in Earth’s outer Van Allen radiation belt is the relative contribution of loss, transport, and acceleration processes within the inner magnetosphere. Detangling each individual process is critical to improve the understanding of radiation belt dynamics, but determining a single component is challenging due to sparse measurements in diverse spatial and temporal regimes. However, there are currently an unprecedented number of spacecraft taking measurements that sample different regions of the inner magnetosphere. With the increasing number of varied observational platforms, system dynamics can begin to be unraveled. In this work, we employ in-situ measurementsmore » during the 13-14 January 2013 enhancement event to isolate transport, loss, and source dynamics in a one dimensional radial diffusion model. We then validate the results by comparing them to Van Allen Probes and THEMIS observations, indicating that the three terms have been accurately and individually quantified for the event. Finally, a direct comparison is performed between the model containing event-specific terms and various models containing terms parameterized by geomagnetic index. Models using a simple 3/Kp loss timescale show deviation from the event specific model of nearly two orders of magnitude within 72 hours of the enhancement event. However, models using alternative loss timescales closely resemble the event specific model.« less

  16. High-order scheme for the source-sink term in a one-dimensional water temperature model

    PubMed Central

    Jing, Zheng; Kang, Ling

    2017-01-01

    The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data. PMID:28264005

  17. High-order scheme for the source-sink term in a one-dimensional water temperature model.

    PubMed

    Jing, Zheng; Kang, Ling

    2017-01-01

    The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data.

  18. Short-term emergency response planning and risk assessment via an integrated modeling system for nuclear power plants in complex terrain

    NASA Astrophysics Data System (ADS)

    Chang, Ni-Bin; Weng, Yu-Chi

    2013-03-01

    Short-term predictions of potential impacts from accidental release of various radionuclides at nuclear power plants are acutely needed, especially after the Fukushima accident in Japan. An integrated modeling system that provides expert services to assess the consequences of accidental or intentional releases of radioactive materials to the atmosphere has received wide attention. These scenarios can be initiated either by accident due to human, software, or mechanical failures, or from intentional acts such as sabotage and radiological dispersal devices. Stringent action might be required just minutes after the occurrence of accidental or intentional release. To fulfill the basic functions of emergency preparedness and response systems, previous studies seldom consider the suitability of air pollutant dispersion models or the connectivity between source term, dispersion, and exposure assessment models in a holistic context for decision support. Therefore, the Gaussian plume and puff models, which are only suitable for illustrating neutral air pollutants in flat terrain conditional to limited meteorological situations, are frequently used to predict the impact from accidental release of industrial sources. In situations with complex terrain or special meteorological conditions, the proposing emergency response actions might be questionable and even intractable to decisionmakers responsible for maintaining public health and environmental quality. This study is a preliminary effort to integrate the source term, dispersion, and exposure assessment models into a Spatial Decision Support System (SDSS) to tackle the complex issues for short-term emergency response planning and risk assessment at nuclear power plants. Through a series model screening procedures, we found that the diagnostic (objective) wind field model with the aid of sufficient on-site meteorological monitoring data was the most applicable model to promptly address the trend of local wind field patterns. However, most of the hazardous materials being released into the environment from nuclear power plants are not neutral pollutants, so the particle and multi-segment puff models can be regarded as the most suitable models to incorporate into the output of the diagnostic wind field model in a modern emergency preparedness and response system. The proposed SDSS illustrates the state-of-the-art system design based on the situation of complex terrain in South Taiwan. This system design of SDSS with 3-dimensional animation capability using a tailored source term model in connection with ArcView® Geographical Information System map layers and remote sensing images is useful for meeting the design goal of nuclear power plants located in complex terrain.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, Jeffrey F.

    This paper briefly reviews the evolution of brachytherapy dosimetry from 1900 to the present. Dosimetric practices in brachytherapy fall into three distinct eras: During the era of biological dosimetry (1900-1938), radium pioneers could only specify Ra-226 and Rn-222 implants in terms of the mass of radium encapsulated within the implanted sources. Due to the high energy of its emitted gamma rays and the long range of its secondary electrons in air, free-air chambers could not be used to quantify the output of Ra-226 sources in terms of exposure. Biological dosimetry, most prominently the threshold erythema dose, gained currency as amore » means of intercomparing radium treatments with exposure-calibrated orthovoltage x-ray units. The classical dosimetry era (1940-1980) began with successful exposure standardization of Ra-226 sources by Bragg-Gray cavity chambers. Classical dose-computation algorithms, based upon 1-D buildup factor measurements and point-source superposition computational algorithms, were able to accommodate artificial radionuclides such as Co-60, Ir-192, and Cs-137. The quantitative dosimetry era (1980- ) arose in response to the increasing utilization of low energy K-capture radionuclides such as I-125 and Pd-103 for which classical approaches could not be expected to estimate accurate correct doses. This led to intensive development of both experimental (largely TLD-100 dosimetry) and Monte Carlo dosimetry techniques along with more accurate air-kerma strength standards. As a result of extensive benchmarking and intercomparison of these different methods, single-seed low-energy radionuclide dose distributions are now known with a total uncertainty of 3%-5%.« less

  20. History of dose specification in Brachytherapy: From Threshold Erythema Dose to Computational Dosimetry

    NASA Astrophysics Data System (ADS)

    Williamson, Jeffrey F.

    2006-09-01

    This paper briefly reviews the evolution of brachytherapy dosimetry from 1900 to the present. Dosimetric practices in brachytherapy fall into three distinct eras: During the era of biological dosimetry (1900-1938), radium pioneers could only specify Ra-226 and Rn-222 implants in terms of the mass of radium encapsulated within the implanted sources. Due to the high energy of its emitted gamma rays and the long range of its secondary electrons in air, free-air chambers could not be used to quantify the output of Ra-226 sources in terms of exposure. Biological dosimetry, most prominently the threshold erythema dose, gained currency as a means of intercomparing radium treatments with exposure-calibrated orthovoltage x-ray units. The classical dosimetry era (1940-1980) began with successful exposure standardization of Ra-226 sources by Bragg-Gray cavity chambers. Classical dose-computation algorithms, based upon 1-D buildup factor measurements and point-source superposition computational algorithms, were able to accommodate artificial radionuclides such as Co-60, Ir-192, and Cs-137. The quantitative dosimetry era (1980- ) arose in response to the increasing utilization of low energy K-capture radionuclides such as I-125 and Pd-103 for which classical approaches could not be expected to estimate accurate correct doses. This led to intensive development of both experimental (largely TLD-100 dosimetry) and Monte Carlo dosimetry techniques along with more accurate air-kerma strength standards. As a result of extensive benchmarking and intercomparison of these different methods, single-seed low-energy radionuclide dose distributions are now known with a total uncertainty of 3%-5%.

  1. Low latitude ice core evidence for dust deposition on high altitude glaciers

    NASA Astrophysics Data System (ADS)

    Gabrielli, P.; Thompson, L. G.

    2017-12-01

    Polar ice cores from Antarctica and Greenland have provided a wealth of information on dust emission, transport and deposition over glacial to interglacial timescales. These ice cores mainly entrap dust transported long distances from source areas such as Asia for Greenland and South America for Antarctica. Thus, these dust records provide paleo-information about the environmental conditions at the source and the strength/pathways of atmospheric circulation at continental scales. Ice cores have also been extracted from high altitude glaciers in the mid- and low-latitudes and provide dust records generally extending back several centuries and in a few cases back to the last glacial period. For these glaciers the potential sources of dust emission include areas that are close or adjacent to the drilling site which facilitates the potential for a strong imprinting of local dust in the records. In addition, only a few high altitude glaciers allow the reconstruction of past snow accumulation and hence the expression of the dust records in terms of fluxes. Due to their extreme elevation, a few of these high altitude ice cores offer dust histories with the potential to record environmental conditions at remote sources. Dust records (in terms of dust concentration/size, crustal trace elements and terrigenous cations) from Africa, the European Alps, South America and the Himalayas are examined over the last millennium. The interplay of the seasonal atmospheric circulation (e.g. westerlies, monsoons and vertical convection) is shown to play a major role in determining the intensity and origin of dust fallout to the high altitude glaciers around the world.

  2. The time variability of Jupiter's synchrotron radiation

    NASA Astrophysics Data System (ADS)

    Bolton, Scott Jay

    1991-02-01

    The time variability of the Jovian synchrotron emission is investigated by analyzing radio observations of Jupiter at decimetric wavelengths. The observations are composed from two distinct sets of measurements addressing both short term (days to weeks) and long term (months to years) variability. The study of long term variations utilizes a set of measurements made several times each month with the NASA Deep Space Network (DNS) antennas operating at 2295 MHz (13.1 cm). The DSN data set, covering 1971 through 1985, is compared with a set of measurements of the solar wind from a number of Earth orbiting spacecraft. The analysis indicates a maximum correlation between the synchrotron emission and the solar wind ram pressure with a two year time lag. Physical mechanisms affecting the synchrotron emission are discussed with an emphasis on radial diffusion. Calculations are performed that suggest the correlation is consistent with inward adiabatic diffusion of solar wind particles driven by Brice's model of ionospheric neutral wind convection (Brice 1972). The implication is that the solar wind could be a source of particles of Jupiter's radiation belts. The investigation of short term variability focuses on a three year Jupiter observing program using the University of California's Hat Creek radio telescope operating at 1400 MHz (21 cm). Measurements are made every two days during the months surrounding opposition. Results from the three year program suggest short term variability near the 10-20 percent level but should be considered inconclusive due to scheduling and observational limitations. A discussion of magneto-spheric processes on short term timescales identifies wave-particle interactions as a candidate source. Further analysis finds that the short term variations could be related to whistler mode wave-particles interactions in the radiation belts associated with atmospheric lightning on Jupiter. However, theoretical calculations on wave particle interactions imply thought if whistler mode waves are to interact with the synchrotron emitting electrons.

  3. Antimicrobial Resistance Profiles and Diversity in Salmonella from Humans and Cattle, 2004-2011.

    PubMed

    Afema, J A; Mather, A E; Sischo, W M

    2015-11-01

    Analysis of long-term anti-microbial resistance (AMR) data is useful to understand source and transmission dynamics of AMR. We analysed 5124 human clinical isolates from Washington State Department of Health, 391 cattle clinical isolates from the Washington Animal Disease Diagnostic Laboratory and 1864 non-clinical isolates from foodborne disease research on dairies in the Pacific Northwest. Isolates were assigned profiles based on phenotypic resistance to 11 anti-microbials belonging to eight classes. Salmonella Typhimurium (ST), Salmonella Newport (SN) and Salmonella Montevideo (SM) were the most common serovars in both humans and cattle. Multinomial logistic regression showed ST and SN from cattle had greater probability of resistance to multiple classes of anti-microbials than ST and SN from humans (P < 0.0001). While these findings could be consistent with the belief that cattle are a source of resistant ST and SN for people, occurrence of profiles unique to cattle and not observed in temporally related human isolates indicates these profiles are circulating in cattle only. We used various measures to assess AMR diversity, conditional on the weighting of rare versus abundant profiles. AMR profile richness was greater in the common serovars from humans, although both source data sets were dominated by relatively few profiles. The greater profile richness in human Salmonella may be due to greater diversity of sources entering the human population compared to cattle or due to continuous evolution in the human environment. Also, AMR diversity was greater in clinical compared to non-clinical cattle Salmonella, and this could be due to anti-microbial selection pressure in diseased cattle that received treatment. The use of bootstrapping techniques showed that although there were shared profiles between humans and cattle, the expected and observed number of profiles was different, suggesting Salmonella and associated resistance from humans and cattle may not be wholly derived from a common population. © 2014 The Authors. Zoonoses and Public Health Published by Blackwell Verlag GmbH.

  4. Using particle swarm optimization to enhance PI controller performances for active and reactive power control in wind energy conversion systems

    NASA Astrophysics Data System (ADS)

    Taleb, M.; Cherkaoui, M.; Hbib, M.

    2018-05-01

    Recently, renewable energy sources are impacting seriously power quality of the grids in term of frequency and voltage stability, due to their intermittence and less forecasting accuracy. Among these sources, wind energy conversion systems (WECS) received a great interest and especially the configuration with Doubly Fed Induction Generator. However, WECS strongly nonlinear, are making their control not easy by classical approaches such as a PI. In this paper, we continue deepen study of PI controller used in active and reactive power control of this kind of WECS. Particle Swarm Optimization (PSO) is suggested to improve its dynamic performances and its robustness against parameters variations. This work highlights the performances of PSO optimized PI control against classical PI tuned with poles compensation strategy. Simulations are carried out on MATLAB-SIMULINK software.

  5. Endocrine and metabolic consequences due to restrictive carbohydrate diets in children with type 1 diabetes: An illustrative case series.

    PubMed

    de Bock, Martin; Lobley, Kristine; Anderson, Donald; Davis, Elizabeth; Donaghue, Kim; Pappas, Marcelle; Siafarikas, Aris; Cho, Yoon Hi; Jones, Timothy; Smart, Carmel

    2018-02-01

    Low carbohydrate diets for the management of type 1 diabetes have been popularised by social media. The promotion of a low carbohydrate diet in lay media is in contrast to published pediatric diabetes guidelines that endorse a balanced diet from a variety of foods for optimal growth and development in children with type 1 diabetes. This can be a source of conflict in clinical practice. We describe a series of 6 cases where adoption of a low carbohydrate diet in children impacted growth and cardiovascular risk factors with potential long-term sequelae. These cases support current clinical guidelines for children with diabetes that promote a diet where total energy intake is derived from balanced macronutrient sources. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. Recycling and source reduction for long duration space habitation

    NASA Technical Reports Server (NTRS)

    Hightower, T. M.

    1992-01-01

    A direct mathematical approach has been established for characterizing the performance of closed-loop life support systems. The understanding that this approach gives clearly illustrates the options available for increasing the performance of a life support system by changing various parameters. New terms are defined and utilized, such as Segregation Factor, Resource Recovery Efficiency, Overall Reclamation Efficiency, Resupply Reduction Factor, and Life Support Extension Factor. The effects of increases in expendable system supplies required due to increases in life support system complexity are shown. Minimizing resupply through increased recycling and source reduction is illustrated. The effects of recycling upon resupply launch cost is also shown. Finally, material balance analyses have been performed based on quantity and composition data for both supplies and wastes, to illustrate the use of this approach by comparing ten different closed-loop life support system cases.

  7. Estimates of ground level TSP, SO sub 2 and HCI for a municipal waste incinerator to be located at Tynes Bay - Bermuda

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kent Simmons, J.A.; Knap, A.H.

    1991-04-01

    The computer model Industrial Source Complex Short Term (ISCST) was used to study the stack emissions from a refuse incinerator proposed for the inland of Bermuda. The model predicts that the highest ground level pollutant concentrations will occur near Prospect, 800 m to 1,000 m due south of the stack. The authors installed a portable laboratory and instruments at Prospect to begin making air quality baseline measurements. By comparing the model's estimates of the incinerator contribution to the background levels measured at the site they predicted that stack emissions would not cause an increase in TSP or SO{sub 2}. Themore » incinerator will be a significant source of HCI to Bermuda air with ambient levels approaching air quality guidelines.« less

  8. The electromagnetic radiation from simple sources in the presence of a homogeneous dielectric sphere

    NASA Technical Reports Server (NTRS)

    Mason, V. B.

    1973-01-01

    In this research, the effect of a homogeneous dielectric sphere on the electromagnetic radiation from simple sources is treated as a boundary value problem, and the solution is obtained by the technique of dyadic Green's functions. Exact representations of the electric fields in the various regions due to a source located inside, outside, or on the surface of a dielectric sphere are formulated. Particular attention is given to the effect of sphere size, source location, dielectric constant, and dielectric loss on the radiation patterns and directivity of small spheres (less than 5 wavelengths in diameter) using the Huygens' source excitation. The computed results are found to closely agree with those measured for waveguide-excited plexiglas spheres. Radiation patterns for an extended Huygens' source and for curved electric dipoles located on the sphere's surface are also presented. The resonance phenomenon associated with the dielectric sphere is studied in terms of the modal representation of the radiated fields. It is found that when the sphere is excited at certain frequencies, much of the energy is radiated into the sidelobes. The addition of a moderate amount of dielectric loss, however, quickly attenuates this resonance effect. A computer program which may be used to calculate the directivity and radiation pattern of a Huygens' source located inside or on the surface of a lossy dielectric sphere is listed.

  9. Important fossil source contribution to brown carbon in Beijing during winter

    NASA Astrophysics Data System (ADS)

    Yan, Caiqing; Zheng, Mei; Bosch, Carme; Andersson, August; Desyaterik, Yury; Sullivan, Amy P.; Collett, Jeffrey L.; Zhao, Bin; Wang, Shuxiao; He, Kebin; Gustafsson, Örjan

    2017-03-01

    Organic aerosol (OA) constitutes a substantial fraction of fine particles and affects both human health and climate. It is becoming clear that OA absorbs light substantially (hence termed Brown Carbon, BrC), adding uncertainties to global aerosol radiative forcing estimations. The few current radiative-transfer and chemical-transport models that include BrC primarily consider sources from biogenic and biomass combustion. However, radiocarbon fingerprinting here clearly indicates that light-absorbing organic carbon in winter Beijing, the capital of China, is mainly due to fossil sources, which contribute the largest part to organic carbon (OC, 67 ± 3%) and its sub-constituents (water-soluble OC, WSOC: 54 ± 4%, and water-insoluble OC, WIOC: 73 ± 3%). The dual-isotope (Δ14C/δ13C) signatures, organic molecular tracers and Beijing-tailored emission inventory identify that this fossil source is primarily from coal combustion activities in winter, especially from the residential sector. Source testing on Chinese residential coal combustion provides direct evidence that intensive coal combustion could contribute to increased light-absorptivity of ambient BrC in Beijing winter. Coal combustion is an important source to BrC in regions such as northern China, especially during the winter season. Future modeling of OA radiative forcing should consider the importance of both biomass and fossil sources.

  10. Important fossil source contribution to brown carbon in Beijing during winter

    PubMed Central

    Yan, Caiqing; Zheng, Mei; Bosch, Carme; Andersson, August; Desyaterik, Yury; Sullivan, Amy P.; Collett, Jeffrey L.; Zhao, Bin; Wang, Shuxiao; He, Kebin; Gustafsson, Örjan

    2017-01-01

    Organic aerosol (OA) constitutes a substantial fraction of fine particles and affects both human health and climate. It is becoming clear that OA absorbs light substantially (hence termed Brown Carbon, BrC), adding uncertainties to global aerosol radiative forcing estimations. The few current radiative-transfer and chemical-transport models that include BrC primarily consider sources from biogenic and biomass combustion. However, radiocarbon fingerprinting here clearly indicates that light-absorbing organic carbon in winter Beijing, the capital of China, is mainly due to fossil sources, which contribute the largest part to organic carbon (OC, 67 ± 3%) and its sub-constituents (water-soluble OC, WSOC: 54 ± 4%, and water-insoluble OC, WIOC: 73 ± 3%). The dual-isotope (Δ14C/δ13C) signatures, organic molecular tracers and Beijing-tailored emission inventory identify that this fossil source is primarily from coal combustion activities in winter, especially from the residential sector. Source testing on Chinese residential coal combustion provides direct evidence that intensive coal combustion could contribute to increased light-absorptivity of ambient BrC in Beijing winter. Coal combustion is an important source to BrC in regions such as northern China, especially during the winter season. Future modeling of OA radiative forcing should consider the importance of both biomass and fossil sources. PMID:28266611

  11. Characterizing open and non-uniform vertical heat sources: towards the identification of real vertical cracks in vibrothermography experiments

    NASA Astrophysics Data System (ADS)

    Castelo, A.; Mendioroz, A.; Celorrio, R.; Salazar, A.; López de Uralde, P.; Gorosmendi, I.; Gorostegui-Colinas, E.

    2017-05-01

    Lock-in vibrothermography is used to characterize vertical kissing and open cracks in metals. In this technique the crack heats up during ultrasound excitation due mainly to friction between the defect's faces. We have solved the inverse problem, consisting in determining the heat source distribution produced at cracks under amplitude modulated ultrasound excitation, which is an ill-posed inverse problem. As a consequence the minimization of the residual is unstable. We have stabilized the algorithm introducing a penalty term based on Total Variation functional. In the inversion, we combine amplitude and phase surface temperature data obtained at several modulation frequencies. Inversions of synthetic data with added noise indicate that compact heat sources are characterized accurately and that the particular upper contours can be retrieved for shallow heat sources. The overall shape of open and homogeneous semicircular strip-shaped heat sources representing open half-penny cracks can also be retrieved but the reconstruction of the deeper end of the heat source loses contrast. Angle-, radius- and depth-dependent inhomogeneous heat flux distributions within these semicircular strips can also be qualitatively characterized. Reconstructions of experimental data taken on samples containing calibrated heat sources confirm the predictions from reconstructions of synthetic data. We also present inversions of experimental data obtained from a real welded Inconel 718 specimen. The results are in good qualitative agreement with the results of liquids penetrants testing.

  12. Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2004-01-01

    A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  13. A Methodology for the Integration of a Mechanistic Source Term Analysis in a Probabilistic Framework for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less

  14. Analysis and Application of Microgrids

    NASA Astrophysics Data System (ADS)

    Yue, Lu

    New trends of generating electricity locally and utilizing non-conventional or renewable energy sources have attracted increasing interests due to the gradual depletion of conventional fossil fuel energy sources. The new type of power generation is called Distributed Generation (DG) and the energy sources utilized by Distributed Generation are termed Distributed Energy Sources (DERs). With DGs embedded in the distribution networks, they evolve from passive distribution networks to active distribution networks enabling bidirectional power flows in the networks. Further incorporating flexible and intelligent controllers and employing future technologies, active distribution networks will turn to a Microgrid. A Microgrid is a small-scale, low voltage Combined with Heat and Power (CHP) supply network designed to supply electrical and heat loads for a small community. To further implement Microgrids, a sophisticated Microgrid Management System must be integrated. However, due to the fact that a Microgrid has multiple DERs integrated and is likely to be deregulated, the ability to perform real-time OPF and economic dispatch with fast speed advanced communication network is necessary. In this thesis, first, problems such as, power system modelling, power flow solving and power system optimization, are studied. Then, Distributed Generation and Microgrid are studied and reviewed, including a comprehensive review over current distributed generation technologies and Microgrid Management Systems, etc. Finally, a computer-based AC optimization method which minimizes the total transmission loss and generation cost of a Microgrid is proposed and a wireless communication scheme based on synchronized Code Division Multiple Access (sCDMA) is proposed. The algorithm is tested with a 6-bus power system and a 9-bus power system.

  15. Water use patterns of co-occurring C3 and C4 shrubs in the Gurbantonggut desert in northwestern China.

    PubMed

    Tiemuerbieke, Bahejiayinaer; Min, Xiao-Jun; Zang, Yong-Xin; Xing, Peng; Ma, Jian-Ying; Sun, Wei

    2018-09-01

    In water-limited ecosystems, spatial and temporal partitioning of water sources is an important mechanism that facilitates plant survival and lessens the competition intensity of co-existing plants. Insights into species-specific root functional plasticity and differences in the water sources of co-existing plants under changing water conditions can aid in accurate prediction of the response of desert ecosystems to future climate change. We used stable isotopes of soil water, groundwater and xylem water to determine the seasonal and inter- and intraspecific differences variations in the water sources of six C 3 and C 4 shrubs in the Gurbantonggut desert. We also measured the stem water potentials to determine the water stress levels of each species under varying water conditions. The studied shrubs exhibited similar seasonal water uptake patterns, i.e., all shrubs extracted shallow soil water recharged by snowmelt water during early spring and reverted to deeper water sources during dry summer periods, indicating that all of the studied shrubs have dimorphic root systems that enable them to obtain water sources that differ in space and time. Species in the C 4 shrub community exhibited differences in seasonal water absorption and water status due to differences in topography and rooting depth, demonstrating divergent adaptations to water availability and water stress. Haloxylon ammodendron and T. ramosissima in the C 3 /C 4 mixed community were similar in terms of seasonal water extraction but differed with respect to water potential, which indicated that plant water status is controlled by both root functioning and shoot eco-physiological traits. The two Tamarix species in the C 3 shrub community were similar in terms of water uptake and water status, which suggests functional convergence of the root system and physiological performance under same soil water conditions. In different communities, Haloxylon ammodendron differed in terms of summer water extraction, which suggests that this species exhibits plasticity with respect to rooting depth under different soil water conditions. Shrubs in the Gurbantonggut desert displayed varying adaptations across species and communities through divergent root functioning and shoot eco-physiological traits. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. A Well-Balanced Path-Integral f-Wave Method for Hyperbolic Problems with Source Terms

    PubMed Central

    2014-01-01

    Systems of hyperbolic partial differential equations with source terms (balance laws) arise in many applications where it is important to compute accurate time-dependent solutions modeling small perturbations of equilibrium solutions in which the source terms balance the hyperbolic part. The f-wave version of the wave-propagation algorithm is one approach, but requires the use of a particular averaged value of the source terms at each cell interface in order to be “well balanced” and exactly maintain steady states. A general approach to choosing this average is developed using the theory of path conservative methods. A scalar advection equation with a decay or growth term is introduced as a model problem for numerical experiments. PMID:24563581

  17. Laboratory tools and e-learning elements in training of acousto-optics

    NASA Astrophysics Data System (ADS)

    Barócsi, Attila; Lenk, Sándor; Ujhelyi, Ferenc; Majoros, Tamás.; Maák, Paál.

    2015-10-01

    Due to the acousto-optic (AO) effect, the refractive index of an optical interaction medium is perturbed by an acoustic wave induced in the medium that builds up a phase grating that will diffract the incident light beam if the condition of constructive interference is satisfied. All parameters, such as magnitude, period or phase of the grating can be controlled that allows the construction of useful devices (modulators, switches, one or multi-dimensional deflectors, spectrum analyzers, tunable filters, frequency shifters, etc.) The research and training of acousto-optics have a long-term tradition at our department. In this presentation, we introduce the related laboratory exercises fitted into an e-learning frame. The BSc level exercise utilizes a laser source and an AO cell to demonstrate the effect and principal AO functions explaining signal processing terms such as amplitude or frequency modulation, modulation depth and Fourier transformation ending up in building a free space sound transmitting and demodulation system. The setup for MSc level utilizes an AO filter with mono- and polychromatic light sources to learn about spectral analysis and synthesis. Smart phones can be used to generate signal inputs or outputs for both setups as well as to help students' preparation and reporting.

  18. Theoretical simulation of the multipole seismoelectric logging while drilling

    NASA Astrophysics Data System (ADS)

    Guan, Wei; Hu, Hengshan; Zheng, Xiaobo

    2013-11-01

    Acoustic logging-while-drilling (LWD) technology has been commercially used in the petroleum industry. However it remains a rather difficult task to invert formation compressional and shear velocities from acoustic LWD signals due to the unwanted strong collar wave, which covers or interferes with signals from the formation. In this paper, seismoelectric LWD is investigated for solving that problem. The seismoelectric field is calculated by solving a modified Poisson's equation, whose source term is the electric disturbance induced electrokinetically by the travelling seismic wave. The seismic wavefield itself is obtained by solving Biot's equations for poroelastic waves. From the simulated waveforms and the semblance plots for monopole, dipole and quadrupole sources, it is found that the electric field accompanies the collar wave as well as other wave groups of the acoustic pressure, despite the fact that seismoelectric conversion occurs only in porous formations. The collar wave in the electric field, however, is significantly weakened compared with that in the acoustic pressure, in terms of its amplitude relative to the other wave groups in the full waveforms. Thus less and shallower grooves are required to damp the collar wave if the seismoelectric LWD signals are recorded for extracting formation compressional and shear velocities.

  19. A Second Law Based Unstructured Finite Volume Procedure for Generalized Flow Simulation

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok

    1998-01-01

    An unstructured finite volume procedure has been developed for steady and transient thermo-fluid dynamic analysis of fluid systems and components. The procedure is applicable for a flow network consisting of pipes and various fittings where flow is assumed to be one dimensional. It can also be used to simulate flow in a component by modeling a multi-dimensional flow using the same numerical scheme. The flow domain is discretized into a number of interconnected control volumes located arbitrarily in space. The conservation equations for each control volume account for the transport of mass, momentum and entropy from the neighboring control volumes. In addition, they also include the sources of each conserved variable and time dependent terms. The source term of entropy equation contains entropy generation due to heat transfer and fluid friction. Thermodynamic properties are computed from the equation of state of a real fluid. The system of equations is solved by a hybrid numerical method which is a combination of simultaneous Newton-Raphson and successive substitution schemes. The paper also describes the application and verification of the procedure by comparing its predictions with the analytical and numerical solution of several benchmark problems.

  20. Demand-driven biogas production by flexible feeding in full-scale - Process stability and flexibility potentials.

    PubMed

    Mauky, Eric; Weinrich, Sören; Jacobi, Hans-Fabian; Nägele, Hans-Joachim; Liebetrau, Jan; Nelles, Michael

    2017-08-01

    For future energy supply systems with high proportions from renewable energy sources, biogas plants are a promising option to supply demand-driven electricity to compensate the divergence between energy demand and energy supply by uncontrolled sources like wind and solar. Apart expanding gas storage capacity a demand-oriented feeding with the aim of flexible gas production can be an effective alternative. The presented study demonstrated a high degree of intraday flexibility (up to 50% compared to the average) and a potential for an electricity shutdown of up to 3 days (decreasing gas production by more than 60%) by flexible feeding in full-scale. Furthermore, the long-term process stability was not affected negatively due to the flexible feeding. The flexible feeding resulted in a variable rate of gas production and a dynamic progression of individual acids and the respective pH-value. In consequence, a demand-driven biogas production may enable significant savings in terms of the required gas storage volume (up to 65%) and permit far greater plant flexibility compared to constant gas production. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Development of Approach for Long-Term Management of Disused Sealed Radioactive Sources - 13630

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kinker, M.; Reber, E.; Mansoux, H.

    Radioactive sources are used widely throughout the world in a variety of medical, industrial, research and military applications. When such radioactive sources are no longer used and are not intended to be used for the practice for which an authorization was granted, they are designated as 'disused sources'. Whether appropriate controls are in place during the useful life of a source or not, the end of this useful life is often a turning point after which it is more difficult to ensure the safety and security of the source over time. For various reasons, many disused sources cannot be returnedmore » to the manufacturer or the supplier for reuse or recycling. When these attempts fail, disused sources should be declared as radioactive waste and should be managed as such, in compliance with relevant international legal instruments and safety standards. However, disposal remains an unresolved issue in many counties, due to in part to limited public acceptance, insufficient funding, and a lack of practical examples of strategies for determining suitable disposal options. As a result, disused sources are often stored indefinitely at the facilities where they were once used. In order to prevent disused sources from becoming orphan sources, each country must develop and implement a comprehensive waste management strategy that includes disposal of disused sources. The International Atomic Energy Agency (IAEA) fosters international cooperation between countries and encourages the development of a harmonized 'cradle to grave' approach to managing sources consistent with international legal instruments, IAEA safety standards, and international good practices. This 'cradle to grave' approach requires the development of a national policy and implementing strategy, an adequate legal and regulatory framework, and adequate resources and infrastructure that cover the entire life cycle, from production and use of radioactive sources to disposal. (authors)« less

  2. Towards an operational high-resolution air quality forecasting system at ECCC

    NASA Astrophysics Data System (ADS)

    Munoz-Alpizar, Rodrigo; Stroud, Craig; Ren, Shuzhan; Belair, Stephane; Leroyer, Sylvie; Souvanlasy, Vanh; Spacek, Lubos; Pavlovic, Radenko; Davignon, Didier; Moran, Moran

    2017-04-01

    Urban environments are particularly sensitive to weather, air quality (AQ), and climatic conditions. Despite the efforts made in Canada to reduce pollution in urban areas, AQ continues to be a concern for the population, especially during short-term episodes that could lead to exceedances of daily air quality standards. Furthermore, urban air pollution has long been associated with significant adverse health effects. In Canada, the large percentage of the population living in urban areas ( 81%, according to the Canada's 2011 census) is exposed to elevated air pollution due to local emissions sources. Thus, in order to improve the services offered to the Canadian public, Environment and Climate Change Canada has launched an initiative to develop a high-resolution air quality prediction capacity for urban areas in Canada. This presentation will show observed pollution trends (2010-2016) for Canadian mega-cities along with some preliminary high-resolution air quality modelling results. Short-term and long-term plans for urban AQ forecasting in Canada will also be described.

  3. Gas Turbine Energy Conversion Systems for Nuclear Power Plants Applicable to LiFTR Liquid Fluoride Thorium Reactor Technology

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    2014-01-01

    This panel plans to cover thermal energy and electric power production issues facing our nation and the world over the next decades, with relevant technologies ranging from near term to mid-and far term.Although the main focus will be on ground based plants to provide baseload electric power, energy conversion systems (ECS) for space are also included, with solar- or nuclear energy sources for output power levels ranging tens of Watts to kilo-Watts for unmanned spacecraft, and eventual mega-Watts for lunar outposts and planetary surface colonies. Implications of these technologies on future terrestrial energy systems, combined with advanced fracking, are touched upon.Thorium based reactors, and nuclear fusion along with suitable gas turbine energy conversion systems (ECS) will also be considered by the panelists. The characteristics of the above mentioned ECS will be described, both in terms of their overall energy utilization effectiveness and also with regard to climactic effects due to exhaust emissions.

  4. Southern Hemisphere Carbon Monoxide Inferannual Variability Observed by Terra/Measurement of Pollution in the Troposphere (MOPITT)

    NASA Technical Reports Server (NTRS)

    Edwards, D. P.; Petron, G.; Novelli, P. C.; Emmons, L. K.; Gille, J. C.; Drummond, J. R.

    2010-01-01

    Biomass burning is an annual occurrence in the tropical southern hemisphere (SH) and represents a major source of regional pollution. Vegetation fires emit carbon monoxide (CO), which due to its medium lifetime is an excellent tracer of tropospheric transport. CO is also one of the few tropospheric trace gases currently observed from satellite and this provides long-term global measurements. In this paper, we use the 5 year CO data record from the Measurement Of Pollution In The Troposphere (MOPITT) instrument to examine the inter-annual variability of the SH CO loading and show how this relates to climate conditions which determine the intensity of fire sources. The MOPITT observations show an annual austral springtime peak in the SH zonal CO loading each year with dry-season biomass burning emissions in S. America, southern Africa, the Maritime Continent, and northwestern Australia. Although fires in southern Africa and S. America typically produce the greatest amount of CO, the most significant inter-annual variation is due to varying fire activity and emissions from the Maritime Continent and northern Australia. We find that this variation in turn correlates well with the El Nino Southern Oscillation precipitation index. Between 2000 and 2005, emissions were greatest in late 2002 and an inverse modeling of the MOPITT data using the MOZART chemical transport model estimates the southeast Asia regional fire source for the year August 2002 to September 2003 to be 52 Tg CO. Comparison of the MOPITT retrievals and NOAA surface network measurements indicate that the latter do not fully capture the inter-annual variability or the seasonal range of the CO zonal average concentration due to biases associated with atmospheric and geographic sampling.

  5. 26 CFR 1.737-1 - Recognition of precontribution gain.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Property A1 and Property A2 is long-term, U.S.-source capital gain or loss. The character of gain on Property A3 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real... long-term, U.S.-source capital gain ($10,000 gain on Property A1 and $8,000 loss on Property A2) and $1...

  6. Source term model evaluations for the low-level waste facility performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yim, M.S.; Su, S.I.

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  7. Important ingredients for health adaptive information systems.

    PubMed

    Senathirajah, Yalini; Bakken, Suzanne

    2011-01-01

    Healthcare information systems frequently do not truly meet clinician needs, due to the complexity, variability, and rapid change in medical contexts. Recently the internet world has been transformed by approaches commonly termed 'Web 2.0'. This paper proposes a Web 2.0 model for a healthcare adaptive architecture. The vision includes creating modular, user-composable systems which aim to make all necessary information from multiple internal and external sources available via a platform, for the user to use, arrange, recombine, author, and share at will, using rich interfaces where advisable. Clinicians can create a set of 'widgets' and 'views' which can transform data, reflect their domain knowledge and cater to their needs, using simple drag and drop interfaces without the intervention of programmers. We have built an example system, MedWISE, embodying the user-facing parts of the model. This approach to HIS is expected to have several advantages, including greater suitability to user needs (reflecting clinician rather than programmer concepts and priorities), incorporation of multiple information sources, agile reconfiguration to meet emerging situations and new treatment deployment, capture of user domain expertise and tacit knowledge, efficiencies due to workflow and human-computer interaction improvements, and greater user acceptance.

  8. Water use trends in Washington, 1985-2005

    USGS Publications Warehouse

    Lane, R.C.

    2010-01-01

    Since 1950, the U.S. Geological Survey Washington Water Science Center (USGS-WAWSC) has collected, compiled, and published, at 5-year intervals, statewide estimates of the amounts of water withdrawn and used for various purposes in Washington State. As new data and methods became available, some of the original datasets were recompiled. The most recent versions of these datasets were used in this fact sheet. The datasets are available online along with other USGS-WAWSC water-use publications at the USGS-WAWSC water use web page: http://wa.water.usgs.gov/data/wuse/. Values on these datasets and in this fact sheet may not sum to the indicated total due to independent rounding. Due to variations in data requirements, collection methods, terminology, and data sources, the direct assessment of water-use trends between compilations is difficult. This fact sheet focuses on the trends in total State and public-supplied populations, freshwater withdrawals and use, public-supply withdrawals and deliveries, and crop irrigation withdrawals and acreage in Washington from 1985 through 2005. These four categories were included in all five compilations and were the most stable in terms of data requirements, collection methods, terminology, and data sources.

  9. Bayesian source term determination with unknown covariance of measurements

    NASA Astrophysics Data System (ADS)

    Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav

    2017-04-01

    Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  10. Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2005-01-01

    A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perrin, Tess E.; Davis, Robert G.; Wilkerson, Andrea M.

    This GATEWAY project evaluated four field installations to better understand the long-term performance of a number of LED products, which can hopefully stimulate improvements in designing, manufacturing, specifying, procuring, and installing LED products. Field studies provide the opportunity to discover and investigate issues that cannot be simulated or uncovered in a laboratory, but the installed performance over time of commercially available LED products has not been well documented. Improving long-term performance can provide both direct energy savings by reducing the need to over-light to account for light loss and indirect energy savings through better market penetration due to SSL’s competitivemore » advantages over less-efficient light source technologies. The projects evaluated for this report illustrate that SSL use is often motivated by advantages other than energy savings, including maintenance savings, easier integration with control systems, and improved lighting quality.« less

  12. Measuring soil moisture near soil surface...minor differences due to neutron source type

    Treesearch

    Robert R. Ziemer; Irving Goldberg; Norman A. MacGillivray

    1967-01-01

    Moisture measurements were made in three media?paraffin, water, saturated sand?with four neutron miusture meters, each containing 226-radium-beryllium, 227-actinium-beryllium, 238-plutonium-beryllium, or 241-americium-beryllium neutron sources. Variability in surface detection by the different sources may be due to differences in neutron sources, in length of source,...

  13. Seismological evidence for monsoon induced micro to moderate earthquake sequence beneath the 2011 Talala, Saurashtra earthquake, Gujarat, India

    NASA Astrophysics Data System (ADS)

    Singh, A. P.; Mishra, O. P.

    2015-10-01

    In order to understand the processes involved in the genesis of monsoon induced micro to moderate earthquakes after heavy rainfall during the Indian summer monsoon period beneath the 2011 Talala, Saurashtra earthquake (Mw 5.1) source zone, we assimilated 3-D microstructures of the sub-surface rock materials using a data set recorded by the Seismic Network of Gujarat (SeisNetG), India. Crack attributes in terms of crack density (ε), the saturation rate (ξ) and porosity parameter (ψ) were determined from the estimated 3-D sub-surface velocities (Vp, Vs) and Poisson's ratio (σ) structures of the area at varying depths. We distinctly imaged high-ε, high-ξ and low-ψ anomalies at shallow depths, extending up to 9-15 km. We infer that the existence of sub-surface fractured rock matrix connected to the surface from the source zone may have contributed to the changes in differential strain deep down to the crust due to the infiltration of rainwater, which in turn induced micro to moderate earthquake sequence beneath Talala source zone. Infiltration of rainwater during the Indian summer monsoon might have hastened the failure of the rock by perturbing the crustal volume strain of the causative source rock matrix associated with the changes in the seismic moment release beneath the surface. Analyses of crack attributes suggest that the fractured volume of the rock matrix with high porosity and lowered seismic strength beneath the source zone might have considerable influence on the style of fault displacements due to seismo-hydraulic fluid flows. Localized zone of micro-cracks diagnosed within the causative rock matrix connected to the water table and their association with shallow crustal faults might have acted as a conduit for infiltrating the precipitation down to the shallow crustal layers following the fault suction mechanism of pore pressure diffusion, triggering the monsoon induced earthquake sequence beneath the source zone.

  14. Interannual, solar cycle, and trend terms in middle atmospheric temperature time series from HALOE

    NASA Astrophysics Data System (ADS)

    Remsberg, E. E.; Deaver, L. E.

    2005-03-01

    Temperature versus pressure or T(p) time series from the Halogen Occultation Experiment (HALOE) have been generated and analyzed for the period of 1991-2004 and for the mesosphere and upper stratosphere for latitude zones from 40N to 40S. Multiple linear regression (MLR) techniques were used for the analysis of the seasonal and the significant interannual and solar cycle (or decadal-scale) terms. An 11-yr solar cycle (SC) term of amplitude 0.5 to 1.7 K was found for the middle to upper mesosphere; its phase was determined by a Fourier fit to the de-seasonalized residual. This SC term is largest and has a lag of several years for northern hemisphere middle latitudes of the middle mesosphere, perhaps due to the interfering effects of wintertime wave dissipation. The SC response from the MLR models is weaker but essentially in-phase at low latitudes and in the southern hemisphere. An in-phase SC response term is also significant near the tropical stratopause with an amplitude of about 0.4 to 0.6 K, which is somewhat less than predicted from models. Both sub-biennial (688-dy) and QBO (800-dy) terms are resolved for the mid to upper stratosphere along with a decadal-scale term that is presumed to have a 13.5-yr period due to their predicted modulation. This decadal-scale term is out-of-phase with the SC during 1991-2004. However, the true nature and source of this term is still uncertain, especially at 5 hPa. Significant linear cooling trends ranging from -0.3 K to -1.1 K per decade were found in the tropical upper stratosphere and subtropical mesosphere. Trends have not emerged so far for the tropical mesosphere, so it is concluded that the cooling rates that have been resolved for the subtropics are likely upper limits. As HALOE-like measurements continue and their time series lengthen, it is anticipated that better accuracy can be achieved for these interannual, SC, and trend terms.

  15. Tracking nonpoint source nitrogen pollution in human-impacted watersheds

    USGS Publications Warehouse

    Kaushal, Sujay S.; Groffman, Peter M; Band, Lawrence; Elliott, Emily M.; Shields, Catherine A.; Kendall, Carol

    2011-01-01

    Nonpoint source nitrogen (N) pollution is a leading contributor to U.S. water quality impairments. We combined watershed N mass balances and stable isotopes to investigate fate and transport of nonpoint N in forest, agricultural, and urbanized watersheds at the Baltimore Long-Term Ecological Research site. Annual N retention was 55%, 68%, and 82% for agricultural, suburban, and forest watersheds, respectively. Analysis of δ15N-NO3–, and δ18O-NO3– indicated wastewater was an important nitrate source in urbanized streams during baseflow. Negative correlations between δ15N-NO3– and δ18O-NO3– in urban watersheds indicated mixing between atmospheric deposition and wastewater, and N source contributions changed with storm magnitude (atmospheric sources contributed ∼50% at peak storm N loads). Positive correlations between δ15N-NO3– and δ18O-NO3– in watersheds suggested denitrification was removing septic system and agriculturally derived N, but N from belowground leaking sewers was less susceptible to denitrification. N transformations were also observed in a storm drain (no natural drainage network) potentially due to organic carbon inputs. Overall, nonpoint sources such as atmospheric deposition, wastewater, and fertilizer showed different susceptibility to watershed N export. There were large changes in nitrate sources as a function of runoff, and anticipating source changes in response to climate and storms will be critical for managing nonpoint N pollution.

  16. Quantum corrections to the gravitational potentials of a point source due to conformal fields in de Sitter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fröb, Markus B.; Verdaguer, Enric, E-mail: mfroeb@itp.uni-leipzig.de, E-mail: enric.verdaguer@ub.edu

    We derive the leading quantum corrections to the gravitational potentials in a de Sitter background, due to the vacuum polarization from loops of conformal fields. Our results are valid for arbitrary conformal theories, even strongly interacting ones, and are expressed using the coefficients b and b' appearing in the trace anomaly. Apart from the de Sitter generalization of the known flat-space results, we find two additional contributions: one which depends on the finite coefficients of terms quadratic in the curvature appearing in the renormalized effective action, and one which grows logarithmically with physical distance. While the first contribution corresponds tomore » a rescaling of the effective mass, the second contribution leads to a faster fall-off of the Newton potential at large distances, and is potentially measurable.« less

  17. Source effects on the simulation of the strong groud motion of the 2011 Lorca earthquake

    NASA Astrophysics Data System (ADS)

    Saraò, Angela; Moratto, Luca; Vuan, Alessandro; Mucciarelli, Marco; Jimenez, Maria Jose; Garcia Fernandez, Mariano

    2016-04-01

    On May 11, 2011 a moderate seismic event (Mw=5.2) struck the city of Lorca (South-East Spain) causing nine casualties, a large number of injured people and damages at the civil buildings. The largest PGA value (360 cm/s2) ever recorded so far in Spain, was observed at the accelerometric station located in Lorca (LOR), and it was explained as due to the source directivity, rather than to local site effects. During the last years different source models, retrieved from the inversions of geodetic or seismological data, or a combination of the two, have been published. To investigate the variability that equivalent source models of an average earthquake can introduce in the computation of strong motion, we calculated seismograms (up to 1 Hz), using an approach based on the wavenumber integration and, as input, four different source models taken from the literature. The source models differ mainly for the slip distribution on the fault. Our results show that, as effect of the different sources, the ground motion variability, in terms of pseudo-spectral velocity (1s), can reach one order of magnitude for near source receivers or for sites influenced by the forward-directivity effect. Finally, we compute the strong motion at frequencies higher than 1 Hz using the Empirical Green Functions and the source model parameters that better reproduce the recorded shaking up to 1 Hz: the computed seismograms fit satisfactorily the signals recorded at LOR station as well as at the other stations close to the source.

  18. Quantitative Determination of Vinpocetine in Dietary Supplements.

    PubMed

    French, John M T; King, Matthew D; McDougal, Owen M

    2016-05-01

    Current United States regulatory policies allow for the addition of pharmacologically active substances in dietary supplements if derived from a botanical source. The inclusion of certain nootropic drugs, such as vinpocetine, in dietary supplements has recently come under scrutiny due to the lack of defined dosage parameters and yet unproven short- and long-term benefits and risks to human health. This study quantified the concentration of vinpocetine in several commercially available dietary supplements and found that a highly variable range of 0.6-5.1 mg/serving was present across the tested products, with most products providing no specification of vinpocetine concentrations.

  19. A model for the spectroscopic variations of the peculiar symbiotic star MWC 560

    NASA Technical Reports Server (NTRS)

    Shore, Steven N.; Aufdenberg, Jason P.; Michalitsianos, A. G.

    1994-01-01

    In this note, we show that the ultraviolet and optical spectroscopic variability of this unique symbiotic star can be understood in terms of a time variable collimated stellar wind with a rapid acceleration near the source. Using the radial velocities observed during the ultraviolet bright phase, we find that a variation in the mass loss rate of a factor of ten can explain the ultraviolet spectral changes. The acceleration is far faster than normally observed in radiatively driven stellar winds and may be due to mechanical driving of the outflow from the disk.

  20. The influence of cross-order terms in interface mobilities for structure-borne sound source characterization

    NASA Astrophysics Data System (ADS)

    Bonhoff, H. A.; Petersson, B. A. T.

    2010-08-01

    For the characterization of structure-borne sound sources with multi-point or continuous interfaces, substantial simplifications and physical insight can be obtained by incorporating the concept of interface mobilities. The applicability of interface mobilities, however, relies upon the admissibility of neglecting the so-called cross-order terms. Hence, the objective of the present paper is to clarify the importance and significance of cross-order terms for the characterization of vibrational sources. From previous studies, four conditions have been identified for which the cross-order terms can become more influential. Such are non-circular interface geometries, structures with distinctively differing transfer paths as well as a suppression of the zero-order motion and cases where the contact forces are either in phase or out of phase. In a theoretical study, the former four conditions are investigated regarding the frequency range and magnitude of a possible strengthening of the cross-order terms. For an experimental analysis, two source-receiver installations are selected, suitably designed to obtain strong cross-order terms. The transmitted power and the source descriptors are predicted by the approximations of the interface mobility approach and compared with the complete calculations. Neglecting the cross-order terms can result in large misinterpretations at certain frequencies. On average, however, the cross-order terms are found to be insignificant and can be neglected with good approximation. The general applicability of interface mobilities for structure-borne sound source characterization and the description of the transmission process thereby is confirmed.

  1. Can fungi compete with marine sources for chitosan production?

    PubMed

    Ghormade, V; Pathan, E K; Deshpande, M V

    2017-11-01

    Chitosan, a β-1,4-linked glucosamine polymer is formed by deacetylation of chitin. It has a wide range of applications from agriculture to human health care products. Chitosan is commercially produced from shellfish, shrimp waste, crab and lobster processing using strong alkalis at high temperatures for long time periods. The production of chitin and chitosan from fungal sources has gained increased attention in recent years due to potential advantages in terms of homogenous polymer length, high degree of deacetylation and solubility over the current marine source. Zygomycetous fungi such as Absidia coerulea, Benjaminiella poitrasii, Cunninghamella elegans, Gongrenella butleri, Mucor rouxii, Mucor racemosus and Rhizopus oryzae have been studied extensively. Isolation of chitosan are reported from few edible basidiomycetous fungi like Agaricus bisporus, Lentinula edodes and Pleurotus sajor-caju. Other organisms from mycotech industries explored for chitosan production are Aspergillus niger, Penicillium chrysogenum, Saccharomyces cerevisiae and other wine yeasts. Number of aspects such as value addition to the existing applications of fungi, utilization of waste from agriculture sector, and issues and challenges for the production of fungal chitosan to compete with existing sources, metabolic engineering and novel applications have been discussed to adjudge the potential of fungal sources for commercial chitosan production. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Image transmission system using adaptive joint source and channel decoding

    NASA Astrophysics Data System (ADS)

    Liu, Weiliang; Daut, David G.

    2005-03-01

    In this paper, an adaptive joint source and channel decoding method is designed to accelerate the convergence of the iterative log-dimain sum-product decoding procedure of LDPC codes as well as to improve the reconstructed image quality. Error resilience modes are used in the JPEG2000 source codec, which makes it possible to provide useful source decoded information to the channel decoder. After each iteration, a tentative decoding is made and the channel decoded bits are then sent to the JPEG2000 decoder. Due to the error resilience modes, some bits are known to be either correct or in error. The positions of these bits are then fed back to the channel decoder. The log-likelihood ratios (LLR) of these bits are then modified by a weighting factor for the next iteration. By observing the statistics of the decoding procedure, the weighting factor is designed as a function of the channel condition. That is, for lower channel SNR, a larger factor is assigned, and vice versa. Results show that the proposed joint decoding methods can greatly reduce the number of iterations, and thereby reduce the decoding delay considerably. At the same time, this method always outperforms the non-source controlled decoding method up to 5dB in terms of PSNR for various reconstructed images.

  3. Status of a standard for neutron skyshine calculation and measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westfall, R.M.; Wright, R.Q.; Greenborg, J.

    1990-01-01

    An effort has been under way for several years to prepare a draft standard, ANS-6.6.2, Calculation and Measurement of Direct and Scattered Neutron Radiation from Contained Sources Due to Nuclear Power Operations. At the outset, the work group adopted a three-phase study involving one-dimensional analyses, a measurements program, and multi-dimensional analyses. Of particular interest are the neutron radiation levels associated with dry-fuel storage at reactor sites. The need for dry storage has been investigated for various scenarios of repository and monitored retrievable storage (MRS) facilities availability with the waste stream analysis model. The concern is with long-term integrated, low-level dosesmore » at long distances from a multiplicity of sources. To evaluate the conservatism associated with one-dimensional analyses, the work group has specified a series of simple problems. Sources as a function of fuel exposure were determined for a Westinghouse 17 x 17 pressurized water reactor assembly with the ORIGEN-S module of the SCALE system. The energy degradation of the 35 GWd/ton U sources was determined for two generic designs of dry-fuel storage casks.« less

  4. Kernel temporal enhancement approach for LORETA source reconstruction using EEG data.

    PubMed

    Torres-Valencia, Cristian A; Santamaria, M Claudia Joana; Alvarez, Mauricio A

    2016-08-01

    Reconstruction of brain sources from magnetoencephalography and electroencephalography (M/EEG) data is a well known problem in the neuroengineering field. A inverse problem should be solved and several methods have been proposed. Low Resolution Electromagnetic Tomography (LORETA) and the different variations proposed as standardized LORETA (sLORETA) and the standardized weighted LORETA (swLORETA) have solved the inverse problem following a non-parametric approach, that is by setting dipoles in the whole brain domain in order to estimate the dipole positions from the M/EEG data and assuming some spatial priors. Errors in the reconstruction of sources are presented due the low spatial resolution of the LORETA framework and the influence of noise in the observable data. In this work a kernel temporal enhancement (kTE) is proposed in order to build a preprocessing stage of the data that allows in combination with the swLORETA method a improvement in the source reconstruction. The results are quantified in terms of three dipole error localization metrics and the strategy of swLORETA + kTE obtained the best results across different signal to noise ratio (SNR) in random dipoles simulation from synthetic EEG data.

  5. The role of alprazolam for the treatment of panic disorder in Australia.

    PubMed

    Moylan, Steven; Giorlando, Francesco; Nordfjærn, Trond; Berk, Michael

    2012-03-01

    To investigate the potential impact of increasing prescription rates of alprazolam for the treatment of panic disorder (PD) in Australia through a review of efficacy, tolerability and adverse outcome literature. Data were sourced by a literature search using MEDLINE, Embase, PsycINFO and a manual search of scientific journals to identify relevant articles. Clinical practice guidelines from the American Psychiatric Association, National Institute of Clinical Excellence, Royal Australian and New Zealand College of Psychiatrists and World Federation of Societies of Biological Psychiatry were sourced. Prescription data were sourced from Australian governmental sources. Alprazolam has shown efficacy for control of PD symptoms, particularly in short-term controlled clinical trials, but is no longer recommended as a first-line pharmacological treatment due to concerns about the risks of developing tolerance, dependence and abuse potential. Almost no evidence is available comparing alprazolam to current first-line pharmacological treatment. Despite this, prescription rates are increasing. A number of potential issues including use in overdose and impact on car accidents are noted. conclusion: Although effective for PD symptoms in clinical trials, a number of potential issues may exist with use. Consideration of its future place in PD treatment in Australia may be warranted.

  6. Saline Groundwater from Coastal Aquifers As a Source for Desalination.

    PubMed

    Stein, Shaked; Russak, Amos; Sivan, Orit; Yechieli, Yoseph; Rahav, Eyal; Oren, Yoram; Kasher, Roni

    2016-02-16

    Reverse osmosis (RO) seawater desalination is currently a widespread means of closing the gap between supply and demand for potable water in arid regions. Currently, one of the main setbacks of RO operation is fouling, which hinders membrane performance and induces pressure loss, thereby reducing system efficiency. An alternative water source is saline groundwater with salinity close to seawater, pumped from beach wells in coastal aquifers which penetrate beneath the freshwater-seawater interface. In this research, we studied the potential use of saline groundwater of the coastal aquifer as feedwater for desalination in comparison to seawater using fieldwork and laboratory approaches. The chemistry, microbiology and physical properties of saline groundwater were characterized and compared with seawater. Additionally, reverse osmosis desalination experiments in a cross-flow system were performed, evaluating the permeate flux, salt rejection and fouling propensities of the different water types. Our results indicated that saline groundwater was significantly favored over seawater as a feed source in terms of chemical composition, microorganism content, silt density, and fouling potential, and exhibited better desalination performance with less flux decline. Saline groundwater may be a better water source for desalination by RO due to lower fouling potential, and reduced pretreatment costs.

  7. Modeling the refraction of microbaroms by the winds of a large maritime storm.

    PubMed

    Blom, Philip; Waxler, Roger

    2017-12-01

    Continuous infrasonic signals produced by the ocean surface interacting with the atmosphere, termed microbaroms, are known to be generated by a number of phenomena including large maritime storms. Storm generated microbaroms exhibit axial asymmetry when observed at locations far from the storm due to the source location being offset from the storm center. Because of this offset, a portion of the microbarom energy will radiate towards the storm center and interact with the winds in the region. Detailed here are predictions for the propagation of microbaroms through an axisymmetric, three-dimensional model storm. Geometric propagation methods have been utilized and the predicted horizontal refraction is found to produce signals that appear to emanate from a virtual source near the storm center when observed far from the storm. This virtual source near the storm center is expected to be observed only from a limited arc around the storm system with increased extent associated with more intense wind fields. This result implies that identifying the extent of the arc observing signal from the virtual source could provide a means to estimate the wind structure using infrasonic observations far from the storm system.

  8. Efficient 1.6 Micron Laser Source for Methane DIAL

    NASA Technical Reports Server (NTRS)

    Shuman, Timothy; Burnham, Ralph; Nehrir, Amin R.; Ismail, Syed; Hair, Johnathan W.

    2013-01-01

    Methane is a potent greenhouse gas and on a per molecule basis has a warming influence 72 times that of carbon dioxide over a 20 year horizon. Therefore, it is important to look at near term radiative effects due to methane to develop mitigation strategies to counteract global warming trends via ground and airborne based measurements systems. These systems require the development of a time-resolved DIAL capability using a narrow-line laser source allowing observation of atmospheric methane on local, regional and global scales. In this work, a demonstrated and efficient nonlinear conversion scheme meeting the performance requirements of a deployable methane DIAL system is presented. By combining a single frequency 1064 nm pump source and a seeded KTP OPO more than 5 mJ of 1.6 µm pulse energy is generated with conversion efficiencies in excess of 20%. Even without active cavity control instrument limited linewidths (50 pm) were achieved with an estimated spectral purity of 95%. Tunable operation over 400 pm (limited by the tuning range of the seed laser) was also demonstrated. This source demonstrated the critical needs for a methane DIAL system motivating additional development of the technology.

  9. Viking-Age Sails: Form and Proportion

    NASA Astrophysics Data System (ADS)

    Bischoff, Vibeke

    2017-04-01

    Archaeological ship-finds have shed much light on the design and construction of vessels from the Viking Age. However, the exact proportions of their sails remain unknown due to the lack of fully preserved sails, or other definite indicators of their proportions. Key Viking-Age ship-finds from Scandinavia—the Oseberg Ship, the Gokstad Ship and Skuldelev 3—have all revealed traces of rigging. In all three finds, the keelson—with the mast position—is preserved, together with fastenings for the sheets and the tack, indicating the breadth of the sail. The sail area can then be estimated based on practical experience of how large a sail the specific ship can carry, in conjunction with hull form and displacement. This article presents reconstructions of the form and dimensions of rigging and sail based on the archaeological finds, evidence from iconographic and written sources, and ethnographic parallels with traditional Nordic boats. When these sources are analysed, not only do the similarities become apparent, but so too does the relative disparity between the archaeological record and the other sources. Preferential selection in terms of which source is given the greatest merit is therefore required, as it is not possible to afford them all equal value.

  10. Dissociation between memory accuracy and memory confidence following bilateral parietal lesions.

    PubMed

    Simons, Jon S; Peers, Polly V; Mazuz, Yonatan S; Berryhill, Marian E; Olson, Ingrid R

    2010-02-01

    Numerous functional neuroimaging studies have observed lateral parietal lobe activation during memory tasks: a surprise to clinicians who have traditionally associated the parietal lobe with spatial attention rather than memory. Recent neuropsychological studies examining episodic recollection after parietal lobe lesions have reported differing results. Performance was preserved in unilateral lesion patients on source memory tasks involving recollecting the context in which stimuli were encountered, and impaired in patients with bilateral parietal lesions on tasks assessing free recall of autobiographical memories. Here, we investigated a number of possible accounts for these differing results. In 3 experiments, patients with bilateral parietal lesions performed as well as controls at source recollection, confirming the previous unilateral lesion results and arguing against an explanation for those results in terms of contralesional compensation. Reducing the behavioral relevance of mnemonic information critical to the source recollection task did not affect performance of the bilateral lesion patients, indicating that the previously observed reduced autobiographical free recall might not be due to impaired bottom-up attention. The bilateral patients did, however, exhibit reduced confidence in their source recollection abilities across the 3 experiments, consistent with a suggestion that parietal lobe lesions might lead to impaired subjective experience of rich episodic recollection.

  11. Contribution of Satellite Gravimetry to Understanding Seismic Source Processes of the 2011 Tohoku-Oki Earthquake

    NASA Technical Reports Server (NTRS)

    Han, Shin-Chan; Sauber, Jeanne; Riva, Riccardo

    2011-01-01

    The 2011 great Tohoku-Oki earthquake, apart from shaking the ground, perturbed the motions of satellites orbiting some hundreds km away above the ground, such as GRACE, due to coseismic change in the gravity field. Significant changes in inter-satellite distance were observed after the earthquake. These unconventional satellite measurements were inverted to examine the earthquake source processes from a radically different perspective that complements the analyses of seismic and geodetic ground recordings. We found the average slip located up-dip of the hypocenter but within the lower crust, as characterized by a limited range of bulk and shear moduli. The GRACE data constrained a group of earthquake source parameters that yield increasing dip (7-16 degrees plus or minus 2 degrees) and, simultaneously, decreasing moment magnitude (9.17-9.02 plus or minus 0.04) with increasing source depth (15-24 kilometers). The GRACE solution includes the cumulative moment released over a month and demonstrates a unique view of the long-wavelength gravimetric response to all mass redistribution processes associated with the dynamic rupture and short-term postseismic mechanisms to improve our understanding of the physics of megathrusts.

  12. Fermi Large Area Telescope First Source Catalog

    DOE PAGES

    Abdo, A. A.; Ackermann, M.; Ajello, M.; ...

    2010-05-25

    Here, we present a catalog of high-energy gamma-ray sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi), during the first 11 months of the science phase of the mission, which began on 2008 August 4. The First Fermi-LAT catalog (1FGL) contains 1451 sources detected and characterized in the 100 MeV to 100 GeV range. Source detection was based on the average flux over the 11 month period, and the threshold likelihood Test Statistic is 25, corresponding to a significance of just over 4σ. The 1FGL catalog includes source location regions,more » defined in terms of elliptical fits to the 95% confidence regions and power-law spectral fits as well as flux measurements in five energy bands for each source. In addition, monthly light curves are provided. Using a protocol defined before launch we have tested for several populations of gamma-ray sources among the sources in the catalog. For individual LAT-detected sources we provide firm identifications or plausible associations with sources in other astronomical catalogs. Identifications are based on correlated variability with counterparts at other wavelengths, or on spin or orbital periodicity. For the catalogs and association criteria that we have selected, 630 of the sources are unassociated. In conclusion, care was taken to characterize the sensitivity of the results to the model of interstellar diffuse gamma-ray emission used to model the bright foreground, with the result that 161 sources at low Galactic latitudes and toward bright local interstellar clouds are flagged as having properties that are strongly dependent on the model or as potentially being due to incorrectly modeled structure in the Galactic diffuse emission.« less

  13. A comparison of Lorentz, planetary gravitational, and satellite gravitational resonances

    NASA Technical Reports Server (NTRS)

    Hamilton, Douglas P.

    1994-01-01

    We consider a charged dust grain whose orbital motion is dominated by a planet's point-source gravity, but perturbed by higher-order terms in the planet's gravity field as well as by the Lorentz force arising from an asymmetric planetary magnetic field. Perturbations to Keplerian orbits due to a nonspherical gravity field are expressed in the traditional way: in terms of a disturbing function which can be expanded in a series of spherical harmonics (W. M. Kaula, 1966). In order to calculate the electromagnetic perturbation, we first write the Lorentz force in terms of the orbital elements and then substitute it into Gauss' perturbation equations. We use our result to derive strengths of Lorentz resonances and elucidate their properties. In particular, we compare Lorentz resonances to two types of gravitational resonances: those arising from periodic tugs of a satellite and those due to the attraction of an arbitrarily shaped planet. We find that Lorentz resonances share numerous properties with their gravitational counterparts and show, using simple physical arguments, that several of these patterns are fundamental, applying not only to our expansions, but to all quantities expressed in terms of orbital elements. Some of these patterns have been previously called 'd'Alembert rules' for satellite resonances. Other similarities arise because, to first-order in the perturbing force, the three problems share an integral of the motion. Yet there are also differences; for example, first-order inclination resonances exist for perturbations arising from planetary gravity and from the Lorentz force, but not for those due to an orbiting satellite. Finally, we provide a heuristic treatment of a particle's orbital evolution under the influence of drag and resonant forces. Particles brought into mean-motion resonances experience either trapping or resonant 'jumps,' depending on the direction from which the resonance is approached. We show that this behavior does not depend on the details of the perturbing force but rather is fundamental to all mean-motion resonances.

  14. Lexicon-enhanced sentiment analysis framework using rule-based classification scheme.

    PubMed

    Asghar, Muhammad Zubair; Khan, Aurangzeb; Ahmad, Shakeel; Qasim, Maria; Khan, Imran Ali

    2017-01-01

    With the rapid increase in social networks and blogs, the social media services are increasingly being used by online communities to share their views and experiences about a particular product, policy and event. Due to economic importance of these reviews, there is growing trend of writing user reviews to promote a product. Nowadays, users prefer online blogs and review sites to purchase products. Therefore, user reviews are considered as an important source of information in Sentiment Analysis (SA) applications for decision making. In this work, we exploit the wealth of user reviews, available through the online forums, to analyze the semantic orientation of words by categorizing them into +ive and -ive classes to identify and classify emoticons, modifiers, general-purpose and domain-specific words expressed in the public's feedback about the products. However, the un-supervised learning approach employed in previous studies is becoming less efficient due to data sparseness, low accuracy due to non-consideration of emoticons, modifiers, and presence of domain specific words, as they may result in inaccurate classification of users' reviews. Lexicon-enhanced sentiment analysis based on Rule-based classification scheme is an alternative approach for improving sentiment classification of users' reviews in online communities. In addition to the sentiment terms used in general purpose sentiment analysis, we integrate emoticons, modifiers and domain specific terms to analyze the reviews posted in online communities. To test the effectiveness of the proposed method, we considered users reviews in three domains. The results obtained from different experiments demonstrate that the proposed method overcomes limitations of previous methods and the performance of the sentiment analysis is improved after considering emoticons, modifiers, negations, and domain specific terms when compared to baseline methods.

  15. On the possibility of the multiple inductively coupled plasma and helicon plasma sources for large-area processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Jin-Won; Lee, Yun-Seong, E-mail: leeeeys@kaist.ac.kr; Chang, Hong-Young

    2014-08-15

    In this study, we attempted to determine the possibility of multiple inductively coupled plasma (ICP) and helicon plasma sources for large-area processes. Experiments were performed with the one and two coils to measure plasma and electrical parameters, and a circuit simulation was performed to measure the current at each coil in the 2-coil experiment. Based on the result, we could determine the possibility of multiple ICP sources due to a direct change of impedance due to current and saturation of impedance due to the skin-depth effect. However, a helicon plasma source is difficult to adapt to the multiple sources duemore » to the consistent change of real impedance due to mode transition and the low uniformity of the B-field confinement. As a result, it is expected that ICP can be adapted to multiple sources for large-area processes.« less

  16. Terms used by nurses to describe patient problems: can SNOMED III represent nursing concepts in the patient record?

    PubMed Central

    Henry, S B; Holzemer, W L; Reilly, C A; Campbell, K E

    1994-01-01

    OBJECTIVE: To analyze the terms used by nurses in a variety of data sources and to test the feasibility of using SNOMED III to represent nursing terms. DESIGN: Prospective research design with manual matching of terms to the SNOMED III vocabulary. MEASUREMENTS: The terms used by nurses to describe patient problems during 485 episodes of care for 201 patients hospitalized for Pneumocystis carinii pneumonia were identified. Problems from four data sources (nurse interview, intershift report, nursing care plan, and nurse progress note/flowsheet) were classified based on the substantive area of the problem and on the terminology used to describe the problem. A test subset of the 25 most frequently used terms from the two written data sources (nursing care plan and nurse progress note/flowsheet) were manually matched to SNOMED III terms to test the feasibility of using that existing vocabulary to represent nursing terms. RESULTS: Nurses most frequently described patient problems as signs/symptoms in the verbal nurse interview and intershift report. In the written data sources, problems were recorded as North American Nursing Diagnosis Association (NANDA) terms and signs/symptoms with similar frequencies. Of the nursing terms in the test subset, 69% were represented using one or more SNOMED III terms. PMID:7719788

  17. Highlighting Uncertainty and Recommendations for Improvement of Black Carbon Biomass Fuel-Based Emission Inventories in the Indo-Gangetic Plain Region.

    PubMed

    Soneja, Sutyajeet I; Tielsch, James M; Khatry, Subarna K; Curriero, Frank C; Breysse, Patrick N

    2016-03-01

    Black carbon (BC) is a major contributor to hydrological cycle change and glacial retreat within the Indo-Gangetic Plain (IGP) and surrounding region. However, significant variability exists for estimates of BC regional concentration. Existing inventories within the IGP suffer from limited representation of rural sources, reliance on idealized point source estimates (e.g., utilization of emission factors or fuel-use estimates for cooking along with demographic information), and difficulty in distinguishing sources. Inventory development utilizes two approaches, termed top down and bottom up, which rely on various sources including transport models, emission factors, and remote sensing applications. Large discrepancies exist for BC source attribution throughout the IGP depending on the approach utilized. Cooking with biomass fuels, a major contributor to BC production has great source apportionment variability. Areas requiring attention tied to research of cookstove and biomass fuel use that have been recognized to improve emission inventory estimates include emission factors, particulate matter speciation, and better quantification of regional/economic sectors. However, limited attention has been given towards understanding ambient small-scale spatial variation of BC between cooking and non-cooking periods in low-resource environments. Understanding the indoor to outdoor relationship of BC emissions due to cooking at a local level is a top priority to improve emission inventories as many health and climate applications rely upon utilization of accurate emission inventories.

  18. Climate Change Impacts of US Reactive Nitrogen Emissions

    NASA Astrophysics Data System (ADS)

    Pinder, R. W.; Davidson, E. A.; Goodale, C. L.; Greaver, T.; Herrick, J.; Liu, L.

    2011-12-01

    By fossil fuel combustion and fertilizer application, the US has substantially altered the nitrogen cycle, with serious effects on climate change. The climate effects can be short-lived, by impacting the chemistry of the atmosphere, or long-lived, by altering ecosystem greenhouse gas fluxes. Here, we develop a coherent framework for assessing the climate change impacts of US reactive nitrogen emissions. We use the global temperature potential (GTP) as a common metric, and we calculate the GTP at 20 and 100 years in units of CO2 equivalents. At both time-scales, nitrogen enhancement of CO2 uptake has the largest impact, because in the eastern US, areas of high nitrogen deposition are co-located with forests. In the short-term, the effect due to NOx altering ozone and methane concentrations is also substantial, but are not important on the 100 year time scale. Finally, the GTP of N2O emissions is substantial at both time scales. We have also attributed these impacts to combustion and agricultural sources, and quantified the uncertainty. Reactive nitrogen from combustion sources contribute more to cooling than warming. The impacts of agricultural sources tend to cancel each other out, and the net effect is uncertain. Recent trends show decreasing reactive nitrogen from US combustion sources, while agricultural sources are increasing. Fortunately, there are many mitigation strategies currently available to reduce the climate change impacts of US agricultural sources.

  19. Estimating air emissions from ships: Meta-analysis of modelling approaches and available data sources

    NASA Astrophysics Data System (ADS)

    Miola, Apollonia; Ciuffo, Biagio

    2011-04-01

    Maritime transport plays a central role in the transport sector's sustainability debate. Its contribution to air pollution and greenhouse gases is significant. An effective policy strategy to regulate air emissions requires their robust estimation in terms of quantification and location. This paper provides a critical analysis of the ship emission modelling approaches and data sources available, identifying their limits and constraints. It classifies the main methodologies on the basis of the approach followed (bottom-up or top-down) for the evaluation and geographic characterisation of emissions. The analysis highlights the uncertainty of results from the different methods. This is mainly due to the level of uncertainty connected with the sources of information that are used as inputs to the different studies. This paper describes the sources of the information required for these analyses, paying particular attention to AIS data and to the possible problems associated with their use. One way of reducing the overall uncertainty in the results could be the simultaneous use of different sources of information. This paper presents an alternative methodology based on this approach. As a final remark, it can be expected that new approaches to the problem together with more reliable data sources over the coming years could give more impetus to the debate on the global impact of maritime traffic on the environment that, currently, has only reached agreement via the "consensus" estimates provided by IMO (2009).

  20. The characteristics and impact of source of infection on sepsis-related ICU outcomes.

    PubMed

    Jeganathan, Niranjan; Yau, Stephen; Ahuja, Neha; Otu, Dara; Stein, Brian; Fogg, Louis; Balk, Robert

    2017-10-01

    Source of infection is an independent predictor of sepsis-related mortality. To date, studies have failed to evaluate differences in septic patients based on the source of infection. Retrospective study of all patients with sepsis admitted to the ICU of a university hospital within a 12month time period. Sepsis due to intravascular device and multiple sources had the highest number of positive blood cultures and microbiology whereas lung and abdominal sepsis had the least. The observed hospital mortality was highest for sepsis due to multiple sources and unknown cause, and was lowest when due to abdominal, genitourinary (GU) or skin/soft tissue. Patients with sepsis due to lungs, unknown and multiple sources had the highest rates of multi-organ failure, whereas those with sepsis due to GU and skin/soft tissue had the lowest rates. Those with multisource sepsis had a significantly higher median ICU length of stay and hospital cost. There are significant differences in patient characteristics, microbiology positivity, organs affected, mortality, length of stay and cost based on the source of sepsis. These differences should be considered in future studies to be able to deliver personalized care. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Estimation of the caesium-137 source term from the Fukushima Daiichi nuclear power plant using a consistent joint assimilation of air concentration and deposition observations

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne

    2014-01-01

    Inverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6-19.3 PBq with an estimated standard deviation range of 15-20% depending on the method and the data sets. The “blind” time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data.

  2. The new generation of beta-cells: replication, stem cell differentiation, and the role of small molecules.

    PubMed

    Borowiak, Malgorzata

    2010-01-01

    Diabetic patients suffer from the loss of insulin-secreting β-cells, or from an improper working β-cell mass. Due to the increasing prevalence of diabetes across the world, there is a compelling need for a renewable source of cells that could replace pancreatic β-cells. In recent years, several promising approaches to the generation of new β-cells have been developed. These include directed differentiation of pluripotent cells such as embryonic stem (ES) cells or induced pluripotent stem (iPS) cells, or reprogramming of mature tissue cells. High yield methods to differentiate cell populations into β-cells, definitive endoderm, and pancreatic progenitors, have been established using growth factors and small molecules. However, the final step of directed differentiation to generate functional, mature β-cells in sufficient quantities has yet to be achieved in vitro. Beside the needs of transplantation medicine, a renewable source of β-cells would also be important in terms of a platform to study the pathogenesis of diabetes, and to seek alternative treatments. Finally, by generating new β-cells, we could learn more details about pancreatic development and β-cell specification. This review gives an overview of pancreas ontogenesis in the perspective of stem cell differentiation, and highlights the critical aspects of small molecules in the generation of a renewable β-cell source. Also, it discusses longer term challenges and opportunities in moving towards a therapeutic goal for diabetes.

  3. Reachability Analysis in Probabilistic Biological Networks.

    PubMed

    Gabr, Haitham; Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2015-01-01

    Extra-cellular molecules trigger a response inside the cell by initiating a signal at special membrane receptors (i.e., sources), which is then transmitted to reporters (i.e., targets) through various chains of interactions among proteins. Understanding whether such a signal can reach from membrane receptors to reporters is essential in studying the cell response to extra-cellular events. This problem is drastically complicated due to the unreliability of the interaction data. In this paper, we develop a novel method, called PReach (Probabilistic Reachability), that precisely computes the probability that a signal can reach from a given collection of receptors to a given collection of reporters when the underlying signaling network is uncertain. This is a very difficult computational problem with no known polynomial-time solution. PReach represents each uncertain interaction as a bi-variate polynomial. It transforms the reachability problem to a polynomial multiplication problem. We introduce novel polynomial collapsing operators that associate polynomial terms with possible paths between sources and targets as well as the cuts that separate sources from targets. These operators significantly shrink the number of polynomial terms and thus the running time. PReach has much better time complexity than the recent solutions for this problem. Our experimental results on real data sets demonstrate that this improvement leads to orders of magnitude of reduction in the running time over the most recent methods. Availability: All the data sets used, the software implemented and the alignments found in this paper are available at http://bioinformatics.cise.ufl.edu/PReach/.

  4. Analysis of photovoltaic with water pump cooling by using ANSYS

    NASA Astrophysics Data System (ADS)

    Syafiqah, Z.; Amin, N. A. M.; Irwan, Y. M.; Shobry, M. Z.; Majid, M. S. A.

    2017-10-01

    Almost all regions in the world are facing with problem of increasing electricity cost from time to time. Besides, with the mankind’s anxiety about global warming, it has infused an ideology to rapidly move towards renewable energy sources since it is believed to be more reliable and safer. One example of the best alternatives to replace the fossil fuels sourced is solar energy. Photovoltaic (PV) panel is used to convert the sunlight into electricity. Unfortunately, the performance of PV panel can be affected by its operating temperature. With the increment of ambient temperature, the PV panel operating temperature also increase and will affect the performance of PV panel (in terms of power generated). With this concern, a water cooling system was installed on top of PV panel to help reduce the PV panel’s temperature. Five different water mass flow rate is tested due to investigate their impact towards the thermal performance and heat transfer rate.

  5. Configurations and implementation of payroll system using open source erp: a case study of Koperasi PT Sri

    NASA Astrophysics Data System (ADS)

    Terminanto, A.; Swantoro, H. A.; Hidayanto, A. N.

    2017-12-01

    Enterprise Resource Planning (ERP) is an integrated information system to manage business processes of companies of various business scales. Because of the high cost of ERP investment, ERP implementation is usually done in large-scale enterprises, Due to the complexity of implementation problems, the success rate of ERP implementation is still low. Open Source System ERP becomes an alternative choice of ERP application to SME companies in terms of cost and customization. This study aims to identify characteristics and configure the implementation of OSS ERP Payroll module in KKPS (Employee Cooperative PT SRI) using OSS ERP Odoo and using ASAP method. This study is classified into case study research and action research. Implementation of OSS ERP Payroll module is done because the HR section of KKPS has not been integrated with other parts. The results of this study are the characteristics and configuration of OSS ERP payroll module in KKPS.

  6. Beam shaping in high-power broad-area quantum cascade lasers using optical feedback

    PubMed Central

    Ferré, Simon; Jumpertz, Louise; Carras, Mathieu; Ferreira, Robson; Grillot, Frédéric

    2017-01-01

    Broad-area quantum cascade lasers with high output powers are highly desirable sources for various applications including infrared countermeasures. However, such structures suffer from strongly deteriorated beam quality due to multimode behavior, diffraction of light and self-focusing. Quantum cascade lasers presenting high performances in terms of power and heat-load dissipation are reported and their response to a nonlinear control based on optical feedback is studied. Applying optical feedback enables to efficiently tailor its near-field beam profile. The different cavity modes are sequentially excited by shifting the feedback mirror angle. Further control of the near-field profile is demonstrated using spatial filtering. The impact of an inhomogeneous gain as well as the influence of the cavity width are investigated. Compared to existing technologies, that are complex and costly, beam shaping with optical feedback is a more flexible solution to obtain high-quality mid-infrared sources. PMID:28287175

  7. Essential oils: extraction, bioactivities, and their uses for food preservation.

    PubMed

    Tongnuanchan, Phakawat; Benjakul, Soottawat

    2014-07-01

    Essential oils are concentrated liquids of complex mixtures of volatile compounds and can be extracted from several plant organs. Essential oils are a good source of several bioactive compounds, which possess antioxidative and antimicrobial properties. In addition, some essential oils have been used as medicine. Furthermore, the uses of essential oils have received increasing attention as the natural additives for the shelf-life extension of food products, due to the risk in using synthetic preservatives. Essential oils can be incorporated into packaging, in which they can provide multifunctions termed "active or smart packaging." Those essential oils are able to modify the matrix of packaging materials, thereby rendering the improved properties. This review covers up-to-date literatures on essential oils including sources, chemical composition, extraction methods, bioactivities, and their applications, particularly with the emphasis on preservation and the shelf-life extension of food products. © 2014 Institute of Food Technologists®

  8. Theobroma cacao: Review of the Extraction, Isolation, and Bioassay of Its Potential Anti-cancer Compounds

    PubMed Central

    Baharum, Zainal; Akim, Abdah Md; Hin, Taufiq Yap Yun; Hamid, Roslida Abdul; Kasran, Rosmin

    2016-01-01

    Plants have been a good source of therapeutic agents for thousands of years; an impressive number of modern drugs used for treating human diseases are derived from natural sources. The Theobroma cacao tree, or cocoa, has recently garnered increasing attention and become the subject of research due to its antioxidant properties, which are related to potential anti-cancer effects. In the past few years, identifying and developing active compounds or extracts from the cocoa bean that might exert anti-cancer effects have become an important area of health- and biomedicine-related research. This review provides an updated overview of T. cacao in terms of its potential anti-cancer compounds and their extraction, in vitro bioassay, purification, and identification. This article also discusses the advantages and disadvantages of the techniques described and reviews the processes for future perspectives of analytical methods from the viewpoint of anti-cancer compound discovery. PMID:27019680

  9. Application of Biomass from Palm Oil Mill for Organic Rankine Cycle to Generate Power in North Sumatera Indonesia

    NASA Astrophysics Data System (ADS)

    Nur, T. B.; Pane, Z.; Amin, M. N.

    2017-03-01

    Due to increasing oil and gas demand with the depletion of fossil resources in the current situation make efficient energy systems and alternative energy conversion processes are urgently needed. With the great potential of resources in Indonesia, make biomass has been considered as one of major potential fuel and renewable resource for the near future. In this paper, the potential of palm oil mill waste as a bioenergy source has been investigated. An organic Rankine cycle (ORC) small scale power plant has been preliminary designed to generate electricity. The working fluid candidates for the ORC plant based on the heat source temperature domains have been investigated. The ORC system with a regenerator has higher thermal efficiency than the basic ORC system. The study demonstrates the technical feasibility of ORC solutions in terms of resources optimizations and reducing of greenhouse gas emissions.

  10. Numerical Prediction of Combustion-induced Noise using a hybrid LES/CAA approach

    NASA Astrophysics Data System (ADS)

    Ihme, Matthias; Pitsch, Heinz; Kaltenbacher, Manfred

    2006-11-01

    Noise generation in technical devices is an increasingly important problem. Jet engines in particular produce sound levels that not only are a nuisance but may also impair hearing. The noise emitted by such engines is generated by different sources such as jet exhaust, fans or turbines, and combustion. Whereas the former acoustic mechanisms are reasonably well understood, combustion-generated noise is not. A methodology for the prediction of combustion-generated noise is developed. In this hybrid approach unsteady acoustic source terms are obtained from an LES and the propagation of pressure perturbations are obtained using acoustic analogies. Lighthill's acoustic analogy and a non-linear wave equation, accounting for variable speed of sound, have been employed. Both models are applied to an open diffusion flame. The effects on the far field pressure and directivity due to the variation of speed of sound are analyzed. Results for the sound pressure level will be compared with experimental data.

  11. Correlation between Ti source/drain contact and performance of InGaZnO-based thin film transistors

    NASA Astrophysics Data System (ADS)

    Choi, Kwang-Hyuk; Kim, Han-Ki

    2013-02-01

    Ti contact properties and their electrical contribution to an amorphous InGaZnO (a-IGZO) semiconductor-based thin film transistor (TFT) were investigated in terms of chemical, structural, and electrical considerations. TFT device parameters were quantitatively studied by a transmission line method. By comparing various a-IGZO TFT parameters with those of different Ag and Ti source/drain electrodes, Ti S/D contact with an a-IGZO channel was found to lead to a negative shift in VT (-Δ 0.52 V). This resulted in higher saturation mobility (8.48 cm2/Vs) of a-IGZO TFTs due to effective interfacial reaction between Ti and an a-IGZO semiconducting layer. Based on transmission electron microcopy, x-ray photoelectron depth profile analyses, and numerical calculation of TFT parameters, we suggest a possible Ti contact mechanism on semiconducting a-IGZO channel layers for TFTs.

  12. Beam shaping in high-power broad-area quantum cascade lasers using optical feedback.

    PubMed

    Ferré, Simon; Jumpertz, Louise; Carras, Mathieu; Ferreira, Robson; Grillot, Frédéric

    2017-03-13

    Broad-area quantum cascade lasers with high output powers are highly desirable sources for various applications including infrared countermeasures. However, such structures suffer from strongly deteriorated beam quality due to multimode behavior, diffraction of light and self-focusing. Quantum cascade lasers presenting high performances in terms of power and heat-load dissipation are reported and their response to a nonlinear control based on optical feedback is studied. Applying optical feedback enables to efficiently tailor its near-field beam profile. The different cavity modes are sequentially excited by shifting the feedback mirror angle. Further control of the near-field profile is demonstrated using spatial filtering. The impact of an inhomogeneous gain as well as the influence of the cavity width are investigated. Compared to existing technologies, that are complex and costly, beam shaping with optical feedback is a more flexible solution to obtain high-quality mid-infrared sources.

  13. Energy and human health.

    PubMed

    Smith, Kirk R; Frumkin, Howard; Balakrishnan, Kalpana; Butler, Colin D; Chafe, Zoë A; Fairlie, Ian; Kinney, Patrick; Kjellstrom, Tord; Mauzerall, Denise L; McKone, Thomas E; McMichael, Anthony J; Schneider, Mycle

    2013-01-01

    Energy use is central to human society and provides many health benefits. But each source of energy entails some health risks. This article reviews the health impacts of each major source of energy, focusing on those with major implications for the burden of disease globally. The biggest health impacts accrue to the harvesting and burning of solid fuels, coal and biomass, mainly in the form of occupational health risks and household and general ambient air pollution. Lack of access to clean fuels and electricity in the world's poor households is a particularly serious risk for health. Although energy efficiency brings many benefits, it also entails some health risks, as do renewable energy systems, if not managed carefully. We do not review health impacts of climate change itself, which are due mostly to climate-altering pollutants from energy systems, but do discuss the potential for achieving near-term health cobenefits by reducing certain climate-related emissions.

  14. A simplified Mach number scaling law for helicopter rotor noise

    NASA Technical Reports Server (NTRS)

    Aravamudan, K. S.; Lee, A.; Harris, W. L.

    1978-01-01

    Mach number scaling laws are derived for the rotational and the high-frequency broadband noise from helicopter rotors. The rotational scaling law is obtained directly from the theory of Lowson and Ollerhead (1969) by exploiting the properties of the dominant terms in the expression for the complex Fourier coefficients of sound radiation from a point source. The scaling law for the high-frequency broadband noise is obtained by assuming that the noise sources are acoustically compact and computing the instantaneous pressure due to an element on an airfoil where vortices are shed. Experimental results on the correlation lengths for stationary airfoils are extended to rotating airfoils. On the assumption that the correlation length varies as the boundary layer displacement thickness, it is found that the Mach number scaling law contains a factor of Mach number raised to the exponent 5.8. Both scaling laws were verified by model tests.

  15. Long-Term Stability of Oxide Nanowire Sensors via Heavily Doped Oxide Contact.

    PubMed

    Zeng, Hao; Takahashi, Tsunaki; Kanai, Masaki; Zhang, Guozhu; He, Yong; Nagashima, Kazuki; Yanagida, Takeshi

    2017-12-22

    Long-term stability of a chemical sensor is an essential quality for long-term collection of data related to exhaled breath, environmental air, and other sources in the Internet of things (IoT) era. Although an oxide nanowire sensor has shown great potential as a chemical sensor, the long-term stability of sensitivity has not been realized yet due to electrical degradation under harsh sensing conditions. Here, we report a rational concept to accomplish long-term electrical stability of metal oxide nanowire sensors via introduction of a heavily doped metal oxide contact layer. Antimony-doped SnO 2 (ATO) contacts on SnO 2 nanowires show much more stable and lower electrical contact resistance than conventional Ti contacts for high temperature (200 °C) conditions, which are required to operate chemical sensors. The stable and low contact resistance of ATO was confirmed for at least 1960 h under 200 °C in open air. This heavily doped oxide contact enables us to realize the long-term stability of SnO 2 nanowire sensors while maintaining the sensitivity for both NO 2 gas and light (photo) detections. The applicability of our method is confirmed for sensors on a flexible polyethylene naphthalate (PEN) substrate. Since the proposed fundamental concept can be applied to various oxide nanostructures, it will give a foundation for designing long-term stable oxide nanomaterial-based IoT sensors.

  16. Sensitivity of WRF-chem predictions to dust source function specification in West Asia

    NASA Astrophysics Data System (ADS)

    Nabavi, Seyed Omid; Haimberger, Leopold; Samimi, Cyrus

    2017-02-01

    Dust storms tend to form in sparsely populated areas covered by only few observations. Dust source maps, known as source functions, are used in dust models to allocate a certain potential of dust release to each place. Recent research showed that the well known Ginoux source function (GSF), currently used in Weather Research and Forecasting Model coupled with Chemistry (WRF-chem), exhibits large errors over some regions in West Asia, particularly near the IRAQ/Syrian border. This study aims to improve the specification of this critical part of dust forecasts. A new source function based on multi-year analysis of satellite observations, called West Asia source function (WASF), is therefore proposed to raise the quality of WRF-chem predictions in the region. WASF has been implemented in three dust schemes of WRF-chem. Remotely sensed and ground-based observations have been used to verify the horizontal and vertical extent and location of simulated dust clouds. Results indicate that WRF-chem performance is significantly improved in many areas after the implementation of WASF. The modified runs (long term simulations over the summers 2008-2012, using nudging) have yielded an average increase of Spearman correlation between observed and forecast aerosol optical thickness by 12-16 percent points compared to control runs with standard source functions. They even outperform MACC and DREAM dust simulations over many dust source regions. However, the quality of the forecasts decreased with distance from sources, probably due to deficiencies in the transport and deposition characteristics of the forecast model in these areas.

  17. Beam energy considerations for gold nano-particle enhanced radiation treatment.

    PubMed

    Van den Heuvel, F; Locquet, Jean-Pierre; Nuyts, S

    2010-08-21

    A novel approach using nano-technology enhanced radiation modalities is investigated. The proposed methodology uses antibodies labeled with organically inert metals with a high atomic number. Irradiation using photons with energies in the kilo-electron volt (keV) range shows an increase in dose due to a combination of an increase in photo-electric interactions and a pronounced generation of Auger and/or Coster-Krönig (A-CK) electrons. The dependence of the dose deposition on various factors is investigated using Monte Carlo simulation models. The factors investigated include agent concentration, spectral dependence looking at mono-energetic sources as well as classical bremsstrahlung sources. The optimization of the energy spectrum is performed in terms of physical dose enhancement as well as the dose deposited by Auger and/or Coster-Krönig electrons and their biological effectiveness. A quasi-linear dependence on concentration and an exponential decrease within the target medium is observed. The maximal dose enhancement is dependent on the position of the target in the beam. Apart from irradiation with low-photon energies (10-20 keV) there is no added benefit from the increase in generation of Auger electrons. Interestingly, a regular 110 kVp bremsstrahlung spectrum shows a comparable enhancement in comparison with the optimized mono-energetic sources. In conclusion we find that the use of enhanced nano-particles shows promise to be implemented quite easily in regular clinics on a physical level due to the advantageous properties in classical beams.

  18. Beam energy considerations for gold nano-particle enhanced radiation treatment

    NASA Astrophysics Data System (ADS)

    Van den Heuvel, F.; Locquet, Jean-Pierre; Nuyts, S.

    2010-08-01

    A novel approach using nano-technology enhanced radiation modalities is investigated. The proposed methodology uses antibodies labeled with organically inert metals with a high atomic number. Irradiation using photons with energies in the kilo-electron volt (keV) range shows an increase in dose due to a combination of an increase in photo-electric interactions and a pronounced generation of Auger and/or Coster-Krönig (A-CK) electrons. The dependence of the dose deposition on various factors is investigated using Monte Carlo simulation models. The factors investigated include agent concentration, spectral dependence looking at mono-energetic sources as well as classical bremsstrahlung sources. The optimization of the energy spectrum is performed in terms of physical dose enhancement as well as the dose deposited by Auger and/or Coster-Krönig electrons and their biological effectiveness. A quasi-linear dependence on concentration and an exponential decrease within the target medium is observed. The maximal dose enhancement is dependent on the position of the target in the beam. Apart from irradiation with low-photon energies (10-20 keV) there is no added benefit from the increase in generation of Auger electrons. Interestingly, a regular 110 kVp bremsstrahlung spectrum shows a comparable enhancement in comparison with the optimized mono-energetic sources. In conclusion we find that the use of enhanced nano-particles shows promise to be implemented quite easily in regular clinics on a physical level due to the advantageous properties in classical beams.

  19. Estimation of methane flux from fish ponds of southwestern Taiwan

    NASA Astrophysics Data System (ADS)

    Huang, K. H.; Hung, C. C.

    2016-02-01

    CH4 is one of the trace gases in the atmosphere, but it is an important greenhouse gas, with 15 times more effective than CO2 absorbing infrared radiation capability. To date, scientists generally consider that the methane production is mainly from livestock farming, such as pigs and cattle, but the source of methane emission from aquaculture ponds have been ignored. Due to overfishing in the ocean, aquaculture fishery in coastal zone has been increasing globally and the methane emission from those fish ponds has seldom been studied. To better evaluate the emission of methane from fish ponds, we measured methane concentrations in both atmosphere and fish ponds of the southwestern Taiwan from March to September in 2015. Besides an extremely high flux (829 mmol/m2/d), the fluxes of methane in different fish ponds ranged from 19 to 725 μmol/m2/d, which is lower than the global mean value of lakes (2.7 mmol/m2/d). The low methane fluxes during sampling period may be due to non-harvest season, because when the harvest season comes, the higher trophic status will appear, and there will be more organic matter supply for methanogenesis. Currently, we have no idea where the extremely high methane flux comes from. We will try to measure C-isotopes to understand the sources of highest methane fluxes. Overall, the preliminary results provide substantive evidence that methane emission from aquaculture ponds could be an important source and it needs long-term investigations.

  20. Long Term 2 Second Round Source Water Monitoring and Bin Placement Memo

    EPA Pesticide Factsheets

    The Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) applies to all public water systems served by a surface water source or public water systems served by a ground water source under the direct influence of surface water.

  1. Assessing the short-term clock drift of early broadband stations with burst events of the 26 s persistent and localized microseism

    NASA Astrophysics Data System (ADS)

    Xie, Jun; Ni, Sidao; Chu, Risheng; Xia, Yingjie

    2018-01-01

    Accurate seismometer clock plays an important role in seismological studies including earthquake location and tomography. However, some seismic stations may have clock drift larger than 1 s (e.g. GSC in 1992), especially in early days of global seismic networks. The 26 s Persistent Localized (PL) microseism event in the Gulf of Guinea sometime excites strong and coherent signals, and can be used as repeating source for assessing stability of seismometer clocks. Taking station GSC, PAS and PFO in the TERRAscope network as an example, the 26 s PL signal can be easily observed in the ambient noise cross-correlation function between these stations and a remote station OBN with interstation distance about 9700 km. The travel-time variation of this 26 s signal in the ambient noise cross-correlation function is used to infer clock error. A drastic clock error is detected during June 1992 for station GSC, but not found for station PAS and PFO. This short-term clock error is confirmed by both teleseismic and local earthquake records with a magnitude of 25 s. Averaged over the three stations, the accuracy of the ambient noise cross-correlation function method with the 26 s source is about 0.3-0.5 s. Using this PL source, the clock can be validated for historical records of sparsely distributed stations, where the usual ambient noise cross-correlation function of short-period (<20 s) ambient noise might be less effective due to its attenuation over long interstation distances. However, this method suffers from cycling problem, and should be verified by teleseismic/local P waves. Further studies are also needed to investigate whether the 26 s source moves spatially and its effects on clock drift detection.

  2. Surveillance for cancer recurrence in long-term young breast cancer survivors randomly selected from a statewide cancer registry.

    PubMed

    Jones, Tarsha; Duquette, Debra; Underhill, Meghan; Ming, Chang; Mendelsohn-Victor, Kari E; Anderson, Beth; Milliron, Kara J; Copeland, Glenn; Janz, Nancy K; Northouse, Laurel L; Duffy, Sonia M; Merajver, Sofia D; Katapodi, Maria C

    2018-05-01

    This study examined clinical breast exam (CBE) and mammography surveillance in long-term young breast cancer survivors (YBCS) and identified barriers and facilitators to cancer surveillance practices. Data collected with a self-administered survey from a statewide, randomly selected sample of YBCS diagnosed with invasive breast cancer or ductal carcinoma in situ younger than 45 years old, stratified by race (Black vs. White/Other). Multivariate logistic regression models identified predictors of annual CBEs and mammograms. Among 859 YBCS (n = 340 Black; n = 519 White/Other; mean age = 51.0 ± 5.9; diagnosed 11.0 ± 4.0 years ago), the majority (> 85%) reported an annual CBE and a mammogram. Black YBCS in the study were more likely to report lower rates of annual mammography and more barriers accessing care compared to White/Other YBCS. Having a routine source of care, confidence to use healthcare services, perceived expectations from family members and healthcare providers to engage in cancer surveillance, and motivation to comply with these expectations were significant predictors of having annual CBEs and annual mammograms. Cost-related lack of access to care was a significant barrier to annual mammograms. Routine source of post-treatment care facilitated breast cancer surveillance above national average rates. Persistent disparities regarding access to mammography surveillance were identified for Black YBCS, primarily due to lack of access to routine source of care and high out-of-pocket costs. Public health action targeting cancer surveillance in YBCS should ensure routine source of post-treatment care and address cost-related barriers. Clinical Trials Registration Number: NCT01612338.

  3. Where did all the Nitrogen go? Use of Watershed-Scale Budgets to Quantify Nitrogen Inputs, Storages, and Losses.

    NASA Astrophysics Data System (ADS)

    Boyer, E. W.; Goodale, C. L.; Howarth, R. W.; VanBreemen, N.

    2001-12-01

    Inputs of nitrogen (N) to aquatic and terrestrial ecosystems have increased during recent decades, primarily from the production and use of fertilizers, the planting of N-fixing crops, and the combustion of fossil fuels. We present mass-balanced budgets of N for 16 catchments along a latitudinal profile from Maine to Virginia, which encompass a range of climatic variability and are major drainages to the coast of the North Atlantic Ocean. We quantify inputs of N to each catchment from atmospheric deposition, application of nitrogenous fertilizers, biological nitrogen fixation by crops and trees, and import of N in agricultural products (food and feed). We relate these input terms to losses of N (total, organic, and nitrate) in streamflow. The importance of the relative N sources to N exports varies widely by watershed and is related to land use. Atmospheric deposition was the largest source of N to the forested catchments of northern New England (e.g., Penobscot and Kennebec); import of N in food was the largest source of N to the more populated regions of southern New England (e.g., Charles and Blackstone); and agricultural inputs were the dominant N sources in the Mid-Atlantic region (e.g., Schuylkill and Potomac). In all catchments, N inputs greatly exceed outputs, implying additional loss terms (e.g., denitrification or volatilization and transport of animal wastes), or changes in internal N stores (e.g, accumulation of N in vegetation, soil, or groundwater). We use our N budgets and several modeling approaches to constrain estimates about the fate of this excess N, including estimates of N storage in accumulating woody biomass, N losses due to in-stream denitrification, and more. This work is an effort of the SCOPE Nitrogen Project.

  4. In Situ NAPL Modification for Contaminant Source-Zone Passivation, Mass Flux Reduction, and Remediation

    NASA Astrophysics Data System (ADS)

    Mateas, D. J.; Tick, G.; Carroll, K. C.

    2016-12-01

    A remediation method was developed to reduce the aqueous solubility and mass-flux of target NAPL contaminants through the in-situ creation of a NAPL mixture source-zone. This method was tested in the laboratory using equilibrium batch tests and two-dimensional flow-cell experiments. The creation of two different NAPL mixture source zones were tested in which 1) volumes of relatively insoluble n-hexadecane (HEX) or vegetable oil (VO) were injected into a trichloroethene (TCE) contaminant source-zone; and 2) pre-determined HEX-TCE and VO-TCE mixture ratio source zones were emplaced into the flow cell prior to water flushing. NAPL-aqueous phase batch tests were conducted prior to the flow-cell experiments to evaluate the effects of various NAPL mixture ratios on equilibrium aqueous-phase concentrations of TCE and toluene (TOL) and to design optimal NAPL (HEX or VO) injection volumes for the flow-cell experiments. Uniform NAPL mixture source-zones were able to quickly decrease contaminant mass-flux, as demonstrated by the emplaced source-zone experiments. The success of the HEX and VO injections to also decrease mass flux was dependent on the ability of these injectants to homogeneously mix with TCE source-zone. Upon injection, both HEX and VO migrated away from the source-zone, to some extent. However, the lack of a steady-state dissolution phase and the inefficient mass-flux-reduction/mass-removal behavior produced after VO injection suggest that VO was more effective than HEX for mixing and partitioning within the source-zone region to form a more homogeneous NAPL mixture with TCE. VO appears to be a promising source-zone injectant-NAPL due to its negligible long-term toxicity and lower mobilization potential.

  5. Uncertainty, variability, and earthquake physics in ground‐motion prediction equations

    USGS Publications Warehouse

    Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A.

    2017-01-01

    Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20  km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction.

  6. Long range laser propagation: power scaling and beam quality issues

    NASA Astrophysics Data System (ADS)

    Bohn, Willy L.

    2010-09-01

    This paper will address long range laser propagation applications where power and, in particular beam quality issues play a major role. Hereby the power level is defined by the specific mission under consideration. I restrict myself to the following application areas: (1)Remote sensing/Space based LIDAR, (2) Space debris removal (3)Energy transmission, and (4)Directed energy weapons Typical examples for space based LIDARs are the ADM Aeolus ESA mission using the ALADIN Nd:YAG laser with its third harmonic at 355 nm and the NASA 2 μm Tm:Ho:LuLiF convectively cooled solid state laser. Space debris removal has attracted more attention in the last years due to the dangerous accumulation of debris in orbit which become a threat to the satellites and the ISS space station. High power high brightness lasers may contribute to this problem by partially ablating the debris material and hence generating an impulse which will eventually de-orbit the debris with their subsequent disintegration in the lower atmosphere. Energy transmission via laser beam from space to earth has long been discussed as a novel long term approach to solve the energy problem on earth. In addition orbital transfer and stationkeeping are among the more mid-term applications of high power laser beams. Finally, directed energy weapons are becoming closer to reality as corresponding laser sources have matured due to recent efforts in the JHPSSL program. All of this can only be realized if he laser sources fulfill the necessary power requirements while keeping the beam quality as close as possible to the diffraction limited value. And this is the rationale and motivation of this paper.

  7. Determining the contribution of long-range transport, regional and local source areas, to PM10 mass loading in Hessen, Germany using a novel multi-receptor based statistical approach

    NASA Astrophysics Data System (ADS)

    Garg, Saryu; Sinha, Baerbel

    2017-10-01

    This study uses two newly developed statistical source apportionment models, MuSAM and MuReSAM, to perform quantitative statistical source apportionment of PM10 at multiple receptor sites in South Hessen. MuSAM uses multi-site back trajectory data to quantify the contribution of long-range transport, while MuReSAM uses wind speed and direction as proxy for regional transport and quantifies the contribution of regional source areas. On average, between 7.8 and 9.1 μg/m3 of PM10 (∼50%) at receptor sites in South Hessen is contributed by long-range transport. The dominant source regions are Eastern, South Eastern, and Southern Europe. 32% of the PM10 at receptor sites in South Hessen is contributed by regional source areas (2.8-9.41 μg/m3). This fraction varies from <20% at remote sites to >40% for urban stations. Sources located within a 2 km radius around the receptor site are responsible for 7%-20% of the total PM10 mass (0.7-4.4 μg/m3). The perturbation study of the traffic flow due to the closing and reopening of the Schiersteiner Brücke revealed that the contribution of the bridge to PM10 mass loadings at two nearby receptor sites increased by approximately 120% after it reopened and became a bottleneck, although in absolute terms, the increase is small.

  8. Development of episodic and autobiographical memory: The importance of remembering forgetting

    PubMed Central

    Bauer, Patricia J.

    2015-01-01

    Some memories of the events of our lives have a long shelf-life—they remain accessible to recollection even after long delays. Yet many other of our experiences are forgotten, sometimes very soon after they take place. In spite of the prevalence of forgetting, theories of the development of episodic and autobiographical memory largely ignore it as a potential source of variance in explanation of age-related variability in long-term recall. They focus instead on what may be viewed as positive developmental changes, that is, changes that result in improvements in the quality of memory representations that are formed. The purpose of this review is to highlight the role of forgetting as an important variable in understanding the development of episodic and autobiographical memory. Forgetting processes are implicated as a source of variability in long-term recall due to the protracted course of development of the neural substrate responsible for transformation of fleeting experiences into memory traces that can be integrated into long-term stores and retrieved at later points in time. It is logical to assume that while the substrate is developing, neural processing is relatively inefficient and ineffective, resulting in loss of information from memory (i.e., forgetting). For this reason, focus on developmental increases in the quality of representations of past events and experiences will tell only a part of the story of how memory develops. A more complete account is afforded when we also consider changes in forgetting. PMID:26644633

  9. Daily trends and source apportionment of ultrafine particulate mass (PM0.1) over an annual cycle in a typical California city.

    PubMed

    Kuwayama, Toshihiro; Ruehl, Chris R; Kleeman, Michael J

    2013-12-17

    Toxicology studies indicate that inhalation of ultrafine particles (Dp < 0.1 μm) causes adverse health effects, presumably due to their large surface area-to-volume ratio that can drive heterogeneous reactions. Epidemiological associations between ultrafine particles and health effects, however, have been difficult to identify due to the lack of appropriate long-term monitoring and exposure data. The majority of the existing ultrafine particle epidemiology studies are based on exposure to particle number, although an independent analysis suggests that ultrafine particle mass (PM0.1) correlates better with particle surface area. More information is needed to characterize PM0.1 exposure to fully evaluate the health effects of ultrafine particles using epidemiology. The present study summarizes 1 year of daily PM0.1 chemistry and source apportionment at Sacramento, CA, USA. Positive matrix factorization (PMF) was used to resolve PM0.1 source contributions from old-technology diesel engines, residential wood burning, rail, regional traffic, and brake wear/road dust. Diesel PM0.1 and total PM0.1 concentrations were reduced by 97 and 26%, respectively, as a result of the adoption of cleaner diesel technology. The strong linear correlation between PM0.1 and particle surface area in central California suggests that the adoption of clean diesel engines reduced particle surface area by similar amounts. PM0.1 sulfate reduction occurred as a result of reduced primary particle surface area available for sulfate condensation. The current study demonstrates the capability of measuring PM0.1 source contributions over a 12 month period and identifies the extended benefits of emissions reduction efforts for diesel engines on ambient concentrations of primary and secondary PM0.1.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parworth, Caroline; Tilp, Alison; Fast, Jerome

    In this study the long-term trends of non-refractory submicrometer aerosol (NR-PM1) composition and mass concentration measured by an Aerosol Chemical Speciation Monitor (ACSM) at the Atmospheric Radiation Measurement (ARM) program's Southern Great Plains (SGP) site are discussed. NR-PM1 data was recorded at ~30 min intervals over a period of 19 months between November 2010 and June 2012. Positive Matrix Factorization (PMF) was performed on the measured organic mass spectral matrix using a rolling window technique to derive factors associated with distinct sources, evolution processes, and physiochemical properties. The rolling window approach also allows us to capture the dynamic variations ofmore » the chemical properties in the organic aerosol (OA) factors over time. Three OA factors were obtained including two oxygenated OA (OOA) factors, differing in degrees of oxidation, and a biomass burning OA (BBOA) factor. Back trajectory analyses were performed to investigate possible sources of major NR-PM1 species at the SGP site. Organics dominated NR-PM1 mass concentration for the majority of the study with the exception of winter, when ammonium nitrate increases due to transport of precursor species from surrounding urban and agricultural areas and also due to cooler temperatures. Sulfate mass concentrations have little seasonal variation with mixed regional and local sources. In the spring BBOA emissions increase and are mainly associated with local fires. Isoprene and carbon monoxide emission rates were obtained by the Model of Emissions of Gases and Aerosols from Nature (MEGAN) and the 2011 U.S. National Emissions Inventory to represent the spatial distribution of biogenic and anthropogenic sources, respectively. The combined spatial distribution of isoprene emissions and air mass trajectories suggest that biogenic emissions from the southeast contribute to SOA formation at the SGP site during the summer.« less

  11. Development, Evaluation, and Application of a Primary Aerosol Model.

    PubMed

    Wang, I T; Chico, T; Huang, Y H; Farber, R J

    1999-09-01

    The Segmented-Plume Primary Aerosol Model (SPPAM) has been developed over the past several years. The earlier model development goals were simply to generalize the widely used Industrial Source Complex Short-Term (ISCST) model to simulate plume transport and dispersion under light wind conditions and to handle a large number of roadway or line sources. The goals have been expanded to include development of improved algorithm for effective plume transport velocity, more accurate and efficient line and area source dispersion algorithms, and recently, a more realistic and computationally efficient algorithm for plume depletion due to particle dry deposition. A performance evaluation of the SPPAM has been carried out using the 1983 PNL dual tracer experimental data. The results show the model predictions to be in good agreement with observations in both plume advection-dispersion and particulate matter (PM) depletion by dry deposition. For PM 2.5 impact analysis, the SPPAM has been applied to the Rubidoux area of California. Emission sources included in the modeling analysis are: paved road dust, diesel vehicular exhaust, gasoline vehicular exhaust, and tire wear particles from a large number of roadways in Rubidoux and surrounding areas. For the selected modeling periods, the predicted primary PM 2.5 to primary PM10 concentration ratios for the Rubidoux sampling station are in the range of 0.39-0.46. The organic fractions of the primary PM 2.5 impacts are estimated to be at least 34-41%. Detailed modeling results indicate that the relatively high organic fractions are primarily due to the proximity of heavily traveled roadways north of the sampling station. The predictions are influenced by a number of factors; principal among them are the receptor locations relative to major roadways, the volume and composition of traffic on these roadways, and the prevailing meteorological conditions.

  12. Uncertainty assessment of source attribution of PM(2.5) and its water-soluble organic carbon content using different biomass burning tracers in positive matrix factorization analysis--a case study in Beijing, China.

    PubMed

    Tao, Jun; Zhang, Leiming; Zhang, Renjian; Wu, Yunfei; Zhang, Zhisheng; Zhang, Xiaoling; Tang, Yixi; Cao, Junji; Zhang, Yuanhang

    2016-02-01

    Daily PM2.5 samples were collected at an urban site in Beijing during four one-month periods in 2009-2010, with each period in a different season. Samples were subject to chemical analysis for various chemical components including major water-soluble ions, organic carbon (OC) and water-soluble organic carbon (WSOC), element carbon (EC), trace elements, anhydrosugar levoglucosan (LG), and mannosan (MN). Three sets of source profiles of PM2.5 were first identified through positive matrix factorization (PMF) analysis using single or combined biomass tracers - non-sea salt potassium (nss-K(+)), LG, and a combination of nss-K(+) and LG. The six major source factors of PM2.5 included secondary inorganic aerosol, industrial pollution, soil dust, biomass burning, traffic emission, and coal burning, which were estimated to contribute 31±37%, 39±28%, 14±14%, 7±7%, 5±6%, and 4±8%, respectively, to PM2.5 mass if using the nss-K(+) source profiles, 22±19%, 29±17%, 20±20%, 13±13%, 12±10%, and 4±6%, respectively, if using the LG source profiles, and 21±17%, 31±18%, 19±19%, 11±12%, 14±11%, and 4±6%, respectively, if using the combined nss-K(+) and LG source profiles. The uncertainties in the estimation of biomass burning contributions to WSOC due to the different choices of biomass burning tracers were around 3% annually and up to 24% seasonally in terms of absolute percentage contributions, or on a factor of 1.7 annually and up to a factor of 3.3 seasonally in terms of the actual concentrations. The uncertainty from the major source (e.g. industrial pollution) was on a factor of 1.9 annually and up to a factor of 2.5 seasonally in the estimated WSOC concentrations. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. C-arm based cone-beam CT using a two-concentric-arc source trajectory: system evaluation

    NASA Astrophysics Data System (ADS)

    Zambelli, Joseph; Zhuang, Tingliang; Nett, Brian E.; Riddell, Cyril; Belanger, Barry; Chen, Guang-Hong

    2008-03-01

    The current x-ray source trajectory for C-arm based cone-beam CT is a single arc. Reconstruction from data acquired with this trajectory yields cone-beam artifacts for regions other than the central slice. In this work we present the preliminary evaluation of reconstruction from a source trajectory of two concentric arcs using a flat-panel detector equipped C-arm gantry (GE Healthcare Innova 4100 system, Waukesha, Wisconsin). The reconstruction method employed is a summation of FDK-type reconstructions from the two individual arcs. For the angle between arcs studied here, 30°, this method offers a significant reduction in the visibility of cone-beam artifacts, with the additional advantages of simplicity and ease of implementation due to the fact that it is a direct extension of the reconstruction method currently implemented on commercial systems. Reconstructed images from data acquired from the two arc trajectory are compared to those reconstructed from a single arc trajectory and evaluated in terms of spatial resolution, low contrast resolution, noise, and artifact level.

  14. C-arm based cone-beam CT using a two-concentric-arc source trajectory: system evaluation.

    PubMed

    Zambelli, Joseph; Zhuang, Tingliang; Nett, Brian E; Riddell, Cyril; Belanger, Barry; Chen, Guang-Hong

    2008-01-01

    The current x-ray source trajectory for C-arm based cone-beam CT is a single arc. Reconstruction from data acquired with this trajectory yields cone-beam artifacts for regions other than the central slice. In this work we present the preliminary evaluation of reconstruction from a source trajectory of two concentric arcs using a flat-panel detector equipped C-arm gantry (GE Healthcare Innova 4100 system, Waukesha, Wisconsin). The reconstruction method employed is a summation of FDK-type reconstructions from the two individual arcs. For the angle between arcs studied here, 30°, this method offers a significant reduction in the visibility of cone-beam artifacts, with the additional advantages of simplicity and ease of implementation due to the fact that it is a direct extension of the reconstruction method currently implemented on commercial systems. Reconstructed images from data acquired from the two arc trajectory are compared to those reconstructed from a single arc trajectory and evaluated in terms of spatial resolution, low contrast resolution, noise, and artifact level.

  15. Analysis of capital spending and capital financing among large US nonprofit health systems.

    PubMed

    Stewart, Louis J

    2012-01-01

    This article examines the recent trends (2006 to 2009) in capital spending among 25 of the largest nonprofit health systems in the United States and analyzes the financing sources that these large nonprofit health care systems used to fund their capital spending. Total capital spending for these 25 nonprofit health entities exceeded $41 billion for the four-year period of this study. Less than 3 percent of total capital spending resulted in mergers and acquisition activities. Total annual capital spending grew at an average annual rate of 17.6 percent during the first three year of this study's period of analysis. Annual capital spending for 2009 fell by more than 22 percent over prior year's level due to the impact of widespread disruption in US tax-exempt variable rate debt markets. While cash inflow from long-term debt issues was a significant source of capital financing, this study's primary finding was that operating cash flow was the predominant source of capital spending funding. Key words: nonprofit, mergers and acquisitions (M&A), capital spending, capital financing.

  16. Well water quality in rural Nicaragua using a low-cost bacterial test and microbial source tracking.

    PubMed

    Weiss, Patricia; Aw, Tiong Gim; Urquhart, Gerald R; Galeano, Miguel Ruiz; Rose, Joan B

    2016-04-01

    Water-related diseases, particularly diarrhea, are major contributors to morbidity and mortality in developing countries. Monitoring water quality on a global scale is crucial to making progress in terms of population health. Traditional analytical methods are difficult to use in many regions of the world in low-resource settings that face severe water quality issues due to the inaccessibility of laboratories. This study aimed to evaluate a new low-cost method (the compartment bag test (CBT)) in rural Nicaragua. The CBT was used to quantify the presence of Escherichia coli in drinking water wells and aimed to determine the source(s) of any microbial contamination. Results indicate that the CBT is a viable method for use in remote rural regions. The overall quality of well water in Pueblo Nuevo, Nicaragua was deemed unsafe, and results led to the conclusion that animal fecal wastes may be one of the leading causes of well contamination. Elevation and depth of wells were not found to impact overall water quality. However rope-pump wells had a 64.1% reduction in contamination when compared with simple wells.

  17. Redistribution of energy available for ocean mixing by long-range propagation of internal waves.

    PubMed

    Alford, Matthew H

    2003-05-08

    Ocean mixing, which affects pollutant dispersal, marine productivity and global climate, largely results from the breaking of internal gravity waves--disturbances propagating along the ocean's internal stratification. A global map of internal-wave dissipation would be useful in improving climate models, but would require knowledge of the sources of internal gravity waves and their propagation. Towards this goal, I present here computations of horizontal internal-wave propagation from 60 historical moorings and relate them to the source terms of internal waves as computed previously. Analysis of the two most energetic frequency ranges--near-inertial frequencies and semidiurnal tidal frequencies--reveals that the fluxes in both frequency bands are of the order of 1 kW x m(-1) (that is, 15-50% of the energy input) and are directed away from their respective source regions. However, the energy flux due to near-inertial waves is stronger in winter, whereas the tidal fluxes are uniform throughout the year. Both varieties of internal waves can thus significantly affect the space-time distribution of energy available for global mixing.

  18. Near-field sound radiation of fan tones from an installed turbofan aero-engine.

    PubMed

    McAlpine, Alan; Gaffney, James; Kingan, Michael J

    2015-09-01

    The development of a distributed source model to predict fan tone noise levels of an installed turbofan aero-engine is reported. The key objective is to examine a canonical problem: how to predict the pressure field due to a distributed source located near an infinite, rigid cylinder. This canonical problem is a simple representation of an installed turbofan, where the distributed source is based on the pressure pattern generated by a spinning duct mode, and the rigid cylinder represents an aircraft fuselage. The radiation of fan tones can be modelled in terms of spinning modes. In this analysis, based on duct modes, theoretical expressions for the near-field acoustic pressures on the cylinder, or at the same locations without the cylinder, have been formulated. Simulations of the near-field acoustic pressures are compared against measurements obtained from a fan rig test. Also, the installation effect is quantified by calculating the difference in the sound pressure levels with and without the adjacent cylindrical fuselage. Results are shown for the blade passing frequency fan tone radiated at a supersonic fan operating condition.

  19. The Potential for Engineering Enhanced Functional-Feed Soybeans for Sustainable Aquaculture Feed.

    PubMed

    Herman, Eliot M; Schmidt, Monica A

    2016-01-01

    Aquaculture is the most rapidly growing segment of global animal production that now surpasses wild-capture fisheries production and is continuing to grow 10% annually. Sustainable aquaculture needs to diminish, and progressively eliminate, its dependence on fishmeal-sourced feed from over-harvested fisheries. Sustainable aquafeed sources will need to be primarily of plant-origin. Soybean is currently the primary global vegetable-origin protein source for aquaculture. Direct exchange of soybean meal for fishmeal in aquafeed has resulted in reduced growth rates due in part to soybean's anti-nutritional proteins. To produce soybeans for use in aquaculture feeds a new conventional line has been bred termed Triple Null by stacking null alleles for the feed-relevant proteins Kunitz Trypsin Inhibitor, lectin, and P34 allergen. Triple Null is now being further enhanced as a platform to build additional transgene traits for vaccines, altered protein composition, and to produce high levels of β-carotene an intrinsic orange-colored aquafeed marker to distinguish the seeds from commodity beans and as the metabolic feedstock precursor of highly valued astaxanthin.

  20. Exact Closed-form Solutions for Lamb's Problem

    NASA Astrophysics Data System (ADS)

    Feng, Xi; Zhang, Haiming

    2018-04-01

    In this article, we report on an exact closed-form solution for the displacement at the surface of an elastic half-space elicited by a buried point source that acts at some point underneath that surface. This is commonly referred to as the 3-D Lamb's problem, for which previous solutions were restricted to sources and receivers placed at the free surface. By means of the reciprocity theorem, our solution should also be valid as a means to obtain the displacements at interior points when the source is placed at the free surface. We manage to obtain explicit results by expressing the solution in terms of elementary algebraic expression as well as elliptic integrals. We anchor our developments on Poisson's ratio 0.25 starting from Johnson's (1974) integral solutions which must be computed numerically. In the end, our closed-form results agree perfectly with the numerical results of Johnson (1974), which strongly confirms the correctness of our explicit formulas. It is hoped that in due time, these formulas may constitute a valuable canonical solution that will serve as a yardstick against which other numerical solutions can be compared and measured.

  1. Exact closed-form solutions for Lamb's problem

    NASA Astrophysics Data System (ADS)

    Feng, Xi; Zhang, Haiming

    2018-07-01

    In this paper, we report on an exact closed-form solution for the displacement at the surface of an elastic half-space elicited by a buried point source that acts at some point underneath that surface. This is commonly referred to as the 3-D Lamb's problem for which previous solutions were restricted to sources and receivers placed at the free surface. By means of the reciprocity theorem, our solution should also be valid as a means to obtain the displacements at interior points when the source is placed at the free surface. We manage to obtain explicit results by expressing the solution in terms of elementary algebraic expression as well as elliptic integrals. We anchor our developments on Poisson's ratio 0.25 starting from Johnson's integral solutions which must be computed numerically. In the end, our closed-form results agree perfectly with the numerical results of Johnson, which strongly confirms the correctness of our explicit formulae. It is hoped that in due time, these formulae may constitute a valuable canonical solution that will serve as a yardstick against which other numerical solutions can be compared and measured.

  2. I{ Relationship between source clean up and mass flux of chlorinated solvents in low permeability settings with fractures}

    NASA Astrophysics Data System (ADS)

    Bjerg, P. L.; Chambon, J. C.; Christiansen, C. M.; Broholm, M. M.; Binning, P. J.

    2009-04-01

    Groundwater contamination by chlorinated solvents, such as perchloroethylene (PCE), often occurs via leaching from complex sources located in low permeability sediments such as clayey tills overlying aquifers. Clayey tills are mostly fractured, and contamination migrating through the fractures spreads to the low permeability matrix by diffusion. This results in a long term source of contamination due to back-diffusion. Leaching from such sources is further complicated by microbial degradation under anaerobic conditions to sequentially form the daughter products trichloroethylene, cis-dichloroethylene (cis-DCE), vinyl chloride (VC) and ethene. This process can be enhanced by addition of electron donors and/or bioaugmentation and is termed Enhanced Reductive Dechlorination (ERD). This work aims to improve our understanding of the physical, chemical and microbial processes governing source behaviour under natural and enhanced conditions. That understanding is applied to risk assessment, and to determine the relationship and time frames of source clean up and plume response. To meet that aim, field and laboratory observations are coupled to state of the art models incorporating new insights of contaminant behaviour. The long term leaching of chlorinated ethenes from clay aquitards is currently being monitored at a number of Danish sites. The observed data is simulated using a coupled fracture flow and clay matrix diffusion model. Sequential degradation is represented by modified Monod kinetics accounting for competitive inhibition between the chlorinated ethenes. The model is constructed using Comsol Multiphysics, a generic finite- element partial differential equation solver. The model is applied at well characterised field sites with respect to hydrogeology, fracture network, contaminant distribution and microbial processes (lab and field experiments). At one of the study sites (Sortebrovej), the source areas are situated in a clayey till with fractures and interbedded sand lenses. The site is highly contaminated with chlorinated ethenes which impact the underlying sand aquifer. Full scale remediation using ERD was implemented at Sortebrovej in 2006. Anaerobic dechlorination is taking place, and cis-DCE and VC have been found in significant amounts in monitoring wells and to some degree in sediment cores representing the the clayey till matrix. Model results reveal several interesting findings. The physical processes of matrix diffusion and advection in the fractures seem to be more important than the microbial degradation processes for estimation of the time frames and the distance between fractures is amongst the most sensitive model parameters. However, the inclusion of sequential degradation is crucial to determining the composition of contamination leaching into the underlying aquifer. Degradation products like VC will peak at an earlier stage compared to the mother compound due to a higher mobility. These model results are supported by actual findings at the Sortebrovej site. The findings highlight a need for improved characterization of low permeability aquitards lying above aquifers used for water supply. The fracture network in aquitards is currently poorly described at larger depths (below 5-8 m) and the effect of sand lenses on leaching behaviour is not well understood. The microbial processes are assumed to be taking place in the fracture system, but the interaction with and processes in the matrix need to be further explored. Development of new methods for field site characterisation and integrated field and model expertise are crucial for the design of remedial actions and for risk assessment of contaminated sites in low permeability settings.

  3. Common Calibration Source for Monitoring Long-term Ozone Trends

    NASA Technical Reports Server (NTRS)

    Kowalewski, Matthew

    2004-01-01

    Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.

  4. Watershed nitrogen and phosphorus balance: The upper Potomac River basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaworski, N.A.; Groffman, P.M.; Keller, A.A.

    1992-01-01

    Nitrogen and phosphorus mass balances were estimated for the portion of the Potomac River basin watershed located above Washington, D.C. The total nitrogen (N) balance included seven input source terms, six sinks, and one 'change-in-storage' term, but was simplified to five input terms and three output terms. The phosphorus (P) baance had four input and three output terms. The estimated balances are based on watershed data from seven information sources. Major sources of nitrogen are animal waste and atmospheric deposition. The major sources of phosphorus are animal waste and fertilizer. The major sink for nitrogen is combined denitrification, volatilization, andmore » change-in-storage. The major sink for phosphorus is change-in-storage. River exports of N and P were 17% and 8%, respectively, of the total N and P inputs. Over 60% of the N and P were volatilized or stored. The major input and output terms on the budget are estimated from direct measurements, but the change-in-storage term is calculated by difference. The factors regulating retention and storage processes are discussed and research needs are identified.« less

  5. Measuring soil moisture near soil surface ... minor differences due to neutron source type

    Treesearch

    Robert R. Ziemer; Irving Goldberg; Norman A. MacGillivray

    1967-01-01

    Abstract - Moisture measurements were made in three media--paraffin, water, saturated sand--with four neutron moisture meters, each containing 226-radium-beryllium, 227-actinium-beryllium, 239-plutonium-beryllium, or 241-americium-beryllium neutron sources. Variability in surface detection by the different sources may be due to differences in neutron sources, in...

  6. Estimation of the isotopic composition and origins of winter precipitation over Japan using a regional isotope circulation model

    NASA Astrophysics Data System (ADS)

    Tanoue, M.; Ichiyanagi, K.; Yoshimura, K.; Shimada, J.; Hirabayashi, Y.

    2017-12-01

    Understanding the dynamics of the origins of precipitation (i.e., vapor source regions of evaporated moisture) is useful for long-term forecasting and calibration of water isotope thermometer. In the Asian monsoon region, vapor source regions are identified by the deuterium excess (d-excess; defined as δD - 8 • δ18O) of precipitation because its values mainly reflect humidity conditions during evaporation at the source regions. In Japan, previous studies assumed the Sea of Japan to be the dominant source of winter precipitation when the d-excess value in winter is >20‰ or higher than the average value in summer. Because this assumption is based on an interpretation that the high d-excess value is due to an interaction between the continental winter monsoon (WM) and warm sea surface at the Sea of Japan, it may not be appropriate for winter precipitation caused by extratropical cyclones (EC). Here, we utilized a regional isotope circulation model and then clarified local patterns of isotopic composition and the origins of precipitation in the WM and EC types over Japan. The results indicated that moisture originating from the Sea of Japan made the highest contribution to precipitation on the Sea of Japan side of Japan in the WM type, whereas the Pacific Ocean was the dominant source of precipitation over Japan in the EC type. Because d-excess values were higher in the WM than in the EC type, we can assume that the Sea of Japan was the dominant source of precipitation on the Sea of Japan side when the d-excess value was high. Because precipitation on the Pacific Ocean side and the Kyushu island of Japan was mainly caused by the EC type, we could not identify the dominant source of precipitation as the Sea of Japan from only the d-excess values in these regions. We also found that WM activity could be estimated from observed d-excess values due to a clear positive correlation between simulated d-excess values and the activity.

  7. Long-term risk in a recently active volcanic system: Evaluation of doses and indoor radiological risk in the quaternary Vulsini Volcanic District (Central Italy)

    NASA Astrophysics Data System (ADS)

    Capaccioni, B.; Cinelli, G.; Mostacci, D.; Tositti, L.

    2012-12-01

    Volcanic rocks in the Vulsini Volcanic District (Central Italy) contain high concentrations of 238U, 232Th and 40K due to subduction-related metasomatic enrichment of incompatible elements in the mantle source coupled with magma differentiation within the upper crust. Due to their favorable mechanical properties they have been extensively used for construction since the Etruscan age. In the old buildings of the Bolsena village, one of the most populated ancient village in the area, the major source of indoor radioactivity is 222Rn, a radioactive noble gas descendant of 238U. Direct 222Rn indoor measurements have detected extremely high values in the old center due to the combined effect of building materials, radon fluxes from the volcanic basement and low air exchange rates. In these cases the evaluated risk of developing lung cancer within a 75 year lifetime reaches up to 40% for ever smokers. Simulations of "standard rooms" built with different tuffs and lavas collected from the Vulsini Volcanic District have also provided estimations of the effective doses and lifetime risk for radiogenic cancer. Other than by the method adopted for calculation, the total evaluated risk for each volcanic rock depends on different parameters, such as: radionuclide content, radon emanation power, occupancy factor and air exchange rate. Occupancy factor and air exchange rate appear as the only controlling parameters able to mitigate the indoor radiological risk.

  8. Relativistic analogue of the Newtonian fluid energy equation with nucleosynthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardall, Christian Y.

    In Newtonian fluid dynamics simulations in which composition has been tracked by a nuclear reaction network, energy generation due to composition changes has generally been handled as a separate source term in the energy equation. Here, a relativistic equation in conservative form for total fluid energy, obtained from the spacetime divergence of the stress-energy tensor, in principle encompasses such energy generation; but it is not explicitly manifest. An alternative relativistic energy equation in conservative form—in which the nuclear energy generation appears explicitly, and that reduces directly to the Newtonian internal+kinetic energy in the appropriate limit—emerges naturally and self-consistently from themore » difference of the equation for total fluid energy and the equation for baryon number conservation multiplied by the average baryon mass m, when m is expressed in terms of contributions from the nuclear species in the fluid, and allowed to be mutable.« less

  9. High-heat-load monochromator options for the RIXS beamline at the APS with the MBA lattice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Zunping, E-mail: zpliu@anl.gov; Gog, Thomas, E-mail: gog@aps.anl.gov; Stoupin, Stanislav A.

    2016-07-27

    With the MBA lattice for APS-Upgrade, tuning curves of 2.6 cm period undulators meet the source requirements for the RIXS beamline. The high-heat-load monochromator (HHLM) is the first optical white beam component. There are four options for the HHLM such as diamond monochromators with refrigerant of either water or liquid nitrogen (LN{sub 2}), and silicon monochromators of either direct or indirect cooling system. Their performances are evaluated at energy 11.215 keV (Ir L-III edge). The cryo-cooled diamond monochromator has similar performance as the water-cooled diamond monochromator because GaIn of the Cu-GaIn-diamond interface becomes solid. The cryo-cooled silicon monochromators perform better,more » not only in terms of surface slope error due to thermal deformation, but also in terms of thermal capacity.« less

  10. Long-term transport behavior of psychoactive compounds in sewage-affected groundwater

    NASA Astrophysics Data System (ADS)

    Nham, Hang Thuy Thi; Greskowiak, Janek; Hamann, Enrico; Meffe, Raffaella; Hass, Ulrike; Massmann, Gudrun

    2016-11-01

    The present study provides a model-based characterization of the long-term transport behavior of five psychoactive compounds (meprobamate, pyrithyldione, primidone, phenobarbital and phenylethylmalonamide) introduced into groundwater via sewage irrigation in Berlin, Germany. Compounds are still present in the groundwater despite the sewage farm closure in the year 1980. Due to the limited information on (i) compound concentrations in the source water and (ii) substance properties, a total of 180 cross-sectional model realizations for each compound were carried out, covering a large range of possible parameter combinations. Results were compared with the present-day contamination patterns in the aquifer and the most likely scenarios were identified based on a number of model performance criteria. The simulation results show that (i) compounds are highly persistent under the present field conditions, and (ii) sorption is insignificant. Thus, back-diffusion from low permeability zones appears as the main reason for the compound retardation.

  11. The modelling and assessment of whale-watching impacts

    USGS Publications Warehouse

    New, Leslie; Hall, Ailsa J.; Harcourt, Robert; Kaufman, Greg; Parsons, E.C.M.; Pearson, Heidi C.; Cosentino, A. Mel; Schick, Robert S

    2015-01-01

    In recent years there has been significant interest in modelling cumulative effects and the population consequences of individual changes in cetacean behaviour and physiology due to disturbance. One potential source of disturbance that has garnered particular interest is whale-watching. Though perceived as ‘green’ or eco-friendly tourism, there is evidence that whale-watching can result in statistically significant and biologically meaningful changes in cetacean behaviour, raising the question whether whale-watching is in fact a long term sustainable activity. However, an assessment of the impacts of whale-watching on cetaceans requires an understanding of the potential behavioural and physiological effects, data to effectively address the question and suitable modelling techniques. Here, we review the current state of knowledge on the viability of long-term whale-watching, as well as logistical limitations and potential opportunities. We conclude that an integrated, coordinated approach will be needed to further understanding of the possible effects of whale-watching on cetaceans.

  12. New solution decomposition and minimization schemes for Poisson-Boltzmann equation in calculation of biomolecular electrostatics

    NASA Astrophysics Data System (ADS)

    Xie, Dexuan

    2014-10-01

    The Poisson-Boltzmann equation (PBE) is one widely-used implicit solvent continuum model in the calculation of electrostatic potential energy for biomolecules in ionic solvent, but its numerical solution remains a challenge due to its strong singularity and nonlinearity caused by its singular distribution source terms and exponential nonlinear terms. To effectively deal with such a challenge, in this paper, new solution decomposition and minimization schemes are proposed, together with a new PBE analysis on solution existence and uniqueness. Moreover, a PBE finite element program package is developed in Python based on the FEniCS program library and GAMer, a molecular surface and volumetric mesh generation program package. Numerical tests on proteins and a nonlinear Born ball model with an analytical solution validate the new solution decomposition and minimization schemes, and demonstrate the effectiveness and efficiency of the new PBE finite element program package.

  13. Numerical and experimental study of the fundamental flow characteristics of a 3D gully box under drainage.

    PubMed

    Lopes, Pedro; Carvalho, Rita F; Leandro, Jorge

    2017-05-01

    Numerical studies regarding the influence of entrapped air on the hydraulic performance of gullies are nonexistent. This is due to the lack of a model that simulates the air-entrainment phenomena and consequently the entrapped air. In this work, we used experimental data to validate an air-entrainment model that uses a Volume-of-Fluid based method to detect the interface and the Shear-stress transport k-ω turbulence model. The air is detected in a sub-grid scale, generated by a source term and transported using a slip velocity formulation. Results are shown in terms of free-surface elevation, velocity profiles, turbulent kinetic energy and discharge coefficients. The air-entrainment model allied to the turbulence model showed a good accuracy in the prediction of the zones of the gully where the air is more concentrated.

  14. Polynomial-interpolation algorithm for van der Pauw Hall measurement in a metal hydride film

    NASA Astrophysics Data System (ADS)

    Koon, D. W.; Ares, J. R.; Leardini, F.; Fernández, J. F.; Ferrer, I. J.

    2008-10-01

    We apply a four-term polynomial-interpolation extension of the van der Pauw Hall measurement technique to a 330 nm Mg-Pd bilayer during both absorption and desorption of hydrogen at room temperature. We show that standard versions of the van der Pauw DC Hall measurement technique produce an error of over 100% due to a drifting offset signal and can lead to unphysical interpretations of the physical processes occurring in this film. The four-term technique effectively removes this source of error, even when the offset signal is drifting by an amount larger than the Hall signal in the time interval between successive measurements. This technique can be used to increase the resolution of transport studies of any material in which the resistivity is rapidly changing, particularly when the material is changing from metallic to insulating behavior.

  15. Assessment of rainfall thresholds for landslide triggering in the Pacific Northwest: extreme short-term rainfall and long-term trends

    NASA Astrophysics Data System (ADS)

    Stanley, T.; Kirschbaum, D.; Sobieszczyk, S.; Jasinski, M. F.; Borak, J.; Yatheendradas, S.

    2017-12-01

    Landslides occur every year in the U.S. Pacific Northwest due to extreme rainfall, snow cover, and rugged topography. Data for 15,000 landslide events in Washington and Oregon were assembled from State Surveys, Departments of Transportation, a Global Landslide Catalog compiled by NASA, and other sources. This new inventory was evaluated against rainfall data from the National Climate Assessment (NCA) Land Data Assimilation System to characterize the regional rainfall conditions that trigger landslides. Analysis of these data sets indicates clear differences in triggering thresholds between extreme weather systems such as a Pineapple Express and the more typical peak seasonal rainfall between November and February. The study also leverages over 30 years of precipitation and land surface information to inform variability of landslide triggering over multiple decades and landslide trends within the region.

  16. Ca(2+) signaling mechanisms in bovine adrenal chromaffin cells.

    PubMed

    Weiss, Jamie L

    2012-01-01

    Calcium (Ca(2+)) is a crucial intracellular messenger in physiological aspects of cell signaling. Adrenal chromaffin cells are the secretory cells from the adrenal gland medulla that secrete catecholamines, which include epinephrine and norepinephrine important in the 'fight or flight' response. Bovine adrenal chromaffin cells have long been used as an important model for secretion -(exocytosis) not only due to their importance in the short-term stress response, but also as a neuroendocrine model of neurotransmtter release, as they have all the same exocytotic proteins as neurons but are easier to prepare, culture and use in functional assays. The components of the Ca(2+) signal transduction cascade and it role in secretion has been extensively characterized in bovine adrenal chromaffin cells. The Ca(2+) sources, signaling molecules and how this relates to the short-term stress response are reviewed in this book chapter in an endeavor to generally -overview these mechanisms in a concise and uncomplicated manner.

  17. Relativistic analogue of the Newtonian fluid energy equation with nucleosynthesis

    DOE PAGES

    Cardall, Christian Y.

    2017-12-15

    In Newtonian fluid dynamics simulations in which composition has been tracked by a nuclear reaction network, energy generation due to composition changes has generally been handled as a separate source term in the energy equation. Here, a relativistic equation in conservative form for total fluid energy, obtained from the spacetime divergence of the stress-energy tensor, in principle encompasses such energy generation; but it is not explicitly manifest. An alternative relativistic energy equation in conservative form—in which the nuclear energy generation appears explicitly, and that reduces directly to the Newtonian internal+kinetic energy in the appropriate limit—emerges naturally and self-consistently from themore » difference of the equation for total fluid energy and the equation for baryon number conservation multiplied by the average baryon mass m, when m is expressed in terms of contributions from the nuclear species in the fluid, and allowed to be mutable.« less

  18. Doors for memory: A searchable database.

    PubMed

    Baddeley, Alan D; Hitch, Graham J; Quinlan, Philip T; Bowes, Lindsey; Stone, Rob

    2016-11-01

    The study of human long-term memory has for over 50 years been dominated by research on words. This is partly due to lack of suitable nonverbal materials. Experience in developing a clinical test suggested that door scenes can provide an ecologically relevant and sensitive alternative to the faces and geometrical figures traditionally used to study visual memory. In pursuing this line of research, we have accumulated over 2000 door scenes providing a database that is categorized on a range of variables including building type, colour, age, condition, glazing, and a range of other physical characteristics. We describe an illustrative study of recognition memory for 100 doors tested by yes/no, two-alternative, or four-alternative forced-choice paradigms. These stimuli, together with the full categorized database, are available through a dedicated website. We suggest that door scenes provide an ecologically relevant and participant-friendly source of material for studying the comparatively neglected field of visual long-term memory.

  19. Long-term Satellite Observations of Asian Dust Storm: Source, Pathway, and Interannual Variability

    NASA Technical Reports Server (NTRS)

    Hsu, N. Christina

    2008-01-01

    Among the many components that contribute to air pollution, airborne mineral dust plays an important role due to its biogeochemical impact on the ecosystem and its radiative-forcing effect on the climate system. In East Asia, dust storms frequently accompany the cold and dry air masses that occur as part of springtime cold front systems. Outbreaks of Asian dust storms occur often in the arid and semi-arid areas of northwestern China -about 1.6x10(exp 6) square kilometers including the Gobi and Taklimakan deserts- with continuous expanding of spatial coverage. These airborne dust particles, originating in desert areas far from polluted regions, interact with anthropogenic sulfate and soot aerosols emitted from Chinese megacities during their transport over the mainland. Adding the intricate effects of clouds and marine aerosols, dust particles reaching the marine environment can have drastically different properties than those from their sources. Furthermore, these aerosols, once generated over the source regions, can be transported out of the boundary layer into the free troposphere and can travel thousands of kilometers across the Pacific into the United States and beyond. In this paper, we will demonstrate the capability of a new satellite algorithm to retrieve aerosol properties (e.g., optical thickness, single scattering albedo) over bright-reflecting surfaces such as urban areas and deserts. Such retrievals have been difficult to perform using previously available algorithms that use wavelengths from the mid-visible to the near IR because they have trouble separating the aerosol signal from the contribution due to the bright surface reflectance. This new algorithm, called Deep Blue, utilizes blue-wavelength measurements from instruments such as SeaWiFS and MODIS to infer the properties of aerosols, since the surface reflectance over land in the blue part of the spectrum is much lower than for longer wavelength channels. Reasonable agreements have been achieved between Deep Blue retrievals of aerosol optical thickness and those directly from AERONET sunphotometers over desert and semi-desert regions. New Deep Blue products will allow scientists to determine quantitatively the aerosol properties near sources using high spatial resolution measurements from SeaWiFS and MODIS-like instruments. Long-term satellite measurements (1998 - 2007) from SeaWiFS will be utilized to investigate the interannual variability of source, pathway, and dust loading associated with the Asian dust storm outbreaks. In addition, monthly averaged aerosol optical thickness during the springtime from SeaWiFS will also be compared with the MODIS Deep Blue products.

  20. Recent Progress on Deep Blue Aerosol Algorithm as Applied TO MODIS, SEA WIFS, and VIIRS, and Their Intercomparisons with Ground Based and Other Satellite Measurements

    NASA Technical Reports Server (NTRS)

    Hsu, N. Christina; Bettenhausen, Corey; Sawyer, Andrew; Tsay, Si-Chee

    2012-01-01

    The impact of natural and anthropogenic sources of aerosols has gained increasing attention from scientific communities in recent years. Indeed, tropospheric aerosols not only perturb radiative energy balance by interacting with solar and terrestrial radiation, but also by changing cloud properties and lifetime. Furthermore, these anthropogenic and natural air particles, once generated over the source regions, can be transported out of the boundary layer into the free troposphere and can travel thousands of kilometers across oceans and continents resulting in important biogeochemical impacts on the ecosystem. With the launch of SeaWiFS in 1997, Terra/MODIS in 1999, and Aqua/MODIS in 2002, high quality comprehensive aerosol climatology is becoming feasible for the first time. As a result of these unprecedented data records, studies of the radiative and biogeochemical effects due to tropospheric aerosols are now possible. In this talk, we will demonstrate how this newly available SeaWiFS/MODIS aerosol climatology can provide an important piece of puzzles in reducing the uncertainty of estimated climatic forcing due to aerosols. We will start with the global distribution of aerosol loading and their variabilities over both land and ocean on short- and long-term temporal scales observed over the last decade. The recent progress made in Deep Blue aerosol algorithm on improving accuracy of these Sea WiFS / MODIS aerosol products in particular over land will be discussed. The impacts on quantifying physical and optical processes of aerosols over source regions of adding the Deep Blue products of aerosol properties over bright-reflecting surfaces into Sea WiFS / MODIS as well as VIIRS data suite will also be addressed. We will also show the intercomparison results of SeaWiFS/MODIS retrieved aerosol optical thickness with data from ground based AERONET sunphotometers over land and ocean as well as with other satellite measurements. The trends observed in global aerosol loadings of both natural and anthropogenic sources based upon more than a decade of combined MODIS/SeaWiFS data (1997-2011) will be discussed. We will also address the importance of various key issues such as differences in spatial-temporal sampling rates and observation time between different satellite measurements could potentially impact these intercomparisons results, especially for using the monthly mean data, and thus on estimates of long-term aerosol trends.

  1. Nodal Green’s Function Method Singular Source Term and Burnable Poison Treatment in Hexagonal Geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A.A. Bingham; R.M. Ferrer; A.M. ougouag

    2009-09-01

    An accurate and computationally efficient two or three-dimensional neutron diffusion model will be necessary for the development, safety parameters computation, and fuel cycle analysis of a prismatic Very High Temperature Reactor (VHTR) design under Next Generation Nuclear Plant Project (NGNP). For this purpose, an analytical nodal Green’s function solution for the transverse integrated neutron diffusion equation is developed in two and three-dimensional hexagonal geometry. This scheme is incorporated into HEXPEDITE, a code first developed by Fitzpatrick and Ougouag. HEXPEDITE neglects non-physical discontinuity terms that arise in the transverse leakage due to the transverse integration procedure application to hexagonal geometry andmore » cannot account for the effects of burnable poisons across nodal boundaries. The test code being developed for this document accounts for these terms by maintaining an inventory of neutrons by using the nodal balance equation as a constraint of the neutron flux equation. The method developed in this report is intended to restore neutron conservation and increase the accuracy of the code by adding these terms to the transverse integrated flux solution and applying the nodal Green’s function solution to the resulting equation to derive a semi-analytical solution.« less

  2. Effect of natural and synthetic iron corrosion products on silicate glass alteration processes

    NASA Astrophysics Data System (ADS)

    Dillmann, Philippe; Gin, Stéphane; Neff, Delphine; Gentaz, Lucile; Rebiscoul, Diane

    2016-01-01

    Glass long term alteration in the context of high-level radioactive waste (HLW) storage is influenced by near-field materials and environmental context. As previous studies have shown, the extent of glass alteration is strongly related to the presence of iron in the system, mainly provided by the steel overpack around surrounding the HLW glass package. A key to understanding what will happen to the glass-borne elements in the geological disposal lies in the relationship between the iron-bearing phases and the glass alteration products formed. In this study, we focus on the influence of the formation conditions (synthetized or in-situ) and the age of different iron corrosion products on SON68 glass alteration. Corrosion products obtained from archaeological iron artifacts are considered here to be true analogues of the corrosion products in a waste disposal system due to the similarities in formation conditions and physical properties. These representative corrosion products (RCP) are used in the experiment along with synthetized iron anoxic corrosion products and pristine metallic iron. The model-cracks of SON68 glass were altered in cell reactors, with one of the different iron-sources inserted in the crack each time. The study was successful in reproducing most of the processes observed in the long term archaeological system. Between the different systems, alteration variations were noted both in nature and intensity, confirming the influence of the iron-source on glass alteration. Results seem to point to a lesser effect of long term iron corrosion products (RCP) on the glass alteration than that of the more recent products (SCP), both in terms of general glass alteration and of iron transport.

  3. Advanced relativistic VLBI model for geodesy

    NASA Astrophysics Data System (ADS)

    Soffel, Michael; Kopeikin, Sergei; Han, Wen-Biao

    2017-07-01

    Our present relativistic part of the geodetic VLBI model for Earthbound antennas is a consensus model which is considered as a standard for processing high-precision VLBI observations. It was created as a compromise between a variety of relativistic VLBI models proposed by different authors as documented in the IERS Conventions 2010. The accuracy of the consensus model is in the picosecond range for the group delay but this is not sufficient for current geodetic purposes. This paper provides a fully documented derivation of a new relativistic model having an accuracy substantially higher than one picosecond and based upon a well accepted formalism of relativistic celestial mechanics, astrometry and geodesy. Our new model fully confirms the consensus model at the picosecond level and in several respects goes to a great extent beyond it. More specifically, terms related to the acceleration of the geocenter are considered and kept in the model, the gravitational time-delay due to a massive body (planet, Sun, etc.) with arbitrary mass and spin-multipole moments is derived taking into account the motion of the body, and a new formalism for the time-delay problem of radio sources located at finite distance from VLBI stations is presented. Thus, the paper presents a substantially elaborated theoretical justification of the consensus model and its significant extension that allows researchers to make concrete estimates of the magnitude of residual terms of this model for any conceivable configuration of the source of light, massive bodies, and VLBI stations. The largest terms in the relativistic time delay which can affect the current VLBI observations are from the quadrupole and the angular momentum of the gravitating bodies that are known from the literature. These terms should be included in the new geodetic VLBI model for improving its consistency.

  4. Is Earth coming out of the recent ice house age in the long-term? - constraints from probable mantle CO2-degassing reconstructions

    NASA Astrophysics Data System (ADS)

    Hartmann, Jens; Li, Gaojun; West, A. Joshua

    2017-04-01

    Enhanced partial melting of mantle material probably started when the subduction motor started around 3.2 Ga ago as evidenced by the formation history of the continental crust. Carbon is degassing due partial melting as it is an incompatible element. Therefore, mantle carbon degassing rates would change with time proportionally to the reservoir mantle concentration evolution and the ocean crust production rate, causing a distinct CO2-degassing rate change with time. The evolution of the mantle degassing rate has some implications for the reconstruction of the carbon cycle and therefore climate and Earth surface processes rates, as CO2-degassing rates are used to constrain or to balance the atmosphere-ocean-crust carbon cycle system. It will be shown that compilations of CO2-degassing from relevant geological sources are probably exceeding the established CO2-sink terrestrial weathering, which is often used to constrain long-term mantle degassing rates to close the carbon cycle on geological time scales. In addition, the scenarios for the degassing dynamics from the mantle sources suggest that the mantle is depleting its carbon content since 3 Ga. This has further implications for the long-term CO2-sink weathering. Results will be compared with geochemical proxies for weathering and weathering intensity dynamics, and will be set in context with snow ball Earth events and long-term emplacement dynamics of mafic areas as Large Igneous Provinces. Decreasing mantle degassing rates since about 2 Ga suggest a constraint for the evolution of the carbon cycle and recycling potential of the amount of subducted carbon. If the given scenarios hold further investigation, the contribution of mantle degassing to climate forcing (directly and via recycling) will decrease further.

  5. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Convergence of the Uncertainty Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie

    2014-02-01

    This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisorymore » Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).« less

  6. Analytic estimation of recycled products added value as a means for effective environmental management

    NASA Astrophysics Data System (ADS)

    Batzias, Dimitris F.

    2012-12-01

    In this work, we present an analytic estimation of recycled products added value in order to provide a means for determining the degree of recycling that maximizes profit, taking also into account the social interest by including the subsidy of the corresponding investment. A methodology has been developed based on Life Cycle Product (LCP) with emphasis on added values H, R as fractions of production and recycle cost, respectively (H, R >1, since profit is included), which decrease by the corresponding rates h, r in the recycle course, due to deterioration of quality. At macrolevel, the claim that "an increase of exergy price, as a result of available cheap energy sources becoming more scarce, leads to less recovered quantity of any recyclable material" is proved by means of the tradeoff between the partial benefits due to material saving and resources degradation/consumption (assessed in monetary terms).

  7. Elimination of the light shift in rubidium gas cell frequency standards using pulsed optical pumping

    NASA Technical Reports Server (NTRS)

    English, T. C.; Jechart, E.; Kwon, T. M.

    1978-01-01

    Changes in the intensity of the light source in an optically pumped, rubidium, gas cell frequency standard can produce corresponding frequency shifts, with possible adverse effects on the long-term frequency stability. A pulsed optical pumping apparatus was constructed with the intent of investigating the frequency stability in the absence of light shifts. Contrary to original expectations, a small residual frequency shift due to changes in light intensity was experimentally observed. Evidence is given which indicates that this is not a true light-shift effect. Preliminary measurements of the frequency stability of this apparatus, with this small residual pseudo light shift present, are presented. It is shown that this pseudo light shift can be eliminated by using a more homogeneous C-field. This is consistent with the idea that the pseudo light shift is due to inhomogeneity in the physics package (position-shift effect).

  8. Long-term Photometric Variability in Kepler Full-frame Images: Magnetic Cycles of Sun–like Stars

    NASA Astrophysics Data System (ADS)

    Montet, Benjamin T.; Tovar, Guadalupe; Foreman-Mackey, Daniel

    2017-12-01

    Photometry from the Kepler mission is optimized to detect small, short-duration signals like planet transits at the expense of long-term trends. This long-term variability can be recovered in photometry from the full-frame images (FFIs), a set of calibration data collected approximately monthly during the Kepler mission. Here we present f3, an open-source package to perform photometry on the Kepler FFIs in order to detect changes in the brightness of stars in the Kepler field of view over long time baselines. We apply this package to a sample of 4000 Sun–like stars with measured rotation periods. We find that ≈10% of these targets have long-term variability in their observed flux. For the majority of targets, we find that the luminosity variations are either correlated or anticorrelated with the short-term variability due to starspots on the stellar surface. We find a transition between anticorrelated (starspot-dominated) variability and correlated (facula-dominated) variability between rotation periods of 15 and 25 days, suggesting the transition between the two modes is complete for stars at the age of the Sun. We also identify a sample of stars with apparently complete cycles, as well as a collection of short-period binaries with extreme photometric variation over the Kepler mission.

  9. Allometric Trajectories and "Stress": A Quantitative Approach.

    PubMed

    Anfodillo, Tommaso; Petit, Giai; Sterck, Frank; Lechthaler, Silvia; Olson, Mark E

    2016-01-01

    The term "stress" is an important but vague term in plant biology. We show situations in which thinking in terms of "stress" is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between sources and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, "too little" leaf area (e.g., due to herbivory or disease) per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to "stress," without need for recourse to this term. Our approach contrasts with traditional approaches for studying "stress," e.g., revealing that small "stressed" plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as "stress," plasticity, adaptation, and acclimation.

  10. Efficient Development of High Fidelity Structured Volume Grids for Hypersonic Flow Simulations

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    2003-01-01

    A new technique for the control of grid line spacing and intersection angles of a structured volume grid, using elliptic partial differential equations (PDEs) is presented. Existing structured grid generation algorithms make use of source term hybridization to provide control of grid lines, imposing orthogonality implicitly at the boundary and explicitly on the interior of the domain. A bridging function between the two types of grid line control is typically used to blend the different orthogonality formulations. It is shown that utilizing such a bridging function with source term hybridization can result in the excessive use of computational resources and diminishes robustness. A new approach, Anisotropic Lagrange Based Trans-Finite Interpolation (ALBTFI), is offered as a replacement to source term hybridization. The ALBTFI technique captures the essence of the desired grid controls while improving the convergence rate of the elliptic PDEs when compared with source term hybridization. Grid generation on a blunt cone and a Shuttle Orbiter is used to demonstrate and assess the ALBTFI technique, which is shown to be as much as 50% faster, more robust, and produces higher quality grids than source term hybridization.

  11. BWR ASSEMBLY SOURCE TERMS FOR WASTE PACKAGE DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.L. Lotz

    1997-02-15

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to provide boiling water reactor (BWR) assembly radiation source term data for use during Waste Package (WP) design. The BWR assembly radiation source terms are to be used for evaluation of radiolysis effects at the WP surface, and for personnel shielding requirements during assembly or WP handling operations. The objectives of this evaluation are to generate BWR assembly radiation source terms that bound selected groupings of BWR assemblies, with regard to assembly average burnup and cooling time, which comprise the anticipated MGDS BWR commercialmore » spent nuclear fuel (SNF) waste stream. The source term data is to be provided in a form which can easily be utilized in subsequent shielding/radiation dose calculations. Since these calculations may also be used for Total System Performance Assessment (TSPA), with appropriate justification provided by TSPA, or radionuclide release rate analysis, the grams of each element and additional cooling times out to 25 years will also be calculated and the data included in the output files.« less

  12. High stability integrated Tri-axial fluxgate sensor with suspended technology

    NASA Astrophysics Data System (ADS)

    Wang, Chen; Teng, Yuntian; Wang, Xiaomei; Fan, Xiaoyong; Wu, Qiong

    2017-04-01

    The relative geomagnetic record of China Geomagnetic Network of China(GNC) has been digitized, network, meanwhile achieving second data acquisition and storage during after 9th five-year and 10th five-year plan upgraded. Currently the relative record in geomagnetic observatories are generally two sets of the same type instrument with parallel observation, which could distinguish the differential between observation instrument failures and environmental interference, and ensure the continuity and integrity of the observation data. Fluxgate magnetometer has become mainstream equipment for relative geomagnetic record because of its low noise, high sensitivity, and fast response. There is a problem about data inconsistency by the same type of instrument in the same station though few years observation data analysis. The researchers have done a lot of experiments and found three main error sources:1. The instrument performances, due to the limitation of manufacturing and assembly process level it is difficult to ensure the orthogonality of the instrument; other performances of scale, zero offset and temperature coefficient; 2. horizontal error, which introduced by the initial installation process due to horizontal adjustment and pillar tilling due to long-term observations; 3.The observation environment, the temperature and humidity, power supply system. The new fluxgate magnetometer uses special nonmagnetic gimbaled (made by beryllium / bronze material) construction for suspension, so the fluxgate sensor is fixed at the suspended platform in order to automatically keep the horizontal level. The advantage of this design is to eliminate horizontal error introduced by the initial installation process due to horizontal adjustment and pillar tilling due to long-term observations. The signal processing circuit board is fixed on the top of the suspended platform with certain distance to ensure the static and dynamic magnetic field produced by circuit board no effect to the sensor, so we could get flexible instrument due to signal attenuation resulting signal transmission cable limited length.

  13. Recent H- diagnostics, plasma simulations, and 2X scaled Penning ion source developments at the Rutherford Appleton Laboratory

    NASA Astrophysics Data System (ADS)

    Lawrie, S. R.; Faircloth, D. C.; Smith, J. D.; Sarmento, T. M.; Whitehead, M. O.; Wood, T.; Perkins, M.; Macgregor, J.; Abel, R.

    2018-05-01

    A vessel for extraction and source plasma analyses is being used for Penning H- ion source development at the Rutherford Appleton Laboratory. A new set of optical elements including an einzel lens has been installed, which transports over 80 mA of H- beam successfully. Simultaneously, a 2X scaled Penning source has been developed to reduce cathode power density. The 2X source is now delivering a 65 mA H- ion beam at 10% duty factor, meeting its design criteria. The long-term viability of the einzel lens and 2X source is now being evaluated, so new diagnostic devices have been installed. A pair of electrostatic deflector plates is used to correct beam misalignment and perform fast chopping, with a voltage rise time of 24 ns. A suite of four quartz crystal microbalances has shown that the cesium flux in the vacuum vessel is only increased by a factor of two, despite the absence of a dedicated cold trap. Finally, an infrared camera has demonstrated good agreement with thermal simulations but has indicated unexpected heating due to beam loss on the downstream electrode. These types of diagnostics are suitable for monitoring all operational ion sources. In addition to experimental campaigns and new diagnostic tools, the high-performance VSim and COMSOL software packages are being used for plasma simulations of two novel ion thrusters for space propulsion applications. In parallel, a VSim framework has been established to include arbitrary temperature and cesium fields to allow the modeling of surface physics in H- ion sources.

  14. A comprehensive Probabilistic Tsunami Hazard Assessment for the city of Naples (Italy)

    NASA Astrophysics Data System (ADS)

    Anita, G.; Tonini, R.; Selva, J.; Sandri, L.; Pierdominici, S.; Faenza, L.; Zaccarelli, L.

    2012-12-01

    A comprehensive Probabilistic Tsunami Hazard Assessment (PTHA) should consider different tsunamigenic sources (seismic events, slide failures, volcanic eruptions) to calculate the hazard on given target sites. This implies a multi-disciplinary analysis of all natural tsunamigenic sources, in a multi-hazard/risk framework, which considers also the effects of interaction/cascade events. Our approach shows the ongoing effort to analyze the comprehensive PTHA for the city of Naples (Italy) including all types of sources located in the Tyrrhenian Sea, as developed within the Italian project ByMuR (Bayesian Multi-Risk Assessment). The project combines a multi-hazard/risk approach to treat the interactions among different hazards, and a Bayesian approach to handle the uncertainties. The natural potential tsunamigenic sources analyzed are: 1) submarine seismic sources located on active faults in the Tyrrhenian Sea and close to the Southern Italian shore line (also we consider the effects of the inshore seismic sources and the associated active faults which we provide their rapture properties), 2) mass failures and collapses around the target area (spatially identified on the basis of their propensity to failure), and 3) volcanic sources mainly identified by pyroclastic flows and collapses from the volcanoes in the Neapolitan area (Vesuvius, Campi Flegrei and Ischia). All these natural sources are here preliminary analyzed and combined, in order to provide a complete picture of a PTHA for the city of Naples. In addition, the treatment of interaction/cascade effects is formally discussed in the case of significant temporary variations in the short-term PTHA due to an earthquake.

  15. A multi-source data assimilation framework for flood forecasting: Accounting for runoff routing lags

    NASA Astrophysics Data System (ADS)

    Meng, S.; Xie, X.

    2015-12-01

    In the flood forecasting practice, model performance is usually degraded due to various sources of uncertainties, including the uncertainties from input data, model parameters, model structures and output observations. Data assimilation is a useful methodology to reduce uncertainties in flood forecasting. For the short-term flood forecasting, an accurate estimation of initial soil moisture condition will improve the forecasting performance. Considering the time delay of runoff routing is another important effect for the forecasting performance. Moreover, the observation data of hydrological variables (including ground observations and satellite observations) are becoming easily available. The reliability of the short-term flood forecasting could be improved by assimilating multi-source data. The objective of this study is to develop a multi-source data assimilation framework for real-time flood forecasting. In this data assimilation framework, the first step is assimilating the up-layer soil moisture observations to update model state and generated runoff based on the ensemble Kalman filter (EnKF) method, and the second step is assimilating discharge observations to update model state and runoff within a fixed time window based on the ensemble Kalman smoother (EnKS) method. This smoothing technique is adopted to account for the runoff routing lag. Using such assimilation framework of the soil moisture and discharge observations is expected to improve the flood forecasting. In order to distinguish the effectiveness of this dual-step assimilation framework, we designed a dual-EnKF algorithm in which the observed soil moisture and discharge are assimilated separately without accounting for the runoff routing lag. The results show that the multi-source data assimilation framework can effectively improve flood forecasting, especially when the runoff routing has a distinct time lag. Thus, this new data assimilation framework holds a great potential in operational flood forecasting by merging observations from ground measurement and remote sensing retrivals.

  16. Characterisation of exposure to non-ionising electromagnetic fields in the Spanish INMA birth cohort: study protocol.

    PubMed

    Gallastegi, Mara; Guxens, Mònica; Jiménez-Zabala, Ana; Calvente, Irene; Fernández, Marta; Birks, Laura; Struchen, Benjamin; Vrijheid, Martine; Estarlich, Marisa; Fernández, Mariana F; Torrent, Maties; Ballester, Ferrán; Aurrekoetxea, Juan J; Ibarluzea, Jesús; Guerra, David; González, Julián; Röösli, Martin; Santa-Marina, Loreto

    2016-02-18

    Analysis of the association between exposure to electromagnetic fields of non-ionising radiation (EMF-NIR) and health in children and adolescents is hindered by the limited availability of data, mainly due to the difficulties on the exposure assessment. This study protocol describes the methodologies used for characterising exposure of children to EMF-NIR in the INMA (INfancia y Medio Ambiente- Environment and Childhood) Project, a prospective cohort study. Indirect (proximity to emission sources, questionnaires on sources use and geospatial propagation models) and direct methods (spot and fixed longer-term measurements and personal measurements) were conducted in order to assess exposure levels of study participants aged between 7 and 18 years old. The methodology used varies depending on the frequency of the EMF-NIR and the environment (homes, schools and parks). Questionnaires assessed the use of sources contributing both to Extremely Low Frequency (ELF) and Radiofrequency (RF) exposure levels. Geospatial propagation models (NISMap) are implemented and validated for environmental outdoor sources of RFs using spot measurements. Spot and fixed longer-term ELF and RF measurements were done in the environments where children spend most of the time. Moreover, personal measurements were taken in order to assess individual exposure to RF. The exposure data are used to explore their relationships with proximity and/or use of EMF-NIR sources. Characterisation of the EMF-NIR exposure by this combination of methods is intended to overcome problems encountered in other research. The assessment of exposure of INMA cohort children and adolescents living in different regions of Spain to the full frequency range of EMF-NIR extends the characterisation of environmental exposures in this cohort. Together with other data obtained in the project, on socioeconomic and family characteristics and development of the children and adolescents, this will enable to evaluate the complex interaction between health outcomes in children and adolescents and the various environmental factors that surround them.

  17. Analysis of long-term water quality for effective river health monitoring in peri-urban landscapes--a case study of the Hawkesbury-Nepean river system in NSW, Australia.

    PubMed

    Pinto, U; Maheshwari, B L; Ollerton, R L

    2013-06-01

    The Hawkesbury-Nepean River (HNR) system in South-Eastern Australia is the main source of water supply for the Sydney Metropolitan area and is one of the more complex river systems due to the influence of urbanisation and other activities in the peri-urban landscape through which it flows. The long-term monitoring of river water quality is likely to suffer from data gaps due to funding cuts, changes in priority and related reasons. Nevertheless, we need to assess river health based on the available information. In this study, we demonstrated how the Factor Analysis (FA), Hierarchical Agglomerative Cluster Analysis (HACA) and Trend Analysis (TA) can be applied to evaluate long-term historic data sets. Six water quality parameters, viz., temperature, chlorophyll-a, dissolved oxygen, oxides of nitrogen, suspended solids and reactive silicates, measured at weekly intervals between 1985 and 2008 at 12 monitoring stations located along the 300 km length of the HNR system were evaluated to understand the human and natural influences on the river system in a peri-urban landscape. The application of FA extracted three latent factors which explained more than 70 % of the total variance of the data and related to the 'bio-geographical', 'natural' and 'nutrient pollutant' dimensions of the HNR system. The bio-geographical and nutrient pollution factors more likely related to the direct influence of changes and activities of peri-urban natures and accounted for approximately 50 % of variability in water quality. The application of HACA indicated two major clusters representing clean and polluted zones of the river. On the spatial scale, one cluster was represented by the upper and lower sections of the river (clean zone) and accounted for approximately 158 km of the river. The other cluster was represented by the middle section (polluted zone) with a length of approximately 98 km. Trend Analysis indicated how the point sources influence river water quality on spatio-temporal scales, taking into account the various effects of nutrient and other pollutant loads from sewerage effluents, agriculture and other point and non-point sources along the river and major tributaries of the HNR. Over the past 26 years, water temperature has significantly increased while suspended solids have significantly decreased (p < 0.05). The analysis of water quality data through FA, HACA and TA helped to characterise the key sections and cluster the key water quality variables of the HNR system. The insights gained from this study have the potential to improve the effectiveness of river health-monitoring programs in terms of cost, time and effort, particularly in a peri-urban context.

  18. Assessing risk of non-compliance of phosphorus standards for lakes in England and Wales

    NASA Astrophysics Data System (ADS)

    Duethmann, D.; Anthony, S.; Carvalho, L.; Spears, B.

    2009-04-01

    High population densities, use of inorganic fertilizer and intensive livestock agriculture have increased phosphorus loads to lakes, and accelerated eutrophication is a major pressure for many lakes. The EC Water Framework Directive (WFD) requires that good chemical and ecological quality is restored in all surface water bodies by 2015. Total phosphorus (TP) standards for lakes in England and Wales have been agreed recently, and our aim was to estimate what percentage of lakes in England and Wales is at risk of failing these standards. With measured lake phosphorus concentrations only being available for a small number of lakes, such an assessment had to be model based. The study also makes a source apportionment of phosphorus inputs into lakes. Phosphorus loads were estimated from a range of sources including agricultural loads, sewage effluents, septic tanks, diffuse urban sources, atmospheric deposition, groundwater and bank erosion. Lake phosphorus concentrations were predicted using the Vollenweider model, and the model framework was satisfactorily tested against available observed lake concentration data. Even though predictions for individual lakes remain uncertain, results for a population of lakes are considered as sufficiently robust. A scenario analysis was carried out to investigate to what extent reductions in phosphorus loads would increase the number of lakes achieving good ecological status in terms of TP standards. Applying the model to all lakes in England and Wales greater than 1 ha, it was calculated that under current conditions roughly two thirds of the lakes would fail the good ecological status with respect to phosphorus. According to our estimates, agricultural phosphorus loads represent the most frequent dominant source for the majority of catchments, but diffuse urban runoff also is important in many lakes. Sewage effluents are the most frequent dominant source for large lake catchments greater than 100 km². The evaluation in terms of total load can be misleading in terms of what sources need to be tackled by catchment management for most of the lakes. For example sewage effluents are responsible for the majority of the total load but are the dominant source in only a small number of larger lake catchments. If loads from all sources were halved this would potentially increase the number of complying lakes to two thirds but require substantial measures to reduce phosphorus inputs to lakes. For agriculture, required changes would have to go beyond improvements of agricultural practise, and need to include reducing the intensity of land use. The time required for many lakes to respond to reduced nutrient loading is likely to extend beyond the current timelines of the WFD due to internal loading and biological resistances.

  19. Aerodynamic sound of flow past an airfoil

    NASA Technical Reports Server (NTRS)

    Wang, Meng

    1995-01-01

    The long term objective of this project is to develop a computational method for predicting the noise of turbulence-airfoil interactions, particularly at the trailing edge. We seek to obtain the energy-containing features of the turbulent boundary layers and the near-wake using Navier-Stokes Simulation (LES or DNS), and then to calculate the far-field acoustic characteristics by means of acoustic analogy theories, using the simulation data as acoustic source functions. Two distinct types of noise can be emitted from airfoil trailing edges. The first, a tonal or narrowband sound caused by vortex shedding, is normally associated with blunt trailing edges, high angles of attack, or laminar flow airfoils. The second source is of broadband nature arising from the aeroacoustic scattering of turbulent eddies by the trailing edge. Due to its importance to airframe noise, rotor and propeller noise, etc., trailing edge noise has been the subject of extensive theoretical (e.g. Crighton & Leppington 1971; Howe 1978) as well as experimental investigations (e.g. Brooks & Hodgson 1981; Blake & Gershfeld 1988). A number of challenges exist concerning acoustic analogy based noise computations. These include the elimination of spurious sound caused by vortices crossing permeable computational boundaries in the wake, the treatment of noncompact source regions, and the accurate description of wave reflection by the solid surface and scattering near the edge. In addition, accurate turbulence statistics in the flow field are required for the evaluation of acoustic source functions. Major efforts to date have been focused on the first two challenges. To this end, a paradigm problem of laminar vortex shedding, generated by a two dimensional, uniform stream past a NACA0012 airfoil, is used to address the relevant numerical issues. Under the low Mach number approximation, the near-field flow quantities are obtained by solving the incompressible Navier-Stokes equations numerically at chord Reynolds number of 104. The far-field noise is computed using Curle's extension to the Lighthill analogy (Curle 1955). An effective method for separating the physical noise source from spurious boundary contributions is developed. This allows an accurate evaluation of the Reynolds stress volume quadrupoles, in addition to the more readily computable surface dipoles due to the unsteady lift and drag. The effect of noncompact source distribution on the far-field sound is assessed using an efficient integration scheme for the Curle integral, with full account of retarded-time variations. The numerical results confirm in quantitative terms that the far-field sound is dominated by the surface pressure dipoles at low Mach number. The techniques developed are applicable to a wide range of flows, including jets and mixing layers, where the Reynolds stress quadrupoles play a prominent or even dominant role in the overall sound generation.

  20. Boreal forest soil erosion and soil-atmosphere carbon exchange

    NASA Astrophysics Data System (ADS)

    Billings, S. A.; Harden, J. W.; O'Donnell, J.; Sierra, C. A.

    2013-12-01

    Erosion may become an increasingly important agent of change in boreal systems with climate warming, due to enhanced ice wedge degradation and increases in the frequency and intensity of stand-replacing fires. Ice wedge degradation can induce ground surface subsidence and lateral movement of mineral soil downslope, and fire can result in the loss of O horizons and live roots, with associated increases in wind- and water-promoted erosion until vegetation re-establishment. It is well-established that soil erosion can induce significant atmospheric carbon (C) source and sink terms, with the strength of these terms dependent on the fate of eroded soil organic carbon (SOC) and the extent to which SOC oxidation and production characteristics change with erosion. In spite of the large SOC stocks in the boreal system and the high probability that boreal soil profiles will experience enhanced erosion in the coming decades, no one has estimated the influence of boreal erosion on the atmospheric C budget, a phenomenon that can serve as a positive or negative feedback to climate. We employed an interactive erosion model that permits the user to define 1) profile characteristics, 2) the erosion rate, and 3) the extent to which each soil layer at an eroding site retains its pre-erosion SOC oxidation and production rates (nox and nprod=0, respectively) vs. adopts the oxidation and production rates of previous, non-eroded soil layers (nox and nprod=1, respectively). We parameterized the model using soil profile characteristics observed at a recently burned site in interior Alaska (Hess Creek), defining SOC content and turnover times. We computed the degree to which post-burn erosion of mineral soil generates an atmospheric C sink or source while varying erosion rates and assigning multiple values of nox and nprod between 0 and 1, providing insight into the influence of erosion rate, SOC oxidation, and SOC production on C dynamics in this and similar profiles. Varying nox and nprod did not induce meaningful changes in model estimates of atmospheric C source or sink strength, likely due to the low turnover rate of SOC in this system. However, variation in mineral soil erosion rates induced large shifts in the source and sink strengths for atmospheric C; after 50 y of mineral soil erosion at 5 cm y-1, we observed a maximum C source of 35 kg C m-2 and negligible sink strength. Doubling the erosion rate approximately doubled the source strength. Scaling these estimates to the region requires estimates of the area undergoing mineral soil erosion in forests similar to those modeled. We suggest that erosion is an important but little studied feature of fire-driven boreal systems that will influence atmospheric CO2 budgets.

  1. Time-frequency approach to underdetermined blind source separation.

    PubMed

    Xie, Shengli; Yang, Liu; Yang, Jun-Mei; Zhou, Guoxu; Xiang, Yong

    2012-02-01

    This paper presents a new time-frequency (TF) underdetermined blind source separation approach based on Wigner-Ville distribution (WVD) and Khatri-Rao product to separate N non-stationary sources from M(M <; N) mixtures. First, an improved method is proposed for estimating the mixing matrix, where the negative value of the auto WVD of the sources is fully considered. Then after extracting all the auto-term TF points, the auto WVD value of the sources at every auto-term TF point can be found out exactly with the proposed approach no matter how many active sources there are as long as N ≤ 2M-1. Further discussion about the extraction of auto-term TF points is made and finally the numerical simulation results are presented to show the superiority of the proposed algorithm by comparing it with the existing ones.

  2. PAHs and PCBs in an Eastern Mediterranean megacity, Istanbul: Their spatial and temporal distributions, air-soil exchange and toxicological effects.

    PubMed

    Cetin, Banu; Ozturk, Fatma; Keles, Melek; Yurdakul, Sema

    2017-01-01

    Istanbul, one of the mega cities in the world located between Asia and Europe, has suffered from severe air pollution problems due to rapid population growth, traffic and industry. Atmospheric levels of PAHs and PCBs were investigated in Istanbul at 22 sampling sites during four different sampling periods using PUF disk passive air samplers and spatial and temporal variations of these chemicals were determined. Soil samples were also taken at the air sampling sites. At all sites, the average ambient air Σ 15 PAH and Σ 41 PCB concentrations were found as 85.6 ± 68.3 ng m -3 and 246 ± 122 pg m -3 , respectively. Phenanthrene and anthracene were the predominant PAHs and low molecular weight congeners dominated the PCBs. The PAH concentrations were higher especially at urban sites close to highways. However, the PCBs showed moderately uniform spatial variations. Except four sites, the PAH concentrations were increased with decreasing temperatures during the sampling period, indicating the contributions of combustion sources for residential heating, while PCB concentrations were mostly increased with the temperature, probably due to enhanced volatilization at higher temperatures from their sources. The results of the Factor Analysis represented the impact of traffic, petroleum, coal/biomass and natural gas combustion and medical waste incineration plants on ambient air concentrations. A similar spatial distribution trend was observed in the soil samples. Fugacity ratio results indicated that the source/sink tendency of soil for PAHs and PCBs depends on their volatility and temperature; soil generally acts as a source for lighter PAHs and PCBs particularly in higher temperatures while atmospheric deposition is a main source for higher molecular weight compounds in local soils. Toxicological effect studies also revealed the severity of air and soil pollution especially in terms of PAHs in Istanbul. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. A study on the uncertainty based on Meteorological fields on Source-receptor Relationships for Total Nitrate in the Northeast Asia

    NASA Astrophysics Data System (ADS)

    Sunwoo, Y.; Park, J.; Kim, S.; Ma, Y.; Chang, I.

    2010-12-01

    Northeast Asia hosts more than one third of world population and the emission of pollutants trends to increase rapidly, because of economic growth and the increase of the consumption in high energy intensity. In case of air pollutants, especially, its characteristics of emissions and transportation become issued nationally, in terms of not only environmental aspects, but also long-range transboundary transportation. In meteorological characteristics, westerlies area means what air pollutants that emitted from China can be delivered to South Korea. Therefore, considering meteorological factors can be important to understand air pollution phenomena. In this study, we used MM5(Fifth-Generation Mesoscale Model) and WRF(Weather Research and Forecasting Model) to produce the meteorological fields. We analyzed the feature of physics option in each model and the difference due to characteristic of WRF and MM5. We are trying to analyze the uncertainty of source-receptor relationships for total nitrate according to meteorological fields in the Northeast Asia. We produced the each meteorological fields that apply the same domain, same initial and boundary conditions, the best similar physics option. S-R relationships in terms of amount and fractional number for total nitrate (sum of N from HNO3, nitrate and PAN) were calculated by EMEP method 3.

  4. Hydrodynamic model for expansion and collisional relaxation of x-ray laser-excited multi-component nanoplasma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saxena, Vikrant, E-mail: vikrant.saxena@desy.de; Hamburg Center for Ultrafast Imaging, Luruper Chaussee 149, 22761 Hamburg; Ziaja, Beata, E-mail: ziaja@mail.desy.de

    The irradiation of an atomic cluster with a femtosecond x-ray free-electron laser pulse results in a nanoplasma formation. This typically occurs within a few hundred femtoseconds. By this time the x-ray pulse is over, and the direct photoinduced processes no longer contributing. All created electrons within the nanoplasma are thermalized. The nanoplasma thus formed is a mixture of atoms, electrons, and ions of various charges. While expanding, it is undergoing electron impact ionization and three-body recombination. Below we present a hydrodynamic model to describe the dynamics of such multi-component nanoplasmas. The model equations are derived by taking the moments ofmore » the corresponding Boltzmann kinetic equations. We include the equations obtained, together with the source terms due to electron impact ionization and three-body recombination, in our hydrodynamic solver. Model predictions for a test case, expanding spherical Ar nanoplasma, are obtained. With this model, we complete the two-step approach to simulate x-ray created nanoplasmas, enabling computationally efficient simulations of their picosecond dynamics. Moreover, the hydrodynamic framework including collisional processes can be easily extended for other source terms and then applied to follow relaxation of any finite non-isothermal multi-component nanoplasma with its components relaxed into local thermodynamic equilibrium.« less

  5. Dorsal Augmentation with Homologous Rib.

    PubMed

    Kridel, Russell W H; Sturm, Angela K

    2017-04-01

    Dorsal augmentation grafts are used to reconstruct and raise the nasal dorsum in patients with dorsal saddling due to trauma, infection, or previous nasal surgery, as well as in patients with a narrow, congenitally low, and/or wide dorsum. Alloplastic implants and various biomaterials are available for grafting, each with advantages and disadvantages. Although autologous septal cartilage is a preferable and often convenient source of cartilage, it is frequently not sufficient for large volume dorsal augmentation, nor is it available in patients who have had septoplasty, infection, previous rhinoplasty with grafting, or significant trauma. Ear cartilage may be used but it is difficult to make homogenous and smooth, and dorsal irregularities can be seen in the long term especially in thin-skinned patients. For these reasons, we frequently use irradiated costal cartilage from tissue banks as our grafting source, thereby eliminating the morbidity of harvesting the patient's own rib. Proper surgical techniques, the use of antibiotics, and proper sculpting and placement of the cartilage limits complications such as warping, resorption, infection, and extrusion. Irradiated homograft costal cartilage grafts have been used successfully in large numbers of patients with long-term follow-up with low complication rates and serve as a welcome alternative to harvesting a patient's rib cartilage. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  6. Long-term influence of asteroids on planet longitudes and chaotic dynamics of the solar system

    NASA Astrophysics Data System (ADS)

    Woillez, E.; Bouchet, F.

    2017-11-01

    Over timescales much longer than an orbital period, the solar system exhibits large-scale chaotic behavior and can thus be viewed as a stochastic dynamical system. The aim of the present paper is to compare different sources of stochasticity in the solar system. More precisely we studied the importance of the long term influence of asteroids on the chaotic dynamics of the solar system. We show that the effects of asteroids on planets is similar to a white noise process, when those effects are considered on a timescale much larger than the correlation time τϕ ≃ 104 yr of asteroid trajectories. We computed the timescale τe after which the effects of the stochastic evolution of the asteroids lead to a loss of information for the initial conditions of the perturbed Laplace-Lagrange secular dynamics. The order of magnitude of this timescale is precisely determined by theoretical argument, and we find that τe ≃ 104 Myr. Although comparable to the full main-sequence lifetime of the sun, this timescale is considerably longer than the Lyapunov time τI ≃ 10 Myr of the solar system without asteroids. This shows that the external sources of chaos arise as a small perturbation in the stochastic secular behavior of the solar system, rather due to intrinsic chaos.

  7. Multi-criteria analysis for PM10 planning

    NASA Astrophysics Data System (ADS)

    Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa

    To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.

  8. Rapid discrimination of Herba Cistanches by multi-step infrared macro-fingerprinting combined with soft independent modeling of class analogy (SIMCA)

    NASA Astrophysics Data System (ADS)

    Xu, Changhua; Jia, Xiaoguang; Xu, Rong; Wang, Yang; Zhou, Qun; Sun, Suqin

    2013-10-01

    Herba Cistanche, an important Chinese herbal medicine, has common four species, Cistanche deserticola (CD), Cistanche tubulosa (CT), Cistanche salsa (CS) and Cistanche sinensis (CSN) which have been frequent mixed used. To clarify the sources of Herba Cistanches and ensure the clinical efficacy and safety, a method combing IR macro-fingerprinting with statistical pattern recognition was developed to analyze and discriminate the four species of Herba Cistanche. By comparing FT-IR, second derivative spectral fingerprints via group-peak matching, the similarity to CD and total saccharides (TS) followed an increasing sequence, CT < CSN < CS < CD, whereas that to total glycosides (TG) followed a decreasing order, CT > CSN > CS > CD. Characteristic fingerprints of their 2D-IR correlation spectra in 1750-1000 cm-1 have confirmed the above findings in a more intuitive way. In terms of sources for phenylethanoid glycosides (PhGs), CT can be an ideal alternative species. However, in terms of using them as a whole, more pharmacological study should be conducted due to the different ratios of their chemical constituents, which is also applicable to CSN and CS. Moreover, the four species (179 samples) has been objectively classified by SIMCA based on IR macro-fingerprints.

  9. Effects of air pollution on human health and practical measures for prevention in Iran

    PubMed Central

    Ghorani-Azam, Adel; Riahi-Zanjani, Bamdad; Balali-Mood, Mahdi

    2016-01-01

    Air pollution is a major concern of new civilized world, which has a serious toxicological impact on human health and the environment. It has a number of different emission sources, but motor vehicles and industrial processes contribute the major part of air pollution. According to the World Health Organization, six major air pollutants include particle pollution, ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. Long and short term exposure to air suspended toxicants has a different toxicological impact on human including respiratory and cardiovascular diseases, neuropsychiatric complications, the eyes irritation, skin diseases, and long-term chronic diseases such as cancer. Several reports have revealed the direct association between exposure to the poor air quality and increasing rate of morbidity and mortality mostly due to cardiovascular and respiratory diseases. Air pollution is considered as the major environmental risk factor in the incidence and progression of some diseases such as asthma, lung cancer, ventricular hypertrophy, Alzheimer's and Parkinson's diseases, psychological complications, autism, retinopathy, fetal growth, and low birth weight. In this review article, we aimed to discuss toxicology of major air pollutants, sources of emission, and their impact on human health. We have also proposed practical measures to reduce air pollution in Iran. PMID:27904610

  10. Effects of air pollution on human health and practical measures for prevention in Iran.

    PubMed

    Ghorani-Azam, Adel; Riahi-Zanjani, Bamdad; Balali-Mood, Mahdi

    2016-01-01

    Air pollution is a major concern of new civilized world, which has a serious toxicological impact on human health and the environment. It has a number of different emission sources, but motor vehicles and industrial processes contribute the major part of air pollution. According to the World Health Organization, six major air pollutants include particle pollution, ground-level ozone, carbon monoxide, sulfur oxides, nitrogen oxides, and lead. Long and short term exposure to air suspended toxicants has a different toxicological impact on human including respiratory and cardiovascular diseases, neuropsychiatric complications, the eyes irritation, skin diseases, and long-term chronic diseases such as cancer. Several reports have revealed the direct association between exposure to the poor air quality and increasing rate of morbidity and mortality mostly due to cardiovascular and respiratory diseases. Air pollution is considered as the major environmental risk factor in the incidence and progression of some diseases such as asthma, lung cancer, ventricular hypertrophy, Alzheimer's and Parkinson's diseases, psychological complications, autism, retinopathy, fetal growth, and low birth weight. In this review article, we aimed to discuss toxicology of major air pollutants, sources of emission, and their impact on human health. We have also proposed practical measures to reduce air pollution in Iran.

  11. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 15 2010-04-01 2010-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  12. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 15 2011-04-01 2011-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  13. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 26 Internal Revenue 15 2012-04-01 2012-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  14. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 15 2014-04-01 2014-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  15. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 26 Internal Revenue 15 2013-04-01 2013-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  16. Global Economic Impact of Dental Diseases.

    PubMed

    Listl, S; Galloway, J; Mossey, P A; Marcenes, W

    2015-10-01

    Reporting the economic burden of oral diseases is important to evaluate the societal relevance of preventing and addressing oral diseases. In addition to treatment costs, there are indirect costs to consider, mainly in terms of productivity losses due to absenteeism from work. The purpose of the present study was to estimate the direct and indirect costs of dental diseases worldwide to approximate the global economic impact. Estimation of direct treatment costs was based on a systematic approach. For estimation of indirect costs, an approach suggested by the World Health Organization's Commission on Macroeconomics and Health was employed, which factored in 2010 values of gross domestic product per capita as provided by the International Monetary Fund and oral burden of disease estimates from the 2010 Global Burden of Disease Study. Direct treatment costs due to dental diseases worldwide were estimated at US$298 billion yearly, corresponding to an average of 4.6% of global health expenditure. Indirect costs due to dental diseases worldwide amounted to US$144 billion yearly, corresponding to economic losses within the range of the 10 most frequent global causes of death. Within the limitations of currently available data sources and methodologies, these findings suggest that the global economic impact of dental diseases amounted to US$442 billion in 2010. Improvements in population oral health may imply substantial economic benefits not only in terms of reduced treatment costs but also because of fewer productivity losses in the labor market. © International & American Associations for Dental Research 2015.

  17. Quantitative determination of vinpocetine in dietary supplements

    PubMed Central

    French, John M. T.; King, Matthew D.

    2017-01-01

    Current United States regulatory policies allow for the addition of pharmacologically active substances in dietary supplements if derived from a botanical source. The inclusion of certain nootropic drugs, such as vinpocetine, in dietary supplements has recently come under scrutiny due to the lack of defined dosage parameters and yet unproven short- and long-term benefits and risks to human health. This study quantified the concentration of vinpocetine in several commercially available dietary supplements and found that a highly variable range of 0.6–5.1 mg/serving was present across the tested products, with most products providing no specification of vinpocetine concentrations. PMID:27319129

  18. Re-Evaluating Satellite Solar Power Systems for Earth

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.

    2006-01-01

    The Solar Power Satellite System is a concept to collect solar power in space, and then transport it to the surface of the Earth by microwave (or possibly laser) beam, where if is converted into electrical power for terrestrial use. The recent increase in energy costs, predictions of the near-term exhaustion of oil, and prominence of possible climate change due to the "greenhouse effect" from burning of fossil fuels has again brought alternative energy sources to public attention, and the time is certainly appropriate to reexamine the economics of space based power. Several new concepts for Satellite Power System designs were evaluated to make the concept more economically feasible.

  19. Cosmic matter-antimatter asymmetry and gravitational force

    NASA Technical Reports Server (NTRS)

    Hsu, J. P.

    1980-01-01

    Cosmic matter-antimatter asymmetry due to the gravitational interaction alone is discussed, considering the gravitational coupling of fermion matter related to the Yang-Mills (1954) gauge symmetry with the unique generalization of the four-dimensional Poincare group. Attention is given to the case of weak static fields which determines the space-time metric where only large source terms are retained. In addition, considering lowest-order Feynman diagrams, there are presented gravitational potential energies between fermions, between antifermions, and between a fermion and an antifermion. It is concluded that the gravitational force between matter is different from that between antimatter; implications from this concerning the evolution of the universe are discussed.

  20. Long-Term (2002-2015) Changes in Mercury Contamination in NE Brazil Depicted by the Mangrove Oyster Crassostraea rhizophorae (Guilding, 1828).

    PubMed

    Rios, J H L; Marins, R V; Oliveira, K F; Lacerda, L D

    2016-10-01

    Mercury concentrations in oysters from four estuaries in northeastern Brazil varied following source changes during the past 13 years. Concentrations were higher in urban estuaries relative to rural areas, but decreased in the 13-years interval following improvements in solid wastes disposal and sewage treatment. In rural estuaries, the one located in an environmental protection area showed no changes in Hg concentrations in the period. However, in the Jaguaribe estuary, remobilization from soils and sediments due to regional environmental changes, increased Hg concentrations in oysters to values similar to the most contaminated metropolitan sites.

  1. Comparison of Fully-Compressible Equation Sets for Atmospheric Dynamics

    NASA Technical Reports Server (NTRS)

    Ahmad, Nashat N.

    2016-01-01

    Traditionally, the equation for the conservation of energy used in atmospheric models is based on potential temperature and is used in place of the total energy conservation. This paper compares the application of the two equations sets for both the Euler and the Navier-Stokes solutions using several benchmark test cases. A high-resolution wave-propagation method which accurately takes into account the source term due to gravity is used for computing the non-hydrostatic atmospheric flows. It is demonstrated that there is little to no difference between the results obtained using the two different equation sets for Euler as well as Navier-Stokes solutions.

  2. Environmental Assessment: Construct POV Parking Lot Main Gate-Separated at Grand Forks AFB, North Dakota

    DTIC Science & Technology

    2004-09-01

    water quality could be degraded, both in the short-term, during actual construction, and over the long-term due to reduced storm water quality caused by...due to reduced storm water quality caused by the increase of 37 paved area. The short-term effects come from possible erosion contributing to...construction, and over the long-term due to reduced storm water quality caused by the increase of paved area. The short-term effects come from

  3. Long term performance stability of silicon sensors

    NASA Astrophysics Data System (ADS)

    Mori, R.; Betancourt, C.; Kühn, S.; Hauser, M.; Messmer, I.; Hasenfratz, A.; Thomas, M.; Lohwasser, K.; Parzefall, U.; Jakobs, K.

    2015-10-01

    The HL-LHC investigations on silicon particle sensor performance are carried out with the intention to reproduce the harsh environments foreseen, but usually in individual short measurements. Recently, several groups have observed a decrease in the charge collection of silicon strip sensors after several days, in particular on sensors showing charge multiplication. This phenomenon has been explained with a surface effect, the increase of charge sharing due to the increment of positive charge in the silicon oxide coming from the source used for charge collection measurements. Observing a similar behaviour in other sensors for which we can exclude this surface effect, we propose and investigate alternative explanations, namely trapping related effects (change of polarization) and annealing related effects. Several n-on-p strip sensors, as-processed and irradiated with protons and neutrons up to 5 ×1015neq /cm2, have been subjected to charge collection efficiency measurements for several days, while parameters like the impedance have been monitored. The probable stressing conditions have been changed in an attempt to recover the collected charge in case of a decrease. The results show that for the investigated sensors the effect of charge sharing induced by a radioactive source is not important, and a main detrimental factor is due to very high voltage, while at lower voltages the performance is stable.

  4. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  5. 12 CFR 201.4 - Availability and terms of credit.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... overnight, as a backup source of funding to a depository institution that is in generally sound financial... to a few weeks as a backup source of funding to a depository institution if, in the judgment of the... very short-term basis, usually overnight, as a backup source of funding to a depository institution...

  6. Attenuation Tomography of Northern California and the Yellow Sea / Korean Peninsula from Coda-source Normalized and Direct Lg Amplitudes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, S R; Dreger, D S; Phillips, W S

    2008-07-16

    Inversions for regional attenuation (1/Q) of Lg are performed in two different regions. The path attenuation component of the Lg spectrum is isolated using the coda-source normalization method, which corrects the Lg spectral amplitude for the source using the stable, coda-derived source spectra. Tomographic images of Northern California agree well with one-dimensional (1-D) Lg Q estimated from five different methods. We note there is some tendency for tomographic smoothing to increase Q relative to targeted 1-D methods. For example in the San Francisco Bay Area, which contains high attenuation relative to the rest of it's region, Q is over-estimated bymore » {approx}30. Coda-source normalized attenuation tomography is also carried out for the Yellow Sea/Korean Peninsula (YSKP) where output parameters (site, source, and path terms) are compared with those from the amplitude tomography method of Phillips et al. (2005) as well as a new method that ties the source term to the MDAC formulation (Walter and Taylor, 2001). The source terms show similar scatter between coda-source corrected and MDAC source perturbation methods, whereas the amplitude method has the greatest correlation with estimated true source magnitude. The coda-source better represents the source spectra compared to the estimated magnitude and could be the cause of the scatter. The similarity in the source terms between the coda-source and MDAC-linked methods shows that the latter method may approximate the effect of the former, and therefore could be useful in regions without coda-derived sources. The site terms from the MDAC-linked method correlate slightly with global Vs30 measurements. While the coda-source and amplitude ratio methods do not correlate with Vs30 measurements, they do correlate with one another, which provides confidence that the two methods are consistent. The path Q{sup -1} values are very similar between the coda-source and amplitude ratio methods except for small differences in the Da-xin-anling Mountains, in the northern YSKP. However there is one large difference between the MDAC-linked method and the others in the region near stations TJN and INCN, which point to site-effect as the cause for the difference.« less

  7. Green materials for sustainable development

    NASA Astrophysics Data System (ADS)

    Purwasasmita, B. S.

    2017-03-01

    Sustainable development is an integrity of multidiscipline concept combining ecological, social and economic aspects to construct a liveable human living system. The sustainable development can be support through the development of green materials. Green materials offers a unique characteristic and properties including abundant in nature, less toxic, economically affordable and versatility in term of physical and chemical properties. Green materials can be applied for a numerous field in science and technology applications including for energy, building, construction and infrastructures, materials science and engineering applications and pollution management and technology. For instance, green materials can be developed as a source for energy production. Green materials including biomass-based source can be developed as a source for biodiesel and bioethanol production. Biomass-based materials also can be transformed into advanced functionalized materials for advanced bio-applications such as the transformation of chitin into chitosan which further used for biomedicine, biomaterials and tissue engineering applications. Recently, cellulose-based material and lignocellulose-based materials as a source for the developing functional materials attracted the potential prospect for biomaterials, reinforcing materials and nanotechnology. Furthermore, the development of pigment materials has gaining interest by using the green materials as a source due to their unique properties. Eventually, Indonesia as a large country with a large biodiversity can enhance the development of green material to strengthen our nation competitiveness and develop the materials technology for the future.

  8. Assessing Pyrite-Derived Sulfate in the Mississippi River with Four Years of Sulfur and Triple-Oxygen Isotope Data.

    PubMed

    Killingsworth, Bryan A; Bao, Huiming; Kohl, Issaku E

    2018-05-17

    Riverine dissolved sulfate (SO 4 2- ) sulfur and oxygen isotope variations reflect their controls such as SO 4 2- reduction and reoxidation, and source mixing. However, unconstrained temporal variability of riverine SO 4 2- isotope compositions due to short sampling durations may lead to mischaracterization of SO 4 2- sources, particularly for the pyrite-derived sulfate load. We measured the sulfur and triple-oxygen isotopes (δ 34 S, δ 18 O, and Δ' 17 O) of Mississippi River SO 4 2- with biweekly sampling between 2009 and 2013 to test isotopic variability and constrain sources. Sulfate δ 34 S and δ 18 O ranged from -6.3‰ to -0.2‰ and -3.6‰ to +8.8‰, respectively. Our sampling period captured the most severe flooding and drought in the Mississippi River basin since 1927 and 1956, respectively, and a first year of sampling that was unrepresentative of long-term average SO 4 2- . The δ 34 S SO4 data indicate pyrite-derived SO 4 2- sources are 74 ± 10% of the Mississippi River sulfate budget. Furthermore, pyrite oxidation is implicated as the dominant process supplying SO 4 2- to the Mississippi River, whereas the Δ' 17 O SO4 data shows 18 ± 9% of oxygen in this sulfate is sourced from air O 2 .

  9. Receptivity of the compressible mixing layer

    NASA Astrophysics Data System (ADS)

    Barone, Matthew F.; Lele, Sanjiva K.

    2005-09-01

    Receptivity of compressible mixing layers to general source distributions is examined by a combined theoretical/computational approach. The properties of solutions to the adjoint Navier Stokes equations are exploited to derive expressions for receptivity in terms of the local value of the adjoint solution. The result is a description of receptivity for arbitrary small-amplitude mass, momentum, and heat sources in the vicinity of a mixing-layer flow, including the edge-scattering effects due to the presence of a splitter plate of finite width. The adjoint solutions are examined in detail for a Mach 1.2 mixing-layer flow. The near field of the adjoint solution reveals regions of relatively high receptivity to direct forcing within the mixing layer, with receptivity to nearby acoustic sources depending on the source type and position. Receptivity ‘nodes’ are present at certain locations near the splitter plate edge where the flow is not sensitive to forcing. The presence of the nodes is explained by interpretation of the adjoint solution as the superposition of incident and scattered fields. The adjoint solution within the boundary layer upstream of the splitter-plate trailing edge reveals a mechanism for transfer of energy from boundary-layer stability modes to Kelvin Helmholtz modes. Extension of the adjoint solution to the far field using a Kirchhoff surface gives the receptivity of the mixing layer to incident sound from distant sources.

  10. Identification and measurement of combustion noise from a turbofan engine using correlation and coherence techniques. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Karchmer, A. M.

    1977-01-01

    Fluctuating pressure measurements within the combustor and tailpipe of a turbofan engine are made simultaneously with far field acoustic measurements. The pressure measurements within the engine are accomplished with cooled semi-infinite waveguide probes utilizing conventional condenser microphones as the transducers. The measurements are taken over a broad range of engine operating conditions and for 16 far field microphone positions between 10 deg and 160 deg relative to the engine inlet axis. Correlation and coherence techniques are used to determine the relative phase and amplitude relationships between the internal pressures and far field acoustic pressures. The results indicate that the combustor is a low frequency source region for acoustic propagation through the tailpipe and out to the far field. Specifically, it is found that the relation between source pressure and the resulting sound pressure involves a 180 deg phase shift. The latter result is obtained by Fourier transforming the cross correlation function between the source pressure and acoustic pressure after removing the propagation delay time. Further, it is found that the transfer function between the source pressure and acoustic pressure has a magnitude approximately proportional to frequency squared. These results are shown to be consistent with a model using a modified source term in Lighthill's turbulence stress tensor, wherein the fluctuating Reynolds stresses are replaced with the pressure fluctuations due to fluctuating entropy.

  11. D Hydrodynamics Simulation of Amazonian Seasonally Flooded Wetlands

    NASA Astrophysics Data System (ADS)

    Pinel, S. S.; Bonnet, M. P.; Da Silva, J. S.; Cavalcanti, R., Sr.; Calmant, S.

    2016-12-01

    In the low Amazonian basin, interactions between floodplains and river channels are important in terms of water exchanges, sediments, or nutrients. These wetlands are considered as hotspot of biodiversity and are among the most productive in the world. However, they are threatened by climatic changes and anthropic activities. Hence, considering the implications for predicting inundation status of floodplain habitats, the strong interactions between water circulation, energy fluxes, biogeochemical and ecological processes, detailed analyses of flooding dynamics are useful and needed. Numerical inundation models offer means to study the interactions among different water sources. Modeling floods events in this area is challenging because flows respond to dynamic hydraulic controls coming from several water sources, complex geomorphology, and vegetation. In addition, due to the difficulty of access, there is a lack of existing hydrological data. In this context, the use of monitoring systems by remote sensing is a good option. In this study, we simulated filling and drainage processes of an Amazon floodplain (Janauacá Lake, AM, Brazil) over a 6 years period (2006-2012). Common approaches of flow modeling in the Amazon region consist of coupling a 1D simulation of the main channel flood wave to a 2D simulation of the inundation of the floodplain. Here, our approach differs as the floodplain is fully simulated. Model used is the 3D model IPH-ECO, which consists of a three-dimensional hydrodynamic module coupled with an ecosystem module. The IPH-ECO hydrodynamic module solves the Reynolds-Averaged Navier-Stokes equations using a semi-implicit discretization. After having calibrated the simulation against roughness coefficients, we validated the model in terms of vertical accuracy against water levels (daily in situ and altimetrics data), in terms of flood extent against inundation maps deduced from available remote-sensed product imagery (ALOS-1/PALSAR.), and in terms of velocity. We analyzed the inter-annual variability in hydrological fluxes and inundation dynamics of the floodplain unit. Dominant sources of inflow varied seasonally: among direct rain and local runoff (November to April), Amazon River (May to August) and seepage (September to October).

  12. 10 CFR 40.41 - Terms and conditions of licenses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Terms and conditions of licenses. 40.41 Section 40.41 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF SOURCE MATERIAL Licenses § 40.41 Terms and... the regulations in this part shall confine his possession and use of source or byproduct material to...

  13. Short-Term and Long-Term Biological Effects of Chronic Chemical Contamination on Natural Populations of a Marine Bivalve.

    PubMed

    Breitwieser, Marine; Viricel, Amélia; Graber, Marianne; Murillo, Laurence; Becquet, Vanessa; Churlaud, Carine; Fruitier-Arnaudin, Ingrid; Huet, Valérie; Lacroix, Camille; Pante, Eric; Le Floch, Stéphane; Thomas-Guyon, Hélène

    2016-01-01

    Understanding the effects of chronic chemical contamination on natural populations of marine organisms is complex due to the combined effects of different types of pollutants and environmental parameters that can modulate the physiological responses to stress. Here, we present the effects of a chronic contamination in a marine bivalve by combining multiple approaches that provide information on individual and population health. We sampled variegated scallops (Mimachlamys varia) at sites characterized by different contaminants and contamination levels to study the short and long-term (intergenerational) responses of this species to physiological stress. We used biomarkers (SOD, MDA, GST, laccase, citrate synthase and phosphatases) as indicators of oxidative stress, immune system alteration, mitochondrial respiration and general metabolism, and measured population genetic diversity at each site. In parallel, concentration of 14 trace metals and 45 organic contaminants (PAHs, PCBs, pesticides) in tissues were measured. Scallops were collected outside and during their reproductive season to investigate temporal variability in contaminant and biomarker levels. Our analyses revealed that the levels of two biomarkers (Laccase-type phenoloxidase and malondialdehyde) were significantly correlated with Cd concentration. Additionally, we observed significant seasonal differences for four of the five biomarkers, which is likely due to the scallop reproductive status at time of sampling. As a source of concern, a location that was identified as a reference site on the basis of inorganic contaminant levels presented the same level of some persistent organic pollutants (DDT and its metabolites) than more impacted sites. Finally, potential long-term effects of heavy metal contamination were observed for variegated scallops as genetic diversity was depressed in the most polluted sites.

  14. The course of posttraumatic stress symptoms and functional impairment following a disaster: what is the lasting influence of acute vs. ongoing traumatic events and stressors?

    PubMed Central

    Cerdá, M.; Bordelois, P.M.; Galea, S.; Norris, F.; Tracy, M.; Koenen, K.C.

    2012-01-01

    Purpose Ongoing traumatic events and stressors, rather than acute sources of trauma, may shape long-term post-disaster mental health. The purpose of this study was to compare the influence of acute hurricane-related exposures and ongoing post-hurricane exposures on the short- and long-term course of posttraumatic stress symptoms (PTSS) and functional impairment (FI). Methods A random sample of adults (n=658) in Galveston and Chambers Counties, Texas, was selected 2–6 months after Hurricane Ike and interviewed 3 times over eighteen months. Hurricane-related exposures included traumatic events such as death of a family member due to the hurricane and stressors such as loss/damage to personal property due to the hurricane. Post-hurricane exposures included traumatic events such as sexual assault and stressors such as divorce or serious financial problems. Results Experiencing an acute hurricane-related traumatic event or stressor was associated with initial post-hurricane PTSS [RR=1.92(95% CI=1.13–3.26) and RR=1.62(1.36–1.94), respectively] and FI [RR=1.76; (1.05–2.97) and RR=1.74(1.46–2.08)], respectively, and acute hurricane-related stressors were associated with a higher rate of increase in FI over time [RR=1.09; (1.01–1.19)]. In contrast, ongoing post-hurricane daily stressors were not associated within initial PTSS and FI, but were associated with PTSS and FI at the second and third interviews. Conclusions While immediate postdisaster interventions may influence short-term mental health, investment in the prevention of ongoing stressors may be instrumental to manage long-term mental health status. PMID:22878832

  15. Spatial and temporal changes in the structure of groundwater nitrate concentration time series (1935 1999) as demonstrated by autoregressive modelling

    NASA Astrophysics Data System (ADS)

    Jones, A. L.; Smart, P. L.

    2005-08-01

    Autoregressive modelling is used to investigate the internal structure of long-term (1935-1999) records of nitrate concentration for five karst springs in the Mendip Hills. There is a significant short term (1-2 months) positive autocorrelation at three of the five springs due to the availability of sufficient nitrate within the soil store to maintain concentrations in winter recharge for several months. The absence of short term (1-2 months) positive autocorrelation in the other two springs is due to the marked contrast in land use between the limestone and swallet parts of the catchment, rapid concentrated recharge from the latter causing short term switching in the dominant water source at the spring and thus fluctuating nitrate concentrations. Significant negative autocorrelation is evident at lags varying from 4 to 7 months through to 14-22 months for individual springs, with positive autocorrelation at 19-20 months at one site. This variable timing is explained by moderation of the exhaustion effect in the soil by groundwater storage, which gives longer residence times in large catchments and those with a dominance of diffuse flow. The lags derived from autoregressive modelling may therefore provide an indication of average groundwater residence times. Significant differences in the structure of the autocorrelation function for successive 10-year periods are evident at Cheddar Spring, and are explained by the effect the ploughing up of grasslands during the Second World War and increased fertiliser usage on available nitrogen in the soil store. This effect is moderated by the influence of summer temperatures on rates of mineralization, and of both summer and winter rainfall on the timing and magnitude of nitrate leaching. The pattern of nitrate leaching also appears to have been perturbed by the 1976 drought.

  16. Short-Term and Long-Term Biological Effects of Chronic Chemical Contamination on Natural Populations of a Marine Bivalve

    PubMed Central

    Graber, Marianne; Murillo, Laurence; Becquet, Vanessa; Churlaud, Carine; Fruitier-Arnaudin, Ingrid; Huet, Valérie; Lacroix, Camille; Pante, Eric; Le Floch, Stéphane; Thomas-Guyon, Hélène

    2016-01-01

    Understanding the effects of chronic chemical contamination on natural populations of marine organisms is complex due to the combined effects of different types of pollutants and environmental parameters that can modulate the physiological responses to stress. Here, we present the effects of a chronic contamination in a marine bivalve by combining multiple approaches that provide information on individual and population health. We sampled variegated scallops (Mimachlamys varia) at sites characterized by different contaminants and contamination levels to study the short and long-term (intergenerational) responses of this species to physiological stress. We used biomarkers (SOD, MDA, GST, laccase, citrate synthase and phosphatases) as indicators of oxidative stress, immune system alteration, mitochondrial respiration and general metabolism, and measured population genetic diversity at each site. In parallel, concentration of 14 trace metals and 45 organic contaminants (PAHs, PCBs, pesticides) in tissues were measured. Scallops were collected outside and during their reproductive season to investigate temporal variability in contaminant and biomarker levels. Our analyses revealed that the levels of two biomarkers (Laccase-type phenoloxidase and malondialdehyde) were significantly correlated with Cd concentration. Additionally, we observed significant seasonal differences for four of the five biomarkers, which is likely due to the scallop reproductive status at time of sampling. As a source of concern, a location that was identified as a reference site on the basis of inorganic contaminant levels presented the same level of some persistent organic pollutants (DDT and its metabolites) than more impacted sites. Finally, potential long-term effects of heavy metal contamination were observed for variegated scallops as genetic diversity was depressed in the most polluted sites. PMID:26938082

  17. Spatial variability of soil carbon and nitrogen in two hybrid poplar-hay crop systems in southern Quebec, Canada

    NASA Astrophysics Data System (ADS)

    Winans, K. S.

    2013-12-01

    Canadian agricultural operations contribute approximately 8% of national GHG emissions each year, mainly from fertilizers, enteric fermentation, and manure management (Environment Canada, 2010). With improved management of cropland and forests, it is possible to mitigate GHG emissions through carbon (C) sequestration while enhancing soil and crop productivity. Tree-based intercropped (TBI) systems, consisting of a fast-growing woody species such as poplar (Populus spp.) planted in widely-spaced rows with crops cultivated between tree rows, were one of the technologies prioritized for investigation by the Agreement for the Agricultural Greenhouse Gases Program (AAGGP), because fast growing trees can be a sink for atmospheric carbon-dioxide (CO2) as well as a long-term source of farm income (Montagnini and Nair, 2004). However, there are relatively few estimates of the C sequestration in the trees or due to tree inputs (e.g., fine root turnover, litterfall that gets incorporated into SOC), and hybrid poplars grow exponentially in the first 8-10 years after planting. With the current study, our objectives were (1) to evaluate spatial variation in soil C and nitrogen (N) storage, CO2 and nitrogen oxide (N20), and tree and crop productivity for two hybrid poplar-hay intercrop systems at year 9, comparing TBI vs. non-TBI systems, and (2) to evaluate TBI systems in the current context of C trading markets, which value C sequestration in trees, unharvested crop components, and soils of TBI systems. The study results will provide meaningful measures that indicate changes due to TBI systems in the short-term and in the long-term, in terms of GHG mitigation, enhanced soil and crop productivity, as well as the expected economic returns in TBI systems.

  18. Tropospheric ozone using an emission tagging technique in the CAM-Chem and WRF-Chem models

    NASA Astrophysics Data System (ADS)

    Lupascu, A.; Coates, J.; Zhu, S.; Butler, T. M.

    2017-12-01

    Tropospheric ozone is a short-lived climate forcing pollutant. High concentration of ozone can affect human health (cardiorespiratory and increased mortality due to long-term exposure), and also it damages crops. Attributing ozone concentrations to the contributions from different sources would indicate the effects of locally emitted or transported precursors on ozone levels in specific regions. This information could be used as an important component of the design of emissions reduction strategies by indicating which emission sources could be targeted for effective reductions, thus reducing the burden of ozone pollution. Using a "tagging" approach within the CAM-Chem (global) and WRF-Chem (regional) models, we can quantify the contribution of individual emission of NOx and VOC precursors on air quality. Hence, when precursor emissions of NOx are tagged, we have seen that the largest contributors on ozone levels are the anthropogenic sources, while in the case of precursor emissions of VOCs, the biogenic sources and methane account for more than 50% of ozone levels. Further, we have extended the NOx tagging method in order to investigate continental source region contributions to concentrations of ozone over various receptor regions over the globe, with a zoom over Europe. In general, summertime maximum ozone in most receptor regions is largely attributable to local emissions of anthropogenic NOx and biogenic VOC. During the rest of the year, especially during springtime, ozone in most receptor regions shows stronger influences from anthropogenic emissions of NOx and VOC in remote source regions.

  19. Subalpine Forest Carbon Cycling Short- and Long-Term Influence ofClimate and Species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kueppers, L.; Harte, J.

    2005-08-23

    Ecosystem carbon cycle feedbacks to climate change comprise one of the largest remaining sources of uncertainty in global model predictions of future climate. Both direct climate effects on carbon cycling and indirect effects via climate-induced shifts in species composition may alter ecosystem carbon balance over the long term. In the short term, climate effects on carbon cycling may be mediated by ecosystem species composition. We used an elevational climate and tree species composition gradient in Rocky Mountain subalpine forest to quantify the sensitivity of all major ecosystem carbon stocks and fluxes to these factors. The climate sensitivities of carbon fluxesmore » were species-specific in the cases of relative above ground productivity and litter decomposition, whereas the climate sensitivity of dead wood decay did not differ between species, and total annual soil CO2 flux showed no strong climate trend. Lodge pole pine relative productivity increased with warmer temperatures and earlier snowmelt, while Engelmann spruce relative productivity was insensitive to climate variables. Engelmann spruce needle decomposition decreased linearly with increasing temperature(decreasing litter moisture), while lodgepole pine and subalpine fir needle decay showed a hump-shaped temperature response. We also found that total ecosystem carbon declined by 50 percent with a 2.88C increase in mean annual temperature and a concurrent 63 percent decrease ingrowing season soil moisture, primarily due to large declines in mineral soil and dead wood carbon. We detected no independent effect of species composition on ecosystem C stocks. Overall, our carbon flux results suggest that, in the short term, any change in subalpine forest net carbon balance will depend on the specific climate scenario and spatial distribution of tree species. Over the long term, our carbon stock results suggest that with regional warming and drying, Rocky Mountain subalpine forest will be a net source of carbon to the atmosphere.« less

  20. Enhancing GADRAS Source Term Inputs for Creation of Synthetic Spectra.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horne, Steven M.; Harding, Lee

    The Gamma Detector Response and Analysis Software (GADRAS) team has enhanced the source term input for the creation of synthetic spectra. These enhancements include the following: allowing users to programmatically provide source information to GADRAS through memory, rather than through a string limited to 256 characters; allowing users to provide their own source decay database information; and updating the default GADRAS decay database to fix errors and include coincident gamma information.

  1. Localization of sound sources in a room with one microphone

    NASA Astrophysics Data System (ADS)

    Peić Tukuljac, Helena; Lissek, Hervé; Vandergheynst, Pierre

    2017-08-01

    Estimation of the location of sound sources is usually done using microphone arrays. Such settings provide an environment where we know the difference between the received signals among different microphones in the terms of phase or attenuation, which enables localization of the sound sources. In our solution we exploit the properties of the room transfer function in order to localize a sound source inside a room with only one microphone. The shape of the room and the position of the microphone are assumed to be known. The design guidelines and limitations of the sensing matrix are given. Implementation is based on the sparsity in the terms of voxels in a room that are occupied by a source. What is especially interesting about our solution is that we provide localization of the sound sources not only in the horizontal plane, but in the terms of the 3D coordinates inside the room.

  2. A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions

    NASA Technical Reports Server (NTRS)

    Huff, R. G.

    1984-01-01

    The equations of momentum annd continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in Earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.

  3. A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions

    NASA Technical Reports Server (NTRS)

    Huff, R. G.

    1984-01-01

    The equations of momentum and continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.

  4. Sources and characteristics of terrestrial carbon in Holocene-scale sediments of the East Siberian Sea

    NASA Astrophysics Data System (ADS)

    Keskitalo, Kirsi; Tesi, Tommaso; Bröder, Lisa; Andersson, August; Pearce, Christof; Sköld, Martin; Semiletov, Igor P.; Dudarev, Oleg V.; Gustafsson, Örjan

    2017-09-01

    Thawing of permafrost carbon (PF-C) due to climate warming can remobilise considerable amounts of terrestrial carbon from its long-term storage to the marine environment. PF-C can be then be buried in sediments or remineralised to CO2 with implications for the carbon-climate feedback. Studying historical sediment records during past natural climate changes can help us to understand the response of permafrost to current climate warming. In this study, two sediment cores collected from the East Siberian Sea were used to study terrestrial organic carbon sources, composition and degradation during the past ˜ 9500 cal yrs BP. CuO-derived lignin and cutin products (i.e., compounds solely biosynthesised in terrestrial plants) combined with δ13C suggest that there was a higher input of terrestrial organic carbon to the East Siberian Sea between ˜ 9500 and 8200 cal yrs BP than in all later periods. This high input was likely caused by marine transgression and permafrost destabilisation in the early Holocene climatic optimum. Based on source apportionment modelling using dual-carbon isotope (Δ14C, δ13C) data, coastal erosion releasing old Pleistocene permafrost carbon was identified as a significant source of organic matter translocated to the East Siberian Sea during the Holocene.

  5. Experimental and numerical study of impact of voltage fluctuate, flicker and power factor wave electric generator to local distribution

    NASA Astrophysics Data System (ADS)

    Hadi, Nik Azran Ab; Rashid, Wan Norhisyam Abd; Hashim, Nik Mohd Zarifie; Mohamad, Najmiah Radiah; Kadmin, Ahmad Fauzan

    2017-10-01

    Electricity is the most powerful energy source in the world. Engineer and technologist combined and cooperated to invent a new low-cost technology and free carbon emission where the carbon emission issue is a major concern now due to global warming. Renewable energy sources such as hydro, wind and wave are becoming widespread to reduce the carbon emissions, on the other hand, this effort needs several novel methods, techniques and technologies compared to coal-based power. Power quality of renewable sources needs in depth research and endless study to improve renewable energy technologies. The aim of this project is to investigate the impact of renewable electric generator on its local distribution system. The power farm was designed to connect to the local distribution system and it will be investigated and analyzed to make sure that energy which is supplied to customer is clean. The MATLAB tools are used to simulate the overall analysis. At the end of the project, a summary of identifying various voltage fluctuates data sources is presented in terms of voltage flicker. A suggestion of the analysis impact of wave power generation on its local distribution is also presented for the development of wave generator farms.

  6. General relativistic corrections in density-shear correlations

    NASA Astrophysics Data System (ADS)

    Ghosh, Basundhara; Durrer, Ruth; Sellentin, Elena

    2018-06-01

    We investigate the corrections which relativistic light-cone computations induce on the correlation of the tangential shear with galaxy number counts, also known as galaxy-galaxy lensing. The standard-approach to galaxy-galaxy lensing treats the number density of sources in a foreground bin as observable, whereas it is in reality unobservable due to the presence of relativistic corrections. We find that already in the redshift range covered by the DES first year data, these currently neglected relativistic terms lead to a systematic correction of up to 50% in the density-shear correlation function for the highest redshift bins. This correction is dominated by the fact that a redshift bin of number counts does not only lens sources in a background bin, but is itself again lensed by all masses between the observer and the counted source population. Relativistic corrections are currently ignored in the standard galaxy-galaxy analyses, and the additional lensing of a counted source populations is only included in the error budget (via the covariance matrix). At increasingly higher redshifts and larger scales, these relativistic and lensing corrections become however increasingly more important, and we here argue that it is then more efficient, and also cleaner, to account for these corrections in the density-shear correlations.

  7. A unified model of supernova driven by magnetic monopoles

    NASA Astrophysics Data System (ADS)

    Peng, Qiu-He; Liu, Jing-Jing; Chou, Chih-Kang

    2017-12-01

    In this paper, we first discuss a series of important but puzzling physical mechanisms concerning the energy source, various kinds of core collapsed supernovae explosion mechanisms during central gravitational collapse in astrophysics. We also discuss the puzzle of possible association of γ -ray burst with gravitational wave perturbation, the heat source for the molten interior of the core of the Earth and finally the puzzling problem of the cooling of white dwarfs. We then make use of the estimations for the space flux of magnetic monopoles (hereafter MMs) and nucleon decay induced by MMs (called the Rubakov-Callen (RC) effect) to obtain the luminosity due to the RC effect. In terms of the formula for this RC luminosity, we present a unified treatment for the heat source of the Earth's core, the energy source for the white dwarf interior, various kinds of core collapsed supernovae (Type II Supernova (SNII), Type Ib Supernova (SNIb), Type Ic Supernova (SNIc), Super luminous supernova (SLSN)), and the production mechanism for γ -ray burst. This unified model can also be used to reasonably explain the possible association of the short γ -ray burst detected by the Fermi γ -ray Burst Monitoring Satellite (GBM) with the LIGO gravitational wave event GW150914 in September 2015.

  8. Simulation of Mechanical Processes in Gas Storage Caverns for Short-Term Energy Storage

    NASA Astrophysics Data System (ADS)

    Böttcher, Norbert; Nagel, Thomas; Kolditz, Olaf

    2015-04-01

    In recent years, Germany's energy management has started to be transferred from fossil fuels to renewable and sustainable energy carriers. Renewable energy sources such as solar and wind power are subjected by fluctuations, thus the development and extension of energy storage capacities is a priority in German R&D programs. This work is a part of the ANGUS+ Project, funded by the federal ministry of education and research, which investigates the influence of subsurface energy storage on the underground. The utilization of subsurface salt caverns as a long-term storage reservoir for fossil fuels is a common method, since the construction of caverns in salt rock is inexpensive in comparison to solid rock formations due to solution mining. Another advantage of evaporate as host material is the self-healing behaviour of salt rock, thus the cavity can be assumed to be impermeable. In the framework of short-term energy storage (hours to days), caverns can be used as gas storage reservoirs for natural or artificial fuel gases, such as hydrogen, methane, or compressed air, where the operation pressures inside the caverns will fluctuate more frequently. This work investigates the influence of changing operation pressures at high frequencies on the stability of the host rock of gas storage caverns utilizing numerical models. Therefore, we developed a coupled Thermo-Hydro-Mechanical (THM) model based on the finite element method utilizing the open-source software platform OpenGeoSys. The salt behaviour is described by well-known constitutive material models which are capable of predicting creep, self-healing, and dilatancy processes. Our simulations include the thermodynamic behaviour of gas storage process, temperature development and distribution on the cavern boundary, the deformation of the cavern geometry, and the prediction of the dilatancy zone. Based on the numerical results, optimal operation modes can be found for individual caverns, so the risk of host rock damage can be minimized. Furthermore, the model can be used to design efficient monitoring programs to detect possible variations of the host rock due construction and operation of the storage facility. The developed model will be used by public authorities for land use planning issues.

  9. Shipboard monitoring of non-CO2 greenhouse gases in Asia and Oceania using commercially cargo vessels

    NASA Astrophysics Data System (ADS)

    Nara, H.; Tanimoto, H.; Mukai, H.; Nojiri, Y.; Tohjima, Y.; Machida, T.; Hashimoto, S.

    2011-12-01

    The National Institute for Environmental Studies (NIES) has been performing a long-term program for monitoring trace gases of atmospheric importance over the Pacific Ocean since 1995. The NIES Voluntary Observing Ships (NIES-VOS) program currently makes use of commercial cargo vessels because they operate regularly over fixed routes for long periods and sail over a wide area between various ports (e.g., between Japan and the United States, between Japan and Australia/New Zealand, and between Japan and southeast Asia). This program allows systematic and continuous measurements of non-CO2 greenhouse gases, providing long-term datasets for background air over the Pacific Ocean and regionally polluted air around east Asia. We observe both long-lived greenhouse gases (e.g., carbon dioxide) and short-lived air pollutants (e.g., tropospheric ozone, carbon monoxide) on a continuous basis. Flask samples are collected for later laboratory analysis of carbon dioxide, methane, nitrous oxide, and carbon monoxide by using gas chromatographic techniques. In addition, we recently installed cavity ringdown spectrometers for high-resolution measurement of methane and carbon dioxide to capture their highly variable features in regionally polluted air around southeast Asia (e.g., Hong Kong, Thailand, Singapore, Malaysia, Indonesia and Philippine), which is now thought to be a large source due to expanding socioeconomic activities as well as biomass burnings. Contrasting the Japan-Australia/New Zealand and Japan-southeast Asia cruises revealed regional characteristics of sources and sinks of these atmospherically important species, suggesting the existence of additional sources for methane, nitrous oxides, and carbon monoxide in this tropical Asian region.

  10. Recent changes in the oxidized to reduced nitrogen ratio in atmospheric precipitation

    NASA Astrophysics Data System (ADS)

    Kurzyca, Iwona; Frankowski, Marcin

    2017-10-01

    In this study, the characteristics of precipitation in terms of various nitrogen forms (NO3-, NO2-, NH4+, Norganic, Ntotal) is presented. The samples were collected in the areas of different anthropogenic pressure (urban area vs. ecologically protected woodland area, ∼30 km distant from each other; Wielkopolska region, Poland). Based on the Nox and Nred emission profiles (Nox/Nred ratio), temporal and spatial comparison was carried out. For both sites, during a decade of observation, more than 60% of samples had higher contribution of N-NH4+ than N-NO3-, the amount of N-NO2- was negligible, and organic nitrogen amounted to 30% of total nitrogen content which varied up to 16 mg/l. The precipitation events w ith high concentration of nitrogen species were investigated in terms of possible local and remote sources of nitrogen (synoptic meteorology), to indicate the areas which can act as potential sources of N-compounds. Based on the chemometric analysis, it was found that Nred implies Nox and vice versa, due to interactions between them in the atmosphere. Taking into account the analysis of precipitation occurring simultaneously in both locations (about 50% of all rainfall episodes), it was observed that such factor as anthropogenic pressure differentiates but does not determine the chemical composition of precipitation in the investigated areas (urban vs. woodland area; distance of ∼30 km). Thermodynamics of the atmosphere had a significant impact on concentrations of N-NO3- and N-NH4+ in precipitation, as well as the circulation of air masses and remote N sources responsible for transboundary inflow of pollutants.

  11. Chemical characterization and source apportionment of fine particulate matter in Yangzhou, China, using offline aerosol mass spectrometry

    NASA Astrophysics Data System (ADS)

    Li, L.; Ge, X.; Xu, J.; Ye, Z.

    2016-12-01

    In recent years, Aerodyne Aerosol Mass Spectrometer (AMS) has been widely used for online and real-time monitoring of fine aerosol particles all over the world. However, due to the high cost and complex maintenance, the AMS was typically deployed for short-term intense field measurements, limiting its ability in elucidating the long-term behaviors and dominant sources of regional fine particles (PM2.5). In this study, we collected daily PM2.5 filter samples across a relatively long period (November 2015 to April 2016, in total >100 samples) using a high-volume sampler, in urban Yangzhou - a city in the Yangtze River Delta region, China. These samples were analyzed by using a suite of analytical techniques, for the water-soluble inorganic ions (WSIs), organic carbon (OC), elemental carbon (EC), water-soluble organic carbon (WSOC) and total nitrogen (TN), trace metal elements, etc. More importantly, an Aerodyne soot particle aerosol mass spectrometer (SP-AMS) was for the first time introduced for the offline characterization of the PM2.5 samples collected in this region. In particular, Positive matrix factorization was conducted on the SP-AMS determined water-soluble fraction of organic aerosols (WSOA), and three distinct sources were separated, including a primary OA (POA), a less oxygenated OA (LOOA), and a more oxygenated OA (MOOA). Chemical characteristics and evolution processes of these OA subcomponents were further discussed. Our results are useful for the air pollution management in the YRD region, and the technique developed can be applied elsewhere as well.

  12. A numerical method for shock driven multiphase flow with evaporating particles

    NASA Astrophysics Data System (ADS)

    Dahal, Jeevan; McFarland, Jacob A.

    2017-09-01

    A numerical method for predicting the interaction of active, phase changing particles in a shock driven flow is presented in this paper. The Particle-in-Cell (PIC) technique was used to couple particles in a Lagrangian coordinate system with a fluid in an Eulerian coordinate system. The Piecewise Parabolic Method (PPM) hydrodynamics solver was used for solving the conservation equations and was modified with mass, momentum, and energy source terms from the particle phase. The method was implemented in the open source hydrodynamics software FLASH, developed at the University of Chicago. A simple validation of the methods is accomplished by comparing velocity and temperature histories from a single particle simulation with the analytical solution. Furthermore, simple single particle parcel simulations were run at two different sizes to study the effect of particle size on vorticity deposition in a shock-driven multiphase instability. Large particles were found to have lower enstrophy production at early times and higher enstrophy dissipation at late times due to the advection of the particle vorticity source term through the carrier gas. A 2D shock-driven instability of a circular perturbation is studied in simulations and compared to previous experimental data as further validation of the numerical methods. The effect of the particle size distribution and particle evaporation is examined further for this case. The results show that larger particles reduce the vorticity deposition, while particle evaporation increases it. It is also shown that for a distribution of particles sizes the vorticity deposition is decreased compared to single particle size case at the mean diameter.

  13. Relevance analysis and short-term prediction of PM2.5 concentrations in Beijing based on multi-source data

    NASA Astrophysics Data System (ADS)

    Ni, X. Y.; Huang, H.; Du, W. P.

    2017-02-01

    The PM2.5 problem is proving to be a major public crisis and is of great public-concern requiring an urgent response. Information about, and prediction of PM2.5 from the perspective of atmospheric dynamic theory is still limited due to the complexity of the formation and development of PM2.5. In this paper, we attempted to realize the relevance analysis and short-term prediction of PM2.5 concentrations in Beijing, China, using multi-source data mining. A correlation analysis model of PM2.5 to physical data (meteorological data, including regional average rainfall, daily mean temperature, average relative humidity, average wind speed, maximum wind speed, and other pollutant concentration data, including CO, NO2, SO2, PM10) and social media data (microblog data) was proposed, based on the Multivariate Statistical Analysis method. The study found that during these factors, the value of average wind speed, the concentrations of CO, NO2, PM10, and the daily number of microblog entries with key words 'Beijing; Air pollution' show high mathematical correlation with PM2.5 concentrations. The correlation analysis was further studied based on a big data's machine learning model- Back Propagation Neural Network (hereinafter referred to as BPNN) model. It was found that the BPNN method performs better in correlation mining. Finally, an Autoregressive Integrated Moving Average (hereinafter referred to as ARIMA) Time Series model was applied in this paper to explore the prediction of PM2.5 in the short-term time series. The predicted results were in good agreement with the observed data. This study is useful for helping realize real-time monitoring, analysis and pre-warning of PM2.5 and it also helps to broaden the application of big data and the multi-source data mining methods.

  14. Experimental evaluation of the significance of the pressure transport term for the Turbulence Kinetic Energy Budget across contrasting forest architectures

    NASA Astrophysics Data System (ADS)

    Ehrnsperger, Laura; Wunder, Tobias; Thomas, Christoph

    2017-04-01

    Forests are one of the dominant vegetation types on Earth and are an important sink for carbon on our planet. Forests are special ecosystems due to their great canopy height und complex architecture consisting of a subcanopy and a canopy layer, which changes the mechanisms of turbulent exchange within the plant canopy. To date, the sinks and sources of turbulence in forest canopies are not completely understood, especially the role of the pressure transport remains unclear. The INTRAMIX experiment was conducted in a mountainous Norway spruce (Picea abies) forest at the Fluxnet Waldstein site (DE-Bay) in Bavaria, Germany, for a period of 10 weeks in order to experimentally evaluate the significance of the pressure transport to the TKE budget for the first time. The INTRAMIX data of the dense mountain forest was compared to observations from a sparse Ponderosa pine (Pinus ponderosa) stand in Oregon, USA, to study the influence of forest architecture. We hypothesized that the pressure transport is more important in dense forest canopies as the crown decouples the subcanopy from the buoyancy- and shear-driven flow above the canopy. It is also investigated how atmospheric stability influences the TKE budget. Based upon model results from literature we expect the pressure transport to act as a source for TKE especially under free convective and unstable dynamic stability. Results to date indicate that pressure transport is most important in the subcanopy with decreasing magnitude with increasing height. Nevertheless, pressure transport is a continuous source of TKE above the canopy, while in the canopy and subcanopy layer pressure transport acts both as a sink and source term for TKE. In the tree crown layer pressure transport is a source in the morning and afternoon hours and acts as a sink during the evening, while in the subcanopy pressure transport is a source around noon and during the night and acts as a sink in the early morning and afternoon hours. This complementary pattern suggests that the pressure transport is an important means for exchanging TKE across canopy layers.

  15. Auditing the multiply-related concepts within the UMLS

    PubMed Central

    Mougin, Fleur; Grabar, Natalia

    2014-01-01

    Objective This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. Methods We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. Results At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Discussion Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. PMID:24464853

  16. The European Infrasound Bulletin

    NASA Astrophysics Data System (ADS)

    Pilger, Christoph; Ceranna, Lars; Ross, J. Ole; Vergoz, Julien; Le Pichon, Alexis; Brachet, Nicolas; Blanc, Elisabeth; Kero, Johan; Liszka, Ludwik; Gibbons, Steven; Kvaerna, Tormod; Näsholm, Sven Peter; Marchetti, Emanuele; Ripepe, Maurizio; Smets, Pieter; Evers, Laslo; Ghica, Daniela; Ionescu, Constantin; Sindelarova, Tereza; Ben Horin, Yochai; Mialle, Pierrick

    2018-05-01

    The European Infrasound Bulletin highlights infrasound activity produced mostly by anthropogenic sources, recorded all over Europe and collected in the course of the ARISE and ARISE2 projects (Atmospheric dynamics Research InfraStructure in Europe). Data includes high-frequency (> 0.7 Hz) infrasound detections at 24 European infrasound arrays from nine different national institutions complemented with infrasound stations of the International Monitoring System for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Data were acquired during 16 years of operation (from 2000 to 2015) and processed to identify and locate ˜ 48,000 infrasound events within Europe. The source locations of these events were derived by combining at least two corresponding station detections per event. Comparisons with ground-truth sources, e.g., Scandinavian mining activity, are provided as well as comparisons with the CTBT Late Event Bulletin (LEB). Relocation is performed using ray-tracing methods to estimate celerity and back-azimuth corrections for source location based on meteorological wind and temperature values for each event derived from European Centre for Medium-range Weather Forecast (ECMWF) data. This study focuses on the analysis of repeating, man-made infrasound events (e.g., mining blasts and supersonic flights) and on the seasonal, weekly and diurnal variation of the infrasonic activity of sources in Europe. Drawing comparisons to previous studies shows that improvements in terms of detection, association and location are made within this study due to increasing the station density and thus the number of events and determined source regions. This improves the capability of the infrasound station network in Europe to more comprehensively estimate the activity of anthropogenic infrasound sources in Europe.

  17. New era of electronic brachytherapy

    PubMed Central

    Ramachandran, Prabhakar

    2017-01-01

    Traditional brachytherapy refers to the placement of radioactive sources on or inside the cancer tissues. Based on the type of sources, brachytherapy can be classified as radionuclide and electronic brachytherapy. Electronic brachytherapy uses miniaturized X-ray sources instead of radionuclides to deliver high doses of radiation. The advantages of electronic brachytherapy include low dose to organs at risk, reduced dose to treating staff, no leakage radiation in off state, less shielding, and no radioactive waste. Most of these systems operate between 50 and 100 kVp and are widely used in the treatment of skin cancer. Intrabeam, Xoft and Papillon systems are also used in the treatment of intra-operative radiotherapy to breast in addition to other treatment sites. The rapid fall-off in the dose due to its low energy is a highly desirable property in brachytherapy and results in a reduced dose to the surrounding normal tissues compared to the Ir-192 source. The Xoft Axxent brachytherapy system uses a 2.25 mm miniaturized X-ray tube and the source almost mimics the high dose rate Ir-192 source in terms of dose rate and it is the only electronic brachytherapy system specifically used in the treatment of cervical cancers. One of the limiting factors that impede the use of electronic brachytherapy for interstitial application is the source dimension. However, it is highly anticipated that the design of miniaturized X-ray tube closer to the dimension of an Ir-192 wire is not too far away, and the new era of electronic brachytherapy has just begun. PMID:28529679

  18. Food sources of sodium, saturated fat, and added sugar in the Physical Activity and Nutrition for Diabetes in Alberta (PANDA) trial.

    PubMed

    Asaad, Ghada; Chan, Catherine B

    2017-12-01

    Diabetic patients may find it difficult to achieve recommended nutrient intakes embedded within dietary guidelines. The objective of this analysis was to document total sodium, saturated fat, and added sugar intake as well as the main food sources of these nutrients in Canadian adults with type 2 diabetes before and after an intervention focused on healthy eating. Participants were enrolled in a single-arm dietary intervention trial designed to improve glycemic control and adherence to dietary recommendations. A 4-week menu plan and recipes were provided for participants along with a 6-week educational curriculum. Three repeated 24-h dietary recalls were collected at baseline and 3 months. Food sources of sodium, saturated fat, and added sugar were a secondary outcome derived from the dietary recalls. After 3 months, there was a reduction (p < 0.05) in sodium intake of 561 mg/day, which was mainly due to reduced consumption of processed meats, soups, and condiments. Significantly lower intake of processed meat contributed to -2.9 g/day saturated fat intake (p < 0.1) while added sugar intake declined by 7 g/day (p < 0.1), which was due to lower consumption of baked goods/desserts and chocolate (both p < 0.05). The intervention was beneficial for type 2 diabetes patients in terms of changing dietary habits. However, the majority of the participants still exceeded the dietary guidelines for sodium and saturated fat. In addition to the efforts of individuals and their healthcare providers, strategies to increase the nutritional quality of prepared foods could provide widespread benefits.

  19. Groundwater quality in Ghaziabad district, Uttar Pradesh, India: Multivariate and health risk assessment.

    PubMed

    Chabukdhara, Mayuri; Gupta, Sanjay Kumar; Kotecha, Yatharth; Nema, Arvind K

    2017-07-01

    This study aimed to assess the quality of groundwater and potential health risk due to ingestion of heavy metals in the peri-urban and urban-industrial clusters of Ghaziabad district, Uttar Pradesh, India. Furthermore, the study aimed to evaluate heavy metals sources and their pollution level using multivariate analysis and fuzzy comprehensive assessment (FCA), respectively. Multivariate analysis using principle component analysis (PCA) showed mixed origin for Pb, Cd, Zn, Fe, and Ni, natural source for Cu and Mn and anthropogenic source for Cr. Among all the metals, Pb, Cd, Fe and Ni were above the safe limits of Bureau of Indian Standards (BIS) and World Health Organization (WHO) except Ni. Health risk in terms of hazard quotient (HQ) showed that the HQ values for children were higher than the safe level (HQ = 1) for Pb (2.4) and Cd (2.1) in pre-monsoon while in post-monsoon the value exceeded only for Pb (HQ = 1.23). The health risks of heavy metals for the adults were well within safe limits. The finding of this study indicates potential health risks to the children due to chronic exposure to contaminated groundwater in the region. Based on FCA, groundwater pollution could be categorized as quite high in the peri-urban region, and absolutely high in the urban region of Ghaziabad district. This study showed that different approaches are required for the integrated assessment of the groundwater pollution, and provides a scientific basis for the strategic future planning and comprehensive management. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Multicriteria analysis for sources of renewable energy using data from remote sensing

    NASA Astrophysics Data System (ADS)

    Matejicek, L.

    2015-04-01

    Renewable energy sources are major components of the strategy to reduce harmful emissions and to replace depleting fossil energy resources. Data from remote sensing can provide information for multicriteria analysis for sources of renewable energy. Advanced land cover quantification makes it possible to search for suitable sites. Multicriteria analysis, together with other data, is used to determine the energy potential and socially acceptability of suggested locations. The described case study is focused on an area of surface coal mines in the northwestern region of the Czech Republic, where the impacts of surface mining and reclamation constitute a dominant force in land cover changes. High resolution satellite images represent the main input datasets for identification of suitable sites. Solar mapping, wind predictions, the location of weirs in watersheds, road maps and demographic information complement the data from remote sensing for multicriteria analysis, which is implemented in a geographic information system (GIS). The input spatial datasets for multicriteria analysis in GIS are reclassified to a common scale and processed with raster algebra tools to identify suitable sites for sources of renewable energy. The selection of suitable sites is limited by the CORINE land cover database to mining and agricultural areas. The case study is focused on long term land cover changes in the 1985-2015 period. Multicriteria analysis based on CORINE data shows moderate changes in mapping of suitable sites for utilization of selected sources of renewable energy in 1990, 2000, 2006 and 2012. The results represent map layers showing the energy potential on a scale of a few preference classes (1-7), where the first class is linked to minimum preference and the last class to maximum preference. The attached histograms show the moderate variability of preference classes due to land cover changes caused by mining activities. The results also show a slight increase in the more preferred classes for utilization of sources of renewable energy due to an increase area of reclaimed sites. Using data from remote sensing, such as the multispectral images and the CORINE land cover datasets, can reduce the financial resources currently required for finding and assessing suitable areas.

  1. The long-term problems of contaminated land: Sources, impacts and countermeasures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baes, C.F. III

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').

  2. Biotic Nitrogen Enrichment Regulates Calcium Sources to Forests

    NASA Astrophysics Data System (ADS)

    Pett-Ridge, J. C.; Perakis, S. S.; Hynicka, J. D.

    2015-12-01

    Calcium is an essential nutrient in forest ecosystems that is susceptible to leaching loss and depletion. Calcium depletion can affect plant and animal productivity, soil acid buffering capacity, and fluxes of carbon and water. Excess nitrogen supply and associated soil acidification are often implicated in short-term calcium loss from soils, but the long-term role of nitrogen enrichment on calcium sources and resupply is unknown. Here we use strontium isotopes (87Sr/86Sr) as a proxy for calcium to investigate how soil nitrogen enrichment from biological nitrogen fixation interacts with bedrock calcium to regulate both short-term available supplies and the long-term sources of calcium in montane conifer forests. Our study examines 22 sites in western Oregon, spanning a 20-fold range of bedrock calcium on sedimentary and basaltic lithologies. In contrast to previous studies emphasizing abiotic control of weathering as a determinant of long-term ecosystem calcium dynamics and sources (via bedrock fertility, climate, or topographic/tectonic controls) we find instead that that biotic nitrogen enrichment of soil can strongly regulate calcium sources and supplies in forest ecosystems. For forests on calcium-rich basaltic bedrock, increasing nitrogen enrichment causes calcium sources to shift from rock-weathering to atmospheric dominance, with minimal influence from other major soil forming factors, despite regionally high rates of tectonic uplift and erosion that can rejuvenate weathering supply of soil minerals. For forests on calcium-poor sedimentary bedrock, we find that atmospheric inputs dominate regardless of degree of nitrogen enrichment. Short-term measures of soil and ecosystem calcium fertility are decoupled from calcium source sustainability, with fundamental implications for understanding nitrogen impacts, both in natural ecosystems and in the context of global change. Our finding that long-term nitrogen enrichment increases forest reliance on atmospheric calcium helps explain reports of greater ecological calcium limitation in an increasingly nitrogen-rich world.

  3. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, D.; Brunett, A.; Passerini, S.

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less

  4. Environmental Assessment: Construct Mass/Mobility Parking Lot at Grand Forks AFB, North Dakota

    DTIC Science & Technology

    2004-02-13

    Water: Surface water quality could be degraded, both in the short-term, during actual construction, and over the long-term due to reduced storm water quality caused...term, during actual construction, and over the long-term due to reduced storm water quality caused by the increase of exposed soil. The short-term

  5. Mathematical Fluid Dynamics of Store and Stage Separation

    DTIC Science & Technology

    2005-05-01

    coordinates r = stretched inner radius S, (x) = effective source strength Re, = transition Reynolds number t = time r = reflection coefficient T = temperature...wave drag due to lift integral has the same form as that due to thickness, the source strength of the equivalent body depends on streamwise derivatives...revolution in which the source strength S, (x) is proportional to the x rate of change of cross sectional area, the source strength depends on the streamwise

  6. The mass-zero spin-two field and gravitational theory.

    NASA Technical Reports Server (NTRS)

    Coulter, C. A.

    1972-01-01

    Demonstration that the conventional theory of the mass-zero spin-two field with sources introduces extraneous nonspin-two field components in source regions and fails to be covariant under the full or restricted conformal group. A modified theory is given, expressed in terms of the physical components of mass-zero spin-two field rather than in terms of 'potentials,' which has no extraneous components inside or outside sources, and which is covariant under the full conformal group. For a proper choice of source term, this modified theory has the correct Newtonian limit and automatically implies that a symmetric second-rank source tensor has zero divergence. It is shown that possibly a generally covariant form of the spin-two theory derived here can be constructed to agree with general relativity in all currently accessible experimental situations.

  7. A review on effectiveness of best management practices in improving hydrology and water quality: Needs and opportunities.

    PubMed

    Liu, Yaoze; Engel, Bernard A; Flanagan, Dennis C; Gitau, Margaret W; McMillan, Sara K; Chaubey, Indrajeet

    2017-12-01

    Best management practices (BMPs) have been widely used to address hydrology and water quality issues in both agricultural and urban areas. Increasing numbers of BMPs have been studied in research projects and implemented in watershed management projects, but a gap remains in quantifying their effectiveness through time. In this paper, we review the current knowledge about BMP efficiencies, which indicates that most empirical studies have focused on short-term efficiencies, while few have explored long-term efficiencies. Most simulation efforts that consider BMPs assume constant performance irrespective of ages of the practices, generally based on anticipated maintenance activities or the expected performance over the life of the BMP(s). However, efficiencies of BMPs likely change over time irrespective of maintenance due to factors such as degradation of structures and accumulation of pollutants. Generally, the impacts of BMPs implemented in water quality protection programs at watershed levels have not been as rapid or large as expected, possibly due to overly high expectations for practice long-term efficiency, with BMPs even being sources of pollutants under some conditions and during some time periods. The review of available datasets reveals that current data are limited regarding both short-term and long-term BMP efficiency. Based on this review, this paper provides suggestions regarding needs and opportunities. Existing practice efficiency data need to be compiled. New data on BMP efficiencies that consider important factors, such as maintenance activities, also need to be collected. Then, the existing and new data need to be analyzed. Further research is needed to create a framework, as well as modeling approaches built on the framework, to simulate changes in BMP efficiencies with time. The research community needs to work together in addressing these needs and opportunities, which will assist decision makers in formulating better decisions regarding BMP implementation in watershed management projects. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Is there an environmental benefit from remediation of a contaminated site? Combined assessments of the risk reduction and life cycle impact of remediation.

    PubMed

    Lemming, Gitte; Chambon, Julie C; Binning, Philip J; Bjerg, Poul L

    2012-12-15

    A comparative life cycle assessment is presented for four different management options for a trichloroethene-contaminated site with a contaminant source zone located in a fractured clay till. The compared options are (i) long-term monitoring (ii) in-situ enhanced reductive dechlorination (ERD), (iii) in-situ chemical oxidation (ISCO) with permanganate and (iv) long-term monitoring combined with treatment by activated carbon at the nearby waterworks. The life cycle assessment included evaluation of both primary and secondary environmental impacts. The primary impacts are the local human toxic impacts due to contaminant leaching into groundwater that is used for drinking water, whereas the secondary environmental impacts are related to remediation activities such as monitoring, drilling and construction of wells and use of remedial amendments. The primary impacts for the compared scenarios were determined by a numerical risk assessment and remedial performance model, which predicted the contaminant mass discharge over time at a point of compliance in the aquifer and at the waterworks. The combined assessment of risk reduction and life cycle impacts showed that all management options result in higher environmental impacts than they remediate, in terms of person equivalents and assuming equal weighting of all impacts. The ERD and long-term monitoring were the scenarios with the lowest secondary life cycle impacts and are therefore the preferred alternatives. However, if activated carbon treatment at the waterworks is required in the long-term monitoring scenario, then it becomes unfavorable because of large secondary impacts. ERD is favorable due to its low secondary impacts, but only if leaching of vinyl chloride to the groundwater aquifer can be avoided. Remediation with ISCO caused the highest secondary impacts and cannot be recommended for the site. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Sources of Sodium in the Lunar Exosphere: Modeling Using Ground-Based Observations of Sodium Emission and Spacecraft Data of the Plasma

    NASA Technical Reports Server (NTRS)

    Sarantos, Menelaos; Killen, Rosemary M.; Sharma, A. Surjalal; Slavin, James A.

    2009-01-01

    Observations of the equatorial lunar sodium emission are examined to quantify the effect of precipitating ions on source rates for the Moon's exospheric volatile species. Using a model of exospheric sodium transport under lunar gravity forces, the measured emission intensity is normalized to a constant lunar phase angle to minimize the effect of different viewing geometries. Daily averages of the solar Lyman alpha flux and ion flux are used as the input variables for photon-stimulated desorption (PSD) and ion sputtering, respectively, while impact vaporization due to the micrometeoritic influx is assumed constant. Additionally, a proxy term proportional to both the Lyman alpha and to the ion flux is introduced to assess the importance of ion-enhanced diffusion and/or chemical sputtering. The combination of particle transport and constrained regression models demonstrates that, assuming sputtering yields that are typical of protons incident on lunar soils, the primary effect of ion impact on the surface of the Moon is not direct sputtering but rather an enhancement of the PSD efficiency. It is inferred that the ion-induced effects must double the PSD efficiency for flux typical of the solar wind at 1 AU. The enhancement in relative efficiency of PSD due to the bombardment of the lunar surface by the plasma sheet ions during passages through the Earth's magnetotail is shown to be approximately two times higher than when it is due to solar wind ions. This leads to the conclusion that the priming of the surface is more efficiently carried out by the energetic plasma sheet ions.

  10. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less

  11. Neuroimaging Evidence for Agenda-Dependent Monitoring of Different Features during Short-Term Source Memory Tests

    ERIC Educational Resources Information Center

    Mitchell, Karen J.; Raye, Carol L.; McGuire, Joseph T.; Frankel, Hillary; Greene, Erich J.; Johnson, Marcia K.

    2008-01-01

    A short-term source monitoring procedure with functional magnetic resonance imaging assessed neural activity when participants made judgments about the format of 1 of 4 studied items (picture, word), the encoding task performed (cost, place), or whether an item was old or new. The results support findings from long-term memory studies showing that…

  12. Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux

    NASA Astrophysics Data System (ADS)

    Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.

    2017-12-01

    Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order-of-magnitude reductions. Additionally, sites may require monitoring for a minimum of 5-years in order to sufficiently evaluate remedial performance. The study shows that enhanced anaerobic source zone bioremediation contributed to a modest reduction of source zone contaminant mass discharge and appears to have mitigated rebound of chlorinated ethenes.

  13. NASA thesaurus. Volume 3: Definitions

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Publication of NASA Thesaurus definitions began with Supplement 1 to the 1985 NASA Thesaurus. The definitions given here represent the complete file of over 3,200 definitions, complimented by nearly 1,000 use references. Definitions of more common or general scientific terms are given a NASA slant if one exists. Certain terms are not defined as a matter of policy: common names, chemical elements, specific models of computers, and nontechnical terms. The NASA Thesaurus predates by a number of years the systematic effort to define terms, therefore not all Thesaurus terms have been defined. Nevertheless, definitions of older terms are continually being added. The following data are provided for each entry: term in uppercase/lowercase form, definition, source, and year the term (not the definition) was added to the NASA Thesaurus. The NASA History Office is the authority for capitalization in satellite and spacecraft names. Definitions with no source given were constructed by lexicographers at the NASA Scientific and Technical Information (STI) Facility who rely on the following sources for their information: experts in the field, literature searches from the NASA STI database, and specialized references.

  14. Comparative evaluation of statistical and mechanistic models of Escherichia coli at beaches in southern Lake Michigan

    USGS Publications Warehouse

    Safaie, Ammar; Wendzel, Aaron; Ge, Zhongfu; Nevers, Meredith; Whitman, Richard L.; Corsi, Steven R.; Phanikumar, Mantha S.

    2016-01-01

    Statistical and mechanistic models are popular tools for predicting the levels of indicator bacteria at recreational beaches. Researchers tend to use one class of model or the other, and it is difficult to generalize statements about their relative performance due to differences in how the models are developed, tested, and used. We describe a cooperative modeling approach for freshwater beaches impacted by point sources in which insights derived from mechanistic modeling were used to further improve the statistical models and vice versa. The statistical models provided a basis for assessing the mechanistic models which were further improved using probability distributions to generate high-resolution time series data at the source, long-term “tracer” transport modeling based on observed electrical conductivity, better assimilation of meteorological data, and the use of unstructured-grids to better resolve nearshore features. This approach resulted in improved models of comparable performance for both classes including a parsimonious statistical model suitable for real-time predictions based on an easily measurable environmental variable (turbidity). The modeling approach outlined here can be used at other sites impacted by point sources and has the potential to improve water quality predictions resulting in more accurate estimates of beach closures.

  15. Mobile sensing of point-source fugitive methane emissions using Bayesian inference: the determination of the likelihood function

    NASA Astrophysics Data System (ADS)

    Zhou, X.; Albertson, J. D.

    2016-12-01

    Natural gas is considered as a bridge fuel towards clean energy due to its potential lower greenhouse gas emission comparing with other fossil fuels. Despite numerous efforts, an efficient and cost-effective approach to monitor fugitive methane emissions along the natural gas production-supply chain has not been developed yet. Recently, mobile methane measurement has been introduced which applies a Bayesian approach to probabilistically infer methane emission rates and update estimates recursively when new measurements become available. However, the likelihood function, especially the error term which determines the shape of the estimate uncertainty, is not rigorously defined and evaluated with field data. To address this issue, we performed a series of near-source (< 30 m) controlled methane release experiments using a specialized vehicle mounted with fast response methane analyzers and a GPS unit. Methane concentrations were measured at two different heights along mobile traversals downwind of the sources, and concurrent wind and temperature data are recorded by nearby 3-D sonic anemometers. With known methane release rates, the measurements were used to determine the functional form and the parameterization of the likelihood function in the Bayesian inference scheme under different meteorological conditions.

  16. The Radiated Field Generated by a Monopole Source in a Short, Rigid, Rectangular Duct. Degree awarded by George Washington Univ.

    NASA Technical Reports Server (NTRS)

    Lakota, Barbara Anne

    1998-01-01

    This thesis develops a method to model the acoustic field generated by a monopole source placed in a moving rectangular duct. The walls of the duct are assumed to be infinitesimally thin and the source is placed at the center of the duct. The total acoustic pressure is written in terms of the free-space pressure, or incident pressure, and the scattered pressure. The scattered pressure is the augmentation to the incident pressure due to the presence of the duct. It satisfies a homogeneous wave equation and is discontinuous across the duct walls. Utilizing an integral representation of the scattered pressure, a set of singular boundary integral equations governing the unknown jump in scattered pressure is derived. This equation is solved by the method of collocation after representing the jump in pressure as a double series of shape functions. The solution obtained is then substituted back into the integral representation to determine the scattered pressure, and the total acoustic pressure at any point in the field. A few examples are included to illustrate the influence of various geometric and kinematic parameters on the radiated sound field.

  17. Beer as a Rich Source of Fluoride Delivered into the Body.

    PubMed

    Styburski, D; Baranowska-Bosiacka, I; Goschorska, M; Chlubek, D; Gutowska, I

    2017-06-01

    Fluoride is an element which in the minimum amount is necessary for the proper construction of the teeth and bones. But on the other hand, it increases the synthesis of reactive oxygen species, inflammatory mediators, and impairs the action of enzymes. Beer is the most popular alcoholic beverage in the world. Due to its prevalence and volume of consumption, it should be considered as a potential source of F- and taken into account in designing a balanced diet. Therefore, the aim of this study was to analyze beer samples in terms of F- levels. The concentrations of fluoride were examined using ion-selective electrode Thermo Scientific Orion and statistical analysis was based on two-way ANOVA and t test. When compared to imported beers, Polish beers were characterized by the lowest mean F- concentration (0.089 ppm). The highest mean F- concentrations were recorded in beers from Thailand (0.260 ppm), Italy (0.238 ppm), Mexico (0.210 ppm), and China (0.203 ppm). Our study shows that beer is a significant source of fluoride for humans, which is mainly associated with the quality of the water used in beer production.

  18. Improvement of dem Generation from Aster Images Using Satellite Jitter Estimation and Open Source Implementation

    NASA Astrophysics Data System (ADS)

    Girod, L.; Nuth, C.; Kääb, A.

    2015-12-01

    The Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) system embarked on the Terra (EOS AM-1) satellite has been a source of stereoscopic images covering the whole globe at a 15m resolution at a consistent quality for over 15 years. The potential of this data in terms of geomorphological analysis and change detection in three dimensions is unrivaled and needs to be exploited. However, the quality of the DEMs and ortho-images currently delivered by NASA (ASTER DMO products) is often of insufficient quality for a number of applications such as mountain glacier mass balance. For this study, the use of Ground Control Points (GCPs) or of other ground truth was rejected due to the global "big data" type of processing that we hope to perform on the ASTER archive. We have therefore developed a tool to compute Rational Polynomial Coefficient (RPC) models from the ASTER metadata and a method improving the quality of the matching by identifying and correcting jitter induced cross-track parallax errors. Our method outputs more accurate DEMs with less unmatched areas and reduced overall noise. The algorithms were implemented in the open source photogrammetric library and software suite MicMac.

  19. A 10-year observation of PM2.5-bound nickel in Xi’an, China: Effects of source control on its trend and associated health risks

    NASA Astrophysics Data System (ADS)

    Xu, Hongmei; Ho, Steven Sai Hang; Cao, Junji; Guinot, Benjamin; Kan, Haidong; Shen, Zhenxing; Ho, Kin Fai; Liu, Suixin; Zhao, Zhuzi; Li, Jianjun; Zhang, Ningning; Zhu, Chongshu; Zhang, Qian; Huang, Rujin

    2017-01-01

    This study presents the first long term (10-year period, 2004-2013) datasets of PM2.5-bound nickel (Ni) concentration obtained from the daily sample in urban of Xi’an, Northwestern China. The Ni concentration trend, pollution sources, and the potential health risks associated to Ni were investigated. The Ni concentrations increased from 2004 to 2008, but then decreased due to coal consumption reduction, energy structure reconstruction, tighter emission rules and the improvement of the industrial and motor vehicle waste control techniques. With the comparison of distributions between workday and non-workday periods, the effectiveness of local and regional air pollution control policies and contributions of hypothetical Ni sources (industrial and automobile exhausts) were evaluated, demonstrating the health benefits to the populations during the ten years. Mean Ni cancer risk was higher than the threshold value of 10-6, suggesting that carcinogenic Ni still was a concern to the residents. Our findings conclude that there are still needs to establish more strict strategies and guidelines for atmospheric Ni in our living area, assisting to balance the relationship between economic growth and environmental conservation in China.

  20. Thermal analysis of a Phase Change Material for a Solar Organic Rankine Cycle

    NASA Astrophysics Data System (ADS)

    Iasiello, M.; Braimakis, K.; Andreozzi, A.; Karellas, S.

    2017-11-01

    Organic Rankine Cycle (ORC) is a promising technology for low temperature power generation, for example for the utilization of medium temperature solar energy. Since heat generated from solar source is variable throughout the day, the implementation of Thermal Energy Storage (TES) systems to guarantee the continuous operation of solar ORCs is a critical task, and Phase Change Materials (PCM) rely on latent heat to store large amounts of energy. In the present study, a thermal analysis of a PCM for a solar ORC is carried out. Three different types of PCMs are analyzed. The energy equation for the PCM is modeled by using the heat capacity method, and it is solved by employing a 1Dexplicit finite difference scheme. The solar source is modeled with a time-variable temperature boundary condition, with experimental data taken from the literature for two different solar collectors. Results are presented in terms of temperature profiles and stored energy. It has been shown that the stored energy depends on the heat source temperature, on the employed PCM and on the boundary conditions. It has been demonstrated that the use of a metal foam can drastically enhance the stored energy due to the higher overall thermal conductivity.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oloff, L.-P., E-mail: oloff@physik.uni-kiel.de; Hanff, K.; Stange, A.

    With the advent of ultrashort-pulsed extreme ultraviolet sources, such as free-electron lasers or high-harmonic-generation (HHG) sources, a new research field for photoelectron spectroscopy has opened up in terms of femtosecond time-resolved pump-probe experiments. The impact of the high peak brilliance of these novel sources on photoemission spectra, so-called vacuum space-charge effects caused by the Coulomb interaction among the photoemitted probe electrons, has been studied extensively. However, possible distortions of the energy and momentum distributions of the probe photoelectrons caused by the low photon energy pump pulse due to the nonlinear emission of electrons have not been studied in detail yet.more » Here, we systematically investigate these pump laser-induced space-charge effects in a HHG-based experiment for the test case of highly oriented pyrolytic graphite. Specifically, we determine how the key parameters of the pump pulse—the excitation density, wavelength, spot size, and emitted electron energy distribution—affect the measured time-dependent energy and momentum distributions of the probe photoelectrons. The results are well reproduced by a simple mean-field model, which could open a path for the correction of pump laser-induced space-charge effects and thus toward probing ultrafast electron dynamics in strongly excited materials.« less

  2. Fluid source inferred from strontium isotopes in pore fluid and carbonate recovered during Expedition 337 off Shimokita, Japan

    NASA Astrophysics Data System (ADS)

    Hong, W.; Moen, N.; Haley, B. A.

    2013-12-01

    IODP Expedition 337 was designed to understand the relationship between a deep-buried (2000 meters below seafloor) hydrocarbon reservoir off the Shimokita peninsula (Japan), and the microbial community that this carbon reservoir sustains at such depth. Understanding sources and pathways of flow of fluids that carry hydrocarbons, nutrients, and other reduced components is of particular interest to fulfilling the expedition objectives, since this migrating fluid supports microbial activity not only of the deep-seated communities but also to the shallow-dwelling organisms. To this aim, the concentration and isotopic signature of Sr can be valuable due to that it is relatively free from biogenic influence and pristine in terms of drill fluid contamination. From the pore water Sr profile, concentration gradually increases from 1500 to 2400 mbsf. The depth where highest Sr concentration is observed corresponds to the depths where couple layers of carbonate were observed. Such profile suggests an upward-migrating fluid carries Sr from those deep-seated carbonate layers (>2400 mbsf) to shallower sediments. To confirm this inference, pore water, in-situ formation fluid, and carbonate samples were analyzed for Sr isotopes to investigate the fluid source.

  3. On the inclusion of mass source terms in a single-relaxation-time lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Aursjø, Olav; Jettestuen, Espen; Vinningland, Jan Ludvig; Hiorth, Aksel

    2018-05-01

    We present a lattice Boltzmann algorithm for incorporating a mass source in a fluid flow system. The proposed mass source/sink term, included in the lattice Boltzmann equation, maintains the Galilean invariance and the accuracy of the overall method, while introducing a mass source/sink term in the fluid dynamical equations. The method can, for instance, be used to inject or withdraw fluid from any preferred lattice node in a system. This suggests that injection and withdrawal of fluid does not have to be introduced through cumbersome, and sometimes less accurate, boundary conditions. The method also suggests that, through a chosen equation of state relating mass density to pressure, the proposed mass source term will render it possible to set a preferred pressure at any lattice node in a system. We demonstrate how this model handles injection and withdrawal of a fluid. And we show how it can be used to incorporate pressure boundaries. The accuracy of the algorithm is identified through a Chapman-Enskog expansion of the model and supported by the numerical simulations.

  4. A comparison of community response to aircraft noise at Toronto International and Oshawa Municipal airports†

    NASA Astrophysics Data System (ADS)

    Taylor, S. M.; Hall, F. L.; Birnie, S. E.

    1981-07-01

    Debate continues over the validity of a single dose-response relationship to describe annoyance due to transportation noise. Doubts about the appropriateness of a single relationship have centred primarily on the issue of differential response to the same noise level for different sources (e.g., aircraft, road traffic and trains). However, recent work suggests that response may vary for different types of the same source, namely aircraft, dependent upon the character, and specifically the number, of operations. Recent data collected around Toronto International and Oshawa Municipal airports permit a test of differences in four aggregate response variables. For the same NEF level, the percent at all annoyed at the two airports is not statistically different. The percent highly annoyed and the percent reporting speech interference are both significantly greater at Toronto but the percent reporting sleep interruption is greater at Oshawa. These differences can be explained in terms of the operational characteristics of the two airports.

  5. First measurement of Lyman alpha x-ray lines in hydrogen-like vanadium: results and implications for precision wavelength metrology and tests of QED

    NASA Astrophysics Data System (ADS)

    Gillaspy, J. D.; Chantler, C. T.; Paterson, D.; Hudson, L. T.; Serpa, F. G.; Takács, E.

    2010-04-01

    The first measurement of hydrogen-like vanadium x-ray Lyman alpha transitions has been made. The measurement was made on an absolute scale, fully independent of atomic structure calculations. Sufficient signal was obtained to reduce the statistical uncertainty to a small fraction of the total uncertainty budget. Potential sources of systematic error due to Doppler shifts were eliminated by performing the measurement on trapped ions. The energies for Ly α1 (1s-2p3/2) and Ly α2 (1s-2p1/2) are found to be 5443.95(25) eV and 5431.10(25) eV, respectively. These results are within approximately 1.5 σ (experimental) of the theoretical values 5443.63 eV and 5430.70 eV. The results are discussed in terms of their relation to the Lamb shift and the development of an x-ray wavelength standard based on a compact source of trapped highly charged ions.

  6. Efficient RNA drug delivery using red blood cell extracellular vesicles.

    PubMed

    Usman, Waqas Muhammad; Pham, Tin Chanh; Kwok, Yuk Yan; Vu, Luyen Tien; Ma, Victor; Peng, Boya; Chan, Yuen San; Wei, Likun; Chin, Siew Mei; Azad, Ajijur; He, Alex Bai-Liang; Leung, Anskar Y H; Yang, Mengsu; Shyh-Chang, Ng; Cho, William C; Shi, Jiahai; Le, Minh T N

    2018-06-15

    Most of the current methods for programmable RNA drug therapies are unsuitable for the clinic due to low uptake efficiency and high cytotoxicity. Extracellular vesicles (EVs) could solve these problems because they represent a natural mode of intercellular communication. However, current cellular sources for EV production are limited in availability and safety in terms of horizontal gene transfer. One potentially ideal source could be human red blood cells (RBCs). Group O-RBCs can be used as universal donors for large-scale EV production since they are readily available in blood banks and they are devoid of DNA. Here, we describe and validate a new strategy to generate large-scale amounts of RBC-derived EVs for the delivery of RNA drugs, including antisense oligonucleotides, Cas9 mRNA, and guide RNAs. RNA drug delivery with RBCEVs shows highly robust microRNA inhibition and CRISPR-Cas9 genome editing in both human cells and xenograft mouse models, with no observable cytotoxicity.

  7. Synthesis and structural property of Si nanosheets connected to Si nanowires using MnCl2/Si powder source

    NASA Astrophysics Data System (ADS)

    Meng, Erchao; Ueki, Akiko; Meng, Xiang; Suzuki, Hiroaki; Itahara, Hiroshi; Tatsuoka, Hirokazu

    2016-08-01

    Si nanosheets connected to Si nanowires were synthesized using a MnCl2/Si powder source with an Au catalyst. The synthesis method has benefits in terms of avoiding conventionally used air-sensitive SiH4 or SiCl4. The existence of the Si nanosheets connected to the Si<111> nanowires, like sprouts or leaves with petioles, was observed, and the surface of the nanosheets was Si{111}. The nanosheets were grown in the growth direction of <211> perpendicular to that of the Si nanowires. It was evident from these structural features of the nanosheets that the nanosheets were formed by the twin-plane reentrant-edge mechanism. The feature of the observed lattice fringes, which do not appear for Si bulk crystals, of the Si(111) nanosheets obtained by high resolution transmission electron microscopy was clearly explained due to the extra diffraction spots that arose by the reciprocal lattice streaking effect.

  8. Urban light pollution - The effect of atmospheric aerosols on astronomical observations at night

    NASA Astrophysics Data System (ADS)

    Joseph, J. H.; Kaufman, Y. J.; Mekler, Y.

    1991-07-01

    The transfer of diffuse city light from a localized source through a dust-laden atmosphere with optical depth less than 0.5 has been analyzed in the source-observer plane on the basis of an approximate treatment. The effect on several types of astronomical observation at night has been studied, considering different size distributions and amounts as well as particle shapes of the aerosols. The analysis is made in terms of the signal-to-noise ratios for a given amount of aerosol. The model is applied to conditions at the Wise Astronomical Observatory in the Negev desert, and limiting backgrounds for spectroscopy, photometry, and photography of stars and extended objects have been calculated for a variety of signal-to-noise ratios. Applications to observations with different equipment at various distances from an urban area of any size are possible. Due to the use of signal-to-noise ratios, the conclusions are different for the different experimental techniques used in astronomy.

  9. Urban light pollution - The effect of atmospheric aerosols on astronomical observations at night

    NASA Technical Reports Server (NTRS)

    Joseph, Joachim H.; Mekler, Yuri; Kaufman, Yoram J.

    1991-01-01

    The transfer of diffuse city light from a localized source through a dust-laden atmosphere with optical depth less than 0.5 has been analyzed in the source-observer plane on the basis of an approximate treatment. The effect on several types of astronomical observation at night has been studied, considering different size distributions and amounts as well as particle shapes of the aerosols. The analysis is made in terms of the signal-to-noise ratios for a given amount of aerosol. The model is applied to conditions at the Wise Astronomical Observatory in the Negev desert, and limiting backgrounds for spectroscopy, photometry, and photography of stars and extended objects have been calculated for a variety of signal-to-noise ratios. Applications to observations with different equipment at various distances from an urban area of any size are possible. Due to the use of signal-to-noise ratios, the conclusions are different for the different experimental techniques used in astronomy.

  10. Biomass for energy in the European Union - a review of bioenergy resource assessments

    PubMed Central

    2012-01-01

    This paper reviews recent literature on bioenergy potentials in conjunction with available biomass conversion technologies. The geographical scope is the European Union, which has set a course for long term development of its energy supply from the current dependence on fossil resources to a dominance of renewable resources. A cornerstone in European energy policies and strategies is biomass and bioenergy. The annual demand for biomass for energy is estimated to increase from the current level of 5.7 EJ to 10.0 EJ in 2020. Assessments of bioenergy potentials vary substantially due to methodological inconsistency and assumptions applied by individual authors. Forest biomass, agricultural residues and energy crops constitute the three major sources of biomass for energy, with the latter probably developing into the most important source over the 21st century. Land use and the changes thereof is a key issue in sustainable bioenergy production as land availability is an ultimately limiting factor. PMID:22546368

  11. Study of flow control by localized volume heating in hypersonic boundary layers

    NASA Astrophysics Data System (ADS)

    Keller, M. A.; Kloker, M. J.; Kirilovskiy, S. V.; Polivanov, P. A.; Sidorenko, A. A.; Maslov, A. A.

    2014-12-01

    Boundary-layer flow control is a prerequisite for a safe and efficient operation of future hypersonic transport systems. Here, the influence of an electric discharge—modeled by a heat-source term in the energy equation—on laminar boundary-layer flows over a flat plate with zero pressure gradient at Mach 3, 5, and 7 is investigated numerically. The aim was to appraise the potential of electro-gasdynamic devices for an application as turbulence generators in the super- and hypersonic flow regime. The results with localized heat-source elements in boundary layers are compared to cases with roughness elements serving as classical passive trips. The numerical simulations are performed using the commercial code ANSYS FLUENT (by ITAM) and the high-order finite-difference DNS code NS3D (by IAG), the latter allowing for the detailed analysis of laminar flow instability. For the investigated setups with steady heating, transition to turbulence is not observed, due to the Reynolds-number lowering effect of heating.

  12. An auto-adaptive optimization approach for targeting nonpoint source pollution control practices.

    PubMed

    Chen, Lei; Wei, Guoyuan; Shen, Zhenyao

    2015-10-21

    To solve computationally intensive and technically complex control of nonpoint source pollution, the traditional genetic algorithm was modified into an auto-adaptive pattern, and a new framework was proposed by integrating this new algorithm with a watershed model and an economic module. Although conceptually simple and comprehensive, the proposed algorithm would search automatically for those Pareto-optimality solutions without a complex calibration of optimization parameters. The model was applied in a case study in a typical watershed of the Three Gorges Reservoir area, China. The results indicated that the evolutionary process of optimization was improved due to the incorporation of auto-adaptive parameters. In addition, the proposed algorithm outperformed the state-of-the-art existing algorithms in terms of convergence ability and computational efficiency. At the same cost level, solutions with greater pollutant reductions could be identified. From a scientific viewpoint, the proposed algorithm could be extended to other watersheds to provide cost-effective configurations of BMPs.

  13. Sources and Resources Into the Dark Domain: The UK Web Archive as a Source for the Contemporary History of Public Health.

    PubMed

    Gorsky, Martin

    2015-08-01

    With the migration of the written record from paper to digital format, archivists and historians must urgently consider how web content should be conserved, retrieved and analysed. The British Library has recently acquired a large number of UK domain websites, captured 1996-2010, which is colloquially termed the Dark Domain Archive while technical issues surrounding user access are resolved. This article reports the results of an invited pilot project that explores methodological issues surrounding use of this archive. It asks how the relationship between UK public health and local government was represented on the web, drawing on the 'declinist' historiography to frame its questions. It points up some difficulties in developing an aggregate picture of web content due to duplication of sites. It also highlights their potential for thematic and discourse analysis, using both text and image, illustrated through an argument about the contradictory rationale for public health policy under New Labour.

  14. Gravity effects obtained from global hydrology models in comparison with high precision gravimetric time series

    NASA Astrophysics Data System (ADS)

    Wziontek, Hartmut; Wilmes, Herbert; Güntner, Andreas; Creutzfeldt, Benjamin

    2010-05-01

    Water mass changes are a major source of variations in residual gravimetric time series obtained from the combination of observations with superconducting and absolute gravimeters. Changes in the local water storage are the main influence, but global variations contribute to the signal significantly. For three European gravity stations, Bad Homburg, Wettzell and Medicina, different global hydrology models are compared. The influence of topographic effects is discussed and due to the long-term stability of the combined gravity time series, inter-annual signals in model data and gravimetric observations are compared. Two sources of influence are discriminated, i.e., the effect of a local zone with an extent of a few kilometers around the gravimetric station and the global contribution beyond 50km. Considering their coarse resolution and uncertainties, local effects calculated from global hydrological models are compared with the in-situ gravity observations and, for the station Wettzell, with local hydrological monitoring data.

  15. Hydrological considerations in providing data for water agreements

    NASA Astrophysics Data System (ADS)

    Shamir, U.

    2011-12-01

    Conflicts over water are as old as human history. Still, analysis of past and present water conflicts, cooperation and agreements clearly indicate a preponderance of cooperation over conflict. How can hydrologists contribute to maximizing the probability that this will be the outcome when interests of adjacent political entities over water move towards conflict? Hydrology is among the most important data bases for crafting a water agreement across a political boundary (others include: political, social, and economic) and are often the most elusive and controversial. We deal here with cases of water scarcity, although flood protection issues are no less interesting and challenging. For hydrologists, some of the important points in this regard are: - Agreed and "stable" hydrological data base: hydrologists know that data bases are always a "moving target" that keeps changing with new and better information, improved understanding of the hydrological components and the use of models, as well as due to the influence of changing internal and external drivers (land use and land cover, modified precipitation fields, climate change). On the other hand, it is not possible to manage an agreement that requires continuous change of the hydrological information. To do so would cause endless discussions between the parties, causing the agreement to become unstable. The tendency is therefore to "freeze" the hydrological information in the agreement and introduce a mechanism for periodic update. - Variability and uncertainty: while the basic hydrology is to be kept "stable", the agreement must recognize variability and uncertainty. Various mechanisms can be used for this, depending on the specific circumstances of the case, including: the range of variability and the degree of uncertainty and the consequences of excursions systematic from nominal values and the effects of random variability. - Water quality is an important parameter that determines usability for various purposes, and requires treatment when source quality does not match consumer requirements. - Complexity/difficulty and associated cost of extraction/production to make the "potential" source water into "usable" water. - Look jointly for new sources and benefits (expand the "cake"): agreements should look beyond the issues and water sources that are under imminent discussion due to competition and disagreement, to see whether the "cake" can be expanded, in terms of the water itself and of benefits that can accrue from a creative water agreement. - Conversion of "potential" water into "usable" water: water in a source requires transformation in time, space and quality and incurs a cost. - Introduction of expanded, previously unused resources which become available due to advanced extraction/production capabilities and additional treatment process, and/or by changing water use patterns and land use practices. - Negotiate over and jointly manage the benefits and losses due to water (wherever and whenever possible) rather than merely with the physical parameters of water themselves volume, flow, concentration.

  16. Involvement of WNT Signaling in the Regulation of Gestational Age-Dependent Umbilical Cord-Derived Mesenchymal Stem Cell Proliferation

    PubMed Central

    Shono, Akemi; Yoshida, Makiko; Yamana, Keiji; Thwin, Khin Kyae Mon; Kuroda, Jumpei; Kurokawa, Daisuke; Koda, Tsubasa; Nishida, Kosuke; Ikuta, Toshihiko; Mizobuchi, Masami; Taniguchi-Ikeda, Mariko

    2017-01-01

    Mesenchymal stem cells (MSCs) are a heterogeneous cell population that is isolated initially from the bone marrow (BM) and subsequently almost all tissues including umbilical cord (UC). UC-derived MSCs (UC-MSCs) have attracted an increasing attention as a source for cell therapy against various degenerative diseases due to their vigorous proliferation and differentiation. Although the cell proliferation and differentiation of BM-derived MSCs is known to decline with age, the functional difference between preterm and term UC-MSCs is poorly characterized. In the present study, we isolated UC-MSCs from 23 infants delivered at 22–40 weeks of gestation and analyzed their gene expression and cell proliferation. Microarray analysis revealed that global gene expression in preterm UC-MSCs was distinct from term UC-MSCs. WNT signaling impacts on a variety of tissue stem cell proliferation and differentiation, and its pathway genes were enriched in differentially expressed genes between preterm and term UC-MSCs. Cell proliferation of preterm UC-MSCs was significantly enhanced compared to term UC-MSCs and counteracted by WNT signaling inhibitor XAV939. Furthermore, WNT2B expression in UC-MSCs showed a significant negative correlation with gestational age (GA). These results suggest that WNT signaling is involved in the regulation of GA-dependent UC-MSC proliferation. PMID:29138639

  17. Alcohol, appetite and energy balance: is alcohol intake a risk factor for obesity?

    PubMed

    Yeomans, Martin R

    2010-04-26

    The increased recognition that the worldwide increase in incidence of obesity is due to a positive energy balance has lead to a focus on lifestyle choices that may contribute to excess energy intake, including the widespread belief that alcohol intake is a significant risk factor for development of obesity. This brief review examines this issue by contrasting short-term laboratory-based studies of the effects of alcohol on appetite and energy balance and longer-term epidemiological data exploring the relationship between alcohol intake and body weight. Current research clearly shows that energy consumed as alcohol is additive to that from other dietary sources, leading to short-term passive over-consumption of energy when alcohol is consumed. Indeed, alcohol consumed before or with meals tends to increase food intake, probably through enhancing the short-term rewarding effects of food. However, while these data might suggest that alcohol is a risk factor for obesity, epidemiological data suggests that moderate alcohol intake may protect against obesity, particularly in women. In contrast, higher intakes of alcohol in the absence of alcohol dependence may increase the risk of obesity, as may binge-drinking, however these effects may be secondary to personality and habitual beverage preferences. Copyright 2010 Elsevier Inc. All rights reserved.

  18. Short- and long-term effects of carbohydrate limitation on sugar and organic acid accumulation during mandarin fruit growth.

    PubMed

    Antoine, Sandrine; Pailly, Olivier; Gibon, Yves; Luro, François; Santini, Jérémie; Giannettini, Jean; Berti, Liliane

    2016-08-01

    The physiological roles of organic acids in fruit cells are not fully understood, especially in citrus, whereas the decline in titratable acidity during ripening shown by many citrus fruits is due to the utilization of citric acid. We induced carbohydrate depletion by removing source leaves at two key periods in mandarin development (early and full citric acid accumulation). Then, we assessed the resulting changes in the short term (within 48 h) and long term (several weeks until ripening). Control mature fruits were characterized by elevated fresh weight, large diameters and high quantities of malic acid, citric acid and sucrose. At the same stage, fruits subjected to early or late defoliation had higher glucose, fructose, citric acid concentrations and lower sucrose concentrations. They differed only in their malic acid concentrations, which were higher in early defoliation fruits and similar in late defoliation fruits when compared to control fruits. Finally, fruits subjected to late defoliation were characterized by high proline and γ-aminobutyric acid concentrations, and low fructose and glucose concentrations. We have shown that short- and long-term carbohydrate limitation modifies sugar and organic acid metabolism during mandarin fruit growth. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.

  19. Directional Unfolded Source Term (DUST) for Compton Cameras.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Dean J.; Horne, Steven M.; O'Brien, Sean

    2018-03-01

    A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.

  20. Erratum to Surface‐wave green’s tensors in the near field

    USGS Publications Warehouse

    Haney, Matthew M.; Hisashi Nakahara,

    2016-01-01

    Haney and Nakahara (2014) derived expressions for surface‐wave Green’s tensors that included near‐field behavior. Building on the result for a force source, Haney and Nakahara (2014) further derived expressions for a general point moment tensor source using the exact Green’s tensors. However, it has come to our attention that, although the Green’s tensors were correct, the resulting expressions for a general point moment tensor source were missing some terms. In this erratum, we provide updated expressions with these missing terms. The inclusion of the missing terms changes the example given in Haney and Nakahara (2014).

Top