Sample records for additional source term

  1. INEEL Subregional Conceptual Model Report Volume 3: Summary of Existing Knowledge of Natural and Anthropogenic Influences on the Release of Contaminants to the Subsurface Environment from Waste Source Terms at the INEEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul L. Wichlacz

    2003-09-01

    This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventorymore » of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.« less

  2. A Systematic Search for Short-term Variability of EGRET Sources

    NASA Technical Reports Server (NTRS)

    Wallace, P. M.; Griffis, N. J.; Bertsch, D. L.; Hartman, R. C.; Thompson, D. J.; Kniffen, D. A.; Bloom, S. D.

    2000-01-01

    The 3rd EGRET Catalog of High-energy Gamma-ray Sources contains 170 unidentified sources, and there is great interest in the nature of these sources. One means of determining source class is the study of flux variability on time scales of days; pulsars are believed to be stable on these time scales while blazers are known to be highly variable. In addition, previous work has demonstrated that 3EG J0241-6103 and 3EG J1837-0606 are candidates for a new gamma-ray source class. These sources near the Galactic plane display transient behavior but cannot be associated with any known blazers. Although, many instances of flaring AGN have been reported, the EGRET database has not been systematically searched for occurrences of short-timescale (approximately 1 day) variability. These considerations have led us to conduct a systematic search for short-term variability in EGRET data, covering all viewing periods through proposal cycle 4. Six 3EG catalog sources are reported here to display variability on short time scales; four of them are unidentified. In addition, three non-catalog variable sources are discussed.

  3. On volume-source representations based on the representation theorem

    NASA Astrophysics Data System (ADS)

    Ichihara, Mie; Kusakabe, Tetsuya; Kame, Nobuki; Kumagai, Hiroyuki

    2016-01-01

    We discuss different ways to characterize a moment tensor associated with an actual volume change of ΔV C , which has been represented in terms of either the stress glut or the corresponding stress-free volume change ΔV T . Eshelby's virtual operation provides a conceptual model relating ΔV C to ΔV T and the stress glut, where non-elastic processes such as phase transitions allow ΔV T to be introduced and subsequent elastic deformation of - ΔV T is assumed to produce the stress glut. While it is true that ΔV T correctly represents the moment tensor of an actual volume source with volume change ΔV C , an explanation as to why such an operation relating ΔV C to ΔV T exists has not previously been given. This study presents a comprehensive explanation of the relationship between ΔV C and ΔV T based on the representation theorem. The displacement field is represented using Green's function, which consists of two integrals over the source surface: one for displacement and the other for traction. Both integrals are necessary for representing volumetric sources, whereas the representation of seismic faults includes only the first term, as the second integral over the two adjacent fault surfaces, across which the traction balances, always vanishes. Therefore, in a seismological framework, the contribution from the second term should be included as an additional surface displacement. We show that the seismic moment tensor of a volume source is directly obtained from the actual state of the displacement and stress at the source without considering any virtual non-elastic operations. A purely mathematical procedure based on the representation theorem enables us to specify the additional imaginary displacement necessary for representing a volume source only by the displacement term, which links ΔV C to ΔV T . It also specifies the additional imaginary stress necessary for representing a moment tensor solely by the traction term, which gives the "stress glut." The imaginary displacement-stress approach clarifies the mathematical background to the classical theory.

  4. BWR ASSEMBLY SOURCE TERMS FOR WASTE PACKAGE DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.L. Lotz

    1997-02-15

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to provide boiling water reactor (BWR) assembly radiation source term data for use during Waste Package (WP) design. The BWR assembly radiation source terms are to be used for evaluation of radiolysis effects at the WP surface, and for personnel shielding requirements during assembly or WP handling operations. The objectives of this evaluation are to generate BWR assembly radiation source terms that bound selected groupings of BWR assemblies, with regard to assembly average burnup and cooling time, which comprise the anticipated MGDS BWR commercialmore » spent nuclear fuel (SNF) waste stream. The source term data is to be provided in a form which can easily be utilized in subsequent shielding/radiation dose calculations. Since these calculations may also be used for Total System Performance Assessment (TSPA), with appropriate justification provided by TSPA, or radionuclide release rate analysis, the grams of each element and additional cooling times out to 25 years will also be calculated and the data included in the output files.« less

  5. Solutions of Boltzmann`s Equation for Mono-energetic Neutrons in an Infinite Homogeneous Medium

    DOE R&D Accomplishments Database

    Wigner, E. P.

    1943-11-30

    Boltzman's equation is solved for the case of monoenergetic neutrons created by a plane or point source in an infinite medium which has spherically symmetric scattering. The customary solution of the diffusion equation appears to be multiplied by a constant factor which is smaller than 1. In addition to this term the total neutron density contains another term which is important in the neighborhood of the source. It varies as 1/r{sup 2} in the neighborhood of a point source. (auth)

  6. Common Calibration Source for Monitoring Long-term Ozone Trends

    NASA Technical Reports Server (NTRS)

    Kowalewski, Matthew

    2004-01-01

    Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.

  7. Galactic water vapor emission: further observations of variability.

    PubMed

    Knowles, S H; Mayer, C H; Sullivan, W T; Cheung, A C

    1969-10-10

    Recent observations of the 1.35-centimeter line emission of water vapor from galactic sources show short-term variability in the spectra of several sources. Two additional sources, Cygnus 1 and NGC 6334N, have been observed, and the spectra of W49 and VY Canis Majoris were measured over a wider range of radial velocity.

  8. Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2004-01-01

    A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  9. SOURCES OF PCBS TO THE ATMOSPHERE IN CHICAGO

    EPA Science Inventory

    The project will obtain additional short-term PCB samples in southwestern Chicago to determine the amount of PCB emissions to the air from a sludge drying facility. Four different types of samples will be collected: (1) short-term ambient air samples surrounding the drying beds,...

  10. 7 CFR 1786.167 - Restrictions to additional RUS financing.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 12 2010-01-01 2010-01-01 false Restrictions to additional RUS financing. 1786.167... additional RUS financing. (a) No borrower that prepays an electric loan at a discount as provided under this... borrower is unable to obtain financing at reasonable terms to restore the system from non-RUS sources...

  11. Interlaboratory study of the ion source memory effect in 36Cl accelerator mass spectrometry

    NASA Astrophysics Data System (ADS)

    Pavetich, Stefan; Akhmadaliev, Shavkat; Arnold, Maurice; Aumaître, Georges; Bourlès, Didier; Buchriegler, Josef; Golser, Robin; Keddadouche, Karim; Martschini, Martin; Merchel, Silke; Rugel, Georg; Steier, Peter

    2014-06-01

    Understanding and minimization of contaminations in the ion source due to cross-contamination and long-term memory effect is one of the key issues for accurate accelerator mass spectrometry (AMS) measurements of volatile elements. The focus of this work is on the investigation of the long-term memory effect for the volatile element chlorine, and the minimization of this effect in the ion source of the Dresden accelerator mass spectrometry facility (DREAMS). For this purpose, one of the two original HVE ion sources at the DREAMS facility was modified, allowing the use of larger sample holders having individual target apertures. Additionally, a more open geometry was used to improve the vacuum level. To evaluate this improvement in comparison to other up-to-date ion sources, an interlaboratory comparison had been initiated. The long-term memory effect of the four Cs sputter ion sources at DREAMS (two sources: original and modified), ASTER (Accélérateur pour les Sciences de la Terre, Environnement, Risques) and VERA (Vienna Environmental Research Accelerator) had been investigated by measuring samples of natural 35Cl/37Cl-ratio and samples highly-enriched in 35Cl (35Cl/37Cl ∼ 999). Besides investigating and comparing the individual levels of long-term memory, recovery time constants could be calculated. The tests show that all four sources suffer from long-term memory, but the modified DREAMS ion source showed the lowest level of contamination. The recovery times of the four ion sources were widely spread between 61 and 1390 s, where the modified DREAMS ion source with values between 156 and 262 s showed the fastest recovery in 80% of the measurements.

  12. Noise-enhanced CVQKD with untrusted source

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoqun; Huang, Chunhui

    2017-06-01

    The performance of one-way and two-way continuous variable quantum key distribution (CVQKD) protocols can be increased by adding some noise on the reconciliation side. In this paper, we propose to add noise at the reconciliation end to improve the performance of CVQKD with untrusted source. We derive the key rate of this case and analyze the impact of the additive noise. The simulation results show that the optimal additive noise can improve the performance of the system in terms of maximum transmission distance and tolerable excess noise.

  13. An Investigation into the Comparative Costs of Additive Manufacture vs. Machine from Solid for Aero Engine Parts

    DTIC Science & Technology

    2006-05-01

    welding power sources are not totally efficient at converting power drawn from the wall into heat energy used for the welding process . TIG sources are...Powder bed + Laser • Wire + Laser • Wire + Electron Beam • Wire + TIG Each system has its own unique attributes in terms of process variables...relative economics of producing a near net shape by Additive Manufacturing (AM) processes compared with traditional machine from solid processes (MFS

  14. Low birth weight and air pollution in California: Which sources and components drive the risk?

    PubMed

    Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Kleeman, Michael J; Bartell, Scott M; Cockburn, Myles; Escobedo, Loraine; Wu, Jun

    2016-01-01

    Intrauterine growth restriction has been associated with exposure to air pollution, but there is a need to clarify which sources and components are most likely responsible. This study investigated the associations between low birth weight (LBW, <2500g) in term born infants (≥37 gestational weeks) and air pollution by source and composition in California, over the period 2001-2008. Complementary exposure models were used: an empirical Bayesian kriging model for the interpolation of ambient pollutant measurements, a source-oriented chemical transport model (using California emission inventories) that estimated fine and ultrafine particulate matter (PM2.5 and PM0.1, respectively) mass concentrations (4km×4km) by source and composition, a line-source roadway dispersion model at fine resolution, and traffic index estimates. Birth weight was obtained from California birth certificate records. A case-cohort design was used. Five controls per term LBW case were randomly selected (without covariate matching or stratification) from among term births. The resulting datasets were analyzed by logistic regression with a random effect by hospital, using generalized additive mixed models adjusted for race/ethnicity, education, maternal age and household income. In total 72,632 singleton term LBW cases were included. Term LBW was positively and significantly associated with interpolated measurements of ozone but not total fine PM or nitrogen dioxide. No significant association was observed between term LBW and primary PM from all sources grouped together. A positive significant association was observed for secondary organic aerosols. Exposure to elemental carbon (EC), nitrates and ammonium were also positively and significantly associated with term LBW, but only for exposure during the third trimester of pregnancy. Significant positive associations were observed between term LBW risk and primary PM emitted by on-road gasoline and diesel or by commercial meat cooking sources. Primary PM from wood burning was inversely associated with term LBW. Significant positive associations were also observed between term LBW and ultrafine particle numbers modeled with the line-source roadway dispersion model, traffic density and proximity to roadways. This large study based on complementary exposure metrics suggests that not only primary pollution sources (traffic and commercial meat cooking) but also EC and secondary pollutants are risk factors for term LBW. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Auditing the multiply-related concepts within the UMLS

    PubMed Central

    Mougin, Fleur; Grabar, Natalia

    2014-01-01

    Objective This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. Methods We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. Results At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Discussion Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. PMID:24464853

  16. Understanding the electrical behavior of the action potential in terms of elementary electrical sources.

    PubMed

    Rodriguez-Falces, Javier

    2015-03-01

    A concept of major importance in human electrophysiology studies is the process by which activation of an excitable cell results in a rapid rise and fall of the electrical membrane potential, the so-called action potential. Hodgkin and Huxley proposed a model to explain the ionic mechanisms underlying the formation of action potentials. However, this model is unsuitably complex for teaching purposes. In addition, the Hodgkin and Huxley approach describes the shape of the action potential only in terms of ionic currents, i.e., it is unable to explain the electrical significance of the action potential or describe the electrical field arising from this source using basic concepts of electromagnetic theory. The goal of the present report was to propose a new model to describe the electrical behaviour of the action potential in terms of elementary electrical sources (in particular, dipoles). The efficacy of this model was tested through a closed-book written exam. The proposed model increased the ability of students to appreciate the distributed character of the action potential and also to recognize that this source spreads out along the fiber as function of space. In addition, the new approach allowed students to realize that the amplitude and sign of the extracellular electrical potential arising from the action potential are determined by the spatial derivative of this intracellular source. The proposed model, which incorporates intuitive graphical representations, has improved students' understanding of the electrical potentials generated by bioelectrical sources and has heightened their interest in bioelectricity. Copyright © 2015 The American Physiological Society.

  17. An Exact Form of Lilley's Equation with a Velocity Quadrupole/Temperature Dipole Source Term

    NASA Technical Reports Server (NTRS)

    Goldstein, Marvin E.

    2001-01-01

    There have been several attempts to introduce approximations into the exact form of Lilley's equation in order to express the source term as the sum of a quadrupole whose strength is quadratic in the fluctuating velocities and a dipole whose strength is proportional to the temperature fluctuations. The purpose of this note is to show that it is possible to choose the dependent (i.e., the pressure) variable so that this type of result can be derived directly from the Euler equations without introducing any additional approximations.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brauner, Edwin Jr.; Carlson, Daniel C.

    The Geysers steamfields in northern Sonoma County have produced reliable ''green'' power for many years. An impediment to long-term continued production has been the ability to provide a reliable source of injection water to replace water extracted and lost in the form of steam. The steamfield operators have historcially used cooling towers to recycle a small portion of the steam and have collected water during the winter months using stream extraction. These two sources, however, could not by themselves sustain the steamfield in the long term. The Lake County Reclaimed Water Project (SEGEP) was inititated in 1997 and provides anothermore » source of steamfield replenishment water. The Santa Rosa Geysers Recharge Project provides another significant step in replenishing the steamfield. In addition, the Santa Rosa Geysers Recharge Project has been built with capacity to potentially meet virtually all injection water requirements, when combined with these other sources. Figure 2.1 graphically depicts the combination of injection sources.« less

  19. A Comparison of Mathematical Models of Fish Mercury Concentration as a Function of Atmospheric Mercury Deposition Rate and Watershed Characteristics

    NASA Astrophysics Data System (ADS)

    Smith, R. A.; Moore, R. B.; Shanley, J. B.; Miller, E. K.; Kamman, N. C.; Nacci, D.

    2009-12-01

    Mercury (Hg) concentrations in fish and aquatic wildlife are complex functions of atmospheric Hg deposition rate, terrestrial and aquatic watershed characteristics that influence Hg methylation and export, and food chain characteristics determining Hg bioaccumulation. Because of the complexity and incomplete understanding of these processes, regional-scale models of fish tissue Hg concentration are necessarily empirical in nature, typically constructed through regression analysis of fish tissue Hg concentration data from many sampling locations on a set of potential explanatory variables. Unless the data sets are unusually long and show clear time trends, the empirical basis for model building must be based solely on spatial correlation. Predictive regional scale models are highly useful for improving understanding of the relevant biogeochemical processes, as well as for practical fish and wildlife management and human health protection. Mechanistically, the logical arrangement of explanatory variables is to multiply each of the individual Hg source terms (e.g. dry, wet, and gaseous deposition rates, and residual watershed Hg) for a given fish sampling location by source-specific terms pertaining to methylation, watershed transport, and biological uptake for that location (e.g. SO4 availability, hill slope, lake size). This mathematical form has the desirable property that predicted tissue concentration will approach zero as all individual source terms approach zero. One complication with this form, however, is that it is inconsistent with the standard linear multiple regression equation in which all terms (including those for sources and physical conditions) are additive. An important practical disadvantage of a model in which the Hg source terms are additive (rather than multiplicative) with their modifying factors is that predicted concentration is not zero when all sources are zero, making it unreliable for predicting the effects of large future reductions in Hg deposition. In this paper we compare the results of using several different linear and non-linear models in an analysis of watershed and fish Hg data for 450 New England lakes. The differences in model results pertain to both their utility in interpreting methylation and export processes as well as in fisheries management.

  20. Coarse Grid Modeling of Turbine Film Cooling Flows Using Volumetric Source Terms

    NASA Technical Reports Server (NTRS)

    Heidmann, James D.; Hunter, Scott D.

    2001-01-01

    The recent trend in numerical modeling of turbine film cooling flows has been toward higher fidelity grids and more complex geometries. This trend has been enabled by the rapid increase in computing power available to researchers. However, the turbine design community requires fast turnaround time in its design computations, rendering these comprehensive simulations ineffective in the design cycle. The present study describes a methodology for implementing a volumetric source term distribution in a coarse grid calculation that can model the small-scale and three-dimensional effects present in turbine film cooling flows. This model could be implemented in turbine design codes or in multistage turbomachinery codes such as APNASA, where the computational grid size may be larger than the film hole size. Detailed computations of a single row of 35 deg round holes on a flat plate have been obtained for blowing ratios of 0.5, 0.8, and 1.0, and density ratios of 1.0 and 2.0 using a multiblock grid system to resolve the flows on both sides of the plate as well as inside the hole itself. These detailed flow fields were spatially averaged to generate a field of volumetric source terms for each conservative flow variable. Solutions were also obtained using three coarse grids having streamwise and spanwise grid spacings of 3d, 1d, and d/3. These coarse grid solutions used the integrated hole exit mass, momentum, energy, and turbulence quantities from the detailed solutions as volumetric source terms. It is shown that a uniform source term addition over a distance from the wall on the order of the hole diameter is able to predict adiabatic film effectiveness better than a near-wall source term model, while strictly enforcing correct values of integrated boundary layer quantities.

  1. Auditing the multiply-related concepts within the UMLS.

    PubMed

    Mougin, Fleur; Grabar, Natalia

    2014-10-01

    This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. What's in a ray set: moving towards a unified ray set format

    NASA Astrophysics Data System (ADS)

    Muschaweck, Julius

    2011-10-01

    For the purpose of optical simulation, a plethora of formats exist to describe the properties of a light source. Except for the EULUMDAT and IES formats which describe sources in terms of aperture area and far field intensity, all these formats are vendor specific, and no generally accepted standard exists. Most illumination simulation software vendors use their own format for ray sets, which describe sources in terms of many rays. Some of them keep their format definition proprietary. Thus, software packages typically can read or write only their own specific format, although the actual data content is not so different. Typically, they describe origin and direction of each ray in 3D vectors, and use one more single number for magnitude, where magnitude may denote radiant flux, luminous flux (equivalently tristimulus Y), or tristimulus X and Z. Sometimes each ray also carries its wavelength, while other formats allow to specify an overall spectrum for the whole source. In addition, in at least one format, polarization properties are also included for each ray. This situation makes it inefficient and potentially error prone for light source manufacturers to provide ray data sets for their sources in many different formats. Furthermore, near field goniometer vendors again use their proprietary formats to store the source description in terms of luminance data, and offer their proprietary software to generate ray sets from this data base. Again, the plethora of ray sets make the ray set production inefficient and potentially error prone. In this paper, we propose to describe ray data sets in terms of phase space, as a step towards a standardized ray set format. It is well known that luminance and radiance can be defined as flux density in phase space: luminance is flux divided by etendue. Therefore, single rays can be thought of as center points of phase space cells, where each cell possesses its volume (i.e. etendue), its flux, and therefore its luminance. In addition, each phase space cell possesses its spectrum, and its polarization properties. We show how this approach leads to a unification of the EULUMDAT/IES, ray set and near field goniometer formats, making possible the generation of arbitrarily many additional rays by luminance interpolation. We also show how the EULUMDAT/IES and individual ray set formats can be derived from the proposed general format, making software using a possible standard format downward compatible.

  3. An Improved Elastic and Nonelastic Neutron Transport Algorithm for Space Radiation

    NASA Technical Reports Server (NTRS)

    Clowdsley, Martha S.; Wilson, John W.; Heinbockel, John H.; Tripathi, R. K.; Singleterry, Robert C., Jr.; Shinn, Judy L.

    2000-01-01

    A neutron transport algorithm including both elastic and nonelastic particle interaction processes for use in space radiation protection for arbitrary shield material is developed. The algorithm is based upon a multiple energy grouping and analysis of the straight-ahead Boltzmann equation by using a mean value theorem for integrals. The algorithm is then coupled to the Langley HZETRN code through a bidirectional neutron evaporation source term. Evaluation of the neutron fluence generated by the solar particle event of February 23, 1956, for an aluminum water shield-target configuration is then compared with MCNPX and LAHET Monte Carlo calculations for the same shield-target configuration. With the Monte Carlo calculation as a benchmark, the algorithm developed in this paper showed a great improvement in results over the unmodified HZETRN solution. In addition, a high-energy bidirectional neutron source based on a formula by Ranft showed even further improvement of the fluence results over previous results near the front of the water target where diffusion out the front surface is important. Effects of improved interaction cross sections are modest compared with the addition of the high-energy bidirectional source terms.

  4. Improvements of PKU PMECRIS for continuous hundred hours CW proton beam operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peng, S. X., E-mail: sxpeng@pku.edu.cn; Ren, H. T.; Zhang, T.

    2016-02-15

    In order to improve the source stability, a long term continuous wave (CW) proton beam experiment has been carried out with Peking University compact permanent magnet 2.45 GHz ECR ion source (PKU PMECRIS). Before such an experiment a lot of improvements and modifications were completed on the source body, the Faraday cup and the PKU ion source test bench. At the beginning of 2015, a continuous operation of PKU PMECRIS for 306 h with more than 50 mA CW beam was carried out after success of many short term tests. No plasma generator failure or high voltage breakdown was observedmore » during that running period and the proton source reliability is near 100%. Total beam availability, which is defined as 35-keV beam-on time divided by elapsed time, was higher than 99% [S. X. Peng et al., Chin. Phys. B 24(7), 075203 (2015)]. A re-inspection was performed after another additional 100 h operation (counting time) and no obvious sign of component failure was observed. Counting the previous source testing time together, this PMECRs longevity is now demonstrated to be greater than 460 h. This paper is mainly concentrated on the improvements for this long term experiment.« less

  5. Learning Discriminative Sparse Models for Source Separation and Mapping of Hyperspectral Imagery

    DTIC Science & Technology

    2010-10-01

    allowing spectroscopic analysis. The data acquired by these spectrometers play significant roles in biomedical, environmental, land-survey, and...noisy in nature , so there are differences between the true and the observed signals. In addition, there are distortions associated with atmosphere... handwriting classification, showing advantages of using both terms instead of only using the reconstruction term as in previous approaches. C. Dictionary

  6. Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux

    NASA Astrophysics Data System (ADS)

    Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.

    2017-12-01

    Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order-of-magnitude reductions. Additionally, sites may require monitoring for a minimum of 5-years in order to sufficiently evaluate remedial performance. The study shows that enhanced anaerobic source zone bioremediation contributed to a modest reduction of source zone contaminant mass discharge and appears to have mitigated rebound of chlorinated ethenes.

  7. Numerical models for the diffuse ionized gas in galaxies. I. Synthetic spectra of thermally excited gas with turbulent magnetic reconnection as energy source

    NASA Astrophysics Data System (ADS)

    Hoffmann, T. L.; Lieb, S.; Pauldrach, A. W. A.; Lesch, H.; Hultzsch, P. J. N.; Birk, G. T.

    2012-08-01

    Aims: The aim of this work is to verify whether turbulent magnetic reconnection can provide the additional energy input required to explain the up to now only poorly understood ionization mechanism of the diffuse ionized gas (DIG) in galaxies and its observed emission line spectra. Methods: We use a detailed non-LTE radiative transfer code that does not make use of the usual restrictive gaseous nebula approximations to compute synthetic spectra for gas at low densities. Excitation of the gas is via an additional heating term in the energy balance as well as by photoionization. Numerical values for this heating term are derived from three-dimensional resistive magnetohydrodynamic two-fluid plasma-neutral-gas simulations to compute energy dissipation rates for the DIG under typical conditions. Results: Our simulations show that magnetic reconnection can liberate enough energy to by itself fully or partially ionize the gas. However, synthetic spectra from purely thermally excited gas are incompatible with the observed spectra; a photoionization source must additionally be present to establish the correct (observed) ionization balance in the gas.

  8. Multiple vesicle recycling pathways in central synapses and their impact on neurotransmission

    PubMed Central

    Kavalali, Ege T

    2007-01-01

    Short-term synaptic depression during repetitive activity is a common property of most synapses. Multiple mechanisms contribute to this rapid depression in neurotransmission including a decrease in vesicle fusion probability, inactivation of voltage-gated Ca2+ channels or use-dependent inhibition of release machinery by presynaptic receptors. In addition, synaptic depression can arise from a rapid reduction in the number of vesicles available for release. This reduction can be countered by two sources. One source is replenishment from a set of reserve vesicles. The second source is the reuse of vesicles that have undergone exocytosis and endocytosis. If the synaptic vesicle reuse is fast enough then it can replenish vesicles during a brief burst of action potentials and play a substantial role in regulating the rate of synaptic depression. In the last 5 years, we have examined the impact of synaptic vesicle reuse on neurotransmission using fluorescence imaging of synaptic vesicle trafficking in combination with electrophysiological detection of short-term synaptic plasticity. These studies have revealed that synaptic vesicle reuse shapes the kinetics of short-term synaptic depression in a frequency-dependent manner. In addition, synaptic vesicle recycling helps maintain the level of neurotransmission at steady state. Moreover, our studies showed that synaptic vesicle reuse is a highly plastic process as it varies widely among synapses and can adapt to changes in chronic activity levels. PMID:17690145

  9. 40 CFR 418.71 - Specialized definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... this subpart. (b) The term mixed fertilizer shall mean a mixture of wet and/or dry straight fertilizer materials, mixed fertilizer materials, fillers and additives prepared through chemical reaction to a given... STANDARDS FERTILIZER MANUFACTURING POINT SOURCE CATEGORY Mixed and Blend Fertilizer Production Subcategory...

  10. 40 CFR 418.71 - Specialized definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... this subpart. (b) The term mixed fertilizer shall mean a mixture of wet and/or dry straight fertilizer materials, mixed fertilizer materials, fillers and additives prepared through chemical reaction to a given... STANDARDS FERTILIZER MANUFACTURING POINT SOURCE CATEGORY Mixed and Blend Fertilizer Production Subcategory...

  11. 40 CFR 418.71 - Specialized definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... this subpart. (b) The term mixed fertilizer shall mean a mixture of wet and/or dry straight fertilizer materials, mixed fertilizer materials, fillers and additives prepared through chemical reaction to a given... STANDARDS FERTILIZER MANUFACTURING POINT SOURCE CATEGORY Mixed and Blend Fertilizer Production Subcategory...

  12. 40 CFR 418.71 - Specialized definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... apply to this subpart. (b) The term mixed fertilizer shall mean a mixture of wet and/or dry straight fertilizer materials, mixed fertilizer materials, fillers and additives prepared through chemical reaction to... AND STANDARDS FERTILIZER MANUFACTURING POINT SOURCE CATEGORY Mixed and Blend Fertilizer Production...

  13. 40 CFR 418.71 - Specialized definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... this subpart. (b) The term mixed fertilizer shall mean a mixture of wet and/or dry straight fertilizer materials, mixed fertilizer materials, fillers and additives prepared through chemical reaction to a given... STANDARDS FERTILIZER MANUFACTURING POINT SOURCE CATEGORY Mixed and Blend Fertilizer Production Subcategory...

  14. Glossary of CERCLA, RCRA and TSCA related terms and acronyms. Environmental Guidance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-10-01

    This glossary contains CERCLA, RCRA and TSCA related terms that are most often encountered in the US Department of Energy (DOE) Environmental Restoration and Emergency Preparedness activities. Detailed definitions are included for key terms. The CERCLA definitions included in this glossary are taken from the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA), as amended and related federal rulemakings. The RCRA definitions included in this glossary are taken from the Resource Conservation and Recovery Act (RCRA) and related federal rulemakings. The TSCA definitions included in this glossary are taken from the Toxic Substances and Control Act (TSCA) and related federalmore » rulemakings. Definitions related to TSCA are limited to those sections in the statute and regulations concerning PCBs and asbestos.Other sources for definitions include additional federal rulemakings, assorted guidance documents prepared by the US Environmental Protection Agency (EPA), guidance and informational documents prepared by the US Department of Energy (DOE), and DOE Orders. The source of each term is noted beside the term. Terms presented in this document reflect revised and new definitions published before July 1, 1993.« less

  15. Non-additive dissipation in open quantum networks out of equilibrium

    NASA Astrophysics Data System (ADS)

    Mitchison, Mark T.; Plenio, Martin B.

    2018-03-01

    We theoretically study a simple non-equilibrium quantum network whose dynamics can be expressed and exactly solved in terms of a time-local master equation. Specifically, we consider a pair of coupled fermionic modes, each one locally exchanging energy and particles with an independent, macroscopic thermal reservoir. We show that the generator of the asymptotic master equation is not additive, i.e. it cannot be expressed as a sum of contributions describing the action of each reservoir alone. Instead, we identify an additional interference term that generates coherences in the energy eigenbasis, associated with the current of conserved particles flowing in the steady state. Notably, non-additivity arises even for wide-band reservoirs coupled arbitrarily weakly to the system. Our results shed light on the non-trivial interplay between multiple thermal noise sources in modular open quantum systems.

  16. Recent advances in laser-driven neutron sources

    NASA Astrophysics Data System (ADS)

    Alejo, A.; Ahmed, H.; Green, A.; Mirfayzi, S. R.; Borghesi, M.; Kar, S.

    2016-11-01

    Due to the limited number and high cost of large-scale neutron facilities, there has been a growing interest in compact accelerator-driven sources. In this context, several potential schemes of laser-driven neutron sources are being intensively studied employing laser-accelerated electron and ion beams. In addition to the potential of delivering neutron beams with high brilliance, directionality and ultra-short burst duration, a laser-driven neutron source would offer further advantages in terms of cost-effectiveness, compactness and radiation confinement by closed-coupled experiments. Some of the recent advances in this field are discussed, showing improvements in the directionality and flux of the laser-driven neutron beams.

  17. MSW-resonant fermion mixing during reheating

    NASA Astrophysics Data System (ADS)

    Kanai, Tsuneto; Tsujikawa, Shinji

    2003-10-01

    We study the dynamics of reheating in which an inflaton field couples two flavor fermions through Yukawa-couplings. When two fermions have a mixing term with a constant coupling, we show that the Mikheyev-Smirnov-Wolfenstein (MSW)-type resonance emerges due to a time-dependent background in addition to the standard fermion creation via parametric resonance. This MSW resonance not only alters the number densities of fermions generated by a preheating process but also can lead to the larger energy transfer from the inflaton to fermions. Our mechanism can provide additional source terms for the creation of superheavy fermions which may be relevant for the leptogenesis scenario.

  18. Forcing scheme analysis for the axisymmetric lattice Boltzmann method under incompressible limit.

    PubMed

    Zhang, Liangqi; Yang, Shiliang; Zeng, Zhong; Chen, Jie; Yin, Linmao; Chew, Jia Wei

    2017-04-01

    Because the standard lattice Boltzmann (LB) method is proposed for Cartesian Navier-Stokes (NS) equations, additional source terms are necessary in the axisymmetric LB method for representing the axisymmetric effects. Therefore, the accuracy and applicability of the axisymmetric LB models depend on the forcing schemes adopted for discretization of the source terms. In this study, three forcing schemes, namely, the trapezium rule based scheme, the direct forcing scheme, and the semi-implicit centered scheme, are analyzed theoretically by investigating their derived macroscopic equations in the diffusive scale. Particularly, the finite difference interpretation of the standard LB method is extended to the LB equations with source terms, and then the accuracy of different forcing schemes is evaluated for the axisymmetric LB method. Theoretical analysis indicates that the discrete lattice effects arising from the direct forcing scheme are part of the truncation error terms and thus would not affect the overall accuracy of the standard LB method with general force term (i.e., only the source terms in the momentum equation are considered), but lead to incorrect macroscopic equations for the axisymmetric LB models. On the other hand, the trapezium rule based scheme and the semi-implicit centered scheme both have the advantage of avoiding the discrete lattice effects and recovering the correct macroscopic equations. Numerical tests applied for validating the theoretical analysis show that both the numerical stability and the accuracy of the axisymmetric LB simulations are affected by the direct forcing scheme, which indicate that forcing schemes free of the discrete lattice effects are necessary for the axisymmetric LB method.

  19. Environmental performance of bio-based and biodegradable plastics: the road ahead.

    PubMed

    Lambert, Scott; Wagner, Martin

    2017-11-13

    Future plastic materials will be very different from those that are used today. The increasing importance of sustainability promotes the development of bio-based and biodegradable polymers, sometimes misleadingly referred to as 'bioplastics'. Because both terms imply "green" sources and "clean" removal, this paper aims at critically discussing the sometimes-conflicting terminology as well as renewable sources with a special focus on the degradation of these polymers in natural environments. With regard to the former we review innovations in feedstock development (e.g. microalgae and food wastes). In terms of the latter, we highlight the effects that polymer structure, additives, and environmental variables have on plastic biodegradability. We argue that the 'biodegradable' end-product does not necessarily degrade once emitted to the environment because chemical additives used to make them fit for purpose will increase the longevity. In the future, this trend may continue as the plastics industry also is expected to be a major user of nanocomposites. Overall, there is a need to assess the performance of polymer innovations in terms of their biodegradability especially under realistic waste management and environmental conditions, to avoid the unwanted release of plastic degradation products in receiving environments.

  20. Light-emitting diodes as an illumination source for plants: a review of research at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Kim, Hyeon-Hye; Wheeler, Raymond M.; Sager, John C.; Yorio, Neil C.; Goins, Gregory D.

    2005-01-01

    The provision of sufficient light is a fundamental requirement to support long-term plant growth in space. Several types of electric lamps have been tested to provide radiant energy for plants in this regard, including fluorescent, high-pressure sodium, and metal halide lamps. These lamps vary in terms of spectral quality, which can result in differences in plant growth and morphology. Current lighting research for space-based plant culture is focused on innovative lighting technologies that demonstrate high electrical efficiency and reduced mass and volume. Among the lighting technologies considered for space are light-emitting diodes (LEDs). The combination of red and blue LEDs has proven to be an effective lighting source for several crops, yet the appearance of plants under red and blue lighting is purplish gray, making visual assessment of plant health difficult. Additional green light would make the plant leaves appear green and normal, similar to a natural setting under white light, and may also offer psychological benefits for the crew. The addition of 24% green light (500-600 nm) to red and blue LEDs enhanced the growth of lettuce plants compared with plants grown under cool white fluorescent lamps. Coincidentally, these plants grown under additional green light would have the additional aesthetic appeal of a green appearance.

  1. MAIL LOG, program theory, volume 2

    NASA Technical Reports Server (NTRS)

    Harris, D. K.

    1979-01-01

    Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.

  2. Long-term carbon loss in fragmented Neotropical forests.

    PubMed

    Pütz, Sandro; Groeneveld, Jürgen; Henle, Klaus; Knogge, Christoph; Martensen, Alexandre Camargo; Metz, Markus; Metzger, Jean Paul; Ribeiro, Milton Cezar; de Paula, Mateus Dantas; Huth, Andreas

    2014-10-07

    Tropical forests play an important role in the global carbon cycle, as they store a large amount of carbon (C). Tropical forest deforestation has been identified as a major source of CO2 emissions, though biomass loss due to fragmentation--the creation of additional forest edges--has been largely overlooked as an additional CO2 source. Here, through the combination of remote sensing and knowledge on ecological processes, we present long-term carbon loss estimates due to fragmentation of Neotropical forests: within 10 years the Brazilian Atlantic Forest has lost 69 (±14) Tg C, and the Amazon 599 (±120) Tg C due to fragmentation alone. For all tropical forests, we estimate emissions up to 0.2 Pg C y(-1) or 9 to 24% of the annual global C loss due to deforestation. In conclusion, tropical forest fragmentation increases carbon loss and should be accounted for when attempting to understand the role of vegetation in the global carbon balance.

  3. Does ascorbic acid supplementation affect iron bioavailability in rats fed micronized dispersible ferric pyrophosphate fortified fruit juice?

    PubMed

    Haro-Vicente, Juan Francisco; Pérez-Conesa, Darío; Rincón, Francisco; Ros, Gaspar; Martínez-Graciá, Carmen; Vidal, Maria Luisa

    2008-12-01

    Food iron (Fe) fortification is an adequate approach for preventing Fe-deficiency anemia. Poorly water-soluble Fe compounds have good sensory attributes but low bioavailability. The reduction of the particle size of Fe fortificants and the addition of ascorbic acid might increase the bioavailability of low-soluble compounds. The present work aims to compare the Fe absorption and bioavailability of micronized dispersible ferric pyrophosphate (MDFP) (poorly soluble) to ferrous sufate (FS) (highly soluble) added to a fruit juice in presence or absence of ascorbic acid (AA) by using the hemoglobin repletion assay in rats. After a hemoglobin depletion period, four fruit juices comprised of (1) FS, (2) MDFP, (3) FS + AA, (4) MDFP + AA were produced and administered to a different group of rats (n = 18) over 21 days. During the repletion period, Fe balance, hemoglobin regeneration efficiency (HRE), relative bioavailability (RBV) and Fe tissue content were determined in the short, medium and long term. Fe absorption and bioavailability showed no significant differences between fortifying the fruit juice with FS or MDFP. The addition of AA to the juice enhanced Fe absorption during the long-term balance study within the same Fe source. HRE and Fe utilization increased after AA addition in both FS and MDFP groups in every period. Fe absorption and bioavailability from MDFP were comparable to FS added to a fruit juice in rats. Further, the addition of AA enhanced Fe absorption in the long term, as well as Fe bioavailability throughout the repletion period regardless of the Fe source employed.

  4. A glossary for biometeorology

    NASA Astrophysics Data System (ADS)

    Gosling, Simon N.; Bryce, Erin K.; Dixon, P. Grady; Gabriel, Katharina M. A.; Gosling, Elaine Y.; Hanes, Jonathan M.; Hondula, David M.; Liang, Liang; Bustos Mac Lean, Priscilla Ayleen; Muthers, Stefan; Nascimento, Sheila Tavares; Petralli, Martina; Vanos, Jennifer K.; Wanka, Eva R.

    2014-03-01

    Here we present, for the first time, a glossary of biometeorological terms. The glossary aims to address the need for a reliable source of biometeorological definitions, thereby facilitating communication and mutual understanding in this rapidly expanding field. A total of 171 terms are defined, with reference to 234 citations. It is anticipated that the glossary will be revisited in coming years, updating terms and adding new terms, as appropriate. The glossary is intended to provide a useful resource to the biometeorology community, and to this end, readers are encouraged to contact the lead author to suggest additional terms for inclusion in later versions of the glossary as a result of new and emerging developments in the field.

  5. A glossary for biometeorology.

    PubMed

    Gosling, Simon N; Bryce, Erin K; Dixon, P Grady; Gabriel, Katharina M A; Gosling, Elaine Y; Hanes, Jonathan M; Hondula, David M; Liang, Liang; Bustos Mac Lean, Priscilla Ayleen; Muthers, Stefan; Nascimento, Sheila Tavares; Petralli, Martina; Vanos, Jennifer K; Wanka, Eva R

    2014-03-01

    Here we present, for the first time, a glossary of biometeorological terms. The glossary aims to address the need for a reliable source of biometeorological definitions, thereby facilitating communication and mutual understanding in this rapidly expanding field. A total of 171 terms are defined, with reference to 234 citations. It is anticipated that the glossary will be revisited in coming years, updating terms and adding new terms, as appropriate. The glossary is intended to provide a useful resource to the biometeorology community, and to this end, readers are encouraged to contact the lead author to suggest additional terms for inclusion in later versions of the glossary as a result of new and emerging developments in the field.

  6. 17 CFR 38.1201 - Additional sources for compliance.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... designing a futures contract, the designated contract market should conduct market research so that the contract design meets the risk management needs of prospective users and promotes price discovery of the... and opinions during the contract design process to ensure the contract's term and conditions reflect...

  7. 40 CFR 469.12 - Specialized definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... AND STANDARDS ELECTRICAL AND ELECTRONIC COMPONENTS POINT SOURCE CATEGORY Semiconductor Subcategory... in 40 CFR part 136 apply to this subpart. In addition, (a) The term “total toxic organics (TTO)” means the sum of the concentrations for each of the following toxic organic compounds which is found in...

  8. 40 CFR 469.12 - Specialized definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... AND STANDARDS ELECTRICAL AND ELECTRONIC COMPONENTS POINT SOURCE CATEGORY Semiconductor Subcategory... in 40 CFR part 136 apply to this subpart. In addition, (a) The term “total toxic organics (TTO)” means the sum of the concentrations for each of the following toxic organic compounds which is found in...

  9. Numerical Simulations of Reacting Flows Using Asynchrony-Tolerant Schemes for Exascale Computing

    NASA Astrophysics Data System (ADS)

    Cleary, Emmet; Konduri, Aditya; Chen, Jacqueline

    2017-11-01

    Communication and data synchronization between processing elements (PEs) are likely to pose a major challenge in scalability of solvers at the exascale. Recently developed asynchrony-tolerant (AT) finite difference schemes address this issue by relaxing communication and synchronization between PEs at a mathematical level while preserving accuracy, resulting in improved scalability. The performance of these schemes has been validated for simple linear and nonlinear homogeneous PDEs. However, many problems of practical interest are governed by highly nonlinear PDEs with source terms, whose solution may be sensitive to perturbations caused by communication asynchrony. The current work applies the AT schemes to combustion problems with chemical source terms, yielding a stiff system of PDEs with nonlinear source terms highly sensitive to temperature. Examples shown will use single-step and multi-step CH4 mechanisms for 1D premixed and nonpremixed flames. Error analysis will be discussed both in physical and spectral space. Results show that additional errors introduced by the AT schemes are negligible and the schemes preserve their accuracy. We acknowledge funding from the DOE Computational Science Graduate Fellowship administered by the Krell Institute.

  10. Emergency Preparedness technology support to the Health and Safety Executive (HSE), Nuclear Installations Inspectorate (NII) of the United Kingdom. Appendix A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.

    1994-03-01

    The Nuclear Installations Inspectorate (NII) of the United Kingdom (UK) suggested the use of an accident progression logic model method developed by Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) for K Reactor to predict the magnitude and timing of radioactivity releases (the source term) based on an advanced logic model methodology. Predicted releases are output from the personal computer-based model in a level-of-confidence format. Additional technical discussions eventually led to a request from the NII to develop a proposal for assembling a similar technology to predict source terms for the UK`s advanced gas-cooled reactor (AGR) type.more » To respond to this request, WSRC is submitting a proposal to provide contractual assistance as specified in the Scope of Work. The work will produce, document, and transfer technology associated with a Decision-Oriented Source Term Estimator for Emergency Preparedness (DOSE-EP) for the NII to apply to AGRs in the United Kingdom. This document, Appendix A is a part of this proposal.« less

  11. A Generalized Evolution Criterion in Nonequilibrium Convective Systems

    NASA Astrophysics Data System (ADS)

    Ichiyanagi, Masakazu; Nisizima, Kunisuke

    1989-04-01

    A general evolution criterion, applicable to transport processes such as the conduction of heat and mass diffusion, is obtained as a direct version of the Le Chatelier-Braun principle for stationary states. The present theory is not based on any radical departure from the conventional one. The generalized theory is made determinate by proposing the balance equations for extensive thermodynamic variables which will reflect the character of convective systems under the assumption of local equilibrium. As a consequence of the introduction of source terms in the balance equations, there appear additional terms in the expression of the local entropy production, which are bilinear in terms of the intensive variables and the sources. In the present paper, we show that we can construct a dissipation function for such general cases, in which the premises of the Glansdorff-Prigogine theory are accumulated. The new dissipation function permits us to formulate a generalized evolution criterion for convective systems.

  12. Open source acceleration of wave optics simulations on energy efficient high-performance computing platforms

    NASA Astrophysics Data System (ADS)

    Beck, Jeffrey; Bos, Jeremy P.

    2017-05-01

    We compare several modifications to the open-source wave optics package, WavePy, intended to improve execution time. Specifically, we compare the relative performance of the Intel MKL, a CPU based OpenCV distribution, and GPU-based version. Performance is compared between distributions both on the same compute platform and between a fully-featured computing workstation and the NVIDIA Jetson TX1 platform. Comparisons are drawn in terms of both execution time and power consumption. We have found that substituting the Fast Fourier Transform operation from OpenCV provides a marked improvement on all platforms. In addition, we show that embedded platforms offer some possibility for extensive improvement in terms of efficiency compared to a fully featured workstation.

  13. Nurses' Information Seeking Behavior for Clinical Practice: A Case Study in a Developing Country.

    PubMed

    Sarbaz, Masoumeh; Kimiafar, Khalil; Sheikhtaheri, Abbas; Taherzadeh, Zhila; Eslami, Saeid

    2016-01-01

    We used a valid questionnaire to survey Iranian nurses' seeking information behavior and their confidence on different information sources. The frequently used sources were Internet" and "personal experiences" (54.8% and 48.2% respectively). "English medical journals" (61.9%) and "English textbooks" (41.3%) were the least frequently used sources. Nurses felt high confidence in sources such as "International instructions/guidelines" (58.6%) and "English medical textbooks" (50.4%). The main reasons for selecting sources were easy accessibility, being up to date and reliability. Google, Pubmed and Up to Date were the most used electronic sources. In addition, there were differences in terms of using some of these resources and nurse' age and gender. In developing information sources for nurses, factors such as reliability level, availability, and updatedness of resources should be more emphasized.

  14. Additional Evidence for the Accuracy of Biographical Data: Long-Term Retest and Observer Ratings.

    ERIC Educational Resources Information Center

    Shaffer, Garnett Stokes; And Others

    1986-01-01

    Investigated accuracy of responses to biodata questionnaire using a test-retest design and informed external observers for verification. Responses from 237 subjects and 200 observers provided evidence that many responses to biodata questionnaire were accurate. Assessed sources of inaccuracy, including social desirability effects, and noted…

  15. 40 CFR 466.02 - General definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... EPA if the State does not have an approved program. (h) The term “precious metal” means gold, silver... STANDARDS PORCELAIN ENAMELING POINT SOURCE CATEGORY General Provisions § 466.02 General definitions. In addition to the definitions set forth in 40 CFR part 401, the following definitions apply to this part: (a...

  16. Final Technical Report - SciDAC Cooperative Agreement: Center for Wave Interactions with Magnetohydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schnack, Dalton D.

    Final technical report for research performed by Dr. Thomas G. Jenkins in collaboration with Professor Dalton D. Schnack on SciDAC Cooperative Agreement: Center for Wave Interactions with Magnetohydrodyanics, DE-FC02-06ER54899, for the period of 8/15/06 - 8/14/11. This report centers on the Slow MHD physics campaign work performed by Dr. Jenkins while at UW-Madison and then at Tech-X Corporation. To make progress on the problem of RF induced currents affect magnetic island evolution in toroidal plasmas, a set of research approaches are outlined. Three approaches can be addressed in parallel. These are: (1) Analytically prescribed additional term in Ohm's law tomore » model the effect of localized ECCD current drive; (2) Introduce an additional evolution equation for the Ohm's law source term. Establish a RF source 'box' where information from the RF code couples to the fluid evolution; and (3) Carry out a more rigorous analytic calculation treating the additional RF terms in a closure problem. These approaches rely on the necessity of reinvigorating the computation modeling efforts of resistive and neoclassical tearing modes with present day versions of the numerical tools. For the RF community, the relevant action item is - RF ray tracing codes need to be modified so that general three-dimensional spatial information can be obtained. Further, interface efforts between the two codes require work as well as an assessment as to the numerical stability properties of the procedures to be used.« less

  17. Analysis of jet-airfoil interaction noise sources by using a microphone array technique

    NASA Astrophysics Data System (ADS)

    Fleury, Vincent; Davy, Renaud

    2016-03-01

    The paper is concerned with the characterization of jet noise sources and jet-airfoil interaction sources by using microphone array data. The measurements were carried-out in the anechoic open test section wind tunnel of Onera, Cepra19. The microphone array technique relies on the convected, Lighthill's and Ffowcs-Williams and Hawkings' acoustic analogy equation. The cross-spectrum of the source term of the analogy equation is sought. It is defined as the optimal solution to a minimal error equation using the measured microphone cross-spectra as reference. This inverse problem is ill-posed yet. A penalty term based on a localization operator is therefore added to improve the recovery of jet noise sources. The analysis of isolated jet noise data in subsonic regime shows the contribution of the conventional mixing noise source in the low frequency range, as expected, and of uniformly distributed, uncorrelated noise sources in the jet flow at higher frequencies. In underexpanded supersonic regime, a shock-associated noise source is clearly identified, too. An additional source is detected in the vicinity of the nozzle exit both in supersonic and subsonic regimes. In the presence of the airfoil, the distribution of the noise sources is deeply modified. In particular, a strong noise source is localized on the flap. For high Strouhal numbers, higher than about 2 (based on the jet mixing velocity and diameter), a significant contribution from the shear-layer near the flap is observed, too. Indications of acoustic reflections on the airfoil are also discerned.

  18. A modification of Einstein-Schrödinger theory that contains both general relativity and electrodynamics

    NASA Astrophysics Data System (ADS)

    Shifflett, J. A.

    2008-08-01

    We modify the Einstein-Schrödinger theory to include a cosmological constant Λ z which multiplies the symmetric metric, and we show how the theory can be easily coupled to additional fields. The cosmological constant Λ z is assumed to be nearly cancelled by Schrödinger’s cosmological constant Λ b which multiplies the nonsymmetric fundamental tensor, such that the total Λ = Λ z + Λ b matches measurement. The resulting theory becomes exactly Einstein-Maxwell theory in the limit as | Λ z | → ∞. For | Λ z | ~ 1/(Planck length)2 the field equations match the ordinary Einstein and Maxwell equations except for extra terms which are < 10-16 of the usual terms for worst-case field strengths and rates-of-change accessible to measurement. Additional fields can be included in the Lagrangian, and these fields may couple to the symmetric metric and the electromagnetic vector potential, just as in Einstein-Maxwell theory. The ordinary Lorentz force equation is obtained by taking the divergence of the Einstein equations when sources are included. The Einstein-Infeld-Hoffmann (EIH) equations of motion match the equations of motion for Einstein-Maxwell theory to Newtonian/Coulombian order, which proves the existence of a Lorentz force without requiring sources. This fixes a problem of the original Einstein-Schrödinger theory, which failed to predict a Lorentz force. An exact charged solution matches the Reissner-Nordström solution except for additional terms which are ~10-66 of the usual terms for worst-case radii accessible to measurement. An exact electromagnetic plane-wave solution is identical to its counterpart in Einstein-Maxwell theory.

  19. Microstructure of the combustion zone: Thin-binder AP-polymer sandwiches

    NASA Technical Reports Server (NTRS)

    Price, E. W.; Panyam, R. R.; Sigman, R. K.

    1980-01-01

    Experimental results are summarized for systematic quench-burning tests on ammonium perchlorate-HC binder sandwiches with binder thicknesses in the range 10 - 150 microns. Tests included three binders (polysulfide, polybutadiene-acrylonitrile, and hydroxy terminated polybutadiene), and pressures from 1.4 to 14 MPa. In addition, deflagration limits were determined in terms of binder thickness and pressure. Results are discussed in terms of a qualitative theory of sandwich burning consolidated from various sources. Some aspects of the observed results are explained only speculatively.

  20. Source terms, shielding calculations and soil activation for a medical cyclotron.

    PubMed

    Konheiser, J; Naumann, B; Ferrari, A; Brachem, C; Müller, S E

    2016-12-01

    Calculations of the shielding and estimates of soil activation for a medical cyclotron are presented in this work. Based on the neutron source term from the 18 O(p,n) 18 F reaction produced by a 28 MeV proton beam, neutron and gamma dose rates outside the building were estimated with the Monte Carlo code MCNP6 (Goorley et al 2012 Nucl. Technol. 180 298-315). The neutron source term was calculated with the MCNP6 code and FLUKA (Ferrari et al 2005 INFN/TC_05/11, SLAC-R-773) code as well as with supplied data by the manufacturer. MCNP and FLUKA calculations yielded comparable results, while the neutron yield obtained using the manufacturer-supplied information is about a factor of 5 smaller. The difference is attributed to the missing channels in the manufacturer-supplied neutron source terms which considers only the 18 O(p,n) 18 F reaction, whereas the MCNP and FLUKA calculations include additional neutron reaction channels. Soil activation was performed using the FLUKA code. The estimated dose rate based on MCNP6 calculations in the public area is about 0.035 µSv h -1 and thus significantly below the reference value of 0.5 µSv h -1 (2011 Strahlenschutzverordnung, 9 Auflage vom 01.11.2011, Bundesanzeiger Verlag). After 5 years of continuous beam operation and a subsequent decay time of 30 d, the activity concentration of the soil is about 0.34 Bq g -1 .

  1. OntoFox: web-based support for ontology reuse

    PubMed Central

    2010-01-01

    Background Ontology development is a rapidly growing area of research, especially in the life sciences domain. To promote collaboration and interoperability between different projects, the OBO Foundry principles require that these ontologies be open and non-redundant, avoiding duplication of terms through the re-use of existing resources. As current options to do so present various difficulties, a new approach, MIREOT, allows specifying import of single terms. Initial implementations allow for controlled import of selected annotations and certain classes of related terms. Findings OntoFox http://ontofox.hegroup.org/ is a web-based system that allows users to input terms, fetch selected properties, annotations, and certain classes of related terms from the source ontologies and save the results using the RDF/XML serialization of the Web Ontology Language (OWL). Compared to an initial implementation of MIREOT, OntoFox allows additional and more easily configurable options for selecting and rewriting annotation properties, and for inclusion of all or a computed subset of terms between low and top level terms. Additional methods for including related classes include a SPARQL-based ontology term retrieval algorithm that extracts terms related to a given set of signature terms and an option to extract the hierarchy rooted at a specified ontology term. OntoFox's output can be directly imported into a developer's ontology. OntoFox currently supports term retrieval from a selection of 15 ontologies accessible via SPARQL endpoints and allows users to extend this by specifying additional endpoints. An OntoFox application in the development of the Vaccine Ontology (VO) is demonstrated. Conclusions OntoFox provides a timely publicly available service, providing different options for users to collect terms from external ontologies, making them available for reuse by import into client OWL ontologies. PMID:20569493

  2. Biological and Health Effects of Electromagnetic (Nonionizing) Radiation. LC Science Tracer Bullet.

    ERIC Educational Resources Information Center

    Halasz, Hisako, Comp.

    The environment we live in today is filled with human-created electromagnetic fields generated by a variety of sources, including radio and television transmitters, power lines, and visual display terminals. (In addition, there exists a natural background of electromagnetic fields.) The term "electromagnetic pollution" is often used to…

  3. Evaluation of a Brief Homework Assignment Designed to Reduce Citation Problems

    ERIC Educational Resources Information Center

    Schuetze, Pamela

    2004-01-01

    I evaluated a brief homework assignment designed to reduce citation problems in research-based term papers. Students in 2 developmental psychology classes received a brief presentation and handout defining plagiarism with tips on how to cite sources to avoid plagiarizing. In addition, students in 1 class completed 2 brief homework assignments in…

  4. Source determination of benzotriazoles in sediment cores from two urban estuaries on the Atlantic Coast of the United States

    EPA Science Inventory

    Benzotriazoles (BZTs) are used in a broad range of commercial and industrial products, particularly as metal corrosion inhibitors and as ultraviolet (UV) light stabilizer additives in plastics and polymers. Their long-term usage and high production volumes have resulted in the r...

  5. Evaluation of a Consistent LES/PDF Method Using a Series of Experimental Spray Flames

    NASA Astrophysics Data System (ADS)

    Heye, Colin; Raman, Venkat

    2012-11-01

    A consistent method for the evolution of the joint-scalar probability density function (PDF) transport equation is proposed for application to large eddy simulation (LES) of turbulent reacting flows containing evaporating spray droplets. PDF transport equations provide the benefit of including the chemical source term in closed form, however, additional terms describing LES subfilter mixing must be modeled. The recent availability of detailed experimental measurements provide model validation data for a wide range of evaporation rates and combustion regimes, as is well-known to occur in spray flames. In this work, the experimental data will used to investigate the impact of droplet mass loading and evaporation rates on the subfilter scalar PDF shape in comparison with conventional flamelet models. In addition, existing model term closures in the PDF transport equations are evaluated with a focus on their validity in the presence of regime changes.

  6. Algorithms and analytical solutions for rapidly approximating long-term dispersion from line and area sources

    NASA Astrophysics Data System (ADS)

    Barrett, Steven R. H.; Britter, Rex E.

    Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean dispersion are shown to produce results several orders of magnitude more efficiently with a loss of accuracy small compared to the absolute accuracy of advanced dispersion models near sources. The method can be readily incorporated into existing dispersion models, and may allow for additional computation time to be expended on modelling dispersion processes more accurately in future, rather than on accounting for source geometry.

  7. Software Tool for Researching Annotations of Proteins (STRAP): Open-Source Protein Annotation Software with Data Visualization

    PubMed Central

    Bhatia, Vivek N.; Perlman, David H.; Costello, Catherine E.; McComb, Mark E.

    2009-01-01

    In order that biological meaning may be derived and testable hypotheses may be built from proteomics experiments, assignments of proteins identified by mass spectrometry or other techniques must be supplemented with additional notation, such as information on known protein functions, protein-protein interactions, or biological pathway associations. Collecting, organizing, and interpreting this data often requires the input of experts in the biological field of study, in addition to the time-consuming search for and compilation of information from online protein databases. Furthermore, visualizing this bulk of information can be challenging due to the limited availability of easy-to-use and freely available tools for this process. In response to these constraints, we have undertaken the design of software to automate annotation and visualization of proteomics data in order to accelerate the pace of research. Here we present the Software Tool for Researching Annotations of Proteins (STRAP) – a user-friendly, open-source C# application. STRAP automatically obtains gene ontology (GO) terms associated with proteins in a proteomics results ID list using the freely accessible UniProtKB and EBI GOA databases. Summarized in an easy-to-navigate tabular format, STRAP includes meta-information on the protein in addition to complimentary GO terminology. Additionally, this information can be edited by the user so that in-house expertise on particular proteins may be integrated into the larger dataset. STRAP provides a sortable tabular view for all terms, as well as graphical representations of GO-term association data in pie (biological process, cellular component and molecular function) and bar charts (cross comparison of sample sets) to aid in the interpretation of large datasets and differential analyses experiments. Furthermore, proteins of interest may be exported as a unique FASTA-formatted file to allow for customizable re-searching of mass spectrometry data, and gene names corresponding to the proteins in the lists may be encoded in the Gaggle microformat for further characterization, including pathway analysis. STRAP, a tutorial, and the C# source code are freely available from http://cpctools.sourceforge.net. PMID:19839595

  8. Dynamic power balance analysis in JET

    NASA Astrophysics Data System (ADS)

    Matthews, G. F.; Silburn, S. A.; Challis, C. D.; Eich, T.; Iglesias, D.; King, D.; Sieglin, B.; Contributors, JET

    2017-12-01

    The full scale realisation of nuclear fusion as an energy source requires a detailed understanding of power and energy balance in current experimental devices. In this we explore whether a global power balance model in which some of the calibration factors applied to the source or sink terms are fitted to the data can provide insight into possible causes of any discrepancies in power and energy balance seen in the JET tokamak. We show that the dynamics in the power balance can only be properly reproduced by including the changes in the thermal stored energy which therefore provides an additional opportunity to cross calibrate other terms in the power balance equation. Although the results are inconclusive with respect to the original goal of identifying the source of the discrepancies in the energy balance, we do find that with optimised parameters an extremely good prediction of the total power measured at the outer divertor target can be obtained over a wide range of pulses with time resolution up to ∼25 ms.

  9. The influence of sulphate deposition on the seasonal variation of peat pore water methyl Hg in a boreal mire.

    PubMed

    Bergman, Inger; Bishop, Kevin; Tu, Qiang; Frech, Wolfgang; Åkerblom, Staffan; Nilsson, Mats

    2012-01-01

    In this paper we investigate the hypothesis that long-term sulphate (SO(4) (2-)) deposition has made peatlands a larger source of methyl mercury (MeHg) to remote boreal lakes. This was done on experimental plots at a boreal, low sedge mire where the effect of long-term addition of SO(4) (2-) on peat pore water MeHg concentrations was observed weekly throughout the snow-free portion of 1999. The additions of SO(4) (2-) started in 1995. The seasonal mean of the pore water MeHg concentrations on the plots with 17 kg ha(-1) yr(-1) of sulphur (S) addition (1.3±0.08 ng L(-1), SE; n = 44) was significantly (p<0.0001) higher than the mean MeHg concentration on the plots with 3 kg ha(-1) yr(-1) of ambient S deposition (0.6±0.02 ng L(-1), SE; n = 44). The temporal variation in pore water MeHg concentrations during the snow free season was larger in the S-addition plots, with an amplitude of >2 ng L(-1) compared to +/-0.5 ng L(-1) in the ambient S deposition plots. The concentrations of pore water MeHg in the S-addition plots were positively correlated (r(2) = 0.21; p = 0.001) to the groundwater level, with the lowest concentrations of MeHg during the period with the lowest groundwater levels. The pore water MeHg concentrations were not correlated to total Hg, DOC concentration or pH. The results from this study indicate that the persistently higher pore water concentrations of MeHg in the S-addition plots are caused by the long-term additions of SO(4) (2-) to the mire surface. Since these waters are an important source of runoff, the results support the hypothesis that SO(4) (2-) deposition has increased the contribution of peatlands to MeHg in downstream aquatic systems. This would mean that the increased deposition of SO(4) (2-) in acid rain has contributed to the modern increase in the MeHg burdens of remote lakes hydrologically connected to peatlands.

  10. Contrast and Assimilation Effects of Dimensional Comparisons in Five Subjects: An Extension of the I/E Model

    ERIC Educational Resources Information Center

    Jansen, Malte; Schroeders, Ulrich; Lüdtke, Oliver; Marsh, Herbert W.

    2015-01-01

    Students evaluate their achievement in a specific domain in relation to their achievement in other domains and form their self-concepts accordingly. These comparison processes have been termed "dimensional comparisons" and shown to be an important source of academic self-concepts in addition to social and temporal comparisons. Research…

  11. Possible source term of high concentrations of mecoprop-p in leachate and water quality: impact of climate change, public use and disposal.

    PubMed

    Idowu, I A; Alkhaddar, R M; Atherton, W

    2014-08-01

    Mecoprop-p herbicide is often found in wells and water abstractions in many areas around Europe, the UK inclusive. There is a growing environmental and public health concern about mecoprop-p herbicide pollution in ground and surface water in England. Reviews suggest that extensive work has been carried out on the contribution of mecoprop-p herbicides from agricultural use whilst more work needs to be carried out on the contribution of mecoprop-p herbicide from non-agricultural use. The study covers two landfill sites in Weaver/Gowy Catchment. Mecoprop-p herbicide concentrations in the leachate quality range between 0.06 and 290 microg l1 in cells. High concentration ofmecoprop-p herbicide in the leachate quality suggests that there is a possible source term in the waste stream. This paper addresses the gap by exploring possible source terms of mecoprop-p herbicide contamination on landfill sites and evaluates the impact of public purchase, use and disposal alongside climate change on seasonal variations in mecoprop-p concentrations. Mecoprop-p herbicide was found to exceed the EU drinking water quality standards at the unsaturated zone/aquifer with observed average concentrations ranging between 0.005 and 7.96 microg l1. A route map for mecoprop-p herbicide source term contamination is essential for mitigation and pollution management with emphasis on both consumer and producer responsibility towards use of mecoprop-p product. In addition, improvement in data collection on mecoprop-p concentrations and detailed seasonal herbicide sales for non-agricultural purposes are needed to inform the analysis and decision process.

  12. Surfzone alongshore advective accelerations: observations and modeling

    NASA Astrophysics Data System (ADS)

    Hansen, J.; Raubenheimer, B.; Elgar, S.

    2014-12-01

    The sources, magnitudes, and impacts of non-linear advective accelerations on alongshore surfzone currents are investigated with observations and a numerical model. Previous numerical modeling results have indicated that advective accelerations are an important contribution to the alongshore force balance, and are required to understand spatial variations in alongshore currents (which may result in spatially variable morphological change). However, most prior observational studies have neglected advective accelerations in the alongshore force balance. Using a numerical model (Delft3D) to predict optimal sensor locations, a dense array of 26 colocated current meters and pressure sensors was deployed between the shoreline and 3-m water depth over a 200 by 115 m region near Duck, NC in fall 2013. The array included 7 cross- and 3 alongshore transects. Here, observational and numerical estimates of the dominant forcing terms in the alongshore balance (pressure and radiation-stress gradients) and the advective acceleration terms will be compared with each other. In addition, the numerical model will be used to examine the force balance, including sources of velocity gradients, at a higher spatial resolution than possible with the instrument array. Preliminary numerical results indicate that at O(10-100 m) alongshore scales, bathymetric variations and the ensuing alongshore variations in the wave field and subsequent forcing are the dominant sources of the modeled velocity gradients and advective accelerations. Additional simulations and analysis of the observations will be presented. Funded by NSF and ASDR&E.

  13. Weak unique continuation property and a related inverse source problem for time-fractional diffusion-advection equations

    NASA Astrophysics Data System (ADS)

    Jiang, Daijun; Li, Zhiyuan; Liu, Yikan; Yamamoto, Masahiro

    2017-05-01

    In this paper, we first establish a weak unique continuation property for time-fractional diffusion-advection equations. The proof is mainly based on the Laplace transform and the unique continuation properties for elliptic and parabolic equations. The result is weaker than its parabolic counterpart in the sense that we additionally impose the homogeneous boundary condition. As a direct application, we prove the uniqueness for an inverse problem on determining the spatial component in the source term by interior measurements. Numerically, we reformulate our inverse source problem as an optimization problem, and propose an iterative thresholding algorithm. Finally, several numerical experiments are presented to show the accuracy and efficiency of the algorithm.

  14. Coarse Grid CFD for underresolved simulation

    NASA Astrophysics Data System (ADS)

    Class, Andreas G.; Viellieber, Mathias O.; Himmel, Steffen R.

    2010-11-01

    CFD simulation of the complete reactor core of a nuclear power plant requires exceedingly huge computational resources so that this crude power approach has not been pursued yet. The traditional approach is 1D subchannel analysis employing calibrated transport models. Coarse grid CFD is an attractive alternative technique based on strongly under-resolved CFD and the inviscid Euler equations. Obviously, using inviscid equations and coarse grids does not resolve all the physics requiring additional volumetric source terms modelling viscosity and other sub-grid effects. The source terms are implemented via correlations derived from fully resolved representative simulations which can be tabulated or computed on the fly. The technique is demonstrated for a Carnot diffusor and a wire-wrap fuel assembly [1]. [4pt] [1] Himmel, S.R. phd thesis, Stuttgart University, Germany 2009, http://bibliothek.fzk.de/zb/berichte/FZKA7468.pdf

  15. Quantum connectivity optimization algorithms for entanglement source deployment in a quantum multi-hop network

    NASA Astrophysics Data System (ADS)

    Zou, Zhen-Zhen; Yu, Xu-Tao; Zhang, Zai-Chen

    2018-04-01

    At first, the entanglement source deployment problem is studied in a quantum multi-hop network, which has a significant influence on quantum connectivity. Two optimization algorithms are introduced with limited entanglement sources in this paper. A deployment algorithm based on node position (DNP) improves connectivity by guaranteeing that all overlapping areas of the distribution ranges of the entanglement sources contain nodes. In addition, a deployment algorithm based on an improved genetic algorithm (DIGA) is implemented by dividing the region into grids. From the simulation results, DNP and DIGA improve quantum connectivity by 213.73% and 248.83% compared to random deployment, respectively, and the latter performs better in terms of connectivity. However, DNP is more flexible and adaptive to change, as it stops running when all nodes are covered.

  16. Effect of Dairy Proteins on Appetite, Energy Expenditure, Body Weight, and Composition: a Review of the Evidence from Controlled Clinical Trials1

    PubMed Central

    Bendtsen, Line Q.; Lorenzen, Janne K.; Bendsen, Nathalie T.; Rasmussen, Charlotte; Astrup, Arne

    2013-01-01

    Evidence supports that a high proportion of calories from protein increases weight loss and prevents weight (re)gain. Proteins are known to induce satiety, increase secretion of gastrointestinal hormones, and increase diet-induced thermogenesis, but less is known about whether various types of proteins exert different metabolic effects. In the Western world, dairy protein, which consists of 80% casein and 20% whey, is a large contributor to our daily protein intake. Casein and whey differ in absorption and digestion rates, with casein being a “slow” protein and whey being a “fast” protein. In addition, they differ in amino acid composition. This review examines whether casein, whey, and other protein sources exert different metabolic effects and targets to clarify the underlying mechanisms. Data indicate that whey is more satiating in the short term, whereas casein is more satiating in the long term. In addition, some studies indicate that whey stimulates the secretion of the incretin hormones glucagon-like peptide-1 and glucose-dependent insulinotropic polypeptide more than other proteins. However, for the satiety (cholecystokinin and peptide YY) and hunger-stimulating (ghrelin) hormones, no clear evidence exists that 1 protein source has a greater stimulating effect compared with others. Likewise, no clear evidence exists that 1 protein source results in higher diet-induced thermogenesis and promotes more beneficial changes in body weight and composition compared with other protein sources. However, data indicate that amino acid composition, rate of absorption, and protein/food texture may be important factors for protein-stimulated metabolic effects. PMID:23858091

  17. Radionuclides in the Arctic seas from the former Soviet Union: Potential health and ecological risks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Layton, D W; Edson, R; Varela, M

    1999-11-15

    The primary goal of the assessment reported here is to evaluate the health and environmental threat to coastal Alaska posed by radioactive-waste dumping in the Arctic and Northwest Pacific Oceans by the FSU. In particular, the FSU discarded 16 nuclear reactors from submarines and an icebreaker in the Kara Sea near the island of Novaya Zemlya, of which 6 contained spent nuclear fuel (SNF); disposed of liquid and solid wastes in the Sea of Japan; lost a {sup 90}Sr-powered radioisotope thermoelectric generator at sea in the Sea of Okhotsk; and disposed of liquid wastes at several sites in the Pacificmore » Ocean, east of the Kamchatka Peninsula. In addition to these known sources in the oceans, the RAIG evaluated FSU waste-disposal practices at inland weapons-development sites that have contaminated major rivers flowing into the Arctic Ocean. The RAIG evaluated these sources for the potential for release to the environment, transport, and impact to Alaskan ecosystems and peoples through a variety of scenarios, including a worst-case total instantaneous and simultaneous release of the sources under investigation. The risk-assessment process described in this report is applicable to and can be used by other circumpolar countries, with the addition of information about specific ecosystems and human life-styles. They can use the ANWAP risk-assessment framework and approach used by ONR to establish potential doses for Alaska, but add their own specific data sets about human and ecological factors. The ANWAP risk assessment addresses the following Russian wastes, media, and receptors: dumped nuclear submarines and icebreaker in Kara Sea--marine pathways; solid reactor parts in Sea of Japan and Pacific Ocean--marine pathways; thermoelectric generator in Sea of Okhotsk--marine pathways; current known aqueous wastes in Mayak reservoirs and Asanov Marshes--riverine to marine pathways; and Alaska as receptor. For these waste and source terms addressed, other pathways, such as atmospheric transport, could be considered under future-funded research efforts for impacts to Alaska. The ANWAP risk assessment does not address the following wastes, media, and receptors: radioactive sources in Alaska (except to add perspective for Russian source term); radioactive wastes associated with Russian naval military operations and decommissioning; Russian production reactor and spent-fuel reprocessing facilities nonaqueous source terms; atmospheric, terrestrial and nonaqueous pathways; and dose calculations for any circumpolar locality other than Alaska. These other, potentially serious sources of radioactivity to the Arctic environment, while outside the scope of the current ANWAP mandate, should be considered for future funding research efforts.« less

  18. Porous elastic system with nonlinear damping and sources terms

    NASA Astrophysics Data System (ADS)

    Freitas, Mirelson M.; Santos, M. L.; Langa, José A.

    2018-02-01

    We study the long-time behavior of porous-elastic system, focusing on the interplay between nonlinear damping and source terms. The sources may represent restoring forces, but may also be focusing thus potentially amplifying the total energy which is the primary scenario of interest. By employing nonlinear semigroups and the theory of monotone operators, we obtain several results on the existence of local and global weak solutions, and uniqueness of weak solutions. Moreover, we prove that such unique solutions depend continuously on the initial data. Under some restrictions on the parameters, we also prove that every weak solution to our system blows up in finite time, provided the initial energy is negative and the sources are more dominant than the damping in the system. Additional results are obtained via careful analysis involving the Nehari Manifold. Specifically, we prove the existence of a unique global weak solution with initial data coming from the "good" part of the potential well. For such a global solution, we prove that the total energy of the system decays exponentially or algebraically, depending on the behavior of the dissipation in the system near the origin. We also prove the existence of a global attractor.

  19. Stable source reconstruction from a finite number of measurements in the multi-frequency inverse source problem

    NASA Astrophysics Data System (ADS)

    Karamehmedović, Mirza; Kirkeby, Adrian; Knudsen, Kim

    2018-06-01

    We consider the multi-frequency inverse source problem for the scalar Helmholtz equation in the plane. The goal is to reconstruct the source term in the equation from measurements of the solution on a surface outside the support of the source. We study the problem in a certain finite dimensional setting: from measurements made at a finite set of frequencies we uniquely determine and reconstruct sources in a subspace spanned by finitely many Fourier–Bessel functions. Further, we obtain a constructive criterion for identifying a minimal set of measurement frequencies sufficient for reconstruction, and under an additional, mild assumption, the reconstruction method is shown to be stable. Our analysis is based on a singular value decomposition of the source-to-measurement forward operators and the distribution of positive zeros of the Bessel functions of the first kind. The reconstruction method is implemented numerically and our theoretical findings are supported by numerical experiments.

  20. The Denver Aerosol Sources and Health (DASH) Study: Overview and Early Findings

    PubMed Central

    Vedal, S.; Hannigan, M.P.; Dutton, S.J.; Miller, S. L.; Milford, J.B.; Rabinovitch, N.; Kim, S.-Y.; Sheppard, L.

    2012-01-01

    Improved understanding of the sources of air pollution that are most harmful could aid in developing more effective measures for protecting human health. The Denver Aerosol Sources and Health (DASH) study was designed to identify the sources of ambient fine particulate matter (PM2.5) that are most responsible for the adverse health effects of short-term exposure to PM 2.5. Daily 24-hour PM2.5 sampling began in July 2002 at a residential monitoring site in Denver, Colorado, using both Teflon and quartz filter samplers. Sampling is planned to continue through 2008. Chemical speciation is being carried out for mass, inorganic ionic compounds (sulfate, nitrate and ammonium), and carbonaceous components, including elemental carbon, organic carbon, temperature-resolved organic carbon fractions and a large array of organic compounds. In addition, water soluble metals were measured daily for 12 months in 2003. A receptor-based source apportionment approach utilizing positive matrix factorization (PMF) will be used to identify PM 2.5 source contributions for each 24-hour period. Based on a preliminary assessment using synthetic data, the proposed source apportionment should be able to identify many important sources on a daily basis, including secondary ammonium nitrate and ammonium sulfate, diesel vehicle exhaust, road dust, wood combustion and vegetative debris. Meat cooking, gasoline vehicle exhaust and natural gas combustion were more challenging for PMF to accurately identify due to high detection limits for certain organic molecular marker compounds. Measurements of these compounds are being improved and supplemented with additional organic molecular marker compounds. The health study will investigate associations between daily source contributions and an array of health endpoints, including daily mortality and hospitalizations and measures of asthma control in asthmatic children. Findings from the DASH study, in addition to being of interest to policymakers, by identifying harmful PM2.5 sources may provide insights into mechanisms of PM effect. PMID:22723735

  1. The Denver Aerosol Sources and Health (DASH) study: Overview and early findings

    NASA Astrophysics Data System (ADS)

    Vedal, S.; Hannigan, M. P.; Dutton, S. J.; Miller, S. L.; Milford, J. B.; Rabinovitch, N.; Kim, S.-Y.; Sheppard, L.

    Improved understanding of the sources of air pollution that are most harmful could aid in developing more effective measures for protecting human health. The Denver Aerosol Sources and Health (DASH) study was designed to identify the sources of ambient fine particulate matter (PM 2.5) that are most responsible for the adverse health effects of short-term exposure to PM 2.5. Daily 24-h PM 2.5 sampling began in July 2002 at a residential monitoring site in Denver, Colorado, using both Teflon and quartz filter samplers. Sampling is planned to continue through 2008. Chemical speciation is being carried out for mass, inorganic ionic compounds (sulfate, nitrate and ammonium), and carbonaceous components, including elemental carbon, organic carbon, temperature-resolved organic carbon fractions and a large array of organic compounds. In addition, water-soluble metals were measured daily for 12 months in 2003. A receptor-based source apportionment approach utilizing positive matrix factorization (PMF) will be used to identify PM 2.5 source contributions for each 24-h period. Based on a preliminary assessment using synthetic data, the proposed source apportionment should be able to identify many important sources on a daily basis, including secondary ammonium nitrate and ammonium sulfate, diesel vehicle exhaust, road dust, wood combustion and vegetative debris. Meat cooking, gasoline vehicle exhaust and natural gas combustion were more challenging for PMF to accurately identify due to high detection limits for certain organic molecular marker compounds. Measurements of these compounds are being improved and supplemented with additional organic molecular marker compounds. The health study will investigate associations between daily source contributions and an array of health endpoints, including daily mortality and hospitalizations and measures of asthma control in asthmatic children. Findings from the DASH study, in addition to being of interest to policymakers, by identifying harmful PM 2.5 sources may provide insights into mechanisms of PM effect.

  2. An Ultradeep Chandra Catalog of X-Ray Point Sources in the Galactic Center Star Cluster

    NASA Astrophysics Data System (ADS)

    Zhu, Zhenlin; Li, Zhiyuan; Morris, Mark R.

    2018-04-01

    We present an updated catalog of X-ray point sources in the inner 500″ (∼20 pc) of the Galactic center (GC), where the nuclear star cluster (NSC) stands, based on a total of ∼4.5 Ms of Chandra observations taken from 1999 September to 2013 April. This ultradeep data set offers unprecedented sensitivity for detecting X-ray sources in the GC, down to an intrinsic 2–10 keV luminosity of 1.0 × 1031 erg s‑1. A total of 3619 sources are detected in the 2–8 keV band, among which ∼3500 are probable GC sources and ∼1300 are new identifications. The GC sources collectively account for ∼20% of the total 2–8 keV flux from the inner 250″ region where detection sensitivity is the greatest. Taking advantage of this unprecedented sample of faint X-ray sources that primarily traces the old stellar populations in the NSC, we revisit global source properties, including long-term variability, cumulative spectra, luminosity function, and spatial distribution. Based on the equivalent width and relative strength of the iron lines, we suggest that in addition to the arguably predominant population of magnetic cataclysmic variables (CVs), nonmagnetic CVs contribute substantially to the detected sources, especially in the lower-luminosity group. On the other hand, the X-ray sources have a radial distribution closely following the stellar mass distribution in the NSC, but much flatter than that of the known X-ray transients, which are presumably low-mass X-ray binaries (LMXBs) caught in outburst. This, together with the very modest long-term variability of the detected sources, strongly suggests that quiescent LMXBs are a minor (less than a few percent) population.

  3. The future of meat: a qualitative analysis of cultured meat media coverage.

    PubMed

    Goodwin, J N; Shoulders, C W

    2013-11-01

    This study sought to explore the informational themes and information sources cited by the media to cover stories of cultured meat in both the United States and the European Union. The results indicated that cultured meat news articles in both the United States and the European Union commonly discuss cultured meat in terms of benefits, history, process, time, livestock production problems, and skepticism. Additionally, the information sources commonly cited in the articles included cultured meat researchers, sources from academia, People for the Ethical Treatment of Animals (PETA), New Harvest, Winston Churchill, restaurant owners/chefs, and sources from the opposing countries (e.g. US use some EU sources and vice versa). The implications of this study will allow meat scientists to understand how the media is influencing consumers' perceptions about the topic, and also allow them to strategize how to shape future communication about cultured meat. Published by Elsevier Ltd.

  4. Conglomeration or Chameleon? Teachers' Representations of Language in the Assessment of Learners with English as an Additional Language.

    ERIC Educational Resources Information Center

    Gardner, Sheena; Rea-Dickins, Pauline

    2001-01-01

    Investigates teacher representations of language in relation to assessment contexts. Analyzes not only what is represented in teachers' use of metalanguage, but also how it is presented--in terms of expression, voice, and source. The analysis is based on interviews with teachers, transcripts of lessons, and classroom-based assessments, formal…

  5. Department of Defense Partners in Flight Strategic Plan

    DTIC Science & Technology

    2004-07-28

    forests are private industrial timberlands and often are heavily fragmented. Reconciliation of the need for long· term, sustainable timber production...habitats have been set aside as protected areas or incorporated into the existing Parque Nacional Soberania. In addition, the upper Panama Bay...lines present another source of potential mortality, especially for raptors in western states. Several raptor conservation organizations and industry

  6. Mitigating climate change through small-scale forestry in the USA: opportunities and challenges

    Treesearch

    Susan Charnley; David Diaz; Hannah Gosnell

    2010-01-01

    Forest management for carbon sequestration is a low-cost, low-technology, relatively easy way to help mitigate global climate change that can be adopted now while additional long-term solutions are developed. Carbon-oriented management of forests also offers forest owners an opportunity to obtain a new source of income, and commonly has environmental co-benefits. The...

  7. Improvements to Passive Acoustic Tracking Methods for Marine Mammal Monitoring

    DTIC Science & Technology

    2016-05-02

    individual animals . 15. SUBJECT TERMS Marine mammal; Passive acoustic monitoring ; Localization; Tracking ; Multiple source ; Sparse array 16. SECURITY...al. 2004; Thode 2005; Nosal 2007] to localize animals in situations where straight-line propagation assumptions made by conventional marine mammal...Objective 1: Inveti for sound speed profiles. hydrophone position and hydrophone timing offset in addition to animal position Almost all marine mammal

  8. 75 FR 48743 - Mandatory Reporting of Greenhouse Gases

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-11

    ...EPA is proposing to amend specific provisions in the GHG reporting rule to clarify certain provisions, to correct technical and editorial errors, and to address certain questions and issues that have arisen since promulgation. These proposed changes include providing additional information and clarity on existing requirements, allowing greater flexibility or simplified calculation methods for certain sources in a facility, amending data reporting requirements to provide additional clarity on when different types of GHG emissions need to be calculated and reported, clarifying terms and definitions in certain equations, and technical corrections.

  9. 75 FR 79091 - Mandatory Reporting of Greenhouse Gases

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-17

    ...EPA is amending specific provisions in the greenhouse gas reporting rule to clarify certain provisions, to correct technical and editorial errors, and to address certain questions and issues that have arisen since promulgation. These final changes include generally providing additional information and clarity on existing requirements, allowing greater flexibility or simplified calculation methods for certain sources, amending data reporting requirements to provide additional clarity on when different types of greenhouse gas emissions need to be calculated and reported, clarifying terms and definitions in certain equations and other technical corrections and amendments.

  10. OntoBrowser: a collaborative tool for curation of ontologies by subject matter experts.

    PubMed

    Ravagli, Carlo; Pognan, Francois; Marc, Philippe

    2017-01-01

    The lack of controlled terminology and ontology usage leads to incomplete search results and poor interoperability between databases. One of the major underlying challenges of data integration is curating data to adhere to controlled terminologies and/or ontologies. Finding subject matter experts with the time and skills required to perform data curation is often problematic. In addition, existing tools are not designed for continuous data integration and collaborative curation. This results in time-consuming curation workflows that often become unsustainable. The primary objective of OntoBrowser is to provide an easy-to-use online collaborative solution for subject matter experts to map reported terms to preferred ontology (or code list) terms and facilitate ontology evolution. Additional features include web service access to data, visualization of ontologies in hierarchical/graph format and a peer review/approval workflow with alerting. The source code is freely available under the Apache v2.0 license. Source code and installation instructions are available at http://opensource.nibr.com This software is designed to run on a Java EE application server and store data in a relational database. philippe.marc@novartis.com. © The Author 2016. Published by Oxford University Press.

  11. OntoBrowser: a collaborative tool for curation of ontologies by subject matter experts

    PubMed Central

    Ravagli, Carlo; Pognan, Francois

    2017-01-01

    Summary: The lack of controlled terminology and ontology usage leads to incomplete search results and poor interoperability between databases. One of the major underlying challenges of data integration is curating data to adhere to controlled terminologies and/or ontologies. Finding subject matter experts with the time and skills required to perform data curation is often problematic. In addition, existing tools are not designed for continuous data integration and collaborative curation. This results in time-consuming curation workflows that often become unsustainable. The primary objective of OntoBrowser is to provide an easy-to-use online collaborative solution for subject matter experts to map reported terms to preferred ontology (or code list) terms and facilitate ontology evolution. Additional features include web service access to data, visualization of ontologies in hierarchical/graph format and a peer review/approval workflow with alerting. Availability and implementation: The source code is freely available under the Apache v2.0 license. Source code and installation instructions are available at http://opensource.nibr.com. This software is designed to run on a Java EE application server and store data in a relational database. Contact: philippe.marc@novartis.com PMID:27605099

  12. Part 1 of a Computational Study of a Drop-Laden Mixing Layer

    NASA Technical Reports Server (NTRS)

    Okong'o, Nora A.; Bellan, Josette

    2004-01-01

    This first of three reports on a computational study of a drop-laden temporal mixing layer presents the results of direct numerical simulations (DNS) of well-resolved flow fields and the derivation of the large-eddy simulation (LES) equations that would govern the larger scales of a turbulent flow field. The mixing layer consisted of two counterflowing gas streams, one of which was initially laden with evaporating liquid drops. The gas phase was composed of two perfect gas species, the carrier gas and the vapor emanating from the drops, and was computed in an Eulerian reference frame, whereas each drop was tracked individually in a Lagrangian manner. The flow perturbations that were initially imposed on the layer caused mixing and eventual transition to turbulence. The DNS database obtained included transitional states for layers with various liquid mass loadings. For the DNS, the gas-phase equations were the compressible Navier-Stokes equations for conservation of momentum and additional conservation equations for total energy and species mass. These equations included source terms representing the effect of the drops on the mass, momentum, and energy of the gas phase. From the DNS equations, the expression for the irreversible entropy production (dissipation) was derived and used to determine the dissipation due to the source terms. The LES equations were derived by spatially filtering the DNS set and the magnitudes of the terms were computed at transitional states, leading to a hierarchy of terms to guide simplification of the LES equations. It was concluded that effort should be devoted to the accurate modeling of both the subgridscale fluxes and the filtered source terms, which were the dominant unclosed terms appearing in the LES equations.

  13. Systematically biological prioritizing remediation sites based on datasets of biological investigations and heavy metals in soil

    NASA Astrophysics Data System (ADS)

    Lin, Wei-Chih; Lin, Yu-Pin; Anthony, Johnathen

    2015-04-01

    Heavy metal pollution has adverse effects on not only the focal invertebrate species of this study, such as reduction in pupa weight and increased larval mortality, but also on the higher trophic level organisms which feed on them, either directly or indirectly, through the process of biomagnification. Despite this, few studies regarding remediation prioritization take species distribution or biological conservation priorities into consideration. This study develops a novel approach for delineating sites which are both contaminated by any of 5 readily bioaccumulated heavy metal soil contaminants and are of high ecological importance for the highly mobile, low trophic level focal species. The conservation priority of each site was based on the projected distributions of 6 moth species simulated via the presence-only maximum entropy species distribution model followed by the subsequent application of a systematic conservation tool. In order to increase the number of available samples, we also integrated crowd-sourced data with professionally-collected data via a novel optimization procedure based on a simulated annealing algorithm. This integration procedure was important since while crowd-sourced data can drastically increase the number of data samples available to ecologists, still the quality or reliability of crowd-sourced data can be called into question, adding yet another source of uncertainty in projecting species distributions. The optimization method screens crowd-sourced data in terms of the environmental variables which correspond to professionally-collected data. The sample distribution data was derived from two different sources, including the EnjoyMoths project in Taiwan (crowd-sourced data) and the Global Biodiversity Information Facility (GBIF) ?eld data (professional data). The distributions of heavy metal concentrations were generated via 1000 iterations of a geostatistical co-simulation approach. The uncertainties in distributions of the heavy metals were then quantified based on the overall consistency between realizations. Finally, Information-Gap Decision Theory (IGDT) was applied to rank the remediation priorities of contaminated sites in terms of both spatial consensus of multiple heavy metal realizations and the priority of specific conservation areas. Our results show that the crowd-sourced optimization algorithm developed in this study is effective at selecting suitable data from crowd-sourced data. By using this technique the available sample data increased to a total number of 96, 162, 72, 62, 69 and 62 or, that is, 2.6, 1.6, 2.5, 1.6, 1.2 and 1.8 times that originally available through the GBIF professionally-assembled database. Additionally, for all species considered the performance of models, in terms of test-AUC values, based on the combination of both data sources exceeded those models which were based on a single data source. Furthermore, the additional optimization-selected data lowered the overall variability, and therefore uncertainty, of model outputs. Based on the projected species distributions, our results revealed that around 30% of high species hotspot areas were also identified as contaminated. The decision-making tool, IGDT, successfully yielded remediation plans in terms of specific ecological value requirements, false positive tolerance rates of contaminated areas, and expected decision robustness. The proposed approach can be applied both to identify high conservation priority sites contaminated by heavy metals, based on the combination of screened crowd-sourced and professionally-collected data, and in making robust remediation decisions.

  14. Semiconductor Laser Low Frequency Noise Characterization

    NASA Technical Reports Server (NTRS)

    Maleki, Lute; Logan, Ronald T.

    1996-01-01

    This work summarizes the efforts in identifying the fundamental noise limit in semiconductor optical sources (lasers) to determine the source of 1/F noise and it's associated behavior. In addition, the study also addresses the effects of this 1/F noise on RF phased arrays. The study showed that the 1/F noise in semiconductor lasers has an ultimate physical limit based upon similar factors to fundamental noise generated in other semiconductor and solid state devices. The study also showed that both additive and multiplicative noise can be a significant detriment to the performance of RF phased arrays especially in regard to very low sidelobe performance and ultimate beam steering accuracy. The final result is that a noise power related term must be included in a complete analysis of the noise spectrum of any semiconductor device including semiconductor lasers.

  15. Evaluation of the Hydrologic Source Term from Underground Nuclear Tests on Pahute Mesa at the Nevada Test Site: The CHESHIRE Test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawloski, G A; Tompson, A F B; Carle, S F

    The objectives of this report are to develop, summarize, and interpret a series of detailed unclassified simulations that forecast the nature and extent of radionuclide release and near-field migration in groundwater away from the CHESHIRE underground nuclear test at Pahute Mesa at the NTS over 1000 yrs. Collectively, these results are called the CHESHIRE Hydrologic Source Term (HST). The CHESHIRE underground nuclear test was one of 76 underground nuclear tests that were fired below or within 100 m of the water table between 1965 and 1992 in Areas 19 and 20 of the NTS. These areas now comprise the Pahutemore » Mesa Corrective Action Unit (CAU) for which a separate subregional scale flow and transport model is being developed by the UGTA Project to forecast the larger-scale migration of radionuclides from underground tests on Pahute Mesa. The current simulations are being developed, on one hand, to more fully understand the complex coupled processes involved in radionuclide migration, with a specific focus on the CHESHIRE test. While remaining unclassified, they are as site specific as possible and involve a level of modeling detail that is commensurate with the most fundamental processes, conservative assumptions, and representative data sets available. However, the simulation results are also being developed so that they may be simplified and interpreted for use as a source term boundary condition at the CHESHIRE location in the Pahute Mesa CAU model. In addition, the processes of simplification and interpretation will provide generalized insight as to how the source term behavior at other tests may be considered or otherwise represented in the Pahute Mesa CAU model.« less

  16. The Plant Research Unit: Long-Term Plant Growth Support for Space Station

    NASA Technical Reports Server (NTRS)

    Heathcote, D. G.; Brown, C. S.; Goins, G. D.; Kliss, M.; Levine, H.; Lomax, P. A.; Porter, R. L.; Wheeler, R.

    1996-01-01

    The specifications of the plant research unit (PRU) plant habitat, designed for space station operations, are presented. A prototype brassboard model of the PRU is described, and the results of the subsystems tests are outlined. The effects of the long term red light emitting diode (LED) illumination as the sole source for plant development were compared with red LEDs supplemented with blue wavelengths, and white fluorescent sources. It was found that wheat and Arabidopsis were able to complete a life cycle under red LEDs alone, but with differences in physiology and morphology. The differences noted were greatest for the Arabidopsis, where the time to flowering was increased under red illumination. The addition of 10 percent of blue light was effective in eliminating the observed differences. The results of the comparative testing of three nutrient delivery systems for the PRU are discussed.

  17. Observations of rapid-fire event tremor at Lascar volcano, Chile

    USGS Publications Warehouse

    Asch, Guenter; Wylegalla, K.; Hellweg, M.; Seidl, D.; Rademacher, H.

    1996-01-01

    During the Proyecto de Investigacio??n Sismolo??gica de la Cordillera Occidental (PISCO '94) in the Atacama desert of Northern Chile, a continuously recording broadband seismic station was installed to the NW of the currently active volcano, Lascar. For the month of April, 1994, an additional network of three, short period, three-component stations was deployed around the volcano to help discriminate its seismic signals from other local seismicity. During the deployment, the volcanic activity at Lascar appeared to be limited mainly to the emission of steam and SO2. Tremor from Lascar is a random, ??rapid-fire?? series of events with a wide range of amplitudes and a quasi-fractal structure. The tremor is generated by an ensemble of independent elementary sources clustered in the volcanic edifice. In the short-term, the excitation of the sources fluctuates strongly, while the long-term power spectrum is very stationary.

  18. Neon reduction program on Cymer ArF light sources

    NASA Astrophysics Data System (ADS)

    Kanawade, Dinesh; Roman, Yzzer; Cacouris, Ted; Thornes, Josh; O'Brien, Kevin

    2016-03-01

    In response to significant neon supply constraints, Cymer has responded with a multi-part plan to support its customers. Cymer's primary objective is to ensure that reliable system performance is maintained while minimizing gas consumption. Gas algorithms were optimized to ensure stable performance across all operating conditions. The Cymer neon support plan contains four elements: 1. Gas reduction program to reduce neon by >50% while maintaining existing performance levels and availability; 2. short-term containment solutions for immediate relief. 3. qualification of additional gas suppliers; and 4. long-term recycling/reclaim opportunity. The Cymer neon reduction program has shown excellent results as demonstrated through the comparison on standard gas use versus the new >50% reduced neon performance for ArF immersion light sources. Testing included stressful conditions such as repetition rate, duty cycle and energy target changes. No performance degradation has been observed over typical gas lives.

  19. Accuracy-preserving source term quadrature for third-order edge-based discretization

    NASA Astrophysics Data System (ADS)

    Nishikawa, Hiroaki; Liu, Yi

    2017-09-01

    In this paper, we derive a family of source term quadrature formulas for preserving third-order accuracy of the node-centered edge-based discretization for conservation laws with source terms on arbitrary simplex grids. A three-parameter family of source term quadrature formulas is derived, and as a subset, a one-parameter family of economical formulas is identified that does not require second derivatives of the source term. Among the economical formulas, a unique formula is then derived that does not require gradients of the source term at neighbor nodes, thus leading to a significantly smaller discretization stencil for source terms. All the formulas derived in this paper do not require a boundary closure, and therefore can be directly applied at boundary nodes. Numerical results are presented to demonstrate third-order accuracy at interior and boundary nodes for one-dimensional grids and linear triangular/tetrahedral grids over straight and curved geometries.

  20. Quiet Clean Short-Haul Experimental Engine (QCSEE): Acoustic treatment development and design

    NASA Technical Reports Server (NTRS)

    Clemons, A.

    1979-01-01

    Acoustic treatment designs for the quiet clean short-haul experimental engines are defined. The procedures used in the development of each noise-source suppressor device are presented and discussed in detail. A complete description of all treatment concepts considered and the test facilities utilized in obtaining background data used in treatment development are also described. Additional supporting investigations that are complementary to the treatment development work are presented. The expected suppression results for each treatment configuration are given in terms of delta SPL versus frequency and in terms of delta PNdB.

  1. The effect of coverings, including plastic bags and wraps, on mortality and morbidity in preterm and full-term neonates.

    PubMed

    Oatley, H K; Blencowe, H; Lawn, J E

    2016-05-01

    Neonatal hypothermia is an important risk factor for mortality and morbidity, and is common even in temperate climates. We conducted a systematic review to determine whether plastic coverings, used immediately following delivery, were effective in reducing the incidence of mortality, hypothermia and morbidity. A total of 26 studies (2271 preterm and 1003 term neonates) were included. Meta-analyses were conducted as appropriate. Plastic wraps were associated with a reduction in hypothermia in preterm (⩽29 weeks; risk ratio (RR)=0.57; 95% confidence interval (CI) 0.46 to 0.71) and term neonates (RR=0.76; 95% CI 0.60 to 0.96). No significant reduction in neonatal mortality or morbidity was found; however, the studies were underpowered for these outcomes. For neonates, especially preterm, plastic wraps combined with other environmental heat sources are effective in reducing hypothermia during stabilization and transfer within hospital. Further research is needed to quantify the effects on mortality or morbidity, and investigate the use of plastic coverings outside hospital settings or without additional heat sources.

  2. The effect of coverings, including plastic bags and wraps, on mortality and morbidity in preterm and full-term neonates

    PubMed Central

    Oatley, H K; Blencowe, H; Lawn, J E

    2016-01-01

    Neonatal hypothermia is an important risk factor for mortality and morbidity, and is common even in temperate climates. We conducted a systematic review to determine whether plastic coverings, used immediately following delivery, were effective in reducing the incidence of mortality, hypothermia and morbidity. A total of 26 studies (2271 preterm and 1003 term neonates) were included. Meta-analyses were conducted as appropriate. Plastic wraps were associated with a reduction in hypothermia in preterm (⩽29 weeks; risk ratio (RR)=0.57; 95% confidence interval (CI) 0.46 to 0.71) and term neonates (RR=0.76; 95% CI 0.60 to 0.96). No significant reduction in neonatal mortality or morbidity was found; however, the studies were underpowered for these outcomes. For neonates, especially preterm, plastic wraps combined with other environmental heat sources are effective in reducing hypothermia during stabilization and transfer within hospital. Further research is needed to quantify the effects on mortality or morbidity, and investigate the use of plastic coverings outside hospital settings or without additional heat sources. PMID:27109095

  3. Poster — Thur Eve — 40: Automated Quality Assurance for Remote-Afterloading High Dose Rate Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Anthony; Ravi, Ananth

    2014-08-15

    High dose rate (HDR) remote afterloading brachytherapy involves sending a small, high-activity radioactive source attached to a cable to different positions within a hollow applicator implanted in the patient. It is critical that the source position within the applicator and the dwell time of the source are accurate. Daily quality assurance (QA) tests of the positional and dwell time accuracy are essential to ensure that the accuracy of the remote afterloader is not compromised prior to patient treatment. Our centre has developed an automated, video-based QA system for HDR brachytherapy that is dramatically superior to existing diode or film QAmore » solutions in terms of cost, objectivity, positional accuracy, with additional functionalities such as being able to determine source dwell time and transit time of the source. In our system, a video is taken of the brachytherapy source as it is sent out through a position check ruler, with the source visible through a clear window. Using a proprietary image analysis algorithm, the source position is determined with respect to time as it moves to different positions along the check ruler. The total material cost of the video-based system was under $20, consisting of a commercial webcam and adjustable stand. The accuracy of the position measurement is ±0.2 mm, and the time resolution is 30 msec. Additionally, our system is capable of robustly verifying the source transit time and velocity (a test required by the AAPM and CPQR recommendations), which is currently difficult to perform accurately.« less

  4. The psychological contract: enhancing productivity and its implications for long-term care.

    PubMed

    Flannery, Raymond B

    2002-01-01

    When hired, a new employee is usually given a job description and an explanation of benefits. In addition, the employee will also have a psychological contract with the organization. This contract, often unstated, reflects the main source of the employee's motivation to work hard. This is true of all groups of employees, including long-term care staff. Common examples of psychological contracts for long-term care administrative staff include autonomy, social acceptance, and being in the forefront of cutting-edge research. An awareness of these psychological contracts can result in better "fits" between employee aspirations and relevant long-term care organization tasks so that productivity is enhanced. This article outlines the steps necessary to create these good fits in ways that benefit both the organization and its employees. These recommendations are of particular relevance to administrators and supervisors in long-term carefacilities.

  5. Life-cycle energy impacts for adapting an urban water supply system to droughts.

    PubMed

    Lam, Ka Leung; Stokes-Draut, Jennifer R; Horvath, Arpad; Lane, Joe L; Kenway, Steven J; Lant, Paul A

    2017-12-15

    In recent years, cities in some water stressed regions have explored alternative water sources such as seawater desalination and potable water recycling in spite of concerns over increasing energy consumption. In this study, we evaluate the current and future life-cycle energy impacts of four alternative water supply strategies introduced during a decade-long drought in South East Queensland (SEQ), Australia. These strategies were: seawater desalination, indirect potable water recycling, network integration, and rainwater tanks. Our work highlights the energy burden of alternative water supply strategies which added approximately 24% life-cycle energy use to the existing supply system (with surface water sources) in SEQ even for a current post-drought low utilisation status. Over half of this additional life-cycle energy use was from the centralised alternative supply strategies. Rainwater tanks contributed an estimated 3% to regional water supply, but added over 10% life-cycle energy use to the existing system. In the future scenario analysis, we compare the life-cycle energy use between "Normal", "Dry", "High water demand" and "Design capacity" scenarios. In the "Normal" scenario, a long-term low utilisation of the desalination system and the water recycling system has greatly reduced the energy burden of these centralised strategies to only 13%. In contrast, higher utilisation in the unlikely "Dry" and "Design capacity" scenarios add 86% and 140% to life-cycle energy use of the existing system respectively. In the "High water demand" scenario, a 20% increase in per capita water use over 20 years "consumes" more energy than is used by the four alternative strategies in the "Normal" scenario. This research provides insight for developing more realistic long-term scenarios to evaluate and compare life-cycle energy impacts of drought-adaptation infrastructure and regional decentralised water sources. Scenario building for life-cycle assessments of water supply systems should consider i) climate variability and, therefore, infrastructure utilisation rate, ii) potential under-utilisation for both installed centralised and decentralised sources, and iii) the potential energy penalty for operating infrastructure well below its design capacity (e.g., the operational energy intensity of the desalination system is three times higher at low utilisation rates). This study illustrates that evaluating the life-cycle energy use and intensity of these type of supply sources without considering their realistic long-term operating scenario(s) can potentially distort and overemphasise their energy implications. To other water stressed regions, this work shows that managing long-term water demand is also important, in addition to acknowledging the energy-intensive nature of some alternative water sources. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. The Influence of Sulphate Deposition on the Seasonal Variation of Peat Pore Water Methyl Hg in a Boreal Mire

    PubMed Central

    Bergman, Inger; Bishop, Kevin; Tu, Qiang; Frech, Wolfgang; Åkerblom, Staffan; Nilsson, Mats

    2012-01-01

    In this paper we investigate the hypothesis that long-term sulphate (SO4 2−) deposition has made peatlands a larger source of methyl mercury (MeHg) to remote boreal lakes. This was done on experimental plots at a boreal, low sedge mire where the effect of long-term addition of SO4 2− on peat pore water MeHg concentrations was observed weekly throughout the snow-free portion of 1999. The additions of SO4 2− started in 1995. The seasonal mean of the pore water MeHg concentrations on the plots with 17 kg ha−1 yr−1 of sulphur (S) addition (1.3±0.08 ng L−1, SE; n = 44) was significantly (p<0.0001) higher than the mean MeHg concentration on the plots with 3 kg ha−1 yr−1 of ambient S deposition (0.6±0.02 ng L−1, SE; n = 44). The temporal variation in pore water MeHg concentrations during the snow free season was larger in the S-addition plots, with an amplitude of >2 ng L−1 compared to +/−0.5 ng L−1 in the ambient S deposition plots. The concentrations of pore water MeHg in the S-addition plots were positively correlated (r2 = 0.21; p = 0.001) to the groundwater level, with the lowest concentrations of MeHg during the period with the lowest groundwater levels. The pore water MeHg concentrations were not correlated to total Hg, DOC concentration or pH. The results from this study indicate that the persistently higher pore water concentrations of MeHg in the S-addition plots are caused by the long-term additions of SO4 2− to the mire surface. Since these waters are an important source of runoff, the results support the hypothesis that SO4 2− deposition has increased the contribution of peatlands to MeHg in downstream aquatic systems. This would mean that the increased deposition of SO4 2− in acid rain has contributed to the modern increase in the MeHg burdens of remote lakes hydrologically connected to peatlands. PMID:23029086

  7. Sources of Uncertainty and the Interpretation of Short-Term Fluctuations

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Risbey, J.; Cowtan, K.; Rahmstorf, S.

    2016-12-01

    The alleged significant slowdown in global warming during the first decade of the 21st century, and the appearance of a discrepancy between models and observations, has attracted considerable research attention. We trace the history of this research and show how its conclusions were shaped by several sources of uncertainty and ambiguity about models and observations. We show that as those sources of uncertainty were gradually eliminated by further research, insufficient evidence remained to infer any discrepancy between models and observations or a significant slowing of warming. Specifically, we show that early research had to contend with uncertainties about coverage biases in the global temperature record and biases in the sea surface temperature observations which turned out to have exaggerated the extent of slowing. In addition, uncertainties in the observed forcings were found to have exaggerated the mismatch between models and observations. Further sources of uncertainty that were ultimately eliminated involved the use of incommensurate sea surface temperature data between models and observations and a tacit interpretation of model projections as predictions or forecasts. After all those sources of uncertainty were eliminated, the most recent research finds little evidence for an unusual slowdown or a discrepancy between models and observations. We discuss whether these different kinds of uncertainty could have been anticipated or managed differently, and how one can apply those lessons to future short-term fluctuations in warming.

  8. A shock capturing technique for hypersonic, chemically relaxing flows

    NASA Technical Reports Server (NTRS)

    Eberhardt, S.; Brown, K.

    1986-01-01

    A fully coupled, shock capturing technique is presented for chemically reacting flows at high Mach numbers. The technique makes use of a total variation diminishing (TVD) dissipation operator which results in sharp, crisp shocks. The eigenvalues and eigenvectors of the fully coupled system, which includes species conversion equations in addition to the gas dynamics equations, are analytically derived for a general reacting gas. Species production terms for a model dissociating gas are introduced and are included in the algorithm. The convective terms are solved using a first-order TVD scheme while the source terms are solved using a fourth-order Runge-Kutta scheme to enhance stability. Results from one-dimensional numerical experiments are shown for a two species and a three species gas.

  9. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  10. Group long-term care insurance: decision-making factors and implications for financing long-term care.

    PubMed

    Stum, Marlene S

    2008-01-01

    This study proposes and tests a systemic family decision-making framework to understand group long-term care insurance (LTCI) enrollment decisions. A random sample of public employees who were offered group LTCI as a workplace benefit were examined. Findings reveal very good predictive efficacy for the overall conceptual framework with a pseudo R2 value of .687, and reinforced the contributions of factors within the family system. Enrollees were more likely to have discussed the decision with others, used information sources, and had prior experience when compared to non-enrollees. Perceived health status, financial knowledge, attitudes regarding the role of private insurance, risk taking, and coverage features were additional factors related to enrollment decisions. The findings help to inform policymakers about the potential of LTCI as one strategy for financing long-term care.

  11. Men's strategic preferences for femininity in female faces.

    PubMed

    Little, Anthony C; Jones, Benedict C; Feinberg, David R; Perrett, David I

    2014-08-01

    Several evolutionarily relevant sources of individual differences in face preference have been documented for women. Here, we examine three such sources of individual variation in men's preference for female facial femininity: term of relationship, partnership status and self-perceived attractiveness. We show that men prefer more feminine female faces when rating for a short-term relationship and when they have a partner (Study 1). These variables were found to interact in a follow-up study (Study 2). Men who thought themselves attractive also preferred more feminized female faces for short-term relationships than men who thought themselves less attractive (Study 1 and Study 2). In women, similar findings for masculine preferences in male faces have been interpreted as adaptive. In men, such preferences potentially reflect that attractive males are able to compete for high-quality female partners in short-term contexts. When a man has secured a mate, the potential cost of being discovered may increase his choosiness regarding short-term partners relative to unpartnered men, who can better increase their short-term mating success by relaxing their standards. Such potentially strategic preferences imply that men also face trade-offs when choosing relatively masculine or feminine faced partners. In line with a trade-off, women with feminine faces were seen as more likely to be unfaithful and more likely to pursue short-term relationships (Study 3), suggesting that risk of cuckoldry is one factor that may limit men's preferences for femininity in women and could additionally lead to preferences for femininity in short-term mates. © 2013 The British Psychological Society.

  12. Essential oils: extraction, bioactivities, and their uses for food preservation.

    PubMed

    Tongnuanchan, Phakawat; Benjakul, Soottawat

    2014-07-01

    Essential oils are concentrated liquids of complex mixtures of volatile compounds and can be extracted from several plant organs. Essential oils are a good source of several bioactive compounds, which possess antioxidative and antimicrobial properties. In addition, some essential oils have been used as medicine. Furthermore, the uses of essential oils have received increasing attention as the natural additives for the shelf-life extension of food products, due to the risk in using synthetic preservatives. Essential oils can be incorporated into packaging, in which they can provide multifunctions termed "active or smart packaging." Those essential oils are able to modify the matrix of packaging materials, thereby rendering the improved properties. This review covers up-to-date literatures on essential oils including sources, chemical composition, extraction methods, bioactivities, and their applications, particularly with the emphasis on preservation and the shelf-life extension of food products. © 2014 Institute of Food Technologists®

  13. Effects of Varying Nitrogen Sources on Amino Acid Synthesis Costs in Arabidopsis thaliana under Different Light and Carbon-Source Conditions

    PubMed Central

    Nikoloski, Zoran

    2015-01-01

    Plants as sessile organisms cannot escape their environment and have to adapt to any changes in the availability of sunlight and nutrients. The quantification of synthesis costs of metabolites, in terms of consumed energy, is a prerequisite to understand trade-offs arising from energetic limitations. Here, we examine the energy consumption of amino acid synthesis in Arabidopsis thaliana. To quantify these costs in terms of the energy equivalent ATP, we introduce an improved cost measure based on flux balance analysis and apply it to three state-of-the-art metabolic reconstructions to ensure robust results. We present the first systematic in silico analysis of the effect of nitrogen supply (nitrate/ammonium) on individual amino acid synthesis costs as well as of the effect of photoautotrophic and heterotrophic growth conditions, integrating day/night-specific regulation. Our results identify nitrogen supply as a key determinant of amino acid costs, in agreement with experimental evidence. In addition, the association of the determined costs with experimentally observed growth patterns suggests that metabolite synthesis costs are involved in shaping regulation of plant growth. Finally, we find that simultaneous uptake of both nitrogen sources can lead to efficient utilization of energy source, which may be the result of evolutionary optimization. PMID:25706533

  14. Effects of varying nitrogen sources on amino acid synthesis costs in Arabidopsis thaliana under different light and carbon-source conditions.

    PubMed

    Arnold, Anne; Sajitz-Hermstein, Max; Nikoloski, Zoran

    2015-01-01

    Plants as sessile organisms cannot escape their environment and have to adapt to any changes in the availability of sunlight and nutrients. The quantification of synthesis costs of metabolites, in terms of consumed energy, is a prerequisite to understand trade-offs arising from energetic limitations. Here, we examine the energy consumption of amino acid synthesis in Arabidopsis thaliana. To quantify these costs in terms of the energy equivalent ATP, we introduce an improved cost measure based on flux balance analysis and apply it to three state-of-the-art metabolic reconstructions to ensure robust results. We present the first systematic in silico analysis of the effect of nitrogen supply (nitrate/ammonium) on individual amino acid synthesis costs as well as of the effect of photoautotrophic and heterotrophic growth conditions, integrating day/night-specific regulation. Our results identify nitrogen supply as a key determinant of amino acid costs, in agreement with experimental evidence. In addition, the association of the determined costs with experimentally observed growth patterns suggests that metabolite synthesis costs are involved in shaping regulation of plant growth. Finally, we find that simultaneous uptake of both nitrogen sources can lead to efficient utilization of energy source, which may be the result of evolutionary optimization.

  15. Effects of volcano topography on seismic broad-band waveforms

    NASA Astrophysics Data System (ADS)

    Neuberg, Jürgen; Pointer, Tim

    2000-10-01

    Volcano seismology often deals with rather shallow seismic sources and seismic stations deployed in their near field. The complex stratigraphy on volcanoes and near-field source effects have a strong impact on the seismic wavefield, complicating the interpretation techniques that are usually employed in earthquake seismology. In addition, as most volcanoes have a pronounced topography, the interference of the seismic wavefield with the stress-free surface results in severe waveform perturbations that affect seismic interpretation methods. In this study we deal predominantly with the surface effects, but take into account the impact of a typical volcano stratigraphy as well as near-field source effects. We derive a correction term for plane seismic waves and a plane-free surface such that for smooth topographies the effect of the free surface can be totally removed. Seismo-volcanic sources radiate energy in a broad frequency range with a correspondingly wide range of different Fresnel zones. A 2-D boundary element method is employed to study how the size of the Fresnel zone is dependent on source depth, dominant wavelength and topography in order to estimate the limits of the plane wave approximation. This approximation remains valid if the dominant wavelength does not exceed twice the source depth. Further aspects of this study concern particle motion analysis to locate point sources and the influence of the stratigraphy on particle motions. Furthermore, the deployment strategy of seismic instruments on volcanoes, as well as the direct interpretation of the broad-band waveforms in terms of pressure fluctuations in the volcanic plumbing system, are discussed.

  16. Prevalence of microbiological contaminants in groundwater sources and risk factor assessment in Juba, South Sudan.

    PubMed

    Engström, Emma; Balfors, Berit; Mörtberg, Ulla; Thunvik, Roger; Gaily, Tarig; Mangold, Mikael

    2015-05-15

    In low-income regions, drinking water is often derived from groundwater sources, which might spread diarrheal disease if they are microbiologically polluted. This study aimed to investigate the occurrence of fecal contamination in 147 improved groundwater sources in Juba, South Sudan and to assess potential contributing risk factors, based on bivariate statistical analysis. Thermotolerant coliforms (TTCs) were detected in 66% of the investigated sources, including 95 boreholes, breaching the health-based recommendations for drinking water. A significant association (p<0.05) was determined between the presence of TTCs and the depth of cumulative, long-term prior precipitation (both within the previous five days and within the past month). No such link was found to short-term rainfall, the presence of latrines or damages in the borehole apron. However, the risk factor analysis further suggested, to a lesser degree, that the local topography and on-site hygiene were additionally significant. In summary, the analysis indicated that an important contamination mechanism was fecal pollution of the contributing groundwater, which was unlikely due to the presence of latrines; instead, infiltration from contaminated surface water was more probable. The reduction in fecal sources in the environment in Juba is thus recommended, for example, through constructing latrines or designating protection areas near water sources. The study results contribute to the understanding of microbiological contamination of groundwater sources in areas with low incomes and high population densities, tropical climates and weathered basement complex environments, which are common in urban sub-Saharan Africa. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Dynamics of two competing species in the presence of Lévy noise sources.

    PubMed

    La Cognata, A; Valenti, D; Dubkov, A A; Spagnolo, B

    2010-07-01

    We consider a Lotka-Volterra system of two competing species subject to multiplicative α-stable Lévy noise. The interaction parameter between the species is a random process which obeys a stochastic differential equation with a generalized bistable potential in the presence both of a periodic driving term and an additive α-stable Lévy noise. We study the species dynamics, which is characterized by two different regimes, exclusion of one species and coexistence of both. We find quasiperiodic oscillations and stochastic resonance phenomenon in the dynamics of the competing species, analyzing the role of the Lévy noise sources.

  18. Dynamics of two competing species in the presence of Lévy noise sources

    NASA Astrophysics Data System (ADS)

    La Cognata, A.; Valenti, D.; Dubkov, A. A.; Spagnolo, B.

    2010-07-01

    We consider a Lotka-Volterra system of two competing species subject to multiplicative α -stable Lévy noise. The interaction parameter between the species is a random process which obeys a stochastic differential equation with a generalized bistable potential in the presence both of a periodic driving term and an additive α -stable Lévy noise. We study the species dynamics, which is characterized by two different regimes, exclusion of one species and coexistence of both. We find quasiperiodic oscillations and stochastic resonance phenomenon in the dynamics of the competing species, analyzing the role of the Lévy noise sources.

  19. Orthogonal strip HPGe planar SmartPET detectors in Compton configuration

    NASA Astrophysics Data System (ADS)

    Boston, H. C.; Gillam, J.; Boston, A. J.; Cooper, R. J.; Cresswell, J.; Grint, A. N.; Mather, A. R.; Nolan, P. J.; Scraggs, D. P.; Turk, G.; Hall, C. J.; Lazarus, I.; Berry, A.; Beveridge, T.; Lewis, R.

    2007-10-01

    The evolution of Germanium detector technology over the last decade has lead to the possibility that they can be employed in medical and security imaging. The potential of excellent energy resolution coupled with good position information that Germanium affords removes the necessity for mechanical collimators that would be required in a conventional gamma camera system. By removing this constraint, the overall dose to the patient can be reduced or the throughput of the system can be increased. An additional benefit of excellent energy resolution is that tight gates can be placed on energies from either a multi-lined gamma source or from multi-nuclide sources increasing the number of sources that can be used in medical imaging. In terms of security imaging, segmented Germanium gives directionality and excellent spectroscopic information.

  20. A dynamical regularization algorithm for solving inverse source problems of elliptic partial differential equations

    NASA Astrophysics Data System (ADS)

    Zhang, Ye; Gong, Rongfang; Cheng, Xiaoliang; Gulliksson, Mårten

    2018-06-01

    This study considers the inverse source problem for elliptic partial differential equations with both Dirichlet and Neumann boundary data. The unknown source term is to be determined by additional boundary conditions. Unlike the existing methods found in the literature, which usually employ the first-order in time gradient-like system (such as the steepest descent methods) for numerically solving the regularized optimization problem with a fixed regularization parameter, we propose a novel method with a second-order in time dissipative gradient-like system and a dynamical selected regularization parameter. A damped symplectic scheme is proposed for the numerical solution. Theoretical analysis is given for both the continuous model and the numerical algorithm. Several numerical examples are provided to show the robustness of the proposed algorithm.

  1. Ancient Glass: A Literature Search and its Role in Waste Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strachan, Denis M.; Pierce, Eric M.

    2010-07-01

    When developing a performance assessment model for the long-term disposal of immobilized low-activity waste (ILAW) glass, it is desirable to determine the durability of glass forms over very long periods of time. However, testing is limited to short time spans, so experiments are performed under conditions that accelerate the key geochemical processes that control weathering. Verification that models currently being used can reliably calculate the long term behavior ILAW glass is a key component of the overall PA strategy. Therefore, Pacific Northwest National Laboratory was contracted by Washington River Protection Solutions, LLC to evaluate alternative strategies that can be usedmore » for PA source term model validation. One viable alternative strategy is the use of independent experimental data from archaeological studies of ancient or natural glass contained in the literature. These results represent a potential independent experiment that date back to approximately 3600 years ago or 1600 before the current era (bce) in the case of ancient glass and 106 years or older in the case of natural glass. The results of this literature review suggest that additional experimental data may be needed before the result from archaeological studies can be used as a tool for model validation of glass weathering and more specifically disposal facility performance. This is largely because none of the existing data set contains all of the information required to conduct PA source term calculations. For example, in many cases the sediments surrounding the glass was not collected and analyzed; therefore having the data required to compare computer simulations of concentration flux is not possible. This type of information is important to understanding the element release profile from the glass to the surrounding environment and provides a metric that can be used to calibrate source term models. Although useful, the available literature sources do not contain the required information needed to simulate the long-term performance of nuclear waste glasses in a near-surface or deep geologic repositories. The information that will be required include 1) experimental measurements to quantify the model parameters, 2) detailed analyses of altered glass samples, and 3) detailed analyses of the sediment surrounding the ancient glass samples.« less

  2. Extended lattice Boltzmann scheme for droplet combustion.

    PubMed

    Ashna, Mostafa; Rahimian, Mohammad Hassan; Fakhari, Abbas

    2017-05-01

    The available lattice Boltzmann (LB) models for combustion or phase change are focused on either single-phase flow combustion or two-phase flow with evaporation assuming a constant density for both liquid and gas phases. To pave the way towards simulation of spray combustion, we propose a two-phase LB method for modeling combustion of liquid fuel droplets. We develop an LB scheme to model phase change and combustion by taking into account the density variation in the gas phase and accounting for the chemical reaction based on the Cahn-Hilliard free-energy approach. Evaporation of liquid fuel is modeled by adding a source term, which is due to the divergence of the velocity field being nontrivial, in the continuity equation. The low-Mach-number approximation in the governing Navier-Stokes and energy equations is used to incorporate source terms due to heat release from chemical reactions, density variation, and nonluminous radiative heat loss. Additionally, the conservation equation for chemical species is formulated by including a source term due to chemical reaction. To validate the model, we consider the combustion of n-heptane and n-butanol droplets in stagnant air using overall single-step reactions. The diameter history and flame standoff ratio obtained from the proposed LB method are found to be in good agreement with available numerical and experimental data. The present LB scheme is believed to be a promising approach for modeling spray combustion.

  3. Short-Term Rhizosphere Effect on Available Carbon Sources, Phenanthrene Degradation, and Active Microbiome in an Aged-Contaminated Industrial Soil

    PubMed Central

    Thomas, François; Cébron, Aurélie

    2016-01-01

    Over the last decades, understanding of the effects of plants on soil microbiomes has greatly advanced. However, knowledge on the assembly of rhizospheric communities in aged-contaminated industrial soils is still limited, especially with regard to transcriptionally active microbiomes and their link to the quality or quantity of carbon sources. We compared the short-term (2–10 days) dynamics of bacterial communities and potential PAH-degrading bacteria in bare or ryegrass-planted aged-contaminated soil spiked with phenanthrene, put in relation with dissolved organic carbon (DOC) sources and polycyclic aromatic hydrocarbon (PAH) pollution. Both resident and active bacterial communities (analyzed from DNA and RNA, respectively) showed higher species richness and smaller dispersion between replicates in planted soils. Root development strongly favored the activity of Pseudomonadales within the first 2 days, and of members of Actinobacteria, Caulobacterales, Rhizobiales, and Xanthomonadales within 6–10 days. Plants slowed down the dissipation of phenanthrene, while root exudation provided a cocktail of labile substrates that might preferentially fuel microbial growth. Although the abundance of PAH-degrading genes increased in planted soil, their transcription level stayed similar to bare soil. In addition, network analysis revealed that plants induced an early shift in the identity of potential phenanthrene degraders, which might influence PAH dissipation on the long-term. PMID:26903971

  4. Evaluating the behavior of polychlorinated biphenyl compounds in Lake Superior using a dynamic multimedia model

    NASA Astrophysics Data System (ADS)

    Khan, T.; Perlinger, J. A.; Urban, N. R.

    2017-12-01

    Certain toxic, persistent, bioaccumulative, and semivolatile compounds known as atmosphere-surface exchangeable pollutants or ASEPs are emitted into the environment by primary sources, are transported, deposited to water surfaces, and can be later re-emitted causing the water to act as a secondary source. Polychlorinated biphenyl (PCB) compounds, a class of ASEPs, are of major concern in the Laurentian Great Lakes because of their historical use primarily as additives to oils and industrial fluids, and discharge from industrial sources. Following the ban on production in the U.S. in 1979, atmospheric concentrations of PCBs in the Lake Superior region decreased rapidly. Subsequently, PCB concentrations in the lake surface water also reached near equilibrium as the atmospheric levels of PCBs declined. However, previous studies on long-term PCB levels and trends in lake trout and walleye suggested that the initial rate of decline of PCB concentrations in fish has leveled off in Lake Superior. In this study, a dynamic multimedia flux model was developed with the objective to investigate the observed levelling off of PCB concentrations in Lake Superior fish. The model structure consists of two water layers (the epilimnion and the hypolimnion), and the surface mixed sediment layer, while atmospheric deposition is the primary external pathway of PCB inputs to the lake. The model was applied for different PCB congeners having a range of hydrophobicity and volatility. Using this model, we compare the long-term trends in predicted PCB concentrations in different environmental media with relevant available measurements for Lake Superior. We examine the seasonal depositional and exchange patterns, the relative importance of different process terms, and provide the most probable source of the current observed PCB levels in Lake Superior fish. In addition, we evaluate the role of current atmospheric PCB levels in sustaining the observed fish concentrations and appraise the need for continuous atmospheric PCB monitoring by the Great Lakes Integrated Atmospheric Deposition Network. By combining the modeled lake and biota response times resulting from atmospheric PCB inputs, we predict the time scale for safe fish consumption in Lake Superior.

  5. Influence of heat conducting substrates on explosive crystallization in thin layers

    NASA Astrophysics Data System (ADS)

    Schneider, Wilhelm

    2017-09-01

    Crystallization in a thin, initially amorphous layer is considered. The layer is in thermal contact with a substrate of very large dimensions. The energy equation of the layer contains source and sink terms. The source term is due to liberation of latent heat in the crystallization process, while the sink term is due to conduction of heat into the substrate. To determine the latter, the heat diffusion equation for the substrate is solved by applying Duhamel's integral. Thus, the energy equation of the layer becomes a heat diffusion equation with a time integral as an additional term. The latter term indicates that the heat loss due to the substrate depends on the history of the process. To complete the set of equations, the crystallization process is described by a rate equation for the degree of crystallization. The governing equations are then transformed to a moving co-ordinate system in order to analyze crystallization waves that propagate with invariant properties. Dual solutions are found by an asymptotic expansion for large activation energies of molecular diffusion. By introducing suitable variables, the results can be presented in a universal form that comprises the influence of all non-dimensional parameters that govern the process. Of particular interest for applications is the prediction of a critical heat loss parameter for the existence of crystallization waves with invariant properties.

  6. Mass discharge assessment at a brominated DNAPL site: Effects of known DNAPL source mass removal

    NASA Astrophysics Data System (ADS)

    Johnston, C. D.; Davis, G. B.; Bastow, T. P.; Woodbury, R. J.; Rao, P. S. C.; Annable, M. D.; Rhodes, S.

    2014-08-01

    Management and closure of contaminated sites is increasingly being proposed on the basis of mass flux of dissolved contaminants in groundwater. Better understanding of the links between source mass removal and contaminant mass fluxes in groundwater would allow greater acceptance of this metric in dealing with contaminated sites. Our objectives here were to show how measurements of the distribution of contaminant mass flux and the overall mass discharge emanating from the source under undisturbed groundwater conditions could be related to the processes and extent of source mass depletion. In addition, these estimates of mass discharge were sought in the application of agreed remediation targets set in terms of pumped groundwater quality from offsite wells. Results are reported from field studies conducted over a 5-year period at a brominated DNAPL (tetrabromoethane, TBA; and tribromoethene, TriBE) site located in suburban Perth, Western Australia. Groundwater fluxes (qw; L3/L2/T) and mass fluxes (Jc; M/L2/T) of dissolved brominated compounds were simultaneously estimated by deploying Passive Flux Meters (PFMs) in wells in a heterogeneous layered aquifer. PFMs were deployed in control plane (CP) wells immediately down-gradient of the source zone, before (2006) and after (2011) 69-85% of the source mass was removed, mainly by groundwater pumping from the source zone. The high-resolution (26-cm depth interval) measures of qw and Jc along the source CP allowed investigation of the DNAPL source-zone architecture and impacts of source mass removal. Comparable estimates of total mass discharge (MD; M/T) across the source zone CP reduced from 104 g day- 1 to 24-31 g day- 1 (70-77% reductions). Importantly, this mass discharge reduction was consistent with the estimated proportion of source mass remaining at the site (15-31%). That is, a linear relationship between mass discharge and source mass is suggested. The spatial detail of groundwater and mass flux distributions also provided further evidence of the source zone architecture and DNAPL mass depletion processes. This was especially apparent in different mass-depletion rates from distinct parts of the CP. High mass fluxes and groundwater fluxes located near the base of the aquifer dominated in terms of the dissolved mass flux in the profile, although not in terms of concentrations. Reductions observed in Jc and MD were used to better target future remedial efforts. Integration of the observations from the PFM deployments and the source mass depletion provided a basis for establishing flux-based management criteria for the site.

  7. Mass discharge assessment at a brominated DNAPL site: Effects of known DNAPL source mass removal.

    PubMed

    Johnston, C D; Davis, G B; Bastow, T P; Woodbury, R J; Rao, P S C; Annable, M D; Rhodes, S

    2014-08-01

    Management and closure of contaminated sites is increasingly being proposed on the basis of mass flux of dissolved contaminants in groundwater. Better understanding of the links between source mass removal and contaminant mass fluxes in groundwater would allow greater acceptance of this metric in dealing with contaminated sites. Our objectives here were to show how measurements of the distribution of contaminant mass flux and the overall mass discharge emanating from the source under undisturbed groundwater conditions could be related to the processes and extent of source mass depletion. In addition, these estimates of mass discharge were sought in the application of agreed remediation targets set in terms of pumped groundwater quality from offsite wells. Results are reported from field studies conducted over a 5-year period at a brominated DNAPL (tetrabromoethane, TBA; and tribromoethene, TriBE) site located in suburban Perth, Western Australia. Groundwater fluxes (qw; L(3)/L(2)/T) and mass fluxes (Jc; M/L(2)/T) of dissolved brominated compounds were simultaneously estimated by deploying Passive Flux Meters (PFMs) in wells in a heterogeneous layered aquifer. PFMs were deployed in control plane (CP) wells immediately down-gradient of the source zone, before (2006) and after (2011) 69-85% of the source mass was removed, mainly by groundwater pumping from the source zone. The high-resolution (26-cm depth interval) measures of qw and Jc along the source CP allowed investigation of the DNAPL source-zone architecture and impacts of source mass removal. Comparable estimates of total mass discharge (MD; M/T) across the source zone CP reduced from 104gday(-1) to 24-31gday(-1) (70-77% reductions). Importantly, this mass discharge reduction was consistent with the estimated proportion of source mass remaining at the site (15-31%). That is, a linear relationship between mass discharge and source mass is suggested. The spatial detail of groundwater and mass flux distributions also provided further evidence of the source zone architecture and DNAPL mass depletion processes. This was especially apparent in different mass-depletion rates from distinct parts of the CP. High mass fluxes and groundwater fluxes located near the base of the aquifer dominated in terms of the dissolved mass flux in the profile, although not in terms of concentrations. Reductions observed in Jc and MD were used to better target future remedial efforts. Integration of the observations from the PFM deployments and the source mass depletion provided a basis for establishing flux-based management criteria for the site. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Degenerative meniscus: Pathogenesis, diagnosis, and treatment options

    PubMed Central

    Howell, Richard; Kumar, Neil S; Patel, Nimit; Tom, James

    2014-01-01

    The symptomatic degenerative meniscus continues to be a source of discomfort for a significant number of patients. With vascular penetration of less than one-third of the adult meniscus, healing potential in the setting of chronic degeneration remains low. Continued hoop and shear stresses upon the degenerative meniscus results in gross failure, often in the form of complex tears in the posterior horn and midbody. Patient history and physical examination are critical to determine the true source of pain, particularly with the significant incidence of simultaneous articular pathology. Joint line tenderness, a positive McMurray test, and mechanical catching or locking can be highly suggestive of a meniscal source of knee pain and dysfunction. Radiographs and magnetic resonance imaging are frequently utilized to examine for osteoarthritis and to verify the presence of meniscal tears, in addition to ruling out other sources of pain. Non-operative therapy focused on non-steroidal anti-inflammatory drugs and physical therapy may be able to provide pain relief as well as improve mechanical function of the knee joint. For patients refractory to conservative therapy, arthroscopic partial meniscectomy can provide short-term gains regarding pain relief, especially when combined with an effective, regular physiotherapy program. Patients with clear mechanical symptoms and meniscal pathology may benefit from arthroscopic partial meniscectomy, but surgery is not a guaranteed success, especially with concomitant articular pathology. Ultimately, the long-term outcomes of either treatment arm provide similar results for most patients. Further study is needed regarding the short and long-term outcomes regarding conservative and surgical therapy, with a particular focus on the economic impact of treatment as well. PMID:25405088

  9. Repeat immigration: A previously unobserved source of heterogeneity?

    PubMed

    Aradhya, Siddartha; Scott, Kirk; Smith, Christopher D

    2017-07-01

    Register data allow for nuanced analyses of heterogeneities between sub-groups which are not observable in other data sources. One heterogeneity for which register data is particularly useful is in identifying unique migration histories of immigrant populations, a group of interest across disciplines. Years since migration is a commonly used measure of integration in studies seeking to understand the outcomes of immigrants. This study constructs detailed migration histories to test whether misclassified migrations may mask important heterogeneities. In doing so, we identify a previously understudied group of migrants called repeat immigrants, and show that they differ systematically from permanent immigrants. In addition, we quantify the degree to which migration information is misreported in the registers. The analysis is carried out in two steps. First, we estimate income trajectories for repeat immigrants and permanent immigrants to understand the degree to which they differ. Second, we test data validity by cross-referencing migration information with changes in income to determine whether there are inconsistencies indicating misreporting. From the first part of the analysis, the results indicate that repeat immigrants systematically differ from permanent immigrants in terms of income trajectories. Furthermore, income trajectories differ based on the way in which years since migration is calculated. The second part of the analysis suggests that misreported migration events, while present, are negligible. Repeat immigrants differ in terms of income trajectories, and may differ in terms of other outcomes as well. Furthermore, this study underlines that Swedish registers provide a reliable data source to analyze groups which are unidentifiable in other data sources.

  10. An Empirical Temperature Variance Source Model in Heated Jets

    NASA Technical Reports Server (NTRS)

    Khavaran, Abbas; Bridges, James

    2012-01-01

    An acoustic analogy approach is implemented that models the sources of jet noise in heated jets. The equivalent sources of turbulent mixing noise are recognized as the differences between the fluctuating and Favre-averaged Reynolds stresses and enthalpy fluxes. While in a conventional acoustic analogy only Reynolds stress components are scrutinized for their noise generation properties, it is now accepted that a comprehensive source model should include the additional entropy source term. Following Goldstein s generalized acoustic analogy, the set of Euler equations are divided into two sets of equations that govern a non-radiating base flow plus its residual components. When the base flow is considered as a locally parallel mean flow, the residual equations may be rearranged to form an inhomogeneous third-order wave equation. A general solution is written subsequently using a Green s function method while all non-linear terms are treated as the equivalent sources of aerodynamic sound and are modeled accordingly. In a previous study, a specialized Reynolds-averaged Navier-Stokes (RANS) solver was implemented to compute the variance of thermal fluctuations that determine the enthalpy flux source strength. The main objective here is to present an empirical model capable of providing a reasonable estimate of the stagnation temperature variance in a jet. Such a model is parameterized as a function of the mean stagnation temperature gradient in the jet, and is evaluated using commonly available RANS solvers. The ensuing thermal source distribution is compared with measurements as well as computational result from a dedicated RANS solver that employs an enthalpy variance and dissipation rate model. Turbulent mixing noise predictions are presented for a wide range of jet temperature ratios from 1.0 to 3.20.

  11. Electron Energy Deposition in Atomic Nitrogen

    DTIC Science & Technology

    1987-10-06

    knovn theoretical results, and their relative accuracy in comparison to existing measurements and calculations is given elsevhere. 20 2.1 The Source Term...with the proper choice of parameters, reduces to vell-known theoretical results. 20 Table 2 gives the parameters for collisional excitation of the...calculations of McGuire 36 and experimental measurements of Brook et al.3 7 Additional theoretical and experimental results are discussed in detail elsevhere

  12. POGO-FAN: Remarkable Empirical Indicators for the Local Chemical Production of Smog- Ozone and NOx-Sensitivity of Air Parcels

    NASA Astrophysics Data System (ADS)

    Chatfield, R. B.; Browell, E. V.; Brune, W. H.; Crawford, J. H.; Esswein, R.; Fried, A.; Olson, J. R.; Shetter, R. E.; Singh, H. B.

    2006-12-01

    We propose and evaluate two related and surprisingly simple empirical estimators for the local chemical production term for photochemical ozone; each uses two moderate-technology chemical measurements and a measurement of ultraviolet light. We nickname the techniques POGO-FAN: Production of Ozone by Gauging Oxidation: Formaldehyde and NO. (1) A non-linear function of a single three-factor index-variable, j (HCHO=>rads) [HCHO] [NO] seems to provide a good estimator of the largest single term in the production of smog ozone, the HOO+NO term, over a very wide range of situations. (2) By considering empirical contour plots summarizing isopleths of HOO+NO using j (HCHO=>rads) [HCHO] and [NO] separately as coordinates, we provide a slightly more complex 2-d indicator of smog ozone production that additionally allows an estimate of the NOx-sensitivity or NOx-saturation (i.e., VOC-sensitivity) of sampled air parcels. ~85 to >90 % of the variance is explained. The correspondence to "EKMA" contour plots, estimating afternoon ozone based on morningtime organics and NOx mixes, is not coincidental. We utilize a broad set of urban plume, regionally polluted and cleaner NASA DC-8 PBL samples from the Intercontinental Transport Experiment-North America (INTEX-NA), in which each of the variables was measured, to help establish our relationship. The estimator is described in terms both both of asymptotic smog photochemistry theory; primarily this suggests appropriate statistical approaches which can capture some of the complex interrelations of lower-tropospheric smog mix through correlation of reactive mixture components. HCHO is not only an important source of HOO radicals, but it more important serves as a "gauge" of all photochemical processing of volatile organic compounds. It probably captures information related to coincident VOC sources of various compounds and parallels in photochemical processing. Constrained modeling of observed atmospheric concentrations suggests that the prime source of ozone from HOO+NO reaction and other peroxy radical ozone formation reactions (ROO+NO), thus all ozone production, are closely related. Additionally, modeling allows us to follow ozone production and NOx-sensitivity throughout the varying photolytic cycle.

  13. First beam measurements on the vessel for extraction and source plasma analyses (VESPA) at the Rutherford Appleton Laboratory (RAL)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrie, Scott R., E-mail: scott.lawrie@stfc.ac.uk; John Adams Institute for Accelerator Science, Department of Physics, University of Oxford; Faircloth, Daniel C.

    2015-04-08

    In order to facilitate the testing of advanced H{sup −} ion sources for the ISIS and Front End Test Stand (FETS) facilities at the Rutherford Appleton Laboratory (RAL), a Vessel for Extraction and Source Plasma Analyses (VESPA) has been constructed. This will perform the first detailed plasma measurements on the ISIS Penning-type H{sup −} ion source using emission spectroscopic techniques. In addition, the 30-year-old extraction optics are re-designed from the ground up in order to fully transport the beam. Using multiple beam and plasma diagnostics devices, the ultimate aim is improve H{sup −} production efficiency and subsequent transport for eithermore » long-term ISIS user operations or high power FETS requirements. The VESPA will also accommodate and test a new scaled-up Penning H{sup −} source design. This paper details the VESPA design, construction and commissioning, as well as initial beam and spectroscopy results.« less

  14. The potential contribution of geothermal energy to electricity supply in Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Chandrasekharam, D.; Lashin, Aref; Al Arifi, Nassir

    2016-10-01

    With increase in demand for electricity at 7.5% per year, the major concern of Saudi Arabia is the amount of CO2 being emitted. The country has the potential of generating 200×106 kWh from hydrothermal sources and 120×106 terawatt hour from Enhanced Geothermal System (EGS) sources. In addition to electricity generation and desalination, the country has substantial source for direct application such as space cooling and heating, a sector that consumes 80% of the electricity generated from fossil fuels. Geothermal energy can offset easily 17 million kWh of electricity that is being used for desalination. At least a part of 181,000 Gg of CO2 emitted by conventional space cooling units can also be mitigated through ground-source heat pump technology immediately. Future development of EGS sources together with the wet geothermal systems will make the country stronger in terms of oil reserves saved and increase in exports.

  15. Noninvasive Electromagnetic Source Imaging and Granger Causality Analysis: An Electrophysiological Connectome (eConnectome) Approach

    PubMed Central

    Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A.; Zhang, Wenbo

    2016-01-01

    Objective Combined source imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a non-invasive fashion. Source imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source imaging algorithms to both find the network nodes (regions of interest) and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Methods Source imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from inter-ictal and ictal signals recorded by EEG and/or MEG. Results Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ~20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Conclusion Our study indicates that combined source imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). Significance The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions. PMID:27740473

  16. Noninvasive Electromagnetic Source Imaging and Granger Causality Analysis: An Electrophysiological Connectome (eConnectome) Approach.

    PubMed

    Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A; Zhang, Wenbo; He, Bin

    2016-12-01

    Combined source-imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a noninvasive fashion. Source-imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source-imaging algorithms to both find the network nodes [regions of interest (ROI)] and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses, and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Source-imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from interictal and ictal signals recorded by EEG and/or Magnetoencephalography (MEG). Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ∼20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Our study indicates that combined source-imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions.

  17. 77 FR 19740 - Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant Accident

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2010-0249] Water Sources for Long-Term Recirculation Cooling... Regulatory Guide (RG) 1.82, ``Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant... regarding the sumps and suppression pools that provide water sources for emergency core cooling, containment...

  18. Carbon and nitrogen mineralization and enzyme activities in soil aggregate-size classes: Effects of biochar, oyster shells, and polymers.

    PubMed

    Awad, Yasser Mahmoud; Lee, Sang Soo; Kim, Ki-Hyun; Ok, Yong Sik; Kuzyakov, Yakov

    2018-05-01

    Biochar (BC) and polymers are cost-effective additives for soil quality improvement and long-term sustainability. The additional use of the oyster shells (OS) powder in BC- or polymer-treated soils is recommended as a nutrient source, to enhance aggregation and to increase enzyme activities. The effects of soil treatments (i.e., BC (5 Mg ha -1 ) and polymers (biopolymer at 0.4 Mg ha -1 or polyacrylamide at 0.4 Mg ha -1 ) with or without the OS (1%)) on the short-term changes were evaluated based on a 30-day incubation experiment with respect to several variables (e.g., CO 2 release, NH 4 + and NO 3 - concentrations, aggregate-size classes, and enzyme activities in an agricultural Luvisol). The BC and BP with the addition of OS increased the portion of microaggregates (<0.25 mm) relative to the control soil without any additions, while PAM alone increased the portion of large macroaggregates (1-2 mm). Concentrations of NO 3 - also increased in soils treated with OS, OS + BC, and OS + BP as result of the increased chitinase and leucine aminopeptidase activities. The BC and BP when treated with the additional OS had significant short-term impacts on N mineralization without affecting C mineralization in soil. Consequently, the combination of BC or BP with OS was seen to accelerate N turnover without affecting C turnover (and related C losses) from soil. As such, the addition of these additives contributed considerably to the improvement of soil fertility and C sequestration. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Plant-wax D/H ratios in the southern European Alps record multiple aspects of climate variability

    NASA Astrophysics Data System (ADS)

    Wirth, Stefanie B.; Sessions, Alex L.

    2016-09-01

    We present a Younger Dryas-Holocene record of the hydrogen isotopic composition of sedimentary plant waxes (δDwax) from the southern European Alps (Lake Ghirla, N-Italy) to investigate its sensitivity to climatic forcing variations in this mid-latitude region (45°N). A modern altitudinal transect of δD values of river water and leaf waxes in the Lake Ghirla catchment is used to test present-day climate sensitivity of δDwax. While we find that altitudinal effects on δDwax are minor at our study site, temperature, precipitation amount, and evapotranspiration all appear to influence δDwax to varying extents. In the lake-sediment record, δDwax values vary between -134 and -180‰ over the past 13 kyr. The long-term Holocene pattern of δDwax parallels the trend of decreasing temperature and is thus likely forced by the decline of northern hemisphere summer insolation. Shorter-term fluctuations, in contrast, may reflect both temperature and moisture-source changes. During the cool Younger Dryas and Little Ice Age (LIA) periods we observe unexpectedly high δDwax values relative to those before and after. We suggest that a change towards a more D-enriched moisture source is required during these intervals. In fact, a shift from northern N-Atlantic to southern N-Atlantic/western Mediterranean Sea sources would be consistent with a southward migration of the Westerlies with climate cooling. Prominent δDwax fluctuations in the early and middle Holocene are negative and potentially associated with temperature declines. In the late Holocene (<4 kyr BP), excursions are partly positive (as for the LIA) suggesting a stronger influence of moisture-source changes on δDwax variation. In addition to isotopic fractionations of the hydrological cycle, changes in vegetation composition, in the length of the growing season, and in snowfall amount provide additional potential sources of variability, although we cannot yet quantitatively assess these in the paleo-record. We conclude that while our δDwax record from the Alps does contain climatic information, it is a complicated record that would require additional constraints to be robustly interpreted. This also has important implications for other water-isotope-based proxy records of precipitation and hydro-climate from this region, such as cave speleothems.

  20. Shipboard monitoring of non-CO2 greenhouse gases in Asia and Oceania using commercially cargo vessels

    NASA Astrophysics Data System (ADS)

    Nara, H.; Tanimoto, H.; Mukai, H.; Nojiri, Y.; Tohjima, Y.; Machida, T.; Hashimoto, S.

    2011-12-01

    The National Institute for Environmental Studies (NIES) has been performing a long-term program for monitoring trace gases of atmospheric importance over the Pacific Ocean since 1995. The NIES Voluntary Observing Ships (NIES-VOS) program currently makes use of commercial cargo vessels because they operate regularly over fixed routes for long periods and sail over a wide area between various ports (e.g., between Japan and the United States, between Japan and Australia/New Zealand, and between Japan and southeast Asia). This program allows systematic and continuous measurements of non-CO2 greenhouse gases, providing long-term datasets for background air over the Pacific Ocean and regionally polluted air around east Asia. We observe both long-lived greenhouse gases (e.g., carbon dioxide) and short-lived air pollutants (e.g., tropospheric ozone, carbon monoxide) on a continuous basis. Flask samples are collected for later laboratory analysis of carbon dioxide, methane, nitrous oxide, and carbon monoxide by using gas chromatographic techniques. In addition, we recently installed cavity ringdown spectrometers for high-resolution measurement of methane and carbon dioxide to capture their highly variable features in regionally polluted air around southeast Asia (e.g., Hong Kong, Thailand, Singapore, Malaysia, Indonesia and Philippine), which is now thought to be a large source due to expanding socioeconomic activities as well as biomass burnings. Contrasting the Japan-Australia/New Zealand and Japan-southeast Asia cruises revealed regional characteristics of sources and sinks of these atmospherically important species, suggesting the existence of additional sources for methane, nitrous oxides, and carbon monoxide in this tropical Asian region.

  1. Water and rock geochemistry, geologic cross sections, geochemical modeling, and groundwater flow modeling for identifying the source of groundwater to Montezuma Well, a natural spring in central Arizona

    USGS Publications Warehouse

    Johnson, Raymond H.; DeWitt, Ed; Wirt, Laurie; Arnold, L. Rick; Horton, John D.

    2011-01-01

    The National Park Service (NPS) seeks additional information to better understand the source(s) of groundwater and associated groundwater flow paths to Montezuma Well in Montezuma Castle National Monument, central Arizona. The source of water to Montezuma Well, a flowing sinkhole in a desert setting, is poorly understood. Water emerges from the middle limestone facies of the lacustrine Verde Formation, but the precise origin of the water and its travel path are largely unknown. Some have proposed artesian flow to Montezuma Well through the Supai Formation, which is exposed along the eastern margin of the Verde Valley and underlies the Verde Formation. The groundwater recharge zone likely lies above the floor of the Verde Valley somewhere to the north or east of Montezuma Well, where precipitation is more abundant. Additional data from groundwater, surface water, and bedrock geology are required for Montezuma Well and the surrounding region to test the current conceptual ideas, to provide new details on the groundwater flow in the area, and to assist in future management decisions. The results of this research will provide information for long-term water resource management and the protection of water rights.

  2. Long-period noise source inversion in a 3-D heterogeneous Earth

    NASA Astrophysics Data System (ADS)

    Sager, K.; Ermert, L. A.; Afanasiev, M.; Boehm, C.; Fichtner, A.

    2017-12-01

    We have implemented a new method for ambient noise source inversion that fully honors finite-frequency wave propagation and 3-D heterogeneous Earth structure.Here, we present results of its first application to the Earth's long-period background signal, the hum, in a period band of around 120 - 300 s. In addition to being a computationally convenient test case, the hum is also the topic of ongoing research in its own right, because different physical mechanisms have been proposed for its excitation. The broad patterns of this model for South and North hemisphere winter are qualitatively consistent with previous long-term studies of the hum sources; however, thanks to methodological improvements, the iterative refinement, and the use of a comparatively extensive dataset, we retrieve a more detailed model in certain locations. In particular, our results support findings that the dominant hum sources are focused along coasts and shelf areas, particularly in the North hemisphere winter, with a possible though not well-constrained contribution of pelagic sources. Additionally, our findings indicate that hum source locations in the ocean, tentatively linked to locally high bathymetry, are important contributors particularly during South hemisphere winter. These results, in conjunction with synthetic recovery tests and observed cross-correlation waveforms, suggest that hum sources are rather narrowly concentrated in space, with length scales on the order of few hundred kilometers. Future work includes the extension of the model to spring and fall season and to shorter periods, as well as its use in full-waveform ambient noise inversion for 3-D Earth structure.

  3. Matrix effect and recovery terminology issues in regulated drug bioanalysis.

    PubMed

    Huang, Yong; Shi, Robert; Gee, Winnie; Bonderud, Richard

    2012-02-01

    Understanding the meaning of the terms used in the bioanalytical method validation guidance is essential for practitioners to implement best practice. However, terms that have several meanings or that have different interpretations exist within bioanalysis, and this may give rise to differing practices. In this perspective we discuss an important but often confusing term - 'matrix effect (ME)' - in regulated drug bioanalysis. The ME can be interpreted as either the ionization change or the measurement bias of the method caused by the nonanalyte matrix. The ME definition dilemma makes its evaluation challenging. The matrix factor is currently used as a standard method for evaluation of ionization changes caused by the matrix in MS-based methods. Standard additions to pre-extraction samples have been suggested to evaluate the overall effects of a matrix from different sources on the analytical system, because it covers ionization variation and extraction recovery variation. We also provide our personal views on the term 'recovery'.

  4. Performance Analysis of Physical Layer Security of Opportunistic Scheduling in Multiuser Multirelay Cooperative Networks

    PubMed Central

    Shim, Kyusung; Do, Nhu Tri; An, Beongku

    2017-01-01

    In this paper, we study the physical layer security (PLS) of opportunistic scheduling for uplink scenarios of multiuser multirelay cooperative networks. To this end, we propose a low-complexity, yet comparable secrecy performance source relay selection scheme, called the proposed source relay selection (PSRS) scheme. Specifically, the PSRS scheme first selects the least vulnerable source and then selects the relay that maximizes the system secrecy capacity for the given selected source. Additionally, the maximal ratio combining (MRC) technique and the selection combining (SC) technique are considered at the eavesdropper, respectively. Investigating the system performance in terms of secrecy outage probability (SOP), closed-form expressions of the SOP are derived. The developed analysis is corroborated through Monte Carlo simulation. Numerical results show that the PSRS scheme significantly improves the secure ability of the system compared to that of the random source relay selection scheme, but does not outperform the optimal joint source relay selection (OJSRS) scheme. However, the PSRS scheme drastically reduces the required amount of channel state information (CSI) estimations compared to that required by the OJSRS scheme, specially in dense cooperative networks. PMID:28212286

  5. Radiological analysis of plutonium glass batches with natural/enriched boron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainisch, R.

    2000-06-22

    The disposition of surplus plutonium inventories by the US Department of Energy (DOE) includes the immobilization of certain plutonium materials in a borosilicate glass matrix, also referred to as vitrification. This paper addresses source terms of plutonium masses immobilized in a borosilicate glass matrix where the glass components include both natural boron and enriched boron. The calculated source terms pertain to neutron and gamma source strength (particles per second), and source spectrum changes. The calculated source terms corresponding to natural boron and enriched boron are compared to determine the benefits (decrease in radiation source terms) for to the use ofmore » enriched boron. The analysis of plutonium glass source terms shows that a large component of the neutron source terms is due to (a, n) reactions. The Americium-241 and plutonium present in the glass emit alpha particles (a). These alpha particles interact with low-Z nuclides like B-11, B-10, and O-17 in the glass to produce neutrons. The low-Z nuclides are referred to as target particles. The reference glass contains 9.4 wt percent B{sub 2}O{sub 3}. Boron-11 was found to strongly support the (a, n) reactions in the glass matrix. B-11 has a natural abundance of over 80 percent. The (a, n) reaction rates for B-10 are lower than for B-11 and the analysis shows that the plutonium glass neutron source terms can be reduced by artificially enriching natural boron with B-10. The natural abundance of B-10 is 19.9 percent. Boron enriched to 96-wt percent B-10 or above can be obtained commercially. Since lower source terms imply lower dose rates to radiation workers handling the plutonium glass materials, it is important to know the achievable decrease in source terms as a result of boron enrichment. Plutonium materials are normally handled in glove boxes with shielded glass windows and the work entails both extremity and whole-body exposures. Lowering the source terms of the plutonium batches will make the handling of these materials less difficult and will reduce radiation exposure to operating workers.« less

  6. Topological Hall effect in diffusive ferromagnetic thin films with spin-flip scattering

    DOE PAGES

    Zhang, Steven S. -L.; Heinonen, Olle

    2018-04-02

    In this paper, we study the topological Hall (TH) effect in a diffusive ferromagnetic metal thin film by solving a Boltzmann transport equation in the presence of spin-flip scattering. A generalized spin-diffusion equation is derived which contains an additional source term associated with the gradient of the emergent magnetic field that arises from skyrmions. Because of the source term, spin accumulation may build up in the vicinity of the skyrmions. This gives rise to a spin-polarized diffusion current that in general suppresses the bulk TH current. Only when the spin-diffusion length is much smaller than the skyrmion size does themore » TH resistivity approach the value derived by Bruno et al. [Phys. Rev. Lett. 93, 096806 (2004)]. Finally, we derive a general expression of the TH resistivity that applies to thin-film geometries with spin-flip scattering, and show that the corrections to the TH resistivity become large when the size of room temperature skyrmions is further reduced to tens of nanometers.« less

  7. Topological Hall effect in diffusive ferromagnetic thin films with spin-flip scattering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Steven S. -L.; Heinonen, Olle

    In this paper, we study the topological Hall (TH) effect in a diffusive ferromagnetic metal thin film by solving a Boltzmann transport equation in the presence of spin-flip scattering. A generalized spin-diffusion equation is derived which contains an additional source term associated with the gradient of the emergent magnetic field that arises from skyrmions. Because of the source term, spin accumulation may build up in the vicinity of the skyrmions. This gives rise to a spin-polarized diffusion current that in general suppresses the bulk TH current. Only when the spin-diffusion length is much smaller than the skyrmion size does themore » TH resistivity approach the value derived by Bruno et al. [Phys. Rev. Lett. 93, 096806 (2004)]. Finally, we derive a general expression of the TH resistivity that applies to thin-film geometries with spin-flip scattering, and show that the corrections to the TH resistivity become large when the size of room temperature skyrmions is further reduced to tens of nanometers.« less

  8. The solution of three-variable duct-flow equations

    NASA Technical Reports Server (NTRS)

    Stuart, A. R.; Hetherington, R.

    1974-01-01

    This paper establishes a numerical method for the solution of three-variable problems and is applied here to rotational flows through ducts of various cross sections. An iterative scheme is developed, the main feature of which is the addition of a duplicate variable to the forward component of velocity. Two forward components of velocity result from integrating two sets of first order ordinary differential equations for the streamline curvatures, in intersecting directions across the duct. Two pseudo-continuity equations are introduced with source/sink terms, whose strengths are dependent on the difference between the forward components of velocity. When convergence is obtained, the two forward components of velocity are identical, the source/sink terms are zero, and the original equations are satisfied. A computer program solves the exact equations and boundary conditions numerically. The method is economical and compares successfully with experiments on bent ducts of circular and rectangular cross section where secondary flows are caused by gradients of total pressure upstream.

  9. On recontamination and directional-bias problems in Monte Carlo simulation of PDF turbulence models

    NASA Technical Reports Server (NTRS)

    Hsu, Andrew T.

    1991-01-01

    Turbulent combustion can not be simulated adequately by conventional moment closure turbulence models. The difficulty lies in the fact that the reaction rate is in general an exponential function of the temperature, and the higher order correlations in the conventional moment closure models of the chemical source term can not be neglected, making the applications of such models impractical. The probability density function (pdf) method offers an attractive alternative: in a pdf model, the chemical source terms are closed and do not require additional models. A grid dependent Monte Carlo scheme was studied, since it is a logical alternative, wherein the number of computer operations increases only linearly with the increase of number of independent variables, as compared to the exponential increase in a conventional finite difference scheme. A new algorithm was devised that satisfies a restriction in the case of pure diffusion or uniform flow problems. Although for nonuniform flows absolute conservation seems impossible, the present scheme has reduced the error considerably.

  10. Topological Hall effect in diffusive ferromagnetic thin films with spin-flip scattering

    NASA Astrophysics Data System (ADS)

    Zhang, Steven S.-L.; Heinonen, Olle

    2018-04-01

    We study the topological Hall (TH) effect in a diffusive ferromagnetic metal thin film by solving a Boltzmann transport equation in the presence of spin-flip scattering. A generalized spin-diffusion equation is derived which contains an additional source term associated with the gradient of the emergent magnetic field that arises from skyrmions. Because of the source term, spin accumulation may build up in the vicinity of the skyrmions. This gives rise to a spin-polarized diffusion current that in general suppresses the bulk TH current. Only when the spin-diffusion length is much smaller than the skyrmion size does the TH resistivity approach the value derived by Bruno et al. [Phys. Rev. Lett. 93, 096806 (2004), 10.1103/PhysRevLett.93.096806]. We derive a general expression of the TH resistivity that applies to thin-film geometries with spin-flip scattering, and show that the corrections to the TH resistivity become large when the size of room temperature skyrmions is further reduced to tens of nanometers.

  11. Fission Product Appearance Rate Coefficients in Design Basis Source Term Determinations - Past and Present

    NASA Astrophysics Data System (ADS)

    Perez, Pedro B.; Hamawi, John N.

    2017-09-01

    Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.

  12. New VLBI2010 scheduling strategies and implications on the terrestrial reference frames.

    PubMed

    Sun, Jing; Böhm, Johannes; Nilsson, Tobias; Krásná, Hana; Böhm, Sigrid; Schuh, Harald

    In connection with the work for the next generation VLBI2010 Global Observing System (VGOS) of the International VLBI Service for Geodesy and Astrometry, a new scheduling package (Vie_Sched) has been developed at the Vienna University of Technology as a part of the Vienna VLBI Software. In addition to the classical station-based approach it is equipped with a new scheduling strategy based on the radio sources to be observed. We introduce different configurations of source-based scheduling options and investigate the implications on present and future VLBI2010 geodetic schedules. By comparison to existing VLBI schedules of the continuous campaign CONT11, we find that the source-based approach with two sources has a performance similar to the station-based approach in terms of number of observations, sky coverage, and geodetic parameters. For an artificial 16 station VLBI2010 network, the source-based approach with four sources provides an improved distribution of source observations on the celestial sphere. Monte Carlo simulations yield slightly better repeatabilities of station coordinates with the source-based approach with two sources or four sources than the classical strategy. The new VLBI scheduling software with its alternative scheduling strategy offers a promising option with respect to applications of the VGOS.

  13. New VLBI2010 scheduling strategies and implications on the terrestrial reference frames

    NASA Astrophysics Data System (ADS)

    Sun, Jing; Böhm, Johannes; Nilsson, Tobias; Krásná, Hana; Böhm, Sigrid; Schuh, Harald

    2014-05-01

    In connection with the work for the next generation VLBI2010 Global Observing System (VGOS) of the International VLBI Service for Geodesy and Astrometry, a new scheduling package (Vie_Sched) has been developed at the Vienna University of Technology as a part of the Vienna VLBI Software. In addition to the classical station-based approach it is equipped with a new scheduling strategy based on the radio sources to be observed. We introduce different configurations of source-based scheduling options and investigate the implications on present and future VLBI2010 geodetic schedules. By comparison to existing VLBI schedules of the continuous campaign CONT11, we find that the source-based approach with two sources has a performance similar to the station-based approach in terms of number of observations, sky coverage, and geodetic parameters. For an artificial 16 station VLBI2010 network, the source-based approach with four sources provides an improved distribution of source observations on the celestial sphere. Monte Carlo simulations yield slightly better repeatabilities of station coordinates with the source-based approach with two sources or four sources than the classical strategy. The new VLBI scheduling software with its alternative scheduling strategy offers a promising option with respect to applications of the VGOS.

  14. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  15. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  16. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  17. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  18. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  19. Open-Source Assisted Laboratory Automation through Graphical User Interfaces and 3D Printers: Application to Equipment Hyphenation for Higher-Order Data Generation.

    PubMed

    Siano, Gabriel G; Montemurro, Milagros; Alcaráz, Mirta R; Goicoechea, Héctor C

    2017-10-17

    Higher-order data generation implies some automation challenges, which are mainly related to the hidden programming languages and electronic details of the equipment. When techniques and/or equipment hyphenation are the key to obtaining higher-order data, the required simultaneous control of them demands funds for new hardware, software, and licenses, in addition to very skilled operators. In this work, we present Design of Inputs-Outputs with Sikuli (DIOS), a free and open-source code program that provides a general framework for the design of automated experimental procedures without prior knowledge of programming or electronics. Basically, instruments and devices are considered as nodes in a network, and every node is associated both with physical and virtual inputs and outputs. Virtual components, such as graphical user interfaces (GUIs) of equipment, are handled by means of image recognition tools provided by Sikuli scripting language, while handling of their physical counterparts is achieved using an adapted open-source three-dimensional (3D) printer. Two previously reported experiments of our research group, related to fluorescence matrices derived from kinetics and high-performance liquid chromatography, were adapted to be carried out in a more automated fashion. Satisfactory results, in terms of analytical performance, were obtained. Similarly, advantages derived from open-source tools assistance could be appreciated, mainly in terms of lesser intervention of operators and cost savings.

  20. Path spectra derived from inversion of source and site spectra for earthquakes in Southern California

    NASA Astrophysics Data System (ADS)

    Klimasewski, A.; Sahakian, V. J.; Baltay, A.; Boatwright, J.; Fletcher, J. B.; Baker, L. M.

    2017-12-01

    A large source of epistemic uncertainty in Ground Motion Prediction Equations (GMPEs) is derived from the path term, currently represented as a simple geometric spreading and intrinsic attenuation term. Including additional physical relationships between the path properties and predicted ground motions would produce more accurate and precise, region-specific GMPEs by reclassifying some of the random, aleatory uncertainty as epistemic. This study focuses on regions of Southern California, using data from the Anza network and Southern California Seismic network to create a catalog of events magnitude 2.5 and larger from 1998 to 2016. The catalog encompasses regions of varying geology and therefore varying path and site attenuation. Within this catalog of events, we investigate several collections of event region-to-station pairs, each of which share similar origin locations and stations so that all events have similar paths. Compared with a simple regional GMPE, these paths consistently have high or low residuals. By working with events that have the same path, we can isolate source and site effects, and focus on the remaining residual as path effects. We decompose the recordings into source and site spectra for each unique event and site in our greater Southern California regional database using the inversion method of Andrews (1986). This model represents each natural log record spectra as the sum of its natural log event and site spectra, while constraining each record to a reference site or Brune source spectrum. We estimate a regional, path-specific anelastic attenuation (Q) and site attenuation (t*) from the inversion site spectra and corner frequency from the inversion event spectra. We then compute the residuals between the observed record data, and the inversion model prediction (event*site spectra). This residual is representative of path effects, likely anelastic attenuation along the path that varies from the regional median attenuation. We examine the residuals for our different sets independently to see how path terms differ between event-to-station collections. The path-specific information gained from this can inform development of terms for regional GMPEs, through understanding of these seismological phenomena.

  1. Human-Induced Long-Term Shifts in Gull Diet from Marine to Terrestrial Sources in North America's Coastal Pacific: More Evidence from More Isotopes (δ2H, δ34S).

    PubMed

    Hobson, Keith A; Blight, Louise K; Arcese, Peter

    2015-09-15

    Measurements of naturally occurring stable isotopes in tissues of seabirds and their prey are a powerful tool for investigating long-term changes in marine foodwebs. Recent isotopic (δ(15)N, δ(13)C) evidence from feathers of Glaucous-winged Gulls (Larus glaucescens) has shown that over the last 150 years, this species shifted from a midtrophic marine diet to one including lower trophic marine prey and/or more terrestrial or freshwater foods. However, long-term isotopic patterns of δ(15)N and δ(13)C cannot distinguish between the relative importance of lower trophic-level marine foods and terrestrial sources. We examined 48 feather stable-hydrogen (δ(2)H) and -sulfur (δ(34)S) isotope values from this same 150-year feather set and found additional isotopic evidence supporting the hypothesis that gulls shifted to terrestrial and/or freshwater prey. Mean feather δ(2)H and δ(34)S values (± SD) declined from the earliest period (1860-1915; n = 12) from -2.5 ± 21.4 ‰ and 18.9 ± 2.7 ‰, respectively, to -35.5 ± 15.5 ‰ and 14.8 ± 2.4 ‰, respectively, for the period 1980-2009 (n = 12). We estimated a shift of ∼ 30% increase in dependence on terrestrial/freshwater sources. These results are consistent with the hypothesis that gulls increased terrestrial food inputs in response to declining forage fish availability.

  2. Bayesian estimation of a source term of radiation release with approximately known nuclide ratios

    NASA Astrophysics Data System (ADS)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek

    2016-04-01

    We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  3. Establishing Standards on Colors from Natural Sources.

    PubMed

    Simon, James E; Decker, Eric A; Ferruzzi, Mario G; Giusti, M Monica; Mejia, Carla D; Goldschmidt, Mark; Talcott, Stephen T

    2017-11-01

    Color additives are applied to many food, drug, and cosmetic products. With up to 85% of consumer buying decisions potentially influenced by color, appropriate application of color additives and their safety is critical. Color additives are defined by the U.S. Federal Food, Drug, and Cosmetic Act (FD&C Act) as any dye, pigment, or substance that can impart color to a food, drug, or cosmetic or to the human body. Under current U.S. Food and Drug Administration (FDA) regulations, colors fall into 2 categories as those subject to an FDA certification process and those that are exempt from certification often referred to as "natural" colors by consumers because they are sourced from plants, minerals, and animals. Certified colors have been used for decades in food and beverage products, but consumer interest in natural colors is leading market applications. However, the popularity of natural colors has also opened a door for both unintentional and intentional economic adulteration. Whereas FDA certifications for synthetic dyes and lakes involve strict quality control, natural colors are not evaluated by the FDA and often lack clear definitions and industry accepted quality and safety specifications. A significant risk of adulteration of natural colors exists, ranging from simple misbranding or misuse of the term "natural" on a product label to potentially serious cases of physical, chemical, and/or microbial contamination from raw material sources, improper processing methods, or intentional postproduction adulteration. Consistent industry-wide safety standards are needed to address the manufacturing, processing, application, and international trade of colors from natural sources to ensure quality and safety throughout the supply chain. © 2017 Institute of Food Technologists®.

  4. Effect of different concentrations of dl-isoleucine, dl-valine, and dl-alanine on growth and sporulation in Fusarium oxysporum f. udum (Butl.) Sn. et H.

    PubMed

    Prasad, M; Chaudhary, S K

    1977-01-01

    D1-alanine and dl-valine, when added as an extra nitrogen for fortifying the already present inorganic nitrogen source, actually acted as growth retardant for F. oxysporum f. udum (Butl.) Sn. et H. Sporulation of microconidia was indifferently affected by these two amino acids. DI-valine stimulated microconidial formation in young cultures only. In both young and old cultures the lowest concentration of dl-valine depressed macronidial sporulation. In old cultures the lowest concentration of valine stimulated chlamydospore differentiation rapidly, higher concentrations being less effective. D1-alanine, as an additional nitrogen source, depressed both macro- and microconidal sporulation. It did not even invigorate chlamydospore formation. D1-isoleucine, on the other hand, belongs to the category of growth promoters and profuse and stimulative sporulators of macro- and microconidia. This pathogen needs very specific and preferential doses of the three amino acids, if these are used as a booster in addition to the already present nitrogen source. The response, both in terms of mycelial growth and sporulation of the three spore forms, was also conditioned by the age of the culture.

  5. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2014-01-01 2014-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  6. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2012-01-01 2012-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  7. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  8. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2013-01-01 2013-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  9. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  10. WebGIVI: a web-based gene enrichment analysis and visualization tool.

    PubMed

    Sun, Liang; Zhu, Yongnan; Mahmood, A S M Ashique; Tudor, Catalina O; Ren, Jia; Vijay-Shanker, K; Chen, Jian; Schmidt, Carl J

    2017-05-04

    A major challenge of high throughput transcriptome studies is presenting the data to researchers in an interpretable format. In many cases, the outputs of such studies are gene lists which are then examined for enriched biological concepts. One approach to help the researcher interpret large gene datasets is to associate genes and informative terms (iTerm) that are obtained from the biomedical literature using the eGIFT text-mining system. However, examining large lists of iTerm and gene pairs is a daunting task. We have developed WebGIVI, an interactive web-based visualization tool ( http://raven.anr.udel.edu/webgivi/ ) to explore gene:iTerm pairs. WebGIVI was built via Cytoscape and Data Driven Document JavaScript libraries and can be used to relate genes to iTerms and then visualize gene and iTerm pairs. WebGIVI can accept a gene list that is used to retrieve the gene symbols and corresponding iTerm list. This list can be submitted to visualize the gene iTerm pairs using two distinct methods: a Concept Map or a Cytoscape Network Map. In addition, WebGIVI also supports uploading and visualization of any two-column tab separated data. WebGIVI provides an interactive and integrated network graph of gene and iTerms that allows filtering, sorting, and grouping, which can aid biologists in developing hypothesis based on the input gene lists. In addition, WebGIVI can visualize hundreds of nodes and generate a high-resolution image that is important for most of research publications. The source code can be freely downloaded at https://github.com/sunliang3361/WebGIVI . The WebGIVI tutorial is available at http://raven.anr.udel.edu/webgivi/tutorial.php .

  11. What is What in the Nanoworld: A Handbook on Nanoscience and Nanotechnology

    NASA Astrophysics Data System (ADS)

    Borisenko, Victor E.; Ossicini, Stefano

    2004-10-01

    This introductory, reference handbook summarizes the terms and definitions, most important phenomena, and regulations discovered in the physics, chemistry, technology, and application of nanostructures. These nanostructures are typically inorganic and organic structures at the atomic scale. Fast progressing nanoelectronics and optoelectronics, molecular electronics and spintronics, nanotechnology and quantum processing of information, are of strategic importance for the information society of the 21st century. The short form of information taken from textbooks, special encyclopedias, recent original books and papers provides fast support in understanding "old" and new terms of nanoscience and technology widely used in scientific literature on recent developments. Such support is indeed important when one reads a scientific paper presenting new results in nanoscience. A representative collection of fundamental terms and definitions from quantum physics, and quantum chemistry, special mathematics, organic and inorganic chemistry, solid state physics, material science and technology accompanies recommended second sources (books, reviews, websites) for an extended study of a subject. Each entry interprets the term or definition under consideration and briefly presents main features of the phenomena behind it. Additional information in the form of notes ("First described in: ?", "Recognition: ?", "More details in: ?") supplements entries and gives a historical retrospective of the subject with reference to further sources. Ideal for answering questions related to unknown terms and definitions of undergraduate and Ph.D. students studying the physics of low-dimensional structures, nanoelectronics, nanotechnology. The handbook provides fast support, when one likes to know or to remind the essence of a scientific term, especially when it contains a personal name in its title, like in terms "Anderson localization", "Aharonov-Bohm effect", "Bose-Einstein condensate", e.t.c. More than 1000 entries, from a few sentences to a page in length.

  12. Estimation of the Cesium-137 Source Term from the Fukushima Daiichi Power Plant Using Air Concentration and Deposition Data

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne

    2013-04-01

    A major difficulty when inverting the source term of an atmospheric tracer dispersion problem is the estimation of the prior errors: those of the atmospheric transport model, those ascribed to the representativeness of the measurements, the instrumental errors, and those attached to the prior knowledge on the variables one seeks to retrieve. In the case of an accidental release of pollutant, and specially in a situation of sparse observability, the reconstructed source is sensitive to these assumptions. This sensitivity makes the quality of the retrieval dependent on the methods used to model and estimate the prior errors of the inverse modeling scheme. In Winiarek et al. (2012), we proposed to use an estimation method for the errors' amplitude based on the maximum likelihood principle. Under semi-Gaussian assumptions, it takes into account, without approximation, the positivity assumption on the source. We applied the method to the estimation of the Fukushima Daiichi cesium-137 and iodine-131 source terms using activity concentrations in the air. The results were compared to an L-curve estimation technique, and to Desroziers's scheme. Additionally to the estimations of released activities, we provided related uncertainties (12 PBq with a std. of 15 - 20 % for cesium-137 and 190 - 380 PBq with a std. of 5 - 10 % for iodine-131). We also enlightened that, because of the low number of available observations (few hundreds) and even if orders of magnitude were consistent, the reconstructed activities significantly depended on the method used to estimate the prior errors. In order to use more data, we propose to extend the methods to the use of several data types, such as activity concentrations in the air and fallout measurements. The idea is to simultaneously estimate the prior errors related to each dataset, in order to fully exploit the information content of each one. Using the activity concentration measurements, but also daily fallout data from prefectures and cumulated deposition data over a region lying approximately 150 km around the nuclear power plant, we can use a few thousands of data in our inverse modeling algorithm to reconstruct the Cesium-137 source term. To improve the parameterization of removal processes, rainfall fields have also been corrected using outputs from the mesoscale meteorological model WRF and ground station rainfall data. As expected, the different methods yield closer results as the number of data increases. Reference : Winiarek, V., M. Bocquet, O. Saunier, A. Mathieu (2012), Estimation of errors in the inverse modeling of accidental release of atmospheric pollutant : Application to the reconstruction of the cesium-137 and iodine-131 source terms from the Fukushima Daiichi power plant, J. Geophys. Res., 117, D05122, doi:10.1029/2011JD016932.

  13. The scope and control of attention: Sources of variance in working memory capacity.

    PubMed

    Chow, Michael; Conway, Andrew R A

    2015-04-01

    Working memory capacity is a strong positive predictor of many cognitive abilities, across various domains. The pattern of positive correlations across domains has been interpreted as evidence for a unitary source of inter-individual differences in behavior. However, recent work suggests that there are multiple sources of variance contributing to working memory capacity. The current study (N = 71) investigates individual differences in the scope and control of attention, in addition to the number and resolution of items maintained in working memory. Latent variable analyses indicate that the scope and control of attention reflect independent sources of variance and each account for unique variance in general intelligence. Also, estimates of the number of items maintained in working memory are consistent across tasks and related to general intelligence whereas estimates of resolution are task-dependent and not predictive of intelligence. These results provide insight into the structure of working memory, as well as intelligence, and raise new questions about the distinction between number and resolution in visual short-term memory.

  14. Excitation of Love waves in a thin film layer by a line source.

    NASA Technical Reports Server (NTRS)

    Tuan, H.-S.; Ponamgi, S. R.

    1972-01-01

    The excitation of a Love surface wave guided by a thin film layer deposited on a semiinfinite substrate is studied in this paper. Both the thin film and the substrate are considered to be elastically isotropic. Amplitudes of the surface wave in the thin film region and the substrate are found in terms of the strength of a line source vibrating in a direction transverse to the propagating wave. In addition to the surface wave, the bulk shear wave excited by the source is also studied. Analytical expressions for the bulk wave amplitude as a function of the direction of propagation, the acoustic powers transported by the surface and bulk waves, and the efficiency of surface wave excitation are obtained. A numerical example is given to show how the bulk wave radiation pattern depends upon the source frequency, the film thickness and other important parameters of the problem. The efficiency of surface wave excitation is also calculated for various parameter values.

  15. Long-term financing needs for HIV control in sub-Saharan Africa in 2015–2050: a modelling study

    PubMed Central

    Atun, Rifat; Chang, Angela Y; Ogbuoji, Osondu; Silva, Sachin; Resch, Stephen; Hontelez, Jan; Bärnighausen, Till

    2016-01-01

    Objectives To estimate the present value of current and future funding needed for HIV treatment and prevention in 9 sub-Saharan African (SSA) countries that account for 70% of HIV burden in Africa under different scenarios of intervention scale-up. To analyse the gaps between current expenditures and funding obligation, and discuss the policy implications of future financing needs. Design We used the Goals module from Spectrum, and applied the most up-to-date cost and coverage data to provide a range of estimates for future financing obligations. The four different scale-up scenarios vary by treatment initiation threshold and service coverage level. We compared the model projections to current domestic and international financial sources available in selected SSA countries. Results In the 9 SSA countries, the estimated resources required for HIV prevention and treatment in 2015–2050 range from US$98 billion to maintain current coverage levels for treatment and prevention with eligibility for treatment initiation at CD4 count of <500/mm3 to US$261 billion if treatment were to be extended to all HIV-positive individuals and prevention scaled up. With the addition of new funding obligations for HIV—which arise implicitly through commitment to achieve higher than current treatment coverage levels—overall financial obligations (sum of debt levels and the present value of the stock of future HIV funding obligations) would rise substantially. Conclusions Investing upfront in scale-up of HIV services to achieve high coverage levels will reduce HIV incidence, prevention and future treatment expenditures by realising long-term preventive effects of ART to reduce HIV transmission. Future obligations are too substantial for most SSA countries to be met from domestic sources alone. New sources of funding, in addition to domestic sources, include innovative financing. Debt sustainability for sustained HIV response is an urgent imperative for affected countries and donors. PMID:26948960

  16. The National Geographic Names Data Base: Phase II instructions

    USGS Publications Warehouse

    Orth, Donald J.; Payne, Roger L.

    1987-01-01

    not recorded on topographic maps be added. The systematic collection of names from other sources, including maps, charts, and texts, is termed Phase II. In addition, specific types of features not compiled during Phase I are encoded and added to the data base. Other names of importance to researchers and users, such as historical and variant names, are also included. The rules and procedures for Phase II research, compilation, and encoding are contained in this publication.

  17. Evolution of air pollution source contributions over one decade, derived by PM10 and PM2.5 source apportionment in two metropolitan urban areas in Greece

    NASA Astrophysics Data System (ADS)

    Diapouli, E.; Manousakas, M.; Vratolis, S.; Vasilatou, V.; Maggos, Th; Saraga, D.; Grigoratos, Th; Argyropoulos, G.; Voutsa, D.; Samara, C.; Eleftheriadis, K.

    2017-09-01

    Metropolitan Urban areas in Greece have been known to suffer from poor air quality, due to variety of emission sources, topography and climatic conditions favouring the accumulation of pollution. While a number of control measures have been implemented since the 1990s, resulting in reductions of atmospheric pollution and changes in emission source contributions, the financial crisis which started in 2009 has significantly altered this picture. The present study is the first effort to assess the contribution of emission sources to PM10 and PM2.5 concentration levels and their long-term variability (over 5-10 years), in the two largest metropolitan urban areas in Greece (Athens and Thessaloniki). Intensive measurement campaigns were conducted during 2011-2012 at suburban, urban background and urban traffic sites in these two cities. In addition, available datasets from previous measurements in Athens and Thessaloniki were used in order to assess the long-term variability of concentrations and sources. Chemical composition analysis of the 2011-2012 samples showed that carbonaceous matter was the most abundant component for both PM size fractions. Significant increase of carbonaceous particle concentrations and of OC/EC ratio during the cold period, especially in the residential urban background sites, pointed towards domestic heating and more particularly wood (biomass) burning as a significant source. PMF analysis further supported this finding. Biomass burning was the largest contributing source at the two urban background sites (with mean contributions for the two size fractions in the range of 24-46%). Secondary aerosol formation (sulphate, nitrate & organics) was also a major contributing source for both size fractions at the suburban and urban background sites. At the urban traffic site, vehicular traffic (exhaust and non-exhaust emissions) was the source with the highest contributions, accounting for 44% of PM10 and 37% of PM2.5, respectively. The long-term variability of emission sources in the two cities (over 5-10 years), assessed through a harmonized application of the PMF technique on recent and past year data, clearly demonstrates the effective reduction in emissions during the last decade due to control measures and technological development; however, it also reflects the effects of the financial crisis in Greece during these years, which has led to decreased economic activities and the adoption of more polluting practices by the local population in an effort to reduce living costs.

  18. Inverse modelling-based reconstruction of the Chernobyl source term available for long-range transport

    NASA Astrophysics Data System (ADS)

    Davoine, X.; Bocquet, M.

    2007-03-01

    The reconstruction of the Chernobyl accident source term has been previously carried out using core inventories, but also back and forth confrontations between model simulations and activity concentration or deposited activity measurements. The approach presented in this paper is based on inverse modelling techniques. It relies both on the activity concentration measurements and on the adjoint of a chemistry-transport model. The location of the release is assumed to be known, and one is looking for a source term available for long-range transport that depends both on time and altitude. The method relies on the maximum entropy on the mean principle and exploits source positivity. The inversion results are mainly sensitive to two tuning parameters, a mass scale and the scale of the prior errors in the inversion. To overcome this hardship, we resort to the statistical L-curve method to estimate balanced values for these two parameters. Once this is done, many of the retrieved features of the source are robust within a reasonable range of parameter values. Our results favour the acknowledged three-step scenario, with a strong initial release (26 to 27 April), followed by a weak emission period of four days (28 April-1 May) and again a release, longer but less intense than the initial one (2 May-6 May). The retrieved quantities of iodine-131, caesium-134 and caesium-137 that have been released are in good agreement with the latest reported estimations. Yet, a stronger apportionment of the total released activity is ascribed to the first period and less to the third one. Finer chronological details are obtained, such as a sequence of eruptive episodes in the first two days, likely related to the modulation of the boundary layer diurnal cycle. In addition, the first two-day release surges are found to have effectively reached an altitude up to the top of the domain (5000 m).

  19. Maximizing the spatial representativeness of NO2 monitoring data using a combination of local wind-based sectoral division and seasonal and diurnal correction factors.

    PubMed

    Donnelly, Aoife; Naughton, Owen; Misstear, Bruce; Broderick, Brian

    2016-10-14

    This article describes a new methodology for increasing the spatial representativeness of individual monitoring sites. Air pollution levels at a given point are influenced by emission sources in the immediate vicinity. Since emission sources are rarely uniformly distributed around a site, concentration levels will inevitably be most affected by the sources in the prevailing upwind direction. The methodology provides a means of capturing this effect and providing additional information regarding source/pollution relationships. The methodology allows for the division of the air quality data from a given monitoring site into a number of sectors or wedges based on wind direction and estimation of annual mean values for each sector, thus optimising the information that can be obtained from a single monitoring station. The method corrects for short-term data, diurnal and seasonal variations in concentrations (which can produce uneven weighting of data within each sector) and uneven frequency of wind directions. Significant improvements in correlations between the air quality data and the spatial air quality indicators were obtained after application of the correction factors. This suggests the application of these techniques would be of significant benefit in land-use regression modelling studies. Furthermore, the method was found to be very useful for estimating long-term mean values and wind direction sector values using only short-term monitoring data. The methods presented in this article can result in cost savings through minimising the number of monitoring sites required for air quality studies while also capturing a greater degree of variability in spatial characteristics. In this way, more reliable, but also more expensive monitoring techniques can be used in preference to a higher number of low-cost but less reliable techniques. The methods described in this article have applications in local air quality management, source receptor analysis, land-use regression mapping and modelling and population exposure studies.

  20. A general circulation model study of atmospheric carbon monoxide

    NASA Technical Reports Server (NTRS)

    Pinto, J. P.; Rind, D.; Russell, G. L.; Lerner, J. A.; Hansen, J. E.; Yung, Y. L.; Hameed, S.

    1983-01-01

    The carbon monoxide cycle is studied by incorporating the known and hypothetical sources and sinks in a tracer model that uses the winds generated by a general circulation model. Photochemical production and loss terms, which depend on OH radical concentrations, are calculated in an interactive fashion. The computed global distribution and seasonal variations of CO are compared with observations to obtain constraints on the distribution and magnitude of the sources and sinks of CO, and on the tropospheric abundance of OH. The simplest model that accounts for available observations requires a low latitude plant source of about 1.3 x 10 to the 15th g/yr, in addition to sources from incomplete combustion of fossil fuels and oxidation of methane. The globally averaged OH concentration calculated in the model is 750,000/cu cm. Models that calculate globally averaged OH concentrations much lower than this nominal value are not consistent with the observed variability of CO. Such models are also inconsistent with measurements of CO isotopic abundances, which imply the existence of plant sources.

  1. A Laboratory Study of River Discharges into Shallow Seas

    NASA Astrophysics Data System (ADS)

    Crawford, T. J.; Linden, P. F.

    2016-02-01

    We present an experimental study that aims to simulate the buoyancy driven coastal currents produced by estuarine freshwater discharges into the ocean. The currents are generated inside a rotating tank filled with saltwater by the continuous release of buoyant freshwater from a source structure located at the fluid surface. The freshwater is discharged horizontally from a finite-depth source, giving rise to significant momentum-flux effects and a non-zero potential vorticity. We perform a parametric study in which we vary the rotation rate, freshwater discharge magnitude, the density difference and the source cross-sectional area. The parameter values are chosen to match the regimes appropriate to the River Rhine and River Elbe when entering the North Sea. Persistent features of an anticyclonic outflow vortex and a propagating boundary current were identified and their properties quantified. We also present a finite potential vorticity, geostrophic model that provides theoretical predictions for the current height, width and velocity as functions of the experimental parameters. The experiments and model are compared with each other in terms of a set of non-dimensional parameters identified in the theoretical analysis of the problem. Good agreement between the model and the experimental data is found. The effect of mixing in the turbulent ocean is also addressed with the addition of an oscillating grid to the experimental setup. The grid generates turbulence in the saltwater ambient that is designed to represent the mixing effects of the wind, tides and bathymetry in a shallow shelf sea. The impact of the addition of turbulence is discussed in terms of the experimental data and through modifications to the theoretical model to include mixing. Once again, good agreement is seen between the experiments and the model.

  2. Piecewise synonyms for enhanced UMLS source terminology integration.

    PubMed

    Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J

    2007-10-11

    The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.

  3. Sources and contents of air pollution affecting term low birth weight in Los Angeles County, California, 2001-2008.

    PubMed

    Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Cockburn, Myles; Escobedo, Loraine; Kleeman, Michael J; Wu, Jun

    2014-10-01

    Low birth weight (LBW, <2500 g) has been associated with exposure to air pollution, but it is still unclear which sources or components of air pollution might be in play. The association between ultrafine particles and LBW has never been studied. To study the relationships between LBW in term born infants and exposure to particles by size fraction, source and chemical composition, and complementary components of air pollution in Los Angeles County (California, USA) over the period 2001-2008. Birth certificates (n=960,945) were geocoded to maternal residence. Primary particulate matter (PM) concentrations by source and composition were modeled. Measured fine PM, nitrogen dioxide and ozone concentrations were interpolated using empirical Bayesian kriging. Traffic indices were estimated. Associations between LBW and air pollution metrics were examined using generalized additive models, adjusting for maternal age, parity, race/ethnicity, education, neighborhood income, gestational age and infant sex. Increased LBW risks were associated with the mass of primary fine and ultrafine PM, with several major sources (especially gasoline, wood burning and commercial meat cooking) of primary PM, and chemical species in primary PM (elemental and organic carbon, potassium, iron, chromium, nickel, and titanium but not lead or arsenic). Increased LBW risks were also associated with total fine PM mass, nitrogen dioxide and local traffic indices (especially within 50 m from home), but not with ozone. Stronger associations were observed in infants born to women with low socioeconomic status, chronic hypertension, diabetes and a high body mass index. This study supports previously reported associations between traffic-related pollutants and LBW and suggests other pollution sources and components, including ultrafine particles, as possible risk factors. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Hormonal and metabolic regulation of source-sink relations under salinity and drought: from plant survival to crop yield stability.

    PubMed

    Albacete, Alfonso A; Martínez-Andújar, Cristina; Pérez-Alfocea, Francisco

    2014-01-01

    Securing food production for the growing population will require closing the gap between potential crop productivity under optimal conditions and the yield captured by farmers under a changing environment, which is termed agronomical stability. Drought and salinity are major environmental factors contributing to the yield gap ultimately by inducing premature senescence in the photosynthetic source tissues of the plant and by reducing the number and growth of the harvestable sink organs by affecting the transport and use of assimilates between and within them. However, the changes in source-sink relations induced by stress also include adaptive changes in the reallocation of photoassimilates that influence crop productivity, ranging from plant survival to yield stability. While the massive utilization of -omic technologies in model plants is discovering hundreds of genes with potential impacts in alleviating short-term applied drought and salinity stress (usually measured as plant survival), only in relatively few cases has an effect on crop yield stability been proven. However, achieving the former does not necessarily imply the latter. Plant survival only requires water status conservation and delayed leaf senescence (thus maintaining source activity) that is usually accompanied by growth inhibition. However, yield stability will additionally require the maintenance or increase in sink activity in the reproductive structures, thus contributing to the transport of assimilates from the source leaves and to delayed stress-induced leaf senescence. This review emphasizes the role of several metabolic and hormonal factors influencing not only the source strength, but especially the sink activity and their inter-relations, and their potential to improve yield stability under drought and salinity stresses. © 2013.

  5. Apparatus And Method For Osl-Based, Remote Radiation Monitoring And Spectrometry

    DOEpatents

    Miller, Steven D.; Smith, Leon Eric; Skorpik, James R.

    2006-03-07

    Compact, OSL-based devices for long-term, unattended radiation detection and spectroscopy are provided. In addition, a method for extracting spectroscopic information from these devices is taught. The devices can comprise OSL pixels and at least one radiation filter surrounding at least a portion of the OSL pixels. The filter can modulate an incident radiation flux. The devices can further comprise a light source and a detector, both proximally located to the OSL pixels, as well as a power source and a wireless communication device, each operably connected to the light source and the detector. Power consumption of the device ranges from ultra-low to zero. The OSL pixels can retain data regarding incident radiation events as trapped charges. The data can be extracted wirelessly or manually. The method for extracting spectroscopic data comprises optically stimulating the exposed OSL pixels, detecting a readout luminescence, and reconstructing an incident-energy spectrum from the luminescence.

  6. Apparatus and method for OSL-based, remote radiation monitoring and spectrometry

    DOEpatents

    Smith, Leon Eric [Richland, WA; Miller, Steven D [Richland, WA; Bowyer, Theodore W [Oakton, VA

    2008-05-20

    Compact, OSL-based devices for long-term, unattended radiation detection and spectroscopy are provided. In addition, a method for extracting spectroscopic information from these devices is taught. The devices can comprise OSL pixels and at least one radiation filter surrounding at least a portion of the OSL pixels. The filter can modulate an incident radiation flux. The devices can further comprise a light source and a detector, both proximally located to the OSL pixels, as well as a power source and a wireless communication device, each operably connected to the light source and the detector. Power consumption of the device ranges from ultra-low to zero. The OSL pixels can retain data regarding incident radiation events as trapped charges. The data can be extracted wirelessly or manually. The method for extracting spectroscopic data comprises optically stimulating the exposed OSL pixels, detecting a readout luminescence, and reconstructing an incident-energy spectrum from the luminescence.

  7. EEG source localization: Sensor density and head surface coverage.

    PubMed

    Song, Jasmine; Davey, Colin; Poulsen, Catherine; Luu, Phan; Turovets, Sergei; Anderson, Erik; Li, Kai; Tucker, Don

    2015-12-30

    The accuracy of EEG source localization depends on a sufficient sampling of the surface potential field, an accurate conducting volume estimation (head model), and a suitable and well-understood inverse technique. The goal of the present study is to examine the effect of sampling density and coverage on the ability to accurately localize sources, using common linear inverse weight techniques, at different depths. Several inverse methods are examined, using the popular head conductivity. Simulation studies were employed to examine the effect of spatial sampling of the potential field at the head surface, in terms of sensor density and coverage of the inferior and superior head regions. In addition, the effects of sensor density and coverage are investigated in the source localization of epileptiform EEG. Greater sensor density improves source localization accuracy. Moreover, across all sampling density and inverse methods, adding samples on the inferior surface improves the accuracy of source estimates at all depths. More accurate source localization of EEG data can be achieved with high spatial sampling of the head surface electrodes. The most accurate source localization is obtained when the voltage surface is densely sampled over both the superior and inferior surfaces. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Microbial biogeochemistry of uranium mill tailings

    USGS Publications Warehouse

    Landa, Edward R.

    2005-01-01

    Uranium mill tailings (UMT) are the crushed ore residues from the extraction of uranium (U) from ores. Among the radioactive wastes associated with the nuclear fuel cycle, UMT are unique in terms of their volume and their limited isolation from the surficial environment. For this latter reason, their management and long-term fate has many interfaces with environmental microbial communities and processes. The interactions of microorganisms with UMT have been shown to be diverse and with significant consequences for radionuclide mobility and bioremediation. These radionuclides are associated with the U-decay series. The addition of organic carbon and phosphate is required to initiate the reduction of the U present in the groundwater down gradient of the mills. Investigations on sediment and water from the U-contaminated aquifer, indicates that the addition of a carbon source stimulates the rate of U removal by microbial reduction. Moreover, most attention with respect to passive or engineered removal of U from groundwaters focuses on iron-reducing and sulfate-reducing bacteria.

  9. Separated by a common language: awareness of term usage differences between languages and disciplines in biopreparedness.

    PubMed

    Andersson, M Gunnar; Tomuzia, Katharina; Löfström, Charlotta; Appel, Bernd; Bano, Luca; Keremidis, Haralampos; Knutsson, Rickard; Leijon, Mikael; Lövgren, Susanna Ekströmer; De Medici, Dario; Menrath, Andrea; van Rotterdam, Bart J; Wisselink, Henk J; Barker, Gary C

    2013-09-01

    Preparedness for bioterrorism is based on communication between people in organizations who are educated and trained in several disciplines, including law enforcement, health, and science. Various backgrounds, cultures, and vocabularies generate difficulties in understanding and interpretating terms and concepts, which may impair communication. This is especially true in emergency situations, in which the need for clarity and consistency is vital. The EU project AniBioThreat initiated methods and made a rough estimate of the terms and concepts that are crucial for an incident, and a pilot database with key terms and definitions has been constructed. Analysis of collected terms and sources has shown that many of the participating organizations use various international standards in their area of expertise. The same term often represents different concepts in the standards from different sectors, or, alternatively, different terms were used to represent the same or similar concepts. The use of conflicting terminology can be problematic for decision makers and communicators in planning and prevention or when handling an incident. Since the CBRN area has roots in multiple disciplines, each with its own evolving terminology, it may not be realistic to achieve unequivocal communication through a standardized vocabulary and joint definitions for words from common language. We suggest that a communication strategy should include awareness of alternative definitions and ontologies and the ability to talk and write without relying on the implicit knowledge underlying specialized jargon. Consequently, cross-disciplinary communication skills should be part of training of personnel in the CBRN field. In addition, a searchable repository of terms and definitions from relevant organizations and authorities would be a valuable addition to existing glossaries for improving awareness concerning bioterrorism prevention planning.

  10. Separated by a Common Language: Awareness of Term Usage Differences Between Languages and Disciplines in Biopreparedness

    PubMed Central

    Tomuzia, Katharina; Löfström, Charlotta; Appel, Bernd; Bano, Luca; Keremidis, Haralampos; Knutsson, Rickard; Leijon, Mikael; Lövgren, Susanna Ekströmer; De Medici, Dario; Menrath, Andrea; van Rotterdam, Bart J.; Wisselink, Henk J.; Barker, Gary C.

    2013-01-01

    Preparedness for bioterrorism is based on communication between people in organizations who are educated and trained in several disciplines, including law enforcement, health, and science. Various backgrounds, cultures, and vocabularies generate difficulties in understanding and interpretating terms and concepts, which may impair communication. This is especially true in emergency situations, in which the need for clarity and consistency is vital. The EU project AniBioThreat initiated methods and made a rough estimate of the terms and concepts that are crucial for an incident, and a pilot database with key terms and definitions has been constructed. Analysis of collected terms and sources has shown that many of the participating organizations use various international standards in their area of expertise. The same term often represents different concepts in the standards from different sectors, or, alternatively, different terms were used to represent the same or similar concepts. The use of conflicting terminology can be problematic for decision makers and communicators in planning and prevention or when handling an incident. Since the CBRN area has roots in multiple disciplines, each with its own evolving terminology, it may not be realistic to achieve unequivocal communication through a standardized vocabulary and joint definitions for words from common language. We suggest that a communication strategy should include awareness of alternative definitions and ontologies and the ability to talk and write without relying on the implicit knowledge underlying specialized jargon. Consequently, cross-disciplinary communication skills should be part of training of personnel in the CBRN field. In addition, a searchable repository of terms and definitions from relevant organizations and authorities would be a valuable addition to existing glossaries for improving awareness concerning bioterrorism prevention planning. PMID:23971818

  11. Relativistic effects in local inertial frames including parametrized-post-Newtonian effects

    NASA Astrophysics Data System (ADS)

    Shahid-Saless, Bahman; Ashby, Neil

    1988-09-01

    We use the concept of a generalized Fermi frame to describe relativistic effects, due to local and distant sources of gravitation, on a body placed in a local inertial frame of reference. In particular we have considered a model of two spherically symmetric gravitating point sources, moving in circular orbits around a common barycenter where one of the bodies is chosen to be the local and the other the distant one. This has been done using the slow-motion, weak-field approximation and including four of the parametrized-post-Newtonian (PPN) parameters. The position of the classical center of mass must be modified when the PPN parameter ζ2 is included. We show that the main relativistic effect on a local satellite is described by the Schwarzschild field of the local body and the nonlinear term corresponding to the self-interaction of the local source with itself. There are also much smaller terms that are proportional, respectively, to the product of the potentials of local and distant bodies and to the distant body's self-interactions. The spatial axes of the local frame undergo geodetic precession. In addition we have an acceleration of the order of 10-11 cm sec-2 that vanish in the case of general relativity, which is discussed in detail.

  12. High-order scheme for the source-sink term in a one-dimensional water temperature model

    PubMed Central

    Jing, Zheng; Kang, Ling

    2017-01-01

    The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data. PMID:28264005

  13. High-order scheme for the source-sink term in a one-dimensional water temperature model.

    PubMed

    Jing, Zheng; Kang, Ling

    2017-01-01

    The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data.

  14. An experimental study on the near-source region of lazy turbulent plumes

    NASA Astrophysics Data System (ADS)

    Ciriello, Francesco; Hunt, Gary R.

    2017-11-01

    The near-source region of a `lazy' turbulent buoyant plume issuing from a circular source is examined for source Richardson numbers in the range of 101 to 107. New data is acquired for the radial contraction and streamwise variation of volume flux through an experimental programme of dye visualisations and particle image velocimetry. This data reveals the limited applicability of traditional entrainment laws used in integral modelling approaches for the description of the near-source region for these source Richardson numbers. A revised entrainment function is proposed, based on which we introduce a classification of plume behaviour whereby the degree of `laziness' may be expressed in terms of the excess dilution that occurs compared to a `pure' constant Richardson number plume. The increased entrainment measured in lazy plumes is attributed to Rayleigh-Taylor instabilities developing along the contraction of the plume which promote the additional engulfment of ambient fluid into the plume. This work was funded by an EPSRC Industial Case Award sponsored by Dyson Technology Ltd. Special thanks go to the members of the Dyson Environmental Control Group that regularly visit us in Cambridge for discussions about our work.

  15. Towards a semantic lexicon for biological language processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verspoor, K.

    It is well understood that natural language processing (NLP) applications require sophisticated lexical resources to support their processing goals. In the biomedical domain, we are privileged to have access to extensive terminological resources in the form of controlled vocabularies and ontologies, which have been integrated into the framework of the National Library of Medicine's Unified Medical Language System's (UMLS) Metathesaurus. However, the existence of such terminological resources does not guarantee their utility for NLP. In particular, we have two core requirements for lexical resources for NLP in addition to the basic enumeration of important domain terms: representation of morphosyntactic informationmore » about those terms, specifically part of speech information and inflectional patterns to support parsing and lemma assignment, and representation of semantic information indicating general categorical information about terms, and significant relations between terms to support text understanding and inference (Hahn et at, 1999). Biomedical vocabularies by and large commonly leave out morphosyntactic information, and where they address semantic considerations, they often do so in an unprincipled manner, for instance by indicating a relation between two concepts without indicating the type of that relation. But all is not lost. The UMLS knowledge sources include two additional resources which are relevant - the SPECIALIST lexicon, a lexicon addressing our morphosyntactic requirements, and the Semantic Network, a representation of core conceptual categories in the biomedical domain. The coverage of these two knowledge sources with respect to the full coverage of the Metathesaurus is, however, not entirely clear. Furthermore, when our goals are specifically to process biological text - and often more specifically, text in the molecular biology domain - it is difficult to say whether the coverage of these resources is meaningful. The utility of the UMLS knowledge sources for medical language processing (MLP) has been explored (Johnson, 1999; Friedman et al 2001); the time has now come to repeat these experiments with respect to biological language processing (BLP). To that end, this paper presents an analysis of ihe UMLS resources, specifically with an eye towards constructing lexical resources suitable for BLP. We follow the paradigm presented in Johnson (1999) for medical language, exploring overlap between the UMLS Metathesaurus and SPECIALIST lexicon to construct a morphosyntactic and semantically-specified lexicon, and then further explore the overlap with a relevant domain corpus for molecular biology.« less

  16. SISSY: An efficient and automatic algorithm for the analysis of EEG sources based on structured sparsity.

    PubMed

    Becker, H; Albera, L; Comon, P; Nunes, J-C; Gribonval, R; Fleureau, J; Guillotel, P; Merlet, I

    2017-08-15

    Over the past decades, a multitude of different brain source imaging algorithms have been developed to identify the neural generators underlying the surface electroencephalography measurements. While most of these techniques focus on determining the source positions, only a small number of recently developed algorithms provides an indication of the spatial extent of the distributed sources. In a recent comparison of brain source imaging approaches, the VB-SCCD algorithm has been shown to be one of the most promising algorithms among these methods. However, this technique suffers from several problems: it leads to amplitude-biased source estimates, it has difficulties in separating close sources, and it has a high computational complexity due to its implementation using second order cone programming. To overcome these problems, we propose to include an additional regularization term that imposes sparsity in the original source domain and to solve the resulting optimization problem using the alternating direction method of multipliers. Furthermore, we show that the algorithm yields more robust solutions by taking into account the temporal structure of the data. We also propose a new method to automatically threshold the estimated source distribution, which permits to delineate the active brain regions. The new algorithm, called Source Imaging based on Structured Sparsity (SISSY), is analyzed by means of realistic computer simulations and is validated on the clinical data of four patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. A Well-Balanced Path-Integral f-Wave Method for Hyperbolic Problems with Source Terms

    PubMed Central

    2014-01-01

    Systems of hyperbolic partial differential equations with source terms (balance laws) arise in many applications where it is important to compute accurate time-dependent solutions modeling small perturbations of equilibrium solutions in which the source terms balance the hyperbolic part. The f-wave version of the wave-propagation algorithm is one approach, but requires the use of a particular averaged value of the source terms at each cell interface in order to be “well balanced” and exactly maintain steady states. A general approach to choosing this average is developed using the theory of path conservative methods. A scalar advection equation with a decay or growth term is introduced as a model problem for numerical experiments. PMID:24563581

  18. Functional Analysis in Long-Term Operation of High Power UV-LEDs in Continuous Fluoro-Sensing Systems for Hydrocarbon Pollution

    PubMed Central

    Arques-Orobon, Francisco Jose; Nuñez, Neftali; Vazquez, Manuel; Gonzalez-Posadas, Vicente

    2016-01-01

    This work analyzes the long-term functionality of HP (High-power) UV-LEDs (Ultraviolet Light Emitting Diodes) as the exciting light source in non-contact, continuous 24/7 real-time fluoro-sensing pollutant identification in inland water. Fluorescence is an effective alternative in the detection and identification of hydrocarbons. The HP UV-LEDs are more advantageous than classical light sources (xenon and mercury lamps) and helps in the development of a low cost, non-contact, and compact system for continuous real-time fieldwork. This work analyzes the wavelength, output optical power, and the effects of viscosity, temperature of the water pollutants, and the functional consistency for long-term HP UV-LED working operation. To accomplish the latter, an analysis of the influence of two types 365 nm HP UV-LEDs degradation under two continuous real-system working mode conditions was done, by temperature Accelerated Life Tests (ALTs). These tests estimate the mean life under continuous working conditions of 6200 h and for cycled working conditions (30 s ON & 30 s OFF) of 66,000 h, over 7 years of 24/7 operating life of hydrocarbon pollution monitoring. In addition, the durability in the face of the internal and external parameter system variations is evaluated. PMID:26927113

  19. Functional Analysis in Long-Term Operation of High Power UV-LEDs in Continuous Fluoro-Sensing Systems for Hydrocarbon Pollution.

    PubMed

    Arques-Orobon, Francisco Jose; Nuñez, Neftali; Vazquez, Manuel; Gonzalez-Posadas, Vicente

    2016-02-26

    This work analyzes the long-term functionality of HP (High-power) UV-LEDs (Ultraviolet Light Emitting Diodes) as the exciting light source in non-contact, continuous 24/7 real-time fluoro-sensing pollutant identification in inland water. Fluorescence is an effective alternative in the detection and identification of hydrocarbons. The HP UV-LEDs are more advantageous than classical light sources (xenon and mercury lamps) and helps in the development of a low cost, non-contact, and compact system for continuous real-time fieldwork. This work analyzes the wavelength, output optical power, and the effects of viscosity, temperature of the water pollutants, and the functional consistency for long-term HP UV-LED working operation. To accomplish the latter, an analysis of the influence of two types 365 nm HP UV-LEDs degradation under two continuous real-system working mode conditions was done, by temperature Accelerated Life Tests (ALTs). These tests estimate the mean life under continuous working conditions of 6200 h and for cycled working conditions (30 s ON & 30 s OFF) of 66,000 h, over 7 years of 24/7 operating life of hydrocarbon pollution monitoring. In addition, the durability in the face of the internal and external parameter system variations is evaluated.

  20. Block 4 solar cell module design and test specification for residential applications

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Near-term design, qualification and acceptance requirements are provided for terrestrial solar cell modules suitable for incorporation in photovoltaic power sources (2 kW to 10 kW) applied to single family residential installations. Requirement levels and recommended design limits for selected performance criteria are specified for modules intended principally for rooftop installations. Modules satisfying the requirements of this specification fall into one of two categories, residential panel or residential shingle, both meeting general performance requirements plus additional category peculiar constraints.

  1. Periodic variations in the signal-to-noise ratios of signals received from the ICE spacecraft

    NASA Technical Reports Server (NTRS)

    Nadeau, T.

    1986-01-01

    Data from the ICE probe to comet Giacobini-Zinner are analyzed to determine the effects of spacecraft rotation upon the signal to noise ratio (SNR) for the two channels of data. In addition, long-term variations from sources other than rotations are considered. Results include a pronounced SNR variation over a period of three seconds (one rotation) and a lesser effect over a two minute period (possibly due to the receiving antenna conscan).

  2. Decadal Variability of the Tropical Stratosphere: Secondary Influence of the El Nino-Southern Oscillation

    DTIC Science & Technology

    2010-02-04

    the QBO [McCormack et al., 2007, and references therein]. However, it is also possible that “feedbacks from below” are a significant contributing cause...DECADAL VARIABILITY OF THE TROPICAL STRATOSPHERE ozone variability from other sources (notably from the equatorial quasi-biennial wind oscillation, or QBO ...work (SH06 and references therein), but with the addition of an ENSO term and including a more complete representation of the QBO : X(t) = µ(i) + βtrendt

  3. A consistent modelling methodology for secondary settling tanks: a reliable numerical method.

    PubMed

    Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena

    2013-01-01

    The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.

  4. 26 CFR 1.737-1 - Recognition of precontribution gain.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Property A1 and Property A2 is long-term, U.S.-source capital gain or loss. The character of gain on Property A3 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real... long-term, U.S.-source capital gain ($10,000 gain on Property A1 and $8,000 loss on Property A2) and $1...

  5. Source term model evaluations for the low-level waste facility performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yim, M.S.; Su, S.I.

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  6. Design of a Nutrient Reclamation System for the Cultivation of Microalgae for Biofuel Production and Other Industrial Applications

    NASA Astrophysics Data System (ADS)

    Sandefur, Heather Nicole

    Microalgal biomass has been identified as a promising feedstock for a number of industrial applications, including the synthesis of new pharmaceutical and biofuel products. However, there are several economic limitations associated with the scale up of existing algal production processes. Critical economic studies of algae-based industrial processes highlight the high cost of supplying essential nutrients to microalgae cultures. With microalgae cells having relatively high nitrogen contents (4 to 8%), the N fertilizer cost in industrial-scale production is significant. In addition, the disposal of the large volumes of cell residuals that are generated during product extraction stages can pose other economic challenges. While waste streams can provide a concentrated source of nutrients, concerns about the presence of biological contaminants and the expense of heat treatment pose challenges to processes that use wastewater as a nutrient source in microalgae cultures. The goal of this study was to evaluate the potential application of ultrafiltration technology to aid in the utilization of agricultural wastewater in the cultivation of a high-value microalgae strain. An ultrafiltration system was used to remove inorganic solids and biological contaminants from wastewater taken from a swine farm in Savoy, Arkansas. The permeate from the system was then used as the nutrient source for the cultivation of the marine microalgae Porphyridium cruentum. During the ultrafiltration system operation, little membrane fouling was observed, and permeate fluxes remained relatively constant during both short-term and long-term tests. The complete rejection of E. coli and coliforms from the wastewater was also observed, in addition to a 75% reduction in total solids, including inorganic materials. The processed permeate was shown to have very high concentrations of total nitrogen (695.6 mg L-1) and total phosphorus (69.1 mg L-1 ). In addition, the growth of P. cruentum was analyzed in a medium containing swine waste permeate, and was compared to P. cruentum growth in a control medium. A higher biomass productivity, lipid productivity, and lipid content were observed in the microalgae cultivated in the swine waste medium compared to that of the control medium. These results suggest that, through the use of ultrafiltration technology as an alternative to traditional heat treatment, agricultural wastewaters could be effectively utilized as a nutrient source for microalgae cultivation.

  7. Disentangling the effects of CO2 and short-lived climate forcer mitigation.

    PubMed

    Rogelj, Joeri; Schaeffer, Michiel; Meinshausen, Malte; Shindell, Drew T; Hare, William; Klimont, Zbigniew; Velders, Guus J M; Amann, Markus; Schellnhuber, Hans Joachim

    2014-11-18

    Anthropogenic global warming is driven by emissions of a wide variety of radiative forcers ranging from very short-lived climate forcers (SLCFs), like black carbon, to very long-lived, like CO2. These species are often released from common sources and are therefore intricately linked. However, for reasons of simplification, this CO2-SLCF linkage was often disregarded in long-term projections of earlier studies. Here we explicitly account for CO2-SLCF linkages and show that the short- and long-term climate effects of many SLCF measures consistently become smaller in scenarios that keep warming to below 2 °C relative to preindustrial levels. Although long-term mitigation of methane and hydrofluorocarbons are integral parts of 2 °C scenarios, early action on these species mainly influences near-term temperatures and brings small benefits for limiting maximum warming relative to comparable reductions taking place later. Furthermore, we find that maximum 21st-century warming in 2 °C-consistent scenarios is largely unaffected by additional black-carbon-related measures because key emission sources are already phased-out through CO2 mitigation. Our study demonstrates the importance of coherently considering CO2-SLCF coevolutions. Failing to do so leads to strongly and consistently overestimating the effect of SLCF measures in climate stabilization scenarios. Our results reinforce that SLCF measures are to be considered complementary rather than a substitute for early and stringent CO2 mitigation. Near-term SLCF measures do not allow for more time for CO2 mitigation. We disentangle and resolve the distinct benefits across different species and therewith facilitate an integrated strategy for mitigating both short and long-term climate change.

  8. Disentangling the effects of CO2 and short-lived climate forcer mitigation

    PubMed Central

    Rogelj, Joeri; Schaeffer, Michiel; Meinshausen, Malte; Shindell, Drew T.; Hare, William; Klimont, Zbigniew; Amann, Markus; Schellnhuber, Hans Joachim

    2014-01-01

    Anthropogenic global warming is driven by emissions of a wide variety of radiative forcers ranging from very short-lived climate forcers (SLCFs), like black carbon, to very long-lived, like CO2. These species are often released from common sources and are therefore intricately linked. However, for reasons of simplification, this CO2–SLCF linkage was often disregarded in long-term projections of earlier studies. Here we explicitly account for CO2–SLCF linkages and show that the short- and long-term climate effects of many SLCF measures consistently become smaller in scenarios that keep warming to below 2 °C relative to preindustrial levels. Although long-term mitigation of methane and hydrofluorocarbons are integral parts of 2 °C scenarios, early action on these species mainly influences near-term temperatures and brings small benefits for limiting maximum warming relative to comparable reductions taking place later. Furthermore, we find that maximum 21st-century warming in 2 °C-consistent scenarios is largely unaffected by additional black-carbon-related measures because key emission sources are already phased-out through CO2 mitigation. Our study demonstrates the importance of coherently considering CO2–SLCF coevolutions. Failing to do so leads to strongly and consistently overestimating the effect of SLCF measures in climate stabilization scenarios. Our results reinforce that SLCF measures are to be considered complementary rather than a substitute for early and stringent CO2 mitigation. Near-term SLCF measures do not allow for more time for CO2 mitigation. We disentangle and resolve the distinct benefits across different species and therewith facilitate an integrated strategy for mitigating both short and long-term climate change. PMID:25368182

  9. Dependence of near field co-seismic ionospheric perturbations on surface deformations: A case study based on the April, 25 2015 Gorkha Nepal earthquake

    NASA Astrophysics Data System (ADS)

    Sunil, A. S.; Bagiya, Mala S.; Catherine, Joshi; Rolland, Lucie; Sharma, Nitin; Sunil, P. S.; Ramesh, D. S.

    2017-03-01

    Ionospheric response to the recent 25 April 2015 Gorkha, Nepal earthquake is studied in terms of Global Positioning System-Total Electron Content (GPS-TEC) from the viewpoints of source directivity, rupture propagation and associated surface deformations, over and near the fault plane. The azimuthal directivity of co-seismic ionospheric perturbations (CIP) amplitudes from near field exhibit excellent correlation with east-southeast propagation of earthquake rupture and associated surface deformations. In addition, the amplitude of CIP is observed to be very small in the opposite direction of the rupture movement. Conceptual explanations on the poleward directivity of CIP exist in literature, we show the observational evidences of additional equator ward directivity, interpreted in terms of rupture propagation direction. We also discuss the coupling between earthquake induced acoustic waves and local geomagnetic field and its effects on near field CIP amplitudes. We suggest that variability of near field CIP over and near the fault plane are the manifestations of the geomagnetic field-wave coupling in addition to crustal deformations that observed through GPS measurements and corroborated by Interferometric Synthetic Aperture Radar (InSAR) data sets.

  10. Observation-based source terms in the third-generation wave model WAVEWATCH

    NASA Astrophysics Data System (ADS)

    Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.

    2015-12-01

    Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.

  11. FY2004 SYSTEM ENGINEER PROGRAM MANAGER ANNUAL REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JACKSON, G.J.

    2004-10-29

    During FY 2004, reviews of the FH System Engineer (SE) Program were conducted by the Independent Assessment (IA) Group. The results of these reviews are summarized as a part of this document. Additional reviews were performed by FH Engineering personnel. SE Engineering reviews performed include Periodic Walkdowns (typically, quarterly) by the SEs, a review of System Notebooks by the System Engineer Program Manager (SEPM), annual status report by each SE, and an annual status report by each of the Project Chief Engineers (PCEs). FY 2004 marked the completion of the first round of Vital Safety System assessments. Each of themore » VSSs on the FH VSS list has been evaluated at least once by either the FH Independent Assessment organization or was included as a part of DOE Phase II assessment. Following the completion of the K-Basins Assessment in May 2004, a review of the VSS assessment process was completed. Criteria were developed by FH, and concurred with by RL, to determine the frequency and priority of future VSS assessments. Additional actions have been taken to increase the visibility and emphasis assigned to VSSs. Completion of several Documented Safety Analyses (DSA), in combination with efforts to remove source term materials from several facilities, enabled the number of systems on the FH VSS list to be reduced from 60 at the beginning of FY 2004 to 48 by the end of FY 2004. It is expected that there will be further changes to the FH VSS list based on additional DSA revisions and continued progress towards reduction of source terms across the Hanford Site. Other new VSSs may be added to the list to reflect the relocation of materials away from the River Corridor to interim storage locations on the Central Plateau.« less

  12. Is Urinary Cadmium a Biomarker of Long-term Exposure in Humans? A Review

    PubMed Central

    Kruse, Danielle; Harrington, James; Levine, Keith; Meliker, Jaymie R.

    2017-01-01

    Cadmium is a naturally-occurring element, and humans are exposed from cigarettes, food, and industrial sources. Following exposure, cadmium accumulates in the kidney and is slowly released into the urine, usually proportionally to the levels found in the kidneys. Cadmium levels in a single spot urine sample have been considered indicative of long-term exposure to cadmium; however, such a potentially exceptional biomarker requires careful scrutiny. In this review, we report good to excellent temporal stability of urinary cadmium (intraclass correlation coefficient 0.66–0.81) regardless of spot urine or first morning void sampling. Factors such as changes in smoking habits and diseases characterized by increased excretion of proteins may produce short-term changes in urinary cadmium levels. We recommend that epidemiologists use this powerful biomarker in prospective studies stratified by smoking status, along with thoughtful consideration of additional factors that can influence renal physiology and cadmium excretion. PMID:27696280

  13. Data and methods for studying commercial motor vehicle driver fatigue, highway safety and long-term driver health.

    PubMed

    Stern, Hal S; Blower, Daniel; Cohen, Michael L; Czeisler, Charles A; Dinges, David F; Greenhouse, Joel B; Guo, Feng; Hanowski, Richard J; Hartenbaum, Natalie P; Krueger, Gerald P; Mallis, Melissa M; Pain, Richard F; Rizzo, Matthew; Sinha, Esha; Small, Dylan S; Stuart, Elizabeth A; Wegman, David H

    2018-03-09

    This article summarizes the recommendations on data and methodology issues for studying commercial motor vehicle driver fatigue of a National Academies of Sciences, Engineering, and Medicine study. A framework is provided that identifies the various factors affecting driver fatigue and relating driver fatigue to crash risk and long-term driver health. The relevant factors include characteristics of the driver, vehicle, carrier and environment. Limitations of existing data are considered and potential sources of additional data described. Statistical methods that can be used to improve understanding of the relevant relationships from observational data are also described. The recommendations for enhanced data collection and the use of modern statistical methods for causal inference have the potential to enhance our understanding of the relationship of fatigue to highway safety and to long-term driver health. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Sources and fate of bioavailable dissolved organic nitrogen in the Neuse River Estuary, North Carolina

    NASA Astrophysics Data System (ADS)

    Paerl, H. W.; Peierls, B. L.; Hounshell, A.; Osburn, C. L.

    2015-12-01

    Eutrophication is a widespread problem affecting the structure and function of estuaries and is often linked to anthropogenic nitrogen (N) enrichment, since N is the primary nutrient limiting algal production. Watershed management actions typically have ignored dissolved organic nitrogen (DON) loading because of its perceived refractory nature and instead focused on inorganic N as targets for loading reductions. A fluorescence-based model indicated that anthropogenic sources of DON near the head of the microtidal Neuse River Estuary (NRE), NC were dominated by septic systems and poultry waste. A series of bioassays were used to determine the bioavailability of river DON and DON-rich sources to primary producers and whether those additions promoted the growth of certain phytoplankton taxa, particularly harmful species. Overall, at time scales up to two to three weeks, estuarine phytoplankton and bacteria only showed limited responses to additions of high molecular weight (HMW, >1 kDa) river DON. When increases in productivity and biomass did occur, they were quite small compared with the response to inorganic N. Low molecular weight (LMW) river DON, waste water treatment plant effluent, and poultry litter extract did have a positive effect on phytoplankton and bacterial production, indicating a bioavailable fraction. High variability of bulk DON concentration suggested that bioavailable compounds added in the experimental treatments were low in concentration and turned over quite rapidly. Some phytoplankton taxa, as measured by diagnostic photopigments, appeared to be selectively enhanced by the HMW and specific source DON additions, although the taxa could not be positively identified as harmful species. Preliminary tests show that labile autochthonous organic matter may act as a primer for the mineralization of the HMW DON. These and other, longer-term bioavailability studies will be needed to adequately address the fate of watershed DON in estuarine ecosystems.

  15. Human Rights Texts: Converting Human Rights Primary Source Documents into Data.

    PubMed

    Fariss, Christopher J; Linder, Fridolin J; Jones, Zachary M; Crabtree, Charles D; Biek, Megan A; Ross, Ana-Sophia M; Kaur, Taranamol; Tsai, Michael

    2015-01-01

    We introduce and make publicly available a large corpus of digitized primary source human rights documents which are published annually by monitoring agencies that include Amnesty International, Human Rights Watch, the Lawyers Committee for Human Rights, and the United States Department of State. In addition to the digitized text, we also make available and describe document-term matrices, which are datasets that systematically organize the word counts from each unique document by each unique term within the corpus of human rights documents. To contextualize the importance of this corpus, we describe the development of coding procedures in the human rights community and several existing categorical indicators that have been created by human coding of the human rights documents contained in the corpus. We then discuss how the new human rights corpus and the existing human rights datasets can be used with a variety of statistical analyses and machine learning algorithms to help scholars understand how human rights practices and reporting have evolved over time. We close with a discussion of our plans for dataset maintenance, updating, and availability.

  16. Personal assistance services in the workplace: A literature review.

    PubMed

    Dowler, Denetta L; Solovieva, Tatiana I; Walls, Richard T

    2011-10-01

    Personal assistance services (PAS) can be valuable adjuncts to the complement of accommodations that support workers with disabilities. This literature review explored the professional literature on the use of PAS in the workplace. Bibliographic sources were used to locate relevant research studies on the use of PAS in the workplace. The studies in this review used both qualitative and quantitative methods to identify current definitions of work-related and personal care-related PAS, agency-directed versus consumer-directed PAS, long-term and short-term funding issues, development of PAS policy, and barriers to successful implementation of PAS. The studies uncovered issues related to (a) recruiting, training, and retaining personal assistants, (b) employer concerns, (c) costs and benefits of workplace PAS, (d) wages and incentives for personal assistants, and (e) sources for financing PAS as a workplace accommodation. The findings reveal the value and benefits of effective PAS on the job. PAS can lead to successful employment of people with disabilities when other accommodations cannot provide adequate workplace support. Additionally, the evolution of workplace PAS is dependent on development of realistic PAS policy and funding options. Published by Elsevier Inc.

  17. Human Rights Texts: Converting Human Rights Primary Source Documents into Data

    PubMed Central

    Fariss, Christopher J.; Linder, Fridolin J.; Jones, Zachary M.; Crabtree, Charles D.; Biek, Megan A.; Ross, Ana-Sophia M.; Kaur, Taranamol; Tsai, Michael

    2015-01-01

    We introduce and make publicly available a large corpus of digitized primary source human rights documents which are published annually by monitoring agencies that include Amnesty International, Human Rights Watch, the Lawyers Committee for Human Rights, and the United States Department of State. In addition to the digitized text, we also make available and describe document-term matrices, which are datasets that systematically organize the word counts from each unique document by each unique term within the corpus of human rights documents. To contextualize the importance of this corpus, we describe the development of coding procedures in the human rights community and several existing categorical indicators that have been created by human coding of the human rights documents contained in the corpus. We then discuss how the new human rights corpus and the existing human rights datasets can be used with a variety of statistical analyses and machine learning algorithms to help scholars understand how human rights practices and reporting have evolved over time. We close with a discussion of our plans for dataset maintenance, updating, and availability. PMID:26418817

  18. Characterization and in vitro properties of potentially probiotic Bifidobacterium strains isolated from breast-milk.

    PubMed

    Arboleya, Silvia; Ruas-Madiedo, Patricia; Margolles, Abelardo; Solís, Gonzalo; Salminen, Seppo; de Los Reyes-Gavilán, Clara G; Gueimonde, Miguel

    2011-09-01

    Most of the current commercial probiotic strains have not been selected for specific applications, but rather on the basis of their technological potential for use in diverse applications. Therefore, by selecting them from appropriate sources, depending on the target population, it is likely that better performing strains may be identified. Few strains have been specifically selected for human neonates, where the applications of probiotics may have a great positive impact. Breast-milk constitutes an interesting source of potentially probiotic bifidobacteria for inclusion in infant formulas and foods targeted to both pre-term and full-term infants. In this study six Bifidobacterium strains isolated from breast-milk were phenotypically and genotypically characterised according to international guidelines for probiotics. In addition, different in vitro tests were used to assess the safety and probiotic potential of the strains. Although clinical data would be needed before drawing any conclusion on the probiotic properties of the strains, our results indicate that some of them may have probiotic potential for their inclusion in products targeting infants. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Polarization and long-term variability of Sgr A* X-ray echo

    NASA Astrophysics Data System (ADS)

    Churazov, E.; Khabibullin, I.; Ponti, G.; Sunyaev, R.

    2017-06-01

    We use a model of the molecular gas distribution within ˜100 pc from the centre of the Milky Way (Kruijssen, Dale & Longmore) to simulate time evolution and polarization properties of the reflected X-ray emission, associated with the past outbursts from Sgr A*. While this model is too simple to describe the complexity of the true gas distribution, it illustrates the importance and power of long-term observations of the reflected emission. We show that the variable part of X-ray emission observed by Chandra and XMM-Newton from prominent molecular clouds is well described by a pure reflection model, providing strong support of the reflection scenario. While the identification of Sgr A* as a primary source for this reflected emission is already a very appealing hypothesis, a decisive test of this model can be provided by future X-ray polarimetric observations, which will allow placing constraints on the location of the primary source. In addition, X-ray polarimeters (like, e.g. XIPE) have sufficient sensitivity to constrain the line-of-sight positions of molecular complexes, removing major uncertainty in the model.

  20. CW injection locking for long-term stability of frequency combs

    NASA Astrophysics Data System (ADS)

    Williams, Charles; Quinlan, Franklyn; Delfyett, Peter J.

    2009-05-01

    Harmonically mode-locked semiconductor lasers with external ring cavities offer high repetition rate pulse trains while maintaining low optical linewidth via long cavity storage times. Continuous wave (CW) injection locking further reduces linewidth and stabilizes the optical frequencies. The output can be stabilized long-term with the help of a modified Pound-Drever-Hall feedback loop. Optical sidemode suppression of 36 dB has been shown, as well as RF supermode noise suppression of 14 dB for longer than 1 hour. In addition to the injection locking of harmonically mode-locked lasers requiring an external frequency source, recent work shows the viability of the injection locking technique for regeneratively mode-locked lasers, or Coupled Opto-Electronic Oscillators (COEO).

  1. An efficient unstructured WENO method for supersonic reactive flows

    NASA Astrophysics Data System (ADS)

    Zhao, Wen-Geng; Zheng, Hong-Wei; Liu, Feng-Jun; Shi, Xiao-Tian; Gao, Jun; Hu, Ning; Lv, Meng; Chen, Si-Cong; Zhao, Hong-Da

    2018-03-01

    An efficient high-order numerical method for supersonic reactive flows is proposed in this article. The reactive source term and convection term are solved separately by splitting scheme. In the reaction step, an adaptive time-step method is presented, which can improve the efficiency greatly. In the convection step, a third-order accurate weighted essentially non-oscillatory (WENO) method is adopted to reconstruct the solution in the unstructured grids. Numerical results show that our new method can capture the correct propagation speed of the detonation wave exactly even in coarse grids, while high order accuracy can be achieved in the smooth region. In addition, the proposed adaptive splitting method can reduce the computational cost greatly compared with the traditional splitting method.

  2. Search for the Footprints of New Physics with Laboratory and Cosmic Neutrinos

    NASA Technical Reports Server (NTRS)

    Stecker, Floyd W.

    2017-01-01

    Observations of high energy neutrinos, both in the laboratory and from cosmic sources, can be a useful probe in searching for new physics. Such observations can provide sensitive tests of Lorentz invariance violation (LIV), which may be a the result of quantum gravity physics (QG). We review some observationally testable consequences of LIV using effective field theory (EFT) formalism. To do this, one can postulate the existence of additional small LIV terms in free particle Lagrangians, suppressed by powers of the Planck mass. The observational consequences of such terms are then examined. In particular, one can place limits on a class of non-renormalizable, mass dimension five and six Lorentz invariance violating operators that may be the result of QG.

  3. Development of axisymmetric lattice Boltzmann flux solver for complex multiphase flows

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Shu, Chang; Yang, Li-Ming; Yuan, Hai-Zhuan

    2018-05-01

    This paper presents an axisymmetric lattice Boltzmann flux solver (LBFS) for simulating axisymmetric multiphase flows. In the solver, the two-dimensional (2D) multiphase LBFS is applied to reconstruct macroscopic fluxes excluding axisymmetric effects. Source terms accounting for axisymmetric effects are introduced directly into the governing equations. As compared to conventional axisymmetric multiphase lattice Boltzmann (LB) method, the present solver has the kinetic feature for flux evaluation and avoids complex derivations of external forcing terms. In addition, the present solver also saves considerable computational efforts in comparison with three-dimensional (3D) computations. The capability of the proposed solver in simulating complex multiphase flows is demonstrated by studying single bubble rising in a circular tube. The obtained results compare well with the published data.

  4. Electromagnetic fields and the public: EMF standards and estimation of risk

    NASA Astrophysics Data System (ADS)

    Grigoriev, Yury

    2010-04-01

    Mobile communications are a relatively new and additional source of electromagnetic exposure for the population. Standard daily mobile-phone use is known to increase RF-EMF (radiofrequency electromagnetic field) exposure to the brains of users of all ages, whilst mobile-phone base stations, and base station units for cordless phones, can regularly increase the exposures of large numbers of the population to RF-EMF radiation in everyday life. The need to determine appropriate standards stipulating the maximum acceptable short-term and long-term RF-EMF levels encountered by the public, and set such levels as general guidelines, is of great importance in order to help preserve the general public's health and that of the next generation of humanity.

  5. Bayesian source term determination with unknown covariance of measurements

    NASA Astrophysics Data System (ADS)

    Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav

    2017-04-01

    Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  6. Improvements and limitations on understanding of atmospheric processes of Fukushima Daiichi NPS radioactivity

    NASA Astrophysics Data System (ADS)

    Yamazawa, Hiromi; Terasaka, Yuta; Mizutani, Kenta; Sugiura, Hiroki; Hirao, Shigekazu

    2017-04-01

    Understanding on the release of radioactivity into the atmosphere from the accidental units of Fukushima Daiichi Nuclear Power Station have been improved owing to recent analyses of atmospheric concentrations of radionuclide. Our analysis of gamma-ray spectra from monitoring posts located about 100 km to the south of the site revealed temporal changes of atmospheric concentrations of several key nuclides including noble gas Xe-133 in addition to radio-iodine and cesium nuclides, including I-131 and Cs-137, at a 10 minute interval. By using the atmospheric concentration data, in combination with an inverse atmospheric transport modelling with a Bayesian statistical method, a modification was proposed for the widely used Katata's source term. A source term for Xe-133 was also proposed. Although the atmospheric concentration data and the source terms help us understand the atmospheric transport processes of radionuclides, they still have significant uncertainty due to limitations in availability of the concentration data. There still remain limitations in the atmospheric transport modeling. The largest uncertainty in the model is in the deposition processes. It had been pointed out that, in the 100 km range from the accidental site, there were locations at which the ambient dose rate significantly increased a few hours before precipitation detectors recorded the start of rain. According to our analysis, the dose rate increase was not directly caused by the air-borne radioactivity but by deposition. This phenomenon can be attributed to a deposition process in which evaporating precipitation enhances efficiency of deposition even in a case where no precipitation is observed at ground level.

  7. The temporal evolution of electromagnetic markers sensitive to the capacity limits of visual short-term memory.

    PubMed

    Mitchell, Daniel J; Cusack, Rhodri

    2011-01-01

    An electroencephalographic (EEG) marker of the limited contents of human visual short-term memory (VSTM) has previously been described. Termed contralateral delay activity, this consists of a sustained, posterior, negative potential that correlates with memory load and is greatest contralateral to the remembered hemifield. The current investigation replicates this finding and uses magnetoencephalography (MEG) to characterize its magnetic counterparts and their neural generators as they evolve throughout the memory delay. A parametric manipulation of memory load, within and beyond capacity limits, allows separation of signals that asymptote with behavioral VSTM performance from additional responses that contribute to a linear increase with set-size. Both EEG and MEG yielded bilateral signals that track the number of objects held in memory, and contralateral signals that are independent of memory load. In MEG, unlike EEG, the contralateral interaction between hemisphere and item load is much weaker, suggesting that bilateral and contralateral markers of memory load reflect distinct sources to which EEG and MEG are differentially sensitive. Nonetheless, source estimation allowed both the bilateral and the weaker contralateral capacity-limited responses to be localized, along with a load-independent contralateral signal. Sources of global and hemisphere-specific signals all localized to the posterior intraparietal sulcus during the early delay. However the bilateral load response peaked earlier and its generators shifted later in the delay. Therefore the hemifield-specific response may be more closely tied to memory maintenance while the global load response may be involved in initial processing of a limited number of attended objects, such as their individuation or consolidation into memory.

  8. NSRD-10: Leak Path Factor Guidance Using MELCOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Louie, David; Humphries, Larry L.

    Estimates of the source term from a U.S. Department of Energy (DOE) nuclear facility requires that the analysts know how to apply the simulation tools used, such as the MELCOR code, particularly for a complicated facility that may include an air ventilation system and other active systems that can influence the environmental pathway of the materials released. DOE has designated MELCOR 1.8.5, an unsupported version, as a DOE ToolBox code in its Central Registry, which includes a leak-path-factor guidance report written in 2004 that did not include experimental validation data. To continue to use this MELCOR version requires additional verificationmore » and validations, which may not be feasible from a project cost standpoint. Instead, the recent MELCOR should be used. Without any developer support and lack of experimental data validation, it is difficult to convince regulators that the calculated source term from the DOE facility is accurate and defensible. This research replaces the obsolete version in the 2004 DOE leak path factor guidance report by using MELCOR 2.1 (the latest version of MELCOR with continuing modeling development and user support) and by including applicable experimental data from the reactor safety arena and from applicable experimental data used in the DOE-HDBK-3010. This research provides best practice values used in MELCOR 2.1 specifically for the leak path determination. With these enhancements, the revised leak-path-guidance report should provide confidence to the DOE safety analyst who would be using MELCOR as a source-term determination tool for mitigated accident evaluations.« less

  9. Audit of manufactured products: use of allergen advisory labels and identification of labeling ambiguities.

    PubMed

    Pieretti, Mariah M; Chung, Danna; Pacenza, Robert; Slotkin, Todd; Sicherer, Scott H

    2009-08-01

    The Food Allergy Labeling and Consumer Protection Act became effective January 1, 2006, and mandates disclosure of the 8 major allergens in plain English and as a source of ingredients in the ingredient statement. It does not regulate advisory labels. We sought to determine the frequency and language used in voluntary advisory labels among commercially available products and to identify labeling ambiguities affecting consumers with allergy. Trained surveyors performed a supermarket survey of 20,241 unique manufactured food products (from an original assessment of 49,604 products) for use of advisory labels. A second detailed survey of 744 unique products evaluated additional labeling practices. Overall, 17% of 20,241 products surveyed contain advisory labels. Chocolate candy, cookies, and baking mixes were the 3 categories of 24 with the greatest frequency (> or = 40%). Categorically, advisory warnings included "may contain" (38%), "shared equipment" (33%), and "within plant" (29%). The subsurvey disclosed 25 different types of advisory terminology. Nonspecific terms, such as "natural flavors" and "spices," were found on 65% of products and were not linked to a specific ingredient for 83% of them. Additional ambiguities included unclear sources of soy (lecithin vs protein), nondisclosure of sources of gelatin and lecithin, and simultaneous disclosure of "contains" and "may contain" for the same allergen, among others. Numerous products have advisory labeling and ambiguities that present challenges to consumers with food allergy. Additional allergen labeling regulation could improve safety and quality of life for individuals with food allergy.

  10. Principle component analysis to separate deformation signals from multiple sources during a 2015 intrusive sequence at Kīlauea Volcano

    NASA Astrophysics Data System (ADS)

    Johanson, I. A.; Miklius, A.; Poland, M. P.

    2016-12-01

    A sequence of magmatic events in April-May 2015 at Kīlauea Volcano produced a complex deformation pattern that can be described by multiple deforming sources, active simultaneously. The 2015 intrusive sequence began with inflation in the volcano's summit caldera near Halema`uma`u (HMM) Crater, which continued over a few weeks, followed by rapid deflation of the HMM source and inflation of a source in the south caldera region during the next few days. In Kīlauea Volcano's summit area, multiple deformation centers are active at varying times, and all contribute to the overall pattern observed with GPS, tiltmeters, and InSAR. Isolating the contribution of different signals related to each source is a challenge and complicates the determination of optimal source geometry for the underlying magma bodies. We used principle component analysis of continuous GPS time series from the 2015 intrusion sequence to determine three basis vectors which together account for 83% of the variance in the data set. The three basis vectors are non-orthogonal and not strictly the principle components of the data set. In addition to separating deformation sources in the continuous GPS data, the basis vectors provide a means to scale the contribution of each source in a given interferogram. This provides an additional constraint in a joint model of GPS and InSAR data (COSMO-SkyMed and Sentinel-1A) to determine source geometry. The first basis vector corresponds with inflation in the south caldera region, an area long recognized as the location of a long-term storage reservoir. The second vector represents deformation of the HMM source, which is in the same location as a previously modeled shallow reservoir, however InSAR data suggest a more complicated source. Preliminary modeling of the deformation attributed to the third basis vector shows that it is consistent with inflation of a steeply dipping ellipsoid centered below Keanakāko`i crater, southeast of HMM. Keanakāko`i crater is the locus of a known, intermittently active deformation source, which was not previously recognized to have been active during the 2015 event.

  11. Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2005-01-01

    A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  12. Beyond the double banana: improved recognition of temporal lobe seizures in long-term EEG.

    PubMed

    Rosenzweig, Ivana; Fogarasi, András; Johnsen, Birger; Alving, Jørgen; Fabricius, Martin Ejler; Scherg, Michael; Neufeld, Miri Y; Pressler, Ronit; Kjaer, Troels W; van Emde Boas, Walter; Beniczky, Sándor

    2014-02-01

    To investigate whether extending the 10-20 array with 6 electrodes in the inferior temporal chain and constructing computed montages increases the diagnostic value of ictal EEG activity originating in the temporal lobe. In addition, the accuracy of computer-assisted spectral source analysis was investigated. Forty EEG samples were reviewed by 7 EEG experts in various montages (longitudinal and transversal bipolar, common average, source derivation, source montage, current source density, and reference-free montages) using 2 electrode arrays (10-20 and the extended one). Spectral source analysis used source montage to calculate density spectral array, defining the earliest oscillatory onset. From this, phase maps were calculated for localization. The reference standard was the decision of the multidisciplinary epilepsy surgery team on the seizure onset zone. Clinical performance was compared with the double banana (longitudinal bipolar montage, 10-20 array). Adding the inferior temporal electrode chain, computed montages (reference free, common average, and source derivation), and voltage maps significantly increased the sensitivity. Phase maps had the highest sensitivity and identified ictal activity at earlier time-point than visual inspection. There was no significant difference concerning specificity. The findings advocate for the use of these digital EEG technology-derived analysis methods in clinical practice.

  13. Possible sources of nitrate in ground water at swine licensed-managed feeding operations in Oklahoma, 2001

    USGS Publications Warehouse

    Becker, Mark F.; Peter, Kathy D.; Masoner, Jason

    2002-01-01

    Samples collected and analyzed by the Oklahoma Department of Agriculture, Food, and Forestry from 1999 to 2001 determined that nitrate exceeded the U.S. Environmental Protection Agency maximum contaminant level for public drinking-water supplies of 10 milligrams per liter as nitrogen in 79 monitoring wells at 35 swine licensed-managed feeding operations (LMFO) in Oklahoma. The LMFOs are located in rural agricultural settings where long-term agriculture has potentially affected the ground-water quality in some areas. Land use prior to the construction of the LMFOs was assessed to evaluate the types of agricultural land use within a 500-meter radius of the sampled wells. Chemical and microbiological techniques were used to determine the possible sources of nitrate in water sampled from 10 wastewater lagoons and 79 wells. Samples were analyzed for dissolved major ions, dissolved trace elements, dissolved nutrients, nitrogen isotope ratios of nitrate and ammonia, wastewater organic compounds, and fecal coliform bacteria. Bacteria ribotyping analysis was done on selected samples to identify possible specific animal sources. A decision process was developed to identify the possible sources of nitrate. First, nitrogen isotope ratios were used to define sources as animal, mixed animal and fertilizer, or fertilizer. Second, wastewater organic compound detections, nitrogen-isotope ratios, fecal coliform bacteria detections, and ribotyping were used to refine the identification of possible sources as LFMO waste, fertilizer, or unidentified animal or mixtures of these sources. Additional evidence provided by ribotyping and wastewater organic compound data can, in some cases, specifically indicate the animal source. Detections of three or more wastewater organic compounds that are indicators of animal sources and detections of fecal coliform bacteria provided additional evidence of an animal source. LMFO waste was designated as a possible source of nitrate in water from 10 wells. The source of waste in water from five of those wells was determined through ribotyping, and the source of waste in water from the remaining five wells was determined by detections of three or more animal-waste compounds in the well samples. LMFO waste in the water from wells with unidentified animal source of nitrate does not indicate that LMFO waste was not the source, but indicated that multiple animal sources, including LMFO waste, may be the source of the nitrate.

  14. Functional Additive Mixed Models

    PubMed Central

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2014-01-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach. PMID:26347592

  15. Functional Additive Mixed Models.

    PubMed

    Scheipl, Fabian; Staicu, Ana-Maria; Greven, Sonja

    2015-04-01

    We propose an extensive framework for additive regression models for correlated functional responses, allowing for multiple partially nested or crossed functional random effects with flexible correlation structures for, e.g., spatial, temporal, or longitudinal functional data. Additionally, our framework includes linear and nonlinear effects of functional and scalar covariates that may vary smoothly over the index of the functional response. It accommodates densely or sparsely observed functional responses and predictors which may be observed with additional error and includes both spline-based and functional principal component-based terms. Estimation and inference in this framework is based on standard additive mixed models, allowing us to take advantage of established methods and robust, flexible algorithms. We provide easy-to-use open source software in the pffr() function for the R-package refund. Simulations show that the proposed method recovers relevant effects reliably, handles small sample sizes well and also scales to larger data sets. Applications with spatially and longitudinally observed functional data demonstrate the flexibility in modeling and interpretability of results of our approach.

  16. Hybrid Wing Body Aircraft System Noise Assessment with Propulsion Airframe Aeroacoustic Experiments

    NASA Technical Reports Server (NTRS)

    Thomas, Russell H.; Burley, Casey L.; Olson, Erik D.

    2010-01-01

    A system noise assessment of a hybrid wing body configuration was performed using NASA s best available aircraft models, engine model, and system noise assessment method. A propulsion airframe aeroacoustic effects experimental database for key noise sources and interaction effects was used to provide data directly in the noise assessment where prediction methods are inadequate. NASA engine and aircraft system models were created to define the hybrid wing body aircraft concept as a twin engine aircraft with a 7500 nautical mile mission. The engines were modeled as existing technology high bypass ratio turbofans. The baseline hybrid wing body aircraft was assessed at 22 dB cumulative below the FAA Stage 4 certification level. To determine the potential for noise reduction with relatively near term technologies, seven other configurations were assessed beginning with moving the engines two fan nozzle diameters upstream of the trailing edge and then adding technologies for reduction of the highest noise sources. Aft radiated noise was expected to be the most challenging to reduce and, therefore, the experimental database focused on jet nozzle and pylon configurations that could reduce jet noise through a combination of source reduction and shielding effectiveness. The best configuration for reduction of jet noise used state-of-the-art technology chevrons with a pylon above the engine in the crown position. This configuration resulted in jet source noise reduction, favorable azimuthal directivity, and noise source relocation upstream where it is more effectively shielded by the limited airframe surface, and additional fan noise attenuation from acoustic liner on the crown pylon internal surfaces. Vertical and elevon surfaces were also assessed to add shielding area. The elevon deflection above the trailing edge showed some small additional noise reduction whereas vertical surfaces resulted in a slight noise increase. With the effects of the configurations from the database included, the best available noise reduction was 40 dB cumulative. Projected effects from additional technologies were assessed for an advanced noise reduction configuration including landing gear fairings and advanced pylon and chevron nozzles. Incorporating the three additional technology improvements, an aircraft noise is projected of 42.4 dB cumulative below the Stage 4 level.

  17. Constraining the Long-Term Average of Earthquake Recurrence Intervals From Paleo- and Historic Earthquakes by Assimilating Information From Instrumental Seismicity

    NASA Astrophysics Data System (ADS)

    Zoeller, G.

    2017-12-01

    Paleo- and historic earthquakes are the most important source of information for the estimationof long-term recurrence intervals in fault zones, because sequences of paleoearthquakes cover more than one seismic cycle. On the other hand, these events are often rare, dating uncertainties are enormous and the problem of missing or misinterpreted events leads to additional problems. Taking these shortcomings into account, long-term recurrence intervals are usually unstable as long as no additional information are included. In the present study, we assume that the time to the next major earthquake depends on the rate of small and intermediate events between the large ones in terms of a ``clock-change'' model that leads to a Brownian Passage Time distribution for recurrence intervals. We take advantage of an earlier finding that the aperiodicity of this distribution can be related to the Gutenberg-Richter-b-value, which is usually around one and can be estimated easily from instrumental seismicity in the region under consideration. This allows to reduce the uncertainties in the estimation of the mean recurrence interval significantly, especially for short paleoearthquake sequences and high dating uncertainties. We present illustrative case studies from Southern California and compare the method with the commonly used approach of exponentially distributed recurrence times assuming a stationary Poisson process.

  18. The influence of cross-order terms in interface mobilities for structure-borne sound source characterization

    NASA Astrophysics Data System (ADS)

    Bonhoff, H. A.; Petersson, B. A. T.

    2010-08-01

    For the characterization of structure-borne sound sources with multi-point or continuous interfaces, substantial simplifications and physical insight can be obtained by incorporating the concept of interface mobilities. The applicability of interface mobilities, however, relies upon the admissibility of neglecting the so-called cross-order terms. Hence, the objective of the present paper is to clarify the importance and significance of cross-order terms for the characterization of vibrational sources. From previous studies, four conditions have been identified for which the cross-order terms can become more influential. Such are non-circular interface geometries, structures with distinctively differing transfer paths as well as a suppression of the zero-order motion and cases where the contact forces are either in phase or out of phase. In a theoretical study, the former four conditions are investigated regarding the frequency range and magnitude of a possible strengthening of the cross-order terms. For an experimental analysis, two source-receiver installations are selected, suitably designed to obtain strong cross-order terms. The transmitted power and the source descriptors are predicted by the approximations of the interface mobility approach and compared with the complete calculations. Neglecting the cross-order terms can result in large misinterpretations at certain frequencies. On average, however, the cross-order terms are found to be insignificant and can be neglected with good approximation. The general applicability of interface mobilities for structure-borne sound source characterization and the description of the transmission process thereby is confirmed.

  19. Recent Progress on Spherical Torus Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ono, Masayuki; Kaita, Robert

    2014-01-01

    The spherical torus or spherical tokamak (ST) is a member of the tokamak family with its aspect ratio (A = R0/a) reduced to A ~ 1.5, well below the normal tokamak operating range of A ≥ 2.5. As the aspect ratio is reduced, the ideal tokamak beta β (radio of plasma to magnetic pressure) stability limit increases rapidly, approximately as β ~ 1/A. The plasma current it can sustain for a given edge safety factor q-95 also increases rapidly. Because of the above, as well as the natural elongation κ, which makes its plasma shape appear spherical, the ST configurationmore » can yield exceptionally high tokamak performance in a compact geometry. Due to its compactness and high performance, the ST configuration has various near term applications, including a compact fusion neutron source with low tritium consumption, in addition to its longer term goal of attractive fusion energy power source. Since the start of the two megaampere class ST facilities in 2000, National Spherical Torus Experiment (NSTX) in the US and Mega Ampere Spherical Tokamak (MAST) in UK, active ST research has been conducted worldwide. More than sixteen ST research facilities operating during this period have achieved remarkable advances in all of fusion science areas, involving fundamental fusion energy science as well as innovation. These results suggest exciting future prospects for ST research both near term and longer term. The present paper reviews the scientific progress made by the worldwide ST research community during this new mega-ampere-ST era.« less

  20. Long-Term Space Astrophysics Program

    NASA Technical Reports Server (NTRS)

    Nowark, Michael A.

    2001-01-01

    This is the final report for our Long-Term Space Astrophysics Program (NRA 94-OSS-12) grant NAG 5-3225. The proposal is entitled 'Spectral and Temporal Properties of Black Hole Candidates', and began funding in May 1995, and ran through 31 Aug 2000. The project summary from the original proposal was as follows: 'We will study the spectral and temporal properties of black hole candidates (BHC) by using data from archival sources (e.g., EXOSAT, Ginga, ROSAT) and proposed follow-up observations with modern instruments (e.g., ASCA, XTE). Our spectral studies will focus on identifying the basic characteristics and luminosities of the emission components in the various 'states' of BHC. We hope to understand and quantify the global energetics of these states. Our temporal studies will focus on expanding and classifying our knowledge of BHC variability properties in each state. We will explore the nature of quasi-periodic oscillations in BHC. We will combine our spectral and temporal studies by analyzing time lags and variability coherence between energy channels. In addition, we will investigate ways of correlating observed variability behavior with specific emission components.' We have accomplished many of these goals laid out within the original proposal. As originally proposed, we have utilized both archival and proprietary satellite data. In terms of archival data, we have utilized data from the Advanced Satellite for Cosmology and Astrophysics (ASCA), ROSAT, and the Rossi X-ray Timing Explorer (RXTE). We also obtained proprietary data from ASCA, RXTE, and the Extreme Ultraviolet Explorer (EUVE). In terms of sources, we have examined a wide variety of both galactic black hole candidates and extra-galactic black holes. For the galactic black holes we have observed and analyzed both the low/hard state and the high/soft state. We have performed both spectral and timing analyses on all of these objects. In addition, we have also examined a number of neutron stars or potential neutron stars. All of our research on the above mentioned objects has resulted in one or more publications in peer-reviewed journals. Attached is a list of refereed publications of research results which have been funded by this grant over approximately the past five and a half years. In addition, we have included a list of conference proceedings and other similar reports that have been associated with this grant.

  1. Transitioning NPOESS Data to Weather Offices: The SPoRT Paradigm with EOS Data

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary

    2009-01-01

    Real-time satellite information provides one of many data sources used by NWS weather forecast offices (WFOs) to diagnose current weather conditions and to assist in short-term forecast preparation. While GOES satellite data provides relatively coarse spatial resolution coverage of the continental U.S. on a 10-15 minute repeat cycle, polar orbiting imagery has the potential to provide snapshots of weather conditions at high-resolution in many spectral channels. Additionally, polar orbiting sounding data can provide additional information on the thermodynamic structure of the atmosphere in data sparse regions of at asynoptic observation times. The NASA Short-term Prediction Research and Transition (SPoRT) project has demonstrated the utility of polar orbiting MODIS and AIRS data on the Terra and Aqua satellites to improve weather diagnostics and short-term forecasting on the regional and local scales. SPoRT scientists work directly forecasters at selected WFOS in the Southern Region (SR) to help them ingest these unique data streams into their AWIPS system, understand how to use the data (through on-site and distance learn techniques), and demonstrate the utility of these products to address significant forecast problems. This process also prepares forecasters for the use of similar observational capabilities from NPOESS operational sensors. NPOESS environmental data records (EDRs) from the Visible 1 Infrared Imager I Radiometer Suite (VIIRS), the Cross-track Infrared Sounder (CrlS) and Advanced Technology Microwave Sounder (ATMS) instruments and additional value-added products produced by NESDIS will be available in near real-time and made available to WFOs to extend their use of NASA EOS data into the NPOESS era. These new data streams will be integrated into the NWs's new AWIPS II decision support tools. The AWIPS I1 system to be unveiled in WFOs in 2009 will be a JAVA-based decision support system which preserves the functionality of the existing systems and offers unique development opportunities for new data sources and applications in the Service Orientated Architecture ISOA) environment. This paper will highlight some of the SPoRT activities leading to the integration of VllRS and CrIS/ATMS data into the display capabilities of these new systems to support short-term forecasting problems at WFOs.

  2. Laser Scanning Systems and Techniques in Rockfall Source Identification and Risk Assessment: A Critical Review

    NASA Astrophysics Data System (ADS)

    Fanos, Ali Mutar; Pradhan, Biswajeet

    2018-04-01

    Rockfall poses risk to people, their properties and to transportation ways in mountainous and hilly regions. This catastrophe shows various characteristics such as vast distribution, sudden occurrence, variable magnitude, strong fatalness and randomicity. Therefore, prediction of rockfall phenomenon both spatially and temporally is a challenging task. Digital Terrain model (DTM) is one of the most significant elements in rockfall source identification and risk assessment. Light detection and ranging (LiDAR) is the most advanced effective technique to derive high-resolution and accurate DTM. This paper presents a critical overview of rockfall phenomenon (definition, triggering factors, motion modes and modeling) and LiDAR technique in terms of data pre-processing, DTM generation and the factors that can be obtained from this technique for rockfall source identification and risk assessment. It also reviews the existing methods that are utilized for the evaluation of the rockfall trajectories and their characteristics (frequency, velocity, bouncing height and kinetic energy), probability, susceptibility, hazard and risk. Detail consideration is given on quantitative methodologies in addition to the qualitative ones. Various methods are demonstrated with respect to their application scales (local and regional). Additionally, attention is given to the latest improvement, particularly including the consideration of the intensity of the phenomena and the magnitude of the events at chosen sites.

  3. Gravity Waves in the Southern Hemisphere Extratropical Winter in the 7-km GEOS-5 Nature Run

    NASA Astrophysics Data System (ADS)

    Holt, L. A.; Alexander, M. J.; Coy, L.; Putman, W.; Molod, A.; Pawson, S.

    2016-12-01

    This study investigates winter Southern Hemisphere extratropical gravity waves and their sources in a 7-km horizontal resolution global climate simulation, the GEOS-5 Nature Run (NR). Gravity waves are evaluated by comparing brightness temperature anomalies to those from the Atmospheric Infrared Sounder (AIRS). Gravity wave amplitudes, wavelengths, and propagation directions are also computed in the NR and AIRS. The NR shows good agreement with AIRS in terms of spatial patterns of gravity wave activity and propagation directions, but the NR amplitudes are smaller by about a factor of 5 and the wavelengths are about a factor of 2 longer than in AIRS. In addition to evaluating gravity wave characteristics, gravity wave sources in the NR are also investigated by relating diagnostics of tropospheric sources of gravity waves, such as precipitation, frontogenesis, and potential vorticity anomalies to absolute gravity wave momentum fluxes in the lower stratosphere. Strong precipitation events are the most strongly correlated with absolute momentum flux, supporting previous studies highlighting the importance of moist processes in the generation of Southern Hemisphere extratropical gravity waves. Additionally, gravity wave absolute momentum fluxes over land are compared to those over ocean, and the contribution of orographic and nonorographic gravity waves to the total absolute momentum flux is examined.

  4. The Tea-Carbon Dioxide Laser as a Means of Generating Ultrasound in Solids

    NASA Astrophysics Data System (ADS)

    Taylor, Gregory Stuart

    1990-01-01

    Available from UMI in association with The British Library. Requires signed TDF. The aim of this thesis is to characterise the interaction between pulsed, high power, 10.6 mu m radiation and solids. The work is considered both in the general context of laser generation of ultrasound and specifically to gain a deeper understanding of the interaction between a laser supported plasma and a solid. The predominant experimental tools used are the homodyne Michelson interferometer and a range of electromagnetic acoustic transducers. To complement the ultrasonic data, various plasma inspection techniques, such as high speed, streak camera photography and reflection photometry, have been used to correlate the plasma properties with those of the ultrasonic transients. The work involving the characterisation of a laser supported plasma with a solid, which is based on previous experimental and theoretical analysis, gives an increased understanding of the plasma's ultrasonic generation mechanism. The ability to record the entire plasma-sample interaction, time history yields information of the internal dynamics of the plasma growth and shock wave generation. The interaction of the radiation with a solid is characterised in both the plasma breakdown and non-breakdown regimes by a wide ultrasonic source. The variation in source diameter enables the transition from a point to a near planar ultrasonic source to be studied. The resultant ultrasonic modifications are examined in terms of the wave structure and the directivity pattern. The wave structure is analysed in terms of existing wide source, bulk wave theories and extended to consider the effects on surface and Lamb waves. The directivity patterns of the longitudinal and shear waves are analysed in terms of top-hat and non -uniform source profiles, giving additional information into the radiation-solid interaction. The wide, one dimensional source analysis is continued to a two dimensional, extended ultrasonic source, generated on non-metals by the optical penetration of radiation within the target. The generation of ultrasound in both metals and non-metals, using the CO_2 laser, is shown to be an efficient process and may be employed almost totally non-destructively. Such a laser may therefore be used effectively on a greatly enhanced range of materials than those tested to-date via laser generation, resulting in the increased suitability of the laser technique within the field of Non Destructive Testing.

  5. Non-diffusive ignition of a gaseous reactive mixture following time-resolved, spatially distributed energy deposition

    NASA Astrophysics Data System (ADS)

    Kassoy, D. R.

    2014-01-01

    Systematic asymptotic methods are applied to the compressible conservation and state equations for a reactive gas, including transport terms, to develop a rational thermomechanical formulation for the ignition of a chemical reaction following time-resolved, spatially distributed thermal energy addition from an external source into a finite volume of gas. A multi-parameter asymptotic analysis is developed for a wide range of energy deposition levels relative to the initial internal energy in the volume when the heating timescale is short compared to the characteristic acoustic timescale of the volume. Below a quantitatively defined threshold for energy addition, a nearly constant volume heating process occurs, with a small but finite internal gas expansion Mach number. Very little added thermal energy is converted to kinetic energy. The gas expelled from the boundary of the hot, high-pressure spot is the source of mechanical disturbances (acoustic and shock waves) that propagate away into the neighbouring unheated gas. When the energy addition reaches the threshold value, the heating process is fully compressible with a substantial internal gas expansion Mach number, the source of blast waves propagating into the unheated environmental gas. This case corresponds to an extremely large non-dimensional hot-spot temperature and pressure. If the former is sufficiently large, a high activation energy chemical reaction is initiated on the short heating timescale. This phenomenon is in contrast to that for more modest levels of energy addition, where a thermal explosion occurs only after the familiar extended ignition delay period for a classical high activation reaction. Transport effects, modulated by an asymptotically small Knudsen number, are shown to be negligible unless a local gradient in temperature, concentration or velocity is exceptionally large.

  6. Soil Organic Carbon and Below Ground Biomass: Development of New GLOBE Special Measurements

    NASA Technical Reports Server (NTRS)

    Levine, Elissa; Haskett, Jonathan

    1999-01-01

    A scientific consensus is building that changes in the atmospheric concentrations of radiatively active gases are changing the climate (IPCC, 1990). One of these gases CO2 has been increasing in concentration due to additions from anthropogenic sources that are primarily industrial and land use related. The soil contains a very large pool of carbon, estimated at 1550 Gt (Lal 1995) which is larger than the atmospheric and biosphere pools of carbon combined (Greenland, 1995). The flux between the soil and the atmosphere is very large, 60 Pg C/yr (Lal 1997), and is especially important because the soil can act as either a source or a sink for carbon. On any given landscape, as much as 50% of the biomass that provides the major source of carbon can be below ground. In addition, the movement of carbon in and out of the soil is mediated by the living organisms. At present, there is no widespread sampling of soil biomass in any consistent or coordinated manner. Current large scale estimates of soil carbon are limited by the number and widely dispersed nature of the data points available. A measurement of the amount of carbon in the soil would supplement existing carbon data bases as well as provide a benchmark that can be used to determine whether the soil is storing carbon or releasing it to the atmosphere. Information on the below ground biomass would be a valuable addition to our understanding of net primary productivity and standing biomass. The addition of these as special measurements within GLOBE would be unique in terms of areal extent and continuity, and make a real contribution to scientific understanding of carbon dynamics.

  7. Comparative Study on High-Order Positivity-preserving WENO Schemes

    NASA Technical Reports Server (NTRS)

    Kotov, Dmitry V.; Yee, Helen M.; Sjogreen, Bjorn Axel

    2013-01-01

    The goal of this study is to compare the results obtained by non-positivity-preserving methods with the recently developed positivity-preserving schemes for representative test cases. In particular the more di cult 3D Noh and Sedov problems are considered. These test cases are chosen because of the negative pressure/density most often exhibited by standard high-order shock-capturing schemes. The simulation of a hypersonic nonequilibrium viscous shock tube that is related to the NASA Electric Arc Shock Tube (EAST) is also included. EAST is a high-temperature and high Mach number viscous nonequilibrium ow consisting of 13 species. In addition, as most common shock-capturing schemes have been developed for problems without source terms, when applied to problems with nonlinear and/or sti source terms these methods can result in spurious solutions, even when solving a conservative system of equations with a conservative scheme. This kind of behavior can be observed even for a scalar case (LeVeque & Yee 1990) as well as for the case consisting of two species and one reaction (Wang et al. 2012). For further information concerning this issue see (LeVeque & Yee 1990; Griffiths et al. 1992; Lafon & Yee 1996; Yee et al. 2012). This EAST example indicated that standard high-order shock-capturing methods exhibit instability of density/pressure in addition to grid-dependent discontinuity locations with insufficient grid points. The evaluation of these test cases is based on the stability of the numerical schemes together with the accuracy of the obtained solutions.

  8. Managing soil nutrients with compost in organic farms of East Georgia

    NASA Astrophysics Data System (ADS)

    Ghambashidze, Giorgi

    2013-04-01

    Soil Fertility management in organic farming relies on a long-term integrated approach rather than the more short-term very targeted solutions common in conventional agriculture. Increasing soil organic matter content through the addition of organic amendments has proven to be a valuable practice for maintaining or restoring soil quality. Organic agriculture relies greatly on building soil organic matter with compost typically replacing inorganic fertilizers and animal manure as the fertility source of choice. In Georgia, more and more attention is paid to the development of organic farming, occupying less than 1% of total agricultural land of the country. Due to increased interest towards organic production the question about soil amendments is arising with special focus on organic fertilizers as basic nutrient supply sources under organic management practice. In the frame of current research two different types of compost was prepared and their nutritional value was studied. The one was prepared from organic fraction municipal solid waste and another one using fruit processing residues. In addition to main nutritional properties both composts were tested on heavy metals content, as one of the main quality parameter. The results have shown that concentration of main nutrient is higher in municipal solid waste compost, but it contains also more heavy metals, which is not allowed in organic farming system. Fruit processing residue compost also has lower pH value and is lower in total salt content being is more acceptable for soil in lowlands of East Georgia, mainly characterised by alkaline reaction. .

  9. “Gestaltomics”: Systems Biology Schemes for the Study of Neuropsychiatric Diseases

    PubMed Central

    Gutierrez Najera, Nora A.; Resendis-Antonio, Osbaldo; Nicolini, Humberto

    2017-01-01

    The integration of different sources of biological information about what defines a behavioral phenotype is difficult to unify in an entity that reflects the arithmetic sum of its individual parts. In this sense, the challenge of Systems Biology for understanding the “psychiatric phenotype” is to provide an improved vision of the shape of the phenotype as it is visualized by “Gestalt” psychology, whose fundamental axiom is that the observed phenotype (behavior or mental disorder) will be the result of the integrative composition of every part. Therefore, we propose the term “Gestaltomics” as a term from Systems Biology to integrate data coming from different sources of information (such as the genome, transcriptome, proteome, epigenome, metabolome, phenome, and microbiome). In addition to this biological complexity, the mind is integrated through multiple brain functions that receive and process complex information through channels and perception networks (i.e., sight, ear, smell, memory, and attention) that in turn are programmed by genes and influenced by environmental processes (epigenetic). Today, the approach of medical research in human diseases is to isolate one disease for study; however, the presence of an additional disease (co-morbidity) or more than one disease (multimorbidity) adds complexity to the study of these conditions. This review will present the challenge of integrating psychiatric disorders at different levels of information (Gestaltomics). The implications of increasing the level of complexity, for example, studying the co-morbidity with another disease such as cancer, will also be discussed. PMID:28536537

  10. A Source-Term Based Boundary Layer Bleed/Effusion Model for Passive Shock Control

    NASA Technical Reports Server (NTRS)

    Baurle, Robert A.; Norris, Andrew T.

    2011-01-01

    A modeling framework for boundary layer effusion has been developed based on the use of source (or sink) terms instead of the usual practice of specifying bleed directly as a boundary condition. This framework allows the surface boundary condition (i.e. isothermal wall, adiabatic wall, slip wall, etc.) to remain unaltered in the presence of bleed. This approach also lends itself to easily permit the addition of empirical models for second order effects that are not easily accounted for by simply defining effective transpiration values. Two effusion models formulated for supersonic flows have been implemented into this framework; the Doerffer/Bohning law and the Slater formulation. These models were applied to unit problems that contain key aspects of the flow physics applicable to bleed systems designed for hypersonic air-breathing propulsion systems. The ability of each model to predict bulk bleed properties was assessed, as well as the response of the boundary layer as it passes through and downstream of a porous bleed system. The model assessment was performed with and without the presence of shock waves. Three-dimensional CFD simulations that included the geometric details of the porous plate bleed systems were also carried out to supplement the experimental data, and provide additional insights into the bleed flow physics. Overall, both bleed formulations fared well for the tests performed in this study. However, the sample of test problems considered in this effort was not large enough to permit a comprehensive validation of the models.

  11. Integrating Information in Biological Ontologies and Molecular Networks to Infer Novel Terms.

    PubMed

    Li, Le; Yip, Kevin Y

    2016-12-15

    Currently most terms and term-term relationships in Gene Ontology (GO) are defined manually, which creates cost, consistency and completeness issues. Recent studies have demonstrated the feasibility of inferring GO automatically from biological networks, which represents an important complementary approach to GO construction. These methods (NeXO and CliXO) are unsupervised, which means 1) they cannot use the information contained in existing GO, 2) the way they integrate biological networks may not optimize the accuracy, and 3) they are not customized to infer the three different sub-ontologies of GO. Here we present a semi-supervised method called Unicorn that extends these previous methods to tackle the three problems. Unicorn uses a sub-tree of an existing GO sub-ontology as training part to learn parameters in integrating multiple networks. Cross-validation results show that Unicorn reliably inferred the left-out parts of each specific GO sub-ontology. In addition, by training Unicorn with an old version of GO together with biological networks, it successfully re-discovered some terms and term-term relationships present only in a new version of GO. Unicorn also successfully inferred some novel terms that were not contained in GO but have biological meanings well-supported by the literature. Source code of Unicorn is available at http://yiplab.cse.cuhk.edu.hk/unicorn/.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barry, Kenneth

    The Nuclear Energy Institute (NEI) Small Modular Reactor (SMR) Licensing Task Force (TF) has been evaluating licensing issues unique and important to iPWRs, ranking these issues, and developing NEI position papers for submittal to the U.S. Nuclear Regulatory Commission (NRC) during the past three years. Papers have been developed and submitted to the NRC in a range of areas including: Price-Anderson Act, NRC annual fees, security, modularity, and staffing. In December, 2012, NEI completed a draft position paper on SMR source terms and participated in an NRC public meeting presenting a summary of this paper, which was subsequently submitted tomore » the NRC. One important conclusion of the source term paper was the evaluation and selection of high importance areas where additional research would have a significant impact on source terms. The highest ranked research area was iPWR containment aerosol natural deposition. The NRC accepts the use of existing aerosol deposition correlations in Regulatory Guide 1.183, but these were developed for large light water reactor (LWR) containments. Application of these correlations to an iPWR design has resulted in greater than a ten-fold reduction of containment airborne aerosol inventory as compared to large LWRs. Development and experimental justification of containment aerosol natural deposition correlations specifically for the unique iPWR containments is expected to result in a large reduction of design basis and beyond-design-basis accident source terms with concomitantly smaller dose to workers and the public. Therefore, NRC acceptance of iPWR containment aerosol natural deposition correlations will directly support the industry’s goal of reducing the Emergency Planning Zone (EPZ) for SMRs. Based on the results in this work, it is clear that thermophoresis is relatively unimportant for iPWRs. Gravitational settling is well understood, and may be the dominant process for a dry environment. Diffusiophoresis and enhanced settling by particle growth are the dominant processes for determining DFs for expected conditions in an iPWR containment. These processes are dependent on the areato-volume (A/V) ratio, which should benefit iPWR designs because these reactors have higher A/Vs compared to existing LWRs.« less

  13. Terms used by nurses to describe patient problems: can SNOMED III represent nursing concepts in the patient record?

    PubMed Central

    Henry, S B; Holzemer, W L; Reilly, C A; Campbell, K E

    1994-01-01

    OBJECTIVE: To analyze the terms used by nurses in a variety of data sources and to test the feasibility of using SNOMED III to represent nursing terms. DESIGN: Prospective research design with manual matching of terms to the SNOMED III vocabulary. MEASUREMENTS: The terms used by nurses to describe patient problems during 485 episodes of care for 201 patients hospitalized for Pneumocystis carinii pneumonia were identified. Problems from four data sources (nurse interview, intershift report, nursing care plan, and nurse progress note/flowsheet) were classified based on the substantive area of the problem and on the terminology used to describe the problem. A test subset of the 25 most frequently used terms from the two written data sources (nursing care plan and nurse progress note/flowsheet) were manually matched to SNOMED III terms to test the feasibility of using that existing vocabulary to represent nursing terms. RESULTS: Nurses most frequently described patient problems as signs/symptoms in the verbal nurse interview and intershift report. In the written data sources, problems were recorded as North American Nursing Diagnosis Association (NANDA) terms and signs/symptoms with similar frequencies. Of the nursing terms in the test subset, 69% were represented using one or more SNOMED III terms. PMID:7719788

  14. Open-source micro-tensile testers via additive manufacturing for the mechanical characterization of thin films and papers

    PubMed Central

    Scheftic, Charlie M.; Brinson, L. Catherine

    2018-01-01

    The cost of specialized scientific equipment can be high and with limited funding resources, researchers and students are often unable to access or purchase the ideal equipment for their projects. In the fields of materials science and mechanical engineering, fundamental equipment such as tensile testing devices can cost tens to hundreds of thousands of dollars. While a research lab often has access to a large-scale testing machine suitable for conventional samples, loading devices for meso- and micro-scale samples for in-situ testing with the myriad of microscopy tools are often hard to source and cost prohibitive. Open-source software has allowed for great strides in the reduction of costs associated with software development and open-source hardware and additive manufacturing have the potential to similarly reduce the costs of scientific equipment and increase the accessibility of scientific research. To investigate the feasibility of open-source hardware, a micro-tensile tester was designed with a freely accessible computer-aided design package and manufactured with a desktop 3D-printer and off-the-shelf components. To our knowledge this is one of the first demonstrations of a tensile tester with additively manufactured components for scientific research. The capabilities of the tensile tester were demonstrated by investigating the mechanical properties of Graphene Oxide (GO) paper and thin films. A 3D printed tensile tester was successfully used in conjunction with an atomic force microscope to provide one of the first quantitative measurements of GO thin film buckling under compression. The tensile tester was also used in conjunction with an atomic force microscope to observe the change in surface topology of a GO paper in response to increasing tensile strain. No significant change in surface topology was observed in contrast to prior hypotheses from the literature. Based on this result obtained with the new open source tensile stage we propose an alternative hypothesis we term ‘superlamellae consolidation’ to explain the initial deformation of GO paper. The additively manufactured tensile tester tested represents cost savings of >99% compared to commercial solutions in its class and offers simple customization. However, continued development is needed for the tensile tester presented here to approach the technical specifications achievable with commercial solutions. PMID:29813103

  15. Open-source micro-tensile testers via additive manufacturing for the mechanical characterization of thin films and papers.

    PubMed

    Nandy, Krishanu; Collinson, David W; Scheftic, Charlie M; Brinson, L Catherine

    2018-01-01

    The cost of specialized scientific equipment can be high and with limited funding resources, researchers and students are often unable to access or purchase the ideal equipment for their projects. In the fields of materials science and mechanical engineering, fundamental equipment such as tensile testing devices can cost tens to hundreds of thousands of dollars. While a research lab often has access to a large-scale testing machine suitable for conventional samples, loading devices for meso- and micro-scale samples for in-situ testing with the myriad of microscopy tools are often hard to source and cost prohibitive. Open-source software has allowed for great strides in the reduction of costs associated with software development and open-source hardware and additive manufacturing have the potential to similarly reduce the costs of scientific equipment and increase the accessibility of scientific research. To investigate the feasibility of open-source hardware, a micro-tensile tester was designed with a freely accessible computer-aided design package and manufactured with a desktop 3D-printer and off-the-shelf components. To our knowledge this is one of the first demonstrations of a tensile tester with additively manufactured components for scientific research. The capabilities of the tensile tester were demonstrated by investigating the mechanical properties of Graphene Oxide (GO) paper and thin films. A 3D printed tensile tester was successfully used in conjunction with an atomic force microscope to provide one of the first quantitative measurements of GO thin film buckling under compression. The tensile tester was also used in conjunction with an atomic force microscope to observe the change in surface topology of a GO paper in response to increasing tensile strain. No significant change in surface topology was observed in contrast to prior hypotheses from the literature. Based on this result obtained with the new open source tensile stage we propose an alternative hypothesis we term 'superlamellae consolidation' to explain the initial deformation of GO paper. The additively manufactured tensile tester tested represents cost savings of >99% compared to commercial solutions in its class and offers simple customization. However, continued development is needed for the tensile tester presented here to approach the technical specifications achievable with commercial solutions.

  16. A Semi-implicit Treatment of Porous Media in Steady-State CFD.

    PubMed

    Domaingo, Andreas; Langmayr, Daniel; Somogyi, Bence; Almbauer, Raimund

    There are many situations in computational fluid dynamics which require the definition of source terms in the Navier-Stokes equations. These source terms not only allow to model the physics of interest but also have a strong impact on the reliability, stability, and convergence of the numerics involved. Therefore, sophisticated numerical approaches exist for the description of such source terms. In this paper, we focus on the source terms present in the Navier-Stokes or Euler equations due to porous media-in particular the Darcy-Forchheimer equation. We introduce a method for the numerical treatment of the source term which is independent of the spatial discretization and based on linearization. In this description, the source term is treated in a fully implicit way whereas the other flow variables can be computed in an implicit or explicit manner. This leads to a more robust description in comparison with a fully explicit approach. The method is well suited to be combined with coarse-grid-CFD on Cartesian grids, which makes it especially favorable for accelerated solution of coupled 1D-3D problems. To demonstrate the applicability and robustness of the proposed method, a proof-of-concept example in 1D, as well as more complex examples in 2D and 3D, is presented.

  17. Modeling mesoscale eddies

    NASA Astrophysics Data System (ADS)

    Canuto, V. M.; Dubovikov, M. S.

    Mesoscale eddies are not resolved in coarse resolution ocean models and must be modeled. They affect both mean momentum and scalars. At present, no generally accepted model exists for the former; in the latter case, mesoscales are modeled with a bolus velocity u∗ to represent a sink of mean potential energy. However, comparison of u∗(model) vs. u∗ (eddy resolving code, [J. Phys. Ocean. 29 (1999) 2442]) has shown that u∗(model) is incomplete and that additional terms, "unrelated to thickness source or sinks", are required. Thus far, no form of the additional terms has been suggested. To describe mesoscale eddies, we employ the Navier-Stokes and scalar equations and a turbulence model to treat the non-linear interactions. We then show that the problem reduces to an eigenvalue problem for the mesoscale Bernoulli potential. The solution, which we derive in analytic form, is used to construct the momentum and thickness fluxes. In the latter case, the bolus velocity u∗ is found to contain two types of terms: the first type entails the gradient of the mean potential vorticity and represents a positive contribution to the production of mesoscale potential energy; the second type of terms, which is new, entails the velocity of the mean flow and represents a negative contribution to the production of mesoscale potential energy, or equivalently, a backscatter process whereby a fraction of the mesoscale potential energy is returned to the original reservoir of mean potential energy. This type of terms satisfies the physical description of the additional terms given by [J. Phys. Ocean. 29 (1999) 2442]. The mesoscale flux that enters the momentum equations is also contributed by two types of terms of the same physical nature as those entering the thickness flux. The potential vorticity flux is also shown to contain two types of terms: the first is of the gradient-type while the other terms entail the velocity of the mean flow. An expression is derived for the mesoscale diffusivity κM and for the mesoscale kinetic energy K in terms of the large-scale fields. The predicted κM( z) agrees with that of heuristic models. The complete mesoscale model in isopycnal coordinates is presented in Appendix D and can be used in coarse resolution ocean global circulation models.

  18. Using high-frequency sensors to identify hydroclimatological controls on storm-event variability in catchment nutrient fluxes and source zone activation

    NASA Astrophysics Data System (ADS)

    Blaen, Phillip; Khamis, Kieran; Lloyd, Charlotte; Krause, Stefan

    2017-04-01

    At the river catchment scale, storm events can drive highly variable behaviour in nutrient and water fluxes, yet short-term dynamics are frequently missed by low resolution sampling regimes. In addition, nutrient source contributions can vary significantly within and between storm events. Our inability to identify and characterise time dynamic source zone contributions severely hampers the adequate design of land use management practices in order to control nutrient exports from agricultural landscapes. Here, we utilise an 8-month high-frequency (hourly) time series of streamflow, nitrate concentration (NO3) and fluorescent dissolved organic matter concentration (FDOM) derived from optical in-situ sensors located in a headwater agricultural catchment. We characterised variability in flow and nutrient dynamics across 29 storm events. Storm events represented 31% of the time series and contributed disproportionately to nutrient loads (43% of NO3 and 36% of CDOM) relative to their duration. Principal components analysis of potential hydroclimatological controls on nutrient fluxes demonstrated that a small number of components, representing >90% of variance in the dataset, were highly significant model predictors of inter-event variability in catchment nutrient export. Hysteresis analysis of nutrient concentration-discharge relationships suggested spatially discrete source zones existed for NO3 and FDOM, and that activation of these zones varied on an event-specific basis. Our results highlight the benefits of high-frequency in-situ monitoring for characterising complex short-term nutrient dynamics and unravelling connections between hydroclimatological variability and river nutrient export and source zone activation under extreme flow conditions. These new process-based insights are fundamental to underpinning the development of targeted management measures to reduce nutrient loading of surface waters.

  19. Estimation of the caesium-137 source term from the Fukushima Daiichi nuclear power plant using a consistent joint assimilation of air concentration and deposition observations

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne

    2014-01-01

    Inverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6-19.3 PBq with an estimated standard deviation range of 15-20% depending on the method and the data sets. The “blind” time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data.

  20. Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil

    NASA Astrophysics Data System (ADS)

    de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.

    2018-05-01

    A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.

  1. Metabolite profiling of Dioscorea (yam) species reveals underutilised biodiversity and renewable sources for high-value compounds

    PubMed Central

    Price, Elliott J.; Wilkin, Paul; Sarasan, Viswambharan; Fraser, Paul D.

    2016-01-01

    Yams (Dioscorea spp.) are a multispecies crop with production in over 50 countries generating ~50 MT of edible tubers annually. The long-term storage potential of these tubers is vital for food security in developing countries. Furthermore, many species are important sources of pharmaceutical precursors. Despite these attributes as staple food crops and sources of high-value chemicals, Dioscorea spp. remain largely neglected in comparison to other staple tuber crops of tropical agricultural systems such as cassava (Manihot esculenta) and sweet potato (Ipomoea batatas). To date, studies have focussed on the tubers or rhizomes of Dioscorea, neglecting the foliage as waste. In the present study metabolite profiling procedures, using GC-MS approaches, have been established to assess biochemical diversity across species. The robustness of the procedures was shown using material from the phylogenetic clades. The resultant data allowed separation of the genotypes into clades, species and morphological traits with a putative geographical origin. Additionally, we show the potential of foliage material as a renewable source of high-value compounds. PMID:27385275

  2. PolEASIA Project: Pollution in Eastern Asia - towards better Air Quality Prevision and Impacts' Evaluation

    NASA Astrophysics Data System (ADS)

    Dufour, Gaëlle; Albergel, Armand; Balkanski, Yves; Beekmann, Matthias; Cai, Zhaonan; Fortems-Cheiney, Audrey; Cuesta, Juan; Derognat, Claude; Eremenko, Maxim; Foret, Gilles; Hauglustaine, Didier; Lachatre, Matthieu; Laurent, Benoit; Liu, Yi; Meng, Fan; Siour, Guillaume; Tao, Shu; Velay-Lasry, Fanny; Zhang, Qijie; Zhang, Yuli

    2017-04-01

    The rapid economic development and urbanization of China during the last decades resulted in rising pollutant emissions leading to amongst the largest pollutant concentrations in the world for the major pollutants (ozone, PM2.5, and PM10). Robust monitoring and forecasting systems associated with downstream services providing comprehensive risk indicators are highly needed to establish efficient pollution mitigation strategies. In addition, a precise evaluation of the present and future impacts of Chinese pollutant emissions is of importance to quantify: first, the consequences of pollutants export on atmospheric composition and air quality all over the globe; second, the additional radiative forcing induced by the emitted and produced short-lived climate forcers (ozone and aerosols); third, the long-term health consequences of pollution exposure. To achieve this, a detailed understanding of East Asian pollution is necessary. The French PolEASIA project aims at addressing these different issues by providing a better quantification of major pollutants sources and distributions as well as of their recent and future evolution. The main objectives, methodologies and tools of this starting 4-year project will be presented. An ambitious synergistic and multi-scale approach coupling innovative satellite observations, in situ measurements and chemical transport model simulations will be developed to characterize the spatial distribution, the interannual to daily variability and the trends of the major pollutants (ozone and aerosols) and their sources over East Asia, and to quantify the role of the different processes (emissions, transport, chemical transformation) driving the observed pollutant distributions. A particular attention will be paid to assess the natural and anthropogenic contributions to East Asian pollution. Progress made with the understanding of pollutant sources, especially in terms of modeling of pollution over East Asia and advanced numerical approaches such as inverse modeling will serve the development of an efficient and marketable forecasting system for regional outdoor air pollution. The performances of this upgraded forecasting system will be evaluated and promoted to ensure a good visibility of the French technology. In addition, the contribution of Chinese pollution to the regional and global atmospheric composition, as well as the resulting radiative forcing of short-lived species will be determined using both satellite observations and model simulations. Health Impact Assessment (HIA) methods coupled with model simulations will be used to estimate the long-term impacts of exposure to pollutants (PM2.5 and ozone) on cardiovascular and respiratory mortality. First results obtained in this framework will be presented.

  3. The role of noise in clinical environments with particular reference to mental health care: A narrative review.

    PubMed

    Brown, Brian; Rutherford, Peter; Crawford, Paul

    2015-09-01

    There is a large literature suggesting that noise can be detrimental to health and numerous policy documents have promoted noise abatement in clinical settings. This paper documents the role of noise in clinical environments and its deleterious effects with a particular focus on mental health care. Our intention however, is to go beyond the notion that noise is simply undesirable and examine the extent to which researchers have explored the meaning of sound in hospital settings and identify new opportunities for research and practice. This is a narrative review which has grouped the literature and issues in the field into themes concerning the general issues of noise in health care; sleep noise and hospital environments; noise in intensive care units; implications for service users and staff; and suggestions for new ways of conceptualising and researching clinical soundscapes. Data sources comprised relevant UK policy documents and the results of a literature search of Pubmed, Scopus and Web of Knowledge using terms such as noise, health, hospital, soundscape and relevant additional terms derived from the papers retrieved. In addition the references of retrieved articles were scanned for additional relevant material and historical items significant in shaping the field. Excess unwanted noise can clearly be detrimental to health and impede recovery, and this is clearly recognised by policymakers especially in the UK context. We use the literature surveyed to argue that it is important also to see the noise in clinical environments in terms of the meaning it conveys and rather than merely containing unwanted sound, clinical environments have a 'soundscape'. This comprises noises which convey meaning, for example about the activities of other people, the rhythms of the day and the nature of the auditory community of the hospital. Unwanted sound may have unwanted effects, especially on those who are most vulnerable, yet this does not necessarily mean that silence is the better option. Therefore it is our contention that it is important to begin thinking about the social functions of sound in the mental health environment. Whilst it can be stressful, sound can also be soothing, reassuring and a rich source of information about the environment as well. It may be used to secure a degree of privacy for oneself, to exclude others or as a source of solidarity among friends and colleagues. The challenge then is to understand the work that sound does in its ecological context in health care settings. Copyright © 2015. Published by Elsevier Ltd.

  4. Long Term 2 Second Round Source Water Monitoring and Bin Placement Memo

    EPA Pesticide Factsheets

    The Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) applies to all public water systems served by a surface water source or public water systems served by a ground water source under the direct influence of surface water.

  5. Progress Toward Improving Jet Noise Predictions in Hot Jets

    NASA Technical Reports Server (NTRS)

    Khavaran, Abbas; Kenzakowski, Donald C.

    2007-01-01

    An acoustic analogy methodology for improving noise predictions in hot round jets is presented. Past approaches have often neglected the impact of temperature fluctuations on the predicted sound spectral density, which could be significant for heated jets, and this has yielded noticeable acoustic under-predictions in such cases. The governing acoustic equations adopted here are a set of linearized, inhomogeneous Euler equations. These equations are combined into a single third order linear wave operator when the base flow is considered as a locally parallel mean flow. The remaining second-order fluctuations are regarded as the equivalent sources of sound and are modeled. It is shown that the hot jet effect may be introduced primarily through a fluctuating velocity/enthalpy term. Modeling this additional source requires specialized inputs from a RANS-based flowfield simulation. The information is supplied using an extension to a baseline two equation turbulence model that predicts total enthalpy variance in addition to the standard parameters. Preliminary application of this model to a series of unheated and heated subsonic jets shows significant improvement in the acoustic predictions at the 90 degree observer angle.

  6. Uncertainty, variability, and earthquake physics in ground‐motion prediction equations

    USGS Publications Warehouse

    Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A.

    2017-01-01

    Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20  km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction.

  7. An imaging-based photometric and colorimetric measurement method for characterizing OLED panels for lighting applications

    NASA Astrophysics Data System (ADS)

    Zhu, Yiting; Narendran, Nadarajah; Tan, Jianchuan; Mou, Xi

    2014-09-01

    The organic light-emitting diode (OLED) has demonstrated its novelty in displays and certain lighting applications. Similar to white light-emitting diode (LED) technology, it also holds the promise of saving energy. Even though the luminous efficacy values of OLED products have been steadily growing, their longevity is still not well understood. Furthermore, currently there is no industry standard for photometric and colorimetric testing, short and long term, of OLEDs. Each OLED manufacturer tests its OLED panels under different electrical and thermal conditions using different measurement methods. In this study, an imaging-based photometric and colorimetric measurement method for OLED panels was investigated. Unlike an LED that can be considered as a point source, the OLED is a large form area source. Therefore, for an area source to satisfy lighting application needs, it is important that it maintains uniform light level and color properties across the emitting surface of the panel over a long period. This study intended to develop a measurement procedure that can be used to test long-term photometric and colorimetric properties of OLED panels. The objective was to better understand how test parameters such as drive current or luminance and temperature affect the degradation rate. In addition, this study investigated whether data interpolation could allow for determination of degradation and lifetime, L70, at application conditions based on the degradation rates measured at different operating conditions.

  8. Integrating new Storage Technologies into EOS

    NASA Astrophysics Data System (ADS)

    Peters, Andreas J.; van der Ster, Dan C.; Rocha, Joaquim; Lensing, Paul

    2015-12-01

    The EOS[1] storage software was designed to cover CERN disk-only storage use cases in the medium-term trading scalability against latency. To cover and prepare for long-term requirements the CERN IT data and storage services group (DSS) is actively conducting R&D and open source contributions to experiment with a next generation storage software based on CEPH[3] and ethernet enabled disk drives. CEPH provides a scale-out object storage system RADOS and additionally various optional high-level services like S3 gateway, RADOS block devices and a POSIX compliant file system CephFS. The acquisition of CEPH by Redhat underlines the promising role of CEPH as the open source storage platform of the future. CERN IT is running a CEPH service in the context of OpenStack on a moderate scale of 1 PB replicated storage. Building a 100+PB storage system based on CEPH will require software and hardware tuning. It is of capital importance to demonstrate the feasibility and possibly iron out bottlenecks and blocking issues beforehand. The main idea behind this R&D is to leverage and contribute to existing building blocks in the CEPH storage stack and implement a few CERN specific requirements in a thin, customisable storage layer. A second research topic is the integration of ethernet enabled disks. This paper introduces various ongoing open source developments, their status and applicability.

  9. A real-time laser feedback control method for the three-wave laser source used in the polarimeter-interferometer diagnostic on Joint-TEXT tokamak

    NASA Astrophysics Data System (ADS)

    Xiong, C. Y.; Chen, J.; Li, Q.; Liu, Y.; Gao, L.

    2014-12-01

    A three-wave laser polarimeter-interferometer, equipped with three independent far-infrared laser sources, has been developed on Joint-TEXT (J-TEXT) tokamak. The diagnostic system is capable of high-resolution temporal and phase measurement of the Faraday angle and line-integrated density. However, for long-term operation (>10 min), the free-running lasers can lead to large drifts of the intermediate frequencies (˜100-˜500 kHz/10 min) and decay of laser power (˜10%-˜20%/10 min), which act to degrade diagnostic performance. In addition, these effects lead to increased maintenance cost and limit measurement applicability to long pulse/steady state experiments. To solve this problem, a real-time feedback control method of the laser source is proposed. By accurately controlling the length of each laser cavity, both the intermediate frequencies and laser power can be simultaneously controlled: the intermediate frequencies are controlled according to the pre-set values, while the laser powers are maintained at an optimal level. Based on this approach, a real-time feedback control system has been developed and applied on J-TEXT polarimeter-interferometer. Long-term (theoretically no time limit) feedback of intermediate frequencies (maximum change less than ±12 kHz) and laser powers (maximum relative power change less than ±7%) has been successfully achieved.

  10. A real-time laser feedback control method for the three-wave laser source used in the polarimeter-interferometer diagnostic on Joint-TEXT tokamak.

    PubMed

    Xiong, C Y; Chen, J; Li, Q; Liu, Y; Gao, L

    2014-12-01

    A three-wave laser polarimeter-interferometer, equipped with three independent far-infrared laser sources, has been developed on Joint-TEXT (J-TEXT) tokamak. The diagnostic system is capable of high-resolution temporal and phase measurement of the Faraday angle and line-integrated density. However, for long-term operation (>10 min), the free-running lasers can lead to large drifts of the intermediate frequencies (∼100-∼500 kHz/10 min) and decay of laser power (∼10%-∼20%/10 min), which act to degrade diagnostic performance. In addition, these effects lead to increased maintenance cost and limit measurement applicability to long pulse/steady state experiments. To solve this problem, a real-time feedback control method of the laser source is proposed. By accurately controlling the length of each laser cavity, both the intermediate frequencies and laser power can be simultaneously controlled: the intermediate frequencies are controlled according to the pre-set values, while the laser powers are maintained at an optimal level. Based on this approach, a real-time feedback control system has been developed and applied on J-TEXT polarimeter-interferometer. Long-term (theoretically no time limit) feedback of intermediate frequencies (maximum change less than ±12 kHz) and laser powers (maximum relative power change less than ±7%) has been successfully achieved.

  11. Auroral Proper Motion in the Era of AMISR and EMCCD

    NASA Astrophysics Data System (ADS)

    Semeter, J. L.

    2016-12-01

    The term "aurora" is a catch-all for luminosity produced by the deposition of magnetospheric energy in the outer atmosphere. The use of this single phenomenological term occludes the rich variety of sources and mechanisms responsible for the excitation. Among these are electron thermal conduction (SAR arcs), electrostatic potential fields ("inverted-V" aurora), wave-particle resonance (Alfvenic aurora, pulsating aurora), pitch-angle scattering (diffuse aurora), and direct injection of plasma sheet particles (PBIs, substorms). Much information about auroral energization has been derived from the energy spectrum of primary particles, which may be measured directly with an in situ detector or indirectly via analysis of the atmospheric response (e.g., auroral spectroscopy, tomography, ionization). Somewhat less emphasized has been the information in the B_perp dimension. Specifically, the scale-dependent motions of auroral forms in the rest frame of the ambient plasma provide a means of partitioning both the source region and the source mechanism. These results, in turn, affect ionospheric state parameters that control the M-I coupling process-most notably, the degree of structure imparted to the conductance field. This paper describes recent results enabled by the advent of two technologies: high frame-rate, high-resolution imaging detectors, and electronically steerable incoherent scatter radar (the AMISR systems). In addition to contributing to our understanding of the aurora, these results may be used in predictive models of multi-scale energy transfer within the disturbed geospace system.

  12. Analysis of temporal decay of diffuse broadband sound fields in enclosures by decomposition in powers of an absorption parameter

    NASA Astrophysics Data System (ADS)

    Bliss, Donald; Franzoni, Linda; Rouse, Jerry; Manning, Ben

    2005-09-01

    An analysis method for time-dependent broadband diffuse sound fields in enclosures is described. Beginning with a formulation utilizing time-dependent broadband intensity boundary sources, the strength of these wall sources is expanded in a series in powers of an absorption parameter, thereby giving a separate boundary integral problem for each power. The temporal behavior is characterized by a Taylor expansion in the delay time for a source to influence an evaluation point. The lowest-order problem has a uniform interior field proportional to the reciprocal of the absorption parameter, as expected, and exhibits relatively slow exponential decay. The next-order problem gives a mean-square pressure distribution that is independent of the absorption parameter and is primarily responsible for the spatial variation of the reverberant field. This problem, which is driven by input sources and the lowest-order reverberant field, depends on source location and the spatial distribution of absorption. Additional problems proceed at integer powers of the absorption parameter, but are essentially higher-order corrections to the spatial variation. Temporal behavior is expressed in terms of an eigenvalue problem, with boundary source strength distributions expressed as eigenmodes. Solutions exhibit rapid short-time spatial redistribution followed by long-time decay of a predominant spatial mode.

  13. A comparison of color fidelity metrics for light sources using simulation of color samples under lighting conditions

    NASA Astrophysics Data System (ADS)

    Kwon, Hyeokjun; Kang, Yoojin; Jang, Junwoo

    2017-09-01

    Color fidelity has been used as one of indices to evaluate the performance of light sources. Since the Color Rendering Index (CRI) was proposed at CIE, many color fidelity metrics have been proposed to increase the accuracy of the metric. This paper focuses on a comparison of the color fidelity metrics in an aspect of accuracy with human visual assessments. To visually evaluate the color fidelity of light sources, we made a simulator that reproduces the color samples under lighting conditions. In this paper, eighteen color samples of the Macbeth color checker under test light sources and reference illuminant for each of them are simulated and displayed on a well-characterized monitor. With only a spectrum set of the test light source and reference illuminant, color samples under any lighting condition can be reproduced. In this paper, the spectrums of the two LED and two OLED light sources that have similar values of CRI are used for the visual assessment. In addition, the results of the visual assessment are compared with the two color fidelity metrics that include CRI and IES TM-30-15 (Rf), proposed by Illuminating Engineering Society (IES) in 2015. Experimental results indicate that Rf outperforms CRI in terms of the correlation with visual assessment.

  14. Source-specific fine particulate air pollution and systemic inflammation in ischaemic heart disease patients

    PubMed Central

    Siponen, Taina; Yli-Tuomi, Tarja; Aurela, Minna; Dufva, Hilkka; Hillamo, Risto; Hirvonen, Maija-Riitta; Huttunen, Kati; Pekkanen, Juha; Pennanen, Arto; Salonen, Iiris; Tiittanen, Pekka; Salonen, Raimo O; Lanki, Timo

    2015-01-01

    Objective To compare short-term effects of fine particles (PM2.5; aerodynamic diameter <2.5 µm) from different sources on the blood levels of markers of systemic inflammation. Methods We followed a panel of 52 ischaemic heart disease patients from 15 November 2005 to 21 April 2006 with clinic visits in every second week in the city of Kotka, Finland, and determined nine inflammatory markers from blood samples. In addition, we monitored outdoor air pollution at a fixed site during the study period and conducted a source apportionment of PM2.5 using the Environmental Protection Agency's model EPA PMF 3.0. We then analysed associations between levels of source-specific PM2.5 and markers of systemic inflammation using linear mixed models. Results We identified five source categories: regional and long-range transport (LRT), traffic, biomass combustion, sea salt, and pulp industry. We found most evidence for the relation of air pollution and inflammation in LRT, traffic and biomass combustion; the most relevant inflammation markers were C-reactive protein, interleukin-12 and myeloperoxidase. Sea salt was not positively associated with any of the inflammatory markers. Conclusions Results suggest that PM2.5 from several sources, such as biomass combustion and traffic, are promoters of systemic inflammation, a risk factor for cardiovascular diseases. PMID:25479755

  15. Semi-implicit and fully implicit shock-capturing methods for hyperbolic conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Shinn, J. L.

    1986-01-01

    Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogenous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the soruce terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated.

  16. Watershed nitrogen and phosphorus balance: The upper Potomac River basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaworski, N.A.; Groffman, P.M.; Keller, A.A.

    1992-01-01

    Nitrogen and phosphorus mass balances were estimated for the portion of the Potomac River basin watershed located above Washington, D.C. The total nitrogen (N) balance included seven input source terms, six sinks, and one 'change-in-storage' term, but was simplified to five input terms and three output terms. The phosphorus (P) baance had four input and three output terms. The estimated balances are based on watershed data from seven information sources. Major sources of nitrogen are animal waste and atmospheric deposition. The major sources of phosphorus are animal waste and fertilizer. The major sink for nitrogen is combined denitrification, volatilization, andmore » change-in-storage. The major sink for phosphorus is change-in-storage. River exports of N and P were 17% and 8%, respectively, of the total N and P inputs. Over 60% of the N and P were volatilized or stored. The major input and output terms on the budget are estimated from direct measurements, but the change-in-storage term is calculated by difference. The factors regulating retention and storage processes are discussed and research needs are identified.« less

  17. Bulawayo water supplies: Sustainable alternatives for the next decade

    NASA Astrophysics Data System (ADS)

    Mkandla, Noel; Van der Zaag, Pieter; Sibanda, Peter

    Bulawayo is the second largest city in Zimbabwe with a population of nearly one million people. It is located on the watershed of Umzingwane and Gwayi catchments. The former is part of the Limpopo basin, while the latter drains into the Zambezi basin. Bulawayo has a good potential of economic development but has been stymied by lack of sufficient water. The city currently relies on five surface sources in the Umzingwane catchment where it has to compete with evaporation. The well field from the Nyamandlovu aquifer in the Gwayi catchment, which was constructed as an emergency measure during the 1992 drought, is currently not operational. Alternative water supply sources are far and expensive. A multilinear regression model was developed to analyse and quantify the factors affecting water consumption. It was found that per capita water consumption is very low, indicating suppressed demand. Water rationing, tariffs, rainfall, population growth and gross domestic product are the main factors influencing water consumption in Bulawayo. Assuming that these factors will continue to be influential, future water consumption was projected for intensive, regular and slack water demand management. Future water consumption was then compared with the current water supply capacity in order to determine the date by which the next water supply source is required. With slack demand management, the Nyamandlovu well field should have been operational by 2003, while by the year 2007 an additional source of water is required. With intensive demand management and assuming low population growth, current capacities may suffice to satisfy the suppressed demand until the year 2015, by which time Nyamandlovu wells should be operational again. The additional water supply sources that are currently being considered for Bulawayo (namely the Zambezi water pipeline; Gwayi Shangani dam; Mtshabezi dam; Lower Tuli dam; and Glass block dam) were then compared with an alternative water source not yet contemplated, namely drawing groundwater from Umguza, part of the Nyamandlovu aquifer. The paper then provides details of the Umguza alternative, which was designed at pre-feasibility level by Mkandla [Mkandla, N., 2003. Bulawayo water supplies: Umguza well field as a sustainable alternative for the next decade. Unpublished M.Sc. WREM dissertation. University of Zimbabwe, Harare]. All alternative additional water supply sources were compared in terms of their Net Present Values. It was found that Umguza well field is the least-cost alternative to meet additional water demand. The Umguza alternative will be able to satisfy water demand for a period of six to ten years. Thereafter, the second least-cost alternative, namely Gwayi Shangani dam, must be on stream.

  18. General public knowledge, preferred dosage forms, and beliefs toward medicines in western Saudi Arabia.

    PubMed

    Alhaddad, Mahmoud S; Abdallah, Qasem M; Alshakhsheer, Sami M; Alosaimi, Salman B; Althmali, Ahmed R; Alahmari, Solaiman A

    2014-06-01

    To measure general public knowledge, source of knowledge, preferred dosage forms, and beliefs toward medicines. A cross-sectional study design using convenience-sampling technique was used. A pre-validated questionnaire was designed and distributed to the general public through face-to-face interviews. All data were analyzed, and p-values less than 0.05 were considered significant. The study took place in the Clinical Pharmacy Department, Taif University, Taif, Kingdom of Saudi Arabia between August 2012 and February 2013 RESULTS: Nine hundred participants successfully responded to this study. Males represented two-thirds of the respondents (66.8%). In addition, 52% of respondents were of high education level. Modern (74.2%) and alternative medicines (88.7%) were understood by most respondents. Tablets (69.6%) and capsules (37.6%) represented the highest preferred dosage forms. In addition, physicians (66.6%) and pharmacists (46.2%) were the main sources of information regarding medicines. In terms of beliefs, respondents showed wrong beliefs in many statements used in this study. There is a need to improve public knowledge and beliefs toward medicines as well as utilizing public preferred dosage forms. In addition, pharmacists should play a major role in these programs since they are experts on medicines and play a more active role in patient education and counseling.

  19. A multi-scalar PDF approach for LES of turbulent spray combustion

    NASA Astrophysics Data System (ADS)

    Raman, Venkat; Heye, Colin

    2011-11-01

    A comprehensive joint-scalar probability density function (PDF) approach is proposed for large eddy simulation (LES) of turbulent spray combustion and tests are conducted to analyze the validity and modeling requirements. The PDF method has the advantage that the chemical source term appears closed but requires models for the small scale mixing process. A stable and consistent numerical algorithm for the LES/PDF approach is presented. To understand the modeling issues in the PDF method, direct numerical simulation of a spray flame at three different fuel droplet Stokes numbers and an equivalent gaseous flame are carried out. Assumptions in closing the subfilter conditional diffusion term in the filtered PDF transport equation are evaluated for various model forms. In addition, the validity of evaporation rate models in high Stokes number flows is analyzed.

  20. Finite Moment Tensors of Southern California Earthquakes

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; Chen, P.; Zhao, L.

    2003-12-01

    We have developed procedures for inverting broadband waveforms for the finite moment tensors (FMTs) of regional earthquakes. The FMT is defined in terms of second-order polynomial moments of the source space-time function and provides the lowest order representation of a finite fault rupture; it removes the fault-plane ambiguity of the centroid moment tensor (CMT) and yields several additional parameters of seismological interest: the characteristic length L{c}, width W{c}, and duration T{c} of the faulting, as well as the directivity vector {v}{d} of the fault slip. To formulate the inverse problem, we follow and extend the methods of McGuire et al. [2001, 2002], who have successfully recovered the second-order moments of large earthquakes using low-frequency teleseismic data. We express the Fourier spectra of a synthetic point-source waveform in its exponential (Rytov) form and represent the observed waveform relative to the synthetic in terms two frequency-dependent differential times, a phase delay δ τ {p}(ω ) and an amplitude-reduction time δ τ {q}(ω ), which we measure using Gee and Jordan's [1992] isolation-filter technique. We numerically calculate the FMT partial derivatives in terms of second-order spatiotemporal gradients, which allows us to use 3D finite-difference seismograms as our isolation filters. We have applied our methodology to a set of small to medium-sized earthquakes in Southern California. The errors in anelastic structure introduced perturbations larger than the signal level caused by finite source effect. We have therefore employed a joint inversion technique that recovers the CMT parameters of the aftershocks, as well as the CMT and FMT parameters of the mainshock, under the assumption that the source finiteness of the aftershocks can be ignored. The joint system of equations relating the δ τ {p} and δ τ {q} data to the source parameters of the mainshock-aftershock cluster is denuisanced for path anomalies in both observables; this projection operation effectively corrects the mainshock data for path-related amplitude anomalies in a way similar to, but more flexible than, empirical Green function (EGF) techniques.

  1. Alpine Warming induced Nitrogen Export from Green Lakes Valley, Colorado Front Range, USA

    NASA Astrophysics Data System (ADS)

    Barnes, R. T.; Williams, M. W.; Parman, J.

    2012-12-01

    Alpine ecosystems are particularly susceptible to disturbance due to their short growing seasons, sparse vegetation and thin soils. Atmospheric nitrogen deposition and warming temperatures currently affect Green Lakes Valley (GLV) within the Colorado Front Range. Research conducted within the alpine links chronic nitrogen inputs to a suite of ecological impacts, resulting in increased nitrate export. According to NADP records at the site, the atmospheric flux of nitrogen has decreased by 0.56 kg ha-1 yr-1 since 2000, due to a decrease in precipitation. Concurrent with this decrease, alpine nitrate yields have continued to increase; by 32% relative to the previous decade (1990-1999). In order to determine the source(s) of the sustained nitrate increases we utilized long term datasets to construct a mass balance model for four stream segments (glacier to subalpine) for nitrogen and weathering product constituents. We also compared geochemical fingerprints of various solute sources (glacial meltwater, thawing permafrost, snow, and stream water) to alpine stream water to determine if sources had changed over time. Long term trends indicate that in addition to increases in nitrate; sulfate, calcium, and silica have also increased over the same period. The geochemical composition of thawing permafrost (as indicated by rock glacial meltwater) suggests it is the source of these weathering products. Mass balance results indicate the high ammonium loads within glacial meltwater are rapidly nitrified, contributing approximately 0.45 kg yr-1 to the NO3- flux within the upper reaches of the watershed. The sustained export of these solutes during dry, summer months is likely facilitated by thawing cryosphere providing hydraulic connectivity late into the growing season. In a neighboring catchment, lacking permafrost and glacial features, there were no long term weathering or nitrogen solute trends; providing further evidence that the changes in alpine chemistry in GLV are likely due to cryospheric thaw exposing soils to biological and geochemical processes. These findings suggest that efforts to reduce nitrogen deposition loads may not improve water quality, as thawing cryosphere associated with climate change may affect alpine nitrate concentrations as much, or more than atmospheric deposition trends.

  2. An X-Ray Investigation of the NGC346 Field in the SMC (3): XMM-Newton Data

    NASA Technical Reports Server (NTRS)

    Naze, Yael; Manfroid, Jean; Corcoran, Michael F.; Stevens, Ian R.

    2004-01-01

    We present new XMM-Newton results on the field around the NGC346 star cluster in the SMC. This continues and extends previously published work on Chandra observations of the same field. The two XMM-Newton observations were obtained, respectively, six months before and six months after the previously published Chandra data. Of the 51 X-ray sources detected with XMM-Newton, 29 were already detected with Chandru. Comparing the properties of these X-ray sources in each of our three datasets has enabled us to investigate their variability on times scales of a year. Changes in the flux levels and/or spectral properties were observed for 21 of these sources. In addition, we discovered long-term variations in the X-ray properties of the peculiar system HD5980, a luminous blue variable star, that is likely to be a colliding wind binary system, which displays the largest luminosity during the first XMM-Newton observation.

  3. Carbon-dependent alleviation of ammonia toxicity for algae cultivation and associated mechanisms exploration.

    PubMed

    Lu, Qian; Chen, Paul; Addy, Min; Zhang, Renchuan; Deng, Xiangyuan; Ma, Yiwei; Cheng, Yanling; Hussain, Fida; Chen, Chi; Liu, Yuhuan; Ruan, Roger

    2018-02-01

    Ammonia toxicity in wastewater is one of the factors that limit the application of algae technology in wastewater treatment. This work explored the correlation between carbon sources and ammonia assimilation and applied a glucose-assisted nitrogen starvation method to alleviate ammonia toxicity. In this study, ammonia toxicity to Chlorella sp. was observed when NH 3 -N concentration reached 28.03mM in artificial wastewater. Addition of alpha-ketoglutarate in wastewater promoted ammonia assimilation, but low utilization efficiency and high cost of alpha-ketoglutarate limits its application in wastewater treatment. Comparison of three common carbon sources, glucose, citric acid, and sodium bicarbonate, indicates that in terms of ammonia assimilation, glucose is the best carbon source. Experimental results suggest that organic carbon with good ability of generating energy and hydride donor may be critical to ammonia assimilation. Nitrogen starvation treatment assisted by glucose increased ammonia removal efficiencies and algal viabilities. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Galaxy evolution and large-scale structure in the far-infrared. I - IRAS pointed observations

    NASA Astrophysics Data System (ADS)

    Lonsdale, Carol J.; Hacking, Perry B.

    1989-04-01

    Redshifts for 66 galaxies were obtained from a sample of 93 60-micron sources detected serendipitously in 22 IRAS deep pointed observations, covering a total area of 18.4 sq deg. The flux density limit of this survey is 150 mJy, 4 times fainter than the IRAS Point Source Catalog (PSC). The luminosity function is similar in shape with those previously published for samples selected from the PSC, with a median redshift of 0.048 for the fainter sample, but shifted to higher space densities. There is evidence that some of the excess number counts in the deeper sample can be explained in terms of a large-scale density enhancement beyond the Pavo-Indus supercluster. In addition, the faintest counts in the new sample confirm the result of Hacking et al. (1989) that faint IRAS 60-micron source counts lie significantly in excess of an extrapolation of the PSC counts assuming no luminosity or density evolution.

  5. Galaxy evolution and large-scale structure in the far-infrared. I. IRAS pointed observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lonsdale, C.J.; Hacking, P.B.

    1989-04-01

    Redshifts for 66 galaxies were obtained from a sample of 93 60-micron sources detected serendipitously in 22 IRAS deep pointed observations, covering a total area of 18.4 sq deg. The flux density limit of this survey is 150 mJy, 4 times fainter than the IRAS Point Source Catalog (PSC). The luminosity function is similar in shape with those previously published for samples selected from the PSC, with a median redshift of 0.048 for the fainter sample, but shifted to higher space densities. There is evidence that some of the excess number counts in the deeper sample can be explained inmore » terms of a large-scale density enhancement beyond the Pavo-Indus supercluster. In addition, the faintest counts in the new sample confirm the result of Hacking et al. (1989) that faint IRAS 60-micron source counts lie significantly in excess of an extrapolation of the PSC counts assuming no luminosity or density evolution. 81 refs.« less

  6. Galaxy evolution and large-scale structure in the far-infrared. I - IRAS pointed observations

    NASA Technical Reports Server (NTRS)

    Lonsdale, Carol J.; Hacking, Perry B.

    1989-01-01

    Redshifts for 66 galaxies were obtained from a sample of 93 60-micron sources detected serendipitously in 22 IRAS deep pointed observations, covering a total area of 18.4 sq deg. The flux density limit of this survey is 150 mJy, 4 times fainter than the IRAS Point Source Catalog (PSC). The luminosity function is similar in shape with those previously published for samples selected from the PSC, with a median redshift of 0.048 for the fainter sample, but shifted to higher space densities. There is evidence that some of the excess number counts in the deeper sample can be explained in terms of a large-scale density enhancement beyond the Pavo-Indus supercluster. In addition, the faintest counts in the new sample confirm the result of Hacking et al. (1989) that faint IRAS 60-micron source counts lie significantly in excess of an extrapolation of the PSC counts assuming no luminosity or density evolution.

  7. Convenient yet not a convenience sample: Jury pools as experimental subject pools.

    PubMed

    Murray, Gregg R; Rugeley, Cynthia R; Mitchell, Dona-Gene; Mondak, Jeffery J

    2013-01-01

    Scholars greatly benefit from access to convenient, inexpensive data sources. Many researchers rely on student subject pools, a practice that raises concern about the "college sophomore problem," or the possibility that findings from student subjects do not generalize beyond the campus. As an accessible, low cost, and heterogeneous data source, some researchers have used subjects recruited from jury pools, which are drawn from randomly-selected citizens required by law to appear for jury duty. In this paper, we discuss the strengths and weaknesses of this approach. First, we review pragmatic considerations involving access to jury pools, substantive content, the administration of survey-experiments, and the financial costs and benefits of this approach. Next, we present evidence regarding the quality of jury pool samples in terms of response rates, diversity, and representativeness. We conclude that jury pools, given proper attention to their limitations, offer an attractive addition to the viable sources of experimental data. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Null stream analysis of Pulsar Timing Array data: localisation of resolvable gravitational wave sources

    NASA Astrophysics Data System (ADS)

    Goldstein, Janna; Veitch, John; Sesana, Alberto; Vecchio, Alberto

    2018-04-01

    Super-massive black hole binaries are expected to produce a gravitational wave (GW) signal in the nano-Hertz frequency band which may be detected by pulsar timing arrays (PTAs) in the coming years. The signal is composed of both stochastic and individually resolvable components. Here we develop a generic Bayesian method for the analysis of resolvable sources based on the construction of `null-streams' which cancel the part of the signal held in common for each pulsar (the Earth-term). For an array of N pulsars there are N - 2 independent null-streams that cancel the GW signal from a particular sky location. This method is applied to the localisation of quasi-circular binaries undergoing adiabatic inspiral. We carry out a systematic investigation of the scaling of the localisation accuracy with signal strength and number of pulsars in the PTA. Additionally, we find that source sky localisation with the International PTA data release one is vastly superior than what is achieved by its constituent regional PTAs.

  9. Attention during memory retrieval enhances future remembering.

    PubMed

    Dudukovic, Nicole M; Dubrow, Sarah; Wagner, Anthony D

    2009-10-01

    Memory retrieval is a powerful learning event that influences whether an experience will be remembered in the future. Although retrieval can succeed in the presence of distraction, dividing attention during retrieval may reduce the power of remembering as an encoding event. In the present experiments, participants studied pictures of objects under full attention and then engaged in item recognition and source memory retrieval under full or divided attention. Two days later, a second recognition and source recollection test assessed the impact of attention during initial retrieval on long-term retention. On this latter test, performance was superior for items that had been tested initially under full versus divided attention. More importantly, even when items were correctly recognized on the first test, divided attention reduced the likelihood of subsequent recognition on the second test. The same held true for source recollection. Additionally, foils presented during the first test were also less likely to be later recognized if they had been encountered initially under divided attention. These findings demonstrate that attentive retrieval is critical for learning through remembering.

  10. Stockholm Arlanda Airport as a source of per- and polyfluoroalkyl substances to water, sediment and fish.

    PubMed

    Ahrens, Lutz; Norström, Karin; Viktor, Tomas; Cousins, Anna Palm; Josefsson, Sarah

    2015-06-01

    Fire training facilities are potential sources of per- and polyfluoroalkyl substances (PFASs) to the nearby environment due to the usage of PFAS-containing aqueous fire-fighting foams (AFFFs). The multimedia distribution of perfluoroalkyl carboxylates (PFCAs), perfluoroalkyl sulfonates (PFSAs), perfluorooctanesulfonamide (PFOSA) and 6:2 fluorotelomer sulfonate (FTSA) was investigated near a fire training facility at Stockholm Arlanda Airport in Sweden. The whole body burden of PFASs in European perch (Perca fluviatilis) was 334±80μg absolute and was distributed as follows: Gonad>liver≈muscle>blood>gill. The bioconcentration factor (BCF) and sediment/water partition coefficient (Kd) increased by 0.6-1.7 and 0.2-0.5 log units, respectively, for each additional CF2 moiety for PFCAs and PFSAs. PFAS concentrations in water showed no significant decreasing trend between 2009 and 2013 (p>0.05), which indicates that Stockholm Arlanda Airport may be an important source for long-term contamination of the nearby environment with PFASs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Transition From Ideal To Viscous Mach Cones In A Partonic Transport Model

    NASA Astrophysics Data System (ADS)

    Bouras, I.; El, A.; Fochler, O.; Niemi, H.; Xu, Z.; Greiner, C.

    2013-09-01

    Using a partonic transport model we investigate the evolution of conical structures in ultrarelativistic matter. Using two different source terms and varying the transport properties of the matter we study the formation of Mach Cones. Furthermore, in an additional study we extract the two-particle correlations from the numerical calculations and compare them to an analytical approximation. The influence of the viscosity to the shape of Mach Cones and the corresponding two-particle correlations is studied by adjusting the cross section of the medium.

  12. Low latitude ice core evidence for dust deposition on high altitude glaciers

    NASA Astrophysics Data System (ADS)

    Gabrielli, P.; Thompson, L. G.

    2017-12-01

    Polar ice cores from Antarctica and Greenland have provided a wealth of information on dust emission, transport and deposition over glacial to interglacial timescales. These ice cores mainly entrap dust transported long distances from source areas such as Asia for Greenland and South America for Antarctica. Thus, these dust records provide paleo-information about the environmental conditions at the source and the strength/pathways of atmospheric circulation at continental scales. Ice cores have also been extracted from high altitude glaciers in the mid- and low-latitudes and provide dust records generally extending back several centuries and in a few cases back to the last glacial period. For these glaciers the potential sources of dust emission include areas that are close or adjacent to the drilling site which facilitates the potential for a strong imprinting of local dust in the records. In addition, only a few high altitude glaciers allow the reconstruction of past snow accumulation and hence the expression of the dust records in terms of fluxes. Due to their extreme elevation, a few of these high altitude ice cores offer dust histories with the potential to record environmental conditions at remote sources. Dust records (in terms of dust concentration/size, crustal trace elements and terrigenous cations) from Africa, the European Alps, South America and the Himalayas are examined over the last millennium. The interplay of the seasonal atmospheric circulation (e.g. westerlies, monsoons and vertical convection) is shown to play a major role in determining the intensity and origin of dust fallout to the high altitude glaciers around the world.

  13. The importance of electrothermal terms in Ohm's law for magnetized spherical implosions

    DOE PAGES

    Davies, J. R.; Betti, R.; Chang, P. -Y.; ...

    2015-11-06

    The magnetohydrodynamics (MHD) of magnetic-field compression in laser-driven spherical targets is considered. Magnetic-field evolution is cast in terms of an effective fluid velocity, a convective term resulting from resistivity gradients, a resistive diffusion term, and a source term. Effective velocity is the sum of fluid velocity, drift velocity, and heat-flux velocity, given by electron heat flux divided by electron enthalpy density, which has two components: the perpendicular or Nernst velocity and the cross-field velocity. The Nernst velocity compresses the magnetic field as a heat front moves into the gas. The cross-field velocity leads to dynamo generation of an azimuthal magneticmore » field. It is proposed that the heat-flux velocity should be flux limited using a “Nernst” flux limiter independent of the thermal flux limiter but should not exceed it. The addition of MHD routines to the 1-D, Lagrangian hydrocode LILAC and the Eulerian version of the 2-D hydrocode DRACO is described, and the codes are used to model a magnetized spherical compression on the OMEGA laser. Thermal flux limiting at a shock front is found to cause unphysical electron temperature gradients that lead to large, unphysical magnetic fields caused by the resistivity gradient, so thermal flux limiting in the gas is removed. The Nernst term reduces the benefits of magnetization in inertial fusion. In addition, a Nernst flux limiter ≤ 0.12 is required in the gas in order to agree with measured neutron yield and increases in the neutron-averaged ion temperature caused by magnetization. This corresponds to maintaining the Nernst velocity below the shock velocity, which prevents significant decoupling of the magnetic field and gas compression.« less

  14. Using patient experiences on Dutch social media to supervise health care services: exploratory study.

    PubMed

    van de Belt, Tom H; Engelen, Lucien J L P G; Verhoef, Lise M; van der Weide, Marian J A; Schoonhoven, Lisette; Kool, Rudolf B

    2015-01-15

    Social media has become mainstream and a growing number of people use it to share health care-related experiences, for example on health care rating sites. These users' experiences and ratings on social media seem to be associated with quality of care. Therefore, information shared by citizens on social media could be of additional value for supervising the quality and safety of health care services by regulatory bodies, thereby stimulating participation by consumers. The objective of the study was to identify the added value of social media for two types of supervision by the Dutch Healthcare Inspectorate (DHI), which is the regulatory body charged with supervising the quality and safety of health care services in the Netherlands. These were (1) supervision in response to incidents reported by individuals, and (2) risk-based supervision. We performed an exploratory study in cooperation with the DHI and searched different social media sources such as Twitter, Facebook, and healthcare rating sites to find additional information for these incidents and topics, from five different sectors. Supervision experts determined the added value for each individual result found, making use of pre-developed scales. Searches in social media resulted in relevant information for six of 40 incidents studied and provided relevant additional information in 72 of 116 cases in risk-based supervision of long-term elderly care. The results showed that social media could be used to include the patient's perspective in supervision. However, it appeared that the rating site ZorgkaartNederland was the only source that provided information that was of additional value for the DHI, while other sources such as forums and social networks like Twitter and Facebook did not result in additional information. This information could be of importance for health care inspectorates, particularly for its enforcement by risk-based supervision in care of the elderly. Further research is needed to determine the added value for other health care sectors.

  15. Using Patient Experiences on Dutch Social Media to Supervise Health Care Services: Exploratory Study

    PubMed Central

    Engelen, Lucien JLPG; Verhoef, Lise M; van der Weide, Marian JA; Schoonhoven, Lisette; Kool, Rudolf B

    2015-01-01

    Background Social media has become mainstream and a growing number of people use it to share health care-related experiences, for example on health care rating sites. These users’ experiences and ratings on social media seem to be associated with quality of care. Therefore, information shared by citizens on social media could be of additional value for supervising the quality and safety of health care services by regulatory bodies, thereby stimulating participation by consumers. Objective The objective of the study was to identify the added value of social media for two types of supervision by the Dutch Healthcare Inspectorate (DHI), which is the regulatory body charged with supervising the quality and safety of health care services in the Netherlands. These were (1) supervision in response to incidents reported by individuals, and (2) risk-based supervision. Methods We performed an exploratory study in cooperation with the DHI and searched different social media sources such as Twitter, Facebook, and healthcare rating sites to find additional information for these incidents and topics, from five different sectors. Supervision experts determined the added value for each individual result found, making use of pre-developed scales. Results Searches in social media resulted in relevant information for six of 40 incidents studied and provided relevant additional information in 72 of 116 cases in risk-based supervision of long-term elderly care. Conclusions The results showed that social media could be used to include the patient’s perspective in supervision. However, it appeared that the rating site ZorgkaartNederland was the only source that provided information that was of additional value for the DHI, while other sources such as forums and social networks like Twitter and Facebook did not result in additional information. This information could be of importance for health care inspectorates, particularly for its enforcement by risk-based supervision in care of the elderly. Further research is needed to determine the added value for other health care sectors. PMID:25592481

  16. Establishment of a Comprehensive List of Candidate Antiaging Medicinal Herb Used in Korean Medicine by Text Mining of the Classical Korean Medical Literature, “Dongeuibogam,” and Preliminary Evaluation of the Antiaging Effects of These Herbs

    PubMed Central

    Choi, Moo Jin; Choi, Byung Tae; Shin, Hwa Kyoung; Shin, Byung Cheul; Han, Yoo Kyoung; Baek, Jin Ung

    2015-01-01

    The major objectives of this study were to provide a list of candidate antiaging medicinal herbs that have been widely utilized in Korean medicine and to organize preliminary data for the benefit of experimental and clinical researchers to develop new drug therapies by analyzing previous studies. “Dongeuibogam,” a representative source of the Korean medicine literature, was selected to investigate candidate antiaging medicinal herbs and to identify appropriate terms that describe the specific antiaging effects that these herbs are predicted to elicit. In addition, we aimed to review previous studies that referenced the selected candidate antiaging medicinal herbs. From our chosen source, “Dongeuibogam,” we were able to screen 102 terms describing antiaging effects, which were further classified into 11 subtypes. Ninety-seven candidate antiaging medicinal herbs were selected using the criterion that their antiaging effects were described using the same terms as those employed in “Dongeuibogam.” These candidates were classified into 11 subtypes. Of the 97 candidate antiaging medicinal herbs selected, 47 are widely used by Korean medical doctors in Korea and were selected for further analysis of their antiaging effects. Overall, we found an average of 7.7 previous studies per candidate herb that described their antiaging effects. PMID:25861371

  17. Long-term financing needs for HIV control in sub-Saharan Africa in 2015-2050: a modelling study.

    PubMed

    Atun, Rifat; Chang, Angela Y; Ogbuoji, Osondu; Silva, Sachin; Resch, Stephen; Hontelez, Jan; Bärnighausen, Till

    2016-03-06

    To estimate the present value of current and future funding needed for HIV treatment and prevention in 9 sub-Saharan African (SSA) countries that account for 70% of HIV burden in Africa under different scenarios of intervention scale-up. To analyse the gaps between current expenditures and funding obligation, and discuss the policy implications of future financing needs. We used the Goals module from Spectrum, and applied the most up-to-date cost and coverage data to provide a range of estimates for future financing obligations. The four different scale-up scenarios vary by treatment initiation threshold and service coverage level. We compared the model projections to current domestic and international financial sources available in selected SSA countries. In the 9 SSA countries, the estimated resources required for HIV prevention and treatment in 2015-2050 range from US$98 billion to maintain current coverage levels for treatment and prevention with eligibility for treatment initiation at CD4 count of <500/mm(3) to US$261 billion if treatment were to be extended to all HIV-positive individuals and prevention scaled up. With the addition of new funding obligations for HIV--which arise implicitly through commitment to achieve higher than current treatment coverage levels--overall financial obligations (sum of debt levels and the present value of the stock of future HIV funding obligations) would rise substantially. Investing upfront in scale-up of HIV services to achieve high coverage levels will reduce HIV incidence, prevention and future treatment expenditures by realising long-term preventive effects of ART to reduce HIV transmission. Future obligations are too substantial for most SSA countries to be met from domestic sources alone. New sources of funding, in addition to domestic sources, include innovative financing. Debt sustainability for sustained HIV response is an urgent imperative for affected countries and donors. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  18. Metabolic analyses of the improved ε-poly-L-lysine productivity using a glucose-glycerol mixed carbon source in chemostat cultures.

    PubMed

    Zhang, Jian-Hua; Zeng, Xin; Chen, Xu-Sheng; Mao, Zhong-Gui

    2018-04-21

    The glucose-glycerol mixed carbon source remarkably reduced the batch fermentation time of ε-poly-L-lysine (ε-PL) production, leading to higher productivity of both biomass and ε-PL, which was of great significance in industrial microbial fermentation. Our previous study confirmed the positive influence of fast cell growth on the ε-PL biosynthesis, while the direct influence of mixed carbon source on ε-PL production was still unknown. In this work, chemostat culture was employed to study the capacity of ε-PL biosynthesis in different carbon sources at a same dilution rate of 0.05 h -1 . The results indicated that the mixed carbon source could enhance the ε-PL productivity besides the rapid cell growth. Analysis of key enzymes demonstrated that the activities of phosphoenolpyruvate carboxylase, citrate synthase, aspartokinase and ε-PL synthetase were all increased in chemostat culture with the mixed carbon source. In addition, the carbon fluxes were also improved in the mixed carbon source in terms of tricarboxylic acid cycle, anaplerotic and diaminopimelate pathway. Moreover, the mixed carbon source also accelerated the energy metabolism, leading to higher levels of energy charge and NADH/NAD + ratio. The overall improvements of primary metabolism in chemostat culture with glucose-glycerol combination provided sufficient carbon skeletons and ATP for ε-PL biosynthesis. Therefore, the significantly higher ε-PL productivity in the mixed carbon source was a combined effect of both superior substrate group and rapid cell growth.

  19. "Do-It-Yourself" reliable pH-stat device by using open-source software, inexpensive hardware and available laboratory equipment

    PubMed Central

    Kragic, Rastislav; Kostic, Mirjana

    2018-01-01

    In this paper, we present the construction of a reliable and inexpensive pH stat device, by using open-source “OpenPhControl” software, inexpensive hardware (a peristaltic and a syringe pump, Arduino, a step motor…), readily available laboratory devices: a pH meter, a computer, a webcam, and some 3D printed parts. We provide a methodology for the design, development and test results of each part of the device, as well as of the entire system. In addition to dosing reagents by means of a low-cost peristaltic pump, we also present carefully controlled dosing of reagents by an open-source syringe pump. The upgrading of the basic open-source syringe pump is given in terms of pump control and application of a larger syringe. In addition to the basic functions of pH stat, i.e. pH value measurement and maintenance, an improvement allowing the device to be used for potentiometric titration has been made as well. We have demonstrated the device’s utility when applied for cellulose fibers oxidation with 2,2,6,6-tetramethylpiperidine-1-oxyl radical, i.e. for TEMPO-mediated oxidation. In support of this, we present the results obtained for the oxidation kinetics, the consumption of added reagent and experimental repeatability. Considering that the open-source scientific tools are available to everyone, and that researchers can construct and adjust the device according to their needs, as well as, that the total cost of the open-source pH stat device, excluding the existing laboratory equipment (pH meter, computer and glossary) was less than 150 EUR, we believe that, at a small fraction of the cost of available commercial offers, our open-source pH stat can significantly improve experimental work where the use of pH stat is necessary. PMID:29509793

  20. "Do-It-Yourself" reliable pH-stat device by using open-source software, inexpensive hardware and available laboratory equipment.

    PubMed

    Milanovic, Jovana Z; Milanovic, Predrag; Kragic, Rastislav; Kostic, Mirjana

    2018-01-01

    In this paper, we present the construction of a reliable and inexpensive pH stat device, by using open-source "OpenPhControl" software, inexpensive hardware (a peristaltic and a syringe pump, Arduino, a step motor…), readily available laboratory devices: a pH meter, a computer, a webcam, and some 3D printed parts. We provide a methodology for the design, development and test results of each part of the device, as well as of the entire system. In addition to dosing reagents by means of a low-cost peristaltic pump, we also present carefully controlled dosing of reagents by an open-source syringe pump. The upgrading of the basic open-source syringe pump is given in terms of pump control and application of a larger syringe. In addition to the basic functions of pH stat, i.e. pH value measurement and maintenance, an improvement allowing the device to be used for potentiometric titration has been made as well. We have demonstrated the device's utility when applied for cellulose fibers oxidation with 2,2,6,6-tetramethylpiperidine-1-oxyl radical, i.e. for TEMPO-mediated oxidation. In support of this, we present the results obtained for the oxidation kinetics, the consumption of added reagent and experimental repeatability. Considering that the open-source scientific tools are available to everyone, and that researchers can construct and adjust the device according to their needs, as well as, that the total cost of the open-source pH stat device, excluding the existing laboratory equipment (pH meter, computer and glossary) was less than 150 EUR, we believe that, at a small fraction of the cost of available commercial offers, our open-source pH stat can significantly improve experimental work where the use of pH stat is necessary.

  1. A Methodology for the Integration of a Mechanistic Source Term Analysis in a Probabilistic Framework for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less

  2. Impact of routine episodic emissions on the expected frequency distribution of emissions from oil and gas production sources.

    NASA Astrophysics Data System (ADS)

    Smith, N.; Blewitt, D.; Hebert, L. B.

    2015-12-01

    In coordination with oil and gas operators, we developed a high resolution (< 1 min) simulation of temporal variability in well-pad oil and gas emissions over a year. We include routine emissions from condensate tanks, dehydrators, pneumatic devices, fugitive leaks and liquids unloading. We explore the variability in natural gas emissions from these individual well-pad sources, and find that routine short-term episodic emissions such as tank flashing and liquids unloading result in the appearance of a skewed, or 'fat-tail' distribution of emissions, from an individual well-pad over time. Additionally, we explore the expected variability in emissions from multiple wells with different raw gas composition, gas/liquids production volumes and control equipment. Differences in well-level composition, production volume and control equipment translate into differences in well-level emissions leading to a fat-tail distribution of emissions in the absence of operational upsets. Our results have several implications for recent studies focusing on emissions from oil and gas sources. Time scale of emission estimates are important and have important policy implications. Fat tail distributions may not be entirely driven by avoidable mechanical failures, and are expected to occur under routine operational conditions from short-duration emissions (e.g., tank flashing, liquid unloading). An understanding of the expected distribution of emissions for a particular population of wells is necessary to evaluate whether the observed distribution is more skewed than expected. Temporal variability in well-pad emissions make comparisons to annual average emissions inventories difficult and may complicate the interpretation of long-term ambient fenceline monitoring data. Sophisticated change detection algorithms will be necessary to identify when true operational upsets occur versus routine short-term emissions.

  3. Emergence of a dark force in corpuscular gravity

    NASA Astrophysics Data System (ADS)

    Cadoni, M.; Casadio, R.; Giusti, A.; Tuveri, M.

    2018-02-01

    We investigate the emergent laws of gravity when dark energy and the de Sitter space-time are modeled as a critical Bose-Einstein condensate of a large number of soft gravitons NG. We argue that this scenario requires the presence of various regimes of gravity in which NG scales in different ways. Moreover, the local gravitational interaction affecting baryonic matter can be naturally described in terms of gravitons pulled out from this dark energy condensate (DEC). We then explain the additional component of the acceleration at galactic scales, commonly attributed to dark matter, as the reaction of the DEC to the presence of baryonic matter. This additional dark force is also associated to gravitons pulled out from the DEC and correctly reproduces the modified Newtonian dynamics (MOND) acceleration. It also allows for an effective description in terms of general relativity sourced by an anisotropic fluid. We finally calculate the mass ratio between the contribution of the apparent dark matter and the baryonic matter in a region of size r at galactic scales and show that it is consistent with the Λ CDM predictions.

  4. Epidemiological considerations for the use of databases in transfusion research: a Scandinavian perspective.

    PubMed

    Edgren, Gustaf; Hjalgrim, Henrik

    2010-11-01

    At current safety levels, with adverse events from transfusions being relatively rare, further progress in risk reductions will require large-scale investigations. Thus, truly prospective studies may prove unfeasible and other alternatives deserve consideration. In this review, we will try to give an overview of recent and historical developments in the use of blood donation and transfusion databases in research. In addition, we will go over important methodological issues. There are at least three nationwide or near-nationwide donation/transfusion databases with the possibility for long-term follow-up of donors and recipients. During the past few years, a large number of reports have been published utilizing such data sources to investigate transfusion-associated risks. In addition, numerous clinics systematically collect and use such data on a smaller scale. Combining systematically recorded donation and transfusion data with long-term health follow-up opens up exciting opportunities for transfusion medicine research. However, the correct analysis of such data requires close attention to methodological issues, especially including the indication for transfusion and reverse causality.

  5. Consistent lattice Boltzmann methods for incompressible axisymmetric flows

    NASA Astrophysics Data System (ADS)

    Zhang, Liangqi; Yang, Shiliang; Zeng, Zhong; Yin, Linmao; Zhao, Ya; Chew, Jia Wei

    2016-08-01

    In this work, consistent lattice Boltzmann (LB) methods for incompressible axisymmetric flows are developed based on two efficient axisymmetric LB models available in the literature. In accord with their respective original models, the proposed axisymmetric models evolve within the framework of the standard LB method and the source terms contain no gradient calculations. Moreover, the incompressibility conditions are realized with the Hermite expansion, thus the compressibility errors arising in the existing models are expected to be reduced by the proposed incompressible models. In addition, an extra relaxation parameter is added to the Bhatnagar-Gross-Krook collision operator to suppress the effect of the ghost variable and thus the numerical stability of the present models is significantly improved. Theoretical analyses, based on the Chapman-Enskog expansion and the equivalent moment system, are performed to derive the macroscopic equations from the LB models and the resulting truncation terms (i.e., the compressibility errors) are investigated. In addition, numerical validations are carried out based on four well-acknowledged benchmark tests and the accuracy and applicability of the proposed incompressible axisymmetric LB models are verified.

  6. Source term identification in atmospheric modelling via sparse optimization

    NASA Astrophysics Data System (ADS)

    Adam, Lukas; Branda, Martin; Hamburger, Thomas

    2015-04-01

    Inverse modelling plays an important role in identifying the amount of harmful substances released into atmosphere during major incidents such as power plant accidents or volcano eruptions. Another possible application of inverse modelling lies in the monitoring the CO2 emission limits where only observations at certain places are available and the task is to estimate the total releases at given locations. This gives rise to minimizing the discrepancy between the observations and the model predictions. There are two standard ways of solving such problems. In the first one, this discrepancy is regularized by adding additional terms. Such terms may include Tikhonov regularization, distance from a priori information or a smoothing term. The resulting, usually quadratic, problem is then solved via standard optimization solvers. The second approach assumes that the error term has a (normal) distribution and makes use of Bayesian modelling to identify the source term. Instead of following the above-mentioned approaches, we utilize techniques from the field of compressive sensing. Such techniques look for a sparsest solution (solution with the smallest number of nonzeros) of a linear system, where a maximal allowed error term may be added to this system. Even though this field is a developed one with many possible solution techniques, most of them do not consider even the simplest constraints which are naturally present in atmospheric modelling. One of such examples is the nonnegativity of release amounts. We believe that the concept of a sparse solution is natural in both problems of identification of the source location and of the time process of the source release. In the first case, it is usually assumed that there are only few release points and the task is to find them. In the second case, the time window is usually much longer than the duration of the actual release. In both cases, the optimal solution should contain a large amount of zeros, giving rise to the concept of sparsity. In the paper, we summarize several optimization techniques which are used for finding sparse solutions and propose their modifications to handle selected constraints such as nonnegativity constraints and simple linear constraints, for example the minimal or maximal amount of total release. These techniques range from successive convex approximations to solution of one nonconvex problem. On simple examples, we explain these techniques and compare them from the point of implementation simplicity, approximation capability and convergence properties. Finally, these methods will be applied on the European Tracer Experiment (ETEX) data and the results will be compared with the current state of arts techniques such as regularized least squares or Bayesian approach. The obtained results show the surprisingly good results of these techniques. This research is supported by EEA/Norwegian Financial Mechanism under project 7F14287 STRADI.

  7. Efficient Development of High Fidelity Structured Volume Grids for Hypersonic Flow Simulations

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    2003-01-01

    A new technique for the control of grid line spacing and intersection angles of a structured volume grid, using elliptic partial differential equations (PDEs) is presented. Existing structured grid generation algorithms make use of source term hybridization to provide control of grid lines, imposing orthogonality implicitly at the boundary and explicitly on the interior of the domain. A bridging function between the two types of grid line control is typically used to blend the different orthogonality formulations. It is shown that utilizing such a bridging function with source term hybridization can result in the excessive use of computational resources and diminishes robustness. A new approach, Anisotropic Lagrange Based Trans-Finite Interpolation (ALBTFI), is offered as a replacement to source term hybridization. The ALBTFI technique captures the essence of the desired grid controls while improving the convergence rate of the elliptic PDEs when compared with source term hybridization. Grid generation on a blunt cone and a Shuttle Orbiter is used to demonstrate and assess the ALBTFI technique, which is shown to be as much as 50% faster, more robust, and produces higher quality grids than source term hybridization.

  8. Effect of wine addition on microbiological characteristics, volatile molecule profiles and biogenic amine contents in fermented sausages.

    PubMed

    Coloretti, Fabio; Tabanelli, Giulia; Chiavari, Cristiana; Lanciotti, Rosalba; Grazia, Luigi; Gardini, Fausto; Montanari, Chiara

    2014-03-01

    The aim was to evaluate the effect of wine addition during manufacturing of dry fermented sausages, in terms of safety aspects (biogenic amine accumulation), aroma profile and sensory characteristics. Three batches of salami were produced: without wine addition and with 7.5% or 15% (v/w) of white wine. The fermented sausages showed characteristics that can increase product diversification. Some of the sensory features (i.e. increased salty perception) can represent an important strategy because of the trend to reduce salt intake for health reasons. The presence of wine immediately reduced the pH and is a source of ethanol, which can have an inhibitory effect against undesirable microflora. The microbiological results observed regarding Enterobacteriaceae and enterococci were encouraging. The addition of wine did not negatively affect the ripening time or increase the presence of biogenic amines. The samples containing wine showed reduced concentrations of putrescine. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Utilizing bi-spectral index (BIS) for the monitoring of sedated adult ICU patients: a systematic review.

    PubMed

    Bilgili, Beliz; Montoya, Juan C; Layon, A J; Berger, Andrea L; Kirchner, H L; Gupta, Leena K; Gloss, David S

    2017-03-01

    The ideal level of sedation in the ICU is an ongoing source of scrutiny. At higher levels of sedation, the current scoring systems are not ideal. BIS may be able to improve both. We evaluated literature on effectiveness of BIS monitoring in sedated mechanically ventilated (MV) ICU patients compared to clinical sedation scores (CSS). For this systematic review, full text articles were searched in OVID, MEDLINE, EMBASE, and Cochrane databases from 1986 - 2014. Additional studies were identified searching bibliographies/abstracts from national/international Critical Care Medicine conferences and references from searched articles retrieved. Search terms were: 'Clinical sedation scale, Bi-spectral Index, Mechanical ventilation, Intensive care Unit'. Included were prospective, randomized and non-randomized studies comparing BIS monitoring with any CSS in MV adult (>18 yr old) ICU patients. Studies were graded for quality of evidence based on bias as established by the GRADE guidelines. Additional sources of bias were examined. There were five studies which met inclusion criteria. All five studies were either unclear or high risk for blinding of participants and blinding of outcome assessment. All papers had at least one source of additional high risk, or unclear/unstated. BIS monitoring in the mechanically ventilated ICU patient may decrease sedative drug dose, recall, and time to wake-up. The studies suggesting this are severely limited methodologically. BIS, when compared to subjective CSSs, is not, at this time, clearly indicated. An appropriately powered randomized, controlled study is needed to determine if this monitoring modality is of use on the ICU.

  10. The electromagnetic radiation from simple sources in the presence of a homogeneous dielectric sphere

    NASA Technical Reports Server (NTRS)

    Mason, V. B.

    1973-01-01

    In this research, the effect of a homogeneous dielectric sphere on the electromagnetic radiation from simple sources is treated as a boundary value problem, and the solution is obtained by the technique of dyadic Green's functions. Exact representations of the electric fields in the various regions due to a source located inside, outside, or on the surface of a dielectric sphere are formulated. Particular attention is given to the effect of sphere size, source location, dielectric constant, and dielectric loss on the radiation patterns and directivity of small spheres (less than 5 wavelengths in diameter) using the Huygens' source excitation. The computed results are found to closely agree with those measured for waveguide-excited plexiglas spheres. Radiation patterns for an extended Huygens' source and for curved electric dipoles located on the sphere's surface are also presented. The resonance phenomenon associated with the dielectric sphere is studied in terms of the modal representation of the radiated fields. It is found that when the sphere is excited at certain frequencies, much of the energy is radiated into the sidelobes. The addition of a moderate amount of dielectric loss, however, quickly attenuates this resonance effect. A computer program which may be used to calculate the directivity and radiation pattern of a Huygens' source located inside or on the surface of a lossy dielectric sphere is listed.

  11. Echolocation versus echo suppression in humans

    PubMed Central

    Wallmeier, Ludwig; Geßele, Nikodemus; Wiegrebe, Lutz

    2013-01-01

    Several studies have shown that blind humans can gather spatial information through echolocation. However, when localizing sound sources, the precedence effect suppresses spatial information of echoes, and thereby conflicts with effective echolocation. This study investigates the interaction of echolocation and echo suppression in terms of discrimination suppression in virtual acoustic space. In the ‘Listening’ experiment, sighted subjects discriminated between positions of a single sound source, the leading or the lagging of two sources, respectively. In the ‘Echolocation’ experiment, the sources were replaced by reflectors. Here, the same subjects evaluated echoes generated in real time from self-produced vocalizations and thereby discriminated between positions of a single reflector, the leading or the lagging of two reflectors, respectively. Two key results were observed. First, sighted subjects can learn to discriminate positions of reflective surfaces echo-acoustically with accuracy comparable to sound source discrimination. Second, in the Listening experiment, the presence of the leading source affected discrimination of lagging sources much more than vice versa. In the Echolocation experiment, however, the presence of both the lead and the lag strongly affected discrimination. These data show that the classically described asymmetry in the perception of leading and lagging sounds is strongly diminished in an echolocation task. Additional control experiments showed that the effect is owing to both the direct sound of the vocalization that precedes the echoes and owing to the fact that the subjects actively vocalize in the echolocation task. PMID:23986105

  12. Development of a Persistent Reactive Treatment Zone for Containment of Sources Located in Lower-Permeability Strata

    NASA Astrophysics Data System (ADS)

    Marble, J.; Carroll, K. C.; Brusseau, M. L.; Plaschke, M.; Brinker, F.

    2013-12-01

    Source zones located in relatively deep, low-permeability formations provide special challenges for remediation. Application of permeable reactive barriers, in-situ thermal, or electrokinetic methods would be expensive and generally impractical. In addition, the use of enhanced mass-removal approaches based on reagent injection (e.g., ISCO, enhanced-solubility reagents) is likely to be ineffective. One possible approach for such conditions is to create a persistent treatment zone for purposes of containment. This study examines the efficacy of this approach for containment and treatment of contaminants in a lower permeability zone using potassium permanganate (KMnO4) as the reactant. A localized 1,1-dichloroethene (DCE) source zone is present in a section of the Tucson International Airport Area (TIAA) Superfund Site. Characterization studies identified the source of DCE to be located in lower-permeability strata adjacent to the water table. Bench-scale studies were conducted using core material collected from boreholes drilled at the site to measure DCE concentrations and determine natural oxidant demand. The reactive zone was created by injecting ~1.7% KMnO4 solution into multiple wells screened within the lower-permeability unit. The site has been monitored for ~8 years to characterize the spatial distribution of DCE and permanganate. KMnO4 continues to persist at the site, demonstrating successful creation of a long-term reactive zone. Additionally, the footprint of the DCE contaminant plume in groundwater has decreased continuously with time. This project illustrates the application of ISCO as a reactive-treatment system for lower-permeability source zones, which appears to effectively mitigate persistent mass flux into groundwater.

  13. Time-frequency approach to underdetermined blind source separation.

    PubMed

    Xie, Shengli; Yang, Liu; Yang, Jun-Mei; Zhou, Guoxu; Xiang, Yong

    2012-02-01

    This paper presents a new time-frequency (TF) underdetermined blind source separation approach based on Wigner-Ville distribution (WVD) and Khatri-Rao product to separate N non-stationary sources from M(M <; N) mixtures. First, an improved method is proposed for estimating the mixing matrix, where the negative value of the auto WVD of the sources is fully considered. Then after extracting all the auto-term TF points, the auto WVD value of the sources at every auto-term TF point can be found out exactly with the proposed approach no matter how many active sources there are as long as N ≤ 2M-1. Further discussion about the extraction of auto-term TF points is made and finally the numerical simulation results are presented to show the superiority of the proposed algorithm by comparing it with the existing ones.

  14. New Observations of UV Emissions from Europa

    NASA Technical Reports Server (NTRS)

    McGrath, Melissa; Sparks, William

    2009-01-01

    The recent top prioritization of the Europa Jupiter System Mission for the next outer solar system flagship mission is refocusing attention on Europa and the other Galilean satellites and their contextual environments in the Jupiter system. Surface sputtering by magnetospheric plasma generates a tenuous atmosphere for Europa, dominated by 02 gas. This tenuous gas is in turn excited by plasma electrons, producing ultraviolet and visible emissions. Two sets of imaging observations have been published to date, UV images from the Hubble Space Telescope, and visible eclipse images from Cassini. Three additional sets of HST UV observations were acquired in February 2007, April 2007 and June 2009. The signal to noise ratio in these data are not high, however, given the paucity of data and its increasing importance in terms of planning for EJSM, we have attempted to extract as much new information as possible from these data. This talk will summarize our analysis to date, and discuss them in terms of existing models, which attempt to explain the image morphology either in terms of the underlying source production and loss processes, or in terms of the plasma interaction with the exosphere.

  15. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 15 2010-04-01 2010-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  16. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 15 2011-04-01 2011-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  17. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 26 Internal Revenue 15 2012-04-01 2012-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  18. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 15 2014-04-01 2014-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  19. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 26 Internal Revenue 15 2013-04-01 2013-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  20. pyJac: Analytical Jacobian generator for chemical kinetics

    NASA Astrophysics Data System (ADS)

    Niemeyer, Kyle E.; Curtis, Nicholas J.; Sung, Chih-Jen

    2017-06-01

    Accurate simulations of combustion phenomena require the use of detailed chemical kinetics in order to capture limit phenomena such as ignition and extinction as well as predict pollutant formation. However, the chemical kinetic models for hydrocarbon fuels of practical interest typically have large numbers of species and reactions and exhibit high levels of mathematical stiffness in the governing differential equations, particularly for larger fuel molecules. In order to integrate the stiff equations governing chemical kinetics, generally reactive-flow simulations rely on implicit algorithms that require frequent Jacobian matrix evaluations. Some in situ and a posteriori computational diagnostics methods also require accurate Jacobian matrices, including computational singular perturbation and chemical explosive mode analysis. Typically, finite differences numerically approximate these, but for larger chemical kinetic models this poses significant computational demands since the number of chemical source term evaluations scales with the square of species count. Furthermore, existing analytical Jacobian tools do not optimize evaluations or support emerging SIMD processors such as GPUs. Here we introduce pyJac, a Python-based open-source program that generates analytical Jacobian matrices for use in chemical kinetics modeling and analysis. In addition to producing the necessary customized source code for evaluating reaction rates (including all modern reaction rate formulations), the chemical source terms, and the Jacobian matrix, pyJac uses an optimized evaluation order to minimize computational and memory operations. As a demonstration, we first establish the correctness of the Jacobian matrices for kinetic models of hydrogen, methane, ethylene, and isopentanol oxidation (number of species ranging 13-360) by showing agreement within 0.001% of matrices obtained via automatic differentiation. We then demonstrate the performance achievable on CPUs and GPUs using pyJac via matrix evaluation timing comparisons; the routines produced by pyJac outperformed first-order finite differences by 3-7.5 times and the existing analytical Jacobian software TChem by 1.1-2.2 times on a single-threaded basis. It is noted that TChem is not thread-safe, while pyJac is easily parallelized, and hence can greatly outperform TChem on multicore CPUs. The Jacobian matrix generator we describe here will be useful for reducing the cost of integrating chemical source terms with implicit algorithms in particular and algorithms that require an accurate Jacobian matrix in general. Furthermore, the open-source release of the program and Python-based implementation will enable wide adoption.

  1. Integrating multiple remote sensing and surface measurements with models, to quantify and constrain the past decade's total 4D aerosol source profile and impacts

    NASA Astrophysics Data System (ADS)

    Cohen, J. B.; Lan, R.; Lin, C.; Ng, D. H. L.; Lim, A.

    2017-12-01

    A multi-instrument, inverse modeling approach, is employed to identify and quantify large-scale global biomass urban aerosol emissions profiles. The approach uses MISR, MODIS, OMI and MOPITT, with data from 2006 to 2016, to generate spatial and temporal loads, as well as some information about composition. The method is able to identify regions impacted by stable urban sources, changing urban sources, intense fires, and linear-combinations. Subsequent quantification is a unified field, leading to a less biased profile, with the result not requiring arbitrary scaling to match long-term means. Additionally, the result reasonably reproduces inter and intra annual variation. Both meso-scale (WRF-CHEM) and global (MIT-AERO, multi-mode, multi-mixing state aerosol model) models of aerosol transport, chemistry, and physics, are used to generate resulting 4D aerosol fields. Comparisons with CALIOP, AERONET, and surface chemical and aerosol networks, provide unbiased confirmation, while column and vertical loadings provide additional feedback. There are three significant results. First, there is a reduction in sources over existing urban areas in East Asia. Second, there is an increase in sources over new urban areas in South, South East, and East Asia. Third, that there is an increase in fire sources in South and South East Asia. There are other initial findings relevant to the global tropics, which have not been as deeply investigated. The results improve the model match with both the mean and variation, which is essential if we hope to understand seasonal extremes. The results also quantify impacts of both local and long-range sources. This is of extreme urgency, in particular in developing nations, where there are considerable contributions from long-range or otherwise unknown sources, that impact hundreds of millions of people throughout Asia. It is hoped that the approach provided here can help us to make critical decisions about total sources, as well as point out the many missing scientific and analytical issues still required to address.

  2. SYNTHESIS OF NOVEL ALL-DIELECTRIC GRATING FILTERS USING GENETIC ALGORITHMS

    NASA Technical Reports Server (NTRS)

    Zuffada, Cinzia; Cwik, Tom; Ditchman, Christopher

    1997-01-01

    We are concerned with the design of inhomogeneous, all dielectric (lossless) periodic structures which act as filters. Dielectric filters made as stacks of inhomogeneous gratings and layers of materials are being used in optical technology, but are not common at microwave frequencies. The problem is then finding the periodic cell's geometric configuration and permittivity values which correspond to a specified reflectivity/transmittivity response as a function of frequency/illumination angle. This type of design can be thought of as an inverse-source problem, since it entails finding a distribution of sources which produce fields (or quantities derived from them) of given characteristics. Electromagnetic sources (electric and magnetic current densities) in a volume are related to the outside fields by a well known linear integral equation. Additionally, the sources are related to the fields inside the volume by a constitutive equation, involving the material properties. Then, the relationship linking the fields outside the source region to those inside is non-linear, in terms of material properties such as permittivity, permeability and conductivity. The solution of the non-linear inverse problem is cast here as a combination of two linear steps, by explicitly introducing the electromagnetic sources in the computational volume as a set of unknowns in addition to the material unknowns. This allows to solve for material parameters and related electric fields in the source volume which are consistent with Maxwell's equations. Solutions are obtained iteratively by decoupling the two steps. First, we invert for the permittivity only in the minimization of a cost function and second, given the materials, we find the corresponding electric fields through direct solution of the integral equation in the source volume. The sources thus computed are used to generate the far fields and the synthesized triter response. The cost function is obtained by calculating the deviation between the synthesized value of reflectivity/transmittivity and the desired one. Solution geometries for the periodic cell are sought as gratings (ensembles of columns of different heights and widths), or combinations of homogeneous layers of different dielectric materials and gratings. Hence the explicit unknowns of the inversion step are the material permittivities and the relative boundaries separating homogeneous parcels of the periodic cell.

  3. Integrating nursing diagnostic concepts into the medical entities dictionary using the ISO Reference Terminology Model for Nursing Diagnosis.

    PubMed

    Hwang, Jee-In; Cimino, James J; Bakken, Suzanne

    2003-01-01

    The purposes of the study were (1) to evaluate the usefulness of the International Standards Organization (ISO) Reference Terminology Model for Nursing Diagnoses as a terminology model for defining nursing diagnostic concepts in the Medical Entities Dictionary (MED) and (2) to create the additional hierarchical structures required for integration of nursing diagnostic concepts into the MED. The authors dissected nursing diagnostic terms from two source terminologies (Home Health Care Classification and the Omaha System) into the semantic categories of the ISO model. Consistent with the ISO model, they selected Focus and Judgment as required semantic categories for creating intensional definitions of nursing diagnostic concepts in the MED. Because the MED does not include Focus and Judgment hierarchies, the authors developed them to define the nursing diagnostic concepts. The ISO model was sufficient for dissecting the source terminologies into atomic terms. The authors identified 162 unique focus concepts from the 266 nursing diagnosis terms for inclusion in the Focus hierarchy. For the Judgment hierarchy, the authors precoordinated Judgment and Potentiality instead of using Potentiality as a qualifier of Judgment as in the ISO model. Impairment and Alteration were the most frequently occurring judgments. Nursing care represents a large proportion of health care activities; thus, it is vital that terms used by nurses are integrated into concept-oriented terminologies that provide broad coverage for the domain of health care. This study supports the utility of the ISO Reference Terminology Model for Nursing Diagnoses as a facilitator for the integration process.

  4. Integrating Nursing Diagnostic Concepts into the Medical Entities Dictionary Using the ISO Reference Terminology Model for Nursing Diagnosis

    PubMed Central

    Hwang, Jee-In; Cimino, James J.; Bakken, Suzanne

    2003-01-01

    Objective: The purposes of the study were (1) to evaluate the usefulness of the International Standards Organization (ISO) Reference Terminology Model for Nursing Diagnoses as a terminology model for defining nursing diagnostic concepts in the Medical Entities Dictionary (MED) and (2) to create the additional hierarchical structures required for integration of nursing diagnostic concepts into the MED. Design and Measurements: The authors dissected nursing diagnostic terms from two source terminologies (Home Health Care Classification and the Omaha System) into the semantic categories of the ISO model. Consistent with the ISO model, they selected Focus and Judgment as required semantic categories for creating intensional definitions of nursing diagnostic concepts in the MED. Because the MED does not include Focus and Judgment hierarchies, the authors developed them to define the nursing diagnostic concepts. Results: The ISO model was sufficient for dissecting the source terminologies into atomic terms. The authors identified 162 unique focus concepts from the 266 nursing diagnosis terms for inclusion in the Focus hierarchy. For the Judgment hierarchy, the authors precoordinated Judgment and Potentiality instead of using Potentiality as a qualifier of Judgment as in the ISO model. Impairment and Alteration were the most frequently occurring judgments. Conclusions: Nursing care represents a large proportion of health care activities; thus, it is vital that terms used by nurses are integrated into concept-oriented terminologies that provide broad coverage for the domain of health care. This study supports the utility of the ISO Reference Terminology Model for Nursing Diagnoses as a facilitator for the integration process. PMID:12668692

  5. 12 CFR 201.4 - Availability and terms of credit.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... overnight, as a backup source of funding to a depository institution that is in generally sound financial... to a few weeks as a backup source of funding to a depository institution if, in the judgment of the... very short-term basis, usually overnight, as a backup source of funding to a depository institution...

  6. AMOEBA 2.0: A physics-first approach to biomolecular simulations

    NASA Astrophysics Data System (ADS)

    Rackers, Joshua; Ponder, Jay

    The goal of the AMOEBA force field project is to use classical physics to understand and predict the nature of interactions between biological molecules. While making significant advances over the past decade, the ultimate goal of predicting binding energies with ``chemical accuracy'' remains elusive. The primary source of this inaccuracy comes from the physics of how molecules interact at short range. For example, despite AMOEBA's advanced treatment of electrostatics, the force field dramatically overpredicts the electrostatic energy of DNA stacking interactions. AMOEBA 2.0 works to correct these errors by including simple, first principles physics-based terms to account for the quantum mechanical nature of these short-range molecular interactions. We have added a charge penetration term that considerably improves the description of electrostatic interactions at short range. We are reformulating the polarization term of AMOEBA in terms of basic physics assertions. And we are reevaluating the van der Waals term to match ab initio energy decompositions. These additions and changes promise to make AMOEBA more predictive. By including more physical detail of the important short-range interactions of biological molecules, we hope to move closer to the ultimate goal of true predictive power.

  7. Numerical Analysis of 2-D and 3-D MHD Flows Relevant to Fusion Applications

    DOE PAGES

    Khodak, Andrei

    2017-08-21

    Here, the analysis of many fusion applications such as liquid-metal blankets requires application of computational fluid dynamics (CFD) methods for electrically conductive liquids in geometrically complex regions and in the presence of a strong magnetic field. A current state of the art general purpose CFD code allows modeling of the flow in complex geometric regions, with simultaneous conjugated heat transfer analysis in liquid and surrounding solid parts. Together with a magnetohydrodynamics (MHD) capability, the general purpose CFD code will be a valuable tool for the design and optimization of fusion devices. This paper describes an introduction of MHD capability intomore » the general purpose CFD code CFX, part of the ANSYS Workbench. The code was adapted for MHD problems using a magnetic induction approach. CFX allows introduction of user-defined variables using transport or Poisson equations. For MHD adaptation of the code three additional transport equations were introduced for the components of the magnetic field, in addition to the Poisson equation for electric potential. The Lorentz force is included in the momentum transport equation as a source term. Fusion applications usually involve very strong magnetic fields, with values of the Hartmann number of up to tens of thousands. In this situation a system of MHD equations become very rigid with very large source terms and very strong variable gradients. To increase system robustness, special measures were introduced during the iterative convergence process, such as linearization using source coefficient for momentum equations. The MHD implementation in general purpose CFD code was tested against benchmarks, specifically selected for liquid-metal blanket applications. Results of numerical simulations using present implementation closely match analytical solutions for a Hartmann number of up to 1500 for a 2-D laminar flow in the duct of square cross section, with conducting and nonconducting walls. Results for a 3-D test case are also included.« less

  8. Numerical Analysis of 2-D and 3-D MHD Flows Relevant to Fusion Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khodak, Andrei

    Here, the analysis of many fusion applications such as liquid-metal blankets requires application of computational fluid dynamics (CFD) methods for electrically conductive liquids in geometrically complex regions and in the presence of a strong magnetic field. A current state of the art general purpose CFD code allows modeling of the flow in complex geometric regions, with simultaneous conjugated heat transfer analysis in liquid and surrounding solid parts. Together with a magnetohydrodynamics (MHD) capability, the general purpose CFD code will be a valuable tool for the design and optimization of fusion devices. This paper describes an introduction of MHD capability intomore » the general purpose CFD code CFX, part of the ANSYS Workbench. The code was adapted for MHD problems using a magnetic induction approach. CFX allows introduction of user-defined variables using transport or Poisson equations. For MHD adaptation of the code three additional transport equations were introduced for the components of the magnetic field, in addition to the Poisson equation for electric potential. The Lorentz force is included in the momentum transport equation as a source term. Fusion applications usually involve very strong magnetic fields, with values of the Hartmann number of up to tens of thousands. In this situation a system of MHD equations become very rigid with very large source terms and very strong variable gradients. To increase system robustness, special measures were introduced during the iterative convergence process, such as linearization using source coefficient for momentum equations. The MHD implementation in general purpose CFD code was tested against benchmarks, specifically selected for liquid-metal blanket applications. Results of numerical simulations using present implementation closely match analytical solutions for a Hartmann number of up to 1500 for a 2-D laminar flow in the duct of square cross section, with conducting and nonconducting walls. Results for a 3-D test case are also included.« less

  9. Accident Source Terms for Pressurized Water Reactors with High-Burnup Cores Calculated using MELCOR 1.8.5.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Goldmann, Andrew; Kalinich, Donald A.

    2016-12-01

    In this study, risk-significant pressurized-water reactor severe accident sequences are examined using MELCOR 1.8.5 to explore the range of fission product releases to the reactor containment building. Advances in the understanding of fission product release and transport behavior and severe accident progression are used to render best estimate analyses of selected accident sequences. Particular emphasis is placed on estimating the effects of high fuel burnup in contrast with low burnup on fission product releases to the containment. Supporting this emphasis, recent data available on fission product release from high-burnup (HBU) fuel from the French VERCOR project are used in thismore » study. The results of these analyses are treated as samples from a population of accident sequences in order to employ approximate order statistics characterization of the results. These trends and tendencies are then compared to the NUREG-1465 alternative source term prescription used today for regulatory applications. In general, greater differences are observed between the state-of-the-art calculations for either HBU or low-burnup (LBU) fuel and the NUREG-1465 containment release fractions than exist between HBU and LBU release fractions. Current analyses suggest that retention of fission products within the vessel and the reactor coolant system (RCS) are greater than contemplated in the NUREG-1465 prescription, and that, overall, release fractions to the containment are therefore lower across the board in the present analyses than suggested in NUREG-1465. The decreased volatility of Cs 2 MoO 4 compared to CsI or CsOH increases the predicted RCS retention of cesium, and as a result, cesium and iodine do not follow identical behaviors with respect to distribution among vessel, RCS, and containment. With respect to the regulatory alternative source term, greater differences are observed between the NUREG-1465 prescription and both HBU and LBU predictions than exist between HBU and LBU analyses. Additionally, current analyses suggest that the NUREG-1465 release fractions are conservative by about a factor of 2 in terms of release fractions and that release durations for in-vessel and late in-vessel release periods are in fact longer than the NUREG-1465 durations. It is currently planned that a subsequent report will further characterize these results using more refined statistical methods, permitting a more precise reformulation of the NUREG-1465 alternative source term for both LBU and HBU fuels, with the most important finding being that the NUREG-1465 formula appears to embody significant conservatism compared to current best-estimate analyses. ACKNOWLEDGEMENTS This work was supported by the United States Nuclear Regulatory Commission, Office of Nuclear Regulatory Research. The authors would like to thank Dr. Ian Gauld and Dr. Germina Ilas, of Oak Ridge National Laboratory, for their contributions to this work. In addition to development of core fission product inventory and decay heat information for use in MELCOR models, their insights related to fuel management practices and resulting effects on spatial distribution of fission products in the core was instrumental in completion of our work.« less

  10. An Improved Neutron Transport Algorithm for Space Radiation

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Clowdsley, Martha S.; Wilson, John W.

    2000-01-01

    A low-energy neutron transport algorithm for use in space radiation protection is developed. The algorithm is based upon a multigroup analysis of the straight-ahead Boltzmann equation by using a mean value theorem for integrals. This analysis is accomplished by solving a realistic but simplified neutron transport test problem. The test problem is analyzed by using numerical and analytical procedures to obtain an accurate solution within specified error bounds. Results from the test problem are then used for determining mean values associated with rescattering terms that are associated with a multigroup solution of the straight-ahead Boltzmann equation. The algorithm is then coupled to the Langley HZETRN code through the evaporation source term. Evaluation of the neutron fluence generated by the solar particle event of February 23, 1956, for a water and an aluminum-water shield-target configuration is then compared with LAHET and MCNPX Monte Carlo code calculations for the same shield-target configuration. The algorithm developed showed a great improvement in results over the unmodified HZETRN solution. In addition, a two-directional solution of the evaporation source showed even further improvement of the fluence near the front of the water target where diffusion from the front surface is important.

  11. Influence of ammonium sulphate feeding time on fed-batch Arthrospira (Spirulina) platensis cultivation and biomass composition with and without pH control.

    PubMed

    Rodrigues, Mayla Santos; Ferreira, Lívia Seno; Converti, Attilio; Sato, Sunao; de Carvalho, João Carlos Monteiro

    2011-06-01

    Previous work demonstrated that a mixture of NH(4)Cl and KNO(3) as nitrogen source was beneficial to fed-batch Arthrospira (Spirulina) platensis cultivation, in terms of either lower costs or higher cell concentration. On the basis of those results, this study focused on the use of a cheaper nitrogen source mixture, namely (NH(4))(2)SO(4) plus NaNO(3), varying the ammonium feeding time (T=7-15 days), either controlling the pH by CO(2) addition or not. A. platensis was cultivated in mini-tanks at 30°C, 156 μmol photons m(-2) s(-1), and starting cell concentration of 400 mg L(-1), on a modified Schlösser medium. T=13 days under pH control were selected as optimum conditions, ensuring the best results in terms of biomass production (maximum cell concentration of 2911 mg L(-1), cell productivity of 179 mg L(-1)d(-1) and specific growth rate of 0.77 d(-1)) and satisfactory protein and lipid contents (around 30% each). Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. pyBadlands: A framework to simulate sediment transport, landscape dynamics and basin stratigraphic evolution through space and time

    PubMed Central

    2018-01-01

    Understanding Earth surface responses in terms of sediment dynamics to climatic variability and tectonics forcing is hindered by limited ability of current models to simulate long-term evolution of sediment transfer and associated morphological changes. This paper presents pyBadlands, an open-source python-based framework which computes over geological time (1) sediment transport from landmasses to coasts, (2) reworking of marine sediments by longshore currents and (3) development of coral reef systems. pyBadlands is cross-platform, distributed under the GPLv3 license and available on GitHub (http://github.com/badlands-model). Here, we describe the underlying physical assumptions behind the simulated processes and the main options already available in the numerical framework. Along with the source code, a list of hands-on examples is provided that illustrates the model capabilities. In addition, pre and post-processing classes have been built and are accessible as a companion toolbox which comprises a series of workflows to efficiently build, quantify and explore simulation input and output files. While the framework has been primarily designed for research, its simplicity of use and portability makes it a great tool for teaching purposes. PMID:29649301

  13. A Second Law Based Unstructured Finite Volume Procedure for Generalized Flow Simulation

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok

    1998-01-01

    An unstructured finite volume procedure has been developed for steady and transient thermo-fluid dynamic analysis of fluid systems and components. The procedure is applicable for a flow network consisting of pipes and various fittings where flow is assumed to be one dimensional. It can also be used to simulate flow in a component by modeling a multi-dimensional flow using the same numerical scheme. The flow domain is discretized into a number of interconnected control volumes located arbitrarily in space. The conservation equations for each control volume account for the transport of mass, momentum and entropy from the neighboring control volumes. In addition, they also include the sources of each conserved variable and time dependent terms. The source term of entropy equation contains entropy generation due to heat transfer and fluid friction. Thermodynamic properties are computed from the equation of state of a real fluid. The system of equations is solved by a hybrid numerical method which is a combination of simultaneous Newton-Raphson and successive substitution schemes. The paper also describes the application and verification of the procedure by comparing its predictions with the analytical and numerical solution of several benchmark problems.

  14. Energy harvesting concepts for small electric unmanned systems

    NASA Astrophysics Data System (ADS)

    Qidwai, Muhammad A.; Thomas, James P.; Kellogg, James C.; Baucom, Jared N.

    2004-07-01

    In this study, we identify and survey energy harvesting technologies for small electrically powered unmanned systems designed for long-term (>1 day) time-on-station missions. An environmental energy harvesting scheme will provide long-term, energy additions to the on-board energy source. We have identified four technologies that cover a broad array of available energy sources: solar, kinetic (wind) flow, autophagous structure-power (both combustible and metal air-battery systems) and electromagnetic (EM) energy scavenging. We present existing conceptual designs, critical system components, performance, constraints and state-of-readiness for each technology. We have concluded that the solar and autophagous technologies are relatively matured for small-scale applications and are capable of moderate power output levels (>1 W). We have identified key components and possible multifunctionalities in each technology. The kinetic flow and EM energy scavenging technologies will require more in-depth study before they can be considered for implementation. We have also realized that all of the harvesting systems require design and integration of various electrical, mechanical and chemical components, which will require modeling and optimization using hybrid mechatronics-circuit simulation tools. This study provides a starting point for detailed investigation into the proposed technologies for unmanned system applications under current development.

  15. Main field and recent secular variation.

    USGS Publications Warehouse

    Alldredge, L.R.

    1983-01-01

    As Cain (1979) indicated might happen in the last IUGG quadrennial report, added resources were made available during the past few years and a real impulse was added to the geomagnetic work in the US by the launching of the MAGSAT Satellite. This new effort paid off in terms of new charts, additional long wavelength studies, and external source studies. As before, however, the future funding for new starts in geomagnetism does not look bright at the present time. A single MAGSAT in orbit a little more than seven months did wonders for main field (M.F.) charting, but did little or nothing for secular variation (S.V.) charting. It would take a number of repeated MAGSATS to help the S.V. picture. Meanwhile, the world magnetic observatory net and surface repeat stations remain as the main source of S.V. data. -from Author

  16. Measurement of volatile organic chemicals at selected sites in California

    NASA Technical Reports Server (NTRS)

    Singh, Hanwant B.; Salas, L.; Viezee, W.; Sitton, B.; Ferek, R.

    1992-01-01

    Urban air concentrations of 24 selected volatile organic chemicals that may be potentially hazardous to human health and environment were measured during field experiments conducted at two California locations, at Houston, and at Denver. Chemicals measured included chlorofluorocarbons, halomethanes, haloethanes, halopropanes, chloroethylenes, and aromatic hydrocarbons. With emphasis on California sites, data from these studies are analyzed and interpreted with respect to variabilities in ambient air concentrations, diurnal changes, relation to prevailing meteorology, sources and trends. Except in a few instances, mean concentrations are typically between 0 and 5 ppb. Significant variabilities in atmospheric concentrations associated with intense sources and adverse meteorological conditions are shown to exist. In addition to short-term variability, there is evidence of systematic diurnal and seasonal trends. In some instances it is possible to detect declining trends resulting from the effectiveness of control strategies.

  17. Material from the Internal Surface of Squid Axon Exhibits Excess Noise

    PubMed Central

    Fishman, Harvey M.

    1981-01-01

    A fluid material from a squid (Loligo pealei) axon was isolated by mechanical application of two types of microcapillary (1-3-μm Diam) to the internal surface of intact and cut-axon preparations. Current noise in the isolated material exceeded thermal levels and power spectra were 1/f in form in the frequency range 1.25-500 Hz with voltage-dependent intensities that were unrelated to specific ion channels. Whether conduction in this material is a significant source of excess noise during axon conduction remains to be determined. Nevertheless, a source of excess noise external to or within an ion channel may not be properly represented solely as an additive term to the spectrum of ion channel noise; a deconvolution of these spectral components may be required for modeling purposes. PMID:6266542

  18. Water on Mars: Inventory, distribution, and possible sources of polar ice

    NASA Technical Reports Server (NTRS)

    Clifford, S. M.

    1992-01-01

    Theoretical considerations and various lines of morphologic evidence suggest that, in addition to the normal seasonal and climatic exchange of H2O that occurs between the Martian polar caps, atmosphere, and mid to high latitude regolith, large volumes of water have been introduced into the planet's long term hydrologic cycle by the sublimation of equatorial ground ice, impacts, catastrophic flooding, and volcanism. Under the climatic conditions that are thought to have prevailed on Mars throughout the past 3 to 4 b.y., much of this water is expected to have been cold trapped at the poles. The amount of polar ice contributed by each of the planet's potential crustal sources is discussed and estimated. The final analysis suggests that only 5 to 15 pct. of this potential inventory is now in residence at the poles.

  19. Mushrooms: A Potential Natural Source of Anti-Inflammatory Compounds for Medical Applications

    PubMed Central

    Elsayed, Elsayed A.; El Enshasy, Hesham; Wadaan, Mohammad A. M.; Aziz, Ramlan

    2014-01-01

    For centuries, macrofungi have been used as food and medicine in different parts of the world. This is mainly attributed to their nutritional value as a potential source of carbohydrates, proteins, amino acids, and minerals. In addition, they also include many bioactive metabolites which make mushrooms and truffles common components in folk medicine, especially in Africa, the Middle East, China, and Japan. The reported medicinal effects of mushrooms include anti-inflammatory effects, with anti-inflammatory compounds of mushrooms comprising a highly diversified group in terms of their chemical structure. They include polysaccharides, terpenoids, phenolic compounds, and many other low molecular weight molecules. The aims of this review are to report the different types of bioactive metabolites and their relevant producers, as well as the different mechanisms of action of mushroom compounds as potent anti-inflammatory agents. PMID:25505823

  20. Solar radiation data sources, applications, and network design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    A prerequisite to considering solar energy projects is to determine the requirements for information about solar radiation to apply to possible projects. This report offers techniques to help the reader specify requirements in terms of solar radiation data and information currently available, describes the past and present programs to record and present information to be used for most requirements, presents courses of action to help the user meet his needs for information, lists sources of solar radiation data and presents the problems, costs, benefits and responsibilities of programs to acquire additional solar radiation data. Extensive background information is provided aboutmore » solar radiation data and its use. Specialized information about recording, collecting, processing, storing and disseminating solar radiation data is given. Several Appendices are included which provide reference material for special situations.« less

  1. Attenuation Tomography of Northern California and the Yellow Sea / Korean Peninsula from Coda-source Normalized and Direct Lg Amplitudes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, S R; Dreger, D S; Phillips, W S

    2008-07-16

    Inversions for regional attenuation (1/Q) of Lg are performed in two different regions. The path attenuation component of the Lg spectrum is isolated using the coda-source normalization method, which corrects the Lg spectral amplitude for the source using the stable, coda-derived source spectra. Tomographic images of Northern California agree well with one-dimensional (1-D) Lg Q estimated from five different methods. We note there is some tendency for tomographic smoothing to increase Q relative to targeted 1-D methods. For example in the San Francisco Bay Area, which contains high attenuation relative to the rest of it's region, Q is over-estimated bymore » {approx}30. Coda-source normalized attenuation tomography is also carried out for the Yellow Sea/Korean Peninsula (YSKP) where output parameters (site, source, and path terms) are compared with those from the amplitude tomography method of Phillips et al. (2005) as well as a new method that ties the source term to the MDAC formulation (Walter and Taylor, 2001). The source terms show similar scatter between coda-source corrected and MDAC source perturbation methods, whereas the amplitude method has the greatest correlation with estimated true source magnitude. The coda-source better represents the source spectra compared to the estimated magnitude and could be the cause of the scatter. The similarity in the source terms between the coda-source and MDAC-linked methods shows that the latter method may approximate the effect of the former, and therefore could be useful in regions without coda-derived sources. The site terms from the MDAC-linked method correlate slightly with global Vs30 measurements. While the coda-source and amplitude ratio methods do not correlate with Vs30 measurements, they do correlate with one another, which provides confidence that the two methods are consistent. The path Q{sup -1} values are very similar between the coda-source and amplitude ratio methods except for small differences in the Da-xin-anling Mountains, in the northern YSKP. However there is one large difference between the MDAC-linked method and the others in the region near stations TJN and INCN, which point to site-effect as the cause for the difference.« less

  2. Integrating Information in Biological Ontologies and Molecular Networks to Infer Novel Terms

    PubMed Central

    Li, Le; Yip, Kevin Y.

    2016-01-01

    Currently most terms and term-term relationships in Gene Ontology (GO) are defined manually, which creates cost, consistency and completeness issues. Recent studies have demonstrated the feasibility of inferring GO automatically from biological networks, which represents an important complementary approach to GO construction. These methods (NeXO and CliXO) are unsupervised, which means 1) they cannot use the information contained in existing GO, 2) the way they integrate biological networks may not optimize the accuracy, and 3) they are not customized to infer the three different sub-ontologies of GO. Here we present a semi-supervised method called Unicorn that extends these previous methods to tackle the three problems. Unicorn uses a sub-tree of an existing GO sub-ontology as training part to learn parameters in integrating multiple networks. Cross-validation results show that Unicorn reliably inferred the left-out parts of each specific GO sub-ontology. In addition, by training Unicorn with an old version of GO together with biological networks, it successfully re-discovered some terms and term-term relationships present only in a new version of GO. Unicorn also successfully inferred some novel terms that were not contained in GO but have biological meanings well-supported by the literature.Availability: Source code of Unicorn is available at http://yiplab.cse.cuhk.edu.hk/unicorn/. PMID:27976738

  3. 10 CFR 40.41 - Terms and conditions of licenses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Terms and conditions of licenses. 40.41 Section 40.41 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF SOURCE MATERIAL Licenses § 40.41 Terms and... the regulations in this part shall confine his possession and use of source or byproduct material to...

  4. Evaluation of an 18-year CMAQ simulation: Seasonal variations and long-term temporal changes in sulfate and nitrate

    NASA Astrophysics Data System (ADS)

    Civerolo, Kevin; Hogrefe, Christian; Zalewsky, Eric; Hao, Winston; Sistla, Gopal; Lynn, Barry; Rosenzweig, Cynthia; Kinney, Patrick L.

    2010-10-01

    This paper compares spatial and seasonal variations and temporal trends in modeled and measured concentrations of sulfur and nitrogen compounds in wet and dry deposition over an 18-year period (1988-2005) over a portion of the northeastern United States. Substantial emissions reduction programs occurred over this time period, including Title IV of the Clean Air Act Amendments of 1990 which primarily resulted in large decreases in sulfur dioxide (SO 2) emissions by 1995, and nitrogen oxide (NO x) trading programs which resulted in large decreases in warm season NO x emissions by 2004. Additionally, NO x emissions from mobile sources declined more gradually over this period. The results presented here illustrate the use of both operational and dynamic model evaluation and suggest that the modeling system largely captures the seasonal and long-term changes in sulfur compounds. The modeling system generally captures the long-term trends in nitrogen compounds, but does not reproduce the average seasonal variation or spatial patterns in nitrate.

  5. Quality of life for chronic psychiatric illnesses and home care

    PubMed Central

    Molu, Nesibe Gunay; Ozkan, Birgul; Icel, Sema

    2016-01-01

    Nowadays, mental illnesses are gradually increasing and so does chronic psychiatric patients. As a result of this increase, chronic psychiatric disorders lead the burden of patients and their families. To reduce the burden of mental illnesses on individuals and their families, treatment and care are given including psychosocial, physiological and medical support and social services. To begin with, home care enables both the patient and his or her family to stay at their own houses and not to be bothered with residents or long-term, institutional-based nursing homes. In addition, the home care providers deliver services to the patient’s at their own house. The other advantages of taking care at home is that it eases financial issues in terms of reducing the cost, reduces the patient’s symptoms and improve the individual’s quality of life (QoL). In addition to these, home care also minimizes the burden on outpatient services and provides help for the patient and the family in order to solve their problems and give support. Home care services help patients to get their freedom back and enhance the quality of their lives. Thus, it is necessary to procure and implement these services and supply both the patient and his or her family a high-quality life. Sources of data/ study selection: Literature review was done by using the keywords “home care, patient with chronic mental illness, quality of life, home care nursing” from the sources including PsychINFO, PsychARTICLES, MEDLINE, PubMED, EBSCOHOST and The COCHRANE LIBRARY in the time period of 2005- 2015. PMID:27182272

  6. Polyfunctional epoxies - Different molecular weights of brominated polymeric additives as flame retardants in graphite composites

    NASA Technical Reports Server (NTRS)

    Nir, Z.; Gilwee, W. J.; Kourtides, D. A.; Parker, J. A.

    1983-01-01

    The imparting of flame retardancy to graphite-reinforced composites without incurring mechanical property deterioration is investigated for the case of an experimental, trifunctional epoxy resin incorporating brominated polymeric additives (BPAs) of the diglycidyl type. Such mechanical properties as flexural strength and modulus, and short beam shear strength, were measured in dry and in hot/wet conditions, and the glass transition temperature, flammability, and water absorption were measured and compared with nonbromilated systems. Another comparison was made with a tetrafunctional epoxy system. The results obtained are explained in terms of differences in the polymeric backbone length of the bromine carrier polymer. BPAs are found to be a reliable bromine source for fire inhibition in carbon-reinforced composites without compromise of mechanical properties.

  7. Langmuir cells and mixing in the upper ocean

    NASA Astrophysics Data System (ADS)

    Carniel, S.; Sclavo, M.; Kantha, L. H.; Clayson, C. A.

    2005-01-01

    The presence of surface gravity waves at the ocean surface has two important effects on turbulence in the oceanic mixed layer (ML): the wave breaking and the Langmuir cells (LC). Both these effects act as additional sources of turbulent kinetic energy (TKE) in the oceanic ML, and hence are important to mixing in the upper ocean. The breaking of high wave-number components of the wind wave spectrum provides an intense but sporadic source of turbulence in the upper surface; turbulence thus injected diffuses downward, while decaying rapidly, modifying oceanic near-surface properties which in turn could affect the air-sea transfer of heat and dissolved gases. LC provide another source of additional turbulence in the water column; they are counter-rotating cells inside the ML, with their axes roughly aligned in the direction of the wind (Langmuir I., Science871938119). These structures are usually made evident by the presence of debris and foam in the convergence area of the cells, and are generated by the interaction of the wave-field-induced Stokes drift with the wind-induced shear stress. LC have long been thought to have a substantial influence on mixing in the upper ocean, but the difficulty in their parameterization have made ML modelers consistently ignore them in the past. However, recent Large Eddy Simulations (LES) studies suggest that it is possible to include their effect on mixing by simply adding additional production terms in the turbulence equations, thus enabling even 1D models to incorporate LC-driven turbulence. Since LC also modify the Coriolis terms in the mean momentum equations by the addition of a term involving the Stokes drift, their effect on the velocity structure in the ML is also quite significant and could have a major impact on the drift of objects and spilled oil in the upper ocean. In this paper we examine the effect of surface gravity waves on mixing in the upper ocean, focusing on Langmuir circulations, which is by far the dominant part of the surface wave contribution to mixing. Oceanic ML models incorporating these effects are applied to an observation station in the Northern Adriatic Sea to see what the extent of these effects might be. It is shown that the surface wave effects can indeed be significant; in particular, the modification of the velocity profile due to LC-generated turbulence can be large under certain conditions. However, the surface wave effects on the bulk properties of the ML, such as the associated temperature, while significant, are generally speaking well within the errors introduced by uncertainties in the external forcing of the models. This seems to be the reason why ML models, though pretty much ignoring surface wave effects until recently, have been reasonably successful in depicting the evolution of the mixed layer temperature (MLT) at various timescales.

  8. Enhancing GADRAS Source Term Inputs for Creation of Synthetic Spectra.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horne, Steven M.; Harding, Lee

    The Gamma Detector Response and Analysis Software (GADRAS) team has enhanced the source term input for the creation of synthetic spectra. These enhancements include the following: allowing users to programmatically provide source information to GADRAS through memory, rather than through a string limited to 256 characters; allowing users to provide their own source decay database information; and updating the default GADRAS decay database to fix errors and include coincident gamma information.

  9. Localization of sound sources in a room with one microphone

    NASA Astrophysics Data System (ADS)

    Peić Tukuljac, Helena; Lissek, Hervé; Vandergheynst, Pierre

    2017-08-01

    Estimation of the location of sound sources is usually done using microphone arrays. Such settings provide an environment where we know the difference between the received signals among different microphones in the terms of phase or attenuation, which enables localization of the sound sources. In our solution we exploit the properties of the room transfer function in order to localize a sound source inside a room with only one microphone. The shape of the room and the position of the microphone are assumed to be known. The design guidelines and limitations of the sensing matrix are given. Implementation is based on the sparsity in the terms of voxels in a room that are occupied by a source. What is especially interesting about our solution is that we provide localization of the sound sources not only in the horizontal plane, but in the terms of the 3D coordinates inside the room.

  10. A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions

    NASA Technical Reports Server (NTRS)

    Huff, R. G.

    1984-01-01

    The equations of momentum annd continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in Earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.

  11. A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions

    NASA Technical Reports Server (NTRS)

    Huff, R. G.

    1984-01-01

    The equations of momentum and continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.

  12. The validity of open-source data when assessing jail suicides.

    PubMed

    Thomas, Amanda L; Scott, Jacqueline; Mellow, Jeff

    2018-05-09

    The Bureau of Justice Statistics' Deaths in Custody Reporting Program is the primary source for jail suicide research, though the data is restricted from general dissemination. This study is the first to examine whether jail suicide data obtained from publicly available sources can help inform our understanding of this serious public health problem. Of the 304 suicides that were reported through the DCRP in 2009, roughly 56 percent (N = 170) of those suicides were identified through the open-source search protocol. Each of the sources was assessed based on how much information was collected on the incident and the types of variables available. A descriptive analysis was then conducted on the variables that were present in both data sources. The four variables present in each data source were: (1) demographic characteristics of the victim, (2) the location of occurrence within the facility, (3) the location of occurrence by state, and (4) the size of the facility. Findings demonstrate that the prevalence and correlates of jail suicides are extremely similar in both open-source and official data. However, for almost every variable measured, open-source data captured as much information as official data did, if not more. Further, variables not found in official data were identified in the open-source database, thus allowing researchers to have a more nuanced understanding of the situational characteristics of the event. This research provides support for the argument in favor of including open-source data in jail suicide research as it illustrates how open-source data can be used to provide additional information not originally found in official data. In sum, this research is vital in terms of possible suicide prevention, which may be directly linked to being able to manipulate environmental factors.

  13. Fermi Large Area Telescope First Source Catalog

    DOE PAGES

    Abdo, A. A.; Ackermann, M.; Ajello, M.; ...

    2010-05-25

    Here, we present a catalog of high-energy gamma-ray sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi), during the first 11 months of the science phase of the mission, which began on 2008 August 4. The First Fermi-LAT catalog (1FGL) contains 1451 sources detected and characterized in the 100 MeV to 100 GeV range. Source detection was based on the average flux over the 11 month period, and the threshold likelihood Test Statistic is 25, corresponding to a significance of just over 4σ. The 1FGL catalog includes source location regions,more » defined in terms of elliptical fits to the 95% confidence regions and power-law spectral fits as well as flux measurements in five energy bands for each source. In addition, monthly light curves are provided. Using a protocol defined before launch we have tested for several populations of gamma-ray sources among the sources in the catalog. For individual LAT-detected sources we provide firm identifications or plausible associations with sources in other astronomical catalogs. Identifications are based on correlated variability with counterparts at other wavelengths, or on spin or orbital periodicity. For the catalogs and association criteria that we have selected, 630 of the sources are unassociated. In conclusion, care was taken to characterize the sensitivity of the results to the model of interstellar diffuse gamma-ray emission used to model the bright foreground, with the result that 161 sources at low Galactic latitudes and toward bright local interstellar clouds are flagged as having properties that are strongly dependent on the model or as potentially being due to incorrectly modeled structure in the Galactic diffuse emission.« less

  14. Aerosol characterization over the southeastern United States using high resolution aerosol mass spectrometry: spatial and seasonal variation of aerosol composition, sources, and organic nitrates

    NASA Astrophysics Data System (ADS)

    Xu, L.; Suresh, S.; Guo, H.; Weber, R. J.; Ng, N. L.

    2015-04-01

    We deployed a High-Resolution Time-of-Flight Aerosol Mass Spectrometer (HR-ToF-AMS) and an Aerosol Chemical Speciation Monitor (ACSM) to characterize the chemical composition of submicron non-refractory particles (NR-PM1) in the southeastern US. Measurements were performed in both rural and urban sites in the greater Atlanta area, GA and Centreville, AL for approximately one year, as part of Southeastern Center of Air Pollution and Epidemiology study (SCAPE) and Southern Oxidant and Aerosol Study (SOAS). Organic aerosol (OA) accounts for more than half of NR1 mass concentration regardless of sampling sites and seasons. Positive matrix factorization (PMF) analysis of HR-ToF-AMS measurements identified various OA sources, depending on location and season. Hydrocarbon-like OA (HOA) and cooking OA (COA) have important but not dominant contributions to total OA in urban sites. Biomass burning OA (BBOA) concentration shows a distinct seasonal variation with a larger enhancement in winter than summer. We find a good correlation between BBOA and brown carbon, indicating biomass burning is an important source for brown carbon, although an additional, unidentified brown carbon source is likely present at the rural Yorkville site. Isoprene-derived OA (Isoprene-OA) is only deconvolved in warmer months and contributes 18-36% of total OA. The presence of Isoprene-OA factor in urban sites is more likely from local production in the presence of NOx than transport from rural sites. More-oxidized and less-oxidized oxygenated organic aerosol (MO-OOA and LO-OOA, respectively) are dominant fractions (47-79%) of OA in all sites. MO-OOA correlates well with ozone in summer, but not in winter, indicating MO-OOA sources may vary with seasons. LO-OOA, which reaches a daily maximum at night, correlates better with estimated nitrate functionality from organic nitrates than total nitrates. Based on the HR-ToF-AMS measurements, we estimate that the nitrate functionality from organic nitrates contributes 63-100% of total measured nitrates in summer. Further, the contribution of organic nitrates to total OA is estimated to be 5-12% in summer, suggesting that organic nitrates are important components in the ambient aerosol in the southeastern US. The spatial distribution of OA is investigated by comparing simultaneous HR-ToF-AMS measurements with ACSM measurements at two different sampling sites. OA is found to be spatially homogeneous in summer, possibly due to stagnant air mass and a dominant amount of regional SOA in the southeastern US. The homogeneity is less in winter, which is likely due to spatial variation of primary emissions. We observed that the seasonality of OA concentration shows a clear urban/rural contrast. While OA exhibits weak seasonal variation in the urban sites, its concentration is higher in summer than winter for rural sites. This observation from our year-long measurements is consistent with 14 years of organic carbon (OC) data from the SouthEastern Aerosol Research and Characterization (SEARCH) network. The comparison between short-term measurements with advanced instruments and long-term measurements of basic air quality indicators not only tests the robustness of the short-term measurements but also provides insights in interpreting long-term measurements. We find that OA factors resolved from PMF analysis on HR-ToF-AMS measurements have distinctly different diurnal variations. The compensation of OA factors with different diurnal trends is one possible reason for the repeatedly observed, relatively flat OA diurnal profile in the southeastern US. In addition, analysis of long-term measurements shows that the correlation between OC and sulfate is substantially higher in summer than winter. This seasonality could be partly due to the effects of sulfate on isoprene SOA formation as revealed by the short-term, intensive measurements.

  15. Aerosol characterization over the southeastern United States using high-resolution aerosol mass spectrometry: spatial and seasonal variation of aerosol composition and sources with a focus on organic nitrates

    NASA Astrophysics Data System (ADS)

    Xu, L.; Suresh, S.; Guo, H.; Weber, R. J.; Ng, N. L.

    2015-07-01

    We deployed a High-Resolution Time-of-Flight Aerosol Mass Spectrometer (HR-ToF-AMS) and an Aerosol Chemical Speciation Monitor (ACSM) to characterize the chemical composition of submicron non-refractory particulate matter (NR-PM1) in the southeastern USA. Measurements were performed in both rural and urban sites in the greater Atlanta area, Georgia (GA), and Centreville, Alabama (AL), for approximately 1 year as part of Southeastern Center for Air Pollution and Epidemiology study (SCAPE) and Southern Oxidant and Aerosol Study (SOAS). Organic aerosol (OA) accounts for more than half of NR-PM1 mass concentration regardless of sampling sites and seasons. Positive matrix factorization (PMF) analysis of HR-ToF-AMS measurements identified various OA sources, depending on location and season. Hydrocarbon-like OA (HOA) and cooking OA (COA) have important, but not dominant, contributions to total OA in urban sites (i.e., 21-38 % of total OA depending on site and season). Biomass burning OA (BBOA) concentration shows a distinct seasonal variation with a larger enhancement in winter than summer. We find a good correlation between BBOA and brown carbon, indicating biomass burning is an important source for brown carbon, although an additional, unidentified brown carbon source is likely present at the rural Yorkville site. Isoprene-derived OA factor (isoprene-OA) is only deconvolved in warmer months and contributes 18-36 % of total OA. The presence of isoprene-OA factor in urban sites is more likely from local production in the presence of NOx than transport from rural sites. More-oxidized and less-oxidized oxygenated organic aerosol (MO-OOA and LO-OOA, respectively) are dominant fractions (47-79 %) of OA in all sites. MO-OOA correlates well with ozone in summer but not in winter, indicating MO-OOA sources may vary with seasons. LO-OOA, which reaches a daily maximum at night, correlates better with estimated nitrate functionality from organic nitrates than total nitrates. Based on the HR-ToF-AMS measurements, we estimate that the nitrate functionality from organic nitrates contributes 63-100 % to the total measured nitrates in summer. Furthermore, the contribution of organic nitrates to total OA is estimated to be 5-12 % in summer, suggesting that organic nitrates are important components in the ambient aerosol in the southeastern USA. The spatial distribution of OA is investigated by comparing simultaneous HR-ToF-AMS measurements with ACSM measurements at two different sampling sites. OA is found to be spatially homogeneous in summer due possibly to stagnant air mass and a dominant amount of regional secondary organic aerosol (SOA) in the southeastern USA. The homogeneity is less in winter, which is likely due to spatial variation of primary emissions. We observe that the seasonality of OA concentration shows a clear urban/rural contrast. While OA exhibits weak seasonal variation in the urban sites, its concentration is higher in summer than winter for rural sites. This observation from our year-long measurements is consistent with 14 years of organic carbon (OC) data from the SouthEastern Aerosol Research and Characterization (SEARCH) network. The comparison between short-term measurements with advanced instruments and long-term measurements of basic air quality indicators not only tests the robustness of the short-term measurements but also provides insights in interpreting long-term measurements. We find that OA factors resolved from PMF analysis on HR-ToF-AMS measurements have distinctly different diurnal variations. The compensation of OA factors with different diurnal trends is one possible reason for the repeatedly observed, relatively flat OA diurnal profile in the southeastern USA. In addition, analysis of long-term measurements shows that the correlation between OC and sulfate is substantially stronger in summer than winter. This seasonality could be partly due to the effects of sulfate on isoprene SOA formation as revealed by the short-term intensive measurements.

  16. Mercury in municipal solid wastes and New Jersey mercury prevention and reduction program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erdogan, H.; Stevenson, E.

    1994-12-31

    Mercury is a very toxic heavy metal which accumulates in the brain causing neurological damages involving psychasthenic and vegetative syndrome. At high exposure levels it causes behavioral and personality changes, loss of memory and insomnia. Long-term exposure or exposure during pregnancy to mercury or mercury compounds can permanently damage the kidney and fetus. In addition to potential effects on human health, mercury poisoning can also affect other living organisms. Mercury is different than other heavy metals. It consistently biomagnifies and bioaccumulates within the aquatic food chain. Global sources of mercury release are both natural and anthropogenic. Natural sources include volatilizationmore » of gaseous-mercury iron soils ana rocks, volcanic releases, evaporation from the ocean and other water bodies. Anthropogenic sources are fuel and coal combustion, mining, smelting, manufacturing activities, disposal of sludge, pesticides, animal and food waste, and incineration of municipal solid waste. Worldwide combustion of municipal solid waste is the second largest source of atmospheric emission of mercury. In New Jersey, incineration of solid waste is the largest source of atmospheric emission of mercury. The New Jersey Department of Environmental Protection and Energy (NJDEPE) has developed a comprehensive program to control and prevent emission of mercury resulting from combustion municipal solid waste.« less

  17. Can fungi compete with marine sources for chitosan production?

    PubMed

    Ghormade, V; Pathan, E K; Deshpande, M V

    2017-11-01

    Chitosan, a β-1,4-linked glucosamine polymer is formed by deacetylation of chitin. It has a wide range of applications from agriculture to human health care products. Chitosan is commercially produced from shellfish, shrimp waste, crab and lobster processing using strong alkalis at high temperatures for long time periods. The production of chitin and chitosan from fungal sources has gained increased attention in recent years due to potential advantages in terms of homogenous polymer length, high degree of deacetylation and solubility over the current marine source. Zygomycetous fungi such as Absidia coerulea, Benjaminiella poitrasii, Cunninghamella elegans, Gongrenella butleri, Mucor rouxii, Mucor racemosus and Rhizopus oryzae have been studied extensively. Isolation of chitosan are reported from few edible basidiomycetous fungi like Agaricus bisporus, Lentinula edodes and Pleurotus sajor-caju. Other organisms from mycotech industries explored for chitosan production are Aspergillus niger, Penicillium chrysogenum, Saccharomyces cerevisiae and other wine yeasts. Number of aspects such as value addition to the existing applications of fungi, utilization of waste from agriculture sector, and issues and challenges for the production of fungal chitosan to compete with existing sources, metabolic engineering and novel applications have been discussed to adjudge the potential of fungal sources for commercial chitosan production. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Effects of argument quality, source credibility and self-reported diabetes knowledge on message attitudes: an experiment using diabetes related messages.

    PubMed

    Lin, Tung-Cheng; Hwang, Lih-Lian; Lai, Yung-Jye

    2017-05-17

    Previous studies have reported that credibility and content (argument quality) are the most critical factors affecting the quality of health information and its acceptance and use; however, this causal relationship merits further investigation in the context of health education. Moreover, message recipients' prior knowledge may moderate these relationships. This study used the elaboration likelihood model to determine the main effects of argument quality, source credibility and the moderating effect of self-reported diabetes knowledge on message attitudes. A between-subjects experimental design using an educational message concerning diabetes for manipulation was applied to validate the effects empirically. A total of 181 participants without diabetes were recruited from the Department of Health, Taipei City Government. Four group messages were manipulated in terms of argument quality (high and low) × source credibility (high and low). Argument quality and source credibility of health information significantly influenced the attitude of message recipients. The participants with high self-reported knowledge participants exhibited significant disapproval for messages with low argument quality. Effective health information should provide objective descriptions and cite reliable sources; in addition, it should provide accurate, customised messages for recipients who have high background knowledge level and ability to discern message quality. © 2017 Health Libraries Group Health Information & Libraries Journal.

  19. An adaptive Bayesian inference algorithm to estimate the parameters of a hazardous atmospheric release

    NASA Astrophysics Data System (ADS)

    Rajaona, Harizo; Septier, François; Armand, Patrick; Delignon, Yves; Olry, Christophe; Albergel, Armand; Moussafir, Jacques

    2015-12-01

    In the eventuality of an accidental or intentional atmospheric release, the reconstruction of the source term using measurements from a set of sensors is an important and challenging inverse problem. A rapid and accurate estimation of the source allows faster and more efficient action for first-response teams, in addition to providing better damage assessment. This paper presents a Bayesian probabilistic approach to estimate the location and the temporal emission profile of a pointwise source. The release rate is evaluated analytically by using a Gaussian assumption on its prior distribution, and is enhanced with a positivity constraint to improve the estimation. The source location is obtained by the means of an advanced iterative Monte-Carlo technique called Adaptive Multiple Importance Sampling (AMIS), which uses a recycling process at each iteration to accelerate its convergence. The proposed methodology is tested using synthetic and real concentration data in the framework of the Fusion Field Trials 2007 (FFT-07) experiment. The quality of the obtained results is comparable to those coming from the Markov Chain Monte Carlo (MCMC) algorithm, a popular Bayesian method used for source estimation. Moreover, the adaptive processing of the AMIS provides a better sampling efficiency by reusing all the generated samples.

  20. Evaluation of Intercontinental Transport of Ozone Using Full-tagged, Tagged-N and Sensitivity Methods

    NASA Astrophysics Data System (ADS)

    Guo, Y.; Liu, J.; Mauzerall, D. L.; Emmons, L. K.; Horowitz, L. W.; Fan, S.; Li, X.; Tao, S.

    2014-12-01

    Long-range transport of ozone is of great concern, yet the source-receptor relationships derived previously depend strongly on the source attribution techniques used. Here we describe a new tagged ozone mechanism (full-tagged), the design of which seeks to take into account the combined effects of emissions of ozone precursors, CO, NOx and VOCs, from a particular source, while keeping the current state of chemical equilibrium unchanged. We label emissions from the target source (A) and background (B). When two species from A and B sources react with each other, half of the resulting products are labeled A, and half B. Thus the impact of a given source on downwind regions is recorded through tagged chemistry. We then incorporate this mechanism into the Model for Ozone and Related chemical Tracers (MOZART-4) to examine the impact of anthropogenic emissions within North America, Europe, East Asia and South Asia on ground-level ozone downwind of source regions during 1999-2000. We compare our results with two previously used methods -- the sensitivity and tagged-N approaches. The ozone attributed to a given source by the full-tagged method is more widely distributed spatially, but has weaker seasonal variability than that estimated by the other methods. On a seasonal basis, for most source/receptor pairs, the full-tagged method estimates the largest amount of tagged ozone, followed by the sensitivity and tagged-N methods. In terms of trans-Pacific influence of ozone pollution, the full-tagged method estimates the strongest impact of East Asian (EA) emissions on the western U.S. (WUS) in MAM and JJA (~3 ppbv), which is substantially different in magnitude and seasonality from tagged-N and sensitivity studies. This difference results from the full-tagged method accounting for the maintenance of peroxy radicals (e.g., CH3O2, CH3CO3, and HO2), in addition to NOy, as effective reservoirs of EA source impact across the Pacific, allowing for a significant contribution to ozone formation over WUS (particularly in summer). Thus, the full-tagged method, with its clear discrimination of source and background contributions on a per-reaction basis, provides unique insights into the critical role of VOCs (and additional reactive nitrogen species) in determining the nonlinear inter-continental influence of ozone pollution.

  1. Management of Ultimate Risk of Nuclear Power Plants by Source Terms - Lessons Learned from the Chernobyl Accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genn Saji

    2006-07-01

    The term 'ultimate risk' is used here to describe the probabilities and radiological consequences that should be incorporated in siting, containment design and accident management of nuclear power plants for hypothetical accidents. It is closely related with the source terms specified in siting criteria which assures an adequate separation of radioactive inventories of the plants from the public, in the event of a hypothetical and severe accident situation. The author would like to point out that current source terms which are based on the information from the Windscale accident (1957) through TID-14844 are very outdated and do not incorporate lessonsmore » learned from either the Three Miles Island (TMI, 1979) nor Chernobyl accident (1986), two of the most severe accidents ever experienced. As a result of the observations of benign radionuclides released at TMI, the technical community in the US felt that a more realistic evaluation of severe reactor accident source terms was necessary. In this background, the 'source term research project' was organized in 1984 to respond to these challenges. Unfortunately, soon after the time of the final report from this project was released, the Chernobyl accident occurred. Due to the enormous consequences induced by then accident, the one time optimistic perspectives in establishing a more realistic source term were completely shattered. The Chernobyl accident, with its human death toll and dispersion of a large part of the fission fragments inventories into the environment, created a significant degradation in the public's acceptance of nuclear energy throughout the world. In spite of this, nuclear communities have been prudent in responding to the public's anxiety towards the ultimate safety of nuclear plants, since there still remained many unknown points revolving around the mechanism of the Chernobyl accident. In order to resolve some of these mysteries, the author has performed a scoping study of the dispersion and deposition mechanisms of fuel particles and fission fragments during the initial phase of the Chernobyl accident. Through this study, it is now possible to generally reconstruct the radiological consequences by using a dispersion calculation technique, combined with the meteorological data at the time of the accident and land contamination densities of {sup 137}Cs measured and reported around the Chernobyl area. Although it is challenging to incorporate lessons learned from the Chernobyl accident into the source term issues, the author has already developed an example of safety goals by incorporating the radiological consequences of the accident. The example provides safety goals by specifying source term releases in a graded approach in combination with probabilities, i.e. risks. The author believes that the future source term specification should be directly linked with safety goals. (author)« less

  2. The long-term problems of contaminated land: Sources, impacts and countermeasures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baes, C.F. III

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').

  3. Improved Discrete Ordinate Solutions in the Presence of an Anisotropically Reflecting Lower Boundary: Upgrades of the DISORT Computational Tool

    NASA Technical Reports Server (NTRS)

    Lin, Z.; Stamnes, S.; Jin, Z.; Laszlo, I.; Tsay, S. C.; Wiscombe, W. J.; Stamnes, K.

    2015-01-01

    A successor version 3 of DISORT (DISORT3) is presented with important upgrades that improve the accuracy, efficiency, and stability of the algorithm. Compared with version 2 (DISORT2 released in 2000) these upgrades include (a) a redesigned BRDF computation that improves both speed and accuracy, (b) a revised treatment of the single scattering correction, and (c) additional efficiency and stability upgrades for beam sources. In DISORT3 the BRDF computation is improved in the following three ways: (i) the Fourier decomposition is prepared "off-line", thus avoiding the repeated internal computations done in DISORT2; (ii) a large enough number of terms in the Fourier expansion of the BRDF is employed to guarantee accurate values of the expansion coefficients (default is 200 instead of 50 in DISORT2); (iii) in the post processing step the reflection of the direct attenuated beam from the lower boundary is included resulting in a more accurate single scattering correction. These improvements in the treatment of the BRDF have led to improved accuracy and a several-fold increase in speed. In addition, the stability of beam sources has been improved by removing a singularity occurring when the cosine of the incident beam angle is too close to the reciprocal of any of the eigenvalues. The efficiency for beam sources has been further improved from reducing by a factor of 2 (compared to DISORT2) the dimension of the linear system of equations that must be solved to obtain the particular solutions, and by replacing the LINPAK routines used in DISORT2 by LAPACK 3.5 in DISORT3. These beam source stability and efficiency upgrades bring enhanced stability and an additional 5-7% improvement in speed. Numerical results are provided to demonstrate and quantify the improvements in accuracy and efficiency of DISORT3 compared to DISORT2.

  4. Saline Groundwater from Coastal Aquifers As a Source for Desalination.

    PubMed

    Stein, Shaked; Russak, Amos; Sivan, Orit; Yechieli, Yoseph; Rahav, Eyal; Oren, Yoram; Kasher, Roni

    2016-02-16

    Reverse osmosis (RO) seawater desalination is currently a widespread means of closing the gap between supply and demand for potable water in arid regions. Currently, one of the main setbacks of RO operation is fouling, which hinders membrane performance and induces pressure loss, thereby reducing system efficiency. An alternative water source is saline groundwater with salinity close to seawater, pumped from beach wells in coastal aquifers which penetrate beneath the freshwater-seawater interface. In this research, we studied the potential use of saline groundwater of the coastal aquifer as feedwater for desalination in comparison to seawater using fieldwork and laboratory approaches. The chemistry, microbiology and physical properties of saline groundwater were characterized and compared with seawater. Additionally, reverse osmosis desalination experiments in a cross-flow system were performed, evaluating the permeate flux, salt rejection and fouling propensities of the different water types. Our results indicated that saline groundwater was significantly favored over seawater as a feed source in terms of chemical composition, microorganism content, silt density, and fouling potential, and exhibited better desalination performance with less flux decline. Saline groundwater may be a better water source for desalination by RO due to lower fouling potential, and reduced pretreatment costs.

  5. Efficient 1.6 Micron Laser Source for Methane DIAL

    NASA Technical Reports Server (NTRS)

    Shuman, Timothy; Burnham, Ralph; Nehrir, Amin R.; Ismail, Syed; Hair, Johnathan W.

    2013-01-01

    Methane is a potent greenhouse gas and on a per molecule basis has a warming influence 72 times that of carbon dioxide over a 20 year horizon. Therefore, it is important to look at near term radiative effects due to methane to develop mitigation strategies to counteract global warming trends via ground and airborne based measurements systems. These systems require the development of a time-resolved DIAL capability using a narrow-line laser source allowing observation of atmospheric methane on local, regional and global scales. In this work, a demonstrated and efficient nonlinear conversion scheme meeting the performance requirements of a deployable methane DIAL system is presented. By combining a single frequency 1064 nm pump source and a seeded KTP OPO more than 5 mJ of 1.6 µm pulse energy is generated with conversion efficiencies in excess of 20%. Even without active cavity control instrument limited linewidths (50 pm) were achieved with an estimated spectral purity of 95%. Tunable operation over 400 pm (limited by the tuning range of the seed laser) was also demonstrated. This source demonstrated the critical needs for a methane DIAL system motivating additional development of the technology.

  6. What are the Starting Points? Evaluating Base-Year Assumptions in the Asian Modeling Exercise

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chaturvedi, Vaibhav; Waldhoff, Stephanie; Clarke, Leon E.

    2012-12-01

    A common feature of model inter-comparison efforts is that the base year numbers for important parameters such as population and GDP can differ substantially across models. This paper explores the sources and implications of this variation in Asian countries across the models participating in the Asian Modeling Exercise (AME). Because the models do not all have a common base year, each team was required to provide data for 2005 for comparison purposes. This paper compares the year 2005 information for different models, noting the degree of variation in important parameters, including population, GDP, primary energy, electricity, and CO2 emissions. Itmore » then explores the difference in these key parameters across different sources of base-year information. The analysis confirms that the sources provide different values for many key parameters. This variation across data sources and additional reasons why models might provide different base-year numbers, including differences in regional definitions, differences in model base year, and differences in GDP transformation methodologies, are then discussed in the context of the AME scenarios. Finally, the paper explores the implications of base-year variation on long-term model results.« less

  7. Biotic Nitrogen Enrichment Regulates Calcium Sources to Forests

    NASA Astrophysics Data System (ADS)

    Pett-Ridge, J. C.; Perakis, S. S.; Hynicka, J. D.

    2015-12-01

    Calcium is an essential nutrient in forest ecosystems that is susceptible to leaching loss and depletion. Calcium depletion can affect plant and animal productivity, soil acid buffering capacity, and fluxes of carbon and water. Excess nitrogen supply and associated soil acidification are often implicated in short-term calcium loss from soils, but the long-term role of nitrogen enrichment on calcium sources and resupply is unknown. Here we use strontium isotopes (87Sr/86Sr) as a proxy for calcium to investigate how soil nitrogen enrichment from biological nitrogen fixation interacts with bedrock calcium to regulate both short-term available supplies and the long-term sources of calcium in montane conifer forests. Our study examines 22 sites in western Oregon, spanning a 20-fold range of bedrock calcium on sedimentary and basaltic lithologies. In contrast to previous studies emphasizing abiotic control of weathering as a determinant of long-term ecosystem calcium dynamics and sources (via bedrock fertility, climate, or topographic/tectonic controls) we find instead that that biotic nitrogen enrichment of soil can strongly regulate calcium sources and supplies in forest ecosystems. For forests on calcium-rich basaltic bedrock, increasing nitrogen enrichment causes calcium sources to shift from rock-weathering to atmospheric dominance, with minimal influence from other major soil forming factors, despite regionally high rates of tectonic uplift and erosion that can rejuvenate weathering supply of soil minerals. For forests on calcium-poor sedimentary bedrock, we find that atmospheric inputs dominate regardless of degree of nitrogen enrichment. Short-term measures of soil and ecosystem calcium fertility are decoupled from calcium source sustainability, with fundamental implications for understanding nitrogen impacts, both in natural ecosystems and in the context of global change. Our finding that long-term nitrogen enrichment increases forest reliance on atmospheric calcium helps explain reports of greater ecological calcium limitation in an increasingly nitrogen-rich world.

  8. When and where in aging: the role of music on source monitoring.

    PubMed

    Palumbo, Rocco; Mammarella, Nicola; Di Domenico, Alberto; Fairfield, Beth

    2018-06-01

    Difficulties in source monitoring (SM) tasks observed in healthy older adults may be linked to associative memory deficits since SM requires individuals to correctly bind and later remember these bound features to discriminate the origin of a memory. Therefore, focusing attention on discriminating factors that may attenuate older adults' difficulties in attributing contextual information to memories is necessary. We investigated the effect of affective information on source monitoring in younger and older adults by manipulating the type of affective information (pictures and music) and assessing the ability to remember spatial and temporal source details for affective pictures encoded while listening to classical music. Older and younger adults viewed a series of affective IAPS pictures presented on the left or right side of the computer screen in two different lists. At test, participants were asked to remember if the picture was seen (right/left), in which list (list1/list2) or whether it was new. Results showed that spatial information was attributed better than temporal information and emotional pictures were attributed better than neutral pictures in both younger and older adults. In addition, although music significantly increased source memory performance in both younger and older participants compared to the white noise condition, the pleasantness of music differentially affected memory for source details. The authors discuss findings in terms of an interaction between music, emotion and cognition in aging.

  9. Bremsstrahlung versus Monoenergetic Photon Dose and Photonuclear Stimulation Comparisons at Long Standoff Distances

    NASA Astrophysics Data System (ADS)

    Jones, J. L.; Sterbentz, J. W.; Yoon, W. Y.; Norman, D. R.

    2009-12-01

    Energetic photon sources with energies greater than 6 MeV continue to be recognized as viable source for various types of inspection applications, especially those related to nuclear and/or explosive material detection. These energetic photons can be produced as a continuum of energies (i.e., bremsstrahlung) or as a set of one or more discrete photon energies (i.e., monoenergetic). This paper will provide a follow-on extension of the photon dose comparison presented at the 9th International Conference on Applications of Nuclear Techniques (June 2008). Our previous paper showed the comparative advantages and disadvantages of the photon doses provided by these two energetic interrogation sources and highlighted the higher energy advantage of the bremsstrahlung source, especially at long standoff distances (i.e., distance from source to the inspected object). This paper will pursue higher energy photon inspection advantage (up to 100 MeV) by providing dose and stimulated photonuclear interaction predictions in air and for an infinitely dilute interrogated material (used for comparative interaction rate assessments since it excludes material self-shielding) as the interrogation object positioned forward on the inspection beam axis at increasing standoff distances. In addition to the direct energetic photon-induced stimulation, the predictions will identify the importance of secondary downscattered/attenuated source-term effects arising from the photon transport in the intervening air environment.

  10. Large Energy Development Projects: Lessons Learned from Space and Politics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmitt, Harrison H.

    2005-04-15

    The challenge to global energy future lies in meeting the needs and aspirations of the ten to twelve billion earthlings that will be on this planet by 2050. At least an eight-fold increase in annual production will be required by the middle of this century. The energy sources that can be considered developed and 'in the box' for consideration as sources for major increases in supply over the next half century are fossil fuels, nuclear fission, and, to a lesser degree, various forms of direct and stored solar energy and conservation. None of these near-term sources of energy will providemore » an eight-fold or more increase in energy supply for various technical, environmental and political reasons.Only a few potential energy sources that fall 'out of the box' appear worthy of additional consideration as possible contributors to energy demand in 2050 and beyond. These particular candidates are deuterium-tritium fusion, space solar energy, and lunar helium-3 fusion. The primary advantage that lunar helium-3 fusion will have over other 'out of the box' energy sources in the pre-2050 timeframe is a clear path into the private capital markets. The development and demonstration of new energy sources will require several development paths, each of Apollo-like complexity and each with sub-paths of parallel development for critical functions and components.« less

  11. Evaluating Contaminants of Emerging Concern as tracers of wastewater from septic systems.

    PubMed

    James, C Andrew; Miller-Schulze, Justin P; Ultican, Shawn; Gipe, Alex D; Baker, Joel E

    2016-09-15

    Bacterial and nutrient contamination from anthropogenic sources impacts fresh and marine waters, reducing water quality and restricting recreational and commercial activities. In many cases the source of this contamination is ambiguous, and a tracer or set of tracers linking contamination to source would be valuable. In this work, the effectiveness of utilizing a suite of Contaminants of Emerging Concern (CECs) as tracers of bacteria from human septic system effluent is investigated. Field sampling was performed at more than 20 locations over approximately 18 months and analyzed for a suite of CECs and fecal coliform bacteria. The sampling locations included seeps and small freshwater discharges to the shoreline. Sites were selected and grouped according to level of impact by septic systems as determined by previous field sampling programs. A subset of selected locations had been positively identified as being impacted by effluent from failing septic systems through dye testing. The CECs were selected based on their predominant use, their frequency of use, and putative fate and transport properties. In addition, two rounds of focused sampling were performed at selected sites to characterize short-term variations in CEC and fecal coliform concentrations, and to evaluate environmental persistence following source correction activities. The results indicate that a suite of common use compounds are suitable as generalized tracers of bacterial contamination from septic systems and that fate and transport properties are important in tracer selection. Highly recalcitrant or highly labile compounds likely follow different loss profiles in the subsurface compared to fecal bacteria and are not suitable tracers. The use of more than one tracer compound is recommended due to source variability of septic systems and to account for variations in the subsurface condition. In addition, concentrations of some CECs were measured in receiving waters at levels which suggested the potential for environmental harm, indicating that the possible risk presented from these sources warrants further investigation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Multiobjective optimization of cluster-scale urban water systems investigating alternative water sources and level of decentralization

    NASA Astrophysics Data System (ADS)

    Newman, J. P.; Dandy, G. C.; Maier, H. R.

    2014-10-01

    In many regions, conventional water supplies are unable to meet projected consumer demand. Consequently, interest has arisen in integrated urban water systems, which involve the reclamation or harvesting of alternative, localized water sources. However, this makes the planning and design of water infrastructure more difficult, as multiple objectives need to be considered, water sources need to be selected from a number of alternatives, and end uses of these sources need to be specified. In addition, the scale at which each treatment, collection, and distribution network should operate needs to be investigated. In order to deal with this complexity, a framework for planning and designing water infrastructure taking into account integrated urban water management principles is presented in this paper and applied to a rural greenfield development. Various options for water supply, and the scale at which they operate were investigated in order to determine the life-cycle trade-offs between water savings, cost, and GHG emissions as calculated from models calibrated using Australian data. The decision space includes the choice of water sources, storage tanks, treatment facilities, and pipes for water conveyance. For each water system analyzed, infrastructure components were sized using multiobjective genetic algorithms. The results indicate that local water sources are competitive in terms of cost and GHG emissions, and can reduce demand on the potable system by as much as 54%. Economies of scale in treatment dominated the diseconomies of scale in collection and distribution of water. Therefore, water systems that connect large clusters of households tend to be more cost efficient and have lower GHG emissions. In addition, water systems that recycle wastewater tended to perform better than systems that captured roof-runoff. Through these results, the framework was shown to be effective at identifying near optimal trade-offs between competing objectives, thereby enabling informed decisions to be made when planning water systems for greenfield developments.

  13. Assessment and application of clustering techniques to atmospheric particle number size distribution for the purpose of source apportionment

    NASA Astrophysics Data System (ADS)

    Salimi, F.; Ristovski, Z.; Mazaheri, M.; Laiman, R.; Crilley, L. R.; He, C.; Clifford, S.; Morawska, L.

    2014-06-01

    Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods which have been recently employed to analyse PNSD data, however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K-means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and silhouette width validation values and the K-means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K-means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectra to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.

  14. Assessment and application of clustering techniques to atmospheric particle number size distribution for the purpose of source apportionment

    NASA Astrophysics Data System (ADS)

    Salimi, F.; Ristovski, Z.; Mazaheri, M.; Laiman, R.; Crilley, L. R.; He, C.; Clifford, S.; Morawska, L.

    2014-11-01

    Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods that have been recently employed to analyse PNSD data; however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and Silhouette width validation values and the K means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectrum to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.

  15. Enhanced MFC power production and struvite recovery by the addition of sea salts to urine.

    PubMed

    Merino-Jimenez, Irene; Celorrio, Veronica; Fermin, David J; Greenman, John; Ieropoulos, Ioannis

    2017-02-01

    Urine is an excellent fuel for electricity generation in Microbial Fuel Cells (MFCs), especially with practical implementations in mind. Moreover, urine has a high content in nutrients which can be easily recovered. Struvite (MgNH 4 PO 4 ·6H 2 O) crystals naturally precipitate in urine, but this reaction can be enhanced by the introduction of additional magnesium. In this work, the effect of magnesium additives on the power output of the MFCs and on the catholyte generation is evaluated. Several magnesium sources including MgCl 2 , artificial sea water and a commercially available sea salts mixture for seawater preparation (SeaMix) were mixed with real fresh human urine in order to enhance struvite precipitation. The supernatant of each mixture was tested as a feedstock for the MFCs and it was evaluated in terms of power output and catholyte generation. The commercial SeaMix showed the best performance in terms of struvite precipitation, increasing the amount of struvite in the solid collected from 21% to 94%. Moreover, the SeaMix increased the maximum power performance of the MFCs by over 10% and it also changed the properties of the catholyte collected by increasing the pH, conductivity and the concentration of chloride ions. These results demonstrate that the addition of sea-salts to real urine is beneficial for both struvite recovery and electricity generation in MFCs. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, D.; Brunett, A.; Passerini, S.

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less

  17. Identification of metapopulation dynamics among Northern Goshawks of the Alexander Archipelago, Alaska, and Coastal British Columbia

    USGS Publications Warehouse

    Sonsthagen, Sarah A.; McClaren, Erica L.; Doyle, Frank I.; Titus, K.; Sage, George K.; Wilson, Robert E.; Gust, Judy R.; Talbot, Sandra L.

    2012-01-01

    Northern Goshawks occupying the Alexander Archipelago, Alaska, and coastal British Columbia nest primarily in old-growth and mature forest, which results in spatial heterogeneity in the distribution of individuals across the landscape. We used microsatellite and mitochondrial data to infer genetic structure, gene flow, and fluctuations in population demography through evolutionary time. Patterns in the genetic signatures were used to assess predictions associated with the three population models: panmixia, metapopulation, and isolated populations. Population genetic structure was observed along with asymmetry in gene flow estimates that changed directionality at different temporal scales, consistent with metapopulation model predictions. Therefore, Northern Goshawk assemblages located in the Alexander Archipelago and coastal British Columbia interact through a metapopulation framework, though they may not fit the classic model of a metapopulation. Long-term population sources (coastal mainland British Columbia) and sinks (Revillagigedo and Vancouver islands) were identified. However, there was no trend through evolutionary time in the directionality of dispersal among the remaining assemblages, suggestive of a rescue-effect dynamic. Admiralty, Douglas, and Chichagof island complex appears to be an evolutionarily recent source population in the Alexander Archipelago. In addition, Kupreanof island complex and Kispiox Forest District populations have high dispersal rates to populations in close geographic proximity and potentially serve as local source populations. Metapopulation dynamics occurring in the Alexander Archipelago and coastal British Columbia by Northern Goshawks highlight the importance of both occupied and unoccupied habitats to long-term population persistence of goshawks in this region.

  18. Associations of Mortality with Long-Term Exposures to Fine and Ultrafine Particles, Species and Sources: Results from the California Teachers Study Cohort

    PubMed Central

    Hu, Jianlin; Goldberg, Debbie; Reynolds, Peggy; Hertz, Andrew; Bernstein, Leslie; Kleeman, Michael J.

    2015-01-01

    Background Although several cohort studies report associations between chronic exposure to fine particles (PM2.5) and mortality, few have studied the effects of chronic exposure to ultrafine (UF) particles. In addition, few studies have estimated the effects of the constituents of either PM2.5 or UF particles. Methods We used a statewide cohort of > 100,000 women from the California Teachers Study who were followed from 2001 through 2007. Exposure data at the residential level were provided by a chemical transport model that computed pollutant concentrations from > 900 sources in California. Besides particle mass, monthly concentrations of 11 species and 8 sources or primary particles were generated at 4-km grids. We used a Cox proportional hazards model to estimate the association between the pollutants and all-cause, cardiovascular, ischemic heart disease (IHD), and respiratory mortality. Results We observed statistically significant (p < 0.05) associations of IHD with PM2.5 mass, nitrate, elemental carbon (EC), copper (Cu), and secondary organics and the sources gas- and diesel-fueled vehicles, meat cooking, and high-sulfur fuel combustion. The hazard ratio estimate of 1.19 (95% CI: 1.08, 1.31) for IHD in association with a 10-μg/m3 increase in PM2.5 is consistent with findings from the American Cancer Society cohort. We also observed significant positive associations between IHD and several UF components including EC, Cu, metals, and mobile sources. Conclusions Using an emissions-based model with a 4-km spatial scale, we observed significant positive associations between IHD mortality and both fine and ultrafine particle species and sources. Our results suggest that the exposure model effectively measured local exposures and facilitated the examination of the relative toxicity of particle species. Citation Ostro B, Hu J, Goldberg D, Reynolds P, Hertz A, Bernstein L, Kleeman MJ. 2015. Associations of mortality with long-term exposures to fine and ultrafine particles, species and sources: results from the California Teachers Study cohort. Environ Health Perspect 123:549–556; http://dx.doi.org/10.1289/ehp.1408565 PMID:25633926

  19. Mass transfer processes in a post eruption hydrothermal system: Parameterisation of microgravity changes at Te Maari craters, New Zealand

    NASA Astrophysics Data System (ADS)

    Miller, Craig A.; Currenti, Gilda; Hamling, Ian; Williams-Jones, Glyn

    2018-05-01

    Fluid transfer and ground deformation at hydrothermal systems occur both as a precursor to, or as a result of, an eruption. Typically studies focus on pre-eruption changes to understand the likelihood of unrest leading to eruption; however, monitoring post-eruption changes is important for tracking the return of the system towards background activity. Here we describe processes occurring in a hydrothermal system following the 2012 eruption of Upper Te Maari crater on Mt Tongariro, New Zealand, from observations of microgravity change and deformation. Our aim is to assess the post-eruption recovery of the system, to provide a baseline for long-term monitoring. Residual microgravity anomalies of up to 92 ± 11 μGal per year are accompanied by up to 0.037 ± 0.01 m subsidence. We model microgravity changes using analytic solutions to determine the most likely geometry and source location. A multiobjective inversion tests whether the gravity change models are consistent with the observed deformation. We conclude that the source of subsidence is separate from the location of mass addition. From this unusual combination of observations, we develop a conceptual model of fluid transfer within a condensate layer, occurring in response to eruption-driven pressure changes. We find that depressurisation drives the evacuation of pore fluid, either exiting the system completely as vapour through newly created vents and fumaroles, or migrating to shallower levels where it accumulates in empty pore space, resulting in positive gravity changes. Evacuated pores then collapse, causing subsidence. In addition we find that significant mass addition occurs from influx of meteoric fluids through the fractured hydrothermal seal. Long-term combined microgravity and deformation monitoring will allow us to track the resealing and re-pressurisation of the hydrothermal system and assess what hazard it presents to thousands of hikers who annually traverse the volcano, within 2 km of the eruption site.

  20. What is the impact of different VLBI analysis setups of the tropospheric delay on precipitable water vapor trends?

    NASA Astrophysics Data System (ADS)

    Balidakis, Kyriakos; Nilsson, Tobias; Heinkelmann, Robert; Glaser, Susanne; Zus, Florian; Deng, Zhiguo; Schuh, Harald

    2017-04-01

    The quality of the parameters estimated by global navigation satellite systems (GNSS) and very long baseline interferometry (VLBI) are distorted by erroneous meteorological observations applied to model the propagation delay in the electrically neutral atmosphere. For early VLBI sessions with poor geometry, unsuitable constraints imposed on the a priori tropospheric gradients is a source of additional hassle of VLBI analysis. Therefore, climate change indicators deduced from the geodetic analysis, such as the long-term precipitable water vapor (PWV) trends, are strongly affected. In this contribution we investigate the impact of different modeling and parameterization of the propagation delay in the troposphere on the estimates of long-term PWV trends from geodetic VLBI analysis results. We address the influence of the meteorological data source, and of the a priori non-hydrostatic delays and gradients employed in the VLBI processing, on the estimated PWV trends. In particular, we assess the effect of employing temperature and pressure from (i) homogenized in situ observations, (ii) the model levels of the ERA Interim reanalysis numerical weather model and (iii) our own blind model in the style of GPT2w with enhanced parameterization, calculated using the latter data set. Furthermore, we utilize non-hydrostatic delays and gradients estimated from (i) a GNSS reprocessing at GeoForschungsZentrum Potsdam, rigorously considering tropospheric ties, and (ii)) direct ray-tracing through ERA Interim, as additional observations. To evaluate the above, the least-squares module of the VieVS@GFZ VLBI software was appropriately modified. Additionally, we study the noise characteristics of the non-hydrostatic delays and gradients estimated from our VLBI and GNSS analyses as well as from ray-tracing. We have modified the Theil-Sen estimator appropriately to robustly deduce PWV trends from VLBI, GNSS, ray-tracing and direct numerical integration in ERA Interim. We disseminate all our solutions in the latest Tropo-SINEX format.

  1. Chemical characteristic and toxicity assessment of particle associated PAHs for the short-term anthropogenic activity event: During the Chinese New Year's Festival in 2013.

    PubMed

    Shi, Guo-Liang; Liu, Gui-Rong; Tian, Ying-Ze; Zhou, Xiao-Yu; Peng, Xing; Feng, Yin-Chang

    2014-06-01

    PM10 and PM2.5 samples were simultaneously collected during a period which covered the Chinese New Year's (CNY) Festival. The concentrations of particulate matter (PM) and 16 polycyclic aromatic hydrocarbons (PAHs) were measured. The possible source contributions and toxicity risks were estimated for Festival and non-Festival periods. According to the diagnostic ratios and Multilinear Engine 2 (ME2), three sources were identified and their contributions were calculated: vehicle emission (48.97% for PM10, 53.56% for PM2.5), biomass & coal combustion (36.83% for PM10, 28.76% for PM2.5), and cook emission (22.29% for PM10, 27.23% for PM2.5). An interesting result was found: although the PAHs are not directly from the fireworks display, they were still indirectly influenced by biomass combustion which is affiliated with the fireworks display. Additionally, toxicity risks of different sources were estimated by Multilinear Engine 2-BaP equivalent (ME2-BaPE): vehicle emission (54.01% for PM10, 55.42% for PM2.5), cook emission (25.59% for PM10, 29.05% for PM2.5), and biomass & coal combustion source (20.90% for PM10, 14.28% for PM2.5). It is worth to be noticed that the toxicity contribution of cook emission was considerable in Festival period. The findings can provide useful information to protect the urban human health, as well as develop the effective air control strategies in special short-term anthropogenic activity event. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Where did all the Nitrogen go? Use of Watershed-Scale Budgets to Quantify Nitrogen Inputs, Storages, and Losses.

    NASA Astrophysics Data System (ADS)

    Boyer, E. W.; Goodale, C. L.; Howarth, R. W.; VanBreemen, N.

    2001-12-01

    Inputs of nitrogen (N) to aquatic and terrestrial ecosystems have increased during recent decades, primarily from the production and use of fertilizers, the planting of N-fixing crops, and the combustion of fossil fuels. We present mass-balanced budgets of N for 16 catchments along a latitudinal profile from Maine to Virginia, which encompass a range of climatic variability and are major drainages to the coast of the North Atlantic Ocean. We quantify inputs of N to each catchment from atmospheric deposition, application of nitrogenous fertilizers, biological nitrogen fixation by crops and trees, and import of N in agricultural products (food and feed). We relate these input terms to losses of N (total, organic, and nitrate) in streamflow. The importance of the relative N sources to N exports varies widely by watershed and is related to land use. Atmospheric deposition was the largest source of N to the forested catchments of northern New England (e.g., Penobscot and Kennebec); import of N in food was the largest source of N to the more populated regions of southern New England (e.g., Charles and Blackstone); and agricultural inputs were the dominant N sources in the Mid-Atlantic region (e.g., Schuylkill and Potomac). In all catchments, N inputs greatly exceed outputs, implying additional loss terms (e.g., denitrification or volatilization and transport of animal wastes), or changes in internal N stores (e.g, accumulation of N in vegetation, soil, or groundwater). We use our N budgets and several modeling approaches to constrain estimates about the fate of this excess N, including estimates of N storage in accumulating woody biomass, N losses due to in-stream denitrification, and more. This work is an effort of the SCOPE Nitrogen Project.

  3. Sources of atmospheric aerosol from long-term measurements (5 years) of chemical composition in Athens, Greece.

    PubMed

    Paraskevopoulou, D; Liakakou, E; Gerasopoulos, E; Mihalopoulos, N

    2015-09-15

    To identify the sources of aerosols in Greater Athens Area (GAA), a total of 1510 daily samples of fine (PM 2.5) and coarse (PM 10-2,5) aerosols were collected at a suburban site (Penteli), during a five year period (May 2008-April 2013) corresponding to the period before and during the financial crisis. In addition, aerosol sampling was also conducted in parallel at an urban site (Thissio), during specific, short-term campaigns during all seasons. In all these samples mass and chemical composition measurements were performed, the latest only at the fine fraction. Particulate organic matter (POM) and ionic masses (IM) are the main contributors of aerosol mass, equally contributing by accounting for about 24% of the fine aerosol mass. In the IM, nss-SO4(-2) is the prevailing specie followed by NO3(-) and NH4(+) and shows a decreasing trend during the 2008-2013 period similar to that observed for PM masses. The contribution of water in fine aerosol is equally significant (21 ± 2%), while during dust transport, the contribution of dust increases from 7 ± 2% to 31 ± 9%. Source apportionment (PCA and PMF) and mass closure exercises identified the presence of six sources of fine aerosols: secondary photochemistry, primary combustion, soil, biomass burning, sea salt and traffic. Finally, from winter 2012 to winter 2013 the contribution of POM to the urban aerosol mass is increased by almost 30%, reflecting the impact of wood combustion (dominant fuel for domestic heating) to air quality in Athens, which massively started in winter 2013. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. The mass-zero spin-two field and gravitational theory.

    NASA Technical Reports Server (NTRS)

    Coulter, C. A.

    1972-01-01

    Demonstration that the conventional theory of the mass-zero spin-two field with sources introduces extraneous nonspin-two field components in source regions and fails to be covariant under the full or restricted conformal group. A modified theory is given, expressed in terms of the physical components of mass-zero spin-two field rather than in terms of 'potentials,' which has no extraneous components inside or outside sources, and which is covariant under the full conformal group. For a proper choice of source term, this modified theory has the correct Newtonian limit and automatically implies that a symmetric second-rank source tensor has zero divergence. It is shown that possibly a generally covariant form of the spin-two theory derived here can be constructed to agree with general relativity in all currently accessible experimental situations.

  5. Feasibility of commercialization of Russian thistle, Salsola kali L. , as a fuel source. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpiscak, M.M.; Foster, K.E.; Rawles, R.L.

    1981-10-01

    The use of Russian thistle as an energy resource has been demonstrated. Russian thistle biomass can be harvested, stored and transported using readily available machinery. Propagation seed can be harvested, cleaned and sown using commercially available machines and traditional techniques. In addition, preliminary tests did not detect that burning Russian thistle biomass causes any major toxicological or immunological problems. Many questions remain to be answered, however, concerning use of Russian thistle as a biomass fuel. The lack of confirmed, long-term data, on the agronomics of Russian thistle makes additional research necessary. Additional data are required to produce a sound datamore » base for evaluating the economics of Russian thistle production, for improving agricultural methods, and for fully evaluating the toxic and immunologic properties of Russian thistle. In conclusion, it appears that Russian thistle biomass has a great potential for becoming a fuel source in arid areas that are lacking fossil fuel reserves or where possible reduction of environmental problems associated with the use of fossil fuels is desired. Analyses of economic and energy factors show that there is a significant net gain in energy with the production and processing of Russia thistle biomass into synthetic logs (Tumblelogs), although the cost of Tumblelogs is slightly higher than that of synthetic logs made from wood waste. 10 refs., 12 figs., 17 tabs.« less

  6. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less

  7. A Novel Airborne Carbon Isotope Analyzer for Methane and Carbon Dioxide Source Fingerprinting

    NASA Astrophysics Data System (ADS)

    Berman, E. S.; Huang, Y. W.; Owano, T. G.; Leifer, I.

    2014-12-01

    Recent field studies on major sources of the important greenhouse gas methane (CH4) indicate significant underestimation of methane release from fossil fuel industrial (FFI) and animal husbandry sources, among others. In addition, uncertainties still exist with respect to carbon dioxide (CO2) measurements, especially source fingerprinting. CO2 isotopic analysis provides a valuable in situ measurement approach to fingerprint CH4 and CO2as associated with combustion sources, leakage from geologic reservoirs, or biogenic sources. As a result, these measurements can characterize strong combustion source plumes, such as power plant emissions, and discriminate these emissions from other sources. As part of the COMEX (CO2 and MEthane eXperiment) campaign, a novel CO2 isotopic analyzer was installed and collected data aboard the CIRPAS Twin Otter aircraft. Developing methods to derive CH4 and CO2 budgets from remote sensing data is the goal of the summer 2014 COMEX campaign, which combines hyperspectral imaging (HSI) and non-imaging spectroscopy (NIS) with in situ airborne and surface data. COMEX leverages the synergy between high spatial resolution HSI and moderate spatial resolution NIS. The carbon dioxide isotope analyzer developed by Los Gatos Research (LGR) uses LGR's patented Off-Axis ICOS (Integrated Cavity Output Spectroscopy) technology and incorporates proprietary internal thermal control for high sensitivity and optimal instrument stability. This analyzer measures CO2 concentration as well as δ13C, δ18O, and δ17O from CO2 at natural abundance (100-3000 ppm). The laboratory accuracy is ±1.2 ppm (1σ) in CO2 from 370-1000 ppm, with a long-term (1000 s) precision of ±0.012 ppm. The long-term precision for both δ13C and δ18O is 0.04 ‰, and for δ17O is 0.06 ‰. The analyzer was field-tested as part of the COWGAS campaign, a pre-cursor campaign to COMEX in March 2014, where it successfully discriminated plumes related to combustion processes associated with dairy activities (tractor exhaust) from plumes and sources in air enriched in methane and ammonia from bovine activities including waste maintenance. Methodology, laboratory data, field data from COWGAS, and field data from the COMEX campaign acquired by LGR's carbon isotope analyzer as well as other COMEX analyzers are presented.

  8. High single-spatial-mode pulsed power from 980 nm emitting diode lasers

    NASA Astrophysics Data System (ADS)

    Hempel, Martin; Tomm, Jens W.; Elsaesser, Thomas; Bettiati, Mauro

    2012-11-01

    Single-spatial-mode pulsed powers as high as 13 W and 20 W in 150 and 50 ns pulses, respectively, are reported for 980 nm emitting lasers. In terms of energy, single-spatial-mode values of up to 2 μJ within 150 ns pulses are shown. In this high-power pulsed operation, the devices shield themselves from facet degradation, being the main degradation source in continuous wave (cw) operation. Our results pave the way towards additional applications while employing available standard devices, which have originally been designed as very reliable cw fiber pumps.

  9. Observational methods for solar origin diagnostics of energetic protons

    NASA Astrophysics Data System (ADS)

    Miteva, Rositsa

    2017-12-01

    The aim of the present report is to outline the observational methods used to determine the solar origin - in terms of flares and coronal mass ejections (CMEs) - of the in situ observed solar energetic protons. Several widely used guidelines are given and different sources of uncertainties are summarized and discussed. In the present study, a new quality factor is proposed as a certainty check on the so-identified flare-CME pairs. In addition, the correlations between the proton peak intensity and the properties of their solar origin are evaluated as a function of the quality factor.

  10. The Science Manager's Guide to Case Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Branch, Kristi M.; Peffers, Melissa S.; Ruegg, Rosalie T.

    2001-09-24

    This guide takes the science manager through the steps of planning, implementing, validating, communicating, and using case studies. It outlines the major methods of analysis, describing their relative merits and applicability while providing relevant examples and sources of additional information. Well-designed case studies can provide a combination of rich qualitative and quantitative information, offering valuable insights into the nature, outputs, and longer-term impacts of the research. An objective, systematic, and credible approach to the evaluation of U.S. Department of Energy Office of Science programs adds value to the research process and is the subject of this guide.

  11. Quantitative Determination of Vinpocetine in Dietary Supplements.

    PubMed

    French, John M T; King, Matthew D; McDougal, Owen M

    2016-05-01

    Current United States regulatory policies allow for the addition of pharmacologically active substances in dietary supplements if derived from a botanical source. The inclusion of certain nootropic drugs, such as vinpocetine, in dietary supplements has recently come under scrutiny due to the lack of defined dosage parameters and yet unproven short- and long-term benefits and risks to human health. This study quantified the concentration of vinpocetine in several commercially available dietary supplements and found that a highly variable range of 0.6-5.1 mg/serving was present across the tested products, with most products providing no specification of vinpocetine concentrations.

  12. Determining the perceived value of information when combining supporting and conflicting data

    NASA Astrophysics Data System (ADS)

    Hanratty, Timothy; Heilman, Eric; Richardson, John; Mittrick, Mark; Caylor, Justine

    2017-05-01

    Modern military intelligence operations involves a deluge of information from a large number of sources. A data ranking algorithm that enables the most valuable information to be reviewed first may improve timely and effective analysis. This ranking is termed the value of information (VoI) and its calculation is a current area of research within the US Army Research Laboratory (ARL). ARL has conducted an experiment to correlate the perceptions of subject matter experts with the ARL VoI model and additionally to construct a cognitive model of the ranking process and the amalgamation of supporting and conflicting information.

  13. Revisiting the social cost of carbon.

    PubMed

    Nordhaus, William D

    2017-02-14

    The social cost of carbon (SCC) is a central concept for understanding and implementing climate change policies. This term represents the economic cost caused by an additional ton of carbon dioxide emissions or its equivalent. The present study presents updated estimates based on a revised DICE model (Dynamic Integrated model of Climate and the Economy). The study estimates that the SCC is $31 per ton of CO 2 in 2010 US$ for the current period (2015). For the central case, the real SCC grows at 3% per year over the period to 2050. The paper also compares the estimates with those from other sources.

  14. Revisiting the social cost of carbon

    NASA Astrophysics Data System (ADS)

    Nordhaus, William D.

    2017-02-01

    The social cost of carbon (SCC) is a central concept for understanding and implementing climate change policies. This term represents the economic cost caused by an additional ton of carbon dioxide emissions or its equivalent. The present study presents updated estimates based on a revised DICE model (Dynamic Integrated model of Climate and the Economy). The study estimates that the SCC is 31 per ton of CO2 in 2010 US for the current period (2015). For the central case, the real SCC grows at 3% per year over the period to 2050. The paper also compares the estimates with those from other sources.

  15. Neuroimaging Evidence for Agenda-Dependent Monitoring of Different Features during Short-Term Source Memory Tests

    ERIC Educational Resources Information Center

    Mitchell, Karen J.; Raye, Carol L.; McGuire, Joseph T.; Frankel, Hillary; Greene, Erich J.; Johnson, Marcia K.

    2008-01-01

    A short-term source monitoring procedure with functional magnetic resonance imaging assessed neural activity when participants made judgments about the format of 1 of 4 studied items (picture, word), the encoding task performed (cost, place), or whether an item was old or new. The results support findings from long-term memory studies showing that…

  16. Organic carbon sources and sinks in San Francisco Bay: variability induced by river flow

    USGS Publications Warehouse

    Jassby, Alan D.; Powell, T.M.; Cloern, James E.

    1993-01-01

    Sources and sinks of organic carbon for San Francisco Bay (California, USA) were estimated for 1980. Sources for the southern reach were dominated by phytoplankton and benthic microalgal production. River loading of organic matter was an additional important factor in the northern reach. Tidal marsh export and point sources played a secondary role. Autochthonous production in San Francisco Bay appears to be less than the mean for temperate-zone estuaries, primarily because turbidity limits microalgal production and the development of seagrass beds. Exchange between the Bay and Pacific Ocean plays an unknown but potentially important role in the organic carbon balance. Interannual variability in the organic carbon supply was assessed for Suisun Bay, a northern reach subembayment that provides habitat for important fish species (delta smelt Hypomesus transpacificus and larval striped bass Morone saxatilus). The total supply fluctuated by an order of magnitude; depending on the year, either autochthonous sources (phytoplankton production) or allochthonous sources (riverine loading) could be dominant. The primary cause of the year-to-year change was variability of freshwater inflows from the Sacramento and San Joaquin rivers, and its magnitude was much larger than long-term changes arising from marsh destruction and point source decreases. Although interannual variability of the total organic carbon supply could not be assessed for the southern reach, year-to-year changes in phytoplankton production were much smaller than in Suisun Bay, reflecting a relative lack of river influence.

  17. NASA thesaurus. Volume 3: Definitions

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Publication of NASA Thesaurus definitions began with Supplement 1 to the 1985 NASA Thesaurus. The definitions given here represent the complete file of over 3,200 definitions, complimented by nearly 1,000 use references. Definitions of more common or general scientific terms are given a NASA slant if one exists. Certain terms are not defined as a matter of policy: common names, chemical elements, specific models of computers, and nontechnical terms. The NASA Thesaurus predates by a number of years the systematic effort to define terms, therefore not all Thesaurus terms have been defined. Nevertheless, definitions of older terms are continually being added. The following data are provided for each entry: term in uppercase/lowercase form, definition, source, and year the term (not the definition) was added to the NASA Thesaurus. The NASA History Office is the authority for capitalization in satellite and spacecraft names. Definitions with no source given were constructed by lexicographers at the NASA Scientific and Technical Information (STI) Facility who rely on the following sources for their information: experts in the field, literature searches from the NASA STI database, and specialized references.

  18. OSRP Source Repatriations-Case Studies: Brazil, Ecuador, Uruguay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenberg, Ray Jr.; Abeyta, Cristy; Matzke, Jim

    2012-07-01

    The Global Threat Reduction Initiative's (GTRI) Offsite Source Recovery Project (OSRP) began recovering excess and unwanted radioactive sealed sources (sources) in 1999. As of February 2012, the project had recovered over 30,000 sources totaling over 820,000 Ci. OSRP grew out of early efforts at Los Alamos National Laboratory (LANL) to recover disused excess Plutonium- 239 (Pu-239) sources that were distributed in the 1960's and 1970's under the Atoms for Peace Program. Source recovery was initially considered a waste management activity. However, after the 9/11 terrorist attacks, the interagency community began to recognize that excess and unwanted radioactive sealed sources posemore » a national security threat, particularly those that lack a disposition path. After OSRP's transfer to the U.S. National Nuclear Security Administration (NNSA) to be part of GTRI, its mission was expanded to include all disused sealed sources that might require national security consideration. Recognizing the transnational threat posed by porous borders and the ubiquitous nature of sources, GTRI/OSRP repatriates U.S. origin sources based on threat reduction prioritization criteria. For example, several recent challenging source repatriation missions have been conducted by GTRI/OSRP in South America. These include the repatriation of a significant amount of Cs-137 and other isotopes from Brazil; re-packaging of conditioned Ra-226 sources in Ecuador for future repatriation; and, multilateral cooperation in the consolidation and export of Canadian, US, and Indian Co-60/Cs-137 sources from Uruguay. In addition, cooperation with regulators and private source owners in other countries presents opportunities for GTRI/OSRP to exchange best practices for managing disused sources. These positive experiences often result in long-term cooperation and information sharing with key foreign counterparts. International source recovery operations are essential to the preservation of U.S. national security interests. They are also mutually beneficial for fostering positive relationships with other governments and private industry, and demonstrate that responsible end-of-life options are given to legacy U.S.-origin sources in other countries. GTRI/OSRP does not take back sources that have a viable path for commercial disposal. Most US origin sources were sold commercially and were not provided by the US government. Below is a synopsis of cooperative efforts with Brazil, Ecuador, and Uruguay. Bilateral and multilateral efforts have been successful in removing hundreds of U.S.origin sealed radioactive sources from Latin American countries to the U.S. As many disused sources remain in the region, and since repatriation is not always an option, GTRI will continue to work with those countries to ensure that these sources are stored securely for the long-term. Successful Latin America operations should serve as a model for other regional cooperation in the repatriation of sealed sources, encouraging other source exporting countries to implement similar programs. Securing and removing sources, both domestically and internationally, is crucial to strengthening the life-cycle management of radioactive sources worldwide. Such efforts not only prevent these materials from being used maliciously, but also address public health and safety concerns, and under-gird the IAEA Code of Conduct on the Safety and Security of Radioactive Sources. (authors)« less

  19. On the inclusion of mass source terms in a single-relaxation-time lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Aursjø, Olav; Jettestuen, Espen; Vinningland, Jan Ludvig; Hiorth, Aksel

    2018-05-01

    We present a lattice Boltzmann algorithm for incorporating a mass source in a fluid flow system. The proposed mass source/sink term, included in the lattice Boltzmann equation, maintains the Galilean invariance and the accuracy of the overall method, while introducing a mass source/sink term in the fluid dynamical equations. The method can, for instance, be used to inject or withdraw fluid from any preferred lattice node in a system. This suggests that injection and withdrawal of fluid does not have to be introduced through cumbersome, and sometimes less accurate, boundary conditions. The method also suggests that, through a chosen equation of state relating mass density to pressure, the proposed mass source term will render it possible to set a preferred pressure at any lattice node in a system. We demonstrate how this model handles injection and withdrawal of a fluid. And we show how it can be used to incorporate pressure boundaries. The accuracy of the algorithm is identified through a Chapman-Enskog expansion of the model and supported by the numerical simulations.

  20. 10 CFR 1.3 - Sources of additional information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Sources of additional information. 1.3 Section 1.3 Energy NUCLEAR REGULATORY COMMISSION STATEMENT OF ORGANIZATION AND GENERAL INFORMATION Introduction § 1.3 Sources of additional information. (a) A statement of the NRC's organization, policies, procedures...

  1. Cosine beamforming

    NASA Astrophysics Data System (ADS)

    Ruigrok, Elmer; Wapenaar, Kees

    2014-05-01

    In various application areas, e.g., seismology, astronomy and geodesy, arrays of sensors are used to characterize incoming wavefields due to distant sources. Beamforming is a general term for phased-adjusted summations over the different array elements, for untangling the directionality and elevation angle of the incoming waves. For characterizing noise sources, beamforming is conventionally applied with a temporal Fourier and a 2D spatial Fourier transform, possibly with additional weights. These transforms become aliased for higher frequencies and sparser array-element distributions. As a partial remedy, we derive a kernel for beamforming crosscorrelated data and call it cosine beamforming (CBF). By applying beamforming not directly to the data, but to crosscorrelated data, the sampling is effectively increased. We show that CBF, due to this better sampling, suffers less from aliasing and yields higher resolution than conventional beamforming. As a flip-side of the coin, the CBF output shows more smearing for spherical waves than conventional beamforming.

  2. Finite-amplitude, pulsed, ultrasonic beams

    NASA Astrophysics Data System (ADS)

    Coulouvrat, François; Frøysa, Kjell-Eivind

    An analytical, approximate solution of the inviscid KZK equation for a nonlinear pulsed sound beam radiated by an acoustic source with a Gaussian velocity distribution, is obtained by means of the renormalization method. This method involves two steps. First, the transient, weakly nonlinear field is computed. However, because of cumulative nonlinear effects, that expansion is non-uniform and breaks down at some distance away from the source. So, in order to extend its validity, it is re-written in a new frame of co-ordinates, better suited to following the nonlinear distorsion of the wave profile. Basically, the nonlinear coordinate transform introduces additional terms in the expansion, which are chosen so as to counterbalance the non-uniform ones. Special care is devoted to the treatment of shock waves. Finally, comparisons with the results of a finite-difference scheme turn out favorable, and show the efficiency of the method for a rather large range of parameters.

  3. Coding conventions and principles for a National Land-Change Modeling Framework

    USGS Publications Warehouse

    Donato, David I.

    2017-07-14

    This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.

  4. Fine-Tuning the Accretion Disk Clock in Hercules X-1

    NASA Technical Reports Server (NTRS)

    Still, M.; Boyd, P.

    2004-01-01

    RXTE ASM count rates from the X-ray pulsar Her X-1 began falling consistently during the late months of 2003. The source is undergoing another state transition similar to the anomalous low state of 1999. This new event has triggered observations from both space and ground-based observatories. In order to aid data interpretation and telescope scheduling, and to facilitate the phase-connection of cycles before and after the state transition, we have re-calculated the precession ephemeris using cycles over the last 3.5 years. We report that the source has displayed a different precession period since the last anomalous event. Additional archival data from CGRO suggests that each low state is accompanied by a change in precession period and that the subsequent period is correlated with accretion flux. Consequently our analysis reveals long-term accretion disk behaviour which is predicted by theoretical models of radiation-driven warping.

  5. NASA's Quiet Aircraft Technology Project

    NASA Technical Reports Server (NTRS)

    Whitfield, Charlotte E.

    2004-01-01

    NASA's Quiet Aircraft Technology Project is developing physics-based understanding, models and concepts to discover and realize technology that will, when implemented, achieve the goals of a reduction of one-half in perceived community noise (relative to 1997) by 2007 and a further one-half in the far term. Noise sources generated by both the engine and the airframe are considered, and the effects of engine/airframe integration are accounted for through the propulsion airframe aeroacoustics element. Assessments of the contribution of individual source noise reductions to the reduction in community noise are developed to guide the work and the development of new tools for evaluation of unconventional aircraft is underway. Life in the real world is taken into account with the development of more accurate airport noise models and flight guidance methodology, and in addition, technology is being developed that will further reduce interior noise at current weight levels or enable the use of lighter-weight structures at current noise levels.

  6. Methods for the behavioral, educational, and social sciences: an R package.

    PubMed

    Kelley, Ken

    2007-11-01

    Methods for the Behavioral, Educational, and Social Sciences (MBESS; Kelley, 2007b) is an open source package for R (R Development Core Team, 2007b), an open source statistical programming language and environment. MBESS implements methods that are not widely available elsewhere, yet are especially helpful for the idiosyncratic techniques used within the behavioral, educational, and social sciences. The major categories of functions are those that relate to confidence interval formation for noncentral t, F, and chi2 parameters, confidence intervals for standardized effect sizes (which require noncentral distributions), and sample size planning issues from the power analytic and accuracy in parameter estimation perspectives. In addition, MBESS contains collections of other functions that should be helpful to substantive researchers and methodologists. MBESS is a long-term project that will continue to be updated and expanded so that important methods can continue to be made available to researchers in the behavioral, educational, and social sciences.

  7. Stochastic memory: getting memory out of noise

    NASA Astrophysics Data System (ADS)

    Stotland, Alexander; di Ventra, Massimiliano

    2011-03-01

    Memory circuit elements, namely memristors, memcapacitors and meminductors, can store information without the need of a power source. These systems are generally defined in terms of deterministic equations of motion for the state variables that are responsible for memory. However, in real systems noise sources can never be eliminated completely. One would then expect noise to be detrimental for memory. Here, we show that under specific conditions on the noise intensity memory can actually be enhanced. We illustrate this phenomenon using a physical model of a memristor in which the addition of white noise into the state variable equation improves the memory and helps the operation of the system. We discuss under which conditions this effect can be realized experimentally, discuss its implications on existing memory systems discussed in the literature, and also analyze the effects of colored noise. Work supported in part by NSF.

  8. Directional Unfolded Source Term (DUST) for Compton Cameras.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, Dean J.; Horne, Steven M.; O'Brien, Sean

    2018-03-01

    A Directional Unfolded Source Term (DUST) algorithm was developed to enable improved spectral analysis capabilities using data collected by Compton cameras. Achieving this objective required modification of the detector response function in the Gamma Detector Response and Analysis Software (GADRAS). Experimental data that were collected in support of this work include measurements of calibration sources at a range of separation distances and cylindrical depleted uranium castings.

  9. A 305 year monthly rainfall series for the Island of Ireland (1711-2016)

    NASA Astrophysics Data System (ADS)

    Murphy, Conor; Burt, Tim P.; Broderick, Ciaran; Duffy, Catriona; Macdonald, Neil; Matthews, Tom; McCarthy, Mark P.; Mullan, Donal; Noone, Simon; Ryan, Ciara; Thorne, Peter; Walsh, Seamus; Wilby, Robert L.

    2017-04-01

    This paper derives a continuous 305-year monthly rainfall series for the Island of Ireland (IoI) for the period 1711-2016. Two key data sources are employed: i) a previously unpublished UK Met Office Note which compiled annual rainfall anomalies and corresponding monthly per mille amounts from weather diaries and early observational records for the period 1711-1977; and ii) a long-term, homogenised monthly IoI rainfall series for the period 1850-2016. Using estimates of long-term average precipitation sampled from the quality assured series, the full record is reconstituted and insights drawn regarding notable periods and the range of climate variability and change experienced. Consistency with other long records for the region is examined, including: the England and Wales Precipitation series (EWP; 1766-2016); the early EWP Glasspoole series (1716-1765) and the Central England Temperature series (CET; 1711-2016). Strong correspondence between all records is noted from 1780 onwards. While disparities are evident between the early EWP and Ireland series, the latter shows strong decadal consistency with CET throughout the record. In addition, independent, early observations from Cork and Dublin, along with available documentary sources, corroborate the derived series and add confidence to our reconstruction. The new IoI rainfall record reveals that the wettest decades occurred in the early 18th Century, despite the fact that IoI has experienced a long-term winter wetting trend consistent with climate model projections. These exceptionally wet winters of the 1720s and 1730s were concurrent with almost unprecedented warmth in the CET, glacial advance throughout Scandinavia, and glacial retreat in West Greenland, consistent with a wintertime NAO-type forcing. Our study therefore demonstrates the value of long-term observational records for providing insight to the natural climate variability of the North Atlantic region.

  10. The influence of initial conditions on dispersion and reactions

    NASA Astrophysics Data System (ADS)

    Wood, B. D.

    2016-12-01

    In various generalizations of the reaction-dispersion problem, researchers have developed frameworks in which the apparent dispersion coefficient can be negative. Such dispersion coefficients raise several difficult questions. Most importantly, the presence of a negative dispersion coefficient at the macroscale leads to a macroscale representation that illustrates an apparent decrease in entropy with increasing time; this, then, appears to be in violation of basic thermodynamic principles. In addition, the proposition of a negative dispersion coefficient leads to an inherently ill-posed mathematical transport equation. The ill-posedness of the problem arises because there is no unique initial condition that corresponds to a later-time concentration distribution (assuming that if discontinuous initial conditions are allowed). In this presentation, we explain how the phenomena of negative dispersion coefficients actually arise because the governing differential equation for early times should, when derived correctly, incorporate a term that depends upon the initial and boundary conditions. The process of reactions introduces a similar phenomena, where the structure of the initial and boundary condition influences the form of the macroscopic balance equations. When upscaling is done properly, new equations are developed that include source terms that are not present in the classical (late-time) reaction-dispersion equation. These source terms depend upon the structure of the initial condition of the reacting species, and they decrease exponentially in time (thus, they converge to the conventional equations at asymptotic times). With this formulation, the resulting dispersion tensor is always positive-semi-definite, and the reaction terms directly incorporate information about the state of mixedness of the system. This formulation avoids many of the problems that would be engendered by defining negative-definite dispersion tensors, and properly represents the effective rate of reaction at early times.

  11. Is Earth coming out of the recent ice house age in the long-term? - constraints from probable mantle CO2-degassing reconstructions

    NASA Astrophysics Data System (ADS)

    Hartmann, Jens; Li, Gaojun; West, A. Joshua

    2017-04-01

    Enhanced partial melting of mantle material probably started when the subduction motor started around 3.2 Ga ago as evidenced by the formation history of the continental crust. Carbon is degassing due partial melting as it is an incompatible element. Therefore, mantle carbon degassing rates would change with time proportionally to the reservoir mantle concentration evolution and the ocean crust production rate, causing a distinct CO2-degassing rate change with time. The evolution of the mantle degassing rate has some implications for the reconstruction of the carbon cycle and therefore climate and Earth surface processes rates, as CO2-degassing rates are used to constrain or to balance the atmosphere-ocean-crust carbon cycle system. It will be shown that compilations of CO2-degassing from relevant geological sources are probably exceeding the established CO2-sink terrestrial weathering, which is often used to constrain long-term mantle degassing rates to close the carbon cycle on geological time scales. In addition, the scenarios for the degassing dynamics from the mantle sources suggest that the mantle is depleting its carbon content since 3 Ga. This has further implications for the long-term CO2-sink weathering. Results will be compared with geochemical proxies for weathering and weathering intensity dynamics, and will be set in context with snow ball Earth events and long-term emplacement dynamics of mafic areas as Large Igneous Provinces. Decreasing mantle degassing rates since about 2 Ga suggest a constraint for the evolution of the carbon cycle and recycling potential of the amount of subducted carbon. If the given scenarios hold further investigation, the contribution of mantle degassing to climate forcing (directly and via recycling) will decrease further.

  12. Numerical simulation of the environmental impact of hydraulic fracturing of tight/shale gas reservoirs on near-surface groundwater: Background, base cases, shallow reservoirs, short-term gas, and water transport.

    PubMed

    Reagan, Matthew T; Moridis, George J; Keen, Noel D; Johnson, Jeffrey N

    2015-04-01

    Hydrocarbon production from unconventional resources and the use of reservoir stimulation techniques, such as hydraulic fracturing, has grown explosively over the last decade. However, concerns have arisen that reservoir stimulation creates significant environmental threats through the creation of permeable pathways connecting the stimulated reservoir with shallower freshwater aquifers, thus resulting in the contamination of potable groundwater by escaping hydrocarbons or other reservoir fluids. This study investigates, by numerical simulation, gas and water transport between a shallow tight-gas reservoir and a shallower overlying freshwater aquifer following hydraulic fracturing operations, if such a connecting pathway has been created. We focus on two general failure scenarios: (1) communication between the reservoir and aquifer via a connecting fracture or fault and (2) communication via a deteriorated, preexisting nearby well. We conclude that the key factors driving short-term transport of gas include high permeability for the connecting pathway and the overall volume of the connecting feature. Production from the reservoir is likely to mitigate release through reduction of available free gas and lowering of reservoir pressure, and not producing may increase the potential for release. We also find that hydrostatic tight-gas reservoirs are unlikely to act as a continuing source of migrating gas, as gas contained within the newly formed hydraulic fracture is the primary source for potential contamination. Such incidents of gas escape are likely to be limited in duration and scope for hydrostatic reservoirs. Reliable field and laboratory data must be acquired to constrain the factors and determine the likelihood of these outcomes. Short-term leakage fractured reservoirs requires high-permeability pathways Production strategy affects the likelihood and magnitude of gas release Gas release is likely short-term, without additional driving forces.

  13. On estimating attenuation from the amplitude of the spectrally whitened ambient seismic field

    NASA Astrophysics Data System (ADS)

    Weemstra, Cornelis; Westra, Willem; Snieder, Roel; Boschi, Lapo

    2014-06-01

    Measuring attenuation on the basis of interferometric, receiver-receiver surface waves is a non-trivial task: the amplitude, more than the phase, of ensemble-averaged cross-correlations is strongly affected by non-uniformities in the ambient wavefield. In addition, ambient noise data are typically pre-processed in ways that affect the amplitude itself. Some authors have recently attempted to measure attenuation in receiver-receiver cross-correlations obtained after the usual pre-processing of seismic ambient-noise records, including, most notably, spectral whitening. Spectral whitening replaces the cross-spectrum with a unit amplitude spectrum. It is generally assumed that cross-terms have cancelled each other prior to spectral whitening. Cross-terms are peaks in the cross-correlation due to simultaneously acting noise sources, that is, spurious traveltime delays due to constructive interference of signal coming from different sources. Cancellation of these cross-terms is a requirement for the successful retrieval of interferometric receiver-receiver signal and results from ensemble averaging. In practice, ensemble averaging is replaced by integrating over sufficiently long time or averaging over several cross-correlation windows. Contrary to the general assumption, we show in this study that cross-terms are not required to cancel each other prior to spectral whitening, but may also cancel each other after the whitening procedure. Specifically, we derive an analytic approximation for the amplitude difference associated with the reversed order of cancellation and normalization. Our approximation shows that an amplitude decrease results from the reversed order. This decrease is predominantly non-linear at small receiver-receiver distances: at distances smaller than approximately two wavelengths, whitening prior to ensemble averaging causes a significantly stronger decay of the cross-spectrum.

  14. Erratum to Surface‐wave green’s tensors in the near field

    USGS Publications Warehouse

    Haney, Matthew M.; Hisashi Nakahara,

    2016-01-01

    Haney and Nakahara (2014) derived expressions for surface‐wave Green’s tensors that included near‐field behavior. Building on the result for a force source, Haney and Nakahara (2014) further derived expressions for a general point moment tensor source using the exact Green’s tensors. However, it has come to our attention that, although the Green’s tensors were correct, the resulting expressions for a general point moment tensor source were missing some terms. In this erratum, we provide updated expressions with these missing terms. The inclusion of the missing terms changes the example given in Haney and Nakahara (2014).

  15. Flowsheets and source terms for radioactive waste projections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forsberg, C.W.

    1985-03-01

    Flowsheets and source terms used to generate radioactive waste projections in the Integrated Data Base (IDB) Program are given. Volumes of each waste type generated per unit product throughput have been determined for the following facilities: uranium mining, UF/sub 6/ conversion, uranium enrichment, fuel fabrication, boiling-water reactors (BWRs), pressurized-water reactors (PWRs), and fuel reprocessing. Source terms for DOE/defense wastes have been developed. Expected wastes from typical decommissioning operations for each facility type have been determined. All wastes are also characterized by isotopic composition at time of generation and by general chemical composition. 70 references, 21 figures, 53 tables.

  16. Traveltime delay relative to the maximum energy of the wave train for dispersive tsunamis propagating across the Pacific Ocean: the case of 2010 and 2015 Chilean Tsunamis

    NASA Astrophysics Data System (ADS)

    Poupardin, A.; Heinrich, P.; Hébert, H.; Schindelé, F.; Jamelot, A.; Reymond, D.; Sugioka, H.

    2018-05-01

    This paper evaluates the importance of frequency dispersion in the propagation of recent trans-Pacific tsunamis. Frequency dispersion induces a time delay for the most energetic waves, which increases for long propagation distances and short source dimensions. To calculate this time delay, propagation of tsunamis is simulated and analyzed from spectrograms of time-series at specific gauges in the Pacific Ocean. One- and two-dimensional simulations are performed by solving either shallow water or Boussinesq equations and by considering realistic seismic sources. One-dimensional sensitivity tests are first performed in a constant-depth channel to study the influence of the source width. Two-dimensional tests are then performed in a simulated Pacific Ocean with a 4000-m constant depth and by considering tectonic sources of 2010 and 2015 Chilean earthquakes. For these sources, both the azimuth and the distance play a major role in the frequency dispersion of tsunamis. Finally, simulations are performed considering the real bathymetry of the Pacific Ocean. Multiple reflections, refractions as well as shoaling of waves result in much more complex time series for which the effects of the frequency dispersion are hardly discernible. The main point of this study is to evaluate frequency dispersion in terms of traveltime delays by calculating spectrograms for a time window of 6 hours after the arrival of the first wave. Results of the spectral analysis show that the wave packets recorded by pressure and tide sensors in the Pacific Ocean seem to be better reproduced by the Boussinesq model than the shallow water model and approximately follow the theoretical dispersion relationship linking wave arrival times and frequencies. Additionally, a traveltime delay is determined above which effects of frequency dispersion are considered to be significant in terms of maximum surface elevations.

  17. Acceleration of auroral electrons in parallel electric fields

    NASA Technical Reports Server (NTRS)

    Kaufmann, R. L.; Walker, D. N.; Arnoldy, R. L.

    1976-01-01

    Rocket observations of auroral electrons are compared with the predictions of a number of theoretical acceleration mechanisms that involve an electric field parallel to the earth's magnetic field. The theoretical models are discussed in terms of required plasma sources, the location of the acceleration region, and properties of necessary wave-particle scattering mechanisms. We have been unable to find any steady state scatter-free electric field configuration that predicts electron flux distributions in agreement with the observations. The addition of a fluctuating electric field or wave-particle scattering several thousand kilometers above the rocket can modify the theoretical flux distributions so that they agree with measurements. The presence of very narrow energy peaks in the flux contours implies a characteristic temperature of several tens of electron volts or less for the source of field-aligned auroral electrons and a temperature of several hundred electron volts or less for the relatively isotropic 'monoenergetic' auroral electrons. The temperature of the field-aligned electrons is more representative of the magnetosheath or possibly the ionosphere as a source region than of the plasma sheet.

  18. C-arm based cone-beam CT using a two-concentric-arc source trajectory: system evaluation

    NASA Astrophysics Data System (ADS)

    Zambelli, Joseph; Zhuang, Tingliang; Nett, Brian E.; Riddell, Cyril; Belanger, Barry; Chen, Guang-Hong

    2008-03-01

    The current x-ray source trajectory for C-arm based cone-beam CT is a single arc. Reconstruction from data acquired with this trajectory yields cone-beam artifacts for regions other than the central slice. In this work we present the preliminary evaluation of reconstruction from a source trajectory of two concentric arcs using a flat-panel detector equipped C-arm gantry (GE Healthcare Innova 4100 system, Waukesha, Wisconsin). The reconstruction method employed is a summation of FDK-type reconstructions from the two individual arcs. For the angle between arcs studied here, 30°, this method offers a significant reduction in the visibility of cone-beam artifacts, with the additional advantages of simplicity and ease of implementation due to the fact that it is a direct extension of the reconstruction method currently implemented on commercial systems. Reconstructed images from data acquired from the two arc trajectory are compared to those reconstructed from a single arc trajectory and evaluated in terms of spatial resolution, low contrast resolution, noise, and artifact level.

  19. C-arm based cone-beam CT using a two-concentric-arc source trajectory: system evaluation.

    PubMed

    Zambelli, Joseph; Zhuang, Tingliang; Nett, Brian E; Riddell, Cyril; Belanger, Barry; Chen, Guang-Hong

    2008-01-01

    The current x-ray source trajectory for C-arm based cone-beam CT is a single arc. Reconstruction from data acquired with this trajectory yields cone-beam artifacts for regions other than the central slice. In this work we present the preliminary evaluation of reconstruction from a source trajectory of two concentric arcs using a flat-panel detector equipped C-arm gantry (GE Healthcare Innova 4100 system, Waukesha, Wisconsin). The reconstruction method employed is a summation of FDK-type reconstructions from the two individual arcs. For the angle between arcs studied here, 30°, this method offers a significant reduction in the visibility of cone-beam artifacts, with the additional advantages of simplicity and ease of implementation due to the fact that it is a direct extension of the reconstruction method currently implemented on commercial systems. Reconstructed images from data acquired from the two arc trajectory are compared to those reconstructed from a single arc trajectory and evaluated in terms of spatial resolution, low contrast resolution, noise, and artifact level.

  20. Yield Determination of Underground and Near Surface Explosions

    NASA Astrophysics Data System (ADS)

    Pasyanos, M.

    2015-12-01

    As seismic coverage of the earth's surface continues to improve, we are faced with signals from a wide variety of explosions from various sources ranging from oil train and ordnance explosions to military and terrorist attacks, as well as underground nuclear tests. We present on a method for determining the yield of underground and near surface explosions, which should be applicable for many of these. We first review the regional envelope method that was developed for underground explosions (Pasyanos et al., 2012) and more recently modified for near surface explosions (Pasyanos and Ford, 2015). The technique models the waveform envelope templates as a product of source, propagation (geometrical spreading and attenuation), and site terms, while near surface explosions include an additional surface effect. Yields and depths are determined by comparing the observed envelopes to the templates and minimizing the misfit. We then apply the method to nuclear and chemical explosions for a range of yields, depths, and distances. We will review some results from previous work, and show new examples from ordnance explosions in Scandinavia, nuclear explosions in Eurasia, and chemical explosions in Nevada associated with the Source Physics Experiments (SPE).

  1. Crowd-Sourced Amputee Gait Data: A Feasibility Study Using YouTube Videos of Unilateral Trans-Femoral Gait.

    PubMed

    Gardiner, James; Gunarathne, Nuwan; Howard, David; Kenney, Laurence

    2016-01-01

    Collecting large datasets of amputee gait data is notoriously difficult. Additionally, collecting data on less prevalent amputations or on gait activities other than level walking and running on hard surfaces is rarely attempted. However, with the wealth of user-generated content on the Internet, the scope for collecting amputee gait data from alternative sources other than traditional gait labs is intriguing. Here we investigate the potential of YouTube videos to provide gait data on amputee walking. We use an example dataset of trans-femoral amputees level walking at self-selected speeds to collect temporal gait parameters and calculate gait asymmetry. We compare our YouTube data with typical literature values, and show that our methodology produces results that are highly comparable to data collected in a traditional manner. The similarity between the results of our novel methodology and literature values lends confidence to our technique. Nevertheless, clear challenges with the collection and interpretation of crowd-sourced gait data remain, including long term access to datasets, and a lack of validity and reliability studies in this area.

  2. Crowd-Sourced Amputee Gait Data: A Feasibility Study Using YouTube Videos of Unilateral Trans-Femoral Gait

    PubMed Central

    Gardiner, James; Gunarathne, Nuwan; Howard, David; Kenney, Laurence

    2016-01-01

    Collecting large datasets of amputee gait data is notoriously difficult. Additionally, collecting data on less prevalent amputations or on gait activities other than level walking and running on hard surfaces is rarely attempted. However, with the wealth of user-generated content on the Internet, the scope for collecting amputee gait data from alternative sources other than traditional gait labs is intriguing. Here we investigate the potential of YouTube videos to provide gait data on amputee walking. We use an example dataset of trans-femoral amputees level walking at self-selected speeds to collect temporal gait parameters and calculate gait asymmetry. We compare our YouTube data with typical literature values, and show that our methodology produces results that are highly comparable to data collected in a traditional manner. The similarity between the results of our novel methodology and literature values lends confidence to our technique. Nevertheless, clear challenges with the collection and interpretation of crowd-sourced gait data remain, including long term access to datasets, and a lack of validity and reliability studies in this area. PMID:27764226

  3. Jet crackle: skewness transport budget and a mechanistic source model

    NASA Astrophysics Data System (ADS)

    Buchta, David; Freund, Jonathan

    2016-11-01

    The sound from high-speed (supersonic) jets, such as on military aircraft, is distinctly different than that from lower-speed jets, such as on commercial airliners. Atop the already loud noise, a higher speed adds an intense, fricative, and intermittent character. The observed pressure wave patterns have strong peaks which are followed by relatively long shallows; notably, their pressure skewness is Sk >= 0 . 4 . Direct numerical simulation of free-shear-flow turbulence show that these skewed pressure waves occur immediately adjacent to the turbulence source for M >= 2 . 5 . Additionally, the near-field waves are seen to intersect and nonlinearly merge with other waves. Statistical analysis of terms in a pressure skewness transport equation show that starting just beyond δ99 the nonlinear wave mechanics that add to Sk are balanced by damping molecular effects, consistent with this aspect of the sound arising in the source region. A gas dynamics description is developed that neglects rotational turbulence dynamics and yet reproduces the key crackle features. At its core, this mechanism shows simply that nonlinear compressive effects lead directly to stronger compressions than expansions and thus Sk > 0 .

  4. The Potential for Engineering Enhanced Functional-Feed Soybeans for Sustainable Aquaculture Feed.

    PubMed

    Herman, Eliot M; Schmidt, Monica A

    2016-01-01

    Aquaculture is the most rapidly growing segment of global animal production that now surpasses wild-capture fisheries production and is continuing to grow 10% annually. Sustainable aquaculture needs to diminish, and progressively eliminate, its dependence on fishmeal-sourced feed from over-harvested fisheries. Sustainable aquafeed sources will need to be primarily of plant-origin. Soybean is currently the primary global vegetable-origin protein source for aquaculture. Direct exchange of soybean meal for fishmeal in aquafeed has resulted in reduced growth rates due in part to soybean's anti-nutritional proteins. To produce soybeans for use in aquaculture feeds a new conventional line has been bred termed Triple Null by stacking null alleles for the feed-relevant proteins Kunitz Trypsin Inhibitor, lectin, and P34 allergen. Triple Null is now being further enhanced as a platform to build additional transgene traits for vaccines, altered protein composition, and to produce high levels of β-carotene an intrinsic orange-colored aquafeed marker to distinguish the seeds from commodity beans and as the metabolic feedstock precursor of highly valued astaxanthin.

  5. Unrecognized astrometric confusion in the Galactic Centre

    NASA Astrophysics Data System (ADS)

    Plewa, P. M.; Sari, R.

    2018-06-01

    The Galactic Centre is a crowded stellar field and frequent unrecognized events of source confusion, which involve undetected faint stars, are expected to introduce astrometric noise on a sub-mas level. This confusion noise is the main non-instrumental effect limiting the astrometric accuracy and precision of current near-infrared imaging observations and the long-term monitoring of individual stellar orbits in the vicinity of the central supermassive black hole. We self-consistently simulate the motions of the known and the yet unidentified stars to characterize this noise component and show that a likely consequence of source confusion is a bias in estimates of the stellar orbital elements, as well as the inferred mass and distance of the black hole, in particular if stars are being observed at small projected separations from it, such as the star S2 during pericentre passage. Furthermore, we investigate modelling the effect of source confusion as an additional noise component that is time-correlated, demonstrating a need for improved noise models to obtain trustworthy estimates of the parameters of interest (and their uncertainties) in future astrometric studies.

  6. Estimating source parameters from deformation data, with an application to the March 1997 earthquake swarm off the Izu Peninsula, Japan

    NASA Astrophysics Data System (ADS)

    Cervelli, P.; Murray, M. H.; Segall, P.; Aoki, Y.; Kato, T.

    2001-06-01

    We have applied two Monte Carlo optimization techniques, simulated annealing and random cost, to the inversion of deformation data for fault and magma chamber geometry. These techniques involve an element of randomness that permits them to escape local minima and ultimately converge to the global minimum of misfit space. We have tested the Monte Carlo algorithms on two synthetic data sets. We have also compared them to one another in terms of their efficiency and reliability. We have applied the bootstrap method to estimate confidence intervals for the source parameters, including the correlations inherent in the data. Additionally, we present methods that use the information from the bootstrapping procedure to visualize the correlations between the different model parameters. We have applied these techniques to GPS, tilt, and leveling data from the March 1997 earthquake swarm off of the Izu Peninsula, Japan. Using the two Monte Carlo algorithms, we have inferred two sources, a dike and a fault, that fit the deformation data and the patterns of seismicity and that are consistent with the regional stress field.

  7. 36 CFR 1290.3 - Sources of assassination records and additional records and information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Sources of assassination records and additional records and information. Assassination records and... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Sources of assassination records and additional records and information. 1290.3 Section 1290.3 Parks, Forests, and Public Property...

  8. 37 CFR 1.776 - Calculation of patent term extension for a food additive or color additive.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... extension for a food additive or color additive. 1.776 Section 1.776 Patents, Trademarks, and Copyrights... Calculation of patent term extension for a food additive or color additive. (a) If a determination is made pursuant to § 1.750 that a patent for a food additive or color additive is eligible for extension, the term...

  9. 37 CFR 1.776 - Calculation of patent term extension for a food additive or color additive.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... extension for a food additive or color additive. 1.776 Section 1.776 Patents, Trademarks, and Copyrights... Calculation of patent term extension for a food additive or color additive. (a) If a determination is made pursuant to § 1.750 that a patent for a food additive or color additive is eligible for extension, the term...

  10. 37 CFR 1.776 - Calculation of patent term extension for a food additive or color additive.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... extension for a food additive or color additive. 1.776 Section 1.776 Patents, Trademarks, and Copyrights... Calculation of patent term extension for a food additive or color additive. (a) If a determination is made pursuant to § 1.750 that a patent for a food additive or color additive is eligible for extension, the term...

  11. 37 CFR 1.776 - Calculation of patent term extension for a food additive or color additive.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... extension for a food additive or color additive. 1.776 Section 1.776 Patents, Trademarks, and Copyrights... Calculation of patent term extension for a food additive or color additive. (a) If a determination is made pursuant to § 1.750 that a patent for a food additive or color additive is eligible for extension, the term...

  12. 37 CFR 1.776 - Calculation of patent term extension for a food additive or color additive.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... extension for a food additive or color additive. 1.776 Section 1.776 Patents, Trademarks, and Copyrights... Calculation of patent term extension for a food additive or color additive. (a) If a determination is made pursuant to § 1.750 that a patent for a food additive or color additive is eligible for extension, the term...

  13. Oxidation study of coated Crofer 22 APU steel in dry oxygen

    NASA Astrophysics Data System (ADS)

    Molin, Sebastian; Chen, Ming; Hendriksen, Peter Vang

    2014-04-01

    The effect of a dual layer coating composed of a layer of a Co3O4 and a layer of a La0.85Sr0.15MnO3/Co3O4 mixture on the high temperature corrosion of the Crofer 22 APU alloy is reported. Oxidation experiments were performed in dry oxygen at three temperatures: 800 °C, 850 °C and 900 °C for periods up to 1000 h. Additionally at 850 °C a 5000 h long oxidation test was performed to evaluate longer term suitability of the proposed coating. Corrosion kinetics were evaluated by measuring mass gain during oxidation. The corrosion kinetics for the coated samples are analyzed in terms of a parabolic rate law. Microstructural features were investigated by scanning electron microscopy, energy dispersive X-ray analysis and X-ray diffractometry. The coating is effective in reducing the corrosion rate and in ensuring long lifetime of coated alloys. The calculated activation energy for the corrosion process is around 1.8 eV. A complex Co-Mn-Cr spinel is formed caused by diffusion of Cr and Mn from the alloy into the Co3O4 coating and by additional diffusion of Mn from the LSM layer. Adding a layer of LSM/Co3O4, acting as an additional Mn source, on top of the cobalt spinel is beneficial for the improved corrosion resistance.

  14. A Benchmark Study of Large Contract Supplier Monitoring Within DOD and Private Industry

    DTIC Science & Technology

    1994-03-01

    83 2. Long Term Supplier Relationships ...... .. 84 3. Global Sourcing . . . . . . . . . . . . .. 85 4. Refocusing on Customer Quality...monitoring and recognition, reduced number of suppliers, global sourcing, and long term contractor relationships . These initiatives were then compared to DCMC...on customer quality. 14. suBJE.C TERMS Benchmark Study of Large Contract Supplier Monitoring. 15. NUMBER OF PAGES108 16. PRICE CODE 17. SECURITY

  15. Recent H- diagnostics, plasma simulations, and 2X scaled Penning ion source developments at the Rutherford Appleton Laboratory

    NASA Astrophysics Data System (ADS)

    Lawrie, S. R.; Faircloth, D. C.; Smith, J. D.; Sarmento, T. M.; Whitehead, M. O.; Wood, T.; Perkins, M.; Macgregor, J.; Abel, R.

    2018-05-01

    A vessel for extraction and source plasma analyses is being used for Penning H- ion source development at the Rutherford Appleton Laboratory. A new set of optical elements including an einzel lens has been installed, which transports over 80 mA of H- beam successfully. Simultaneously, a 2X scaled Penning source has been developed to reduce cathode power density. The 2X source is now delivering a 65 mA H- ion beam at 10% duty factor, meeting its design criteria. The long-term viability of the einzel lens and 2X source is now being evaluated, so new diagnostic devices have been installed. A pair of electrostatic deflector plates is used to correct beam misalignment and perform fast chopping, with a voltage rise time of 24 ns. A suite of four quartz crystal microbalances has shown that the cesium flux in the vacuum vessel is only increased by a factor of two, despite the absence of a dedicated cold trap. Finally, an infrared camera has demonstrated good agreement with thermal simulations but has indicated unexpected heating due to beam loss on the downstream electrode. These types of diagnostics are suitable for monitoring all operational ion sources. In addition to experimental campaigns and new diagnostic tools, the high-performance VSim and COMSOL software packages are being used for plasma simulations of two novel ion thrusters for space propulsion applications. In parallel, a VSim framework has been established to include arbitrary temperature and cesium fields to allow the modeling of surface physics in H- ion sources.

  16. A comprehensive Probabilistic Tsunami Hazard Assessment for the city of Naples (Italy)

    NASA Astrophysics Data System (ADS)

    Anita, G.; Tonini, R.; Selva, J.; Sandri, L.; Pierdominici, S.; Faenza, L.; Zaccarelli, L.

    2012-12-01

    A comprehensive Probabilistic Tsunami Hazard Assessment (PTHA) should consider different tsunamigenic sources (seismic events, slide failures, volcanic eruptions) to calculate the hazard on given target sites. This implies a multi-disciplinary analysis of all natural tsunamigenic sources, in a multi-hazard/risk framework, which considers also the effects of interaction/cascade events. Our approach shows the ongoing effort to analyze the comprehensive PTHA for the city of Naples (Italy) including all types of sources located in the Tyrrhenian Sea, as developed within the Italian project ByMuR (Bayesian Multi-Risk Assessment). The project combines a multi-hazard/risk approach to treat the interactions among different hazards, and a Bayesian approach to handle the uncertainties. The natural potential tsunamigenic sources analyzed are: 1) submarine seismic sources located on active faults in the Tyrrhenian Sea and close to the Southern Italian shore line (also we consider the effects of the inshore seismic sources and the associated active faults which we provide their rapture properties), 2) mass failures and collapses around the target area (spatially identified on the basis of their propensity to failure), and 3) volcanic sources mainly identified by pyroclastic flows and collapses from the volcanoes in the Neapolitan area (Vesuvius, Campi Flegrei and Ischia). All these natural sources are here preliminary analyzed and combined, in order to provide a complete picture of a PTHA for the city of Naples. In addition, the treatment of interaction/cascade effects is formally discussed in the case of significant temporary variations in the short-term PTHA due to an earthquake.

  17. Spatial variation in anthropogenic mortality induces a source-sink system in a hunted mesopredator.

    PubMed

    Minnie, Liaan; Zalewski, Andrzej; Zalewska, Hanna; Kerley, Graham I H

    2018-04-01

    Lethal carnivore management is a prevailing strategy to reduce livestock predation. Intensity of lethal management varies according to land-use, where carnivores are more intensively hunted on farms relative to reserves. Variations in hunting intensity may result in the formation of a source-sink system where carnivores disperse from high-density to low-density areas. Few studies quantify dispersal between supposed sources and sinks-a fundamental requirement for source-sink systems. We used the black-backed jackal (Canis mesomelas) as a model to determine if heterogeneous anthropogenic mortality induces a source-sink system. We analysed 12 microsatellite loci from 554 individuals from lightly hunted and previously unhunted reserves, as well as heavily hunted livestock- and game farms. Bayesian genotype assignment showed that jackal populations displayed a hierarchical population structure. We identified two genetically distinct populations at the regional level and nine distinct subpopulations at the local level, with each cluster corresponding to distinct land-use types separated by various dispersal barriers. Migration, estimated using Bayesian multilocus genotyping, between reserves and farms was asymmetric and heterogeneous anthropogenic mortality induced source-sink dynamics via compensatory immigration. Additionally some heavily hunted populations also acted as source populations, exporting individuals to other heavily hunted populations. This indicates that heterogeneous anthropogenic mortality results in the formation of a complex series of interconnected sources and sinks. Thus, lethal management of mesopredators may not be an effective long-term strategy in reducing livestock predation, as dispersal and, more importantly, compensatory immigration may continue to affect population reduction efforts as long as dispersal from other areas persists.

  18. Effects of dendritic load on the firing frequency of oscillating neurons.

    PubMed

    Schwemmer, Michael A; Lewis, Timothy J

    2011-03-01

    We study the effects of passive dendritic properties on the dynamics of neuronal oscillators. We find that the addition of a passive dendrite can sometimes have counterintuitive effects on firing frequency. Specifically, the addition of a hyperpolarized passive dendritic load can either increase, decrease, or have negligible effects on firing frequency. We use the theory of weak coupling to derive phase equations for "ball-and-stick" model neurons and two-compartment model neurons. We then develop a framework for understanding how the addition of passive dendrites modulates the frequency of neuronal oscillators. We show that the average value of the neuronal oscillator's phase response curves measures the sensitivity of the neuron's firing rate to the dendritic load, including whether the addition of the dendrite causes an increase or decrease in firing frequency. We interpret this finding in terms of to the slope of the neuronal oscillator's frequency-applied current curve. We also show that equivalent results exist for constant and noisy point-source input to the dendrite. We note that the results are not specific to neurons but are applicable to any oscillator subject to a passive load.

  19. Using a Simple Knowledge Organization System to facilitate Catalogue and Search for the ESA CCI Open Data Portal

    NASA Astrophysics Data System (ADS)

    Wilson, Antony; Bennett, Victoria; Donegan, Steve; Juckes, Martin; Kershaw, Philip; Petrie, Ruth; Stephens, Ag; Waterfall, Alison

    2016-04-01

    The ESA Climate Change Initiative (CCI) is a €75m programme that runs from 2009-2016, with a goal to provide stable, long-term, satellite-based essential climate variable (ECV) data products for climate modellers and researchers. As part of the CCI, ESA have funded the Open Data Portal project to establish a central repository to bring together the data from these multiple sources and make it available in a consistent way, in order to maximise its dissemination amongst the international user community. Search capabilities are a critical component to attaining this goal. To this end, the project is providing dataset-level metadata in the form of ISO 19115 records served via a standard OGC CSW interface. In addition, the Open Data Portal is re-using the search system from the Earth System Grid Federation (ESGF), successfully applied to support CMIP5 (5th Coupled Model Intercomparison Project) and obs4MIPs. This uses a tightly defined controlled vocabulary of metadata terms, the DRS (The Data Reference Syntax) which encompass different aspects of the data. This system hs facilitated the construction of a powerful faceted search interface to enable users to discover data at the individual file level of granularity through ESGF's web portal frontend. The use of a consistent set of model experiments for CMIP5 allowed the definition of a uniform DRS for all model data served from ESGF. For CCI however, there are thirteen ECVs, each of which is derived from multiple sources and different science communities resulting in highly heterogeneous metadata. An analysis has been undertaken of the concepts in use, with the aim to produce a CCI DRS which could be provide a single authoritative source for cataloguing and searching the CCI data for the Open Data Portal. The use of SKOS (Simple Knowledge Organization System) and OWL (Web Ontology Language) to represent the DRS are a natural fit and provide controlled vocabularies as well as a way to represent relationships between similar terms used in different ECVs. An iterative approach has been adopted for the model development working closely with domain experts and drawing on practical experience working with content in the input datasets. Tooling has been developed to enable the definition of vocabulary terms via a simple spreadsheet format which can then be automatically converted into Turtle notation and uploaded to the CCI DRS vocabulary service. With a baseline model established, work is underway to develop an ingestion pipeline to import validated search metadata into the ESGF and OGC CSW search services. In addition to the search terms indexed into the ESGF search system, ISO 19115 records will also be similarly tagged during this process with search terms from the data model. In this way it will be possible to construct a faceted search user interface for the Portal which can yield linked search results for data both at the file and dataset level granularity. It is hoped that this will also provide a rich range of content for third-party organisations wishing to incorporate access to CCI data in their own applications and services.

  20. GX 3+1: The Stability of Spectral Index as a Function of Mass Accretion Rate

    NASA Technical Reports Server (NTRS)

    Seifana, Elena; Titarchuk, Lev

    2012-01-01

    We present an analysis of the spectral and timing properties observed in X-rays from neutron star (NS) binary GX 3+1 (4U 1744-26) during long-term transitions between the faint and bright phases superimposed on short-term transitions between lower banana (LB) and upper banana (UB) branches in terms of its color-color diagram, We analyze all observations of this source obtained with the Rossi X-ray Timing Explorer and BeppoSAX satellites, We find that the X-ray broadband energy spectra during these spectral transitions can be adequately reproduced by a composition of a low-temperature blackbody component, a Comptonized component (COMPTB), and Gaussian component We argue that the electron temperature kTe of the Compton cloud monotonically increases from 2.3 keY to 4.5 keY, when GX 3+1 makes a transition from UB to LB. We also detect an evolution of noise components (a very low frequency noise and a high-frequency noise) during these LB-UB transitions. Using a disk seed photon normalization of COMPTB, which is proportional to the mass accretion rate, we find that the photon power-law index Gamma is almost constant (Gamma = 2.00 +/- 0.02) when mass accretion rate changes by factor four. In addition, we find that the emergent spectrum is dominated by the strong Comptonized component We interpret this quasi-stability of the index Gamma and a particular form of the spectrum in the framework of a model in which the energy release in the transition layer located between the accretion disk and NS surface dominates that in the disk. Moreover, this index stability effect now established for GX 3+ I was previously found in the atoll source 4U 1728-34 and suggested for a number of other low-mass X-ray NS binaries. This intrinsic behavior of NSs, in particular for atoll sources, is fundamentally different from that seen in black hole binary sources where the index monotonically increases during spectral transition from the low state to the high state and then finally saturates at high values of mass accretion rate.

  1. GX 3+1: The Stability of Spectral Index as a Function of Mass Accretion Rate

    NASA Astrophysics Data System (ADS)

    Seifina, Elena; Titarchuk, Lev

    2012-03-01

    We present an analysis of the spectral and timing properties observed in X-rays from neutron star (NS) binary GX 3+1 (4U 1744-26) during long-term transitions between the faint and bright phases superimposed on short-term transitions between lower banana (LB) and upper banana (UB) branches in terms of its color-color diagram. We analyze all observations of this source obtained with the Rossi X-ray Timing Explorer and Beppo SAX satellites. We find that the X-ray broadband energy spectra during these spectral transitions can be adequately reproduced by a composition of a low-temperature blackbody component, a Comptonized component (COMPTB), and a Gaussian component. We argue that the electron temperature kTe of the Compton cloud monotonically increases from 2.3 keV to 4.5 keV, when GX 3+1 makes a transition from UB to LB. We also detect an evolution of noise components (a very low frequency noise and a high-frequency noise) during these LB-UB transitions. Using a disk seed photon normalization of COMPTB, which is proportional to the mass accretion rate, we find that the photon power-law index Γ is almost constant (Γ = 2.00 ± 0.02) when mass accretion rate changes by a factor of four. In addition, we find that the emergent spectrum is dominated by the strong Comptonized component. We interpret this quasi-stability of the index Γ and a particular form of the spectrum in the framework of a model in which the energy release in the transition layer located between the accretion disk and NS surface dominates that in the disk. Moreover, this index stability effect now established for GX 3+1 was previously found in the atoll source 4U 1728-34 and suggested for a number of other low-mass X-ray NS binaries (see Farinelli & Titarchuk). This intrinsic behavior of NSs, in particular for atoll sources, is fundamentally different from that seen in black hole binary sources where the index monotonically increases during spectral transition from the low state to the high state and then finally saturates at high values of mass accretion rate.

  2. Isotopic composition and neutronics of the Okelobondo natural reactor

    NASA Astrophysics Data System (ADS)

    Palenik, Christopher Samuel

    The Oklo-Okelobondo and Bangombe uranium deposits, in Gabon, Africa host Earth's only known natural nuclear fission reactors. These 2 billion year old reactors represent a unique opportunity to study used nuclear fuel over geologic periods of time. The reactors in these deposits have been studied as a means by which to constrain the source term of fission product concentrations produced during reactor operation. The source term depends on the neutronic parameters, which include reactor operation duration, neutron flux and the neutron energy spectrum. Reactor operation has been modeled using a point-source computer simulation (Oak Ridge Isotope Generation and Depletion, ORIGEN, code) for a light water reactor. Model results have been constrained using secondary ionization mass spectroscopy (SIMS) isotopic measurements of the fission products Nd and Te, as well as U in uraninite from samples collected in the Okelobondo reactor zone. Based upon the constraints on the operating conditions, the pre-reactor concentrations of Nd (150 ppm +/- 75 ppm) and Te (<1 ppm) in uraninite were estimated. Related to the burnup measured in Okelobondo samples (0.7 to 13.8 GWd/MTU), the final fission product inventories of Nd (90 to 1200 ppm) and Te (10 to 110 ppm) were calculated. By the same means, the ranges of all other fission products and actinides produced during reactor operation were calculated as a function of burnup. These results provide a source term against which the present elemental and decay abundances at the fission reactor can be compared. Furthermore, they provide new insights into the extent to which a "fossil" nuclear reactor can be characterized on the basis of its isotopic signatures. In addition, results from the study of two other natural systems related to the radionuclide and fission product transport are included. A detailed mineralogical characterization of the uranyl mineralogy at the Bangombe uranium deposit in Gabon, Africa was completed to improve geochemical models of the solubility-limiting phase. A study of the competing effects of radiation damage and annealing in a U-bearing crystal of zircon shows that low temperature annealing in actinide-bearing phases is significant in the annealing of radiation damage.

  3. Tracing changes in soil N transformations to explain the doubling of N2O emissions under elevated CO2 in the Giessen FACE

    NASA Astrophysics Data System (ADS)

    Moser, Gerald; Brenzinger, Kristof; Gorenflo, Andre; Clough, Tim; Braker, Gesche; Müller, Christoph

    2017-04-01

    To reduce the emissions of greenhouse gases (CO2, CH4 & N2O) it is important to quantify main sources and identify the respective ecosystem processes. While the main sources of N2O emissions in agro-ecosystems under current conditions are well known, the influence of a projected higher level of CO2 on the main ecosystem processes responsible for N2O emissions has not been investigated in detail. A major result of the Giessen FACE in a managed temperate grassland was that a +20% CO2 level caused a positive feedback due to increased emissions of N2O to 221% related to control condition. To be able to trace the sources of additional N2O emissions a 15N tracing study was conducted. We measured the N2O emission and its 15N signature, together with the 15N signature of soil and plant samples. The results were analyzed using a 15N tracing model which quantified the main changes in N transformation rates under elevated CO2. Directly after 15N fertilizer application a much higher dynamic of N transformations was observed than in the long run. Absolute mineralisation and DNRA rates were lower under elevated CO2 in the short term but higher in the long term. During the one year study period beginning with the 15N labelling a 1.8-fold increase of N2O emissions occurred under elevated CO2. The source of increased N2O was associated with NO3- in the first weeks after 15N application. Elevated CO2 affected denitrification rates, which resulted in increased N2O emissions due to a change of gene transcription rates (nosZ/(nirK+nirS)) and resulting enzyme activity (see: Brenzinger et al.). Here we show that the reported enhanced N2O emissions for the first 8 FACE years do prevail even in the long-term (> 15 years). The effect of elevated CO2 on N2O production/emission can be explained by altered activity ratios within a stable microbial community.

  4. Adaptation

    ERIC Educational Resources Information Center

    Littlejohn, Emily

    2018-01-01

    "Adaptation" originally began as a scientific term, but from 1860 to today it most often refers to an altered version of a text, film, or other literary source. When this term was first analyzed, humanities scholars often measured adaptations against their source texts, frequently privileging "original" texts. However, this…

  5. 40 CFR 401.11 - General definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Environmental Protection Agency. (d) The term point source means any discernible, confined and discrete conveyance, including but not limited to any pipe, ditch, channel, tunnel, conduit, well, discrete fissure... which pollutants are or may be discharged. (e) The term new source means any building, structure...

  6. Feasibility of co-composting of sewage sludge, spent mushroom substrate and wheat straw.

    PubMed

    Meng, Liqiang; Li, Weiguang; Zhang, Shumei; Wu, Chuandong; Lv, Longyi

    2017-02-01

    In this study, the lab-scale co-composting of sewage sludge (SS) with mushroom substrate (SMS) and wheat straw (WS) conducted for 20days was evaluated. The addition of SMS evidently increased CO 2 production and dehydrogenase activity. The combined addition of SMS and WS significantly improved the compost quality in terms of temperature, organic matter degradation and germination index, especially, reduced 21.9% of NH 3 emission. That's because SMS and WS possessed the complementarity of free air space and contained plenty of degradable carbon source. The SMS could create a comfortable environment for the nitrifying bacteria and improve nitrification. The carbohydrates from combined addition of SMS and WS could be utilized by thermophilic microorganisms, stimulate ammonia assimilation and reduce NH 3 emission. These results suggested that adding SMS and WS could not only improve the degradation of organic matter and the quality of compost product, but also stimulate ammonia assimilation and reduce ammonia emission. Copyright © 2016. Published by Elsevier Ltd.

  7. What Is eHealth (3): A Systematic Review of Published Definitions

    PubMed Central

    Oh, Hans; Rizo, Carlos; Enkin, Murray

    2005-01-01

    Context The term eHealth is widely used by many individuals, academic institutions, professional bodies, and funding organizations. It has become an accepted neologism despite the lack of an agreed-upon clear or precise definition. We believe that communication among the many individuals and organizations that use the term could be improved by comprehensive data about the range of meanings encompassed by the term. Objective To report the results of a systematic review of published, suggested, or proposed definitions of eHealth. Data Sources Using the search query string “eHealth” OR “e-Health” OR “electronic health”, we searched the following databases: Medline and Premedline (1966-June 2004), EMBASE (1980-May 2004), International Pharmaceutical Abstracts (1970-May 2004), Web of Science (all years), Information Sciences Abstracts (1966-May 2004), Library Information Sciences Abstracts (1969-May 2004), and Wilson Business Abstracts (1982-March 2004). In addition, we searched dictionaries and an Internet search engine. Study Selection We included any source published in either print format or on the Internet, available in English, and containing text that defines or attempts to define eHealth in explicit terms. Two of us independently reviewed titles and abstracts of citations identified in the bibliographic databases and Internet search, reaching consensus on relevance by discussion. Data Extraction We retrieved relevant reports, articles, references, letters, and websites containing definitions of eHealth. Two of us qualitatively analyzed the definitions and coded them for content, emerging themes, patterns, and novel ideas. Data Synthesis The 51 unique definitions that we retrieved showed a wide range of themes, but no clear consensus about the meaning of the term eHealth. We identified 2 universal themes (health and technology) and 6 less general (commerce, activities, stakeholders, outcomes, place, and perspectives). Conclusions The widespread use of the term eHealth suggests that it is an important concept, and that there is a tacit understanding of its meaning. This compendium of proposed definitions may improve communication among the many individuals and organizations that use the term. PMID:15829471

  8. A Systematic Review of Chronic Fatigue Syndrome: Don't Assume It's Depression

    PubMed Central

    Griffith, James P.; Zarrouf, Fahd A.

    2008-01-01

    Objective: Chronic fatigue syndrome (CFS) is characterized by profound, debilitating fatigue and a combination of several other symptoms resulting in substantial reduction in occupational, personal, social, and educational status. CFS is often misdiagnosed as depression. The objective of this study was to evaluate and discuss different etiologies, approaches, and management strategies of CFS and to present ways to differentiate it from the fatigue symptom of depression. Data Sources: A MEDLINE search was conducted to identify existing information about CFS and depression using the headings chronic fatigue syndrome AND depression. The alternative terms major depressive disorder and mood disorder were also searched in conjunction with the term chronic fatigue syndrome. Additionally, MEDLINE was searched using the term chronic fatigue. All searches were limited to articles published within the last 10 years, in English. A total of 302 articles were identified by these searches. Also, the term chronic fatigue syndrome was searched by itself. This search was limited to articles published within the last 5 years, in English, and resulted in an additional 460 articles. Additional publications were identified by manually searching the reference lists of the articles from both searches. Study Selection and Data Extraction: CFS definitions, etiologies, differential diagnoses (especially depression) and management strategies were extracted, reviewed, and summarized to meet the objectives of this article. Data Synthesis: CFS is underdiagnosed in more than 80% of the people who have it; at the same time, it is often misdiagnosed as depression. Genetic, immunologic, infectious, metabolic, and neurologic etiologies were suggested to explain CFS. A biopsychosocial model was suggested for evaluating, managing, and differentiating CFS from depression. Conclusions: Evaluating and managing chronic fatigue is a challenging situation for physicians, as it is a challenging and difficult condition for patients. A biopsychosocial approach in the evaluation and management is recommended. More studies about CFS manifestations, evaluation, and management are needed. PMID:18458765

  9. Possible Dual Earthquake-Landslide Source of the 13 November 2016 Kaikoura, New Zealand Tsunami

    NASA Astrophysics Data System (ADS)

    Heidarzadeh, Mohammad; Satake, Kenji

    2017-10-01

    A complicated earthquake ( M w 7.8) in terms of rupture mechanism occurred in the NE coast of South Island, New Zealand, on 13 November 2016 (UTC) in a complex tectonic setting comprising a transition strike-slip zone between two subduction zones. The earthquake generated a moderate tsunami with zero-to-crest amplitude of 257 cm at the near-field tide gauge station of Kaikoura. Spectral analysis of the tsunami observations showed dual peaks at 3.6-5.7 and 5.7-56 min, which we attribute to the potential landslide and earthquake sources of the tsunami, respectively. Tsunami simulations showed that a source model with slip on an offshore plate-interface fault reproduces the near-field tsunami observation in terms of amplitude, but fails in terms of tsunami period. On the other hand, a source model without offshore slip fails to reproduce the first peak, but the later phases are reproduced well in terms of both amplitude and period. It can be inferred that an offshore source is necessary to be involved, but it needs to be smaller in size than the plate interface slip, which most likely points to a confined submarine landslide source, consistent with the dual-peak tsunami spectrum. We estimated the dimension of the potential submarine landslide at 8-10 km.

  10. Microbially-Enhanced Coal Bed Methane: Strategies for Increased Biogenic Production

    NASA Astrophysics Data System (ADS)

    Davis, K.; Barhart, E. P.; Schweitzer, H. D.; Cunningham, A. B.; Gerlach, R.; Hiebert, R.; Fields, M. W.

    2014-12-01

    Coal is the largest fossil fuel resource in the United States. Most of this coal is deep in the subsurface making it costly and potentially dangerous to extract. However, in many of these deep coal seams, methane, the main component of natural gas, has been discovered and successfully harvested. Coal bed methane (CBM) currently accounts for approximately 7.5% of the natural gas produced in the U.S. Combustion of natural gas produces substantially less CO2 and toxic emissions (e.g. heavy metals) than combustion of coal or oil thereby making it a cleaner energy source. In the large coal seams of the Powder River Basin (PRB) in southeast Montana and northeast Wyoming, CBM is produced almost entirely by biogenic processes. The in situ conversion of coal to CBM by the native microbial community is of particular interest for present and future natural gas sources as it provides the potential to harvest energy from coal seams with lesser environmental impacts than mining and burning coal. Research at Montana State University has shown the potential for enhancing the subsurface microbial processes that produce CBM. Long-term batch enrichments have investigated the methane enhancement potential of yeast extract as well as algal and cyanobacterial biomass additions with increased methane production observed with all three additions when compared to no addition. Future work includes quantification of CBM enhancement and normalization of additions. This presentation addresses the options thus far investigated for increasing CBM production and the next steps for developing the enhanced in situ conversion of coal to CBM.

  11. Tropical waves and the quasi-biennial oscillation in the lower stratosphere

    NASA Technical Reports Server (NTRS)

    Miller, A. J.; Angell, J. K.; Korshover, J.

    1976-01-01

    By means of spectrum analysis of 11 years of lower stratospheric daily winds and temperatures at Balboa, Ascension and Canton-Singapore, evidence is presented supporting the existence of two principal wave modes with periods of about 11-17 days (Kelvin waves) and about 4-5 days (mixed Rossby-gravity waves). The structure of the two wave modes, as well as the vertical eddy momentum flux by the waves, is shown to be related to the quasi-biennial cycle, although for the mixed Rossby-gravity waves this is obvious only at Ascension. In addition, the Coriolis term, suggested as a source of vertical easterly momentum flux for the mixed Rossby-gravity waves, is investigated and found to be of the same magnitude as the vertical eddy flux term. Finally, we have examined the mean meridional motion and the meridional eddy momentum flux for its possible association with the quasi- biennial variation.

  12. Alternative pharmacological strategies for adult ADHD treatment: a systematic review.

    PubMed

    Buoli, Massimiliano; Serati, Marta; Cahn, Wiepke

    2016-01-01

    Adult Attention Deficit Hyperactivity Disorder (ADHD) is a prevalent psychiatric condition associated with high disability and frequent comorbidity. Current standard pharmacotherapy (methylphenidate and atomoxetine) improves ADHD symptoms in the short-term, but poor data were published about long-term treatment. In addition a number of patients present partial or no response to methylphenidate and atomoxetine. Research into the main database sources has been conducted to obtain an overview of alternative pharmacological approaches in adult ADHD patients. Among alternative compounds, amphetamines (mixed amphetamine salts and lisdexamfetamine) have the most robust evidence of efficacy, but they may be associated with serious side effects (e.g. psychotic symptoms or hypertension). Antidepressants, particularly those acting as noradrenaline or dopamine enhancers, have evidence of efficacy, but they should be avoided in patients with comorbid bipolar disorder. Finally metadoxine and lithium may be particularly suitable in case of comorbid alcohol misuse or bipolar disorder.

  13. Cosmological implications of scalar field dark energy models in f(T,𝒯 ) gravity

    NASA Astrophysics Data System (ADS)

    Salako, Ines G.; Jawad, Abdul; Moradpour, Hooman

    After reviewing the f(T,𝒯 ) gravity, in which T is the torsion scalar and 𝒯 is the trace of the energy-momentum tensor, we refer to two cosmological models of this theory in agreement with observational data. Thereinafter, we consider a flat Friedmann-Robertson-Walker (FRW) universe filled by a pressureless source and look at the terms other than the Einstein terms in the corresponding Friedmann equations, as the dark energy (DE) candidate. In addition, some cosmological features of models, including equation of states and deceleration parameters, are addressed helping us in getting the accelerated expansion of the universe in quintessence era. Finally, we extract the scalar field as well as potential of quintessence, tachyon, K-essence and dilatonic fields for both f(T,𝒯 ) models. It is observed that the dynamics of scalar field as well as the scalar potential of these models indicate an accelerated expanding universe in these models.

  14. Assessment of Technologies for the Space Shuttle External Tank Thermal Protection System and Recommendations for Technology Improvement - Part III: Material Property Characterization, Analysis, and Test Methods

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.

    2005-01-01

    The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.

  15. Tissue engineering and regenerative medicine: recent innovations and the transition to translation.

    PubMed

    Fisher, Matthew B; Mauck, Robert L

    2013-02-01

    The field of tissue engineering and regenerative medicine (TERM) has exploded in the last decade. In this Year (or so) in Review, we highlight some of the high impact advances within the field over the past several years. Using the past as our guide and starting with an objective premise, we attempt so to identify recent "hot topics" and transformative publications within the field. Through this process, several key themes emerged: (1) tissue engineering: grafts and materials, (2) regenerative medicine: scaffolds and factors that control endogenous tissue formation, (3) clinical trials, and (4) novel cell sources: induced pluripotent stem cells. Within these focus areas, we summarize the highly impactful articles that emerged from our objective analysis and review additional recent publications to augment and expand upon these key themes. Finally, we discuss where the TERM field may be headed and how to monitor such a broad-based and ever-expanding community.

  16. [Organisational responsibility versus individual responsibility: safety culture? About the relationship between patient safety and medical malpractice law].

    PubMed

    Hart, Dieter

    2009-01-01

    The contribution is concerned with the correlations between risk information, patient safety, responsibility and liability, in particular in terms of liability law. These correlations have an impact on safety culture in healthcare, which can be evaluated positively if--in addition to good quality of medical care--as many sources of error as possible can be identified, analysed, and minimised or eliminated by corresponding measures (safety or risk management). Liability influences the conduct of individuals and enterprises; safety is (probably) also a function of liability; this should also apply to safety culture. The standard of safety culture does not only depend on individual liability for damages, but first of all on strict enterprise liability (system responsibility) and its preventive effects. Patient safety through quality and risk management is therefore also an organisational programme of considerable relevance in terms of liability law.

  17. Chaos in the sunspot cycle - Analysis and prediction

    NASA Technical Reports Server (NTRS)

    Mundt, Michael D.; Maguire, W. Bruce, II; Chase, Robert R. P.

    1991-01-01

    The variability of solar activity over long time scales, given semiquantitatively by measurements of sunspot numbers, is examined as a nonlinear dynamical system. First, a discussion of the data set used and the techniques utilized to reduce the noise and capture the long-term dynamics inherent in the data is presented. Subsequently, an attractor is reconstructed from the data set using the method of time delays. The reconstructed attractor is then used to determine both the dimension of the underlying system and also the largest Lyapunov exponent, which together indicate that the sunspot cycle is indeed chaotic and also low dimensional. In addition, recent techniques of exploiting chaotic dynamics to provide accurate, short-term predictions are utilized in order to improve upon current forecasting methods and also to place theoretical limits on predictability extent. The results are compared to chaotic solar-dynamo models as a possible physically motivated source of this chaotic behavior.

  18. Tissue Engineering and Regenerative Medicine: Recent Innovations and the Transition to Translation

    PubMed Central

    Fisher, Matthew B.

    2013-01-01

    The field of tissue engineering and regenerative medicine (TERM) has exploded in the last decade. In this Year (or so) in Review, we highlight some of the high impact advances within the field over the past several years. Using the past as our guide and starting with an objective premise, we attempt so to identify recent “hot topics” and transformative publications within the field. Through this process, several key themes emerged: (1) tissue engineering: grafts and materials, (2) regenerative medicine: scaffolds and factors that control endogenous tissue formation, (3) clinical trials, and (4) novel cell sources: induced pluripotent stem cells. Within these focus areas, we summarize the highly impactful articles that emerged from our objective analysis and review additional recent publications to augment and expand upon these key themes. Finally, we discuss where the TERM field may be headed and how to monitor such a broad-based and ever-expanding community. PMID:23253031

  19. Green-light supplementation for enhanced lettuce growth under red- and blue-light-emitting diodes

    NASA Technical Reports Server (NTRS)

    Kim, Hyeon-Hye; Goins, Gregory D.; Wheeler, Raymond M.; Sager, John C.

    2004-01-01

    Plants will be an important component of future long-term space missions. Lighting systems for growing plants will need to be lightweight, reliable, and durable, and light-emitting diodes (LEDs) have these characteristics. Previous studies demonstrated that the combination of red and blue light was an effective light source for several crops. Yet the appearance of plants under red and blue lighting is purplish gray making visual assessment of any problems difficult. The addition of green light would make the plant leave appear green and normal similar to a natural setting under white light and may also offer a psychological benefit to the crew. Green supplemental lighting could also offer benefits, since green light can better penetrate the plant canopy and potentially increase plant growth by increasing photosynthesis from the leaves in the lower canopy. In this study, four light sources were tested: 1) red and blue LEDs (RB), 2) red and blue LEDs with green fluorescent lamps (RGB), 3) green fluorescent lamps (GF), and 4) cool-white fluorescent lamps (CWF), that provided 0%, 24%, 86%, and 51% of the total PPF in the green region of the spectrum, respectively. The addition of 24% green light (500 to 600 nm) to red and blue LEDs (RGB treatment) enhanced plant growth. The RGB treatment plants produced more biomass than the plants grown under the cool-white fluorescent lamps (CWF treatment), a commonly tested light source used as a broad-spectrum control.

  20. A Long Decay of X-Ray Flux and Spectral Evolution in the Supersoft Active Galactic Nucleus GSN 069

    NASA Astrophysics Data System (ADS)

    Shu, X. W.; Wang, S. S.; Dou, L. M.; Jiang, N.; Wang, J. X.; Wang, T. G.

    2018-04-01

    GSN 069 is an optically identified very low-mass active galactic nuclei (AGN) that shows supersoft X-ray emission. The source is known to exhibit a huge X-ray outburst, with flux increased by more than a factor of ∼240 compared to the quiescence state. We report its long-term evolution in the X-ray flux and spectral variations over a timescale of ∼decade, using both new and archival X-ray observations from the XMM-Newton and Swift. The new Swift observations detected the source in its lowest level of X-ray activity since the outburst, a factor of ∼4 lower in the 0.2–2 keV flux than that obtained with the XMM-Newton observations nearly eight years ago. Combining with the historical X-ray measurements, we find that the X-ray flux is decreasing slowly. There seemed to be spectral softening associated with the drop of X-ray flux. In addition, we find evidence for the presence of a weak, variable, hard X-ray component, in addition to the dominant thermal blackbody emission reported before. The long decay of X-ray flux and spectral evolution, as well as the supersoft X-ray spectra, suggest that the source could be a tidal disruption event (TDE), though a highly variable AGN cannot be fully ruled out. Further continued X-ray monitoring would be required to test the TDE interpretation, by better determining the flux evolution in the decay phase.

  1. Two Outbreak Sources of Influenza A (H7N9) Viruses Have Been Established in China.

    PubMed

    Wang, Dayan; Yang, Lei; Zhu, Wenfei; Zhang, Ye; Zou, Shumei; Bo, Hong; Gao, Rongbao; Dong, Jie; Huang, Weijuan; Guo, Junfeng; Li, Zi; Zhao, Xiang; Li, Xiaodan; Xin, Li; Zhou, Jianfang; Chen, Tao; Dong, Libo; Wei, Hejiang; Li, Xiyan; Liu, Liqi; Tang, Jing; Lan, Yu; Yang, Jing; Shu, Yuelong

    2016-06-15

    Due to enzootic infections in poultry and persistent human infections in China, influenza A (H7N9) virus has remained a public health threat. The Yangtze River Delta region, which is located in eastern China, is well recognized as the original source for H7N9 outbreaks. Based on the evolutionary analysis of H7N9 viruses from all three outbreak waves since 2013, we identified the Pearl River Delta region as an additional H7N9 outbreak source. H7N9 viruses are repeatedly introduced from these two sources to the other areas, and the persistent circulation of H7N9 viruses occurs in poultry, causing continuous outbreak waves. Poultry movements may contribute to the geographic expansion of the virus. In addition, the AnH1 genotype, which was predominant during wave 1, was replaced by JS537, JS18828, and AnH1887 genotypes during waves 2 and 3. The establishment of a new source and the continuous evolution of the virus hamper the elimination of H7N9 viruses, thus posing a long-term threat of H7N9 infection in humans. Therefore, both surveillance of H7N9 viruses in humans and poultry and supervision of poultry movements should be strengthened. Since its occurrence in humans in eastern China in spring 2013, the avian H7N9 viruses have been demonstrating the continuing pandemic threat posed by the current influenza ecosystem in China. As the viruses are silently circulated in poultry, with potentially severe outcomes in humans, H7N9 virus activity in humans in China is very important to understand. In this study, we identified a newly emerged H7N9 outbreak source in the Pearl River Delta region. Both sources in the Yangtze River Delta region and the Pearl River Delta region have been established and found to be responsible for the H7N9 outbreaks in mainland China. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  2. Two Outbreak Sources of Influenza A (H7N9) Viruses Have Been Established in China

    PubMed Central

    Wang, Dayan; Yang, Lei; Zhu, Wenfei; Zhang, Ye; Zou, Shumei; Bo, Hong; Gao, Rongbao; Dong, Jie; Huang, Weijuan; Guo, Junfeng; Li, Zi; Zhao, Xiang; Li, Xiaodan; Xin, Li; Zhou, Jianfang; Chen, Tao; Dong, Libo; Wei, Hejiang; Li, Xiyan; Liu, Liqi; Tang, Jing; Lan, Yu; Yang, Jing

    2016-01-01

    ABSTRACT Due to enzootic infections in poultry and persistent human infections in China, influenza A (H7N9) virus has remained a public health threat. The Yangtze River Delta region, which is located in eastern China, is well recognized as the original source for H7N9 outbreaks. Based on the evolutionary analysis of H7N9 viruses from all three outbreak waves since 2013, we identified the Pearl River Delta region as an additional H7N9 outbreak source. H7N9 viruses are repeatedly introduced from these two sources to the other areas, and the persistent circulation of H7N9 viruses occurs in poultry, causing continuous outbreak waves. Poultry movements may contribute to the geographic expansion of the virus. In addition, the AnH1 genotype, which was predominant during wave 1, was replaced by JS537, JS18828, and AnH1887 genotypes during waves 2 and 3. The establishment of a new source and the continuous evolution of the virus hamper the elimination of H7N9 viruses, thus posing a long-term threat of H7N9 infection in humans. Therefore, both surveillance of H7N9 viruses in humans and poultry and supervision of poultry movements should be strengthened. IMPORTANCE Since its occurrence in humans in eastern China in spring 2013, the avian H7N9 viruses have been demonstrating the continuing pandemic threat posed by the current influenza ecosystem in China. As the viruses are silently circulated in poultry, with potentially severe outcomes in humans, H7N9 virus activity in humans in China is very important to understand. In this study, we identified a newly emerged H7N9 outbreak source in the Pearl River Delta region. Both sources in the Yangtze River Delta region and the Pearl River Delta region have been established and found to be responsible for the H7N9 outbreaks in mainland China. PMID:27030268

  3. D Hydrodynamics Simulation of Amazonian Seasonally Flooded Wetlands

    NASA Astrophysics Data System (ADS)

    Pinel, S. S.; Bonnet, M. P.; Da Silva, J. S.; Cavalcanti, R., Sr.; Calmant, S.

    2016-12-01

    In the low Amazonian basin, interactions between floodplains and river channels are important in terms of water exchanges, sediments, or nutrients. These wetlands are considered as hotspot of biodiversity and are among the most productive in the world. However, they are threatened by climatic changes and anthropic activities. Hence, considering the implications for predicting inundation status of floodplain habitats, the strong interactions between water circulation, energy fluxes, biogeochemical and ecological processes, detailed analyses of flooding dynamics are useful and needed. Numerical inundation models offer means to study the interactions among different water sources. Modeling floods events in this area is challenging because flows respond to dynamic hydraulic controls coming from several water sources, complex geomorphology, and vegetation. In addition, due to the difficulty of access, there is a lack of existing hydrological data. In this context, the use of monitoring systems by remote sensing is a good option. In this study, we simulated filling and drainage processes of an Amazon floodplain (Janauacá Lake, AM, Brazil) over a 6 years period (2006-2012). Common approaches of flow modeling in the Amazon region consist of coupling a 1D simulation of the main channel flood wave to a 2D simulation of the inundation of the floodplain. Here, our approach differs as the floodplain is fully simulated. Model used is the 3D model IPH-ECO, which consists of a three-dimensional hydrodynamic module coupled with an ecosystem module. The IPH-ECO hydrodynamic module solves the Reynolds-Averaged Navier-Stokes equations using a semi-implicit discretization. After having calibrated the simulation against roughness coefficients, we validated the model in terms of vertical accuracy against water levels (daily in situ and altimetrics data), in terms of flood extent against inundation maps deduced from available remote-sensed product imagery (ALOS-1/PALSAR.), and in terms of velocity. We analyzed the inter-annual variability in hydrological fluxes and inundation dynamics of the floodplain unit. Dominant sources of inflow varied seasonally: among direct rain and local runoff (November to April), Amazon River (May to August) and seepage (September to October).

  4. Aeolian controls of soil geochemistry and weathering fluxes in high-elevation ecosystems of the Rocky Mountains, Colorado

    USGS Publications Warehouse

    Lawrence, Corey R.; Reynolds, Richard L.; Kettterer, Michael E.; Neff, Jason C.

    2013-01-01

    When dust inputs are large or have persisted for long periods of time, the signature of dust additions are often apparent in soils. The of dust will be greatest where the geochemical composition of dust is distinct from local sources of soil parent material. In this study the influence of dust accretion on soil geochemistry is quantified for two different soils from the San Juan Mountains of southwestern Colorado, USA. At both study sites, dust is enriched in several trace elements relative to local rock, especially Cd, Cu, Pb, and Zn. Mass-balance calculations that do not explicitly account for dust inputs indicate the accumulation of some elements in soil beyond what can be explained by weathering of local rock. Most observed elemental enrichments are explained by accounting for the long-term accretion of dust, based on modern isotopic and geochemical estimates. One notable exception is Pb, which based on mass-balance calculations and isotopic measurements may have an additional source at one of the study sites. These results suggest that dust is a major factor influencing the development of soil in these settings and is also an important control of soil weathering fluxes. After accounting for dust inputs in mass-balance calculations, Si weathering fluxes from San Juan Mountain soils are within the range observed for other temperate systems. Comparing dust inputs with mass-balanced based flux estimates suggests dust could account for as much as 50–80% of total long-term chemical weathering fluxes. These results support the notion that dust inputs may sustain chemical weathering fluxes even in relatively young continental settings. Given the widespread input of far-traveled dust, the weathering of dust is likely and important and underappreciated aspect of the global weathering engine.

  5. IMPLEMENTATION AND OPERATION OF THE REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcus Milling

    2003-10-01

    The NGDRS has facilitated 85% of cores, cuttings, and other data identified available for transfer to the public sector. Over 12 million linear feet of cores and cuttings, in addition to large numbers of paleontological samples and are now available for public use. To date, with industry contributions for program operations and data transfers, the NGDRS project has realized a 6.5 to 1 return on investment to Department of Energy funds. Large-scale transfers of seismic data have been evaluated, but based on the recommendation of the NGDRS steering committee, cores have been given priority because of the vast scale ofmore » the seismic data problem relative to the available funding. The rapidly changing industry conditions have required that the primary core and cuttings preservation strategy evolve as well. Additionally, the NGDRS clearinghouse is evaluating the viability of transferring seismic data covering the western shelf of the Florida Gulf Coast. AGI remains actively involved in working to realize the vision of the National Research Council's report of geoscience data preservation. GeoTrek has been ported to Linux and MySQL, ensuring a purely open-source version of the software. This effort is key in ensuring long-term viability of the software so that is can continue basic operation regardless of specific funding levels. Work has been on a major revision of GeoTrek, using the open-source MapServer project and its related MapScript language. This effort will address a number of key technology issues that appear to be rising for 2003, including the discontinuation of the use of Java in future Microsoft operating systems. The recent donation of BPAmoco's Houston core facility to the Texas Bureau of Economic Geology has provided substantial short-term relief of the space constraints for public repository space.« less

  6. Aeolian controls of soil geochemistry and weathering fluxes in high-elevation ecosystems of the Rocky Mountains, Colorado

    NASA Astrophysics Data System (ADS)

    Lawrence, Corey R.; Reynolds, Richard L.; Ketterer, Michael E.; Neff, Jason C.

    2013-04-01

    When dust inputs are large or have persisted for long periods of time, the signature of dust additions are often apparent in soils. The of dust will be greatest where the geochemical composition of dust is distinct from local sources of soil parent material. In this study the influence of dust accretion on soil geochemistry is quantified for two different soils from the San Juan Mountains of southwestern Colorado, USA. At both study sites, dust is enriched in several trace elements relative to local rock, especially Cd, Cu, Pb, and Zn. Mass-balance calculations that do not explicitly account for dust inputs indicate the accumulation of some elements in soil beyond what can be explained by weathering of local rock. Most observed elemental enrichments are explained by accounting for the long-term accretion of dust, based on modern isotopic and geochemical estimates. One notable exception is Pb, which based on mass-balance calculations and isotopic measurements may have an additional source at one of the study sites. These results suggest that dust is a major factor influencing the development of soil in these settings and is also an important control of soil weathering fluxes. After accounting for dust inputs in mass-balance calculations, Si weathering fluxes from San Juan Mountain soils are within the range observed for other temperate systems. Comparing dust inputs with mass-balanced based flux estimates suggests dust could account for as much as 50-80% of total long-term chemical weathering fluxes. These results support the notion that dust inputs may sustain chemical weathering fluxes even in relatively young continental settings. Given the widespread input of far-traveled dust, the weathering of dust is likely and important and underappreciated aspect of the global weathering engine.

  7. Cable equation for general geometry

    NASA Astrophysics Data System (ADS)

    López-Sánchez, Erick J.; Romero, Juan M.

    2017-02-01

    The cable equation describes the voltage in a straight cylindrical cable, and this model has been employed to model electrical potential in dendrites and axons. However, sometimes this equation might give incorrect predictions for some realistic geometries, in particular when the radius of the cable changes significantly. Cables with a nonconstant radius are important for some phenomena, for example, discrete swellings along the axons appear in neurodegenerative diseases such as Alzheimers, Parkinsons, human immunodeficiency virus associated dementia, and multiple sclerosis. In this paper, using the Frenet-Serret frame, we propose a generalized cable equation for a general cable geometry. This generalized equation depends on geometric quantities such as the curvature and torsion of the cable. We show that when the cable has a constant circular cross section, the first fundamental form of the cable can be simplified and the generalized cable equation depends on neither the curvature nor the torsion of the cable. Additionally, we find an exact solution for an ideal cable which has a particular variable circular cross section and zero curvature. For this case we show that when the cross section of the cable increases the voltage decreases. Inspired by this ideal case, we rewrite the generalized cable equation as a diffusion equation with a source term generated by the cable geometry. This source term depends on the cable cross-sectional area and its derivates. In addition, we study different cables with swelling and provide their numerical solutions. The numerical solutions show that when the cross section of the cable has abrupt changes, its voltage is smaller than the voltage in the cylindrical cable. Furthermore, these numerical solutions show that the voltage can be affected by geometrical inhomogeneities on the cable.

  8. Urea increased nickel and copper accumulation in the leaves of Egeria densa (Planch.) Casp. and Ceratophyllum demersum L. during short-term exposure.

    PubMed

    Maleva, Maria; Borisova, Galina; Chukina, Nadezhda; Kumar, Adarsh

    2018-02-01

    In the present study, two fresh water plant species Egeria densa (Planch.) Casp. and Ceratophyllum demersum L. were subjected to separate and combined action of urea (2mМ) and metals (Ni and Cu, 10μM) to investigate the phytoremediation potential of these two submerged macrophytes during short-term experiments (48h). Both submerged macrophytes demonstrated high accumulative potential for Ni and Cu (average bioconcentration factors were 2505 for Ni and 3778 for Cu). The urea (2 mM) was not significantly toxic for studied plant species. Futhermore, urea worked as an additional source of nitrogen and stimulated some metabolic processes such as the synthesis of photosynthetic pigments, soluble proteins, non-enzymatic antioxidants, and activated some enzymes. Adding urea to the metals increased their accumulation in both macrophytes (on average by 35% for Ni and 15% for Cu). Combined action of urea and Ni did not have a significant effect on antioxidant response, but caused a sharp increase of urease activity (4 folds on an average) in both plants. The copper exerted a stronger toxic effect on both studied macrophytes compared to nickel. Adding urea to copper in some cases diminished the toxic action of this metal. Study concludes that the responses of E. densa and C. demersum to urea and metal action (separate and combined) were depended on the type of pollutant and the activity of antioxidant defence system. Therefore, the studied aquatic macrophytes found to be potential phytoremediators of water bodies, the addition of an organic nitrogen source in the form of urea in environmentally relevant concentration will increase the efficiency of phytoextraction of metals. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Long-term trends in California mobile source emissions and ambient concentrations of black carbon and organic aerosol.

    PubMed

    McDonald, Brian C; Goldstein, Allen H; Harley, Robert A

    2015-04-21

    A fuel-based approach is used to assess long-term trends (1970-2010) in mobile source emissions of black carbon (BC) and organic aerosol (OA, including both primary emissions and secondary formation). The main focus of this analysis is the Los Angeles Basin, where a long record of measurements is available to infer trends in ambient concentrations of BC and organic carbon (OC), with OC used here as a proxy for OA. Mobile source emissions and ambient concentrations have decreased similarly, reflecting the importance of on- and off-road engines as sources of BC and OA in urban areas. In 1970, the on-road sector accounted for ∼90% of total mobile source emissions of BC and OA (primary + secondary). Over time, as on-road engine emissions have been controlled, the relative importance of off-road sources has grown. By 2010, off-road engines were estimated to account for 37 ± 20% and 45 ± 16% of total mobile source contributions to BC and OA, respectively, in the Los Angeles area. This study highlights both the success of efforts to control on-road emission sources, and the importance of considering off-road engine and other VOC source contributions when assessing long-term emission and ambient air quality trends.

  10. Effectiveness of additional supervised exercises compared with conventional treatment alone in patients with acute lateral ankle sprains: systematic review

    PubMed Central

    van Ochten, John; Luijsterburg, Pim A J; van Middelkoop, Marienke; Koes, Bart W; Bierma-Zeinstra, Sita M A

    2010-01-01

    Objective To summarise the effectiveness of adding supervised exercises to conventional treatment compared with conventional treatment alone in patients with acute lateral ankle sprains. Design Systematic review. Data sources Medline, Embase, Cochrane Central Register of Controlled Trials, Cinahl, and reference screening. Study selection Included studies were randomised controlled trials, quasi-randomised controlled trials, or clinical trials. Patients were adolescents or adults with an acute lateral ankle sprain. The treatment options were conventional treatment alone or conventional treatment combined with supervised exercises. Two reviewers independently assessed the risk of bias, and one reviewer extracted data. Because of clinical heterogeneity we analysed the data using a best evidence synthesis. Follow-up was classified as short term (up to two weeks), intermediate (two weeks to three months), and long term (more than three months). Results 11 studies were included. There was limited to moderate evidence to suggest that the addition of supervised exercises to conventional treatment leads to faster and better recovery and a faster return to sport at short term follow-up than conventional treatment alone. In specific populations (athletes, soldiers, and patients with severe injuries) this evidence was restricted to a faster return to work and sport only. There was no strong evidence of effectiveness for any of the outcome measures. Most of the included studies had a high risk of bias, with few having adequate statistical power to detect clinically relevant differences. Conclusion Additional supervised exercises compared with conventional treatment alone have some benefit for recovery and return to sport in patients with ankle sprain, though the evidence is limited or moderate and many studies are subject to bias. PMID:20978065

  11. Source apportionment of speciated PM2.5 and non-parametric regressions of PM2.5 and PM(coarse) mass concentrations from Denver and Greeley, Colorado, and construction and evaluation of dichotomous filter samplers

    NASA Astrophysics Data System (ADS)

    Piedrahita, Ricardo A.

    The Denver Aerosol Sources and Health study (DASH) was a long-term study of the relationship between the variability in fine particulate mass and chemical constituents (PM2.5, particulate matter less than 2.5mum) and adverse health effects such as cardio-respiratory illnesses and mortality. Daily filter samples were chemically analyzed for multiple species. We present findings based on 2.8 years of DASH data, from 2003 to 2005. Multilinear Engine 2 (ME-2), a receptor-based source apportionment model was applied to the data to estimate source contributions to PM2.5 mass concentrations. This study relied on two different ME-2 models: (1) a 2-way model that closely reflects PMF-2; and (2) an enhanced model with meteorological data that used additional temporal and meteorological factors. The Coarse Rural Urban Sources and Health study (CRUSH) is a long-term study of the relationship between the variability in coarse particulate mass (PMcoarse, particulate matter between 2.5 and 10mum) and adverse health effects such as cardio-respiratory illnesses, pre-term births, and mortality. Hourly mass concentrations of PMcoarse and fine particulate matter (PM2.5) are measured using tapered element oscillating microbalances (TEOMs) with Filter Dynamics Measurement Systems (FDMS), at two rural and two urban sites. We present findings based on nine months of mass concentration data, including temporal trends, and non-parametric regressions (NPR) results, which were used to characterize the wind speed and wind direction relationships that might point to sources. As part of CRUSH, 1-year coarse and fine mode particulate matter filter sampling network, will allow us to characterize the chemical composition of the particulate matter collected and perform spatial comparisons. This work describes the construction and validation testing of four dichotomous filter samplers for this purpose. The use of dichotomous splitters with an approximate 2.5mum cut point, coupled with a 10mum cut diameter inlet head allows us to collect the separated size fractions that the collocated TEOMs collect continuously. Chemical analysis of the filters will include inorganic ions, organic compounds, EC, OC, and biological analyses. Side by side testing showed the cut diameters were in agreement with each other, and with a well characterized virtual impactor lent to the group by the University of Southern California. Error propagation was performed and uncertainty results were similar to the observed standard deviations.

  12. Emissions of black carbon and co-pollutants emitted from diesel vehicles in the Mexico City Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Zavala, Miguel; Molina, Luisa T.; Fortner, Edward; Knighton, Berk; Herndon, Scott; Yacovitch, Tara; Floerchinger, Cody; Roscioli, Joseph; Kolb, Charles; Mejia, Jose Antonio; Sarmiento, Jorge; Paramo, Victor Hugo; Zirath, Sergio; Jazcilevich, Aron

    2014-05-01

    Black carbon emitted from freight, public transport, and heavy duty trucks sources is linked with adverse effects on human health. In addition, the control of emissions of black carbon, an important short-lived climate forcing agent (SLCF), has recently been considered as one of the key strategies for mitigating regional near-term climate change. Despite the availability of new emissions control technologies for reducing emissions from diesel-powered mobile sources, their introduction is still not widespread in many urban areas and there is a need to characterize real-world emission rates of black carbon from this key source. The emissions of black carbon, organic carbon, and other gaseous and particle pollutants from diesel-powered mobile sources in Mexico were characterized by deploying a mobile laboratory equipped with real-time instrumentation in Mexico City as part of the SLCFs-Mexico 2013 project. From February 25-28 of 2013 the emissions from selected diesel-powered vehicles were measured in both controlled experiments and real-world on-road driving conditions. Sampled vehicles had several emissions levels technologies, including: EPA98, EPA03, EPA04, EURO3-5, and Hybrid. All vehicles were sampled using diesel fuel and several vehicles were measured using both diesel and biodiesel fuels. Additional measurements included the use of a remote sensing unit for the co-sampling of all tested vehicles, and the installation and operation of a Portable Emissions Measurements System (PEMS) for the measurement of emissions from a test vehicle. We will present inter-comparisons of the emission factors obtained among the various vehicle technologies that were sampled during the experiment as well as the inter-comparison of results from the various sampling platforms. The results can be used to

  13. Contamination, Transport, and Exposure Mapping and Assessment of Karst Groundwater Systems in Northern Puerto Rico Using GIS

    NASA Astrophysics Data System (ADS)

    Howard, J.; Schifman, L. A.; Irrizary, C.; Torres, P.; Padilla, I. Y.

    2011-12-01

    Ground waters from karst aquifer systems are one of the most important sources of freshwater worldwide and are highly vulnerable to both natural and anthropogenic contamination. Contaminants released into karst groundwater systems move through complex pathways from their sources to discharge areas of potential exposure. Points of exposure can include wells, springs, and surface waters that serve as drinking water sources. In Puerto Rico, the North Coast Limestone Aquifer System, which extends 90 miles across the north coast with an area of nearly 700 sq. miles, provides more than 50% of the potable water demand for industrial and drinking purposes. Historical reports from the 1980s revealed that volatile organic compounds, phthalates, and metals were close to or exceeded maximum contaminant levels. Exposure to such contaminants has been reported to cause reproductive and developmental issues, such as preterm birth. Since there is minimal understanding of the extent of contamination it is important to identify areas of potential concern. Preliminary analysis of 20 groundwater/springs and 20 tap water sites within the North Coast suggest that contamination is still a major concern. In addition, mixed effects models analyses suggest that >60% of pre-term birth rates may be explained by the presence of sites contaminated with volatile organic compounds, phthalates, and metals within the North Coast region. This presentation will focus primarily on how GIS was used as a tool for developing sampling strategies for collecting groundwater and tap water sources within the North Coast Limestone Aquifer System of Puerto Rico. In addition, the linkage of contamination, transport, and exposure to volatile organic compounds and phthalates will be addressed.

  14. Compound specific radiocarbon analyses to apportion sources of combustion products in sedimentary pyrogenic carbon deposits

    NASA Astrophysics Data System (ADS)

    Hanke, Ulrich M.; Schmidt, Michael W. I.; McIntyre, Cameron P.; Reddy, Christopher M.; Wacker, Lukas; Eglinton, Timothy I.

    2016-04-01

    Pyrogenic carbon (PyC) is a collective term for carbon-rich residues comprised of a continuum of products generated during biomass burning and fossil fuel combustion. PyC is a key component of the global carbon cycle due to its slow intrinsic decomposition rate and its ubiquity in the environment. It can originate from natural or anthropogenic vegetation fires, coal mining, energy production, industry and transport. Subsequently, PyC can be transported over long distances by wind and water and can eventually be buried in sediments. Information about the origin of PyC (biomass burning vs. fossil fuel combustion) deposited in estuarine sediments is scarce. We studied the highly anoxic estuarine sediments of the Pettaquamscutt River (Rhode Island, U.S.) in high temporal resolution over 250 years and found different combustion proxies reflect local and regional sources of PyC (Hanke et al. in review; Lima et al. 2003). The polycyclic aromatic hydrocarbons (PAH) originate from long-range atmospheric transport, whereas bulk PyC, detected as benzene polycarboxylic acids (BPCA), mainly stems from local catchment run-off. However, to unambiguously apportion PyC sources, we need additional information, such as compound specific radiocarbon (14C) measurements. We report 14C data for individual BPCA including error analysis and for combustion-related PAH. First results indicate that biomass burning is the main source of PyC deposits, with additional minor contributions from fossil fuel combustion. References Hanke U.M., T.I. Eglinton, A.L.L. Braun, C. Reddy, D.B. Wiedemeier, M.W.I. Schmidt. Decoupled sedimentary records of combustion: causes and implications. In review. Lima, A. L.; Eglinton, T. I.; Reddy, C. M., High-resolution record of pyrogenic polycyclic aromatic hydrocarbon deposition during the 20th century. ES&T, 2003, 37 (1), 53-61.

  15. Rising atmospheric CO2 is reducing the protein concentration of a floral pollen source essential for North American bees.

    PubMed

    Ziska, Lewis H; Pettis, Jeffery S; Edwards, Joan; Hancock, Jillian E; Tomecek, Martha B; Clark, Andrew; Dukes, Jeffrey S; Loladze, Irakli; Polley, H Wayne

    2016-04-13

    At present, there is substantive evidence that the nutritional content of agriculturally important food crops will decrease in response to rising levels of atmospheric carbon dioxide, Ca However, whether Ca-induced declines in nutritional quality are also occurring for pollinator food sources is unknown. Flowering late in the season, goldenrod (Solidago spp.) pollen is a widely available autumnal food source commonly acknowledged by apiarists to be essential to native bee (e.g. Bombus spp.) and honeybee (Apis mellifera) health and winter survival. Using floral collections obtained from the Smithsonian Natural History Museum, we quantified Ca-induced temporal changes in pollen protein concentration of Canada goldenrod (Solidago canadensis), the most wide spread Solidago taxon, from hundreds of samples collected throughout the USA and southern Canada over the period 1842-2014 (i.e. a Ca from approx. 280 to 398 ppm). In addition, we conducted a 2 year in situtrial of S. Canadensis populations grown along a continuous Ca gradient from approximately 280 to 500 ppm. The historical data indicated a strong significant correlation between recent increases in Ca and reductions in pollen protein concentration (r(2)= 0.81). Experimental data confirmed this decrease in pollen protein concentration, and indicated that it would be ongoing as Ca continues to rise in the near term, i.e. to 500 ppm (r(2)= 0.88). While additional data are needed to quantify the subsequent effects of reduced protein concentration for Canada goldenrod on bee health and population stability, these results are the first to indicate that increasing Ca can reduce protein content of a floral pollen source widely used by North American bees. © 2016 The Author(s).

  16. Technical Note: On the calculation of stopping-power ratio for stoichiometric calibration in proton therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ödén, Jakob; Zimmerman, Jens; Nowik, Patrik

    2015-09-15

    Purpose: The quantitative effects of assumptions made in the calculation of stopping-power ratios (SPRs) are investigated, for stoichiometric CT calibration in proton therapy. The assumptions investigated include the use of the Bethe formula without correction terms, Bragg additivity, the choice of I-value for water, and the data source for elemental I-values. Methods: The predictions of the Bethe formula for SPR (no correction terms) were validated against more sophisticated calculations using the SRIM software package for 72 human tissues. A stoichiometric calibration was then performed at our hospital. SPR was calculated for the human tissues using either the assumption of simplemore » Bragg additivity or the Seltzer-Berger rule (as used in ICRU Reports 37 and 49). In each case, the calculation was performed twice: First, by assuming the I-value of water was an experimentally based value of 78 eV (value proposed in Errata and Addenda for ICRU Report 73) and second, by recalculating the I-value theoretically. The discrepancy between predictions using ICRU elemental I-values and the commonly used tables of Janni was also investigated. Results: Errors due to neglecting the correction terms to the Bethe formula were calculated at less than 0.1% for biological tissues. Discrepancies greater than 1%, however, were estimated due to departures from simple Bragg additivity when a fixed I-value for water was imposed. When the I-value for water was calculated in a consistent manner to that for tissue, this disagreement was substantially reduced. The difference between SPR predictions when using Janni’s or ICRU tables for I-values was up to 1.6%. Experimental data used for materials of relevance to proton therapy suggest that the ICRU-derived values provide somewhat more accurate results (root-mean-square-error: 0.8% versus 1.6%). Conclusions: The conclusions from this study are that (1) the Bethe formula can be safely used for SPR calculations without correction terms; (2) simple Bragg additivity can be reasonably assumed for compound materials; (3) if simple Bragg additivity is assumed, then the I-value for water should be calculated in a consistent manner to that of the tissue of interest (rather than using an experimentally derived value); (4) the ICRU Report 37 I-values may provide a better agreement with experiment than Janni’s tables.« less

  17. Eddy Covariance Measurements Over a Maize Field: The Contribution of Minor Flux Terms to the Energy Balance Gap

    NASA Astrophysics Data System (ADS)

    Smidt, J.; Ingwersen, J.; Streck, T.

    2015-12-01

    The lack of energy balance closure is a long-standing problem in eddy covariance (EC) measurements. The energy balance equation is defined as Rn - G = H + λE, where Rn is net radiation, G is the ground heat flux, H is the sensible heat flux and λE is the latent heat flux. In most cases of energy imbalance, either Rn is overestimated or the ground heat and turbulent fluxes are underestimated. Multiple studies have shown that calculations, incorrect instrument installation/calibration and measurement errors alone do not entirely account for this imbalance. Rather, research is now focused on previously neglected sources of heat storage in the soil, biomass and air beneath the EC station. This project examined the potential of five "minor flux terms" - soil heat storage, biomass heat storage, energy consumption by photosynthesis, air heat storage and atmospheric moisture change, to further close the energy balance gap. Eddy covariance measurements were conducted at a maize (Zea mays) field in southwest Germany during summer 2014. Soil heat storage was measured for six weeks at 11 sites around the field footprint. Biomass and air heat storage were measured for six subsequent weeks at seven sites around the field footprint. Energy consumption by photosynthesis was calculated using the CO2 flux data. Evapotranspiration was calculated using the water balance method and then compared to the flux data processed with three post-closure methods: the sensible heat flux, the latent heat flux and the Bowen ratio post-closure methods. An energy balance closure of 66% was achieved by the EC station measurements over the entire investigation period. During the soil heat flux campaign, EC station closure was 74.1%, and the field footprint soil heat storage contributed 3.3% additional closure. During the second minor flux term measurement period, closure with the EC station data was 91%. Biomass heat storage resulted in 1.1% additional closure, the photosynthesis flux closed the gap by an additional 7.8%, air heat storage closure was -0.3% and atmospheric moisture change was negligible with an additional closure of <0.01%. These four terms resulted in a total additional closure of 8.6% over the EC station measurements. The Bowen Ratio post-closure method yielded values most similar to the water balance method over the entire season.

  18. A functional magnetic resonance imaging investigation of short-term source and item memory for negative pictures.

    PubMed

    Mitchell, Karen J; Mather, Mara; Johnson, Marcia K; Raye, Carol L; Greene, Erich J

    2006-10-02

    We investigated the hypothesis that arousal recruits attention to item information, thereby disrupting working memory processes that help bind items to context. Using functional magnetic resonance imaging, we compared brain activity when participants remembered negative or neutral picture-location conjunctions (source memory) versus pictures only. Behaviorally, negative trials showed disruption of short-term source, but not picture, memory; long-term picture recognition memory was better for negative than for neutral pictures. Activity in areas involved in working memory and feature integration (precentral gyrus and its intersect with superior temporal gyrus) was attenuated on negative compared with neutral source trials relative to picture-only trials. Visual processing areas (middle occipital and lingual gyri) showed greater activity for negative than for neutral trials, especially on picture-only trials.

  19. Circular current loops, magnetic dipoles and spherical harmonic analysis.

    USGS Publications Warehouse

    Alldredge, L.R.

    1980-01-01

    Spherical harmonic analysis (SHA) is the most used method of describing the Earth's magnetic field, even though spherical harmonic coefficients (SHC) almost completely defy interpretation in terms of real sources. Some moderately successful efforts have been made to represent the field in terms of dipoles placed in the core in an effort to have the model come closer to representing real sources. Dipole sources are only a first approximation to the real sources which are thought to be a very complicated network of electrical currents in the core of the Earth. -Author

  20. Numerical simulations of LNG vapor dispersion in Brayton Fire Training Field tests with ANSYS CFX.

    PubMed

    Qi, Ruifeng; Ng, Dedy; Cormier, Benjamin R; Mannan, M Sam

    2010-11-15

    Federal safety regulations require the use of validated consequence models to determine the vapor cloud dispersion exclusion zones for accidental liquefied natural gas (LNG) releases. One tool that is being developed in industry for exclusion zone determination and LNG vapor dispersion modeling is computational fluid dynamics (CFD). This paper uses the ANSYS CFX CFD code to model LNG vapor dispersion in the atmosphere. Discussed are important parameters that are essential inputs to the ANSYS CFX simulations, including the atmospheric conditions, LNG evaporation rate and pool area, turbulence in the source term, ground surface temperature and roughness height, and effects of obstacles. A sensitivity analysis was conducted to illustrate uncertainties in the simulation results arising from the mesh size and source term turbulence intensity. In addition, a set of medium-scale LNG spill tests were performed at the Brayton Fire Training Field to collect data for validating the ANSYS CFX prediction results. A comparison of test data with simulation results demonstrated that CFX was able to describe the dense gas behavior of LNG vapor cloud, and its prediction results of downwind gas concentrations close to ground level were in approximate agreement with the test data. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Application of the DG-1199 methodology to the ESBWR and ABWR.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kalinich, Donald A.; Gauntt, Randall O.; Walton, Fotini

    2010-09-01

    Appendix A-5 of Draft Regulatory Guide DG-1199 'Alternative Radiological Source Term for Evaluating Design Basis Accidents at Nuclear Power Reactors' provides guidance - applicable to RADTRAD MSIV leakage models - for scaling containment aerosol concentration to the expected steam dome concentration in order to preserve the simplified use of the Accident Source Term (AST) in assessing containment performance under assumed design basis accident (DBA) conditions. In this study Economic and Safe Boiling Water Reactor (ESBWR) and Advanced Boiling Water Reactor (ABWR) RADTRAD models are developed using the DG-1199, Appendix A-5 guidance. The models were run using RADTRAD v3.03. Low Populationmore » Zone (LPZ), control room (CR), and worst-case 2-hr Exclusion Area Boundary (EAB) doses were calculated and compared to the relevant accident dose criteria in 10 CFR 50.67. For the ESBWR, the dose results were all lower than the MSIV leakage doses calculated by General Electric/Hitachi (GEH) in their licensing technical report. There are no comparable ABWR MSIV leakage doses, however, it should be noted that the ABWR doses are lower than the ESBWR doses. In addition, sensitivity cases were evaluated to ascertain the influence/importance of key input parameters/features of the models.« less

  2. Effect of fiber sources on fatty acids profile, glycemic index, and phenolic compound content of in vitro digested fortified wheat bread.

    PubMed

    Kurek, Marcin Andrzej; Wyrwisz, Jarosław; Karp, Sabina; Wierzbicka, Agnieszka

    2018-05-01

    In this study, some dietary fiber (DF) sources were investigated as fortifiers of wheat bread: oat (OB), flax (FB), and apple (AB). Adding oat and flax fibers to bread significantly changed the fatty acid profiles. OB was highest in oleic acid (33.83% of lipids) and linoleic acid (24.31% of lipids). Only in FB, γ-linolenic fatty acid was present in a significant amount-18.32%. The bioaccessibility trails revealed that the DF slow down the intake of saturated fatty acids. PUFA were least bioaccessible from all fatty acids groups in the range of (72% in OB to 87% in FB). The control bread had the greatest value (80.5) and was significantly higher than values for OB, FB, and AB in terms of glycemic index. OB, FB and AB addition led to obtain low glycemic index. AB had a significant highest value of total phenolic (897.2 mg/kg) with the lowest values in FB (541.2 mg/kg). The only significant lowering of caloric values in this study was observed in AB. The study could address the gap in the area of research about taking into consideration glycemic index, fatty acid profile and phenolic content in parallel in terms of DF application in breads.

  3. Confronting effective models for deconfinement in dense quark matter with lattice data

    NASA Astrophysics Data System (ADS)

    Andersen, Jens O.; Brauner, Tomáš; Naylor, William R.

    2015-12-01

    Ab initio numerical simulations of the thermodynamics of dense quark matter remain a challenge. Apart from the infamous sign problem, lattice methods have to deal with finite volume and discretization effects as well as with the necessity to introduce sources for symmetry-breaking order parameters. We study these artifacts in the Polyakov-loop-extended Nambu-Jona-Lasinio (PNJL) model and compare its predictions to existing lattice data for cold and dense two-color matter with two flavors of Wilson quarks. To achieve even qualitative agreement with lattice data requires the introduction of two novel elements in the model: (i) explicit chiral symmetry breaking in the effective contact four-fermion interaction, referred to as the chiral twist, and (ii) renormalization of the Polyakov loop. The feedback of the dense medium to the gauge sector is modeled by a chemical-potential-dependent scale in the Polyakov-loop potential. In contrast to previously used analytical Ansätze, we determine its dependence on the chemical potential from lattice data for the expectation value of the Polyakov loop. Finally, we propose adding a two-derivative operator to our effective model. This term acts as an additional source of explicit chiral symmetry breaking, mimicking an analogous term in the lattice Wilson action.

  4. RADTRAD: A simplified model for RADionuclide Transport and Removal And Dose estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphreys, S.L.; Miller, L.A.; Monroe, D.K.

    1998-04-01

    This report documents the RADTRAD computer code developed for the U.S. Nuclear Regulatory Commission (NRC) Office of Nuclear Reactor Regulation (NRR) to estimate transport and removal of radionuclides and dose at selected receptors. The document includes a users` guide to the code, a description of the technical basis for the code, the quality assurance and code acceptance testing documentation, and a programmers` guide. The RADTRAD code can be used to estimate the containment release using either the NRC TID-14844 or NUREG-1465 source terms and assumptions, or a user-specified table. In addition, the code can account for a reduction in themore » quantity of radioactive material due to containment sprays, natural deposition, filters, and other natural and engineered safety features. The RADTRAD code uses a combination of tables and/or numerical models of source term reduction phenomena to determine the time-dependent dose at user-specified locations for a given accident scenario. The code system also provides the inventory, decay chain, and dose conversion factor tables needed for the dose calculation. The RADTRAD code can be used to assess occupational radiation exposures, typically in the control room; to estimate site boundary doses; and to estimate dose attenuation due to modification of a facility or accident sequence.« less

  5. A study of numerical methods for hyperbolic conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Leveque, R. J.; Yee, H. C.

    1988-01-01

    The proper modeling of nonequilibrium gas dynamics is required in certain regimes of hypersonic flow. For inviscid flow this gives a system of conservation laws coupled with source terms representing the chemistry. Often a wide range of time scales is present in the problem, leading to numerical difficulties as in stiff systems of ordinary differential equations. Stability can be achieved by using implicit methods, but other numerical difficulties are observed. The behavior of typical numerical methods on a simple advection equation with a parameter-dependent source term was studied. Two approaches to incorporate the source term were utilized: MacCormack type predictor-corrector methods with flux limiters, and splitting methods in which the fluid dynamics and chemistry are handled in separate steps. Various comparisons over a wide range of parameter values were made. In the stiff case where the solution contains discontinuities, incorrect numerical propagation speeds are observed with all of the methods considered. This phenomenon is studied and explained.

  6. Occurrence and temporal variability of methyl tert-butyl ether (MTBE) and other volatile organic compounds in select sources of drinking water : results of the focused survey

    USGS Publications Warehouse

    Delzer, Gregory C.; Ivahnenko, Tamara

    2003-01-01

    The large-scale use of the gasoline oxygenate methyl tert-butyl ether (MTBE), and its high solubility, low soil adsorption, and low biodegradability, has resulted in its detection in ground water and surface water in many places throughout the United States. Studies by numerous researchers, as well as many State and local environmental agencies, have discovered high levels of MTBE in soils and ground water at leaking underground gasoline-storage-tank sites and frequent occurrence of low to intermediate levels of MTBE in reservoirs used for both public water supply and recreational boating.In response to these findings, the American Water Works Association Research Foundation sponsored an investigation of MTBE and other volatile organic compounds (VOCs) in the Nation?s sources of drinking water. The goal of the investigation was to provide additional information on the frequency of occurrence, concentration, and temporal variability of MTBE and other VOCs in source water used by community water systems (CWSs). The investigation was completed in two stages: (1) reviews of available literature and (2) the collection of new data. Two surveys were associated with the collection of new data. The first, termed the Random Survey, employed a statistically stratified design for sampling source water from 954 randomly selected CWSs. The second, which is the focus of this report, is termed the Focused Survey, which included samples collected from 134 CWS source waters, including ground water, reservoirs, lakes, rivers, and streams, that were suspected or known to contain MTBE. The general intent of the Focused Survey was to compare results with the Random Survey and provide an improved understanding of the occurrence, concentration, temporal variability, and anthropogenic factors associated with frequently detected VOCs. Each sample collected was analyzed for 66 VOCs, including MTBE and three other ether gasoline oxygenates (hereafter termed gasoline oxygenates). As part of the Focused Survey, 451 source-water samples and 744 field quality-control (QC) samples were collected from 78 ground-water, 39 reservoir and (or) lake, and 17 river and (or) stream source waters at fixed intervals for a period of 1 year.Using a common assessment level of 0.2 ?g/L (micrograms per liter) (2.0 ?g/L for methyl ethyl ketone), 37 of the 66 VOCs analyzed were detected in both surveys. However, VOCs, especially MTBE and other gasoline oxygenates, were detected more frequently in the Focused Survey than in the Random Survey. MTBE was detected in 55.5 percent of the CWSs sampled in the Focused Survey and in 8.7 percent of those sampled in the Random Survey. Little difference in occurrence, however, was observed for trihalomethanes (THMs), which were detected in 16.4 and 14.8 percent of Focused Survey and The large-scale use of the gasoline oxygenate methyl tert-butyl ether (MTBE), and its high solubility, low soil adsorption, and low biodegradability, has resulted in its detection in ground water and surface water in many places throughout the United States. Studies by numerous researchers, as well as many State and local environmental agencies, have discovered high levels of MTBE in soils and ground water at leaking underground gasoline-storage-tank sites and frequent occurrence of low to intermediate levels of MTBE in reservoirs used for both public water supply and recreational boating.In response to these findings, the American Water Works Association Research Foundation sponsored an investigation of MTBE and other volatile organic compounds (VOCs) in the Nation?s sources of drinking water. The goal of the investigation was to provide additional information on the frequency of occurrence, concentration, and temporal variability of MTBE and other VOCs in source water used by community water systems (CWSs). The investigation was completed in two stages: (1) reviews of available literature and (2) the collection of new data. Two surveys wer

  7. S100B Protein concentration in milk-formulas for preterm and term infants. Correlation with industrial preparation procedures.

    PubMed

    Nigro, Francesco; Gagliardi, Luigi; Ciotti, Sabina; Galvano, Fabio; Pietri, Amedeo; Tina, Gabriella Lucia; Cavallaro, Daniela; La Fauci, Luca; Iacopino, Leonardo; Bognanno, Matteo; Li Volti, Giovanni; Scacco, Antonio; Michetti, Fabrizio; Gazzolo, Diego

    2008-05-01

    Human milk S100B protein possesses important neurotrophic properties. However, in some conditions human milk is substituted by milk formulas. The aims of the present study were: to assess S100B concentrations in milk formulas, to verify any differences in S100B levels between preterm and term infant formulas and to evaluate the impact of industrial preparation at predetermined phases on S100B content. Two different set of samples were tested: (i) commercial preterm (n = 36) and term (n = 36) infant milk formulas; ii) milk preterm (n = 10) and term infant (n = 10) formulas sampled at the following predetermined industrial preparation time points: skimmed cow milk (Time 0); after protein sources supplementation (Time 1); after pasteurization (Time 2); after spray-drying (Time 3). Our results showed that S100B concentration in preterm formulas were higher than in term ones (p < 0.01). In addition, S100B concentrations during industrial preparation showed a significant increase (p < 0.001) at Time 1 followed by a slight decrease (p > 0.05) at Time 2, whereas a significant (p < 0.001) dip was observed at Time 3. In conclusion, S100B showed a sufficient thermostability to resist pasteurization but not spry-drying. New feeding strategies in preterm and term infants are therefore warranted in order to preserve S100B protein during industrial preparation.

  8. [Analysis of Multiplatform CO (Carbon Monoxide) Measurements During Trace-P Mission

    NASA Technical Reports Server (NTRS)

    Pougatchev, Nikita S.

    2004-01-01

    Carbon monoxide is considered mission critical (TRACE-P NRA) because it is one of the gases involved in controlling the oxidizing power of the atmosphere and, as a tracer gas, is valuable in interpreting mission data sets. Carbon monoxide exhibits interannual differences, suggesting relatively short-term imbalances in sources and sinks. Sources of CO are dominated by fossil fuel combustion, biomass burning, and the photochemical oxidation of CH4 and nonmethane hydrocarbons while reaction with OH is believed to be the major sink for atmospheric CO, with additional losses due to soil uptake. Uncertainties in the magnitude and distribution of both sources and sinks remain fairly large however, and additional data are required to refine the global budget. Seasonal changes and a northern hemispheric latitudinal gradient have been described for a variety of Pacific basin sites through long-term monitoring of surface background levels. Latitudinal variations have also recently been described at upper tropospheric altitudes over a multi-year period by. TRACE-P will provide an aircraft survey of CO over the northern Pacific in the northern spring when CO concentrations are at their seasonal maximum in the northern hemisphere (NH) and at their seasonal minimum in the southern hemisphere (SH). Previous GTE missions, Le., PEM West-B and PEM Tropics-B, ground-based, and satellite observations (MAPS, April 1994) give us a general picture of the distribution of CO over the northern Pacific during this season. Based on these measurements, background CO levels over remote ocean areas are anticipated to be in the range of 110 - 180 ppbv, while those closer to the Asian continent may rise as high as 600 ppbv. These measurements also reveal high spatial variability (both horizontal and vertical) as well as temporal variations in CO over the area planned for the TRACE-P mission. This variability is a result of multiple CO sources, the meteorological complexity of transport processes, and the photochemical aging of air masses. The influence of biomass burning in the southern Pacific should be relatively small since the mission coincides with the southern tropical wet season when agricultural burning is at its seasonal low. The proposed CO measurements taken during TRACE-P should therefore largely be a function of the impact of various NH sources, primarily Asian and predominantly fossil fuel combustion and biomass burning. These processes are also major sources of many other atmospheric pollutants, consequently making accurate and precise CO measurements is one of the highest TRACE-P priorities [TRACE-P NRA]. The TRACE-P mission emphasizes the dual objectives of assessing the magnitude of the transport of chemically and radiatively important gases such as CO from Asia to the western Pacific, and determining how emissions change and are modified during this transport.

  9. Aerodynamic sound of flow past an airfoil

    NASA Technical Reports Server (NTRS)

    Wang, Meng

    1995-01-01

    The long term objective of this project is to develop a computational method for predicting the noise of turbulence-airfoil interactions, particularly at the trailing edge. We seek to obtain the energy-containing features of the turbulent boundary layers and the near-wake using Navier-Stokes Simulation (LES or DNS), and then to calculate the far-field acoustic characteristics by means of acoustic analogy theories, using the simulation data as acoustic source functions. Two distinct types of noise can be emitted from airfoil trailing edges. The first, a tonal or narrowband sound caused by vortex shedding, is normally associated with blunt trailing edges, high angles of attack, or laminar flow airfoils. The second source is of broadband nature arising from the aeroacoustic scattering of turbulent eddies by the trailing edge. Due to its importance to airframe noise, rotor and propeller noise, etc., trailing edge noise has been the subject of extensive theoretical (e.g. Crighton & Leppington 1971; Howe 1978) as well as experimental investigations (e.g. Brooks & Hodgson 1981; Blake & Gershfeld 1988). A number of challenges exist concerning acoustic analogy based noise computations. These include the elimination of spurious sound caused by vortices crossing permeable computational boundaries in the wake, the treatment of noncompact source regions, and the accurate description of wave reflection by the solid surface and scattering near the edge. In addition, accurate turbulence statistics in the flow field are required for the evaluation of acoustic source functions. Major efforts to date have been focused on the first two challenges. To this end, a paradigm problem of laminar vortex shedding, generated by a two dimensional, uniform stream past a NACA0012 airfoil, is used to address the relevant numerical issues. Under the low Mach number approximation, the near-field flow quantities are obtained by solving the incompressible Navier-Stokes equations numerically at chord Reynolds number of 104. The far-field noise is computed using Curle's extension to the Lighthill analogy (Curle 1955). An effective method for separating the physical noise source from spurious boundary contributions is developed. This allows an accurate evaluation of the Reynolds stress volume quadrupoles, in addition to the more readily computable surface dipoles due to the unsteady lift and drag. The effect of noncompact source distribution on the far-field sound is assessed using an efficient integration scheme for the Curle integral, with full account of retarded-time variations. The numerical results confirm in quantitative terms that the far-field sound is dominated by the surface pressure dipoles at low Mach number. The techniques developed are applicable to a wide range of flows, including jets and mixing layers, where the Reynolds stress quadrupoles play a prominent or even dominant role in the overall sound generation.

  10. Air pollution and subclinical airway inflammation in the SALIA cohort study

    PubMed Central

    2014-01-01

    Background The association between long-term exposure to air pollution and local inflammation in the lung has rarely been investigated in the general population of elderly subjects before. We investigated this association in a population-based cohort of elderly women from Germany. Methods In a follow-up examination of the SALIA cohort study in 2008/2009, 402 women aged 68 to 79 years from the Ruhr Area and Borken (Germany) were clinically examined. Inflammatory markers were determined in exhaled breath condensate (EBC) and in induced sputum (IS). We used traffic indicators and measured air pollutants at single monitoring stations in the study area to assess individual traffic exposure and long-term air pollution background exposure. Additionally long-term residential exposure to air pollution was estimated using land-use regression (LUR) models. We applied multiple logistic and linear regression analyses adjusted for age, indoor mould, smoking, passive smoking and socio-economic status and additionally conducted sensitivity analyses. Results Inflammatory markers showed a high variability between the individuals and were higher with higher exposure to air pollution. NO derivatives, leukotriene (LT) B4 and tumour necrosis factor-α (TNF-α) showed the strongest associations. An increase of 9.42 μg/m3 (interquartile range) in LUR modelled NO2 was associated with measureable LTB4 level (level with values above the detection limit) in EBC (odds ratio: 1.38, 95% CI: 1.02 -1.86) as well as with LTB4 in IS (%-change: 19%, 95% CI: 7% - 32%). The results remained consistent after exclusion of subpopulations with risk factors for inflammation (smoking, respiratory diseases, mould infestation) and after extension of models with additional adjustment for season of examination, mass of IS and urban/rural living as sensitivity analyses. Conclusions In this analysis of the SALIA study we found that long-term exposure to air pollutants from traffic and industrial sources was associated with an increase of several inflammatory markers in EBC and in IS. We conclude that long-term exposure to air pollution might lead to changes in the inflammatory marker profile in the lower airways in an elderly female population. PMID:24645673

  11. Source-term characterisation and solid speciation of plutonium at the Semipalatinsk NTS, Kazakhstan.

    PubMed

    Nápoles, H Jiménez; León Vintró, L; Mitchell, P I; Omarova, A; Burkitbayev, M; Priest, N D; Artemyev, O; Lukashenko, S

    2004-01-01

    New data on the concentrations of key fission/activation products and transuranium nuclides in samples of soil and water from the Semipalatinsk Nuclear Test Site are presented and interpreted. Sampling was carried out at Ground Zero, Lake Balapan, the Tel'kem craters and reference locations within the test site boundary well removed from localised sources. Radionuclide ratios have been used to characterise the source term(s) at each of these sites. The geochemical partitioning of plutonium has also been examined and it is shown that the bulk of the plutonium contamination at most of the sites examined is in a highly refractory, non-labile form.

  12. Source-Sink Colonization as a Possible Strategy of Insects Living in Temporary Habitats.

    PubMed

    Frouz, Jan; Kindlmann, Pavel

    2015-01-01

    Continuous colonization and re-colonization is critical for survival of insect species living in temporary habitats. When insect populations in temporary habitats are depleted, some species may escape extinction by surviving in permanent, but less suitable habitats, in which long-term population survival can be maintained only by immigration from other populations. Such situation has been repeatedly described in nature, but conditions when and how this occurs and how important this phenomenon is for insect metapopulation survival are still poorly known, mainly because it is difficult to study experimentally. Therefore, we used a simulation model to investigate, how environmental stochasticity, growth rate and the incidence of dispersal affect the positive effect of permanent but poor ("sink") habitats on the likelihood of metapopulation persistence in a network of high quality but temporary ("source") habitats. This model revealed that permanent habitats substantially increase the probability of metapopulation persistence of insect species with poor dispersal ability if the availability of temporary habitats is spatio-temporally synchronized. Addition of permanent habitats to a system sometimes enabled metapopulation persistence even in cases in which the metapopulation would otherwise go extinct, especially for species with high growth rates. For insect species with low growth rates the probability of a metapopulation persistence strongly depended on the proportions of "source" to "source" and "sink" to "source" dispersal rates.

  13. Mach wave properties in the presence of source and medium heterogeneity

    NASA Astrophysics Data System (ADS)

    Vyas, J. C.; Mai, P. M.; Galis, M.; Dunham, Eric M.; Imperatori, W.

    2018-06-01

    We investigate Mach wave coherence for kinematic supershear ruptures with spatially heterogeneous source parameters, embedded in 3D scattering media. We assess Mach wave coherence considering: 1) source heterogeneities in terms of variations in slip, rise time and rupture speed; 2) small-scale heterogeneities in Earth structure, parameterized from combinations of three correlation lengths and two standard deviations (assuming von Karman power spectral density with fixed Hurst exponent); and 3) joint effects of source and medium heterogeneities. Ground-motion simulations are conducted using a generalized finite-difference method, choosing a parameterization such that the highest resolved frequency is ˜5 Hz. We discover that Mach wave coherence is slightly diminished at near fault distances (< 10 km) due to spatially variable slip and rise time; beyond this distance the Mach wave coherence is more strongly reduced by wavefield scattering due to small-scale heterogeneities in Earth structure. Based on our numerical simulations and theoretical considerations we demonstrate that the standard deviation of medium heterogeneities controls the wavefield scattering, rather than the correlation length. In addition, we find that peak ground accelerations in the case of combined source and medium heterogeneities are consistent with empirical ground motion prediction equations for all distances, suggesting that in nature ground shaking amplitudes for supershear ruptures may not be elevated due to complexities in the rupture process and seismic wave-scattering.

  14. Source identification and spatial distribution of heavy metals in tobacco-growing soils in Shandong province of China with multivariate and geostatistical analysis.

    PubMed

    Liu, Haiwei; Zhang, Yan; Zhou, Xue; You, Xiuxuan; Shi, Yi; Xu, Jialai

    2017-02-01

    Samples of surface soil from tobacco (Nicotiana tabacum L.) fields were analysed for heavy metals and showed the following concentrations (mean of 246 samples, mg/kg): As, 5.10; Cd, 0.11; Cr, 49.49; Cu, 14.72; Hg, 0.08; Ni, 19.28; Pb. 20.20 and Zn, 30.76. The values of the index of geoaccumulation (I geo ) and of the enrichment factor indicated modest enrichment with As, Cd, Cr, Hg, Ni or Pb. Principal component analysis and cluster analysis correctly allocated each investigated element to its source, whether anthropogenic or natural. The results were consistent with estimated inputs of heavy metals from fertilizers, irrigation water and atmospheric deposition. The variation in the concentrations of As, Cd, Cu, Pb and Zn in the soil was mainly due to long-term agricultural practises, and that of Cr and Ni was mainly due to the soil parent material, whereas the source of Hg was industrial activity, which ultimately led to atmospheric deposition. Atmospheric deposition was the main exogenous source of heavy metals, and fertilizers also played an important role in the accumulation of these elements in soil. Identifying the sources of heavy metals in agricultural soils can serve as a basis for appropriate action to control and reduce the addition of heavy metals to cultivated soils.

  15. Complete Moment Tensor Determination of Induced Seismicity in Unconventional and Conventional Oil/Gas Fields

    NASA Astrophysics Data System (ADS)

    Gu, C.; Li, J.; Toksoz, M. N.

    2013-12-01

    Induced seismicity occurs both in conventional oil/gas fields due to production and water injection and in unconventional oil/gas fields due to hydraulic fracturing. Source mechanisms of these induced earthquakes are of great importance for understanding their causes and the physics of the seismic processes in reservoirs. Previous research on the analysis of induced seismic events in conventional oil/gas fields assumed a double couple (DC) source mechanism. However, recent studies have shown a non-negligible percentage of a non-double-couple (non-DC) component of source moment tensor in hydraulic fracturing events (Šílený et al., 2009; Warpinski and Du, 2010; Song and Toksöz, 2011). In this study, we determine the full moment tensor of the induced seismicity data in a conventional oil/gas field and for hydrofrac events in an unconventional oil/gas field. Song and Toksöz (2011) developed a full waveform based complete moment tensor inversion method to investigate a non-DC source mechanism. We apply this approach to the induced seismicity data from a conventional gas field in Oman. In addition, this approach is also applied to hydrofrac microseismicity data monitored by downhole geophones in four wells in US. We compare the source mechanisms of induced seismicity in the two different types of gas fields and explain the differences in terms of physical processes.

  16. The use of an active controlled enclosure to attenuate sound radiation from a heavy radiator

    NASA Astrophysics Data System (ADS)

    Sun, Yao; Yang, Tiejun; Zhu, Minggang; Pan, Jie

    2017-03-01

    Active structural acoustical control usually experiences difficulty in the control of heavy sources or sources where direct applications of control forces are not practical. To overcome this difficulty, an active controlled enclosure, which forms a cavity with both flexible and open boundary, is employed. This configuration permits indirect implementation of active control in which the control inputs can be applied to subsidiary structures other than the sources. To determine the control effectiveness of the configuration, the vibro-acoustic behavior of the system, which consists of a top plate with an open, a sound cavity and a source panel, is investigated in this paper. A complete mathematical model of the system is formulated involving modified Fourier series formulations and the governing equations are solved using the Rayleigh-Ritz method. The coupling mechanisms of a partly opened cavity and a plate are analysed in terms of modal responses and directivity patterns. Furthermore, to attenuate sound power radiated from both the top panel and the open, two strategies are studied: minimizing the total radiated power and the cancellation of volume velocity. Moreover, three control configurations are compared, using a point force on the control panel (structural control), using a sound source in the cavity (acoustical control) and applying hybrid structural-acoustical control. In addition, the effects of boundary condition of the control panel on the sound radiation and control performance are discussed.

  17. Multi-sources data fusion framework for remote triage prioritization in telehealth.

    PubMed

    Salman, O H; Rasid, M F A; Saripan, M I; Subramaniam, S K

    2014-09-01

    The healthcare industry is streamlining processes to offer more timely and effective services to all patients. Computerized software algorithm and smart devices can streamline the relation between users and doctors by providing more services inside the healthcare telemonitoring systems. This paper proposes a multi-sources framework to support advanced healthcare applications. The proposed framework named Multi Sources Healthcare Architecture (MSHA) considers multi-sources: sensors (ECG, SpO2 and Blood Pressure) and text-based inputs from wireless and pervasive devices of Wireless Body Area Network. The proposed framework is used to improve the healthcare scalability efficiency by enhancing the remote triaging and remote prioritization processes for the patients. The proposed framework is also used to provide intelligent services over telemonitoring healthcare services systems by using data fusion method and prioritization technique. As telemonitoring system consists of three tiers (Sensors/ sources, Base station and Server), the simulation of the MSHA algorithm in the base station is demonstrated in this paper. The achievement of a high level of accuracy in the prioritization and triaging patients remotely, is set to be our main goal. Meanwhile, the role of multi sources data fusion in the telemonitoring healthcare services systems has been demonstrated. In addition to that, we discuss how the proposed framework can be applied in a healthcare telemonitoring scenario. Simulation results, for different symptoms relate to different emergency levels of heart chronic diseases, demonstrate the superiority of our algorithm compared with conventional algorithms in terms of classify and prioritize the patients remotely.

  18. Characterization of Industrial Emission Sources and Photochemistry in Houston, Texas

    NASA Astrophysics Data System (ADS)

    Washenfelder, R. A.; Atlas, E. L.; Degouw, J.; Flocke, F. M.; Fried, A.; Frost, G. J.; Holloway, J.; Richter, D.; Ryerson, T. B.; Schauffler, S.; Trainer, M.; Walega, J.; Warneke, C.; Weibring, P.; Zheng, W.

    2009-12-01

    The Houston-Galveston urban area contains a number of large industrial petrochemical emission sources that produce volatile organic compounds and nitrogen oxides. These co-located emissions result in rapid and efficient ozone production downwind. Unlike a single large power plant, the industrial complexes consist of numerous sources that can be difficult to quantify in emission inventories. During September - October 2006, the NOAA WP-3 aircraft conducted research flights as part of the second Texas Air Quality Study (TexAQS II). We examine measurements of NOx, SO2, and speciated hydrocarbons from the Houston Ship Channel, which contains a dense concentration of industrial petrochemical sources, and isolated petrochemical facilities. These measurements are used to derive source emission estimates, which are then compared to available emission inventories. We find that high hydrocarbon emissions are typical for the Houston Ship Channel and isolated petrochemical facilities. Ethene and propene are found to be major contributors to ozone formation. Ratios of C2H4 / NOx and C3H6 / NOx exceed emission inventory values by factors of 10 - 50. These findings are consistent with the first TexAQS study in 2000. We examine trends in C2H4 / NOx and C3H6 / NOx ratios between 2000 and 2006, and determine that day-to-day variability and within-plume variability exceeds any long-term reduction in ethene and propene emissions for the isolated petrochemical sources. We additionally examine downwind photochemical products formed by these alkenes.

  19. 5 CFR 3601.103 - Additional exceptions for gifts from outside sources.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Additional exceptions for gifts from... gifts from outside sources. In addition to the gifts which come within the exceptions set forth in 5 CFR... gifts from outside sources otherwise prohibited by 5 CFR 2635.202(a) as follows: (a) Events sponsored by...

  20. The Muon Conditions Data Management:. Database Architecture and Software Infrastructure

    NASA Astrophysics Data System (ADS)

    Verducci, Monica

    2010-04-01

    The management of the Muon Conditions Database will be one of the most challenging applications for Muon System, both in terms of data volumes and rates, but also in terms of the variety of data stored and their analysis. The Muon conditions database is responsible for almost all of the 'non-event' data and detector quality flags storage needed for debugging of the detector operations and for performing the reconstruction and the analysis. In particular for the early data, the knowledge of the detector performance, the corrections in term of efficiency and calibration will be extremely important for the correct reconstruction of the events. In this work, an overview of the entire Muon conditions database architecture is given, in particular the different sources of the data and the storage model used, including the database technology associated. Particular emphasis is given to the Data Quality chain: the flow of the data, the analysis and the final results are described. In addition, the description of the software interfaces used to access to the conditions data are reported, in particular, in the ATLAS Offline Reconstruction framework ATHENA environment.

Top