Sample records for assessment source term

  1. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, D.; Brunett, A.; Passerini, S.

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. Themore » mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.« less

  2. Source term model evaluations for the low-level waste facility performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yim, M.S.; Su, S.I.

    1995-12-31

    The estimation of release of radionuclides from various waste forms to the bottom boundary of the waste disposal facility (source term) is one of the most important aspects of LLW facility performance assessment. In this work, several currently used source term models are comparatively evaluated for the release of carbon-14 based on a test case problem. The models compared include PRESTO-EPA-CPG, IMPACTS, DUST and NEFTRAN-II. Major differences in assumptions and approaches between the models are described and key parameters are identified through sensitivity analysis. The source term results from different models are compared and other concerns or suggestions are discussed.

  3. Regulatory Technology Development Plan - Sodium Fast Reactor: Mechanistic Source Term – Trial Calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-10-01

    The potential release of radioactive material during a plant incident, referred to as the source term, is a vital design metric and will be a major focus of advanced reactor licensing. The U.S. Nuclear Regulatory Commission has stated an expectation for advanced reactor vendors to present a mechanistic assessment of the potential source term in their license applications. The mechanistic source term presents an opportunity for vendors to realistically assess the radiological consequences of an incident, and may allow reduced emergency planning zones and smaller plant sites. However, the development of a mechanistic source term for advanced reactors is notmore » without challenges, as there are often numerous phenomena impacting the transportation and retention of radionuclides. This project sought to evaluate U.S. capabilities regarding the mechanistic assessment of radionuclide release from core damage incidents at metal fueled, pool-type sodium fast reactors (SFRs). The purpose of the analysis was to identify, and prioritize, any gaps regarding computational tools or data necessary for the modeling of radionuclide transport and retention phenomena. To accomplish this task, a parallel-path analysis approach was utilized. One path, led by Argonne and Sandia National Laboratories, sought to perform a mechanistic source term assessment using available codes, data, and models, with the goal to identify gaps in the current knowledge base. The second path, performed by an independent contractor, performed sensitivity analyses to determine the importance of particular radionuclides and transport phenomena in regards to offsite consequences. The results of the two pathways were combined to prioritize gaps in current capabilities.« less

  4. Source-term development for a contaminant plume for use by multimedia risk assessment models

    NASA Astrophysics Data System (ADS)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.; Gnanapragasam, Emmanuel K.; Yu, Charley; Lew, Christine S.; Mills, William B.

    2000-02-01

    Multimedia modelers from the US Environmental Protection Agency (EPA) and US Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: MEPAS, MMSOILS, PRESTO, and RESRAD. These models represent typical analytically based tools that are used in human-risk and endangerment assessments at installations containing radioactive and hazardous contaminants. The objective is to demonstrate an approach for developing an adequate source term by simplifying an existing, real-world, 90Sr plume at DOE's Hanford installation in Richland, WA, for use in a multimedia benchmarking exercise between MEPAS, MMSOILS, PRESTO, and RESRAD. Source characteristics and a release mechanism are developed and described; also described is a typical process and procedure that an analyst would follow in developing a source term for using this class of analytical tool in a preliminary assessment.

  5. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  6. Neuroimaging Evidence for Agenda-Dependent Monitoring of Different Features during Short-Term Source Memory Tests

    ERIC Educational Resources Information Center

    Mitchell, Karen J.; Raye, Carol L.; McGuire, Joseph T.; Frankel, Hillary; Greene, Erich J.; Johnson, Marcia K.

    2008-01-01

    A short-term source monitoring procedure with functional magnetic resonance imaging assessed neural activity when participants made judgments about the format of 1 of 4 studied items (picture, word), the encoding task performed (cost, place), or whether an item was old or new. The results support findings from long-term memory studies showing that…

  7. National Air Toxic Assessments (NATA) Results

    EPA Pesticide Factsheets

    The National Air Toxics Assessment was conducted by EPA in 2002 to assess air toxics emissions in order to identify and prioritize air toxics, emission source types and locations which are of greatest potential concern in terms of contributing to population risk. This data source provides downloadable information on emissions at the state, county and census tract level.

  8. Bioaerosol releases from compost facilities: Evaluating passive and active source terms at a green waste facility for improved risk assessments

    NASA Astrophysics Data System (ADS)

    Taha, M. P. M.; Drew, G. H.; Longhurst, P. J.; Smith, R.; Pollard, S. J. T.

    The passive and active release of bioaerosols during green waste composting, measured at source is reported for a commercial composting facility in South East (SE) England as part of a research programme focused on improving risk assessments at composting facilities. Aspergillus fumigatus and actinomycetes concentrations of 9.8-36.8×10 6 and 18.9-36.0×10 6 cfu m -3, respectively, measured during the active turning of green waste compost, were typically 3-log higher than previously reported concentrations from static compost windrows. Source depletion curves constructed for A. fumigatus during compost turning and modelled using SCREEN3 suggest that bioaerosol concentrations could reduce to background concentrations of 10 3 cfu m -3 within 100 m of this site. Authentic source term data produced from this study will help to refine the risk assessment methodologies that support improved permitting of compost facilities.

  9. A critical assessment of flux and source term closures in shallow water models with porosity for urban flood simulations

    NASA Astrophysics Data System (ADS)

    Guinot, Vincent

    2017-11-01

    The validity of flux and source term formulae used in shallow water models with porosity for urban flood simulations is assessed by solving the two-dimensional shallow water equations over computational domains representing periodic building layouts. The models under assessment are the Single Porosity (SP), the Integral Porosity (IP) and the Dual Integral Porosity (DIP) models. 9 different geometries are considered. 18 two-dimensional initial value problems and 6 two-dimensional boundary value problems are defined. This results in a set of 96 fine grid simulations. Analysing the simulation results leads to the following conclusions: (i) the DIP flux and source term models outperform those of the SP and IP models when the Riemann problem is aligned with the main street directions, (ii) all models give erroneous flux closures when is the Riemann problem is not aligned with one of the main street directions or when the main street directions are not orthogonal, (iii) the solution of the Riemann problem is self-similar in space-time when the street directions are orthogonal and the Riemann problem is aligned with one of them, (iv) a momentum balance confirms the existence of the transient momentum dissipation model presented in the DIP model, (v) none of the source term models presented so far in the literature allows all flow configurations to be accounted for(vi) future laboratory experiments aiming at the validation of flux and source term closures should focus on the high-resolution, two-dimensional monitoring of both water depth and flow velocity fields.

  10. COMPARATIVE POTENCY OF COMPLEX MIXTURES: USE OF SHORT-TERM GENETIC BIOASSAYS IN CANCER RISK ASSESSMENT

    EPA Science Inventory

    The primary problem regarding the introduction of new energy sources is whether they will alter the mutagenicity, carcinogenicity and potential human cancer risk from combustion emissions. New risk assessment methodologies utilizing data from short-term bioassays, therefore, are ...

  11. Long-term trends in California mobile source emissions and ambient concentrations of black carbon and organic aerosol.

    PubMed

    McDonald, Brian C; Goldstein, Allen H; Harley, Robert A

    2015-04-21

    A fuel-based approach is used to assess long-term trends (1970-2010) in mobile source emissions of black carbon (BC) and organic aerosol (OA, including both primary emissions and secondary formation). The main focus of this analysis is the Los Angeles Basin, where a long record of measurements is available to infer trends in ambient concentrations of BC and organic carbon (OC), with OC used here as a proxy for OA. Mobile source emissions and ambient concentrations have decreased similarly, reflecting the importance of on- and off-road engines as sources of BC and OA in urban areas. In 1970, the on-road sector accounted for ∼90% of total mobile source emissions of BC and OA (primary + secondary). Over time, as on-road engine emissions have been controlled, the relative importance of off-road sources has grown. By 2010, off-road engines were estimated to account for 37 ± 20% and 45 ± 16% of total mobile source contributions to BC and OA, respectively, in the Los Angeles area. This study highlights both the success of efforts to control on-road emission sources, and the importance of considering off-road engine and other VOC source contributions when assessing long-term emission and ambient air quality trends.

  12. Building Assessment Survey and Evaluation Study: Summarized Data - Test Space Pollutant Sources

    EPA Pesticide Factsheets

    information collected regarding sources that may have potential impact on the building in terms of indoor air quality including sources such as past or current water damage, pesticide application practices, special use spaces, etc.

  13. Generation of Alternative Assessment Scores using TEST and online data sources

    EPA Science Inventory

    Alternatives assessment frameworks such as DfE (Design for the Environment) evaluate chemical alternatives in terms of human health effects, ecotoxicity, and fate. T.E.S.T. (Toxicity Estimation Software Tool) can be utilized to evaluate human health in terms of acute oral rat tox...

  14. High-order scheme for the source-sink term in a one-dimensional water temperature model

    PubMed Central

    Jing, Zheng; Kang, Ling

    2017-01-01

    The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data. PMID:28264005

  15. High-order scheme for the source-sink term in a one-dimensional water temperature model.

    PubMed

    Jing, Zheng; Kang, Ling

    2017-01-01

    The source-sink term in water temperature models represents the net heat absorbed or released by a water system. This term is very important because it accounts for solar radiation that can significantly affect water temperature, especially in lakes. However, existing numerical methods for discretizing the source-sink term are very simplistic, causing significant deviations between simulation results and measured data. To address this problem, we present a numerical method specific to the source-sink term. A vertical one-dimensional heat conduction equation was chosen to describe water temperature changes. A two-step operator-splitting method was adopted as the numerical solution. In the first step, using the undetermined coefficient method, a high-order scheme was adopted for discretizing the source-sink term. In the second step, the diffusion term was discretized using the Crank-Nicolson scheme. The effectiveness and capability of the numerical method was assessed by performing numerical tests. Then, the proposed numerical method was applied to a simulation of Guozheng Lake (located in central China). The modeling results were in an excellent agreement with measured data.

  16. Source inventory for Department of Energy solid low-level radioactive waste disposal facilities: What it means and how to get one of your own

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, M.A.

    1991-12-31

    In conducting a performance assessment for a low-level waste (LLW) disposal facility, one of the important considerations for determining the source term, which is defined as the amount of radioactivity being released from the facility, is the quantity of radioactive material present. This quantity, which will be referred to as the source inventory, is generally estimated through a review of historical records and waste tracking systems at the LLW facility. In theory, estimating the total source inventory for Department of Energy (DOE) LLW disposal facilities should be possible by reviewing the national data base maintained for LLW operations, the Solidmore » Waste Information Management System (SWIMS), or through the annual report that summarizes the SWIMS data, the Integrated Data Base (IDB) report. However, in practice, there are some difficulties in making this estimate. This is not unexpected, since the SWIMS and the IDB were not developed with the goal of developing a performance assessment source term in mind. The practical shortcomings using the existing data to develop a source term for DOE facilities will be discussed in this paper.« less

  17. Long-Term Metacognitive Effects of a Strategic Learning Course for Postsecondary Students with and without Disabilities

    ERIC Educational Resources Information Center

    Burchard, Melinda S.

    2010-01-01

    This dissertation examined long-term metacognitive effects of participation in a Strategic Learning course for postsecondary students with and without disabilities. The researcher integrated existing archival data from three sources, a university-wide assessment program, assessments of 114 students who took a postsecondary Strategic Learning…

  18. A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions

    NASA Technical Reports Server (NTRS)

    Huff, R. G.

    1984-01-01

    The equations of momentum annd continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in Earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.

  19. A theoretical prediction of the acoustic pressure generated by turbulence-flame front interactions

    NASA Technical Reports Server (NTRS)

    Huff, R. G.

    1984-01-01

    The equations of momentum and continuity are combined and linearized yielding the one dimensional nonhomogeneous acoustic wave equation. Three terms in the non-homogeneous equation act as acoustic sources and are taken to be forcing functions acting on the homogeneous wave equation. The three source terms are: fluctuating entropy, turbulence gradients, and turbulence-flame interactions. Each source term is discussed. The turbulence-flame interaction source is used as the basis for computing the source acoustic pressure from the Fourier transformed wave equation. Pressure fluctuations created in turbopump gas generators and turbines may act as a forcing function for turbine and propellant tube vibrations in earth to orbit space propulsion systems and could reduce their life expectancy. A preliminary assessment of the acoustic pressure fluctuations in such systems is presented.

  20. YouTube as a patient-information source for root canal treatment.

    PubMed

    Nason, K; Donnelly, A; Duncan, H F

    2016-12-01

    To assess the content and completeness of Youtube ™ as an information source for patients undergoing root canal treatment procedures. YouTube ™ (https://www.youtube.com/) was searched for information using three relevant treatment search terms ('endodontics', 'root canal' and 'root canal treatment'). After exclusions (language, no audio, >15 min, duplicates), 20 videos per search term were selected. General video assessment included duration, ownership, views, age, likes/dislikes, target audience and video/audio quality, whilst content was analysed under six categories ('aetiology', 'anatomy', 'symptoms', 'procedure', 'postoperative course' and 'prognosis'). Content was scored for completeness level and statistically analysed using anova and post hoc Tukey's test (P < 0.05). To obtain 60 acceptable videos, 124 were assessed. Depending on the search term employed, the video content and ownership differed markedly. There was wide variation in both the number of video views and 'likes/dislikes'. The average video age was 788 days. In total, 46% of videos were 'posted' by a dentist/specialist source; however, this was search term specific rising to 70% of uploads for the search 'endodontic', whilst laypersons contributed 18% of uploads for the search 'root canal treatment'. Every video lacked content in the designated six categories, although 'procedure' details were covered more frequently and in better detail than other categories. Videos posted by dental professional (P = 0.046) and commercial sources (P = 0.009) were significantly more complete than videos posted by laypeople. YouTube ™ videos for endodontic search terms varied significantly by source and content and were generally incomplete. The danger of patient reliance on YouTube ™ is highlighted, as is the need for endodontic professionals to play an active role in directing patients towards alternative high-quality information sources. © 2015 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  1. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    NASA Astrophysics Data System (ADS)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  2. Bias in Terms of Culture and a Method for Reducing It: An Eight-Country "Explanations of Unemployment Scale" Study

    ERIC Educational Resources Information Center

    Mylonas, Kostas; Furnham, Adrian; Divale, William; Leblebici, Cigdem; Gondim, Sonia; Moniz, Angela; Grad, Hector; Alvaro, Jose Luis; Cretu, Romeo Zeno; Filus, Ania; Boski, Pawel

    2014-01-01

    Several sources of bias can plague research data and individual assessment. When cultural groups are considered, across or even within countries, it is essential that the constructs assessed and evaluated are as free as possible from any source of bias and specifically from bias caused due to culturally specific characteristics. Employing the…

  3. Algorithms and analytical solutions for rapidly approximating long-term dispersion from line and area sources

    NASA Astrophysics Data System (ADS)

    Barrett, Steven R. H.; Britter, Rex E.

    Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean dispersion are shown to produce results several orders of magnitude more efficiently with a loss of accuracy small compared to the absolute accuracy of advanced dispersion models near sources. The method can be readily incorporated into existing dispersion models, and may allow for additional computation time to be expended on modelling dispersion processes more accurately in future, rather than on accounting for source geometry.

  4. Efficient Development of High Fidelity Structured Volume Grids for Hypersonic Flow Simulations

    NASA Technical Reports Server (NTRS)

    Alter, Stephen J.

    2003-01-01

    A new technique for the control of grid line spacing and intersection angles of a structured volume grid, using elliptic partial differential equations (PDEs) is presented. Existing structured grid generation algorithms make use of source term hybridization to provide control of grid lines, imposing orthogonality implicitly at the boundary and explicitly on the interior of the domain. A bridging function between the two types of grid line control is typically used to blend the different orthogonality formulations. It is shown that utilizing such a bridging function with source term hybridization can result in the excessive use of computational resources and diminishes robustness. A new approach, Anisotropic Lagrange Based Trans-Finite Interpolation (ALBTFI), is offered as a replacement to source term hybridization. The ALBTFI technique captures the essence of the desired grid controls while improving the convergence rate of the elliptic PDEs when compared with source term hybridization. Grid generation on a blunt cone and a Shuttle Orbiter is used to demonstrate and assess the ALBTFI technique, which is shown to be as much as 50% faster, more robust, and produces higher quality grids than source term hybridization.

  5. BWR ASSEMBLY SOURCE TERMS FOR WASTE PACKAGE DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.L. Lotz

    1997-02-15

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to provide boiling water reactor (BWR) assembly radiation source term data for use during Waste Package (WP) design. The BWR assembly radiation source terms are to be used for evaluation of radiolysis effects at the WP surface, and for personnel shielding requirements during assembly or WP handling operations. The objectives of this evaluation are to generate BWR assembly radiation source terms that bound selected groupings of BWR assemblies, with regard to assembly average burnup and cooling time, which comprise the anticipated MGDS BWR commercialmore » spent nuclear fuel (SNF) waste stream. The source term data is to be provided in a form which can easily be utilized in subsequent shielding/radiation dose calculations. Since these calculations may also be used for Total System Performance Assessment (TSPA), with appropriate justification provided by TSPA, or radionuclide release rate analysis, the grams of each element and additional cooling times out to 25 years will also be calculated and the data included in the output files.« less

  6. PFLOTRAN-RepoTREND Source Term Comparison Summary.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederick, Jennifer M.

    Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.

  7. Reducing Human Radiation Risks on Deep Space Missions

    DTIC Science & Technology

    2017-09-01

    Roadmap (2016). .........................................................108 Figure 53. Risk Assessment for Acute Radiation Syndrome Due to SPEs...Risk of Acute Radiation Syndromes Due to Solar Particle Events Figure 53 highlights the fact that acute radiation syndrome is a short-term risk...acceptable for long-term missions. Figure 53. Risk Assessment for Acute Radiation Syndrome Due to SPEs. Source: NASA Human Research Roadmap (2016

  8. Open source posturography.

    PubMed

    Rey-Martinez, Jorge; Pérez-Fernández, Nicolás

    2016-12-01

    The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.

  9. The Fukushima releases: an inverse modelling approach to assess the source term by using gamma dose rate observations

    NASA Astrophysics Data System (ADS)

    Saunier, Olivier; Mathieu, Anne; Didier, Damien; Tombette, Marilyne; Quélo, Denis; Winiarek, Victor; Bocquet, Marc

    2013-04-01

    The Chernobyl nuclear accident and more recently the Fukushima accident highlighted that the largest source of error on consequences assessment is the source term estimation including the time evolution of the release rate and its distribution between radioisotopes. Inverse modelling methods have proved to be efficient to assess the source term due to accidental situation (Gudiksen, 1989, Krysta and Bocquet, 2007, Stohl et al 2011, Winiarek et al 2012). These methods combine environmental measurements and atmospheric dispersion models. They have been recently applied to the Fukushima accident. Most existing approaches are designed to use air sampling measurements (Winiarek et al, 2012) and some of them use also deposition measurements (Stohl et al, 2012, Winiarek et al, 2013). During the Fukushima accident, such measurements are far less numerous and not as well distributed within Japan than the dose rate measurements. To efficiently document the evolution of the contamination, gamma dose rate measurements were numerous, well distributed within Japan and they offered a high temporal frequency. However, dose rate data are not as easy to use as air sampling measurements and until now they were not used in inverse modelling approach. Indeed, dose rate data results from all the gamma emitters present in the ground and in the atmosphere in the vicinity of the receptor. They do not allow one to determine the isotopic composition or to distinguish the plume contribution from wet deposition. The presented approach proposes a way to use dose rate measurement in inverse modeling approach without the need of a-priori information on emissions. The method proved to be efficient and reliable when applied on the Fukushima accident. The emissions for the 8 main isotopes Xe-133, Cs-134, Cs-136, Cs-137, Ba-137m, I-131, I-132 and Te-132 have been assessed. The Daiichi power plant events (such as ventings, explosions…) known to have caused atmospheric releases are well identified in the retrieved source term, except for unit 3 explosion where no measurement was available. The comparisons between the simulations of atmospheric dispersion and deposition of the retrieved source term show a good agreement with environmental observations. Moreover, an important outcome of this study is that the method proved to be perfectly suited to crisis management and should contribute to improve our response in case of a nuclear accident.

  10. Making the right long-term prescription for medical equipment financing.

    PubMed

    Conbeer, George P

    2007-06-01

    For hospital financial executives charged with assessing new technologies, obtaining access to sufficient information to support an in-depth analysis can be a daunting challenge. The information should come not only from direct sources, such as the equipment manufacturer, but also from indirect sources, such as leasing companies. A thorough knowledge of financing methods--including tax-exempt bonds, bank debt, standard leasing, tax-exempt leasing, and equipment rental terms-is critical.

  11. Seismic hazard assessment over time: Modelling earthquakes in Taiwan

    NASA Astrophysics Data System (ADS)

    Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting

    2017-04-01

    To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.

  12. Generation of GHS Scores from TEST and online sources ...

    EPA Pesticide Factsheets

    Alternatives assessment frameworks such as DfE (Design for the Environment) evaluate chemical alternatives in terms of human health effects, ecotoxicity, and fate. T.E.S.T. (Toxicity Estimation Software Tool) can be utilized to evaluate human health in terms of acute oral rat toxicity, developmental toxicity, endocrine activity, and mutagenicity. It can be used to evaluate ecotoxicity (in terms of acute fathead minnow toxicity) and fate (in terms of bioconcentration factor). It also be used to estimate a variety of key physicochemical properties such as melting point, boiling point, vapor pressure, water solubility, and bioconcentration factor. A web-based version of T.E.S.T. is currently being developed to allow predictions to be made from other web tools. Online data sources such as from NCCT’s Chemistry Dashboard, REACH dossiers, or from ChemHat.org can also be utilized to obtain GHS (Global Harmonization System) scores for comparing alternatives. The purpose of this talk is to show how GHS (Global Harmonization Score) data can be obtained from literature sources and from T.E.S.T. (Toxicity Estimation Software Tool). This data will be used to compare chemical alternatives in the alternatives assessment dashboard (a 2018 CSS product).

  13. Source-term development for a contaminant plume for use by multimedia risk assessment models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.

    1999-12-01

    Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less

  14. Education and Human Resources Sector Assessment Manual.

    ERIC Educational Resources Information Center

    Pigozzi, Mary Joy; Cieutat, Victor J.

    This manual endorses and adopts the sector-assessment approach for planning and managing the allocation of educational resources. Chapter 1 presents the manual's goals. Chapter 2 describes the manual's content and information sources, explains the term "sector assessment," identifies the groups that benefit from recommendations made by…

  15. Access to safe water in rural Artibonite, Haiti 16 months after the onset of the cholera epidemic.

    PubMed

    Patrick, Molly; Berendes, David; Murphy, Jennifer; Bertrand, Fabienne; Husain, Farah; Handzel, Thomas

    2013-10-01

    Haiti has the lowest improved water and sanitation coverage in the Western Hemisphere and is suffering from the largest cholera epidemic on record. In May of 2012, an assessment was conducted in rural areas of the Artibonite Department to describe the type and quality of water sources and determine knowledge, access, and use of household water treatment products to inform future programs. It was conducted after emergency response was scaled back but before longer-term water, sanitation, and hygiene activities were initiated. The household survey and source water quality analysis documented low access to safe water, with only 42.3% of households using an improved drinking water source. One-half (50.9%) of the improved water sources tested positive for Escherichia coli. Of households with water to test, 12.7% had positive chlorine residual. The assessment reinforces the identified need for major investments in safe water and sanitation infrastructure and the importance of household water treatment to improve access to safe water in the near term.

  16. LONG TERM HYDROLOGICAL IMPACT ASSESSMENT (LTHIA)

    EPA Science Inventory

    LTHIA is a universal Urban Sprawl analysis tool that is available to all at no charge through the Internet. It estimates impacts on runoff, recharge and nonpoint source pollution resulting from past or proposed land use changes. It gives long-term average annual runoff for a lan...

  17. Spent fuel radionuclide source-term model for assessing spent fuel performance in geological disposal. Part I: Assessment of the instant release fraction

    NASA Astrophysics Data System (ADS)

    Johnson, Lawrence; Ferry, Cécile; Poinssot, Christophe; Lovera, Patrick

    2005-11-01

    A source-term model for the short-term release of radionuclides from spent nuclear fuel (SNF) has been developed. It provides quantitative estimates of the fraction of various radionuclides that are expected to be released rapidly (the instant release fraction, or IRF) when water contacts the UO 2 or MOX fuel after container breaching in a geological repository. The estimates are based on correlation of leaching data for radionuclides with fuel burnup and fission gas release. Extrapolation of the data to higher fuel burnup values is based on examination of data on fuel restructuring, such as rim development, and on fission gas release data, which permits bounding IRF values to be estimated assuming that radionuclide releases will be less than fission gas release. The consideration of long-term solid-state changes influencing the IRF prior to canister breaching is addressed by evaluating alpha self-irradiation enhanced diffusion, which may gradually increase the accumulation of fission products at grain boundaries.

  18. A Comprehensive Probabilistic Tsunami Hazard Assessment: Multiple Sources and Short-Term Interactions

    NASA Astrophysics Data System (ADS)

    Anita, G.; Selva, J.; Laura, S.

    2011-12-01

    We develop a comprehensive and total probabilistic tsunami hazard assessment (TotPTHA), in which many different possible source types concur to the definition of the total tsunami hazard at given target sites. In a multi-hazard and multi-risk perspective, such an innovative approach allows, in principle, to consider all possible tsunamigenic sources, from seismic events, to slides, asteroids, volcanic eruptions, etc. In this respect, we also formally introduce and discuss the treatment of interaction/cascade effects in the TotPTHA analysis. We demonstrate how external triggering events may induce significant temporary variations in the tsunami hazard. Because of this, such effects should always be considered, at least in short-term applications, to obtain unbiased analyses. Finally, we prove the feasibility of the TotPTHA and of the treatment of interaction/cascade effects by applying this methodology to an ideal region with realistic characteristics (Neverland).

  19. Comparing different types of source memory attributes in dementia of Alzheimer's type.

    PubMed

    Mammarella, Nicola; Fairfield, Beth; Di Domenico, Alberto

    2012-04-01

    Source monitoring (SM) refers to our ability to discriminate between memories from different sources. Twenty healthy high-cognitive functioning older adults, 20 healthy low-cognitive functioning older adults, and 20 older adults with dementia of Alzheimer's type (DAT) were asked to perform a series of SM tasks that varied in terms of the to-be-remembered source attribute (perceptual, spatial, temporal, semantic, social, and affective details). Results indicated that older DAT adults had greater difficulty in SM compared to the healthy control groups, especially with spatial and semantic details. Data are discussed in terms of the SM framework and suggest that poor memory for some types of source information may be considered as an important indicator of clinical memory function when assessing for the presence and severity of dementia.

  20. Generation of GHS Scores from TEST and online sources

    EPA Science Inventory

    Alternatives assessment frameworks such as DfE (Design for the Environment) evaluate chemical alternatives in terms of human health effects, ecotoxicity, and fate. T.E.S.T. (Toxicity Estimation Software Tool) can be utilized to evaluate human health in terms of acute oral rat tox...

  1. MODELING MINERAL NITROGEN EXPORT FROM A FOREST TERRESTRIAL ECOSYSTEM TO STREAMS

    EPA Science Inventory

    Terrestrial ecosystems are major sources of N pollution to aquatic ecosystems. Predicting N export to streams is a critical goal of non-point source modeling. This study was conducted to assess the effect of terrestrial N cycling on stream N export using long-term monitoring da...

  2. Implementation of the Leaching Environmental Assessment Framework

    EPA Science Inventory

    New leaching tests are available in the U.S. for developing more accurate source terms for use in fate and transport models. For beneficial use or disposal, the use of the leaching environmental assessment framework (LEAF) will provide leaching results that reflect field condit...

  3. A New Generation of Leaching Tests – The Leaching Environmental Assessment Framework

    EPA Science Inventory

    Provides an overview of newly released leaching tests that provide a more accurate source term when estimating environmental release of metals and other constituents of potential concern (COPCs). The Leaching Environmental Assessment Framework (LEAF) methods have been (1) develo...

  4. Source term evaluation model for high-level radioactive waste repository with decay chain build-up.

    PubMed

    Chopra, Manish; Sunny, Faby; Oza, R B

    2016-09-18

    A source term model based on two-component leach flux concept is developed for a high-level radioactive waste repository. The long-lived radionuclides associated with high-level waste may give rise to the build-up of activity because of radioactive decay chains. The ingrowths of progeny are incorporated in the model using Bateman decay chain build-up equations. The model is applied to different radionuclides present in the high-level radioactive waste, which form a part of decay chains (4n to 4n + 3 series), and the activity of the parent and daughter radionuclides leaching out of the waste matrix is estimated. Two cases are considered: one when only parent is present initially in the waste and another where daughters are also initially present in the waste matrix. The incorporation of in situ production of daughter radionuclides in the source is important to carry out realistic estimates. It is shown that the inclusion of decay chain build-up is essential to avoid underestimation of the radiological impact assessment of the repository. The model can be a useful tool for evaluating the source term of the radionuclide transport models used for the radiological impact assessment of high-level radioactive waste repositories.

  5. Design, implementation, and initial results from a water-quality monitoring network for Atlanta, Georgia, USA

    USGS Publications Warehouse

    Horowitz, A.J.; Elrick, K.A.; Smith, J.J.

    2005-01-01

    In cooperation with the City of Atlanta, Georgia, the US Geological Survey has designed and implemented a water-quantity and quality monitoring network that measures a variety of biological and chemical constituents in water and suspended sediment. The network consists of 20 long-term monitoring sites and is intended to assess water-quality trends in response to planned infrastructural improvements. Initial results from the network indicate that nonpoint-source contributions may be more significant than point-source contributions for selected sediment associated trace elements and nutrients. There also are indications of short-term discontinuous point-source contributions of these same constituents during baseflow.

  6. EXPERIENCES FROM THE SOURCE-TERM ANALYSIS OF A LOW AND INTERMEDIATE LEVEL RADWASTE DISPOSAL FACILITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park,Jin Beak; Park, Joo-Wan; Lee, Eun-Young

    2003-02-27

    Enhancement of a computer code SAGE for evaluation of the Korean concept for a LILW waste disposal facility is discussed. Several features of source term analysis are embedded into SAGE to analyze: (1) effects of degradation mode of an engineered barrier, (2) effects of dispersion phenomena in the unsaturated zone and (3) effects of time dependent sorption coefficient in the unsaturated zone. IAEA's Vault Safety Case (VSC) approach is used to demonstrate the ability of this assessment code. Results of MASCOT are used for comparison purposes. These enhancements of the safety assessment code, SAGE, can contribute to realistic evaluation ofmore » the Korean concept of the LILW disposal project in the near future.« less

  7. Estimation of the caesium-137 source term from the Fukushima Daiichi nuclear power plant using a consistent joint assimilation of air concentration and deposition observations

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Bocquet, Marc; Duhanyan, Nora; Roustan, Yelva; Saunier, Olivier; Mathieu, Anne

    2014-01-01

    Inverse modelling techniques can be used to estimate the amount of radionuclides and the temporal profile of the source term released in the atmosphere during the accident of the Fukushima Daiichi nuclear power plant in March 2011. In Winiarek et al. (2012b), the lower bounds of the caesium-137 and iodine-131 source terms were estimated with such techniques, using activity concentration measurements. The importance of an objective assessment of prior errors (the observation errors and the background errors) was emphasised for a reliable inversion. In such critical context where the meteorological conditions can make the source term partly unobservable and where only a few observations are available, such prior estimation techniques are mandatory, the retrieved source term being very sensitive to this estimation. We propose to extend the use of these techniques to the estimation of prior errors when assimilating observations from several data sets. The aim is to compute an estimate of the caesium-137 source term jointly using all available data about this radionuclide, such as activity concentrations in the air, but also daily fallout measurements and total cumulated fallout measurements. It is crucial to properly and simultaneously estimate the background errors and the prior errors relative to each data set. A proper estimation of prior errors is also a necessary condition to reliably estimate the a posteriori uncertainty of the estimated source term. Using such techniques, we retrieve a total released quantity of caesium-137 in the interval 11.6-19.3 PBq with an estimated standard deviation range of 15-20% depending on the method and the data sets. The “blind” time intervals of the source term have also been strongly mitigated compared to the first estimations with only activity concentration data.

  8. Short-term emergency response planning and risk assessment via an integrated modeling system for nuclear power plants in complex terrain

    NASA Astrophysics Data System (ADS)

    Chang, Ni-Bin; Weng, Yu-Chi

    2013-03-01

    Short-term predictions of potential impacts from accidental release of various radionuclides at nuclear power plants are acutely needed, especially after the Fukushima accident in Japan. An integrated modeling system that provides expert services to assess the consequences of accidental or intentional releases of radioactive materials to the atmosphere has received wide attention. These scenarios can be initiated either by accident due to human, software, or mechanical failures, or from intentional acts such as sabotage and radiological dispersal devices. Stringent action might be required just minutes after the occurrence of accidental or intentional release. To fulfill the basic functions of emergency preparedness and response systems, previous studies seldom consider the suitability of air pollutant dispersion models or the connectivity between source term, dispersion, and exposure assessment models in a holistic context for decision support. Therefore, the Gaussian plume and puff models, which are only suitable for illustrating neutral air pollutants in flat terrain conditional to limited meteorological situations, are frequently used to predict the impact from accidental release of industrial sources. In situations with complex terrain or special meteorological conditions, the proposing emergency response actions might be questionable and even intractable to decisionmakers responsible for maintaining public health and environmental quality. This study is a preliminary effort to integrate the source term, dispersion, and exposure assessment models into a Spatial Decision Support System (SDSS) to tackle the complex issues for short-term emergency response planning and risk assessment at nuclear power plants. Through a series model screening procedures, we found that the diagnostic (objective) wind field model with the aid of sufficient on-site meteorological monitoring data was the most applicable model to promptly address the trend of local wind field patterns. However, most of the hazardous materials being released into the environment from nuclear power plants are not neutral pollutants, so the particle and multi-segment puff models can be regarded as the most suitable models to incorporate into the output of the diagnostic wind field model in a modern emergency preparedness and response system. The proposed SDSS illustrates the state-of-the-art system design based on the situation of complex terrain in South Taiwan. This system design of SDSS with 3-dimensional animation capability using a tailored source term model in connection with ArcView® Geographical Information System map layers and remote sensing images is useful for meeting the design goal of nuclear power plants located in complex terrain.

  9. Radionuclides in the Arctic seas from the former Soviet Union: Potential health and ecological risks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Layton, D W; Edson, R; Varela, M

    1999-11-15

    The primary goal of the assessment reported here is to evaluate the health and environmental threat to coastal Alaska posed by radioactive-waste dumping in the Arctic and Northwest Pacific Oceans by the FSU. In particular, the FSU discarded 16 nuclear reactors from submarines and an icebreaker in the Kara Sea near the island of Novaya Zemlya, of which 6 contained spent nuclear fuel (SNF); disposed of liquid and solid wastes in the Sea of Japan; lost a {sup 90}Sr-powered radioisotope thermoelectric generator at sea in the Sea of Okhotsk; and disposed of liquid wastes at several sites in the Pacificmore » Ocean, east of the Kamchatka Peninsula. In addition to these known sources in the oceans, the RAIG evaluated FSU waste-disposal practices at inland weapons-development sites that have contaminated major rivers flowing into the Arctic Ocean. The RAIG evaluated these sources for the potential for release to the environment, transport, and impact to Alaskan ecosystems and peoples through a variety of scenarios, including a worst-case total instantaneous and simultaneous release of the sources under investigation. The risk-assessment process described in this report is applicable to and can be used by other circumpolar countries, with the addition of information about specific ecosystems and human life-styles. They can use the ANWAP risk-assessment framework and approach used by ONR to establish potential doses for Alaska, but add their own specific data sets about human and ecological factors. The ANWAP risk assessment addresses the following Russian wastes, media, and receptors: dumped nuclear submarines and icebreaker in Kara Sea--marine pathways; solid reactor parts in Sea of Japan and Pacific Ocean--marine pathways; thermoelectric generator in Sea of Okhotsk--marine pathways; current known aqueous wastes in Mayak reservoirs and Asanov Marshes--riverine to marine pathways; and Alaska as receptor. For these waste and source terms addressed, other pathways, such as atmospheric transport, could be considered under future-funded research efforts for impacts to Alaska. The ANWAP risk assessment does not address the following wastes, media, and receptors: radioactive sources in Alaska (except to add perspective for Russian source term); radioactive wastes associated with Russian naval military operations and decommissioning; Russian production reactor and spent-fuel reprocessing facilities nonaqueous source terms; atmospheric, terrestrial and nonaqueous pathways; and dose calculations for any circumpolar locality other than Alaska. These other, potentially serious sources of radioactivity to the Arctic environment, while outside the scope of the current ANWAP mandate, should be considered for future funding research efforts.« less

  10. The exact calculation of quadrupole sources for some incompressible flows

    NASA Technical Reports Server (NTRS)

    Brentner, Kenneth S.

    1988-01-01

    This paper is concerned with the application of the acoustic analogy of Lighthill to the acoustic and aerodynamic problems associated with moving bodies. The Ffowcs Williams-Hawkings equation, which is an interpretation of the acoustic analogy for sound generation by moving bodies, manipulates the source terms into surface and volume sources. Quite often in practice the volume sources, or quadrupoles, are neglected for various reasons. Recently, Farassat, Long and others have attempted to use the FW-H equation with the quadrupole source and neglected to solve for the surface pressure on the body. The purpose of this paper is to examine the contribution of the quadrupole source to the acoustic pressure and body surface pressure for some problems for which the exact solution is known. The inviscid, incompressible, 2-D flow, calculated using the velocity potential, is used to calculate the individual contributions of the various surface and volume source terms in the FW-H equation. The relative importance of each of the sources is then assessed.

  11. Photovoltaic highway applications: Assessment of the near-term market

    NASA Technical Reports Server (NTRS)

    Rosenblum, L.; Scudder, L. R.; Bifano, W. J.; Poley, W. A.

    1977-01-01

    A preliminary assessment of the near-term market for photovoltaic highway applications is presented. Among the potential users, two market sectors are considered: government and commercial. Within these sectors, two possible application areas, signs and motorist aids, are discussed. Based on judgemental information, obtained by a brief survey of representatives of the two user sectors, the government sector appears more amenable to the introduction of photovoltaic power sources for highway applications in the near-term. However, considerable interest and potential opportunities were also found to exist in the commercial sector. Further studies to quantify the market for highway applications appear warranted.

  12. Access to Safe Water in Rural Artibonite, Haiti 16 Months after the Onset of the Cholera Epidemic

    PubMed Central

    Patrick, Molly; Berendes, David; Murphy, Jennifer; Bertrand, Fabienne; Husain, Farah; Handzel, Thomas

    2013-01-01

    Haiti has the lowest improved water and sanitation coverage in the Western Hemisphere and is suffering from the largest cholera epidemic on record. In May of 2012, an assessment was conducted in rural areas of the Artibonite Department to describe the type and quality of water sources and determine knowledge, access, and use of household water treatment products to inform future programs. It was conducted after emergency response was scaled back but before longer-term water, sanitation, and hygiene activities were initiated. The household survey and source water quality analysis documented low access to safe water, with only 42.3% of households using an improved drinking water source. One-half (50.9%) of the improved water sources tested positive for Escherichia coli. Of households with water to test, 12.7% had positive chlorine residual. The assessment reinforces the identified need for major investments in safe water and sanitation infrastructure and the importance of household water treatment to improve access to safe water in the near term. PMID:24106191

  13. Assessment of macroseismic intensity in the Nile basin, Egypt

    NASA Astrophysics Data System (ADS)

    Fergany, Elsayed

    2018-01-01

    This work intends to assess deterministic seismic hazard and risk analysis in terms of the maximum expected intensity map of the Egyptian Nile basin sector. Seismic source zone model of Egypt was delineated based on updated compatible earthquake catalog in 2015, focal mechanisms, and the common tectonic elements. Four effective seismic source zones were identified along the Nile basin. The observed macroseismic intensity data along the basin was used to develop intensity prediction equation defined in terms of moment magnitude. Expected maximum intensity map was proven based on the developed intensity prediction equation, identified effective seismic source zones, and maximum expected magnitude for each zone along the basin. The earthquake hazard and risk analysis was discussed and analyzed in view of the maximum expected moment magnitude and the maximum expected intensity values for each effective source zone. Moderate expected magnitudes are expected to put high risk at Cairo and Aswan regions. The results of this study could be a recommendation for the planners in charge to mitigate the seismic risk at these strategic zones of Egypt.

  14. Risk assessment of water pollution sources based on an integrated k-means clustering and set pair analysis method in the region of Shiyan, China.

    PubMed

    Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan

    2016-07-01

    Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Updating source term and atmospheric dispersion simulations for the dose reconstruction in Fukushima Daiichi Nuclear Power Station Accident

    NASA Astrophysics Data System (ADS)

    Nagai, Haruyasu; Terada, Hiroaki; Tsuduki, Katsunori; Katata, Genki; Ota, Masakazu; Furuno, Akiko; Akari, Shusaku

    2017-09-01

    In order to assess the radiological dose to the public resulting from the Fukushima Daiichi Nuclear Power Station (FDNPS) accident in Japan, especially for the early phase of the accident when no measured data are available for that purpose, the spatial and temporal distribution of radioactive materials in the environment are reconstructed by computer simulations. In this study, by refining the source term of radioactive materials discharged into the atmosphere and modifying the atmospheric transport, dispersion and deposition model (ATDM), the atmospheric dispersion simulation of radioactive materials is improved. Then, a database of spatiotemporal distribution of radioactive materials in the air and on the ground surface is developed from the output of the simulation. This database is used in other studies for the dose assessment by coupling with the behavioral pattern of evacuees from the FDNPS accident. By the improvement of the ATDM simulation to use a new meteorological model and sophisticated deposition scheme, the ATDM simulations reproduced well the 137Cs and 131I deposition patterns. For the better reproducibility of dispersion processes, further refinement of the source term was carried out by optimizing it to the improved ATDM simulation by using new monitoring data.

  16. ADAPTATION OF THE ADVANCED STATISTICAL TRAJECTORY REGIONAL AIR POLLUTION (ASTRAP) MODEL TO THE EPA VAX COMPUTER - MODIFICATIONS AND TESTING

    EPA Science Inventory

    The Advanced Statistical Trajectory Regional Air Pollution (ASTRAP) model simulates long-term transport and deposition of oxides of and nitrogen. t is a potential screening tool for assessing long-term effects on regional visibility from sulfur emission sources. owever, a rigorou...

  17. Conglomeration or Chameleon? Teachers' Representations of Language in the Assessment of Learners with English as an Additional Language.

    ERIC Educational Resources Information Center

    Gardner, Sheena; Rea-Dickins, Pauline

    2001-01-01

    Investigates teacher representations of language in relation to assessment contexts. Analyzes not only what is represented in teachers' use of metalanguage, but also how it is presented--in terms of expression, voice, and source. The analysis is based on interviews with teachers, transcripts of lessons, and classroom-based assessments, formal…

  18. [Schizophrenia and psychosis on the internet].

    PubMed

    Schrank, Beate; Seyringer, Michaela-Elena; Berger, Peter; Katschnig, Heinz; Amering, Michaela

    2006-09-01

    The internet is an increasingly important source of information for patients concerning their illness. This has to be borne in mind concerning its growing influence on communications between patients and clinicians. The aim of this study is to assess the quality of German-language information on schizophrenia on the internet. Two searches of the terms schizophrenia and psychosis were conducted, using the Google search engine set to produce only German hits. The quality of the first hundred resulting sites was assessed according to a range of criteria, including diagnosis and therapy, links and interactive offers. Evidence-based medical information was provided by more than half of the sites resulting from the search term schizophrenia and by less than one third of psychosis hits. Information and discussion on the relationship between drugs and psychosis appeared almost exclusively under the term psychosis. It is suggested that mental health care professionals can use knowledge on what sort of information their patients are confronted with on the internet in order to assist them in profiting from this source of information.

  19. COMPARATIVE POTENCY METHOD FOR CANCER RISK ASSESSMENT: APPLICATION TO THE QUANTITATIVE ASSESSMENT OF THE CONTRIBUTION OF COMBUSTION EMISSIONS TO LUNG CANCER RISK

    EPA Science Inventory

    Combustion sources emit soot particles containing carcinogenic polycyclic organic compounds which are mutagenic in short-term genetic bioassays in microbial and mammalian cells and are tumorigenic in animals. Although soot is considered to be a human carcinogen, soots from differ...

  20. Methods for nuclear air-cleaning-system accident-consequence assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrae, R.W.; Bolstad, J.W.; Gregory, W.S.

    1982-01-01

    This paper describes a multilaboratory research program that is directed toward addressing many questions that analysts face when performing air cleaning accident consequence assessments. The program involves developing analytical tools and supportive experimental data that will be useful in making more realistic assessments of accident source terms within and up to the atmospheric boundaries of nuclear fuel cycle facilities. The types of accidents considered in this study includes fires, explosions, spills, tornadoes, criticalities, and equipment failures. The main focus of the program is developing an accident analysis handbook (AAH). We will describe the contents of the AAH, which include descriptionsmore » of selected nuclear fuel cycle facilities, process unit operations, source-term development, and accident consequence analyses. Three computer codes designed to predict gas and material propagation through facility air cleaning systems are described. These computer codes address accidents involving fires (FIRAC), explosions (EXPAC), and tornadoes (TORAC). The handbook relies on many illustrative examples to show the analyst how to approach accident consequence assessments. We will use the FIRAC code and a hypothetical fire scenario to illustrate the accident analysis capability.« less

  1. A controlled variation scheme for convection treatment in pressure-based algorithm

    NASA Technical Reports Server (NTRS)

    Shyy, Wei; Thakur, Siddharth; Tucker, Kevin

    1993-01-01

    Convection effect and source terms are two primary sources of difficulties in computing turbulent reacting flows typically encountered in propulsion devices. The present work intends to elucidate the individual as well as the collective roles of convection and source terms in the fluid flow equations, and to devise appropriate treatments and implementations to improve our current capability of predicting such flows. A controlled variation scheme (CVS) has been under development in the context of a pressure-based algorithm, which has the characteristics of adaptively regulating the amount of numerical diffusivity, relative to central difference scheme, according to the variation in local flow field. Both the basic concepts and a pragmatic assessment will be presented to highlight the status of this work.

  2. Overview of major hazards. Part 2: Source term; dispersion; combustion; blast, missiles, venting; fire; radiation; runaway reactions; toxic substances; dust explosions

    NASA Astrophysics Data System (ADS)

    Vilain, J.

    Approaches to major hazard assessment and prediction are reviewed. Source term: (phenomenology/modeling of release, influence on early stages of dispersion); dispersion (atmospheric advection, diffusion and deposition, emphasis on dense/cold gases); combustion (flammable clouds and mists covering flash fires, deflagration, transition to detonation; mostly unconfined/partly confined situations); blast formation, propagation, interaction with structures; catastrophic fires (pool fires, torches and fireballs; highly reactive substances) runaway reactions; features of more general interest; toxic substances, excluding toxicology; and dust explosions (phenomenology and protective measures) are discussed.

  3. Assessment of infrasound signals recorded on seismic stations and infrasound arrays in the western United States using ground truth sources

    NASA Astrophysics Data System (ADS)

    Park, Junghyun; Hayward, Chris; Stump, Brian W.

    2018-06-01

    Ground truth sources in Utah during 2003-2013 are used to assess the contribution of temporal atmospheric conditions to infrasound detection and the predictive capabilities of atmospheric models. Ground truth sources consist of 28 long duration static rocket motor burn tests and 28 impulsive rocket body demolitions. Automated infrasound detections from a hybrid of regional seismometers and infrasound arrays use a combination of short-term time average/long-term time average ratios and spectral analyses. These detections are grouped into station triads using a Delaunay triangulation network and then associated to estimate phase velocity and azimuth to filter signals associated with a particular source location. The resulting range and azimuth distribution from sources to detecting stations varies seasonally and is consistent with predictions based on seasonal atmospheric models. Impulsive signals from rocket body detonations are observed at greater distances (>700 km) than the extended duration signals generated by the rocket burn test (up to 600 km). Infrasound energy attenuation associated with the two source types is quantified as a function of range and azimuth from infrasound amplitude measurements. Ray-tracing results using Ground-to-Space atmospheric specifications are compared to these observations and illustrate the degree to which the time variations in characteristics of the observations can be predicted over a multiple year time period.

  4. Assessing the Impact of Source-Zone Remediation Efforts at the Contaminant-Plume Scale Through Analysis of Contaminant Mass Discharge

    PubMed Central

    Brusseau, M. L.; Hatton, J.; DiGuiseppi, W.

    2011-01-01

    The long-term impact of source-zone remediation efforts was assessed for a large site contaminated by trichloroethene. The impact of the remediation efforts (soil vapor extraction and in-situ chemical oxidation) was assessed through analysis of plume-scale contaminant mass discharge, which was measured using a high-resolution data set obtained from 23 years of operation of a large pump-and-treat system. The initial contaminant mass discharge peaked at approximately 7 kg/d, and then declined to approximately 2 kg/d. This latter value was sustained for several years prior to the initiation of source-zone remediation efforts. The contaminant mass discharge in 2010, measured several years after completion of the two source-zone remediation actions, was approximately 0.2 kg/d, which is ten times lower than the value prior to source-zone remediation. The time-continuous contaminant mass discharge data can be used to evaluate the impact of the source-zone remediation efforts on reducing the time required to operate the pump-and-treat system, and to estimate the cost savings associated with the decreased operational period. While significant reductions have been achieved, it is evident that the remediation efforts have not completely eliminated contaminant mass discharge and associated risk. Remaining contaminant mass contributing to the current mass discharge is hypothesized to comprise poorly-accessible mass in the source zones, as well as aqueous (and sorbed) mass present in the extensive lower-permeability units located within and adjacent to the contaminant plume. The fate of these sources is an issue of critical import to the remediation of chlorinated-solvent contaminated sites, and development of methods to address these sources will be required to achieve successful long-term management of such sites and to ultimately transition them to closure. PMID:22115080

  5. Sources of Evidence-of-Learning: Learning and Assessment in the Era of Big Data

    ERIC Educational Resources Information Center

    Cope, Bill; Kalantzis, Mary

    2015-01-01

    This article sets out to explore a shift in the sources of evidence-of-learning in the era of networked computing. One of the key features of recent developments has been popularly characterized as "big data". We begin by examining, in general terms, the frame of reference of contemporary debates on machine intelligence and the role of…

  6. Evolution of air pollution source contributions over one decade, derived by PM10 and PM2.5 source apportionment in two metropolitan urban areas in Greece

    NASA Astrophysics Data System (ADS)

    Diapouli, E.; Manousakas, M.; Vratolis, S.; Vasilatou, V.; Maggos, Th; Saraga, D.; Grigoratos, Th; Argyropoulos, G.; Voutsa, D.; Samara, C.; Eleftheriadis, K.

    2017-09-01

    Metropolitan Urban areas in Greece have been known to suffer from poor air quality, due to variety of emission sources, topography and climatic conditions favouring the accumulation of pollution. While a number of control measures have been implemented since the 1990s, resulting in reductions of atmospheric pollution and changes in emission source contributions, the financial crisis which started in 2009 has significantly altered this picture. The present study is the first effort to assess the contribution of emission sources to PM10 and PM2.5 concentration levels and their long-term variability (over 5-10 years), in the two largest metropolitan urban areas in Greece (Athens and Thessaloniki). Intensive measurement campaigns were conducted during 2011-2012 at suburban, urban background and urban traffic sites in these two cities. In addition, available datasets from previous measurements in Athens and Thessaloniki were used in order to assess the long-term variability of concentrations and sources. Chemical composition analysis of the 2011-2012 samples showed that carbonaceous matter was the most abundant component for both PM size fractions. Significant increase of carbonaceous particle concentrations and of OC/EC ratio during the cold period, especially in the residential urban background sites, pointed towards domestic heating and more particularly wood (biomass) burning as a significant source. PMF analysis further supported this finding. Biomass burning was the largest contributing source at the two urban background sites (with mean contributions for the two size fractions in the range of 24-46%). Secondary aerosol formation (sulphate, nitrate & organics) was also a major contributing source for both size fractions at the suburban and urban background sites. At the urban traffic site, vehicular traffic (exhaust and non-exhaust emissions) was the source with the highest contributions, accounting for 44% of PM10 and 37% of PM2.5, respectively. The long-term variability of emission sources in the two cities (over 5-10 years), assessed through a harmonized application of the PMF technique on recent and past year data, clearly demonstrates the effective reduction in emissions during the last decade due to control measures and technological development; however, it also reflects the effects of the financial crisis in Greece during these years, which has led to decreased economic activities and the adoption of more polluting practices by the local population in an effort to reduce living costs.

  7. Assessing the risk of second malignancies after modern radiotherapy

    PubMed Central

    Newhauser, Wayne D.; Durante, Marco

    2014-01-01

    Recent advances in radiotherapy have enabled the use of different types of particles, such as protons and heavy ions, as well as refinements to the treatment of tumours with standard sources (photons). However, the risk of second cancers arising in long-term survivors continues to be a problem. The long-term risks from treatments such as particle therapy have not yet been determined and are unlikely to become apparent for many years. Therefore, there is a need to develop risk assessments based on our current knowledge of radiation-induced carcinogenesis. PMID:21593785

  8. Influence of Iterative Reconstruction Algorithms on PET Image Resolution

    NASA Astrophysics Data System (ADS)

    Karpetas, G. E.; Michail, C. M.; Fountos, G. P.; Valais, I. G.; Nikolopoulos, D.; Kandarakis, I. S.; Panayiotakis, G. S.

    2015-09-01

    The aim of the present study was to assess image quality of PET scanners through a thin layer chromatography (TLC) plane source. The source was simulated using a previously validated Monte Carlo model. The model was developed by using the GATE MC package and reconstructed images obtained with the STIR software for tomographic image reconstruction. The simulated PET scanner was the GE DiscoveryST. A plane source consisted of a TLC plate, was simulated by a layer of silica gel on aluminum (Al) foil substrates, immersed in 18F-FDG bath solution (1MBq). Image quality was assessed in terms of the modulation transfer function (MTF). MTF curves were estimated from transverse reconstructed images of the plane source. Images were reconstructed by the maximum likelihood estimation (MLE)-OSMAPOSL, the ordered subsets separable paraboloidal surrogate (OSSPS), the median root prior (MRP) and OSMAPOSL with quadratic prior, algorithms. OSMAPOSL reconstruction was assessed by using fixed subsets and various iterations, as well as by using various beta (hyper) parameter values. MTF values were found to increase with increasing iterations. MTF also improves by using lower beta values. The simulated PET evaluation method, based on the TLC plane source, can be useful in the resolution assessment of PET scanners.

  9. Characterisation of exposure to non-ionising electromagnetic fields in the Spanish INMA birth cohort: study protocol.

    PubMed

    Gallastegi, Mara; Guxens, Mònica; Jiménez-Zabala, Ana; Calvente, Irene; Fernández, Marta; Birks, Laura; Struchen, Benjamin; Vrijheid, Martine; Estarlich, Marisa; Fernández, Mariana F; Torrent, Maties; Ballester, Ferrán; Aurrekoetxea, Juan J; Ibarluzea, Jesús; Guerra, David; González, Julián; Röösli, Martin; Santa-Marina, Loreto

    2016-02-18

    Analysis of the association between exposure to electromagnetic fields of non-ionising radiation (EMF-NIR) and health in children and adolescents is hindered by the limited availability of data, mainly due to the difficulties on the exposure assessment. This study protocol describes the methodologies used for characterising exposure of children to EMF-NIR in the INMA (INfancia y Medio Ambiente- Environment and Childhood) Project, a prospective cohort study. Indirect (proximity to emission sources, questionnaires on sources use and geospatial propagation models) and direct methods (spot and fixed longer-term measurements and personal measurements) were conducted in order to assess exposure levels of study participants aged between 7 and 18 years old. The methodology used varies depending on the frequency of the EMF-NIR and the environment (homes, schools and parks). Questionnaires assessed the use of sources contributing both to Extremely Low Frequency (ELF) and Radiofrequency (RF) exposure levels. Geospatial propagation models (NISMap) are implemented and validated for environmental outdoor sources of RFs using spot measurements. Spot and fixed longer-term ELF and RF measurements were done in the environments where children spend most of the time. Moreover, personal measurements were taken in order to assess individual exposure to RF. The exposure data are used to explore their relationships with proximity and/or use of EMF-NIR sources. Characterisation of the EMF-NIR exposure by this combination of methods is intended to overcome problems encountered in other research. The assessment of exposure of INMA cohort children and adolescents living in different regions of Spain to the full frequency range of EMF-NIR extends the characterisation of environmental exposures in this cohort. Together with other data obtained in the project, on socioeconomic and family characteristics and development of the children and adolescents, this will enable to evaluate the complex interaction between health outcomes in children and adolescents and the various environmental factors that surround them.

  10. Term amniotic fluid: an unexploited reserve of mesenchymal stromal cells for reprogramming and potential cell therapy applications.

    PubMed

    Moraghebi, Roksana; Kirkeby, Agnete; Chaves, Patricia; Rönn, Roger E; Sitnicka, Ewa; Parmar, Malin; Larsson, Marcus; Herbst, Andreas; Woods, Niels-Bjarne

    2017-08-25

    Mesenchymal stromal cells (MSCs) are currently being evaluated in numerous pre-clinical and clinical cell-based therapy studies. Furthermore, there is an increasing interest in exploring alternative uses of these cells in disease modelling, pharmaceutical screening, and regenerative medicine by applying reprogramming technologies. However, the limited availability of MSCs from various sources restricts their use. Term amniotic fluid has been proposed as an alternative source of MSCs. Previously, only low volumes of term fluid and its cellular constituents have been collected, and current knowledge of the MSCs derived from this fluid is limited. In this study, we collected amniotic fluid at term using a novel collection system and evaluated amniotic fluid MSC content and their characteristics, including their feasibility to undergo cellular reprogramming. Amniotic fluid was collected at term caesarean section deliveries using a closed catheter-based system. Following fluid processing, amniotic fluid was assessed for cellularity, MSC frequency, in-vitro proliferation, surface phenotype, differentiation, and gene expression characteristics. Cells were also reprogrammed to the pluripotent stem cell state and differentiated towards neural and haematopoietic lineages. The average volume of term amniotic fluid collected was approximately 0.4 litres per donor, containing an average of 7 million viable mononuclear cells per litre, and a CFU-F content of 15 per 100,000 MNCs. Expanded CFU-F cultures showed similar surface phenotype, differentiation potential, and gene expression characteristics to MSCs isolated from traditional sources, and showed extensive expansion potential and rapid doubling times. Given the high proliferation rates of these neonatal source cells, we assessed them in a reprogramming application, where the derived induced pluripotent stem cells showed multigerm layer lineage differentiation potential. The potentially large donor base from caesarean section deliveries, the high yield of term amniotic fluid MSCs obtainable, the properties of the MSCs identified, and the suitability of the cells to be reprogrammed into the pluripotent state demonstrated these cells to be a promising and plentiful resource for further evaluation in bio-banking, cell therapy, disease modelling, and regenerative medicine applications.

  11. A comparison of color fidelity metrics for light sources using simulation of color samples under lighting conditions

    NASA Astrophysics Data System (ADS)

    Kwon, Hyeokjun; Kang, Yoojin; Jang, Junwoo

    2017-09-01

    Color fidelity has been used as one of indices to evaluate the performance of light sources. Since the Color Rendering Index (CRI) was proposed at CIE, many color fidelity metrics have been proposed to increase the accuracy of the metric. This paper focuses on a comparison of the color fidelity metrics in an aspect of accuracy with human visual assessments. To visually evaluate the color fidelity of light sources, we made a simulator that reproduces the color samples under lighting conditions. In this paper, eighteen color samples of the Macbeth color checker under test light sources and reference illuminant for each of them are simulated and displayed on a well-characterized monitor. With only a spectrum set of the test light source and reference illuminant, color samples under any lighting condition can be reproduced. In this paper, the spectrums of the two LED and two OLED light sources that have similar values of CRI are used for the visual assessment. In addition, the results of the visual assessment are compared with the two color fidelity metrics that include CRI and IES TM-30-15 (Rf), proposed by Illuminating Engineering Society (IES) in 2015. Experimental results indicate that Rf outperforms CRI in terms of the correlation with visual assessment.

  12. Evaluation of Long-term Performance of Enhanced Anaerobic Source Zone Bioremediation using mass flux

    NASA Astrophysics Data System (ADS)

    Haluska, A.; Cho, J.; Hatzinger, P.; Annable, M. D.

    2017-12-01

    Chlorinated ethene DNAPL source zones in groundwater act as potential long term sources of contamination as they dissolve yielding concentrations well above MCLs, posing an on-going public health risk. Enhanced bioremediation has been applied to treat many source zones with significant promise, but long-term sustainability of this technology has not been thoroughly assessed. This study evaluated the long-term effectiveness of enhanced anaerobic source zone bioremediation at chloroethene contaminated sites to determine if the treatment prevented contaminant rebound and removed NAPL from the source zone. Long-term performance was evaluated based on achieving MCL-based contaminant mass fluxes in parent compound concentrations during different monitoring periods. Groundwater concertation versus time data was compiled for 6-sites and post-remedial contaminant mass flux data was then measured using passive flux meters at wells both within and down-gradient of the source zone. Post-remedial mass flux data was then combined with pre-remedial water quality data to estimate pre-remedial mass flux. This information was used to characterize a DNAPL dissolution source strength function, such as the Power Law Model and the Equilibrium Stream tube model. The six-sites characterized for this study were (1) Former Charleston Air Force Base, Charleston, SC; (2) Dover Air Force Base, Dover, DE; (3) Treasure Island Naval Station, San Francisco, CA; (4) Former Raritan Arsenal, Edison, NJ; (5) Naval Air Station, Jacksonville, FL; and, (6) Former Naval Air Station, Alameda, CA. Contaminant mass fluxes decreased for all the sites by the end of the post-treatment monitoring period and rebound was limited within the source zone. Post remedial source strength function estimates suggest that decreases in contaminant mass flux will continue to occur at these sites, but a mass flux based on MCL levels may never be exceeded. Thus, site clean-up goals should be evaluated as order-of-magnitude reductions. Additionally, sites may require monitoring for a minimum of 5-years in order to sufficiently evaluate remedial performance. The study shows that enhanced anaerobic source zone bioremediation contributed to a modest reduction of source zone contaminant mass discharge and appears to have mitigated rebound of chlorinated ethenes.

  13. Policy and the Standards Debate: Mapping Changes in Assessment in Mathematics

    ERIC Educational Resources Information Center

    Lerman, Stephen; Adler, Jill

    2016-01-01

    The influences on governments for policy changes in schools range across many agencies, including the political party in power. When policies change, the sources of these influences are not always clear. The project whose work is presented in this special issue examines what these changes look like in terms of the differences in assessment tasks…

  14. The role of objective personality inventories in suicide risk assessment: an evaluation and proposal.

    PubMed

    Johnson, W B; Lall, R; Bongar, B; Nordlund, M D

    1999-01-01

    Objective personality assessment instruments offer a comparatively underutilized source of clinical data in attempts to evaluate and predict risk for suicide. In contrast to focal suicide risk measures, global personality inventories may be useful in identification of long-standing styles that predispose persons to eventual suicidal behavior. This article reviews the empirical literature regarding the efficacy of established personality inventories in predicting suicidality. The authors offer several recommendations for future research with these measures and conclude that such objective personality instruments offer only marginal utility as sources of clinical information in comprehensive suicide risk evaluations. Personality inventories may offer greatest utility in long-term assessment of suicide risk.

  15. Design and realization of disaster assessment algorithm after forest fire

    NASA Astrophysics Data System (ADS)

    Xu, Aijun; Wang, Danfeng; Tang, Lihua

    2008-10-01

    Based on GIS technology, this paper mainly focuses on the application of disaster assessment algorithm after forest fire and studies on the design and realization of disaster assessment based on GIS. After forest fire through the analysis and processing of multi-sources and heterogeneous data, this paper integrates the foundation that the domestic and foreign scholars laid of the research on assessment for forest fire loss with the related knowledge of assessment, accounting and forest resources appraisal so as to study and approach the theory framework and assessment index of the research on assessment for forest fire loss. The technologies of extracting boundary, overlay analysis, and division processing of multi-sources spatial data are available to realize the application of the investigation method of the burnt forest area and the computation of the fire area. The assessment provides evidence for fire cleaning in burnt areas and new policy making on restoration in terms of the direct and the indirect economic loss and ecological and environmental damage caused by forest fire under the condition of different fire danger classes and different amounts of forest accumulation, thus makes forest resources protection operated in a faster, more efficient and more economical way. Finally, this paper takes Lin'an city of Zhejiang province as a test area to confirm the method mentioned in the paper in terms of key technologies.

  16. Final safety analysis report for the Galileo Mission: Volume 2: Book 1, Accident model document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source.more » The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence. 27 figs., 11 tabs.« less

  17. Next Generation of Leaching Tests

    EPA Science Inventory

    A corresponding abstract has been cleared for this presentation. The four methods comprising the Leaching Environmental Assessment Framework are described along with the tools to support implementation of the more rigorous and accurate source terms that are developed using LEAF ...

  18. Improving bioaerosol exposure assessments of composting facilities — Comparative modelling of emissions from different compost ages and processing activities

    NASA Astrophysics Data System (ADS)

    Taha, M. P. M.; Drew, G. H.; Tamer, A.; Hewings, G.; Jordinson, G. M.; Longhurst, P. J.; Pollard, S. J. T.

    We present bioaerosol source term concentrations from passive and active composting sources and compare emissions from green waste compost aged 1, 2, 4, 6, 8, 12 and 16 weeks. Results reveal that the age of compost has little effect on the bioaerosol concentrations emitted for passive windrow sources. However emissions from turning compost during the early stages may be higher than during the later stages of the composting process. The bioaerosol emissions from passive sources were in the range of 10 3-10 4 cfu m -3, with releases from active sources typically 1-log higher. We propose improvements to current risk assessment methodologies by examining emission rates and the differences between two air dispersion models for the prediction of downwind bioaerosol concentrations at off-site points of exposure. The SCREEN3 model provides a more precautionary estimate of the source depletion curves of bioaerosol emissions in comparison to ADMS 3.3. The results from both models predict that bioaerosol concentrations decrease to below typical background concentrations before 250 m, the distance at which the regulator in England and Wales may require a risk assessment to be completed.

  19. Global biodiversity monitoring: from data sources to essential biodiversity variables

    USGS Publications Warehouse

    Proenca, Vania; Martin, Laura J.; Pereira, Henrique M.; Fernandez, Miguel; McRae, Louise; Belnap, Jayne; Böhm, Monika; Brummitt, Neil; Garcia-Moreno, Jaime; Gregory, Richard D.; Honrado, Joao P; Jürgens, Norbert; Opige, Michael; Schmeller, Dirk S.; Tiago, Patricia; van Sway, Chris A

    2016-01-01

    Essential Biodiversity Variables (EBVs) consolidate information from varied biodiversity observation sources. Here we demonstrate the links between data sources, EBVs and indicators and discuss how different sources of biodiversity observations can be harnessed to inform EBVs. We classify sources of primary observations into four types: extensive and intensive monitoring schemes, ecological field studies and satellite remote sensing. We characterize their geographic, taxonomic and temporal coverage. Ecological field studies and intensive monitoring schemes inform a wide range of EBVs, but the former tend to deliver short-term data, while the geographic coverage of the latter is limited. In contrast, extensive monitoring schemes mostly inform the population abundance EBV, but deliver long-term data across an extensive network of sites. Satellite remote sensing is particularly suited to providing information on ecosystem function and structure EBVs. Biases behind data sources may affect the representativeness of global biodiversity datasets. To improve them, researchers must assess data sources and then develop strategies to compensate for identified gaps. We draw on the population abundance dataset informing the Living Planet Index (LPI) to illustrate the effects of data sources on EBV representativeness. We find that long-term monitoring schemes informing the LPI are still scarce outside of Europe and North America and that ecological field studies play a key role in covering that gap. Achieving representative EBV datasets will depend both on the ability to integrate available data, through data harmonization and modeling efforts, and on the establishment of new monitoring programs to address critical data gaps.

  20. Radiological assessment. A textbook on environmental dose analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Till, J.E.; Meyer, H.R.

    1983-09-01

    Radiological assessment is the quantitative process of estimating the consequences to humans resulting from the release of radionuclides to the biosphere. It is a multidisciplinary subject requiring the expertise of a number of individuals in order to predict source terms, describe environmental transport, calculate internal and external dose, and extrapolate dose to health effects. Up to this time there has been available no comprehensive book describing, on a uniform and comprehensive level, the techniques and models used in radiological assessment. Radiological Assessment is based on material presented at the 1980 Health Physics Society Summer School held in Seattle, Washington. Themore » material has been expanded and edited to make it comprehensive in scope and useful as a text. Topics covered include (1) source terms for nuclear facilities and Medical and Industrial sites; (2) transport of radionuclides in the atmosphere; (3) transport of radionuclides in surface waters; (4) transport of radionuclides in groundwater; (5) terrestrial and aquatic food chain pathways; (6) reference man; a system for internal dose calculations; (7) internal dosimetry; (8) external dosimetry; (9) models for special-case radionuclides; (10) calculation of health effects in irradiated populations; (11) evaluation of uncertainties in environmental radiological assessment models; (12) regulatory standards for environmental releases of radionuclides; (13) development of computer codes for radiological assessment; and (14) assessment of accidental releases of radionuclides.« less

  1. Environmental impact and risk assessments and key factors contributing to the overall uncertainties.

    PubMed

    Salbu, Brit

    2016-01-01

    There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms, ignoring sensitive history life stages of organisms and transgenerational effects. To link sources, ecosystem transfer and biological effects to future impact and risks, a series of models are usually interfaced, while uncertainty estimates are seldom given. The model predictions are, however, only valid within the boundaries of the overall uncertainties. Furthermore, the model predictions are only useful and relevant when uncertainties are estimated, communicated and understood. Among key factors contributing most to uncertainties, the present paper focuses especially on structure uncertainties (model bias or discrepancies) as aspects such as particle releases, ecosystem dynamics, mixed exposure, sensitive life history stages and transgenerational effects, are usually ignored in assessment models. Research focus on these aspects should significantly reduce the overall uncertainties in the impact and risk assessment of radioactive contaminated ecosystems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Reliability of MEG source imaging of anterior temporal spikes: analysis of an intracranially characterized spike focus.

    PubMed

    Wennberg, Richard; Cheyne, Douglas

    2014-05-01

    To assess the reliability of MEG source imaging (MSI) of anterior temporal spikes through detailed analysis of the localization and orientation of source solutions obtained for a large number of spikes that were separately confirmed by intracranial EEG to be focally generated within a single, well-characterized spike focus. MSI was performed on 64 identical right anterior temporal spikes from an anterolateral temporal neocortical spike focus. The effects of different volume conductors (sphere and realistic head model), removal of noise with low frequency filters (LFFs) and averaging multiple spikes were assessed in terms of the reliability of the source solutions. MSI of single spikes resulted in scattered dipole source solutions that showed reasonable reliability for localization at the lobar level, but only for solutions with a goodness-of-fit exceeding 80% using a LFF of 3 Hz. Reliability at a finer level of intralobar localization was limited. Spike averaging significantly improved the reliability of source solutions and averaging 8 or more spikes reduced dependency on goodness-of-fit and data filtering. MSI performed on topographically identical individual spikes from an intracranially defined classical anterior temporal lobe spike focus was limited by low reliability (i.e., scattered source solutions) in terms of fine, sublobar localization within the ipsilateral temporal lobe. Spike averaging significantly improved reliability. MSI performed on individual anterior temporal spikes is limited by low reliability. Reduction of background noise through spike averaging significantly improves the reliability of MSI solutions. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  3. Prevalence of microbiological contaminants in groundwater sources and risk factor assessment in Juba, South Sudan.

    PubMed

    Engström, Emma; Balfors, Berit; Mörtberg, Ulla; Thunvik, Roger; Gaily, Tarig; Mangold, Mikael

    2015-05-15

    In low-income regions, drinking water is often derived from groundwater sources, which might spread diarrheal disease if they are microbiologically polluted. This study aimed to investigate the occurrence of fecal contamination in 147 improved groundwater sources in Juba, South Sudan and to assess potential contributing risk factors, based on bivariate statistical analysis. Thermotolerant coliforms (TTCs) were detected in 66% of the investigated sources, including 95 boreholes, breaching the health-based recommendations for drinking water. A significant association (p<0.05) was determined between the presence of TTCs and the depth of cumulative, long-term prior precipitation (both within the previous five days and within the past month). No such link was found to short-term rainfall, the presence of latrines or damages in the borehole apron. However, the risk factor analysis further suggested, to a lesser degree, that the local topography and on-site hygiene were additionally significant. In summary, the analysis indicated that an important contamination mechanism was fecal pollution of the contributing groundwater, which was unlikely due to the presence of latrines; instead, infiltration from contaminated surface water was more probable. The reduction in fecal sources in the environment in Juba is thus recommended, for example, through constructing latrines or designating protection areas near water sources. The study results contribute to the understanding of microbiological contamination of groundwater sources in areas with low incomes and high population densities, tropical climates and weathered basement complex environments, which are common in urban sub-Saharan Africa. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Assessment of Noise and Associated Health Impacts at Selected Secondary Schools in Ibadan, Nigeria

    PubMed Central

    Ana, Godson R. E. E.; Shendell, Derek G.; Brown, G. E.; Sridhar, M. K. C.

    2009-01-01

    Background. Most schools in Ibadan, Nigeria, are located near major roads (mobile line sources). We conducted an initial assessment of noise levels and adverse noise-related health and learning effects. Methods. For this descriptive, cross-sectional study, four schools were selected randomly from eight participating in overall project. We administered 200 questionnaires, 50 per school, assessing health and learning-related outcomes. Noise levels (A-weighted decibels, dBA) were measured with calibrated sound level meters. Traffic density was assessed for school with the highest measured dBA. Observational checklists assessed noise control parameters and building physical attributes. Results. Short-term, cross-sectional school-day noise levels ranged 68.3–84.7 dBA. Over 60% of respondents reported that vehicular traffic was major source of noise, and over 70% complained being disturbed by noise. Three schools reported tiredness, and one school lack of concentration, as the most prevalent noise-related health problems. Conclusion. Secondary school occupants in Ibadan, Nigeria were potentially affected by exposure to noise from mobile line sources. PMID:20041025

  5. Probabilistic Volcanic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Neri, A.; Newhall, C. G.; Papale, P.

    2007-08-01

    Quantifying Long- and Short-Term Volcanic Hazard: Building Up a Common Strategy for Italian Volcanoes, Erice Italy, 8 November 2006 The term ``hazard'' can lead to some misunderstanding. In English, hazard has the generic meaning ``potential source of danger,'' but for more than 30 years [e.g., Fournier d'Albe, 1979], hazard has been also used in a more quantitative way, that reads, ``the probability of a certain hazardous event in a specific time-space window.'' However, many volcanologists still use ``hazard'' and ``volcanic hazard'' in purely descriptive and subjective ways. A recent meeting held in November 2006 at Erice, Italy, entitled ``Quantifying Long- and Short-Term Volcanic Hazard: Building up a Common Strategy for Italian Volcanoes'' (http://www.bo.ingv.it/erice2006) concluded that a more suitable term for the estimation of quantitative hazard is ``probabilistic volcanic hazard assessment'' (PVHA).

  6. A post-implementation evaluation of ceramic water filters distributed to tsunami-affected communities in Sri Lanka.

    PubMed

    Casanova, Lisa M; Walters, Adam; Naghawatte, Ajith; Sobsey, Mark D

    2012-06-01

    Sri Lanka was devastated by the 2004 Indian Ocean tsunami. During recovery, the Red Cross distributed approximately 12,000 free ceramic water filters. This cross-sectional study was an independent post-implementation assessment of 452 households that received filters, to determine the proportion still using filters, household characteristics associated with use, and quality of household drinking water. The proportion of continued users was high (76%). The most common household water sources were taps or shallow wells. The majority (82%) of users used filtered water for drinking only. Mean filter flow rate was 1.12 L/hr (0.80 L/hr for households with taps and 0.71 for those with wells). Water quality varied by source; households using tap water had source water of high microbial quality. Filters improved water quality, reducing Escherichia coli for households (largely well users) with high levels in their source water. Households were satisfied with filters and are potentially long-term users. To promote sustained use, recovery filter distribution efforts should try to identify households at greatest long-term risk, particularly those who have not moved to safer water sources during recovery. They should be joined with long-term commitment to building supply chains and local production capacity to ensure safe water access.

  7. Challenges Ahead for Nuclear Facility Site-Specific Seismic Hazard Assessment in France: The Alternative Energies and the Atomic Energy Commission (CEA) Vision

    NASA Astrophysics Data System (ADS)

    Berge-Thierry, C.; Hollender, F.; Guyonnet-Benaize, C.; Baumont, D.; Ameri, G.; Bollinger, L.

    2017-09-01

    Seismic analysis in the context of nuclear safety in France is currently guided by a pure deterministic approach based on Basic Safety Rule ( Règle Fondamentale de Sûreté) RFS 2001-01 for seismic hazard assessment, and on the ASN/2/01 Guide that provides design rules for nuclear civil engineering structures. After the 2011 Tohohu earthquake, nuclear operators worldwide were asked to estimate the ability of their facilities to sustain extreme seismic loads. The French licensees then defined the `hard core seismic levels', which are higher than those considered for design or re-assessment of the safety of a facility. These were initially established on a deterministic basis, and they have been finally justified through state-of-the-art probabilistic seismic hazard assessments. The appreciation and propagation of uncertainties when assessing seismic hazard in France have changed considerably over the past 15 years. This evolution provided the motivation for the present article, the objectives of which are threefold: (1) to provide a description of the current practices in France to assess seismic hazard in terms of nuclear safety; (2) to discuss and highlight the sources of uncertainties and their treatment; and (3) to use a specific case study to illustrate how extended source modeling can help to constrain the key assumptions or parameters that impact upon seismic hazard assessment. This article discusses in particular seismic source characterization, strong ground motion prediction, and maximal magnitude constraints, according to the practice of the French Atomic Energy Commission. Due to increases in strong motion databases in terms of the number and quality of the records in their metadata and the uncertainty characterization, several recently published empirical ground motion prediction models are eligible for seismic hazard assessment in France. We show that propagation of epistemic and aleatory uncertainties is feasible in a deterministic approach, as in a probabilistic way. Assessment of seismic hazard in France in the framework of the safety of nuclear facilities should consider these recent advances. In this sense, the opening of discussions with all of the stakeholders in France to update the reference documents (i.e., RFS 2001-01; ASN/2/01 Guide) appears appropriate in the short term.

  8. Guidelines for Analysis of Indigeneous and Private Health Care Planning in Developing Countries. International Health Planning Methods Series, Volume 6.

    ERIC Educational Resources Information Center

    Scrimshaw, Susan

    This guidebook is both a practical tool and a source book to aid health planners assess the importance, extent, and impact of indigenous and private sector medical systems in developing nations. Guidelines are provided for assessment in terms of: use patterns; the meaning and importance to users of various available health services; and ways of…

  9. Introduction to energy sources. [Monograph

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1978-01-01

    Energy resources are reviewed in lay terms in an effort to increase the public's awareness of energy issues. Summaries of the principal sources of energy describe availability, technological requirements, and environmental impacts. The emphasis is placed on making energy use more efficient and the implications of shifting to centralized power plants, with more reliance on electricity. The purpose of this monograph is to demonstrate that energy issues can be examined and assessed by non-experts.

  10. URBAN STORMWATER TOXIC POLLUTANTS: ASSESSMENT, SOURCES, AND TREATABILITY

    EPA Science Inventory

    This paper summarizes an investigation to characterize and treat selected storm water contaminants that are listed as toxic pollutants (termed toxicants in this paper) in the Clean Water Act, Section 307 (Arbuckle et al., 1991). The first project phase investigated typical toxica...

  11. HISTORICAL COASTAL WETLANDS OF PRUDENCE ISLAND

    EPA Science Inventory

    Historical maps are useful tools to assess long-term change. The topographic surveys (T-charts) produced by the U.S. Coast and Geodedetic Survey, predecessor of the U.S.Geologic Survey (USGS), provide a rich source of information on historical environmental features dating back t...

  12. Increasing Confidence In Treatment Performance Assessment Using Geostatistical Methods

    EPA Science Inventory

    It is well established that the presence of dense non-aqueous phase liquids (DNAPLs) such as trichloroethylene (TCE) in aquifer systems represents a very long-term source of groundwater contamination. Significant effort in recent years has been focussed on developing effective me...

  13. Occurrence and risk assessment of potentially toxic elements and typical organic pollutants in contaminated rural soils.

    PubMed

    Xu, Yongfeng; Dai, Shixiang; Meng, Ke; Wang, Yuting; Ren, Wenjie; Zhao, Ling; Christie, Peter; Teng, Ying

    2018-07-15

    The residual levels and risk assessment of several potentially toxic elements (PTEs), phthalate esters (PAEs) and polycyclic aromatic hydrocarbons (PAHs) in rural soils near different types of pollution sources in Tianjin, China, were studied. The soils were found to be polluted to different extents with PTEs, PAEs and PAHs from different pollution sources. The soil concentrations of chromium (Cr), nickel (Ni), di-n-butyl phthalate (DnBP), acenaphthylene (Any) and acenaphthene (Ane) were higher than their corresponding regulatory reference limits. The health risk assessment model used to calculate human exposure indicates that both non-carcinogenic and carcinogenic risks from selected pollutants were generally acceptable or close to acceptable. Different types of pollution sources and soil physicochemical properties substantially affected the soil residual concentrations of and risks from these pollutants. PTEs in soils collected from agricultural lands around industrial and residential areas and organic pollutants (PAEs and PAHs) in soils collected from agricultural areas around livestock breeding were higher than those from other types of pollution sources and merit long-term monitoring. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Evaluation of long-term community recovery from Hurricane Andrew: sources of assistance received by population sub-groups.

    PubMed

    McDonnell, S; Troiano, R P; Barker, N; Noji, E; Hlady, W G; Hopkins, R

    1995-12-01

    Two three-stage cluster surveys were conducted in South Dade County, Florida, 14 months apart, to assess recovery following Hurricane Andrew. Response rates were 75 per cent and 84 per cent. Sources of assistance used in recovery from Hurricane Andrew differed according to race, per capita income, ethnicity, and education. Reports of improved living situation post-hurricane were not associated with receiving relief assistance, but reports of a worse situation were associated with loss of income, being exploited, or job loss. The number of households reporting problems with crime and community violence doubled between the two surveys. Disaster relief efforts had less impact on subjective long-term recovery than did job or income loss or housing repair difficulties. Existing sources of assistance were used more often than specific post-hurricane relief resources. The demographic make-up of a community may determine which are the most effective means to inform them after a disaster and what sources of assistance may be useful.

  15. Toward a Mechanistic Source Term in Advanced Reactors: Characterization of Radionuclide Transport and Retention in a Sodium Cooled Fast Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, Acacia J.; Bucknor, Matthew; Grabaskas, David

    A vital component of the U.S. reactor licensing process is an integrated safety analysis in which a source term representing the release of radionuclides during normal operation and accident sequences is analyzed. Historically, source term analyses have utilized bounding, deterministic assumptions regarding radionuclide release. However, advancements in technical capabilities and the knowledge state have enabled the development of more realistic and best-estimate retention and release models such that a mechanistic source term assessment can be expected to be a required component of future licensing of advanced reactors. Recently, as part of a Regulatory Technology Development Plan effort for sodium cooledmore » fast reactors (SFRs), Argonne National Laboratory has investigated the current state of knowledge of potential source terms in an SFR via an extensive review of previous domestic experiments, accidents, and operation. As part of this work, the significant sources and transport processes of radionuclides in an SFR have been identified and characterized. This effort examines all stages of release and source term evolution, beginning with release from the fuel pin and ending with retention in containment. Radionuclide sources considered in this effort include releases originating both in-vessel (e.g. in-core fuel, primary sodium, cover gas cleanup system, etc.) and ex-vessel (e.g. spent fuel storage, handling, and movement). Releases resulting from a primary sodium fire are also considered as a potential source. For each release group, dominant transport phenomena are identified and qualitatively discussed. The key product of this effort was the development of concise, inclusive diagrams that illustrate the release and retention mechanisms at a high level, where unique schematics have been developed for in-vessel, ex-vessel and sodium fire releases. This review effort has also found that despite the substantial range of phenomena affecting radionuclide release, the current state of knowledge is extensive, and in most areas may be sufficient. Several knowledge gaps were identified, such as uncertainty in release from molten fuel and availability of thermodynamic data for lanthanides and actinides in liquid sodium. However, the overall findings suggest that high retention rates can be expected within the fuel and primary sodium for all radionuclides other than noble gases.« less

  16. A Methodology for the Integration of a Mechanistic Source Term Analysis in a Probabilistic Framework for Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less

  17. Solute source depletion control of forward and back diffusion through low-permeability zones

    NASA Astrophysics Data System (ADS)

    Yang, Minjune; Annable, Michael D.; Jawitz, James W.

    2016-10-01

    Solute diffusive exchange between low-permeability aquitards and high-permeability aquifers acts as a significant mediator of long-term contaminant fate. Aquifer contaminants diffuse into aquitards, but as contaminant sources are depleted, aquifer concentrations decline, triggering back diffusion from aquitards. The dynamics of the contaminant source depletion, or the source strength function, controls the timing of the transition of aquitards from sinks to sources. Here, we experimentally evaluate three archetypical transient source depletion models (step-change, linear, and exponential), and we use novel analytical solutions to accurately account for dynamic aquitard-aquifer diffusive transfer. Laboratory diffusion experiments were conducted using a well-controlled flow chamber to assess solute exchange between sand aquifer and kaolinite aquitard layers. Solute concentration profiles in the aquitard were measured in situ using electrical conductivity. Back diffusion was shown to begin earlier and produce larger mass flux for rapidly depleting sources. The analytical models showed very good correspondence with measured aquifer breakthrough curves and aquitard concentration profiles. The modeling approach links source dissolution and back diffusion, enabling assessment of human exposure risk and calculation of the back diffusion initiation time, as well as the resulting plume persistence.

  18. Solute source depletion control of forward and back diffusion through low-permeability zones.

    PubMed

    Yang, Minjune; Annable, Michael D; Jawitz, James W

    2016-10-01

    Solute diffusive exchange between low-permeability aquitards and high-permeability aquifers acts as a significant mediator of long-term contaminant fate. Aquifer contaminants diffuse into aquitards, but as contaminant sources are depleted, aquifer concentrations decline, triggering back diffusion from aquitards. The dynamics of the contaminant source depletion, or the source strength function, controls the timing of the transition of aquitards from sinks to sources. Here, we experimentally evaluate three archetypical transient source depletion models (step-change, linear, and exponential), and we use novel analytical solutions to accurately account for dynamic aquitard-aquifer diffusive transfer. Laboratory diffusion experiments were conducted using a well-controlled flow chamber to assess solute exchange between sand aquifer and kaolinite aquitard layers. Solute concentration profiles in the aquitard were measured in situ using electrical conductivity. Back diffusion was shown to begin earlier and produce larger mass flux for rapidly depleting sources. The analytical models showed very good correspondence with measured aquifer breakthrough curves and aquitard concentration profiles. The modeling approach links source dissolution and back diffusion, enabling assessment of human exposure risk and calculation of the back diffusion initiation time, as well as the resulting plume persistence. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 3: Appendixes C-H

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.

    1995-04-01

    This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developedmore » that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.« less

  20. Integrating data types to enhance shoreline change assessments

    NASA Astrophysics Data System (ADS)

    Long, J.; Henderson, R.; Plant, N. G.; Nelson, P. R.

    2016-12-01

    Shorelines represent the variable boundary between terrestrial and marine environments. Assessment of geographic and temporal variability in shoreline position and related variability in shoreline change rates are an important part of studies and applications related to impacts from sea-level rise and storms. The results from these assessments are used to quantify future ecosystem services and coastal resilience and guide selection of appropriate coastal restoration and protection designs. But existing assessments typically fail to incorporate all available shoreline observations because they are derived from multiple data types and have different or unknown biases and uncertainties. Shoreline-change research and assessments often focus on either the long-term trajectory using sparse data over multiple decades or shorter-term evolution using data collected more frequently but over a shorter period of time. The combination of data collected with significantly different temporal resolution is not often considered. Also, differences in the definition of the shoreline metric itself can occur, whether using a single or multiple data source(s), due to variation the signal being detected in the data (e.g. instantaneous land/water interface, swash zone, wrack line, or topographic contours). Previous studies have not explored whether more robust shoreline change assessments are possible if all available data are utilized and all uncertainties are considered. In this study, we test the hypothesis that incorporating all available shoreline data will lead to both improved historical assessments and enhance the predictive capability of shoreline-change forecasts. Using over 250 observations of shoreline position at Dauphin Island, Alabama over the last century, we compare shoreline-change rates derived from individual data sources (airborne lidar, satellite, aerial photographs) with an assessment using the combination of all available data. Biases or simple uncertainties in the shoreline metric from different data types and varying temporal/spatial resolution of the data are examined. As part of this test, we also demonstrate application of data assimilation techniques to predict shoreline position by accurately including the uncertainty in each type of data.

  1. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru

    PubMed Central

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan

    2015-01-01

    Background Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. Methods We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey’s ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case – an 8.4 magnitude earthquake that hit southern Peru in 2001. Results and conclusions Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post- earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters. PMID:26090999

  2. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    PubMed

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan

    2015-01-01

    Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001. Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  3. Predictable Unpredictability: the Problem with Basing Medicare Policy on Long-Term Financial Forecasting.

    PubMed

    Glied, Sherry; Zaylor, Abigail

    2015-07-01

    The authors assess how Medicare financing and projections of future costs have changed since 2000. They also assess the impact of legislative reforms on the sources and levels of financing and compare cost forecasts made at different times. Although the aging U.S. population and rising health care costs are expected to increase the share of gross domestic product devoted to Medicare, changes made in the program over the past decade have helped stabilize Medicare's financial outlook--even as benefits have been expanded. Long-term forecasting uncertainty should make policymakers and beneficiaries wary of dramatic changes to the program in the near term that are intended to alter its long-term forecast: the range of error associated with cost forecasts rises as the forecast window lengthens. Instead, policymakers should focus on the immediate policy window, taking steps to reduce the current burden of Medicare costs by containing spending today.

  4. Biological and Dose Thresholds for an Early Genomic Biomarker of Liver Carcinogenesis in Mice.

    EPA Science Inventory

    Traditional data sources for cancer risk assessment are resource-intensive, retrospective, and not feasible for the vast majority of environmental chemicals. The use of quantitative short-term genomic biomarkers may streamline this process by providing protective limits for known...

  5. Biological and Dose Thresholds for an Early Genomic Biomarker of Liver Carcinogenesis in Mice

    EPA Science Inventory

    Traditional data sources for cancer risk assessment are resource-intensive, retrospective, and not feasible for the vast majority of environmental chemicals. The use of quantitative short-term genomic biomarkers may streamline this process by providing protective limits for known...

  6. Energy requirement for the production of silicon solar arrays

    NASA Technical Reports Server (NTRS)

    Lindmayer, J.; Wihl, M.; Scheinne, A.; Morrison, A. D.

    1977-01-01

    Photovoltaics is subject of an extensive technology assessment in terms of its net energy potential as an alternate energy source. Reduction of quartzite pebbles, refinement, crystal growth, cell processing and panel building are evaluated for energy expenditure compared to direct, indirect, and overhead energies.

  7. Assessment of seismic hazard in the North Caucasus

    NASA Astrophysics Data System (ADS)

    Ulomov, V. I.; Danilova, T. I.; Medvedeva, N. S.; Polyakova, T. P.; Shumilina, L. S.

    2007-07-01

    The seismicity of the North Caucasus is the highest in the European part of Russia. The detection of potential seismic sources here and long-term prediction of earthquakes are extremely important for the assessment of seismic hazard and seismic risk in this densely populated and industrially developed region of the country. The seismogenic structures of the Iran-Caucasus-Anatolia and Central Asia regions, adjacent to European Russia, are the subjects of this study. These structures are responsible for the specific features of regional seismicity and for the geodynamic interaction with adjacent areas of the Scythian and Turan platforms. The most probable potential sources of earthquakes with magnitudes M = 7.0 ± 0.2 and 7.5 ± 0.2 in the North Caucasus are located. The possible macroseismic effect of one of them is assessed.

  8. Verification of Methods for Assessing the Sustainability of Monitored Natural Attenuation (MNA)

    DTIC Science & Technology

    2013-01-01

    sugars TOC total organic carbon TSR thermal source removal USACE U.S. Army Corps of Engineers USEPA U.S. Environmental Protection Agency USGS...the SZD function for long-term DNAPL dissolution simulations. However, the sustainability assessment was easily implemented using an alternative...neutral sugars [THNS]). Chapelle et al. (2009) suggested THAA and THNS as measures of the bioavailability of organic carbon based on an analysis of

  9. Groundwater vulnerability and risk mapping using GIS, modeling and a fuzzy logic tool.

    PubMed

    Nobre, R C M; Rotunno Filho, O C; Mansur, W J; Nobre, M M M; Cosenza, C A N

    2007-12-07

    A groundwater vulnerability and risk mapping assessment, based on a source-pathway-receptor approach, is presented for an urban coastal aquifer in northeastern Brazil. A modified version of the DRASTIC methodology was used to map the intrinsic and specific groundwater vulnerability of a 292 km(2) study area. A fuzzy hierarchy methodology was adopted to evaluate the potential contaminant source index, including diffuse and point sources. Numerical modeling was performed for delineation of well capture zones, using MODFLOW and MODPATH. The integration of these elements provided the mechanism to assess groundwater pollution risks and identify areas that must be prioritized in terms of groundwater monitoring and restriction on use. A groundwater quality index based on nitrate and chloride concentrations was calculated, which had a positive correlation with the specific vulnerability index.

  10. TRANSPORTATION FUEL FROM CELLULOSIC BIOMASS: A COMPARATIVE ASSESSMENT OF ETHANOL AND METHANOL OPTIONS

    EPA Science Inventory

    Future sources of renewable fuel energy will be needed to supplement or displace petroleum. Biomass can be converted to ethanol or methanol, either having good properties as motor fuel, but distinctly different production technology. Those technologies are compared in terms of ...

  11. Financial Aid.

    ERIC Educational Resources Information Center

    Graves, Mary A.

    This workbook assists college and vocational school bound American Indian students in determining their financial needs and in locating sources of financial aid. A checklist helps students assess the state of their knowledge of financial programs; a glossary defines terms pertinent to the realm of financial aid (i.e., graduate study programs,…

  12. Assessment of general public exposure to LTE and RF sources present in an urban environment.

    PubMed

    Joseph, Wout; Verloock, Leen; Goeminne, Francis; Vermeeren, Günter; Martens, Luc

    2010-10-01

    For the first time, in situ electromagnetic field exposure of the general public to fields from long term evolution (LTE) cellular base stations is assessed. Exposure contributions due to different radiofrequency (RF) sources are compared with LTE exposure at 30 locations in Stockholm, Sweden. Total exposures (0.2-2.6 V/m) satisfy the International Commission on Non-Ionizing Radiation Protection (ICNIRP) reference levels (from 28 V/m for frequency modulation (FM), up to 61 V/m for LTE) at all locations. LTE exposure levels up to 0.8 V/m were measured, and the average contribution of the LTE signal to the total RF exposure equals 4%.

  13. Inverse modelling of radionuclide release rates using gamma dose rate observations

    NASA Astrophysics Data System (ADS)

    Hamburger, Thomas; Stohl, Andreas; von Haustein, Christoph; Thummerer, Severin; Wallner, Christian

    2014-05-01

    Severe accidents in nuclear power plants such as the historical accident in Chernobyl 1986 or the more recent disaster in the Fukushima Dai-ichi nuclear power plant in 2011 have drastic impacts on the population and environment. The hazardous consequences reach out on a national and continental scale. Environmental measurements and methods to model the transport and dispersion of the released radionuclides serve as a platform to assess the regional impact of nuclear accidents - both, for research purposes and, more important, to determine the immediate threat to the population. However, the assessments of the regional radionuclide activity concentrations and the individual exposure to radiation dose underlie several uncertainties. For example, the accurate model representation of wet and dry deposition. One of the most significant uncertainty, however, results from the estimation of the source term. That is, the time dependent quantification of the released spectrum of radionuclides during the course of the nuclear accident. The quantification of the source terms of severe nuclear accidents may either remain uncertain (e.g. Chernobyl, Devell et al., 1995) or rely on rather rough estimates of released key radionuclides given by the operators. Precise measurements are mostly missing due to practical limitations during the accident. Inverse modelling can be used to realise a feasible estimation of the source term (Davoine and Bocquet, 2007). Existing point measurements of radionuclide activity concentrations are therefore combined with atmospheric transport models. The release rates of radionuclides at the accident site are then obtained by improving the agreement between the modelled and observed concentrations (Stohl et al., 2012). The accuracy of the method and hence of the resulting source term depends amongst others on the availability, reliability and the resolution in time and space of the observations. Radionuclide activity concentrations are observed on a relatively sparse grid and the temporal resolution of available data may be low within the order of hours or a day. Gamma dose rates on the other hand are observed routinely on a much denser grid and higher temporal resolution. Gamma dose rate measurements contain no explicit information on the observed spectrum of radionuclides and have to be interpreted carefully. Nevertheless, they provide valuable information for the inverse evaluation of the source term due to their availability (Saunier et al., 2013). We present a new inversion approach combining an atmospheric dispersion model and observations of radionuclide activity concentrations and gamma dose rates to obtain the source term of radionuclides. We use the Lagrangian particle dispersion model FLEXPART (Stohl et al., 1998; Stohl et al., 2005) to model the atmospheric transport of the released radionuclides. The gamma dose rates are calculated from the modelled activity concentrations. The inversion method uses a Bayesian formulation considering uncertainties for the a priori source term and the observations (Eckhardt et al., 2008). The a priori information on the source term is a first guess. The gamma dose rate observations will be used with inverse modelling to improve this first guess and to retrieve a reliable source term. The details of this method will be presented at the conference. This work is funded by the Bundesamt für Strahlenschutz BfS, Forschungsvorhaben 3612S60026. References Davoine, X. and Bocquet, M., Atmos. Chem. Phys., 7, 1549-1564, 2007. Devell, L., et al., OCDE/GD(96)12, 1995. Eckhardt, S., et al., Atmos. Chem. Phys., 8, 3881-3897, 2008. Saunier, O., et al., Atmos. Chem. Phys., 13, 11403-11421, 2013. Stohl, A., et al., Atmos. Environ., 32, 4245-4264, 1998. Stohl, A., et al., Atmos. Chem. Phys., 5, 2461-2474, 2005. Stohl, A., et al., Atmos. Chem. Phys., 12, 2313-2343, 2012.

  14. Impact of earthquake source complexity and land elevation data resolution on tsunami hazard assessment and fatality estimation

    NASA Astrophysics Data System (ADS)

    Muhammad, Ario; Goda, Katsuichiro

    2018-03-01

    This study investigates the impact of model complexity in source characterization and digital elevation model (DEM) resolution on the accuracy of tsunami hazard assessment and fatality estimation through a case study in Padang, Indonesia. Two types of earthquake source models, i.e. complex and uniform slip models, are adopted by considering three resolutions of DEMs, i.e. 150 m, 50 m, and 10 m. For each of the three grid resolutions, 300 complex source models are generated using new statistical prediction models of earthquake source parameters developed from extensive finite-fault models of past subduction earthquakes, whilst 100 uniform slip models are constructed with variable fault geometry without slip heterogeneity. The results highlight that significant changes to tsunami hazard and fatality estimates are observed with regard to earthquake source complexity and grid resolution. Coarse resolution (i.e. 150 m) leads to inaccurate tsunami hazard prediction and fatality estimation, whilst 50-m and 10-m resolutions produce similar results. However, velocity and momentum flux are sensitive to the grid resolution and hence, at least 10-m grid resolution needs to be implemented when considering flow-based parameters for tsunami hazard and risk assessments. In addition, the results indicate that the tsunami hazard parameters and fatality number are more sensitive to the complexity of earthquake source characterization than the grid resolution. Thus, the uniform models are not recommended for probabilistic tsunami hazard and risk assessments. Finally, the findings confirm that uncertainties of tsunami hazard level and fatality in terms of depth, velocity and momentum flux can be captured and visualized through the complex source modeling approach. From tsunami risk management perspectives, this indeed creates big data, which are useful for making effective and robust decisions.

  15. Hydrogeochemistry of the drinking water sources of Derebogazi Village (Kahramanmaras) and their effects on human health.

    PubMed

    Uras, Yusuf; Uysal, Yagmur; Arikan, Tugba Atilan; Kop, Alican; Caliskan, Mustafa

    2015-06-01

    The aim of this study was to investigate the sources of drinking water for Derebogazi Village, Kahramanmaras Province, Turkey, in terms of hydrogeochemistry, isotope geochemistry, and medical geology. Water samples were obtained from seven different water sources in the area, all of which are located within quartzite units of Paleozoic age, and isotopic analyses of (18)O and (2)H (deuterium) were conducted on the samples. Samples were collected from the region for 1 year. Water quality of the samples was assessed in terms of various water quality parameters, such as temperature, pH, conductivity, alkalinity, trace element concentrations, anion-cation measurements, and metal concentrations, using ion chromatography, inductively coupled plasma (ICP) mass spectrometry, ICP-optical emission spectrometry techniques. Regional health surveys had revealed that the heights of local people are significantly below the average for the country. In terms of medical geology, the sampled drinking water from the seven sources was deficient in calcium and magnesium ions, which promote bone development. Bone mineral density screening tests were conducted on ten females using dual energy X-ray absorptiometry to investigate possible developmental disorder(s) and potential for mineral loss in the region. Of these ten women, three had T-scores close to the osteoporosis range (T-score < -2.5).

  16. Assessment of Technologies for the Space Shuttle External Tank Thermal Protection System and Recommendations for Technology Improvement - Part III: Material Property Characterization, Analysis, and Test Methods

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Johnson, Theodore F.; Whitley, Karen S.

    2005-01-01

    The objective of this report is to contribute to the independent assessment of the Space Shuttle External Tank Foam Material. This report specifically addresses material modeling, characterization testing, data reduction methods, and data pedigree. A brief description of the External Tank foam materials, locations, and standard failure modes is provided to develop suitable background information. A review of mechanics based analysis methods from the open literature is used to provide an assessment of the state-of-the-art in material modeling of closed cell foams. Further, this report assesses the existing material property database and investigates sources of material property variability. The report presents identified deficiencies in testing methods and procedures, recommendations for additional testing as required, identification of near-term improvements that should be pursued, and long-term capabilities or enhancements that should be developed.

  17. Assessment of diesel particulate matter exposure in the workplace: freight terminals†

    PubMed Central

    Sheesley, Rebecca J.; Schauer, James J.; Smith, Thomas J.; Garshick, Eric; Laden, Francine; Marr, Linsey C.; Molina, Luisa T.

    2008-01-01

    A large study has been undertaken to assess the exposure to diesel exhaust within diesel trucking terminals. A critical component of this assessment is an analysis of the variation in carbonaceous particulate matter (PM) across trucking terminal locations; consistency in the primary sources can be effectively tracked by analyzing trends in elemental carbon (EC) and organic molecular marker concentrations. Ambient samples were collected at yard, dock and repair shop work stations in 7 terminals in the USA and 1 in Mexico. Concentrations of EC ranged from 0.2 to 12 μg m−3 among the terminals, which corresponds to the range seen in the concentration of summed hopanes (0.5 to 20.5 ng m−3). However, when chemical mass balance (CMB) source apportionment results were presented as percent contribution to organic carbon (OC) concentrations, the contribution of mobile sources to OC are similar among the terminals in different cities. The average mobile source percent contribution to OC was 75.3 ± 17.1% for truck repair shops, 65.4 ± 20.4% for the docks and 38.4 ± 9.5% for the terminal yard samples. A relatively consistent mobile source impact was present at all the terminals only when considering percentage of total OC concentrations, not in terms of absolute concentrations. PMID:18392272

  18. Additional Evidence for the Accuracy of Biographical Data: Long-Term Retest and Observer Ratings.

    ERIC Educational Resources Information Center

    Shaffer, Garnett Stokes; And Others

    1986-01-01

    Investigated accuracy of responses to biodata questionnaire using a test-retest design and informed external observers for verification. Responses from 237 subjects and 200 observers provided evidence that many responses to biodata questionnaire were accurate. Assessed sources of inaccuracy, including social desirability effects, and noted…

  19. Assessment of Methane Emissions from Oil and Gas Production Pads using Mobile Measurements

    EPA Science Inventory

    Journal Article Abstract --- "A mobile source inspection approach called OTM 33A was used to quantify short-term methane emission rates from 218 oil and gas production pads in Texas, Colorado, and Wyoming from 2010 to 2013. The emission rates were log-normally distributed with ...

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.C. Ryman

    This calculation is a revision of a previous calculation (Ref. 7.5) that bears the same title and has the document identifier BBAC00000-01717-0210-00006 REV 01. The purpose of this revision is to remove TBV (to-be-verified) -41 10 associated with the output files of the previous version (Ref. 7.30). The purpose of this and the previous calculation is to generate source terms for a representative boiling water reactor (BWR) spent nuclear fuel (SNF) assembly for the first one million years after the SNF is discharged from the reactors. This calculation includes an examination of several ways to represent BWR assemblies and operatingmore » conditions in SAS2H in order to quantify the effects these representations may have on source terms. These source terms provide information characterizing the neutron and gamma spectra in particles per second, the decay heat in watts, and radionuclide inventories in curies. Source terms are generated for a range of burnups and enrichments (see Table 2) that are representative of the waste stream and stainless steel (SS) clad assemblies. During this revision, it was determined that the burnups used for the computer runs of the previous revision were actually about 1.7% less than the stated, or nominal, burnups. See Section 6.6 for a discussion of how to account for this effect before using any source terms from this calculation. The source term due to the activation of corrosion products deposited on the surfaces of the assembly from the coolant is also calculated. The results of this calculation support many areas of the Monitored Geologic Repository (MGR), which include thermal evaluation, radiation dose determination, radiological safety analyses, surface and subsurface facility designs, and total system performance assessment. This includes MGR items classified as Quality Level 1, for example, the Uncanistered Spent Nuclear Fuel Disposal Container (Ref. 7.27, page 7). Therefore, this calculation is subject to the requirements of the Quality Assurance Requirements and Description (Ref. 7.28). The performance of the calculation and development of this document are carried out in accordance with AP-3.124, ''Design Calculation and Analyses'' (Ref. 7.29).« less

  1. Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.

  2. EN FACE IMAGING OF RETINAL ARTERY MACROANEURYSMS USING SWEPT-SOURCE OPTICAL COHERENCE TOMOGRAPHY.

    PubMed

    Hanhart, Joel; Strassman, Israel; Rozenman, Yaakov

    2017-01-01

    To describe the advantages of en face view with swept-source optical coherence tomography in assessing the morphologic features of retinal arterial macroaneurysms, their consequences on adjacent retina, planning laser treatment, and evaluating its effects. Three eyes were treated for retinal arterial macroaneurysms and followed by swept-source optical coherence tomography in 2014-2015. En face images of the retina and choroid were obtained by EnView, a swept-source optical coherence tomography program. Retinal arterial macroaneurysms have a typical optical coherence tomography appearance. En face view allows delineation of the macroaneurysm wall, thrombotic components within the dilation, and lumen measurement. Hemorrhage, lipids, and fluids can be precisely described in terms of amount and extent over the macula and depth. This technique is also practical for planning focal laser treatment and determining its effects. En face swept-source optical coherence tomography is a rapid, noninvasive, high-resolution, promising technology, which allows excellent visualization of retinal arterial macroaneurysms and their consequences on surrounding tissues. It could make angiography with intravenous injection redundant in planning and assessing therapy.

  3. Long-term changes after brief dynamic psychotherapy: symptomatic versus dynamic assessments.

    PubMed

    Høglend, P; Sørlie, T; Sørbye, O; Heyerdahl, O; Amlo, S

    1992-08-01

    Dynamic change in psychotherapy, as measured by theory-related or mode-specific instruments, have been criticized for being too intercorrelated with symptomatic change measures. In this study, long-term changes after brief dynamic psychotherapy were studied in 45 moderately disturbed neurotic patients by a reliable outcome battery. The factor structure of all the change variables suggested that they tapped 2 distinct and stable sources of variance: dynamic and symptomatic change. The categories of overall dynamic change were different from categories of change on the Global Assessment Scale. A small systematic difference was found between the categories of overall dynamic change and the categories of target complaints change also, due to false solutions of dynamic conflicts.

  4. Biomass burning contributions estimated by synergistic coupling of daily and hourly aerosol composition records.

    PubMed

    Nava, S; Lucarelli, F; Amato, F; Becagli, S; Calzolai, G; Chiari, M; Giannoni, M; Traversi, R; Udisti, R

    2015-04-01

    Biomass burning (BB) is a significant source of particulate matter (PM) in many parts of the world. Whereas numerous studies demonstrate the relevance of BB emissions in central and northern Europe, the quantification of this source has been assessed only in few cities in southern European countries. In this work, the application of Positive Matrix Factorisation (PMF) allowed a clear identification and quantification of an unexpected very high biomass burning contribution in Tuscany (central Italy), in the most polluted site of the PATOS project. In this urban background site, BB accounted for 37% of the mass of PM10 (particulate matter with aerodynamic diameter<10 μm) as annual average, and more than 50% during winter, being the main cause of all the PM10 limit exceedances. Due to the chemical complexity of BB emissions, an accurate assessment of this source contribution is not always easily achievable using just a single tracer. The present work takes advantage of the combination of a long-term daily data-set, characterized by an extended chemical speciation, with a short-term high time resolution (1-hour) and size-segregated data-set, obtained by PIXE analyses of streaker samples. The hourly time pattern of the BB source, characterised by a periodic behaviour with peaks starting at about 6 p.m. and lasting all the evening-night, and its strong seasonality, with higher values in the winter period, clearly confirmed the hypothesis of a domestic heating source (also excluding important contributions from wildfires and agricultural wastes burning). Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Above and beyond short-term mating, long-term mating is uniquely tied to human personality.

    PubMed

    Holtzman, Nicholas S; Strube, Michael J

    2013-12-16

    To what extent are personality traits and sexual strategies linked? The literature does not provide a clear answer, as it is based on the Sociosexuality model, a one-dimensional model that fails to measure long-term mating (LTM). An improved two-dimensional model separately assesses long-term and short-term mating (STM; Jackson and Kirkpatrick, 2007). In this paper, we link this two-dimensional model to an array of personality traits (Big 5, Dark Triad, and Schizoid Personality). We collected data from different sources (targets and peers; Study 1), and from different nations (United States, Study 1; India, Study 2). We demonstrate for the first time that, above and beyond STM, LTM captures variation in personality.

  6. Numerical and experimental evaluations of the flow past nested chevrons

    NASA Technical Reports Server (NTRS)

    Foss, J. F.; Foss, J. K.; Spalart, P. R.

    1989-01-01

    An effort is made to contribute to the development of CFD by relating the successful use of vortex dynamics in the computation of the pressure drop past a planar array of chevron-shaped obstructions. An ensemble of results was used to compute the loss coefficient k, stimulating an experimental program for the assessment of the measured loss coefficient for the same geometry. The most provocative result of this study has been the representation of kinetic energy production in terms of vorticity source terms.

  7. Photovoltaic village power application: Assessment of the near-term market

    NASA Technical Reports Server (NTRS)

    Rosenblum, L.; Bifano, W. J.; Poley, W. A.; Scudder, L. R.

    1978-01-01

    The village power application represents a potential market for photovoltaics. The price of energy for photovoltaic systems was compared to that of utility line extensions and diesel generators. The potential domestic demand was defined in both the government and commercial sectors. The foreign demand and sources of funding for village power systems in the developing countries were also discussed briefly. It was concluded that a near term domestic market of at least 12 MW min and a foreign market of about 10 GW exists.

  8. Photovoltaic water pumping applications: Assessment of the near-term market

    NASA Technical Reports Server (NTRS)

    Rosenblum, L.; Bifano, W. J.; Scudder, L. R.; Poley, W. A.; Cusick, J. P.

    1978-01-01

    Water pumping applications represent a potential market for photovoltaics. The price of energy for photovoltaic systems was compared to that of utility line extensions and diesel generators. The potential domestic demand was defined in the government, commercial/institutional and public sectors. The foreign demand and sources of funding for water pumping systems in the developing countries were also discussed briefly. It was concluded that a near term domestic market of at least 240 megawatts and a foreign market of about 6 gigawatts exist.

  9. Variational Iterative Refinement Source Term Estimation Algorithm Assessment for Rural and Urban Environments

    NASA Astrophysics Data System (ADS)

    Delle Monache, L.; Rodriguez, L. M.; Meech, S.; Hahn, D.; Betancourt, T.; Steinhoff, D.

    2016-12-01

    It is necessary to accurately estimate the initial source characteristics in the event of an accidental or intentional release of a Chemical, Biological, Radiological, or Nuclear (CBRN) agent into the atmosphere. The accurate estimation of the source characteristics are important because many times they are unknown and the Atmospheric Transport and Dispersion (AT&D) models rely heavily on these estimates to create hazard assessments. To correctly assess the source characteristics in an operational environment where time is critical, the National Center for Atmospheric Research (NCAR) has developed a Source Term Estimation (STE) method, known as the Variational Iterative Refinement STE algorithm (VIRSA). VIRSA consists of a combination of modeling systems. These systems include an AT&D model, its corresponding STE model, a Hybrid Lagrangian-Eulerian Plume Model (H-LEPM), and its mathematical adjoint model. In an operational scenario where we have information regarding the infrastructure of a city, the AT&D model used is the Urban Dispersion Model (UDM) and when using this model in VIRSA we refer to the system as uVIRSA. In all other scenarios where we do not have the city infrastructure information readily available, the AT&D model used is the Second-order Closure Integrated PUFF model (SCIPUFF) and the system is referred to as sVIRSA. VIRSA was originally developed using SCIPUFF 2.4 for the Defense Threat Reduction Agency and integrated into the Hazard Prediction and Assessment Capability and Joint Program for Information Systems Joint Effects Model. The results discussed here are the verification and validation of the upgraded system with SCIPUFF 3.0 and the newly implemented UDM capability. To verify uVIRSA and sVIRSA, synthetic concentration observation scenarios were created in urban and rural environments and the results of this verification are shown. Finally, we validate the STE performance of uVIRSA using scenarios from the Joint Urban 2003 (JU03) experiment, which was held in Oklahoma City and also validate the performance of sVIRSA using scenarios from the FUsing Sensor Integrated Observing Network (FUSION) Field Trial 2007 (FFT07), held at Dugway Proving Grounds in rural Utah.

  10. Information sources in biomedical science and medical journalism: methodological approaches and assessment.

    PubMed

    Miranda, Giovanna F; Vercellesi, Luisa; Bruno, Flavia

    2004-09-01

    Throughout the world the public is showing increasing interest in medical and scientific subjects and journalists largely spread this information, with an important impact on knowledge and health. Clearly, therefore, the relationship between the journalist and his sources is delicate: freedom and independence of information depend on the independence and truthfulness of the sources. The new "precision journalism" holds that scientific methods should be applied to journalism, so authoritative sources are a common need for journalists and scientists. We therefore compared the individual classifications and methods of assessing of sources in biomedical science and medical journalism to try to extrapolate scientific methods of evaluation to journalism. In journalism and science terms used to classify sources of information show some similarities, but their meanings are different. In science primary and secondary classes of information, for instance, refer to the levels of processing, but in journalism to the official nature of the source itself. Scientists and journalists must both always consult as many sources as possible and check their authoritativeness, reliability, completeness, up-to-dateness and balance. In journalism, however, there are some important differences and limits: too many sources can sometimes diminish the quality of the information. The sources serve a first filter between the event and the journalist, who is not providing the reader with the fact, but with its projection. Journalists have time constraints and lack the objective criteria for searching, the specific background knowledge, and the expertise to fully assess sources. To assist in understanding the wealth of sources of information in journalism, we have prepared a checklist of items and questions. There are at least four fundamental points that a good journalist, like any scientist, should know: how to find the latest information (the sources), how to assess it (the quality and authoritativeness), how to analyse and filter it (selection), how to deal with too many sources of information, sometimes case biased by conflicting interests (balance). The journalist must, in addition, know how to translate it to render it accessible and useful to the general public (dissemination), and how to use it best.

  11. Substantial reductions in ambient PAHs pollution and lives saved as a co-benefit of effective long-term PM2.5 pollution controls.

    PubMed

    Kong, Shaofei; Yan, Qin; Zheng, Huang; Liu, Haibiao; Wang, Wei; Zheng, Shurui; Yang, Guowei; Zheng, Mingming; Wu, Jian; Qi, Shihua; Shen, Guofeng; Tang, Lili; Yin, Yan; Zhao, Tianliang; Yu, Huan; Liu, Dantong; Zhao, Delong; Zhang, Tao; Ruan, Jujun; Huang, Mingzhi

    2018-05-01

    Under great efforts in fighting against serious haze problem of China since 2013, decreasing of air pollutants especially for fine particles (PM 2.5 ) has been revealed for several key regions. This study tried to answer whether the reduction of PM 2.5 -bound polycyclic aromatic hydrocarbons (PAHs) was coincident with PM 2.5 because of long-term pollution control measures (PCM), and to assess source-oriented health risks associated with inhalation exposure to PAHs. Field measurements were carried out before and after the publishing of local air pollution protection plan for Nanjing, a mega-city in east China. Results indicated that the air quality was substantially improving, with a significant reduction in annual average PM 2.5 by 34%, and moreover, PM 2.5 -bound PAHs significantly reduced by 63% (p < 0.001). The remarkable reduction was mainly attributable to the change of emission sources, compared to the influence of atmospheric circulation patterns, surface meteorological conditions, and atmospheric chemical reaction. Four PAHs sources including coal combustion (CC), petroleum and oil burning (PO), wood burning (WB) and vehicle emission (VE) were identified. On an annual basis, contributions to ambient PM 2.5 -PAHs from WB, PO, CC and VE sources in the period before the action of control measures were 2.26, 2.20, 1.96 and 5.62 ng m -3 , respectively. They reduced to 1.09, 0.37, 1.31 and 1.77 ng m -3 for the four source types, with the reduction percentages as 51, 83, 33 and 68%, respectively. The estimated reduction in lifetime lung cancer risk was around 61%. The study that firstly assessed the health effects of PAHs reduction as a co-benefit raised by air PCM sustained for a long period is believed to be applicable and referential for other mega-cities around the world for assessing the benefits of PCM. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Review of atmospheric ammonia data in the context of developing technologies, changing climate, and future policy evidence needs

    NASA Astrophysics Data System (ADS)

    Braban, Christine; Tang, Sim; Bealey, Bill; Roberts, Elin; Stephens, Amy; Galloway, Megan; Greenwood, Sarah; Sutton, Mark; Nemitz, Eiko; Leaver, David

    2017-04-01

    Ambient ammonia measurements have been undertaken both in the atmosphere to understand sources, concentrations at background and vulnerable ecosystems and for long term monitoring of concentrations. As a pollutant which is projected to increase concentration in the coming decades with significant policy challenges to implementing mitigation strategies it is useful to assess what has been measured, where and why. In this study a review of the literature, has shown that ammonia measurements are frequently not publically reported and in general not reposited in the open data centres, available for research. The specific sectors where measurements have been undertaken are: agricultural point source assessments, agricultural surface exchange measurements, sensitive ecosystem monitoring, landscape/regional studies and governmental long term monitoring. Less frequently ammonia is measured as part of an intensive atmospheric chemistry field campaign. Technology is developing which means a shift from chemical denuder methods to spectroscopic techniques may be possible, however chemical denuding techniques with off-line laboratory analysis will likely be an economical approach for some time to come. This paper reviews existing datasets from the different sectors of research and integrates them for a global picture to allow both a long term understanding and facilitate comparison with future measurements.

  13. Assessing the short-term clock drift of early broadband stations with burst events of the 26 s persistent and localized microseism

    NASA Astrophysics Data System (ADS)

    Xie, J.; Ni, S.; Chu, R.; Xia, Y.

    2017-12-01

    Accurate seismometer clock plays an important role in seismological studies including earthquake location and tomography. However, some seismic stations may have clock drift larger than 1 second, especially in early days of global seismic network. The 26 s Persistent Localized (PL) microseism event in the Gulf of Guinea sometime excites strong and coherent signals, and can be used as repeating source for assessing stability of seismometer clocks. Taking station GSC/TS in southern California, USA as an example, the 26 s PL signal can be easily observed in the ambient Noise Cross-correlation Function (NCF) between GSC/TS and a remote station. The variation of travel-time of this 26 s signal in the NCF is used to infer clock error. A drastic clock error is detected during June, 1992. This short-term clock error is confirmed by both teleseismic and local earthquake records with a magnitude of ±25 s. Using 26 s PL source, the clock can be validated for historical records of sparsely distributed stations, where usual NCF of short period microseism (<20 s) might be less effective due to its attenuation over long interstation distances. However, this method suffers from cycling problem, and should be verified by teleseismic/local P waves. The location change of the 26 s PL source may influence the measured clock drift, using regional stations with stable clock, we estimate the possible location change of the source.

  14. A comprehensive Probabilistic Tsunami Hazard Assessment for the city of Naples (Italy)

    NASA Astrophysics Data System (ADS)

    Anita, G.; Tonini, R.; Selva, J.; Sandri, L.; Pierdominici, S.; Faenza, L.; Zaccarelli, L.

    2012-12-01

    A comprehensive Probabilistic Tsunami Hazard Assessment (PTHA) should consider different tsunamigenic sources (seismic events, slide failures, volcanic eruptions) to calculate the hazard on given target sites. This implies a multi-disciplinary analysis of all natural tsunamigenic sources, in a multi-hazard/risk framework, which considers also the effects of interaction/cascade events. Our approach shows the ongoing effort to analyze the comprehensive PTHA for the city of Naples (Italy) including all types of sources located in the Tyrrhenian Sea, as developed within the Italian project ByMuR (Bayesian Multi-Risk Assessment). The project combines a multi-hazard/risk approach to treat the interactions among different hazards, and a Bayesian approach to handle the uncertainties. The natural potential tsunamigenic sources analyzed are: 1) submarine seismic sources located on active faults in the Tyrrhenian Sea and close to the Southern Italian shore line (also we consider the effects of the inshore seismic sources and the associated active faults which we provide their rapture properties), 2) mass failures and collapses around the target area (spatially identified on the basis of their propensity to failure), and 3) volcanic sources mainly identified by pyroclastic flows and collapses from the volcanoes in the Neapolitan area (Vesuvius, Campi Flegrei and Ischia). All these natural sources are here preliminary analyzed and combined, in order to provide a complete picture of a PTHA for the city of Naples. In addition, the treatment of interaction/cascade effects is formally discussed in the case of significant temporary variations in the short-term PTHA due to an earthquake.

  15. Monitoring Heritage Buildings with Open Source Hardware Sensors: A Case Study of the Mosque-Cathedral of Córdoba

    PubMed Central

    Mesas-Carrascosa, Francisco Javier; Verdú Santano, Daniel; Meroño de Larriva, Jose Emilio; Ortíz Cordero, Rafael; Hidalgo Fernández, Rafael Enrique; García-Ferrer, Alfonso

    2016-01-01

    A number of physical factors can adversely affect cultural heritage. Therefore, monitoring parameters involved in the deterioration process, principally temperature and relative humidity, is useful for preventive conservation. In this study, a total of 15 microclimate stations using open source hardware were developed and stationed at the Mosque-Cathedral of Córdoba, which is registered with UNESCO for its outstanding universal value, to assess the behavior of interior temperature and relative humidity in relation to exterior weather conditions, public hours and interior design. Long-term monitoring of these parameters is of interest in terms of preservation and reducing the costs of future conservation strategies. Results from monitoring are presented to demonstrate the usefulness of this system. PMID:27690056

  16. Implementation of the Leaching Environmental Assessment ...

    EPA Pesticide Factsheets

    New leaching tests are available in the U.S. for developing more accurate source terms for use in fate and transport models. For beneficial use or disposal, the use of the leaching environmental assessment framework (LEAF) will provide leaching results that reflect field conditions reflecting either use of disposal of the material or waste. To provide overview of the implementation of new leaching tests for presentation at the MEGA symposium which is for the coal-fired power industry

  17. Cross-Disciplinary and Intermode Agreement on the Description and Evaluation of Landscape Resources

    ERIC Educational Resources Information Center

    Zube, Ervin H.

    1974-01-01

    Data obtained from ground reconnaissance and from office studies employing aerial photography show that the extent of agreement between environmental designers and resource managers on the use of descriptive and evaluative landscape terms is generally high. Use of remote data sources for science resource assessments is supported. (DT)

  18. 40 CFR 69.22 - Title V conditional exemption.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the issuance of permits with a fixed term that shall not exceed five years. (10) The program shall... authority to assess civil and criminal penalties up to $10,000 per day per violation and to enjoin... issuing permits to all subject sources within three years of EPA approval of the program. (8) The program...

  19. 40 CFR 69.22 - Title V conditional exemption.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the issuance of permits with a fixed term that shall not exceed five years. (10) The program shall... authority to assess civil and criminal penalties up to $10,000 per day per violation and to enjoin... issuing permits to all subject sources within three years of EPA approval of the program. (8) The program...

  20. Short-term C mineralization (aka the flush of CO2) as an indicator of soil biological health

    USDA-ARS?s Scientific Manuscript database

    Soil biological activity is a key component of soil health assessments, as it (a) indicates soil nutrient cycling capacity from various organic matter sources to inorganic availability, (b) relates to soil structural conditions, (c) informs about the potential to harbor biodiversity in soil, and (d)...

  1. Reducing DoD Fossil-Fuel Dependence

    DTIC Science & Technology

    2006-09-01

    hour: the amount of energy available from one gigawatt in one hour. HFCS High - fructose corn syrup HHV High -heat value HICE Hydrogen internal combustion...63 Ethanol derived from corn .................................................... 63...particular, alternate fuels and energy sources are to be assessed in terms of multiple parameters, to include (but not limited to) stability, high & low

  2. Role of Universities in the National Innovation System. Discussion Paper

    ERIC Educational Resources Information Center

    Group of Eight (NJ1), 2011

    2011-01-01

    Over recent years governments have been placing more emphasis on innovation as a source of national competitiveness. Governments now assess their investments across many areas in terms of the contribution that such investments make to increasing innovation. This has been especially significant for education and in particular for the development of…

  3. Methods for assessing long-term mean pathogen count in drinking water and risk management implications.

    PubMed

    Englehardt, James D; Ashbolt, Nicholas J; Loewenstine, Chad; Gadzinski, Erik R; Ayenu-Prah, Albert Y

    2012-06-01

    Recently pathogen counts in drinking and source waters were shown theoretically to have the discrete Weibull (DW) or closely related discrete growth distribution (DGD). The result was demonstrated versus nine short-term and three simulated long-term water quality datasets. These distributions are highly skewed such that available datasets seldom represent the rare but important high-count events, making estimation of the long-term mean difficult. In the current work the methods, and data record length, required to assess long-term mean microbial count were evaluated by simulation of representative DW and DGD waterborne pathogen count distributions. Also, microbial count data were analyzed spectrally for correlation and cycles. In general, longer data records were required for more highly skewed distributions, conceptually associated with more highly treated water. In particular, 500-1,000 random samples were required for reliable assessment of the population mean ±10%, though 50-100 samples produced an estimate within one log (45%) below. A simple correlated first order model was shown to produce count series with 1/f signal, and such periodicity over many scales was shown in empirical microbial count data, for consideration in sampling. A tiered management strategy is recommended, including a plan for rapid response to unusual levels of routinely-monitored water quality indicators.

  4. Respiratory source control using a surgical mask: An in vitro study.

    PubMed

    Patel, Rajeev B; Skaria, Shaji D; Mansour, Mohamed M; Smaldone, Gerald C

    2016-07-01

    Cough etiquette and respiratory hygiene are forms of source control encouraged to prevent the spread of respiratory infection. The use of surgical masks as a means of source control has not been quantified in terms of reducing exposure to others. We designed an in vitro model using various facepieces to assess their contribution to exposure reduction when worn at the infectious source (Source) relative to facepieces worn for primary (Receiver) protection, and the factors that contribute to each. In a chamber with various airflows, radiolabeled aerosols were exhaled via a ventilated soft-face manikin head using tidal breathing and cough (Source). Another manikin, containing a filter, quantified recipient exposure (Receiver). The natural fit surgical mask, fitted (SecureFit) surgical mask and an N95-class filtering facepiece respirator (commonly known as an "N95 respirator") with and without a Vaseline-seal were tested. With cough, source control (mask or respirator on Source) was statistically superior to mask or unsealed respirator protection on the Receiver (Receiver protection) in all environments. To equal source control during coughing, the N95 respirator must be Vaseline-sealed. During tidal breathing, source control was comparable or superior to mask or respirator protection on the Receiver. Source control via surgical masks may be an important adjunct defense against the spread of respiratory infections. The fit of the mask or respirator, in combination with the airflow patterns in a given setting, are significant contributors to source control efficacy. Future clinical trials should include a surgical mask source control arm to assess the contribution of source control in overall protection against airborne infection.

  5. Is Privately Funded Research on the Rise in Ocean Science?

    NASA Astrophysics Data System (ADS)

    Spring, M.; Cooksey, S. W.; Orcutt, J. A.; Ramberg, S. E.; Jankowski, J. E.; Mengelt, C.

    2014-12-01

    While federal funding for oceanography is leveling off or declining, private sector funding from industry and philanthropy appears to be on the rise. The Ocean Studies Board of the National Research Council is discussing these changes in the ocean science funding landscape. In 2014 the Board convened experts to better understand the long term public and private funding trends for the ocean sciences and the implications of such trends for the ocean science enterprise and the nation. Specific topics of discussion included: (1) the current scope of philanthropic and industry funding for the ocean sciences; (2) the long-term trends in the funding balance between federal and other sources of funding; (3) the priorities and goals for private funders; and (4) the characteristics of various modes of engagement for private funders. Although public funding remains the dominant source of research funding, it is unclear how far or fast that balance might shift in the future nor what a shifting balance may mean. There has been no comprehensive assessment of the magnitude and impact of privately-funded science, particularly the ocean sciences, as public funding sources decline. Nevertheless, the existing data can shed some light on these questions. We will present available data on long-term trends in federal and other sources of funding for science (focusing on ocean science) and report on preliminary findings from a panel discussion with key private foundations and industry funders.

  6. The microbial quality of drinking water in Manonyane community: Maseru District (Lesotho).

    PubMed

    Gwimbi, P

    2011-09-01

    Provision of good quality household drinking water is an important means of improving public health in rural communities especially in Africa; and is the rationale behind protecting drinking water sources and promoting healthy practices at and around such sources. To examine the microbial content of drinking water from different types of drinking water sources in Manonyane community of Lesotho. The community's hygienic practices around the water sources are also assessed to establish their contribution to water quality. Water samples from thirty five water sources comprising 22 springs, 6 open wells, 6 boreholes and 1 open reservoir were assessed. Total coliform and Escherichia coli bacteria were analyzed in water sampled. Results of the tests were compared with the prescribed World Health Organization desirable limits. A household survey and field observations were conducted to assess the hygienic conditions and practices at and around the water sources. Total coliform were detected in 97% and Escherichia coli in 71% of the water samples. The concentration levels of Total coliform and Escherichia coli were above the permissible limits of the World Health Organization drinking water quality guidelines in each case. Protected sources had significantly less number of colony forming units (cfu) per 100 ml of water sample compared to unprotected sources (56% versus 95%, p < 0.05). Similarly in terms of Escherichia coli, protected sources had less counts (7% versus 40%, p < 0.05) compared with those from unprotected sources. Hygiene conditions and practices that seemed to potentially contribute increased total coliform and Escherichia coli counts included non protection of water sources from livestock faeces, laundry practices, and water sources being down slope of pit latrines in some cases. These findings suggest source water protection and good hygiene practices can improve the quality of household drinking water where disinfection is not available. The results also suggest important lines of inquiry and provide support and input for environmental and public health programmes, particularly those related to water and sanitation.

  7. Multi-Decadal Change of Atmospheric Aerosols and Their Effect on Surface Radiation

    NASA Technical Reports Server (NTRS)

    Chin, Mian; Diehl, Thomas; Tan, Qian; Wild, Martin; Qian, Yun; Yu, Hongbin; Bian, Huisheng; Wang, Weiguo

    2012-01-01

    We present an investigation on multi-decadal changes of atmospheric aerosols and their effects on surface radiation using a global chemistry transport model along with the near-term to long-term data records. We focus on a 28-year time period of satellite era from 1980 to 2007, during which a suite of aerosol data from satellite observations and ground-based remote sensing and in-situ measurements have become available. We analyze the long-term global and regional aerosol optical depth and concentration trends and their relationship to the changes of emissions" and assess the role aerosols play in the multi-decadal change of solar radiation reaching the surface (known as "dimming" or "brightening") at different regions of the world, including the major anthropogenic source regions (North America, Europe, Asia) that have been experiencing considerable changes of emissions, dust and biomass burning regions that have large interannual variabilities, downwind regions that are directly affected by the changes in the source area, and remote regions that are considered to representing "background" conditions.

  8. The aromatic amino acids biosynthetic pathway: A core platform for products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lievense, J.C.; Frost, J.W.

    The aromatic amino acids biosynthetic pathway is viewed conventionally and primarily as the source of the amino acids L-tyrosine, L-phenylalanine. The authors have recognized the expanded role of the pathway as the major source of aromatic raw materials on earth. With the development of metabolic engineering approaches, it is now possible to biosynthesize a wide variety of aromatic compounds from inexpensive, clean, abundant, renewable sugars using fermentation methods. Examples of already and soon-to-be commercialized biosynthesis of such compounds are described. The long-term prospects are also assessed.

  9. Platinum-group elements in southern Africa: mineral inventory and an assessment of undiscovered mineral resources: Chapter Q in Global mineral resource assessment

    USGS Publications Warehouse

    Zientek, Michael L.; Causey, J. Douglas; Parks, Heather L.; Miller, Robert J.

    2014-01-01

    The large layered intrusions in southern Africa—the Bushveld Complex and the Great Dyke—are now and will continue to be a major source of the world’s supply of PGE. Mining will not deplete the identified mineral resources and reserves or potential undiscovered mineral resources for many decades; however, in the near-term, PGE supply could be affected by social, environmental, political, and economic factors.

  10. Communicating Science to Impact Learning? A Phenomenological Inquiry into 4th and 5th Graders' Perceptions of Science Information Sources

    NASA Astrophysics Data System (ADS)

    Gelmez Burakgazi, Sevinc; Yildirim, Ali; Weeth Feinstein, Noah

    2016-04-01

    Rooted in science education and science communication studies, this study examines 4th and 5th grade students' perceptions of science information sources (SIS) and their use in communicating science to students. It combines situated learning theory with uses and gratifications theory in a qualitative phenomenological analysis. Data were gathered through classroom observations and interviews in four Turkish elementary schools. Focus group interviews with 47 students and individual interviews with 17 teachers and 10 parents were conducted. Participants identified a wide range of SIS, including TV, magazines, newspapers, internet, peers, teachers, families, science centers/museums, science exhibitions, textbooks, science books, and science camps. Students reported using various SIS in school-based and non-school contexts to satisfy their cognitive, affective, personal, and social integrative needs. SIS were used for science courses, homework/project assignments, examination/test preparations, and individual science-related research. Students assessed SIS in terms of the perceived accessibility of the sources, the quality of the content, and the content presentation. In particular, some sources such as teachers, families, TV, science magazines, textbooks, and science centers/museums ("directive sources") predictably led students to other sources such as teachers, families, internet, and science books ("directed sources"). A small number of sources crossed context boundaries, being useful in both school and out. Results shed light on the connection between science education and science communication in terms of promoting science learning.

  11. A method for the development of disease-specific reference standards vocabularies from textual biomedical literature resources

    PubMed Central

    Wang, Liqin; Bray, Bruce E.; Shi, Jianlin; Fiol, Guilherme Del; Haug, Peter J.

    2017-01-01

    Objective Disease-specific vocabularies are fundamental to many knowledge-based intelligent systems and applications like text annotation, cohort selection, disease diagnostic modeling, and therapy recommendation. Reference standards are critical in the development and validation of automated methods for disease-specific vocabularies. The goal of the present study is to design and test a generalizable method for the development of vocabulary reference standards from expert-curated, disease-specific biomedical literature resources. Methods We formed disease-specific corpora from literature resources like textbooks, evidence-based synthesized online sources, clinical practice guidelines, and journal articles. Medical experts annotated and adjudicated disease-specific terms in four classes (i.e., causes or risk factors, signs or symptoms, diagnostic tests or results, and treatment). Annotations were mapped to UMLS concepts. We assessed source variation, the contribution of each source to build disease-specific vocabularies, the saturation of the vocabularies with respect to the number of used sources, and the generalizability of the method with different diseases. Results The study resulted in 2588 string-unique annotations for heart failure in four classes, and 193 and 425 respectively for pulmonary embolism and rheumatoid arthritis in treatment class. Approximately 80% of the annotations were mapped to UMLS concepts. The agreement among heart failure sources ranged between 0.28 and 0.46. The contribution of these sources to the final vocabulary ranged between 18% and 49%. With the sources explored, the heart failure vocabulary reached near saturation in all four classes with the inclusion of minimal six sources (or between four to seven sources if only counting terms occurred in two or more sources). It took fewer sources to reach near saturation for the other two diseases in terms of the treatment class. Conclusions We developed a method for the development of disease-specific reference vocabularies. Expert-curated biomedical literature resources are substantial for acquiring disease-specific medical knowledge. It is feasible to reach near saturation in a disease-specific vocabulary using a relatively small number of literature sources. PMID:26971304

  12. Are we missing the boat? Current uses of long-term biological monitoring data in the evaluation and management of marine protected areas.

    PubMed

    Addison, P F E; Flander, L B; Cook, C N

    2015-02-01

    Protected area management agencies are increasingly using management effectiveness evaluation (MEE) to better understand, learn from and improve conservation efforts around the globe. Outcome assessment is the final stage of MEE, where conservation outcomes are measured to determine whether management objectives are being achieved. When quantitative monitoring data are available, best-practice examples of outcome assessments demonstrate that data should be assessed against quantitative condition categories. Such assessments enable more transparent and repeatable integration of monitoring data into MEE, which can promote evidence-based management and improve public accountability and reporting. We interviewed key informants from marine protected area (MPA) management agencies to investigate how scientific data sources, especially long-term biological monitoring data, are currently informing conservation management. Our study revealed that even when long-term monitoring results are available, management agencies are not using them for quantitative condition assessment in MEE. Instead, many agencies conduct qualitative condition assessments, where monitoring results are interpreted using expert judgment only. Whilst we found substantial evidence for the use of long-term monitoring data in the evidence-based management of MPAs, MEE is rarely the sole mechanism that facilitates the knowledge transfer of scientific evidence to management action. This suggests that the first goal of MEE (to enable environmental accountability and reporting) is being achieved, but the second and arguably more important goal of facilitating evidence-based management is not. Given that many MEE approaches are in their infancy, recommendations are made to assist management agencies realize the full potential of long-term quantitative monitoring data for protected area evaluation and evidence-based management. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. FY2004 SYSTEM ENGINEER PROGRAM MANAGER ANNUAL REPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JACKSON, G.J.

    2004-10-29

    During FY 2004, reviews of the FH System Engineer (SE) Program were conducted by the Independent Assessment (IA) Group. The results of these reviews are summarized as a part of this document. Additional reviews were performed by FH Engineering personnel. SE Engineering reviews performed include Periodic Walkdowns (typically, quarterly) by the SEs, a review of System Notebooks by the System Engineer Program Manager (SEPM), annual status report by each SE, and an annual status report by each of the Project Chief Engineers (PCEs). FY 2004 marked the completion of the first round of Vital Safety System assessments. Each of themore » VSSs on the FH VSS list has been evaluated at least once by either the FH Independent Assessment organization or was included as a part of DOE Phase II assessment. Following the completion of the K-Basins Assessment in May 2004, a review of the VSS assessment process was completed. Criteria were developed by FH, and concurred with by RL, to determine the frequency and priority of future VSS assessments. Additional actions have been taken to increase the visibility and emphasis assigned to VSSs. Completion of several Documented Safety Analyses (DSA), in combination with efforts to remove source term materials from several facilities, enabled the number of systems on the FH VSS list to be reduced from 60 at the beginning of FY 2004 to 48 by the end of FY 2004. It is expected that there will be further changes to the FH VSS list based on additional DSA revisions and continued progress towards reduction of source terms across the Hanford Site. Other new VSSs may be added to the list to reflect the relocation of materials away from the River Corridor to interim storage locations on the Central Plateau.« less

  14. Assessment of source-specific health effects associated with an unknown number of major sources of multiple air pollutants: a unified Bayesian approach.

    PubMed

    Park, Eun Sug; Hopke, Philip K; Oh, Man-Suk; Symanski, Elaine; Han, Daikwon; Spiegelman, Clifford H

    2014-07-01

    There has been increasing interest in assessing health effects associated with multiple air pollutants emitted by specific sources. A major difficulty with achieving this goal is that the pollution source profiles are unknown and source-specific exposures cannot be measured directly; rather, they need to be estimated by decomposing ambient measurements of multiple air pollutants. This estimation process, called multivariate receptor modeling, is challenging because of the unknown number of sources and unknown identifiability conditions (model uncertainty). The uncertainty in source-specific exposures (source contributions) as well as uncertainty in the number of major pollution sources and identifiability conditions have been largely ignored in previous studies. A multipollutant approach that can deal with model uncertainty in multivariate receptor models while simultaneously accounting for parameter uncertainty in estimated source-specific exposures in assessment of source-specific health effects is presented in this paper. The methods are applied to daily ambient air measurements of the chemical composition of fine particulate matter ([Formula: see text]), weather data, and counts of cardiovascular deaths from 1995 to 1997 for Phoenix, AZ, USA. Our approach for evaluating source-specific health effects yields not only estimates of source contributions along with their uncertainties and associated health effects estimates but also estimates of model uncertainty (posterior model probabilities) that have been ignored in previous studies. The results from our methods agreed in general with those from the previously conducted workshop/studies on the source apportionment of PM health effects in terms of number of major contributing sources, estimated source profiles, and contributions. However, some of the adverse source-specific health effects identified in the previous studies were not statistically significant in our analysis, which probably resulted because we incorporated parameter uncertainty in estimated source contributions that has been ignored in the previous studies into the estimation of health effects parameters. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Maternal recall of breastfeeding duration twenty years after delivery.

    PubMed

    Natland, Siv Tone; Andersen, Lene Frost; Nilsen, Tom Ivar Lund; Forsmo, Siri; Jacobsen, Geir W

    2012-11-23

    Studies on the health benefits from breastfeeding often rely on maternal recall of breastfeeding. Although short-term maternal recall has been found to be quite accurate, less is known about long-term accuracy. The objective of this study was to assess the accuracy of long-term maternal recall of breastfeeding duration. In a prospective study of pregnancy and birth outcome, detailed information on breastfeeding during the child's first year of life was collected from a cohort of Norwegian women who gave birth in 1986-88. Among 374 of the participants, data on breastfeeding initiation and duration were compared to recalled data obtained from mailed questionnaires some 20 years later. Intraclass correlation coefficient (ICC), Bland-Altman plot, and Kappa statistics were used to assess the agreement between the two sources of data. Logistic regression was used to assess predictors of misreporting breastfeeding duration by more than one month. Recorded and recalled breastfeeding duration were strongly correlated (ICC=0.82, p < 0.001). Nearly two thirds of women recalled their breastfeeding to within one month. Recall data showed a modest median overestimation of about 2 weeks. There were no apparent systematic discrepancies between the two sources of information, but recall error was predicted by the age when infants were introduced to another kind of milk. Across categories of breastfeeding, the overall weighted Kappa statistic showed an almost perfect agreement (κ = 0.85, 95% confidence interval [CI] 0.82 - 0.88). Breastfeeding duration was recalled quite accurately 20 years after mothers gave birth in a population where breastfeeding is common and its duration long.

  16. Assessment of YouTube videos as a source of information on medication use in pregnancy.

    PubMed

    Hansen, Craig; Interrante, Julia D; Ailes, Elizabeth C; Frey, Meghan T; Broussard, Cheryl S; Godoshian, Valerie J; Lewis, Courtney; Polen, Kara N D; Garcia, Amanda P; Gilboa, Suzanne M

    2016-01-01

    When making decisions about medication use in pregnancy, women consult many information sources, including the Internet. The aim of this study was to assess the content of publicly accessible YouTube videos that discuss medication use in pregnancy. Using 2023 distinct combinations of search terms related to medications and pregnancy, we extracted metadata from YouTube videos using a YouTube video Application Programming Interface. Relevant videos were defined as those with a medication search term and a pregnancy-related search term in either the video title or description. We viewed relevant videos and abstracted content from each video into a database. We documented whether videos implied each medication to be "safe" or "unsafe" in pregnancy and compared that assessment with the medication's Teratogen Information System (TERIS) rating. After viewing 651 videos, 314 videos with information about medication use in pregnancy were available for the final analyses. The majority of videos were from law firms (67%), television segments (10%), or physicians (8%). Selective serotonin reuptake inhibitors (SSRIs) were the most common medication class named (225 videos, 72%), and 88% of videos about SSRIs indicated that they were unsafe for use in pregnancy. However, the TERIS ratings for medication products in this class range from "unlikely" to "minimal" teratogenic risk. For the majority of medications, current YouTube video content does not adequately reflect what is known about the safety of their use in pregnancy and should be interpreted cautiously. However, YouTube could serve as a platform for communicating evidence-based medication safety information. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Medicaid at Forty

    PubMed Central

    Rowland, Diane

    2005-01-01

    This article examines Medicaid's evolution over the last four decades in its role as a health insurer for low-income families, a source of health and long-term care (LTC) coverage for people with disabilities, and as the supplement to Medicare for low-income aged and disabled Medicare beneficiaries. Medicaid's role and impact on each of these beneficiary groups is assessed. PMID:17290638

  18. 78 FR 48156 - Update to An Inventory of Sources and Environmental Releases of Dioxin-Like Compounds in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-07

    ... Environmental Assessment (NCEA) within EPA's Office of Research and Development. In November 2006, EPA released... classified as preliminary and not included in the quantitative inventory. The updated inventory lists the top... 2000. The quantitative results are expressed in terms of the toxicity equivalent (TEQ) of the mixture...

  19. Long-term assessment of runoff and sediment transport from grass and agroforestry buffers in corn/soybean watersheds using APEX

    USDA-ARS?s Scientific Manuscript database

    Existence of a claypan layer in soils at depths ranging from 4 to 37 cm restricts water movement and has contributed significantly to high rates of runoff, sediment transport, and other non-point source loadings from croplands in watersheds. The deposition of these pollutants in rivers, streams and...

  20. Accuracy-preserving source term quadrature for third-order edge-based discretization

    NASA Astrophysics Data System (ADS)

    Nishikawa, Hiroaki; Liu, Yi

    2017-09-01

    In this paper, we derive a family of source term quadrature formulas for preserving third-order accuracy of the node-centered edge-based discretization for conservation laws with source terms on arbitrary simplex grids. A three-parameter family of source term quadrature formulas is derived, and as a subset, a one-parameter family of economical formulas is identified that does not require second derivatives of the source term. Among the economical formulas, a unique formula is then derived that does not require gradients of the source term at neighbor nodes, thus leading to a significantly smaller discretization stencil for source terms. All the formulas derived in this paper do not require a boundary closure, and therefore can be directly applied at boundary nodes. Numerical results are presented to demonstrate third-order accuracy at interior and boundary nodes for one-dimensional grids and linear triangular/tetrahedral grids over straight and curved geometries.

  1. INEEL Subregional Conceptual Model Report Volume 3: Summary of Existing Knowledge of Natural and Anthropogenic Influences on the Release of Contaminants to the Subsurface Environment from Waste Source Terms at the INEEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul L. Wichlacz

    2003-09-01

    This source-term summary document is intended to describe the current understanding of contaminant source terms and the conceptual model for potential source-term release to the environment at the Idaho National Engineering and Environmental Laboratory (INEEL), as presented in published INEEL reports. The document presents a generalized conceptual model of the sources of contamination and describes the general categories of source terms, primary waste forms, and factors that affect the release of contaminants from the waste form into the vadose zone and Snake River Plain Aquifer. Where the information has previously been published and is readily available, summaries of the inventorymore » of contaminants are also included. Uncertainties that affect the estimation of the source term release are also discussed where they have been identified by the Source Term Technical Advisory Group. Areas in which additional information are needed (i.e., research needs) are also identified.« less

  2. Personality disorder assessment: the challenge of construct validity.

    PubMed

    Clark, L A; Livesley, W J; Morey, L

    1997-01-01

    We begin with a review of the data that challenge the current categorical system for classifying personality disorder, focusing on the central assessment issues of convergent and discriminant validity. These data indicate that while there is room for improvement in assessment, even greater change is needed in conceptualization than in instrumentation. Accordingly, we then refocus the categorical-dimensional debate in assessment terms, and place it in the broader context of such issues as the hierarchical structure of personality, overlap and distinctions between normal and abnormal personality, sources of information in personality disorder assessment, and overlap and discrimination of trait and state assessment. We conclude that more complex conceptual models that can incorporate both biological and environmental influences on the development of adaptive and maladaptive personality are needed.

  3. Changing sources and environmental factors reduce the rates of decline of organochlorine pesticides in the Arctic Atmosphere

    NASA Astrophysics Data System (ADS)

    Becker, S.; Halsall, C. J.; Tych, W.; Kallenborn, R.; Schlabach, M.; Manø, S.

    2009-01-01

    An extensive database of organochlorine (OC) pesticide concentrations measured at the Norwegian Arctic Monitoring Station was analysed to assess longer-term trends in the Arctic atmosphere. Dynamic Harmonic Regression (DHR) is employed to investigate the seasonal and cyclical behaviour of chlordanes, DDTs and hexachlorobenzene (HCB), and to isolate underlying inter-annual trends. Although a simple comparison of annual mean concentrations (1994-2005) suggest a decline for all of the OCs investigated, the longer-term trends identified by DHR only show a significant decline for p,p'-DDT. Indeed, HCB shows an increase from 2003-2005. This is thought to be due to changes in source types and the presence of impurities in current use pesticides, together with retreating sea ice affecting air-water exchange. Changes in source types were revealed by using isomeric ratios for the chlordanes and DDTs. Declining trends in ratios of trans-chlordane/cis-chlordane (TC/CC) indicate a shift from primary sources, to more ''weathered'' secondary sources, whereas an increasing trend in o,p'-DDT/p,p'-DDT ratios indicate a shift from use of technical DDT to dicofol. Continued monitoring of these OC pesticides is required to fully understand the influence of a changing climate on the behaviour and environmental cycling of these chemicals in the Arctic as well as possible impacts from ''new'' sources.

  4. Changing sources and environmental factors reduce the rates of decline of organochlorine pesticides in the Arctic atmosphere

    NASA Astrophysics Data System (ADS)

    Becker, S.; Halsall, C. J.; Tych, W.; Kallenborn, R.; Schlabach, M.; Manø, S.

    2012-05-01

    An extensive database of organochlorine (OC) pesticide concentrations measured at the Norwegian Arctic monitoring station at Ny-Ålesund, Svalbard, was analysed to assess longer-term trends in the Arctic atmosphere. Dynamic Harmonic Regression (DHR) is employed to investigate the seasonal and cyclical behaviour of chlordanes, DDTs and hexachlorobenzene (HCB), and to isolate underlying inter-annual trends. Although a simple comparison of annual mean concentrations (1994-2005) suggest a decline for all of the OCs investigated, the longer-term trends identified by DHR only show a significant decline for p,p'-DDT. Indeed, HCB shows an increase from 2003-2005. This is thought to be due to changes in source types and the presence of impurities in current use pesticides, together with retreating sea ice affecting air-water exchange. Changes in source types were revealed by using isomeric ratios for the chlordanes and DDTs. Declining trends in ratios of trans-chlordane/cis-chlordane (TC/CC) indicate a shift from primary sources, to more "weathered" secondary sources, whereas an increasing trend in o,p'-DDT/p,p'-DDT ratios indicate a shift from use of technical DDT to dicofol. Continued monitoring of these OC pesticides is required to fully understand the influence of a changing climate on the behaviour and environmental cycling of these chemicals in the Arctic as well as possible impacts from "new" sources.

  5. Upper and lower bounds of ground-motion variabilities: implication for source properties

    NASA Astrophysics Data System (ADS)

    Cotton, Fabrice; Reddy-Kotha, Sreeram; Bora, Sanjay; Bindi, Dino

    2017-04-01

    One of the key challenges of seismology is to be able to analyse the physical factors that control earthquakes and ground-motion variabilities. Such analysis is particularly important to calibrate physics-based simulations and seismic hazard estimations at high frequencies. Within the framework of the development of ground-motion prediction equation (GMPE) developments, ground-motions residuals (differences between recorded ground motions and the values predicted by a GMPE) are computed. The exponential growth of seismological near-source records and modern GMPE analysis technics allow to partition these residuals into between- and a within-event components. In particular, the between-event term quantifies all those repeatable source effects (e.g. related to stress-drop or kappa-source variability) which have not been accounted by the magnitude-dependent term of the model. In this presentation, we first discuss the between-event variabilities computed both in the Fourier and Response Spectra domains, using recent high-quality global accelerometric datasets (e.g. NGA-west2, Resorce, Kiknet). These analysis lead to the assessment of upper bounds for the ground-motion variability. Then, we compare these upper bounds with lower bounds estimated by analysing seismic sequences which occurred on specific fault systems (e.g., located in Central Italy or in Japan). We show that the lower bounds of between-event variabilities are surprisingly large which indicates a large variability of earthquake dynamic properties even within the same fault system. Finally, these upper and lower bounds of ground-shaking variability are discussed in term of variability of earthquake physical properties (e.g., stress-drop and kappa_source).

  6. Non-Point Source Pollutant Load Variation in Rapid Urbanization Areas by Remote Sensing, Gis and the L-THIA Model: A Case in Bao'an District, Shenzhen, China.

    PubMed

    Li, Tianhong; Bai, Fengjiao; Han, Peng; Zhang, Yuanyan

    2016-11-01

    Urban sprawl is a major driving force that alters local and regional hydrology and increases non-point source pollution. Using the Bao'an District in Shenzhen, China, a typical rapid urbanization area, as the study area and land-use change maps from 1988 to 2014 that were obtained by remote sensing, the contributions of different land-use types to NPS pollutant production were assessed with a localized long-term hydrologic impact assessment (L-THIA) model. The results show that the non-point source pollution load changed significantly both in terms of magnitude and spatial distribution. The loads of chemical oxygen demand, total suspended substances, total nitrogen and total phosphorus were affected by the interactions between event mean concentration and the magnitude of changes in land-use acreages and the spatial distribution. From 1988 to 2014, the loads of chemical oxygen demand, suspended substances and total phosphorus showed clearly increasing trends with rates of 132.48 %, 32.52 % and 38.76 %, respectively, while the load of total nitrogen decreased by 71.52 %. The immigrant population ratio was selected as an indicator to represent the level of rapid urbanization and industrialization in the study area, and a comparison analysis of the indicator with the four non-point source loads demonstrated that the chemical oxygen demand, total phosphorus and total nitrogen loads are linearly related to the immigrant population ratio. The results provide useful information for environmental improvement and city management in the study area.

  7. Non-Point Source Pollutant Load Variation in Rapid Urbanization Areas by Remote Sensing, Gis and the L-THIA Model: A Case in Bao'an District, Shenzhen, China

    NASA Astrophysics Data System (ADS)

    Li, Tianhong; Bai, Fengjiao; Han, Peng; Zhang, Yuanyan

    2016-11-01

    Urban sprawl is a major driving force that alters local and regional hydrology and increases non-point source pollution. Using the Bao'an District in Shenzhen, China, a typical rapid urbanization area, as the study area and land-use change maps from 1988 to 2014 that were obtained by remote sensing, the contributions of different land-use types to NPS pollutant production were assessed with a localized long-term hydrologic impact assessment (L-THIA) model. The results show that the non-point source pollution load changed significantly both in terms of magnitude and spatial distribution. The loads of chemical oxygen demand, total suspended substances, total nitrogen and total phosphorus were affected by the interactions between event mean concentration and the magnitude of changes in land-use acreages and the spatial distribution. From 1988 to 2014, the loads of chemical oxygen demand, suspended substances and total phosphorus showed clearly increasing trends with rates of 132.48 %, 32.52 % and 38.76 %, respectively, while the load of total nitrogen decreased by 71.52 %. The immigrant population ratio was selected as an indicator to represent the level of rapid urbanization and industrialization in the study area, and a comparison analysis of the indicator with the four non-point source loads demonstrated that the chemical oxygen demand, total phosphorus and total nitrogen loads are linearly related to the immigrant population ratio. The results provide useful information for environmental improvement and city management in the study area.

  8. Assessment of vaccine testing at three laboratories using the guinea pig model of tuberculosis.

    PubMed

    Grover, Ajay; Troudt, Jolynn; Arnett, Kimberly; Izzo, Linda; Lucas, Megan; Strain, Katie; McFarland, Christine; Hall, Yper; McMurray, David; Williams, Ann; Dobos, Karen; Izzo, Angelo

    2012-01-01

    The guinea pig model of tuberculosis is used extensively in different locations to assess the efficacy of novel tuberculosis vaccines during pre-clinical development. Two key assays are used to measure protection against virulent challenge: a 30 day post-infection assessment of mycobacterial burden and long-term post-infection survival and pathology analysis. To determine the consistency and robustness of the guinea pig model for testing vaccines, a comparative assessment between three sites that are currently involved in testing tuberculosis vaccines from external providers was performed. Each site was asked to test two "subunit" type vaccines in their routine animal model as if testing vaccines from a provider. All sites performed a 30 day study, and one site also performed a long-term survival/pathology study. Despite some differences in experimental approach between the sites, such as the origin of the Mycobacterium tuberculosis strain and the type of aerosol exposure device used to infect the animals and the source of the guinea pigs, the data obtained between sites were consistent in regard to the ability of each "vaccine" tested to reduce the mycobacterial burden. The observations also showed that there was good concurrence between the results of short-term and long-term studies. This validation exercise means that efficacy data can be compared between sites. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Vulnerability of Forests in India: A National Scale Assessment.

    PubMed

    Sharma, Jagmohan; Upgupta, Sujata; Jayaraman, Mathangi; Chaturvedi, Rajiv Kumar; Bala, Govindswamy; Ravindranath, N H

    2017-09-01

    Forests are subjected to stress from climatic and non-climatic sources. In this study, we have reported the results of inherent, as well as climate change driven vulnerability assessments for Indian forests. To assess inherent vulnerability of forests under current climate, we have used four indicators, namely biological richness, disturbance index, canopy cover, and slope. The assessment is presented as spatial profile of inherent vulnerability in low, medium, high and very high vulnerability classes. Fourty percent forest grid points in India show high or very high inherent vulnerability. Plantation forests show higher inherent vulnerability than natural forests. We assess the climate change driven vulnerability by combining the results of inherent vulnerability assessment with the climate change impact projections simulated by the Integrated Biosphere Simulator dynamic global vegetation model. While 46% forest grid points show high, very high, or extremely high vulnerability under future climate in the short term (2030s) under both representative concentration pathways 4.5 and 8.5, such grid points are 49 and 54%, respectively, in the long term (2080s). Generally, forests in the higher rainfall zones show lower vulnerability as compared to drier forests under future climate. Minimizing anthropogenic disturbance and conserving biodiversity can potentially reduce forest vulnerability under climate change. For disturbed forests and plantations, adaptive management aimed at forest restoration is necessary to build long-term resilience.

  10. Vulnerability of Forests in India: A National Scale Assessment

    NASA Astrophysics Data System (ADS)

    Sharma, Jagmohan; Upgupta, Sujata; Jayaraman, Mathangi; Chaturvedi, Rajiv Kumar; Bala, Govindswamy; Ravindranath, N. H.

    2017-09-01

    Forests are subjected to stress from climatic and non-climatic sources. In this study, we have reported the results of inherent, as well as climate change driven vulnerability assessments for Indian forests. To assess inherent vulnerability of forests under current climate, we have used four indicators, namely biological richness, disturbance index, canopy cover, and slope. The assessment is presented as spatial profile of inherent vulnerability in low, medium, high and very high vulnerability classes. Fourty percent forest grid points in India show high or very high inherent vulnerability. Plantation forests show higher inherent vulnerability than natural forests. We assess the climate change driven vulnerability by combining the results of inherent vulnerability assessment with the climate change impact projections simulated by the Integrated Biosphere Simulator dynamic global vegetation model. While 46% forest grid points show high, very high, or extremely high vulnerability under future climate in the short term (2030s) under both representative concentration pathways 4.5 and 8.5, such grid points are 49 and 54%, respectively, in the long term (2080s). Generally, forests in the higher rainfall zones show lower vulnerability as compared to drier forests under future climate. Minimizing anthropogenic disturbance and conserving biodiversity can potentially reduce forest vulnerability under climate change. For disturbed forests and plantations, adaptive management aimed at forest restoration is necessary to build long-term resilience.

  11. Preliminary results of the U.S. Nuclear Regulatory Commission collaborative research program to assess tsunami hazard for nuclear power plants on the Atlantic and gulf coasts

    USGS Publications Warehouse

    Kammerer, A.M.; ten Brink, Uri S.; Twitchell, David C.; Geist, Eric L.; Chaytor, Jason D.; Locat, J.; Lee, H.J.; Buczkowski, Brian J.; Sansoucy, M.

    2008-01-01

    In response to the 2004 Indian Ocean Tsunami, the United States Nuclear Regulatory Commission (US NRC) initiated a long-term research program to improve understanding of tsunami hazard levels for nuclear facilities in the United States. For this effort, the US NRC organized a collaborative research program with the United States Geological Survey (USGS) and other key researchers for the purpose of assessing tsunami hazard on the Atlantic and Gulf Coasts of the United States. The initial phase of this work consisted principally of collection, interpretation, and analysis of available offshore data and information. Necessarily, the US NRC research program includes both seismic- and landslide-based tsunamigenic sources in both the near and the far fields. The inclusion of tsunamigenic landslides, an important category of sources that impact tsunami hazard levels for the Atlantic and Gulf Coasts over the long time periods of interest to the US NRC is a key difference between this program and most other tsunami hazard assessment programs. Although only a few years old, this program is already producing results that both support current US NRC activities and look toward the long-term goal of probabilistic tsunami hazard assessment. This paper provides a summary of results from several areas of current research. An overview of the broader US NRC research program is provided in a companion paper in this conference.

  12. Respiratory source control using a surgical mask: An in vitro study

    PubMed Central

    Patel, Rajeev B.; Skaria, Shaji D.; Mansour, Mohamed M.; Smaldone, Gerald C.

    2016-01-01

    ABSTRACT Cough etiquette and respiratory hygiene are forms of source control encouraged to prevent the spread of respiratory infection. The use of surgical masks as a means of source control has not been quantified in terms of reducing exposure to others. We designed an in vitro model using various facepieces to assess their contribution to exposure reduction when worn at the infectious source (Source) relative to facepieces worn for primary (Receiver) protection, and the factors that contribute to each. In a chamber with various airflows, radiolabeled aerosols were exhaled via a ventilated soft-face manikin head using tidal breathing and cough (Source). Another manikin, containing a filter, quantified recipient exposure (Receiver). The natural fit surgical mask, fitted (SecureFit) surgical mask and an N95-class filtering facepiece respirator (commonly known as an “N95 respirator”) with and without a Vaseline-seal were tested. With cough, source control (mask or respirator on Source) was statistically superior to mask or unsealed respirator protection on the Receiver (Receiver protection) in all environments. To equal source control during coughing, the N95 respirator must be Vaseline-sealed. During tidal breathing, source control was comparable or superior to mask or respirator protection on the Receiver. Source control via surgical masks may be an important adjunct defense against the spread of respiratory infections. The fit of the mask or respirator, in combination with the airflow patterns in a given setting, are significant contributors to source control efficacy. Future clinical trials should include a surgical mask source control arm to assess the contribution of source control in overall protection against airborne infection. PMID:26225807

  13. Health information needs and preferences in relation to survivorship care plans of long-term cancer survivors in the American Cancer Society's Study of Cancer Survivors-I.

    PubMed

    Playdon, Mary; Ferrucci, Leah M; McCorkle, Ruth; Stein, Kevin D; Cannady, Rachel; Sanft, Tara; Cartmel, Brenda

    2016-08-01

    Survivorship care plans (SCPs) provide cancer patients and health care providers with a treatment summary and outline of recommended medical follow-up. Few studies have investigated the information needs and preferred sources among long-term cancer survivors. Cancer survivors of the ten most common cancers enrolled in the longitudinal Study of Cancer Survivors-I (SCS-I) completed a survey 9 years post-diagnosis (n = 3138); at time of diagnosis of the SCS-I cohort, SCPs were not considered usual care. We assessed participants' current desire and preferred sources for information across ten SCP items and evaluated factors associated with information need 9 years after diagnosis. The proportion of long-term cancer survivors endorsing a need for cancer and health information 9 years post-diagnosis ranged from 43 % (cancer screening) to 9 % (consequences of cancer on ability to work). Print media and personalized reading materials were the most preferred information sources. Younger age, higher education, race other than non-Hispanic white, later cancer stage, having breast cancer, having ≥2 comorbidities, and self-reporting poor health were associated with greater informational need (p < 0.05). Long-term cancer survivors continue to report health information needs for most SCP items and would prefer a print format; however, level of need differs by socio-demographic and cancer characteristics. Cancer survivors who did not previously receive a SCP may still benefit from receiving SCP content, and strategies for enabling dissemination to long-term survivors warrant further investigation.

  14. Considering potential seismic sources in earthquake hazard assessment for Northern Iran

    NASA Astrophysics Data System (ADS)

    Abdollahzadeh, Gholamreza; Sazjini, Mohammad; Shahaky, Mohsen; Tajrishi, Fatemeh Zahedi; Khanmohammadi, Leila

    2014-07-01

    Located on the Alpine-Himalayan earthquake belt, Iran is one of the seismically active regions of the world. Northern Iran, south of Caspian Basin, a hazardous subduction zone, is a densely populated and developing area of the country. Historical and instrumental documented seismicity indicates the occurrence of severe earthquakes leading to many deaths and large losses in the region. With growth of seismological and tectonic data, updated seismic hazard assessment is a worthwhile issue in emergency management programs and long-term developing plans in urban and rural areas of this region. In the present study, being armed with up-to-date information required for seismic hazard assessment including geological data and active tectonic setting for thorough investigation of the active and potential seismogenic sources, and historical and instrumental events for compiling the earthquake catalogue, probabilistic seismic hazard assessment is carried out for the region using three recent ground motion prediction equations. The logic tree method is utilized to capture epistemic uncertainty of the seismic hazard assessment in delineation of the seismic sources and selection of attenuation relations. The results are compared to a recent practice in code-prescribed seismic hazard of the region and are discussed in detail to explore their variation in each branch of logic tree approach. Also, seismic hazard maps of peak ground acceleration in rock site for 475- and 2,475-year return periods are provided for the region.

  15. Automated source term and wind parameter estimation for atmospheric transport and dispersion applications

    NASA Astrophysics Data System (ADS)

    Bieringer, Paul E.; Rodriguez, Luna M.; Vandenberghe, Francois; Hurst, Jonathan G.; Bieberbach, George; Sykes, Ian; Hannan, John R.; Zaragoza, Jake; Fry, Richard N.

    2015-12-01

    Accurate simulations of the atmospheric transport and dispersion (AT&D) of hazardous airborne materials rely heavily on the source term parameters necessary to characterize the initial release and meteorological conditions that drive the downwind dispersion. In many cases the source parameters are not known and consequently based on rudimentary assumptions. This is particularly true of accidental releases and the intentional releases associated with terrorist incidents. When available, meteorological observations are often not representative of the conditions at the location of the release and the use of these non-representative meteorological conditions can result in significant errors in the hazard assessments downwind of the sensors, even when the other source parameters are accurately characterized. Here, we describe a computationally efficient methodology to characterize both the release source parameters and the low-level winds (eg. winds near the surface) required to produce a refined downwind hazard. This methodology, known as the Variational Iterative Refinement Source Term Estimation (STE) Algorithm (VIRSA), consists of a combination of modeling systems. These systems include a back-trajectory based source inversion method, a forward Gaussian puff dispersion model, a variational refinement algorithm that uses both a simple forward AT&D model that is a surrogate for the more complex Gaussian puff model and a formal adjoint of this surrogate model. The back-trajectory based method is used to calculate a ;first guess; source estimate based on the available observations of the airborne contaminant plume and atmospheric conditions. The variational refinement algorithm is then used to iteratively refine the first guess STE parameters and meteorological variables. The algorithm has been evaluated across a wide range of scenarios of varying complexity. It has been shown to improve the source parameters for location by several hundred percent (normalized by the distance from source to the closest sampler), and improve mass estimates by several orders of magnitude. Furthermore, it also has the ability to operate in scenarios with inconsistencies between the wind and airborne contaminant sensor observations and adjust the wind to provide a better match between the hazard prediction and the observations.

  16. Photovoltaic village power application: assessment of the near-term market

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenblum, L.; Bifano, W.J.; Poley, W.A.

    1978-01-01

    A preliminary assessment of the near-term market for photovoltaic village power applications is presented. One of the objectives of the Department of Energy's (DOE) National Photovoltaic Program is to stimulate the demand for photovoltaic power systems so that appropriate markets will be developed in the near-term to support the increasing photovoltaic production capacity also being developed by DOE. The village power application represents such a potential market for photovoltaics. The price of energy for photovoltaic systems is compared to that of utility line extensions and diesel generators. The potential ''domestic''' demand (including the 50 states of the union plus themore » areas under legal control of the U.S. government) is defined in both the goverment and commercial sectors. The foreign demand and sources of funding for village power systems in the developing countries are also discussed briefly. It is concluded that a near-term domestic market of at least 12 MW (peak) and a foreign market of about 10 GW (peak) exists and that significant market penetration should be possible beginning in the 1981--82 period.« less

  17. River stage influences on uranium transport in a hydrologically dynamic groundwater-surface water transition zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zachara, John M.; Chen, Xingyuan; Murray, Chris

    In this study, a well-field within a uranium (U) plume in the groundwater-surface water transition zone was monitored for a 3 year period for water table elevation and dissolved solutes. The plume discharges to the Columbia River, which displays a dramatic spring stage surge resulting from snowmelt. Groundwater exhibits a low hydrologic gradient and chemical differences with river water. River water intrudes the site in spring. Specific aims were to assess the impacts of river intrusion on dissolved uranium (U aq), specific conductance (SpC), and other solutes, and to discriminate between transport, geochemical, and source term heterogeneity effects. Time seriesmore » trends for U aq and SpC were complex and displayed large temporal and well-to-well variability as a result of water table elevation fluctuations, river water intrusion, and changes in groundwater flow directions. The wells were clustered into subsets exhibiting common behaviors resulting from the intrusion dynamics of river water and the location of source terms. Hot-spots in U aq varied in location with increasing water table elevation through the combined effects of advection and source term location. Heuristic reactive transport modeling with PFLOTRAN demonstrated that mobilized U aq was transported between wells and source terms in complex trajectories, and was diluted as river water entered and exited the groundwater system. While U aq time-series concentration trends varied significantly from year-to-year as a result of climate-caused differences in the spring hydrograph, common and partly predictable response patterns were observed that were driven by water table elevation, and the extent and duration of river water intrusion.« less

  18. River stage influences on uranium transport in a hydrologically dynamic groundwater-surface water transition zone

    DOE PAGES

    Zachara, John M.; Chen, Xingyuan; Murray, Chris; ...

    2016-03-04

    In this study, a well-field within a uranium (U) plume in the groundwater-surface water transition zone was monitored for a 3 year period for water table elevation and dissolved solutes. The plume discharges to the Columbia River, which displays a dramatic spring stage surge resulting from snowmelt. Groundwater exhibits a low hydrologic gradient and chemical differences with river water. River water intrudes the site in spring. Specific aims were to assess the impacts of river intrusion on dissolved uranium (U aq), specific conductance (SpC), and other solutes, and to discriminate between transport, geochemical, and source term heterogeneity effects. Time seriesmore » trends for U aq and SpC were complex and displayed large temporal and well-to-well variability as a result of water table elevation fluctuations, river water intrusion, and changes in groundwater flow directions. The wells were clustered into subsets exhibiting common behaviors resulting from the intrusion dynamics of river water and the location of source terms. Hot-spots in U aq varied in location with increasing water table elevation through the combined effects of advection and source term location. Heuristic reactive transport modeling with PFLOTRAN demonstrated that mobilized U aq was transported between wells and source terms in complex trajectories, and was diluted as river water entered and exited the groundwater system. While U aq time-series concentration trends varied significantly from year-to-year as a result of climate-caused differences in the spring hydrograph, common and partly predictable response patterns were observed that were driven by water table elevation, and the extent and duration of river water intrusion.« less

  19. Filtered Mass Density Function for Design Simulation of High Speed Airbreathing Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Drozda, T. G.; Sheikhi, R. M.; Givi, Peyman

    2001-01-01

    The objective of this research is to develop and implement new methodology for large eddy simulation of (LES) of high-speed reacting turbulent flows. We have just completed two (2) years of Phase I of this research. This annual report provides a brief and up-to-date summary of our activities during the period: September 1, 2000 through August 31, 2001. In the work within the past year, a methodology termed "velocity-scalar filtered density function" (VSFDF) is developed and implemented for large eddy simulation (LES) of turbulent flows. In this methodology the effects of the unresolved subgrid scales (SGS) are taken into account by considering the joint probability density function (PDF) of all of the components of the velocity and scalar vectors. An exact transport equation is derived for the VSFDF in which the effects of the unresolved SGS convection, SGS velocity-scalar source, and SGS scalar-scalar source terms appear in closed form. The remaining unclosed terms in this equation are modeled. A system of stochastic differential equations (SDEs) which yields statistically equivalent results to the modeled VSFDF transport equation is constructed. These SDEs are solved numerically by a Lagrangian Monte Carlo procedure. The consistency of the proposed SDEs and the convergence of the Monte Carlo solution are assessed by comparison with results obtained by an Eulerian LES procedure in which the corresponding transport equations for the first two SGS moments are solved. The unclosed SGS convection, SGS velocity-scalar source, and SGS scalar-scalar source in the Eulerian LES are replaced by corresponding terms from VSFDF equation. The consistency of the results is then analyzed for a case of two dimensional mixing layer.

  20. Understanding cancer survivors' information needs and information-seeking behaviors for complementary and alternative medicine from short- to long-term survival: a mixed-methods study.

    PubMed

    Scarton, Lou Ann; Del Fiol, Guilherme; Oakley-Girvan, Ingrid; Gibson, Bryan; Logan, Robert; Workman, T Elizabeth

    2018-01-01

    The research examined complementary and alternative medicine (CAM) information-seeking behaviors and preferences from short- to long-term cancer survival, including goals, motivations, and information sources. A mixed-methods approach was used with cancer survivors from the "Assessment of Patients' Experience with Cancer Care" 2004 cohort. Data collection included a mail survey and phone interviews using the critical incident technique (CIT). Seventy survivors from the 2004 study responded to the survey, and eight participated in the CIT interviews. Quantitative results showed that CAM usage did not change significantly between 2004 and 2015. The following themes emerged from the CIT: families' and friends' provision of the initial introduction to a CAM, use of CAM to manage the emotional and psychological impact of cancer, utilization of trained CAM practitioners, and online resources as a prominent source for CAM information. The majority of participants expressed an interest in an online information-sharing portal for CAM. Patients continue to use CAM well into long-term cancer survivorship. Finding trustworthy sources for information on CAM presents many challenges such as reliability of source, conflicting information on efficacy, and unknown interactions with conventional medications. Study participants expressed interest in an online portal to meet these needs through patient testimonials and linkage of claims to the scientific literature. Such a portal could also aid medical librarians and clinicians in locating and evaluating CAM information on behalf of patients.

  1. Understanding cancer survivors’ information needs and information-seeking behaviors for complementary and alternative medicine from short- to long-term survival: a mixed-methods study

    PubMed Central

    Scarton, Lou Ann; Del Fiol, Guilherme; Oakley-Girvan, Ingrid; Gibson, Bryan; Logan, Robert; Workman, T. Elizabeth

    2018-01-01

    Objective The research examined complementary and alternative medicine (CAM) information-seeking behaviors and preferences from short- to long-term cancer survival, including goals, motivations, and information sources. Methods A mixed-methods approach was used with cancer survivors from the “Assessment of Patients’ Experience with Cancer Care” 2004 cohort. Data collection included a mail survey and phone interviews using the critical incident technique (CIT). Results Seventy survivors from the 2004 study responded to the survey, and eight participated in the CIT interviews. Quantitative results showed that CAM usage did not change significantly between 2004 and 2015. The following themes emerged from the CIT: families’ and friends’ provision of the initial introduction to a CAM, use of CAM to manage the emotional and psychological impact of cancer, utilization of trained CAM practitioners, and online resources as a prominent source for CAM information. The majority of participants expressed an interest in an online information-sharing portal for CAM. Conclusion Patients continue to use CAM well into long-term cancer survivorship. Finding trustworthy sources for information on CAM presents many challenges such as reliability of source, conflicting information on efficacy, and unknown interactions with conventional medications. Study participants expressed interest in an online portal to meet these needs through patient testimonials and linkage of claims to the scientific literature. Such a portal could also aid medical librarians and clinicians in locating and evaluating CAM information on behalf of patients. PMID:29339938

  2. Internet Interventions for Long-Term Conditions: Patient and Caregiver Quality Criteria

    PubMed Central

    Murray, Elizabeth; Stevenson, Fiona; Gore, Charles; Nazareth, Irwin

    2006-01-01

    Background Interactive health communication applications (IHCAs) that combine high-quality health information with interactive components, such as self-assessment tools, behavior change support, peer support, or decision support, are likely to benefit people with long-term conditions. IHCAs are now largely Web-based and are becoming known as "Internet interventions." Although there are numerous professionally generated criteria to assess health-related websites, to date there has been scant exploration of patient-generated assessment criteria even though patients and professionals use different criteria for assessing the quality of traditional sources of health information. Objective We aimed to determine patients' and caregivers' requirements of IHCAs for long-term conditions as well as their criteria for assessing the quality of different programs. Methods This was a qualitative study with focus groups. Patients and caregivers managing long-term conditions used three (predominantly Web-based) IHCAs relevant to their condition and subsequently discussed the strengths and weaknesses of the different IHCAs in focus groups. Participants in any one focus group all shared the same long-term condition and viewed the same three IHCAs. Patient and caregiver criteria for IHCAs emerged from the data. Results There were 40 patients and caregivers who participated in 10 focus groups. Participants welcomed the potential of Internet interventions but felt that many were not achieving their full potential. Participants generated detailed and specific quality criteria relating to information content, presentation, interactivity, and trustworthiness, which can be used by developers and purchasers of Internet interventions. Conclusions The user-generated quality criteria reported in this paper should help developers and purchasers provide Internet interventions that better meet user needs. PMID:16954123

  3. Source-water susceptibility assessment in Texas—Approach and methodology

    USGS Publications Warehouse

    Ulery, Randy L.; Meyer, John E.; Andren, Robert W.; Newson, Jeremy K.

    2011-01-01

    Public water systems provide potable water for the public's use. The Safe Drinking Water Act amendments of 1996 required States to prepare a source-water susceptibility assessment (SWSA) for each public water system (PWS). States were required to determine the source of water for each PWS, the origin of any contaminant of concern (COC) monitored or to be monitored, and the susceptibility of the public water system to COC exposure, to protect public water supplies from contamination. In Texas, the Texas Commission on Environmental Quality (TCEQ) was responsible for preparing SWSAs for the more than 6,000 public water systems, representing more than 18,000 surface-water intakes or groundwater wells. The U.S. Geological Survey (USGS) worked in cooperation with TCEQ to develop the Source Water Assessment Program (SWAP) approach and methodology. Texas' SWAP meets all requirements of the Safe Drinking Water Act and ultimately provides the TCEQ with a comprehensive tool for protection of public water systems from contamination by up to 247 individual COCs. TCEQ staff identified both the list of contaminants to be assessed and contaminant threshold values (THR) to be applied. COCs were chosen because they were regulated contaminants, were expected to become regulated contaminants in the near future, or were unregulated but thought to represent long-term health concerns. THRs were based on maximum contaminant levels from U.S. Environmental Protection Agency (EPA)'s National Primary Drinking Water Regulations. For reporting purposes, COCs were grouped into seven contaminant groups: inorganic compounds, volatile organic compounds, synthetic organic compounds, radiochemicals, disinfection byproducts, microbial organisms, and physical properties. Expanding on the TCEQ's definition of susceptibility, subject-matter expert working groups formulated the SWSA approach based on assumptions that natural processes and human activities contribute COCs in quantities that vary in space and time; that increased levels of COC-producing activities within a source area may increase susceptibility to COC exposure; and that natural and manmade conditions within the source area may increase, decrease, or have no observable effect on susceptibility to COC exposure. Incorporating these assumptions, eight SWSA components were defined: identification, delineation, intrinsic susceptibility, point- and nonpoint-source susceptibility, contaminant occurrence, area-of-primary influence, and summary components. Spatial datasets were prepared to represent approximately 170 attributes or indicators used in the assessment process. These primarily were static datasets (approximately 46 gigabytes (GB) in size). Selected datasets such as PWS surface-water-intake or groundwater-well locations and potential source of contamination (PSOC) locations were updated weekly. Completed assessments were archived, and that database is approximately 10 GB in size. SWSA components currently (2011) are implemented in the Source Water Assessment Program-Decision Support System (SWAP-DSS) computer software, specifically developed to produce SWSAs. On execution of the software, the components work to identify the source of water for the well or intake, assess intrinsic susceptibility of the water- supply source, assess susceptibility to contamination with COCs from point and nonpoint sources, identify any previous detections of COCs from existing water-quality databases, and summarize the results. Each water-supply source's susceptibility is assessed, source results are weighted by source capacity (when a PWS has multiple sources), and results are combined into a single SWSA for the PWS.'SWSA reports are generated using the software; during 2003, more than 6,000 reports were provided to PWS operators and the public. The ability to produce detailed or summary reports for individual sources, and detailed or summary reports for a PWS, by COC or COC group was a unique capability of SWAP-DSS. In 2004, the TCEQ began a rotating schedule for SWSA wherein one-third of PWSs statewide would be assessed annually, or sooner if protection-program activities deemed it necessary, and that schedule has continued to the present. Cooperative efforts by the TCEQ and the USGS for SWAP software maintenance and enhancements ended in 2011 with the TCEQ assuming responsibility for all tasks.

  4. A World of Competitors: Assessing the US High-Tech Advantage and the Process of Globalisation

    ERIC Educational Resources Information Center

    Douglass, John Aubrey

    2008-01-01

    Research universities throughout the world are part of a larger effort by countries to bolster science and technological innovation and compete economically. The United States remains highly competitive as a source of high-tech innovation because of a number of market positions, many the results of long-term investments in institutions (such as…

  5. An Assessment of Fission Product Scrubbing in Sodium Pools Following a Core Damage Event in a Sodium Cooled Fast Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, M.; Farmer, M.; Grabaskas, D.

    The U.S. Nuclear Regulatory Commission has stated that mechanistic source term (MST) calculations are expected to be required as part of the advanced reactor licensing process. A recent study by Argonne National Laboratory has concluded that fission product scrubbing in sodium pools is an important aspect of an MST calculation for a sodium-cooled fast reactor (SFR). To model the phenomena associated with sodium pool scrubbing, a computational tool, developed as part of the Integral Fast Reactor (IFR) program, was utilized in an MST trial calculation. This tool was developed by applying classical theories of aerosol scrubbing to the decontamination ofmore » gases produced as a result of postulated fuel pin failures during an SFR accident scenario. The model currently considers aerosol capture by Brownian diffusion, inertial deposition, and gravitational sedimentation. The effects of sodium vapour condensation on aerosol scrubbing are also treated. This paper provides details of the individual scrubbing mechanisms utilized in the IFR code as well as results from a trial mechanistic source term assessment led by Argonne National Laboratory in 2016.« less

  6. Managing nuclear power plant induced disasters.

    PubMed

    Kyne, Dean

    2015-01-01

    To understand the management process of nuclear power plant (NPP) induced disasters. The study shields light on phases and issues associated with the NPP induced disaster management. This study uses Palo Verde Nuclear Generation Station as study subject and Arizona State as study area. This study uses the Radiological Assessment System for Consequence Analysis (RASCAL) Source Term to Dose (STDose) of the Nuclear Regulatory Commission, a computer software to project and assess the source term dose and release pathway. This study also uses ArcGIS, a geographic information system to analyze geospatial data. A detailed case study of Palo Verde Nuclear Power Generation (PVNPG) Plant was conducted. The findings reveal that the NPP induced disaster management process is conducted by various stakeholders. To save lives and to minimize the impacts, it is vital to relate planning and process of the disaster management. Number of people who expose to the radioactive plume pathway and level of radioactivity could vary depending on the speed and direction of wind on the day the event takes place. This study findings show that there is a need to address the burning issue of different racial and ethnic groups' unequal exposure and unequal protection to potential risks associated with the NPPs.

  7. Surveillance system for air pollutants by combination of the decision support system COMPAS and optical remote sensing systems

    NASA Astrophysics Data System (ADS)

    Flassak, Thomas; de Witt, Helmut; Hahnfeld, Peter; Knaup, Andreas; Kramer, Lothar

    1995-09-01

    COMPAS is a decision support system designed to assist in the assessment of the consequences of accidental releases of toxic and flammable substances. One of the key elements of COMPAS is a feedback algorithm which allows us to calculate the source term with the aid of concentration measurements. Up to now the feedback technique is applied to concentration measurements done with test tubes or conventional point sensors. In this paper the extension of the actual method is presented which is the combination of COMPAS and an optical remote sensing system like the KAYSER-THREDE K300 FTIR system. Active remote sensing methods based on FTIR are, among other applications, ideal for the so-called fence line monitoring of the diffuse emissions and accidental releases from industrial facilities, since from the FTIR spectra averaged concentration levels along the measurement path can be achieved. The line-averaged concentrations are ideally suited as on-line input for COMPAS' feedback technique. Uncertainties in the assessment of the source term related with both shortcomings of the dispersion model itself and also problems of a feedback strategy based on point measurements are reduced.

  8. Dust: a metric for use in residential and building exposure assessment and source characterization.

    PubMed Central

    Lioy, Paul J; Freeman, Natalie C G; Millette, James R

    2002-01-01

    In this review, we examine house dust and residential soil and their use for identifying sources and the quantifying levels of toxicants for the estimation of exposure. We answer critical questions that focus on the selection of samples or sampling strategies for collection and discuss areas of uncertainty and gaps in knowledge. We discuss the evolution of dust sampling with a special emphasis on work conducted after the publication of the 1992 review by McArthur [Appl Occup Environ Hyg 7(9):599-606 (1992)]. The approaches to sampling dust examined include surface wipe sampling, vacuum sampling, and other sampling approaches, including attic sampling. The metrics of presentation of results for toxicants in dust surface loading (micrograms per square centimeter) or surface concentration (micrograms per gram) are discussed. We evaluate these metrics in terms of how the information can be used in source characterization and in exposure characterization. We discuss the types of companion information on source use and household or personal activity patterns required to assess the significance of the dust exposure. The status and needs for wipe samplers, surface samplers, and vacuum samplers are summarized with some discussion on the strengths and weaknesses of each type of sampler. We also discuss needs for research and development and the current status of standardization. Case studies are provided to illustrate the use of house dust and residential soil in source characterization, forensic analyses, or human exposure assessment. PMID:12361921

  9. Dust: a metric for use in residential and building exposure assessment and source characterization.

    PubMed

    Lioy, Paul J; Freeman, Natalie C G; Millette, James R

    2002-10-01

    In this review, we examine house dust and residential soil and their use for identifying sources and the quantifying levels of toxicants for the estimation of exposure. We answer critical questions that focus on the selection of samples or sampling strategies for collection and discuss areas of uncertainty and gaps in knowledge. We discuss the evolution of dust sampling with a special emphasis on work conducted after the publication of the 1992 review by McArthur [Appl Occup Environ Hyg 7(9):599-606 (1992)]. The approaches to sampling dust examined include surface wipe sampling, vacuum sampling, and other sampling approaches, including attic sampling. The metrics of presentation of results for toxicants in dust surface loading (micrograms per square centimeter) or surface concentration (micrograms per gram) are discussed. We evaluate these metrics in terms of how the information can be used in source characterization and in exposure characterization. We discuss the types of companion information on source use and household or personal activity patterns required to assess the significance of the dust exposure. The status and needs for wipe samplers, surface samplers, and vacuum samplers are summarized with some discussion on the strengths and weaknesses of each type of sampler. We also discuss needs for research and development and the current status of standardization. Case studies are provided to illustrate the use of house dust and residential soil in source characterization, forensic analyses, or human exposure assessment.

  10. PCB remediation in schools: a review.

    PubMed

    Brown, Kathleen W; Minegishi, Taeko; Cummiskey, Cynthia Campisano; Fragala, Matt A; Hartman, Ross; MacIntosh, David L

    2016-02-01

    Growing awareness of polychlorinated biphenyls (PCBs) in legacy caulk and other construction materials of schools has created a need for information on best practices to control human exposures and comply with applicable regulations. A concise review of approaches and techniques for management of building-related PCBs is the focus of this paper. Engineering and administrative controls that block pathways of PCB transport, dilute concentrations of PCBs in indoor air or other exposure media, or establish uses of building space that mitigate exposure can be effective initial responses to identification of PCBs in a building. Mitigation measures also provide time for school officials to plan a longer-term remediation strategy and to secure the necessary resources. These longer-term strategies typically involve removal of caulk or other primary sources of PCBs as well as nearby masonry or other materials contaminated with PCBs by the primary sources. The costs of managing PCB-containing building materials from assessment through ultimate disposal can be substantial. Optimizing the efficacy and cost-effectiveness of remediation programs requires aligning a thorough understanding of sources and exposure pathways with the most appropriate mitigation and abatement methods.

  11. Taste and odor occurrence in Lake William C. Bowen and Municipal Reservoir #1, Spartanburg County, South Carolina

    USGS Publications Warehouse

    Journey, Celeste; Arrington, Jane M.

    2009-01-01

    The U.S. Geological Survey and Spartanburg Water are working cooperatively on an ongoing study of Lake Bowen and Reservoir #1 to identify environmental factors that enhance or influence the production of geosmin in the source-water reservoirs. Spartanburg Water is using information from this study to develop management strategies to reduce (short-term solution) and prevent (long-term solution) geosmin occurrence. Spartanburg Water utility treats and distributes drinking water to the Spartanburg area of South Carolina. The drinking water sources for the area are Lake William C. Bowen (Lake Bowen) and Municipal Reservoir #1 (Reservoir #1), located north of Spartanburg. These reservoirs, which were formed by the impoundment of the South Pacolet River, were assessed in 2006 by the South Carolina Department of Health and Environmental Control (SCDHEC) as being fully supportive of all uses based on established criteria. Nonetheless, Spartanburg Water had noted periodic taste and odor problems due to the presence of geosmin, a naturally occurring compound in the source water. Geosmin is not harmful, but its presence in drinking water is aesthetically unpleasant.

  12. Supersonic jet noise - Its generation, prediction and effects on people and structures

    NASA Technical Reports Server (NTRS)

    Preisser, J. S.; Golub, R. A.; Seiner, J. M.; Powell, C. A.

    1990-01-01

    This paper presents the results of a study aimed at quantifying the effects of jet source noise reduction, increases in aircraft lift, and reduced aircraft thrust on the take-off noise associated with supersonic civil transports. Supersonic jet noise sources are first described, and their frequency and directivity dependence are defined. The study utilizes NASA's Aircraft Noise Prediction Program in a parametric study to weigh the relative benefits of several approaches to low noise. The baseline aircraft concept used in these predictions is the AST-205-1 powered by GE21/J11-B14A scaled engines. Noise assessment is presented in terms of effective perceived noise levels at the FAA's centerline and sideline measuring locations for current subsonic aircraft, and in terms of audiologically perceived sound of people and other indirect effects. The results show that significant noise benefit can be achieved through proper understanding and utilization of all available approaches.

  13. Annual Rates on Seismogenic Italian Sources with Models of Long-Term Predictability for the Time-Dependent Seismic Hazard Assessment In Italy

    NASA Astrophysics Data System (ADS)

    Murru, Maura; Falcone, Giuseppe; Console, Rodolfo

    2016-04-01

    The present study is carried out in the framework of the Center for Seismic Hazard (CPS) INGV, under the agreement signed in 2015 with the Department of Civil Protection for developing a new model of seismic hazard of the country that can update the current reference (MPS04-S1; zonesismiche.mi.ingv.it and esse1.mi.ingv.it) released between 2004 and 2006. In this initiative, we participate with the Long-Term Stress Transfer (LTST) Model to provide the annual occurrence rate of a seismic event on the entire Italian territory, from a Mw4.5 minimum magnitude, considering bins of 0.1 magnitude units on geographical cells of 0.1° x 0.1°. Our methodology is based on the fusion of a statistical time-dependent renewal model (Brownian Passage Time, BPT, Matthews at al., 2002) with a physical model which considers the permanent effect in terms of stress that undergoes a seismogenic source in result of the earthquakes that occur on surrounding sources. For each considered catalog (historical, instrumental and individual seismogenic sources) we determined a distinct rate value for each cell of 0.1° x 0.1° for the next 50 yrs. If the cell falls within one of the sources in question, we adopted the respective value of rate, which is referred only to the magnitude of the event characteristic. This value of rate is divided by the number of grid cells that fall on the horizontal projection of the source. If instead the cells fall outside of any seismic source we considered the average value of the rate obtained from the historical and the instrumental catalog, using the method of Frankel (1995). The annual occurrence rate was computed for any of the three considered distributions (Poisson, BPT and BPT with inclusion of stress transfer).

  14. Hybrid Wing Body Aircraft System Noise Assessment with Propulsion Airframe Aeroacoustic Experiments

    NASA Technical Reports Server (NTRS)

    Thomas, Russell H.; Burley, Casey L.; Olson, Erik D.

    2010-01-01

    A system noise assessment of a hybrid wing body configuration was performed using NASA s best available aircraft models, engine model, and system noise assessment method. A propulsion airframe aeroacoustic effects experimental database for key noise sources and interaction effects was used to provide data directly in the noise assessment where prediction methods are inadequate. NASA engine and aircraft system models were created to define the hybrid wing body aircraft concept as a twin engine aircraft with a 7500 nautical mile mission. The engines were modeled as existing technology high bypass ratio turbofans. The baseline hybrid wing body aircraft was assessed at 22 dB cumulative below the FAA Stage 4 certification level. To determine the potential for noise reduction with relatively near term technologies, seven other configurations were assessed beginning with moving the engines two fan nozzle diameters upstream of the trailing edge and then adding technologies for reduction of the highest noise sources. Aft radiated noise was expected to be the most challenging to reduce and, therefore, the experimental database focused on jet nozzle and pylon configurations that could reduce jet noise through a combination of source reduction and shielding effectiveness. The best configuration for reduction of jet noise used state-of-the-art technology chevrons with a pylon above the engine in the crown position. This configuration resulted in jet source noise reduction, favorable azimuthal directivity, and noise source relocation upstream where it is more effectively shielded by the limited airframe surface, and additional fan noise attenuation from acoustic liner on the crown pylon internal surfaces. Vertical and elevon surfaces were also assessed to add shielding area. The elevon deflection above the trailing edge showed some small additional noise reduction whereas vertical surfaces resulted in a slight noise increase. With the effects of the configurations from the database included, the best available noise reduction was 40 dB cumulative. Projected effects from additional technologies were assessed for an advanced noise reduction configuration including landing gear fairings and advanced pylon and chevron nozzles. Incorporating the three additional technology improvements, an aircraft noise is projected of 42.4 dB cumulative below the Stage 4 level.

  15. The validity of open-source data when assessing jail suicides.

    PubMed

    Thomas, Amanda L; Scott, Jacqueline; Mellow, Jeff

    2018-05-09

    The Bureau of Justice Statistics' Deaths in Custody Reporting Program is the primary source for jail suicide research, though the data is restricted from general dissemination. This study is the first to examine whether jail suicide data obtained from publicly available sources can help inform our understanding of this serious public health problem. Of the 304 suicides that were reported through the DCRP in 2009, roughly 56 percent (N = 170) of those suicides were identified through the open-source search protocol. Each of the sources was assessed based on how much information was collected on the incident and the types of variables available. A descriptive analysis was then conducted on the variables that were present in both data sources. The four variables present in each data source were: (1) demographic characteristics of the victim, (2) the location of occurrence within the facility, (3) the location of occurrence by state, and (4) the size of the facility. Findings demonstrate that the prevalence and correlates of jail suicides are extremely similar in both open-source and official data. However, for almost every variable measured, open-source data captured as much information as official data did, if not more. Further, variables not found in official data were identified in the open-source database, thus allowing researchers to have a more nuanced understanding of the situational characteristics of the event. This research provides support for the argument in favor of including open-source data in jail suicide research as it illustrates how open-source data can be used to provide additional information not originally found in official data. In sum, this research is vital in terms of possible suicide prevention, which may be directly linked to being able to manipulate environmental factors.

  16. Feasibility of future epidemiological studies on possible health effects of mobile phone base stations.

    PubMed

    Neubauer, Georg; Feychting, Maria; Hamnerius, Yngve; Kheifets, Leeka; Kuster, Niels; Ruiz, Ignacio; Schüz, Joachim; Uberbacher, Richard; Wiart, Joe; Röösli, Martin

    2007-04-01

    The increasing deployment of mobile communication base stations led to an increasing demand for epidemiological studies on possible health effects of radio frequency emissions. The methodological challenges of such studies have been critically evaluated by a panel of scientists in the fields of radiofrequency engineering/dosimetry and epidemiology. Strengths and weaknesses of previous studies have been identified. Dosimetric concepts and crucial aspects in exposure assessment were evaluated in terms of epidemiological studies on different types of outcomes. We conclude that in principle base station epidemiological studies are feasible. However, the exposure contributions from all relevant radio frequency sources have to be taken into account. The applied exposure assessment method should be piloted and validated. Short to medium term effects on physiology or health related quality of life are best investigated by cohort studies. For long term effects, groups with a potential for high exposure need to first be identified; for immediate effect, human laboratory studies are the preferred approach. (c) 2006 Wiley-Liss, Inc.

  17. Active System for Electromagnetic Perturbation Monitoring in Vehicles

    NASA Astrophysics Data System (ADS)

    Matoi, Adrian Marian; Helerea, Elena

    Nowadays electromagnetic environment is rapidly expanding in frequency domain and wireless services extend in terms of covered area. European electromagnetic compatibility regulations refer to limit values regarding emissions, as well as procedures for determining susceptibility of the vehicle. Approval procedure for a series of cars is based on determining emissions/immunity level for a few vehicles picked randomly from the entire series, supposing that entire vehicle series is compliant. During immunity assessment, the vehicle is not subjected to real perturbation sources, but exposed to electric/magnetic fields generated by laboratory equipment. Since current approach takes into account only partially real situation regarding perturbation sources, this paper proposes an active system for determining electromagnetic parameters of vehicle's environment, that implements a logical diagram for measurement, satisfying the imposed requirements. This new and original solution is useful for EMC assessment of hybrid and electrical vehicles.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frayce, D.; Khayat, R.E.; Derdouri, A.

    The dual reciprocity boundary element method (DRBEM) is implemented to solve three-dimensional transient heat conduction problems in the presence of arbitrary sources, typically as these problems arise in materials processing. The DRBEM has a major advantage over conventional BEM, since it avoids the computation of volume integrals. These integrals stem from transient, nonlinear, and/or source terms. Thus there is no need to discretize the inner domain, since only a number of internal points are needed for the computation. The validity of the method is assessed upon comparison with results from benchmark problems where analytical solutions exist. There is generally goodmore » agreement. Comparison against finite element results is also favorable. Calculations are carried out in order to assess the influence of the number and location of internal nodes. The influence of the ratio of the numbers of internal to boundary nodes is also examined.« less

  19. Gingival Retraction Methods: A Systematic Review.

    PubMed

    Tabassum, Sadia; Adnan, Samira; Khan, Farhan Raza

    2017-12-01

    The aim of this systematic review was to assess the gingival retraction methods in terms of the amount of gingival retraction achieved and changes observed in various clinical parameters: gingival index (GI), plaque index (PI), probing depth (PD), and attachment loss (AL). Data sources included three major databases, PubMed, CINAHL plus (Ebsco), and Cochrane, along with hand search. Search was made using the key terms in different permutations of gingival retraction* AND displacement method* OR technique* OR agents OR material* OR medicament*. The initial search results yielded 145 articles which were narrowed down to 10 articles using a strict eligibility criteria of including clinical trials or experimental studies on gingival retraction methods with the amount of tooth structure gained and assessment of clinical parameters as the outcomes conducted on human permanent teeth only. Gingival retraction was measured in 6/10 studies whereas the clinical parameters were assessed in 5/10 studies. The total number of teeth assessed in the 10 included studies was 400. The most common method used for gingival retraction was chemomechanical. The results were heterogeneous with regards to the outcome variables. No method seemed to be significantly superior to the other in terms of gingival retraction achieved. Clinical parameters were not significantly affected by the gingival retraction method. © 2016 by the American College of Prosthodontists.

  20. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics ensembles of CESM, employing results from multiple climate models, and combining the results from single impact models with statistical representations of uncertainty across multiple models. A key consideration is the relationship between the question being addressed and the uncertainty approach.

  1. ASSESSMENT OF YOUTUBE VIDEOS AS A SOURCE OF INFORMATION ON MEDICATION USE IN PREGNANCY

    PubMed Central

    Hansen, Craig; Interrante, Julia D; Ailes, Elizabeth C; Frey, Meghan T; Broussard, Cheryl S; Godoshian, Valerie J; Lewis, Courtney; Polen, Kara ND; Garcia, Amanda P; Gilboa, Suzanne M

    2015-01-01

    Background When making decisions about medication use in pregnancy, women consult many information sources, including the Internet. The aim of this study was to assess the content of publicly-accessible YouTube videos that discuss medication use in pregnancy. Methods Using 2,023 distinct combinations of search terms related to medications and pregnancy, we extracted metadata from YouTube videos using a YouTube video Application Programming Interface. Relevant videos were defined as those with a medication search term and a pregnancy-related search term in either the video title or description. We viewed relevant videos and abstracted content from each video into a database. We documented whether videos implied each medication to be ‘safe’ or ‘unsafe’ in pregnancy and compared that assessment with the medication’s Teratogen Information System (TERIS) rating. Results After viewing 651 videos, 314 videos with information about medication use in pregnancy were available for the final analyses. The majority of videos were from law firms (67%), television segments (10%), or physicians (8%). Selective serotonin reuptake inhibitors (SSRIs) were the most common medication class named (225 videos, 72%), and 88% percent of videos about SSRIs indicated they were ‘unsafe’ for use in pregnancy. However, the TERIS ratings for medication products in this class range from ‘unlikely’ to ‘minimal’ teratogenic risk. Conclusion For the majority of medications, current YouTube video content does not adequately reflect what is known about the safety of their use in pregnancy and should be interpreted cautiously. However, YouTube could serve as a valuable platform for communicating evidence-based medication safety information. PMID:26541372

  2. Antimatter Requirements and Energy Costs for Near-Term Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Schmidt, G. R.; Gerrish, H. P.; Martin, J. J.; Smith, G. A.; Meyer, K. J.

    1999-01-01

    The superior energy density of antimatter annihilation has often been pointed to as the ultimate source of energy for propulsion. However, the limited capacity and very low efficiency of present-day antiproton production methods suggest that antimatter may be too costly to consider for near-term propulsion applications. We address this issue by assessing the antimatter requirements for six different types of propulsion concepts, including two in which antiprotons are used to drive energy release from combined fission/fusion. These requirements are compared against the capacity of both the current antimatter production infrastructure and the improved capabilities that could exist within the early part of next century. Results show that although it may be impractical to consider systems that rely on antimatter as the sole source of propulsive energy, the requirements for propulsion based on antimatter-assisted fission/fusion do fall within projected near-term production capabilities. In fact, a new facility designed solely for antiproton production but based on existing technology could feasibly support interstellar precursor missions and omniplanetary spaceflight with antimatter costs ranging up to $6.4 million per mission.

  3. Miniaturized pulsed laser source for time-domain diffuse optics routes to wearable devices

    NASA Astrophysics Data System (ADS)

    Di Sieno, Laura; Nissinen, Jan; Hallman, Lauri; Martinenghi, Edoardo; Contini, Davide; Pifferi, Antonio; Kostamovaara, Juha; Mora, Alberto Dalla

    2017-08-01

    We validate a miniaturized pulsed laser source for use in time-domain (TD) diffuse optics, following rigorous and shared protocols for performance assessment of this class of devices. This compact source (12×6 mm2) has been previously developed for range finding applications and is able to provide short, high energy (˜100 ps, ˜0.5 nJ) optical pulses at up to 1 MHz repetition rate. Here, we start with a basic level laser characterization with an analysis of suitability of this laser for the diffuse optics application. Then, we present a TD optical system using this source and its performances in both recovering optical properties of tissue-mimicking homogeneous phantoms and in detecting localized absorption perturbations. Finally, as a proof of concept of in vivo application, we demonstrate that the system is able to detect hemodynamic changes occurring in the arm of healthy volunteers during a venous occlusion. Squeezing the laser source in a small footprint removes a key technological bottleneck that has hampered so far the realization of a miniaturized TD diffuse optics system, able to compete with already assessed continuous-wave devices in terms of size and cost, but with wider performance potentialities, as demonstrated by research over the last two decades.

  4. On the Assessment of Acoustic Scattering and Shielding by Time Domain Boundary Integral Equation Solutions

    NASA Technical Reports Server (NTRS)

    Hu, Fang Q.; Pizzo, Michelle E.; Nark, Douglas M.

    2016-01-01

    Based on the time domain boundary integral equation formulation of the linear convective wave equation, a computational tool dubbed Time Domain Fast Acoustic Scattering Toolkit (TD-FAST) has recently been under development. The time domain approach has a distinct advantage that the solutions at all frequencies are obtained in a single computation. In this paper, the formulation of the integral equation, as well as its stabilization by the Burton-Miller type reformulation, is extended to cases of a constant mean flow in an arbitrary direction. In addition, a "Source Surface" is also introduced in the formulation that can be employed to encapsulate regions of noise sources and to facilitate coupling with CFD simulations. This is particularly useful for applications where the noise sources are not easily described by analytical source terms. Numerical examples are presented to assess the accuracy of the formulation, including a computation of noise shielding by a thin barrier motivated by recent Historical Baseline F31A31 open rotor noise shielding experiments. Furthermore, spatial resolution requirements of the time domain boundary element method are also assessed using point per wavelength metrics. It is found that, using only constant basis functions and high-order quadrature for surface integration, relative errors of less than 2% may be obtained when the surface spatial resolution is 5 points-per-wavelength (PPW) or 25 points-per-wavelength squared (PPW2).

  5. Logic-based assessment of the compatibility of UMLS ontology sources

    PubMed Central

    2011-01-01

    Background The UMLS Metathesaurus (UMLS-Meta) is currently the most comprehensive effort for integrating independently-developed medical thesauri and ontologies. UMLS-Meta is being used in many applications, including PubMed and ClinicalTrials.gov. The integration of new sources combines automatic techniques, expert assessment, and auditing protocols. The automatic techniques currently in use, however, are mostly based on lexical algorithms and often disregard the semantics of the sources being integrated. Results In this paper, we argue that UMLS-Meta’s current design and auditing methodologies could be significantly enhanced by taking into account the logic-based semantics of the ontology sources. We provide empirical evidence suggesting that UMLS-Meta in its 2009AA version contains a significant number of errors; these errors become immediately apparent if the rich semantics of the ontology sources is taken into account, manifesting themselves as unintended logical consequences that follow from the ontology sources together with the information in UMLS-Meta. We then propose general principles and specific logic-based techniques to effectively detect and repair such errors. Conclusions Our results suggest that the methodologies employed in the design of UMLS-Meta are not only very costly in terms of human effort, but also error-prone. The techniques presented here can be useful for both reducing human effort in the design and maintenance of UMLS-Meta and improving the quality of its contents. PMID:21388571

  6. Towards next generation time-domain diffuse optics devices

    NASA Astrophysics Data System (ADS)

    Dalla Mora, Alberto; Contini, Davide; Arridge, Simon R.; Martelli, Fabrizio; Tosi, Alberto; Boso, Gianluca; Farina, Andrea; Durduran, Turgut; Martinenghi, Edoardo; Torricelli, Alessandro; Pifferi, Antonio

    2015-03-01

    Diffuse Optics is growing in terms of applications ranging from e.g. oximetry, to mammography, molecular imaging, quality assessment of food and pharmaceuticals, wood optics, physics of random media. Time-domain (TD) approaches, although appealing in terms of quantitation and depth sensibility, are presently limited to large fiber-based systems, with limited number of source-detector pairs. We present a miniaturized TD source-detector probe embedding integrated laser sources and single-photon detectors. Some electronics are still external (e.g. power supply, pulse generators, timing electronics), yet full integration on-board using already proven technologies is feasible. The novel devices were successfully validated on heterogeneous phantoms showing performances comparable to large state-of-the-art TD rack-based systems. With an investigation based on simulations we provide numerical evidence that the possibility to stack many TD compact source-detector pairs in a dense, null source-detector distance arrangement could yield on the brain cortex about 1 decade higher contrast as compared to a continuous wave (CW) approach. Further, a 3-fold increase in the maximum depth (down to 6 cm) is estimated, opening accessibility to new organs such as the lung or the heart. Finally, these new technologies show the way towards compact and wearable TD probes with orders of magnitude reduction in size and cost, for a widespread use of TD devices in real life.

  7. Assessing the short-term clock drift of early broadband stations with burst events of the 26 s persistent and localized microseism

    NASA Astrophysics Data System (ADS)

    Xie, Jun; Ni, Sidao; Chu, Risheng; Xia, Yingjie

    2018-01-01

    Accurate seismometer clock plays an important role in seismological studies including earthquake location and tomography. However, some seismic stations may have clock drift larger than 1 s (e.g. GSC in 1992), especially in early days of global seismic networks. The 26 s Persistent Localized (PL) microseism event in the Gulf of Guinea sometime excites strong and coherent signals, and can be used as repeating source for assessing stability of seismometer clocks. Taking station GSC, PAS and PFO in the TERRAscope network as an example, the 26 s PL signal can be easily observed in the ambient noise cross-correlation function between these stations and a remote station OBN with interstation distance about 9700 km. The travel-time variation of this 26 s signal in the ambient noise cross-correlation function is used to infer clock error. A drastic clock error is detected during June 1992 for station GSC, but not found for station PAS and PFO. This short-term clock error is confirmed by both teleseismic and local earthquake records with a magnitude of 25 s. Averaged over the three stations, the accuracy of the ambient noise cross-correlation function method with the 26 s source is about 0.3-0.5 s. Using this PL source, the clock can be validated for historical records of sparsely distributed stations, where the usual ambient noise cross-correlation function of short-period (<20 s) ambient noise might be less effective due to its attenuation over long interstation distances. However, this method suffers from cycling problem, and should be verified by teleseismic/local P waves. Further studies are also needed to investigate whether the 26 s source moves spatially and its effects on clock drift detection.

  8. An Assessment of Some Design Constraints on Heat Production of a 3D Conceptual EGS Model Using an Open-Source Geothermal Reservoir Simulation Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yidong Xia; Mitch Plummer; Robert Podgorney

    2016-02-01

    Performance of heat production process over a 30-year period is assessed in a conceptual EGS model with a geothermal gradient of 65K per km depth in the reservoir. Water is circulated through a pair of parallel wells connected by a set of single large wing fractures. The results indicate that the desirable output electric power rate and lifespan could be obtained under suitable material properties and system parameters. A sensitivity analysis on some design constraints and operation parameters indicates that 1) the fracture horizontal spacing has profound effect on the long-term performance of heat production, 2) the downward deviation anglemore » for the parallel doublet wells may help overcome the difficulty of vertical drilling to reach a favorable production temperature, and 3) the thermal energy production rate and lifespan has close dependence on water mass flow rate. The results also indicate that the heat production can be improved when the horizontal fracture spacing, well deviation angle, and production flow rate are under reasonable conditions. To conduct the reservoir modeling and simulations, an open-source, finite element based, fully implicit, fully coupled hydrothermal code, namely FALCON, has been developed and used in this work. Compared with most other existing codes that are either closed-source or commercially available in this area, this new open-source code has demonstrated a code development strategy that aims to provide an unparalleled easiness for user-customization and multi-physics coupling. Test results have shown that the FALCON code is able to complete the long-term tests efficiently and accurately, thanks to the state-of-the-art nonlinear and linear solver algorithms implemented in the code.« less

  9. 77 FR 19740 - Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant Accident

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2010-0249] Water Sources for Long-Term Recirculation Cooling... Regulatory Guide (RG) 1.82, ``Water Sources for Long-Term Recirculation Cooling Following a Loss-of-Coolant... regarding the sumps and suppression pools that provide water sources for emergency core cooling, containment...

  10. Effects of a constructed Technosol on mortality, survival and reproduction of earthworms

    NASA Astrophysics Data System (ADS)

    Pey, Benjamin; Cortet, Jerome; Capowiez, Yvan; Mignot, Lenaic; Nahmani, Johanne; Watteau, Francoise; Schwartz, Christophe

    2010-05-01

    Soils, whose properties and pedogenesis are dominated by artificial materials or transported materials, are classified as Technosols. Some of these Technosols are used in soil engineering, which is the voluntary action to combine technical materials in a given objective to restore an ecosystem. Primary by products that are used to build these Technosols need to be assessed on an ecotoxicological point of view. The following study aims to assess the effects of a constructed Technosol made from different primary by-products on the mortality, survival and reproductions of two earthworm species. The model of Technosol used here is a combination of green-waste compost (GWC) and papermill sludge (PS) mixed with thermally treated industrial soil (TIS). OECD soil is used as a control soil. Three different experiments have been managed: i) the first, to assess the potential toxicity effect on Eisenia foetida biomass (28 days) and reproduction (56 days), ii) the second to assess the short-term effect (7 days) on Lumbricus terrestris biomass, iii) and the third to assess the medium-term effect (30 days) on L. terrestris biomass. Reproduction of E. foetida is enhanced with high proportions of GWC. For biomass, GWC seems to improve body mass contrary to other materials which lead to losses of body mass. Thus, for E. foetida, GWC seems to be a high-quality and long-term source of food. Body mass of L. terrestris decreased with GWC and OECD. At short-term only, TIS/PS leads to a gain of body mass. Only equilibrium of 25% GWC - 75% TIS/PS allows a gain of body mass at medium term. TIS/PS appears to be a low-quality and short-term food resource but an excellent water tank. It can be concluded that the constructed Technosol is not toxic for fauna but some differences appear between different tested material combinations, depending on nature, proportion and trophic properties of materials.

  11. Tsunami Hazard Assessment: Source regions of concern to U.S. interests derived from NOAA Tsunami Forecast Model Development

    NASA Astrophysics Data System (ADS)

    Eble, M. C.; uslu, B. U.; Wright, L.

    2013-12-01

    Synthetic tsunamis generated from source regions around the Pacific Basin are analyzed in terms of their relative impact on United States coastal locations.. The region of tsunami origin is as important as the expected magnitude and the predicted inundation for understanding tsunami hazard. The NOAA Center for Tsunami Research has developed high-resolution tsunami models capable of predicting tsunami arrival time and amplitude of waves at each location. These models have been used to conduct tsunami hazard assessments to assess maximum impact and tsunami inundation for use by local communities in education and evacuation map development. Hazard assessment studies conducted for Los Angeles, San Francisco, Crescent City, Hilo, and Apra Harbor are combined with results of tsunami forecast model development at each of seventy-five locations. Complete hazard assessment, identifies every possible tsunami variation from a pre-computed propagation database. Study results indicate that the Eastern Aleutian Islands and Alaska are the most likely regions to produce the largest impact on the West Coast of the United States, while the East Philippines and Mariana trench regions impact Apra Harbor, Guam. Hawaii appears to be impacted equally from South America, Alaska and the Kuril Islands.

  12. Long-term agroecosystem research in the Central Mississippi River Basin: hydrogeologic controls and crop management influence on nitrates in loess and fractured glacial till

    USDA-ARS?s Scientific Manuscript database

    Nitrogen (N) from agricultural activities has been suspected as a primary source of elevated ground water nitrate (NO3-N). The objective of this research was to assess the impact of common cropping systems on NO3-N levels for a glacial till aquifer underlying claypan soils in a predominantly agricul...

  13. Unregulated private wells in the Republic of Ireland: consumer awareness, source susceptibility and protective actions.

    PubMed

    Hynds, Paul D; Misstear, Bruce D; Gill, Laurence W

    2013-09-30

    While the safety of public drinking water supplies in the Republic of Ireland is governed and monitored at both local and national levels, there are currently no legislative tools in place relating to private supplies. It is therefore paramount that private well owners (and users) be aware of source specifications and potential contamination risks, to ensure adequate water quality. The objective of this study was to investigate the level of awareness among private well owners in the Republic of Ireland, relating to source characterisation and groundwater contamination issues. This was undertaken through interviews with 245 private well owners. Statistical analysis indicates that respondents' source type significantly influences owner awareness, particularly regarding well construction and design parameters. Water treatment, source maintenance and regular water quality testing are considered the three primary "protective actions" (or "stewardship activities") to consumption of contaminated groundwater and were reported as being absent in 64%, 72% and 40% of cases, respectively. Results indicate that the level of awareness exhibited by well users did not significantly affect the likelihood of their source being contaminated (source susceptibility); increased awareness on behalf of well users was associated with increased levels of protective action, particularly among borehole owners. Hence, lower levels of awareness may result in increased contraction of waterborne illnesses where contaminants have entered the well. Accordingly, focused educational strategies to increase awareness among private groundwater users are advocated in the short-term; the development and introdiction of formal legislation is recommended in the long-term, including an integrated programme of well inspections and risk assessments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Organic carbon sources and sinks in San Francisco Bay: variability induced by river flow

    USGS Publications Warehouse

    Jassby, Alan D.; Powell, T.M.; Cloern, James E.

    1993-01-01

    Sources and sinks of organic carbon for San Francisco Bay (California, USA) were estimated for 1980. Sources for the southern reach were dominated by phytoplankton and benthic microalgal production. River loading of organic matter was an additional important factor in the northern reach. Tidal marsh export and point sources played a secondary role. Autochthonous production in San Francisco Bay appears to be less than the mean for temperate-zone estuaries, primarily because turbidity limits microalgal production and the development of seagrass beds. Exchange between the Bay and Pacific Ocean plays an unknown but potentially important role in the organic carbon balance. Interannual variability in the organic carbon supply was assessed for Suisun Bay, a northern reach subembayment that provides habitat for important fish species (delta smelt Hypomesus transpacificus and larval striped bass Morone saxatilus). The total supply fluctuated by an order of magnitude; depending on the year, either autochthonous sources (phytoplankton production) or allochthonous sources (riverine loading) could be dominant. The primary cause of the year-to-year change was variability of freshwater inflows from the Sacramento and San Joaquin rivers, and its magnitude was much larger than long-term changes arising from marsh destruction and point source decreases. Although interannual variability of the total organic carbon supply could not be assessed for the southern reach, year-to-year changes in phytoplankton production were much smaller than in Suisun Bay, reflecting a relative lack of river influence.

  15. Applying the WHO recommendations on health-sector response to violence against women to assess the Spanish health system. A mixed methods approach.

    PubMed

    Goicolea, Isabel; Vives-Cases, Carmen; Minvielle, Fauhn; Briones-Vozmediano, Erica; Ohman, Ann

    2014-01-01

    This methodological note describes the development and application of a mixed-methods protocol to assess the responsiveness of Spanish health systems to violence against women in Spain, based on the World Health Organization (WHO) recommendations. Five areas for exploration were identified based on the WHO recommendations: policy environment, protocols, training, accountability/monitoring, and prevention/promotion. Two data collection instruments were developed to assess the situation of 17 Spanish regional health systems (RHS) with respect to these areas: 1) a set of indicators to guide a systematic review of secondary sources, and 2) an interview guide to be used with 26 key informants at the regional and national levels. We found differences between RHSs in the five areas assessed. The progress of RHSs on the WHO recommendations was notable at the level of policies, moderate in terms of health service delivery, and very limited in terms of preventive actions. Using a mixed-methods approach was useful for triangulation and complementarity during instrument design, data collection and interpretation. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.

  16. Hydro power flexibility for power systems with variable renewable energy sources: an IEA Task 25 collaboration: Hydro power flexibility for power systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huertas-Hernando, Daniel; Farahmand, Hossein; Holttinen, Hannele

    2016-06-20

    Hydro power is one of the most flexible sources of electricity production. Power systems with considerable amounts of flexible hydro power potentially offer easier integration of variable generation, e.g., wind and solar. However, there exist operational constraints to ensure mid-/long-term security of supply while keeping river flows and reservoirs levels within permitted limits. In order to properly assess the effective available hydro power flexibility and its value for storage, a detailed assessment of hydro power is essential. Due to the inherent uncertainty of the weather-dependent hydrological cycle, regulation constraints on the hydro system, and uncertainty of internal load as wellmore » as variable generation (wind and solar), this assessment is complex. Hence, it requires proper modeling of all the underlying interactions between hydro power and the power system, with a large share of other variable renewables. A summary of existing experience of wind integration in hydro-dominated power systems clearly points to strict simulation methodologies. Recommendations include requirements for techno-economic models to correctly assess strategies for hydro power and pumped storage dispatch. These models are based not only on seasonal water inflow variations but also on variable generation, and all these are in time horizons from very short term up to multiple years, depending on the studied system. Another important recommendation is to include a geographically detailed description of hydro power systems, rivers' flows, and reservoirs as well as grid topology and congestion.« less

  17. Ancient Glass: A Literature Search and its Role in Waste Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strachan, Denis M.; Pierce, Eric M.

    2010-07-01

    When developing a performance assessment model for the long-term disposal of immobilized low-activity waste (ILAW) glass, it is desirable to determine the durability of glass forms over very long periods of time. However, testing is limited to short time spans, so experiments are performed under conditions that accelerate the key geochemical processes that control weathering. Verification that models currently being used can reliably calculate the long term behavior ILAW glass is a key component of the overall PA strategy. Therefore, Pacific Northwest National Laboratory was contracted by Washington River Protection Solutions, LLC to evaluate alternative strategies that can be usedmore » for PA source term model validation. One viable alternative strategy is the use of independent experimental data from archaeological studies of ancient or natural glass contained in the literature. These results represent a potential independent experiment that date back to approximately 3600 years ago or 1600 before the current era (bce) in the case of ancient glass and 106 years or older in the case of natural glass. The results of this literature review suggest that additional experimental data may be needed before the result from archaeological studies can be used as a tool for model validation of glass weathering and more specifically disposal facility performance. This is largely because none of the existing data set contains all of the information required to conduct PA source term calculations. For example, in many cases the sediments surrounding the glass was not collected and analyzed; therefore having the data required to compare computer simulations of concentration flux is not possible. This type of information is important to understanding the element release profile from the glass to the surrounding environment and provides a metric that can be used to calibrate source term models. Although useful, the available literature sources do not contain the required information needed to simulate the long-term performance of nuclear waste glasses in a near-surface or deep geologic repositories. The information that will be required include 1) experimental measurements to quantify the model parameters, 2) detailed analyses of altered glass samples, and 3) detailed analyses of the sediment surrounding the ancient glass samples.« less

  18. Mass discharge assessment at a brominated DNAPL site: Effects of known DNAPL source mass removal

    NASA Astrophysics Data System (ADS)

    Johnston, C. D.; Davis, G. B.; Bastow, T. P.; Woodbury, R. J.; Rao, P. S. C.; Annable, M. D.; Rhodes, S.

    2014-08-01

    Management and closure of contaminated sites is increasingly being proposed on the basis of mass flux of dissolved contaminants in groundwater. Better understanding of the links between source mass removal and contaminant mass fluxes in groundwater would allow greater acceptance of this metric in dealing with contaminated sites. Our objectives here were to show how measurements of the distribution of contaminant mass flux and the overall mass discharge emanating from the source under undisturbed groundwater conditions could be related to the processes and extent of source mass depletion. In addition, these estimates of mass discharge were sought in the application of agreed remediation targets set in terms of pumped groundwater quality from offsite wells. Results are reported from field studies conducted over a 5-year period at a brominated DNAPL (tetrabromoethane, TBA; and tribromoethene, TriBE) site located in suburban Perth, Western Australia. Groundwater fluxes (qw; L3/L2/T) and mass fluxes (Jc; M/L2/T) of dissolved brominated compounds were simultaneously estimated by deploying Passive Flux Meters (PFMs) in wells in a heterogeneous layered aquifer. PFMs were deployed in control plane (CP) wells immediately down-gradient of the source zone, before (2006) and after (2011) 69-85% of the source mass was removed, mainly by groundwater pumping from the source zone. The high-resolution (26-cm depth interval) measures of qw and Jc along the source CP allowed investigation of the DNAPL source-zone architecture and impacts of source mass removal. Comparable estimates of total mass discharge (MD; M/T) across the source zone CP reduced from 104 g day- 1 to 24-31 g day- 1 (70-77% reductions). Importantly, this mass discharge reduction was consistent with the estimated proportion of source mass remaining at the site (15-31%). That is, a linear relationship between mass discharge and source mass is suggested. The spatial detail of groundwater and mass flux distributions also provided further evidence of the source zone architecture and DNAPL mass depletion processes. This was especially apparent in different mass-depletion rates from distinct parts of the CP. High mass fluxes and groundwater fluxes located near the base of the aquifer dominated in terms of the dissolved mass flux in the profile, although not in terms of concentrations. Reductions observed in Jc and MD were used to better target future remedial efforts. Integration of the observations from the PFM deployments and the source mass depletion provided a basis for establishing flux-based management criteria for the site.

  19. Mass discharge assessment at a brominated DNAPL site: Effects of known DNAPL source mass removal.

    PubMed

    Johnston, C D; Davis, G B; Bastow, T P; Woodbury, R J; Rao, P S C; Annable, M D; Rhodes, S

    2014-08-01

    Management and closure of contaminated sites is increasingly being proposed on the basis of mass flux of dissolved contaminants in groundwater. Better understanding of the links between source mass removal and contaminant mass fluxes in groundwater would allow greater acceptance of this metric in dealing with contaminated sites. Our objectives here were to show how measurements of the distribution of contaminant mass flux and the overall mass discharge emanating from the source under undisturbed groundwater conditions could be related to the processes and extent of source mass depletion. In addition, these estimates of mass discharge were sought in the application of agreed remediation targets set in terms of pumped groundwater quality from offsite wells. Results are reported from field studies conducted over a 5-year period at a brominated DNAPL (tetrabromoethane, TBA; and tribromoethene, TriBE) site located in suburban Perth, Western Australia. Groundwater fluxes (qw; L(3)/L(2)/T) and mass fluxes (Jc; M/L(2)/T) of dissolved brominated compounds were simultaneously estimated by deploying Passive Flux Meters (PFMs) in wells in a heterogeneous layered aquifer. PFMs were deployed in control plane (CP) wells immediately down-gradient of the source zone, before (2006) and after (2011) 69-85% of the source mass was removed, mainly by groundwater pumping from the source zone. The high-resolution (26-cm depth interval) measures of qw and Jc along the source CP allowed investigation of the DNAPL source-zone architecture and impacts of source mass removal. Comparable estimates of total mass discharge (MD; M/T) across the source zone CP reduced from 104gday(-1) to 24-31gday(-1) (70-77% reductions). Importantly, this mass discharge reduction was consistent with the estimated proportion of source mass remaining at the site (15-31%). That is, a linear relationship between mass discharge and source mass is suggested. The spatial detail of groundwater and mass flux distributions also provided further evidence of the source zone architecture and DNAPL mass depletion processes. This was especially apparent in different mass-depletion rates from distinct parts of the CP. High mass fluxes and groundwater fluxes located near the base of the aquifer dominated in terms of the dissolved mass flux in the profile, although not in terms of concentrations. Reductions observed in Jc and MD were used to better target future remedial efforts. Integration of the observations from the PFM deployments and the source mass depletion provided a basis for establishing flux-based management criteria for the site. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Assessing risk of non-compliance of phosphorus standards for lakes in England and Wales

    NASA Astrophysics Data System (ADS)

    Duethmann, D.; Anthony, S.; Carvalho, L.; Spears, B.

    2009-04-01

    High population densities, use of inorganic fertilizer and intensive livestock agriculture have increased phosphorus loads to lakes, and accelerated eutrophication is a major pressure for many lakes. The EC Water Framework Directive (WFD) requires that good chemical and ecological quality is restored in all surface water bodies by 2015. Total phosphorus (TP) standards for lakes in England and Wales have been agreed recently, and our aim was to estimate what percentage of lakes in England and Wales is at risk of failing these standards. With measured lake phosphorus concentrations only being available for a small number of lakes, such an assessment had to be model based. The study also makes a source apportionment of phosphorus inputs into lakes. Phosphorus loads were estimated from a range of sources including agricultural loads, sewage effluents, septic tanks, diffuse urban sources, atmospheric deposition, groundwater and bank erosion. Lake phosphorus concentrations were predicted using the Vollenweider model, and the model framework was satisfactorily tested against available observed lake concentration data. Even though predictions for individual lakes remain uncertain, results for a population of lakes are considered as sufficiently robust. A scenario analysis was carried out to investigate to what extent reductions in phosphorus loads would increase the number of lakes achieving good ecological status in terms of TP standards. Applying the model to all lakes in England and Wales greater than 1 ha, it was calculated that under current conditions roughly two thirds of the lakes would fail the good ecological status with respect to phosphorus. According to our estimates, agricultural phosphorus loads represent the most frequent dominant source for the majority of catchments, but diffuse urban runoff also is important in many lakes. Sewage effluents are the most frequent dominant source for large lake catchments greater than 100 km². The evaluation in terms of total load can be misleading in terms of what sources need to be tackled by catchment management for most of the lakes. For example sewage effluents are responsible for the majority of the total load but are the dominant source in only a small number of larger lake catchments. If loads from all sources were halved this would potentially increase the number of complying lakes to two thirds but require substantial measures to reduce phosphorus inputs to lakes. For agriculture, required changes would have to go beyond improvements of agricultural practise, and need to include reducing the intensity of land use. The time required for many lakes to respond to reduced nutrient loading is likely to extend beyond the current timelines of the WFD due to internal loading and biological resistances.

  1. Radiological analysis of plutonium glass batches with natural/enriched boron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rainisch, R.

    2000-06-22

    The disposition of surplus plutonium inventories by the US Department of Energy (DOE) includes the immobilization of certain plutonium materials in a borosilicate glass matrix, also referred to as vitrification. This paper addresses source terms of plutonium masses immobilized in a borosilicate glass matrix where the glass components include both natural boron and enriched boron. The calculated source terms pertain to neutron and gamma source strength (particles per second), and source spectrum changes. The calculated source terms corresponding to natural boron and enriched boron are compared to determine the benefits (decrease in radiation source terms) for to the use ofmore » enriched boron. The analysis of plutonium glass source terms shows that a large component of the neutron source terms is due to (a, n) reactions. The Americium-241 and plutonium present in the glass emit alpha particles (a). These alpha particles interact with low-Z nuclides like B-11, B-10, and O-17 in the glass to produce neutrons. The low-Z nuclides are referred to as target particles. The reference glass contains 9.4 wt percent B{sub 2}O{sub 3}. Boron-11 was found to strongly support the (a, n) reactions in the glass matrix. B-11 has a natural abundance of over 80 percent. The (a, n) reaction rates for B-10 are lower than for B-11 and the analysis shows that the plutonium glass neutron source terms can be reduced by artificially enriching natural boron with B-10. The natural abundance of B-10 is 19.9 percent. Boron enriched to 96-wt percent B-10 or above can be obtained commercially. Since lower source terms imply lower dose rates to radiation workers handling the plutonium glass materials, it is important to know the achievable decrease in source terms as a result of boron enrichment. Plutonium materials are normally handled in glove boxes with shielded glass windows and the work entails both extremity and whole-body exposures. Lowering the source terms of the plutonium batches will make the handling of these materials less difficult and will reduce radiation exposure to operating workers.« less

  2. Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations

    NASA Astrophysics Data System (ADS)

    Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang

    2017-09-01

    Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.

  3. Probabilistic seismic hazard assessments of Sabah, east Malaysia: accounting for local earthquake activity near Ranau

    NASA Astrophysics Data System (ADS)

    Khalil, Amin E.; Abir, Ismail A.; Ginsos, Hanteh; Abdel Hafiez, Hesham E.; Khan, Sohail

    2018-02-01

    Sabah state in eastern Malaysia, unlike most of the other Malaysian states, is characterized by common seismological activity; generally an earthquake of moderate magnitude is experienced at an interval of roughly every 20 years, originating mainly from two major sources, either a local source (e.g. Ranau and Lahad Dato) or a regional source (e.g. Kalimantan and South Philippines subductions). The seismicity map of Sabah shows the presence of two zones of distinctive seismicity, these zones are near Ranau (near Kota Kinabalu) and Lahad Datu in the southeast of Sabah. The seismicity record of Ranau begins in 1991, according to the international seismicity bulletins (e.g. United States Geological Survey and the International Seismological Center), and this short record is not sufficient for seismic source characterization. Fortunately, active Quaternary fault systems are delineated in the area. Henceforth, the seismicity of the area is thus determined as line sources referring to these faults. Two main fault systems are believed to be the source of such activities; namely, the Mensaban fault zone and the Crocker fault zone in addition to some other faults in their vicinity. Seismic hazard assessments became a very important and needed study for the extensive developing projects in Sabah especially with the presence of earthquake activities. Probabilistic seismic hazard assessments are adopted for the present work since it can provide the probability of various ground motion levels during expected from future large earthquakes. The output results are presented in terms of spectral acceleration curves and uniform hazard curves for periods of 500, 1000 and 2500 years. Since this is the first time that a complete hazard study has been done for the area, the output will be a base and standard for any future strategic plans in the area.

  4. Comparative assessment of the global fate and transport pathways of long-chain perfluorocarboxylic acids (PFCAs) and perfluorocarboxylates (PFCs) emitted from direct sources.

    PubMed

    Armitage, James M; Macleod, Matthew; Cousins, Ian T

    2009-08-01

    A global-scale multispecies mass balance model was used to simulate the long-term fate and transport of perfluorocarboxylic acids (PFCAs) with eight to thirteen carbons (C8-C13) and their conjugate bases, the perfluorocarboxylates (PFCs). The main purpose of this study was to assess the relative long-range transport (LRT) potential of each conjugate pair, collectively termed PFC(A)s, considering emissions from direct sources (i.e., manufacturing and use) only. Overall LRT potential (atmospheric + oceanic) varied as a function of chain length and depended on assumptions regarding pKa and mode of entry. Atmospheric transport makes a relatively higher contribution to overall LRT potential for PFC(A)s with longer chain length, which reflects the increasing trend in the air-water partition coefficient (K(AW)) of the neutral PFCA species with chain length. Model scenarios using estimated direct emissions of the C8, C9, and C11 PFC(A)s indicate that the mass fluxes to the Arctic marine environment associated with oceanic transport are in excess of mass fluxes from indirect sources (i.e., atmospheric transport of precursor substances such as fluorotelomer alcohols and subsequent degradation to PFCAs). Modeled concentrations of C8 and C9 in the abiotic environment are broadly consistent with available monitoring data in surface ocean waters. Furthermore, the modeled concentration ratios of C8 to C9 are reconcilable with the homologue pattern frequently observed in biota, assuming a positive correlation between bioaccumulation potential and chain length. Modeled concentration ratios of C11 to C10 are more difficult to reconcile with monitoring data in both source and remote regions. Our model results for C11 and C10 therefore imply that either (i) indirect sources are dominant or (ii) estimates of direct emission are not accurate for these homologues.

  5. Fission Product Appearance Rate Coefficients in Design Basis Source Term Determinations - Past and Present

    NASA Astrophysics Data System (ADS)

    Perez, Pedro B.; Hamawi, John N.

    2017-09-01

    Nuclear power plant radiation protection design features are based on radionuclide source terms derived from conservative assumptions that envelope expected operating experience. Two parameters that significantly affect the radionuclide concentrations in the source term are failed fuel fraction and effective fission product appearance rate coefficients. Failed fuel fraction may be a regulatory based assumption such as in the U.S. Appearance rate coefficients are not specified in regulatory requirements, but have been referenced to experimental data that is over 50 years old. No doubt the source terms are conservative as demonstrated by operating experience that has included failed fuel, but it may be too conservative leading to over-designed shielding for normal operations as an example. Design basis source term methodologies for normal operations had not advanced until EPRI published in 2015 an updated ANSI/ANS 18.1 source term basis document. Our paper revisits the fission product appearance rate coefficients as applied in the derivation source terms following the original U.S. NRC NUREG-0017 methodology. New coefficients have been calculated based on recent EPRI results which demonstrate the conservatism in nuclear power plant shielding design.

  6. Elemental composition and size distribution of particulates in Cleveland, Ohio

    NASA Technical Reports Server (NTRS)

    King, R. B.; Fordyce, J. S.; Neustadter, H. E.; Leibecki, H. F.

    1975-01-01

    Measurements were made of the elemental particle size distribution at five contrasting urban environments with different source-type distributions in Cleveland, Ohio. Air quality conditions ranged from normal to air pollution alert levels. A parallel network of high-volume cascade impactors (5-state) were used for simultaneous sampling on glass fiber surfaces for mass determinations and on Whatman-41 surfaces for elemental analysis by neutron activation for 25 elements. The elemental data are assessed in terms of distribution functions and interrelationships and are compared between locations as a function of resultant wind direction in an attempt to relate the findings to sources.

  7. Elemental composition and size distribution of particulates in Cleveland, Ohio

    NASA Technical Reports Server (NTRS)

    Leibecki, H. F.; King, R. B.; Fordyce, J. S.; Neustadter, H. E.

    1975-01-01

    Measurements have been made of the elemental particle size distribution at five contrasting urban environments with different source-type distributions in Cleveland, Ohio. Air quality conditions ranged from normal to air pollution alert levels. A parallel network of high-volume cascade impactors (5-stage) were used for simultaneous sampling on glass fiber surfaces for mass determinations and on Whatman-41 surfaces for elemental analysis by neutron activation for 25 elements. The elemental data are assessed in terms of distribution functions and interrelationships and are compared between locations as a function of resultant wind direction in an attempt to relate the findings to sources.

  8. Do differences in future sulfate emission pathways matter for near-term climate? A case study for the Asian monsoon

    NASA Astrophysics Data System (ADS)

    Bartlett, Rachel E.; Bollasina, Massimo A.; Booth, Ben B. B.; Dunstone, Nick J.; Marenco, Franco; Messori, Gabriele; Bernie, Dan J.

    2018-03-01

    Anthropogenic aerosols could dominate over greenhouse gases in driving near-term hydroclimate change, especially in regions with high present-day aerosol loading such as Asia. Uncertainties in near-future aerosol emissions represent a potentially large, yet unexplored, source of ambiguity in climate projections for the coming decades. We investigated the near-term sensitivity of the Asian summer monsoon to aerosols by means of transient modelling experiments using HadGEM2-ES under two existing climate change mitigation scenarios selected to have similar greenhouse gas forcing, but to span a wide range of plausible global sulfur dioxide emissions. Increased sulfate aerosols, predominantly from East Asian sources, lead to large regional dimming through aerosol-radiation and aerosol-cloud interactions. This results in surface cooling and anomalous anticyclonic flow over land, while abating the western Pacific subtropical high. The East Asian monsoon circulation weakens and precipitation stagnates over Indochina, resembling the observed southern-flood-northern-drought pattern over China. Large-scale circulation adjustments drive suppression of the South Asian monsoon and a westward extension of the Maritime Continent convective region. Remote impacts across the Northern Hemisphere are also generated, including a northwestward shift of West African monsoon rainfall induced by the westward displacement of the Indian Ocean Walker cell, and temperature anomalies in northern midlatitudes linked to propagation of Rossby waves from East Asia. These results indicate that aerosol emissions are a key source of uncertainty in near-term projection of regional and global climate; a careful examination of the uncertainties associated with aerosol pathways in future climate assessments must be highly prioritised.

  9. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  10. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  11. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  12. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  13. 26 CFR 1.737-3 - Basis adjustments; Recovery rules.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Properties A1, A2, and A3 is long-term, U.S.-source capital gain or loss. The character of gain on Property A4 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real property...-term, foreign-source capital gain ($3,000 total gain under section 737 × $2,000 net long-term, foreign...

  14. Household food waste separation behavior and the importance of convenience.

    PubMed

    Bernstad, Anna

    2014-07-01

    Two different strategies aiming at increasing household source-separation of food waste were assessed through a case-study in a Swedish residential area (a) use of written information, distributed as leaflets amongst households and (b) installation of equipment for source-segregation of waste with the aim of increasing convenience food waste sorting in kitchens. Weightings of separately collected food waste before and after distribution of written information suggest that this resulted in neither a significant increased amount of separately collected food waste, nor an increased source-separation ratio. After installation of sorting equipment in households, both the amount of separately collected food waste as well as the source-separation ratio increased vastly. Long-term monitoring shows that results where longstanding. Results emphasize the importance of convenience and existence of infrastructure necessary for source-segregation of waste as important factors for household waste recycling, but also highlight the need of addressing these aspects where waste is generated, i.e. already inside the household. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Aerosol Source Attributions and Source-Receptor Relationships Across the Northern Hemisphere

    NASA Technical Reports Server (NTRS)

    Bian, Huisheng; Chin, Mian; Kucsera, Tom; Pan, Xiaohua; Darmenov, Anton; Colarco, Peter; Torres, Omar; Shults, Michael

    2014-01-01

    Emissions and long-range transport of air pollution pose major concerns on air quality and climate change. To better assess the impact of intercontinental transport of air pollution on regional and global air quality, ecosystems, and near-term climate change, the UN Task Force on Hemispheric Transport of Air Pollution (HTAP) is organizing a phase II activity (HTAP2) that includes global and regional model experiments and data analysis, focusing on ozone and aerosols. This study presents the initial results of HTAP2 global aerosol modeling experiments. We will (a) evaluate the model results with surface and aircraft measurements, (b) examine the relative contributions of regional emission and extra-regional source on surface PM concentrations and column aerosol optical depth (AOD) over several NH pollution and dust source regions and the Arctic, and (c) quantify the source-receptor relationships in the pollution regions that reflect the sensitivity of regional aerosol amount to the regional and extra-regional emission reductions.

  16. Landscape pattern metrics and regional assessment

    USGS Publications Warehouse

    O'Neill, R. V.; Riitters, K.H.; Wickham, J.D.; Jones, K.B.

    1999-01-01

    The combination of remote imagery data, geographic information systems software, and landscape ecology theory provides a unique basis for monitoring and assessing large-scale ecological systems. The unique feature of the work has been the need to develop and interpret quantitative measures of spatial pattern-the landscape indices. This article reviews what is known about the statistical properties of these pattern metrics and suggests some additional metrics based on island biogeography, percolation theory, hierarchy theory, and economic geography. Assessment applications of this approach have required interpreting the pattern metrics in terms of specific environmental endpoints, such as wildlife and water quality, and research into how to represent synergystic effects of many overlapping sources of stress.

  17. Near Field Modeling for the Maule Tsunami from DART, GPS and Finite Fault Solutions (Invited)

    NASA Astrophysics Data System (ADS)

    Arcas, D.; Chamberlin, C.; Lagos, M.; Ramirez-Herrera, M.; Tang, L.; Wei, Y.

    2010-12-01

    The earthquake and tsunami of February, 27, 2010 in central Chile has rekindled an interest in developing techniques to predict the impact of near field tsunamis along the Chilean coastline. Following the earthquake, several initiatives were proposed to increase the density of seismic, pressure and motion sensors along the South American trench, in order to provide field data that could be used to estimate tsunami impact on the coast. However, the precise use of those data in the elaboration of a quantitative assessment of coastal tsunami damage has not been clarified. The present work makes use of seismic, Deep-ocean Assessment and Reporting of Tsunamis (DART®) systems, and GPS measurements obtained during the Maule earthquake to initiate a number of tsunami inundation models along the rupture area by expressing different versions of the seismic crustal deformation in terms of NOAA’s tsunami unit source functions. Translation of all available real-time data into a feasible tsunami source is essential in near-field tsunami impact prediction in which an impact assessment must be generated under very stringent time constraints. Inundation results from each different source are then contrasted with field and tide gauge data by comparing arrival time, maximum wave height, maximum inundation and tsunami decay rate, using field data collected by the authors.

  18. Laser Scanning Systems and Techniques in Rockfall Source Identification and Risk Assessment: A Critical Review

    NASA Astrophysics Data System (ADS)

    Fanos, Ali Mutar; Pradhan, Biswajeet

    2018-04-01

    Rockfall poses risk to people, their properties and to transportation ways in mountainous and hilly regions. This catastrophe shows various characteristics such as vast distribution, sudden occurrence, variable magnitude, strong fatalness and randomicity. Therefore, prediction of rockfall phenomenon both spatially and temporally is a challenging task. Digital Terrain model (DTM) is one of the most significant elements in rockfall source identification and risk assessment. Light detection and ranging (LiDAR) is the most advanced effective technique to derive high-resolution and accurate DTM. This paper presents a critical overview of rockfall phenomenon (definition, triggering factors, motion modes and modeling) and LiDAR technique in terms of data pre-processing, DTM generation and the factors that can be obtained from this technique for rockfall source identification and risk assessment. It also reviews the existing methods that are utilized for the evaluation of the rockfall trajectories and their characteristics (frequency, velocity, bouncing height and kinetic energy), probability, susceptibility, hazard and risk. Detail consideration is given on quantitative methodologies in addition to the qualitative ones. Various methods are demonstrated with respect to their application scales (local and regional). Additionally, attention is given to the latest improvement, particularly including the consideration of the intensity of the phenomena and the magnitude of the events at chosen sites.

  19. [Features of control of electromagnetic radiation emitted by personal computers].

    PubMed

    Pal'tsev, Iu P; Buzov, A L; Kol'chugin, Iu I

    1996-01-01

    Measurements of PC electromagnetic irradiation show that the main sources are PC blocks emitting the waves of certain frequencies. Use of wide-range detectors measuring field intensity in assessment of PC electromagnetic irradiation gives unreliable results. More precise measurements by selective devices are required. Thus, it is expedient to introduce a term "spectral density of field intensity" and its maximal allowable level. In this case a frequency spectrum of PC electromagnetic irradiation is divided into 4 ranges, one of which is subjected to calculation of field intensity for each harmonic frequency, and others undergo assessment of spectral density of field intensity.

  20. Algae Biofuels Co-Location Assessment Tool for Canada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2011-11-29

    The Algae Biofuels Co-Location Assessment Tool for Canada uses chemical stoichiometry to estimate Nitrogen, Phosphorous, and Carbon atom availability from waste water and carbon dioxide emissions streams, and requirements for those same elements to produce a unit of algae. This information is then combined to find limiting nutrient information and estimate potential productivity associated with waste water and carbon dioxide sources. Output is visualized in terms of distributions or spatial locations. Distances are calculated between points of interest in the model using the great circle distance equation, and the smallest distances found by an exhaustive search and sort algorithm.

  1. Methods Used to Assess the Susceptibility to Contamination of Transient, Non-Community Public Ground-Water Supplies in Indiana

    USGS Publications Warehouse

    Arihood, Leslie D.; Cohen, David A.

    2006-01-01

    The Safe Water Drinking Act of 1974 as amended in 1996 gave each State the responsibility of developing a Source-Water Assessment Plan (SWAP) that is designed to protect public-water supplies from contamination. Each SWAP must include three elements: (1) a delineation of the source-water protection area, (2) an inventory of potential sources of contaminants within the area, and (3) a determination of the susceptibility of the public-water supply to contamination from the inventoried sources. The Indiana Department of Environmental Management (IDEM) was responsible for preparing a SWAP for all public-water supplies in Indiana, including about 2,400 small public ground-water supplies that are designated transient, non-community (TNC) supplies. In cooperation with IDEM, the U.S. Geological Survey compiled information on conditions near the TNC supplies and helped IDEM complete source-water assessments for each TNC supply. The delineation of a source-water protection area (called the assessment area) for each TNC ground-water supply was defined by IDEM as a circular area enclosed by a 300-foot radius centered at the TNC supply well. Contaminants of concern (COCs) were defined by IDEM as any of the 90 contaminants for which the U.S. Environmental Protection Agency has established primary drinking-water standards. Two of these, nitrate as nitrogen and total coliform bacteria, are Indiana State-regulated contaminants for TNC water supplies. IDEM representatives identified potential point and nonpoint sources of COCs within the assessment area, and computer database retrievals were used to identify potential point sources of COCs in the area outside the assessment area. Two types of methods-subjective and subjective hybrid-were used in the SWAP to determine susceptibility to contamination. Subjective methods involve decisions based upon professional judgment, prior experience, and (or) the application of a fundamental understanding of processes without the collection and analysis of data for a specific condition. Subjective hybrid methods combine subjective methods with quantitative hydrologic analyses. The subjective methods included an inventory of potential sources and associated contaminants, and a qualitative description of the inherent susceptibility of the area around the TNC supply. The description relies on a classification of the hydrogeologic and geomorphic characteristics of the general area around the TNC supply in terms of its surficial geology, regional aquifer system, the occurrence of fine- and coarse-grained geologic materials above the screen of the TNC well, and the potential for infiltration of contaminants. The subjective hybrid method combined the results of a logistic regression analysis with a subjective analysis of susceptibility and a subjective set of definitions that classify the thickness of fine-grained geologic materials above the screen of a TNC well in terms of impedance to vertical flow. The logistic regression determined the probability of elevated concentrations of nitrate as nitrogen (greater than or equal to 3 milligrams per liter) in ground water associated with specific thicknesses of fine-grained geologic materials above the screen of a TNC well. In this report, fine-grained geologic materials are referred to as a geologic barrier that generally impedes vertical flow through an aquifer. A geologic barrier was defined to be thin for fine-grained materials between 0 and 45 feet thick, moderate for materials between 45 and 75 feet thick, and thick if the fine-grained materials were greater than 75 feet thick. A flow chart was used to determine the susceptibility rating for each TNC supply. The flow chart indicated a susceptibility rating using (1) concentrations of nitrate as nitrogen and total coliform bacteria reported from routine compliance monitoring of the TNC supply, (2) the presence or absence of potential sources of regulated contaminants (nitrate as nitrogen and coliform bac

  2. Assessment of MRI-Based Marker of Dopaminergic Integrity as a Biological Indicator of Gulf War Illness

    DTIC Science & Technology

    2016-10-01

    including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and...SUBJECT TERMS Gulf war illness; magnetic resonance imaging; dopamine; diffusion tensor imaging 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...nigra, basal ganglia and cortex as markers of integrity of the nigro-striatal dopaminergic pathway using high resolution diffusion tensor imaging (DTI

  3. Is Iowa Educationally Competitive? Children and Iowa's Economic Future--March 2010 Update on NAEP Reading Scores. Iowa Kids Count Special Report Update

    ERIC Educational Resources Information Center

    Child and Family Policy Center, 2010

    2010-01-01

    In January, 2010, Iowa Kids Count produced a special report that showed long-term trends in Iowa student reading and mathematics scores on the National Assessment of Educational Progress (NAEP), the only source for comparative state information on student achievement. The January report showed a decline in Iowa's ranking since 1992, when the first…

  4. Deterrence Impact Modeling Environment (DIME) Proof-of-Concept Test Evaluations and Findings

    DTIC Science & Technology

    2016-06-01

    sources of this caution: financial, technical, legal, and ethical . Several current Coast Guard policies complicate ongoing engagement with and assessment...and ethical . There is evidence that several of the stakeholder communities most important to the Coast Guard have not been early adopters of the...self-organization) or longer-term outcomes (such as over-harvesting, regeneration of biodiversity, resilience of an ecological system to human nature

  5. Diabetic Macular Edema: What is Focal and What is Diffuse?

    PubMed Central

    Browning, David J.; Altaweel, Michael M.; Bressler, Neil M.; Bressler, Susan B.; Scott, Ingrid U.

    2009-01-01

    Purpose To review the available information on classification of diabetic macular edema (DME) as focal or diffuse. Design Interpretive essay. Methods Literature review and interpretation. Results The terms focal and diffuse diabetic macular edema are frequently used without clear definitions. Published definitions often use different examination modalities and are often inconsistent. Evaluating published information on prevalence of focal and diffuse DME, response of focal and diffuse DME to treatments, and importance of focal and diffuse DME in assessing prognosis is hindered because the terms are inconsistently employed. A newer vocabulary may be more constructive, one that describes discrete components of the concepts such as extent and location of macular thickening, involvement of the center of the macula, quantity and pattern of lipid exudates, source of fluorescein leakage, and regional variation in macular thickening, and that distinguishes these terms from the use of the term focal when describing one type of photocoagulation technique. Developing methods for assessing component variables that can be used in clinical practice and establishing reproducibility of the methods will be important tasks. Conclusion Little evidence exists that characteristics of DME described by the terms focal and diffuse help to explain variation in visual acuity or response to treatment. It is unresolved whether a concept of focal and diffuse DME will prove clinically useful despite frequent usage of the terms when describing management of DME. Further studies to address the issues are needed. PMID:18774122

  6. Lessons Learned Through the Follow-up of the Long-Term Effects of Over-Exposure to an Ir192 Industrial Radiography Source in Bangladesh

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jalil, A.; Rabbani, G.; Hossain, M. K.

    2003-02-24

    An industrial radiographer was accidentally over-exposed while taking the radiograph of weld-joints of gas pipe-lines in 1985 in Bangladesh. Symptoms of high radiation exposure occurred immediately after the accident and skin erythema developed leading to progressive tissue deterioration. The consequences of this over-exposure is being followed up to assess the long-term effects of ionizing radiation on the victim. Progressive tissue deteriorations have already led to multiple surgeries and successive amputations of the finger-tips so far. Lessons learned from this accident are also reported in this paper.

  7. Mechanisms of Post-Infarct Left Ventricular Remodeling

    PubMed Central

    French, Brent A.; Kramer, Christopher M.

    2008-01-01

    Heart failure secondary to myocardial infarction (MI) remains a major source of morbidity and mortality. Long-term outcome after MI can be largely be defined in terms of its impact on the size and shape of the left ventricle (i.e., LV remodeling). Three major mechanisms contribute to LV remodeling: 1) early infarct expansion, 2) subsequent infarct extension into adjacent noninfarcted myocardium, and 3) late hypertrophy in the remote LV. Future developments in preventing post-MI heart failure will depend not only on identifying drugs targeting each of these individual mechanisms, but also on diagnostic techniques capable of assessing efficacy against each mechanism. PMID:18690295

  8. Bayesian estimation of a source term of radiation release with approximately known nuclide ratios

    NASA Astrophysics Data System (ADS)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek

    2016-04-01

    We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  9. Long-term trends in dissolved iron and DOC concentration linked to nitrate depletion in riparian soils

    NASA Astrophysics Data System (ADS)

    Musolff, Andreas; Selle, Benny; Fleckenstein, Jan H.; Oosterwoud, Marieke R.; Tittel, Jörg

    2016-04-01

    The instream concentrations of dissolved organic carbon (DOC) are rising in many catchments of the northern hemisphere. Elevated concentrations of DOC, mainly in the form of colored humic components, increase efforts and costs of drinking water purification. In this study, we evaluated a long-term dataset of 110 catchments draining into German drinking water reservoirs in order to assess sources of DOC and drivers of a potential long-term change. The average DOC concentrations across the wide range of different catchments were found to be well explained by the catchment's topographic wetness index. Higher wetness indices were connected to higher average DOC concentrations, which implies that catchments with shallow topography and pronounced riparian wetlands mobilize more DOC. Overall, 37% of the investigated catchments showed a significant long-term increase in DOC concentrations, while 22% exhibited significant negative trends. Moreover, we found that increasing trends in DOC were positively correlated to trends in dissolved iron concentrations at pH≤6 due to remobilization of DOC previously sorbed to iron minerals. Both, increasing trends in DOC and dissolve iron were found to be connected to decreasing trends and low concentrations of nitrate (below ~6 mg/L). This was especially observed in forested catchments where atmospheric N-depositions were the major source for nitrate availability. In these catchments, we also found long-term increases of phosphate concentrations. Therefore, we argue that dissolved iron, DOC and phosphate were jointly released under iron-reducing conditions when nitrate as a competing electron acceptor was too low in concentrations to prevent the microbial iron reduction. In contrast, we could not explain the observed increasing trends in DOC, iron and phosphate concentrations by the long-term trends of pH, sulfate or precipitation. Altogether this study gives strong evidence that both, source and long-term increases in DOC are primarily controlled by riparian wetland soils within the catchments. Here, the achievement of a long-term reduction in nitrogen deposition may in turn lead to a more pronounced iron reduction and a subsequent release of DOC and other iron-bound substances such as phosphate.

  10. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2014-01-01 2014-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  11. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2012-01-01 2012-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  12. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2010-01-01 2010-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  13. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2013-01-01 2013-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  14. 10 CFR 50.67 - Accident source term.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... occupancy of the control room under accident conditions without personnel receiving radiation exposures in... 10 Energy 1 2011-01-01 2011-01-01 false Accident source term. 50.67 Section 50.67 Energy NUCLEAR... Conditions of Licenses and Construction Permits § 50.67 Accident source term. (a) Applicability. The...

  15. Assessment of groundwater exploitation in an aquifer using the random walk on grid method: a case study at Ordos, China

    NASA Astrophysics Data System (ADS)

    Nan, Tongchao; Li, Kaixuan; Wu, Jichun; Yin, Lihe

    2018-04-01

    Sustainability has been one of the key criteria of effective water exploitation. Groundwater exploitation and water-table decline at Haolebaoji water source site in the Ordos basin in NW China has drawn public attention due to concerns about potential threats to ecosystems and grazing land in the area. To better investigate the impact of production wells at Haolebaoji on the water table, an adapted algorithm called the random walk on grid method (WOG) is applied to simulate the hydraulic head in the unconfined and confined aquifers. This is the first attempt to apply WOG to a real groundwater problem. The method can not only evaluate the head values but also the contributions made by each source/sink term. One is allowed to analyze the impact of source/sink terms just as if one had an analytical solution. The head values evaluated by WOG match the values derived from the software Groundwater Modeling System (GMS). It suggests that WOG is effective and applicable in a heterogeneous aquifer with respect to practical problems, and the resultant information is useful for groundwater management.

  16. Nonlinear synthesis of infrasound propagation through an inhomogeneous, absorbing atmosphere.

    PubMed

    de Groot-Hedlin, C D

    2012-08-01

    An accurate and efficient method to predict infrasound amplitudes from large explosions in the atmosphere is required for diverse source types, including bolides, volcanic eruptions, and nuclear and chemical explosions. A finite-difference, time-domain approach is developed to solve a set of nonlinear fluid dynamic equations for total pressure, temperature, and density fields rather than acoustic perturbations. Three key features for the purpose of synthesizing nonlinear infrasound propagation in realistic media are that it includes gravitational terms, it allows for acoustic absorption, including molecular vibration losses at frequencies well below the molecular vibration frequencies, and the environmental models are constrained to have axial symmetry, allowing a three-dimensional simulation to be reduced to two dimensions. Numerical experiments are performed to assess the algorithm's accuracy and the effect of source amplitudes and atmospheric variability on infrasound waveforms and shock formation. Results show that infrasound waveforms steepen and their associated spectra are shifted to higher frequencies for nonlinear sources, leading to enhanced infrasound attenuation. Results also indicate that nonlinear infrasound amplitudes depend strongly on atmospheric temperature and pressure variations. The solution for total field variables and insertion of gravitational terms also allows for the computation of other disturbances generated by explosions, including gravity waves.

  17. Technology integration performance assessment using lean principles in health care.

    PubMed

    Rico, Florentino; Yalcin, Ali; Eikman, Edward A

    2015-01-01

    This study assesses the impact of an automated infusion system (AIS) integration at a positron emission tomography (PET) center based on "lean thinking" principles. The authors propose a systematic measurement system that evaluates improvement in terms of the "8 wastes." This adaptation to the health care context consisted of performance measurement before and after integration of AIS in terms of time, utilization of resources, amount of materials wasted/saved, system variability, distances traveled, and worker strain. The authors' observations indicate that AIS stands to be very effective in a busy PET department, such as the one in Moffitt Cancer Center, owing to its accuracy, pace, and reliability, especially after the necessary adjustments are made to reduce or eliminate the source of errors. This integration must be accompanied by a process reengineering exercise to realize the full potential of AIS in reducing waste and improving patient care and worker satisfaction. © The Author(s) 2014.

  18. Probabilistic estimation of long-term volcanic hazard under evolving tectonic conditions in a 1 Ma timeframe

    NASA Astrophysics Data System (ADS)

    Jaquet, O.; Lantuéjoul, C.; Goto, J.

    2017-10-01

    Risk assessments in relation to the siting of potential deep geological repositories for radioactive wastes demand the estimation of long-term tectonic hazards such as volcanicity and rock deformation. Owing to their tectonic situation, such evaluations concern many industrial regions around the world. For sites near volcanically active regions, a prevailing source of uncertainty is related to volcanic hazard. For specific situations, in particular in relation to geological repository siting, the requirements for the assessment of volcanic and tectonic hazards have to be expanded to 1 million years. At such time scales, tectonic changes are likely to influence volcanic hazard and therefore a particular stochastic model needs to be developed for the estimation of volcanic hazard. The concepts and theoretical basis of the proposed model are given and a methodological illustration is provided using data from the Tohoku region of Japan.

  19. The modelling and assessment of whale-watching impacts

    USGS Publications Warehouse

    New, Leslie; Hall, Ailsa J.; Harcourt, Robert; Kaufman, Greg; Parsons, E.C.M.; Pearson, Heidi C.; Cosentino, A. Mel; Schick, Robert S

    2015-01-01

    In recent years there has been significant interest in modelling cumulative effects and the population consequences of individual changes in cetacean behaviour and physiology due to disturbance. One potential source of disturbance that has garnered particular interest is whale-watching. Though perceived as ‘green’ or eco-friendly tourism, there is evidence that whale-watching can result in statistically significant and biologically meaningful changes in cetacean behaviour, raising the question whether whale-watching is in fact a long term sustainable activity. However, an assessment of the impacts of whale-watching on cetaceans requires an understanding of the potential behavioural and physiological effects, data to effectively address the question and suitable modelling techniques. Here, we review the current state of knowledge on the viability of long-term whale-watching, as well as logistical limitations and potential opportunities. We conclude that an integrated, coordinated approach will be needed to further understanding of the possible effects of whale-watching on cetaceans.

  20. Assessment of rainfall thresholds for landslide triggering in the Pacific Northwest: extreme short-term rainfall and long-term trends

    NASA Astrophysics Data System (ADS)

    Stanley, T.; Kirschbaum, D.; Sobieszczyk, S.; Jasinski, M. F.; Borak, J.; Yatheendradas, S.

    2017-12-01

    Landslides occur every year in the U.S. Pacific Northwest due to extreme rainfall, snow cover, and rugged topography. Data for 15,000 landslide events in Washington and Oregon were assembled from State Surveys, Departments of Transportation, a Global Landslide Catalog compiled by NASA, and other sources. This new inventory was evaluated against rainfall data from the National Climate Assessment (NCA) Land Data Assimilation System to characterize the regional rainfall conditions that trigger landslides. Analysis of these data sets indicates clear differences in triggering thresholds between extreme weather systems such as a Pineapple Express and the more typical peak seasonal rainfall between November and February. The study also leverages over 30 years of precipitation and land surface information to inform variability of landslide triggering over multiple decades and landslide trends within the region.

  1. Miniaturized pulsed laser source for time-domain diffuse optics routes to wearable devices.

    PubMed

    Di Sieno, Laura; Nissinen, Jan; Hallman, Lauri; Martinenghi, Edoardo; Contini, Davide; Pifferi, Antonio; Kostamovaara, Juha; Mora, Alberto Dalla

    2017-08-01

    We validate a miniaturized pulsed laser source for use in time-domain (TD) diffuse optics, following rigorous and shared protocols for performance assessment of this class of devices. This compact source (12×6  mm2) has been previously developed for range finding applications and is able to provide short, high energy (∼100  ps, ∼0.5  nJ) optical pulses at up to 1 MHz repetition rate. Here, we start with a basic level laser characterization with an analysis of suitability of this laser for the diffuse optics application. Then, we present a TD optical system using this source and its performances in both recovering optical properties of tissue-mimicking homogeneous phantoms and in detecting localized absorption perturbations. Finally, as a proof of concept of in vivo application, we demonstrate that the system is able to detect hemodynamic changes occurring in the arm of healthy volunteers during a venous occlusion. Squeezing the laser source in a small footprint removes a key technological bottleneck that has hampered so far the realization of a miniaturized TD diffuse optics system, able to compete with already assessed continuous-wave devices in terms of size and cost, but with wider performance potentialities, as demonstrated by research over the last two decades. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  2. Evaluation of an unsteady flamelet progress variable model for autoignition and flame development in compositionally stratified mixtures

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Saumyadip; Abraham, John

    2012-07-01

    The unsteady flamelet progress variable (UFPV) model has been proposed by Pitsch and Ihme ["An unsteady/flamelet progress variable method for LES of nonpremixed turbulent combustion," AIAA Paper No. 2005-557, 2005] for modeling the averaged/filtered chemistry source terms in Reynolds averaged simulations and large eddy simulations of reacting non-premixed combustion. In the UFPV model, a look-up table of source terms is generated as a function of mixture fraction Z, scalar dissipation rate χ, and progress variable C by solving the unsteady flamelet equations. The assumption is that the unsteady flamelet represents the evolution of the reacting mixing layer in the non-premixed flame. We assess the accuracy of the model in predicting autoignition and flame development in compositionally stratified n-heptane/air mixtures using direct numerical simulations (DNS). The focus in this work is primarily on the assessment of accuracy of the probability density functions (PDFs) employed for obtaining averaged source terms. The performance of commonly employed presumed functions, such as the dirac-delta distribution function, the β distribution function, and statistically most likely distribution (SMLD) approach in approximating the shapes of the PDFs of the reactive and the conserved scalars is evaluated. For unimodal distributions, it is observed that functions that need two-moment information, e.g., the β distribution function and the SMLD approach with two-moment closure, are able to reasonably approximate the actual PDF. As the distribution becomes multimodal, higher moment information is required. Differences are observed between the ignition trends obtained from DNS and those predicted by the look-up table, especially for smaller gradients where the flamelet assumption becomes less applicable. The formulation assumes that the shape of the χ(Z) profile can be modeled by an error function which remains unchanged in the presence of heat release. We show that this assumption is not accurate.

  3. Effects of Host-rock Fracturing on Elastic-deformation Source Models of Volcano Deflation.

    PubMed

    Holohan, Eoghan P; Sudhaus, Henriette; Walter, Thomas R; Schöpfer, Martin P J; Walsh, John J

    2017-09-08

    Volcanoes commonly inflate or deflate during episodes of unrest or eruption. Continuum mechanics models that assume linear elastic deformation of the Earth's crust are routinely used to invert the observed ground motions. The source(s) of deformation in such models are generally interpreted in terms of magma bodies or pathways, and thus form a basis for hazard assessment and mitigation. Using discontinuum mechanics models, we show how host-rock fracturing (i.e. non-elastic deformation) during drainage of a magma body can progressively change the shape and depth of an elastic-deformation source. We argue that this effect explains the marked spatio-temporal changes in source model attributes inferred for the March-April 2007 eruption of Piton de la Fournaise volcano, La Reunion. We find that pronounced deflation-related host-rock fracturing can: (1) yield inclined source model geometries for a horizontal magma body; (2) cause significant upward migration of an elastic-deformation source, leading to underestimation of the true magma body depth and potentially to a misinterpretation of ascending magma; and (3) at least partly explain underestimation by elastic-deformation sources of changes in sub-surface magma volume.

  4. Observational data on the effects of infection by the copepod Salmincola californiensis on the short- and long-term viability of juvenile Chinook salmon (Oncorhynchus tshawytscha) implanted with telemetry tags

    USGS Publications Warehouse

    Beeman, John W.; Hansen, Amy C.; Sprando, Jamie M.

    2015-01-01

    Infection with Salmincola californiensis is common in juvenile Chinook salmon in western USA reservoirs and may affect the viability of fish used in studies of telemetered animals. Our limited assessment suggests infection by Salmincola californiensis affects the short-term morality of tagged fish and may affect long-term viability of tagged fish after release; however, the intensity of infection in the sample population did not represent the source population due to the observational nature of the data. We suggest these results warrant further study into the effects of infection bySalmincola californiensis on the results obtained through active telemetry and perhaps other methods requiring handling of infected fish.

  5. Understanding causality and uncertainty in volcanic observations: An example of forecasting eruptive activity on Soufrière Hills Volcano, Montserrat

    NASA Astrophysics Data System (ADS)

    Sheldrake, T. E.; Aspinall, W. P.; Odbert, H. M.; Wadge, G.; Sparks, R. S. J.

    2017-07-01

    Following a cessation in eruptive activity it is important to understand how a volcano will behave in the future and when it may next erupt. Such an assessment can be based on the volcano's long-term pattern of behaviour and insights into its current state via monitoring observations. We present a Bayesian network that integrates these two strands of evidence to forecast future eruptive scenarios using expert elicitation. The Bayesian approach provides a framework to quantify the magmatic causes in terms of volcanic effects (i.e., eruption and unrest). In October 2013, an expert elicitation was performed to populate a Bayesian network designed to help forecast future eruptive (in-)activity at Soufrière Hills Volcano. The Bayesian network was devised to assess the state of the shallow magmatic system, as a means to forecast the future eruptive activity in the context of the long-term behaviour at similar dome-building volcanoes. The findings highlight coherence amongst experts when interpreting the current behaviour of the volcano, but reveal considerable ambiguity when relating this to longer patterns of volcanism at dome-building volcanoes, as a class. By asking questions in terms of magmatic causes, the Bayesian approach highlights the importance of using short-term unrest indicators from monitoring data as evidence in long-term forecasts at volcanoes. Furthermore, it highlights potential biases in the judgements of volcanologists and identifies sources of uncertainty in terms of magmatic causes rather than scenario-based outcomes.

  6. Environmental performance of bio-based and biodegradable plastics: the road ahead.

    PubMed

    Lambert, Scott; Wagner, Martin

    2017-11-13

    Future plastic materials will be very different from those that are used today. The increasing importance of sustainability promotes the development of bio-based and biodegradable polymers, sometimes misleadingly referred to as 'bioplastics'. Because both terms imply "green" sources and "clean" removal, this paper aims at critically discussing the sometimes-conflicting terminology as well as renewable sources with a special focus on the degradation of these polymers in natural environments. With regard to the former we review innovations in feedstock development (e.g. microalgae and food wastes). In terms of the latter, we highlight the effects that polymer structure, additives, and environmental variables have on plastic biodegradability. We argue that the 'biodegradable' end-product does not necessarily degrade once emitted to the environment because chemical additives used to make them fit for purpose will increase the longevity. In the future, this trend may continue as the plastics industry also is expected to be a major user of nanocomposites. Overall, there is a need to assess the performance of polymer innovations in terms of their biodegradability especially under realistic waste management and environmental conditions, to avoid the unwanted release of plastic degradation products in receiving environments.

  7. Maternal DHA Status during Pregnancy Has a Positive Impact on Infant Problem Solving: A Norwegian Prospective Observation Study

    PubMed Central

    Braarud, Hanne Cecilie; Markhus, Maria Wik; Skotheim, Siv; Stormark, Kjell Morten; Frøyland, Livar; Graff, Ingvild Eide; Kjellevold, Marian

    2018-01-01

    Docosahexaenoic acid (DHA, 22:6, n-3) is a long-chain polyunsaturated fatty acid necessary for normal brain growth and cognitive development. Seafood and dietary supplements are the primary dietary sources of DHA. This study addresses the associations between DHA status in pregnant women and healthy, term-born infant problem-solving skills assessed using the Ages and Stages Questionnaire. The fatty acid status of maternal red blood cells (RBCs) was assessed in the 28th week of gestation and at three months postpartum. The infants’ fatty acid status (RBC) was assessed at three, six, and twelve months, and problem-solving skills were assessed at six and twelve months. Maternal DHA status in pregnancy was found to be positively associated with infants’ problem-solving skills at 12 months. This association remained significant even after controlling for the level of maternal education, a surrogate for socio-economic status. The infants’ DHA status at three months was associated with the infants’ problem solving at 12 months. The results accentuate the importance for pregnant and lactating women to have a satisfactory DHA status from dietary intake of seafood or other sources rich in DHA. PMID:29695097

  8. Maternal DHA Status during Pregnancy Has a Positive Impact on Infant Problem Solving: A Norwegian Prospective Observation Study.

    PubMed

    Braarud, Hanne Cecilie; Markhus, Maria Wik; Skotheim, Siv; Stormark, Kjell Morten; Frøyland, Livar; Graff, Ingvild Eide; Kjellevold, Marian

    2018-04-24

    Docosahexaenoic acid (DHA, 22:6, n -3) is a long-chain polyunsaturated fatty acid necessary for normal brain growth and cognitive development. Seafood and dietary supplements are the primary dietary sources of DHA. This study addresses the associations between DHA status in pregnant women and healthy, term-born infant problem-solving skills assessed using the Ages and Stages Questionnaire. The fatty acid status of maternal red blood cells (RBCs) was assessed in the 28th week of gestation and at three months postpartum. The infants’ fatty acid status (RBC) was assessed at three, six, and twelve months, and problem-solving skills were assessed at six and twelve months. Maternal DHA status in pregnancy was found to be positively associated with infants’ problem-solving skills at 12 months. This association remained significant even after controlling for the level of maternal education, a surrogate for socio-economic status. The infants’ DHA status at three months was associated with the infants’ problem solving at 12 months. The results accentuate the importance for pregnant and lactating women to have a satisfactory DHA status from dietary intake of seafood or other sources rich in DHA.

  9. Is there an environmental benefit from remediation of a contaminated site? Combined assessments of the risk reduction and life cycle impact of remediation.

    PubMed

    Lemming, Gitte; Chambon, Julie C; Binning, Philip J; Bjerg, Poul L

    2012-12-15

    A comparative life cycle assessment is presented for four different management options for a trichloroethene-contaminated site with a contaminant source zone located in a fractured clay till. The compared options are (i) long-term monitoring (ii) in-situ enhanced reductive dechlorination (ERD), (iii) in-situ chemical oxidation (ISCO) with permanganate and (iv) long-term monitoring combined with treatment by activated carbon at the nearby waterworks. The life cycle assessment included evaluation of both primary and secondary environmental impacts. The primary impacts are the local human toxic impacts due to contaminant leaching into groundwater that is used for drinking water, whereas the secondary environmental impacts are related to remediation activities such as monitoring, drilling and construction of wells and use of remedial amendments. The primary impacts for the compared scenarios were determined by a numerical risk assessment and remedial performance model, which predicted the contaminant mass discharge over time at a point of compliance in the aquifer and at the waterworks. The combined assessment of risk reduction and life cycle impacts showed that all management options result in higher environmental impacts than they remediate, in terms of person equivalents and assuming equal weighting of all impacts. The ERD and long-term monitoring were the scenarios with the lowest secondary life cycle impacts and are therefore the preferred alternatives. However, if activated carbon treatment at the waterworks is required in the long-term monitoring scenario, then it becomes unfavorable because of large secondary impacts. ERD is favorable due to its low secondary impacts, but only if leaching of vinyl chloride to the groundwater aquifer can be avoided. Remediation with ISCO caused the highest secondary impacts and cannot be recommended for the site. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Challenges in coupling LTER with environmental assessments: An insight from potential and reality of the Chinese Ecological Research Network in servicing environment assessments.

    PubMed

    Xia, Shaoxia; Liu, Yu; Yu, Xiubo; Fu, Bojie

    2018-08-15

    Environmental assessments estimate, evaluate and predict the consequences of natural processes and human activities on the environment. Long-term ecosystem observation and research networks (LTERs) are potentially valuable infrastructure to support environmental assessments. However, very few environmental assessments have successfully incorporated them. In this study, we try to reveal the current status of coupling LTERs with environmental assessments and look at the challenges involved in improving this coupling through exploring the role that Chinese Ecological Research Network (CERN), the LTER of China, currently plays in regional environment assessments. A review of official protocols and standards, regional assessments and CERN researches related to ecosystems and environment shows that there is great potential for coupling CERN with environment assessments. However in practice, CERN does not currently play the expected role. Remote sensing and irregular inventory data are still the main data sources currently used in regional assessments. Several causes led to the present situation: (1) insufficient cross-site research and failure to scale up site-level variables to the regional scale; (2) data barriers resulting from incompatible protocols and low data usability due to lack of data assimilation and scaling; and (3) absence of indicators relevant to human activities in existing monitoring protocols. For these reasons, enhancing cross-site monitoring and research, data assimilation and scaling up are critical steps required to improve coupling of LTER with environmental assessments. Site-focused long-term monitoring should be combined with wide-scale ground surveys and remote sensing to establish an effective connection between different environmental monitoring platforms for regional assessments. It is also necessary to revise the current monitoring protocols to include human activities and their impacts on the ecosystem, or change the LTERs into Long-Term Socio-Ecological Research (LTSER) networks. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Health risk assessments for alumina refineries.

    PubMed

    Donoghue, A Michael; Coffey, Patrick S

    2014-05-01

    To describe contemporary air dispersion modeling and health risk assessment methodologies applied to alumina refineries and to summarize recent results. Air dispersion models using emission source and meteorological data have been used to assess ground-level concentrations (GLCs) of refinery emissions. Short-term (1-hour and 24-hour average) GLCs and annual average GLCs have been used to assess acute health, chronic health, and incremental carcinogenic risks. The acute hazard index can exceed 1 close to refineries, but it is typically less than 1 at neighboring residential locations. The chronic hazard index is typically substantially less than 1. The incremental carcinogenic risk is typically less than 10(-6). The risks of acute health effects are adequately controlled, and the risks of chronic health effects and incremental carcinogenic risks are negligible around referenced alumina refineries.

  12. Parenting knowledge: experiential and sociodemographic factors in European American mothers of young children.

    PubMed

    Bornstein, Marc H; Cote, Linda R; Haynes, O Maurice; Hahn, Chun-Shin; Park, Yoonjung

    2010-11-01

    Knowledge of child rearing and child development is relevant to parenting and the well-being of children. Using a sociodemographically heterogeneous sample of 268 European American mothers of 2-year-olds, we assessed the state of mothers' parenting knowledge; compared parenting knowledge in groups of mothers who varied in terms of parenthood and social status; and identified principal sources of mothers' parenting knowledge in terms of social factors, parenting supports, and formal classes. On the whole, European American mothers demonstrated fair but less than complete basic parenting knowledge; age, education, and rated helpfulness of written materials each uniquely contributed to mothers' knowledge. Adult mothers scored higher than adolescent mothers, and mothers improved in their knowledge of parenting from their first to their second child (and were stable across time). No differences were found between mothers of girls and boys, mothers who varied in employment status, or birth and adoptive mothers. The implications of variation in parenting knowledge and its sources for parenting education and clinical interactions with parents are discussed.

  13. Antimatter Production for Near-Term Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Schmidt, G. R.; Gerrish, H. P.; Martin, J. J.; Smith, G. A.; Meyer, K. J.

    1999-01-01

    The superior energy density of antimatter annihilation has often been pointed to as the ultimate source of energy for propulsion. However, the limited capacity and very low efficiency of present-day antiproton production methods suggest that antimatter may be too costly to consider for near-term propulsion applications. We address this issue by assessing the antimatter requirements for six different types of propulsion concepts, including two in which antiprotons are used to drive energy release from combined fission/fusion. These requirements are compared against the capacity of both the current antimatter production infrastructure and the improved capabilities which could exist within the early part of next century. Results show that although it may be impractical to consider systems which rely on antimatter as the sole source of propulsive energy, the requirements for propulsion based on antimatter-assisted fission/fusion do fall within projected near-ten-n production capabilities. In fact, such systems could feasibly support interstellar precursor missions and omniplanetary spaceflight with antimatter costs ranging up to $60 million per mission.

  14. Design and implementation of wireless dose logger network for radiological emergency decision support system.

    PubMed

    Gopalakrishnan, V; Baskaran, R; Venkatraman, B

    2016-08-01

    A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee-Pro wireless modules and PSoC controller for wireless interfacing, and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.

  15. Possible consequences of severe accidents at the Lubiatowo site, Poland

    NASA Astrophysics Data System (ADS)

    Seibert, Petra; Philipp, Anne; Hofman, Radek; Gufler, Klaus; Sholly, Steven

    2014-05-01

    The construction of a nuclear power plant is under consideration in Poland. One of the sites under discussion is near Lubiatowo, located on the cost of the Baltic Sea northwest of Gdansk. An assessment of possible environmental consequences is carried out for 88 real meteorological cases with the Lagrangian particle dispersion model FLEXPART. Based on literature research, three reactor designs (ABWR, EPR, AP 1000) were identified as being under discussion in Poland. For each of the designs, a set of accident scenarios was evaluated and two source terms per reactor design were selected for analysis. One of the selected source terms was a relatively large release while the second one was a severe accident with an intact containment. Considered endpoints of the calculations are ground contamination with Cs-137 and time-integrated concentrations of I-131 in air as well as committed doses. They are evaluated on a grid of ca. 3 km mesh size covering eastern Central Europe.

  16. Design and implementation of wireless dose logger network for radiological emergency decision support system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gopalakrishnan, V.; Baskaran, R.; Venkatraman, B.

    A decision support system (DSS) is implemented in Radiological Safety Division, Indira Gandhi Centre for Atomic Research for providing guidance for emergency decision making in case of an inadvertent nuclear accident. Real time gamma dose rate measurement around the stack is used for estimating the radioactive release rate (source term) by using inverse calculation. Wireless gamma dose logging network is designed, implemented, and installed around the Madras Atomic Power Station reactor stack to continuously acquire the environmental gamma dose rate and the details are presented in the paper. The network uses XBee–Pro wireless modules and PSoC controller for wireless interfacing,more » and the data are logged at the base station. A LabView based program is developed to receive the data, display it on the Google Map, plot the data over the time scale, and register the data in a file to share with DSS software. The DSS at the base station evaluates the real time source term to assess radiation impact.« less

  17. A novel method for assessing chronic cortisol concentrations in dogs using the nail as a source.

    PubMed

    Mack, Z; Fokidis, H B

    2017-04-01

    Cortisol, a glucocorticoid secreted in response to stress, is used to assess adrenal function and mental health in clinical settings. Current methods assess cortisol sources that reflect short-term secretion that can vary with current stress state. Here, we present a novel method for the extraction and quantification of cortisol from the dog nail using solid phase extraction coupled to enzyme-linked immunosorbent assay. Validation experiments demonstrated accuracy (r = 0.836, P < 0.001) precision (15.1% coefficients of variation), and repeatability (14.4% coefficients of variation) with this method. Furthermore, nail cortisol concentrations were positively correlated to an established hair cortisol method (r = 0.736, P < 0.001). Nail cortisol concentrations did not differ with dog sex, breed, age, or weights; however, sample size limitations may preclude statistical significance. Nail cortisol may provide information on cortisol secretion integrated over the time corresponding to nail growth and may be useful as a tool for diagnosing stress and adrenal disorders in dogs. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Elkhorn Slough: Detecting Eutrophication through Geospatial Modeling Applications

    NASA Astrophysics Data System (ADS)

    Caraballo Álvarez, I. O.; Childs, A.; Jurich, K.

    2016-12-01

    Elkhorn Slough in Monterey, California, has experienced substantial nutrient loading and eutrophication over the past 21 years as a result of fertilizer-rich runoff from nearby agricultural fields. This study seeks to identify and track spatial patterns of eutrophication hotspots and the correlation to land use changes, possible nutrient sources, and general climatic trends using remotely sensed and in situ data. Threats of rising sea level, subsiding marshes, and increased eutrophication hotspots demonstrate the necessity to analyze the effects of increasing nutrient loads, relative sea level changes, and sedimentation within Elkhorn Slough. The Soil & Water Assessment Tool (SWAT) model integrates specified inputs to assess nutrient and sediment loading and their sources. TerrSet's Land Change Modeler forecasts the future potential of land change transitions for various land cover classes around the slough as a result of nutrient loading, eutrophication, and increased sedimentation. TerrSet's Earth Trends Modeler provides a comprehensive analysis of image time series to rapidly assess long term eutrophication trends and detect spatial patterns of known hotspots. Results from this study will inform future coastal management practices and provide greater spatial and temporal insight into Elkhorn Slough eutrophication dynamics.

  19. Piecewise synonyms for enhanced UMLS source terminology integration.

    PubMed

    Huang, Kuo-Chuan; Geller, James; Halper, Michael; Cimino, James J

    2007-10-11

    The UMLS contains more than 100 source vocabularies and is growing via the integration of others. When integrating a new source, the source terms already in the UMLS must first be found. The easiest approach to this is simple string matching. However, string matching usually does not find all concepts that should be found. A new methodology, based on the notion of piecewise synonyms, for enhancing the process of concept discovery in the UMLS is presented. This methodology is supported by first creating a general synonym dictionary based on the UMLS. Each multi-word source term is decomposed into its component words, allowing for the generation of separate synonyms for each word from the general synonym dictionary. The recombination of these synonyms into new terms creates an expanded pool of matching candidates for terms from the source. The methodology is demonstrated with respect to an existing UMLS source. It shows a 34% improvement over simple string matching.

  20. A model approach to assess the long-term trends of indirect photochemistry in lake water. The case of Lake Maggiore (NW Italy).

    PubMed

    Minella, Marco; Rogora, Michela; Vione, Davide; Maurino, Valter; Minero, Claudio

    2011-08-15

    A model-based approach is here developed and applied to predict the long-term trends of indirect photochemical processes in the surface layer (5m water depth) of Lake Maggiore, NW Italy. For this lake, time series of the main parameters of photochemical importance that cover almost two decades are available. As a way to assess the relevant photochemical reactions, the modelled steady-state concentrations of important photogenerated transients ((•)OH, ³CDOM* and CO₃(-•)) were taken into account. A multivariate analysis approach was adopted to have an overview of the system, to emphasise relationships among chemical, photochemical and seasonal variables, and to highlight annual and long-term trends. Over the considered time period, because of the decrease of the dissolved organic carbon (DOC) content of water and of the increase of alkalinity, a significant increase is predicted for the steady-state concentrations of the radicals (•)OH and CO₃(-•). Therefore, the photochemical degradation processes that involve the two radical species would be enhanced. Another issue of potential photochemical importance is related to the winter maxima of nitrate (a photochemical (•)OH source) and the summer maxima of DOC ((•)OH sink and ³CDOM* source) in the lake water under consideration. From the combination of sunlight irradiance and chemical composition data, one predicts that the processes involving (•)OH and CO₃(-•) would be most important in spring, while the reactions involving ³CDOM* would be most important in summer. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Inverse modeling of the Chernobyl source term using atmospheric concentration and deposition measurements

    NASA Astrophysics Data System (ADS)

    Evangeliou, Nikolaos; Hamburger, Thomas; Cozic, Anne; Balkanski, Yves; Stohl, Andreas

    2017-07-01

    This paper describes the results of an inverse modeling study for the determination of the source term of the radionuclides 134Cs, 137Cs and 131I released after the Chernobyl accident. The accident occurred on 26 April 1986 in the Former Soviet Union and released about 1019 Bq of radioactive materials that were transported as far away as the USA and Japan. Thereafter, several attempts to assess the magnitude of the emissions were made that were based on the knowledge of the core inventory and the levels of the spent fuel. More recently, when modeling tools were further developed, inverse modeling techniques were applied to the Chernobyl case for source term quantification. However, because radioactivity is a sensitive topic for the public and attracts a lot of attention, high-quality measurements, which are essential for inverse modeling, were not made available except for a few sparse activity concentration measurements far from the source and far from the main direction of the radioactive fallout. For the first time, we apply Bayesian inversion of the Chernobyl source term using not only activity concentrations but also deposition measurements from the most recent public data set. These observations refer to a data rescue attempt that started more than 10 years ago, with a final goal to provide available measurements to anyone interested. In regards to our inverse modeling results, emissions of 134Cs were estimated to be 80 PBq or 30-50 % higher than what was previously published. From the released amount of 134Cs, about 70 PBq were deposited all over Europe. Similar to 134Cs, emissions of 137Cs were estimated as 86 PBq, on the same order as previously reported results. Finally, 131I emissions of 1365 PBq were found, which are about 10 % less than the prior total releases. The inversion pushes the injection heights of the three radionuclides to higher altitudes (up to about 3 km) than previously assumed (≈ 2.2 km) in order to better match both concentration and deposition observations over Europe. The results of the present inversion were confirmed using an independent Eulerian model, for which deposition patterns were also improved when using the estimated posterior releases. Although the independent model tends to underestimate deposition in countries that are not in the main direction of the plume, it reproduces country levels of deposition very efficiently. The results were also tested for robustness against different setups of the inversion through sensitivity runs. The source term data from this study are publicly available.

  2. Fish-Eye Observing with Phased Array Radio Telescopes

    NASA Astrophysics Data System (ADS)

    Wijnholds, S. J.

    The radio astronomical community is currently developing and building several new radio telescopes based on phased array technology. These telescopes provide a large field-of-view, that may in principle span a full hemisphere. This makes calibration and imaging very challenging tasks due to the complex source structures and direction dependent radio wave propagation effects. In this thesis, calibration and imaging methods are developed based on least squares estimation of instrument and source parameters. Monte Carlo simulations and actual observations with several prototype show that this model based approach provides statistically and computationally efficient solutions. The error analysis provides a rigorous mathematical framework to assess the imaging performance of current and future radio telescopes in terms of the effective noise, which is the combined effect of propagated calibration errors, noise in the data and source confusion.

  3. Low birth weight and air pollution in California: Which sources and components drive the risk?

    PubMed

    Laurent, Olivier; Hu, Jianlin; Li, Lianfa; Kleeman, Michael J; Bartell, Scott M; Cockburn, Myles; Escobedo, Loraine; Wu, Jun

    2016-01-01

    Intrauterine growth restriction has been associated with exposure to air pollution, but there is a need to clarify which sources and components are most likely responsible. This study investigated the associations between low birth weight (LBW, <2500g) in term born infants (≥37 gestational weeks) and air pollution by source and composition in California, over the period 2001-2008. Complementary exposure models were used: an empirical Bayesian kriging model for the interpolation of ambient pollutant measurements, a source-oriented chemical transport model (using California emission inventories) that estimated fine and ultrafine particulate matter (PM2.5 and PM0.1, respectively) mass concentrations (4km×4km) by source and composition, a line-source roadway dispersion model at fine resolution, and traffic index estimates. Birth weight was obtained from California birth certificate records. A case-cohort design was used. Five controls per term LBW case were randomly selected (without covariate matching or stratification) from among term births. The resulting datasets were analyzed by logistic regression with a random effect by hospital, using generalized additive mixed models adjusted for race/ethnicity, education, maternal age and household income. In total 72,632 singleton term LBW cases were included. Term LBW was positively and significantly associated with interpolated measurements of ozone but not total fine PM or nitrogen dioxide. No significant association was observed between term LBW and primary PM from all sources grouped together. A positive significant association was observed for secondary organic aerosols. Exposure to elemental carbon (EC), nitrates and ammonium were also positively and significantly associated with term LBW, but only for exposure during the third trimester of pregnancy. Significant positive associations were observed between term LBW risk and primary PM emitted by on-road gasoline and diesel or by commercial meat cooking sources. Primary PM from wood burning was inversely associated with term LBW. Significant positive associations were also observed between term LBW and ultrafine particle numbers modeled with the line-source roadway dispersion model, traffic density and proximity to roadways. This large study based on complementary exposure metrics suggests that not only primary pollution sources (traffic and commercial meat cooking) but also EC and secondary pollutants are risk factors for term LBW. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Validation of the translation of an instrument to measure reliability of written information on treatment choices: a study on attention deficit/hyperactivity disorder (ADHD).

    PubMed

    Montoya, A; Llopis, N; Gilaberte, I

    2011-12-01

    DISCERN is an instrument designed to help patients assess the reliability of written information on treatment choices. Originally created in English, there is no validated Spanish version of this instrument. This study seeks to validate the Spanish translation of the DISCERN instrument used as a primary measure on a multicenter study aimed to assess the reliability of web-based information on treatment choices for attention deficit/hyperactivity disorder (ADHD). We used a modified version of a method for validating translated instruments in which the original source-language version is formally compared with the back-translated source-language version. Each item was ranked in terms of comparability of language, similarity of interpretability, and degree of understandability. Responses used Likert scales ranging from 1 to 7, where 1 indicates the best interpretability, language and understandability, and 7 indicates the worst. Assessments were performed by 20 raters fluent in the source language. The Spanish translation of DISCERN, based on ratings of comparability, interpretability and degree of understandability (mean score (SD): 1.8 (1.1), 1.4 (0.9) and 1.6 (1.1), respectively), was considered extremely comparable. All items received a score of less than three, therefore no further revision of the translation was needed. The validation process showed that the quality of DISCERN translation was high, validating the comparable language of the tool translated on assessing written information on treatment choices for ADHD.

  5. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Metal Fuel Radionuclide Release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-02-01

    The development of an accurate and defensible mechanistic source term will be vital for the future licensing efforts of metal fuel, pool-type sodium fast reactors. To assist in the creation of a comprehensive mechanistic source term, the current effort sought to estimate the release fraction of radionuclides from metal fuel pins to the primary sodium coolant during fuel pin failures at a variety of temperature conditions. These release estimates were based on the findings of an extensive literature search, which reviewed past experimentation and reactor fuel damage accidents. Data sources for each radionuclide of interest were reviewed to establish releasemore » fractions, along with possible release dependencies, and the corresponding uncertainty levels. Although the current knowledge base is substantial, and radionuclide release fractions were established for the elements deemed important for the determination of offsite consequences following a reactor accident, gaps were found pertaining to several radionuclides. First, there is uncertainty regarding the transport behavior of several radionuclides (iodine, barium, strontium, tellurium, and europium) during metal fuel irradiation to high burnup levels. The migration of these radionuclides within the fuel matrix and bond sodium region can greatly affect their release during pin failure incidents. Post-irradiation examination of existing high burnup metal fuel can likely resolve this knowledge gap. Second, data regarding the radionuclide release from molten high burnup metal fuel in sodium is sparse, which makes the assessment of radionuclide release from fuel melting accidents at high fuel burnup levels difficult. This gap could be addressed through fuel melting experimentation with samples from the existing high burnup metal fuel inventory.« less

  6. Annual Threat Assessment of the US Intelligence Community for the House Permanent Select Committee on Intelligence

    DTIC Science & Technology

    2010-02-03

    water and power shortages, and a major currency devaluation , raising questions about his longer term political future. On foreign policy, Chavez’s...have not disappeared. Most emerging market nations have weathered the crisis, international private investment flows are recovering, and the IMF has...lose support from the IMF and other sources of finance. Bulgaria, Estonia, Greece, Hungary, Iceland, Ireland, Latvia, Lithuania, and Romania remain

  7. An Evaluation of the Hazard Prediction and Assessment Capability (HPAC) Software’s Ability to Model the Chornobyl Accident

    DTIC Science & Technology

    2002-03-01

    source term. Several publications provided a thorough accounting of the accident, including “ Chernobyl Record” [Mould], and the NRC technical report...Report on the Accident at the Chernobyl Nuclear Power Station” [NUREG-1250]. The most comprehensive study of transport models to predict the...from the Chernobyl Accident: The ATMES Report” [Klug, et al.]. The Atmospheric Transport 5 Model Evaluation Study (ATMES) report used data

  8. Medical concerns for exploration-class missions

    NASA Technical Reports Server (NTRS)

    Stewart, Donald F.; Lujan, Barbara

    1991-01-01

    The Space Exploration initiative will challenge life scientists with a diverse set of crew medical risks. The varied sources of this cumulative risk are identified and briefly discussed in terms of risk assessment and preliminary plans for risk management. The roles of Space Station Freedom and other flight programs are discussed in the context of exploration medical objectives. The significant differences between Space Station era (second generation) and exploration medical support systems (third generation) are reviewed.

  9. ENERGY AND OUR ENVIRONMENT: A SYSTEMS AND LIFE ...

    EPA Pesticide Factsheets

    This is a presentation to the North Carolina BREATE Conference on March 28, 2017. This presentation provides an overview of energy modeling capabilities in ORD, and includes examples related to scenario development, water-energy nexus, bioenergy, etc. The focus is on system approaches as well as life cycle assessment data and tools. Provide an overview of system and life cycle approaches to modeling medium to long-term changes in drivers of changes in emissions sources.

  10. Energy performance of a ventilation system for a block of apartments with a ground source heat pump as generation system

    NASA Astrophysics Data System (ADS)

    Lucchi, M.; Lorenzini, M.; Valdiserri, P.

    2017-01-01

    This work presents a numerical simulation of the annual performance of two different systems: a traditional one composed by a gas boiler-chiller pair and one consisting of a ground source heat pump (GSHP) both coupled to two thermal storage tanks. The systems serve a bloc of flats located in northern Italy and are assessed over a typical weather year, covering both the heating and cooling seasons. The air handling unit (AHU) coupled with the GSHP exhibits excellent characteristics in terms of temperature control, and has high performance parameters (EER and COP), which make conduction costs about 30% lower than those estimated for the traditional plant.

  11. Does the Method of Weight Loss Effect Long-Term Changes in Weight, Body Composition or Chronic Disease Risk Factors in Overweight or Obese Adults? A Systematic Review

    PubMed Central

    Washburn, Richard A.; Szabo, Amanda N.; Lambourne, Kate; Willis, Erik A.; Ptomey, Lauren T.; Honas, Jeffery J.; Herrmann, Stephen D.; Donnelly, Joseph E.

    2014-01-01

    Background Differences in biological changes from weight loss by energy restriction and/or exercise may be associated with differences in long-term weight loss/regain. Objective To assess the effect of weight loss method on long-term changes in weight, body composition and chronic disease risk factors. Data Sources PubMed and Embase were searched (January 1990-October 2013) for studies with data on the effect of energy restriction, exercise (aerobic and resistance) on long-term weight loss. Twenty articles were included in this review. Study Eligibility Criteria Primary source, peer reviewed randomized trials published in English with an active weight loss period of >6 months, or active weight loss with a follow-up period of any duration, conducted in overweight or obese adults were included. Study Appraisal and Synthesis Methods Considerable heterogeneity across trials existed for important study parameters, therefore a meta-analysis was considered inappropriate. Results were synthesized and grouped by comparisons (e.g. diet vs. aerobic exercise, diet vs. diet + aerobic exercise etc.) and study design (long-term or weight loss/follow-up). Results Forty percent of trials reported significantly greater long-term weight loss with diet compared with aerobic exercise, while results for differences in weight regain were inconclusive. Diet+aerobic exercise resulted in significantly greater weight loss than diet alone in 50% of trials. However, weight regain (∼55% of loss) was similar in diet and diet+aerobic exercise groups. Fat-free mass tended to be preserved when interventions included exercise. PMID:25333384

  12. Air quality impact assessment of at-berth ship emissions: Case-study for the project of a new freight port.

    PubMed

    Lonati, Giovanni; Cernuschi, Stefano; Sidi, Shelina

    2010-12-01

    This work is intended to assess the impact on local air quality due to atmospheric emissions from port area activities for a new port in project in the Mediterranean Sea. The sources of air pollutants in the harbour area are auxiliary engines used by ships at berth during loading/offloading operations. A fleet activity-based methodology is first applied to evaluate annual pollutant emissions (NO(X), SO(X), PM, CO and VOC) based on vessel traffic data, ships tonnage and in-port hotelling time for loading/offloading operations. The 3-dimensional Calpuff transport and dispersion model is then applied for the subsequent assessment of the ground level spatial distribution of atmospheric pollutants for both long-term and short-term averaging times. Compliance with current air quality standards in the port area is finally evaluated and indications for port operation are provided. Some methodological aspects of the impact assessment procedure, namely those concerning the steps of emission scenario definitions and model simulations set-up at the project stage, are specifically addressed, suggesting a pragmatic approach for similar evaluations for small new ports in project. Copyright © 2010 Elsevier B.V. All rights reserved.

  13. Morphodynamics and Sediment connectivity in the Kosi River basin in the Himalaya and their implications for river management

    NASA Astrophysics Data System (ADS)

    Sinha, R.; Mishra, K.; Swrankar, S.; Jain, V.; Nepal, S.; Uddin, K.

    2017-12-01

    Sediment flux of large tropical rivers is strongly influenced by the degree of linkage between the sediments sources and sink (i.e. sediment connectivity). Sediment connectivity, especially at the catchment scale, depends largely on the morphological characteristics of the catchment such as relief, terrain roughness, slope, elevation, stream network density and catchment shape and the combined effects of land use, particularly vegetation. Understanding the spatial distribution of sediment connectivity and its temporal evolution can be useful for the characterization of sediment source areas. Specifically, these areas represent sites of instability and their connectivity influences the probability of sediment transfer at a local scale that will propagate downstream through a feedback system. This paper evaluates the morphodynamics and sediment connectivity of the Kosi basin in Nepal and India at various spatial and temporal scales. Our results provide the first order assessment of the spatial sediment connectivity in terms of the channel connectivity (IC outlet) and source to channel connectivity (IC channel) of the upstream and midstream Kosi basin. This assessment helped in the characterization of sediment dynamics in the complex morphological settings and in a mixed environment. Further, Revised Universal Soil Loss Equation (RUSLE) was used to quantify soil erosion and sediment transport capacity equation is used to quantify sediment flux at each cell basis. Sediment Delivery Ratio (SDR) was calculated for each sub-basin to identify the sediment production and transport capacity limited sub-basin. We have then integrated all results to assess the sediment flux in the Kosi basin in relation to sediment connectivity and the factors controlling the pathways of sediment delivery. Results of this work have significant implications for sediment management of the Kosi river in terms of identification of hotspots of sediment accumulation that will in turn be manifested in morphodynamics of the river in the alluvial reaches.

  14. Impact of saline water sources on hypertension and cardiovascular disease risk in coastal Bangladesh

    NASA Astrophysics Data System (ADS)

    Butler, Adrian; Hoque, Mohammad; Mathewson, Eleanor; Ahmed, Kazi; Rahman, Moshuir; Vineis, Paolo; Scheelbeek, Pauline

    2016-04-01

    Southern Bangladesh is periodically affected by tropical cyclone induced storm surges. Such events can result in the inundation of large areas of the coastal plain by sea water. Over time these episodic influxes of saline water have led to the build-up of a high of salinities (e.g. > 1,000 mg/l) in the shallow (up to ca. 150 m depth) groundwater. Owing to the highly saline groundwater, local communities have developed alternative surface water sources by constructing artificial drinking water ponds, which collect monsoonal rainwater. These have far greater storage than traditional rainwater harvesting systems, which typically use 40 litre storage containers that are quickly depleted during the dry season. Unfortunately, the ponds can also become salinised during storm surge events, the impacts of which can last for a number of years. A combined hydrological and epidemiological research programme over the past two years has been undertaken to understand the potential health risks associated with these saline water sources, as excessive intake of sodium can lead to hypertension and an increased risk of cardiovascular disease (such as stroke and heart attack). An important aspect of the selected research sites was the variety of drinking water sources available. These included the presence of managed aquifer recharge sites where monsoonal rainwater is stored in near-surface (semi-)confined aquifers for abstraction during the dry season. This provided an opportunity for the effects of interventions with lower salinity sources to be assessed. Adjusting for confounding factors such as age, gender and diet, the results show a significant association between salinity and blood pressure. Furthermore, the results also showed such impacts are reversible. In order to evaluate the costs and benefits of such interventions, a water salinity - dose impact model is being developed to assess the effectiveness of alternative drinking water sources, such as enhanced rainwater harvesting, localised solar distillation, as well as the long-term risks from traditional water sources due to climate change. Preliminary results from the model will be presented showing the relative impacts from these interventions. These highlight the need for an integrated approach to salinity management in such coastal deltas in order to improve the long-term health of local communities living in these areas.

  15. What do popular Spanish women's magazines say about caesarean section? A 21-year survey.

    PubMed

    Torloni, M R; Campos Mansilla, B; Merialdi, M; Betrán, A P

    2014-04-01

    Caesarean section (CS) rates are increasing worldwide and maternal request is cited as one of the main reasons for this trend. Women's preferences for route of delivery are influenced by popular media, including magazines. We assessed the information on CS presented in Spanish women's magazines. Systematic review. Women's magazines printed from 1989 to 2009 with the largest national distribution. Articles with any information on CS. Articles were selected, read and abstracted in duplicate. Sources of information, scientific accuracy, comprehensiveness and women's testimonials were objectively extracted using a content analysis form designed for this study. Accuracy, comprehensiveness and sources of information. Most (67%) of the 1223 selected articles presented exclusively personal opinion/birth stories, 12% reported the potential benefits of CS, 26% mentioned the short-term and 10% mentioned the long-term maternal risks, and 6% highlighted the perinatal risks of CS. The most frequent short-term risks were the increased time for maternal recovery (n = 86), frustration/feelings of failure (n = 83) and increased post-surgical pain (n = 71). The most frequently cited long-term risks were uterine rupture (n = 57) and the need for another CS in any subsequent pregnancy (n = 42). Less than 5% of the selected articles reported that CS could increase the risks of infection (n = 53), haemorrhage (n = 31) or placenta praevia/accreta in future pregnancies (n = 6). The sources of information were not reported by 68% of the articles. The portrayal of CS in Spanish women's magazines is not sufficiently comprehensive and does not provide adequate important information to help the readership to understand the real benefits and risks of this route of delivery. © 2014 The Authors. BJOG An International Journal of Obstetrics and Gynaecology published by John Wiley & Sons Ltd on behalf of Royal College of Obstetricians and Gynaecologists.

  16. What do popular Spanish women's magazines say about caesarean section? A 21-year survey

    PubMed Central

    Torloni, MR; Campos Mansilla, B; Merialdi, M; Betrán, AP

    2014-01-01

    Objectives Caesarean section (CS) rates are increasing worldwide and maternal request is cited as one of the main reasons for this trend. Women's preferences for route of delivery are influenced by popular media, including magazines. We assessed the information on CS presented in Spanish women's magazines. Design Systematic review. Setting Women's magazines printed from 1989 to 2009 with the largest national distribution. Sample Articles with any information on CS. Methods Articles were selected, read and abstracted in duplicate. Sources of information, scientific accuracy, comprehensiveness and women's testimonials were objectively extracted using a content analysis form designed for this study. Main outcome measures Accuracy, comprehensiveness and sources of information. Results Most (67%) of the 1223 selected articles presented exclusively personal opinion/birth stories, 12% reported the potential benefits of CS, 26% mentioned the short-term and 10% mentioned the long-term maternal risks, and 6% highlighted the perinatal risks of CS. The most frequent short-term risks were the increased time for maternal recovery (n = 86), frustration/feelings of failure (n = 83) and increased post-surgical pain (n = 71). The most frequently cited long-term risks were uterine rupture (n = 57) and the need for another CS in any subsequent pregnancy (n = 42). Less than 5% of the selected articles reported that CS could increase the risks of infection (n = 53), haemorrhage (n = 31) or placenta praevia/accreta in future pregnancies (n = 6). The sources of information were not reported by 68% of the articles. Conclusions The portrayal of CS in Spanish women's magazines is not sufficiently comprehensive and does not provide adequate important information to help the readership to understand the real benefits and risks of this route of delivery. PMID:24467797

  17. Source apportionment of wet-deposited atmospheric mercury in Tampa, Florida

    NASA Astrophysics Data System (ADS)

    Michael, Ryan; Stuart, Amy L.; Trotz, Maya A.; Akiwumi, Fenda

    2016-03-01

    In this paper, sources of mercury deposition to the Tampa area (Florida, USA) are investigated by analysis of one year (March 2000-March 2001) of daily wet deposition data. HYSPLIT back-trajectory modeling was performed to assess potential source locations for high versus low concentration events in data stratified by precipitation level. Positive matrix factorization (PMF) was also applied to apportion the elemental compositions from each event and to identify sources. Increased total mercury deposition was observed during summer months, corresponding to increased precipitation. However, mercury concentration in deposited samples was not strongly correlated with precipitation amount. Back-trajectories show air masses passing over Florida land in the short (12 h) and medium (24 h) term prior to deposition for high mercury concentration events. PMF results indicate that eleven factors contribute to the deposited elements in the event data. Diagnosed elemental profiles suggest the sources that contribute to mercury wet deposition at the study site are coal combustion (52% of the deposited mercury mass), municipal waste incineration (23%), medical waste incineration (19%), and crustal dust (6%). Overall, results suggest that sources local to the county and in Florida likely contributed substantially to mercury deposition at the study site, but distant sources may also contribute.

  18. Assessment and treatment of sex offenders: the Prince of Wales Programme.

    PubMed

    McConaghy, N

    1990-06-01

    The treatment programme for sex offenders at the Prince of Wales Hospital, Sydney, is described. Penile circumference assessment is not used as there is no evidence it provides a valid measure of individuals' paedophile or rapist tendencies. Sex offenders' self-reports remain the major source of information in their assessment. The development of the two major techniques used--imaginal desensitization and short-term medroxyprogesterone--is outlined. About 80% of subjects can be expected to show a good response to one or other of these therapies. Of those who do not, most respond to the alternative or aversive therapy. Adolescent offenders appear to require more intensive treatment. Results appear comparable with those of more intensive programmes in use overseas.

  19. Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2004-01-01

    A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  20. A Well-Balanced Path-Integral f-Wave Method for Hyperbolic Problems with Source Terms

    PubMed Central

    2014-01-01

    Systems of hyperbolic partial differential equations with source terms (balance laws) arise in many applications where it is important to compute accurate time-dependent solutions modeling small perturbations of equilibrium solutions in which the source terms balance the hyperbolic part. The f-wave version of the wave-propagation algorithm is one approach, but requires the use of a particular averaged value of the source terms at each cell interface in order to be “well balanced” and exactly maintain steady states. A general approach to choosing this average is developed using the theory of path conservative methods. A scalar advection equation with a decay or growth term is introduced as a model problem for numerical experiments. PMID:24563581

  1. Spatial and temporal variations of loads and sources of total and dissolved Phosphorus in a set of rivers (Western France).

    NASA Astrophysics Data System (ADS)

    Legeay, Pierre-Louis; Moatar, Florentina; Gascuel-Odoux, Chantal; Gruau, Gérard

    2015-04-01

    In intensive agricultural regions with important livestock farming, long-term land application of Phosphorus (P) both as chemical fertilizer and animal wastes, have resulted in elevated P contents in soils. Since we know that high P concentrations in rivers is of major concern, few studies have been done at to assess the spatiotemporal variability of P loads in rivers and apportionment of point and nonpoint source in total loads. Here we focus on Brittany (Western France) where even though P is a great issue in terms of human and drinking water safety (cyano-toxins), environmental protection and economic costs for Brittany with regards to the periodic proliferations of cyanobacteria that occur every year in this region, no regional-scale systematic study has been carried out so far. We selected a set of small rivers (stream order 3-5) with homogeneous agriculture and granitic catchment. By gathering data from three water quality monitoring networks, covering more than 100 measurements stations, we provide a regional-scale quantification of the spatiotemporal variability of dissolved P (DP) and total P (TP) interannual loads from 1992 to 2012. Build on mean P load in low flows and statistical significance tests, we developed a new indicator, called 'low flow P load' (LFP-load), which allows us to determine the importance of domestic and industrial P sources in total P load and to assess their spatiotemporal variability compared to agricultural sources. The calculation and the map representation of DP and TP interannual load variations allow identification of the greatest and lowest P contributory catchments over the study period and the way P loads of Brittany rivers have evolved through time. Both mean DP and TP loads have been divided by more than two over the last 20 years. Mean LFDP-load decreased by more than 60% and mean LFTP-load by more than 45% on average over the same period showing that this marked temporal decrease in total load is largely due to the decrease of domestic and industrial P effluents. A global shift in P inputs apportionment to freshwaters thus occurred in Brittany since 20 years as agricultural nonpoint sources now contribute a greater portion of inputs showing the efficiency of the recent control of point sources by enhancement of water treatment plant and removal of phosphates in detergents. The spatialized P loads provided by this study could give a basis for a better understanding of the factors that drives the P transfers in Brittany soils and hotspots of P emissions while the LFP-load indicator can be a tool to assess effects of point-source P mitigation plans.

  2. 26 CFR 1.737-1 - Recognition of precontribution gain.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Property A1 and Property A2 is long-term, U.S.-source capital gain or loss. The character of gain on Property A3 is long-term, foreign-source capital gain. B contributes Property B, nondepreciable real... long-term, U.S.-source capital gain ($10,000 gain on Property A1 and $8,000 loss on Property A2) and $1...

  3. Hanford Site Composite Analysis Technical Approach Description: Automated Quality Assurance Process Design.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dockter, Randy E.

    2017-07-31

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions, or assessment needsmore » if potential problems are identified.« less

  4. Providing long term care for sex offenders: liabilities and responsibilities.

    PubMed

    Corson, Tyler Rogers; Nadash, Pamela

    2013-11-01

    The high risk for recidivism among sex offenders who need long term care (LTC) raises serious issues when they are cared for alongside frail, vulnerable adults. LTC providers must balance offenders' right to access care with other residents' right to be free from abuse and must assess and manage the risks associated with admitting offenders. This article identifies sources of legal liability that derive from sex offender management and discusses the need for the LTC community to develop reasonable, balanced guidance on how best to mitigate the risks associated with sex offenders, protect the rights of all residents, and reduce provider liabilities. Copyright © 2013 American Medical Directors Association, Inc. Published by Elsevier Inc. All rights reserved.

  5. Hanford Site Composite Analysis Technical Approach Description: Atmospheric Transport Modeling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, B.; Lehman, L. L.

    2017-10-02

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions or assessment needs,more » if potential problems are identified.« less

  6. Hanford Site Composite Analysis Technical Approach Description: Waste Form Release.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardie, S.; Paris, B.; Apted, M.

    2017-09-14

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions or assessment needs,more » if potential problems are identified.« less

  7. Hanford Site Composite Analysis Technical Approach Description: Integrated Computational Framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, K. J.

    2017-09-14

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions, or assessment needsmore » if potential problems are identified.« less

  8. Sources, distribution, bioavailability, toxicity, and risk assessment of heavy metal(loid)s in complementary medicines.

    PubMed

    Bolan, Shiv; Kunhikrishnan, Anitha; Seshadri, Balaji; Choppala, Girish; Naidu, Ravi; Bolan, Nanthi S; Ok, Yong Sik; Zhang, Ming; Li, Chun-Guang; Li, Feng; Noller, Barry; Kirkham, Mary Beth

    2017-11-01

    The last few decades have seen the rise of alternative medical approaches including the use of herbal supplements, natural products, and traditional medicines, which are collectively known as 'Complementary medicines'. However, there are increasing concerns on the safety and health benefits of these medicines. One of the main hazards with the use of complementary medicines is the presence of heavy metal(loid)s such as arsenic (As), cadmium (Cd), lead (Pb), and mercury (Hg). This review deals with the characteristics of complementary medicines in terms of heavy metal(loid)s sources, distribution, bioavailability, toxicity, and human risk assessment. The heavy metal(loid)s in these medicines are derived from uptake by medicinal plants, cross-contamination during processing, and therapeutic input of metal(loid)s. This paper discusses the distribution of heavy metal(loid)s in these medicines, in terms of their nature, concentration, and speciation. The importance of determining bioavailability towards human health risk assessment was emphasized by the need to estimate daily intake of heavy metal(loid)s in complementary medicines. The review ends with selected case studies of heavy metal(loid) toxicity from complementary medicines with specific reference to As, Cd, Pb, and Hg. The future research opportunities mentioned in the conclusion of review will help researchers to explore new avenues, methodologies, and approaches to the issue of heavy metal(loid)s in complementary medicines, thereby generating new regulations and proposing fresh approach towards safe use of these medicines. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Observation-based source terms in the third-generation wave model WAVEWATCH

    NASA Astrophysics Data System (ADS)

    Zieger, Stefan; Babanin, Alexander V.; Erick Rogers, W.; Young, Ian R.

    2015-12-01

    Measurements collected during the AUSWEX field campaign, at Lake George (Australia), resulted in new insights into the processes of wind wave interaction and whitecapping dissipation, and consequently new parameterizations of the input and dissipation source terms. The new nonlinear wind input term developed accounts for dependence of the growth on wave steepness, airflow separation, and for negative growth rate under adverse winds. The new dissipation terms feature the inherent breaking term, a cumulative dissipation term and a term due to production of turbulence by waves, which is particularly relevant for decaying seas and for swell. The latter is consistent with the observed decay rate of ocean swell. This paper describes these source terms implemented in WAVEWATCH III ®and evaluates the performance against existing source terms in academic duration-limited tests, against buoy measurements for windsea-dominated conditions, under conditions of extreme wind forcing (Hurricane Katrina), and against altimeter data in global hindcasts. Results show agreement by means of growth curves as well as integral and spectral parameters in the simulations and hindcast.

  10. Optimization of a mirror-based neutron source using differential evolution algorithm

    NASA Astrophysics Data System (ADS)

    Yurov, D. V.; Prikhodko, V. V.

    2016-12-01

    This study is dedicated to the assessment of capabilities of gas-dynamic trap (GDT) and gas-dynamic multiple-mirror trap (GDMT) as potential neutron sources for subcritical hybrids. In mathematical terms the problem of the study has been formulated as determining the global maximum of fusion gain (Q pl), the latter represented as a function of trap parameters. A differential evolution method has been applied to perform the search. Considered in all calculations has been a configuration of the neutron source with 20 m long distance between the mirrors and 100 MW heating power. It is important to mention that the numerical study has also taken into account a number of constraints on plasma characteristics so as to provide physical credibility of searched-for trap configurations. According to the results obtained the traps considered have demonstrated fusion gain up to 0.2, depending on the constraints applied. This enables them to be used either as neutron sources within subcritical reactors for minor actinides incineration or as material-testing facilities.

  11. Reduced mercury deposition in New Hampshire from 1996 to 2002 due to changes in local sources.

    PubMed

    Han, Young-Ji; Holsen, Thomas M; Evers, David C; Driscoll, Charles T

    2008-12-01

    Changes in deposition of gaseous divalent mercury (Hg(II)) and particulate mercury (Hg(p)) in New Hampshire due to changes in local sources from 1996 to 2002 were assessed using the Industrial Source Complex Short Term (ISCST3) model (regional and global sources and Hg atmospheric reactions were not considered). Mercury (Hg) emissions in New Hampshire and adjacent areas decreased significantly (from 1540 to 880 kg yr(-1)) during this period, and the average annual modeled deposition of total Hg also declined from 17 to 7.0 microg m(-2) yr(-1) for the same period. In 2002, the maximum amount of Hg deposition was modeled to be in southern New Hampshire, while for 1996 the maximum deposition occurred farther north and east. The ISCST3 was also used to evaluate two future scenarios. The average percent difference in deposition across all cells was 5% for the 50% reduction scenario and 9% for the 90% reduction scenario.

  12. Health and ecological risk assessment of heavy metals pollution in an antimony mining region: a case study from South China.

    PubMed

    Fei, Jiang-Chi; Min, Xiao-Bo; Wang, Zhen-Xing; Pang, Zhi-Hua; Liang, Yan-Jie; Ke, Yong

    2017-12-01

    In recent years, international research on the toxicity of the heavy metal, antimony, has gradually changed focus from early medical and pharmacological toxicology to environmental toxicology and ecotoxicology. However, little research has been conducted for sources identification and risk management of heavy metals pollution by long-term antimony mining activities. In this study, a large number of investigations were conducted on the temporal and spatial distribution of antimony and related heavy metal contaminants (lead, zinc, and arsenic), as well as on the exposure risks for the population for the Yuxi river basin in the Hunan province, China. The scope of the investigations included mine water, waste rock, tailings, agricultural soil, surface water, river sediments, and groundwater sources of drinking water. Health and ecological risks from exposure to heavy metal pollution were evaluated. The main pollution sources of heavy metals in the Yuxi River basin were analyzed. Remediation programs and risk management strategies for heavy metal pollution were consequently proposed. This article provides a scientific basis for the risk assessment and management of heavy metal pollution caused by antimony basin ore mining.

  13. Seasonal variation of acute gastro-intestinal illness by hydroclimatic regime and drinking water source: a retrospective population-based study.

    PubMed

    Galway, Lindsay P; Allen, Diana M; Parkes, Margot W; Takaro, Tim K

    2014-03-01

    Acute gastro-intestinal illness (AGI) is a major cause of mortality and morbidity worldwide and an important public health problem. Despite the fact that AGI is currently responsible for a huge burden of disease throughout the world, important knowledge gaps exist in terms of its epidemiology. Specifically, an understanding of seasonality and those factors driving seasonal variation remain elusive. This paper aims to assess variation in the incidence of AGI in British Columbia (BC), Canada over an 11-year study period. We assessed variation in AGI dynamics in general, and disaggregated by hydroclimatic regime and drinking water source. We used several different visual and statistical techniques to describe and characterize seasonal and annual patterns in AGI incidence over time. Our results consistently illustrate marked seasonal patterns; seasonality remains when the dataset is disaggregated by hydroclimatic regime and drinking water source; however, differences in the magnitude and timing of the peaks and troughs are noted. We conclude that systematic descriptions of infectious illness dynamics over time is a valuable tool for informing disease prevention strategies and generating hypotheses to guide future research in an era of global environmental change.

  14. Bayesian source term determination with unknown covariance of measurements

    NASA Astrophysics Data System (ADS)

    Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav

    2017-04-01

    Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  15. Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2005-01-01

    A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  16. Integrating sentinel watershed-systems into the monitoring and assessment of Minnesota's (USA) waters quality.

    PubMed

    Magner, J A; Brooks, K N

    2008-03-01

    Section 303(d) of the Clean Water Act requires States and Tribes to list waters not meeting water quality standards. A total maximum daily load must be prepared for waters identified as impaired with respect to water quality standards. Historically, the management of pollution in Minnesota has been focused on point-source regulation. Regulatory effort in Minnesota has improved water quality over the last three decades. Non-point source pollution has become the largest driver of conventional 303(d) listings in the 21st century. Conventional pollutants, i.e., organic, sediment and nutrient imbalances can be identified with poor land use management practices. However, the cause and effect relationship can be elusive because of natural watershed-system influences that vary with scale. Elucidation is complex because the current water quality standards in Minnesota were designed to work best with water quality permits to control point sources of pollution. This paper presents a sentinel watershed-systems approach (SWSA) to the monitoring and assessment of Minnesota waterbodies. SWSA integrates physical, chemical, and biological data over space and time using advanced technologies at selected small watersheds across Minnesota to potentially improve understanding of natural and anthropogenic watershed processes and the management of point and non-point sources of pollution. Long-term, state-of-the-art monitoring and assessment is needed to advance and improve water quality standards. Advanced water quality or ecologically-based standards that integrate physical, chemical, and biological numeric criteria offer the potential to better understand, manage, protect, and restore Minnesota's waterbodies.

  17. Remotely measuring populations during a crisis by overlaying two data sources

    PubMed Central

    Bharti, Nita; Lu, Xin; Bengtsson, Linus; Wetter, Erik; Tatem, Andrew J.

    2015-01-01

    Background Societal instability and crises can cause rapid, large-scale movements. These movements are poorly understood and difficult to measure but strongly impact health. Data on these movements are important for planning response efforts. We retrospectively analyzed movement patterns surrounding a 2010 humanitarian crisis caused by internal political conflict in Côte d'Ivoire using two different methods. Methods We used two remote measures, nighttime lights satellite imagery and anonymized mobile phone call detail records, to assess average population sizes as well as dynamic population changes. These data sources detect movements across different spatial and temporal scales. Results The two data sources showed strong agreement in average measures of population sizes. Because the spatiotemporal resolution of the data sources differed, we were able to obtain measurements on long- and short-term dynamic elements of populations at different points throughout the crisis. Conclusions Using complementary, remote data sources to measure movement shows promise for future use in humanitarian crises. We conclude with challenges of remotely measuring movement and provide suggestions for future research and methodological developments. PMID:25733558

  18. Health Risk Assessments for Alumina Refineries

    PubMed Central

    Coffey, Patrick S.

    2014-01-01

    Objective: To describe contemporary air dispersion modeling and health risk assessment methodologies applied to alumina refineries and to summarize recent results. Methods: Air dispersion models using emission source and meteorological data have been used to assess ground-level concentrations (GLCs) of refinery emissions. Short-term (1-hour and 24-hour average) GLCs and annual average GLCs have been used to assess acute health, chronic health, and incremental carcinogenic risks. Results: The acute hazard index can exceed 1 close to refineries, but it is typically less than 1 at neighboring residential locations. The chronic hazard index is typically substantially less than 1. The incremental carcinogenic risk is typically less than 10−6. Conclusions: The risks of acute health effects are adequately controlled, and the risks of chronic health effects and incremental carcinogenic risks are negligible around referenced alumina refineries. PMID:24806721

  19. Indices of soil contamination by heavy metals - methodology of calculation for pollution assessment (minireview).

    PubMed

    Weissmannová, Helena Doležalová; Pavlovský, Jiří

    2017-11-07

    This article provides the assessment of heavy metal soil pollution with using the calculation of various pollution indices and contains also summarization of the sources of heavy metal soil pollution. Twenty described indices of the assessment of soil pollution consist of two groups: single indices and total complex indices of pollution or contamination with relevant classes of pollution. This minireview provides also the classification of pollution indices in terms of the complex assessment of soil quality. In addition, based on the comparison of metal concentrations in soil-selected sites of the world and used indices of pollution or contamination in soils, the concentration of heavy metal in contaminated soils varied widely, and pollution indices confirmed the significant contribution of soil pollution from anthropogenic activities mainly in urban and industrial areas.

  20. Pain assessment scales in newborns: integrative review

    PubMed Central

    de Melo, Gleicia Martins; Lélis, Ana Luíza Paula de Aguiar; de Moura, Alline Falconieri; Cardoso, Maria Vera Lúcia Moreira Leitão; da Silva, Viviane Martins

    2014-01-01

    OBJECTIVE: To analyze studies on methods used to assess pain in newborns. DATA SOURCES: Integrative review study of articles published from 2001 to 2012, carried out in the following databases: Scopus, PubMed, CINAHL, LILACS and Cochrane. The sample consisted of 13 articles with level of evidence 5. DATA SYNTHESIS: 29 pain assessment scales in newborns, including 13 one-dimensional and 16 multidimensional, that assess acute and prolonged pain in preterm and full-term infants were available in scientific publications. CONCLUSION: Based on the characteristics of scales, one cannot choose a single one as the most appropriate scale, as this choice will depend on gestational age, type of painful stimulus and the environment in which the infant is inserted. It is suggested the use of multidimensional or one-dimensional scales; however, they must be reliable and validated. PMID:25511005

  1. River Export of Plastic from Land to Sea: A Global Modeling Approach

    NASA Astrophysics Data System (ADS)

    Siegfried, Max; Gabbert, Silke; Koelmans, Albert A.; Kroeze, Carolien; Löhr, Ansje; Verburg, Charlotte

    2016-04-01

    Plastic is increasingly considered a serious cause of water pollution. It is a threat to aquatic ecosystems, including rivers, coastal waters and oceans. Rivers transport considerable amounts of plastic from land to sea. The quantity and its main sources, however, are not well known. Assessing the amount of macro- and microplastic transport from river to sea is, therefore, important for understanding the dimension and the patterns of plastic pollution of aquatic ecosystems. In addition, it is crucial for assessing short- and long-term impacts caused by plastic pollution. Here we present a global modelling approach to quantify river export of plastic from land to sea. Our approach accounts for different types of plastic, including both macro- and micro-plastics. Moreover, we distinguish point sources and diffuse sources of plastic in rivers. Our modelling approach is inspired by global nutrient models, which include more than 6000 river basins. In this paper, we will present our modelling approach, as well as first model results for micro-plastic pollution in European rivers. Important sources of micro-plastics include personal care products, laundry, household dust and car tyre wear. We combine information on these sources with information on sewage management, and plastic retention during river transport for the largest European rivers. Our modelling approach may help to better understand and prevent water pollution by plastic , and at the same time serves as 'proof of concept' for future application on global scale.

  2. Final Environmental Assessment Addressing Construction of a Fitness Center at Beale Air Force Base, California

    DTIC Science & Technology

    2009-10-01

    adverse impacts on geology and soils would be anticipated due to construction and demolition activities, such as grading, excavation, and...2, during construction and demolition activities would limit adverse impacts on geology and soils. Therefore, no long-term, adverse, direct or...20 99 113 70 70 99 65 70 20 Live Oak Loma Rica Tierra Buena Wheatland Lincoln Linda Marysville Olivehurst South Yuba City Yuba City Source: ESRI

  3. [Patient safety: Glossary].

    PubMed

    Sabio Paz, Verónica; Panattieri, Néstor D; Cristina Godio, Farmacéutica; Ratto, María E; Arpí, Lucrecia; Dackiewicz, Nora

    2015-10-01

    Patient safety and quality of care has become a challenge for health systems. Health care is an increasingly complex and risky activity, as it represents a combination of human, technological and organizational processes. It is necessary, therefore, to take effective actions to reduce the adverse events and mitigate its impact. This glossary is a local adaptation of key terms and concepts from the international bibliographic sources. The aim is providing a common language for assessing patient safety processes and compare them.

  4. Building Partner Capacity: DOD Is Meeting Most Targets for Colombias Regional Helicopter Training Center but Should Track Graduates

    DTIC Science & Technology

    2013-07-01

    including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the...medium- and long-term results. In this review , GAO assesses (1) U.S. government allocations, obligations, and disbursements for RHTC in fiscal year...officials in Washington, D.C.; Alabama; Virginia; and Colombia. GAO also reviewed DOD documents and funding data. What GAO Recommends GAO is

  5. [Methodological approaches of a social budget of disability].

    PubMed

    Fardeau, M

    1994-01-01

    By gathering data from different sources, it may be possible to estimate the French social budget of disability. In 1990, approximatively 126.9 million FF were devoted by the nation to its disabled population. One quarter of the amount is "in kind", for financing training centers, nursing homes for the disabled... The three remaining quarters are composed of "cash benefits" (disability allowances, work accident annuities,...). The approach makes it possible the assessment of disability in economic terms.

  6. Weathering and carbon fluxes of the Irrawaddy-Salween-Mekong river system

    NASA Astrophysics Data System (ADS)

    Baronas, J. J.; Tipper, E.; Hilton, R. G.; Bickle, M.; Relph, K.; Parsons, D. R.

    2017-12-01

    The Irrawaddy-Salween-Mekong (ISM) rivers with their source regions draining the eastern Tibetan Plateau account for a significant portion of the global solute and sediment flux to the ocean, and appear to exhibit some of the highest chemical weathering rates in the world. However they are greatly understudied, despite their significance. We will present data from the first part of a recently started multi-year study of these monsoon-controlled river systems. Our aim is to fully deconvolve and quantify the multiple processes and fluxes which play a role in the long-term feedback loop between tectonics, climate, and the critical zone. The long-term goals of the project are to accurately partition the silicate and carbonate weathering rates, acidity sources, and various organic and inorganic carbon fluxes, using a large range of geochemical and isotopic analyses. In addition, we have begun to collect extensive suspended sediment depth profiles to assess changes in sediment chemistry from the Himalayan headwaters to the river mouths, in an attempt to quantify whole-catchment silicate weathering rates over millennial timescales. Finally, bi-weekly multi-annual time-series data are being used to assess the catchment biogeochemical response to the strong hydrological seasonality imposed by the monsoonal climate. Here, we will present some of our preliminary findings of our dissolved dissolved and sediment data from the main-stems and major tributaries of the ISM rivers.

  7. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Trial Calculation. Work Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-02-01

    The overall objective of the SFR Regulatory Technology Development Plan (RTDP) effort is to identify and address potential impediments to the SFR regulatory licensing process. In FY14, an analysis by Argonne identified the development of an SFR-specific MST methodology as an existing licensing gap with high regulatory importance and a potentially long lead-time to closure. This work was followed by an initial examination of the current state-of-knowledge regarding SFR source term development (ANLART-3), which reported several potential gaps. Among these were the potential inadequacies of current computational tools to properly model and assess the transport and retention of radionuclides duringmore » a metal fuel pool-type SFR core damage incident. The objective of the current work is to determine the adequacy of existing computational tools, and the associated knowledge database, for the calculation of an SFR MST. To accomplish this task, a trial MST calculation will be performed using available computational tools to establish their limitations with regard to relevant radionuclide release/retention/transport phenomena. The application of existing modeling tools will provide a definitive test to assess their suitability for an SFR MST calculation, while also identifying potential gaps in the current knowledge base and providing insight into open issues regarding regulatory criteria/requirements. The findings of this analysis will assist in determining future research and development needs.« less

  8. Fabrication and In Situ Testing of Scalable Nitrate-Selective Electrodes for Distributed Observations

    NASA Astrophysics Data System (ADS)

    Harmon, T. C.; Rat'ko, A.; Dietrich, H.; Park, Y.; Wijsboom, Y. H.; Bendikov, M.

    2008-12-01

    Inorganic nitrogen (nitrate (NO3-) and ammonium (NH+)) from chemical fertilizer and livestock waste is a major source of pollution in groundwater, surface water and the air. While some sources of these chemicals, such as waste lagoons, are well-defined, their application as fertilizer has the potential to create distributed or non-point source pollution problems. Scalable nitrate sensors (small and inexpensive) would enable us to better assess non-point source pollution processes in agronomic soils, groundwater and rivers subject to non-point source inputs. This work describes the fabrication and testing of inexpensive PVC-membrane- based ion selective electrodes (ISEs) for monitoring nitrate levels in soil water environments. ISE-based sensors have the advantages of being easy to fabricate and use, but suffer several shortcomings, including limited sensitivity, poor precision, and calibration drift. However, modern materials have begun to yield more robust ISE types in laboratory settings. This work emphasizes the in situ behavior of commercial and fabricated sensors in soils subject to irrigation with dairy manure water. Results are presented in the context of deployment techniques (in situ versus soil lysimeters), temperature compensation, and uncertainty analysis. Observed temporal responses of the nitrate sensors exhibited diurnal cycling with elevated nitrate levels at night and depressed levels during the day. Conventional samples collected via lysimeters validated this response. It is concluded that while modern ISEs are not yet ready for long-term, unattended deployment, short-term installations (on the order of 2 to 4 days) are viable and may provide valuable insights into nitrogen dynamics in complex soil systems.

  9. Fate of hydrocarbon pollutants in source and non-source control sustainable drainage systems.

    PubMed

    Roinas, Georgios; Mant, Cath; Williams, John B

    2014-01-01

    Sustainable drainage (SuDs) is an established method for managing runoff from developments, and source control is part of accepted design philosophy. However, there are limited studies into the contribution source control makes to pollutant removal, especially for roads. This study examines organic pollutants, total petroleum hydrocarbons (TPH) and polycyclic aromatic hydrocarbons (PAHs), in paired source and non-source control full-scale SuDs systems. Sites were selected to cover local roads, trunk roads and housing developments, with a range of SuDs, including porous asphalt, swales, detention basins and ponds. Soil and water samples were taken bi-monthly over 12 months to assess pollutant loads. Results show first flush patterns in storm events for solids, but not for TPH. The patterns of removal for specific PAHs were also different, reflecting varying physico-chemical properties. The potential of trunk roads for pollution was illustrated by peak runoff for TPH of > 17,000 μg/l. Overall there was no significant difference between pollutant loads from source and non-source control systems, but the dynamic nature of runoff means that longer-term data are required. The outcomes of this project will increase understanding of organic pollutants behaviour in SuDs. This will provide design guidance about the most appropriate systems for treating these pollutants.

  10. Milan Army Ammunition Plant remedial investigation report: Volume 1. Final report 89-91

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okusu, N.; Hall, H.; Orndorff, A.

    1991-12-09

    A Remedial Investigation at the Milan Army Ammunition Plant, TN, was conducted for the US Army Toxic and Hazardous Materials Agency, under the terms of an Interagency Agreement with the State of Tennessee and the US Environmental Protection Agency. The study focused on the CERCLA site and selected RCRA regulated units identified by previous studies as potential sources of contamination. A broad range of chemicals including metals, explosives, and other organic compounds were found in source areas and in groundwater. The results of a risk assessment indicate that unacceptable levels of human health risks potentially exist. Conceptual models of sitemore » and unit characteristics were formulated to explain major findings, and areas not contributing to the problem were identified. For many source areas, major unknowns exist regarding hydrology, extent of contamination, and current and future impacts to groundwater quality.« less

  11. A web-based screening tool for near-port air quality assessments

    PubMed Central

    Isakov, Vlad; Barzyk, Timothy M.; Smith, Elizabeth R.; Arunachalam, Saravanan; Naess, Brian; Venkatram, Akula

    2018-01-01

    The Community model for near-PORT applications (C-PORT) is a screening tool with an intended purpose of calculating differences in annual averaged concentration patterns and relative contributions of various source categories over the spatial domain within about 10 km of the port. C-PORT can inform decision-makers and concerned citizens about local air quality due to mobile source emissions related to commercial port activities. It allows users to visualize and evaluate different planning scenarios, helping them identify the best alternatives for making long-term decisions that protect community health and sustainability. The web-based, easy-to-use interface currently includes data from 21 seaports primarily in the Southeastern U.S., and has a map-based interface based on Google Maps. The tool was developed to visualize and assess changes in air quality due to changes in emissions and/or meteorology in order to analyze development scenarios, and is not intended to support or replace any regulatory models or programs. PMID:29681760

  12. Biomass for energy in the European Union - a review of bioenergy resource assessments

    PubMed Central

    2012-01-01

    This paper reviews recent literature on bioenergy potentials in conjunction with available biomass conversion technologies. The geographical scope is the European Union, which has set a course for long term development of its energy supply from the current dependence on fossil resources to a dominance of renewable resources. A cornerstone in European energy policies and strategies is biomass and bioenergy. The annual demand for biomass for energy is estimated to increase from the current level of 5.7 EJ to 10.0 EJ in 2020. Assessments of bioenergy potentials vary substantially due to methodological inconsistency and assumptions applied by individual authors. Forest biomass, agricultural residues and energy crops constitute the three major sources of biomass for energy, with the latter probably developing into the most important source over the 21st century. Land use and the changes thereof is a key issue in sustainable bioenergy production as land availability is an ultimately limiting factor. PMID:22546368

  13. Bioassay selection, experimental design and quality control/assurance for use in effluent assessment and control.

    PubMed

    Johnson, Ian; Hutchings, Matt; Benstead, Rachel; Thain, John; Whitehouse, Paul

    2004-07-01

    In the UK Direct Toxicity Assessment Programme, carried out in 1998-2000, a series of internationally recognised short-term toxicity test methods for algae, invertebrates and fishes, and rapid methods (ECLOX and Microtox) were used extensively. Abbreviated versions of conventional tests (algal growth inhibition tests, Daphnia magna immobilisation test and the oyster embryo-larval development test) were valuable for toxicity screening of effluent discharges and the identification of causes and sources of toxicity. Rapid methods based on chemiluminescence and bioluminescence were not generally useful in this programme, but may have a role where the rapid test has been shown to be an acceptable surrogate for a standardised test method. A range of quality assurance and control measures were identified. Requirements for quality control/assurance are most stringent when deriving data for characterising the toxic hazards of effluents and monitoring compliance against a toxicity reduction target. Lower quality control/assurance requirements can be applied to discharge screening and the identification of causes and sources of toxicity.

  14. A COMPARISON OF PATIENT AND HEALTHCARE PROFESSIONAL VIEWS WHEN ASSESSING QUALITY OF INFORMATION ON PITUITARY ADENOMA AVAILABLE ON THE INTERNET.

    PubMed

    Druce, Irena; Williams, Chantal; Baggoo, Carolyn; Keely, Erin; Malcolm, Janine

    2017-10-01

    Patients are increasingly turning to the internet to seek reliable sources of health information and desire guidance in assessing the quality of information as healthcare becomes progressively more complex. Pituitary adenomas are a rare, diverse group of tumors associated with increased mortality and morbidity whose management requires a multidisciplinary approach. As such, patients with this disorder are often searching for additional sources of healthcare information. We undertook a study to assess the quality of information available on the internet for patients with pituitary adenoma. After exclusion, 42 websites were identified based on a search engine query with various search terms. Each website was assessed in triplicate: once by a health professional, once by a simulated patient, and once by a patient who had a pituitary adenoma and underwent medical and surgical treatment. The assessment tools included a content-specific questionnaire, the DISCERN tool, and the Ensuring Quality Information for Patients tool. The readability of the information was assessed with the Flesch-Kincaid grade level. We found that the overall quality of information on pituitary adenoma on the internet was variable and written at a high grade level. Correlation between the different assessors was poor, indicating that there may be differences in how healthcare professionals and patients view healthcare information. Our findings highlight the importance of assessment of the health information by groups of the intended user to ensure the needs of that population are met. Abbreviation: EQIP = Ensuring Quality Information for Patients.

  15. The influence of cross-order terms in interface mobilities for structure-borne sound source characterization

    NASA Astrophysics Data System (ADS)

    Bonhoff, H. A.; Petersson, B. A. T.

    2010-08-01

    For the characterization of structure-borne sound sources with multi-point or continuous interfaces, substantial simplifications and physical insight can be obtained by incorporating the concept of interface mobilities. The applicability of interface mobilities, however, relies upon the admissibility of neglecting the so-called cross-order terms. Hence, the objective of the present paper is to clarify the importance and significance of cross-order terms for the characterization of vibrational sources. From previous studies, four conditions have been identified for which the cross-order terms can become more influential. Such are non-circular interface geometries, structures with distinctively differing transfer paths as well as a suppression of the zero-order motion and cases where the contact forces are either in phase or out of phase. In a theoretical study, the former four conditions are investigated regarding the frequency range and magnitude of a possible strengthening of the cross-order terms. For an experimental analysis, two source-receiver installations are selected, suitably designed to obtain strong cross-order terms. The transmitted power and the source descriptors are predicted by the approximations of the interface mobility approach and compared with the complete calculations. Neglecting the cross-order terms can result in large misinterpretations at certain frequencies. On average, however, the cross-order terms are found to be insignificant and can be neglected with good approximation. The general applicability of interface mobilities for structure-borne sound source characterization and the description of the transmission process thereby is confirmed.

  16. Risk-Based Contaminated Land Investigation and Assessment

    NASA Astrophysics Data System (ADS)

    Davis, Donald R.

    With increasing frequency, problems of environmental contamination are being analyzed from a risk perspective. Risk-Based Contaminated Land Investigation and Assessment is written for those who wish to present the results of their examination of contaminated land in terms of risk.The opening chapters introduce the concepts of risk analysis for contaminated land. Risk management and the risk assessment process are based on a source-pathway-target framework. Readers are warned against an “over-reliance on the identification of contaminants rather than the potential pathways by which targets may be exposed to these hazards.” In the risk management framework presented in this book, risk evaluation and resultant decision making are seen as part of both the risk assessment and risk reduction process. The sharp separation of risk assessment from risk management as seen in the National Academy of Sciences' (NAS) risk assessment paradigm is not advocatedsemi; perhaps this is because the NAS' concern was regulatory decision while the book's concern is the assessment of a specific site.

  17. Palatal development of preterm and low birthweight infants compared to term infants – What do we know? Part 1: The palate of the term newborn

    PubMed Central

    Hohoff, Ariane; Rabe, Heike; Ehmer, Ulrike; Harms, Erik

    2005-01-01

    Background The evidence on prematurity as 'a priori' a risk for palatal disturbances that increase the need for orthodontic or orthognathic treatment is still weak. Further well-designed clinical studies are needed. The objective of this review is to provide a fundamental analysis of methodologies, confounding factors, and outcomes of studies on palatal development. One focus of this review is the analysis of studies on the palate of the term newborn, since knowing what is 'normal' is a precondition of being able to assess abnormalities. Methods A search profile based on Cochrane search strategies applied to 10 medical databases was used to identify existing studies. Articles, mainly those published before 1960, were identified from hand searches in textbooks, encyclopedias, reference lists and bibliographies. Sources in English, German, and French of more than a century were included. Data for term infants were recalculated if particular information about weight, length, or maturity was given. The extracted values, especially those from non-English paper sources, were provided unfiltered for comparison. Results The search strategy yielded 182 articles, of which 155 articles remained for final analysis. Morphology of the term newborn's palate was of great interest in the first half of the last century. Two general methodologies were used to assess palatal morphology: visual and metrical descriptions. Most of the studies on term infants suffer from lack of reliability tests. The groove system was recognized as the distinctive feature of the infant palate. The shape of the palate of the term infant may vary considerably, both visually and metrically. Gender, race, mode of delivery, and nasal deformities were identified as causes contributing to altered palatal morphology. Until today, anatomical features of the newborn's palate are subject to a non-uniform nomenclature. Conclusion Today's knowledge of a newborn's 'normal' palatal morphology is based on non-standardized and limited methodologies for measuring a three-dimensional shape. This shortcoming increases bias and is the reason for contradictory research results, especially if pathologic conditions like syndromes or prematurity are involved. Adequate measurement techniques are needed and the 'normal palatal morphology' should be defined prior to new clinical studies on palatal development. PMID:16270908

  18. Toward uniform implementation of parametric map Digital Imaging and Communication in Medicine standard in multisite quantitative diffusion imaging studies.

    PubMed

    Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C

    2018-01-01

    This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.

  19. Integrated watershed- and farm-scale modeling framework for targeting critical source areas while maintaining farm economic viability.

    PubMed

    Ghebremichael, Lula T; Veith, Tamie L; Hamlett, James M

    2013-01-15

    Quantitative risk assessments of pollution and data related to the effectiveness of mitigating best management practices (BMPs) are important aspects of nonpoint source pollution control efforts, particularly those driven by specific water quality objectives and by measurable improvement goals, such as the total maximum daily load (TMDL) requirements. Targeting critical source areas (CSAs) that generate disproportionately high pollutant loads within a watershed is a crucial step in successfully controlling nonpoint source pollution. The importance of watershed simulation models in assisting with the quantitative assessments of CSAs of pollution (relative to their magnitudes and extents) and of the effectiveness of associated BMPs has been well recognized. However, due to the distinct disconnect between the hydrological scale in which these models conduct their evaluation and the farm scale at which feasible BMPs are actually selected and implemented, and due to the difficulty and uncertainty involved in transferring watershed model data to farm fields, there are limited practical applications of these tools in the current nonpoint source pollution control efforts by conservation specialists for delineating CSAs and planning targeting measures. There are also limited approaches developed that can assess impacts of CSA-targeted BMPs on farm productivity and profitability together with the assessment of water quality improvements expected from applying these measures. This study developed a modeling framework that integrates farm economics and environmental aspects (such as identification and mitigation of CSAs) through joint use of watershed- and farm-scale models in a closed feedback loop. The integration of models in a closed feedback loop provides a way for environmental changes to be evaluated with regard to the impact on the practical aspects of farm management and economics, adjusted or reformulated as necessary, and revaluated with respect to effectiveness of environmental mitigation at the farm- and watershed-levels. This paper also outlines steps needed to extract important CSA-related information from a watershed model to help inform targeting decisions at the farm scale. The modeling framework is demonstrated with two unique case studies in the northeastern United States (New York and Vermont), with supporting data from numerous published, location-specific studies at both the watershed and farm scales. Using the integrated modeling framework, it can be possible to compare the costs (in terms of changes required in farm system components or financial compensations for retiring crop lands) and benefits (in terms of measurable water quality improvement goals) of implementing targeted BMPs. This multi-scale modeling approach can be used in the multi-objective task of mitigating CSAs of pollution to meet water quality goals while maintaining farm-level economic viability. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Microplastics in Freshwater River Sediments in Shanghai, China: A Case Study of Risk Assessment in Mega Cities

    NASA Astrophysics Data System (ADS)

    Peng, G.; Xu, P.

    2017-12-01

    Microplastics are plastics that measure less than 5 mm, which attracted exponential interest in recent years. Microplastics are widely distributed in water, sediments, and biotas. Most of distribution studies focus on the marine environment, yet methods to conduct risk assessment are limited. Widespread of microplastics has raised alarm for the well-being of marine living resources because of its negative ecological effects that has been proved. To understand the distribution of microplastics in urban rivers and source of marine microplastics, we investigated into river sediments in Shanghai, the biggest city in China. Seven sampling sites covered most of city central districts including one sampling site from a tidal flat. Density separation, microscopic inspection and identification were conducted to analyze microplastic abundance, shape and color. It is found that pellets were the most prevalent shape, followed by fiber and fragment. White microplastics were the most common type in terms of color. White foamed microplastic pellets were widely distributed in urban river sediments. Microplastic abundance from rivers was one to two orders of magnitude higher than that from the tidal flat. The significant difference between river and tidal flat samples lead to the conclusion that coastal rivers may be the source of microplastics, therefore in situ data and legitimate estimation should be considered by policy-makers. Seven types of microplastics were identified by μ-FT-IR analysis, indicating a secondary source. Comparison between two types of μ-FT-IR instruments was summarized. Framework for environmental risk assessment for microplastics in sediments was proposed. Indicators and ranks were select for the assessment of microplastic in sediments. It is recommended to select the index, integrate statistical data, follow expert opinions extensively and construct comprehensive evaluation method and ecological risk assessment system for the Chinese context.

  1. Physiotherapy rehabilitation for whiplash associated disorder II: a systematic review and meta-analysis of randomised controlled trials

    PubMed Central

    Wright, Chris; Heneghan, Nicola; Eveleigh, Gillian; Calvert, Melanie; Freemantle, Nick

    2011-01-01

    Objective To evaluate effectiveness of physiotherapy management in patients experiencing whiplash associated disorder II, on clinically relevant outcomes in the short and longer term. Design Systematic review and meta-analysis. Two reviewers independently searched information sources, assessed studies for inclusion, evaluated risk of bias and extracted data. A third reviewer mediated disagreement. Assessment of risk of bias was tabulated across included trials. Quantitative synthesis was conducted on comparable outcomes across trials with similar interventions. Meta-analyses compared effect sizes, with random effects as primary analyses. Data sources Predefined terms were employed to search electronic databases. Additional studies were identified from key journals, reference lists, authors and experts. Eligibility criteria for selecting studies Randomised controlled trials (RCTs) published in English before 31 December 2010 evaluating physiotherapy management of patients (>16 years), experiencing whiplash associated disorder II. Any physiotherapy intervention was included, when compared with other types of management, placebo/sham, or no intervention. Measurements reported on ≥1 outcome from the domains within the international classification of function, disability and health, were included. Results 21 RCTs (2126 participants, 9 countries) were included. Interventions were categorised as active physiotherapy or a specific physiotherapy intervention. 20/21 trials were evaluated as high risk of bias and one as unclear. 1395 participants were incorporated in the meta-analyses on 12 trials. In evaluating short term outcome in the acute/sub-acute stage, there was some evidence that active physiotherapy intervention reduces pain and improves range of movement, and that a specific physiotherapy intervention may reduce pain. However, moderate/considerable heterogeneity suggested that treatments may differ in nature or effect in different trial patients. Differences between participants, interventions and trial designs limited potential meta-analyses. Conclusions Inconclusive evidence exists for the effectiveness of physiotherapy management for whiplash associated disorder II. There is potential benefit for improving range of movement and pain short term through active physiotherapy, and for improving pain through a specific physiotherapy intervention. PMID:22102642

  2. Quality of Health Information on the Internet for Urolithiasis on the Google Search Engine.

    PubMed

    Chang, Dwayne T S; Abouassaly, Robert; Lawrentschuk, Nathan

    2016-01-01

    Purpose . To compare the quality of health information on the Internet for keywords related to urolithiasis, to assess for difference in information quality across four main Western languages, and to compare the source of sponsorship in these websites. Methods . Health On the Net (HON) Foundation principles were utilised to determine quality information. Fifteen keywords related to urolithiasis were searched on the Google search engine. The first 150 websites were assessed against the HON principles and the source of sponsorship determined. Results . A total of 8986 websites were analysed. A proportion of HON-accredited websites for individual search terms range between 2.5% and 12.0%. The first 50 websites were more likely to be HON-positive compared to websites 51-100 and 101-150. French websites searched were more likely to be HON-positive whereas German websites were less likely to be HON-positive than English websites. There was no statistically significant difference between the rate of HON-positive English and Spanish websites. The three main website sponsors were from government/educational sources (40.2%), followed by commercial (29.9%) and physician/surgeon sources (18.6%). Conclusions . Health information on most urolithiasis websites was not validated. Nearly one-third of websites in this study have commercial sponsorship. Doctors should recognise the need for more reliable health websites for their patients.

  3. Terms used by nurses to describe patient problems: can SNOMED III represent nursing concepts in the patient record?

    PubMed Central

    Henry, S B; Holzemer, W L; Reilly, C A; Campbell, K E

    1994-01-01

    OBJECTIVE: To analyze the terms used by nurses in a variety of data sources and to test the feasibility of using SNOMED III to represent nursing terms. DESIGN: Prospective research design with manual matching of terms to the SNOMED III vocabulary. MEASUREMENTS: The terms used by nurses to describe patient problems during 485 episodes of care for 201 patients hospitalized for Pneumocystis carinii pneumonia were identified. Problems from four data sources (nurse interview, intershift report, nursing care plan, and nurse progress note/flowsheet) were classified based on the substantive area of the problem and on the terminology used to describe the problem. A test subset of the 25 most frequently used terms from the two written data sources (nursing care plan and nurse progress note/flowsheet) were manually matched to SNOMED III terms to test the feasibility of using that existing vocabulary to represent nursing terms. RESULTS: Nurses most frequently described patient problems as signs/symptoms in the verbal nurse interview and intershift report. In the written data sources, problems were recorded as North American Nursing Diagnosis Association (NANDA) terms and signs/symptoms with similar frequencies. Of the nursing terms in the test subset, 69% were represented using one or more SNOMED III terms. PMID:7719788

  4. Assessing bisphenol A (BPA) exposure risk from long-term dietary intakes in Taiwan.

    PubMed

    Chen, Wei-Yu; Shen, Yi-Pei; Chen, Szu-Chieh

    2016-02-01

    Dietary intake is the major bisphenol A (BPA) exposure route in humans, and is a cause of BPA-related adverse effects. The large-scale exposure risk of humans to BPA through dietary sources in Taiwan is less well studied. The aim of this study was to assess the average daily dose (ADD) and hazardous quotient (HQ) of BPA exposure risk from long-term dietary intake of BPA, as well as BPA concentrations in different age-sex groups in Taiwan. We reanalyzed the BPA concentrations of regular daily food sources (rice, poultry, livestock, seafood, protein, fruits, and vegetables) and used a national dietary survey to estimate the contribution of variance to ADDs and potential human health effect for different age-sex groups. This study found that the daily consumption of chicken, pork/beef, and seafood were estimated to be 33.77 (Male)/22.65 (Female), 91.70 (M)/66.35 (F), and 54.15 (M)/40.78 (F) g/day, respectively. The highest BPA ADD was found in the 6-9 years age group (95% CI=0.0006-0.0027 mg/kg-bw/day), whereas the lowest BPA ADD was in the ≥65 years age group (0.0002-0.0020 mg/kg-bw/day). Based on the latest EFSA guidelines (0.004 mg/kg-bw/day), the 97.5 percentile HQ of BPA intake in different age-sex groups in Taiwan posed no risks through dietary intake. However, a combination of multiple exposure routes and long-term exposure in specific populations may be of concern in the future. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. A Semi-implicit Treatment of Porous Media in Steady-State CFD.

    PubMed

    Domaingo, Andreas; Langmayr, Daniel; Somogyi, Bence; Almbauer, Raimund

    There are many situations in computational fluid dynamics which require the definition of source terms in the Navier-Stokes equations. These source terms not only allow to model the physics of interest but also have a strong impact on the reliability, stability, and convergence of the numerics involved. Therefore, sophisticated numerical approaches exist for the description of such source terms. In this paper, we focus on the source terms present in the Navier-Stokes or Euler equations due to porous media-in particular the Darcy-Forchheimer equation. We introduce a method for the numerical treatment of the source term which is independent of the spatial discretization and based on linearization. In this description, the source term is treated in a fully implicit way whereas the other flow variables can be computed in an implicit or explicit manner. This leads to a more robust description in comparison with a fully explicit approach. The method is well suited to be combined with coarse-grid-CFD on Cartesian grids, which makes it especially favorable for accelerated solution of coupled 1D-3D problems. To demonstrate the applicability and robustness of the proposed method, a proof-of-concept example in 1D, as well as more complex examples in 2D and 3D, is presented.

  6. Computations of steady-state and transient premixed turbulent flames using pdf methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hulek, T.; Lindstedt, R.P.

    1996-03-01

    Premixed propagating turbulent flames are modeled using a one-point, single time, joint velocity-composition probability density function (pdf) closure. The pdf evolution equation is solved using a Monte Carlo method. The unclosed terms in the pdf equation are modeled using a modified version of the binomial Langevin model for scalar mixing of Valino and Dopazo, and the Haworth and Pope (HP) and Lagrangian Speziale-Sarkar-Gatski (LSSG) models for the viscous dissipation of velocity and the fluctuating pressure gradient. The source terms for the presumed one-step chemical reaction are extracted from the rate of fuel consumption in laminar premixed hydrocarbon flames, computed usingmore » a detailed chemical kinetic mechanism. Steady-state and transient solutions are obtained for planar turbulent methane-air and propane-air flames. The transient solution method features a coupling with a Finite Volume (FV) code to obtain the mean pressure field. The results are compared with the burning velocity measurements of Abdel-Gayed et al. and with velocity measurements obtained in freely propagating propane-air flames by Videto and Santavicca. The effects of different upstream turbulence fields, chemical source terms (different fuels and strained/unstrained laminar flames) and the influence of the velocity statistics models (HP and LSSG) are assessed.« less

  7. Thermal maturity of type II kerogen from the New Albany Shale assessed by13C CP/MAS NMR

    USGS Publications Warehouse

    Werner-Zwanziger, U.; Lis, G.; Mastalerz, Maria; Schimmelmann, A.

    2005-01-01

    Thermal maturity of oil and gas source rocks is typically quantified in terms of vitrinite reflectance, which is based on optical properties of terrestrial woody remains. This study evaluates 13C CP/MAS NMR parameters in kerogen (i.e., the insoluble fraction of organic matter in sediments and sedimentary rocks) as proxies for thermal maturity in marine-derived source rocks where terrestrially derived vitrinite is often absent or sparse. In a suite of samples from the New Albany Shale (Middle Devonian to the Early Mississippian, Illinois Basin) the abundance of aromatic carbon in kerogen determined by 13C CP/MAS NMR correlates linearly well with vitrinite reflectance. ?? 2004 Elsevier Inc. All rights reserved.

  8. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research.

    PubMed

    Saokaew, Surasak; Sugimoto, Takashi; Kamae, Isao; Pratoomsoot, Chayanin; Chaiyakunapruk, Nathorn

    2015-01-01

    Health technology assessment (HTA) has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced. Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases' characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided. Forty databases- 20 from Thailand and 20 from Japan-were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources. Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed.

  9. Groundwater quality in Ghaziabad district, Uttar Pradesh, India: Multivariate and health risk assessment.

    PubMed

    Chabukdhara, Mayuri; Gupta, Sanjay Kumar; Kotecha, Yatharth; Nema, Arvind K

    2017-07-01

    This study aimed to assess the quality of groundwater and potential health risk due to ingestion of heavy metals in the peri-urban and urban-industrial clusters of Ghaziabad district, Uttar Pradesh, India. Furthermore, the study aimed to evaluate heavy metals sources and their pollution level using multivariate analysis and fuzzy comprehensive assessment (FCA), respectively. Multivariate analysis using principle component analysis (PCA) showed mixed origin for Pb, Cd, Zn, Fe, and Ni, natural source for Cu and Mn and anthropogenic source for Cr. Among all the metals, Pb, Cd, Fe and Ni were above the safe limits of Bureau of Indian Standards (BIS) and World Health Organization (WHO) except Ni. Health risk in terms of hazard quotient (HQ) showed that the HQ values for children were higher than the safe level (HQ = 1) for Pb (2.4) and Cd (2.1) in pre-monsoon while in post-monsoon the value exceeded only for Pb (HQ = 1.23). The health risks of heavy metals for the adults were well within safe limits. The finding of this study indicates potential health risks to the children due to chronic exposure to contaminated groundwater in the region. Based on FCA, groundwater pollution could be categorized as quite high in the peri-urban region, and absolutely high in the urban region of Ghaziabad district. This study showed that different approaches are required for the integrated assessment of the groundwater pollution, and provides a scientific basis for the strategic future planning and comprehensive management. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Overview of the U.S. Nuclear Regulatory Commission collaborative research program to assess tsunami hazard for nuclear power plants on the Atlantic and Gulf Coasts

    USGS Publications Warehouse

    Kammerer, A.M.; ten Brink, Uri S.; Titov, V.V.

    2017-01-01

    In response to the 2004 Indian Ocean Tsunami, the United States Nuclear Regulatory Commission (US NRC) initiated a long-term research program to improve understanding of tsunami hazard levels for nuclear facilities in the United States. For this effort, the US NRC organized a collaborative research program with the United States Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA) with a goal of assessing tsunami hazard on the Atlantic and Gulf Coasts of the United States. Necessarily, the US NRC research program includes both seismic- and landslide-based tsunamigenic sources in both the near and the far fields. The inclusion of tsunamigenic landslides, an important category of sources that impact tsunami hazard levels for the Atlantic and Gulf Coasts is a key difference between this program and most other tsunami hazard assessment programs. The initial phase of this work consisted of collection, interpretation, and analysis of available offshore data, with significant effort focused on characterizing offshore near-field landslides and analyzing their tsunamigenic potential and properties. In the next phase of research, additional field investigations will be conducted in key locations of interest and additional analysis will be undertaken. Simultaneously, the MOST tsunami generation and propagation model used by NOAA will first be enhanced to include landslide-based initiation mechanisms and then will be used to investigate the impact of the tsunamigenic sources identified and characterized by the USGS. The potential for probabilistic tsunami hazard assessment will also be explore in the final phases of the program.

  11. Big Data Usage Patterns in the Health Care Domain: A Use Case Driven Approach Applied to the Assessment of Vaccination Benefits and Risks

    PubMed Central

    Liyanage, H.; Liaw, S-T.; Kuziemsky, C.; Mold, F.; Krause, P.; Fleming, D.; Jones, S.

    2014-01-01

    Summary Background Generally benefits and risks of vaccines can be determined from studies carried out as part of regulatory compliance, followed by surveillance of routine data; however there are some rarer and more long term events that require new methods. Big data generated by increasingly affordable personalised computing, and from pervasive computing devices is rapidly growing and low cost, high volume, cloud computing makes the processing of these data inexpensive. Objective To describe how big data and related analytical methods might be applied to assess the benefits and risks of vaccines. Method: We reviewed the literature on the use of big data to improve health, applied to generic vaccine use cases, that illustrate benefits and risks of vaccination. We defined a use case as the interaction between a user and an information system to achieve a goal. We used flu vaccination and pre-school childhood immunisation as exemplars. Results We reviewed three big data use cases relevant to assessing vaccine benefits and risks: (i) Big data processing using crowd-sourcing, distributed big data processing, and predictive analytics, (ii) Data integration from heterogeneous big data sources, e.g. the increasing range of devices in the “internet of things”, and (iii) Real-time monitoring for the direct monitoring of epidemics as well as vaccine effects via social media and other data sources. Conclusions Big data raises new ethical dilemmas, though its analysis methods can bring complementary real-time capabilities for monitoring epidemics and assessing vaccine benefit-risk balance. PMID:25123718

  12. Healthcare Databases in Thailand and Japan: Potential Sources for Health Technology Assessment Research

    PubMed Central

    Saokaew, Surasak; Sugimoto, Takashi; Kamae, Isao; Pratoomsoot, Chayanin; Chaiyakunapruk, Nathorn

    2015-01-01

    Background Health technology assessment (HTA) has been continuously used for value-based healthcare decisions over the last decade. Healthcare databases represent an important source of information for HTA, which has seen a surge in use in Western countries. Although HTA agencies have been established in Asia-Pacific region, application and understanding of healthcare databases for HTA is rather limited. Thus, we reviewed existing databases to assess their potential for HTA in Thailand where HTA has been used officially and Japan where HTA is going to be officially introduced. Method Existing healthcare databases in Thailand and Japan were compiled and reviewed. Databases’ characteristics e.g. name of database, host, scope/objective, time/sample size, design, data collection method, population/sample, and variables were described. Databases were assessed for its potential HTA use in terms of safety/efficacy/effectiveness, social/ethical, organization/professional, economic, and epidemiological domains. Request route for each database was also provided. Results Forty databases– 20 from Thailand and 20 from Japan—were included. These comprised of national censuses, surveys, registries, administrative data, and claimed databases. All databases were potentially used for epidemiological studies. In addition, data on mortality, morbidity, disability, adverse events, quality of life, service/technology utilization, length of stay, and economics were also found in some databases. However, access to patient-level data was limited since information about the databases was not available on public sources. Conclusion Our findings have shown that existing databases provided valuable information for HTA research with limitation on accessibility. Mutual dialogue on healthcare database development and usage for HTA among Asia-Pacific region is needed. PMID:26560127

  13. Assessment of short-term PM2.5-related mortality due to different emission sources in the Yangtze River Delta, China

    NASA Astrophysics Data System (ADS)

    Wang, Jiandong; Wang, Shuxiao; Voorhees, A. Scott; Zhao, Bin; Jang, Carey; Jiang, Jingkun; Fu, Joshua S.; Ding, Dian; Zhu, Yun; Hao, Jiming

    2015-12-01

    Air pollution is a major environmental risk to health. In this study, short-term premature mortality due to particulate matter equal to or less than 2.5 μm in aerodynamic diameter (PM2.5) in the Yangtze River Delta (YRD) is estimated by using a PC-based human health benefits software. The economic loss is assessed by using the willingness to pay (WTP) method. The contributions of each region, sector and gaseous precursor are also determined by employing brute-force method. The results show that, in the YRD in 2010, the short-term premature deaths caused by PM2.5 are estimated to be 13,162 (95% confidence interval (CI): 10,761-15,554), while the economic loss is 22.1 (95% CI: 18.1-26.1) billion Chinese Yuan. The industrial and residential sectors contributed the most, accounting for more than 50% of the total economic loss. Emissions of primary PM2.5 and NH3 are major contributors to the health-related loss in winter, while the contribution of gaseous precursors such as SO2 and NOx is higher than primary PM2.5 in summer.

  14. Detection and Estimation of 2-D Distributions of Greenhouse Gas Source Concentrations and Emissions over Complex Urban Environments and Industrial Sites

    NASA Astrophysics Data System (ADS)

    Zaccheo, T. S.; Pernini, T.; Dobler, J. T.; Blume, N.; Braun, M.

    2017-12-01

    This work highlights the use of the greenhouse-gas laser imaging tomography experiment (GreenLITETM) data in conjunction with a sparse tomography approach to identify and quantify both urban and industrial sources of CO2 and CH4. The GreenLITETM system provides a user-defined set of time-sequenced intersecting chords or integrated column measurements at a fixed height through a quasi-horizontal plane of interest. This plane, with unobstructed views along the lines of sight, may range from complex industrial facilities to a small city scale or urban sector. The continuous time phased absorption measurements are converted to column concentrations and combined with a plume based model to estimate the 2-D distribution of gas concentration over extended areas ranging from 0.04-25 km2. Finally, these 2-D maps of concentration are combined with ancillary meteorological and atmospheric data to identify potential emission sources and provide first order estimates of their associated fluxes. In this presentation, we will provide a brief overview of the systems and results from both controlled release experiments and a long-term system deployment in Paris, FR. These results provide a quantitative assessment of the system's ability to detect and estimate CO2 and CH4 sources, and demonstrate its ability to perform long-term autonomous monitoring and quantification of either persistent or sporadic emissions that may have both health and safety as well as environmental impacts.

  15. A structured approach to recording AIDS-defining illnesses in Kenya: A SNOMED CT based solution

    PubMed Central

    Oluoch, Tom; de Keizer, Nicolette; Langat, Patrick; Alaska, Irene; Ochieng, Kenneth; Okeyo, Nicky; Kwaro, Daniel; Cornet, Ronald

    2016-01-01

    Introduction Several studies conducted in sub-Saharan Africa (SSA) have shown that routine clinical data in HIV clinics often have errors. Lack of structured and coded documentation of diagnosis of AIDS defining illnesses (ADIs) can compromise data quality and decisions made on clinical care. Methods We used a structured framework to derive a reference set of concepts and terms used to describe ADIs. The four sources used were: (i) CDC/Accenture list of opportunistic infections, (ii) SNOMED Clinical Terms (SNOMED CT), (iii) Focus Group Discussion (FGD) among clinicians and nurses attending to patients at a referral provincial hospital in western Kenya, and (iv) chart abstraction from the Maternal Child Health (MCH) and HIV clinics at the same hospital. Using the January 2014 release of SNOMED CT, concepts were retrieved that matched terms abstracted from approach iii & iv, and the content coverage assessed. Post-coordination matching was applied when needed. Results The final reference set had 1054 unique ADI concepts which were described by 1860 unique terms. Content coverage of SNOMED CT was high (99.9% with pre-coordinated concepts; 100% with post-coordination). The resulting reference set for ADIs was implemented as the interface terminology on OpenMRS data entry forms. Conclusion Different sources demonstrate complementarity in the collection of concepts and terms for an interface terminology. SNOMED CT provides a high coverage in the domain of ADIs. Further work is needed to evaluate the effect of the interface terminology on data quality and quality of care. PMID:26184057

  16. Speckle variance OCT for depth resolved assessment of the viability of bovine embryos

    PubMed Central

    Caujolle, S.; Cernat, R.; Silvestri, G.; Marques, M. J.; Bradu, A.; Feuchter, T.; Robinson, G.; Griffin, D. K.; Podoleanu, A.

    2017-01-01

    The morphology of embryos produced by in vitro fertilization (IVF) is commonly used to estimate their viability. However, imaging by standard microscopy is subjective and unable to assess the embryo on a cellular scale after compaction. Optical coherence tomography is an imaging technique that can produce a depth-resolved profile of a sample and can be coupled with speckle variance (SV) to detect motion on a micron scale. In this study, day 7 post-IVF bovine embryos were observed either short-term (10 minutes) or long-term (over 18 hours) and analyzed by swept source OCT and SV to resolve their depth profile and characterize micron-scale movements potentially associated with viability. The percentage of en face images showing movement at any given time was calculated as a method to detect the vital status of the embryo. This method could be used to measure the levels of damage sustained by an embryo, for example after cryopreservation, in a rapid and non-invasive way. PMID:29188109

  17. Inter-Annual and Shorter-Term Variability in Physical and Biological Characteristics Across Barrow Canyon in August - September 2005-2014

    NASA Astrophysics Data System (ADS)

    Ashjian, C. J.; Okkonen, S. R.; Campbell, R. G.; Alatalo, P.

    2014-12-01

    Late summer physical and biological conditions along a 37-km transect crossing Barrow Canyon have been described for the past ten years as part of an ongoing program, supported by multiple funding sources including the NSF AON, focusing on inter-annual variability and the formation of a bowhead whale feeding hotspot near Barrow. These repeated transects (at least two per year, separated in time by days-weeks) provide an opportunity to assess the inter-annual and shorter term (days-weeks) changes in hydrographic structure, ocean temperature, current velocity and transport, chlorophyll fluorescence, nutrients, and micro- and mesozooplankton community composition and abundance. Inter-annual variability in all properties was high and was associated with larger scale, meteorological forcing. Shorter-term variability could also be high but was strongly influenced by changes in local wind forcing. The sustained sampling at this location provided critical measures of inter-annual variability that should permit detection of longer-term trends that are associated with ongoing climate change.

  18. Conducting Source Water Assessments

    EPA Pesticide Factsheets

    This page presents background information on the source water assessment steps, the four steps of a source wter assessment, and how to use the results of an assessment to protect drinking water sources.

  19. Inverse modelling of radionuclide release rates using gamma dose rate observations

    NASA Astrophysics Data System (ADS)

    Hamburger, Thomas; Evangeliou, Nikolaos; Stohl, Andreas; von Haustein, Christoph; Thummerer, Severin; Wallner, Christian

    2015-04-01

    Severe accidents in nuclear power plants such as the historical accident in Chernobyl 1986 or the more recent disaster in the Fukushima Dai-ichi nuclear power plant in 2011 have drastic impacts on the population and environment. Observations and dispersion modelling of the released radionuclides help to assess the regional impact of such nuclear accidents. Modelling the increase of regional radionuclide activity concentrations, which results from nuclear accidents, underlies a multiplicity of uncertainties. One of the most significant uncertainties is the estimation of the source term. That is, the time dependent quantification of the released spectrum of radionuclides during the course of the nuclear accident. The quantification of the source term may either remain uncertain (e.g. Chernobyl, Devell et al., 1995) or rely on estimates given by the operators of the nuclear power plant. Precise measurements are mostly missing due to practical limitations during the accident. The release rates of radionuclides at the accident site can be estimated using inverse modelling (Davoine and Bocquet, 2007). The accuracy of the method depends amongst others on the availability, reliability and the resolution in time and space of the used observations. Radionuclide activity concentrations are observed on a relatively sparse grid and the temporal resolution of available data may be low within the order of hours or a day. Gamma dose rates, on the other hand, are observed routinely on a much denser grid and higher temporal resolution and provide therefore a wider basis for inverse modelling (Saunier et al., 2013). We present a new inversion approach, which combines an atmospheric dispersion model and observations of radionuclide activity concentrations and gamma dose rates to obtain the source term of radionuclides. We use the Lagrangian particle dispersion model FLEXPART (Stohl et al., 1998; Stohl et al., 2005) to model the atmospheric transport of the released radionuclides. The inversion method uses a Bayesian formulation considering uncertainties for the a priori source term and the observations (Eckhardt et al., 2008, Stohl et al., 2012). The a priori information on the source term is a first guess. The gamma dose rate observations are used to improve the first guess and to retrieve a reliable source term. The details of this method will be presented at the conference. This work is funded by the Bundesamt für Strahlenschutz BfS, Forschungsvorhaben 3612S60026. References Davoine, X. and Bocquet, M., Atmos. Chem. Phys., 7, 1549-1564, 2007. Devell, L., et al., OCDE/GD(96)12, 1995. Eckhardt, S., et al., Atmos. Chem. Phys., 8, 3881-3897, 2008. Saunier, O., et al., Atmos. Chem. Phys., 13, 11403-11421, 2013. Stohl, A., et al., Atmos. Environ., 32, 4245-4264, 1998. Stohl, A., et al., Atmos. Chem. Phys., 5, 2461-2474, 2005. Stohl, A., et al., Atmos. Chem. Phys., 12, 2313-2343, 2012.

  20. An Evaluation Methodology for Longitudinal Studies of Short Term Cancer Research Training Programs

    PubMed Central

    Padilla, Luz A.; Venkatesh, Raam; Daniel, Casey L.; Desmond, Renee A.; Brooks, C. Michael; Waterbor, John W.

    2014-01-01

    The need to familiarize medical students and graduate health professional students with research training opportunities that cultivate the appeal of research careers is vital to the future of research. Comprehensive evaluation of a cancer research training program can be achieved through longitudinal tracking of program alumni to assess the program’s impact on each participant’s career path and professional achievements. With advances in technology and smarter means of communication, effective ways to track alumni have changed. In order to collect data on the career outcomes and achievements of nearly 500 short-term cancer research training program alumni from 1999–2013, we sought to contact each alumnus to request completion of a survey instrument online, or by means of a telephone interview. The effectiveness of each contact method that we used was quantified according to ease of use and time required. The most reliable source of contact information for tracking alumni from the early years of the program was previous tracking results; and for alumni from the later years, the most important source of contact information was university alumni records that provided email addresses and telephone numbers. Personal contacts with former preceptors were sometimes helpful, as were generic search engines and people search engines. Social networking was of little value for most searches. Using information from two or more sources in combination was most effective in tracking alumni. These results provide insights and tools for other research training programs that wish to track their alumni for long-term program evaluation. PMID:25412722

  1. Operational source receptor calculations for large agglomerations

    NASA Astrophysics Data System (ADS)

    Gauss, Michael; Shamsudheen, Semeena V.; Valdebenito, Alvaro; Pommier, Matthieu; Schulz, Michael

    2016-04-01

    For Air quality policy an important question is how much of the air pollution within an urbanized region can be attributed to local sources and how much of it is imported through long-range transport. This is critical information for a correct assessment of the effectiveness of potential emission measures. The ratio between indigenous and long-range transported air pollution for a given region depends on its geographic location, the size of its area, the strength and spatial distribution of emission sources, the time of the year, but also - very strongly - on the current meteorological conditions, which change from day to day and thus make it important to provide such calculations in near-real-time to support short-term legislation. Similarly, long-term analysis over longer periods (e.g. one year), or of specific air quality episodes in the past, can help to scientifically underpin multi-regional agreements and long-term legislation. Within the European MACC projects (Monitoring Atmospheric Composition and Climate) and the transition to the operational CAMS service (Copernicus Atmosphere Monitoring Service) the computationally efficient EMEP MSC-W air quality model has been applied with detailed emission data, comprehensive calculations of chemistry and microphysics, driven by high quality meteorological forecast data (up to 96-hour forecasts), to provide source-receptor calculations on a regular basis in forecast mode. In its current state, the product allows the user to choose among different regions and regulatory pollutants (e.g. ozone and PM) to assess the effectiveness of fictive emission reductions in air pollutant emissions that are implemented immediately, either within the agglomeration or outside. The effects are visualized as bar charts, showing resulting changes in air pollution levels within the agglomeration as a function of time (hourly resolution, 0 to 4 days into the future). The bar charts not only allow assessing the effects of emission reduction measures but they also indicate the relative importance of indigenous versus imported air pollution. The calculations are currently performed weekly by MET Norway for the Paris, London, Berlin, Oslo, Po Valley and Rhine-Ruhr regions and the results are provided free of charge at the MACC website (http://www.gmes-atmosphere.eu/services/aqac/policy_interface/regional_sr/). A proposal to extend this service to all EU capitals on a daily basis within the Copernicus Atmosphere Monitoring Service is currently under review. The tool is an important example illustrating the increased application of scientific tools to operational services that support Air Quality policy. This paper will describe this tool in more detail, focusing on the experimental setup, underlying assumptions, uncertainties, computational demand, and the usefulness for air quality for policy. Options to apply the tool for agglomerations outside the EU will also be discussed (making reference to, e.g., PANDA, which is a European-Chinese collaboration project).

  2. Long Term Leaching of Chlorinated Solvents from Source Zones in Low Permeability Settings with Fractures

    NASA Astrophysics Data System (ADS)

    Bjerg, P. L.; Chambon, J.; Troldborg, M.; Binning, P. J.; Broholm, M. M.; Lemming, G.; Damgaard, I.

    2008-12-01

    Groundwater contamination by chlorinated solvents, such as perchloroethylene (PCE), often occurs via leaching from complex sources located in low permeability sediments such as clayey tills overlying aquifers. Clayey tills are mostly fractured, and contamination migrating through the fractures spreads to the low permeability matrix by diffusion. This results in a long term source of contamination due to back-diffusion. Leaching from such sources is further complicated by microbial degradation under anaerobic conditions to sequentially form the daughter products trichloroethylene, cis-dichloroethylene (cis-DCE), vinyl chloride (VC) and ethene. This process can be enhanced by addition of electron donors and/or bioaugmentation and is termed Enhanced Reductive Dechlorination (ERD). This work aims to improve our understanding of the physical, chemical and microbial processes governing source behaviour under natural and enhanced conditions. That understanding is applied to risk assessment, and to determine the relationship and time frames of source clean up and plume response. To meet that aim, field and laboratory observations are coupled to state of the art models incorporating new insights of contaminant behaviour. The long term leaching of chlorinated ethenes from clay aquitards is currently being monitored at a number of Danish sites. The observed data is simulated using a coupled fracture flow and clay matrix diffusion model. Sequential degradation is represented by modified Monod kinetics accounting for competitive inhibition between the chlorinated ethenes. The model is constructed using Comsol Multiphysics, a generic finite- element partial differential equation solver. The model is applied at two well characterised field sites with respect to hydrogeology, fracture network, contaminant distribution and microbial processes (lab and field experiments). At the study sites (Sortebrovej and Vadsbyvej), the source areas are situated in a clayey till with fractures and interbedded sand lenses. The field sites are both highly contaminated with chlorinated ethenes which impact the underlying sand aquifer. Anaerobic dechlorination is taking place, and cis-DCE and VC have been found in significant amounts in the matrix. Full scale remediation using ERD was implemented at Sortebrovej in 2006, and ERD has been suggested as a remedy at Vadsbyvej. Results reveal several interesting findings. The physical processes of matrix diffusion and advection in the fractures seem to be more important than the microbial degradation processes for estimation of the time frames and the distance between fractures is amongst the most sensitive model parameters. However, the inclusion of sequential degradation is crucial to determining the composition of contamination leaching into the underlying aquifer. Degradation products like VC will peak at an earlier stage compared to the mother compound due to a higher mobility. The findings highlight a need for improved characterization of low permeability aquitards lying above aquifers used for water supply. The fracture network in aquitards is currently poorly described at larger depths (below 5-8 m) and the effect of sand lenses on leaching behaviour is not well understood. The microbial processes are assumed to be taking place in the fracture system, but the interaction with and processes in the matrix need to be further explored. Development of new methods for field site characterisation and integrated field and model expertise are crucial for the design of remedial actions and for risk assessment of contaminated sites in low permeability settings.

  3. Long Term 2 Second Round Source Water Monitoring and Bin Placement Memo

    EPA Pesticide Factsheets

    The Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) applies to all public water systems served by a surface water source or public water systems served by a ground water source under the direct influence of surface water.

  4. Development of the Metropolitan Water Availability Index (MWAI) and short-term assessment with multi-scale remote sensing technologies.

    PubMed

    Chang, Ni-Bin; Yang, Y Jeffrey; Goodrich, James A; Daranpob, Ammarin

    2010-06-01

    Global climate change will influence environmental conditions including temperature, surface radiation, soil moisture, and sea level, and it will also significantly impact regional-scale hydrologic processes such as evapotranspiration (ET), precipitation, runoff, and snowmelt. The quantity and quality of water available for drinking and other domestic usage is also likely to be affected by changes in these processes. Consequently, it is necessary to assess and reflect upon the challenges ahead for water infrastructure and the general public in metropolitan regions. One approach to the problem is to use index-based assessment, forecasting and planning. The drought indices previously developed were not developed for domestic water supplies, and thus are insufficient for the purpose of such an assessment. This paper aims to propose and develop a "Metropolitan Water Availability Index (MWAI)" to assess the status of both the quantity and quality of available potable water sources diverted from the hydrologic cycle in a metropolitan region. In this approach, the accessible water may be expressed as volume per month or week (i.e., m(3)/month or m(3)/week) relative to a prescribed historical record, and such a trend analysis may result in final MWAI values ranging from -1 to +1 for regional water management decision making. The MWAI computation uses data and information from both historical point measurements and spatial remote-sensing based monitoring. Variables such as precipitation, river discharge, and water quality changes at drinking water plant intakes at specific locations are past "point" measurements in MWAI calculations. On the other hand, remote sensing provides information on both spatial and temporal distributions of key variables. Examples of remote-sensing images and sensor network technologies are in-situ sensor networks, ground-based radar, air-borne aircraft, and even space-borne satellites. A case study in Tampa Bay, Florida is described to demonstrate the short-term assessment of the MWAI concept at a practical level. It is anticipated that such a forecasting methodology may be extended for middle-term and long-term water supply assessment. (c) 2010 Elsevier Ltd. All rights reserved.

  5. Uncertainty, variability, and earthquake physics in ground‐motion prediction equations

    USGS Publications Warehouse

    Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A.

    2017-01-01

    Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20  km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction.

  6. Drinking water studies: a review on heavy metal, application of biomarker and health risk assessment (a special focus in Malaysia).

    PubMed

    Ab Razak, Nurul Hafiza; Praveena, Sarva Mangala; Aris, Ahmad Zaharin; Hashim, Zailina

    2015-12-01

    Malaysia has abundant sources of drinking water from river and groundwater. However, rapid developments have deteriorated quality of drinking water sources in Malaysia. Heavy metal studies in terms of drinking water, applications of health risk assessment and bio-monitoring in Malaysia were reviewed from 2003 to 2013. Studies on heavy metal in drinking water showed the levels are under the permissible limits as suggested by World Health Organization and Malaysian Ministry of Health. Future studies on the applications of health risk assessment are crucial in order to understand the risk of heavy metal exposure through drinking water to Malaysian population. Among the biomarkers that have been reviewed, toenail is the most useful tool to evaluate body burden of heavy metal. Toenails are easy to collect, store, transport and analysed. This review will give a clear guidance for future studies of Malaysian drinking water. In this way, it will help risk managers to minimize the exposure at optimum level as well as the government to formulate policies in safe guarding the population. Copyright © 2015 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  7. Background and Source Term Identification in Active Neutron Interrogation Methods

    DTIC Science & Technology

    2011-03-24

    interactions occurred to observe gamma ray peaks and not unduly increase simulation time. Not knowing the uranium enrichment modeled by Gozani, pure U...neutron interactions can occur. The uranium targets, though, should have increased neutron fluencies as the energy levels become below 2 MeV. This is...Assessment Monitor Site (TEAMS) at Kirtland AFB, NM. Iron (Fe-56), lead (Pb-207), polyethylene (C2H4 –– > C-12 & H-1), and uranium (U-235 and U-238) were

  8. Final Environmental Assessment of Installation Development at Grand Forks Air Force Base, North Dakota

    DTIC Science & Technology

    2010-07-01

    the ground source heat pump system . During installation, construction equipment would remove vegetation from the surface and disturb soil to a depth...levels of 50 to 55 dBA or higher on a daily basis. Studies specifically conducted to determine noise effects on various human activities show that about...needs to be evaluated for its potential effects on a project site and adjacent land uses. The foremost factor affecting a proposed action in terms of

  9. Orbital Debris Quarterly News. Volume 13; No. 1

    NASA Technical Reports Server (NTRS)

    Liou, J.-C. (Editor); Shoots, Debi (Editor)

    2009-01-01

    Topics discussed include: new debris from a decommissioned satellite with a nuclear power source; debris from the destruction of the Fengyun-1C meteorological satellite; quantitative analysis of the European Space Agency's Automated Transfer Vehicle 'Jules Verne' reentry event; microsatellite impact tests; solar cycle 24 predictions and other long-term projections and geosynchronus (GEO) environment for the Orbital Debris Engineering Model (ORDEM2008). Abstracts from the NASA Orbital Debris Program Office, examining satellite reentry risk assessments and statistical issues for uncontrolled reentry hazards, are also included.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The feasibility of constructing a 25-50 MWe geothermal power plant using low salinity hydrothermal fluid as the energy source was assessed. Here, the geotechnical aspects of geothermal power generation and their relationship to environmental impacts in the Imperial Valley of California were investigated. Geology, geophysics, hydrogeology, seismicity and subsidence are discussed in terms of the availability of data, state-of-the-art analytical techniques, historical and technical background and interpretation of current data. Estimates of the impact of these geotechnical factors on the environment in the Imperial Valley, if geothermal development proceeds, are discussed.

  11. Experimental investigation of the influence of Mo contained in stainless steel on Cs chemisorption behavior

    NASA Astrophysics Data System (ADS)

    Di Lemma, F. G.; Nakajima, K.; Yamashita, S.; Osaka, M.

    2017-02-01

    Chemisorption phenomena can affect fission products (FP) retention in a nuclear reactor vessel during a severe accident (SA). Detailed information on the FP chemisorbed deposits, especially for Cs, are important for a rational decommissioning of the reactor following a SA, as for the Fukushima Daiichi Power Station. Moreover the retention of Cs will influence the source term assessment and thus improved models for this phenomenon are needed in SA codes. This paper describes the influence on Cs chemisorption of molybdenum contained in stainless steel (SS) type 316. In our experiments it was observed that Cs-Mo deposits (CsFe(MoO4)3, Cs2MoO4) were formed together with CsFeSiO4, which is the predominant compound formed by chemisorption. The Cs-Mo deposits were found to revaporize from the SS sample at 1000 °C, and thus could contribute to the source term. On the other hand, CsFeSiO4 will be probably retained in the reactor during a SA due to its stability.

  12. Characterization and in vitro properties of potentially probiotic Bifidobacterium strains isolated from breast-milk.

    PubMed

    Arboleya, Silvia; Ruas-Madiedo, Patricia; Margolles, Abelardo; Solís, Gonzalo; Salminen, Seppo; de Los Reyes-Gavilán, Clara G; Gueimonde, Miguel

    2011-09-01

    Most of the current commercial probiotic strains have not been selected for specific applications, but rather on the basis of their technological potential for use in diverse applications. Therefore, by selecting them from appropriate sources, depending on the target population, it is likely that better performing strains may be identified. Few strains have been specifically selected for human neonates, where the applications of probiotics may have a great positive impact. Breast-milk constitutes an interesting source of potentially probiotic bifidobacteria for inclusion in infant formulas and foods targeted to both pre-term and full-term infants. In this study six Bifidobacterium strains isolated from breast-milk were phenotypically and genotypically characterised according to international guidelines for probiotics. In addition, different in vitro tests were used to assess the safety and probiotic potential of the strains. Although clinical data would be needed before drawing any conclusion on the probiotic properties of the strains, our results indicate that some of them may have probiotic potential for their inclusion in products targeting infants. Copyright © 2010 Elsevier B.V. All rights reserved.

  13. Dosimetric comparison of Monte Carlo codes (EGS4, MCNP, MCNPX) considering external and internal exposures of the Zubal phantom to electron and photon sources.

    PubMed

    Chiavassa, S; Lemosquet, A; Aubineau-Lanièce, I; de Carlan, L; Clairand, I; Ferrer, L; Bardiès, M; Franck, D; Zankl, M

    2005-01-01

    This paper aims at comparing dosimetric assessments performed with three Monte Carlo codes: EGS4, MCNP4c2 and MCNPX2.5e, using a realistic voxel phantom, namely the Zubal phantom, in two configurations of exposure. The first one deals with an external irradiation corresponding to the example of a radiological accident. The results are obtained using the EGS4 and the MCNP4c2 codes and expressed in terms of the mean absorbed dose (in Gy per source particle) for brain, lungs, liver and spleen. The second one deals with an internal exposure corresponding to the treatment of a medullary thyroid cancer by 131I-labelled radiopharmaceutical. The results are obtained by EGS4 and MCNPX2.5e and compared in terms of S-values (expressed in mGy per kBq and per hour) for liver, kidney, whole body and thyroid. The results of these two studies are presented and differences between the codes are analysed and discussed.

  14. Parenting Knowledge: Experiential and Sociodemographic Factors in European American Mothers of Young Children

    PubMed Central

    Bornstein, Marc H.; Cote, Linda R.; Haynes, O. Maurice; Hahn, Chun-Shin; Park, Yoonjung

    2011-01-01

    Knowledge of childrearing and child development is relevant to parenting and the well-being of children. In a sociodemographically heterogeneous sample of 268 European American mothers of 2-year-olds, we assessed the state of mothers’ parenting knowledge, compared parenting knowledge in groups of mothers who varied in terms of parenthood and social status, and identified principal sources of mothers’ parenting knowledge in terms of social factors, parenting supports, and formal classes. On the whole, European American mothers demonstrated a fair but less than complete basic parenting knowledge, and mothers’ age, education, and rated helpfulness of written materials each uniquely contributed to their knowledge. Adult mothers scored higher than adolescent mothers, and mothers improved in their knowledge of parenting from their first to their second child (and were stable across time). No differences were found between mothers of girls and boys, mothers who varied in employment status, or between birth and adoptive mothers. The implications of variation in parenting knowledge and its sources for parenting education and clinical interactions with parents are discussed. PMID:20836597

  15. PuLSE: Quality control and quantification of peptide sequences explored by phage display libraries.

    PubMed

    Shave, Steven; Mann, Stefan; Koszela, Joanna; Kerr, Alastair; Auer, Manfred

    2018-01-01

    The design of highly diverse phage display libraries is based on assumption that DNA bases are incorporated at similar rates within the randomized sequence. As library complexity increases and expected copy numbers of unique sequences decrease, the exploration of library space becomes sparser and the presence of truly random sequences becomes critical. We present the program PuLSE (Phage Library Sequence Evaluation) as a tool for assessing randomness and therefore diversity of phage display libraries. PuLSE runs on a collection of sequence reads in the fastq file format and generates tables profiling the library in terms of unique DNA sequence counts and positions, translated peptide sequences, and normalized 'expected' occurrences from base to residue codon frequencies. The output allows at-a-glance quantitative quality control of a phage library in terms of sequence coverage both at the DNA base and translated protein residue level, which has been missing from toolsets and literature. The open source program PuLSE is available in two formats, a C++ source code package for compilation and integration into existing bioinformatics pipelines and precompiled binaries for ease of use.

  16. Interlaboratory study of the ion source memory effect in 36Cl accelerator mass spectrometry

    NASA Astrophysics Data System (ADS)

    Pavetich, Stefan; Akhmadaliev, Shavkat; Arnold, Maurice; Aumaître, Georges; Bourlès, Didier; Buchriegler, Josef; Golser, Robin; Keddadouche, Karim; Martschini, Martin; Merchel, Silke; Rugel, Georg; Steier, Peter

    2014-06-01

    Understanding and minimization of contaminations in the ion source due to cross-contamination and long-term memory effect is one of the key issues for accurate accelerator mass spectrometry (AMS) measurements of volatile elements. The focus of this work is on the investigation of the long-term memory effect for the volatile element chlorine, and the minimization of this effect in the ion source of the Dresden accelerator mass spectrometry facility (DREAMS). For this purpose, one of the two original HVE ion sources at the DREAMS facility was modified, allowing the use of larger sample holders having individual target apertures. Additionally, a more open geometry was used to improve the vacuum level. To evaluate this improvement in comparison to other up-to-date ion sources, an interlaboratory comparison had been initiated. The long-term memory effect of the four Cs sputter ion sources at DREAMS (two sources: original and modified), ASTER (Accélérateur pour les Sciences de la Terre, Environnement, Risques) and VERA (Vienna Environmental Research Accelerator) had been investigated by measuring samples of natural 35Cl/37Cl-ratio and samples highly-enriched in 35Cl (35Cl/37Cl ∼ 999). Besides investigating and comparing the individual levels of long-term memory, recovery time constants could be calculated. The tests show that all four sources suffer from long-term memory, but the modified DREAMS ion source showed the lowest level of contamination. The recovery times of the four ion sources were widely spread between 61 and 1390 s, where the modified DREAMS ion source with values between 156 and 262 s showed the fastest recovery in 80% of the measurements.

  17. Emergent Constraints for Cloud Feedbacks and Climate Sensitivity

    DOE PAGES

    Klein, Stephen A.; Hall, Alex

    2015-10-26

    Emergent constraints are physically explainable empirical relationships between characteristics of the current climate and long-term climate prediction that emerge in collections of climate model simulations. With the prospect of constraining long-term climate prediction, scientists have recently uncovered several emergent constraints related to long-term cloud feedbacks. We review these proposed emergent constraints, many of which involve the behavior of low-level clouds, and discuss criteria to assess their credibility. With further research, some of the cases we review may eventually become confirmed emergent constraints, provided they are accompanied by credible physical explanations. Because confirmed emergent constraints identify a source of model errormore » that projects onto climate predictions, they deserve extra attention from those developing climate models and climate observations. While a systematic bias cannot be ruled out, it is noteworthy that the promising emergent constraints suggest larger cloud feedback and hence climate sensitivity.« less

  18. Language differences in verbal short-term memory do not exclusively originate in the process of subvocal rehearsal.

    PubMed

    Thorn, A S; Gathercole, S E

    2001-06-01

    Language differences in verbal short-term memory were investigated in two experiments. In Experiment 1, bilinguals with high competence in English and French and monolingual English adults with extremely limited knowledge of French were assessed on their serial recall of words and nonwords in both languages. In all cases recall accuracy was superior in the language with which individuals were most familiar, a first-language advantage that remained when variation due to differential rates of articulation in the two languages was taken into account. In Experiment 2, bilinguals recalled lists of English and French words with and without concurrent articulatory suppression. First-language superiority persisted under suppression, suggesting that the language differences in recall accuracy were not attributable to slower rates of subvocal rehearsal in the less familiar language. The findings indicate that language-specific differences in verbal short-term memory do not exclusively originate in the subvocal rehearsal process. It is suggested that one source of language-specific variation might relate to the use of long-term knowledge to support short-term memory performance.

  19. USGS perspectives on an integrated approach to watershed and coastal management

    USGS Publications Warehouse

    Larsen, Matthew C.; Hamilton, Pixie A.; Haines, John W.; Mason, Jr., Robert R.

    2010-01-01

    The writers discuss three critically important steps necessary for achieving the goal for improved integrated approaches on watershed and coastal protection and management. These steps involve modernization of monitoring networks, creation of common data and web services infrastructures, and development of modeling, assessment, and research tools. Long-term monitoring is needed for tracking the effectiveness approaches for controlling land-based sources of nutrients, contaminants, and invasive species. The integration of mapping and monitoring with conceptual and mathematical models, and multidisciplinary assessments is important in making well-informed decisions. Moreover, a better integrated data network is essential for mapping, statistical, and modeling applications, and timely dissemination of data and information products to a broad community of users.

  20. The Impact of High School Science Teachers' Beliefs, Curricular Enactments and Experience on Student Learning During an Inquiry-based Urban Ecology Curriculum

    NASA Astrophysics Data System (ADS)

    McNeill, Katherine L.; Silva Pimentel, Diane; Strauss, Eric G.

    2013-10-01

    Inquiry-based curricula are an essential tool for reforming science education yet the role of the teacher is often overlooked in terms of the impact of the curriculum on student achievement. Our research focuses on 22 teachers' use of a year-long high school urban ecology curriculum and how teachers' self-efficacy, instructional practices, curricular enactments and previous experience impacted student learning. Data sources included teacher belief surveys, teacher enactment surveys, a student multiple-choice assessment focused on defining and identifying science concepts and a student open-ended assessment focused on scientific inquiry. Results from the two hierarchical linear models indicate that there was significant variation between teachers in terms of student achievement. For the multiple-choice assessment, teachers who spent a larger percentage of time on group work and a smaller percentage of time lecturing had greater student learning. For the open-ended assessment, teachers who reported a higher frequency of students engaging in argument and sharing ideas had greater student learning while teachers who adapted the curriculum more had lower student learning. These results suggest the importance of supporting the active role of students in instruction, emphasising argumentation, and considering the types of adaptations teachers make to curriculum.

  1. Hanford Site Composite Analysis Technical Approach Description: Vadose Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, M. D.; Nichols, W. E.; Ali, A.

    2017-10-30

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, and DOE M 435.1 Chg 1, Radioactive Waste Management Manual, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems;more » or, to determine management alternatives, corrective actions, or assessment needs, if potential problems are identified.« less

  2. Near-term hybrid vehicle program, phase 1. Appendix B: Design trade-off studies report. Volume 2: Supplement to design trade-off studies

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Results of studies leading to the preliminary design of a hybrid passenger vehicle which is projected to have the maximum potential for reducing petroleum consumption in the near term are presented. Heat engine/electric hybrid vehicle tradeoffs, assessment of battery power source, and weight and cost analysis of key components are among the topics covered. Performance of auxiliary equipment, such as power steering, power brakes, air conditioning, lighting and electrical accessories, heating and ventilation is discussed along with the selection of preferred passenger compartment heating procedure for the hybrid vehicle. Waste heat from the engine, thermal energy storage, and an auxiliary burner are among the approaches considered.

  3. Semi-implicit and fully implicit shock-capturing methods for hyperbolic conservation laws with stiff source terms

    NASA Technical Reports Server (NTRS)

    Yee, H. C.; Shinn, J. L.

    1986-01-01

    Some numerical aspects of finite-difference algorithms for nonlinear multidimensional hyperbolic conservation laws with stiff nonhomogenous (source) terms are discussed. If the stiffness is entirely dominated by the source term, a semi-implicit shock-capturing method is proposed provided that the Jacobian of the soruce terms possesses certain properties. The proposed semi-implicit method can be viewed as a variant of the Bussing and Murman point-implicit scheme with a more appropriate numerical dissipation for the computation of strong shock waves. However, if the stiffness is not solely dominated by the source terms, a fully implicit method would be a better choice. The situation is complicated by problems that are higher than one dimension, and the presence of stiff source terms further complicates the solution procedures for alternating direction implicit (ADI) methods. Several alternatives are discussed. The primary motivation for constructing these schemes was to address thermally and chemically nonequilibrium flows in the hypersonic regime. Due to the unique structure of the eigenvalues and eigenvectors for fluid flows of this type, the computation can be simplified, thus providing a more efficient solution procedure than one might have anticipated.

  4. Common Calibration Source for Monitoring Long-term Ozone Trends

    NASA Technical Reports Server (NTRS)

    Kowalewski, Matthew

    2004-01-01

    Accurate long-term satellite measurements are crucial for monitoring the recovery of the ozone layer. The slow pace of the recovery and limited lifetimes of satellite monitoring instruments demands that datasets from multiple observation systems be combined to provide the long-term accuracy needed. A fundamental component of accurately monitoring long-term trends is the calibration of these various instruments. NASA s Radiometric Calibration and Development Facility at the Goddard Space Flight Center has provided resources to minimize calibration biases between multiple instruments through the use of a common calibration source and standardized procedures traceable to national standards. The Facility s 50 cm barium sulfate integrating sphere has been used as a common calibration source for both US and international satellite instruments, including the Total Ozone Mapping Spectrometer (TOMS), Solar Backscatter Ultraviolet 2 (SBUV/2) instruments, Shuttle SBUV (SSBUV), Ozone Mapping Instrument (OMI), Global Ozone Monitoring Experiment (GOME) (ESA), Scanning Imaging SpectroMeter for Atmospheric ChartographY (SCIAMACHY) (ESA), and others. We will discuss the advantages of using a common calibration source and its effects on long-term ozone data sets. In addition, sphere calibration results from various instruments will be presented to demonstrate the accuracy of the long-term characterization of the source itself.

  5. Watershed nitrogen and phosphorus balance: The upper Potomac River basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaworski, N.A.; Groffman, P.M.; Keller, A.A.

    1992-01-01

    Nitrogen and phosphorus mass balances were estimated for the portion of the Potomac River basin watershed located above Washington, D.C. The total nitrogen (N) balance included seven input source terms, six sinks, and one 'change-in-storage' term, but was simplified to five input terms and three output terms. The phosphorus (P) baance had four input and three output terms. The estimated balances are based on watershed data from seven information sources. Major sources of nitrogen are animal waste and atmospheric deposition. The major sources of phosphorus are animal waste and fertilizer. The major sink for nitrogen is combined denitrification, volatilization, andmore » change-in-storage. The major sink for phosphorus is change-in-storage. River exports of N and P were 17% and 8%, respectively, of the total N and P inputs. Over 60% of the N and P were volatilized or stored. The major input and output terms on the budget are estimated from direct measurements, but the change-in-storage term is calculated by difference. The factors regulating retention and storage processes are discussed and research needs are identified.« less

  6. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    NASA Astrophysics Data System (ADS)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  7. The NASA Lewis Research Center SBIR program: An assessment

    NASA Technical Reports Server (NTRS)

    Grimes, Hubert H.; Metzger, Marie E.; Kim, Walter S.

    1993-01-01

    An assessment was made of the NASA Lewis Small Business Innovation Research (SBIR) Program for the years 1983 to 1989. The assessment was based on the study of 99 Phase 1 contracts and 39 Phase 2 contracts. The overall impact of SBIR was found to be very positive, contributing strongly to many NASA programs. In addition, many successful efforts were commercialized benefiting the small business, federal agencies, and the aerospace industry. The program was evaluated in terms of contract quality, innovativeness, comparison to the state-of-the-art, achievement of goals, difficulty, and impact. Program difficulties were also identified, which could suggest possible program improvements. Much of the information gained in this assessment provided a basis for a SBIR data base which will be updated every year. This data base is computerized and will provide an excellent source of information about past SBIR efforts and company capabilities.

  8. Validation of learning assessments: A primer.

    PubMed

    Peeters, Michael J; Martin, Beth A

    2017-09-01

    The Accreditation Council for Pharmacy Education's Standards 2016 has placed greater emphasis on validating educational assessments. In this paper, we describe validity, reliability, and validation principles, drawing attention to the conceptual change that highlights one validity with multiple evidence sources; to this end, we recommend abandoning historical (confusing) terminology associated with the term validity. Further, we describe and apply Kane's framework (scoring, generalization, extrapolation, and implications) for the process of validation, with its inferences and conclusions from varied uses of assessment instruments by different colleges and schools of pharmacy. We then offer five practical recommendations that can improve reporting of validation evidence in pharmacy education literature. We describe application of these recommendations, including examples of validation evidence in the context of pharmacy education. After reading this article, the reader should be able to understand the current concept of validation, and use a framework as they validate and communicate their own institution's learning assessments. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Production and use of estimates for monitoring progress in the health sector: the case of Bangladesh

    PubMed Central

    Ahsan, Karar Zunaid; Tahsina, Tazeen; Iqbal, Afrin; Ali, Nazia Binte; Chowdhury, Suman Kanti; Huda, Tanvir M.; Arifeen, Shams El

    2017-01-01

    ABSTRACT Background: In order to support the progress towards the post-2015 development agenda for the health sector, the importance of high-quality and timely estimates has become evident both globally and at the country level. Objective and Methods: Based on desk review, key informant interviews and expert panel discussions, the paper critically reviews health estimates from both the local (i.e. nationally generated information by the government and other agencies) and the global sources (which are mostly modeled or interpolated estimates developed by international organizations based on different sources of information), and assesses the country capacity and monitoring strategies to meet the increasing data demand in the coming years. Primarily, this paper provides a situation analysis of Bangladesh in terms of production and use of health estimates for monitoring progress towards the post-2015 development goals for the health sector. Results: The analysis reveals that Bangladesh is data rich, particularly from household surveys and health facility assessments. Practices of data utilization also exist, with wide acceptability of survey results for informing policy, programme review and course corrections. Despite high data availability from multiple sources, the country capacity for providing regular updates of major global health estimates/indicators remains low. Major challenges also include limited human resources, capacity to generate quality data and multiplicity of data sources, where discrepancy and lack of linkages among different data sources (local sources and between local and global estimates) present emerging challenges for interpretation of the resulting estimates. Conclusion: To fulfill the increased data requirement for the post-2015 era, Bangladesh needs to invest more in electronic data capture and routine health information systems. Streamlining of data sources, integration of parallel information systems into a common platform, and capacity building for data generation and analysis are recommended as priority actions for Bangladesh in the coming years. In addition to automation of routine health information systems, establishing an Indicator Reference Group for Bangladesh to analyze data; building country capacity in data quality assessment and triangulation; and feeding into global, inter-agency estimates for better reporting would address a number of mentioned challenges in the short- and long-run. PMID:28532305

  10. Coupling Aggressive Mass Removal with Microbial Reductive Dechlorination for Remediation of DNAPL Source Zones: A Review and Assessment

    PubMed Central

    Christ, John A.; Ramsburg, C. Andrew; Abriola, Linda M.; Pennell, Kurt D.; Löffler, Frank E.

    2005-01-01

    The infiltration of dense non-aqueous-phase liquids (DNAPLs) into the saturated subsurface typically produces a highly contaminated zone that serves as a long-term source of dissolved-phase groundwater contamination. Applications of aggressive physical–chemical technologies to such source zones may remove > 90% of the contaminant mass under favorable conditions. The remaining contaminant mass, however, can create a rebounding of aqueous-phase concentrations within the treated zone. Stimulation of microbial reductive dechlorination within the source zone after aggressive mass removal has recently been proposed as a promising staged-treatment remediation technology for transforming the remaining contaminant mass. This article reviews available laboratory and field evidence that supports the development of a treatment strategy that combines aggressive source-zone removal technologies with subsequent promotion of sustained microbial reductive dechlorination. Physical–chemical source-zone treatment technologies compatible with posttreatment stimulation of microbial activity are identified, and studies examining the requirements and controls (i.e., limits) of reductive dechlorination of chlorinated ethenes are investigated. Illustrative calculations are presented to explore the potential effects of source-zone management alternatives. Results suggest that, for the favorable conditions assumed in these calculations (i.e., statistical homogeneity of aquifer properties, known source-zone DNAPL distribution, and successful bioenhancement in the source zone), source longevity may be reduced by as much as an order of magnitude when physical–chemical source-zone treatment is coupled with reductive dechlorination. PMID:15811838

  11. Quantitative evaluation of intensive remedial action using long-term monitoring and tracer data at a DNAPL contaminated site, Wonju, Korea

    NASA Astrophysics Data System (ADS)

    Lee, S. S.; Kim, H. J.; Kim, M. O.; Lee, K.; Lee, K. K.

    2016-12-01

    A study finding evidence of remediation represented on monitoring data before and after in site intensive remedial action was performed with various quantitative evaluation methods such as mass discharge analysis, tracer data, statistical trend analysis, and analytical solutions at DNAPL contaminated site, Wonju, Korea. Remediation technologies such as soil vapor extraction, soil flushing, biostimulation, and pump-and-treat have been applied to eliminate the contaminant sources of trichloroethylene (TCE) and to prevent the migration of TCE plume from remediation target zones. Prior to the remediation action, the concentration and mass discharges of TCE at all transects were affected by seasonal recharge variation and residual DNAPLs sources. After the remediation, the effect of remediation took place clearly at the main source zone and industrial complex. By tracing a time-series of plume evolution, a greater variation in the TCE concentrations was detected at the plumes near the source zones compared to the relatively stable plumes in the downstream. The removal amount of the residual source mass during the intensive remedial action was estimated to evaluate the efficiency of the intensive remedial action using analytical solution. From results of quantitative evaluation using analytical solution, it is assessed that the intensive remedial action had effectively performed with removal efficiency of 70% for the residual source mass during the remediation period. Analytical solution which can consider and quantify the impacts of partial mass reduction have been proven to be useful tools for quantifying unknown contaminant source mass and verifying dissolved concentration at the DNAPL contaminated site and evaluating the efficiency of remediation using long-term monitoring data. Acknowledgement : This subject was supported by the Korea Ministry of Environment under "GAIA project (173-092-009) and (201400540010)", R&D Project on Enviornmental Management of Geologic CO2 storage" from the KEITI (Project number:2014001810003).

  12. Assessment of effectiveness of geologic isolation systems. CIRMIS data system. Volume 4. Driller's logs, stratigraphic cross section and utility routines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedrichs, D.R.

    1980-01-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (ONWI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for use by the hydrologic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required. The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the fourth of four volumes of the description of the CIRMIS Data System.« less

  13. Leaching of PFC from soils contaminated with PFC of different origin

    NASA Astrophysics Data System (ADS)

    Kalbe, Ute; Piechotta, Christian; Rothe, Robert

    2017-04-01

    Leaching tests are fundamental tools for the assessment of groundwater impact by contaminated soils concerning the soil-groundwater pathway. Such procedures are supposed to serve as the basis for a reliable leachate prognosis. They can be applied to determine the short and long term leaching behaviour as well as the source term of contaminated soils. For this purpose two types of leaching procedures have been validated in Germany for the examination of the leaching behaviour of frequently occurring organic substances (DIN 19528 - column test and DIN 19529 - batch test). A liquid-to-solid ratio (L/S) of 2 L/kg and 10 L/kg) is the basis for the risk assessment which is implemented in different German regulations. The equivalence of test results for both tests for the same material under investigation has been investigated for a variety of pollutants in order to assess their reliability in compliance testing. However, for emerging pollutants there is hardly data available on this issue. Leaching tests on soils contaminated with emerging pollutants such as PFC (Perfluorinated Surfactants) are currently coming more into consideration due to the increasing detection of contaminated sites. Therefore, two soils were investigated in this study from different contamination source (paper sludge containing compost and fire distinguishing foam) using both leaching tests and both liquid-to-solid ratios. The leachability of the various perfluorinated compounds in relation to their content in solid matter was considered. Furthermore the eluate pre-treatment prior analysis (in particular liquid/solid separation step needed for batch tests) has been taken into account. The comparability of the results from batch and column is dependent on the solubility of the various compounds, on the L/S and on the turbidity in the eluates.

  14. Economic impact of medication error: a systematic review.

    PubMed

    Walsh, Elaine K; Hansen, Christina Raae; Sahm, Laura J; Kearney, Patricia M; Doherty, Edel; Bradley, Colin P

    2017-05-01

    Medication error is a significant source of morbidity and mortality among patients. Clinical and cost-effectiveness evidence are required for the implementation of quality of care interventions. Reduction of error-related cost is a key potential benefit of interventions addressing medication error. The aim of this review was to describe and quantify the economic burden associated with medication error. PubMed, Cochrane, Embase, CINAHL, EconLit, ABI/INFORM, Business Source Complete were searched. Studies published 2004-2016 assessing the economic impact of medication error were included. Cost values were expressed in Euro 2015. A narrative synthesis was performed. A total of 4572 articles were identified from database searching, and 16 were included in the review. One study met all applicable quality criteria. Fifteen studies expressed economic impact in monetary terms. Mean cost per error per study ranged from €2.58 to €111 727.08. Healthcare costs were used to measure economic impact in 15 of the included studies with one study measuring litigation costs. Four studies included costs incurred in primary care with the remaining 12 measuring hospital costs. Five studies looked at general medication error in a general population with 11 studies reporting the economic impact of an individual type of medication error or error within a specific patient population. Considerable variability existed between studies in terms of financial cost, patients, settings and errors included. Many were of poor quality. Assessment of economic impact was conducted predominantly in the hospital setting with little assessment of primary care impact. Limited parameters were used to establish economic impact. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Pre/post-closure assessment of groundwater pharmaceutical fate in a wastewater-facility-impacted stream reach.

    PubMed

    Bradley, Paul M; Barber, Larry B; Clark, Jimmy M; Duris, Joseph W; Foreman, William T; Furlong, Edward T; Givens, Carrie E; Hubbard, Laura E; Hutchinson, Kasey J; Journey, Celeste A; Keefe, Steffanie H; Kolpin, Dana W

    2016-10-15

    Pharmaceutical contamination of contiguous groundwater is a substantial concern in wastewater-impacted streams, due to ubiquity in effluent, high aqueous mobility, designed bioactivity, and to effluent-driven hydraulic gradients. Wastewater treatment facility (WWTF) closures are rare environmental remediation events; offering unique insights into contaminant persistence, long-term wastewater impacts, and ecosystem recovery processes. The USGS conducted a combined pre/post-closure groundwater assessment adjacent to an effluent-impacted reach of Fourmile Creek, Ankeny, Iowa, USA. Higher surface-water concentrations, consistent surface-water to groundwater concentration gradients, and sustained groundwater detections tens of meters from the stream bank demonstrated the importance of WWTF effluent as the source of groundwater pharmaceuticals as well as the persistence of these contaminants under effluent-driven, pre-closure conditions. The number of analytes (110 total) detected in surface water decreased from 69 prior to closure down to 8 in the first post-closure sampling event approximately 30 d later, with a corresponding 2 order of magnitude decrease in the cumulative concentration of detected analytes. Post-closure cumulative concentrations of detected analytes were approximately 5 times higher in proximal groundwater than in surface water. About 40% of the 21 contaminants detected in a downstream groundwater transect immediately before WWTF closure exhibited rapid attenuation with estimated half-lives on the order of a few days; however, a comparable number exhibited no consistent attenuation during the year-long post-closure assessment. The results demonstrate the potential for effluent-impacted shallow groundwater systems to accumulate pharmaceutical contaminants and serve as long-term residual sources, further increasing the risk of adverse ecological effects in groundwater and the near-stream ecosystem. Published by Elsevier B.V.

  16. Pre/post-closure assessment of groundwater pharmaceutical fate in a wastewater‑facility-impacted stream reach

    USGS Publications Warehouse

    Bradley, Paul M.; Barber, Larry B.; Clark, Jimmy M.; Duris, Joseph W.; Foreman, William T.; Furlong, Edward T.; Givens, Carrie E.; Hubbard, Laura E.; Hutchinson, Kasey J.; Journey, Celeste A.; Keefe, Steffanie H.; Kolpin, Dana W.

    2016-01-01

    Pharmaceutical contamination of contiguous groundwater is a substantial concern in wastewater-impacted streams, due to ubiquity in effluent, high aqueous mobility, designed bioactivity, and to effluent-driven hydraulic gradients. Wastewater treatment facility (WWTF) closures are rare environmental remediation events; offering unique insights into contaminant persistence, long-term wastewater impacts, and ecosystem recovery processes. The USGS conducted a combined pre/post-closure groundwater assessment adjacent to an effluent-impacted reach of Fourmile Creek, Ankeny, Iowa, USA. Higher surface-water concentrations, consistent surface-water to groundwater concentration gradients, and sustained groundwater detections tens of meters from the stream bank demonstrated the importance of WWTF effluent as the source of groundwater pharmaceuticals as well as the persistence of these contaminants under effluent-driven, pre-closure conditions. The number of analytes (110 total) detected in surface water decreased from 69 prior to closure down to 8 in the first post-closure sampling event approximately 30 d later, with a corresponding 2 order of magnitude decrease in the cumulative concentration of detected analytes. Post-closure cumulative concentrations of detected analytes were approximately 5 times higher in proximal groundwater than in surface water. About 40% of the 21 contaminants detected in a downstream groundwater transect immediately before WWTF closure exhibited rapid attenuation with estimated half-lives on the order of a few days; however, a comparable number exhibited no consistent attenuation during the year-long post-closure assessment. The results demonstrate the potential for effluent-impacted shallow groundwater systems to accumulate pharmaceutical contaminants and serve as long-term residual sources, further increasing the risk of adverse ecological effects in groundwater and the near-stream ecosystem.

  17. Combining multiple sources of data to inform conservation of Lesser Prairie-Chicken populations

    USGS Publications Warehouse

    Ross, Beth; Haukos, David A.; Hagen, Christian A.; Pitman, James

    2018-01-01

    Conservation of small populations is often based on limited data from spatially and temporally restricted studies, resulting in management actions based on an incomplete assessment of the population drivers. If fluctuations in abundance are related to changes in weather, proper management is especially important, because extreme weather events could disproportionately affect population abundance. Conservation assessments, especially for vulnerable populations, are aided by a knowledge of how extreme events influence population status and trends. Although important for conservation efforts, data may be limited for small or vulnerable populations. Integrated population models maximize information from various sources of data to yield population estimates that fully incorporate uncertainty from multiple data sources while allowing for the explicit incorporation of environmental covariates of interest. Our goal was to assess the relative influence of population drivers for the Lesser Prairie-Chicken (Tympanuchus pallidicinctus) in the core of its range, western and southern Kansas, USA. We used data from roadside lek count surveys, nest monitoring surveys, and survival data from telemetry monitoring combined with climate (Palmer drought severity index) data in an integrated population model. Our results indicate that variability in population growth rate was most influenced by variability in juvenile survival. The Palmer drought severity index had no measurable direct effects on adult survival or mean number of offspring per female; however, there were declines in population growth rate following severe drought. Because declines in population growth rate occurred at a broad spatial scale, declines in response to drought were likely due to decreases in chick and juvenile survival rather than emigration outside of the study area. Overall, our model highlights the importance of accounting for environmental and demographic sources of variability, and provides a thorough method for simultaneously evaluating population demography in response to long-term climate effects.

  18. Coronary CT angiography with single-source and dual-source CT: comparison of image quality and radiation dose between prospective ECG-triggered and retrospective ECG-gated protocols.

    PubMed

    Sabarudin, Akmal; Sun, Zhonghua; Yusof, Ahmad Khairuddin Md

    2013-09-30

    This study is conducted to investigate and compare image quality and radiation dose between prospective ECG-triggered and retrospective ECG-gated coronary CT angiography (CCTA) with the use of single-source CT (SSCT) and dual-source CT (DSCT). A total of 209 patients who underwent CCTA with suspected coronary artery disease scanned with SSCT (n=95) and DSCT (n=114) scanners using prospective ECG-triggered and retrospective ECG-gated protocols were recruited from two institutions. The image was assessed by two experienced observers, while quantitative assessment was performed by measuring the image noise, the signal-to-noise ratio (SNR) and the contrast-to-noise ratio (CNR). Effective dose was calculated using the latest published conversion coefficient factor. A total of 2087 out of 2880 coronary artery segments were assessable, with 98.0% classified as of sufficient and 2.0% as of insufficient image quality for clinical diagnosis. There was no significant difference in overall image quality between prospective ECG-triggered and retrospective gated protocols, whether it was performed with DSCT or SSCT scanners. Prospective ECG-triggered protocol was compared in terms of radiation dose calculation between DSCT (6.5 ± 2.9 mSv) and SSCT (6.2 ± 1.0 mSv) scanners and no significant difference was noted (p=0.99). However, the effective dose was significantly lower with DSCT (18.2 ± 8.3 mSv) than with SSCT (28.3 ± 7.0 mSv) in the retrospective gated protocol. Prospective ECG-triggered CCTA reduces radiation dose significantly compared to retrospective ECG-gated CCTA, while maintaining good image quality. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Improving understanding of near-term barrier island evolution through multi-decadal assessment of morphologic change

    USGS Publications Warehouse

    Lentz, Erika E.; Hapke, Cheryl J.; Stockdon, Hilary F.; Hehre, Rachel E.

    2013-01-01

    Observed morphodynamic changes over multiple decades were coupled with storm-driven run-up characteristics at Fire Island, New York, to explore the influence of wave processes relative to the impacts of other coastal change drivers on the near-term evolution of the barrier island. Historical topography was generated from digital stereo-photogrammetry and compared with more recent lidar surveys to quantify near-term (decadal) morphodynamic changes to the beach and primary dune system between the years 1969, 1999, and 2009. Notably increased profile volumes were observed along the entirety of the island in 1999, and likely provide the eolian source for the steady dune crest progradation observed over the relatively quiescent decade that followed. Persistent patterns of erosion and accretion over 10-, 30-, and 40-year intervals are attributable to variations in island morphology, human activity, and variations in offshore bathymetry and island orientation that influence the wave energy reaching the coast. Areas of documented long-term historical inlet formation and extensive bayside marsh development show substantial landward translation of the dune–beach profile over the near-term period of this study. Correlations among areas predicted to overwash, observed elevation changes of the dune crestline, and observed instances of overwash in undeveloped segments of the barrier island verify that overwash locations can be accurately predicted in undeveloped segments of coast. In fact, an assessment of 2012 aerial imagery collected after Hurricane Sandy confirms that overwash occurred at the majority of near-term locations persistently predicted to overwash. In addition to the storm wave climate, factors related to variations within the geologic framework which in turn influence island orientation, offshore slope, and sediment supply impact island behavior on near-term timescales.

  20. The impact of organizational factors on the urinary incontinence care quality in long-term care hospitals: a longitudinal correlational study.

    PubMed

    Yoon, Ju Young; Lee, Ji Yun; Bowers, Barbara J; Zimmerman, David R

    2012-12-01

    With the rapid increase in the number of long-term care hospitals in Korea, care quality has become an important issue. Urinary incontinence is an important condition affecting many residents' quality of life. Thus, it is important that urinary incontinence be amenable to improving conditions with appropriate interventions, since a change in urinary incontinence status can reflect care quality in long-term care facilities if patient level factors are adjusted. We aim to examine the impact of organizational factors on urinary incontinence care quality defined as the improvement of urinary incontinence status or maintenance of continent status post-admission to Korean long-term care hospitals. DESIGN AND DATA: This is a longitudinal correlation study. Data came from two sources: monthly patient assessment reports using the Patient Assessment Instrument and the hospital information system from the Health Insurance Review and Assessment Services. The final analysis includes 5271 elderly adults without indwelling urinary catheter or urostomy who were admitted to 534 Korean long-term care hospitals in April 2008. Multi-level logistic analysis was used to explore the organizational factors that influence urinary incontinence care quality controlling for patient level factors. With respect to the organizational factors, the findings showed that location and RN/total nursing staff ratio variables were statistically significant, controlling for risk factors at the patient level. The odds of urinary incontinence improvement from admission in urban long-term care hospitals were 1.28 times higher than rural long-term care hospitals. In addition, when a long-term care hospital increased one standard deviation (0.19) in the RN ratio, the odds of urinary incontinence status improvement or maintenance of continence status from admission increased about 1.8 times. The most significant finding was that a higher RN to patient ratio and urban location were associated with better resident outcomes of urinary incontinence among organizational factors. For a better understanding of how these significant organizational factors influence positive care outcomes and provide more practical implications, studies should examine concrete care process measures as well as structure and outcome measures based on systematic conceptual models. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Memory for performed and observed activities following traumatic brain injury

    PubMed Central

    Wright, Matthew J.; Wong, Andrew L.; Obermeit, Lisa C.; Woo, Ellen; Schmitter-Edgecombe, Maureen; Fuster, Joaquín M.

    2014-01-01

    Traumatic brain injury (TBI) is associated with deficits in memory for the content of completed activities. However, TBI groups have shown variable memory for the temporal order of activities. We sought to clarify the conditions under which temporal order memory for activities is intact following TBI. Additionally, we evaluated activity source memory and the relationship between activity memory and functional outcome in TBI participants. Thus, we completed a study of activity memory with 18 severe TBI survivors and 18 healthy age- and education-matched comparison participants. Both groups performed eight activities and observed eight activities that were fashioned after routine daily tasks. Incidental encoding conditions for activities were utilized. The activities were drawn from two counterbalanced lists, and both performance and observation were randomly determined and interspersed. After all of the activities were completed, content memory (recall and recognition), source memory (conditional source identification), and temporal order memory (correlation between order reconstruction and actual order) for the activities were assessed. Functional ability was assessed via the Community Integration Questionnaire (CIQ). In terms of content memory, TBI participants recalled and recognized fewer activities than comparison participants. Recognition of performed and observed activities was strongly associated with social integration on the CIQ. There were no between- or within-group differences in temporal order or source memory, although source memory performances were near ceiling. The findings were interpreted as suggesting that temporal order memory following TBI is intact under conditions of both purposeful activity completion and incidental encoding, and that activity memory is related to functional outcomes following TBI. PMID:24524393

  2. Novel Methods for Tracking Long-Term Maintenance Immunosuppression Regimens

    PubMed Central

    Buchanan, Paula M.; Schnitzler, Mark A.; Brennan, Daniel C.; Dzebisashvili, Nino; Willoughby, Lisa M.; Axelrod, David; Salvalaggio, Paolo R.; Abbott, Kevin C.; Burroughs, Thomas E.; Lentine, Krista L.

    2008-01-01

    Background and objectives: Accurate assessment of the use of immunosuppressive medications is vital for observational analyses that are widely used in transplantation research. This study assessed the accuracy of three potential sources of maintenance immunosuppression data. Design, setting, participants, & measurements: This study investigated the agreement of immunosuppression information in directly linked electronic medical records for Medicare beneficiaries who received a kidney transplant at one center in 1998 through 2001, Organ Procurement and Transplantation Network (OPTN) survey data, and Medicare pharmacy claims. Pair-wise, interdata concordance (κ) and percentage agreement statistics were used to compare immunosuppression regimens reported at discharge, and at 6 mo and 1 yr after transplantation in each data source. Results: Among 181 eligible participants, agreement between data sources for nonsteroid immunosuppression increased with time after transplantation. By 1-yr, concordance was excellent for calcineurin inhibitors and mycophenolate mofetil (κ = 0.79 to 1.00), and very good for azathioprine (κ = 0.73 to 0.85). Similarly, percentage agreement at 1 yr was 94.9 to 100% for calcineurin inhibitors, 91.1 to 95.7% for mycophenolate mofetil, and 87.5 to 92.8% for azathioprine. Widening the comparison time window resolved 33.6% of cases with discordant indications of calcineurin inhibitor and/or antimetabolite use in claims compared with other data sources. Conclusions: This analysis supports the accuracy of the three sources of data for description of nonsteroid immunosuppression after kidney transplantation. Given the current strategic focus on reducing collection of data, use of alternative measures of immunosuppression exposure is appropriate and will assume greater importance. PMID:18077785

  3. Ammonia in London: is it increasing and what is the relevance of urban ammonia for air quality impacts?

    NASA Astrophysics Data System (ADS)

    Braban, Christine; Tang, Sim; Poskitt, Janet; Van Dijk, Netty; Leeson, Sarah; Dragosits, Ulli; Hutchings, Torben; Twigg, Marsailidh; Di Marco, Chiara; Langford, Ben; Tremper, Anja; Nemitz, Eiko; Sutton, Mark

    2017-04-01

    Emissions of ammonia affect both rural and urban air quality primarily via reaction of ammonia in the atmosphere forming secondary ammonium salts in particulate matter (PM). Urban ammonia emissions come from a variety of sources including biological decomposition, human waste, industrial processes and combustion engines. In the UK, the only long-term urban ammonia measurement is a UK National Ammonia Monitoring Network site at London Cromwell Road, recording monthly average concentrations. Short term measurements have also been made in the past decade at Marylebone Road, North Kensington and on the BT Tower. Cromwell Road is a kerbside site operational since 1999. The Cromwell Road data indicates that ammonia concentrations may be increasing since 2010-2012 after a long period of decreasing. Data from the National Atmospheric Emissions Inventory indicates ammonia emissions from diesel fleet exhausts increasing over this time period but an overall net decrease in ammonia emissions. With changes in engine and exhaust technology to minimise pollutant emissions and the importance of ammonia as a precursor gas for secondary PM, there is a challenge to understand urban ammonia concentrations and subsequent impacts on urban air quality. In this paper the long term measurements are assessed in conjunction with the short-term measurements.The challenges to assess the relative importance of local versus long range ammonia emission are discussed.

  4. Time-frequency approach to underdetermined blind source separation.

    PubMed

    Xie, Shengli; Yang, Liu; Yang, Jun-Mei; Zhou, Guoxu; Xiang, Yong

    2012-02-01

    This paper presents a new time-frequency (TF) underdetermined blind source separation approach based on Wigner-Ville distribution (WVD) and Khatri-Rao product to separate N non-stationary sources from M(M <; N) mixtures. First, an improved method is proposed for estimating the mixing matrix, where the negative value of the auto WVD of the sources is fully considered. Then after extracting all the auto-term TF points, the auto WVD value of the sources at every auto-term TF point can be found out exactly with the proposed approach no matter how many active sources there are as long as N ≤ 2M-1. Further discussion about the extraction of auto-term TF points is made and finally the numerical simulation results are presented to show the superiority of the proposed algorithm by comparing it with the existing ones.

  5. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 15 2010-04-01 2010-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  6. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 15 2011-04-01 2011-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  7. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 26 Internal Revenue 15 2012-04-01 2012-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  8. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 15 2014-04-01 2014-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  9. 26 CFR 31.3401(a)(14)-1 - Group-term life insurance.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 26 Internal Revenue 15 2013-04-01 2013-04-01 false Group-term life insurance. 31.3401(a)(14)-1... SOURCE Collection of Income Tax at Source § 31.3401(a)(14)-1 Group-term life insurance. (a) The cost of group-term life insurance on the life of an employee is excepted from wages, and hence is not subject to...

  10. Comparison of the Effects of Environmental Parameters on the Growth Variability of Vibrio parahaemolyticus Coupled with Strain Sources and Genotypes Analyses.

    PubMed

    Liu, Bingxuan; Liu, Haiquan; Pan, Yingjie; Xie, Jing; Zhao, Yong

    2016-01-01

    Microbial growth variability plays an important role on food safety risk assessment. In this study, the growth kinetic characteristics corresponding to maximum specific growth rate (μmax) of 50 V. parahaemolyticus isolates from different sources and genotypes were evaluated at different temperatures (10, 20, 30, and 37°C) and salinity (0.5, 3, 5, 7, and 9%) using the automated turbidimetric system Bioscreen C. The results demonstrated that strain growth variability increased as the growth conditions became more stressful both in terms of temperature and salinity. The coefficient of variation (CV) of μmax for temperature was larger than that for salinity, indicating that the impact of temperature on strain growth variability was greater than that of salinity. The strains isolated from freshwater aquatic products had more conspicuous growth variations than those from seawater. Moreover, the strains with tlh (+) /tdh (+) /trh (-) exhibited higher growth variability than tlh (+) /tdh (-) /trh (-) or tlh (+) /tdh (-) /trh (+), revealing that gene heterogeneity might have possible relations with the growth variability. This research illustrates that the growth environments, strain sources as well as genotypes have impacts on strain growth variability of V. parahaemolyticus, which can be helpful for incorporating strain variability in predictive microbiology and microbial risk assessment.

  11. Regional Scale Simulations of Nitrate Leaching through Agricultural Soils of California

    NASA Astrophysics Data System (ADS)

    Diamantopoulos, E.; Walkinshaw, M.; O'Geen, A. T.; Harter, T.

    2016-12-01

    Nitrate is recognized as one of California's most widespread groundwater contaminants. As opposed to point sources, which are relative easily identifiable sources of contamination, non-point sources of nitrate are diffuse and linked with widespread use of fertilizers in agricultural soils. California's agricultural regions have an incredible diversity of soils that encompass a huge range of properties. This complicates studies dealing with nitrate risk assessment, since important biological and physicochemical processes appear at the first meters of the vadose zone. The objective of this study is to evaluate all agricultural soils in California according to their potentiality for nitrate leaching based on numerical simulations using the Richards equation. We conducted simulations for 6000 unique soil profiles (over 22000 soil horizons) taking into account the effect of climate, crop type, irrigation and fertilization management scenarios. The final goal of this study is to evaluate simple management methods in terms of reduced nitrate leaching. We estimated drainage rates of water under the root zone and nitrate concentrations in the drain water at the regional scale. We present maps for all agricultural soils in California which can be used for risk assessment studies. Finally, our results indicate that adoption of simple irrigation and fertilization methods may significantly reduce nitrate leaching in vulnerable regions.

  12. Comparison of the Effects of Environmental Parameters on the Growth Variability of Vibrio parahaemolyticus Coupled with Strain Sources and Genotypes Analyses

    PubMed Central

    Liu, Bingxuan; Liu, Haiquan; Pan, Yingjie; Xie, Jing; Zhao, Yong

    2016-01-01

    Microbial growth variability plays an important role on food safety risk assessment. In this study, the growth kinetic characteristics corresponding to maximum specific growth rate (μmax) of 50 V. parahaemolyticus isolates from different sources and genotypes were evaluated at different temperatures (10, 20, 30, and 37°C) and salinity (0.5, 3, 5, 7, and 9%) using the automated turbidimetric system Bioscreen C. The results demonstrated that strain growth variability increased as the growth conditions became more stressful both in terms of temperature and salinity. The coefficient of variation (CV) of μmax for temperature was larger than that for salinity, indicating that the impact of temperature on strain growth variability was greater than that of salinity. The strains isolated from freshwater aquatic products had more conspicuous growth variations than those from seawater. Moreover, the strains with tlh+/tdh+/trh− exhibited higher growth variability than tlh+/tdh−/trh− or tlh+/tdh−/trh+, revealing that gene heterogeneity might have possible relations with the growth variability. This research illustrates that the growth environments, strain sources as well as genotypes have impacts on strain growth variability of V. parahaemolyticus, which can be helpful for incorporating strain variability in predictive microbiology and microbial risk assessment. PMID:27446034

  13. Geochemistry of dissolved trace elements and heavy metals in the Dan River Drainage (China): distribution, sources, and water quality assessment.

    PubMed

    Meng, Qingpeng; Zhang, Jing; Zhang, Zhaoyu; Wu, Tairan

    2016-04-01

    Dissolved trace elements and heavy metals in the Dan River drainage basin, which is the drinking water source area of South-to-North Water Transfer Project (China), affect large numbers of people and should therefore be carefully monitored. To investigate the distribution, sources, and quality of river water, this study integrating catchment geology and multivariate statistical techniques was carried out in the Dan River drainage from 99 river water samples collected in 2013. The distribution of trace metal concentrations in the Dan River drainage was similar to that in the Danjiangkou Reservoir, indicating that the reservoir was significantly affected by the Dan River drainage. Moreover, our results suggested that As, Sb, Cd, Mn, and Ni were the major pollutants. We revealed extremely high concentrations of As and Sb in the Laoguan River, Cd in the Qingyou River, Mn, Ni, and Cd in the Yinhua River, As and Sb in the Laojun River, and Sb in the Dan River. According to the water quality index, water in the Dan River drainage was suitable for drinking; however, an exposure risk assessment model suggests that As and Sb in the Laojun and Laoguan rivers could pose a high risk to humans in terms of adverse health and potential non-carcinogenic effects.

  14. A new modeling approach for assessing the contribution of industrial and traffic emissions to ambient NOx concentrations

    NASA Astrophysics Data System (ADS)

    Chen, Shimon; Yuval; Broday, David M.

    2018-01-01

    The Optimized Dispersion Model (ODM) is uniquely capable of incorporating emission estimates, ambient air quality monitoring data and meteorology to provide reliable high-resolution (in both time and space) air quality estimates using non-linear regression. However, it was so far not capable of describing the effects of emissions from elevated sources. We formulated an additional term to extend the ODM such that these sources can be accounted for, and implemented it in modeling the fine spatiotemporal patterns of ambient NOx concentrations over the coastal plain of Israel. The diurnal and seasonal variation in the contribution of industry to the ambient NOx is presented, as well as its spatial features. Although industrial stacks are responsible for 88% of the NOx emissions in the study area, their contribution to ambient NOx levels is generally about 2% with a maximal upper bound of 27%. Meteorology has a major role in this source allocation, with the highest impact of industry in the summer months, when the wind is blowing inland past the coastal stacks and vertical mixing is substantial. The new Optimized Dispersion Model (ODM) out-performs both Inverse-Distance-Weighing (IDW) interpolation and a previous ODM version in predicting ambient NOx concentrations. The performance of the new model is thoroughly assessed.

  15. Use of new scientific developments in regulatory risk assessments: challenges and opportunities.

    PubMed

    Tarazona, Jose V

    2013-07-01

    Since the 1990s, science based ecological risk assessments constitute an essential tool for supporting decision making in the regulatory context. Using the European REACH Regulation as example, this article presents the challenges and opportunities for new scientific developments within the area of chemical control and environmental protection. These challenges can be sorted out in 3 main related topics (sets). In the short term, the challenges are directly associated with the regulatory requirements, required for facilitating a scientifically sound implementation of the different obligations for industry and authorities. It is important to mention that although the actual tools are different due to the regulatory requirements, the basic needs are still the same as those addressed in the early 1990s: understanding the ecological relevance of the predicted effects, including the uncertainty, and facilitating the link with the socio-economic assessment. The second set of challenges covers the opportunities for getting an added value from the regulatory efforts. The information compiled through REACH registration and notification processes is analyzed as source for new integrative developments for assessing the combined chemical risk at the regional level. Finally, the article discusses the challenge of inverting the process and developing risk assessment methods focusing on the receptor, the individual or ecosystem, instead of on the stressor or source. These approaches were limited in the past due to the lack of information, but the identification and dissemination of standard information, including uses, manufacturing sites, physical-chemical, environmental, ecotoxicological, and toxicological properties as well as operational conditions and risk management measures for thousands of chemicals, combined by the knowledge gathered through large scale monitoring programs and spatial information systems is generating new opportunities. The challenge is liking predictions and measured data in an integral "-omic type" approach, considering collectively data from different sources and offering a complete assessment of the chemical risk of individuals and ecosystems, with new conceptual approaches that could be defined as "risk-omics based" paradigms and models. Copyright © 2013 SETAC.

  16. Remotely measuring populations during a crisis by overlaying two data sources.

    PubMed

    Bharti, Nita; Lu, Xin; Bengtsson, Linus; Wetter, Erik; Tatem, Andrew J

    2015-03-01

    Societal instability and crises can cause rapid, large-scale movements. These movements are poorly understood and difficult to measure but strongly impact health. Data on these movements are important for planning response efforts. We retrospectively analyzed movement patterns surrounding a 2010 humanitarian crisis caused by internal political conflict in Côte d'Ivoire using two different methods. We used two remote measures, nighttime lights satellite imagery and anonymized mobile phone call detail records, to assess average population sizes as well as dynamic population changes. These data sources detect movements across different spatial and temporal scales. The two data sources showed strong agreement in average measures of population sizes. Because the spatiotemporal resolution of the data sources differed, we were able to obtain measurements on long- and short-term dynamic elements of populations at different points throughout the crisis. Using complementary, remote data sources to measure movement shows promise for future use in humanitarian crises. We conclude with challenges of remotely measuring movement and provide suggestions for future research and methodological developments. © The Author 2015. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  17. Metabolite profiling of Dioscorea (yam) species reveals underutilised biodiversity and renewable sources for high-value compounds

    PubMed Central

    Price, Elliott J.; Wilkin, Paul; Sarasan, Viswambharan; Fraser, Paul D.

    2016-01-01

    Yams (Dioscorea spp.) are a multispecies crop with production in over 50 countries generating ~50 MT of edible tubers annually. The long-term storage potential of these tubers is vital for food security in developing countries. Furthermore, many species are important sources of pharmaceutical precursors. Despite these attributes as staple food crops and sources of high-value chemicals, Dioscorea spp. remain largely neglected in comparison to other staple tuber crops of tropical agricultural systems such as cassava (Manihot esculenta) and sweet potato (Ipomoea batatas). To date, studies have focussed on the tubers or rhizomes of Dioscorea, neglecting the foliage as waste. In the present study metabolite profiling procedures, using GC-MS approaches, have been established to assess biochemical diversity across species. The robustness of the procedures was shown using material from the phylogenetic clades. The resultant data allowed separation of the genotypes into clades, species and morphological traits with a putative geographical origin. Additionally, we show the potential of foliage material as a renewable source of high-value compounds. PMID:27385275

  18. Marine litter on the beaches of the Adriatic and Ionian Seas: An assessment of their abundance, composition and sources.

    PubMed

    Vlachogianni, Thomais; Fortibuoni, Tomaso; Ronchi, Francesca; Zeri, Christina; Mazziotti, Cristina; Tutman, Pero; Varezić, Dubravka Bojanić; Palatinus, Andreja; Trdan, Štefan; Peterlin, Monika; Mandić, Milica; Markovic, Olivera; Prvan, Mosor; Kaberi, Helen; Prevenios, Michael; Kolitari, Jerina; Kroqi, Gulielm; Fusco, Marina; Kalampokis, Evangelos; Scoullos, Michael

    2018-06-01

    The abundance, composition and sources of marine litter were determined on beaches located in the seven countries of the Adriatic-Ionian macroregion, namely Albania, Bosnia and Herzegovina, Croatia, Greece, Italy, Montenegro and Slovenia. A total of 70,581 marine litter items were classified and recorded through one-year long surveys carried out in 31 sites. The average litter density of 0.67 items/m 2 found within this study is considered to be relatively high. The beaches investigated differed in terms of human-induced pressures; their majority is classified either as semi-urban or semi-rural, while very few beaches could be characterized as urban or remote/natural. The majority of litter items were made of artificial/anthropogenic polymer materials accounting for 91.1% of all litter. Litter from shoreline sources accounted for 33.4% of all litter collected. The amount of litter from sea-based sources ranged in the different countries from 1.54% to 14.84%, with an average of 6.30% at regional level. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Time-domain diffuse optics: towards next generation devices

    NASA Astrophysics Data System (ADS)

    Contini, Davide; Dalla Mora, Alberto; Arridge, Simon; Martelli, Fabrizio; Tosi, Alberto; Boso, Gianluca; Farina, Andrea; Durduran, Turgut; Martinenghi, Edoardo; Torricelli, Alessandro; Pifferi, Antonio

    2015-07-01

    Diffuse optics is a powerful tool for clinical applications ranging from oncology to neurology, but also for molecular imaging, and quality assessment of food, wood and pharmaceuticals. We show that ideally time-domain diffuse optics can give higher contrast and a higher penetration depth with respect to standard technology. In order to completely exploit the advantages of a time-domain system a distribution of sources and detectors with fast gating capabilities covering all the sample surface is needed. Here, we present the building block to build up such system. This basic component is made of a miniaturised source-detector pair embedded into the probe based on pulsed Vertical-Cavity Surface-Emitting Lasers (VCSEL) as sources and Single-Photon Avalanche Diodes (SPAD) or Silicon Photomultipliers (SiPM) as detectors. The possibility to miniaturized and dramatically increase the number of source detectors pairs open the way to an advancement of diffuse optics in terms of improvement of performances and exploration of new applications. Furthermore, availability of compact devices with reduction in size and cost can boost the application of this technique.

  20. Energy issues in microwave food processing: A review of developments and the enabling potentials of solid-state power delivery.

    PubMed

    Atuonwu, J C; Tassou, S A

    2018-01-23

    The enormous magnitude and variety of microwave applications in household, commercial and industrial food processing creates a strong motivation for improving the energy efficiency and hence, sustainability of the process. This review critically assesses key energy issues associated with microwave food processing, focusing on previous energy performance studies, energy performance metrics, standards and regulations. Factors affecting energy-efficiency are categorised into source, load and source-load matching factors. This highlights the need for highly-flexible and controllable power sources capable of receiving real-time feedback on load properties, and effecting rapid control actions to minimise reflections, heating non-uniformities and other imperfections that lead to energy losses. A case is made for the use of solid-state amplifiers as alternatives to conventional power sources, magnetrons. By a full-scale techno-economic analysis, including energy aspects, it is shown that the use of solid-state amplifiers as replacements to magnetrons is promising, not only from an energy and overall technical perspective, but also in terms of economics.

  1. An analysis of the readability characteristics of oral health information literature available to the public in Tasmania, Australia.

    PubMed

    Barnett, Tony; Hoang, Ha; Furlan, Ashlea

    2016-03-17

    The effectiveness of print-based health promotion materials is dependent on their readability. This study aimed to assess the characteristics of print-based oral health information literature publically available in Tasmania, Australia. Oral health education brochures were collected from 11 dental clinics across Tasmania and assessed for structure and format, content and readability. Reading level was calculated using three widely-used measures: Flesch-Kincaid Grade Level (FKGL), Flesch Reading Ease, and Simple Measure of Gobbledygook (SMOG) reading grade level. The FKGL of the 67 brochures sampled ranged from grade 3 to 13. The grade level for government health department brochures (n = 14) ranged from grade 4 to 11 (5.6 ± 1.8). Reading levels for materials produced by commercial sources (n = 22) ranged from 3 to 13 (8.3 ± 2.1), those from professional associations (n = 22) ranged from grade 7 to 11 (8.9 ± 0.9) and brochures produced by other sources (n = 9) ranged from 5 to 10 (7.6 ± 1.5). The SMOG test was positively correlated with the FKGL (rs = 0.92, p < 0.001) though consistently rated materials 2-3 grades higher. The reading level required to comprehend brochures published by government sources were, on average, lower than those from commercial, professional and other sources. Government materials were also more likely to contain fewer words and professional jargon terms than brochures from the other sources. A range of oral health information brochures were publically available for patients in both public and private dental clinics. However, their readability characteristics differed. Many brochures required a reading skill level higher than that suited to a large proportion of the Tasmanian population. Readability and other characteristics of oral health education materials should be assessed to ensure their suitability for use with patients, especially those suspected of having low literacy skills.

  2. Governing Laws of Complex System Predictability under Co-evolving Uncertainty Sources: Theory and Nonlinear Geophysical Applications

    NASA Astrophysics Data System (ADS)

    Perdigão, R. A. P.

    2017-12-01

    Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.

  3. A Source-Term Based Boundary Layer Bleed/Effusion Model for Passive Shock Control

    NASA Technical Reports Server (NTRS)

    Baurle, Robert A.; Norris, Andrew T.

    2011-01-01

    A modeling framework for boundary layer effusion has been developed based on the use of source (or sink) terms instead of the usual practice of specifying bleed directly as a boundary condition. This framework allows the surface boundary condition (i.e. isothermal wall, adiabatic wall, slip wall, etc.) to remain unaltered in the presence of bleed. This approach also lends itself to easily permit the addition of empirical models for second order effects that are not easily accounted for by simply defining effective transpiration values. Two effusion models formulated for supersonic flows have been implemented into this framework; the Doerffer/Bohning law and the Slater formulation. These models were applied to unit problems that contain key aspects of the flow physics applicable to bleed systems designed for hypersonic air-breathing propulsion systems. The ability of each model to predict bulk bleed properties was assessed, as well as the response of the boundary layer as it passes through and downstream of a porous bleed system. The model assessment was performed with and without the presence of shock waves. Three-dimensional CFD simulations that included the geometric details of the porous plate bleed systems were also carried out to supplement the experimental data, and provide additional insights into the bleed flow physics. Overall, both bleed formulations fared well for the tests performed in this study. However, the sample of test problems considered in this effort was not large enough to permit a comprehensive validation of the models.

  4. 12 CFR 201.4 - Availability and terms of credit.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... overnight, as a backup source of funding to a depository institution that is in generally sound financial... to a few weeks as a backup source of funding to a depository institution if, in the judgment of the... very short-term basis, usually overnight, as a backup source of funding to a depository institution...

  5. Novel techniques for characterization of hydrocarbon emission sources in the Barnett Shale

    NASA Astrophysics Data System (ADS)

    Nathan, Brian Joseph

    Changes in ambient atmospheric hydrocarbon concentrations can have both short-term and long-term effects on the atmosphere and on human health. Thus, accurate characterization of emissions sources is critically important. The recent boom in shale gas production has led to an increase in hydrocarbon emissions from associated processes, though the exact extent is uncertain. As an original quantification technique, a model airplane equipped with a specially-designed, open-path methane sensor was flown multiple times over a natural gas compressor station in the Barnett Shale in October 2013. A linear optimization was introduced to a standard Gaussian plume model in an effort to determine the most probable emission rate coming from the station. This is shown to be a suitable approach given an ideal source with a single, central plume. Separately, an analysis was performed to characterize the nonmethane hydrocarbons in the Barnett during the same period. Starting with ambient hourly concentration measurements of forty-six hydrocarbon species, Lagrangian air parcel trajectories were implemented in a meteorological model to extend the resolution of these measurements and achieve domain-fillings of the region for the period of interest. A self-organizing map (a type of unsupervised classification) was then utilized to reduce the dimensionality of the total multivariate set of grids into characteristic one-dimensional signatures. By also introducing a self-organizing map classification of the contemporary wind measurements, the spatial hydrocarbon characterizations are analyzed for periods with similar wind conditions. The accuracy of the classification is verified through assessment of observed spatial mixing ratio enhancements of key species, through site-comparisons with a related long-term study, and through a random forest analysis (an ensemble learning method of supervised classification) to determine the most important species for defining key classes. The hydrocarbon classification is shown to have performed very well in identifying expected signatures near and downwind-of oil and gas facilities with active permits, which showcases this method's usefulness for future regional hydrocarbon source-apportionment analyses.

  6. Spatial and temporal dynamics of nitrate fluxes in a mesoscale catchment

    NASA Astrophysics Data System (ADS)

    Muller, C.; Musolff, A.; Strachauer, U.; Brauns, M.; Tarasova, L.; Merz, R.; Knoeller, K.

    2017-12-01

    Spatially and temporally variable and often superimposing processes like mobilization and turnover of N-species strongly affect nitrate fluxes at catchment outlets. It remains thus challenging to determine dominant nitrate sources to derive an effective river management. Here, we combine data sets from two spatially highly resolved key-date monitoring campaigns of nitrate fluxes along a mesoscale catchment in Germany with four years of monitoring data from two representative sites within the catchment. The study area is characterized by a strong land use gradient from pristine headwaters to lowland sub-catchments with intense agricultural land use and wastewater sources. Flow conditions were assessed by a hydrograph separation showing the clear dominance of base flow during both investigations. However, the absolute amounts of discharge differed significantly from each other (outlet: 1.42 m³ s-1 versus 0.43 m³ s-1). Nitrate concentration and flux in the headwater was found to be low. In contrast, nitrate loads further downstream originate from anthropogenic sources such as effluents from wastewater treatment plants (WWTP) and agricultural land use. The agricultural contribution did not vary in terms of nitrate concentration and isotopic signature between the years but in terms of flux. The contrasting amounts of discharge between the years led to a strongly increased relative wastewater contribution with decreasing discharge. This was mainly manifested in elevated δ18O-NO3- values downstream from the wastewater discharge. The four-year monitoring at two sides clearly indicates the chemostatic character of the agricultural N-source and its distinct, yet stable isotopic fingerprint. Denitrification was found to play no dominant role only for controlling nitrate loads in the river. The spatially highly resolved monitoring approach helped to accurately define hot spots of nitrate inputs into the stream while the long-term information allowed a classification of the results with respect to the seasonal N-dynamics in the catchment.

  7. Portrayal of caesarean section in Brazilian women’s magazines: 20 year review

    PubMed Central

    Daher, Silvia; Betrán, Ana Pilar; Widmer, Mariana; Montilla, Pilar; Souza, Joao Paulo; Merialdi, Mario

    2011-01-01

    Objective To assess the quality and comprehensiveness of the information on caesarean section provided in Brazilian women’s magazines. Design Review of articles published during 1988-2008 in top selling women’s magazines. Setting Brazil, one of the countries with the highest caesarean section rates in the world. Data sources Women’s magazines with the largest distribution during the study period, identified through the official national media indexing organisations. Selection criteria Articles with objective scientific information or advice, comments, opinions, or the experience of ordinary women or celebrities on delivery by caesarean section. Main outcome measures Sources of information mentioned by the author of the article, the accuracy and completeness of data presented on caesarean section, and alleged reasons why women would prefer to deliver though caesarean section. Results 118 articles were included. The main cited sources of information were health professionals (78% (n=92) of the articles). 71% (n=84) of the articles reported at least one benefit of caesarean section, and 82% (n=97) reported at least one short term maternal risk of caesarean section. The benefits most often attributed to delivery by caesarean section were reduction of pain and convenience for family or health professionals. The most frequently reported short term maternal risks of caesarean section were increased time to recover and that it is a less natural way of giving birth. Only one third of the articles mentioned any long term maternal risks or perinatal complications associated with caesarean section. Fear of pain was the main reported reason why women would prefer to deliver by caesarean section. Conclusions Most of the articles published in Brazilian women’s magazines do not use optimal sources of information. The portrayal of caesarean section is mostly balanced, not explicitly in favour of one or another route of delivery, but incomplete and may be leading women to underestimate the maternal/perinatal risks associated with this route of delivery. PMID:21266421

  8. Insights Gained from Forensic Analysis with MELCOR of the Fukushima-Daiichi Accidents.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Nathan C.; Gauntt, Randall O.

    Since the accidents at Fukushima-Daiichi, Sandia National Laboratories has been modeling these accident scenarios using the severe accident analysis code, MELCOR. MELCOR is a widely used computer code developed at Sandia National Laboratories since ~1982 for the U.S. Nuclear Regulatory Commission. Insights from the modeling of these accidents is being used to better inform future code development and potentially improved accident management. To date, our necessity to better capture in-vessel thermal-hydraulic and ex-vessel melt coolability and concrete interactions has led to the implementation of new models. The most recent analyses, presented in this paper, have been in support of themore » of the Organization for Economic Cooperation and Development Nuclear Energy Agency’s (OECD/NEA) Benchmark Study of the Accident at the Fukushima Daiichi Nuclear Power Station (BSAF) Project. The goal of this project is to accurately capture the source term from all three releases and then model the atmospheric dispersion. In order to do this, a forensic approach is being used in which available plant data and release timings is being used to inform the modeled MELCOR accident scenario. For example, containment failures, core slumping events and lower head failure timings are all enforced parameters in these analyses. This approach is fundamentally different from a blind code assessment analysis often used in standard problem exercises. The timings of these events are informed by representative spikes or decreases in plant data. The combination of improvements to the MELCOR source code resulting from analysis previous accident analysis and this forensic approach has allowed Sandia to generate representative and plausible source terms for all three accidents at Fukushima Daiichi out to three weeks after the accident to capture both early and late releases. In particular, using the source terms developed by MELCOR, the MACCS software code, which models atmospheric dispersion and deposition, we are able to reasonably capture the deposition of radionuclides to the northwest of the reactor site.« less

  9. Assessing and measuring wetland hydrology

    USGS Publications Warehouse

    Rosenberry, Donald O.; Hayashi, Masaki; Anderson, James T.; Davis, Craig A.

    2013-01-01

    Virtually all ecological processes that occur in wetlands are influenced by the water that flows to, from, and within these wetlands. This chapter provides the “how-to” information for quantifying the various source and loss terms associated with wetland hydrology. The chapter is organized from a water-budget perspective, with sections associated with each of the water-budget components that are common in most wetland settings. Methods for quantifying the water contained within the wetland are presented first, followed by discussion of each separate component. Measurement accuracy and sources of error are discussed for each of the methods presented, and a separate section discusses the cumulative error associated with determining a water budget for a wetland. Exercises and field activities will provide hands-on experience that will facilitate greater understanding of these processes.

  10. Aid to people with disabilities: Medicaid's growing role.

    PubMed

    Carbaugh, Alicia L; Elias, Risa; Rowland, Diane

    2006-01-01

    Medicaid is the nation's largest health care program providing assistance with health and long-term care services for millions of low-income Americans, including people with chronic illness and severe disabilities. This article traces the evolution of Medicaid's now-substantial role for people with disabilities; assesses Medicaid's contributions over the last four decades to improving health insurance coverage, access to care, and the delivery of care; and examines the program's future challenges as a source of assistance to children and adults with disabilities. Medicaid has shown that it is an important source of health insurance coverage for this population, people for whom private coverage is often unavailable or unaffordable, substantially expanding coverage and helping to reduce the disparities in access to care between the low-income population and the privately insured.

  11. Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil

    NASA Astrophysics Data System (ADS)

    de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.

    2018-05-01

    A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.

  12. Definition of 1992 Technology Aircraft Noise Levels and the Methodology for Assessing Airplane Noise Impact of Component Noise Reduction Concepts

    NASA Technical Reports Server (NTRS)

    Kumasaka, Henry A.; Martinez, Michael M.; Weir, Donald S.

    1996-01-01

    This report describes the methodology for assessing the impact of component noise reduction on total airplane system noise. The methodology is intended to be applied to the results of individual study elements of the NASA-Advanced Subsonic Technology (AST) Noise Reduction Program, which will address the development of noise reduction concepts for specific components. Program progress will be assessed in terms of noise reduction achieved, relative to baseline levels representative of 1992 technology airplane/engine design and performance. In this report, the 1992 technology reference levels are defined for assessment models based on four airplane sizes - an average business jet and three commercial transports: a small twin, a medium sized twin, and a large quad. Study results indicate that component changes defined as program final goals for nacelle treatment and engine/airframe source noise reduction would achieve from 6-7 EPNdB reduction of total airplane noise at FAR 36 Stage 3 noise certification conditions for all of the airplane noise assessment models.

  13. Portable Imagery Quality Assessment Test Field for Uav Sensors

    NASA Astrophysics Data System (ADS)

    Dąbrowski, R.; Jenerowicz, A.

    2015-08-01

    Nowadays the imagery data acquired from UAV sensors are the main source of all data used in various remote sensing applications, photogrammetry projects and in imagery intelligence (IMINT) as well as in other tasks as decision support. Therefore quality assessment of such imagery is an important task. The research team from Military University of Technology, Faculty of Civil Engineering and Geodesy, Geodesy Institute, Department of Remote Sensing and Photogrammetry has designed and prepared special test field- The Portable Imagery Quality Assessment Test Field (PIQuAT) that provides quality assessment in field conditions of images obtained with sensors mounted on UAVs. The PIQuAT consists of 6 individual segments, when combined allow for determine radiometric, spectral and spatial resolution of images acquired from UAVs. All segments of the PIQuAT can be used together in various configurations or independently. All elements of The Portable Imagery Quality Assessment Test Field were tested in laboratory conditions in terms of their radiometry and spectral reflectance characteristics.

  14. Disaster Risk Reduction through Innovative Uses of Crowd Sourcing (Invited)

    NASA Astrophysics Data System (ADS)

    Berger, J.; Greene, M.

    2010-12-01

    Crowd sourcing can be described as a method of distributed problem-solving. It takes advantage of the power of the crowd, which can in some cases be a community of experts and in other cases the collective insight of a broader range of contributors with varying degrees of domain knowledge. The term crowd sourcing was first used by Jeff Howe in a June 2006 Wired magazine article “The Rise of Crowdsourcing,” and is a combination of the terms “crowd” and “outsourcing.” Some commonly known examples of crowd sourcing, in its broadest sense, include Wikepedia, distributed participatory design projects, and consumer websites such as Yelp and Angie’s List. The popularity and success of early large-scale crowd sourcing activities is made possible through leveraging Web 2.0 technologies that allow for mass participation from distributed individuals. The Earthquake Engineering Research Institute (EERI) in Oakland, California recently participated in two crowd sourcing projects. One was initiated and coordinated by EERI, while in the second case EERI was invited to contribute once the crowd sourcing activity was underway. In both projects there was: 1) the determination of a problem or set of tasks that could benefit immediately from the engagement of an informed volunteer group of professionals; 2) a segmenting of the problem into discrete pieces that could be completed in a short period of time (from ten minutes to four hours); 3) a call to action, where an interested community was made aware of the project; and 4) the collection, aggregation, vetting and ultimately distribution of the results in a relatively short period of time. The first EERI crowd sourcing example was the use of practicing engineers and engineering students in California to help estimate the number of pre-1980 concrete buildings in the high seismic risk counties in the state. This building type is known to perform poorly in earthquakes, and state officials were interested in understanding more about the size of the problem—how many buildings, which jurisdictions. Volunteers signed up for individual jurisdictions and used a variety of techniques to estimate the count. They shared their techniques at meetings and posted their results online. Over 100 volunteers also came together to walk the streets of downtown San Francisco, a city with a particularly large number of these buildings, gathering more data on each building that will be used in a later phase to identify possible mitigation strategies. The second example was EERI’s participation in a response network, GEO-CAN, created in support of the World Bank’s responsibility in the damage assessment of buildings in Port-au-Prince immediately after the January 12, 2010 earthquake. EERI members, primarily earthquake engineers, were invited to speed up critical damage assessment using pre- and post-event aerial imagery. An area of 300 sq km was divided into grids, and grids were then allocated to knowledgeable individuals for analysis. The initial analysis was completed within 96 hours through the participation of over 300 volunteers. Ultimately, over 600 volunteers completed damage assessments for about 30,000 buildings.

  15. Attenuation Tomography of Northern California and the Yellow Sea / Korean Peninsula from Coda-source Normalized and Direct Lg Amplitudes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, S R; Dreger, D S; Phillips, W S

    2008-07-16

    Inversions for regional attenuation (1/Q) of Lg are performed in two different regions. The path attenuation component of the Lg spectrum is isolated using the coda-source normalization method, which corrects the Lg spectral amplitude for the source using the stable, coda-derived source spectra. Tomographic images of Northern California agree well with one-dimensional (1-D) Lg Q estimated from five different methods. We note there is some tendency for tomographic smoothing to increase Q relative to targeted 1-D methods. For example in the San Francisco Bay Area, which contains high attenuation relative to the rest of it's region, Q is over-estimated bymore » {approx}30. Coda-source normalized attenuation tomography is also carried out for the Yellow Sea/Korean Peninsula (YSKP) where output parameters (site, source, and path terms) are compared with those from the amplitude tomography method of Phillips et al. (2005) as well as a new method that ties the source term to the MDAC formulation (Walter and Taylor, 2001). The source terms show similar scatter between coda-source corrected and MDAC source perturbation methods, whereas the amplitude method has the greatest correlation with estimated true source magnitude. The coda-source better represents the source spectra compared to the estimated magnitude and could be the cause of the scatter. The similarity in the source terms between the coda-source and MDAC-linked methods shows that the latter method may approximate the effect of the former, and therefore could be useful in regions without coda-derived sources. The site terms from the MDAC-linked method correlate slightly with global Vs30 measurements. While the coda-source and amplitude ratio methods do not correlate with Vs30 measurements, they do correlate with one another, which provides confidence that the two methods are consistent. The path Q{sup -1} values are very similar between the coda-source and amplitude ratio methods except for small differences in the Da-xin-anling Mountains, in the northern YSKP. However there is one large difference between the MDAC-linked method and the others in the region near stations TJN and INCN, which point to site-effect as the cause for the difference.« less

  16. The use of hierarchical clustering for the design of optimized monitoring networks

    NASA Astrophysics Data System (ADS)

    Soares, Joana; Makar, Paul Andrew; Aklilu, Yayne; Akingunola, Ayodeji

    2018-05-01

    Associativity analysis is a powerful tool to deal with large-scale datasets by clustering the data on the basis of (dis)similarity and can be used to assess the efficacy and design of air quality monitoring networks. We describe here our use of Kolmogorov-Zurbenko filtering and hierarchical clustering of NO2 and SO2 passive and continuous monitoring data to analyse and optimize air quality networks for these species in the province of Alberta, Canada. The methodology applied in this study assesses dissimilarity between monitoring station time series based on two metrics: 1 - R, R being the Pearson correlation coefficient, and the Euclidean distance; we find that both should be used in evaluating monitoring site similarity. We have combined the analytic power of hierarchical clustering with the spatial information provided by deterministic air quality model results, using the gridded time series of model output as potential station locations, as a proxy for assessing monitoring network design and for network optimization. We demonstrate that clustering results depend on the air contaminant analysed, reflecting the difference in the respective emission sources of SO2 and NO2 in the region under study. Our work shows that much of the signal identifying the sources of NO2 and SO2 emissions resides in shorter timescales (hourly to daily) due to short-term variation of concentrations and that longer-term averages in data collection may lose the information needed to identify local sources. However, the methodology identifies stations mainly influenced by seasonality, if larger timescales (weekly to monthly) are considered. We have performed the first dissimilarity analysis based on gridded air quality model output and have shown that the methodology is capable of generating maps of subregions within which a single station will represent the entire subregion, to a given level of dissimilarity. We have also shown that our approach is capable of identifying different sampling methodologies as well as outliers (stations' time series which are markedly different from all others in a given dataset).

  17. Agreement of central site measurements and land use regression modeled oxidative potential of PM{sub 2.5} with personal exposure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Aileen, E-mail: Yang@uu.nl; Institute for Risk Assessment Sciences, Division Environmental Epidemiology, Utrecht University, P.O. Box 80.178, 3508TD Utrecht; Hoek, Gerard

    Oxidative potential (OP) of ambient particulate matter (PM) has been suggested as a health-relevant exposure metric. In order to use OP for exposure assessment, information is needed about how well central site OP measurements and modeled average OP at the home address reflect temporal and spatial variation of personal OP. We collected 96-hour personal, home outdoor and indoor PM{sub 2.5} samples from 15 volunteers living either at traffic, urban or regional background locations in Utrecht, the Netherlands. OP was also measured at one central reference site to account for temporal variations. OP was assessed using electron spin resonance (OP{sup ESR})more » and dithiothreitol (OP{sup DTT}). Spatial variation of average OP at the home address was modeled using land use regression (LUR) models. For both OP{sup ESR} and OP{sup DTT}, temporal correlations of central site measurements with home outdoor measurements were high (R>0.75), and moderate to high (R=0.49–0.70) with personal measurements. The LUR model predictions for OP correlated significantly with the home outdoor concentrations for OP{sup DTT} and OP{sup ESR} (R=0.65 and 0.62, respectively). LUR model predictions were moderately correlated with personal OP{sup DTT} measurements (R=0.50). Adjustment for indoor sources, such as vacuum cleaning and absence of fume-hood, improved the temporal and spatial agreement with measured personal exposure for OP{sup ESR}. OP{sup DTT} was not associated with any indoor sources. Our study results support the use of central site OP for exposure assessment of epidemiological studies focusing on short-term health effects. - Highlights: • Oxidative potential (OP) of PM was proposed as a health-relevant exposure metric. • We evaluated the relationship between measured and modeled outdoor and personal OP. • Temporal correlations of central site with personal OP are moderate to high. • Adjusting for indoor sources improved the agreement with personal OP. • Our results support the use of central site OP for short-term health effect studies.« less

  18. Remediation and its effect represented on long term monitoring data at a chlorinated ethenes contaminated site, Wonju, Korea

    NASA Astrophysics Data System (ADS)

    Lee, Seong-Sun; Lee, Seung Hyun; Lee, Kang-Kun

    2016-04-01

    A research for the contamination of chlorinated ethenes such as trichloroethylene (TCE) at an industrial complex, Wonju, Korea, was carried out based on 17 rounds of groundwater quality data collection from 2009 to 2015. Remediation technologies such as soil vapor extraction, soil flushing, biostimulation, and pump-and-treat have been applied to eliminate the contaminant sources of trichloroethylene (TCE) and to prevent the migration of TCE plume from remediation target zones to groundwater discharge area like a stream. The remediation efficiency according to the remedial actions was evaluated by tracing a time-series of plume evaluation and temporal mass discharge at three transects (Source, Transect-1, Transect-2) which was assigned along the groundwater flow path. Also, based on long term monitoring data, dissolved TCE concentration and mass of residual TCE in the initial stage of disposal were estimated to evaluate the efficiency of in situ remediation. The results of temporal and spatial monitoring before remedial actions showed that a TCE plume originating from main and local source zones continues to be discharged to a stream. However, from the end of intensive remedial actions from 2012 to 2013, the aqueous concentrations of TCE plume present at and around the main source areas decreased significantly. Especially, during the intensive remediation period, the early average mass discharge (26.58 g/day) at source transect was decreased to average 4.99 g/day. Estimated initial dissolved concentration and residual mass of TCE in the initial stage of disposal decreased rapidly after an intensive remedial action in 2013 and it is expected to be continuously decreased from the end of remedial actions to 2020. This study demonstrates that long term monitoring data are useful in assessing the effectiveness of remedial actions at chlorinated ethenes contaminated site. Acknowledgements This project is supported by the Korea Ministry of Environment under "The GAIA Project (173-092-009)"and "R&D Project on Environmental Management of Geologic CO2 storage" from the KEITI (Project number:2014001810003).

  19. 10 CFR 40.41 - Terms and conditions of licenses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Terms and conditions of licenses. 40.41 Section 40.41 Energy NUCLEAR REGULATORY COMMISSION DOMESTIC LICENSING OF SOURCE MATERIAL Licenses § 40.41 Terms and... the regulations in this part shall confine his possession and use of source or byproduct material to...

  20. Comparison of the landslide susceptibility models in Taipei Water Source Domain, Taiwan

    NASA Astrophysics Data System (ADS)

    WU, C. Y.; Yeh, Y. C.; Chou, T. H.

    2017-12-01

    Taipei Water Source Domain, locating at the southeast of Taipei Metropolis, is the main source of water resource in this region. Recently, the downstream turbidity often soared significantly during the typhoon period because of the upstream landslides. The landslide susceptibilities should be analysed to assess the influence zones caused by different rainfall events, and to ensure the abilities of this domain to serve enough and quality water resource. Generally, the landslide susceptibility models can be established based on either a long-term landslide inventory or a specified landslide event. Sometimes, there is no long-term landslide inventory in some areas. Thus, the event-based landslide susceptibility models are established widely. However, the inventory-based and event-based landslide susceptibility models may result in dissimilar susceptibility maps in the same area. So the purposes of this study were to compare the landslide susceptibility maps derived from the inventory-based and event-based models, and to interpret how to select a representative event to be included in the susceptibility model. The landslide inventory from Typhoon Tim in July, 1994 and Typhoon Soudelor in August, 2015 was collected, and used to establish the inventory-based landslide susceptibility model. The landslides caused by Typhoon Nari and rainfall data were used to establish the event-based model. The results indicated the high susceptibility slope-units were located at middle upstream Nan-Shih Stream basin.

  1. The combined effects of a long-term experimental drought and an extreme drought on the use of plant-water sources in a Mediterranean forest.

    PubMed

    Barbeta, Adrià; Mejía-Chang, Monica; Ogaya, Romà; Voltas, Jordi; Dawson, Todd E; Peñuelas, Josep

    2015-03-01

    Vegetation in water-limited ecosystems relies strongly on access to deep water reserves to withstand dry periods. Most of these ecosystems have shallow soils over deep groundwater reserves. Understanding the functioning and functional plasticity of species-specific root systems and the patterns of or differences in the use of water sources under more frequent or intense droughts is therefore necessary to properly predict the responses of seasonally dry ecosystems to future climate. We used stable isotopes to investigate the seasonal patterns of water uptake by a sclerophyll forest on sloped terrain with shallow soils. We assessed the effect of a long-term experimental drought (12 years) and the added impact of an extreme natural drought that produced widespread tree mortality and crown defoliation. The dominant species, Quercus ilex, Arbutus unedo and Phillyrea latifolia, all have dimorphic root systems enabling them to access different water sources in space and time. The plants extracted water mainly from the soil in the cold and wet seasons but increased their use of groundwater during the summer drought. Interestingly, the plants subjected to the long-term experimental drought shifted water uptake toward deeper (10-35 cm) soil layers during the wet season and reduced groundwater uptake in summer, indicating plasticity in the functional distribution of fine roots that dampened the effect of our experimental drought over the long term. An extreme drought in 2011, however, further reduced the contribution of deep soil layers and groundwater to transpiration, which resulted in greater crown defoliation in the drought-affected plants. This study suggests that extreme droughts aggravate moderate but persistent drier conditions (simulated by our manipulation) and may lead to the depletion of water from groundwater reservoirs and weathered bedrock, threatening the preservation of these Mediterranean ecosystems in their current structures and compositions. © 2014 John Wiley & Sons Ltd.

  2. Multisource Estimation of Long-term Global Terrestrial Surface Radiation

    NASA Astrophysics Data System (ADS)

    Peng, L.; Sheffield, J.

    2017-12-01

    Land surface net radiation is the essential energy source at the earth's surface. It determines the surface energy budget and its partitioning, drives the hydrological cycle by providing available energy, and offers heat, light, and energy for biological processes. Individual components in net radiation have changed historically due to natural and anthropogenic climate change and land use change. Decadal variations in radiation such as global dimming or brightening have important implications for hydrological and carbon cycles. In order to assess the trends and variability of net radiation and evapotranspiration, there is a need for accurate estimates of long-term terrestrial surface radiation. While large progress in measuring top of atmosphere energy budget has been made, huge discrepancies exist among ground observations, satellite retrievals, and reanalysis fields of surface radiation, due to the lack of observational networks, the difficulty in measuring from space, and the uncertainty in algorithm parameters. To overcome the weakness of single source datasets, we propose a multi-source merging approach to fully utilize and combine multiple datasets of radiation components separately, as they are complementary in space and time. First, we conduct diagnostic analysis of multiple satellite and reanalysis datasets based on in-situ measurements such as Global Energy Balance Archive (GEBA), existing validation studies, and other information such as network density and consistency with other meteorological variables. Then, we calculate the optimal weighted average of multiple datasets by minimizing the variance of error between in-situ measurements and other observations. Finally, we quantify the uncertainties in the estimates of surface net radiation and employ physical constraints based on the surface energy balance to reduce these uncertainties. The final dataset is evaluated in terms of the long-term variability and its attribution to changes in individual components. The goal of this study is to provide a merged observational benchmark for large-scale diagnostic analyses, remote sensing and land surface modeling.

  3. Exploiting heterogeneous publicly available data sources for drug safety surveillance: computational framework and case studies.

    PubMed

    Koutkias, Vassilis G; Lillo-Le Louët, Agnès; Jaulent, Marie-Christine

    2017-02-01

    Driven by the need of pharmacovigilance centres and companies to routinely collect and review all available data about adverse drug reactions (ADRs) and adverse events of interest, we introduce and validate a computational framework exploiting dominant as well as emerging publicly available data sources for drug safety surveillance. Our approach relies on appropriate query formulation for data acquisition and subsequent filtering, transformation and joint visualization of the obtained data. We acquired data from the FDA Adverse Event Reporting System (FAERS), PubMed and Twitter. In order to assess the validity and the robustness of the approach, we elaborated on two important case studies, namely, clozapine-induced cardiomyopathy/myocarditis versus haloperidol-induced cardiomyopathy/myocarditis, and apixaban-induced cerebral hemorrhage. The analysis of the obtained data provided interesting insights (identification of potential patient and health-care professional experiences regarding ADRs in Twitter, information/arguments against an ADR existence across all sources), while illustrating the benefits (complementing data from multiple sources to strengthen/confirm evidence) and the underlying challenges (selecting search terms, data presentation) of exploiting heterogeneous information sources, thereby advocating the need for the proposed framework. This work contributes in establishing a continuous learning system for drug safety surveillance by exploiting heterogeneous publicly available data sources via appropriate support tools.

  4. Changes in stable isotope composition in Lake Michigan trout ...

    EPA Pesticide Factsheets

    Researchers have frequently sought to use environmental archives of sediment, peat and glacial ice to try and assess historical trends in atmospheric mercury (Hg) deposition to aquatic ecosystems. While this information is valuable in the context of identifying temporal source trends, these types of assessments cannot account for likely changes in bioavailability of Hg sources that are tied to the formation of methylmercury (MeHg) and accumulation in fish tissues. For this study we propose the use of long-term fish archives and Hg stable isotope determination as an improved means to relate temporal changes in fish Hg levels to varying Hg sources in the Great Lakes. For this study we acquired 180 archived fish composites from Lake Michigan over a 40-year time period (1975 to 2014) from the Great Lakes Fish Monitoring and Surveillance Program, which were analyzed for their total Hg content and Hg isotope abundances. The results reveal that Hg sources to Lake Michigan trout (Salvelinus namaycush) have encountered considerable changes as well as a large shift in the food web trophic position as a result of the introduction of several invasive species, especially the recent invasion of dreissenid mussels. Total Hg concentrations span a large range (1,600 to 150 ng g-1) and exhibit large variations from 1975 to 1985. Ä199Hg signatures similarly exhibit large variation (3.2 to 6.9‰) until 1985, followed by less variation through the end of the data record in 2014.

  5. Evaluation of the Pivot Profile©, a new method to characterize a large variety of a single product: Case study on honeys from around the world.

    PubMed

    Deneulin, Pascale; Reverdy, Caroline; Rébénaque, Pierrick; Danthe, Eve; Mulhauser, Blaise

    2018-04-01

    Honey is a natural product with very diverse sensory attributes that are influenced by the flower source, the bee species, the geographic origin, the treatments and conditions during storage. This study aimed at describing 50 honeys from diverse flower sources in different continents and islands, stored under various conditions. Many articles have been published on the sensory characterization of honeys, thus a common list of attributes has been established, but it appeared to be poorly suited to describe a large number of honeys from around the world. This is why the novel and rapid sensory evaluation method, the Pivot Profile©, was tested, with the participation of 15 panelists during five sessions. The first objective was to obtain a sensory description of the 50 honeys that were tested. From 1152 distinct terms, a list of 29 sensory attributes was established and the attributes divided into three categories: color/texture (8 terms), aroma (16 terms), and taste (5 terms). At first, the honeys have been ranked according to their level of crystallization from fluid/liquid to viscous/hard. Then color was the second assessment factor of the variability. In terms of aroma, honeys from Africa were characterized by smoky, resin, caramel and dried fruit as opposed to floral and fruity, mainly for honeys from South America and Europe. Finally, the honeys were ranked according to their sweetness. The second objective of this study was to test the new sensory method, called Pivot Profile© which is used to describe a large number of products with interpretable results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Enhancing GADRAS Source Term Inputs for Creation of Synthetic Spectra.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horne, Steven M.; Harding, Lee

    The Gamma Detector Response and Analysis Software (GADRAS) team has enhanced the source term input for the creation of synthetic spectra. These enhancements include the following: allowing users to programmatically provide source information to GADRAS through memory, rather than through a string limited to 256 characters; allowing users to provide their own source decay database information; and updating the default GADRAS decay database to fix errors and include coincident gamma information.

  7. Localization of sound sources in a room with one microphone

    NASA Astrophysics Data System (ADS)

    Peić Tukuljac, Helena; Lissek, Hervé; Vandergheynst, Pierre

    2017-08-01

    Estimation of the location of sound sources is usually done using microphone arrays. Such settings provide an environment where we know the difference between the received signals among different microphones in the terms of phase or attenuation, which enables localization of the sound sources. In our solution we exploit the properties of the room transfer function in order to localize a sound source inside a room with only one microphone. The shape of the room and the position of the microphone are assumed to be known. The design guidelines and limitations of the sensing matrix are given. Implementation is based on the sparsity in the terms of voxels in a room that are occupied by a source. What is especially interesting about our solution is that we provide localization of the sound sources not only in the horizontal plane, but in the terms of the 3D coordinates inside the room.

  8. River stage influences on uranium transport in a hydrologically dynamic groundwater-surface water transition zone: U TRANSPORT IN A GROUNDWATER-SURFACE WATER TRANSITION ZONE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zachara, John M.; Chen, Xingyuan; Murray, Chris

    A tightly spaced well-field within a groundwater uranium (U) plume in the groundwater-surface water transition zone was monitored for a three year period for groundwater elevation and dissolved solutes. The plume discharges to the Columbia River, which displays a dramatic spring stage surge resulting from mountain snowmelt. Groundwater exhibits a low hydrologic gradient and chemical differences with river water. River water intrudes the site in spring. Specific aims were to assess the impacts of river intrusion on dissolved uranium (Uaq), specific conductance (SpC), and other solutes, and to discriminate between transport, geochemical, and source term heterogeneity effects. Time series trendsmore » for Uaq and SpC were complex and displayed large temporal well-to well variability as a result of water table elevation fluctuations, river water intrusion, and changes in groundwater flow directions. The wells were clustered into subsets exhibiting common temporal behaviors resulting from the intrusion dynamics of river water and the location of source terms. Concentration hot spots were observed in groundwater that varied in location with increasing water table elevation. Heuristic reactive transport modeling with PFLOTRAN demonstrated that mobilized U was transported between wells and source terms in complex trajectories, and was diluted as river water entered and exited the groundwater system. While uranium time-series concentration trends varied significantly from year to year as a result of climate-caused differences in the spring hydrograph, common and partly predictable response patterns were observed that were driven by water table elevation, and the extent and duration of the river water intrusion event.« less

  9. Testing a model-driven Geographical Information System for risk assessment during an effusive volcanic crisis

    NASA Astrophysics Data System (ADS)

    Harris, Andrew; Latutrie, Benjamin; Andredakis, Ioannis; De Groeve, Tom; Langlois, Eric; van Wyk de Vries, Benjamin; Del Negro, Ciro; Favalli, Massimiliano; Fujita, Eisuke; Kelfoun, Karim; Rongo, Rocco

    2016-04-01

    RED-SEED stands for Risk Evaluation, Detection and Simulation during Effusive Eruption Disasters, and combines stakeholders from the remote sensing, modeling and response communities with experience in tracking volcanic effusive events. It is an informal working group that has evolved around the philosophy of combining global scientific resources, in the realm of physical volcanology, remote sensing and modeling, to better define and limit uncertainty. The group first met during a three day-long workshop held in Clermont Ferrand (France) between 28 and 30 May 2013. The main recommendation of the workshop in terms of modeling was that there is a pressing need for "real-time input of reliable Time-Averaged Discharge Rate (TADR) data with regular up-dates of Digital Elevation Models (DEMs) if modeling is to be effective; the DEMs can be provided by the radar/photogrammetry community." We thus set up a test to explore (i) which model source terms are needed, (ii) how they can be provided and updated, and (iii) how can models be run and applied in an ensemble approach. The test used two hypothetical effusive events in the Chaîne des Puys (Auvergne, France), for which a prototype Geographical Information System (GIS) was set up to allow loss assessment during an effusive crisis. This system drew on all immediately available data for population, land use, communications, utility and building-type. After defining lava flow model source terms (vent location, effusion rate, lava chemistry, temperature, crystallinity and vesicularity), five operational lava flow emplacement models were run (DOWNFLOW, FLOWGO, LAVASIM, MAGFLOW and VOLCFLOW) to produce a projection for likelihood of impact for all pixels within the area covered by the GIS, based on agreement between models. The test thus aimed not to assess the model output, but instead to examine overlapping output. Next, inundation maps and damage reports for impacted zones were produced. The exercise identified several shortcomings of the modeling systems, but indicates that generation of a global response system for effusive crises that uses rapid-response model projections for lava inundation driven by real-time satellite hot spot detection - and open access data sets - is within the current capabilities of the community.

  10. Effects of School-Based Educational Interventions for Enhancing Adolescents Abilities in Critical Appraisal of Health Claims: A Systematic Review.

    PubMed

    Nordheim, Lena V; Gundersen, Malene W; Espehaug, Birgitte; Guttersrud, Øystein; Flottorp, Signe

    2016-01-01

    Adolescents are frequent media users who access health claims from various sources. The plethora of conflicting, pseudo-scientific, and often misleading health claims in popular media makes critical appraisal of health claims an essential ability. Schools play an important role in educating youth to critically appraise health claims. The objective of this systematic review was to evaluate the effects of school-based educational interventions for enhancing adolescents' abilities in critically appraising health claims. We searched MEDLINE, Embase, PsycINFO, AMED, Cinahl, Teachers Reference Centre, LISTA, ERIC, Sociological Abstracts, Social Services Abstracts, The Cochrane Library, Science Citation Index Expanded, Social Sciences Citation Index, and sources of grey literature. Studies that evaluated school-based educational interventions to improve adolescents' critical appraisal ability for health claims through advancing the students' knowledge about science were included. Eligible study designs were randomised and non-randomised controlled trials, and interrupted time series. Two authors independently selected studies, extracted data, and assessed risk of bias in included studies. Due to heterogeneity in interventions and inadequate reporting of results, we performed a descriptive synthesis of studies. We used GRADE (Grading of Recommendations, Assessment, Development, and Evaluation) to assess the certainty of the evidence. Eight studies were included: two compared different teaching modalities, while the others compared educational interventions to instruction as usual. Studies mostly reported positive short-term effects on critical appraisal-related knowledge and skills in favour of the educational interventions. However, the certainty of the evidence for all comparisons and outcomes was very low. Educational interventions in schools may have beneficial short-term effects on knowledge and skills relevant to the critical appraisal of health claims. The small number of studies, their heterogeneity, and the predominantly high risk of bias inhibit any firm conclusions about their effects. None of the studies evaluated any long-term effects of interventions. Future intervention studies should adhere to high methodological standards, target a wider variety of school-based settings, and include a process evaluation. PROSPERO no. CRD42015017936.

  11. Effects of School-Based Educational Interventions for Enhancing Adolescents Abilities in Critical Appraisal of Health Claims: A Systematic Review

    PubMed Central

    Espehaug, Birgitte; Guttersrud, Øystein; Flottorp, Signe

    2016-01-01

    Background and Objective Adolescents are frequent media users who access health claims from various sources. The plethora of conflicting, pseudo-scientific, and often misleading health claims in popular media makes critical appraisal of health claims an essential ability. Schools play an important role in educating youth to critically appraise health claims. The objective of this systematic review was to evaluate the effects of school-based educational interventions for enhancing adolescents’ abilities in critically appraising health claims. Methods We searched MEDLINE, Embase, PsycINFO, AMED, Cinahl, Teachers Reference Centre, LISTA, ERIC, Sociological Abstracts, Social Services Abstracts, The Cochrane Library, Science Citation Index Expanded, Social Sciences Citation Index, and sources of grey literature. Studies that evaluated school-based educational interventions to improve adolescents’ critical appraisal ability for health claims through advancing the students’ knowledge about science were included. Eligible study designs were randomised and non-randomised controlled trials, and interrupted time series. Two authors independently selected studies, extracted data, and assessed risk of bias in included studies. Due to heterogeneity in interventions and inadequate reporting of results, we performed a descriptive synthesis of studies. We used GRADE (Grading of Recommendations, Assessment, Development, and Evaluation) to assess the certainty of the evidence. Results Eight studies were included: two compared different teaching modalities, while the others compared educational interventions to instruction as usual. Studies mostly reported positive short-term effects on critical appraisal-related knowledge and skills in favour of the educational interventions. However, the certainty of the evidence for all comparisons and outcomes was very low. Conclusion Educational interventions in schools may have beneficial short-term effects on knowledge and skills relevant to the critical appraisal of health claims. The small number of studies, their heterogeneity, and the predominantly high risk of bias inhibit any firm conclusions about their effects. None of the studies evaluated any long-term effects of interventions. Future intervention studies should adhere to high methodological standards, target a wider variety of school-based settings, and include a process evaluation. Systematic Review Registration PROSPERO no. CRD42015017936. PMID:27557129

  12. A Legacy of Wildfire-associated Nutrient Releases to Drinking Water Supplies: Treatment Challenges and Adaptations Opportunities

    NASA Astrophysics Data System (ADS)

    Emelko, M.; Silins, U.; Stone, M.

    2016-12-01

    Wildfire remains the most catastrophic agent of landscape disturbance in many forested source water regions. Notably, while wildfire impacts on water have been well studied, little if any of that work has specifically focused on drinking water treatability impacts, which will have both significant regional differences and similarities. Wildfire effects on water quality, particularly nutrient concentrations and character/forms, can be significant. The longevity and downstream propagation of these effects, as well as the geochemical mechanisms regulating them have been largely undocumented at larger river basin scales. This work demonstrates that fine sediment in gravel-bed rivers is a significant, long-term source of in-stream bioavailable P that contributes to a legacy of wildfire impacts on downstream water quality, aquatic ecology, and drinking water treatability in some ecoregions. The short- and mid-term impacts include increases in primary productivity and dissolved organic carbon, associated changes in carbon character, and increased potential for the formation of disinfection byproducts during drinking water treatment. The longer term impacts also may include increases in potentially toxic algal blooms and the production of taste and odor compounds. These documented impacts, as well as strategies for assessing the risk of wildfire-associated water service disruptions and infrastructure and land management-associated opportunities for adaptation to and mitigation of wildfire risk to drinking water supply will be discussed.

  13. The role of a detailed aqueous phase source release model in the LANL area G performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vold, E.L.; Shuman, R.; Hollis, D.K.

    1995-12-31

    A preliminary draft of the Performance Assessment for the Los Alamos National Laboratory (LANL) low-level radioactive waste disposal facility at Area G is currently being completed as required by Department of Energy orders. A detailed review of the inventory data base records and the existing models for source release led to the development of a new modeling capability to describe the liquid phase transport from the waste package volumes. Nuclide quantities are sorted down to four waste package release categories for modeling: rapid release, soil, concrete/sludge, and corrosion. Geochemistry for the waste packages was evaluated in terms of the equilibriummore » coefficients, Kds, and elemental solubility limits, Csl, interpolated from the literature. Percolation calculations for the base case closure cover show a highly skewed distribution with an average of 4 mm/yr percolation from the disposal unit bottom. The waste release model is based on a compartment representation of the package efflux, and depends on package size, percolation rate or Darcy flux, retardation coefficient, and moisture content.« less

  14. Comparison of Thermal Detector Arrays for Off-Axis THz Holography and Real-Time THz Imaging

    PubMed Central

    Hack, Erwin; Valzania, Lorenzo; Gäumann, Gregory; Shalaby, Mostafa; Hauri, Christoph P.; Zolliker, Peter

    2016-01-01

    In terahertz (THz) materials science, imaging by scanning prevails when low power THz sources are used. However, the application of array detectors operating with high power THz sources is increasingly reported. We compare the imaging properties of four different array detectors that are able to record THz radiation directly. Two micro-bolometer arrays are designed for infrared imaging in the 8–14 μm wavelength range, but are based on different absorber materials (i) vanadium oxide; (ii) amorphous silicon; (iii) a micro-bolometer array optimized for recording THz radiation based on silicon nitride; and (iv) a pyroelectric array detector for THz beam profile measurements. THz wavelengths of 96.5 μm, 118.8 μm, and 393.6 μm from a powerful far infrared laser were used to assess the technical performance in terms of signal to noise ratio, detector response and detectivity. The usefulness of the detectors for beam profiling and digital holography is assessed. Finally, the potential and limitation for real-time digital holography are discussed. PMID:26861341

  15. Comparison of Thermal Detector Arrays for Off-Axis THz Holography and Real-Time THz Imaging.

    PubMed

    Hack, Erwin; Valzania, Lorenzo; Gäumann, Gregory; Shalaby, Mostafa; Hauri, Christoph P; Zolliker, Peter

    2016-02-06

    In terahertz (THz) materials science, imaging by scanning prevails when low power THz sources are used. However, the application of array detectors operating with high power THz sources is increasingly reported. We compare the imaging properties of four different array detectors that are able to record THz radiation directly. Two micro-bolometer arrays are designed for infrared imaging in the 8-14 μm wavelength range, but are based on different absorber materials (i) vanadium oxide; (ii) amorphous silicon; (iii) a micro-bolometer array optimized for recording THz radiation based on silicon nitride; and (iv) a pyroelectric array detector for THz beam profile measurements. THz wavelengths of 96.5 μm, 118.8 μm, and 393.6 μm from a powerful far infrared laser were used to assess the technical performance in terms of signal to noise ratio, detector response and detectivity. The usefulness of the detectors for beam profiling and digital holography is assessed. Finally, the potential and limitation for real-time digital holography are discussed.

  16. Measurements of multiple gas parameters in a pulsed-detonation combustor using time-division-multiplexed Fourier-domain mode-locked lasers.

    PubMed

    Caswell, Andrew W; Roy, Sukesh; An, Xinliang; Sanders, Scott T; Schauer, Frederick R; Gord, James R

    2013-04-20

    Hyperspectral absorption spectroscopy is being used to monitor gas temperature, velocity, pressure, and H(2)O mole fraction in a research-grade pulsed-detonation combustor (PDC) at the Air Force Research Laboratory. The hyperspectral source employed is termed the TDM 3-FDML because it consists of three time-division-multiplexed (TDM) Fourier-domain mode-locked (FDML) lasers. This optical-fiber-based source monitors sufficient spectral information in the H(2)O absorption spectrum near 1350 nm to permit measurements over the wide range of conditions encountered throughout the PDC cycle. Doppler velocimetry based on absorption features is accomplished using a counterpropagating beam approach that is designed to minimize common-mode flow noise. The PDC in this study is operated in two configurations: one in which the combustion tube exhausts directly to the ambient environment and another in which it feeds an automotive-style turbocharger to assess the performance of a detonation-driven turbine. Because the enthalpy flow [kilojoule/second] is important in assessing the performance of the PDC in various configurations, it is calculated from the measured gas properties.

  17. Brain-computer interaction research at the Computer Vision and Multimedia Laboratory, University of Geneva.

    PubMed

    Pun, Thierry; Alecu, Teodor Iulian; Chanel, Guillaume; Kronegg, Julien; Voloshynovskiy, Sviatoslav

    2006-06-01

    This paper describes the work being conducted in the domain of brain-computer interaction (BCI) at the Multimodal Interaction Group, Computer Vision and Multimedia Laboratory, University of Geneva, Geneva, Switzerland. The application focus of this work is on multimodal interaction rather than on rehabilitation, that is how to augment classical interaction by means of physiological measurements. Three main research topics are addressed. The first one concerns the more general problem of brain source activity recognition from EEGs. In contrast with classical deterministic approaches, we studied iterative robust stochastic based reconstruction procedures modeling source and noise statistics, to overcome known limitations of current techniques. We also developed procedures for optimal electroencephalogram (EEG) sensor system design in terms of placement and number of electrodes. The second topic is the study of BCI protocols and performance from an information-theoretic point of view. Various information rate measurements have been compared for assessing BCI abilities. The third research topic concerns the use of EEG and other physiological signals for assessing a user's emotional status.

  18. Identify and Quantify the Mechanistic Sources of Sensor Performance Variation Between Individual Sensors SN1 and SN2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz, Aaron A.; Baldwin, David L.; Cinson, Anthony D.

    2014-08-06

    This Technical Letter Report satisfies the M3AR-14PN2301022 milestone, and is focused on identifying and quantifying the mechanistic sources of sensor performance variation between individual 22-element, linear phased-array sensor prototypes, SN1 and SN2. This effort constitutes an iterative evolution that supports the longer term goal of producing and demonstrating a pre-manufacturing prototype ultrasonic probe that possesses the fundamental performance characteristics necessary to enable the development of a high-temperature sodium-cooled fast reactor inspection system. The scope of the work for this portion of the PNNL effort conducted in FY14 includes performing a comparative evaluation and assessment of the performance characteristics of themore » SN1 and SN2 22 element PA-UT probes manufactured at PNNL. Key transducer performance parameters, such as sound field dimensions, resolution capabilities, frequency response, and bandwidth are used as a metric for the comparative evaluation and assessment of the SN1 and SN2 engineering test units.« less

  19. Modeling the transport of PCDD/F compounds in a contaminated river and the possible influence of restoration dredging on calculated fluxes.

    PubMed

    Malve, Olli; Salo, Simo; Verta, Matti; Forsius, John

    2003-08-01

    River Kymijoki, the fourth largest river in Finland, has been heavily polluted by pulp mill effluents as well as by chemical industry. Loading has been reduced considerably, although remains of past emissions still exist in river sediments. The sediments are highly contaminated with polychlorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs), polychlorinated diphenyl ethers (PCDEs), and mercury originating from production of the chlorophenolic wood preservative (Ky-5) and other sources. The objective of this study was to simulate the transport of these PCDD/F compounds with a one-dimensional flow and transport model and to assess the impact of restoration dredging. Using the estimated trend in PCDD/F loading, downstream concentrations were calculated until 2020. If contaminated sediments are removed by dredging, the temporary increase of PCDD/F concentrations in downstream water and surface sediments will be within acceptable limits. Long-term predictions indicated only a minor decrease in surface sediment concentrations but a major decrease if the most contaminated sediments close to the emission source were removed. A more detailed assessment of the effects is suggested.

  20. A study of infrasound propagation based on high-order finite difference solutions of the Navier-Stokes equations.

    PubMed

    Marsden, O; Bogey, C; Bailly, C

    2014-03-01

    The feasibility of using numerical simulation of fluid dynamics equations for the detailed description of long-range infrasound propagation in the atmosphere is investigated. The two dimensional (2D) Navier Stokes equations are solved via high fidelity spatial finite differences and Runge-Kutta time integration, coupled with a shock-capturing filter procedure allowing large amplitudes to be studied. The accuracy of acoustic prediction over long distances with this approach is first assessed in the linear regime thanks to two test cases featuring an acoustic source placed above a reflective ground in a homogeneous and weakly inhomogeneous medium, solved for a range of grid resolutions. An atmospheric model which can account for realistic features affecting acoustic propagation is then described. A 2D study of the effect of source amplitude on signals recorded at ground level at varying distances from the source is carried out. Modifications both in terms of waveforms and arrival times are described.

  1. Laboratory-based micro-X-ray fluorescence setup using a von Hamos crystal spectrometer and a focused beam X-ray tube

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kayser, Y., E-mail: yves.kayser@psi.ch; Paul Scherrer Institut, 5232 Villigen-PSI; Błachucki, W.

    2014-04-15

    The high-resolution von Hamos bent crystal spectrometer of the University of Fribourg was upgraded with a focused X-ray beam source with the aim of performing micro-sized X-ray fluorescence (XRF) measurements in the laboratory. The focused X-ray beam source integrates a collimating optics mounted on a low-power micro-spot X-ray tube and a focusing polycapillary half-lens placed in front of the sample. The performances of the setup were probed in terms of spatial and energy resolution. In particular, the fluorescence intensity and energy resolution of the von Hamos spectrometer equipped with the novel micro-focused X-ray source and a standard high-power water-cooled X-raymore » tube were compared. The XRF analysis capability of the new setup was assessed by measuring the dopant distribution within the core of Er-doped SiO{sub 2} optical fibers.« less

  2. Assessment of the instantaneous unit hydrograph derived from the theory of topologically random networks

    USGS Publications Warehouse

    Karlinger, M.R.; Troutman, B.M.

    1985-01-01

    An instantaneous unit hydrograph (iuh) based on the theory of topologically random networks (topological iuh) is evaluated in terms of sets of basin characteristics and hydraulic parameters. Hydrographs were computed using two linear routing methods for each of two drainage basins in the southeastern United States and are the basis of comparison for the topological iuh's. Elements in the sets of basin characteristics for the topological iuh's are the number of first-order streams only, (N), or the nuber of sources together with the number of channel links in the topological diameter (N, D); the hydraulic parameters are values of the celerity and diffusivity constant. Sensitivity analyses indicate that the mean celerity of the internal links in the network is the critical hydraulic parameter for determining the shape of the topological iuh, while the diffusivity constant has minimal effect on the topological iuh. Asymptotic results (source-only) indicate the number of sources need not be large to approximate the topological iuh with the Weibull probability density function.

  3. Attention during memory retrieval enhances future remembering.

    PubMed

    Dudukovic, Nicole M; Dubrow, Sarah; Wagner, Anthony D

    2009-10-01

    Memory retrieval is a powerful learning event that influences whether an experience will be remembered in the future. Although retrieval can succeed in the presence of distraction, dividing attention during retrieval may reduce the power of remembering as an encoding event. In the present experiments, participants studied pictures of objects under full attention and then engaged in item recognition and source memory retrieval under full or divided attention. Two days later, a second recognition and source recollection test assessed the impact of attention during initial retrieval on long-term retention. On this latter test, performance was superior for items that had been tested initially under full versus divided attention. More importantly, even when items were correctly recognized on the first test, divided attention reduced the likelihood of subsequent recognition on the second test. The same held true for source recollection. Additionally, foils presented during the first test were also less likely to be later recognized if they had been encountered initially under divided attention. These findings demonstrate that attentive retrieval is critical for learning through remembering.

  4. Using stable isotopes and functional wood anatomy to identify underlying mechanisms of drought tolerance in different provenances of lodgepole pine

    NASA Astrophysics Data System (ADS)

    Isaac-Renton, Miriam; Montwé, David; Hamann, Andreas; Spiecker, Heinrich; Cherubini, Paolo; Treydte, Kerstin

    2016-04-01

    Choosing drought-tolerant seed sources for reforestation may help adapt forests to climate change. By combining dendroecological growth analysis with a long-term provenance trial, we assessed growth and drought tolerance of different populations of a wide-ranging conifer, lodgepole pine (Pinus contorta). This experimental design simulated a climate warming scenario through southward seed transfer, and an exceptional drought also occurred in 2002. We felled over 500 trees, representing 23 seed sources, which were grown for 32 years at three warm, dry sites in southern British Columbia, Canada. Northern populations showed poor growth and drought tolerance. These seed sources therefore appear to be especially at risk under climate change. Before recommending assisted migration of southern seeds towards the north, however, it is important to understand the physiological mechanisms underlying these responses. We combine functional wood anatomy with a dual-isotope approach to evaluate these mechanisms to drought response.

  5. Physicsdesign point for a 1MW fusion neutron source

    NASA Astrophysics Data System (ADS)

    Woodruff, Simon; Melnik, Paul; Sieck, Paul; Stuber, James; Romero-Talamas, Carlos; O'Bryan, John; Miller, Ronald

    2016-10-01

    We are developing a design point for a spheromak experiment heated by adiabatic compression for use as a compact neutron source. We utilize the CORSICA and NIMROD MHD codes as well as analytic modeling to assess a concept with target parameters R0 =0.5m, Rf =0.17m, T0 =1keV, Tf =8keV, n0 =2e20m-3 and nf = 5e21m-3, with radial convergence of C =R0/Rf =3. We present results from CORSICA showing the placement of coils and passive structure to ensure stability during compression. We specify target parameters for the compression in terms of plasma beta, formation efficiency and energy confinement. We present results simulations of magnetic compression using the NIMROD code to examine the role of rotation on the stability and confinement of the spheromak as it is compressed. Supported by DARPA Grant N66001-14-1-4044 and IAEA CRP on Compact Fusion Neutron Sources.

  6. Effect of end-stage renal disease on long-term survival after a first-ever mechanical ventilation: a population-based study.

    PubMed

    Chen, Chin-Ming; Lai, Chih-Cheng; Cheng, Kuo-Chen; Weng, Shih-Feng; Liu, Wei-Lun; Shen, Hsiu-Nien

    2015-10-01

    Patients with end-stage renal disease (ESRD(Pos)) usually have multiple comorbidities and are predisposed to acute organ failure and in-hospital mortality. We assessed the effect of ESRD on the poorly understood long-term mortality risk after a first-ever mechanical ventilation (1-MV) for acute respiratory failure. The data source was Taiwan's National Health Insurance (NHI) Research Database. All patients given a 1-MV between 1999 and 2008 from one million randomly selected NHI beneficiaries were identified (n = 38,659). Patients with or without ESRD (ESRD(Neg)) after a 1-MV between 1999 and 2008 were retrospectively compared and followed from the index admission date to death or the end of 2011. ESRD(Pos) patients (n = 1185; mean age: 65.9 years; men: 51.5 %) were individually matched to ESRD(Neg) patients (ratio: 1:8) using a propensity score method. The primary outcome was death after a 1-MV. The effect of ESRD on the risk of death after MV was assessed. A Cox proportional hazard regression model was used to assess how ESRD affected the mortality risk after a 1-MV. The baseline characteristics of the two cohorts were balanced, but the incidence of mortality was higher in ESRD(Pos) patients than in ESRD(Neg) patients (342.30 versus 179.67 per 1000 person-years; P <0.001; covariate-adjusted hazard ratio: 1.43; 95 % confidence interval: 1.31-1.51). For patients who survived until discharge, ESRD was not associated with long-term (>4 years) mortality. ESRD increased the mortality risk after a 1-MV, but long-term survival seemed similar.

  7. The Role of Near-Earth Asteroids in Long-Term Platinum Supply

    NASA Astrophysics Data System (ADS)

    Blair, B. R.

    2000-01-01

    High-grade platinum-group metal concentrations have been identified in an abundant class of near-Earth asteroids known as LL Chondrites. The potential existence of a high-value asteroid-derived mineral product is examined from an economic perspective to assess the possible impacts on long-term precious metal supply. It is hypothesized that extraterrestrial sources of platinum group metals will become available in the global marketplace in a 20-year time frame, based on current trends of growth in technology and increasing levels of human activities in near-Earth space. Current and projected trends in platinum supply and demand are cited from the relevant literature to provide an economic context and provide an example for evaluating the economic potential of future asteroid-derived precious and strategic metals.

  8. Auditing the multiply-related concepts within the UMLS

    PubMed Central

    Mougin, Fleur; Grabar, Natalia

    2014-01-01

    Objective This work focuses on multiply-related Unified Medical Language System (UMLS) concepts, that is, concepts associated through multiple relations. The relations involved in such situations are audited to determine whether they are provided by source vocabularies or result from the integration of these vocabularies within the UMLS. Methods We study the compatibility of the multiple relations which associate the concepts under investigation and try to explain the reason why they co-occur. Towards this end, we analyze the relations both at the concept and term levels. In addition, we randomly select 288 concepts associated through contradictory relations and manually analyze them. Results At the UMLS scale, only 0.7% of combinations of relations are contradictory, while homogeneous combinations are observed in one-third of situations. At the scale of source vocabularies, one-third do not contain more than one relation between the concepts under investigation. Among the remaining source vocabularies, seven of them mainly present multiple non-homogeneous relations between terms. Analysis at the term level also shows that only in a quarter of cases are the source vocabularies responsible for the presence of multiply-related concepts in the UMLS. These results are available at: http://www.isped.u-bordeaux2.fr/ArticleJAMIA/results_multiply_related_concepts.aspx. Discussion Manual analysis was useful to explain the conceptualization difference in relations between terms across source vocabularies. The exploitation of source relations was helpful for understanding why some source vocabularies describe multiple relations between a given pair of terms. PMID:24464853

  9. Drinking water sources, availability, quality, access and utilization for goats in the Karak Governorate, Jordan.

    PubMed

    Al-Khaza'leh, Ja'far Mansur; Reiber, Christoph; Al Baqain, Raid; Valle Zárate, Anne

    2015-01-01

    Goat production is an important agricultural activity in Jordan. The country is one of the poorest countries in the world in terms of water scarcity. Provision of sufficient quantity of good quality drinking water is important for goats to maintain feed intake and production. This study aimed to evaluate the seasonal availability and quality of goats' drinking water sources, accessibility, and utilization in different zones in the Karak Governorate in southern Jordan. Data collection methods comprised interviews with purposively selected farmers and quality assessment of water sources. The provision of drinking water was considered as one of the major constraints for goat production, particularly during the dry season (DS). Long travel distances to the water sources, waiting time at watering points, and high fuel and labor costs were the key reasons associated with the problem. All the values of water quality (WQ) parameters were within acceptable limits of the guidelines for livestock drinking WQ with exception of iron, which showed slightly elevated concentration in one borehole source in the DS. These findings show that water shortage is an important problem leading to consequences for goat keepers. To alleviate the water shortage constraint and in view of the depleted groundwater sources, alternative water sources at reasonable distance have to be tapped and monitored for water quality and more efficient use of rainwater harvesting systems in the study area is recommended.

  10. Management of Ultimate Risk of Nuclear Power Plants by Source Terms - Lessons Learned from the Chernobyl Accident

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genn Saji

    2006-07-01

    The term 'ultimate risk' is used here to describe the probabilities and radiological consequences that should be incorporated in siting, containment design and accident management of nuclear power plants for hypothetical accidents. It is closely related with the source terms specified in siting criteria which assures an adequate separation of radioactive inventories of the plants from the public, in the event of a hypothetical and severe accident situation. The author would like to point out that current source terms which are based on the information from the Windscale accident (1957) through TID-14844 are very outdated and do not incorporate lessonsmore » learned from either the Three Miles Island (TMI, 1979) nor Chernobyl accident (1986), two of the most severe accidents ever experienced. As a result of the observations of benign radionuclides released at TMI, the technical community in the US felt that a more realistic evaluation of severe reactor accident source terms was necessary. In this background, the 'source term research project' was organized in 1984 to respond to these challenges. Unfortunately, soon after the time of the final report from this project was released, the Chernobyl accident occurred. Due to the enormous consequences induced by then accident, the one time optimistic perspectives in establishing a more realistic source term were completely shattered. The Chernobyl accident, with its human death toll and dispersion of a large part of the fission fragments inventories into the environment, created a significant degradation in the public's acceptance of nuclear energy throughout the world. In spite of this, nuclear communities have been prudent in responding to the public's anxiety towards the ultimate safety of nuclear plants, since there still remained many unknown points revolving around the mechanism of the Chernobyl accident. In order to resolve some of these mysteries, the author has performed a scoping study of the dispersion and deposition mechanisms of fuel particles and fission fragments during the initial phase of the Chernobyl accident. Through this study, it is now possible to generally reconstruct the radiological consequences by using a dispersion calculation technique, combined with the meteorological data at the time of the accident and land contamination densities of {sup 137}Cs measured and reported around the Chernobyl area. Although it is challenging to incorporate lessons learned from the Chernobyl accident into the source term issues, the author has already developed an example of safety goals by incorporating the radiological consequences of the accident. The example provides safety goals by specifying source term releases in a graded approach in combination with probabilities, i.e. risks. The author believes that the future source term specification should be directly linked with safety goals. (author)« less

  11. Long-term ecosystem monitoring and assessment of the Detroit River and Western Lake Erie.

    PubMed

    Hartig, J H; Zarull, M A; Ciborowski, J J H; Gannon, J E; Wilke, E; Norwood, G; Vincent, A N

    2009-11-01

    Over 35 years of US and Canadian pollution prevention and control efforts have led to substantial improvements in environmental quality of the Detroit River and western Lake Erie. However, the available information also shows that much remains to be done. Improvements in environmental quality have resulted in significant ecological recovery, including increasing populations of bald eagles (Haliaeetus leucocephalus), peregrine falcons (Falco columbarius), lake sturgeon (Acipenser fulvescens), lake whitefish (Coregonus clupeaformis), walleye (Sander vitreus), and burrowing mayflies (Hexagenia spp.). Although this recovery is remarkable, many challenges remain, including population growth, transportation expansion, and land use changes; nonpoint source pollution; toxic substances contamination; habitat loss and degradation; introduction of exotic species; and greenhouse gases and global warming. Research/monitoring must be sustained for effective management. Priority research and monitoring needs include: demonstrating and quantifying cause-effect relationships; establishing quantitative endpoints and desired future states; determining cumulative impacts and how indicators relate; improving modeling and prediction; prioritizing geographic areas for protection and restoration; and fostering long-term monitoring for adaptive management. Key management agencies, universities, and environmental and conservation organizations should pool resources and undertake comprehensive and integrative assessments of the health of the Detroit River and western Lake Erie at least every 5 years to practice adaptive management for long-term sustainability.

  12. Assessments of a Turbulence Model Based on Menter's Modification to Rotta's Two-Equation Model

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.

    2013-01-01

    The main objective of this paper is to construct a turbulence model with a more reliable second equation simulating length scale. In the present paper, we assess the length scale equation based on Menter s modification to Rotta s two-equation model. Rotta shows that a reliable second equation can be formed in an exact transport equation from the turbulent length scale L and kinetic energy. Rotta s equation is well suited for a term-by-term modeling and shows some interesting features compared to other approaches. The most important difference is that the formulation leads to a natural inclusion of higher order velocity derivatives into the source terms of the scale equation, which has the potential to enhance the capability of Reynolds-averaged Navier-Stokes (RANS) to simulate unsteady flows. The model is implemented in the PAB3D solver with complete formulation, usage methodology, and validation examples to demonstrate its capabilities. The detailed studies include grid convergence. Near-wall and shear flows cases are documented and compared with experimental and Large Eddy Simulation (LES) data. The results from this formulation are as good or better than the well-known SST turbulence model and much better than k-epsilon results. Overall, the study provides useful insights into the model capability in predicting attached and separated flows.

  13. The long-term problems of contaminated land: Sources, impacts and countermeasures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baes, C.F. III

    1986-11-01

    This report examines the various sources of radiological land contamination; its extent; its impacts on man, agriculture, and the environment; countermeasures for mitigating exposures; radiological standards; alternatives for achieving land decontamination and cleanup; and possible alternatives for utilizing the land. The major potential sources of extensive long-term land contamination with radionuclides, in order of decreasing extent, are nuclear war, detonation of a single nuclear weapon (e.g., a terrorist act), serious reactor accidents, and nonfission nuclear weapons accidents that disperse the nuclear fuels (termed ''broken arrows'').

  14. A System of Systems (SoS) Approach to transforming to a low carbon resource-efficient energy system: Insights for the European Union (EU)

    NASA Astrophysics Data System (ADS)

    Madani, K.; Jess, T.; Mahlooji, M.; Ristic, B.

    2015-12-01

    The world's energy sector is experiencing a serious transition from reliance on fossil fuel energy sources to extensive reliance on renewable energies. Europe is leading the way in this transition to a low carbon economy in an attempt to keep climate change below 2oC. Member States have committed themselves to reducing greenhouse gas emissions by 20% and increasing the share of renewables in the EU's energy mix to 20% by 2020. The EU has now gone a step further with the objective of reducing greenhouse gas emissions by 80-95% by 2050. Nevertheless, the short-term focus of the European Commission is at "cost-efficient ways" to cut its greenhouse gas emissions which forgoes the unintended impacts of a large expansion of low-carbon energy technologies on major natural resources such as water and land. This study uses the "System of Systems (SoS) Approach to Energy Sustainability Assessment" (Hadian and Madani, 2015) to evaluate the Relative Aggregate Footprint (RAF) of energy sources in different European Union (EU) member states. RAF reflects the overall resource-use efficiency of energy sources with respect to four criteria: carbon footprint, water footprint, land footprint, and economic cost. Weights are assigned to the four resource use efficiency criteria based on each member state's varying natural and economic resources to examine the changes in the desirability of energy sources based on regional resource availability conditions, and to help evaluating the overall resource use efficiency of the EU's energy portfolio. A longer-term strategy in Europe has been devised under the "Resource Efficient Europe" flagship imitative intended to put the EU on course to using resources in a sustainable way. This study will highlight the resource efficiency of the EU's energy sector in order to assist in a sustainable transition to a low carbon economy in Europe. ReferenceHadian S, Madani K (2015) A System of Systems Approach to Energy Sustainability Assessment: Are All Renewables Really Green? Ecological Indicators, 52, 194-206.

  15. Long-term chemical analysis and organic aerosol source apportionment at nine sites in central Europe: source identification and uncertainty assessment

    NASA Astrophysics Data System (ADS)

    Daellenbach, Kaspar R.; Stefenelli, Giulia; Bozzetti, Carlo; Vlachou, Athanasia; Fermo, Paola; Gonzalez, Raquel; Piazzalunga, Andrea; Colombi, Cristina; Canonaco, Francesco; Hueglin, Christoph; Kasper-Giebl, Anne; Jaffrezo, Jean-Luc; Bianchi, Federico; Slowik, Jay G.; Baltensperger, Urs; El-Haddad, Imad; Prévôt, André S. H.

    2017-11-01

    Long-term monitoring of organic aerosol is important for epidemiological studies, validation of atmospheric models, and air quality management. In this study, we apply a recently developed filter-based offline methodology using an aerosol mass spectrometer (AMS) to investigate the regional and seasonal differences of contributing organic aerosol sources. We present offline AMS measurements for particulate matter smaller than 10 µm at nine stations in central Europe with different exposure characteristics for the entire year of 2013 (819 samples). The focus of this study is a detailed source apportionment analysis (using positive matrix factorization, PMF) including in-depth assessment of the related uncertainties. Primary organic aerosol (POA) is separated in three components: hydrocarbon-like OA related to traffic emissions (HOA), cooking OA (COA), and biomass burning OA (BBOA). We observe enhanced production of secondary organic aerosol (SOA) in summer, following the increase in biogenic emissions with temperature (summer oxygenated OA, SOOA). In addition, a SOA component was extracted that correlated with an anthropogenic secondary inorganic species that is dominant in winter (winter oxygenated OA, WOOA). A factor (sulfur-containing organic, SC-OA) explaining sulfur-containing fragments (CH3SO2+), which has an event-driven temporal behaviour, was also identified. The relative yearly average factor contributions range from 4 to 14 % for HOA, from 3 to 11 % for COA, from 11 to 59 % for BBOA, from 5 to 23 % for SC-OA, from 14 to 27 % for WOOA, and from 15 to 38 % for SOOA. The uncertainty of the relative average factor contribution lies between 2 and 12 % of OA. At the sites north of the alpine crest, the sum of HOA, COA, and BBOA (POA) contributes less to OA (POA / OA = 0.3) than at the southern alpine valley sites (0.6). BBOA is the main contributor to POA with 87 % in alpine valleys and 42 % north of the alpine crest. Furthermore, the influence of primary biological particles (PBOAs), not resolved by PMF, is estimated and could contribute significantly to OA in PM10.

  16. How to Manual: How to Update and Enhance Your Local Source Water Protection Assessments

    EPA Pesticide Factsheets

    Describes opportunities for improving source water assessments performed under the Safe Drinking Water Act 1453. It includes: local delineations, potential contaminant source inventories, and susceptibility determinations of source water assessment.

  17. Assessment of Vegetation Responses and Sensitivity to the Millennium Drought in Australia

    NASA Astrophysics Data System (ADS)

    Jiao, T.; Williams, C. A.

    2017-12-01

    During the period from 1997 to 2009, Australia experienced one of the most severe and persistent drought known as the Millennium Drought (MD). Major water shortages were reported across the Australian continent as well as a great many tree mortality during and post this drought event. Given the projection of hotter and drier conditions for much of the continent (Hughes 2003), it is critical to analyze the impacts of climate extremes like MD as an indicator of possible impacts of future trends. A few drought assessments have been performed for the MD but their utilization of single-source Remote sensing data like vegetation indices makes it difficult to produce a comprehensive understanding of drought responses for diverse ecosystems in Australia. Furthermore, methods adopted in past drought assessments did not distinguish vegetation responses to drought events with different intensity, duration and sequence, which are critically important in determining the magnitude of vegetation responses to drought. Here, multi-source remote sensing datasets and an event-based drought assessment method were employed to assess the impacts of MD on vegetation in Australia in terms of the magnitude and sensitivity. Vegetation variables examined include fraction of photosynthetically absorbed radiation (Fpar), vegetation optical depth (VOD) and aboveground biomass (AGB). Drought indicators were calculated based on precipitation and potential evapotranspiration. Results show that most of Eastern Australia experienced abnormal water deficit during the MD and drought intensity was greatest in humid regions. The decline in aboveground biomass (ABC) demonstrates consistent variation with drought intensity across aridity levels. Drought impacts on Fpar and VOD were greatest at intermediate dryness and for woodier ecosystems, with impacts appearing in Fpar before VOD. Drought sensitivity was also greatest at intermediate dryness and for woodier ecosystems. The small difference in drought sensitivity, in terms of Fpar and VOD, across biomes suggests that trees, shrubs, and herbaceous are equally vulnerable to canopy dieback while the high drought sensitivity of trees as shown in ABC implies that a large amount of carbon could be released to the atmosphere if intense and long-duration drought occurs in forested areas.

  18. Impacts of Coal Seam Gas (Coal Bed Methane) Extraction on Water Resources in Australia

    NASA Astrophysics Data System (ADS)

    Post, David

    2017-04-01

    While extraction of methane from shale gas deposits has been the principal source of the recent expansion of the industry in the United States, in Australia extraction of methane from coal bed methane deposits (termed 'coal seam gas' in Australia) has been the focus to date. The two sources of methane share many of the same characteristics including the potential requirement for hydraulic fracturing. However, as coal seam gas deposits generally occur at shallower depths than shale gas, the potential impacts of extraction on surface and groundwater resources may be of even greater concern. In Australia, an Independent Expert Scientific Committee (IESC) has been established to provide scientific advice to federal and state government regulators on the impact that coal seam gas and large coal mining developments may have on water resources. This advice is provided to enable decisions to be informed by the best available science about the potential water-related impacts associated with these developments. To support this advice, the Australian Government Department of the Environment has implemented a programme of research termed 'bioregional assessments' to investigate these potential impacts. A bioregional assessment is defined as a scientific analysis of the ecology, hydrology, geology and hydrogeology of a bioregion with explicit assessment of the potential direct, indirect and cumulative impacts of coal seam gas and large coal mining development on water resources. These bioregional assessments are currently being carried out across large portions of eastern Australia underlain by coal reserves. Further details of the programme and results to date can be found at http://www.bioregionalassessments.gov.au. The bioregional assessment programme has modelled the impacts of coal seam gas development on surface and groundwater resources in three regions of eastern Australia, namely the Clarence-Moreton, Gloucester, and Namoi regions. This presentation will discuss the overall approach taken, and discuss how the results of these modelling studies will be used to evaluate the impacts of the depressurisation of coal seams on ecological, economic and socio-cultural assets that are dependent on surface and/or groundwater.

  19. Impacts of Coal Seam Gas (Coal Bed Methane) Extraction on Water Resources in Australia

    NASA Astrophysics Data System (ADS)

    Post, David

    2015-04-01

    While extraction of methane from shale gas deposits has been the principal source of the recent expansion of the industry in the United States and Europe, in Australia extraction of methane from coal bed methane deposits (termed 'coal seam gas' in Australia) has been the focus to date. The two sources of methane share many of the same characteristics including the potential requirement for hydraulic fracturing. However as coal seam gas deposits generally occur at shallower depths than shale gas the potential impacts of extraction and hydraulic fracturing on surface and groundwater resources may be of even greater concern for coal seam gas than for shale gas. In Australia an Independent Expert Scientific Committee (IESC) has been established to provide scientific advice to federal and state government regulators on the impact that coal seam gas and large coal mining developments may have on water resources. This advice is provided to enable decisions to be informed by the best available science about the potential water-related impacts associated with these developments. To support this advice the Australian Government Department of the Environment has implemented a three-year programme of research termed 'bioregional assessments' to investigate these potential impacts. A bioregional assessment is defined as a scientific analysis of the ecology, hydrology, geology and hydrogeology of a bioregion with explicit assessment of the potential direct, indirect and cumulative impacts of coal seam gas and large coal mining development on water resources. These bioregional assessments are currently being carried out across large portions of eastern Australia underlain by coal reserves. Further details of the program and results to date can be found at http://www.bioregionalassessments.gov.au. In this presentation the methodology for undertaking bioregional assessments will be described and the application of this methodology to six priority bioregions in eastern Australia will be detailed. Results of the programme to date will be provided (being nearly two years into the three-year study) with a focus on the preliminary results of numerical groundwater modelling. Once completed this modelling will be used to evaluate the impacts of the depressurisation of coal seams on aquifers and associated ecological, economic and socio-cultural water-dependent assets.

  20. Piloting water quality testing coupled with a national socioeconomic survey in Yogyakarta province, Indonesia, towards tracking of Sustainable Development Goal 6.

    PubMed

    Cronin, Aidan A; Odagiri, Mitsunori; Arsyad, Bheta; Nuryetty, Mariet Tetty; Amannullah, Gantjang; Santoso, Hari; Darundiyah, Kristin; Nasution, Nur 'Aisyah

    2017-10-01

    There remains a pressing need for systematic water quality monitoring strategies to assess drinking water safety and to track progress towards the Sustainable Development Goals (SDG). This study incorporated water quality testing into an existing national socioeconomic survey in Yogyakarta province, Indonesia; the first such study in Indonesia in terms of SDG tracking. Multivariate regression analysis assessed the association between faecal and nitrate contamination and drinking water sources household drinking water adjusted for wealth, education level, type of water sources and type of sanitation facilities. The survey observed widespread faecal contamination in both sources for drinking water (89.2%, 95%CI: 86.9-91.5%; n=720) and household drinking water (67.1%, 95%CI: 64.1-70.1%; n=917) as measured by Escherichia coli. This was despite widespread improved drinking water source coverage (85.3%) and commonly self-reported boiling practices (82.2%). E.coli concentration levels in household drinking water were associated with wealth, education levels of a household head, and type of water source (i.e. vender water or local sources). Following the proposed SDG definition for Target 6.1 (water) and 6.2 (sanitation), the estimated proportion of households with access to safely managed drinking water and sanitation was 8.5% and 45.5%, respectively in the study areas, indicating substantial difference from improved drinking water (82.2%) and improved sanitation coverage (70.9%) as per the MDGs targets. The greatest contamination and risk factors were found in the poorest households indicating the urgent need for targeted and effective interventions here. There is suggested evidence that sub-surface leaching from on-site sanitation adversely impacts on drinking water sources, which underscores the need for further technical assistance in promoting latrine construction. Urgent action is still needed to strengthen systematic monitoring efforts towards tracking SDG Goal 6. Copyright © 2017 Elsevier GmbH. All rights reserved.

  1. Source term evaluation for accident transients in the experimental fusion facility ITER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Virot, F.; Barrachin, M.; Cousin, F.

    2015-03-15

    We have studied the transport and chemical speciation of radio-toxic and toxic species for an event of water ingress in the vacuum vessel of experimental fusion facility ITER with the ASTEC code. In particular our evaluation takes into account an assessed thermodynamic data for the beryllium gaseous species. This study shows that deposited beryllium dusts of atomic Be and Be(OH){sub 2} are formed. It also shows that Be(OT){sub 2} could exist in some conditions in the drain tank. (authors)

  2. Natural and Induced Environment in Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Badavi, Francis F.; Kim, Myung-Hee Y.; Clowdsley, Martha S.; Heinbockel, John H.; Cucinotta, Francis A.; Badhwar, Gautam D.; Atwell, William; Huston, Stuart L.

    2002-01-01

    The long-term exposure of astronauts on the developing International Space Station (ISS) requires an accurate knowledge of the internal exposure environment for human risk assessment and other onboard processes. The natural environment is moderated by the solar wind which varies over the solar cycle. The neutron environment within the Shuttle in low Earth orbit has two sources. A time dependent model for the ambient environment is used to evaluate the natural and induced environment. The induced neutron environment is evaluated using measurements on STS-31 and STS-36 near the 1990 solar maximum.

  3. Xanthos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-05-30

    Xanthos is a Python package designed to quantify and analyze global water availability in history and in future at 0.5° × 0.5° spatial resolution and a monthly time step under a changing climate. Its performance was also tested through real applications. It is open-source, extendable and convenient to researchers who work on long-term climate data for studies of global water supply, and Global Change Assessment Model (GCAM). This package integrates inherent global gridded data maps, I/O modules, Water-Balance Model modules and diagnostics modules by user-defined configuration.

  4. Using natural archives to track sources and long-term trends of pollution: some final thoughts and suggestions for future directions

    USGS Publications Warehouse

    Blais, Jules M.; Rosen, Michael R.; Smol, John P.

    2015-01-01

    Newly produced, as well as some so-called legacy contaminants, continue to be released into the environment at an accelerated rate. Given the general lack of integrated, direct monitoring programs, the use of natural archival records of contaminants will almost certainly continue to increase. We conclude this volume with a short chapter highlighting some of our final thoughts, with a focus on a call to action to develop and apply methodologies to assess the fidelity of the archival record.

  5. A recent assessment of cocoa and pesticides in Brazil: an unhealthy blend for plantation workers.

    PubMed

    Hay, A

    1991-07-01

    Cocoa is the major source of income for the Brazilian state of Bahia and some 2.3 million people rely on the crop for their livelihood. Most cocoa is grown on large plantations where working conditions are often extremely poor. Few workers using pesticides on plantations, or on small-holdings, have had instruction about safe working practices. Protective clothing is rarely available. The result is frequent complaints about health. None of this likely to improve in the short term given Brazil's perilous economic situation.

  6. Assessment Methods of Groundwater Overdraft Area and Its Application

    NASA Astrophysics Data System (ADS)

    Dong, Yanan; Xing, Liting; Zhang, Xinhui; Cao, Qianqian; Lan, Xiaoxun

    2018-05-01

    Groundwater is an important source of water, and long-term large demand make groundwater over-exploited. Over-exploitation cause a lot of environmental and geological problems. This paper explores the concept of over-exploitation area, summarizes the natural and social attributes of over-exploitation area, as well as expounds its evaluation methods, including single factor evaluation, multi-factor system analysis and numerical method. At the same time, the different methods are compared and analyzed. And then taking Northern Weifang as an example, this paper introduces the practicality of appraisal method.

  7. BiNGO: a Cytoscape plugin to assess overrepresentation of gene ontology categories in biological networks.

    PubMed

    Maere, Steven; Heymans, Karel; Kuiper, Martin

    2005-08-15

    The Biological Networks Gene Ontology tool (BiNGO) is an open-source Java tool to determine which Gene Ontology (GO) terms are significantly overrepresented in a set of genes. BiNGO can be used either on a list of genes, pasted as text, or interactively on subgraphs of biological networks visualized in Cytoscape. BiNGO maps the predominant functional themes of the tested gene set on the GO hierarchy, and takes advantage of Cytoscape's versatile visualization environment to produce an intuitive and customizable visual representation of the results.

  8. Biotic Nitrogen Enrichment Regulates Calcium Sources to Forests

    NASA Astrophysics Data System (ADS)

    Pett-Ridge, J. C.; Perakis, S. S.; Hynicka, J. D.

    2015-12-01

    Calcium is an essential nutrient in forest ecosystems that is susceptible to leaching loss and depletion. Calcium depletion can affect plant and animal productivity, soil acid buffering capacity, and fluxes of carbon and water. Excess nitrogen supply and associated soil acidification are often implicated in short-term calcium loss from soils, but the long-term role of nitrogen enrichment on calcium sources and resupply is unknown. Here we use strontium isotopes (87Sr/86Sr) as a proxy for calcium to investigate how soil nitrogen enrichment from biological nitrogen fixation interacts with bedrock calcium to regulate both short-term available supplies and the long-term sources of calcium in montane conifer forests. Our study examines 22 sites in western Oregon, spanning a 20-fold range of bedrock calcium on sedimentary and basaltic lithologies. In contrast to previous studies emphasizing abiotic control of weathering as a determinant of long-term ecosystem calcium dynamics and sources (via bedrock fertility, climate, or topographic/tectonic controls) we find instead that that biotic nitrogen enrichment of soil can strongly regulate calcium sources and supplies in forest ecosystems. For forests on calcium-rich basaltic bedrock, increasing nitrogen enrichment causes calcium sources to shift from rock-weathering to atmospheric dominance, with minimal influence from other major soil forming factors, despite regionally high rates of tectonic uplift and erosion that can rejuvenate weathering supply of soil minerals. For forests on calcium-poor sedimentary bedrock, we find that atmospheric inputs dominate regardless of degree of nitrogen enrichment. Short-term measures of soil and ecosystem calcium fertility are decoupled from calcium source sustainability, with fundamental implications for understanding nitrogen impacts, both in natural ecosystems and in the context of global change. Our finding that long-term nitrogen enrichment increases forest reliance on atmospheric calcium helps explain reports of greater ecological calcium limitation in an increasingly nitrogen-rich world.

  9. Cost of care of haemophilia with inhibitors.

    PubMed

    Di Minno, M N D; Di Minno, G; Di Capua, M; Cerbone, A M; Coppola, A

    2010-01-01

    In Western countries, the treatment of patients with inhibitors is presently the most challenging and serious issue in haemophilia management, direct costs of clotting factor concentrates accounting for >98% of the highest economic burden absorbed for the healthcare of patients in this setting. Being designed to address questions of resource allocation and effectiveness, decision models are the golden standard to reliably assess the overall economic implications of haemophilia with inhibitors in terms of mortality, bleeding-related morbidity, and severity of arthropathy. However, presently, most data analyses stem from retrospective short-term evaluations, that only allow for the analysis of direct health costs. In the setting of chronic diseases, the cost-utility analysis, that takes into account the beneficial effects of a given treatment/healthcare intervention in terms of health-related quality of life, is likely to be the most appropriate approach. To calculate net benefits, the quality adjusted life year, that significantly reflects such health gain, has to be compared with specific economic impacts. Differences in data sources, in medical practice and/or in healthcare systems and costs, imply that most current pharmacoeconomic analyses are confined to a narrow healthcare payer perspective. Long-term/lifetime prospective or observational studies, devoted to a careful definition of when to start a treatment; of regimens (dose and type of product) to employ, and of inhibitor population (children/adults, low-responding/high responding inhibitors) to study, are thus urgently needed to allow for newer insights, based on reliable data sources into resource allocation, effectiveness and cost-utility analysis in the treatment of haemophiliacs with inhibitors.

  10. [Written pharmaceutical advertising--still unreliable?].

    PubMed

    Gladsø, Kristin Haugen; Garberg, Hedda Rosland; Spigset, Olav; Slørdal, Lars

    2014-09-02

    Marketing by the pharmaceutical industry affects doctors' prescribing habits. All pharmaceutical advertising received by nine doctors in two GP offices over a period of three months was collected. The advertising material was sorted by compound. For each compound, the advert with the highest number of references was selected. The cited references were obtained, and the claims in the adverts were assessed in terms of their consistency with the source data based on the provisions in the Norwegian regulations on pharmaceuticals. The references were also assessed with regard to the incidence of conflicts of interest among authors. The doctors received a total of 270 shipments of advertising for 46 different compounds. Altogether 95% of the 173 references cited in the 46 selected adverts could be obtained. The adverts contained a total of 156 claims. Of these, 56% were assessed as correct when compared to the source data and as having clinical relevance. Altogether 75% of the journal articles reported relevant conflicts of interest for the authors. About half the claims in the adverts were found to be correct and clinically relevant. These results concur with those from a methodologically identical study based on advertising material collected in 2004. The cited literature was of varying quality and often funded by the pharmaceutical companies. The findings indicate that the target group should be sceptical of this type of marketing.

  11. Do Methodological Choices in Environmental Modeling Bias Rebound Effects? A Case Study on Electric Cars.

    PubMed

    Font Vivanco, David; Tukker, Arnold; Kemp, René

    2016-10-18

    Improvements in resource efficiency often underperform because of rebound effects. Calculations of the size of rebound effects are subject to various types of bias, among which methodological choices have received particular attention. Modellers have primarily focused on choices related to changes in demand, however, choices related to modeling the environmental burdens from such changes have received less attention. In this study, we analyze choices in the environmental assessment methods (life cycle assessment (LCA) and hybrid LCA) and environmental input-output databases (E3IOT, Exiobase and WIOD) used as a source of bias. The analysis is done for a case study on battery electric and hydrogen cars in Europe. The results describe moderate rebound effects for both technologies in the short term. Additionally, long-run scenarios are calculated by simulating the total cost of ownership, which describe notable rebound effect sizes-from 26 to 59% and from 18 to 28%, respectively, depending on the methodological choices-with favorable economic conditions. Relevant sources of bias are found to be related to incomplete background systems, technology assumptions and sectorial aggregation. These findings highlight the importance of the method setup and of sensitivity analyses of choices related to environmental modeling in rebound effect assessments.

  12. [Interventions based on exercise and physical environment for preventing falls in cognitively impaired older people living in long-term care facilities: A systematic review and meta-analysis].

    PubMed

    González-Román, Loreto; Bagur-Calafat, Caritat; Urrútia-Cuchí, Gerard; Garrido-Pedrosa, Jèssica

    2016-01-01

    This systematic review aims to report the effectiveness of interventions based on exercise and/or physical environment for reducing falls in cognitively impaired older adults living in long-term care facilities. In July 2014, a literature search was conducted using main databases and specialised sources. Randomised controlled trials assessing the effectiveness of fall prevention interventions, which used exercise or physical environment among elderly people with cognitive impairment living in long-term care facilities, were selected. Two independent reviewers checked the eligibility of the studies, and evaluated their methodological quality. If it was adequate, data were gathered. Fourteen studies with 3,539 participants using exercise and/or physical environment by a single or combined approach were included. The data gathered from studies that used both interventions showed a significant reduction in fall rate. Further research is needed to demonstrate the effectiveness of those interventions for preventing falls in the elderly with cognitive impairment living in long-term care establishments. Copyright © 2015 SEGG. Published by Elsevier Espana. All rights reserved.

  13. California Drought Recovery Assessment Using GRACE Satellite Gravimetry Information

    NASA Astrophysics Data System (ADS)

    Love, C. A.; Aghakouchak, A.; Madadgar, S.; Tourian, M. J.

    2015-12-01

    California has been experiencing its most extreme drought in recent history due to a combination of record high temperatures and exceptionally low precipitation. An estimate for when the drought can be expected to end is needed for risk mitigation and water management. A crucial component of drought recovery assessments is the estimation of terrestrial water storage (TWS) deficit. Previous studies on drought recovery have been limited to surface water hydrology (precipitation and/or runoff) for estimating changes in TWS, neglecting the contribution of groundwater deficits to the recovery time of the system. Groundwater requires more time to recover than surface water storage; therefore, the inclusion of groundwater storage in drought recovery assessments is essential for understanding the long-term vulnerability of a region. Here we assess the probability, for varying timescales, of California's current TWS deficit returning to its long-term historical mean. Our method consists of deriving the region's fluctuations in TWS from changes in the gravity field observed by NASA's Gravity Recovery and Climate Experiment (GRACE) satellites. We estimate the probability that meteorological inputs, precipitation minus evaporation and runoff, over different timespans will balance the current GRACE-derived TWS deficit (e.g. in 3, 6, 12 months). This method improves upon previous techniques as the GRACE-derived water deficit comprises all hydrologic sources, including surface water, groundwater, and snow cover. With this empirical probability assessment we expect to improve current estimates of California's drought recovery time, thereby improving risk mitigation.

  14. The mass-zero spin-two field and gravitational theory.

    NASA Technical Reports Server (NTRS)

    Coulter, C. A.

    1972-01-01

    Demonstration that the conventional theory of the mass-zero spin-two field with sources introduces extraneous nonspin-two field components in source regions and fails to be covariant under the full or restricted conformal group. A modified theory is given, expressed in terms of the physical components of mass-zero spin-two field rather than in terms of 'potentials,' which has no extraneous components inside or outside sources, and which is covariant under the full conformal group. For a proper choice of source term, this modified theory has the correct Newtonian limit and automatically implies that a symmetric second-rank source tensor has zero divergence. It is shown that possibly a generally covariant form of the spin-two theory derived here can be constructed to agree with general relativity in all currently accessible experimental situations.

  15. Volatile organic compounds in the nation's ground water and drinking-water supply wells

    USGS Publications Warehouse

    Zogorski, John S.; Carter, Janet M.; Ivahnenko, Tamara; Lapham, Wayne W.; Moran, Michael J.; Rowe, Barbara L.; Squillace, Paul J.; Toccalino, Patricia L.

    2006-01-01

    This national assessment of 55 volatile organic compounds (VOCs) in ground water gives emphasis to the occurrence of VOCs in aquifers that are used as an important supply of drinking water. In contrast to the monitoring of VOC contamination of ground water at point-source release sites, such as landfills and leaking underground storage tanks (LUSTs), our investigations of aquifers are designed as large-scale resource assessments that provide a general characterization of water-quality conditions. Nearly all of the aquifers included in this assessment have been identified as regionally extensive aquifers or aquifer systems. The assessment of ground water (Chapter 3) included analyses of about 3,500 water samples collected during 1985-2001 from various types of wells, representing almost 100 different aquifer studies. This is the first national assessment of the occurrence of a large number of VOCs with different uses, and the assessment addresses key questions about VOCs in aquifers. The assessment also provides a foundation for subsequent decadal assessments of the U.S. Geological Survey (USGS) National Water-Quality Assessment (NAWQA) Program to ascertain long-term trends of VOC occurrence in these aquifers.

  16. International market assessment of stand-alone photovoltaic power systems for cottage industry applications

    NASA Astrophysics Data System (ADS)

    Philippi, T. M.

    1981-11-01

    The final result of an international assessment of the market for stand-alone photovoltaic systems in cottage industry applications is reported. Nonindustrialized countries without centrally planned economies were considered. Cottage industries were defined as small rural manufacturers, employing less than 50 people, producing consumer and simple products. The data to support this analysis were obtained from secondary and expert sources in the U.S. and in-country field investigations of the Philippines and Mexico. The near-term market for photovoltaics for rural cottage industry applications appears to be limited to demonstration projects and pilot programs, based on an in-depth study of the nature of cottage industry, its role in the rural economy, the electric energy requirements of cottage industry, and a financial analysis of stand-alone photovoltaic systems as compared to their most viable competitor, diesel driven generators. Photovoltaics are shown to be a better long-term option only for very low power requirements. Some of these uses would include clay mixers, grinders, centrifuges, lathes, power saws and lighting of a workshop.

  17. Web Page Content and Quality Assessed for Shoulder Replacement.

    PubMed

    Matthews, John R; Harrison, Caitlyn M; Hughes, Travis M; Dezfuli, Bobby; Sheppard, Joseph

    2016-01-01

    The Internet has become a major source for obtaining health-related information. This study assesses and compares the quality of information available online for shoulder replacement using medical (total shoulder arthroplasty [TSA]) and nontechnical (shoulder replacement [SR]) terminology. Three evaluators reviewed 90 websites for each search term across 3 search engines (Google, Yahoo, and Bing). Websites were grouped into categories, identified as commercial or noncommercial, and evaluated with the DISCERN questionnaire. Total shoulder arthroplasty provided 53 unique sites compared to 38 websites for SR. Of the 53 TSA websites, 30% were health professional-oriented websites versus 18% of SR websites. Shoulder replacement websites provided more patient-oriented information at 48%, versus 45% of TSA websites. In total, SR websites provided 47% (42/90) noncommercial websites, with the highest number seen in Yahoo, compared with TSA at 37% (33/90), with Google providing 13 of the 33 websites (39%). Using the nonmedical terminology with Yahoo's search engine returned the most noncommercial and patient-oriented websites. However, the quality of information found online was highly variable, with most websites being unreliable and incomplete, regardless of search term.

  18. International market assessment of stand-alone photovoltaic power systems for cottage industry applications

    NASA Technical Reports Server (NTRS)

    Philippi, T. M.

    1981-01-01

    The final result of an international assessment of the market for stand-alone photovoltaic systems in cottage industry applications is reported. Nonindustrialized countries without centrally planned economies were considered. Cottage industries were defined as small rural manufacturers, employing less than 50 people, producing consumer and simple products. The data to support this analysis were obtained from secondary and expert sources in the U.S. and in-country field investigations of the Philippines and Mexico. The near-term market for photovoltaics for rural cottage industry applications appears to be limited to demonstration projects and pilot programs, based on an in-depth study of the nature of cottage industry, its role in the rural economy, the electric energy requirements of cottage industry, and a financial analysis of stand-alone photovoltaic systems as compared to their most viable competitor, diesel driven generators. Photovoltaics are shown to be a better long-term option only for very low power requirements. Some of these uses would include clay mixers, grinders, centrifuges, lathes, power saws and lighting of a workshop.

  19. Antibiotic Use and Need for Antimicrobial Stewardship in Long-Term Care.

    PubMed

    Wu, Lisa Dong-Ying; Walker, Sandra A N; Elligsen, Marion; Palmay, Lesley; Simor, Andrew; Daneman, Nick

    2015-01-01

    Antimicrobial stewardship may be important in long-term care facilities because of unnecessary or inappropriate antibiotic use observed in these residents, coupled with their increased vulnerability to health care-associated infections. To assess antibiotic use in a long-term care facility in order to identify potential antimicrobial stewardship needs. A retrospective descriptive study was conducted at the Veterans Centre, a long-term care facility at Sunnybrook Health Sciences Centre, Toronto, Ontario. All residents taking one or more antibiotics (n = 326) were included as participants. Antibiotic-use data for patients residing in the facility between April 1, 2011, and March 31, 2012, were collected and analyzed. Totals of 358 patient encounters, 835 antibiotic prescriptions, and 193 positive culture results were documented during the study period. For 36% (302/835) of antibiotic prescriptions, the duration was more than 7 days. Cephalosporins (30%; 251/835) and fluoroquinolones (28%; 235/835) were the most frequently prescribed antibiotic classes. Urine was the most common source of samples for culture (60%; 116/193). Characteristics of antimicrobial use at this long-term care facility that might benefit from further evaluation included potentially excessive use of fluoroquinolones and cephalosporins and potentially excessive duration of antibiotic use for individual patients.

  20. Children's exposure assessment of radiofrequency fields: Comparison between spot and personal measurements.

    PubMed

    Gallastegi, Mara; Huss, Anke; Santa-Marina, Loreto; Aurrekoetxea, Juan J; Guxens, Mònica; Birks, Laura Ellen; Ibarluzea, Jesús; Guerra, David; Röösli, Martin; Jiménez-Zabala, Ana

    2018-05-24

    Radiofrequency (RF) fields are widely used and, while it is still unknown whether children are more vulnerable to this type of exposure, it is essential to explore their level of exposure in order to conduct adequate epidemiological studies. Personal measurements provide individualized information, but they are costly in terms of time and resources, especially in large epidemiological studies. Other approaches, such as estimation of time-weighted averages (TWAs) based on spot measurements could simplify the work. The aims of this study were to assess RF exposure in the Spanish INMA birth cohort by spot measurements and by personal measurements in the settings where children tend to spend most of their time, i.e., homes, schools and parks; to identify the settings and sources that contribute most to that exposure; and to explore if exposure assessment based on spot measurements is a valid proxy for personal exposure. When children were 8 years old, spot measurements were conducted in the principal settings of 104 participants: homes (104), schools and their playgrounds (26) and parks (79). At the same time, personal measurements were taken for a subsample of 50 children during 3 days. Exposure assessment based on personal and on spot measurements were compared both in terms of mean exposures and in exposure-dependent categories by means of Bland-Altman plots, Cohen's kappa and McNemar test. Median exposure levels ranged from 29.73 (in children's bedrooms) to 200.10 μW/m 2 (in school playgrounds) for spot measurements and were higher outdoors than indoors. Median personal exposure was 52.13 μW/m 2 and median levels of assessments based on spot measurements ranged from 25.46 to 123.21 μW/m 2 . Based on spot measurements, the sources that contributed most to the exposure were FM radio, mobile phone downlink and Digital Video Broadcasting-Terrestrial, while indoor and personal sources contributed very little (altogether <20%). Similar distribution was observed with personal measurements. There was a bias proportional to power density between personal measurements and estimates based on spot measurements, with the latter providing higher exposure estimates. Nevertheless, there were no systematic differences between those methodologies when classifying subjects into exposure categories. Personal measurements of total RF exposure showed low to moderate agreement with home and bedroom spot measurements and agreed better, though moderately, with TWA based on spot measurements in the main settings where children spend time (homes, schools and parks; Kappa = 0.46). Exposure assessment based on spot measurements could be a feasible proxy to rank personal RF exposure in children population, providing that all relevant locations are being measured. Copyright © 2018. Published by Elsevier Ltd.

Top