Sample records for release scenario analysis

  1. 40 CFR 68.28 - Alternative release scenario analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 15 2011-07-01 2011-07-01 false Alternative release scenario analysis... scenario analysis. (a) The number of scenarios. The owner or operator shall identify and analyze at least... release scenario under § 68.25; and (ii) That will reach an endpoint offsite, unless no such scenario...

  2. Analysis of LNG peakshaving-facility release-prevention systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelto, P.J.; Baker, E.G.; Powers, T.B.

    1982-05-01

    The purpose of this study is to provide an analysis of release prevention systems for a reference LNG peakshaving facility. An overview assessment of the reference peakshaving facility, which preceeded this effort, identified 14 release scenarios which are typical of the potential hazards involved in the operation of LNG peakshaving facilities. These scenarios formed the basis for this more detailed study. Failure modes and effects analysis and fault tree analysis were used to estimate the expected frequency of each release scenario for the reference peakshaving facility. In addition, the effectiveness of release prevention, release detection, and release control systems weremore » evaluated.« less

  3. RMP Guidance for Warehouses - Chapter 4: Offsite Consequence Analysis

    EPA Pesticide Factsheets

    Offsite consequence analysis (OCA) informs government and the public about potential consequences of an accidental toxic or flammable chemical release at your facility, and consists of a worst-case release scenario and alternative release scenarios.

  4. Hydrothermal Liquefaction Treatment Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios receivedmore » increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.« less

  5. RMP Guidance for Offsite Consequence Analysis

    EPA Pesticide Factsheets

    Offsite consequence analysis (OCA) consists of a worst-case release scenario and alternative release scenarios. OCA is required from facilities with chemicals above threshold quantities. RMP*Comp software can be used to perform calculations described here.

  6. Validation of pharmaceutical potency determinations by quantitative nuclear magnetic resonance spectrometry.

    PubMed

    Webster, Gregory K; Marsden, Ian; Pommerening, Cynthia A; Tyrakowski, Christina M

    2010-05-01

    With the changing development paradigms in the pharmaceutical industry, laboratories are challenged to release materials for clinical studies with rapid turnaround times. To minimize cost demands, many businesses are looking to develop ways of using early Good Manufacturing Practice (GMP) materials of active pharmaceutical ingredients (API) for Good Laboratory Practice (GLP) toxicology studies. To make this happen, the analytical laboratory releases the material by one of three scenarios: (1) holding the GLP release until full GMP testing is ready, (2) issuing a separate lot number for a portion of the GMP material and releasing the material for GLP use, or (3) releasing the lot of material for GLP using alternate (equivalent) method(s) not specified for GMP release testing. Many companies are finding the third scenario to be advantageous in terms of cost and efficiency through the use of quantitative nuclear magnetic resonance (q-NMR). The use of q-NMR has proved to be a single-point replacement for routine early development testing that previously combined elements of identity testing, chromatographic assay, moisture analysis, residual solvent analysis, and elemental analysis. This study highlights that q-NMR can be validated to meet current regulatory analytical method guidelines for routine pharmaceutical analysis.

  7. RMP Guidance for Chemical Distributors - Chapter 4: Offsite Consequence Analysis

    EPA Pesticide Factsheets

    How to perform the OCA for regulated substances, informing the government and the public about potential consequences of an accidental chemical release at your facility. Includes calculations for worst-case scenario, alternative scenarios, and endpoints.

  8. Assessment of the Incentives Created by Public Disclosure of Off-Site Consequence Analysis Information for Reduction in the Risk of Accidental Releases

    EPA Pesticide Factsheets

    The off-site consequence analysis (OCA) evaluates the potential for worst-case and alternative accidental release scenarios to harm the public and environment around the facility. Public disclosure would likely reduce the number/severity of incidents.

  9. Space Propulsion Hazards Analysis Manual (SPHAM). Volume 2. Appendices

    DTIC Science & Technology

    1988-10-01

    lb. RESTRICTIVE MARKINGS UNCLASSIFIED 2a. SECURITY CLASSIFICATION AUTHORITY 3 . DISTRIBUTION/AVAILABILITY OF REPORT Approved for public release...Volume I Chapter 2 - Requirementb and the Hazards Analysis Process .... Volume I Chapter 3 - Accident Scenarios...list of the hazardous materials that are discussed; 3 ) description of the failure scenarios; 4) type of post-accident environment that is discussed

  10. ISECG Global Exploration Roadmap: A Stepwise Approach to Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Martinez, Roland; Goodliff, Kandyce; Whitley, Ryan

    2013-01-01

    In 2011, ISECG released the Global Exploration Roadmap (GER), advancing the "Global Exploration Strategy: The Framework for Coordination" by articulating the perspectives of participating agencies on exploration goals and objectives, mission scenarios, and coordination of exploration preparatory activities. The GER featured a stepwise development and demonstration of capabilities ultimately required for human exploration of Mars. In 2013 the GER was updated to reflect the ongoing evolution of agency's exploration policies and plans, informed by individual agency and coordinated analysis activities that are relevant to various elements of the GER framework as well as coordinated stakeholder engagement activities. For this release of version 2 of the GER in the mid 2013 timeframe, a modified mission scenario is presented, more firmly reflecting the importance of a stepwise evolution of critical capabilities provided by multiple partners necessary for executing increasingly complex missions to multiple destinations and leading to human exploration of Mars. This paper will describe the updated mission scenario, the changes since the release of version 1, the mission themes incorporated into the scenario, and risk reduction for Mars missions provided by exploration at various destinations.

  11. Quantitative risk assessment of the New York State operated West Valley Radioactive Waste Disposal Area.

    PubMed

    Garrick, B John; Stetkar, John W; Bembia, Paul J

    2010-08-01

    This article is based on a quantitative risk assessment (QRA) that was performed on a radioactive waste disposal area within the Western New York Nuclear Service Center in western New York State. The QRA results were instrumental in the decision by the New York State Energy Research and Development Authority to support a strategy of in-place management of the disposal area for another decade. The QRA methodology adopted for this first of a kind application was a scenario-based approach in the framework of the triplet definition of risk (scenarios, likelihoods, consequences). The measure of risk is the frequency of occurrence of different levels of radiation dose to humans at prescribed locations. The risk from each scenario is determined by (1) the frequency of disruptive events or natural processes that cause a release of radioactive materials from the disposal area; (2) the physical form, quantity, and radionuclide content of the material that is released during each scenario; (3) distribution, dilution, and deposition of the released materials throughout the environment surrounding the disposal area; and (4) public exposure to the distributed material and the accumulated radiation dose from that exposure. The risks of the individual scenarios are assembled into a representation of the risk from the disposal area. In addition to quantifying the total risk to the public, the analysis ranks the importance of each contributing scenario, which facilitates taking corrective actions and implementing effective risk management. Perhaps most importantly, quantification of the uncertainties is an intrinsic part of the risk results. This approach to safety analysis has demonstrated many advantages of applying QRA principles to assessing the risk of facilities involving hazardous materials.

  12. Financial analysis of experimental releases conducted at Glen Canyon Dam during water years 2006 through 2010.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poch, L. A.; Veselka, T. D.; Palmer, C. S.

    2011-08-22

    Because of concerns about the impact that Glen Canyon Dam (GCD) operations were having on downstream ecosystems and endangered species, the Bureau of Reclamation (Reclamation) conducted an Environmental Impact Statement (EIS) on dam operations (DOE 1996). New operating rules and management goals for GCD that had been specified in the Record of Decision (ROD) (Reclamation 1996) were adopted in February 1997. In addition to issuing new operating criteria, the ROD mandated experimental releases for the purpose of conducting scientific studies. A report released in January 2011 examined the financial implications of the experimental flows that were conducted at the GCDmore » from 1997 to 2005. This report continues the analysis and examines the financial implications of the experimental flows conducted at the GCD from 2006 to 2010. An experimental release may have either a positive or negative impact on the financial value of energy production. This study estimates the financial costs of experimental releases, identifies the main factors that contribute to these costs, and compares the interdependencies among these factors. An integrated set of tools was used to compute the financial impacts of the experimental releases by simulating the operation of the GCD under two scenarios, namely, (1) a baseline scenario that assumes both that operations comply with the ROD operating criteria and the experimental releases that actually took place during the study period, and (2) a 'without experiments' scenario that is identical to the baseline scenario of operations that comply with the GCD ROD, except it assumes that experimental releases did not occur. The Generation and Transmission Maximization (GTMax) model was the main simulation tool used to dispatch GCD and other hydropower plants that comprise the Salt Lake City Area Integrated Projects (SLCA/IP). Extensive data sets and historical information on SLCA/IP powerplant characteristics, hydrologic conditions, and Western Area Power Administration's (Western's) power purchase prices were used for the simulation. In addition to estimating the financial impact of experimental releases, the GTMax model was also used to gain insights into the interplay among ROD operating criteria, exceptions that were made to criteria to accommodate the experimental releases, and Western operating practices. Experimental releases in some water years resulted in financial benefits to Western while others resulted in financial costs. During the study period, the total financial costs of all experimental releases were more than $4.8 million.« less

  13. Data Fusion Analysis For Test Validation System

    DTIC Science & Technology

    2009-11-16

    triethyl phosphate (TEP), methyl salicylate (MeS), and acetic acid (AA). A total of 29 release scenarios were conducted: fifteen TEP releases of 30...N2 - north second. bA - 2, 3, 6, 7, 10, and 11; B - 1 through 12; NA - not available. cTEP - triethyl phosphate; MeS - methyl salicylate ; AA

  14. Financial analysis of experimental releases conducted at Glen Canyon Dam during water year 2011

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poch, L. A.; Veselka, T. D.; Palmer, C. S.

    2012-07-16

    This report examines the financial implications of experimental flows conducted at the Glen Canyon Dam (GCD) in water year 2011. It is the third report in a series examining financial implications of experimental flows conducted since the Record of Decision (ROD) was adopted in February 1997 (Reclamation 1996). A report released in January 2011 examined water years 1997 to 2005 (Veselka et al. 2011), and a report released in August 2011 examined water years 2006 to 2010 (Poch et al. 2011). An experimental release may have either a positive or negative impact on the financial value of energy production. Thismore » study estimates the financial costs of experimental releases, identifies the main factors that contribute to these costs, and compares the interdependencies among these factors. An integrated set of tools was used to compute the financial impacts of the experimental releases by simulating the operation of the GCD under two scenarios, namely, (1) a baseline scenario that assumes both that operations comply with the ROD operating criteria and the experimental releases that actually took place during the study period, and (2) a 'without experiments' scenario that is identical to the baseline scenario of operations that comply with the GCD ROD, except it assumes that experimental releases did not occur. The Generation and Transmission Maximization (GTMax) model was the main simulation tool used to dispatch GCD and other hydropower plants that comprise the Salt Lake City Area Integrated Projects (SLCA/IP). Extensive data sets and historical information on SLCA/IP powerplant characteristics, hydrologic conditions, and Western Area Power Administration's (Western's) power purchase prices were used for the simulation. In addition to estimating the financial impact of experimental releases, the GTMax model was also used to gain insights into the interplay among ROD operating criteria, exceptions that were made to criteria to accommodate the experimental releases, and Western operating practices. Experimental releases conducted in water year 2011 resulted only in financial costs; the total cost of all experimental releases was about $622,000.« less

  15. Evaluation of Water Year 2011 Glen Canyon Dam Flow Release Scenarios on Downstream Sand Storage along the Colorado River in Arizona

    USGS Publications Warehouse

    Wright, Scott A.; Grams, Paul E.

    2010-01-01

    This report describes numerical modeling simulations of sand transport and sand budgets for reaches of the Colorado River below Glen Canyon Dam. Two hypothetical Water Year 2011 annual release volumes were each evaluated with six hypothetical operational scenarios. The six operational scenarios include the current operation, scenarios with modifications to the monthly distribution of releases, and scenarios with modifications to daily flow fluctuations. Uncertainties in model predictions were evaluated by conducting simulations with error estimates for tributary inputs and mainstem transport rates. The modeling results illustrate the dependence of sand transport rates and sand budgets on the annual release volumes as well as the within year operating rules. The six operational scenarios were ranked with respect to the predicted annual sand budgets for Marble Canyon and eastern Grand Canyon reaches. While the actual WY 2011 annual release volume and levels of tributary inputs are unknown, the hypothetical conditions simulated and reported herein provide reasonable comparisons between the operational scenarios, in a relative sense, that may be used by decision makers within the Glen Canyon Dam Adaptive Management Program.

  16. Natural ageing process accelerates the release of Ag from functional textile in various exposure scenarios

    PubMed Central

    Ding, Dahu; Chen, Lulu; Dong, Shaowei; Cai, Hao; Chen, Jifei; Jiang, Canlan; Cai, Tianming

    2016-01-01

    Natural ageing process occurs throughout the life cycle of textile products, which may possess influences on the release behavior of additives such as silver nanoparticles (Ag NPs). In this study, we assessed the releasability of Ag NPs from a Ag NPs functionalized textile in five different exposure scenarios (i.e. tap water (TW), pond water (PW), rain water (RW), artificial sweat (AS), and detergent solution (DS) along with deionized water (DW) as reference), which were very likely to occur throughout the life cycle of the textile. For the pristine textile, although the most remarkable release was found in DW (6–15 μg Ag/g textile), the highest release rate was found in RW (around 7 μg Ag/(g textile·h)). After ageing treatment, the total released Ag could be increased by 75.7~386.0% in DW, AS and DS. Morphological analysis clearly showed that the Ag NPs were isolated from the surface of the textile fibre due to the ageing treatment. This study provides useful information for risk assessment of nano-enhanced textile products. PMID:27869136

  17. Natural ageing process accelerates the release of Ag from functional textile in various exposure scenarios

    NASA Astrophysics Data System (ADS)

    Ding, Dahu; Chen, Lulu; Dong, Shaowei; Cai, Hao; Chen, Jifei; Jiang, Canlan; Cai, Tianming

    2016-11-01

    Natural ageing process occurs throughout the life cycle of textile products, which may possess influences on the release behavior of additives such as silver nanoparticles (Ag NPs). In this study, we assessed the releasability of Ag NPs from a Ag NPs functionalized textile in five different exposure scenarios (i.e. tap water (TW), pond water (PW), rain water (RW), artificial sweat (AS), and detergent solution (DS) along with deionized water (DW) as reference), which were very likely to occur throughout the life cycle of the textile. For the pristine textile, although the most remarkable release was found in DW (6-15 μg Ag/g textile), the highest release rate was found in RW (around 7 μg Ag/(g textile·h)). After ageing treatment, the total released Ag could be increased by 75.7~386.0% in DW, AS and DS. Morphological analysis clearly showed that the Ag NPs were isolated from the surface of the textile fibre due to the ageing treatment. This study provides useful information for risk assessment of nano-enhanced textile products.

  18. Potential release scenarios for carbon nanotubes used in composites.

    PubMed

    Nowack, Bernd; David, Raymond M; Fissan, Heinz; Morris, Howard; Shatkin, Jo Anne; Stintz, Michael; Zepp, Richard; Brouwer, Derk

    2013-09-01

    The expected widespread use of carbon nanotube (CNT)-composites in consumer products calls for an assessment of the possible release and exposure to workers, consumers and the environment. Release of CNTs may occur at all steps in the life cycle of products, but to date only limited information is available about release of CNTs from actual products and articles. As a starting point for exposure assessment, exploring sources and pathways of release helps to identify relevant applications and situations where the environment and especially humans may encounter releases of CNTs. It is the aim of this review to identify various potential release scenarios for CNTs used in polymers and identify the greatest likelihood of release at the various stages throughout the life-cycle of the product. The available information on release of CNTs from products and articles is reviewed in a first part. In a second part nine relevant release scenarios are described in detail: injection molding, manufacturing, sports equipment, electronics, windmill blades, fuel system components, tires, textiles, incineration, and landfills. Release from products can potentially occur by two pathways; (a) where free CNTs are released directly, or more frequently (b) where the initial release is a particle with CNTs embedded in the matrix, potentially followed by the subsequent release of CNTs from the matrix. The potential for release during manufacturing exists for all scenarios, however, this is also the situation when exposure can be best controlled. For most of the other life cycle stages and their corresponding release scenarios, potential release of CNTs can be considered to be low, but it cannot be excluded totally. Direct release to the environment is also considered to be very low for most scenarios except for the use of CNTs in tires where significant abrasion during use and release into the environment would occur. Also the possible future use of CNTs in textiles could result in consumer exposure. A possibility for significant release also exists during recycling operations when the polymers containing CNTs are handled together with other polymers and mainly occupational users would be exposed. It can be concluded that in general, significant release of CNTs from products and articles is unlikely except in manufacturing and subsequent processing, tires, recycling, and potentially in textiles. However except for high energy machining processes, most likely the resulting exposure for these scenarios will be low and to a non-pristine form of CNTs. Actual exposure studies, which quantify the amount of material released should be conducted to provide further evidence for this conclusion. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. 40 CFR 68.28 - Alternative release scenario analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... overfilling and spill, or overpressurization and venting through relief valves or rupture disks; and (v... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as...

  20. Ecological scenarios analyzed and evaluated by a shallow lake model.

    PubMed

    Kardaetz, Sascha; Strube, Torsten; Brüggemann, Rainer; Nützmann, Gunnar

    2008-07-01

    We applied the complex ecosystem model EMMO, which was adopted to the shallow lake Müggelsee (Germany), in order to evaluate a large set of ecological scenarios. By means of EMMO, 33 scenarios and 17 indicators were defined to characterize their effects on the lake ecosystem. The indicators were based on model outputs of EMMO and can be separated into biological indicators, such as chlorophyll-a and cyanobacteria, and hydro-chemical indicators, such as phosphorus. The question to be solved was, what is the ranking of the scenarios based on their characterization by these 17 indicators? And how can we handle high quantities of complex data within evaluation procedures? The scenario evaluation was performed by partial order theory which, however, did not provide a clear result. By subsequently applying the hierarchical cluster analysis (complete linkage) it was possible to reduce the data matrix to indicator and scenario representatives. Even though this step implies losses of information, it simplifies the application of partial order theory and the post processing by METEOR. METEOR is derived from partial order theory and allows the stepwise aggregation of indicators, which subsequently leads to a distinct and clear decision. In the final evaluation result the best scenario was the one which defines a minimum nutrient input and no phosphorus release from the sediment while the worst scenario is characterized by a maximum nutrient input and extensive phosphorus release from the sediment. The reasonable and comprehensive results show that the combination of partial order, cluster analysis and METEOR can handle big amounts of data in a very clear and transparent way, and therefore is ideal in the context of complex ecosystem models, like that we applied.

  1. 40 CFR 68.28 - Alternative release scenario analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...

  2. 40 CFR 68.28 - Alternative release scenario analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...

  3. 40 CFR 68.28 - Alternative release scenario analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Consequence Analysis Guidance or any commercially or publicly available air dispersion modeling techniques, provided the techniques account for the specified modeling conditions and are recognized by industry as applicable as part of current practices. Proprietary models that account for the modeling conditions may be...

  4. Financial analysis of experimental releases conducted at Glen Canyon Dam during water years 1997 through 2005.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veselka, T. D.; Poch, L. A.; Palmer, C. S.

    2010-04-21

    Because of concerns about the impact that Glen Canyon Dam (GCD) operations were having on downstream ecosystems and endangered species, the Bureau of Reclamation (Reclamation) conducted an Environmental Impact Statement (EIS) on dam operations (DOE 1996). New operating rules and management goals for GCD that had been specified in the Record of Decision (ROD) (Reclamation 1996) were adopted in February 1997. In addition to issuing new operating criteria, the ROD mandated experimental releases for the purpose of conducting scientific studies. This paper examines the financial implications of the experimental flows that were conducted at the GCD from 1997 to 2005.more » An experimental release may have either a positive or negative impact on the financial value of energy production. This study estimates the financial costs of experimental releases, identifies the main factors that contribute to these costs, and compares the interdependencies among these factors. An integrated set of tools was used to compute the financial impacts of the experimental releases by simulating the operation of the GCD under two scenarios, namely, (1) a baseline scenario that assumes operations comply with the ROD operating criteria and experimental releases that actually took place during the study period, and (2) a ''without experiments'' scenario that is identical to the baseline scenario of operations that comply with the GCD ROD, except it assumes that experimental releases did not occur. The Generation and Transmission Maximization (GTMax) model was the main simulation tool used to dispatch GCD and other hydropower plants that comprise the Salt Lake City Area Integrated Projects (SLCA/IP). Extensive data sets and historical information on SLCA/IP power plant characteristics, hydrologic conditions, and Western Area Power Administration's (Western's) power purchase prices were used for the simulation. In addition to estimating the financial impact of experimental releases, the GTMax model was also used to gain insights into the interplay among ROD operating criteria, exceptions that were made to criteria to accommodate the experimental releases, and Western operating practices. Experimental releases in some water years resulted in financial benefits to Western while others resulted in financial costs. During the study period, the total financial costs of all experimental releases were $11.9 million.« less

  5. Revised financial analysis of experimental releases conducted at Glen Canyon Dam during water years 1997 through 2005.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Veselka, T. D.; Poch, L. A.; Palmer, C. S.

    2011-01-11

    Because of concerns about the impact that Glen Canyon Dam (GCD) operations were having on downstream ecosystems and endangered species, the Bureau of Reclamation (Reclamation) conducted an Environmental Impact Statement (EIS) on dam operations (DOE 1996). New operating rules and management goals for GCD that had been specified in the Record of Decision (ROD) (Reclamation 1996) were adopted in February 1997. In addition to issuing new operating criteria, the ROD mandated experimental releases for the purpose of conducting scientific studies. This paper examines the financial implications of the experimental flows that were conducted at the GCD from 1997 to 2005.more » An experimental release may have either a positive or negative impact on the financial value of energy production. This study estimates the financial costs of experimental releases, identifies the main factors that contribute to these costs, and compares the interdependencies among these factors. An integrated set of tools was used to compute the financial impacts of the experimental releases by simulating the operation of the GCD under two scenarios, namely, (1) a baseline scenario that assumes operations comply with the ROD operating criteria and experimental releases that actually took place during the study period, and (2) a 'without experiments' scenario that is identical to the baseline scenario of operations that comply with the GCD ROD, except it assumes that experimental releases did not occur. The Generation and Transmission Maximization (GTMax) model was the main simulation tool used to dispatch GCD and other hydropower plants that comprise the Salt Lake City Area Integrated Projects (SLCA/IP). Extensive data sets and historical information on SLCA/IP power plant characteristics, hydrologic conditions, and Western Area Power Administration's (Western's) power purchase prices were used for the simulation. In addition to estimating the financial impact of experimental releases, the GTMax model was also used to gain insights into the interplay among ROD operating criteria, exceptions that were made to criteria to accommodate the experimental releases, and Western operating practices. Experimental releases in some water years resulted in financial benefits to Western whileothers resulted in financial costs. During the study period, the total financial costs of all experimental releases were more than $23 million.« less

  6. Toxic release consequence analysis tool (TORCAT) for inherently safer design plant.

    PubMed

    Shariff, Azmi Mohd; Zaini, Dzulkarnain

    2010-10-15

    Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage. 2010 Elsevier B.V. All rights reserved.

  7. 40 CFR 68.25 - Worst-case release scenario analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... used is based on TNT equivalent methods. (1) For regulated flammable substances that are normally gases... shall be used to determine the distance to the explosion endpoint if the model used is based on TNT...

  8. 40 CFR 68.25 - Worst-case release scenario analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... used is based on TNT equivalent methods. (1) For regulated flammable substances that are normally gases... shall be used to determine the distance to the explosion endpoint if the model used is based on TNT...

  9. 40 CFR 68.25 - Worst-case release scenario analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... used is based on TNT equivalent methods. (1) For regulated flammable substances that are normally gases... shall be used to determine the distance to the explosion endpoint if the model used is based on TNT...

  10. 40 CFR 68.25 - Worst-case release scenario analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... used is based on TNT equivalent methods. (1) For regulated flammable substances that are normally gases... shall be used to determine the distance to the explosion endpoint if the model used is based on TNT...

  11. 40 CFR 68.25 - Worst-case release scenario analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... used is based on TNT equivalent methods. (1) For regulated flammable substances that are normally gases... shall be used to determine the distance to the explosion endpoint if the model used is based on TNT...

  12. A Preliminary Performance Assessment for Salt Disposal of High-Level Nuclear Waste - 12173

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Clayton, Daniel; Jove-Colon, Carlos

    2012-07-01

    A salt repository is one of the four geologic media currently under study by the U.S. DOE Office of Nuclear Energy to support the development of a long-term strategy for geologic disposal of commercial used nuclear fuel (UNF) and high-level radioactive waste (HLW). The immediate goal of the generic salt repository study is to develop the necessary modeling tools to evaluate and improve the understanding of the repository system response and processes relevant to long-term disposal of UNF and HLW in a salt formation. The current phase of this study considers representative geologic settings and features adopted from previous studiesmore » for salt repository sites. For the reference scenario, the brine flow rates in the repository and underlying interbeds are very low, and transport of radionuclides in the transport pathways is dominated by diffusion and greatly retarded by sorption on the interbed filling materials. I-129 is the dominant annual dose contributor at the hypothetical accessible environment, but the calculated mean annual dose is negligibly small. For the human intrusion (or disturbed) scenario, the mean mass release rate and mean annual dose histories are very different from those for the reference scenario. Actinides including Pu-239, Pu-242 and Np-237 are major annual dose contributors, and the calculated peak mean annual dose is acceptably low. A performance assessment model for a generic salt repository has been developed incorporating, where applicable, representative geologic settings and features adopted from literature data for salt repository sites. The conceptual model and scenario for radionuclide release and transport from a salt repository were developed utilizing literature data. The salt GDS model was developed in a probabilistic analysis framework. The preliminary performance analysis for demonstration of model capability is for an isothermal condition at the ambient temperature for the near field. The capability demonstration emphasizes key attributes of a salt repository that are potentially important to the long-term safe disposal of UNF and HLW. The analysis presents and discusses the results showing repository responses to different radionuclide release scenarios (undisturbed and human intrusion). For the reference (or nominal or undisturbed) scenario, the brine flow rates in the repository and underlying interbeds are very low, and transport of radionuclides in the transport pathways is dominated by diffusion and greatly retarded by sorption on the interbed filling materials. I-129 (non-sorbing and unlimited solubility with a very long half-life) is the dominant annual dose contributor at the hypothetical accessible environment, but the calculated mean annual dose is negligibly small that there is no meaningful consequence for the repository performance. For the human intrusion (or disturbed) scenario analysis, the mean mass release rate and mean annual dose histories are very different from those for the reference scenario analysis. Compared to the reference scenario, the relative annual dose contributions by soluble, non-sorbing fission products, particularly I-129, are much lower than by actinides including Pu-239, Pu-242 and Np-237. The lower relative mean annual dose contributions by the fission product radionuclides are due to their lower total inventory available for release (i.e., up to five affected waste packages), and the higher mean annual doses by the actinides are the outcome of the direct release of the radionuclides into the overlying aquifer having high water flow rates, thereby resulting in an early arrival of higher concentrations of the radionuclides at the biosphere drinking water well prior to their significant decay. The salt GDS model analysis has also identified the following future recommendations and/or knowledge gaps to improve and enhance the confidence of the future repository performance analysis. - Repository thermal loading by UNF and HLW, and the effect on the engineered barrier and near-field performance. - Closure and consolidation of salt rocks by creep deformation under the influence of thermal perturbation, and the effect on the engineered barrier and near-field performance. - Brine migration and radionuclide transport under the influence of thermal perturbation in generic salt repository environment, and the effect on the engineered barrier and near-field performance and far-field performance. - Near-field geochemistry and radionuclide mobility in generic salt repository environment (high ionic strength brines, elevated temperatures and chemically reducing condition). - Degradation of engineer barrier components (waste package, waste canister, waste forms, etc.) in a generic salt repository environment (high ionic strength brines, elevated temperatures and chemically reducing condition). - Waste stream types and inventory estimates, particularly for reprocessing high-level waste. (authors)« less

  13. Dispersion modeling of accidental releases of toxic gases - utility for the fire brigades.

    NASA Astrophysics Data System (ADS)

    Stenzel, S.; Baumann-Stanzer, K.

    2009-09-01

    Several air dispersion models are available for prediction and simulation of the hazard areas associated with accidental releases of toxic gases. The most model packages (commercial or free of charge) include a chemical database, an intuitive graphical user interface (GUI) and automated graphical output for effective presentation of results. The models are designed especially for analyzing different accidental toxic release scenarios ("worst-case scenarios”), preparing emergency response plans and optimal countermeasures as well as for real-time risk assessment and management. The research project RETOMOD (reference scenarios calculations for toxic gas releases - model systems and their utility for the fire brigade) was conducted by the Central Institute for Meteorology and Geodynamics (ZAMG) in cooperation with the Viennese fire brigade, OMV Refining & Marketing GmbH and Synex Ries & Greßlehner GmbH. RETOMOD was funded by the KIRAS safety research program of the Austrian Ministry of Transport, Innovation and Technology (www.kiras.at). The main tasks of this project were 1. Sensitivity study and optimization of the meteorological input for modeling of the hazard areas (human exposure) during the accidental toxic releases. 2. Comparison of several model packages (based on reference scenarios) in order to estimate the utility for the fire brigades. For the purpose of our study the following models were tested and compared: ALOHA (Areal Location of Hazardous atmosphere, EPA), MEMPLEX (Keudel av-Technik GmbH), Trace (Safer System), Breeze (Trinity Consulting), SAM (Engineering office Lohmeyer). A set of reference scenarios for Chlorine, Ammoniac, Butane and Petrol were proceed, with the models above, in order to predict and estimate the human exposure during the event. Furthermore, the application of the observation-based analysis and forecasting system INCA, developed in the Central Institute for Meteorology and Geodynamics (ZAMG) in case of toxic release was investigated. INCA (Integrated Nowcasting through Comprehensive Analysis) data are calculated operationally with 1 km horizontal resolution and based on the weather forecast model ALADIN. The meteorological field's analysis with INCA include: Temperature, Humidity, Wind, Precipitation, Cloudiness and Global Radiation. In the frame of the project INCA data were compared with measurements from the meteorological observational network, conducted at traffic-near sites in Vienna. INCA analysis and very short term forecast fields (up to 6 hours) are found to be an advanced possibility to provide on-line meteorological input for the model package used by the fire brigade. Since the input requirements differ from model to model, and the outputs are based on unequal criteria for toxic area and exposure, a high degree of caution in the interpretation of the model results is required - especially in the case of slow wind speeds, stable atmospheric condition, and flow deflection by buildings in the urban area or by complex topography.

  14. Waste Isolation Safety Assessment Program. Technical progress report for FY-1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandstetter, A.; Harwell, M.A.; Howes, B.W.

    1979-07-01

    Associated with commercial nuclear power production in the United States is the generation of potentially hazardous radioactive wastes. The Department of Energy (DOE) is seeking to develop nuclear waste isolation systems in geologic formations that will preclude contact with the biosphere of waste radionuclides in concentrations which are sufficient to cause deleterious impact on humans or their environments. Comprehensive analyses of specific isolation systems are needed to assess the expectations of meeting that objective. The Waste Isolation Safety Assessment Program (WISAP) has been established at the Pacific Northwest Laboratory (operated by Battelle Memorial Institute) for developing the capability of makingmore » those analyses. Progress on the following tasks is reported: release scenario analysis, waste form release rate analysis, release consequence analysis, sorption-desorption analysis, and societal acceptance analysis. (DC)« less

  15. Topical Backgrounder: Evaluating Chemical Hazards in the Community: Using RMP's Offsite Consequence Analysis

    EPA Pesticide Factsheets

    Part of a May 1999 series on the Risk Management Program Rule and issues related to chemical emergency management. Explains hazard versus risk, worst-case and alternative release scenarios, flammable endpoints and toxic endpoints.

  16. General RMP Guidance - Chapter 4: Offsite Consequence Analysis

    EPA Pesticide Factsheets

    This chapter provides basic compliance information, not modeling methodologies, for people who plan to do their own air dispersion modeling. OCA is a required part of the risk management program, and involves worst-case and alternative release scenarios.

  17. Industrial Accidents Triggered by Natural Hazards: an Emerging Risk Issue

    NASA Astrophysics Data System (ADS)

    Renni, Elisabetta; Krausmann, Elisabeth; Basco, Anna; Salzano, Ernesto; Cozzani, Valerio

    2010-05-01

    Natural disasters such as earthquakes, tsunamis, flooding or hurricanes have recently and dramatically hit several countries worldwide. Both direct and indirect consequences involved the population, causing on the one hand a high number of fatalities and on the other hand so relevant economical losses that the national gross product may be affected for many years. Loss of critical industrial infrastructures (electricity generation and distribution, gas pipelines, oil refineries, etc.) also occurred, causing further indirect damage to the population. In several cases, accident scenarios with large releases of hazardous materials were triggered by these natural events, causing so-called "Natech events", in which the overall damage resulted from the simultaneous consequences of the natural event and of the release of hazardous substances. Toxic releases, large fires and explosions, as well as possible long-term environmental pollution, economical losses, and overloading of emergency systems were recognised by post-event studies as the main issues of these Natech scenarios. In recent years the increasing frequency and severity of some natural hazards due to climate change has slowly increased the awareness of Natech risk as an emerging risk among the stakeholders. Indeed, the iNTeg-Risk project, co-funded by the European Commission within the 7th Framework Program specifically addresses these scenarios among new technological issues on public safety. The present study, in part carried out within the iNTeg-Risk project, was aimed at the analysis and further development of methods and tools for the assessment and mitigation of Natech accidents. Available tools and knowledge gaps in the assessment of Natech scenarios were highlighted. The analysis mainly addressed the potential impact of flood, lightning and earthquake events on industrial installations where hazardous substances are present. Preliminary screening methodologies and more detailed methods based on quantitative risk analysis were developed. Strategies based on the use of multiple information layers aiming at the identification of mitigation and early warning systems were also explored. A case-study in the Emilia-Romagna region is presented.

  18. Benchmarking analysis of three multimedia models: RESRAD, MMSOILS, and MEPAS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, J.J.; Faillace, E.R.; Gnanapragasam, E.K.

    1995-11-01

    Multimedia modelers from the United States Environmental Protection Agency (EPA) and the United States Department of Energy (DOE) collaborated to conduct a comprehensive and quantitative benchmarking analysis of three multimedia models. The three models-RESRAD (DOE), MMSOILS (EPA), and MEPAS (DOE)-represent analytically based tools that are used by the respective agencies for performing human exposure and health risk assessments. The study is performed by individuals who participate directly in the ongoing design, development, and application of the models. A list of physical/chemical/biological processes related to multimedia-based exposure and risk assessment is first presented as a basis for comparing the overall capabilitiesmore » of RESRAD, MMSOILS, and MEPAS. Model design, formulation, and function are then examined by applying the models to a series of hypothetical problems. Major components of the models (e.g., atmospheric, surface water, groundwater) are evaluated separately and then studied as part of an integrated system for the assessment of a multimedia release scenario to determine effects due to linking components of the models. Seven modeling scenarios are used in the conduct of this benchmarking study: (1) direct biosphere exposure, (2) direct release to the air, (3) direct release to the vadose zone, (4) direct release to the saturated zone, (5) direct release to surface water, (6) surface water hydrology, and (7) multimedia release. Study results show that the models differ with respect to (1) environmental processes included (i.e., model features) and (2) the mathematical formulation and assumptions related to the implementation of solutions (i.e., parameterization).« less

  19. Mercury Dispersion Modeling And Purge Ventilation Stack Height Determination For Tank 40H

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivera-Giboyeaux, A.

    2017-05-19

    The SRNL Atmospheric Technologies Group performed an analysis for mercury emissions from H-Tank Farm - Tank 40 ventilation system exhaust in order to assess whether the Short Term Exposure Limit (STEL), or Threshold Limit Value (TLV) levels for mercury will be exceeded during bulk sludge slurry mixing and sludge removal operations. The American Meteorological Society/Environmental Protection Agency Regulatory Model (AERMOD) was used as the main dispersion modelling tool for this analysis. The results indicated that a 45-foot stack is sufficient to raise the plume centerline from the Tank 40 release to prevent mercury exposure problems for any of the stackmore » discharge scenarios provided. However, a 42-foot stack at Tank 40 is sufficient to prevent mercury exposure concerns in all emission scenarios except the 50 mg/m 3 release. At a 42-foot stack height, values exceeding the exposure standards are only measured on receptors located above 34 feet.« less

  20. Development of the table of initial isolation distances and protective action distances for the 2004 emergency response guidebook.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D. F.; Freeman, W. A.; Carhart, R. A.

    2005-09-23

    This report provides technical documentation for values in the Table of Initial Isolation and Protective Action Distances (PADs) in the 2004 Emergency Response Guidebook (ERG2004). The objective for choosing the PADs specified in the ERG2004 is to balance the need to adequately protect the public from exposure to potentially harmful substances against the risks and expenses that could result from overreacting to a spill. To quantify this balance, a statistical approach is adopted, whereby the best available information is used to conduct an accident scenario analysis and develop a set of up to 1,000,000 hypothetical incidents. The set accounts formore » differences in containers types, incident types, accident severity (i.e., amounts released), locations, times of day, times of year, and meteorological conditions. Each scenario is analyzed using detailed emission rate and atmospheric dispersion models to calculate the downwind chemical concentrations from which a 'safe distance' is determined. The safe distance is defined as the distance downwind from the source at which the chemical concentration falls below health protection criteria. The American Industrial Hygiene Association's Emergency Response Planning Guideline Level 2 (ERPG-2) or equivalent is the health criteria used. The statistical sample of safe distance values for all incidents considered in the analysis are separated into four categories: small spill/daytime release, small spill/nighttime release, large spill/daytime release, and large spill/nighttime release. The 90th-percentile safe distance values for each of these groups became the PADs that appear in the ERG2004.« less

  1. TSCA Work Plan Chemical Problem Formulation and Initial Assessment Tetrabromobisphenol A and Related Chemicals Cluster Flame Retardants

    EPA Pesticide Factsheets

    EPA released a problem formulation for TBBPA and related chemicals used as a flame retardants in plastics/printed circuit boards for electronics. The goal of this problem formulation was to identify scenarios where further risk analysis may be necessary.

  2. Development of nonproliferation and assessment scenarios.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finley, Melissa; Barnett, Natalie Beth

    2005-10-01

    The overall objective of the Nonproliferation and Assessments Scenario Development project is to create and analyze potential and plausible scenarios that would lead to an adversary's ability to acquire and use a biological weapon. The initial three months of funding was intended to be used to develop a scenario to demonstrate the efficacy of this analysis methodology; however, it was determined that a substantial amount of preliminary data collection would be needed before a proof of concept scenario could be developed. We have dedicated substantial effort to determine the acquisition pathways for Foot and Mouth Disease Virus, and similar processesmore » will be applied to all pathogens of interest. We have developed a biosecurity assessments database to capture information on adversary skill locales, available skill sets in specific regions, pathogen sources and regulations involved in pathogen acquisition from legitimate facilities. FY06 funding, once released, will be dedicated to data collection on acquisition, production and dissemination requirements on a pathogen basis. Once pathogen data has been collected, scenarios will be developed and scored.« less

  3. Variable thickness transient ground-water flow model. Volume 3. Program listings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reisenauer, A.E.

    1979-12-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This is the third of 3 volumes of the description of the VTT (Variable Thickness Transient) Groundwater Hydrologic Model - second level (intermediate complexity) two-dimensional saturated groundwater flow.« less

  4. Preliminary assessment of the health and environmental impacts of transporting M55 rockets from Lexington-Blue Grass Depot activity, Anniston Army depot, and Umatilla depot activity to alternative disposal facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carnes, S.A.; Breck, J.E.; Copenhaver, E.D.

    1986-03-01

    This assessment discusses the potential health and environmental impacts of transporting M55 rockets filled with nerve agent GB or VX from various existing Army storage depots to alternative Army depots for disposal. The origin depots include Anniston Army Depot in Alabama, Lexington-Blue Grass Depot Activity in Kentucky, and Umatilla Depot Activity in Oregon. The destination depots include Pine Bluff Arsenal in Arkansas, Tooele Army Depot in Utah, and the facility on Johnston Island in the central Pacific Ocean. This assessment considers the possible impacts of normal transport operations and of two postulated accident scenarios on the air quality, ground andmore » surface water, aquatic ecology, terrestrial ecology, human health, and cultural and socioeconomic resources of the various transport corridors involved. The impacts of these scenarios are assessed for truck, train, and air transport for each orgin-destination pair. The analysis considers three basic scenario during transport: (1) normal operations with no atmospheric release of nerve agent; (2) a minor agent spill (the contents of one rocket being released to the biosphere); and (3) a worst-case accident involving the release of a large, specified quantity of nerve agent to the biosphere. The extremely low probabilities of such accidents, which are reported elsewhere, are noted.« less

  5. CIRMIS Data system. Volume 2. Program listings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedrichs, D.R.

    1980-01-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for utilization by the hydraulic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required.The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the second of four volumes of the description of the CIRMIS Data System.« less

  6. Conductive plastics: comparing alternative nanotechnologies by performance and life cycle release probability

    NASA Astrophysics Data System (ADS)

    Neubauer, Nicole; Wohlleben, Wendel; Tomović, Željko

    2017-03-01

    Nanocomposites can be considered safe during their life cycle as long as the nanofillers remain embedded in the matrix. Therefore, a possible release of nanofillers has to be assessed before commercialization. This report addresses possible life cycle release scenarios for carbon nanotubes (CNT), graphene, and carbon black (CB) from a thermoplastic polyurethane (TPU) matrix. The content of each nanofiller was adjusted to achieve the same conductivity level. The nanofillers reduced the rate of nanoscale releases during mechanical processing with decreasing release in the order neat TPU, TPU-CNT, TPU-graphene, and TPU-CB. Released fragments were dominated by the polymer matrix with embedded or surface-protruding nanofillers. During electron microscopy analysis, free CB was observed, however, there was no free CNT or graphene. Quantitatively, the presence of free nanofillers remained below the detection limit of <0.01% of generated dust. Further, both the production process and type of mechanical processing showed a significant impact with higher release rates for injection-molded compared to extruded and sanded compared to drilled materials. Due to its optimal performance for further development, extruded TPU-CNT was investigated in a combined, stepwise worst case scenario (mechanical processing after weathering). After weathering by simulated sunlight and rain, CNT were visible at the surface of the nanocomposite; after additional sanding, fragments showed protruding CNT, but free CNT were not detected. In summary, this preliminary exposure assessment showed no indication that recommended occupational exposure limits for carbonaceous nanomaterials can be exceeded during the life cycle of the specific TPU nanocomposites and conditions investigated in this study.

  7. The influence of air-conditioning on street temperatures in the city of Paris

    NASA Astrophysics Data System (ADS)

    de Munck, C. S.; Pigeon, G.; Masson, V.; Marchadier, C.; Meunier, F.; Tréméac, B.; Merchat, M.

    2010-12-01

    A consequence of urban heat islands in summer is the increased use of air-conditioning during extreme heat events : the use of air-conditioning systems, while cooling the inside of buildings releases waste heat (as latent and sensible heat) in the lower part of the urban atmosphere, hence potentially increasing air street temperatures where the heat is released. This may lead locally to a further increase in air street temperatures, therefore increasing the air cooling demand, while at the same time lowering the efficiency of air-conditioning units. A coupled model consisting of a meso-scale meteorological model (MESO-NH) and an urban energy balance model (TEB) has been implemented with an air-conditioning module and used in combination to real spatialised datasets to understand and quantify potential increases in temperature due to air-conditioning heat releases for the city of Paris . In a first instance, the current types of air-conditioning systems co-existing in the city were simulated (underground chilled water network, wet cooling towers and individual air-conditioning units) to study the effects of latent and sensible heat releases on street temperatures. In a third instance, 2 scenarios were tested to characterise the impacts of likely future trends in air-conditioning equipment in the city : a first scenario for which current heat releases were converted to sensible heat, and a second based on 2030s projections of air-conditioning equipment at the scale of the city. All the scenarios showed an increase in street temperature which, as expected, was greater at night time than day time. For the first two scenarios, this increase in street temperatures was localised at or near the sources of air-conditioner heat releases, while the 2030s air-conditioning scenario impacted wider zones in the city. The amplitude of the increase in temperature varied from 0,25°C to 1°C for the air-conditioning current state, between 0,25°C and 2°C for the sensible heat release only scenario, and finally from 0,25°C to 2 °C for the 2030s scenario, with impacts of up to 3°C locally. Overall, these results demonstrated to which extend the use air-conditioning could enhance street temperatures in the city of Paris and the importance of a spatialised approach.

  8. Nanosilver conductive ink: A case study for evaluating the potential risk of nanotechnology under hypothetical use scenarios.

    PubMed

    Martin, David P; Melby, Nicolas L; Jordan, Shinita M; Bednar, Anthony J; Kennedy, Alan J; Negrete, Maria E; Chappell, Mark A; Poda, Aimee R

    2016-11-01

    Engineered nanomaterials (ENMs) are being incorporated into a variety of consumer products due to unique properties that offer a variety of advantages over bulk materials. Understanding of the nano-specific risk associated with nano-enabled technologies, however, continues to lag behind research and development, registration with regulators, and commercialization. One example of a nano-enabled technology is nanosilver ink, which can be used in commercial ink-jet printers for the development of low-cost printable electronics. This investigation utilizes a tiered EHS framework to evaluate the potential nano-specific release, exposure and hazard associated with typical use of both nanosilver ink and printed circuits. The framework guides determination of the potential for ENM release from both forms of the technology in simulated use scenarios, including spilling of the ink, aqueous release (washing) from the circuits and UV light exposure. The as-supplied ink merits nano-specific consideration based on the presence of nanoparticles and their persistence in environmentally-relevant media. The material released from the printed circuits upon aqueous exposure was characterized by a number of analysis techniques, including ultracentrifugation and single particle ICP-MS, and the results suggest that a vast majority of the material was ionic in nature and nano-specific regulatory scrutiny may be less relevant. Published by Elsevier Ltd.

  9. Finite-element three-dimensional ground-water (FE3DGW) flow model - formulation, program listings and users' manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gupta, S.K.; Cole, C.R.; Bond, F.W.

    1979-12-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. Hydrologic and transport models are available at several levels of complexity or sophistication. Model selection and use are determined by the quantity and quality of input data. Model development under AEGIS and related programs provides three levels of hydrologic models, two levels of transport models, and one level of dose models (with several separate models). This document consists of the description of the FE3DGW (Finite Element, Three-Dimensional Groundwater) Hydrologic model third level (high complexity) three-dimensional, finite element approach (Galerkin formulation) for saturated groundwater flow.« less

  10. Risk analysis for veterinary biologicals released into the environment.

    PubMed

    Silva, S V; Samagh, B S; Morley, R S

    1995-12-01

    All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.

  11. Evaluating Potential for Large Releases from CO2 StorageReservoirs: Analogs, Scenarios, and Modeling Needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Birkholzer, Jens; Pruess, Karsten; Lewicki, Jennifer

    2005-09-19

    While the purpose of geologic storage of CO{sub 2} in deep saline formations is to trap greenhouse gases underground, the potential exists for CO{sub 2} to escape from the target reservoir, migrate upward along permeable pathways, and discharge at the land surface. Such discharge is not necessarily a serious concern, as CO{sub 2} is a naturally abundant and relatively benign gas in low concentrations. However, there is a potential risk to health, safety and environment (HSE) in the event that large localized fluxes of CO{sub 2} were to occur at the land surface, especially where CO{sub 2} could accumulate. Inmore » this paper, we develop possible scenarios for large CO{sub 2} fluxes based on the analysis of natural analogues, where large releases of gas have been observed. We are particularly interested in scenarios which could generate sudden, possibly self-enhancing, or even eruptive release events. The probability for such events may be low, but the circumstances under which they might occur and potential consequences need to be evaluated in order to design appropriate site selection and risk management strategies. Numerical modeling of hypothetical test cases is needed to determine critical conditions for such events, to evaluate whether such conditions may be possible at designated storage sites, and, if applicable, to evaluate the potential HSE impacts of such events and design appropriate mitigation strategies.« less

  12. Diminished Wastewater Treatment: Evaluation of Septic System Performance Under a Climate Change Scenario

    NASA Astrophysics Data System (ADS)

    Cooper, J.; Loomis, G.; Kalen, D.; Boving, T. B.; Morales, I.; Amador, J.

    2015-12-01

    The effects of climate change are expected to reduce the ability of soil-based onsite wastewater treatment systems (OWTS), to treat domestic wastewater. In the northeastern U.S., the projected increase in atmospheric temperature, elevation of water tables from rising sea levels, and heightened precipitation will reduce the volume of unsaturated soil and oxygen available for treatment. Incomplete removal of contaminants may lead to transport of pathogens, nutrients, and biochemical oxygen demand (BOD) to groundwater, increasing the risk to public health and likelihood of eutrophying aquatic ecosystems. Advanced OWTS, which include pre-treatment steps and provide unsaturated drainfields of greater volume relative to conventional OWTS, are expected to be more resilient to climate change. We used intact soil mesocosms to quantify water quality functions for two advanced shallow narrow drainfield types and a conventional drainfield under a current climate scenario and a moderate climate change scenario of 30 cm rise in water table and 5°C increase in soil temperature. While no fecal coliform bacteria (FCB) was released under the current climate scenario, up to 109 CFU FCB/mL (conventional) and up to 20 CFU FCB/mL (shallow narrow) were released under the climate change scenario. Total P removal rates dropped from 100% to 54% (conventional) and 71% (shallow narrow) under the climate change scenario. Total N removal averaged 17% under both climate scenarios in the conventional, but dropped from 5.4% to 0% in the shallow narrow under the climate change scenario, with additional leaching of N in excess of inputs indicating release of previously held N. No significant difference was observed between scenarios for BOD removal. The initial data indicate that while advanced OWTS retain more function under the climate change scenario, all three drainfield types experience some diminished treatment capacity.

  13. Water temperature effects from simulated dam operations and structures in the Middle Fork Willamette River, western Oregon

    USGS Publications Warehouse

    Buccola, Norman L.; Turner, Daniel F.; Rounds, Stewart A.

    2016-09-14

    Significant FindingsStreamflow and water temperature in the Middle Fork Willamette River (MFWR), western Oregon, have been regulated and altered since the construction of Lookout Point, Dexter, and Hills Creek Dams in 1954 and 1961, respectively. Each year, summer releases from the dams typically are cooler than pre-dam conditions, with the reverse (warmer than pre-dam conditions) occurring in autumn. This pattern has been detrimental to habitat of endangered Upper Willamette River (UWR) Chinook salmon (Oncorhynchus tshawytscha) and UWR winter steelhead (O. mykiss) throughout multiple life stages. In this study, scenarios testing different dam-operation strategies and hypothetical dam-outlet structures were simulated using CE-QUAL-W2 hydrodynamic/temperature models of the MFWR system from Hills Creek Lake (HCR) to Lookout Point (LOP) and Dexter (DEX) Lakes to explore and understand the efficacy of potential flow and temperature mitigation options.Model scenarios were run in constructed wet, normal, and dry hydrologic calendar years, and designed to minimize the effects of Hills Creek and Lookout Point Dams on river temperature by prioritizing warmer lake surface releases in May–August and cooler, deep releases in September–December. Operational scenarios consisted of a range of modified release rate rules, relaxation of power-generation constraints, variations in the timing of refill and drawdown, and maintenance of different summer maximum lake levels at HCR and LOP. Structural scenarios included various combinations of hypothetical floating outlets near the lake surface and hypothetical new outlets at depth. Scenario results were compared to scenarios using existing operational rules that give temperature management some priority (Base), scenarios using pre-2012 operational rules that prioritized power generation over temperature management (NoBlend), and estimated temperatures from a without-dams condition (WoDams).Results of the tested model scenarios led to the following conclusions:The existing outlets at Lookout Point Dam, because of the range of depths, allow for greater temperature control than the two existing outlets at Hills Creek Dam that are relatively deep.Temperature control at HCR through operational scenarios generally was minimal near Hills Creek Dam, but improved downstream toward the head of LOP when decreased release rates held HCR at a low lake elevation year-round.Inflows from unregulated streams between HCR and LOP helped to dilute the effects of HCR and achieve more natural stream temperatures before the MFWR entered LOP.The relative benefit of any particular scenario depended on the location in the MFWR system used to assess the potential change, with most scenarios involving changes to Hills Creek Dam being less effective with increasing downstream distance, such as downstream of DEX.To achieve as much temperature control as the most successful structural scenarios, which were able to resemble without-dam conditions for part of the year, most operational scenarios had to be free of any power-generation requirements at Lookout Point Dam.Downstream of DEX, scenarios incorporating a hypothetical floating outlet at either HCR or LOP resulted in similar temperatures, with both scenarios causing a delay in the estimated spring Chinook egg emergence by about 9–10 days compared to base-case temperature-management scenarios.

  14. An approach for estimating toxic releases of H2S-containing natural gas.

    PubMed

    Jianwen, Zhang; Da, Lei; Wenxing, Feng

    2014-01-15

    China is well known being rich in sulfurous natural gas with huge deposits widely distributed all over the country. Due to the toxic nature, the release of hydrogen sulfide-containing natural gas from the pipelines intends to impose serious threats to the human, society and environment around the release sources. CFD algorithm is adopted to simulate the dispersion process of gas, and the results prove that Gaussian plume model is suitable for determining the affected region of the well blowout of sulfide hydrogen-containing natural gas. In accordance with the analysis of release scenarios, the present study proposes a new approach for estimating the risk of hydrogen sulfide poisoning hazards, as caused by sulfide-hydrogen-containing natural gas releases. Historical accident-statistical data from the EGIG (European Gas Pipeline Incident Data Group) and the Britain Gas Transco are integrated into the approach. Also, the dose-load effect is introduced to exploit the hazards' effects by two essential parameters - toxic concentration and exposure time. The approach was applied to three release scenarios occurring on the East-Sichuan Gas Transportation Project, and the individual risk and societal risk are classified and discussed. Results show that societal risk varies significantly with different factors, including population density, distance from pipeline, operating conditions and so on. Concerning the dispersion process of hazardous gas, available safe egress time was studied from the perspective of individual fatality risks. The present approach can provide reliable support for the safety management and maintenance of natural gas pipelines as well as evacuations that may occur after release incidents. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Application of an NLME-Stochastic Deconvolution Approach to Level A IVIVC Modeling.

    PubMed

    Kakhi, Maziar; Suarez-Sharp, Sandra; Shepard, Terry; Chittenden, Jason

    2017-07-01

    Stochastic deconvolution is a parameter estimation method that calculates drug absorption using a nonlinear mixed-effects model in which the random effects associated with absorption represent a Wiener process. The present work compares (1) stochastic deconvolution and (2) numerical deconvolution, using clinical pharmacokinetic (PK) data generated for an in vitro-in vivo correlation (IVIVC) study of extended release (ER) formulations of a Biopharmaceutics Classification System class III drug substance. The preliminary analysis found that numerical and stochastic deconvolution yielded superimposable fraction absorbed (F abs ) versus time profiles when supplied with exactly the same externally determined unit impulse response parameters. In a separate analysis, a full population-PK/stochastic deconvolution was applied to the clinical PK data. Scenarios were considered in which immediate release (IR) data were either retained or excluded to inform parameter estimation. The resulting F abs profiles were then used to model level A IVIVCs. All the considered stochastic deconvolution scenarios, and numerical deconvolution, yielded on average similar results with respect to the IVIVC validation. These results could be achieved with stochastic deconvolution without recourse to IR data. Unlike numerical deconvolution, this also implies that in crossover studies where certain individuals do not receive an IR treatment, their ER data alone can still be included as part of the IVIVC analysis. Published by Elsevier Inc.

  16. Financial Analysis of Experimental Releases Conducted at Glen Canyon Dam during Water Year 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graziano, D. J.; Poch, L. A.; Veselka, T. D.

    This report examines the financial implications of experimental flows conducted at the Glen Canyon Dam (GCD) in water year (WY) 2015. It is the seventh report in a series examining the financial implications of experimental flows conducted since the Record of Decision (ROD) was adopted in February 1997 (Reclamation 1996). A report released in January 2011 examined WYs 1997 to 2005 (Veselka et al. 2011); a report released in August 2011 examined WYs 2006 to 2010 (Poch et al. 2011); a report released June 2012 examined WY 2011 (Poch et al. 2012); a report released April 2013 examined WY 2012more » (Poch et al. 2013); a report released June 2014 examined WY 2013 (Graziano et al. 2014); and a report released September 2015 examined WY 2014 (Graziano et al. 2015). An experimental release may have either a positive or negative impact on the financial value of energy production. Only one experimental release was conducted at GCD in WY 2015; specifically, a high flow experimental (HFE) release conducted in November 2014. For this experimental release, financial costs of approximately $2.1 million were incurred because the HFE required sustained water releases that exceeded the powerplant’s maximum flow rate. In addition, during the month of the experiment, operators were not allowed to shape GCD power production to either follow firm power customer loads or to respond to market prices. This study identifies the main factors that contribute to HFE costs and examines the interdependencies among these factors. It applies an integrated set of tools to estimate financial impacts by simulating the GCD operations under two scenarios: (1) a baseline scenario that mimics both HFE operations during the experiment and during the rest of the year when it complies with the 1996 ROD operating criteria, and (2) a “without experiments” scenario that is identical to the baseline except it assumes that the HFE did not occur. The Generation and Transmission Maximization (GTMax) model was the main simulation tool used to simulate the dispatch of hydropower plants at GCD and other plants that comprise the Salt Lake City Area Integrated Projects (SLCA/IP). The research team used extensive data sets and historical information on SLCA/IP powerplant characteristics, hydrologic conditions, and Western Area Power Administration’s (WAPA’s) power purchase prices in the modeling process. In addition to estimating the financial impact of the HFE, the team used the GTMax model to gain insights into the interplay among ROD operating criteria, exceptions that were made to criteria to accommodate the experimental releases, and WAPA operating practices.« less

  17. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less

  18. Statistical, Graphical, and Learning Methods for Sensing, Surveillance, and Navigation Systems

    DTIC Science & Technology

    2016-06-28

    harsh propagation environments. Conventional filtering techniques fail to provide satisfactory performance in many important nonlinear or non...Gaussian scenarios. In addition, there is a lack of a unified methodology for the design and analysis of different filtering techniques. To address...these problems, we have proposed a new filtering methodology called belief condensation (BC) DISTRIBUTION A: Distribution approved for public release

  19. Analysis of Pulsed Flow Modification Alternatives, Lower Missouri River, 2005

    USGS Publications Warehouse

    Jacobson, Robert B.

    2008-01-01

    The graphical, tabular, and statistical data presented in this report resulted from analysis of alternative flow regime designs considered by a group of Missouri River managers, stakeholders, and scientists during the summer of 2005. This plenary group was charged with designing a flow regime with increased spring flow pulses to support reproduction and survival of the endangered pallid sturgeon. Environmental flow components extracted from the reference natural flow regime were used to design and assess performance of alternative flow regimes. The analysis is based on modeled flow releases from Gavins Point Dam (near Yankton, South Dakota) for nine design alternatives and two reference scenarios; the reference scenarios are the run-of-the-river and the water-control plan implemented in 2004. The alternative designs were developed by the plenary group with the goal of providing pulsed spring flows, while retaining traditional social and economic uses of the river.

  20. Exploring Bioeconomy Growth through the Public Release of the Biomass Scenario Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newes, Emily K; Biddy, Mary J; Bush, Brian W

    The Biomass Scenario Model (BSM) is an important tool for exploring vibrant future bioeconomy scenarios that leverage domestic resources. Developed by NREL and BETO, this model of the domestic biofuels supply chain has been used to explore success strategies for BETO's activities towards bioeconomy growth. The BSM offers a robust test bed for detailed exploration of effects of BETO activities within the complex context of resource availability; physical, technological, and economic constraints; behavior; and policy. The public release of the model in 2017 will allow broad engagement with the theme of the conference as model users can analyze bioeconomy growth,more » domestic biomass resource use, and associated effects. The BSM is a carefully validated, state-of-the-art, dynamic model of the biomass to biofuels supply chain. Using a system dynamics simulation modeling approach, the model tracks long-term deployment of biofuels given technology development and investment, considering land availability, the competing oil market, consumer demand, and government policies over time. Sample outputs include biofuels production, feedstock use, capital investment, incentives, and costs of feedstocks and fuels. BSM scenarios reveal technological, economic, and policy challenges, as well as opportunities for dynamic growth of the bioeconomy with strategic public and private investment at key points in the system. The model logic and results have been reviewed extensively, through collaborative analysis, expert reviews and external publications (https://www.zotero.org/groups/bsm_publications/).« less

  1. TSPA 1991: An initial total-system performance assessment for Yucca Mountain; Yucca Mountain Site Characterization Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnard, R.W.; Wilson, M.L.; Dockery, H.A.

    1992-07-01

    This report describes an assessment of the long-term performance of a repository system that contains deeply buried highly radioactive waste; the system is assumed to be located at the potential site at Yucca Mountain, Nevada. The study includes an identification of features, events, and processes that might affect the potential repository, a construction of scenarios based on this identification, a selection of models describing these scenarios (including abstraction of appropriate models from detailed models), a selection of probability distributions for the parameters in the models, a stochastic calculation of radionuclide releases for the scenarios, and a derivation of complementary cumulativemore » distribution functions (CCDFs) for the releases. Releases and CCDFs are calculated for four categories of scenarios: aqueous flow (modeling primarily the existing conditions at the site, with allowances for climate change), gaseous flow, basaltic igneous activity, and human intrusion. The study shows that models of complex processes can be abstracted into more simplified representations that preserve the understanding of the processes and produce results consistent with those of more complex models.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ringbom, Anders; Axelssson, A.; Aldener, M.

    Abstract: Observations of the radioxenon isotopes 133Xe and 131mXe collected at the IMS stations RN38 and RN58 on April 7-8, and April 12-13 2013, respectively, are unique with respect to the measurement history of these stations. Comparison of measured data with calculated isotopic ratios as well as analysis using atmospheric transport modeling indicate that it is likely that the xenon measured was created in the underground nuclear test conducted by North Korea on February 12, 2013, and released 7 weeks later. More than one release is required to explain all observations. The 131mXe source terms for each release were calculatedmore » to 7x1011 Bq, corresponding to about 1-10% of the total xenon inventory for a 10-kt explosion, depending on fractionation and release scenario. The observed ratios could not be used to obtain any information regarding the fissile material that was used in the test.« less

  3. A large-scale field trial experiment to derive effective release of heavy metals from incineration bottom ashes during construction in land reclamation.

    PubMed

    Chan, Wei-Ping; Ren, Fei; Dou, Xiaomin; Yin, Ke; Chang, Victor Wei-Chung

    2018-05-08

    Recycling of incineration bottom ashes (IBA) is attracting great interest as it is considered as a vital aspect for closing the waste loop to achieve sustainable development at the growing cities around the world. Various laboratory-testing methods are developed to assess the release potential of heavy metals - one of the most important concerns of using IBA, by reflecting the release conditions of heavy metals from IBA based on the targeted land reclamation application scenarios and corresponding environmental conditions. However, realistic release of the concerned elements in actual application with the presence of complex environment could possibly deviate from the outcomes produced by leaching tests carried out in the laboratory. Hence, a set of large-scale column trial experiments was performed to experimentally determine the effective release of heavy metals, when IBA is used as a filling material in land reclamation. 20 tons of IBA and 320 m 3 of seawater were used in six column trial experiments. The release of 13 heavy metal elements was analyzed through multiple aspects which included kinetics of release, distribution of elements in seawater and the impacts of two different dumping methods, with and without application of a chute. After dumping of IBA into the seawater, almost instantaneous release of heavy metals with uniform horizontal dispersion was observed. Higher concentration of these elements was observed near the bottom of the column, especially when a chute was applied. Comparative analysis was then carried out to establish relationships between the results obtained from the column trial with batch leaching test carried out in the laboratory. Distinctive relationships were observed for different heavy metals which suggests the need of pursuance of further understanding on leaching of IBA in real application scenario and complex environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less

  5. Assessment of effectiveness of geologic isolation systems. CIRMIS data system. Volume 4. Driller's logs, stratigraphic cross section and utility routines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedrichs, D.R.

    1980-01-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (ONWI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for use by the hydrologic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required. The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the fourth of four volumes of the description of the CIRMIS Data System.« less

  6. Insights Gained from Forensic Analysis with MELCOR of the Fukushima-Daiichi Accidents.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Nathan C.; Gauntt, Randall O.

    Since the accidents at Fukushima-Daiichi, Sandia National Laboratories has been modeling these accident scenarios using the severe accident analysis code, MELCOR. MELCOR is a widely used computer code developed at Sandia National Laboratories since ~1982 for the U.S. Nuclear Regulatory Commission. Insights from the modeling of these accidents is being used to better inform future code development and potentially improved accident management. To date, our necessity to better capture in-vessel thermal-hydraulic and ex-vessel melt coolability and concrete interactions has led to the implementation of new models. The most recent analyses, presented in this paper, have been in support of themore » of the Organization for Economic Cooperation and Development Nuclear Energy Agency’s (OECD/NEA) Benchmark Study of the Accident at the Fukushima Daiichi Nuclear Power Station (BSAF) Project. The goal of this project is to accurately capture the source term from all three releases and then model the atmospheric dispersion. In order to do this, a forensic approach is being used in which available plant data and release timings is being used to inform the modeled MELCOR accident scenario. For example, containment failures, core slumping events and lower head failure timings are all enforced parameters in these analyses. This approach is fundamentally different from a blind code assessment analysis often used in standard problem exercises. The timings of these events are informed by representative spikes or decreases in plant data. The combination of improvements to the MELCOR source code resulting from analysis previous accident analysis and this forensic approach has allowed Sandia to generate representative and plausible source terms for all three accidents at Fukushima Daiichi out to three weeks after the accident to capture both early and late releases. In particular, using the source terms developed by MELCOR, the MACCS software code, which models atmospheric dispersion and deposition, we are able to reasonably capture the deposition of radionuclides to the northwest of the reactor site.« less

  7. Nanoscale Coloristic Pigments: Upper Limits on Releases from Pigmented Plastic during Environmental Aging, In Food Contact, and by Leaching.

    PubMed

    Neubauer, Nicole; Scifo, Lorette; Navratilova, Jana; Gondikas, Andreas; Mackevica, Aiga; Borschneck, Daniel; Chaurand, Perrine; Vidal, Vladimir; Rose, Jerome; von der Kammer, Frank; Wohlleben, Wendel

    2017-10-17

    The life cycle of nanoscale pigments in plastics may cause environmental or human exposure by various release scenarios. We investigated spontaneous and induced release with mechanical stress during/after simulated sunlight and rain degradation of polyethylene (PE) with organic and inorganic pigments. Additionally, primary leaching in food contact and secondary leaching from nanocomposite fragments with an increased surface into environmental media was examined. Standardized protocols/methods for release sampling, detection, and characterization of release rate and form were applied: Transformation of the bulk material was analyzed by Scanning Electron Microscopy (SEM), X-ray-tomography and Fourier-Transform Infrared spectroscopy (FTIR); releases were quantified by Inductively Coupled Plasma Mass Spectrometry (ICP-MS), single-particle-ICP-MS (sp-ICP-MS), Transmission Electron Microscopy (TEM), Analytical Ultracentrifugation (AUC), and UV/Vis spectroscopy. In all scenarios, the detectable particulate releases were attributed primarily to contaminations from handling and machining of the plastics, and were not identified with the pigments, although the contamination of 4 mg/kg (Fe) was dwarfed by the intentional content of 5800 mg/kg (Fe as Fe 2 O 3 pigment). We observed modulations (which were at least partially preventable by UV stabilizers) when comparing as-produced and aged nanocomposites, but no significant increase of releases. Release of pigments was negligible within the experimental error for all investigated scenarios, with upper limits of 10 mg/m 2 or 1600 particles/mL. This is the first holistic confirmation that pigment nanomaterials remain strongly contained in a plastic that has low diffusion and high persistence such as the polyolefin High Density Polyethylene (HDPE).

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Nathan; Faucett, Christopher; Haskin, Troy Christopher

    Following the conclusion of the first phase of the crosswalk analysis, one of the key unanswered questions was whether or not the deviations found would persist during a partially recovered accident scenario, similar to the one that occurred in TMI - 2. In particular this analysis aims to compare the impact of core degradation morphology on quenching models inherent within the two codes and the coolability of debris during partially recovered accidents. A primary motivation for this study is the development of insights into how uncertainties in core damage progression models impact the ability to assess the potential for recoverymore » of a degraded core. These quench and core recovery models are of the most interest when there is a significant amount of core damage, but intact and degraded fuel still remain in the cor e region or the lower plenum. Accordingly this analysis presents a spectrum of partially recovered accident scenarios by varying both water injection timing and rate to highlight the impact of core degradation phenomena on recovered accident scenarios. This analysis uses the newly released MELCOR 2.2 rev. 966 5 and MAAP5, Version 5.04. These code versions, which incorporate a significant number of modifications that have been driven by analyses and forensic evidence obtained from the Fukushima - Daiichi reactor site.« less

  9. A review and perspective of existing research on the release of nanomaterials from solid nanocomposites

    PubMed Central

    2014-01-01

    Advances in adding nanomaterials to various matrices have occurred in tandem with the identification of potential hazards associated with exposure to pure forms of nanomaterials. We searched multiple research publication databases and found that, relative to data generated on potential nanomaterial hazards or exposures, very little attention has focused on understanding the potential and conditions for release of nanomaterials from nanocomposites. However, as a prerequisite to exposure studying release is necessary to inform risk assessments. We identified fifty-four studies that specifically investigated the release of nanomaterials, and review them in the following release scenario groupings: machining, weathering, washing, contact and incineration. While all of the identified studies provided useful information, only half were controlled experiments. Based on these data, the debris released from solid, non-food nanocomposites contains in varying frequencies, a mixture of four types of debris. Most frequently identified are (1) particles of matrix alone, and slightly less often, the (2) matrix particles exhibit the nanomaterial partially or fully embedded; far less frequently is (3) the added nanomaterial entirely dissociated from the matrix identified: and most rare are (4) dissolved ionic forms of the added nanomaterial. The occurrence of specific debris types appeared to be dependent on the specific release scenario and environment. These data highlight that release from nanocomposites can take multiple forms and that additional research and guidance would be beneficial, allowing for more consistent characterization of the release potential of nanomaterials. In addition, these data support calls for method validation and standardization, as well as understanding how laboratory release scenarios relate to real-world conditions. Importantly, as risk is considered to be a function of the inherent hazards of a substance and the actual potential for exposure, data on nanomaterial release dynamics and debris composition from commercially relevant nanocomposites are a valuable starting point for consideration in fate and transport modeling, exposure assessment, and risk assessment frameworks for nanomaterials. PMID:24708765

  10. A review and perspective of existing research on the release of nanomaterials from solid nanocomposites.

    PubMed

    Froggett, Stephan J; Clancy, Shaun F; Boverhof, Darrell R; Canady, Richard A

    2014-04-07

    Advances in adding nanomaterials to various matrices have occurred in tandem with the identification of potential hazards associated with exposure to pure forms of nanomaterials. We searched multiple research publication databases and found that, relative to data generated on potential nanomaterial hazards or exposures, very little attention has focused on understanding the potential and conditions for release of nanomaterials from nanocomposites. However, as a prerequisite to exposure studying release is necessary to inform risk assessments. We identified fifty-four studies that specifically investigated the release of nanomaterials, and review them in the following release scenario groupings: machining, weathering, washing, contact and incineration. While all of the identified studies provided useful information, only half were controlled experiments. Based on these data, the debris released from solid, non-food nanocomposites contains in varying frequencies, a mixture of four types of debris. Most frequently identified are (1) particles of matrix alone, and slightly less often, the (2) matrix particles exhibit the nanomaterial partially or fully embedded; far less frequently is (3) the added nanomaterial entirely dissociated from the matrix identified: and most rare are (4) dissolved ionic forms of the added nanomaterial. The occurrence of specific debris types appeared to be dependent on the specific release scenario and environment. These data highlight that release from nanocomposites can take multiple forms and that additional research and guidance would be beneficial, allowing for more consistent characterization of the release potential of nanomaterials. In addition, these data support calls for method validation and standardization, as well as understanding how laboratory release scenarios relate to real-world conditions. Importantly, as risk is considered to be a function of the inherent hazards of a substance and the actual potential for exposure, data on nanomaterial release dynamics and debris composition from commercially relevant nanocomposites are a valuable starting point for consideration in fate and transport modeling, exposure assessment, and risk assessment frameworks for nanomaterials.

  11. Dynamics of Crust Dissolution and Gas Release in Tank 241-SY-101

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rassat, Scot D.; Stewart, Charles W.; Wells, Beric E.

    2000-01-24

    Due primarily to an increase in floating crust thickness, the waste level in Tank 241-SY-101 has grown appreciably and the flammable gas volume stored in the crust has become a potential hazard. To remediate gas retention in the crust and the potential for buoyant displacement gas releases from the nonconvective layer at the bottom of the tank, SY-101 will be diluted to dissolve a large fraction of the solids that allow the waste to retain gas. The plan is to transfer some waste out and back-dilute with water in several steps. In this work, mechanisms and rates of waste solidsmore » dissolution and gas releases are evaluated theoretically and experimentally. Particular emphasis is given to crust dissolution processes and associated gas releases, although dissolution and gas release from the mixed-slurry and nonconvective layers are also considered. The release of hydrogen gas to the tank domespace is modeled for a number of scenarios. Under the tank conditions expected at the time of back-dilution, no plausible continuous or sudden gas release scenarios resulting in flammable hydrogen concentrations were identified.« less

  12. Collective doses to man from dumping of radioactive waste in the Arctic Seas.

    PubMed

    Nielsen, S P; Iosjpe, M; Strand, P

    1997-08-25

    A box model for the dispersion of radionuclides in the marine environment covering the Arctic Ocean and the North Atlantic Ocean has been constructed. Collective doses from ingestion pathways have been calculated from unit releases of the radionuclides 3H, 60Co, 63Ni, 90Sr, 129I, 137Cs, 239Pu and 241Am into a fjord on the east coast of NovayaZemlya. The results show that doses for the shorter-lived radionuclides (e.g. 137Cs) are derived mainly from seafood production in the Barents Sea. Doses from the longer-lived radionuclides (e.g. 239Pu) are delivered through marine produce further away from the Arctic Ocean. Collective doses were calculated for two release scenarios, both of which are based on information of the dumping of radioactive waste in the Barents and Kara Seas by the former Soviet Union and on preliminary information from the International Arctic Sea Assessment Programme. A worst-case scenario was assumed according to which all radionuclides in liquid and solid radioactive waste were available for dispersion in the marine environment at the time of dumping. Release of radionuclides from spent nuclear fuel was assumed to take place by direct corrosion of the fuel ignoring the barriers that prevent direct contact between the fuel and the seawater. The second scenario selected assumed that releases of radionuclides from spent nuclear fuel do not occur until after failure of the protective barriers. All other liquid and solid radioactive waste was assumed to be available for dispersion at the time of discharge in both scenarios. The estimated collective dose for the worst-case scenario was about 9 manSv and that for the second scenario was about 3 manSv. In both cases, 137Cs is the radionuclide predicted to dominate the collective doses as well as the peak collective dose rates.

  13. PROBABILISTIC SAFETY ASSESSMENT OF OPERATIONAL ACCIDENTS AT THE WASTE ISOLATION PILOT PLANT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rucker, D.F.

    2000-09-01

    This report presents a probabilistic safety assessment of radioactive doses as consequences from accident scenarios to complement the deterministic assessment presented in the Waste Isolation Pilot Plant (WIPP) Safety Analysis Report (SAR). The International Council of Radiation Protection (ICRP) recommends both assessments be conducted to ensure that ''an adequate level of safety has been achieved and that no major contributors to risk are overlooked'' (ICRP 1993). To that end, the probabilistic assessment for the WIPP accident scenarios addresses the wide range of assumptions, e.g. the range of values representing the radioactive source of an accident, that could possibly have beenmore » overlooked by the SAR. Routine releases of radionuclides from the WIPP repository to the environment during the waste emplacement operations are expected to be essentially zero. In contrast, potential accidental releases from postulated accident scenarios during waste handling and emplacement could be substantial, which necessitates the need for radiological air monitoring and confinement barriers (DOE 1999). The WIPP Safety Analysis Report (SAR) calculated doses from accidental releases to the on-site (at 100 m from the source) and off-site (at the Exclusive Use Boundary and Site Boundary) public by a deterministic approach. This approach, as demonstrated in the SAR, uses single-point values of key parameters to assess the 50-year, whole-body committed effective dose equivalent (CEDE). The basic assumptions used in the SAR to formulate the CEDE are retained for this report's probabilistic assessment. However, for the probabilistic assessment, single-point parameter values were replaced with probability density functions (PDF) and were sampled over an expected range. Monte Carlo simulations were run, in which 10,000 iterations were performed by randomly selecting one value for each parameter and calculating the dose. Statistical information was then derived from the 10,000 iteration batch, which included 5%, 50%, and 95% dose likelihood, and the sensitivity of each assumption to the calculated doses. As one would intuitively expect, the doses from the probabilistic assessment for most scenarios were found to be much less than the deterministic assessment. The lower dose of the probabilistic assessment can be attributed to a ''smearing'' of values from the high and low end of the PDF spectrum of the various input parameters. The analysis also found a potential weakness in the deterministic analysis used in the SAR, a detail on drum loading was not taken into consideration. Waste emplacement operations thus far have handled drums from each shipment as a single unit, i.e. drums from each shipment are kept together. Shipments typically come from a single waste stream, and therefore the curie loading of each drum can be considered nearly identical to that of its neighbor. Calculations show that if there are large numbers of drums used in the accident scenario assessment, e.g. 28 drums in the waste hoist failure scenario (CH5), then the probabilistic dose assessment calculations will diverge from the deterministically determined doses. As it is currently calculated, the deterministic dose assessment assumes one drum loaded to the maximum allowable (80 PE-Ci), and the remaining are 10% of the maximum. The effective average of drum curie content is therefore less in the deterministic assessment than the probabilistic assessment for a large number of drums. EEG recommends that the WIPP SAR calculations be revisited and updated to include a probabilistic safety assessment.« less

  14. Transportation accident scenarios for commercial spent fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilmot, E L

    1981-02-01

    A spectrum of high severity, low probability, transportation accident scenarios involving commercial spent fuel is presented together with mechanisms, pathways and quantities of material that might be released from spent fuel to the environment. These scenarios are based on conclusions from a workshop, conducted in May 1980 to discuss transportation accident scenarios, in which a group of experts reviewed and critiqued available literature relating to spent fuel behavior and cask response in accidents.

  15. Potential release scenarios for carbon nanotubes used in composites

    EPA Science Inventory

    The expected widespread use of carbon nanotube (CNT)-composites in consumer products calls for an assessment of the possible release and exposure to workers, consumers and the environment. Release of CNTs may occur at all steps in the life cycle of products, but to date only limi...

  16. A Nano-Selenium Reactive Barrier Approach for Managing Mercury over the Life-Cycle of Compact Fluorescent Lamps

    PubMed Central

    Lee, Brian; Sarin, Love; Johnson, Natalie C.; Hurt, Robert H.

    2013-01-01

    Compact fluorescent lamps contain small quantities of mercury, whose release can lead to human exposures of potential concern in special cases involving multiple lamps, confined spaces, or young children. The exposure scenarios typically involve solid lamp debris that slowly releases elemental mercury vapor to indoor spaces. Here we propose and demonstrate a reactive barrier approach for the suppression of that mercury release, and demonstrate the concept using uncoated amorphous nano-selenium as the reactive component. Multi-layer structures containing an impregnated reactive layer and a mercury vapor barrier are fabricated, characterized, and evaluated in three exposure prevention scenarios: carpeted break sites, disposal/recycling bags, and boxes as used for retail sales, shipping and collection. The reactive barriers achieve significant suppression of mercury release to indoor spaces in each of the three scenarios. The nano-selenium barriers also exhibit a unique indicator function that can reveal the location of Hg-contamination by local reaction-induced change in optical properties. The article also presents results on equilibrium Hg vapor pressure above lamp debris, mathematical modeling of reaction and transport processes within reactive barriers, and landfill stability of nano-selenium and its reaction products. PMID:19731697

  17. A nano-selenium reactive barrier approach for managing mercury over the life-cycle of compact fluorescent lamps.

    PubMed

    Lee, Brian; Sarin, Love; Johnson, Natalie C; Hurt, Robert H

    2009-08-01

    Compact fluorescent lamps contain small quantities of mercury, release of which can lead to human exposures of potential concern in special cases involving multiple lamps, confined spaces, or young children. The exposure scenarios typically involve solid lamp debris that slowly releases elemental mercury vapor to indoor spaces. Here we propose and demonstrate a reactive barrier approach for the suppression of that mercury release, and demonstrate the concept using uncoated amorphous nanoselenium as the reactive component. Multilayer structures containing an impregnated reactive layer and a mercury vapor barrier are fabricated, characterized, and evaluated in three exposure prevention scenarios: carpeted break sites, disposal/recycling bags, and boxes as used for retail sales, shipping, and collection. The reactive barriers achieve significant suppression of mercury release to indoor spaces in each of thethree scenarios. The nanoselenium barriers also exhibit a unique indicator function that can reveal the location of Hg contamination by local reaction-induced change in optical properties. The article also presents results on equilibrium Hg vapor pressure above lamp debris, mathematical modeling of reaction and transport processes within reactive barriers, and landfill stability of nanoselenium and its reaction products.

  18. Probabilistic mapping of urban flood risk: Application to extreme events in Surat, India

    NASA Astrophysics Data System (ADS)

    Ramirez, Jorge; Rajasekar, Umamaheshwaran; Coulthard, Tom; Keiler, Margreth

    2016-04-01

    Surat, India is a coastal city that lies on the banks of the river Tapti and is located downstream from the Ukai dam. Given Surat's geographic location, the population of five million people are repeatedly exposed to flooding caused by high tide combined with large emergency dam releases into the Tapti river. In 2006 such a flood event occurred when intense rainfall in the Tapti catchment caused a dam release near 25,000 m3 s-1 and flooded 90% of the city. A first step towards strengthening resilience in Surat requires a robust method for mapping potential flood risk that considers the uncertainty in future dam releases. Here, in this study we develop many combinations of dam release magnitude and duration for the Ukai dam. Afterwards we use these dam releases to drive a two dimensional flood model (CAESAR-Lisflood) of Surat that also considers tidal effects. Our flood model of Surat utilizes fine spatial resolution (30m) topography produced from an extensive differential global positioning system survey and measurements of river cross-sections. Within the city we have modelled scenarios that include extreme conditions with near maximum dam release levels (e.g. 1:250 year flood) and high tides. Results from all scenarios have been summarized into probabilistic flood risk maps for Surat. These maps are currently being integrated within the city disaster management plan for taking both mitigation and adaptation measures for different scenarios of flooding.

  19. Benefit Assessment of the Precision Departure Release Capability Concept

    NASA Technical Reports Server (NTRS)

    Palopo, Kee; Chatterji, Gano B.; Lee, Hak-Tae

    2011-01-01

    A Precision Departure Release Capability concept is being evaluated by both the National Aeronautics and Space Administration and the Federal Aviation Administration as part of a larger goal of improving throughput, efficiency and capacity in integrated departure, arrival and surface operations. The concept is believed to have the potential of increasing flight efficiency and throughput by avoiding missing assigned slots and minimizing speed increase or path stretch to recover the slot. The main thrust of the paper is determining the impact of early and late departures from the departure runway when an aircraft has a slot assigned either at a meter fix or at the arrival airport. Results reported in the paper are for two scenarios. The first scenario considers flights out of Dallas/Fort Worth destined for Hartsfield-Jackson International Airport in Atlanta flying through the Meridian meter-fix in the Memphis Center with miles-in-trail constraints. The second scenario considers flights destined to George Bush Intercontinental/Houston Airport with specified airport arrival rate constraint. Results show that delay reduction can be achieved by allowing reasonable speed changes in scheduling. It was determined that the traffic volume between Dallas/Fort Worth and Atlanta via the Meridian fix is low and the departures times are spread enough that large departure schedule uncertainty can be tolerated. Flights can depart early or late within 90 minutes without accruing much more delay due to miles-in-trail constraint at the Meridian fix. In the Houston scenario, 808 arrivals from 174 airports were considered. Results show that delay experienced by the 16 Dallas/Fort Worth departures is higher if initial schedules of the remaining 792 flights are kept unaltered while they are rescheduled. Analysis shows that the probability of getting the initially assigned slot back after perturbation and rescheduling decreases with increasing standard deviation of the departure delay distributions. Results show that most Houston arrivals can be expected to be on time based on the assumed zero-mean Normal departure delay distributions achievable by Precision Departure Release Capability. In the current system, airport-departure delay, which is the sum of gate-departure delay and taxi-out delay, is observed at the airports. This delay acts as a bias, which can be reduced by Precision Departure Release Capability.

  20. Modeling climate change impacts on water trading.

    PubMed

    Luo, Bin; Maqsood, Imran; Gong, Yazhen

    2010-04-01

    This paper presents a new method of evaluating the impacts of climate change on the long-term performance of water trading programs, through designing an indicator to measure the mean of periodic water volume that can be released by trading through a water-use system. The indicator is computed with a stochastic optimization model which can reflect the random uncertainty of water availability. The developed method was demonstrated in the Swift Current Creek watershed of Prairie Canada under two future scenarios simulated by a Canadian Regional Climate Model, in which total water availabilities under future scenarios were estimated using a monthly water balance model. Frequency analysis was performed to obtain the best probability distributions for both observed and simulated water quantity data. Results from the case study indicate that the performance of a trading system is highly scenario-dependent in future climate, with trading effectiveness highly optimistic or undesirable under different future scenarios. Trading effectiveness also largely depends on trading costs, with high costs resulting in failure of the trading program. (c) 2010 Elsevier B.V. All rights reserved.

  1. As assessment of power system vulnerability to release of carbon fibers during commercial aviation accidents

    NASA Technical Reports Server (NTRS)

    Larocque, G. R.

    1980-01-01

    The vulnerability of a power distribution system in Bedford and Lexington, Massachusetts to power outages as a result of exposure to carbon fibers released in a commercial aviation accident in 1993 was examined. Possible crash scenarios at Logan Airport based on current operational data and estimated carbon fiber usage levels were used to predict exposure levels and occurrence probabilities. The analysis predicts a mean time between carbon fiber induced power outages of 2300 years with an expected annual consequence of 0.7 persons losing power. In comparison to historical outage data for the system, this represents a 0.007% increase in outage rate and 0.07% increase in consequence.

  2. Evaluating release alternatives for a long-lived bird species under uncertainty about long-term demographic rates

    USGS Publications Warehouse

    Moore, Clinton T.; Converse, Sarah J.; Folk, Martin J.; Runge, Michael C.; Nesbitt, Stephen A.

    2012-01-01

    The release of animals to reestablish an extirpated population is a decision problem that is often attended by considerable uncertainty about the probability of success. Annual releases of captive-reared juvenile Whooping Cranes (Grus americana) were begun in 1993 in central Florida, USA, to establish a breeding, non-migratory population. Over a 12-year period, 286 birds were released, but by 2004, the introduced flock had produced only four wild-fledged birds. Consequently, releases were halted over managers' concerns about the performance of the released flock and uncertainty about the efficacy of further releases. We used data on marked, released birds to develop predictive models for addressing whether releases should be resumed, and if so, under what schedule. To examine the outcome of different release scenarios, we simulated the survival and productivity of individual female birds under a baseline model that recognized age and breeding-class structure and which incorporated empirically estimated stochastic elements. As data on wild-fledged birds from captive-reared parents were sparse, a key uncertainty that confronts release decision-making is whether captive-reared birds and their offspring share the same vital rates. Therefore, we used data on the only population of wild Whooping Cranes in existence to construct two alternatives to the baseline model. The probability of population persistence was highly sensitive to the choice of these three models. Under the baseline model, extirpation of the population was nearly certain under any scenario of resumed releases. In contrast, the model based on estimates from wild birds projected a high probability of persistence under any release scenario, including cessation of releases. Therefore, belief in either of these models suggests that further releases are an ineffective use of resources. In the third model, which simulated a population Allee effect, population persistence was sensitive to the release decision: high persistence probability was achieved only through the release of more birds, whereas extirpation was highly probable with cessation of releases. Despite substantial investment of time and effort in the release program, evidence collected to date does not favor one model over another; therefore, any decision about further releases must be made under considerable biological uncertainty. However, given an assignment of credibility weight to each model, a best, informed decision about releases can be made under uncertainty. Furthermore, if managers can periodically revisit the release decision and collect monitoring data to further inform the models, then managers have a basis for confronting uncertainty and adaptively managing releases through time.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeremy Gwin and Douglas Frenette

    This document contains process knowledge and radiological data and analysis to support approval for release of the 25-ton locomotive, Serial Number 21547, at the Area 25 Engine Maintenance, Assembly, and Disassembly (EMAD) Facility, located on the Nevada Test Site (NTS). The 25-ton locomotive is a small, one-of-a-kind locomotive used to move railcars in support of the Nuclear Engine for Rocket Vehicle Application project. This locomotive was identified as having significant historical value by the Nevada State Railroad Museum in Boulder City, Nevada, where it will be used as a display piece. A substantial effort to characterize the radiological conditions ofmore » the locomotive was undertaken by the NTS Management and Operations Contractor, National Security Technologies, LLC (NSTec). During this characterization process, seven small areas on the locomotive had contamination levels that exceeded the NTS release criteria (limits consistent with U.S. Department of Energy [DOE] Order DOE O 5400.5, “Radiation Protection of the Public and the Environment”). The decision was made to perform radiological decontamination of these known accessible impacted areas to further the release process. On February 9, 2010, NSTec personnel completed decontamination of these seven areas to within the NTS release criteria. Although all accessible areas of the locomotive had been successfully decontaminated to within NTS release criteria, it was plausible that inaccessible areas of the locomotive (i.e., those areas on the locomotive where it was not possible to perform radiological surveys) could potentially have contamination above unrestricted release limits. To access the majority of these inaccessible areas, the locomotive would have to be disassembled. A complete disassembly for a full radiological survey could have permanently destroyed parts and would have ruined the historical value of the locomotive. Complete disassembly would also add an unreasonable financial burden for the contractor. A decision was reached between the NTS regulator and NSTec, opting for alternative authorized limits from DOE Headquarters. In doing so, NSTec personnel performed a dose model using the DOE-approved modeling code RESRAD-BUILD v3.5 to evaluate scenarios. The parameters used in the dose model were conservative. NSTec’s Radiological Engineering Calculation, REC-2010-001, “Public Dose Estimate from the EMAD 25 Ton Locomotive,” concluded that the four scenarios evaluated were below the 25-millirem per year limit, the “likely” dose scenarios met the “few millirem in a year” criteria, and that the EMAD 25-ton locomotive met the radiological requirements to be released with residual radioactivity to the public.« less

  4. A methodology to quantify the release of spent nuclear fuel from dry casks during security-related scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durbin, Samuel G.; Luna, Robert Earl

    Assessing the risk to the public and the environment from a release of radioactive material produced by accidental or purposeful forces/environments is an important aspect of the regulatory process in many facets of the nuclear industry. In particular, the transport and storage of radioactive materials is of particular concern to the public, especially with regard to potential sabotage acts that might be undertaken by terror groups to cause injuries, panic, and/or economic consequences to a nation. For many such postulated attacks, no breach in the robust cask or storage module containment is expected to occur. However, there exists evidence thatmore » some hypothetical attack modes can penetrate and cause a release of radioactive material. This report is intended as an unclassified overview of the methodology for release estimation as well as a guide to useful resource data from unclassified sources and relevant analysis methods for the estimation process.« less

  5. A Comparison of Three Algorithms for Orion Drogue Parachute Release

    NASA Technical Reports Server (NTRS)

    Matz, Daniel A.; Braun, Robert D.

    2015-01-01

    The Orion Multi-Purpose Crew Vehicle is susceptible to ipping apex forward between drogue parachute release and main parachute in ation. A smart drogue release algorithm is required to select a drogue release condition that will not result in an apex forward main parachute deployment. The baseline algorithm is simple and elegant, but does not perform as well as desired in drogue failure cases. A simple modi cation to the baseline algorithm can improve performance, but can also sometimes fail to identify a good release condition. A new algorithm employing simpli ed rotational dynamics and a numeric predictor to minimize a rotational energy metric is proposed. A Monte Carlo analysis of a drogue failure scenario is used to compare the performance of the algorithms. The numeric predictor prevents more of the cases from ipping apex forward, and also results in an improvement in the capsule attitude at main bag extraction. The sensitivity of the numeric predictor to aerodynamic dispersions, errors in the navigated state, and execution rate is investigated, showing little degradation in performance.

  6. Participation of the NDC Austria at the NDC Preparedness Exercise 2012

    NASA Astrophysics Data System (ADS)

    Mitterbauer, Ulrike; Wotawa, Gerhard; Schraick, Irene

    2013-04-01

    NDC Preparedness Exercises (NPEs) are conducted annually by the National Data Centers (NDCs) of CTBT States Signatories to train the detection of a (hypothetical) nuclear test. During the NDC Preparedness Exercise 2012, a fictitious radionuclide scenario originating from a real seismic event (mining explosion) was calculated by the German NDC and distributed among all NDCs. For the scenario computation, it was assumed that the selected seismic event was the epicentre of an underground nuclear fission explosion. The scenario included detections of the Iodine isotopes I-131 and I-133 (both particulates), and the Radioxenon Isotopes Xe-133, Xe-133M, Xe-131M and Xe-135 (noble gas). By means of atmospheric transport modelling (ATM), concentrations of all these six isotopes which would result from the hypothetical explosion were calculated and interpolated to the IMS station locations. The participating NDCs received information about the concentration of the isotopes at the station locations without knowing the underlying seismic event. The aim of the exercise was to identify this event based on the detection scenario. The Austrian NDC performed the following analyses: • Atmospheric backtracking and data fusion to identify seismic candidate events, • Seismic analysis of candidate events within the possible source region, • Atmospheric transport modelling (forward mode) from identified candidate events, comparison between "measured" and simulated concentrations based on certain release assumptions. The main goal of the analysis was to identify the event selected by NDC Germany to calculate the radionuclide scenario, and to exclude other events. In the presentation, the analysis methodology as well as the final results and conclusions will be shown and discussed in detail.

  7. EVA Suit Microbial Leakage Investigation Project

    NASA Technical Reports Server (NTRS)

    Falker, Jay; Baker, Christopher; Clayton, Ronald; Rucker, Michelle

    2016-01-01

    The objective of this project is to collect microbial samples from various EVA suits to determine how much microbial contamination is typically released during simulated planetary exploration activities. Data will be released to the planetary protection and science communities, and advanced EVA system designers. In the best case scenario, we will discover that very little microbial contamination leaks from our current or prototype suit designs, in the worst case scenario, we will identify leak paths, learn more about what affects leakage--and we'll have a new, flight-certified swab tool for our EVA toolbox.

  8. Explosion impacts during transport of hazardous cargo: GIS-based characterization of overpressure impacts and delineation of flammable zones for ammonia.

    PubMed

    Inanloo, Bahareh; Tansel, Berrin

    2015-06-01

    The aim of this research was to investigate accidental releases of ammonia followed by an en-route incident in an attempt to further predict the consequences of hazardous cargo accidents. The air dispersion model Areal Locations of Hazardous Atmospheres (ALOHA) was employed to track the probable outcomes of a hazardous material release of a tanker truck under different explosion scenarios. The significance of identification of the flammable zones was taken into consideration; in case the flammable vapor causes an explosion. The impacted areas and the severity of the probable destructions were evaluated for an explosion by considering the overpressure waves. ALOHA in conjunction with ArcGIS was used to delineate the flammable and overpressure impact zones for different scenarios. Based on the results, flammable fumes were formed in oval shapes having a chief axis along the wind direction at the time of release. The expansions of the impact areas under the overpressure value which can lead to property damage for 2 and 20 tons releases, under very stable and unstable atmospheric conditions were estimated to be around 1708, 1206; 3742, 3527 feet, respectively, toward the wind direction. A sensitivity analysis was done to assess the significance of wind speed on the impact zones. The insight provided by this study can be utilized by decision makers in transportation of hazardous materials as a guide for possible rerouting, rescheduling, or limiting the quantity of hazardous cargo to reduce the possible impacts after hazardous cargo accidents during transport. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Demonstration of fully coupled simplified extended station black-out accident simulation with RELAP-7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Haihua; Zhang, Hongbin; Zou, Ling

    2014-10-01

    The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The RELAP-7 code develop-ment effort started in October of 2011 and by the end of the second development year, a number of physical components with simplified two phase flow capability have been de-veloped to support the simplified boiling water reactor (BWR) extended station blackout (SBO) analyses. The demonstration case includes the major components for the primary system of a BWR, as well as the safety system components for the safety relief valve (SRV), the reactor core isolation cooling (RCIC)more » system, and the wet well. Three scenar-ios for the SBO simulations have been considered. Since RELAP-7 is not a severe acci-dent analysis code, the simulation stops when fuel clad temperature reaches damage point. Scenario I represents an extreme station blackout accident without any external cooling and cooling water injection. The system pressure is controlled by automatically releasing steam through SRVs. Scenario II includes the RCIC system but without SRV. The RCIC system is fully coupled with the reactor primary system and all the major components are dynamically simulated. The third scenario includes both the RCIC system and the SRV to provide a more realistic simulation. This paper will describe the major models and dis-cuss the results for the three scenarios. The RELAP-7 simulations for the three simplified SBO scenarios show the importance of dynamically simulating the SRVs, the RCIC sys-tem, and the wet well system to the reactor safety during extended SBO accidents.« less

  10. Contraceptive Options and Their Associated Estrogenic Environmental Loads: Relationships and Trade-Offs

    PubMed Central

    Khan, Usman; Nicell, Jim A.

    2014-01-01

    This work explores the relationships between a user's choice of a given contraceptive option and the load of steroidal estrogens that can be associated with that choice. Family planning data for the USA served as a basis for the analysis. The results showed that collectively the use of contraception in the USA conservatively averts the release of approximately 4.8 tonnes of estradiol equivalents to the environment. 35% of the estrogenic load released over the course of all experienced pregnancies events and 34% the estrogenic load represented by all resultant legacies are a result of contraception failure and the non-use of contraception. A scenario analysis conducted to explore the impacts of discontinuing the use of ethinylestradiol-based oral contraceptives revealed that this would not only result in a 1.7-fold increase in the estrogenic loading of the users, but the users would also be expected to experience undesired family planning outcomes at a rate that is 3.3 times higher. Additional scenario analyses in which ethinylestradiol-based oral contraceptive users were modeled as having switched entirely to the use of male condoms, diaphragms or copper IUDs suggested that whether a higher or lower estrogenic load can be associated with the switching population depends on the typical failure rates of the options adopted following discontinuation. And, finally, it was estimated that, in the USA, at most 13% of the annual estrogenic load can be averted by fully meeting the contraceptive needs of the population. Therefore, while the issue of estrogen impacts on the environment cannot be addressed solely by meeting the population's contraceptive needs, a significant fraction of the estrogenic mass released to environment can be averted by improving the level with which their contraceptive needs are met. PMID:24670973

  11. Ingestion Pathway Consequences of a Major Release from SRTC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanchard, A.

    1999-06-08

    The food ingestion consequences due to radioactive particulates of an accidental release, scenario 1-RD-3, are evaluated for Savannah River Technology Center. The sizes of land areas requiring the protective action of food interdiction are calculated. The consequences of the particulate portion of the release are evaluated with the HOTSPOT model and an EXCEL spreadsheet for particulates.

  12. Understanding and Predicting the Fate of Semivolatile Organic Pesticides in a Glacier-Fed Lake Using a Multimedia Chemical Fate Model.

    PubMed

    Wu, Xiaolin; Davie-Martin, Cleo L; Steinlin, Christine; Hageman, Kimberly J; Cullen, Nicolas J; Bogdal, Christian

    2017-10-17

    Melting glaciers release previously ice-entrapped chemicals to the surrounding environment. As glacier melting accelerates under future climate warming, chemical release may also increase. This study investigated the behavior of semivolatile pesticides over the course of one year and predicted their behavior under two future climate change scenarios. Pesticides were quantified in air, lake water, glacial meltwater, and streamwater in the catchment of Lake Brewster, an alpine glacier-fed lake located in the Southern Alps of New Zealand. Two historic-use pesticides (endosulfan I and hexachlorobenzene) and three current-use pesticides (dacthal, triallate, and chlorpyrifos) were frequently found in both air and water samples from the catchment. Regression analysis indicated that the pesticide concentrations in glacial meltwater and lake water were strongly correlated. A multimedia environmental fate model was developed for these five chemicals in Brewster Lake. Modeling results indicated that seasonal lake ice cover melt, and varying contributions of input from glacial melt and streamwater, created pulses in pesticide concentrations in lake water. Under future climate scenarios, the concentration pulse was altered and glacial melt made a greater contribution (as mass flux) to pesticide input in the lake water.

  13. Emission rate modeling and risk assessment at an automobile plant from painting operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, A.; Shrivastava, A.; Kulkarni, A.

    Pollution from automobile plants from painting operations has been addressed in the Clean Act Amendments (1990). The estimation of pollutant emissions from automobile painting operation were done mostly by approximate procedures than by actual calculations. The purpose of this study was to develop a methodology for calculating the emissions of the pollutants from painting operation in an automobile plant. Five scenarios involving an automobile painting operation, located in Columbus (Ohio), were studied for pollutant emission and concomitant risk associated with that. In the study of risk, a sensitivity analysis was done using Crystal Ball{reg{underscore}sign} on the parameters involved in risk.more » This software uses the Monte Carlo principle. The most sensitive factor in the risk analysis was the ground level concentration of the pollutants. All scenarios studied met the safety goal (a risk value of 1 x 10{sup {minus}6}) with different confidence levels. The highest level of confidence in meeting the safety goal was displayed by Scenario 1 (Alpha Industries). The results from the scenarios suggest that risk is associated with the quantity of released toxic pollutants. The sensitivity analysis of the various parameter shows that average spray rate of paint is the most important parameter in the estimation of pollutants from the painting operations. The entire study is a complete module that can be used by the environmental pollution control agencies for estimation of pollution levels and estimation of associated risk. The study can be further extended to other operations in an automobile industry or to different industries.« less

  14. NPE 2010 results - Independent performance assessment by simulated CTBT violation scenarios

    NASA Astrophysics Data System (ADS)

    Ross, O.; Bönnemann, C.; Ceranna, L.; Gestermann, N.; Hartmann, G.; Plenefisch, T.

    2012-04-01

    For verification of compliance to the Comprehensive Nuclear-Test-Ban Treaty (CTBT) the global International Monitoring System (IMS) is currently being built up. The IMS is designed to detect nuclear explosions through their seismic, hydroacoustic, infrasound, and radionuclide signature. The IMS data are collected, processed to analysis products, and distributed to the state signatories by the International Data Centre (IDC) in Vienna. The state signatories themselves may operate National Data Centers (NDC) giving technical advice concerning CTBT verification to the government. NDC Preparedness Exercises (NPE) are regularly performed to practice the verification procedures for the detection of nuclear explosions in the framework of CTBT monitoring. The initial focus of the NPE 2010 was on the component of radionuclide detections and the application of Atmospheric Transport Modeling (ATM) for defining the source region of a radionuclide event. The exercise was triggered by fictitious radioactive noble gas detections which were calculated beforehand secretly by forward ATM for a hypothetical xenon release scenario starting at location and time of a real seismic event. The task for the exercise participants was to find potential source events by atmospheric backtracking and to analyze in the following promising candidate events concerning their waveform signals. The study shows one possible way of solution for NPE 2010 as it was performed at German NDC by a team without precedent knowledge of the selected event and release scenario. The ATM Source Receptor Sensitivity (SRS) fields as provided by the IDC were evaluated in a logical approach in order to define probable source regions for several days before the first reported fictitious radioactive xenon finding. Additional information on likely event times was derived from xenon isotopic ratios where applicable. Of the considered seismic events in the potential source region all except one could be identified as earthquakes by seismological analysis. The remaining event at Black Thunder Mine, Wyoming, on 23 Oct at 21:15 UTC showed clear explosion characteristics. It caused also Infrasound detections at one station in Canada. An infrasonic one station localization algorithm led to event localization results comparable in precision to the teleseismic localization. However, the analysis of regional seismological stations gave the most accurate result giving an error ellipse of about 60 square kilometer. Finally a forward ATM simulation was performed with the candidate event as source in order to reproduce the original detection scenario. The ATM results showed a simulated station fingerprint in the IMS very similar to the fictitious detections given in the NPE 2010 scenario which is an additional confirmation that the event was correctly identified. The shown event analysis of the NPE 2010 serves as successful example for Data Fusion between the technology of radionuclide detection supported by ATM and seismological methodology as well as infrasound signal processing.

  15. Extreme scenarios for nuclear waste repositories.

    PubMed

    Brown, M J; Crouch, E

    1982-09-01

    Two extreme scenarios for release of radioactive waste have been constructed. In the first, a volcanic eruption releases 1 km2 of an underground nuclear waste repository, while in the second, waste enters the drinking water reservoir of a major city. With pessimistic assumptions, upper bounds on the number of cancers due to radiation are calculated. In the volcano scenario, the effects of the water are smaller than the effects of natural radioactivity in the volcanic dust if the delay between emplacement and eruption exceeds 2000 yr. The consequences of the waste in drinking water depend on the survival time of the canisters and the rate of leaching of the nuclides from the waste matrix. For a canister life of 400 yr and a leach time of 6300 yr the cancer rate in the affected area would increase by 25%.

  16. Carbon Nanotubes Released from an Epoxy-Based Nanocomposite: Quantification and Particle Toxicity.

    PubMed

    Schlagenhauf, Lukas; Buerki-Thurnherr, Tina; Kuo, Yu-Ying; Wichser, Adrian; Nüesch, Frank; Wick, Peter; Wang, Jing

    2015-09-01

    Studies combining both the quantification of free nanoparticle release and the toxicological investigations of the released particles from actual nanoproducts in a real-life exposure scenario are urgently needed, yet very rare. Here, a new measurement method was established to quantify the amount of free-standing and protruding multiwalled carbon nanotubes (MWCNTs) in the respirable fraction of particles abraded from a MWCNT-epoxy nanocomposite. The quantification approach involves the prelabeling of MWCNTs with lead ions, nanocomposite production, abrasion and collection of the inhalable particle fraction, and quantification of free-standing and protruding MWCNTs by measuring the concentration of released lead ions. In vitro toxicity studies for genotoxicity, reactive oxygen species formation, and cell viability were performed using A549 human alveolar epithelial cells and THP-1 monocyte-derived macrophages. The quantification experiment revealed that in the respirable fraction of the abraded particles, approximately 4000 ppm of the MWCNTs were released as exposed MWCNTs (which could contact lung cells upon inhalation) and approximately 40 ppm as free-standing MWCNTs in the worst-case scenario. The release of exposed MWCNTs was lower for nanocomposites containing agglomerated MWCNTs. The toxicity tests revealed that the abraded particles did not induce any acute cytotoxic effects.

  17. A comparison between the example reference biosphere model ERB 2B and a process-based model: simulation of a natural release scenario.

    PubMed

    Almahayni, T

    2014-12-01

    The BIOMASS methodology was developed with the objective of constructing defensible assessment biospheres for assessing potential radiological impacts of radioactive waste repositories. To this end, a set of Example Reference Biospheres were developed to demonstrate the use of the methodology and to provide an international point of reference. In this paper, the performance of the Example Reference Biosphere model ERB 2B associated with the natural release scenario, discharge of contaminated groundwater to the surface environment, was evaluated by comparing its long-term projections of radionuclide dynamics and distribution in a soil-plant system to those of a process-based, transient advection-dispersion model (AD). The models were parametrised with data characteristic of a typical rainfed winter wheat crop grown on a sandy loam soil under temperate climate conditions. Three safety-relevant radionuclides, (99)Tc, (129)I and (237)Np with different degree of sorption were selected for the study. Although the models were driven by the same hydraulic (soil moisture content and water fluxes) and radiological (Kds) input data, their projections were remarkably different. On one hand, both models were able to capture short and long-term variation in activity concentration in the subsoil compartment. On the other hand, the Reference Biosphere model did not project any radionuclide accumulation in the topsoil and crop compartments. This behaviour would underestimate the radiological exposure under natural release scenarios. The results highlight the potential role deep roots play in soil-to-plant transfer under a natural release scenario where radionuclides are released into the subsoil. When considering the relative activity and root depth profiles within the soil column, much of the radioactivity was taken up into the crop from the subsoil compartment. Further improvements were suggested to address the limitations of the Reference Biosphere model presented in this paper. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Medical Device Plug-and-Play Interoperability Standards and Technology Leadership

    DTIC Science & Technology

    2016-10-01

    above for publication in a peer-reviewed journal 4. Expand the release of the Clinical Scenario Repository (CSR), also known as “Good Ideas for...the Clinical Scenario Repository (CSR). The CSR pilot with the American Society of Anesthesiologists (ASA) Committee on Patient Safety and Education

  19. Potential slab avalanche release area identification from estimated winter terrain: a multi-scale, fuzzy logic approach

    NASA Astrophysics Data System (ADS)

    Veitinger, Jochen; Purves, Ross Stuart; Sovilla, Betty

    2016-10-01

    Avalanche hazard assessment requires a very precise estimation of the release area, which still depends, to a large extent, on expert judgement of avalanche specialists. Therefore, a new algorithm for automated identification of potential avalanche release areas was developed. It overcomes some of the limitations of previous tools, which are currently not often applied in hazard mitigation practice. By introducing a multi-scale roughness parameter, fine-scale topography and its attenuation under snow influence is captured. This allows the assessment of snow influence on terrain morphology and, consequently, potential release area size and location. The integration of a wind shelter index enables the user to define release area scenarios as a function of the prevailing wind direction or single storm events. A case study illustrates the practical usefulness of this approach for the definition of release area scenarios under varying snow cover and wind conditions. A validation with historical data demonstrated an improved estimation of avalanche release areas. Our method outperforms a slope-based approach, in particular for more frequent avalanches; however, the application of the algorithm as a forecasting tool remains limited, as snowpack stability is not integrated. Future research activity should therefore focus on the coupling of the algorithm with snowpack conditions.

  20. Water temperature effects from simulated changes to dam operations and structures in the Middle and South Santiam Rivers, Oregon

    USGS Publications Warehouse

    Buccola, Norman L.

    2017-05-31

    Green Peter and Foster Dams on the Middle and South Santiam Rivers, Oregon, have altered the annual downstream water temperature profile (cycle). Operation of the dams has resulted in cooler summer releases and warmer autumn releases relative to pre-dam conditions, and that alteration can hinder recovery of various life stages of threatened spring-run Chinook salmon (Oncorhyncus tshawytscha) and winter steelhead (O. mykiss). Lake level management and the use of multiple outlets from varying depths at the dams can enable the maintenance of a temperature regime more closely resembling that in which the fish evolved by releasing warm surface water during summer and cooler, deeper water in the autumn. At Green Peter and Foster Dams, the outlet configuration is such that temperature control is often limited by hydropower production at the dams. Previously calibrated CE-QUAL-W2 water temperature models of Green Peter and Foster Lakes were used to simulate the downstream thermal effects from hypothetical structures and modified operations at the dams. Scenarios with no minimum power production requirements allowed some releases through shallower and deeper outlets (summer and autumn) to achieve better temperature control throughout the year and less year-to-year variability in autumn release temperatures. Scenarios including a hypothetical outlet floating 1 meter below the lake surface resulted in greater ability to release warm water during summer compared to existing structures. Later in Autumn (October 15–December 31), a limited amount of temperature control was realized downstream from Foster Dam by scenarios limited to operational changes with existing structures, resulting in 15-day averages within 1.0 degree Celsius of current operations.

  1. Consequence assessment for Airborne Releases of SO{sub 2} from the Y-12 Pilot Dechlorination Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pendergrass, W.R.

    The Atmospheric Turbulence and Diffusion Division was requested by the Department of Energy`s Oak Ridge Operations Office to conduct a consequence assessment for potential atmospheric releases of SO{sub 2} from the Y-12 Pilot Dechlorination Facility. The focus of the assessment was to identify ``worst`` case meteorology which posed the highest concentration exposure potential for both on-site as well as off-site populations. A series of plausible SO{sub 2} release scenarios were provided by Y-12 for the consequence assessment. Each scenario was evaluated for predictions of downwind concentration, estimates of a five-minute time weighted average, and estimate of the dimension of themore » puff. The highest hazard potential was associated with Scenario 1, in which a total of eight SO{sub 2} cylinders are released internally to the Pilot Facility and exhausted through the emergency venting system. A companion effort was also conducted to evaluate the potential for impact of releases of SO{sub 2} from the Pilot Facility on the population of Oak Ridge. While specific transport trajectory data is not available for the Pilot Facility, extrapolations based on the Oak Ridge Site Survey and climatological records from the Y-12 meteorological program does not indicate the potential for impact on the city of Oak Ridge. Steering by the local topographical features severely limits the potential impact ares. Due to the lack of specific observational data, both tracer and meteorological, only inferences can be made concerning impact zones. It is recommended tat the Department of Energy Oak Ridge Operations examine the potential for off-site impact and develop the background data to prepare impact zones for releases of hazardous materials from the Y-12 facility.« less

  2. Development of a GIS-based spill management information system.

    PubMed

    Martin, Paul H; LeBoeuf, Eugene J; Daniel, Edsel B; Dobbins, James P; Abkowitz, Mark D

    2004-08-30

    Spill Management Information System (SMIS) is a geographic information system (GIS)-based decision support system designed to effectively manage the risks associated with accidental or intentional releases of a hazardous material into an inland waterway. SMIS provides critical planning and impact information to emergency responders in anticipation of, or following such an incident. SMIS couples GIS and database management systems (DBMS) with the 2-D surface water model CE-QUAL-W2 Version 3.1 and the air contaminant model Computer-Aided Management of Emergency Operations (CAMEO) while retaining full GIS risk analysis and interpretive capabilities. Live 'real-time' data links are established within the spill management software to utilize current meteorological information and flowrates within the waterway. Capabilities include rapid modification of modeling conditions to allow for immediate scenario analysis and evaluation of 'what-if' scenarios. The functionality of the model is illustrated through a case study of the Cheatham Reach of the Cumberland River near Nashville, TN.

  3. Potential impacts of climate change on the built environment: ASHRAE climate zones, building codes and national energy efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    New, Joshua Ryan; Kumar, Jitendra; Hoffman, Forrest M.

    Statement of the Problem: ASHRAE releases updates to 90.1 “Energy Standard for Buildings except Low-Rise Residential Buildings” every three years resulting in a 3.7%-17.3% increase in energy efficiency for buildings with each release. This is adopted by or informs building codes in nations across the globe, is the National Standard for the US, and individual states elect which release year of the standard they will enforce. These codes are built upon Standard 169 “Climatic Data for Building Design Standards,” the latest 2017 release of which defines climate zones based on 8, 118 weather stations throughout the world and data frommore » the past 8-25 years. This data may not be indicative of the weather that new buildings built today, will see during their upcoming 30-120 year lifespan. Methodology & Theoretical Orientation: Using more modern, high-resolution datasets from climate satellites, IPCC climate models (PCM and HadGCM), high performance computing resources (Titan) and new capabilities for clustering and optimization the authors briefly analyzed different methods for redefining climate zones. Using bottom-up analysis of multiple meteorological variables which were the subject matter, experts selected as being important to energy consumption, rather than the heating/cooling degree days currently used. Findings: We analyzed the accuracy of redefined climate zones, compared to current climate zones and how the climate zones moved under different climate change scenarios, and quantified the accuracy of these methods on a local level, at a national scale for the US. Conclusion & Significance: There is likely to be a significant annual, national energy and cost (billions USD) savings that could be realized by adjusting climate zones to take into account anticipated trends or scenarios in regional weather patterns.« less

  4. Reservoir optimisation using El Niño information. Case study of Daule Peripa (Ecuador)

    NASA Astrophysics Data System (ADS)

    Gelati, Emiliano; Madsen, Henrik; Rosbjerg, Dan

    2010-05-01

    The optimisation of water resources systems requires the ability to produce runoff scenarios that are consistent with available climatic information. We approach stochastic runoff modelling with a Markov-modulated autoregressive model with exogenous input, which belongs to the class of Markov-switching models. The model assumes runoff parameterisation to be conditioned on a hidden climatic state following a Markov chain, whose state transition probabilities depend on climatic information. This approach allows stochastic modeling of non-stationary runoff, as runoff anomalies are described by a mixture of autoregressive models with exogenous input, each one corresponding to a climate state. We calibrate the model on the inflows of the Daule Peripa reservoir located in western Ecuador, where the occurrence of El Niño leads to anomalously heavy rainfall caused by positive sea surface temperature anomalies along the coast. El Niño - Southern Oscillation (ENSO) information is used to condition the runoff parameterisation. Inflow predictions are realistic, especially at the occurrence of El Niño events. The Daule Peripa reservoir serves a hydropower plant and a downstream water supply facility. Using historical ENSO records, synthetic monthly inflow scenarios are generated for the period 1950-2007. These scenarios are used as input to perform stochastic optimisation of the reservoir rule curves with a multi-objective Genetic Algorithm (MOGA). The optimised rule curves are assumed to be the reservoir base policy. ENSO standard indices are currently forecasted at monthly time scale with nine-month lead time. These forecasts are used to perform stochastic optimisation of reservoir releases at each monthly time step according to the following procedure: (i) nine-month inflow forecast scenarios are generated using ENSO forecasts; (ii) a MOGA is set up to optimise the upcoming nine monthly releases; (iii) the optimisation is carried out by simulating the releases on the inflow forecasts, and by applying the base policy on a subsequent synthetic inflow scenario in order to account for long-term costs; (iv) the optimised release for the first month is implemented; (v) the state of the system is updated and (i), (ii), (iii), and (iv) are iterated for the following time step. The results highlight the advantages of using a climate-driven stochastic model to produce inflow scenarios and forecasts for reservoir optimisation, showing potential improvements with respect to the current management. Dynamic programming was used to find the best possible release time series given the inflow observations, in order to benchmark any possible operational improvement.

  5. Establishing the environmental risk of metal contaminated river bank sediments

    NASA Astrophysics Data System (ADS)

    Lynch, Sarah; Batty, Lesley; Byrne, Patrick

    2016-04-01

    Climate change predictions indicate an increase in the frequency and duration of flood events along with longer dry antecedent conditions, which could alter patterns of trace metal release from contaminated river bank sediments. This study took a laboratory mesocosm approach. Chemical analysis of water and sediment samples allowed the patterns of Pb and Zn release and key mechanisms controlling Pb and Zn mobility to be determined. Trace metal contaminants Pb and Zn were released throughout flooded periods. The highest concentrations of dissolved Pb were observed at the end of the longest flood period and high concentrations of dissolved Zn were released at the start of a flood. These concentrations were found to exceed environmental quality standards. Key mechanisms controlling mobility were (i) evaporation, precipitation and dissolution of Zn sulphate salts, (ii) anglesite solubility control of dissolved Pb, (iii) oxidation of galena and sphalerite, (iv) reductive dissolution of Mn/Fe hydroxides and co-precipitation/adsorption with Zn. In light of climate change predictions these results indicate future scenarios may include larger or more frequent transient 'pulses' of dissolved Pb and Zn released to river systems. These short lived pollution episodes could act as a significant barrier to achieving the EU Water Framework Directive objectives.

  6. Potential Scenarios for Nanomaterial Release and Subsequent Alteration in the Environment

    EPA Science Inventory

    The risk associated with engineered nanomaterials (ENMs) will be determined in part by the processes that control their environmental fate and transformation. These processes act not only on ENMs that might be released directly to the environment, but perhaps more importantly als...

  7. Radioxenon detections in the CTBT international monitoring system likely related to the announced nuclear test in North Korea on February 12, 2013.

    PubMed

    Ringbom, A; Axelsson, A; Aldener, M; Auer, M; Bowyer, T W; Fritioff, T; Hoffman, I; Khrustalev, K; Nikkinen, M; Popov, V; Popov, Y; Ungar, K; Wotawa, G

    2014-02-01

    Observations made in April 2013 of the radioxenon isotopes (133)Xe and (131m)Xe at measurement stations in Japan and Russia, belonging to the International Monitoring System for verification of the Comprehensive Nuclear-Test-Ban Treaty, are unique with respect to the measurement history of these stations. Comparison of measured data with calculated isotopic ratios as well as analysis using atmospheric transport modeling indicate that it is likely that the xenon measured was created in the underground nuclear test conducted by North Korea on February 12, 2013, and released 7-8 weeks later. More than one release is required to explain all observations. The (131m)Xe source terms for each release were calculated to 0.7 TBq, corresponding to about 1-10% of the total xenon inventory for a 10 kt explosion, depending on fractionation and release scenario. The observed ratios could not be used to obtain any information regarding the fissile material that was used in the test. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. Mars Sample Return mission: Two alternate scenarios

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Two scenarios for accomplishing a Mars Sample Return mission are presented herein. Mission A is a low cost, low mass scenario, while Mission B is a high technology, high science alternative. Mission A begins with the launch of one Titan IV rocket with a Centaur G' upper stage. The Centaur performs the trans-Mars injection burn and is then released. The payload consists of two lander packages and the Orbital Transfer Vehicle, which is responsible for supporting the landers during launch and interplanetary cruise. After descending to the surface, the landers deploy small, local rovers to collect samples. Mission B starts with 4 Titan IV launches, used to place the parts of the Planetary Transfer Vehicle (PTV) into orbit. The fourth launch payload is able to move to assemble the entire vehicle by simple docking routines. Once complete, the PTV begins a low thrust trajectory out from low Earth orbit, through interplanetary space, and into low Martian orbit. It deploys a communication satellite into a 1/2 sol orbit and then releases the lander package at 500 km altitude. The lander package contains the lander, the Mars Ascent Vehicle (MAV), two lighter than air rovers (called Aereons), and one conventional land rover. The entire package is contained with a biconic aeroshell. After release from the PTV, the lander package descends to the surface, where all three rovers are released to collect samples and map the terrain.

  9. A novel approach to multihazard modeling and simulation.

    PubMed

    Smith, Silas W; Portelli, Ian; Narzisi, Giuseppe; Nelson, Lewis S; Menges, Fabian; Rekow, E Dianne; Mincer, Joshua S; Mishra, Bhubaneswar; Goldfrank, Lewis R

    2009-06-01

    To develop and apply a novel modeling approach to support medical and public health disaster planning and response using a sarin release scenario in a metropolitan environment. An agent-based disaster simulation model was developed incorporating the principles of dose response, surge response, and psychosocial characteristics superimposed on topographically accurate geographic information system architecture. The modeling scenarios involved passive and active releases of sarin in multiple transportation hubs in a metropolitan city. Parameters evaluated included emergency medical services, hospital surge capacity (including implementation of disaster plan), and behavioral and psychosocial characteristics of the victims. In passive sarin release scenarios of 5 to 15 L, mortality increased nonlinearly from 0.13% to 8.69%, reaching 55.4% with active dispersion, reflecting higher initial doses. Cumulative mortality rates from releases in 1 to 3 major transportation hubs similarly increased nonlinearly as a function of dose and systemic stress. The increase in mortality rate was most pronounced in the 80% to 100% emergency department occupancy range, analogous to the previously observed queuing phenomenon. Effective implementation of hospital disaster plans decreased mortality and injury severity. Decreasing ambulance response time and increasing available responding units reduced mortality among potentially salvageable patients. Adverse psychosocial characteristics (excess worry and low compliance) increased demands on health care resources. Transfer to alternative urban sites was possible. An agent-based modeling approach provides a mechanism to assess complex individual and systemwide effects in rare events.

  10. Effects of working memory load and repeated scenario exposure on emergency braking performance.

    PubMed

    Engström, Johan; Aust, Mikael Ljung; Viström, Matias

    2010-10-01

    The objective of the present study was to examine the effect of working memory load on drivers' responses to a suddenly braking lead vehicle and whether this effect (if any) is moderated by repeated scenario exposure. Several experimental studies have found delayed braking responses to lead vehicle braking events during concurrent performance of nonvisual, working memory-loading tasks, such as hands-free phone conversation. However, the common use of repeated, and hence somewhat expected, braking events may undermine the generalizability of these results to naturalistic, unexpected, emergency braking scenarios. A critical lead vehicle braking scenario was implemented in a fixed-based simulator.The effects of working memory load and repeated scenario exposure on braking performance were examined. Brake response time was decomposed into accelerator pedal release time and accelerator-to-brake pedal movement time. Accelerator pedal release times were strongly reduced with repeated scenario exposure and were delayed by working memory load with a small but significant amount (178 ms).The two factors did not interact. There were no effects on accelerator-to-brake pedal movement time. The results suggest that effects of working memory load on response performance obtained from repeated critical lead vehicle braking scenarios may be validly generalized to real world unexpected events. The results have important implications for the interpretation of braking performance in experimental settings, in particular in the context of safety-related evaluation of in-vehicle information and communication technologies.

  11. Integrating software architectures for distributed simulations and simulation analysis communities.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context ofmore » the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.« less

  12. Toxic chemical considerations for tank farm releases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Keuren, J.C.; Davis, J.S., Westinghouse Hanford

    1996-08-01

    This topical report contains technical information used to determine the accident consequences of releases of toxic chemical and gases for the Tank Farm Final Safety Analysis report (FSAR).It does not provide results for specific accident scenarios but does provide information for use in those calculations including chemicals to be considered, chemical concentrations, chemical limits and a method of summing the fractional contributions of each chemical. Tank farm composites evaluated were liquids and solids for double shell tanks, single shell tanks, all solids,all liquids, headspace gases, and 241-C-106 solids. Emergency response planning guidelines (ERPGs) were used as the limits.Where ERPGs weremore » not available for the chemicals of interest, surrogate ERPGs were developed. Revision 2 includes updated sample data, an executive summary, and some editorial revisions.« less

  13. Centralized versus distributed reservoirs: an investigation of their implications on environmental flows and sustainable water resources management

    NASA Astrophysics Data System (ADS)

    Eriyagama, Nishadi; Smakhtin, Vladimir; Udamulla, Lakshika

    2018-06-01

    Storage of surface water is widely regarded as a form of insurance against rainfall variability. However, creation of surface storage often endanger the functions of natural ecosystems, and, in turn, ecosystem services that benefit humans. The issues of optimal size, placement and the number of reservoirs in a river basin - which maximizes sustainable benefits from storage - remain subjects for debate. This study examines the above issues through the analysis of a range of reservoir configurations in the Malwatu Oya river basin in the dry zone of Sri Lanka. The study produced multiple surface storage development pathways for the basin under different scenarios of environmental flow (EF) releases and reservoir network configurations. The EF scenarios ranged from zero to very healthy releases. It is shown that if the middle ground between the two extreme EF scenarios is considered, the theoretical maximum safe yield from surface storage is about 65-70 % of the mean annual runoff (MAR) of the basin. It is also identified that although distribution of reservoirs in the river network reduces the cumulative yield from the basin, this cumulative yield is maximized if the ratio among the storage capacities placed in each sub drainage basin is equivalent to the ratio among their MAR. The study suggests a framework to identify drainage regions having higher surface storage potential, to plan for the right distribution of storage capacity within a river basin, as well as to plan for EF allocations.

  14. Testing high resolution numerical models for analysis of contaminant storage and release from low permeability zones

    NASA Astrophysics Data System (ADS)

    Chapman, Steven W.; Parker, Beth L.; Sale, Tom C.; Doner, Lee Ann

    2012-08-01

    It is now widely recognized that contaminant release from low permeability zones can sustain plumes long after primary sources are depleted, particularly for chlorinated solvents where regulatory limits are orders of magnitude below source concentrations. This has led to efforts to appropriately characterize sites and apply models for prediction incorporating these effects. A primary challenge is that diffusion processes are controlled by small-scale concentration gradients and capturing mass distribution in low permeability zones requires much higher resolution than commonly practiced. This paper explores validity of using numerical models (HydroGeoSphere, FEFLOW, MODFLOW/MT3DMS) in high resolution mode to simulate scenarios involving diffusion into and out of low permeability zones: 1) a laboratory tank study involving a continuous sand body with suspended clay layers which was 'loaded' with bromide and fluorescein (for visualization) tracers followed by clean water flushing, and 2) the two-layer analytical solution of Sale et al. (2008) involving a relatively simple scenario with an aquifer and underlying low permeability layer. All three models are shown to provide close agreement when adequate spatial and temporal discretization are applied to represent problem geometry, resolve flow fields and capture advective transport in the sands and diffusive transfer with low permeability layers and minimize numerical dispersion. The challenge for application at field sites then becomes appropriate site characterization to inform the models, capturing the style of the low permeability zone geometry and incorporating reasonable hydrogeologic parameters and estimates of source history, for scenario testing and more accurate prediction of plume response, leading to better site decision making.

  15. A method for simulating the release of natural gas from the rupture of high-pressure pipelines in any terrain.

    PubMed

    Deng, Yajun; Hu, Hongbing; Yu, Bo; Sun, Dongliang; Hou, Lei; Liang, Yongtu

    2018-01-15

    The rupture of a high-pressure natural gas pipeline can pose a serious threat to human life and environment. In this research, a method has been proposed to simulate the release of natural gas from the rupture of high-pressure pipelines in any terrain. The process of gas releases from the rupture of a high-pressure pipeline is divided into three stages, namely the discharge, jet, and dispersion stages. Firstly, a discharge model is established to calculate the release rate of the orifice. Secondly, an improved jet model is proposed to obtain the parameters of the pseudo source. Thirdly, a fast-modeling method applicable to any terrain is introduced. Finally, based upon these three steps, a dispersion model, which can take any terrain into account, is established. Then, the dispersion scenarios of released gas in four different terrains are studied. Moreover, the effects of pipeline pressure, pipeline diameter, wind speed and concentration of hydrogen sulfide on the dispersion scenario in real terrain are systematically analyzed. The results provide significant guidance for risk assessment and contingency planning of a ruptured natural gas pipeline. Copyright © 2017. Published by Elsevier B.V.

  16. Decision analysis of emergency ventilation and evacuation strategies against suddenly released contaminant indoors by considering the uncertainty of source locations.

    PubMed

    Cai, Hao; Long, Weiding; Li, Xianting; Kong, Lingjuan; Xiong, Shuang

    2010-06-15

    In case hazardous contaminants are suddenly released indoors, the prompt and proper emergency responses are critical to protect occupants. This paper aims to provide a framework for determining the optimal combination of ventilation and evacuation strategies by considering the uncertainty of source locations. The certainty of source locations is classified as complete certainty, incomplete certainty, and complete uncertainty to cover all the possible situations. According to this classification, three types of decision analysis models are presented. A new concept, efficiency factor of contaminant source (EFCS), is incorporated in these models to evaluate the payoffs of the ventilation and evacuation strategies. A procedure of decision-making based on these models is proposed and demonstrated by numerical studies of one hundred scenarios with ten ventilation modes, two evacuation modes, and five source locations. The results show that the models can be useful to direct the decision analysis of both the ventilation and evacuation strategies. In addition, the certainty of the source locations has an important effect on the outcomes of the decision-making. Copyright 2010 Elsevier B.V. All rights reserved.

  17. Influence of condensation and latent heat release upon barotropic and baroclinic instabilities of vortices in a rotating shallow water f-plane model

    NASA Astrophysics Data System (ADS)

    Rostami, Masoud; Zeitlin, Vladimir

    2017-01-01

    Analysis of the influence of condensation and related latent heat release upon developing barotropic and baroclinic instabilities of large-scale low Rossby-number shielded vortices on the f-plane is performed within the moist-convective rotating shallow water model, in its barotropic (one-layer) and baroclinic (two-layer) versions. Numerical simulations with a high-resolution well-balanced finite-volume code, using a relaxation parameterisation for condensation, are made. Evolution of the instability in four different environments, with humidity (i) behaving as passive scalar, (ii) subject to condensation beyond a saturation threshold, (iii) subject to condensation and evaporation, with three different parameterisations of the latter, are inter-compared. The simulations are initialised with unstable modes determined from the detailed linear stability analysis in the "dry" version of the model. In a configuration corresponding to low-level mid-latitude atmospheric vortices, it is shown that the known scenario of evolution of barotropically unstable vortices, consisting in formation of a pair of dipoles (dipolar breakdown) is substantially modified by condensation and related moist convection, especially in the presence of surface evaporation. No enhancement of the instability due to precipitation was detected in this case. Cyclone-anticyclone asymmetry with respect to sensitivity to the moist effects is evidenced. It is shown that inertia-gravity wave emission during the vortex evolution is enhanced by the moist effects. In the baroclinic configuration corresponding to idealised cut-off lows in the atmosphere, it is shown that the azimuthal structure of the leading unstable mode is sensitive to the details of stratification. Scenarios of evolution are completely different for different azimuthal structures, one leading to dipolar breaking, and another to tripole formation. The effects of moisture considerably enhance the perturbations in the lower layer, especially in the tripole formation scenario.

  18. Transport and Dispersion Model Predictions of Elevated Source Tracer Experiments in the Copenhagen Area: Comparisons of Hazard Prediction and Assessment Capability (HPAC) and National Atmospheric Release Advisory Center (NARAC) Emergency Response Model Predictions

    DTIC Science & Technology

    2006-07-01

    Blue --) and NARAC (Red -) for two elevated releases ( MvM 3 and MvM 15) considered in the model-to-model study [2]. MvM 3 was a gas release (SF6...carried out under stable conditions with a boundary layer height of 100 m and release height of 80 m, while MvM 15 was a particle release carried out...release scenarios: MvM 3 at 30 and 60 Minutes and MvM 15 at 120 and 180 minutes. Each release shows significant NARAC underpredictions with

  19. Assessment of environmental public exposure from a hypothetical nuclear accident for Unit-1 Bushehr nuclear power plant.

    PubMed

    Sohrabi, M; Ghasemi, M; Amrollahi, R; Khamooshi, C; Parsouzi, Z

    2013-05-01

    Unit-1 of the Bushehr nuclear power plant (BNPP-1) is a VVER-type reactor with 1,000-MWe power constructed near Bushehr city at the coast of the Persian Gulf, Iran. The reactor has been recently operational to near its full power. The radiological impact of nuclear power plant (NPP) accidents is of public concern, and the assessment of radiological consequences of any hypothetical nuclear accident on public exposure is vital. The hypothetical accident scenario considered in this paper is a design-basis accident, that is, a primary coolant leakage to the secondary circuit. This scenario was selected in order to compare and verify the results obtained in the present paper with those reported in the Final Safety Analysis Report (FSAR 2007) of the BNPP-1 and to develop a well-proven methodology that can be used to study other and more severe hypothetical accident scenarios for this reactor. In the present study, the version 2.01 of the PC COSYMA code was applied. In the early phase of the accidental releases, effective doses (from external and internal exposures) as well as individual and collective doses (due to the late phase of accidental releases) were evaluated. The surrounding area of the BNPP-1 within a radius of 80 km was subdivided into seven concentric rings and 16 sectors, and distribution of population and agricultural products was calculated for this grid. The results show that during the first year following the modeled hypothetical accident, the effective doses do not exceed the limit of 5 mSv, for the considered distances from the BNPP-1. The results obtained in this study are in good agreement with those in the FSAR-2007 report. The agreement obtained is in light of many inherent uncertainties and variables existing in the two modeling procedures applied and proves that the methodology applied here can also be used to model other severe hypothetical accident scenarios of the BNPP-1 such as a small and large break in the reactor coolant system as well as beyond design-basis accidents. Such scenarios are planned to be studied in the near future, for this reactor.

  20. Projecting the release of carbon from permafrost soils using a perturbed physics ensemble

    NASA Astrophysics Data System (ADS)

    MacDougall, A. H.; Knutti, R.

    2015-12-01

    The soils of the Northern Hemisphere permafrost region are estimated to contain 1100 to 1500 Pg of carbon (Pg C). A substantial fraction of this carbon has been frozen and therefore protected from microbial decay for millennia. As anthropogenic climate warming progresses much of this permafrost is expected to thaw. Here we conduct perturbed physics experiments on a climate model of intermediate complexity, with an improved permafrost carbon module, to estimate with formal uncertainty bounds the release of carbon from permafrost soils by year 2100 and 2300. We estimate that by 2100 the permafrost region may release between 56 (13 to 118) Pg C under Representative Concentration Pathway (RCP) 2.6 and 102 (27 to 199) Pg C under RCP 8.5, with substantially more to be released under each scenario by year 2300. A subset of 25 model variants were projected 8000 years into the future under continued RCP 4.5 and 8.5 forcing. Under the high forcing scenario the permafrost carbon pool decays away over several thousand years. Under the moderate scenario forcing a remnant near-surface permafrost region persists in the high Arctic which develops a large permafrost carbon pool, leading to global recovery of the pool beginning in mid third millennium of the common era (CE). Overall our simulations suggest that the permafrost carbon cycle feedback to climate change will make a significant but not cataclysmic contribution to climate change over the next centuries and millennia.

  1. Projecting the release of carbon from permafrost soils using a perturbed physics ensemble modelling approach

    NASA Astrophysics Data System (ADS)

    MacDougall, Andrew; Knutti, Reto

    2016-04-01

    The soils of the northern hemisphere permafrost region are estimated to contain 1100 to 1500 Pg of carbon. A substantial fraction of this carbon has been frozen and therefore protected from microbial decay for millennia. As anthropogenic climate warming progresses permafrost soils are expected to thaw. Here we conduct perturbed physics experiments on a climate model of intermediate complexity, with an improved permafrost carbon module, to estimate with formal uncertainty bounds the release of carbon from permafrost soils by year 2100 and 2300. We estimate that by year 2100 the permafrost region may release between 56 (13 to 118)Pg C under Representative Concentration Pathway (RCP) 2.6 and 102 (27 to 199) Pg C under RCP 8.5, with substantially more to be released under each scenario by 2300. A subset of 25 model variants is projected 8000 years into the future under continued RCP 4.5 and 8.5 forcing. Under the high forcing scenario the permafrost carbon pool decays away over several thousand years. Under the moderate forcing scenario a remnant near-surface permafrost region persists in the High-Arctic, which develops a large permafrost carbon pool, leading to a global recovery of the pool beginning in mid third millennium of the common era. Overall our simulations suggest that the permafrost carbon cycle feedback to climate change will make a significant but not cataclysmic contribution to climate change over the next centuries and millennia.

  2. Measuring Nanomaterial Release from Carbon Nanotube Composites: Review of the State of the Science

    NASA Astrophysics Data System (ADS)

    Harper, Stacey; Wohlleben, Wendel; Doa, Maria; Nowack, Bernd; Clancy, Shaun; Canady, Richard; Maynard, Andrew

    2015-05-01

    Hazard studies of “as-produced” nanomaterials are increasingly available, yet a critical gap exists in exposure science that may impede safe development of nanomaterials. The gap is that we do not understand what is actually released because nanomaterials can change when released in ways that are not understood. We also generally do not have methods capable of quantitatively measuring what is released to support dose assessment. This review presents a case study of multi-walled carbon nanotubes (MWCNTs) for the measurement challenge to bridge this gap. As the use and value of MWCNTs increases, methods to measure what is released in ways relevant to risk evaluation are critically needed if products containing these materials are to be economically, environmentally, and socially sustainable. This review draws on the input of over 50 experts engaged in a program of workshops and technical report writing to address the release of MWCNTs from nanocomposite materials across their life cycle. The expert analyses reveals that new and sophisticated methods are required to measure and assess MWCNT exposures for realistic exposure scenarios. Furthermore, method requirements vary with the materials and conditions of release across life cycle stages of products. While review shows that the likelihood of significant release of MWCNTs appears to be low for many stages of composite life cycle, measurement methods are needed so that exposures from MWCNT-composites are understood and managed. In addition, there is an immediate need to refocus attention from study of “as-produced” nanomaterials to coordinated research on actual release scenarios.

  3. Wheels-Off Time Uncertainty Impact on Benefits of Early Call for Release Scheduling

    NASA Technical Reports Server (NTRS)

    Palopo, Kee; Chatterji, Gano B.; Almog, Noam

    2017-01-01

    Arrival traffic scenarios with 808 flights from 173 airports to Houston George Bush International airport are simulated to determine if Call For Release flights can receive a benefit in terms of less delay over other flights by scheduling prior to gate pushback (look-ahead in time) as opposed to at gate pushback. Call for Release flights are departures that require approval from Air Route Traffic Control Center prior to release. Realism is brought to the study by including gate departure delay and taxi-out delay uncertainties for the 77 major U. S. airports. Gate departure delay uncertainty is assumed to increase as a function of look-ahead time. Results show that Call For Release flights from an airport within the freeze horizon (a region surrounding the arrival airport) can get an advantage over other flights to a capacity constrained airport by scheduling prior to gate pushback, provided the wheels-off time uncertainty with respect to schedule is controlled to a small value, such as within a three-minute window. Another finding of the study is that system delay, measured as the sum of arrival delays, is smaller when flights are scheduled in the order of arrival compared to in the order of departure. Because flights from airports within the freeze horizon are scheduled in the order of departure, an increase in the number of internal airports with a larger freeze horizon increases system delay. Delay in the given scenario was found to increase by 126% (from 13.8 hours to 31.2 hours) as freeze horizon was increased from 30-minutes to 2-hours in the baseline scenario.

  4. Variation that can be expected when using particle tracking models in connectivity studies

    NASA Astrophysics Data System (ADS)

    Hufnagl, Marc; Payne, Mark; Lacroix, Geneviève; Bolle, Loes J.; Daewel, Ute; Dickey-Collas, Mark; Gerkema, Theo; Huret, Martin; Janssen, Frank; Kreus, Markus; Pätsch, Johannes; Pohlmann, Thomas; Ruardij, Piet; Schrum, Corinna; Skogen, Morten D.; Tiessen, Meinard C. H.; Petitgas, Pierre; van Beek, Jan K. L.; van der Veer, Henk W.; Callies, Ulrich

    2017-09-01

    Hydrodynamic Ocean Circulation Models and Lagrangian particle tracking models are valuable tools e.g. in coastal ecology to identify the connectivity between offshore spawning and coastal nursery areas of commercially important fish, for risk assessment and more for defining or evaluating marine protected areas. Most studies are based on only one model and do not provide levels of uncertainty. Here this uncertainty was addressed by applying a suite of 11 North Sea models to test what variability can be expected concerning connectivity. Different notional test cases were calculated related to three important and well-studied North Sea fish species: herring (Clupea harengus), and the flatfishes sole (Solea solea) and plaice (Pleuronectes platessa). For sole and plaice we determined which fraction of particles released in the respective spawning areas would reach a coastal marine protected area. For herring we determined the fraction located in a wind park after a predefined time span. As temperature is more and more a focus especially in biological and global change studies, furthermore inter-model variability in temperatures experienced by the virtual particles was determined. The main focus was on the transport variability originating from the physical models and thus biological behavior was not included. Depending on the scenario, median experienced temperatures differed by 3 °C between years. The range between the different models in one year was comparable to this temperature range observed between modelled years. Connectivity between flatfish spawning areas and the coastal protected area was highly dependent on the release location and spawning time. No particles released in the English Channel in the sole scenario reached the protected area while up to 20% of the particles released in the plaice scenario did. Interannual trends in transport directions and connectivity rates were comparable between models but absolute values displayed high variations. Most models showed systematic biases during all years in comparison to the ensemble median, indicating that in general interannual variation was represented but absolute values varied. In conclusion: variability between models is generally high and management decisions or scientific analysis using absolute values from only one single model might be biased and results or conclusions drawn from such studies need to be treated with caution. We further concluded that more true validation data for particle modelling are required.

  5. Release of Si from Silicon, a Ferrosilicon (FeSi) Alloy and a Synthetic Silicate Mineral in Simulated Biological Media

    PubMed Central

    Herting, Gunilla; Jiang, Tao; Sjöstedt, Carin; Odnevall Wallinder, Inger

    2014-01-01

    Unique quantitative bioaccessibility data has been generated, and the influence of surface/material and test media characteristics on the elemental release process were assessed for silicon containing materials in specific synthetic body fluids at certain time periods at a fixed loading. The metal release test protocol, elaborated by the KTH team, has previously been used for classification, ranking, and screening of different alloys and metals. Time resolved elemental release of Si, Fe and Al from particles, sized less than 50 µm, of two grades of metallurgical silicon (high purity silicon, SiHG, low purity silicon, SiLG), an alloy (ferrosilicon, FeSi) and a mineral (aluminium silicate, AlSi) has been investigated in synthetic body fluids of varying pH, composition and complexation capacity, simple models of for example dermal contact and digestion scenarios. Individual methods for analysis of released Si (as silicic acid, Si(OH)4) in synthetic body fluids using GF-AAS were developed for each fluid including optimisation of solution pH and graphite furnace parameters. The release of Si from the two metallurgical silicon grades was strongly dependent on both pH and media composition with the highest release in pH neutral media. No similar effect was observed for the FeSi alloy or the aluminium silicate mineral. Surface adsorption of phosphate and lactic acid were believed to hinder the release of Si whereas the presence of citric acid enhanced the release as a result of surface complexation. An increased presence of Al and Fe in the material (low purity metalloid, alloy or mineral) resulted in a reduced release of Si in pH neutral media. The release of Si was enhanced for all materials with Al at their outermost surface in acetic media. PMID:25225879

  6. Socio-economic analysis of the risk management of hexabromocyclododecane (HBCD) in China in the context of the Stockholm Convention.

    PubMed

    Zhu, Jing; Liu, Jian-Guo; Hu, Jian-Xin; Yi, Shan

    2016-05-01

    Socio-economic analysis (SEA) plays an important role in decision-making on risk management actions for certain chemicals under Multilateral Environmental Agreements (MEAs) in developing countries. This paper showed the first holistic and quantitative SEA case study on that by developing a country-specific SEA framwork and methodologies and applying the case of HBCD phase-out in China under the Stockholm Convention on Persistent Organic Pollutants (POPs). The study indicates that, under the possible scenarios of 10 years and 5 years , the economic costs of HBCD phase-out in China would be between 9.032 and 19.021 billion RMB. Although the total economic costs seems to be significant, it would only have a marginal impact on the house building industry with a likely cost increase by about 0.07‰-0.14‰. Meanwhile, the HBCD phase-out may render significant environmental and health benefits, including about 23-29 tons of HBCD release prevented to the environment, 1.142-1.469 million tons of potentially HBCD contained hazardous wastes avoided, along with significant reduction from 58% up to almost 100% in local environmental concentrations of HBCD, and about 0.0996-0.128 million workers at risk avoided and at least 3.067-4.033 billion RMB of the health care savings. While the scenario of phasing out HBCD over 10 years would be less costly than the scenario of that over 5 years, the later scenario suggested much greater environmental and health benefits for China. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Anticipating environmental and environmental-health implications of extreme storms: ARkStorm scenario

    USGS Publications Warehouse

    Plumlee, Geoffrey S.; Alpers, Charles N.; Morman, Suzette A.; San Juan, Carma A.

    2016-01-01

    The ARkStorm Scenario predicts that a prolonged winter storm event across California would cause extreme precipitation, flooding, winds, physical damages, and economic impacts. This study uses a literature review and geographic information system-based analysis of national and state databases to infer how and where ARkStorm could cause environmental damages, release contamination from diverse natural and anthropogenic sources, affect ecosystem and human health, and cause economic impacts from environmental-remediation, liability, and health-care costs. Examples of plausible ARkStorm environmental and health concerns include complex mixtures of contaminants such as petroleum, mercury, asbestos, persistent organic pollutants, molds, and pathogens; adverse physical and contamination impacts on riverine and coastal marine ecosystems; and increased incidences of mold-related health concerns, some vector-borne diseases, and valley fever. Coastal cities, the San Francisco Bay area, the Sacramento-San Joaquin River Delta, parts of the Central Valley, and some mountainous areas would likely be most affected. This type of screening analysis, coupled with follow-up local assessments, can help stakeholders in California and disaster-prone areas elsewhere better plan for, mitigate, and respond to future environmental disasters.

  8. An integrated Riverine Environmental Flow Decision Support System (REFDSS) to evaluate the ecological effects of alternative flow scenarios on river ecosystems

    USGS Publications Warehouse

    Maloney, Kelly O.; Talbert, Colin B.; Cole, Jeffrey C.; Galbraith, Heather S.; Blakeslee, Carrie J.; Hanson, Leanne; Holmquist-Johnson, Christopher L.

    2015-01-01

    In regulated rivers, managers must evaluate competing flow release scenarios that attempt to balance both human and natural needs. Meeting these natural flow needs is complex due to the myriad of interacting physical and hydrological factors that affect ecosystems. Tools that synthesize the voluminous scientific data and models on these factors will facilitate management of these systems. Here, we present the Riverine Environmental Flow Decision Support System (REFDSS), a tool that enables evaluation of competing flow scenarios and other variables on instream habitat. We developed a REFDSS for the Upper Delaware River, USA, a system that is regulated by three headwater reservoirs. This version of the REFDSS has the ability to integrate any set of spatially explicit data and synthesizes modeled discharge for three competing management scenarios, flow-specific 2-D hydrodynamic modeled estimates of local hydrologic conditions (e.g., depth, velocity, shear stress, etc.) at a fine pixel-scale (1 m2), and habitat suitability criteria (HSC) for a variety of taxa. It contains all individual model outputs, computationally integrates these data, and outputs the amount of potentially available habitat for a suite of species of interest under each flow release scenario. Users have the flexibility to change the time period of interest and vary the HSC. The REFDSS was developed to enable side-by-side evaluation of different flow management scenarios and their effects on potential habitat availability, allowing managers to make informed decisions on the best flow scenarios. An exercise comparing two alternative flow scenarios to a baseline scenario for several key species is presented. The Upper Delaware REFDSS was robust to minor changes in HSC (± 10 %). The general REFDSS platform was developed as a user-friendly Windows desktop application that was designed to include other potential parameters of interest (e.g., temperature) and for transferability to other riverine systems.

  9. How "lucky" we are that the Fukushima disaster occurred in early spring: predictions on the contamination levels from various fission products released from the accident and updates on the risk assessment for solid and thyroid cancers.

    PubMed

    Evangeliou, Nikolaos; Balkanski, Yves; Cozic, Anne; Møller, Anders Pape

    2014-12-01

    The present paper studies how a random event (earthquake) and the subsequent disaster in Japan affect transport and deposition of fallout and the resulting health consequences. Therefore, except for the original accident in March 2011, three additional scenarios are assessed assuming that the same releases took place in winter 2010, summer 2011 and autumn 2011 in order to cover a full range of annual seasonality. This is also the first study where a large number of fission products released from the accident are used to assess health risks with the maximum possible efficiency. Xenon-133 and (137)Cs are directly estimated within the model, whereas 15 other radionuclides are calculated indirectly using reported isotopic ratios. As much as 85% of the released (137)Cs would be deposited in continental regions worldwide if the accident occurred in winter 2010, 22% in spring 2011 (when it actually happened), 55% in summer 2011 and 48% if it occurred during autumn 2011. Solid cancer incidents and mortalities from Fukushima are estimated to be between 160 and 880 and from 110 to 640 close to previous estimations. By adding thyroid cancers, the total number rises from 230 to 850 for incidents and from 120 to 650 for mortalities. Fatalities due to worker exposure and mandatory evacuation have been reported to be around 610 increasing total estimated mortalities to 730-1260. These estimates are 2.8 times higher than previously reported ones for radiocaesium and (131)I and 16% higher than those reported based on radiocaesium only. Total expected fatalities from Fukushima are 32% lower than in the winter scenario, 5% that in the summer scenario and 30% lower than in the autumn scenario. Nevertheless, cancer fatalities are expected to be less than 5% of those from the tsunami (~20,000). Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Optimizing Environmental Flow Operation Rules based on Explicit IHA Constraints

    NASA Astrophysics Data System (ADS)

    Dongnan, L.; Wan, W.; Zhao, J.

    2017-12-01

    Multi-objective operation of reservoirs are increasingly asked to consider the environmental flow to support ecosystem health. Indicators of Hydrologic Alteration (IHA) is widely used to describe environmental flow regimes, but few studies have explicitly formulated it into optimization models and thus is difficult to direct reservoir release. In an attempt to incorporate the benefit of environmental flow into economic achievement, a two-objective reservoir optimization model is developed and all 33 hydrologic parameters of IHA are explicitly formulated into constraints. The benefit of economic is defined by Hydropower Production (HP) while the benefit of environmental flow is transformed into Eco-Index (EI) that combined 5 of the 33 IHA parameters chosen by principal component analysis method. Five scenarios (A to E) with different constraints are tested and solved by nonlinear programming. The case study of Jing Hong reservoir, located in the upstream of Mekong basin, China, shows: 1. A Pareto frontier is formed by maximizing on only HP objective in scenario A and on only EI objective in scenario B. 2. Scenario D using IHA parameters as constraints obtains the optimal benefits of both economic and ecological. 3. A sensitive weight coefficient is found in scenario E, but the trade-offs between HP and EI objectives are not within the Pareto frontier. 4. When the fraction of reservoir utilizable capacity reaches 0.8, both HP and EI capture acceptable values. At last, to make this modelmore conveniently applied to everyday practice, a simplified operation rule curve is extracted.

  11. Scenarios and methods that induce protruding or released CNTs after degradation of nanocomposite materials.

    PubMed

    Hirth, Sabine; Cena, Lorenzo; Cox, Gerhard; Tomović, Zeljko; Peters, Thomas; Wohlleben, Wendel

    2013-04-01

    Nanocomposite materials may be considered as a low-risk application of nanotechnology, if the nanofillers remain embedded throughout the life-cycle of the products in which they are embedded. We hypothesize that release of free CNTs occurs by a combination of mechanical stress and chemical degradation of the polymer matrix. We experimentally address limiting cases: Mechanically released fragments may show tubular protrusions on their surface. Here we identify these protrusions unambiguously as naked CNTs by chemically resolved microscopy and a suitable preparation protocol. By size-selective quantification of fragments we establish as a lower limit that at least 95 % of the CNTs remain embedded. Contrary to classical fiber composite approaches, we link this phenomenon to matrix materials with only a few percent elongation at break, predicting which materials should still cover their CNT nanofillers after machining. Protruding networks of CNTs remain after photochemical degradation of the matrix, and we show that it takes the worst case combinations of weathering plus high-shear wear to release free CNTs in the order of mg/m 2 /year. Synergy of chemical degradation and mechanical energy input is identified as the priority scenario of CNT release, but its lab simulation by combined methods is still far from real-world validation.

  12. Testing high resolution numerical models for analysis of contaminant storage and release from low permeability zones.

    PubMed

    Chapman, Steven W; Parker, Beth L; Sale, Tom C; Doner, Lee Ann

    2012-08-01

    It is now widely recognized that contaminant release from low permeability zones can sustain plumes long after primary sources are depleted, particularly for chlorinated solvents where regulatory limits are orders of magnitude below source concentrations. This has led to efforts to appropriately characterize sites and apply models for prediction incorporating these effects. A primary challenge is that diffusion processes are controlled by small-scale concentration gradients and capturing mass distribution in low permeability zones requires much higher resolution than commonly practiced. This paper explores validity of using numerical models (HydroGeoSphere, FEFLOW, MODFLOW/MT3DMS) in high resolution mode to simulate scenarios involving diffusion into and out of low permeability zones: 1) a laboratory tank study involving a continuous sand body with suspended clay layers which was 'loaded' with bromide and fluorescein (for visualization) tracers followed by clean water flushing, and 2) the two-layer analytical solution of Sale et al. (2008) involving a relatively simple scenario with an aquifer and underlying low permeability layer. All three models are shown to provide close agreement when adequate spatial and temporal discretization are applied to represent problem geometry, resolve flow fields and capture advective transport in the sands and diffusive transfer with low permeability layers and minimize numerical dispersion. The challenge for application at field sites then becomes appropriate site characterization to inform the models, capturing the style of the low permeability zone geometry and incorporating reasonable hydrogeologic parameters and estimates of source history, for scenario testing and more accurate prediction of plume response, leading to better site decision making. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Expert assessment of vulnerability of permafrost carbon to climate change

    USGS Publications Warehouse

    Schuur, E.A.G.; Abbott, B.W.; Bowden, W.B.; Brovkin, V.; Camill, P.; Canadell, J.G.; Chanton, J.P.; Chapin, F. S.; Christensen, T.R.; Ciais, P.; Crosby, B.T.; Czimczik, C.I.; Grosse, G.; Harden, J.; Hayes, D.J.; Hugelius, G.; Jastrow, J.D.; Jones, J.B.; Kleinen, T.; Koven, C.D.; Krinner, G.; Kuhry, P.; Lawrence, D.M.; McGuire, A.D.; Natali, Susan M.; O'Donnell, J. A.; Ping, C.-L.; Riley, W.J.; Rinke, A.; Romanovsky, V.E.; Sannel, A.B.K.; Schädel, C.; Schaefer, K.; Sky, J.; Subin, Z.M.; Tarnocai, C.; Turetsky, M.R.; Waldrop, M.P.; Anthony, K.M. Walter; Wickland, K.P.; Wilson, C.J.; Zimov, S.A.

    2013-01-01

    Approximately 1700 Pg of soil carbon (C) are stored in the northern circumpolar permafrost zone, more than twice as much C than in the atmosphere. The overall amount, rate, and form of C released to the atmosphere in a warmer world will influence the strength of the permafrost C feedback to climate change. We used a survey to quantify variability in the perception of the vulnerability of permafrost C to climate change. Experts were asked to provide quantitative estimates of permafrost change in response to four scenarios of warming. For the highest warming scenario (RCP 8.5), experts hypothesized that C release from permafrost zone soils could be 19–45 Pg C by 2040, 162–288 Pg C by 2100, and 381–616 Pg C by 2300 in CO2 equivalent using 100-year CH4 global warming potential (GWP). These values become 50 % larger using 20-year CH4 GWP, with a third to a half of expected climate forcing coming from CH4 even though CH4 was only 2.3 % of the expected C release. Experts projected that two-thirds of this release could be avoided under the lowest warming scenario (RCP 2.6). These results highlight the potential risk from permafrost thaw and serve to frame a hypothesis about the magnitude of this feedback to climate change. However, the level of emissions proposed here are unlikely to overshadow the impact of fossil fuel burning, which will continue to be the main source of C emissions and climate forcing.

  14. Expert assessment of vulnerability of permafrost carbon to climate change

    DOE PAGES

    Schuur, E. A. G.; Abbott, B. W.; Bowden, W. B.; ...

    2013-03-26

    Approximately 1700 Pg of soil carbon (C) are stored in the northern circumpolar permafrost zone, more than twice as much C than in the atmosphere. The overall amount, rate, and form of C released to the atmosphere in a warmer world will influence the strength of the permafrost C feedback to climate change. We used a survey to quantify variability in the perception of the vulnerability of permafrost C to climate change. Experts were asked to provide quantitative estimates of permafrost change in response to four scenarios of warming. For the highest warming scenario (RCP 8.5), experts hypothesized that Cmore » release from permafrost zone soils could be 19–45 Pg C by 2040, 162–288 Pg C by 2100, and 381–616 Pg C by 2300 in CO 2 equivalent using 100-year CH 4 global warming potential (GWP). These values become 50% larger using 20-year CH 4 GWP, with a third to a half of expected climate forcing coming from CH 4 even though CH 4 was only 2.3 % of the expected C release. Experts projected that two-thirds of this release could be avoided under the lowest warming scenario (RCP 2.6). These results highlight the potential risk from permafrost thaw and serve to frame a hypothesis about the magnitude of this feedback to climate change. However, the level of emissions proposed here are unlikely to overshadow the impact of fossil fuel burning, which will continue to be the main source of C emissions and climate forcing.« less

  15. Collisionless coupling processes in AMPTE releases

    NASA Technical Reports Server (NTRS)

    Lui, A. T. Y.

    1990-01-01

    An evaluation is made of results obtained to date by in situ measurements, numerical simulations, and theoretical considerations of Active Magnetospheric Particle Tracer Explorer chemical releases bearing on the nature of collisionless coupling processes. It is noted that both laminar and turbulent forces act to couple the solar wind momentum and energy to the release cloud; the magnetic field compression formed in this interaction plays an important intermediary role in coupling the two plasmas, and the intense electrostatic turbulence generated enhances the interaction. A scenario accounting for several features in the observed evolution of the December 27, 1984 artificial comet release is presented.

  16. Halite as a Methane Sequestration Host: A Possible Explanation for Periodic Methane Release on Mars, and a Surface-accessible Source of Ancient Martian Carbon

    NASA Technical Reports Server (NTRS)

    Fries, M. D.; Steele, Andrew; Hynek, B. M.

    2015-01-01

    We present the hypothesis that halite may play a role in methane sequestration on the martian surface. In terrestrial examples, halite deposits sequester large volumes of methane and chloromethane. Also, examples of chloromethane-bearing, approximately 4.5 Ga old halite from the Monahans meteorite show that this system is very stable unless the halite is damaged. On Mars, methane may be generated from carbonaceous material trapped in ancient halite deposits and sequestered. The methane may be released by damaging its halite host; either by aqueous alteration, aeolian abrasion, heating, or impact shock. Such a scenario may help to explain the appearance of short-lived releases of methane on the martian surface. The methane may be of either biogenic or abiogenic origin. If this scenario plays a significant role on Mars, then martian halite deposits may contain samples of organic compounds dating to the ancient desiccation of the planet, accessible at the surface for future sample return missions.

  17. Techno-economic potential of bioethanol from bamboo in China

    PubMed Central

    2013-01-01

    Background Bamboo is potentially an interesting feedstock for advanced bioethanol production in China due to its natural abundance, rapid growth, perennial nature and low management requirements. Liquid hot water (LHW) pretreatment was selected as a promising technology to enhance sugar release from bamboo lignocellulose whilst keeping economic and environmental costs to a minimum. The present research was conducted to assess: 1) by how much LHW pretreatment can enhance sugar yields in bamboo, and 2) whether this process has the potential to be economically feasible for biofuel use at the commercial scale. Pretreatments were performed at temperatures of 170-190°C for 10–30 minutes, followed by enzymatic saccharification with a commercial enzyme cocktail at various loadings. These data were then used as inputs to a techno-economic model using AspenPlus™ to determine the production cost of bioethanol from bamboo in China. Results At the selected LHW pretreatment of 190°C for 10 minutes, 69% of the initial sugars were released under a standardised enzyme loading; this varied between 59-76% when 10–140 FPU/g glucan of commercial enzyme Cellic CTec2 was applied. Although the lowest enzyme loading yielded the least amount of bioethanol, the techno-economic evaluation revealed it to be the most economically viable scenario with a production cost of $0.484 per litre (with tax exemption and a $0.16/litre subsidy). The supply-chain analysis demonstrated that bioethanol could be economically competitive with petrol at the pump at enzyme loadings up to 60 FPU/g glucan. However, in a prospective scenario with reduced government support, this enzyme loading threshold would be reduced to 30 FPU/g glucan. Conclusions Bioethanol from bamboo is shown to be both technically and economically feasible, as well as competitive with petrol in China. Alternative approaches to reduce bioethanol production costs are still needed however, to ensure its competitiveness in a possible future scenario where neither tax exemptions nor subsidies are granted to producers. These measures may include improving sugar release with more effective pretreatments and reduced enzyme usage, accessing low cost bamboo feedstock or selecting feedstocks with higher/more accessible cellulose. PMID:24286490

  18. Modelling Behaviour of a Carbon Epoxy Composite Exposed to Fire: Part I-Characterisation of Thermophysical Properties.

    PubMed

    Tranchard, Pauline; Samyn, Fabienne; Duquesne, Sophie; Estèbe, Bruno; Bourbigot, Serge

    2017-05-04

    Thermophysical properties of a carbon-reinforced epoxy composite laminate (T700/M21 composite for aircraft structures) were evaluated using different innovative characterisation methods. Thermogravimetric Analysis (TGA), Simultaneous Thermal analysis (STA), Laser Flash analysis (LFA), and Fourier Transform Infrared (FTIR) analysis were used for measuring the thermal decomposition, the specific heat capacity, the anisotropic thermal conductivity of the composite, the heats of decomposition and the specific heat capacity of released gases. It permits to get input data to feed a three-dimensional (3D) model given the temperature profile and the mass loss obtained during well-defined fire scenarios (model presented in Part II of this paper). The measurements were optimised to get accurate data. The data also permit to create a public database on an aeronautical carbon fibre/epoxy composite for fire safety engineering.

  19. Data reduction and analysis of graphite fiber release experiments

    NASA Technical Reports Server (NTRS)

    Lieberman, P.; Chovit, A. R.; Sussholz, B.; Korman, H. F.

    1979-01-01

    The burn and burn/explode effects on aircraft structures were examined in a series of fifteen outdoor tests conducted to verify the results obtained in previous burn and explode tests of carbon/graphite composite samples conducted in a closed chamber, and to simulate aircraft accident scenarios in which carbon/graphite fibers would be released. The primary effects that were to be investigaged in these tests were the amount and size distribution of the conductive fibers released from the composite structures, and how these various sizes of fibers transported downwind. The structures included plates, barrels, aircraft spoilers and a cockpit. The heat sources included a propane gas burner and 20 ft by 20 ft and 40 ft by 60 ft JP-5 pool fires. The larger pool fire was selected to simulate an aircraft accident incident. The passive instrumentation included sticky paper and sticky bridal veil over an area 6000 ft downwind and 3000 ft crosswind. The active instrumentation included instrumented meteorological towers, movies, infrared imaging cameras, LADAR, high voltage ball gages, light emitting diode gages, microwave gages and flame velocimeter.

  20. Dispersion modeling of accidental releases of toxic gases - Comparison of the models and their utility for the fire brigades.

    NASA Astrophysics Data System (ADS)

    Stenzel, S.; Baumann-Stanzer, K.

    2009-04-01

    Dispersion modeling of accidental releases of toxic gases - Comparison of the models and their utility for the fire brigades. Sirma Stenzel, Kathrin Baumann-Stanzer In the case of accidental release of hazardous gases in the atmosphere, the emergency responders need a reliable and fast tool to assess the possible consequences and apply the optimal countermeasures. For hazard prediction and simulation of the hazard zones a number of air dispersion models are available. The most model packages (commercial or free of charge) include a chemical database, an intuitive graphical user interface (GUI) and automated graphical output for display the results, they are easy to use and can operate fast and effective during stress situations. The models are designed especially for analyzing different accidental toxic release scenarios ("worst-case scenarios"), preparing emergency response plans and optimal countermeasures as well as for real-time risk assessment and management. There are also possibilities for model direct coupling to automatic meteorological stations, in order to avoid uncertainties in the model output due to insufficient or incorrect meteorological data. Another key problem in coping with accidental toxic release is the relative width spectrum of regulations and values, like IDLH, ERPG, AEGL, MAK etc. and the different criteria for their application. Since the particulate emergency responders and organizations require for their purposes unequal regulations and values, it is quite difficult to predict the individual hazard areas. There are a quite number of research studies and investigations coping with the problem, anyway the end decision is up to the authorities. The research project RETOMOD (reference scenarios calculations for toxic gas releases - model systems and their utility for the fire brigade) was conducted by the Central Institute for Meteorology and Geodynamics (ZAMG) in cooperation with the Vienna fire brigade, OMV Refining & Marketing GmbH and Synex Ries & Greßlehner GmbH. RETOMOD was funded by the KIRAS safety research program at the Austrian Ministry of Transport, Innovation and Technology (www.kiras.at). One of the main tasks of this project was 1. Sensitivity study and optimization of the meteorological input for modeling of the hazard areas (human exposure) during the accidental toxic releases. 2. Comparison of several model packages (based on reference scenarios) in order to estimate the utility for the fire brigades. This presentation introduces the project models used and presents the results of task 2. The results of task 1 are presented by Baumann-Stanzer and Stenzel in this session. For the purpose of this study the following models were tested and compared: ALOHA (Areal Location of Hazardous atmosphere, EPA), MEMPLEX (Keudel av-Technik GmbH), Breeze (Trinity Consulting), SAFER System, SAM (Engineering office Lohmeyer), COMPAS. A set of reference scenarios for Chlorine, Ammoniac, Butane and Petrol were proceed in order to reliably predict and estimate the human exposure during the event. The models simulated the accidental release from the mentioned above gases and estimates the potential toxic areas. Since the inputs requirement differ from model to model, and the outputs are based on different criteria for toxic areas and exposure, a high degree of caution in the interpretation of the model results is needed.

  1. Comparación de las predicciones de cosmologías alternativas al modelo estándar con datos del fondo cósmico de radiación

    NASA Astrophysics Data System (ADS)

    Piccirilli, M. P.; Landau, S. J.; León, G.

    2016-08-01

    The cosmic microwave background radiation is one of the most powerful tools to study the early Universe and its evolution, providing also a method to test different cosmological scenarios. We consider alternative inflationary models where the emergence of the seeds of cosmic structure from a perfect isotropic and homogeneous universe can be explained by the self-induced collapse of the inflaton wave function. Some of these alternative models may result indistinguishable from the standard model, while others require to be compared with observational data through statistical analysis. In this article we show results concerning the first Planck release, the Atacama Cosmology Telescope, the South Pole Telescope, the WMAP and Sloan Digital Sky Survey datasets, reaching good agreement between data and theoretical predictions. For future works, we aim to achieve better limits in the cosmological parameters using the last Planck release.

  2. Learning and robustness to catch-and-release fishing in a shark social network

    PubMed Central

    Brown, Culum; Planes, Serge

    2017-01-01

    Individuals can play different roles in maintaining connectivity and social cohesion in animal populations and thereby influence population robustness to perturbations. We performed a social network analysis in a reef shark population to assess the vulnerability of the global network to node removal under different scenarios. We found that the network was generally robust to the removal of nodes with high centrality. The network appeared also highly robust to experimental fishing. Individual shark catchability decreased as a function of experience, as revealed by comparing capture frequency and site presence. Altogether, these features suggest that individuals learnt to avoid capture, which ultimately increased network robustness to experimental catch-and-release. Our results also suggest that some caution must be taken when using capture–recapture models often used to assess population size as assumptions (such as equal probabilities of capture and recapture) may be violated by individual learning to escape recapture. PMID:28298593

  3. Classic Emergencies

    EPA Pesticide Factsheets

    This example scenario describes a hazardous release of liquid chlorine at a chemical manufacturing facility, notification of the National Response Center, and deployment of an on-scene coordinator and subsequent response procedures.

  4. Envisioning Nano Release Dynamics in a Changing World: Using Dynamic Probabilistic Modeling to Assess Future Environmental Emissions of Engineered Nanomaterials.

    PubMed

    Sun, Tian Yin; Mitrano, Denise M; Bornhöft, Nikolaus A; Scheringer, Martin; Hungerbühler, Konrad; Nowack, Bernd

    2017-03-07

    The need for an environmental risk assessment for engineered nanomaterials (ENM) necessitates the knowledge about their environmental emissions. Material flow models (MFA) have been used to provide predicted environmental emissions but most current nano-MFA models consider neither the rapid development of ENM production nor the fact that a large proportion of ENM are entering an in-use stock and are released from products over time (i.e., have a lag phase). Here we use dynamic probabilistic material flow modeling to predict scenarios of the future flows of four ENM (nano-TiO 2 , nano-ZnO, nano-Ag and CNT) to environmental compartments and to quantify their amounts in (temporary) sinks such as the in-use stock and ("final") environmental sinks such as soil and sediment. In these scenarios, we estimate likely future amounts if the use and distribution of ENM in products continues along current trends (i.e., a business-as-usual approach) and predict the effect of hypothetical trends in the market development of nanomaterials, such as the emergence of a new widely used product or the ban on certain substances, on the flows of nanomaterials to the environment in years to come. We show that depending on the scenario and the product type affected, significant changes of the flows occur over time, driven by the growth of stocks and delayed release dynamics.

  5. Analytical model of flame spread in full-scale room/corner tests (ISO9705)

    Treesearch

    Mark Dietenberger; Ondrej Grexa

    1999-01-01

    A physical, yet analytical, model of fire growth has predicted flame spread and rate of heat release (RHR) for an ISO9705 test scenario using bench-scale data from the cone calorimeter. The test scenario simulated was the propane ignition burner at the comer with a 100/300 kW program and the specimen lined on the walls only. Four phases of fire growth were simulated....

  6. Building Civilian-Military Collaboration to Enhance Response Following an Anthrax Release

    DTIC Science & Technology

    2012-05-04

    thought that rural communities are not considered ―high risk‖ for the anthrax scenario as their widely dispersed population may not be a likely a...terrorist target.27 The community planners’ perception of risk will impact the time and effort a rural community places towards planning for these...types of scenarios. The diversity of urban and rural populations and their differing healthcare systems and infrastructures present complexities when

  7. Sensitivity of the global submarine hydrate inventory to scenarios of future climate change

    NASA Astrophysics Data System (ADS)

    Hunter, S. J.; Goldobin, D. S.; Haywood, A. M.; Ridgwell, A.; Rees, J. G.

    2013-04-01

    The global submarine inventory of methane hydrate is thought to be considerable. The stability of marine hydrates is sensitive to changes in temperature and pressure and once destabilised, hydrates release methane into sediments and ocean and potentially into the atmosphere, creating a positive feedback with climate change. Here we present results from a multi-model study investigating how the methane hydrate inventory dynamically responds to different scenarios of future climate and sea level change. The results indicate that a warming-induced reduction is dominant even when assuming rather extreme rates of sea level rise (up to 20 mm yr-1) under moderate warming scenarios (RCP 4.5). Over the next century modelled hydrate dissociation is focussed in the top ˜100m of Arctic and Subarctic sediments beneath <500m water depth. Predicted dissociation rates are particularly sensitive to the modelled vertical hydrate distribution within sediments. Under the worst case business-as-usual scenario (RCP 8.5), upper estimates of resulting global sea-floor methane fluxes could exceed estimates of natural global fluxes by 2100 (>30-50TgCH4yr-1), although subsequent oxidation in the water column could reduce peak atmospheric release rates to 0.75-1.4 Tg CH4 yr-1.

  8. The Role of Deposition in Limiting the Hazard Extent of Dense-Gas Plumes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillon, M B

    2008-01-29

    Accidents involving release of large (multi-ton) quantities of toxic industrial chemicals often yield far fewer fatalities and causalities than standard, widely-used assessment and emergency response models predict. While recent work has suggested that models should incorporate the protection provided by buildings, more refined health effect methodologies, and more detailed consideration of the release process; investigations into the role of deposition onto outdoor surfaces has been lacking. In this paper, we examine the conditions under which dry deposition may significantly reduce the extent of the downwind hazard zone. We provide theoretical arguments that in congested environments (e.g. suburbs, forests), deposition tomore » vertical surfaces (such as building walls) may play a significant role in reducing the hazard zone extent--particularly under low-wind, stable atmospheric conditions which are often considered to be the worst-case scenario for these types of releases. Our analysis suggests that in these urban or suburban environments, the amount of toxic chemicals lost to earth's surface is typically a small fraction of overall depositional losses. For isothermal gases such as chlorine, the degree to which the chemicals stick to (or react with) surfaces (i.e. surface resistance) is demonstrated to be a key parameter controlling hazard extent (the maximum distance from the release at which hazards to human health are expected). This analysis does not consider the depositional effects associated with particulate matter or gases that undergo significant thermal change in the atmosphere. While no controlled experiments were available to validate our hypothesis, our analysis results are qualitatively consistent with the observed downwind extent of vegetation damage in two chlorine accidents.« less

  9. Scenarios of stable Vapor→Liquid Droplet→Solid Nanowire growth

    NASA Astrophysics Data System (ADS)

    Nebol`sin, Valery A.; Dunaev, Alexander I.; Tatarenkov, Alexander F.; Shmakova, Svetlana S.

    2016-09-01

    In the process of Nanowire (NW) growth under the Vapor→Liquid Droplet→Solid (VLS) scheme, the stages that reach the boundary of the crystallization front (the triple phase line (TPL)) under the droplet of the catalyst are either absorbed by the TPL, or accumulate ahead of it. It has been shown that, in the first case, TPL can release stages, which leads to a decrease in supersaturation necessary for NW growth. An equation has been derived, which defines the change in free surface energy of the three-phase system in the absorption (release) of a stage, being a function of the contact angle of the droplet, and the ratio between the phase conjugation angles interface at equilibrium shift in the boundary line. A thermodynamic model has been developed and three possible scenarios for sustainable NW growth: Non-Wetting, Wetting and Fully Wetting have been considered in accordance with the processes occurring at the interface of three phases. The results obtained for each scenario were used to analyze the polytypism of GaAs and InAs NW, the radial periodic instability of Si NW and the formation of "negative" NW.

  10. Estimating emissions from adhesives and sealants uses and manufacturing for environmental risk assessments.

    PubMed

    Tolls, Johannes; Gómez, Divina; Guhl, Walter; Funk, Torsten; Seger, Erich; Wind, Thorsten

    2016-01-01

    Regulation (EC) No 1907/2006 concerning the Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) requires that environmental exposure assessments be performed for all uses of dangerous substances that are marketed in the European Union in quantities above 10 tons per year. The quantification of emissions to the environment is a key step in this process. This publication describes the derivation of release factors and gives guidance for estimating use rates for quantifying the emissions from the manufacturing and application of adhesives and sealants. Release factors available for coatings and paints are read across to adhesives or sealants based on similarities between these 2 product groups with regard to chemical composition and to processing during manufacturing and application. The granular emission scenarios in these documents are mapped to the broad emission scenarios for adhesives or sealants. According to the mapping, the worst-case release factors for coatings or paints are identified and assigned to the adhesives or sealants scenarios. The resulting 10 specific environmental release categories (SPERCs) for adhesives and sealants are defined by differentiating between solvent and nonsolvent ingredients and between water-borne and solvent-borne or solvent-free products. These cover the vast majority of the production processes and uses and are more realistic than the 5 relevant emission estimation defaults provided in the REACH guidance. They are accompanied with adhesive or sealant consumption rates in the EU and with guidance for estimating conservative substance use rates at a generic level. The approach of combining conservative SPERC release factors with conservative estimates of substance rates is likely to yield emission estimates that tend to overpredict actual releases. Because this qualifies the approach for use in lower-tier environmental exposure assessment, the Association of the European Adhesive & Sealant Industry (FEICA) SPERCs are available in several exposure assessment tools that are used under REACH. Given the limited regional variation in the manufacturing and use processes of adhesives and sealants, the SPERCs may be applicable for emission estimation not only in the EU but also in other regions. © 2015 SETAC.

  11. Influence of lag effect, soil release, and climate change on watershed anthropogenic nitrogen inputs and riverine export dynamics.

    PubMed

    Chen, Dingjiang; Huang, Hong; Hu, Minpeng; Dahlgren, Randy A

    2014-05-20

    This study demonstrates the importance of the nitrogen-leaching lag effect, soil nitrogen release, and climate change on anthropogenic N inputs (NANI) and riverine total nitrogen (TN) export dynamics using a 30-yr record for the Yongan River watershed in eastern China. Cross-correlation analysis indicated a 7-yr, 5-yr, and 4-yr lag time in riverine TN export in response to changes in NANI, temperature, and drained agricultural land area, respectively. Enhanced by warmer temperature and improved agricultural drainage, the upper 20 cm of agricultural soils released 270 kg N ha(-1) between 1980 and 2009. Climate change also increased the fractional export of NANI to river. An empirical model (R(2) = 0.96) for annual riverine TN flux incorporating these influencing factors estimated 35%, 41%, and 24% of riverine TN flux originated from the soil N pool, NANI, and background N sources, respectively. The model forecasted an increase of 45%, 25%, and 6% and a decrease of 13% in riverine TN flux from 2010 to 2030 under continued development, climate change, status-quo, and tackling scenarios, respectively. The lag effect, soil N release, and climate change delay riverine TN export reductions with respect to decreases in NANI and should be considered in developing and evaluating N management measures.

  12. Cost-effectiveness of extended-release niacin/laropiprant added to a stable simvastatin dose in secondary prevention patients not at cholesterol goal in Germany.

    PubMed

    Michailov, Galin V; Davies, Glenn M; Krobot, Karl J

    2012-06-01

    Coronary heart disease (CHD) remains the leading cause of death in Germany despite statin use to reduce low-density lipoprotein cholesterol (LDL-C) levels; improving lipids beyond LDL-C may further reduce cardiovascular risk. A fixed-dose combination of extended-release niacin (ERN) with laropiprant (LRPT) provides comprehensive lipid management. We adapted a decision-analytic model to evaluate the economic value (incremental cost-effectiveness ratio [ICER] in terms of costs per life-years gained [LYG]) of ERN/LRPT 2 g over a lifetime in secondary prevention patients in a German setting. Two scenarios were modelled: (1) ERN/LRPT 2 g added to simvastatin 40 mg in patients not at LDL-C goal with simvastatin 40 mg; (2) adding ERN/LRPT 2 g compared with titration to simvastatin 40 mg in patients not at LDL-C goal with simvastatin 20 mg. In both scenarios, adding ERN/LRPT was cost-effective relative to simvastatin monotherapy at a commonly accepted threshold of €30,000 per LYG; ICERs for ERN/LRPT were €13,331 per LYG in scenario 1 and €17,684 per LYG in scenario 2. Subgroup analyses showed that ERN/LRPT was cost-effective in patients with or without diabetes, patients aged ≤ 65 or >65 years and patients with low baseline high-density lipoprotein cholesterol levels; ICERs ranged from €10,342 to €15,579 in scenario 1, and from €14,081 to €20,462 in scenario 2. In conclusion, comprehensive lipid management with ERN/LRPT 2 g is cost-effective in secondary prevention patients in Germany who have not achieved LDL-C goal with simvastatin monotherapy.

  13. Assessment of the effects of release variables on the consequences of LNG spillage onto water using FERC models.

    PubMed

    Qiao, Yuanhua; West, Harry H; Mannan, M Sam; Johnson, David W; Cornwell, John B

    2006-03-17

    Liquefied natural gas (LNG) release, spread, evaporation, and dispersion processes are illustrated using the Federal Energy Regulatory Commission models in this paper. The spillage consequences are dependent upon the tank conditions, release scenarios, and the environmental conditions. The effects of the contributing variables, including the tank configuration, breach hole size, ullage pressure, wind speed and stability class, and surface roughness, on the consequence of LNG spillage onto water are evaluated using the models. The sensitivities of the consequences to those variables are discussed.

  14. Transportation analyses for the lunar-Mars initiative

    NASA Technical Reports Server (NTRS)

    Woodcock, Gordon R.; Buddington, Patricia A.

    1991-01-01

    This paper focuses on certain results of an ongoing NASA-sponsored study by Boeing, including (1) a series of representative space exploration scenarios; (2) the levels of effort required to accomplish each; and (3) a range of candidate transportation system as partial implementations of the scenarios. This effort predated release of the Synthesis report; the three levels of activity described are not responses to the Synthesis architectures. These three levels (minimum, median and ambitious), do envelope the range of scope described in the four Synthesis architecture models. The level of analysis detail was to the current known level of detail of transportation hardware systems and mission scenarios. The study did not include detailed analysis of earth-to-orbit transportation, surface systems, or tracking and communications systems. The influence of earth-to-orbit systems was considered in terms of delivery capacity and cost. Aspects of additional options, such as in situ resource utilization are explored as needed to indicate potential benefits. Results favored cryogenic chemical propulsion for low activity levels and undemanding missions (such as cargo and some lunar missions), nuclear thermal propulsion for median activity levels similar to the Synthesis architectures, and nuclear thermal propulsion with aerobraking or nuclear electric propulsion for high activity levels. Solar electric propulsion was seen as having an important role if the present high unit cost (i.e., dollars per watt) of space photovoltaics could be reduced by a factor of five or more at production rates of megawatts per year.

  15. Carbon Calculator for Land Use Change from Biofuels Production (CCLUB) Users’ Manual and Technical Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunn, Jennifer B.; Qin, Zhangcai; Mueller, Steffen

    The Carbon Calculator for Land Use Change from Biofuels Production (CCLUB) calculates carbon emissions from land use change (LUC) for four different ethanol production pathways including corn grain ethanol and cellulosic ethanol from corn stover, Miscanthus, and switchgrass, and a soy biodiesel pathway. This document discusses the version of CCLUB released September 30, 2017 which includes five ethanol LUC scenarios and four soy biodiesel LUC scenarios.

  16. Contamination and release of nanomaterials associated with the use of personal protective clothing.

    PubMed

    Tsai, Candace Su-Jung

    2015-05-01

    We investigated nanomaterial release associated with the contamination of protective clothing during manipulation of clothing fabrics contaminated with nanoparticles. Nanomaterials, when released as airborne nanoparticles, can cause inhalation exposure which is the route of exposure of most concern to cause adverse health effects. Measurement of such nanoparticle re-suspension has not yet been conducted. Protective clothing can be contaminated with airborne nanoparticles during handling and operating processes, typically on the arms and front of the body. The contaminated clothing could release nanoparticles in the general room while performing other activities and manipulating the clothing after work. The exposures associated with three different fabric materials of contaminated laboratory coats (cotton, polyester, and Tyvek), including the magnitude of contamination and particle release, were investigated in this study by measuring the number concentration increase and the weight change on fabric pieces. This study simulated real life occupational exposure scenarios and was performed in both regular and clean room environments to investigate the effect of background aerosols on the measurements. Concentration were measured using particle spectrometers for diameters from 10nm to 10 µm. Collected aerosol particles and contaminated fabric surfaces were characterized using scanning electron microscopy (SEM), transmission electron microscopy (TEM), and elemental composition analysis. The magnitude of particle release from contaminated lab coat fabric was found to vary by the type of fabric material; cotton fabric showed the highest level of contamination and particle release, followed by Tyvek and polyester fabrics. The polyester lab coat material was found to have the lowest particle release to deposition (R/D) ratio. The particle release number concentrations were in a range of 768-119 particles cm(-3) and 586-187 particles cm(-3) in regular and clean rooms, respectively. Multiple peaks were observed in the number concentration distribution data, with particle diameters peaking at 40-50 and 100-300nm. The SEM analysis of the contaminated fabric surface found test particles and other environmental particles. The elemental composition analysis presented detectable response to the studied alumina oxide particles. The laboratory coat primarily made of cotton woven material is not recommended for worker protection against nanoparticle exposure because of the highest particle contamination and release ability. In addition, the result demonstrated that a well-controlled (cleanroom) environment is critical to investigate the factors affecting nanoparticle interaction with protective clothing. © The Author 2015. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  17. Refinement and evaluation of the Massachusetts firm-yield estimator model version 2.0

    USGS Publications Warehouse

    Levin, Sara B.; Archfield, Stacey A.; Massey, Andrew J.

    2011-01-01

    The firm yield is the maximum average daily withdrawal that can be extracted from a reservoir without risk of failure during an extended drought period. Previously developed procedures for determining the firm yield of a reservoir were refined and applied to 38 reservoir systems in Massachusetts, including 25 single- and multiple-reservoir systems that were examined during previous studies and 13 additional reservoir systems. Changes to the firm-yield model include refinements to the simulation methods and input data, as well as the addition of several scenario-testing capabilities. The simulation procedure was adapted to run at a daily time step over a 44-year simulation period, and daily streamflow and meteorological data were compiled for all the reservoirs for input to the model. Another change to the model-simulation methods is the adjustment of the scaling factor used in estimating groundwater contributions to the reservoir. The scaling factor is used to convert the daily groundwater-flow rate into a volume by multiplying the rate by the length of reservoir shoreline that is hydrologically connected to the aquifer. Previous firm-yield analyses used a constant scaling factor that was estimated from the reservoir surface area at full pool. The use of a constant scaling factor caused groundwater flows during periods when the reservoir stage was very low to be overestimated. The constant groundwater scaling factor used in previous analyses was replaced with a variable scaling factor that is based on daily reservoir stage. This change reduced instability in the groundwater-flow algorithms and produced more realistic groundwater-flow contributions during periods of low storage. Uncertainty in the firm-yield model arises from many sources, including errors in input data. The sensitivity of the model to uncertainty in streamflow input data and uncertainty in the stage-storage relation was examined. A series of Monte Carlo simulations were performed on 22 reservoirs to assess the sensitivity of firm-yield estimates to errors in daily-streamflow input data. Results of the Monte Carlo simulations indicate that underestimation in the lowest stream inflows can cause firm yields to be underestimated by an average of 1 to 10 percent. Errors in the stage-storage relation can arise when the point density of bathymetric survey measurements is too low. Existing bathymetric surfaces were resampled using hypothetical transects of varying patterns and point densities in order to quantify the uncertainty in stage-storage relations. Reservoir-volume calculations and resulting firm yields were accurate to within 5 percent when point densities were greater than 20 points per acre of reservoir surface. Methods for incorporating summer water-demand-reduction scenarios into the firm-yield model were developed as well as the ability to relax the no-fail reliability criterion. Although the original firm-yield model allowed monthly reservoir releases to be specified, there have been no previous studies examining the feasibility of controlled releases for downstream flows from Massachusetts reservoirs. Two controlled-release scenarios were tested—with and without a summer water-demand-reduction scenario—for a scenario with a no-fail criterion and a scenario that allows for a 1-percent failure rate over the entire simulation period. Based on these scenarios, about one-third of the reservoir systems were able to support the flow-release scenarios at their 2000–2004 usage rates. Reservoirs with higher storage ratios (reservoir storage capacity to mean annual streamflow) and lower demand ratios (mean annual water demand to annual firm yield) were capable of higher downstream release rates. For the purposes of this research, all reservoir systems were assumed to have structures which enable controlled releases, although this assumption may not be true for many of the reservoirs studied.

  18. Modelling Behaviour of a Carbon Epoxy Composite Exposed to Fire: Part I—Characterisation of Thermophysical Properties

    PubMed Central

    Tranchard, Pauline; Samyn, Fabienne; Duquesne, Sophie; Estèbe, Bruno; Bourbigot, Serge

    2017-01-01

    Thermophysical properties of a carbon-reinforced epoxy composite laminate (T700/M21 composite for aircraft structures) were evaluated using different innovative characterisation methods. Thermogravimetric Analysis (TGA), Simultaneous Thermal analysis (STA), Laser Flash analysis (LFA), and Fourier Transform Infrared (FTIR) analysis were used for measuring the thermal decomposition, the specific heat capacity, the anisotropic thermal conductivity of the composite, the heats of decomposition and the specific heat capacity of released gases. It permits to get input data to feed a three-dimensional (3D) model given the temperature profile and the mass loss obtained during well-defined fire scenarios (model presented in Part II of this paper). The measurements were optimised to get accurate data. The data also permit to create a public database on an aeronautical carbon fibre/epoxy composite for fire safety engineering. PMID:28772854

  19. The DRG shift: a new twist for ICD-10 preparation.

    PubMed

    Long, Peri L

    2012-06-01

    Analysis of your specific business is a key component of ICD-10 implementation. An understanding of your organization's current reimbursement trends will go a long way to assessing and preparing for the impact of ICD-10 in your environment. If you cannot be prepared for each detailed scenario, remember that much of the analysis and resolution requires familiar coding, DRG analysis, and claims processing best practices. Now, they simply have the new twist of researching new codes and some new concepts. The news of a delay in the implementation compliance date, along with the release of grouper Version 29, should encourage your educational and business analysis efforts. This is a great opportunity to maintain open communication with the Centers for Medicare & Medicaid Services, Department of Health and Human Services, and Centers for Disease Control. This is also a key time to report any unusual or discrepant findings in order to provide input to the final rule.

  20. Developing Health-Based Pre-Planning Clearance Goals for Airport Remediation Following a Chemical Terrorist Attack: Decision Criteria for Multipathway Exposure Routes

    PubMed Central

    Watson, Annetta; Dolislager, Fredrick; Hall, Linda; Raber, Ellen; Hauschild, Veronique D.; Love, Adam H.

    2011-01-01

    In the event of a chemical terrorist attack on a transportation hub, post-event remediation and restoration activities necessary to attain unrestricted facility re-use and re-entry could require hours to multiple days. While timeframes are dependent on numerous variables, a primary controlling factor is the level of pre-planning and decision-making completed prior to chemical release. What follows is the second of a two-part analysis identifying key considerations, critical information and decision criteria to facilitate post-attack and post-decontamination consequence management activities. Decision criteria analysis presented here provides first-time, open-literature documentation of multi-pathway, health-based remediation exposure guidelines for selected toxic industrial compounds, chemical warfare agents, and agent degradation products for pre-planning application in anticipation of a chemical terrorist attack. Guideline values are provided for inhalation and direct ocular vapor exposure routes as well as percutaneous vapor, surface contact, and ingestion. Target populations include various employees as well as transit passengers. This work has been performed as a national case study conducted in partnership with the Los Angeles International Airport and The Bradley International Terminal. All recommended guidelines have been selected for consistency with airport scenario release parameters of a one-time, short-duration, finite airborne release from a single source followed by compound-specific decontamination. PMID:21399674

  1. Developing health-based pre-planning clearance goals for airport remediation following a chemical terrorist attack: Decision criteria for multipathway exposure routes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, Annetta Paule; Dolislager, Frederick; Hall, Dr. Linda

    2011-01-01

    In the event of a chemical terrorist attack on a transportation hub, post-event remediation and restoration activities necessary to attain unrestricted facility re-use and re-entry could require hours to multiple days. While timeframes are dependent on numerous variables, a primary controlling factor is the level of pre-planning and decision-making completed prior to chemical release. What follows is the second of a two-part analysis identifying key considerations, critical information and decision criteria to facilitate post-attack and post-decontamination consequence management activities. Decision criteria analysis presented here provides first-time, open-literature documentation of multi-pathway, health-based remediation exposure guidelines for selected toxic industrial compounds, chemicalmore » warfare agents, and agent degradation products for pre-planning application in anticipation of a chemical terrorist attack. Guideline values are provided for inhalation and direct ocular vapor exposure routes as well as percutaneous vapor, surface contact, and ingestion. Target populations include various employees as well as transit passengers. This work has been performed as a national case study conducted in partnership with the Los Angeles International Airport and The Bradley International Terminal. All recommended guidelines have been selected for consistency with airport scenario release parameters of a one-time, short-duration, finite airborne release from a single source followed by compound-specific decontamination.« less

  2. Gaia Confirms that SDSS J102915+172927 is a Dwarf Star

    NASA Astrophysics Data System (ADS)

    Bonifacio, P.; Caffau, E.; Spite, M.; Spite, F.; François, P.; Zaggia, S.; Arenou, F.; Haigron, R.; Leclerc, N.; Marchal, O.; Panuzzo, P.; Plum, G.; Sartoretti, P.

    2018-05-01

    The Gaia Data Release 2 provides a parallax of 0.734+/-0.073 mas for SDSS J102915+172927, currently the most metal-poor known object. This parallax implies that it is dwarf star, ruling out the scenario that it is a subgiant. The subgiant scenario had as a corollary that the star had been formed in a medium highly enriched in C, thus making line cooling efficient during the collapse, that was also highly enriched in Fe by Type Ia SNe. This scenario can also now be ruled out for this star, reinforcing the need of dust cooling and fragmentation to explain its formation.

  3. 300 Area Building Retention Evaluation Mitigation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D. J. McBride

    2007-07-03

    Evaluate the long-term retention of several facilities associated with the PNNL Capability Replacement Laboratory and other Hanfor mission needs. WCH prepared a mitigation plan for three scenarios with different release dates for specific buildings. The evaluations present a proposed plan for providing utility services to retained facilities in support of a long-term (+20 year) lifespan in addition to temporary services to buildings with specified delayed release dates.

  4. Modeling of occupational exposure to accidentally released manufactured nanomaterials in a production facility and calculation of internal doses by inhalation

    PubMed Central

    Vaquero-Moralejo, Celina; Jaén, María; Lopez De Ipiña Peña, Jesús; Neofytou, Panagiotis

    2016-01-01

    Background Occupational exposure to manufactured nanomaterials (MNMs) and its potential health impacts are of scientific and practical interest, as previous epidemiological studies associate exposure to nanoparticles with health effects, including increased morbidity of the respiratory and the circulatory system. Objectives To estimate the occupational exposure and effective internal doses in a real production facility of TiO2 MNMs during hypothetical scenarios of accidental release. Methods Commercial software for geometry and mesh generation, as well as fluid flow and particle dispersion calculation, were used to estimate occupational exposure to MNMs. The results were introduced to in-house software to calculate internal doses in the human respiratory tract by inhalation. Results Depending on the accidental scenario, different areas of the production facility were affected by the released MNMs, with a higher dose exposure among individuals closer to the particles source. Conclusions Granted that the study of the accidental release of particles can only be performed by chance, this numerical approach provides valuable information regarding occupational exposure and contributes to better protection of personnel. The methodology can be used to identify occupational settings where the exposure to MNMs would be high during accidents, providing insight to health and safety officials. PMID:27670588

  5. Toxic industrial chemical (TIC) source emissions modeling for pressurized liquefied gases

    NASA Astrophysics Data System (ADS)

    Britter, Rex; Weil, Jeffrey; Leung, Joseph; Hanna, Steven

    2011-01-01

    The objective of this article is to report current toxic industrial chemical (TIC) source emissions formulas appropriate for use in atmospheric comprehensive risk assessment models so as to represent state-of-the-art knowledge. The focus is on high-priority scenarios, including two-phase releases of pressurized liquefied gases such as chlorine from rail cars. The total mass released and the release duration are major parameters, as well as the velocity, thermodynamic state, and amount and droplet sizes of imbedded aerosols of the material at the exit of the rupture, which are required as inputs to the subsequent jet and dispersion modeling. Because of the many possible release scenarios that could develop, a suite of model equations has been described. These allow for gas, two-phase or liquid storage and release through ruptures of various types including sharp-edged and "pipe-like" ruptures. Model equations for jet depressurization and phase change due to flashing are available. Consideration of the importance of vessel response to a rupture is introduced. The breakup of the jet into fine droplets and their subsequent suspension and evaporation, or rainout is still a significant uncertainty in the overall modeling process. The recommended models are evaluated with data from various TIC field experiments, in particular recent experiments with pressurized liquefied gases. It is found that there is typically a factor of two error in models compared with research-grade observations of mass flow rates. However, biases are present in models' estimates of the droplet size distributions resulting from flashing releases.

  6. ICP-MS analysis of fission product diffusion in graphite for High-Temperature Gas-Cooled Reactors

    NASA Astrophysics Data System (ADS)

    Carter, Lukas M.

    Release of radioactive fission products from nuclear fuel during normal reactor operation or in accident scenarios is a fundamental safety concern. Of paramount importance are the understanding and elucidation of mechanisms of chemical interaction, nuclear interaction, and transport phenomena involving fission products. Worldwide efforts to reduce fossil fuel dependence coupled with an increasing overall energy demand have generated renewed enthusiasm toward nuclear power technologies, and as such, these mechanisms continue to be the subjects of vigorous research. High-Temperature Gas-Cooled Reactors (HTGRs or VHTRs) remain one of the most promising candidates for the next generation of nuclear power reactors. An extant knowledge gap specific to HTGR technology derives from an incomplete understanding of fission product transport in major core materials under HTGR operational conditions. Our specific interest in the current work is diffusion in reactor graphite. Development of methods for analysis of diffusion of multiple fission products is key to providing accurate models for fission product release from HTGR core components and the reactor as a whole. In the present work, a specialized diffusion cell has been developed and constructed to facilitate real-time diffusion measurements via ICP-MS. The cell utilizes a helium gas-jet system which transports diffusing fission products to the mass spectrometer using carbon nanoparticles. The setup was designed to replicate conditions present in a functioning HTGR, and can be configured for real-time release or permeation measurements of single or multiple fission products from graphite or other core materials. In the present work, we have analyzed release rates of cesium in graphite grades IG-110, NBG-18, and a commercial grade of graphite, as well as release of iodine in IG-110. Additionally we have investigated infusion of graphite samples with Cs, I, Sr, Ag, and other surrogate fission products for use in release or profile measurements of diffusion coefficients.

  7. Modelling and assessment of accidental oil release from damaged subsea pipelines.

    PubMed

    Li, Xinhong; Chen, Guoming; Zhu, Hongwei

    2017-10-15

    This paper develops a 3D, transient, mathematical model to estimate the oil release rate and simulate the oil dispersion behavior. The Euler-Euler method is used to estimate the subsea oil release rate, while the Eulerian-Lagrangian method is employed to track the migration trajectory of oil droplets. This model accounts for the quantitative effect of backpressure and hole size on oil release rate, and the influence of oil release rate, oil density, current speed, water depth and leakage position on oil migration is also investigated in this paper. Eventually, the results, e.g. transient release rate of oil, the rise time of oil and dispersion distance are determined by above-mentioned model, and the oil release and dispersion behavior under different scenarios is revealed. Essentially, the assessment results could provide a useful guidance for detection of leakage positon and placement of oil containment boom. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Releases of whooping cranes to the Florida nonmigratory flock: a structured decision-making approach: report to the International Whooping Crane Recovery Team, September 22, 2008

    USGS Publications Warehouse

    Moore, Clinton T.; Converse, Sarah J.; Folk, Martin J.; Boughton, Robin; Brooks, Bill; French, John B.; O'Meara, Timothy; Putnam, Michael; Rodgers, James; Spalding, Marilyn

    2008-01-01

    We used a structured decision-making approach to inform the decision of whether the Florida Fish and Wildlife Conservation Commission should request of the International Whooping Crane Recovery Team that additional whooping crane chicks be released into the Florida Non-Migratory Population (FNMP). Structured decision-making is an application of decision science that strives to produce transparent, replicable, and defensible decisions that recognize the appropriate roles of management policy and science in decision-making. We present a multi-objective decision framework, where management objectives include successful establishment of a whooping crane population in Florida, minimization of costs, positive public relations, information gain, and providing a supply of captive-reared birds to alternative crane release projects, such as the Eastern Migratory Population. We developed models to predict the outcome relative to each of these objectives under 29 different scenarios of the release methodology used from 1993 to 2004, including options of no further releases and variable numbers of releases per year over the next 5-30 years. In particular, we developed a detailed set of population projection models, which make substantially different predictions about the probability of successful establishment of the FNMP. We used expert elicitation to develop prior model weights (measures of confidence in population model predictions); the results of the population model weighting and modelaveraging exercise indicated that the probability of successful establishment of the FNMP ranged from 9% if no additional releases are made, to as high as 41% with additional releases. We also used expert elicitation to develop weights (relative values) on the set of identified objectives, and we then used a formal optimization technique for identifying the optimal decision, which considers the tradeoffs between objectives. The optimal decision was identified as release of 3 cohorts (24 birds) per year over the next 10 years. However, any decision that involved release of 1-3 cohorts (8-24 birds) per year over the next 5 to 20 years, as well as decisions that involve skipping releases in every other year, performed better in our analysis than the alternative of no further releases. These results were driven by the relatively high objective weights that experts placed on the population objective (i.e., successful establishment of the FNMP) and the information gain objective (where releases are expected to accelerate learning on what was identified as a primary uncertainty: the demographic performance of wild-hatched birds). Additional considerations that were not formally integrated into the analysis are also discussed.

  9. Risk analysis of technological hazards: Simulation of scenarios and application of a local vulnerability index.

    PubMed

    Sanchez, E Y; Represa, S; Mellado, D; Balbi, K B; Acquesta, A D; Colman Lerner, J E; Porta, A A

    2018-06-15

    The potential impact of a technological accident can be assessed by risk estimation. Taking this into account, the latent or potential condition can be warned and mitigated. In this work we propose a methodology to estimate risk of technological hazards, focused on two components. The first one is the processing of meteorological databases to define the most probably and conservative scenario of study, and the second one, is the application of a local social vulnerability index to classify the population. In this case of study, the risk was estimated for a hypothetical release of liquefied ammonia in a meat-packing industry in the city of La Plata, Argentina. The method consists in integrating the simulated toxic threat zone with ALOHA software, and the layer of sociodemographic classification of the affected population. The results show the areas associated with higher risks of exposure to ammonia, which are worth being addressed for the prevention of disasters in the region. Advantageously, this systemic approach is methodologically flexible as it provides the possibility of being applied in various scenarios based on the available information of both, the exposed population and its meteorology. Furthermore, this methodology optimizes the processing of the input data and its calculation. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Projecting the release of carbon from permafrost soils using a perturbed parameter ensemble modelling approach

    NASA Astrophysics Data System (ADS)

    MacDougall, Andrew H.; Knutti, Reto

    2016-04-01

    The soils of the northern hemispheric permafrost region are estimated to contain 1100 to 1500 Pg of carbon. A substantial fraction of this carbon has been frozen and therefore protected from microbial decay for millennia. As anthropogenic climate warming progresses much of this permafrost is expected to thaw. Here we conduct perturbed model experiments on a climate model of intermediate complexity, with an improved permafrost carbon module, to estimate with formal uncertainty bounds the release of carbon from permafrost soils by the year 2100 and 2300 CE. We estimate that by year 2100 the permafrost region may release between 56 (13 to 118) Pg C under Representative Concentration Pathway (RCP) 2.6 and 102 (27 to 199) Pg C under RCP 8.5, with substantially more to be released under each scenario by the year 2300. Our analysis suggests that the two parameters that contribute most to the uncertainty in the release of carbon from permafrost soils are the size of the non-passive fraction of the permafrost carbon pool and the equilibrium climate sensitivity. A subset of 25 model variants are integrated 8000 years into the future under continued RCP forcing. Under the moderate RCP 4.5 forcing a remnant near-surface permafrost region persists in the high Arctic, eventually developing a new permafrost carbon pool. Overall our simulations suggest that the permafrost carbon cycle feedback to climate change will make a significant contribution to climate change over the next centuries and millennia, releasing a quantity of carbon 3 to 54 % of the cumulative anthropogenic total.

  11. Hydrogen release from 800 MeV proton-irradiated tungsten

    NASA Astrophysics Data System (ADS)

    Oliver, B. M.; Venhaus, T. J.; Causey, R. A.; Garner, F. A.; Maloy, S. A.

    2002-12-01

    Tungsten irradiated in spallation neutron sources, such as those proposed for the accelerator production of tritium (APT) project, will contain large quantities of generated helium and hydrogen gas. Tungsten used in proposed fusion reactors will also be exposed to neutrons, and the generated protium will be accompanied by deuterium and tritium diffusing in from the plasma-facing surface. The release kinetics of these gases during various off-normal scenarios involving loss of coolant and after heat-induced rises in temperature are of particular interest for both applications. To determine the release kinetics of hydrogen from tungsten, tungsten rods irradiated with 800 MeV protons in the Los Alamos Neutron Science Center (LANSCE) to high exposures as part of the APT project have been examined. Hydrogen evolution from the tungsten has been measured using a dedicated mass-spectrometer system by subjecting the specimens to an essentially linear temperature ramp from ˜300 to ˜1500 K. Release profiles are compared with predictions obtained using the Tritium Migration Analysis Program (TMAP4). The measurements show that for high proton doses, the majority of the hydrogen is released gradually, starting at about 900 K and reaching a maximum at about 1400 K, where it drops fairly rapidly. Comparisons with TMAP show quite reasonable agreement using a trap energy of 1.4 eV and a trap density of ˜7%. There is a small additional release fraction occurring at ˜550 K, which is believed to be associated with low-energy trapping at or near the surface, and, therefore, was not included in the bulk TMAP model.

  12. Exposure to airborne engineered nanoparticles in the indoor environment

    NASA Astrophysics Data System (ADS)

    Vance, Marina E.; Marr, Linsey C.

    2015-04-01

    This literature review assesses the current state of knowledge about inhalation exposure to airborne, engineered nanoparticles in the indoor environment. We present principal exposure scenarios in indoor environments, complemented by analysis of the published literature and of an inventory of nanotechnology-enhanced consumer products. Of all products listed in the inventory, 10.8% (194 products) present the potential for aerosolization of nanomaterials and subsequent inhalation exposure during use or misuse. Among those, silver-containing products are the most prevalent (68 products). Roughly 50% of products would release wet aerosols and 50% would potentially release dry aerosols. Approximately 14% are cleaning products that can be broadly used in public indoor environments, where building occupants may be exposed. While a variety of nanomaterial compositions have been investigated in the limited number of published release and exposure studies, we identified a need for studies investigating nanofibers (beyond carbon nanotubes), nanofilms, nanoplatelets, and other emerging nanomaterials such as ceria and their nanocomposites. Finally, we provide recommendations for future research to advance the understanding of exposure to airborne nanomaterials indoors, such as studies into indoor chemistry of nanomaterials, better nanomaterial reporting and labeling in consumer products, and safer design of nanomaterial-containing consumer products.

  13. Modeling Streamflow and Water Temperature in the North Santiam and Santiam Rivers, Oregon, 2001-02

    USGS Publications Warehouse

    Sullivan, Annett B.; Roundsk, Stewart A.

    2004-01-01

    To support the development of a total maximum daily load (TMDL) for water temperature in the Willamette Basin, the laterally averaged, two-dimensional model CE-QUAL-W2 was used to construct a water temperature and streamflow model of the Santiam and North Santiam Rivers. The rivers were simulated from downstream of Detroit and Big Cliff dams to the confluence with the Willamette River. Inputs to the model included bathymetric data, flow and temperature from dam releases, tributary flow and temperature, and meteorologic data. The model was calibrated for the period July 1 through November 21, 2001, and confirmed with data from April 1 through October 31, 2002. Flow calibration made use of data from two streamflow gages and travel-time and river-width data. Temperature calibration used data from 16 temperature monitoring locations in 2001 and 5 locations in 2002. A sensitivity analysis was completed by independently varying input parameters, including point-source flow, air temperature, flow and water temperature from dam releases, and riparian shading. Scenario analyses considered hypothetical river conditions without anthropogenic heat inputs, with restored riparian vegetation, with minimum streamflow from the dams, and with a more-natural seasonal water temperature regime from dam releases.

  14. Derivation of risk indices and analysis of variablility for the management of incidents involving the transport of nuclear materials in the Northern Seas.

    PubMed

    Brown, J; Hosseini, A; Karcher, M; Kauker, F; Dowdall, M; Schnur, R; Strand, P

    2016-04-15

    The transport of nuclear or radioactive materials and the presence of nuclear powered vessels pose risks to the Northern Seas in terms of potential impacts to man and environment as well socio-economic impacts. Management of incidents involving actual or potential releases to the marine environment are potentially difficult due to the complexity of the environment into which the release may occur and difficulties in quantifying risk to both man and environment. In order to address this, a state of the art oceanographic model was used to characterize the underlying variability for a specific radionuclide release scenario. The resultant probabilistic data were used as inputs to transfer and dose models providing an indication of potential impacts for man and environment This characterization was then employed to facilitate a rapid means of quantifying risk to man and the environment that included and addressed this variability. The radionuclide specific risk indices derived can be applied by simply multiplying the reported values by the magnitude of the source term and thereafter summing over all radionuclides to provide an indication of total risk. Copyright © 2016. Published by Elsevier Ltd.

  15. Flu, risks, and videotape: escalating fear and avoidance.

    PubMed

    Rosoff, Heather; John, Richard S; Prager, Fynnwin

    2012-04-01

    While extensive risk perception research has focused on emotions, cognitions, and behavior at static points in time, less attention has been paid to how these variables might change over time. This study assesses how negative affect, threat beliefs, perceived risk, and intended avoidance behavior change over the course of an escalating biological disaster. A scenario simulation methodology was used that presents respondents with a video simulation of a 15-day series of local news reports to immerse respondents in the developing details of the disaster. Systemic manipulation of the virus's causal origin (terrorist attack, medical lab accident, unknown) and the respondent's proximity to the virus (local vs. opposite coast) allowed us to investigate the dynamics of public response. The unfolding scenario was presented in discrete episodes, allowing responses to be tracked over the episodes. The sample includes 600 respondents equally split by sex and by location, with half in the Washington, DC area, and half in the Los Angeles area. The results showed respondents' reactions to the flu epidemic increased as the disaster escalated. More importantly, there was considerable consistency across respondents' emotional, cognitive, and behavioral responses to the epidemic over the episodes. In addition, the reactions of respondents proximally closer to the epidemic increased more rapidly and with greater intensity than their distant counterparts. Finally, as the flu epidemic escalated, both terrorist and accidental flu releases were perceived as being less risky and were less likely to lead to avoidance behavior compared to the unknown flu release. © 2012 Society for Risk Analysis.

  16. Hard choices in assessing survival past dams — a comparison of single- and paired-release strategies

    USGS Publications Warehouse

    Zydlewski, Joseph D.; Stich, Daniel S.; Sigourney, Douglas B.

    2017-01-01

    Mark–recapture models are widely used to estimate survival of salmon smolts migrating past dams. Paired releases have been used to improve estimate accuracy by removing components of mortality not attributable to the dam. This method is accompanied by reduced precision because (i) sample size is reduced relative to a single, large release; and (ii) variance calculations inflate error. We modeled an idealized system with a single dam to assess trade-offs between accuracy and precision and compared methods using root mean squared error (RMSE). Simulations were run under predefined conditions (dam mortality, background mortality, detection probability, and sample size) to determine scenarios when the paired release was preferable to a single release. We demonstrate that a paired-release design provides a theoretical advantage over a single-release design only at large sample sizes and high probabilities of detection. At release numbers typical of many survival studies, paired release can result in overestimation of dam survival. Failures to meet model assumptions of a paired release may result in further overestimation of dam-related survival. Under most conditions, a single-release strategy was preferable.

  17. EPA QUICK REFERENCE GUIDES

    EPA Science Inventory

    EPA Quick Reference Guides are compilations of information on chemical and biological terrorist agents. The information is presented in consistent format and includes agent characteristics, release scenarios, health and safety data, real-time field detection, effect levels, samp...

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bourham, Mohamed A.; Gilligan, John G.

    Safety considerations in large future fusion reactors like ITER are important before licensing the reactor. Several scenarios are considered hazardous, which include safety of plasma-facing components during hard disruptions, high heat fluxes and thermal stresses during normal operation, accidental energy release, and aerosol formation and transport. Disruption events, in large tokamaks like ITER, are expected to produce local heat fluxes on plasma-facing components, which may exceed 100 GW/m{sup 2} over a period of about 0.1 ms. As a result, the surface temperature dramatically increases, which results in surface melting and vaporization, and produces thermal stresses and surface erosion. Plasma-facing componentsmore » safety issues extends to cover a wide range of possible scenarios, including disruption severity and the impact of plasma-facing components on disruption parameters, accidental energy release and short/long term LOCA's, and formation of airborne particles by convective current transport during a LOVA (water/air ingress disruption) accident scenario. Study, and evaluation of, disruption-induced aerosol generation and mobilization is essential to characterize database on particulate formation and distribution for large future fusion tokamak reactor like ITER. In order to provide database relevant to ITER, the SIRENS electrothermal plasma facility at NCSU has been modified to closely simulate heat fluxes expected in ITER.« less

  19. Multimedia Environmental Distribution of Nanomaterials

    NASA Astrophysics Data System (ADS)

    Liu, Haoyang Haven

    Engineered nanomaterials (ENMs), which may be released to the environment due to human-related activities, can move across environmental phase boundaries and be found in most media. Given the rapid development and growing applications of nanotechnology, there is concern and thus the need to assess the potential environmental impact associated with ENMs. Accordingly, a modeling platform was developed to enable evaluation of the dynamic multimedia environmental distribution of ENMs (MendNano) and the range of potential exposure concentrations of ENMs. The MendNano was based on a dynamic multimedia compartmental modeling approach that was guided by detailed analysis of the agglomeration of ENMs, life-cycle analysis based estimates of their potential release to the environment, and incorporation of mechanistic sub-models of various intermedia transport processes. Model simulations for various environmental scenarios indicated that ENM accumulation in the sediment increased significantly with increased ENMs attachment to suspended solids in water. Atmospheric dry and wet depositions can be important pathways for ENMs input to the terrestrial environment in the absence of direct and distributed ENM release to soil. Increased ENM concentration in water due to atmospheric deposition (wet and dry) is expected as direct ENM release to water diminishes. However, for soluble ENMs dissolution can be the dominant pathway for suspended ENM removal from water even compared to advective transport. For example, simulations for Los Angeles showed that dry deposition, rain scavenging, and wind dilution can remove 90% of ENMs from the atmospheric airshed in ~100-230 days, ~2-6 hrs, and ~0.5-2 days, respectively. For the evaluated ENMs (metal, metal oxides, carbon nanotubes (CNT), nanoclays), mass accumulation in the multimedia environment was mostly in the soil and sediment. Additionally, simulation results for TiO2 in Los Angeles demonstrates that the ENM concentrations in air and water increases rapidly to reach steady state, in 72 hrs and 8 days after the start of source release, respectively. After termination of source release, ENM concentrations would decrease by 90% in ~1 and ~4 days. In contrast, steady state for ENM concentrations in soil would not be expected to occur until after about 10 years. MendNano was further integrated with a sub-model of lifecycle environmental assessment for the release of ENMs (LearNano). Estimation of the releases of various ENMs and their environmental distributions in various regions in the U.S. and countries throughout the world revealed that the exposure concentrations for most ENMs (e.g., metal, metal oxides and carbon nanotubes) are expected to be in the range of 0.0003 - 30 ng m-3 (air), 0.006 - 150 ng L-1 (water), 0.01 - 40 mug kg-1 (soil), and 0.005 - 100 mg kg-1 (sediment). It is important to note that the environmental transport of ENMs is governed by particulate transport processes; and thus the transport rates of ENMs are dependent on their particle size distribution. Accordingly, a computational constant-number Direct Simulation Monte Carlo (DSMC) model was also developed to assess the ENM agglomeration in aqueous systems, by solving the Smoluchowski coagulation equation coupled with particle-particle interaction energies provided by the classical Derjaguin-Landau-Verwey-Overbeek (DLVO) theory and non-DLVO hydration repulsion interaction energy. Prediction of ENM agglomerate PSDs demonstrated excellent agreement with experimental measurements for TiO 2, CeO2, alpha-Fe2O3, SiO2, and C60 ENMs over a wide range suspension conditions. Simulations also demonstrated, in quantitative agreement with DLS measurements, that nanoparticle agglomerate size increased both with ionic strength (IS) and as the solution pH approached the isoelectric point (IEP). Additionally, evaluation of experimental DLS measurements for TiO2, CeO2, SiO2, and alpha-Fe 2O3 (hematite) at high IS (up to 900 mM) or low |zeta-potential| (>=1.35 mV) revealed that non-DLVO hydration repulsion energy can be above electrostatic repulsion energy such that the increased overall repulsion energy (contributed by hydration repulsion energy) can significantly lower the agglomerate diameter relative to the classical DLVO prediction. The classical DLVO theory, which is reasonably applicable for agglomeration of NPs of high |zeta-potential| (~>35 mV) in suspensions of low IS (~1 mM) or low |zeta-potential| (~< 40 mV) conditions. In summary, the MendNano-LearNano integrated modeling platform was implemented as a web-based software application that enables rapid "what-if?" scenario analysis, in order to assess the response of environmental system to various scenarios of ENM releases, investigate the impact of geographical and meteorological parameters on ENM distribution in the environment, compare the impact of ENM production and potential releases on different regions, as well as estimate source release rates based on monitored ENM concentrations. It is envisioned that the present integrated modeling platform can serve as a decision support tool to rapidly and critically assess the potential environmental implications of ENMs and thus ensure that nanotechnology is developed in a productive and environmentally responsible manner.

  20. Reuse and recycle--considering the soil below constructions.

    PubMed

    Suer, Pascal; Wik, Ola; Erlandsson, Martin

    2014-07-01

    The European Construction Products Regulation provides a life cycle based framework for the environmental assessment of construction products. Harmonised European standards for the assessment of the release of dangerous substances and for declaration of environmental performance are in progress. Risk based limit values for the protection of soil and groundwater below construction works will still bet set nationally. In this paper we review the possibilities to expand the ongoing harmonisation to include risk assessment and life cycle assessment (LCA). Based on reviews of national European limit value models (LMVs) for assessment of release to soil and groundwater, two areas for harmonisation emerge: 1- The toxicological criteria. Toxicological endpoints to protect human health and environment are similar, and data from the same toxicological data sets are used to establish acceptance criteria. 2- The emission part of LMVs. We extracted six generic construction works for granular materials. These encompass the most common choices and span the different release scenarios applied. Harmonised emission models would also facilitate LCA and environmental product declaration (EPD). The immission or transport part of the LVMs is less promising for harmonisation. Locating the acceptance criteria point of compliance close to the construction works is advantageous from many aspects and would facilitate harmonisation of assessments. We have identified two different strategies to include recycling in the assessments: 1- Tiered procedure where assessment and declaration of performance are made for the intended primary use of the product only and renewed assessments are made whenever the construction works are demolished and the product is recovered. 2- Scenario based procedure where future recycling scenarios, into new products and construction works, are forecasted. In this case the initial assessment and declaration of environmental performance of a construction product is performed both for the intended primary use of the product and for the recycling scenarios. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. The biopharmaceutics of successful controlled release drug product: Segmental-dependent permeability of glipizide vs. metoprolol throughout the intestinal tract.

    PubMed

    Zur, Moran; Cohen, Noa; Agbaria, Riad; Dahan, Arik

    2015-07-15

    The purpose of this work was to study the challenges and prospects of regional-dependent absorption in a controlled-release scenario, through the oral biopharmaceutics of the sulfonylurea antidiabetic drug glipizide. The BCS solubility class of glipizide was determined, and its physicochemical properties and intestinal permeability were thoroughly investigated, both in-vitro (PAMPA and Caco-2) and in-vivo in rats. Metoprolol was used as the low/high permeability class boundary marker. Glipizide was found to be a low-solubility compound. All intestinal permeability experimental methods revealed similar trend; a mirror image small intestinal permeability with opposite regional/pH-dependency was obtained, a downward trend for glipizide, and an upward trend for metoprolol. Yet the lowest permeability of glipizide (terminal Ileum) was comparable to the lowest permeability of metoprolol (proximal jejunum). At the colon, similar permeability was evident for glipizide and metoprolol, that was higher than metoprolol's jejunal permeability. We present an analysis that identifies metoprolol's jejunal permeability as the low/high permeability class benchmark anywhere throughout the intestinal tract; we show that the permeability of both glipizide and metoprolol matches/exceeds this threshold throughout the entire intestinal tract, accounting for their success as controlled-release dosage form. This represents a key biopharmaceutical characteristic for a successful controlled-release dosage form. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Cost-Effectiveness of the Freeze-All Policy.

    PubMed

    Roque, Matheus; Valle, Marcello; Guimarães, Fernando; Sampaio, Marcos; Geber, Selmo

    2015-08-01

    To evaluate the cost-effectiveness of freeze-all cycles when compared to fresh embryo transfer. This was an observational study with a cost-effectiveness analysis. The analysis consisted of 530 intracytoplasmic sperm injection (ICSI) cycles in a private center in Brazil between January 2012 and December 2013. A total of 530 intracytoplasmic sperm injection (ICSI) cycles - 351 fresh embryo transfers and 179 freeze-all cycles - with a gonadotropin-releasing hormone (GnRH) antagonist protocol and day 3 embryo transfers. The pregnancy rate was 31.1% in the fresh group and 39.7% in the freeze-all group. We performed two scenario analyses for costs. In scenario 1, we included those costs associated with the ICSI cycle (monitoring during controlled ovarian stimulation [COS], oocyte retrieval, embryo transfer, IVF laboratory, and medical costs), embryo cryopreservation of supernumerary embryos, hormone measurements during COS and endometrial priming, medication use (during COS, endometrial priming, and luteal phase support), ultrasound scan for frozen- thawed embryo transfer (FET), obstetric ultrasounds, and miscarriage. The total cost (in USD) per pregnancy was statistically lower in the freeze-all cycles (19,156.73 ± 1,732.99) when compared to the fresh cycles (23,059.72 ± 2,347.02). Even in Scenario 2, when charging all of the patients in the freeze-all group for cryopreservation (regardless of supernumerary embryos) and for FET, the fresh cycles had a statistically significant increase in treatment costs per ongoing pregnancy. The results presented in this study suggest that the freeze-all policy is a cost-effective strategy when compared to fresh embryo transfer.

  3. Review of unsaturated-zone transport and attenuation of volatile organic compound (VOC) plumes leached from shallow source zones

    NASA Astrophysics Data System (ADS)

    Rivett, Michael O.; Wealthall, Gary P.; Dearden, Rachel A.; McAlary, Todd A.

    2011-04-01

    Reliable prediction of the unsaturated zone transport and attenuation of dissolved-phase VOC (volatile organic compound) plumes leached from shallow source zones is a complex, multi-process, environmental problem. It is an important problem as sources, which include solid-waste landfills, aqueous-phase liquid discharge lagoons and NAPL releases partially penetrating the unsaturated zone, may persist for decades. Natural attenuation processes operating in the unsaturated zone that, uniquely for VOCs includes volatilisation, may, however, serve to protect underlying groundwater and potentially reduce the need for expensive remedial actions. Review of the literature indicates that only a few studies have focused upon the overall leached VOC source and plume scenario as a whole. These are mostly modelling studies that often involve high strength, non-aqueous phase liquid (NAPL) sources for which density-induced and diffusive vapour transport is significant. Occasional dissolved-phase aromatic hydrocarbon controlled infiltration field studies also exist. Despite this lack of focus on the overall problem, a wide range of process-based unsaturated zone — VOC research has been conducted that may be collated to build good conceptual model understanding of the scenario, particularly for the much studied aromatic hydrocarbons and chlorinated aliphatic hydrocarbons (CAHs). In general, the former group is likely to be attenuated in the unsaturated zone due to their ready aerobic biodegradation, albeit with rate variability across the literature, whereas the fate of the latter is far less likely to be dominated by a single mechanism and dependent upon the relative importance of the various attenuation processes within individual site — VOC scenarios. Analytical and numerical modelling tools permit effective process representation of the whole scenario, albeit with potential for inclusion of additional processes — e.g., multi-mechanistic sorption phase partitioning, and provide good opportunity for further sensitivity analysis and development to practitioner use. There remains a significant need to obtain intermediate laboratory-scale and particularly field-scale (actual site and controlled release) datasets that address the scenario as a whole and permit validation of the available models. Integrated assessment of the range of simultaneous processes that combine to influence leached plume generation, transport and attenuation in the unsaturated zone is required. Component process research needs are required across the problem scenario and include: the simultaneous volatilisation and dissolution of source zones; development of appropriate field-scale dispersion estimates for the unsaturated zone; assessment of transient VOC exchanges between aqueous, vapour and sorbed phases and their influence upon plume attenuation; development of improved field methods to recognise and quantify biodegradation of CAHs; establishment of the influence of co-contaminants; and, finally, translation of research findings into more robust practitioner practice.

  4. Thermal thresholds as predictors of seed dormancy release and germination timing: altitude-related risks from climate warming for the wild grapevine Vitis vinifera subsp. sylvestris.

    PubMed

    Orrù, Martino; Mattana, Efisio; Pritchard, Hugh W; Bacchetta, Gianluigi

    2012-12-01

    The importance of thermal thresholds for predicting seed dormancy release and germination timing under the present climate conditions and simulated climate change scenarios was investigated. In particular, Vitis vinifera subsp. sylvestris was investigated in four Sardinian populations over the full altitudinal range of the species (from approx. 100 to 800 m a.s.l). Dried and fresh seeds from each population were incubated in the light at a range of temperatures (10-25 and 25/10 °C), without any pre-treatment and after a warm (3 months at 25 °C) or a cold (3 months at 5 °C) stratification. A thermal time approach was then applied to the germination results for dried seeds and the seed responses were modelled according to the present climate conditions and two simulated scenarios of the Intergovernmental Panel on Climate Change (IPCC): B1 (+1·8 °C) and A2 (+3·4 °C). Cold stratification released physiological dormancy, while very few seeds germinated without treatments or after warm stratification. Fresh, cold-stratified seeds germinated significantly better (>80 %) at temperatures ≥20 °C than at lower temperatures. A base temperature for germination (T(b)) of 9·0-11·3 °C and a thermal time requirement for 50 % of germination (θ(50)) ranging from 33·6 °Cd to 68·6 °Cd were identified for non-dormant cold-stratified seeds, depending on the populations. This complex combination of thermal requirements for dormancy release and germination allowed prediction of field emergence from March to May under the present climatic conditions for the investigated populations. The thermal thresholds for seed germination identified in this study (T(b) and θ(50)) explained the differences in seed germination detected among populations. Under the two simulated IPCC scenarios, an altitude-related risk from climate warming is identified, with lowland populations being more threatened due to a compromised seed dormancy release and a narrowed seed germination window.

  5. Strategies to Mitigate Ammonia Release on the International Space Station

    NASA Technical Reports Server (NTRS)

    Macatangay, Ariel V.; Prokhorov, Kimberlee S.; Sweterlitsch, Jeffrey J.

    2007-01-01

    International Space Station (ISS) is crucial to its continuous operation. Off-nominal situations can arise from virtually any aspect of ISS operations. One situation of particular concern is the inadvertent release of a chemical into the ISS atmosphere. In sufficient quantities, a chemical release can render the ISS uninhabitable regardless of the chemical s toxicity as a result of its effect on the hardware used to maintain the environment. This is certainly true with system chemicals which are integral components to the function and purpose of the system. Safeguards, such as design for minimum risk, multiple containment, hazard assessments, rigorous safety reviews, and others, are in place to minimize the probability of a chemical release to the ISS environment thereby allowing the benefits of system chemicals to outweigh the risks associated with them. The thermal control system is an example of such a system. Heat generated within the ISS is transferred from the internal thermal control system (ITCS) to the external thermal control system (ETCS) via two, single-barrier interface heat exchangers (IFHX). The ITCS and ETCS are closed-loop systems which utilize water and anhydrous ammonia, respectively, as heat-transfer fluids. There is approximately 1200 lbs. (208 gallons) of anhydrous ammonia in the ETCS circulating through the two heat exchangers, transferring heat from the ITCS water lines. At the amounts present in the ETCS, anhydrous ammonia is one system chemical that can easily overwhelm the station atmosphere scrubbing capabilities and render the ISS uninhabitable in the event of a catastrophic rupture. Although safeguards have certainly minimized the risk of an ammonia release into the Station atmosphere, credible release scenarios and controls to manage these scenarios are examined.

  6. Waste management facility accident analysis (WASTE ACC) system: software for analysis of waste management alternatives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kohout, E.F.; Folga, S.; Mueller, C.

    1996-03-01

    This paper describes the Waste Management Facility Accident Analysis (WASTE{underscore}ACC) software, which was developed at Argonne National Laboratory (ANL) to support the US Department of Energy`s (DOE`s) Waste Management (WM) Programmatic Environmental Impact Statement (PEIS). WASTE{underscore}ACC is a decision support and database system that is compatible with Microsoft{reg_sign} Windows{trademark}. It assesses potential atmospheric releases from accidents at waste management facilities. The software provides the user with an easy-to-use tool to determine the risk-dominant accident sequences for the many possible combinations of process technologies, waste and facility types, and alternative cases described in the WM PEIS. In addition, its structure willmore » allow additional alternative cases and assumptions to be tested as part of the future DOE programmatic decision-making process. The WASTE{underscore}ACC system demonstrates one approach to performing a generic, systemwide evaluation of accident risks at waste management facilities. The advantages of WASTE{underscore}ACC are threefold. First, the software gets waste volume and radiological profile data that were used to perform other WM PEIS-related analyses directly from the WASTE{underscore}MGMT system. Second, the system allows for a consistent analysis across all sites and waste streams, which enables decision makers to understand more fully the trade-offs among various policy options and scenarios. Third, the system is easy to operate; even complex scenario runs are completed within minutes.« less

  7. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  8. Environmental impact and risk assessments and key factors contributing to the overall uncertainties.

    PubMed

    Salbu, Brit

    2016-01-01

    There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms, ignoring sensitive history life stages of organisms and transgenerational effects. To link sources, ecosystem transfer and biological effects to future impact and risks, a series of models are usually interfaced, while uncertainty estimates are seldom given. The model predictions are, however, only valid within the boundaries of the overall uncertainties. Furthermore, the model predictions are only useful and relevant when uncertainties are estimated, communicated and understood. Among key factors contributing most to uncertainties, the present paper focuses especially on structure uncertainties (model bias or discrepancies) as aspects such as particle releases, ecosystem dynamics, mixed exposure, sensitive life history stages and transgenerational effects, are usually ignored in assessment models. Research focus on these aspects should significantly reduce the overall uncertainties in the impact and risk assessment of radioactive contaminated ecosystems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Improved algorithms in the CE-QUAL-W2 water-quality model for blending dam releases to meet downstream water-temperature targets

    USGS Publications Warehouse

    Rounds, Stewart A.; Buccola, Norman L.

    2015-01-01

    Water-quality models allow water resource professionals to examine conditions under an almost unlimited variety of potential future scenarios. The two-dimensional (longitudinal, vertical) water-quality model CE-QUAL-W2, version 3.7, was enhanced and augmented with new features to help dam operators and managers explore and optimize potential solutions for temperature management downstream of thermally stratified reservoirs. Such temperature management often is accomplished by blending releases from multiple dam outlets that access water of different temperatures at different depths. The modified blending algorithm in version 3.7 of CE-QUAL-W2 allows the user to specify a time-series of target release temperatures, designate from 2 to 10 floating or fixed-elevation outlets for blending, impose minimum and maximum head and flow constraints for any blended outlet, and set priority designations for each outlet that allow the model to choose which outlets to use and how to balance releases among them. The modified model was tested with a variety of examples and against a previously calibrated model of Detroit Lake on the North Santiam River in northwestern Oregon, and the results compared well. These updates to the blending algorithms will allow more complicated dam-operation scenarios to be evaluated somewhat automatically with the model, with decreased need for multiple model runs or preprocessing of model inputs to fully characterize the operational constraints.

  10. Hydropower Production and Fish Habitat Suitability: Impact and Effectiveness of Environmental Flow Prescriptions

    NASA Astrophysics Data System (ADS)

    Ceola, Serena; Pugliese, Alessio; Galeati, Giorgio; Castellarin, Attilio

    2017-04-01

    The anthropogenic alteration of the natural flow regime of a river for hydropower production can significantly modify the processes and functions associated with fluvial ecosystems. In order to preserve the fluvial habitat downstream of dams and diversion structures, environmental flows are commonly defined. Such environmental flows are generally computed from empirical methodologies, which are seldom based on site-specific studies, and may not be representative of local ecological and hydraulic conditions. Here we present the results of a quantitative analysis on the effectiveness of two alternative environmental flow scenarios prescribed in Central Italy (time-invariant experimental and empirically-based flow release versus time-variant hydrogeomorphologically-based flow release) and their impact on hydropower production and fish habitat suitability. The latter is examined by means of several models of habitat suitability curve, which is a well-known approach capable of analysing fluvial species preferences as a function of key eco-hydraulic features, such as water depth, flow velocity and river substrate. The results show an evident loss of hydropower production moving from the time-invariant experimental flow release to the hydrogeomorphological one (nearly 20% at the annual scale). Concerning the effects in terms of fish habitat suitability, our outcomes are less obvious, since they are species- and life stage-specific. The proposed analysis, which can be easily adapted to different riparian habitats and hydrological contexts, is a useful tool to guide the derivation of optimal water resource management strategies in order to ensure both hydropower production and fluvial ecosystem protection.

  11. Hydropower Production and Fish Habitat Suitability: Impact and Effectiveness of Environmental Flow Prescriptions

    NASA Astrophysics Data System (ADS)

    Castellarin, A.; Galeati, G.; Ceola, S.; Pugliese, A.; Ventura, M.; Montanari, A.

    2017-12-01

    The anthropogenic alteration of the natural flow regime of a river for hydropower production can significantly modify the processes and functions associated with fluvial ecosystems. In order to preserve the fluvial habitat downstream of dams and diversion structures, environmental flows are commonly defined. Such environmental flows are generally computed from empirical methodologies, which are seldom based on site-specific studies, and may not be representative of local ecological and hydraulic conditions. Here we present the results of a quantitative analysis on the effectiveness of two alternative environmental flow scenarios prescribed in Central Italy (time-invariant experimental and empirically-based flow release versus time-variant hydrogeomorphologically-based flow release) and their impact on hydropower production and fish habitat suitability. The latter is examined by means of several models of habitat suitability curve, which is a well-known approach capable of analysing fluvial species preferences as a function of key eco-hydraulic features, such as water depth, flow velocity and river substrate. The results show an evident loss of hydropower production moving from the time-invariant experimental flow release to the hydrogeomorphological one (nearly 20% at the annual scale). Concerning the effects in terms of fish habitat suitability, our outcomes are less obvious, since they are species- and life stage-specific. The proposed analysis, which can be easily adapted to different riparian habitats and hydrological contexts, is a useful tool to guide the derivation of optimal water resource management strategies in order to ensure both hydropower production and fluvial ecosystem protection.

  12. Predicting dredging-associated effects to coral reefs in Apra Harbor, Guam - Part 1: Sediment exposure modeling.

    PubMed

    Gailani, Joseph Z; Lackey, Tahirih C; King, David B; Bryant, Duncan; Kim, Sung-Chan; Shafer, Deborah J

    2016-03-01

    Model studies were conducted to investigate the potential coral reef sediment exposure from dredging associated with proposed development of a deepwater wharf in Apra Harbor, Guam. The Particle Tracking Model (PTM) was applied to quantify the exposure of coral reefs to material suspended by the dredging operations at two alternative sites. Key PTM features include the flexible capability of continuous multiple releases of sediment parcels, control of parcel/substrate interaction, and the ability to efficiently track vast numbers of parcels. This flexibility has facilitated simulating the combined effects of sediment released from clamshell dredging and chiseling within Apra Harbor. Because the rate of material released into the water column by some of the processes is not well understood or known a priori, the modeling approach was to bracket parameters within reasonable ranges to produce a suite of potential results from multiple model runs. Sensitivity analysis to model parameters is used to select the appropriate parameter values for bracketing. Data analysis results include mapping the time series and the maximum values of sedimentation, suspended sediment concentration, and deposition rate. Data were used to quantify various exposure processes that affect coral species in Apra Harbor. The goal of this research is to develop a robust methodology for quantifying and bracketing exposure mechanisms to coral (or other receptors) from dredging operations. These exposure values were utilized in an ecological assessment to predict effects (coral reef impacts) from various dredging scenarios. Copyright © 2015. Published by Elsevier Ltd.

  13. Release Kinetics of Paclitaxel and Cisplatin from Two and Three Layered Gold Nanoparticles

    PubMed Central

    England, Christopher G.; Miller, M. Clarke; Kuttan, Ashani; Trent, John O.; Frieboes, Hermann B.

    2015-01-01

    Gold nanoparticles functionalized with biologically-compatible layers may achieve stable drug release while avoiding adverse effects in cancer treatment. We study cisplatin and paclitaxel release from gold cores functionalized with hexadecanethiol (TL) and phosphatidylcholine (PC) to form two-layer nanoparticles, or TL, PC, and high density lipoprotein (HDL) to form three-layer nanoparticles. Drug release was monitored for 14 days to assess long term effects of the core surface modifications on release kinetics. Release profiles were fitted to previously developed kinetic models to differentiate possible release mechanisms. The hydrophilic drug (cisplatin) showed an initial (5-hr.) burst, followed by a steady release over 14 days. The hydrophobic drug (paclitaxel) showed a steady release over the same time period. Two layer nanoparticles released 64.0 ± 2.5% of cisplatin and 22.3 ± 1.5% of paclitaxel, while three layer nanoparticles released the entire encapsulated drug. The Korsmeyer-Peppas model best described each release scenario, while the simplified Higuchi model also adequately described paclitaxel release from the two layer formulation. We conclude that functionalization of gold nanoparticles with a combination of TL and PC may help to modulate both hydrophilic and hydrophobic drug release kinetics, while the addition of HDL may enhance long term release of hydrophobic drug. PMID:25753197

  14. Land use of drained peatlands: Greenhouse gas fluxes, plant production, and economics.

    PubMed

    Kasimir, Åsa; He, Hongxing; Coria, Jessica; Nordén, Anna

    2017-10-10

    Drained peatlands are hotspots for greenhouse gas (GHG) emissions, which could be mitigated by rewetting and land use change. We performed an ecological/economic analysis of rewetting drained fertile peatlands in a hemiboreal climate using different land use strategies over 80 years. Vegetation, soil processes, and total GHG emissions were modeled using the CoupModel for four scenarios: (1) business as usual-Norway spruce with average soil water table of -40 cm; (2) willow with groundwater at -20 cm; (3) reed canary grass with groundwater at -10 cm; and (4) a fully rewetted peatland. The predictions were based on previous model calibrations with several high-resolution datasets consisting of water, heat, carbon, and nitrogen cycling. Spruce growth was calibrated by tree-ring data that extended the time period covered. The GHG balance of four scenarios, including vegetation and soil, were 4.7, 7.1, 9.1, and 6.2 Mg CO 2 eq ha -1  year -1 , respectively. The total soil emissions (including litter and peat respiration CO 2 + N 2 O + CH 4 ) were 33.1, 19.3, 15.3, and 11.0 Mg CO 2 eq ha -1  year -1 , respectively, of which the peat loss contributed 35%, 24%, and 7% of the soil emissions for the three drained scenarios, respectively. No peat was lost for the wet peatland. It was also found that draining increases vegetation growth, but not as drastically as peat respiration does. The cost-benefit analysis (CBA) is sensitive to time frame, discount rate, and carbon price. Our results indicate that the net benefit was greater with a somewhat higher soil water table and when the peatland was vegetated with willow and reed canary grass (Scenarios 2 and 3). We conclude that saving peat and avoiding methane release using fairly wet conditions can significantly reduce GHG emissions, and that this strategy should be considered for land use planning and policy-making. © 2017 John Wiley & Sons Ltd.

  15. Influence of Postprandial Intragastric Pressures on Drug Release from Gastroretentive Dosage Forms.

    PubMed

    Schneider, Felix; Hoppe, Melanie; Koziolek, Mirko; Weitschies, Werner

    2018-05-29

    Despite extensive research in the field of gastroretentive dosage forms, this "holy grail" of oral drug delivery yet remained an unmet goal. Especially under fasting conditions, the reproducible retention of dosage forms in the stomach seems to be an impossible task. This is why such systems are often advised to be taken together with food. But also the postprandial motility can contribute significantly to the failure of gastroretentive dosage forms. To investigate the influence of postprandial pressure conditions on drug release from such systems, we used a novel in vitro dissolution tool, the dissolution stress test device. With the aid of this device, we simulated three different intragastric pressure profiles that may occur after postprandial intake. These transit scenarios were based on recently obtained, postprandial SmartPill® data. The tested systems, Glumetza® 1000 and Madopar® HBS 125, are marketed dosage forms that are based on different approaches to achieve proper gastric retention. All three transit scenarios revealed a highly pressure-sensitive drug release behavior, for both drugs. For Madopar® HBS 125, nearly complete drug release was observed even after early occurring pressures. Glumetza® 1000 seemed to be more resistant to these, most likely due to incomplete wetting of the system. On the contrary to these findings, data from standard dissolution tests using the paddle apparatus displayed controlled drug release for both systems for about 6 h. Based on these results, it can be doubted that established gastroretentive systems stay intact over a longer period of time, even under postprandial conditions.

  16. Greenhouse gas emissions from different municipal solid waste management scenarios in China: Based on carbon and energy flow analysis.

    PubMed

    Liu, Yili; Sun, Weixin; Liu, Jianguo

    2017-10-01

    Waste management is a major source of global greenhouse gas (GHG) emissions and many opportunities exist to reduce these emissions. To identify the GHG emissions from waste management in China, the characteristics of MSW and the current and future treatment management strategies, five typical management scenarios were modeled by EaseTech software following the principles of life cycle inventory and analyzed based on the carbon and energy flows. Due to the high organic fraction (50-70%) and moisture content (>50%) of Chinese municipal solid waste (MSW), the net GHG emissions in waste management had a significant difference from the developed countries. It was found that the poor landfill gas (LFG) collection efficiency and low carbon storage resulted landfilling with flaring and landfilling with biogas recovery scenarios were the largest GHG emissions (192 and 117 kgCO 2 -Eq/t, respectively). In contrast, incineration had the best energy recovery rate (19%), and, by grid emissions substitution, led to a substantial decrease in net GHG emissions (-124 kgCO 2 -Eq/t). Due to the high energy consumption in operation, the unavoidable leakage of CH 4 and N 2 O in treatment, and the further release of CH 4 in disposing of the digested residue or composted product, the scenarios with biological treatment of the organic fractions after sorting, such as composting or anaerobic digestion (AD), did not lead to the outstanding GHG reductions (emissions of 32 and -36 kgCO 2 -Eq/t, respectively) as expected. Copyright © 2017. Published by Elsevier Ltd.

  17. Bayesian analysis of the break in DAMPE lepton spectra

    NASA Astrophysics Data System (ADS)

    Niu, Jia-Shu; Li, Tianjun; Ding, Ran; Zhu, Bin; Xue, Hui-Fang; Wang, Yang

    2018-04-01

    Recently, DAMPE has released its first results on the high-energy cosmic-ray electrons and positrons (CREs) from about 25 GeV to 4.6 TeV, which directly detect a break at ˜1 TeV . This result gives us an excellent opportunity to study the source of the CREs excess. In this work, we used the data for proton and helium flux (from AMS-02 and CREAM), p ¯/p ratio (from AMS-02), positron flux (from AMS-02) and CREs flux (from DAMPE without the peak signal point at ˜1.4 TeV ) to do global fitting simultaneously, which can account for the influence from the propagation model, the nuclei and electron primary source injection, and the secondary lepton production precisely. For an extra source to interpret the excess in lepton spectrum, we consider two separate scenarios (pulsar and dark matter annihilation via leptonic channels) to construct the bump (≳100 GeV ) and the break at ˜1 TeV . The result shows that (i) in the pulsar scenario, the spectral index of the injection should be νpsr˜0.65 and the cut-off should be Rc˜650 GV ; (ii) in dark matter scenario, the dark matter particle's mass is mχ˜1208 GeV , and the cross section is ⟨σ v ⟩˜1.48 ×10-23 cm3 s-1 . Moreover, in the dark matter scenario, the τ τ ¯ annihilation channel is highly suppressed, and a DM model is built to satisfy the fitting results.

  18. Waste activated sludge hydrolysis and acidification: A comparison between sodium hydroxide and steel slag addition.

    PubMed

    Zhang, Ying; Zhang, Chaojie; Zhang, Xuan; Feng, Leiyu; Li, Yongmei; Zhou, Qi

    2016-10-01

    Alkaline treatment with steel slag and NaOH addition were investigated under different pH conditions for the fermentation of waste activated sludge. Better performance was achieved in steel slag addition scenarios for both sludge hydrolysis and acidification. More solubilization of organic matters and much production of higher VFA (volatile fatty acid) in a shorter time can be achieved at pH10 when adjusted by steel slag. Higher enzyme activities were also observed in steel slag addition scenarios under the same pH conditions. Phosphorus concentration in the supernatant increased with fermentation time and pH in NaOH addition scenarios, while in contrast most phosphorus was released and captured by steel slag simultaneously in steel slag addition scenarios. These results suggest that steel slag can be used as a substitute for NaOH in sludge alkaline treatment. Copyright © 2016. Published by Elsevier B.V.

  19. Attitude Control Performance of IRVE-3

    NASA Technical Reports Server (NTRS)

    Dillman, Robert A.; Gsell, Valerie T.; Bowden, Ernest L.

    2013-01-01

    The Inflatable Reentry Vehicle Experiment 3 (IRVE-3) launched July 23, 2012, from NASA Wallops Flight Facility and successfully performed its mission, demonstrating both the survivability of a hypersonic inflatable aerodynamic decelerator in the reentry heating environment and the effect of an offset center of gravity on the aeroshell's flight L/D. The reentry vehicle separated from the launch vehicle, released and inflated its aeroshell, reoriented for atmospheric entry, and mechanically shifted its center of gravity before reaching atmospheric interface. Performance data from the entire mission was telemetered to the ground for analysis. This paper discusses the IRVE-3 mission scenario, reentry vehicle design, and as-flown performance of the attitude control system in the different phases of the mission.

  20. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langholtz, M. H.; Stokes, B. J.; Eaton, L. M.

    This product builds on previous efforts, namely the 2005 Billion-Ton Study (BTS) and the 2011 U.S. Billion-Ton Update (BT2).With each report, greater perspective is gained on the potential of biomass resources to contribute to a national energy strategy. Similarly, each successive report introduces new questions regarding commercialization challenges. BTS quantified the broad biophysical potential of biomass nationally, and BT2 elucidated the potential economic availability of these resources. These reports clearly established the potential availability of up to one billion tons of biomass resources nationally. However, many questions remain, including but not limited to crop yields, climate change impacts, logistical operations,more » and systems integration across production, harvest, and conversion. The present report aims to address many of these questions through empirically modeled energy crop yields, scenario analysis of resources delivered to biorefineries, and the addition of new feedstocks. Volume 2 of the 2016 Billion-Ton Report is expected to be released by the end of 2016. It seeks to evaluate environmental sustainability indicators of select scenarios from volume 1 and potential climate change impacts on future supplies.« less

  1. Analysis and modeling of metals release from MBT wastes through batch and up-flow column tests.

    PubMed

    Pantini, Sara; Verginelli, Iason; Lombardi, Francesco

    2015-04-01

    The leaching behavior of wastes coming out from Mechanical Biological Treatment (MBT) plants is still poorly investigated in literature. This work presents an attempt to provide a deeper insight about the contaminants release from this type of waste. To this end, results of several batch and up-flow percolation tests, carried out on different biologically treated waste samples collected from an Italian MBT plant, are reported. The obtained results showed that, despite MBT wastes are characterized by relatively high heavy metals content, only a limited amount was actually soluble and thus bioavailable. Namely, the release percentage was generally lower than 5% of the total content with the only exception of dissolved organic carbon (DOC), Zn, Ni and Co with release percentages up to 20%. The information provided by the different tests also allowed to highlight some key factors governing the kinetics release of DOC and metals from this type of material. In particular, results of up-flow column percolation tests showed that metals such as Cr, Mg, Ni and Zn followed essentially the leaching trend of DOC suggesting that these elements were mainly released as organo-compounds. Actually, a strong linear correlation (R(2) > 0.8) between DOC and metals concentration in eluates was observed, especially for Cr, Ni and Zn (R(2)>0.94). Thus, combining the results of batch and up-flow column percolation tests, partition coefficients between DOC and metals concentration were derived. These data, coupled with a simplified screening model for DOC release, allowed to get a very good prediction of metal release during the different column tests. Finally, combining the experimental data with a simplified model provided some useful indications for the evaluation of long-term emissions from this type of waste in landfill disposal scenarios. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Sorption of Radionuclides to Building Materials and its ...

    EPA Pesticide Factsheets

    Journal article Urban contamination via a number of radiological release scenarios may require simple decontamination methods that can be deployed for wide-area decontamination. This paper investigates a number of factors of importance for developing such decontamination methods, focusing on cesium.

  3. The ShakeOut scenario: A hypothetical Mw7.8 earthquake on the Southern San Andreas Fault

    USGS Publications Warehouse

    Porter, K.; Jones, L.; Cox, D.; Goltz, J.; Hudnut, K.; Mileti, D.; Perry, S.; Ponti, D.; Reichle, M.; Rose, A.Z.; Scawthorn, C.R.; Seligson, H.A.; Shoaf, K.I.; Treiman, J.; Wein, A.

    2011-01-01

    In 2008, an earthquake-planning scenario document was released by the U.S. Geological Survey (USGS) and California Geological Survey that hypothesizes the occurrence and effects of a Mw7.8 earthquake on the southern San Andreas Fault. It was created by more than 300 scientists and engineers. Fault offsets reach 13 m and up to 8 m at lifeline crossings. Physics-based modeling was used to generate maps of shaking intensity, with peak ground velocities of 3 m/sec near the fault and exceeding 0.5 m/sec over 10,000 km2. A custom HAZUS??MH analysis and 18 special studies were performed to characterize the effects of the earthquake on the built environment. The scenario posits 1,800 deaths and 53,000 injuries requiring emergency room care. Approximately 1,600 fires are ignited, resulting in the destruction of 200 million square feet of the building stock, the equivalent of 133,000 single-family homes. Fire contributes $87 billion in property and business interruption loss, out of the total $191 billion in economic loss, with most of the rest coming from shakerelated building and content damage ($46 billion) and business interruption loss from water outages ($24 billion). Emergency response activities are depicted in detail, in an innovative grid showing activities versus time, a new format introduced in this study. ?? 2011, Earthquake Engineering Research Institute.

  4. Health risk impacts analysis of fugitive aromatic compounds emissions from the working face of a municipal solid waste landfill in China.

    PubMed

    Liu, Yanjun; Liu, Yanting; Li, Hao; Fu, Xindi; Guo, Hanwen; Meng, Ruihong; Lu, Wenjing; Zhao, Ming; Wang, Hongtao

    2016-12-01

    Aromatic compounds (ACs) emitted from landfills have attracted a lot of attention of the public due to their adverse impacts on the environment and human health. This study assessed the health risk impacts of the fugitive ACs emitted from the working face of a municipal solid waste (MSW) landfill in China. The emission data was acquired by long-term in-situ samplings using a modified wind tunnel system. The uncertainty of aromatic emissions is determined by means of statistics and the emission factors were thus developed. Two scenarios, i.e. 'normal-case' and 'worst-case', were presented to evaluate the potential health risk in different weather conditions. For this typical large anaerobic landfill, toluene was the dominant species owing to its highest releasing rate (3.40±3.79g·m -2 ·d -1 ). Despite being of negligible non-carcinogenic risk, the ACs might bring carcinogenic risks to human in the nearby area. Ethylbenzene was the major health threat substance. The cumulative carcinogenic risk impact area is as far as ~1.5km at downwind direction for the normal-case scenario, and even nearly 4km for the worst-case scenario. Health risks of fugitive ACs emissions from active landfills should be concerned, especially for landfills which still receiving mixed MSW. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Modeling of thermo-mechanical and irradiation behavior of mixed oxide fuel for sodium fast reactors

    NASA Astrophysics Data System (ADS)

    Karahan, Aydın; Buongiorno, Jacopo

    2010-01-01

    An engineering code to model the irradiation behavior of UO2-PuO2 mixed oxide fuel pins in sodium-cooled fast reactors was developed. The code was named fuel engineering and structural analysis tool (FEAST-OXIDE). FEAST-OXIDE has several modules working in coupled form with an explicit numerical algorithm. These modules describe: (1) fission gas release and swelling, (2) fuel chemistry and restructuring, (3) temperature distribution, (4) fuel-clad chemical interaction and (5) fuel-clad mechanical analysis. Given the fuel pin geometry, composition and irradiation history, FEAST-OXIDE can analyze fuel and cladding thermo-mechanical behavior at both steady-state and design-basis transient scenarios. The code was written in FORTRAN-90 program language. The mechanical analysis module implements the LIFE algorithm. Fission gas release and swelling behavior is described by the OGRES and NEFIG models. However, the original OGRES model has been extended to include the effects of joint oxide gain (JOG) formation on fission gas release and swelling. A detailed fuel chemistry model has been included to describe the cesium radial migration and JOG formation, oxygen and plutonium radial distribution and the axial migration of cesium. The fuel restructuring model includes the effects of as-fabricated porosity migration, irradiation-induced fuel densification, grain growth, hot pressing and fuel cracking and relocation. Finally, a kinetics model is included to predict the clad wastage formation. FEAST-OXIDE predictions have been compared to the available FFTF, EBR-II and JOYO databases, as well as the LIFE-4 code predictions. The agreement was found to be satisfactory for steady-state and slow-ramp over-power accidents.

  6. Exploring NASA Human Spaceflight and Pioneering Scenarios

    NASA Technical Reports Server (NTRS)

    Zapata, Edgar; Wilhite, Alan

    2015-01-01

    The life cycle cost analysis of space exploration scenarios is explored via a merger of (1) scenario planning, separating context and (2) modeling and analysis of specific content. Numerous scenarios are presented, leading to cross-cutting recommendations addressing life cycle costs, productivity, and approaches applicable to any scenarios. Approaches address technical and non-technical factors.

  7. Approach on environmental risk assessment of nanosilver released from textiles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voelker, Doris, E-mail: doris.voelker@uba.de; Schlich, Karsten; Hohndorf, Lars

    2015-07-15

    Based on the increased utilization of nanosilver (silver nanomaterials=AgNM) as antibacterial agent, there is the strong need to assess the potential environmental implication associated with its new application areas. In this study an exemplary environmental risk assessment (ERA) of AgNM applied in textiles was performed. Environmental exposure scenarios (via municipal sewage treatment plant (STP)) with wastewater supply from domestic homes) were developed for three different types of textiles equipped with AgNM. Based on these scenarios predicted environmental concentrations (PECs) were deduced for STPs and for the environmental compartments surface water, sediment as well as soil. These PECs were related tomore » PNECs (predicted no effect concentrations). PNECs were deduced from results of ecotoxicity tests of a selected AgNM (NM-300K). Data on ecotoxicology were derived from various tests with activated sludge, cyanobacteria, algae, daphnids, fish, duckweed, macrophytes, chironomids, earthworms, terrestrial plants as well as soil microorganisms. Emission data for the AgNM NM-300K from textiles were derived from washing experiments. The performed ERA was based on the specifications defined in the ECHA Guidances on information requirements and chemical safety assessment. Based on the chosen scenarios and preconditions, no environmental risk of the AgNM NM-300K released from textiles was detected. Under conservative assumptions a risk quotient for surface water close to 1 indicated that the aquatic compartment may be affected by an increased emission of AgNM to the environment due to the high sensitivity of aquatic organisms to silver. Based on the successful retention of AgNM in the sewage sludge and the still ongoing continual application of sewage sludge on farmland it is recommended to introduce a threshold for total silver content in sewage sludge into the respective regulations. Regarding potential risk mitigation measures, it is emphasized to preferably directly introduce AgNM into the textile fiber since this will strongly minimize the release of AgNM during washing. If this is not possible due to technical limitations or other reasons, the introduction of a threshold level controlling the release of AgNM from textiles is suggested. It has to be noted that this study is a case study which is only valid for the investigated NM-300K and its potential application in textiles. - Highlights: • Washing: release rates of AgNM depend on furnishing in textiles. • Calculated PNECs for the aquatic compartment were low. • For the chosen scenarios no environmental risk of AgNM from textiles was observed. • AgNM is retained in sewage sludge. • In WWTP most of the silver adsorbs to sewage sludge and thus, may reach farmland.« less

  8. 50 CFR 216.215 - Definitions, terms, and criteria

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... design employed in order to determine the appropriate marine mammal mitigation/monitoring measures. (b... energy released during the detonation of an explosive-severance charge. (2) Predetonation survey (required for all scenarios) means any marine mammal monitoring survey (e.g., surface, aerial, or acoustic...

  9. 50 CFR 216.215 - Definitions, terms, and criteria

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... design employed in order to determine the appropriate marine mammal mitigation/monitoring measures. (b... energy released during the detonation of an explosive-severance charge. (2) Predetonation survey (required for all scenarios) means any marine mammal monitoring survey (e.g., surface, aerial, or acoustic...

  10. 50 CFR 216.215 - Definitions, terms, and criteria

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... design employed in order to determine the appropriate marine mammal mitigation/monitoring measures. (b... energy released during the detonation of an explosive-severance charge. (2) Predetonation survey (required for all scenarios) means any marine mammal monitoring survey (e.g., surface, aerial, or acoustic...

  11. Scenario Analysis for the Safety Assessment of Nuclear Waste Repositories: A Critical Review.

    PubMed

    Tosoni, Edoardo; Salo, Ahti; Zio, Enrico

    2018-04-01

    A major challenge in scenario analysis for the safety assessment of nuclear waste repositories pertains to the comprehensiveness of the set of scenarios selected for assessing the safety of the repository. Motivated by this challenge, we discuss the aspects of scenario analysis relevant to comprehensiveness. Specifically, we note that (1) it is necessary to make it clear why scenarios usually focus on a restricted set of features, events, and processes; (2) there is not yet consensus on the interpretation of comprehensiveness for guiding the generation of scenarios; and (3) there is a need for sound approaches to the treatment of epistemic uncertainties. © 2017 Society for Risk Analysis.

  12. Leaked GeV CRs from a Broken Shell: Explaining 9 Years of Fermi-LAT Data of SNR W28

    NASA Astrophysics Data System (ADS)

    Cui, Yudong; Yeung, Paul K. H.; Tam, P. H. Thomas; Pühlhofer, Gerd

    2018-06-01

    Supernova remnant (SNR)W28 is well known for its classic hadronic scenario, in which the TeV cosmic rays (CRs) released at the early stage of this intermediate-age SNR illuminate nearby molecular clouds (MCs). Overwhelming evidence has shown that the northeastern part of the SNR (W28-North) has already encountered the MC clumps. Through this broken shell W28-North, CRs with an energy down to <1 GeV may be able to be injected into nearby MCs. To further verify this hadronic scenario, we first analyze nine years of Fermi-LAT data in/around W28 with energies down to 0.3 GeV. Our Fermi-LAT analysis displays a 10–200 GeV skymap that spatially matches the known TeV sources HESS J1801–233 (W28-North) and HESS J1800–240 A, B, and C (240 A B and C) well. At low energy bands, we have discovered a 0.5–1 GeV blob located to the south of 240 B and C, and a low flux of 0.3–1 GeV at 240 A. A hadronic model is build to explain our analysis results and previous multiwavelength observations of W28. Our model consists of three CR sources: the run-away CRs escaped from a strong shock, the leaked GeV CRs from the broken shell W28-North, and the local CR sea. Through modeling the SNR evolution and the CR acceleration and release, we explain the GeV–TeV emission in/around SNR W28 (except for 240 A) in one model. The damping of the magnetic waves by the neutrals and the decreased acceleration efficiency are both taken into account in our model due to the intermediate age of SNR W28.

  13. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software

    PubMed Central

    Dols, W. Stuart; Persily, Andrew K.; Morrow, Jayne B.; Matzke, Brett D.; Sego, Landon H.; Nuffer, Lisa L.; Pulsipher, Brent A.

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones. PMID:27134782

  14. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software.

    PubMed

    Dols, W Stuart; Persily, Andrew K; Morrow, Jayne B; Matzke, Brett D; Sego, Landon H; Nuffer, Lisa L; Pulsipher, Brent A

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.

  15. Mitigation of negative ecological and socio-economic impacts of the Diama dam on the Senegal River Delta wetland (Mauritania), using a model based decision support system

    NASA Astrophysics Data System (ADS)

    Duvail, S.; Hamerlynck, O.

  16. Potential effects of the fire protection system sprays at Browns Ferry on fission product transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niemczyk, S.J.

    1983-01-01

    The fire protection system (FPS) sprays within any nuclear plant are not intended to mitigate radioactive releases to the environment resulting from severe core-damage accidents. However, it has been shown here that during certain postulated severe accident scenarios at the Browns Ferry Nuclear Plant, the functioning of FPS sprays could have a significant impact on the radioactive releases. Thus the effects of those sprays need to be taken into account for realistic estimation of source terms for some accident scenarios. The effects would include direct ones such as cooling of the reactor building atmosphere and scrubbing of radioactivity from it,more » as well as indirect effects such as an altered likelihood of hydrogen burning and flooding of various safety-related pumps in the reactor building basement. Thus some of the impacts of the sprays would be beneficial with respect to mitigating releases to the environment but some others might not be. The effects of the FPS would be very scenario dependent with a wide range of potential effects often existing for a given accident sequence. Any generalization of the specific results presented here for Browns Ferry to other nuclear plants must be done cautiously, as it appears from a preliminary investigation that the relevant physical and operational characteristics of FPS spray systems differ widely among even otherwise apparently similar plants. Likewise the standby gas treatment systems, which substantially impact the effects of the FPS, differ significantly among plants. More work for both Mark I plants and other plants, BWRs and PWRs alike, is indicated so the potential effects of FPS spray systems during severe accidents can be at least ball-parked for more realistic accident analyses.« less

  17. Scenario-targeted toxicity assessment through multiple endpoint bioassays in a soil posing unacceptable environmental risk according to regulatory screening values.

    PubMed

    Rodriguez-Ruiz, A; Etxebarria, J; Boatti, L; Marigómez, I

    2015-09-01

    Lanestosa is a chronically polluted site (derelict mine) where the soil (Lanestosa (LA) soil) exceeds screening values (SVs) of regulatory policies in force (Basque Country; Europe) for Zn, Pb and Cd. A scenario-targeted toxicity assessment was carried out on the basis of a multi-endpoint bioassay approach. Acute and chronic toxicity bioassays were conducted with selected test species (Vibrio fischeri, Dictyostelium discoideum, Lactuca sativa, Raphanus sativus and Eisenia fetida) in combination with chemical analysis of soils and elutriates and with bioaccumulation studies in earthworms. Besides, the toxicity profile was compared with that of the mine runoff (RO) soil and of a fresh artificially polluted soil (LAAPS) resembling LA soil pollutant profile. Extractability studies in LA soil revealed that Pb, Zn and Cd were highly available for exchange and/or release into the environment. Indeed, Pb and Zn were accumulated in earthworms and LA soil resulted to be toxic. Soil respiration, V. fischeri, vegetative and developmental cycles of D. discoideum and survival and juvenile production of E. fetida were severely affected. These results confirmed that LA soil had unacceptable environmental risk and demanded intervention. In contrast, although Pb and Zn concentrations in RO soil revealed also unacceptable risk, both metal extractability and toxicity were much lower than in LA soil. Thus, within the polluted site, the need for intervention varied between areas that posed dissimilar risk. Besides, since LAAPS, with a high exchangeable metal fraction, was the most toxic, ageing under in situ natural conditions seemingly contributed to attenuate LA soil risk. As a whole, combining multi-endpoint bioassays with scenario-targeted analysis (including leaching and ageing) provides reliable risk assessment in soils posing unacceptable environmental risk according to SVs, which is useful to optimise the required intervention measures.

  18. Strong atmospheric chemistry feedback to climate warming from Arctic methane emissions

    USGS Publications Warehouse

    Isaksen, Ivar S.A.; Gauss, Michael; Myhre, Gunnar; Walter Anthony, Katey M.; Ruppel, Carolyn

    2011-01-01

    The magnitude and feedbacks of future methane release from the Arctic region are unknown. Despite limited documentation of potential future releases associated with thawing permafrost and degassing methane hydrates, the large potential for future methane releases calls for improved understanding of the interaction of a changing climate with processes in the Arctic and chemical feedbacks in the atmosphere. Here we apply a “state of the art” atmospheric chemistry transport model to show that large emissions of CH4 would likely have an unexpectedly large impact on the chemical composition of the atmosphere and on radiative forcing (RF). The indirect contribution to RF of additional methane emission is particularly important. It is shown that if global methane emissions were to increase by factors of 2.5 and 5.2 above current emissions, the indirect contributions to RF would be about 250% and 400%, respectively, of the RF that can be attributed to directly emitted methane alone. Assuming several hypothetical scenarios of CH4 release associated with permafrost thaw, shallow marine hydrate degassing, and submarine landslides, we find a strong positive feedback on RF through atmospheric chemistry. In particular, the impact of CH4 is enhanced through increase of its lifetime, and of atmospheric abundances of ozone, stratospheric water vapor, and CO2 as a result of atmospheric chemical processes. Despite uncertainties in emission scenarios, our results provide a better understanding of the feedbacks in the atmospheric chemistry that would amplify climate warming.

  19. Delivery strategies to control inflammatory response: Modulating M1-M2 polarization in tissue engineering applications.

    PubMed

    Alvarez, Mario Moisés; Liu, Julie C; Trujillo-de Santiago, Grissel; Cha, Byung-Hyun; Vishwakarma, Ajaykumar; Ghaemmaghami, Amir M; Khademhosseini, Ali

    2016-10-28

    Macrophages are key players in many physiological scenarios including tissue homeostasis. In response to injury, typically the balance between macrophage sub-populations shifts from an M1 phenotype (pro-inflammatory) to an M2 phenotype (anti-inflammatory). In tissue engineering scenarios, after implantation of any device, it is desirable to exercise control on this M1-M2 progression and to ensure a timely and smooth transition from the inflammatory to the healing stage. In this review, we briefly introduce the current state of knowledge regarding macrophage function and nomenclature. Next, we discuss the use of controlled release strategies to tune the balance between the M1 and M2 phenotypes in the context of tissue engineering applications. We discuss recent literature related to the release of anti-inflammatory molecules (including nucleic acids) and the sequential release of cytokines to promote a timely M1-M2 shift. In addition, we describe the use of macrophages as controlled release agents upon stimulation by physical and/or mechanical cues provided by scaffolds. Moreover, we discuss current and future applications of "smart" implantable scaffolds capable of controlling the cascade of biochemical events related to healing and vascularization. Finally, we provide our opinion on the current challenges and the future research directions to improve our understanding of the M1-M2 macrophage balance and properly exploit it in tissue engineering and regenerative medicine applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Delivery strategies to control inflammatory response: Modulating M1-M2 polarization in tissue engineering applications

    PubMed Central

    Alvarez, Mario Moisés; Liu, Julie C.; Santiago, Grissel Trujillo-de; Cha, Byung-Hyun; Vishwakarma, Ajaykumar; Ghaemmaghami, Amir; Khademhosseini, Ali

    2016-01-01

    Macrophages are key players in many physiological scenarios including tissue homeostasis. In response to injury, typically the balance between macrophage sub-populations shifts from an M1 phenotype (pro-inflammatory) to an M2 phenotype (anti-inflammatory). In tissue engineering scenarios, after implantation of any device, it is desirable to exercise control on this M1-M2 progression and to ensure a timely and smooth transition from the inflammatory to the healing stage. In this review, we briefly introduce the current state of knowledge regarding macrophage function and nomenclature. Next, we discuss the use of controlled release strategies to tune the balance between the M1 and M2 phenotypes in the context of tissue engineering applications. We discuss recent literature related to the release of anti-inflammatory molecules (including nucleic acids) and the sequential release of cytokines to promote a timely M1-M2 shift. In addition, we describe the use of macrophages as controlled release agents upon stimulation by physical and/or mechanical cues provided by scaffolds. Moreover, we discuss current and future applications of “smart” implantable scaffolds capable of controlling the cascade of biochemical events related to healing and vascularization. Finally, we provide our opinion on the current challenges and the future research directions to improve our understanding of the M1-M2 macrophage balance and properly exploit it in tissue engineering and regenerative medicine applications. PMID:26778695

  1. Hypotheses for a Near-Surface Reservoir of Methane and Its Release on Mars

    NASA Astrophysics Data System (ADS)

    Hu, R.; Bloom, A. A.; Gao, P.; Miller, C. E.; Yung, Y. L.

    2015-12-01

    The Curiosity rover recently detected a background of 0.7 ppb and spikes of 7 ppb of methane on Mars. This in situ measurement reorients our understanding of the Martian environment and its potential for life, as the current theories do not entail any active source or sink of methane. In particular, the 10-fold elevation during the southern winter indicates episodic sources of methane that are yet to be discovered. Using the temperature and humidity measurements from the rover, we find that perchlorate salts in the regolith deliquesce to form liquid solutions, and deliquescence progresses to deeper subsurface in the season of the methane spikes. We therefore formulate the following three testable hypotheses as an attempt to explain the apparent variability of the atmospheric methane abundance. The first scenario is that the regolith in Gale Crater adsorbs methane when dry and releases this methane to the atmosphere upon deliquescence. The adsorption energy needs to be 36 kJ mol-1 to explain the magnitude of the methane spikes, higher than laboratory measurements. The second scenario is that microorganisms exist and convert organic matter in the soil to methane when they are in liquid solutions. This scenario does not require regolith adsorption. The third scenario is that deep subsurface aquifers sealed by ice or clathrate produce bursts of methane as a result of freezing and thawing of the permafrost, as the terrestrial arctic tundra. Continued monitoring of methane by Curiosity will test the existence of the near-surface reservoir and its exchange with the atmosphere.

  2. Limitations and information needs for engineered nanomaterial-specific exposure estimation and scenarios: recommendations for improved reporting practices

    NASA Astrophysics Data System (ADS)

    Clark, Katherine; van Tongeren, Martie; Christensen, Frans M.; Brouwer, Derk; Nowack, Bernd; Gottschalk, Fadri; Micheletti, Christian; Schmid, Kaspar; Gerritsen, Rianda; Aitken, Rob; Vaquero, Celina; Gkanis, Vasileios; Housiadas, Christos; de Ipiña, Jesús María López; Riediker, Michael

    2012-09-01

    The aim of this paper is to describe the process and challenges in building exposure scenarios for engineered nanomaterials (ENM), using an exposure scenario format similar to that used for the European Chemicals regulation (REACH). Over 60 exposure scenarios were developed based on information from publicly available sources (literature, books, and reports), publicly available exposure estimation models, occupational sampling campaign data from partnering institutions, and industrial partners regarding their own facilities. The primary focus was on carbon-based nanomaterials, nano-silver (nano-Ag) and nano-titanium dioxide (nano-TiO2), and included occupational and consumer uses of these materials with consideration of the associated environmental release. The process of building exposure scenarios illustrated the availability and limitations of existing information and exposure assessment tools for characterizing exposure to ENM, particularly as it relates to risk assessment. This article describes the gaps in the information reviewed, recommends future areas of ENM exposure research, and proposes types of information that should, at a minimum, be included when reporting the results of such research, so that the information is useful in a wider context.

  3. Quantitative Risk Analysis on the Transport of Dangerous Goods Through a Bi-Directional Road Tunnel.

    PubMed

    Caliendo, Ciro; De Guglielmo, Maria Luisa

    2017-01-01

    A quantitative risk analysis (QRA) regarding dangerous goods vehicles (DGVs) running through road tunnels was set up. Peak hourly traffic volumes (VHP), percentage of heavy goods vehicles (HGVs), and failure of the emergency ventilation system were investigated in order to assess their impact on the risk level. The risk associated with an alternative route running completely in the open air and passing through a highly populated urban area was also evaluated. The results in terms of social risk, as F/N curves, show an increased risk level with an increase the VHP, the percentage of HGVs, and a failure of the emergency ventilation system. The risk curves of the tunnel investigated were found to lie both above and below those of the alternative route running in the open air depending on the type of dangerous goods transported. In particular, risk was found to be greater in the tunnel for two fire scenarios (no explosion). In contrast, the risk level for the exposed population was found to be greater for the alternative route in three possible accident scenarios associated with explosions and toxic releases. Therefore, one should be wary before stating that for the transport of dangerous products an itinerary running completely in the open air might be used if the latter passes through a populated area. The QRA may help decisionmakers both to implement additional safety measures and to understand whether to allow, forbid, or limit circulation of DGVs. © 2016 Society for Risk Analysis.

  4. Gravity-Driven Flow of non-Newtonian Fluids in Heterogeneous Porous Media: a Theoretical and Experimental Analysis

    NASA Astrophysics Data System (ADS)

    Di Federico, V.; Longo, S.; Ciriello, V.; Chiapponi, L.

    2015-12-01

    A theoretical and experimental analysis of non-Newtonian gravity-driven flow in porous media with spatially variable properties is presented. The motivation for our study is the rheological complexity exhibited by several environmental contaminants (wastewater sludge, oil pollutants, waste produced by the minerals and coal industries) and remediation agents (suspensions employed to enhance the efficiency of in-situ remediation). Natural porous media are inherently heterogeneous, and this heterogeneity influences the extent and shape of the porous domain invaded by the contaminant or remediation agent. To grasp the combined effect of rheology and spatial heterogeneity, we consider: a) the release of a thin current of non-Newtonian power-law fluid into a 2-D, semi-infinite and saturated porous medium above a horizontal bed; b) perfectly stratified media, with permeability and porosity varying along the direction transverse (vertical) or parallel (horizontal) to the flow direction. This continuous variation of spatial properties is described by two additional parameters. In order to represent several possible spreading scenarios, we consider: i) instantaneous injection with constant mass; ii) continuous injection with time-variable mass; iii) instantaneous release of a mound of fluid, which can drain freely out of the formation at the origin (dipole flow). Under these assumptions, scalings for current length and thickness are derived in self similar form. An analysis of the conditions on model parameters required to avoid an unphysical or asymptotically invalid result is presented. Theoretical results are validated against multiple sets of experiments, conducted for different combinations of spreading scenarios and types of stratification. Two basic setups are employed for the experiments: I) direct flow simulation in an artificial porous medium constructed superimposing layers of glass beads of different diameter; II) a Hele-Shaw (HS) analogue made of two parallel plates set at an angle. The HS analogy is extended to power-law fluid flow in porous media with variable properties parallel or transverse to the flow direction. Comparison with experimental results show that the proposed models capture the propagation of the current front and the current profile at intermediate and late time.

  5. Improved Algorithms for Blending Dam Releases to Meet Downstream Water-Temperature Targets in the CE-QUAL-W2 Water-Quality Model

    NASA Astrophysics Data System (ADS)

    Rounds, S. A.; Buccola, N. L.

    2014-12-01

    The two-dimensional (longitudinal, vertical) water-quality model CE-QUAL-W2, version 3.7, was enhanced with new features to help dam operators and managers efficiently explore and optimize potential solutions for temperature management downstream of thermally stratified reservoirs. Such temperature management often is accomplished by blending releases from multiple dam outlets that access water of different temperatures at different depths in the reservoir. The original blending algorithm in this version of the model was limited to mixing releases from two outlets at a time, and few constraints could be imposed. The new enhanced blending algorithm allows the user to (1) specify a time-series of target release temperatures, (2) designate from 2 to 10 floating or fixed-elevation outlets for blending, (3) impose maximum head constraints as well as minimum and maximum flow constraints for any blended outlet, and (4) set a priority designation for each outlet that allows the model to choose which outlets to use and how to balance releases among them. The modified model was tested against a previously calibrated model of Detroit Lake on the North Santiam River in northwestern Oregon, and the results compared well. The enhanced model code is being used to evaluate operational and structural scenarios at multiple dam/reservoir systems in the Willamette River basin in Oregon, where downstream temperature management for endangered fish is a high priority for resource managers and dam operators. These updates to the CE-QUAL-W2 blending algorithm allow scenarios involving complicated dam operations and/or hypothetical outlet structures to be evaluated more efficiently with the model, with decreased need for multiple/iterative model runs or preprocessing of model inputs to fully characterize the operational constraints.

  6. eHive: an artificial intelligence workflow system for genomic analysis.

    PubMed

    Severin, Jessica; Beal, Kathryn; Vilella, Albert J; Fitzgerald, Stephen; Schuster, Michael; Gordon, Leo; Ureta-Vidal, Abel; Flicek, Paul; Herrero, Javier

    2010-05-11

    The Ensembl project produces updates to its comparative genomics resources with each of its several releases per year. During each release cycle approximately two weeks are allocated to generate all the genomic alignments and the protein homology predictions. The number of calculations required for this task grows approximately quadratically with the number of species. We currently support 50 species in Ensembl and we expect the number to continue to grow in the future. We present eHive, a new fault tolerant distributed processing system initially designed to support comparative genomic analysis, based on blackboard systems, network distributed autonomous agents, dataflow graphs and block-branch diagrams. In the eHive system a MySQL database serves as the central blackboard and the autonomous agent, a Perl script, queries the system and runs jobs as required. The system allows us to define dataflow and branching rules to suit all our production pipelines. We describe the implementation of three pipelines: (1) pairwise whole genome alignments, (2) multiple whole genome alignments and (3) gene trees with protein homology inference. Finally, we show the efficiency of the system in real case scenarios. eHive allows us to produce computationally demanding results in a reliable and efficient way with minimal supervision and high throughput. Further documentation is available at: http://www.ensembl.org/info/docs/eHive/.

  7. Simulations of a hypothetical temperature control structure at Detroit Dam on the North Santiam River, northwestern Oregon

    USGS Publications Warehouse

    Buccola, Norman L.; Stonewall, Adam J.; Rounds, Stewart A.

    2015-01-01

    Estimated egg-emergence days for endangered Upper Willamette River Chinook salmon (Oncorhynchus tshawytscha) and Upper Willamette River winter steelhead (Oncorhynchus mykiss) were assessed for all scenarios. Estimated spring Chinook fry emergence under SlidingWeir scenarios was 9 days later immediately downstream of Big Cliff Dam, and 4 days later at Greens Bridge compared with existing structural scenarios at Detroit Dam. Despite the inclusion of a hypothetical sliding weir at Detroit Dam, temperatures exceeded without-dams temperatures during November and December. These late-autumn exceedances likely represent the residual thermal effect of Detroit Lake operated to meet minimum dry-season release rates (supporting instream habitat and irrigation requirements) and lake levels specified by the current (2014) operating rules (supporting recreation and flood mitigation).

  8. Observation-based modelling of permafrost carbon fluxes with accounting for deep carbon deposits and thermokarst activity

    NASA Astrophysics Data System (ADS)

    Schneider von Deimling, T.; Grosse, G.; Strauss, J.; Schirrmeister, L.; Morgenstern, A.; Schaphoff, S.; Meinshausen, M.; Boike, J.

    2015-06-01

    High-latitude soils store vast amounts of perennially frozen and therefore inert organic matter. With rising global temperatures and consequent permafrost degradation, a part of this carbon stock will become available for microbial decay and eventual release to the atmosphere. We have developed a simplified, two-dimensional multi-pool model to estimate the strength and timing of future carbon dioxide (CO2) and methane (CH4) fluxes from newly thawed permafrost carbon (i.e. carbon thawed when temperatures rise above pre-industrial levels). We have especially simulated carbon release from deep deposits in Yedoma regions by describing abrupt thaw under newly formed thermokarst lakes. The computational efficiency of our model allowed us to run large, multi-centennial ensembles under various scenarios of future warming to express uncertainty inherent to simulations of the permafrost carbon feedback. Under moderate warming of the representative concentration pathway (RCP) 2.6 scenario, cumulated CO2 fluxes from newly thawed permafrost carbon amount to 20 to 58 petagrams of carbon (Pg-C) (68% range) by the year 2100 and reach 40 to 98 Pg-C in 2300. The much larger permafrost degradation under strong warming (RCP8.5) results in cumulated CO2 release of 42 to 141 Pg-C and 157 to 313 Pg-C (68% ranges) in the years 2100 and 2300, respectively. Our estimates only consider fluxes from newly thawed permafrost, not from soils already part of the seasonally thawed active layer under pre-industrial climate. Our simulated CH4 fluxes contribute a few percent to total permafrost carbon release yet they can cause up to 40% of total permafrost-affected radiative forcing in the 21st century (upper 68% range). We infer largest CH4 emission rates of about 50 Tg-CH4 per year around the middle of the 21st century when simulated thermokarst lake extent is at its maximum and when abrupt thaw under thermokarst lakes is taken into account. CH4 release from newly thawed carbon in wetland-affected deposits is only discernible in the 22nd and 23rd century because of the absence of abrupt thaw processes. We further show that release from organic matter stored in deep deposits of Yedoma regions crucially affects our simulated circumpolar CH4 fluxes. The additional warming through the release from newly thawed permafrost carbon proved only slightly dependent on the pathway of anthropogenic emission and amounts to about 0.03-0.14 °C (68% ranges) by end of the century. The warming increased further in the 22nd and 23rd century and was most pronounced under the RCP6.0 scenario, adding 0.16 to 0.39 °C (68% range) to simulated global mean surface air temperatures in the year 2300.

  9. Observation-based modelling of permafrost carbon fluxes with accounting for deep carbon deposits and thermokarst activity

    NASA Astrophysics Data System (ADS)

    Schneider von Deimling, T.; Grosse, G.; Strauss, J.; Schirrmeister, L.; Morgenstern, A.; Schaphoff, S.; Meinshausen, M.; Boike, J.

    2014-12-01

    High-latitude soils store vast amounts of perennially frozen and therefore inert organic matter. With rising global temperatures and consequent permafrost degradation, a part of this carbon store will become available for microbial decay and eventual release to the atmosphere. We have developed a simplified, two-dimensional multi-pool model to estimate the strength and timing of future carbon dioxide (CO2) and methane (CH4) fluxes from newly thawed permafrost carbon (i.e. carbon thawed when temperatures rise above pre-industrial levels). We have especially simulated carbon release from deep deposits in Yedoma regions by describing abrupt thaw under thermokarst lakes. The computational efficiency of our model allowed us to run large, multi-centennial ensembles under various scenarios of future warming to express uncertainty inherent to simulations of the permafrost-carbon feedback. Under moderate warming of the representative concentration pathway (RCP) 2.6 scenario, cumulated CO2 fluxes from newly thawed permafrost carbon amount to 20 to 58 petagrammes of carbon (Pg-C) (68% range) by the year 2100 and reach 40 to 98 Pg-C in 2300. The much larger permafrost degradation under strong warming (RCP8.5) results in cumulated CO2 release of 42-141 and 157-313 Pg-C (68% ranges) in the years 2100 and 2300, respectively. Our estimates do only consider fluxes from newly thawed permafrost but not from soils already part of the seasonally thawed active layer under preindustrial climate. Our simulated methane fluxes contribute a few percent to total permafrost carbon release yet they can cause up to 40% of total permafrost-affected radiative forcing in the 21st century (upper 68% range). We infer largest methane emission rates of about 50 Tg-CH4 year-1 around the mid of the 21st century when simulated thermokarst lake extent is at its maximum and when abrupt thaw under thermokarst lakes is accounted for. CH4 release from newly thawed carbon in wetland-affected deposits is only discernible in the 22nd and 23rd century because of the absence of abrupt thaw processes. We further show that release from organic matter stored in deep deposits of Yedoma regions does crucially affect our simulated circumpolar methane fluxes. The additional warming through the release from newly thawed permafrost carbon proved only slightly dependent on the pathway of anthropogenic emission and amounts about 0.03-0.14 °C (68% ranges) by end of the century. The warming increased further in the 22nd and 23rd century and was most pronounced under the RCP6.0 scenario with adding 0.16-0.39 °C (68% range) to simulated global mean surface air temperatures in the year 2300.

  10. Security Analysis of Selected AMI Failure Scenarios Using Agent Based Game Theoretic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T

    Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the Advanced Metering Infrastructure (AMI) functional domain which the National Electric Sector Cyber security Organization Resource (NESCOR) working group has currently documented 29 failure scenarios. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain. From thesemore » five selected scenarios, we characterize them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrates how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA.« less

  11. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  12. Embedded CLIPS for SDI BM/C3 simulation and analysis

    NASA Technical Reports Server (NTRS)

    Gossage, Brett; Nanney, Van

    1990-01-01

    Nichols Research Corporation is developing the BM/C3 Requirements Analysis Tool (BRAT) for the U.S. Army Strategic Defense Command. BRAT uses embedded CLIPS/Ada to model the decision making processes used by the human commander of a defense system. Embedding CLlPS/Ada in BRAT allows the user to explore the role of the human in Command and Control (C2) and the use of expert systems for automated C2. BRAT models assert facts about the current state of the system, the simulated scenario, and threat information into CLIPS/Ada. A user-defined rule set describes the decision criteria for the commander. We have extended CLIPS/Ada with user-defined functions that allow the firing of a rule to invoke a system action such as weapons release or a change in strategy. The use of embedded CLIPS/Ada will provide a powerful modeling tool for our customer at minimal cost.

  13. Metal release from contaminated coastal sediments under changing pH conditions: Implications for metal mobilization in acidified oceans.

    PubMed

    Wang, Zaosheng; Wang, Yushao; Zhao, Peihong; Chen, Liuqin; Yan, Changzhou; Yan, Yijun; Chi, Qiaoqiao

    2015-12-30

    To investigate the impacts and processes of CO2-induced acidification on metal mobilization, laboratory-scale experiments were performed, simulating the scenarios where carbon dioxide was injected into sediment-seawater layers inside non-pressurized chambers. Coastal sediments were sampled from two sites with different contamination levels and subjected to pre-determined pH conditions. Sediment samples and overlying water were collected for metal analysis after 10-days. The results indicated that CO2-induced ocean acidification would provoke increased metal mobilization causing adverse side-effects on water quality. The mobility of metals from sediment to the overlying seawater was correlated with the reduction in pH. Results of sequential extractions of sediments illustrated that exchangeable metal forms were the dominant source of mobile metals. Collectively, our data revealed that high metal concentrations in overlying seawater released from contaminated sediments under acidic conditions may strengthen the existing contamination gradients in Maluan Bay and represent a potential risk to ecosystem health in coastal environments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. An analysis of reservoir storage contents for the proposed enlargement of Bumping Lake in Washington

    USGS Publications Warehouse

    Bartells, John H.

    1981-01-01

    Monthend contents of the proposed Bumping Lake enlargement in Washington were determined for the period 1917-73 by using the historic streamflow record as a basis for inflow to the reservoir and imposing certain operational requirements on reservoir releases. The operational requirements used were those set forth in the Environmental Statement that was prepared for the project by the U.S. Water and Power Resources Service (formerly the U.S. Bureau of Reclamation). The computations show that after starting with an empty reservoir on October 1, 1917, the 458,000 acre-feet capacity of the enlarged lake could be nearly full (384,000 acre-feet) on November 30, 1925, which is the starting scenario given in the Environmental Statement. From 1925 to 1973 the computations compare favorably with those shown in the Environmental Statement, with minor differences due to releases of fish enhancement water. In all cases studied, all fishwater requirements and all existing irrigation requirements were met, and generally from 30,000 to 90,000 acre-feet of extra irrigation water could be available in most years. (USGS)

  15. Climate stabilization wedges in action: a systems approach to energy sustainability for Hawaii Island.

    PubMed

    Johnson, Jeremiah; Chertow, Marian

    2009-04-01

    Pacala and Socolow developed a framework to stabilize global greenhouse gas levels for the next fifty years using wedges of constant size representing an increasing use of existing technologies and approaches for energy efficiency, carbon free generation, renewables, and carbon storage. The research presented here applies their approach to Hawaii Island, with modifications to support local scale analysis and employing a "bottom-up" methodology that allows for wedges of various sizes. A discretely bounded spatial unit offers a testing ground for a holistic approach to improving the energy sector with the identification of local options and limitations to the implementation of a comprehensive energy strategy. Nearly 80% of total primary energy demand across all sectors for Hawaii Island is currently met using petroleum-based fuels.The Sustainable Energy Plan scenario included here presents an internally consistent set of recommendations bounded by local constraints in areas such as transportation efficiency, centralized renewable generation (e.g., geothermal, wind), reduction in transmission losses, and improved building efficiency. This scenario shows thatthe demand for primary energy in 2030 could be reduced by 23% through efficiency measures while 46% could be met by renewable generation, resulting in only 31% of the projected demand being met by fossil fuels. In 2030, the annual releases of greenhouse gases would be 3.2 Mt CO2-eq/year under the Baseline scenario, while the Sustainable Energy Plan would reduce this to 1.2 Mt CO2-eq/year--an annual emissions rate 40% below 2006 levels and 10% below 1990 levels. The total for greenhouse gas emissions during the 24-year study period (2007 to 2030) is 59.9 Mt CO2-eq under the Baseline scenario and 32.5 Mt CO2-eq under the Sustainable Energy Plan scenario. Numerous combinations of efficiency and renewable energy options can be employed in a manner that stabilizes the greenhouse gas emissions of Hawaii Island.

  16. Impact of Spatial Scales on the Intercomparison of Climate Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Wei; Steptoe, Michael; Chang, Zheng

    2017-01-01

    Scenario analysis has been widely applied in climate science to understand the impact of climate change on the future human environment, but intercomparison and similarity analysis of different climate scenarios based on multiple simulation runs remain challenging. Although spatial heterogeneity plays a key role in modeling climate and human systems, little research has been performed to understand the impact of spatial variations and scales on similarity analysis of climate scenarios. To address this issue, the authors developed a geovisual analytics framework that lets users perform similarity analysis of climate scenarios from the Global Change Assessment Model (GCAM) using a hierarchicalmore » clustering approach.« less

  17. Updated observational constraints on quintessence dark energy models

    NASA Astrophysics Data System (ADS)

    Durrive, Jean-Baptiste; Ooba, Junpei; Ichiki, Kiyotomo; Sugiyama, Naoshi

    2018-02-01

    The recent GW170817 measurement favors the simplest dark energy models, such as a single scalar field. Quintessence models can be classified in two classes, freezing and thawing, depending on whether the equation of state decreases towards -1 or departs from it. In this paper, we put observational constraints on the parameters governing the equations of state of tracking freezing, scaling freezing, and thawing models using updated data, from the Planck 2015 release, joint light-curve analysis, and baryonic acoustic oscillations. Because of the current tensions on the value of the Hubble parameter H0, unlike previous authors, we let this parameter vary, which modifies significantly the results. Finally, we also derive constraints on neutrino masses in each of these scenarios.

  18. RedeR: R/Bioconductor package for representing modular structures, nested networks and multiple levels of hierarchical associations

    PubMed Central

    2012-01-01

    Visualization and analysis of molecular networks are both central to systems biology. However, there still exists a large technological gap between them, especially when assessing multiple network levels or hierarchies. Here we present RedeR, an R/Bioconductor package combined with a Java core engine for representing modular networks. The functionality of RedeR is demonstrated in two different scenarios: hierarchical and modular organization in gene co-expression networks and nested structures in time-course gene expression subnetworks. Our results demonstrate RedeR as a new framework to deal with the multiple network levels that are inherent to complex biological systems. RedeR is available from http://bioconductor.org/packages/release/bioc/html/RedeR.html. PMID:22531049

  19. Scenario design and basic analysis of the National Data Centre Preparedness Exercise 2013

    NASA Astrophysics Data System (ADS)

    Ross, Ole; Ceranna, Lars; Hartmann, Gernot; Gestermann, Nicolai; Bönneman, Christian

    2014-05-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. For the detection of treaty violations the International Monitoring System (IMS) operates stations observing seismic, hydroacoustic, and infrasound signals as well as radioisotopes in the atmosphere. While the IMS data is collected, processed and technically analyzed in the International Data Center (IDC) of the CTBT-Organization, National Data Centers (NDC) provide interpretation and advice to their government concerning suspicious detections occurring in IMS data. NDC Preparedness Exercises (NPE) are regularly performed dealing with fictitious treaty violations to practice the combined analysis of CTBT verification technologies and for the mutual exchange of information between NDC and also with the IDC. The NPE2010 and NPE2012 trigger scenarios were based on selected seismic events from the Reviewed Event Bulletin (REB) serving as starting point for fictitious Radionuclide dispersion. The main task was the identification of the original REB event and the discrimination between earthquakes and explosions as source. The scenario design of NPE2013 differs from those of previous NPEs. The waveform event selection is not constrained to events in the REB. The exercise trigger is a combination of a tempo-spatial indication pointing to a certain waveform event and simulated radionuclide concentrations generated by forward Atmospheric Transport Modelling based on a fictitious release. For the waveform event the date (4 Sept. 2013) is given and the region is communicated in a map showing the fictitious state of "Frisia" at the Coast of the North Sea in Central Europe. The synthetic radionuclide detections start in Vienna (8 Sept, I-131) and Schauinsland (11 Sept, Xe-133) with rather low activity concentrations and are most prominent in Stockholm and Spitsbergen mid of September 2013. Smaller concentrations in Asia follow later on. The potential connection between the waveform and radionuclide evidence remains unclear. The verification task is to identify the waveform event and to investigate potential sources of the radionuclide findings. Finally the potential conjunction between the sources and the CTBT-relevance of the whole picture has to be evaluated. The overall question is whether requesting an On-Site-Inspection in "Frisia" would be justified. The poster presents the NPE2013 scenario and gives a basic analysis of the initial situation concerning both waveform detections and atmospheric dispersion conditions in Central Europe in early September 2013. The full NPE2013 scenario will be presented at the NDC Workshop mid of May 2014.

  20. Scale-Independent Relational Query Processing

    ERIC Educational Resources Information Center

    Armbrust, Michael Paul

    2013-01-01

    An increasingly common pattern is for newly-released web applications to succumb to a "Success Disaster". In this scenario, overloaded database machines and resultant high response times destroy a previously good user experience, just as a site is becoming popular. Unfortunately, the data independence provided by a traditional relational…

  1. Chapter 9: Implications of Air Pollutant Emissions from Producing Agricultural and Forestry Feedstocks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warner, Ethan; Zhang, Yi Min; Inman, Daniel J

    The 2016 Billion-Ton Report (BT16), Volume 2: Environmental Sustainability Effects of Select Scenarios from Volume 1, jointly released by the U.S. Department of Energy's Bioenergy Technologies Office (BETO) and Oak Ridge National Laboratory (ORNL), is a pioneering effort to analyze a range of potential environmental effects associated with illustrative near-term and long-term biomass-production scenarios from the 2016 Billion-Ton Report, Volume 1. This chapter of the 2016 Billion-Ton Report, Volume 2, was authored by NREL researchers Ethan Warner, Yimin Zhang, Danny Inman, Annika Eberle, Alberta Carpenter, Garvin Heath, and Dylan Hettinger.

  2. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  3. Toward a consistent model for strain accrual and release for the New Madrid Seismic Zone, central United States

    USGS Publications Warehouse

    Hough, S.E.; Page, M.

    2011-01-01

    At the heart of the conundrum of seismogenesis in the New Madrid Seismic Zone is the apparently substantial discrepancy between low strain rate and high recent seismic moment release. In this study we revisit the magnitudes of the four principal 1811–1812 earthquakes using intensity values determined from individual assessments from four experts. Using these values and the grid search method of Bakun and Wentworth (1997), we estimate magnitudes around 7.0 for all four events, values that are significantly lower than previously published magnitude estimates based on macroseismic intensities. We further show that the strain rate predicted from postglacial rebound is sufficient to produce a sequence with the moment release of one Mmax6.8 every 500 years, a rate that is much lower than previous estimates of late Holocene moment release. However, Mw6.8 is at the low end of the uncertainty range inferred from analysis of intensities for the largest 1811–1812 event. We show that Mw6.8 is also a reasonable value for the largest main shock given a plausible rupture scenario. One can also construct a range of consistent models that permit a somewhat higher Mmax, with a longer average recurrence rate. It is thus possible to reconcile predicted strain and seismic moment release rates with alternative models: one in which 1811–1812 sequences occur every 500 years, with the largest events being Mmax∼6.8, or one in which sequences occur, on average, less frequently, with Mmax of ∼7.0. Both models predict that the late Holocene rate of activity will continue for the next few to 10 thousand years.

  4. Consequence analysis in LPG installation using an integrated computer package.

    PubMed

    Ditali, S; Colombi, M; Moreschini, G; Senni, S

    2000-01-07

    This paper presents the prototype of the computer code, Atlantide, developed to assess the consequences associated with accidental events that can occur in a LPG storage plant. The characteristic of Atlantide is to be simple enough but at the same time adequate to cope with consequence analysis as required by Italian legislation in fulfilling the Seveso Directive. The application of Atlantide is appropriate for LPG storage/transferring installations. The models and correlations implemented in the code are relevant to flashing liquid releases, heavy gas dispersion and other typical phenomena such as BLEVE/Fireball. The computer code allows, on the basis of the operating/design characteristics, the study of the relevant accidental events from the evaluation of the release rate (liquid, gaseous and two-phase) in the unit involved, to the analysis of the subsequent evaporation and dispersion, up to the assessment of the final phenomena of fire and explosion. This is done taking as reference simplified Event Trees which describe the evolution of accidental scenarios, taking into account the most likely meteorological conditions, the different release situations and other features typical of a LPG installation. The limited input data required and the automatic linking between the single models, that are activated in a defined sequence, depending on the accidental event selected, minimize both the time required for the risk analysis and the possibility of errors. Models and equations implemented in Atlantide have been selected from public literature or in-house developed software and tailored with the aim to be easy to use and fast to run but, nevertheless, able to provide realistic simulation of the accidental event as well as reliable results, in terms of physical effects and hazardous areas. The results have been compared with those of other internationally recognized codes and with the criteria adopted by Italian authorities to verify the Safety Reports for LPG installations. A brief of the theoretical basis of each model implemented in Atlantide and an example of application are included in the paper.

  5. Atmospheric modeling of Mars CH4 subsurface clathrates releases mimicking SAM and 2003 Earth-based detections

    NASA Astrophysics Data System (ADS)

    Pla-Garcia, Jorge

    2017-10-01

    The aim of this work is to establish the amount of mixing during all martian seasons to test whether CH4 releases inside or outside of Gale crater are consistent with MSL-SAM observations. Several modeling scenarios were configured, including instantaneous and steady releases, both inside and outside the crater. A simulation to mimic the 2003 Earth-based detections (Mumma et al. 2009 or M09) was also performed. In the instantaneous release inside Gale experiments, Ls270 was shown to be the faster mixing season when air within and outside the crater was well mixed: all tracer mass inside the crater is diluted after just 8 hours. The mixing of near surface crater air with the external environment in the rest of the year is potentially rapid but slower than Ls270. In the instantaneous release outside Gale (NW) experiment, in just 12 hours the CH4 that makes it to the MSL landing location is diluted by six orders of magnitude. The timescale of mixing in the model is on the order of 1 sol regardless of season. The duration of the CH4 peak observed by SAM is 100 sols. Therefore there is a steady release inside the crater, or there is a large magnitude steady release outside the crater. In the steady release Gale experiments, CH4 flux rate from ground is 1.8 kg m-2 s-1 (Gloesener et al. 2017) and it is not predictive. In these experiments, ~200 times lower CH4 values detected by SAM are modeled around MSL location. There are CH4 concentration variations of orders of magnitude depending on the hour, so timing of SAM measurements is important. With a larger (but further away) outside crater release area compared to inside, similar CH4 values around MSL are modeled, so distance to source is important. In the steady experiments mimicking M09 detection release area, only 12 times lower CH4 values detected by SAM are modeled around MSL. The highest value in the M09 modeled scenario (0.6 ppbv) is reached in Ls270. This value is the highest of all modeled experiments. With our initial conditions, SAM should not be able to detect CH4, but if we multiply flux by 12, increase the release area or move it closer to MSL (or all of above), it may be possible to get CH4 values that SAM could detect regardless where it comes from.

  6. Implications of matrix diffusion on 1,4-dioxane persistence at contaminated groundwater sites.

    PubMed

    Adamson, David T; de Blanc, Phillip C; Farhat, Shahla K; Newell, Charles J

    2016-08-15

    Management of groundwater sites impacted by 1,4-dioxane can be challenging due to its migration potential and perceived recalcitrance. This study examined the extent to which 1,4-dioxane's persistence was subject to diffusion of mass into and out of lower-permeability zones relative to co-released chlorinated solvents. Two different release scenarios were evaluated within a two-layer aquifer system using an analytical modeling approach. The first scenario simulated a 1,4-dioxane and 1,1,1-TCA source zone where spent solvent was released. The period when 1,4-dioxane was actively loading the low-permeability layer within the source zone was estimated to be <3years due to its high effective solubility. While this was approximately an order-of-magnitude shorter than the loading period for 1,1,1-TCA, the mass of 1,4-dioxane stored within the low-permeability zone at the end of the simulation period (26kg) was larger than that predicted for 1,1,1-TCA (17kg). Even 80years after release, the aqueous 1,4-dioxane concentration was still several orders-of-magnitude higher than potentially-applicable criteria. Within the downgradient plume, diffusion contributed to higher concentrations and enhanced penetration of 1,4-dioxane into the low-permeability zones relative to 1,1,1-TCA. In the second scenario, elevated 1,4-dioxane concentrations were predicted at a site impacted by migration of a weak source from an upgradient site. Plume cutoff was beneficial because it could be implemented in time to prevent further loading of the low-permeability zone at the downgradient site. Overall, this study documented that 1,4-dioxane within transmissive portions of the source zone is quickly depleted due to characteristics that favor both diffusion-based storage and groundwater transport, leaving little mass to treat using conventional means. Furthermore, the results highlight the differences between 1,4-dioxane and chlorinated solvent source zones, suggesting that back diffusion of 1,4-dioxane mass may be serving as the dominant long-term "secondary source" at many contaminated sites that must be managed using alternative approaches. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Data-driven analysis of the effectiveness of evaporative emissions control systems of passenger cars in real world use condition: Time and spatial mapping

    NASA Astrophysics Data System (ADS)

    De Gennaro, Michele; Paffumi, Elena; Martini, Giorgio

    2016-03-01

    This paper assesses the effectiveness of the evaporative emissions control systems of European passenger cars on the basis of real-world activity data. The study relies on two large datasets of driving patterns from conventional fuel vehicles collected by means of on-board GPS systems, consisting of 4.5 million trips and parking events recorded by monitoring 28,000 vehicles over one month. Real world evaporative emissions are estimated using a model that associates a carbon canister desorption event to each trip and a fuel vapour generation event to each parking. The mass of volatile organic compounds released into the air is calculated taking into account the hot-soak, permeation and breathing emission mechanisms. The analysis is based on 36 scenarios, defined by varying the climate conditions, the fuel vapour pressure, the tank material, the tank headspace volume, the purging volume flow rate and the mass of the activated carbon contained in the canister. The results show that in May 4 out of the 18 scenarios considered for Modena and 6 out of the 18 scenarios considered for Firenze lead to evaporative emissions values above the current type approval limit (i.e. 2 [g/day] per vehicle). In July, these numbers increase to 10 out of the 18 scenarios for Modena and to 12 out of the 18 scenarios for Firenze. Looking at the fleet distribution a share of approximately 20% of the fleet is characterised by evaporative emissions higher than the limit in May, increasing to 48% in July, with a peak value of 98%. The emission peak value is estimated to be approximately 4 [g/day] in May and 8 [g/day] in July, while the time-dependent results show emission rates up to nearly 15 [g/s] in Modena and 30 [g/s] in Firenze, with a respective cumulative value in July up to 0.4 and 0.8 tons of VOCs per day. The space-dependent results show a value of the emissions in July of approximately 4-to-8 [kg/km2/day] in the city areas. These results confirm previous findings from the authors, highlighting how the evaporative emissions control system currently used in passenger cars might not be effective under real-world use condition, calling for a revision of the type-approval test procedure.

  8. DIY Solar Market Analysis Webinar Series: Community Solar Scenario Tool |

    Science.gov Websites

    State, Local, and Tribal Governments | NREL Webinar Series: Community Solar Scenario Tool DIY Solar Market Analysis Webinar Series: Community Solar Scenario Tool Wednesday, August 13, 2014 As part ) presented a live webinar titled, "Community Solar Scenario Tool: Planning for a fruitful solar garden

  9. Game theory of pre-emptive vaccination before bioterrorism or accidental release of smallpox.

    PubMed

    Molina, Chai; Earn, David J D

    2015-06-06

    Smallpox was eradicated in the 1970s, but new outbreaks could be seeded by bioterrorism or accidental release. Substantial vaccine-induced morbidity and mortality make pre-emptive mass vaccination controversial, and if vaccination is voluntary, then there is a conflict between self- and group interests. This conflict can be framed as a tragedy of the commons, in which herd immunity plays the role of the commons, and free-riding (i.e. not vaccinating pre-emptively) is analogous to exploiting the commons. This game has been analysed previously for a particular post-outbreak vaccination scenario. We consider several post-outbreak vaccination scenarios and compare the expected increase in mortality that results from voluntary versus imposed vaccination. Below a threshold level of post-outbreak vaccination effort, expected mortality is independent of the level of response effort. A lag between an outbreak starting and a response being initiated increases the post-outbreak vaccination effort necessary to reduce mortality. For some post-outbreak vaccination scenarios, even modest response lags make it impractical to reduce mortality by increasing post-outbreak vaccination effort. In such situations, if decreasing the response lag is impossible, the only practical way to reduce mortality is to make the vaccine safer (greater post-outbreak vaccination effort leads only to fewer people vaccinating pre-emptively). © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  10. Radiological impact of 2016 operations at the Savannah River Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minter, K. L.; Jannik, G. T.; Dixon, K. M.

    This report presents the environmental dose assessment methods and the estimated potential doses to the offsite public from 2016 Savannah River Site (SRS) air and liquid radioactive releases. Also documented are potential doses from special-case exposure scenarios, such as the consumption of wildlife or goat milk.

  11. STEADY STATE FLAMMABLE GAS RELEASE RATE CALCULATION AND LOWER FLAMMABILITY LEVEL EVALUATION FOR HANFORD TANK WASTE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HU TA

    2009-10-26

    Assess the steady-state flammability level at normal and off-normal ventilation conditions. The hydrogen generation rate was calculated for 177 tanks using the rate equation model. Flammability calculations based on hydrogen, ammonia, and methane were performed for 177 tanks for various scenarios.

  12. Flexibility on storage-release based distributed hydrologic modeling with object-oriented approach

    USDA-ARS?s Scientific Manuscript database

    With the availability of advanced hydrologic data in the public domain such as remotely sensed and climate change scenario data, there is a need for a modeling framework that is capable of using these data to simulate and extend hydrologic processes with multidisciplinary approaches for sustainable ...

  13. A SCREENING MODEL FOR SIMULATING DNAPL FLOW AND TRANSPORT IN POROUS MEDIA: THEORETICAL DEVELOPMENT

    EPA Science Inventory

    There exists a need for a simple tool that will allow us to analyze a DNAPL contamination scenario from free-product release to transport of soluble constituents to downgradient receptor wells. The objective of this manuscript is to present the conceptual model and formulate the ...

  14. Monte Carlo simulation of single accident airport risk profile

    NASA Technical Reports Server (NTRS)

    1979-01-01

    A computer simulation model was developed for estimating the potential economic impacts of a carbon fiber release upon facilities within an 80 kilometer radius of a major airport. The model simulated the possible range of release conditions and the resulting dispersion of the carbon fibers. Each iteration of the model generated a specific release scenario, which would cause a specific amount of dollar loss to the surrounding community. By repeated iterations, a risk profile was generated, showing the probability distribution of losses from one accident. Using accident probability estimates, the risks profile for annual losses was derived. The mechanics are described of the simulation model, the required input data, and the risk profiles generated for the 26 large hub airports.

  15. A White Paper on Global Wheat Health Based on Scenario Development and Analysis.

    PubMed

    Savary, S; Djurle, A; Yuen, J; Ficke, A; Rossi, V; Esker, P D; Fernandes, J M C; Del Ponte, E M; Kumar, J; Madden, L V; Paul, P; McRoberts, N; Singh, P K; Huber, L; Pope de Vallavielle, C; Saint-Jean, S; Willocquet, L

    2017-10-01

    Scenario analysis constitutes a useful approach to synthesize knowledge and derive hypotheses in the case of complex systems that are documented with mainly qualitative or very diverse information. In this article, a framework for scenario analysis is designed and then, applied to global wheat health within a timeframe from today to 2050. Scenario analysis entails the choice of settings, the definition of scenarios of change, and the analysis of outcomes of these scenarios in the chosen settings. Three idealized agrosystems, representing a large fraction of the global diversity of wheat-based agrosystems, are considered, which represent the settings of the analysis. Several components of global changes are considered in their consequences on global wheat health: climate change and climate variability, nitrogen fertilizer use, tillage, crop rotation, pesticide use, and the deployment of host plant resistances. Each idealized agrosystem is associated with a scenario of change that considers first, a production situation and its dynamics, and second, the impacts of the evolving production situation on the evolution of crop health. Crop health is represented by six functional groups of wheat pathogens: the pathogens associated with Fusarium head blight; biotrophic fungi, Septoria-like fungi, necrotrophic fungi, soilborne pathogens, and insect-transmitted viruses. The analysis of scenario outcomes is conducted along a risk-analytical pattern, which involves risk probabilities represented by categorized probability levels of disease epidemics, and risk magnitudes represented by categorized levels of crop losses resulting from these levels of epidemics within each production situation. The results from this scenario analysis suggest an overall increase of risk probabilities and magnitudes in the three idealized agrosystems. Changes in risk probability or magnitude however vary with the agrosystem and the functional groups of pathogens. We discuss the effects of global changes on the six functional groups, in terms of their epidemiology and of the crop losses they cause. Scenario analysis enables qualitative analysis of complex systems, such as plant pathosystems that are evolving in response to global changes, including climate change and technology shifts. It also provides a useful framework for quantitative simulation modeling analysis for plant disease epidemiology.

  16. Characterization of U/Pu Particles Originating From the Nuclear Weapon Accidents at Palomares, Spain, 1966 And Thule, Greenland, 1968

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lind, O.C.; Salbu, B.; Janssens, K.

    2007-07-10

    Following the USAF B-52 bomber accidents at Palomares, Spain in 1966 and at Thule, Greenland in 1968, radioactive particles containing uranium (U) and plutonium (Pu) were dispersed into the environment. To improve long-term environmental impact assessments for the contaminated ecosystems, particles from the two sites have been isolated and characterized with respect to properties influencing particle weathering rates. Low [239]Pu/[235]U (0.62-0.78) and [240]Pu/[239]Pu (0.055-0.061) atom ratios in individual particles from both sites obtained by Inductively Coupled Plasma Mass Spectrometry (ICP-MS) show that the particles contain highly enriched U and weapon-grade Pu. Furthermore, results from electron microscopy with Energy Dispersive X-raymore » analysis (EDX) and synchrotron radiation (SR) based micrometer-scale X-ray fluorescence ({micro}-XRF) 2D mapping demonstrated that U and Pu coexist throughout the 1-50 {micro}m sized particles, while surface heterogeneities were observed in EDX line scans. SR-based micrometer-scale X-ray Absorption Near Edge Structure Spectroscopy ({micro}-XANES) showed that the particles consisted of an oxide mixture of U (predominately UO[2] with the presence ofU[3][8]) and Pu ((III)/(IV), (V)/(V) or (III), (IV) and (V)). Neither metallic U or Pu nor uranyl or Pu(VI) could be observed. Characteristics such as elemental distributions, morphology and oxidation states are remarkably similar for the Palomares and Thule particles, reflecting that they originate from similar source and release scenarios. Thus, these particle characteristics are more dependent on the original material from which the particles are derived (source) and the formation of particles (release scenario) than the environmental conditions to which the particles have been exposed since the late 1960s.« less

  17. Where are Low Mass X-ray Binaries Formed?

    NASA Astrophysics Data System (ADS)

    Kundu, A.; Maccarone, T. J.; Zepf, S. E.

    2004-08-01

    Chandra images of nearby galaxies reveal large numbers of low mass X-ray binaries (LMXBs). As in the Galaxy, a significant fraction of these are associated with globular clusters. We exploit the LMXB-globular cluster link in order to probe both the physical properties of globular clusters that promote the formation of LMXBs within clusters with specific characteristics, and to study whether the non-cluster field LMXB population was originally formed in clusters and then released into the field. The large population of globular clusters around nearby galaxies and the range of properties such as age, metallicity and host galaxy environment spanned by these objects enables us to identify and probe the link between these characteristics and the formation of LMXBs. We present the results of our study of a large sample of elliptical and S0 galaxies which reveals among other things that bright LMXBs definitively prefer metal-rich cluster hosts and that this relationship is unlikely to be driven by age effects. The ancestry of the non-cluster field LMXBs is a matter of some debate with suggestions that they they might have formed in the field, or created in globular clusters and then subsequently released into the field either by being ejected from clusters by dynamical processes or as remnants of dynamically destroyed clusters. Each of these scenarios has a specific spatial signature that can be tested by our combined optical and X-ray study. Furthermore, these scenarios predict additional statistical variations that may be driven by the specific host galaxy environment. We present a detailed analysis of our sample galaxies and comment on the probability that the field sources were actually formed in clusters.

  18. Understanding Anglers' Preferences for Fishing Tournament Characteristics and Policies

    NASA Astrophysics Data System (ADS)

    Oh, Chi-Ok; Ditton, Robert B.; Riechers, Robin

    2007-07-01

    Saltwater fishing tournaments in the United States are generally not regulated nor are there different fishing regulations for tournament and nontournament anglers. Although much is known about those who participate in fishing tournaments in terms of their fishing motivations, attitudes, and characteristics, much less is known at the angler population level regarding their preferences for tournament opportunities. Using a stated preference choice model with hypothetical scenarios to simulate participation choices and understand preferences, study objectives were to identify angler preferences for various types of tournament fishing “products.” Four tournament policy characteristics were investigated: promotion of catch and release, bait restrictions, whether a percentage of the tournament entrance fee should go to support fishery management activities, and whether a tournament should be a nonprofit or profit-making venture. Three expectation attributes were inserted: tournament size, trip cost per day, and whether a tournament is family friendly. We sent seven different versions of the mail questionnaire to 1,633 anglers. Of 795 returns, 648 were used for estimating conditional logit models. Analysis indicated that a scenario with no management interventions was not most preferred. Anglers most preferred a conservation-oriented option that introduced additional management measures. Overall, scenarios with management interventions were more favored than the status quo situation (with no management interventions). Although respondents showed reluctance to adopt other management-related options, results generally indicated they were increasingly concerned with sustainability of fish stocks and potential conflicts between tournament and nontournament users and preferred tournament products that reflect these concerns.

  19. [Scenario analysis on sustainable development of Sino-Singapore Tianjin Eco-city based on emergy and system dynamics].

    PubMed

    Li, Chun-fa; Cao, Ying-ying; Yang, Jian-cho; Yang, Qi-qi

    2015-08-01

    Dynamic evaluation of sustainable development is one of the key fundamental parts of the success of Sino-Singapore Tianjin Eco-city, which is the first eco-city in China constructed by international cooperation. Based on the analysis of nature and economy, function and structure, planning control indices and so on, we constructed a sustainable development evaluation index system and a system dynamics model of Sino-Singapore Tianjin Eco-city to explore dynamic trends of its population, material and currency by comprehensive utilization of emergy analysis and system dynamics method. Five scenarios were set up and simulated, including inertial scenario, scientific and technological scenario, economic scenario, environmental scenario and harmonious development scenario. Then, the sustainability of the 5 scenarios was evaluated and compared. The results showed that in the economy and environment sustainable development scenario, there was a steady growth trend of GDP, accumulation of both emergy and currency, and relatively lower values in emergy waste ratio, emergy ratio of waste, and emergy loading ratio. Although both sustainable evaluation indices, such as ESI and UEI, were relatively low, the economy and environment sustainable development scenario was still the best development scenario which was more active than others.

  20. Comparison of compression efficiency between HEVC/H.265 and VP9 based on subjective assessments

    NASA Astrophysics Data System (ADS)

    Řeřábek, Martin; Ebrahimi, Touradj

    2014-09-01

    Current increasing effort of broadcast providers to transmit UHD (Ultra High Definition) content is likely to increase demand for ultra high definition televisions (UHDTVs). To compress UHDTV content, several alternative encoding mechanisms exist. In addition to internationally recognized standards, open access proprietary options, such as VP9 video encoding scheme, have recently appeared and are gaining popularity. One of the main goals of these encoders is to efficiently compress video sequences beyond HDTV resolution for various scenarios, such as broadcasting or internet streaming. In this paper, a broadcast scenario rate-distortion performance analysis and mutual comparison of one of the latest video coding standards H.265/HEVC with recently released proprietary video coding scheme VP9 is presented. Also, currently one of the most popular and widely spread encoder H.264/AVC has been included into the evaluation to serve as a comparison baseline. The comparison is performed by means of subjective evaluations showing actual differences between encoding algorithms in terms of perceived quality. The results indicate a general dominance of HEVC based encoding algorithm in comparison to other alternatives, while VP9 and AVC showing similar performance.

  1. Context Sensing System Analysis for Privacy Preservation Based on Game Theory.

    PubMed

    Wang, Shengling; Li, Luyun; Sun, Weiman; Guo, Junqi; Bie, Rongfang; Lin, Kai

    2017-02-10

    In a context sensing system in which a sensor-equipped mobile phone runs an unreliable context-aware application, the application can infer the user's contexts, based on which it provides personalized services. However, the application may sell the user's contexts to some malicious adversaries to earn extra profits, which will hinder its widespread use. In the real world, the actions of the user, the application and the adversary in the context sensing system affect each other, so that their payoffs are constrained mutually. To figure out under which conditions they behave well (the user releases, the application does not leak and the adversary does not retrieve the context), we take advantage of game theory to analyze the context sensing system. We use the extensive form game and the repeated game, respectively, to analyze two typical scenarios, single interaction and multiple interaction among three players, from which Nash equilibriums and cooperation conditions are obtained. Our results show that the reputation mechanism for the context-sensing system in the former scenario is crucial to privacy preservation, so is the extent to which the participants are concerned about future payoffs in the latter one.

  2. Using transportation accident databases to investigate ignition and explosion probabilities of flammable spills.

    PubMed

    Ronza, A; Vílchez, J A; Casal, J

    2007-07-19

    Risk assessment of hazardous material spill scenarios, and quantitative risk assessment in particular, make use of event trees to account for the possible outcomes of hazardous releases. Using event trees entails the definition of probabilities of occurrence for events such as spill ignition and blast formation. This study comprises an extensive analysis of ignition and explosion probability data proposed in previous work. Subsequently, the results of the survey of two vast US federal spill databases (HMIRS, by the Department of Transportation, and MINMOD, by the US Coast Guard) are reported and commented on. Some tens of thousands of records of hydrocarbon spills were analysed. The general pattern of statistical ignition and explosion probabilities as a function of the amount and the substance spilled is discussed. Equations are proposed based on statistical data that predict the ignition probability of hydrocarbon spills as a function of the amount and the substance spilled. Explosion probabilities are put forth as well. Two sets of probability data are proposed: it is suggested that figures deduced from HMIRS be used in land transportation risk assessment, and MINMOD results with maritime scenarios assessment. Results are discussed and compared with previous technical literature.

  3. Developing Health-Based Pre-Planning Clearance Goals for Airport Remediation Following Chemical Terrorist Attack: Introduction and Key Assessment Considerations

    PubMed Central

    Watson, Annetta; Hall, Linda; Raber, Ellen; Hauschild, Veronique D.; Dolislager, Fredrick; Love, Adam H.; Hanna, M. Leslie

    2011-01-01

    In the event of a chemical terrorist attack on a transportation hub, post-event remediation and restoration activities necessary to attain unrestricted facility reuse and re-entry could require hours to multiple days. While restoration timeframes are dependent on numerous variables, a primary controlling factor is the level of pre-planning and decision-making completed prior to chemical terrorist release. What follows is the first of a two-part analysis identifying key considerations, critical information, and decision criteria to facilitate post-attack and post-decontamination consequence management activities. A conceptual site model and human health-based exposure guidelines are developed and reported as an aid to site-specific pre-planning in the current absence of U.S. state or Federal values designated as compound-specific remediation or re-entry concentrations, and to safely expedite facility recovery to full operational status. Chemicals of concern include chemical warfare nerve and vesicant agents and the toxic industrial compounds phosgene, hydrogen cyanide, and cyanogen chloride. This work has been performed as a national case study conducted in partnership with the Los Angeles International Airport and The Bradley International Terminal. All recommended guidelines have been selected for consistency with airport scenario release parameters of a one-time, short-duration, finite airborne release from a single source followed by compound-specific decontamination. PMID:21390292

  4. Developing Health-Based Pre-Planning Clearance Goals for Airport Remediation Following Chemical Terrorist Attack: Introduction and Key Assessment Considerations.

    PubMed

    Watson, Annetta; Hall, Linda; Raber, Ellen; Hauschild, Veronique D; Dolislager, Fredrick; Love, Adam H; Hanna, M Leslie

    2011-02-13

    In the event of a chemical terrorist attack on a transportation hub, post-event remediation and restoration activities necessary to attain unrestricted facility reuse and re-entry could require hours to multiple days. While restoration timeframes are dependent on numerous variables, a primary controlling factor is the level of pre-planning and decision-making completed prior to chemical terrorist release. What follows is the first of a two-part analysis identifying key considerations, critical information, and decision criteria to facilitate post-attack and post-decontamination consequence management activities. A conceptual site model and human health-based exposure guidelines are developed and reported as an aid to site-specific pre-planning in the current absence of U.S. state or Federal values designated as compound-specific remediation or re-entry concentrations, and to safely expedite facility recovery to full operational status. Chemicals of concern include chemical warfare nerve and vesicant agents and the toxic industrial compounds phosgene, hydrogen cyanide, and cyanogen chloride. This work has been performed as a national case study conducted in partnership with the Los Angeles International Airport and The Bradley International Terminal. All recommended guidelines have been selected for consistency with airport scenario release parameters of a one-time, short-duration, finite airborne release from a single source followed by compound-specific decontamination.

  5. Developing health-based pre-planning clearance goals for airport remediation following chemical terrorist attack: Introduction and key assessment considerations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, Annetta Paule; Raber, Ellen; Dolislager, Frederick

    2011-01-01

    In the event of a chemical terrorist attack on a transportation hub, post-event remediation and restoration activities necessary to attain unrestricted facility re-use and re-entry could require hours to multiple days. While restoration timeframes are dependent on numerous variables, a primary controlling factor is the level of pre-planning and decision-making completed prior to chemical terrorist release. What follows is the first of a two-part analysis identifying key considerations, critical information, and decision criteria to facilitate post-attack and post-decontamination consequence management activities. A conceptual site model and human health-based exposure guidelines are developed and reported as an aid to site-specific pre-planningmore » in the current absence of U.S. state or Federal values designated as compound-specific remediation or re-entry concentrations, and to safely expedite facility recovery to full operational status. Chemicals of concern include chemical warfare nerve and vesicant agents and the toxic industrial compounds phosgene, hydrogen cyanide, and cyanogen chloride. This work has been performed as a national case study conducted in partnership with the Los Angeles International Airport and The Bradley International Terminal. All recommended guidelines have been selected for consistency with airport scenario release parameters of a one-time, short-duration, finite airborne release from a single source followed by compound-specific decontamination.« less

  6. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  7. Multi-hazard Assessment and Scenario Toolbox (MhAST): A Framework for Analyzing Compounding Effects of Multiple Hazards

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Moftakhari, H.; AghaKouchak, A.

    2017-12-01

    Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.

  8. An integrated decision model for the application of airborne sensors for improved response to accidental and terrorist chemical vapor releases

    NASA Astrophysics Data System (ADS)

    Kapitan, Loginn

    This research created a new model which provides an integrated approach to planning the effective selection and employment of airborne sensor systems in response to accidental or intentional chemical vapor releases. The approach taken was to use systems engineering and decision analysis methods to construct a model architecture which produced a modular structure for integrating both new and existing components into a logical procedure to assess the application of airborne sensor systems to address chemical vapor hazards. The resulting integrated process model includes an internal aggregation model which allowed differentiation among alternative airborne sensor systems. Both models were developed and validated by experts and demonstrated using appropriate hazardous chemical release scenarios. The resultant prototype integrated process model or system fills a current gap in capability allowing improved planning, training and exercise for HAZMAT teams and first responders when considering the selection and employment of airborne sensor systems. Through the research process, insights into the current response structure and how current airborne capability may be most effectively used were generated. Furthermore, the resultant prototype system is tailorable for local, state, and federal application, and can potentially be modified to help evaluate investments in new airborne sensor technology and systems. Better planning, training and preparedness exercising holds the prospect for the effective application of airborne assets for improved response to large scale chemical release incidents. Improved response will result in fewer casualties and lives lost, reduced economic impact, and increased protection of critical infrastructure when faced with accidental and intentional terrorist release of hazardous industrial chemicals. With the prospect of more airborne sensor systems becoming available, this prototype system integrates existing and new tools into an effective process for the selection and employment of airborne sensors to better plan, train and exercise ahead of potential chemical release events.

  9. Increased PSA expression on prostate cancer exosomes in in vitro condition and in cancer patients.

    PubMed

    Logozzi, Mariantonia; Angelini, Daniela F; Iessi, Elisabetta; Mizzoni, Davide; Di Raimo, Rossella; Federici, Cristina; Lugini, Luana; Borsellino, Giovanna; Gentilucci, Alessandro; Pierella, Federico; Marzio, Vittorio; Sciarra, Alessandro; Battistini, Luca; Fais, Stefano

    2017-09-10

    Prostate specific antigen (PSA) test is the most common, clinically validated test for the diagnosis of prostate cancer (PCa). While neoplastic lesions of the prostate may cause aberrant levels of PSA in the blood, the quantitation of free or complexed PSA poorly discriminates cancer patients from those developing benign lesions, often leading to invasive and unnecessary surgical procedures. Microenvironmental acidity increases exosome release by cancer cells. In this study we evaluated whether acidity, a critical phenotype of malignancy, could influence exosome release and increase the PSA expression in nanovesicles released by PCa cells. To this aim, we exploited Nanoparticle Tracking Analysis (NTA), an immunocapture-based ELISA, and nanoscale flow-cytometry. The results show that microenvironmental acidity induces an increased release of nanovesicles expressing both PSA and the exosome marker CD81. In order to verify whether the changes induced by the local selective pressure of extracellular acidity may correspond to a clinical pathway we used the same approach to evaluate the levels of PSA-expressing exosomes in the plasma of PCa patients and controls, including subjects with benign prostatic hypertrophy (BPH). The results show that only PCa patients have high levels of nanovesicles expressing both CD81 and PSA. This study shows that tumor acidity exerts a selective pressure leading to the release of extracellular vesicles that express both PSA and exosome markers. A comparable scenario was shown in the plasma of prostate cancer patients as compared to both BPH and healthy controls. These results suggest that microenvironmental acidity may represent a key factor which determines qualitatively and quantitatively the release of extracellular vesicles by malignant tumors, including prostate cancer. This condition leads to the spill-over of nanovesicles into the peripheral blood of prostate cancer patients, where the levels of tumor biomarkers expressed by exosomes, such as PSA-exosomes, may represent a novel, non-invasive clinical tool for the screening and early diagnosis of prostate cancer. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Assessing the joint impact of DNAPL source-zone behavior and degradation products on the probabilistic characterization of human health risk

    NASA Astrophysics Data System (ADS)

    Henri, Christopher V.; Fernàndez-Garcia, Daniel; de Barros, Felipe P. J.

    2016-02-01

    The release of industrial contaminants into the subsurface has led to a rapid degradation of groundwater resources. Contamination caused by Dense Non-Aqueous Phase Liquids (DNAPLs) is particularly severe owing to their limited solubility, slow dissolution and in many cases high toxicity. A greater insight into how the DNAPL source zone behavior and the contaminant release towards the aquifer impact human health risk is crucial for an appropriate risk management. Risk analysis is further complicated by the uncertainty in aquifer properties and contaminant conditions. This study focuses on the impact of the DNAPL release mode on the human health risk propagation along the aquifer under uncertain conditions. Contaminant concentrations released from the source zone are described using a screening approach with a set of parameters representing several scenarios of DNAPL architecture. The uncertainty in the hydraulic properties is systematically accounted for by high-resolution Monte Carlo simulations. We simulate the release and the transport of the chlorinated solvent perchloroethylene and its carcinogenic degradation products in randomly heterogeneous porous media. The human health risk posed by the chemical mixture of these contaminants is characterized by the low-order statistics and the probability density function of common risk metrics. We show that the zone of high risk (hot spot) is independent of the DNAPL mass release mode, and that the risk amplitude is mostly controlled by heterogeneities and by the source zone architecture. The risk is lower and less uncertain when the source zone is formed mostly by ganglia than by pools. We also illustrate how the source zone efficiency (intensity of the water flux crossing the source zone) affects the risk posed by an exposure to the chemical mixture. Results display that high source zone efficiencies are counter-intuitively beneficial, decreasing the risk because of a reduction in the time available for the production of the highly toxic subspecies.

  11. Impact of leaching conditions on constituents release from Flue Gas Desulfurization Gypsum (FGDG) and FGDG-soil mixture.

    PubMed

    Koralegedara, N H; Al-Abed, S R; Arambewela, M K J; Dionysiou, D D

    2017-02-15

    The interest in using Flue Gas Desulfurization Gypsum (FGDG) for land applications has increased recently. This study evaluates the leaching characteristics of trace elements in "modern" FGDG (produced after fly ash removal) and FGDG-mixed soil (SF) under different environmental conditions using recently approved EPA leaching methods (1313-1316). These methods employ various pH and liquid-solid (LS) ratios under batch leaching, column percolation and diffusion controlled release scenarios. Toxicity Characteristic Leaching Protocol (TCLP) and Synthetic Precipitation Leaching Protocol (SPLP) were used for comparison. The data obtained from new EPA methods provide broad insight into constituent release from FGDG and SF when compared to TCLP and SPLP. The release of toxic elements such as Hg, As, Pb, Co, Cd and Cr from SF was negligible. High release of B from FGDG was observed under all tested conditions; however, its release from SF was low. Both FGDG and SF released Se under all pH conditions (2-13) and LS ratios (1-10) in low concentrations (0.02-0.2mg/L). The data from this study could be used to investigate potential use of "modern" FGDG for new beneficial land applications. Published by Elsevier B.V.

  12. FORMAL SCENARIO DEVELOPMENT FOR ENVIRONMENTAL IMPACT ASSESSMENT STUDIES

    EPA Science Inventory

    Scenario analysis is a process of evaluating possible future events through the consideration of alternative plausible (though not equally likely) outcomes (scenarios). The analysis is designed to enable improved decision-making and assessment through a more rigorous evaluation o...

  13. Atmospheric modeling of Mars CH4 subsurface clathrates releases mimicking SAM and 2003 Earth-based detections

    NASA Astrophysics Data System (ADS)

    Pla-García, J.; Rafkin, S. C.

    2017-12-01

    The aim of this work is to establish the amount of mixing during all martian seasons to test whether CH4 releases inside or outside of Gale crater are consistent with MSL-SAM observations. Several modeling scenarios were configured, including instantaneous and steady releases, both inside and outside the crater. A simulation to mimic the 2003 Earth-based detections (Mumma et al. 2009 or M09) was also performed. In the instantaneous release inside Gale experiments, Ls270 was shown to be the faster mixing season when air within and outside the crater was well mixed: all tracer mass inside the crater is diluted after just 8 hours. The mixing of near surface crater air with the external environment in the rest of the year is potentially rapid but slower than Ls270.In the instantaneous release outside Gale (NW) experiment, in just 12 hours the CH4 that makes it to the MSL landing location is diluted by six orders of magnitude. The timescale of mixing in MRAMS experiments is on the order of 1 sol regardless of season. The duration of the CH4 peak observed by SAM is 100 sols. Therefore there is a steady release inside the crater, or there is a very large magnitude steady release outside the crater. In the steady release Gale experiments, CH4 flux rate from ground is 1.8 kg m-2 s-1 (derived from Gloesener et al. 2017 clathrates fluxes) and it is not predictive. In these experiments, 200 times lower CH4 values detected by SAM are modeled around MSL location. There are CH4 concentration variations of orders of magnitude depending on the hour, so timing of SAM measurements is important. With a larger (but further away) outside crater release area compared to inside, similar CH4 values around MSL are modeled, so distance to source is important. In the steady experiments mimicking M09 detection release area, only 12 times lower CH4 values detected by SAM are modeled around MSL. The highest value in the M09 modeled scenario (0.6 ppbv) is reached in Ls270. This value is the highest of all modeled experiments. With our initial conditions (flux rates, release area size and distance to MSL), SAM should not be able (or very difficult) to detect CH4, but if we multiply flux by 12, increase the release area or move it closer to MSL (or all of above), it may be possible to get CH4 values that SAM could detect regardless where it comes from: inside, outside (close to) or far away from Gale.

  14. Security Analysis of Smart Grid Cyber Physical Infrastructures Using Modeling and Game Theoretic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abercrombie, Robert K; Sheldon, Frederick T.

    Cyber physical computing infrastructures typically consist of a number of sites are interconnected. Its operation critically depends both on cyber components and physical components. Both types of components are subject to attacks of different kinds and frequencies, which must be accounted for the initial provisioning and subsequent operation of the infrastructure via information security analysis. Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, andmore » information assets. We concentrated our analysis on the electric sector failure scenarios and impact analyses by the NESCOR Working Group Study, From the Section 5 electric sector representative failure scenarios; we extracted the four generic failure scenarios and grouped them into three specific threat categories (confidentiality, integrity, and availability) to the system. These specific failure scenarios serve as a demonstration of our simulation. The analysis using our ABGT simulation demonstrates how to model the electric sector functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the cyber physical infrastructure network with respect to CIA.« less

  15. Evaluating management tradeoffs between economic fiber production and other ecosystem services in a Chinese-fir dominated forest plantation in Fujian Province.

    PubMed

    Kang, Haijun; Seely, Brad; Wang, Guangyu; Innes, John; Zheng, Dexiang; Chen, Pingliu; Wang, Tongli; Li, Qinglin

    2016-07-01

    Chinese fir (Cunninghamia lanceolata) is not only a valuable timber species, but also plays an important role in the provision of ecosystem services. Forest management decisions to increase the production of fiber for economic gain may have negative impacts on the long-term flow of ecosystem services from forest resources. Such tradeoffs should be taken into account to fulfill the requirements of sustainable forest management. Here we employed an established, ecosystem-based, stand-level model (FORECAST) in combination with a simplified harvest-scheduling model to evaluate the potential tradeoffs among indicators of provisional, regulating and supporting ecosystem services in a Chinese-fir-dominated landscape located in Fujian Province as a case study. Indicators included: merchantable volume harvested, biomass harvested, ecosystem carbon storage, CO2 fixation, O2 released, biomass nitrogen content, pollutant absorption, and soil fertility. A series of alternative management scenarios, representing different combinations of rotation length and harvest intensity, were simulated to facilitate the analysis. Results from the analysis were summarized in the form of a decision matrix designed to provide a method for forest managers to evaluate management alternatives and tradeoffs in the context of key indicators of ecosystem services. The scenario analysis suggests that there are considerable tradeoffs in terms of ecosystem services associated with stand and landscape-level management decisions. Longer rotations and increased retention tended to favor regulating and supporting services while the opposite was true for provisional services. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. A modelling framework for the transport, transformation and biouptake of manufactured nanoparticles in the aquatic environment

    NASA Astrophysics Data System (ADS)

    Lofts, Stephen; Keller, Virginie; Dumont, Egon; Williams, Richard; Praetorius, Antonia; von der Kammer, Frank

    2016-04-01

    The development of innovative new chemical products is a key aspect of the modern economy, yet society demands that such development is environmentally sustainable. Developing knowledge of how new classes of chemicals behave following release to the environment is key to understanding the hazards that will potentially result. Nanoparticles are a key example of a class of chemicals that have undergone a significant expansion in production and use in recent years and so there is a need to develop tools to predict their potential hazard following their deliberate or incidental release to the environment. Generalising the understanding of the environmental behaviour of manufactured nanoparticles in general is challenging, as they are chemically and physically diverse (e.g. metals, metal oxides, carbon nanotubes, cellulose, quantum dots). Furthermore, nanoparticles may be manufactured with capping agents to modify their desired behaviour in industrial applications; such agents may also influence their environmental behaviour. Also, nanoparticles may become significantly modified from their as-manufactured forms both prior to and after the point of environmental release. Tools for predicting nanoparticle behaviour and hazard need to be able to consider a wide range of release scenarios and aspects of nanoparticle behaviour in the environment (e.g. dissolution, transformation of capping agents, agglomeration and aggregation behaviour), where such behaviours are not shared by all types of nanoparticle. This implies the need for flexible, futureproofed tools capable of being updated to take new understanding of behavioural processes into account as such knowledge emerges. This presentation will introduce the NanoFASE model system, a multimedia modelling framework for the transport, transformation and biouptake of manufactured nanoparticles. The complete system will comprise atmospheric, terrestrial and aquatic compartments to allow holistic simulation of nanoparticles; this presentation will focus on the aquatic compartment but will demonstrate the linkages with the other compartments. The system is intended for application at scales up to a large European river catchment at a resolution suitable for assessing spatially-resolved fate and biouptake. The model will be built around a transport framework which will deal with the bulk movement of water, sediments and nanoparticles through the system. Transformation processes such as dissolution, capping agent transformation, sorption of environmental molecules to particle surfaces, heteroaggregation with sediments, will be dealt with within 'reactors' with each grid cell. The reactor concept is intended to provide a separation between transport and transformation process, to allow alternative formulations of transformation processes for different classes of nanoparticles, and to allow new and updated formulations to be readily incorporated. Using this structure we aim to develop a flexible system capable of simulating the environmental transformation and fate of diverse nanoparticle classes under varying release scenarios. We will show conceptual model structures for release scenarios most relevant to the aquatic environment.

  17. Energy Systems Integration News | Energy Systems Integration Facility |

    Science.gov Websites

    power grid modeling scenarios Study Shows Eastern U.S. Power Grid Can Support Upwards of 30% Wind and newly released Eastern Renewable Energy Integration Study (ERGIS) shows that the power grid of the -based study of four potential wind and PV futures and associated operational impacts in the Eastern

  18. Wall fluidization in two acts: from stiff to soft roughness.

    PubMed

    Derzsi, Ladislav; Filippi, Daniele; Lulli, Matteo; Mistura, Giampaolo; Bernaschi, Massimo; Garstecki, Piotr; Sbragaglia, Mauro; Pierno, Matteo

    2018-02-14

    Fluidization of soft glassy materials (SGMs) in microfluidic channels is affected by the wall roughness in the form of microtexturing. When SGMs flow across microgrooves, their constituents are likely trapped within the grooves' gap, and the way they are released locally modifies the fluidization close to the walls. By leveraging a suitable combination of experiments and numerical simulations on concentrated emulsions (a model SGM), we quantitatively report the existence of two physically different scenarios. When the gap is large compared to the droplets in the emulsion, the droplets hit the solid obstacles and easily escape scrambling with their neighbors. Conversely, as the gap spacing is reduced, droplets get trapped inside, creating a "soft roughness" layer, i.e. a complementary series of deformable posts from which overlying droplets are in turn released. In both cases, the induced fluidization scales with the grooves' density, although with a reduced prefactor for narrow gaps, accounting for the softness of the roughness. Both scenarios are also well distinguished via the statistics of the droplets displacement field close to the walls, with large deviations induced by the surface roughness, depending on its stiffness.

  19. Dendrochemistry of multiple releases of chlorinated solvents at a former industrial site

    USGS Publications Warehouse

    Balouet, Jean Christophe; Burken, Joel G.; Karg, Frank; Vroblesky, Don; Smith, Kevin T.; Grudd, Hakan; Rindby, Anders; Beaujard, Francois; Chalot, Michel

    2012-01-01

    Trees can take up and assimilate contaminants from the soil, subsurface, and groundwater. Contaminants in the transpiration stream can become bound or incorporated into the annual rings formed in trees of the temperate zones. The chemical analysis of precisely dated tree rings, called dendrochemistry, can be used to interpret past plant interactions with contaminants. This investigation demonstrates that dendrochemistry can be used to generate historical scenarios of past contamination of groundwater by chlorinated solvents at a site in Verl, Germany. Increment cores from trees at the Verl site were collected and analyzed by energy-dispersive X-ray fluorescence (EDXRF) line scanning. The EDXRF profiles showed four to six time periods where tree rings had anomalously high concentrations of chlorine (Cl) as an indicator of potential contamination by chlorinated solvents.

  20. Simulation of Reclaimed-Water Injection and Pumping Scenarios and Particle-Tracking Analysis near Mount Pleasant, South Carolina

    USGS Publications Warehouse

    Petkewich, Matthew D.; Campbell, Bruce G.

    2009-01-01

    The effect of injecting reclaimed water into the Middendorf aquifer beneath Mount Pleasant, South Carolina, was simulated using a groundwater-flow model of the Coastal Plain Physiographic Province of South Carolina and parts of Georgia and North Carolina. Reclaimed water, also known as recycled water, is wastewater or stormwater that has been treated to an appropriate level so that the water can be reused. The scenarios were simulated to evaluate potential changes in groundwater flow and groundwater-level conditions caused by injecting reclaimed water into the Middendorf aquifer. Simulations included a Base Case and two injection scenarios. Maximum pumping rates were simulated as 6.65, 8.50, and 10.5 million gallons per day for the Base Case, Scenario 1, and Scenario 2, respectively. The Base Case simulation represents a non-injection estimate of the year 2050 groundwater levels for comparison purposes for the two injection scenarios. For Scenarios 1 and 2, the simulated injection of reclaimed water at 3 million gallons per day begins in 2012 and continues through 2050. The flow paths and time of travel for the injected reclaimed water were simulated using particle-tracking analysis. The simulations indicated a general decline of groundwater altitudes in the Middendorf aquifer in the Mount Pleasant, South Carolina, area between 2004 and 2050 for the Base Case and two injection scenarios. For the Base Case, groundwater altitudes generally declined about 90 feet from the 2004 groundwater levels. For Scenarios 1 and 2, although groundwater altitudes initially increased in the Mount Pleasant area because of the simulated injection, these higher groundwater levels declined as Mount Pleasant Waterworks pumping increased over time. When compared to the Base Case simulation, 2050 groundwater altitudes for Scenario 1 are between 15 feet lower to 23 feet higher for production wells, between 41 and 77 feet higher for the injection wells, and between 9 and 23 feet higher for observation wells in the Mount Pleasant area. When compared to the Base Case simulation, 2050 groundwater altitudes for Scenario 2 are between 2 and 106 feet lower for production wells and observation wells and between 11 and 27 feet higher for the injection wells in the Mount Pleasant area. Water budgets for the model area immediately surrounding the Mount Pleasant area were calculated for 2011 and for 2050. The largest flow component for the 2050 water budget in the Mount Pleasant area is discharge through wells at rates between 7.1 and 10.9 million gallons of water per day. This groundwater is replaced predominantly by between 6.0 and 7.8 million gallons per day of lateral groundwater flow within the Middendorf aquifer for the Base Case and two scenarios and through reclaimed-water injection of 3 million gallons per day for Scenarios 1 and 2. In addition, between 175,000 and 319,000 gallons of groundwater are removed from this area per day because of the regional hydraulic gradient. Additional sources of water to this area are groundwater storage releases at rates between 86,800 and 116,000 gallons per day and vertical flow from over- and underlying confining units at rates between 69,100 and 150,000 gallons per day. Reclaimed water injected into the Middendorf aquifer at three hypothetical injection wells moved to the Mount Pleasant Waterworks production wells in 18 to 256 years as indicated by particle-tracking simulations. Time of travel varied from 18 to 179 years for simulated conditions of 20 percent uniform aquifer porosity and between 25 to 256 years for 30 percent uniform aquifer porosity.

  1. Estimating urban flood risk - uncertainty in design criteria

    NASA Astrophysics Data System (ADS)

    Newby, M.; Franks, S. W.; White, C. J.

    2015-06-01

    The design of urban stormwater infrastructure is generally performed assuming that climate is static. For engineering practitioners, stormwater infrastructure is designed using a peak flow method, such as the Rational Method as outlined in the Australian Rainfall and Runoff (AR&R) guidelines and estimates of design rainfall intensities. Changes to Australian rainfall intensity design criteria have been made through updated releases of the AR&R77, AR&R87 and the recent 2013 AR&R Intensity Frequency Distributions (IFDs). The primary focus of this study is to compare the three IFD sets from 51 locations Australia wide. Since the release of the AR&R77 IFDs, the duration and number of locations for rainfall data has increased and techniques for data analysis have changed. Updated terminology coinciding with the 2013 IFD release has also resulted in a practical change to the design rainfall. For example, infrastructure that is designed for a 1 : 5 year ARI correlates with an 18.13% AEP, however for practical purposes, hydraulic guidelines have been updated with the more intuitive 20% AEP. The evaluation of design rainfall variation across Australia has indicated that the changes are dependent upon location, recurrence interval and rainfall duration. The changes to design rainfall IFDs are due to the application of differing data analysis techniques, the length and number of data sets and the change in terminology from ARI to AEP. Such changes mean that developed infrastructure has been designed to a range of different design criteria indicating the likely inadequacy of earlier developments to the current estimates of flood risk. In many cases, the under-design of infrastructure is greater than the expected impact of increased rainfall intensity under climate change scenarios.

  2. Indoor Air Nuclear, Biological, and Chemical Health Modeling and Assessment System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stenner, Robert D.; Hadley, Donald L.; Armstrong, Peter R.

    2001-03-01

    Indoor air quality effects on human health are of increasing concern to public health agencies and building owners. The prevention and treatment of 'sick building' syndrome and the spread of air-borne diseases in hospitals, for example, are well known priorities. However, increasing attention is being directed to the vulnerability of our public buildings/places, public security and national defense facilities to terrorist attack or the accidental release of air-borne biological pathogens, harmful chemicals, or radioactive contaminants. The Indoor Air Nuclear, Biological, and Chemical Health Modeling and Assessment System (IA-NBC-HMAS) was developed to serve as a health impact analysis tool for usemore » in addressing these concerns. The overall goal was to develop a user-friendly fully functional prototype Health Modeling and Assessment system, which will operate under the PNNL FRAMES system for ease of use and to maximize its integration with other modeling and assessment capabilities accessible within the FRAMES system (e.g., ambient air fate and transport models, water borne fate and transport models, Physiologically Based Pharmacokinetic models, etc.). The prototype IA-NBC-HMAS is designed to serve as a functional Health Modeling and Assessment system that can be easily tailored to meet specific building analysis needs of a customer. The prototype system was developed and tested using an actual building (i.e., the Churchville Building located at the Aberdeen Proving Ground) and release scenario (i.e., the release and measurement of tracer materials within the building) to ensure realism and practicality in the design and development of the prototype system. A user-friendly "demo" accompanies this report to allow the reader the opportunity for a "hands on" review of the prototype system's capability.« less

  3. Environmental and economic life cycle assessment for sewage sludge treatment processes in Japan.

    PubMed

    Hong, Jinglan; Hong, Jingmin; Otaki, Masahiro; Jolliet, Olivier

    2009-02-01

    Life cycle assessment for sewage sludge treatment was carried out by estimating the environmental and economic impacts of the six alternative scenarios most often used in Japan: dewatering, composting, drying, incineration, incinerated ash melting and dewatered sludge melting, each with or without digestion. Three end-of-life treatments were also studied: landfilling, agricultural application and building material application. The results demonstrate that sewage sludge digestion can reduce the environmental load and cost through reduced dry matter volume. The global warming potential (GWP) generated from incineration and melting processes can be significantly reduced through the reuse of waste heat for electricity and/or heat generation. Equipment production in scenarios except dewatering has an important effect on GWP, whereas the contribution of construction is negligible. In addition, the results show that the dewatering scenario has the highest impact on land use and cost, the drying scenario has the highest impact on GWP and acidification, and the incinerated ash melting scenario has the highest impact on human toxicity due to re-emissions of heavy metals from incinerated ash in the melting unit process. On the contrary, the dewatering, composting and incineration scenarios generate the lowest impact on human toxicity, land use and acidification, respectively, and the incinerated ash melting scenario has the lowest impact on GWP and cost. Heavy metals released from atmospheric effluents generated the highest human toxicity impact, with the effect of dioxin emissions being significantly lower. This study proved that the dewatered sludge melting scenario is an environmentally optimal and economically affordable method.

  4. eHive: An Artificial Intelligence workflow system for genomic analysis

    PubMed Central

    2010-01-01

    Background The Ensembl project produces updates to its comparative genomics resources with each of its several releases per year. During each release cycle approximately two weeks are allocated to generate all the genomic alignments and the protein homology predictions. The number of calculations required for this task grows approximately quadratically with the number of species. We currently support 50 species in Ensembl and we expect the number to continue to grow in the future. Results We present eHive, a new fault tolerant distributed processing system initially designed to support comparative genomic analysis, based on blackboard systems, network distributed autonomous agents, dataflow graphs and block-branch diagrams. In the eHive system a MySQL database serves as the central blackboard and the autonomous agent, a Perl script, queries the system and runs jobs as required. The system allows us to define dataflow and branching rules to suit all our production pipelines. We describe the implementation of three pipelines: (1) pairwise whole genome alignments, (2) multiple whole genome alignments and (3) gene trees with protein homology inference. Finally, we show the efficiency of the system in real case scenarios. Conclusions eHive allows us to produce computationally demanding results in a reliable and efficient way with minimal supervision and high throughput. Further documentation is available at: http://www.ensembl.org/info/docs/eHive/. PMID:20459813

  5. On the siting of gases shock-emplaced from internal cavities in basalt

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiens, R.C.

    1988-12-01

    Noble gases were extracted by stepped combustion and crushing from basalts which contained gas-filled cavities of controlled sizes prior to shock at 40 GPa. Analysis of fractions enriched and depleted in shock glass from a single sample gave a factor of 2 higher gas abundances in the glass-rich separate. Release patterns were nearly identical, suggesting similar siting (in glass) in both fractions. Crushing of a sample released {approximately}45% of implanted noble gases, but only {approximately}17% of N{sub 2}, indicating that most or all of the noble gas was trapped in vesicles. Analysis by SEM/EDS confirmed the presence of vesicles inmore » glassy areas, with an average diameter of {approximately}10 {mu}m. Samples with relatively large pre-shock cavities were found to consist of up to 70-80% glass locally and generally exhibit greater local shock effects than solid and densely-packed particulate targets at the same shock pressure, though the latter give higher glass emplacement efficiencies. The petrographic results indicate that in situ production of glassy pockets grossly similar to those in the shergottite EETA 79001 is possible from shock reverberations in the vicinity of a vug. However, the siting of the gases points to a more complex scenario, in which SPB gas and melt material were probably injected into EETA 79001.« less

  6. SERA Scenarios of Early Market Fuel Cell Electric Vehicle Introductions: Modeling Framework, Regional Markets, and Station Clustering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, B.; Melaina, M.; Penev, M.

    This report describes the development and analysis of detailed temporal and spatial scenarios for early market hydrogen fueling infrastructure clustering and fuel cell electric vehicle rollout using the Scenario Evaluation, Regionalization and Analysis (SERA) model. The report provides an overview of the SERA scenario development framework and discusses the approach used to develop the nationwidescenario.

  7. Recommended Parameter Values for GENII Modeling of Radionuclides in Routine Air and Water Releases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, Sandra F.; Arimescu, Carmen; Napier, Bruce A.

    The GENII v2 code is used to estimate dose to individuals or populations from the release of radioactive materials into air or water. Numerous parameter values are required for input into this code. User-defined parameters cover the spectrum from chemical data, meteorological data, agricultural data, and behavioral data. This document is a summary of parameter values that reflect conditions in the United States. Reasonable regional and age-dependent data is summarized. Data availability and quality varies. The set of parameters described address scenarios for chronic air emissions or chronic releases to public waterways. Considerations for the special tritium and carbon-14 modelsmore » are briefly addressed. GENIIv2.10.0 is the current software version that this document supports.« less

  8. AN ALTERNATIVE FUTURES ANALYSIS OF FARMINGTON BAY WETLANDS IN THE GREAT SALT LAKE

    EPA Science Inventory

    An Alternative Futures Analysis (AFA) was conducted to evaluate tradeoffs between landscape design scenarios and ecological services for Farmington Bay, Great Salt Lake (GSL), wetlands. Model scenarios included plan trend and conservation "futures" scenarios projected to 2030. ...

  9. When virulence originates from non-agricultural hosts: new insights into plant breeding.

    PubMed

    Leroy, Thibault; Le Cam, Bruno; Lemaire, Christophe

    2014-10-01

    Monogenic plant resistance breakdown is a model for testing evolution in action in pathogens. As a rule, plant pathologists argue that virulence - the allele that allows pathogens to overcome resistance - is due to a new mutation at the avirulence locus within the native/endemic population that infects susceptible crops. In this article, we develop an alternative and neglected scenario where a given virulence pre-exists in a non-agricultural host and might be accidentally released or introduced on the matching resistant cultivar in the field. The main difference between the two scenarios is the divergence time expected between the avirulent and the virulent populations. As a consequence, population genetic approaches such as genome scans and Approximate Bayesian Computation methods allow explicit testing of the two scenarios by timing the divergence. This review then explores the fundamental implications of this alternative scenario for plant breeding, including the invasion of virulence or the evolution of more aggressive hybrids, and proposes concrete solutions to achieve a sustainable resistance. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. [Scenario analysis--a method for long-term planning].

    PubMed

    Stavem, K

    2000-01-10

    Scenarios are known from the film industry, as detailed descriptions of films. This has given name to scenario analysis, a method for long term planning using descriptions of composite future pictures. This article is an introduction to the scenario method. Scenarios describe plausible, not necessarily probable, developments. They focus on problems and questions that decision makers must be aware of and prepare to deal with, and the consequences of alternative decisions. Scenarios are used in corporate and governmental planning, and they can be useful and complementary to traditional planning and extrapolation of past experience. The method is particularly useful in a rapidly changing world with shifting external conditions.

  11. The impact of high altitude aircraft on the ozone layer in the stratosphere

    NASA Technical Reports Server (NTRS)

    Tie, Xue XI; Brasseur, Guy; Lin, Xing; Friedlingstein, P.; Granier, Claire; Rasch, Philip

    1994-01-01

    The paper discusses the potential effects on the ozone layer of gases released by the engines of proposed high altitude supersonic aircraft. The major problem arises from the emissions of nitrogen oxides which have the potential to destroy significant quantities of ozone in the stratosphere. The magnitude of the perturbation is highly dependent on the cruise altitude of the aircraft. Furthermore, the depletion of ozone is substantially reduced when heterogeneous conversion of nitrogen oxides into nitric acid on sulfate aerosol particles is taken into account in the calculation. The sensitivity of the aerosol load on stratospheric ozone is investigated. First, the model indicates that the aerosol load induced by the SO2 released by aircraft is increased by about 10-20% above the background aerosols at mid-high latitude of the Northern Hemisphere at 15 km for the NASA emission scenario A (the NASA emission scenarios are explained in Tables I to III). This increase in aerosol has small effects on stratospheric ozone. Second, when the aerosol load is increased following a volcanic eruption similar to the eruption of El Chichon (Mexico, April 1982), the ozone column in spring increases by as much as 9% in response to the injection of NOx from the aircraft with the NASA emission scenario A. Finally, the modeled suggests that significant ozone depletion could result from the formation of additional polar stratospheric clouds produced by the injection of H2O and HNO3 by the aircraft engines.

  12. Evaluation of the impact of lime softening waste disposal in natural environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaisi, Nawaf I.; Roessler, Justin; Cheng, Weizhi

    2015-09-15

    Highlights: • Leaching tests conducted on WTR to assess potential for trace element release. • Aluminum leaching found to be elevated with respect to risk threshold. • Release in anaerobic conditions evaluated with column test run in nitrogen chamber. • Increased release of certain elements seen from residues under anaerobic conditions. • Different leaching tests produced results on two sides of regulatory threshold. - Abstract: Drinking water treatment residues (WTR), generated from the lime softening processes, are commonly reused or disposed of in a number of applications; these include use as a soil amendment or a subsurface fill. Recently questionsmore » were posed by the Florida regulatory community on whether lime WTR that contained a small percentage of other treatment additives could appropriately be characterized as lime WTR, in terms of total element content and leachability. A study was done using a broad range of leaching tests, including a framework of tests recently adopted by the United States-Environmental Protection Agency (EPA) and tests that were modified to account for scenario specific conditions, such as the presence of natural organic matter (NOM). The results of these additional leaching tests demonstrated that certain applications, including disposal in a water body with NOM or in placement anaerobic environment, did result in increased leaching of elements such as Fe, and that a site specific assessment should be conducted prior to using WTR in these types of applications. This study illustrates the importance of leaching test selection when attempting to provide an estimation of release in practice. Although leaching tests are just one component in a beneficial use assessment and other factors including aquifer and soil properties play a significant role in the outcome, leaching tests should be tailored to most appropriately represent the scenario or reuse application being evaluated.« less

  13. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less

  14. Developing Scenarios: Linking Environmental Scanning and Strategic Planning.

    ERIC Educational Resources Information Center

    Whiteley, Meredith A.; And Others

    1990-01-01

    The multiple scenario analysis technique for organizational planning used by multinational corporations is adaptable for colleges and universities. Arizona State University launched a futures-based planning project using the Delphi technique and cross-impact analysis to produce three alternative scenarios (stable, turbulent, and chaotic) to expand…

  15. Tracking the MSL-SAM methane detection source location Through Mars Regional Atmospheric Modeling System (MRAMS)

    NASA Astrophysics Data System (ADS)

    Pla-García, Jorge

    2016-04-01

    1. Introduction: The putative in situ detection of methane by Sample Analysis at Mars (SAM) instrument suite on Curiosi-ty at Gale crater has garnered significant attention because of the potential implications for the presence of geological methane sources or indigenous Martian organisms [1, 2]. SAM reported detection of back-ground levels of atmospheric methane of mean value 0.69±0.25 parts per billion by volume (ppbv) at the 95% confidence interval (CI). Additionally, in four sequential measurements spanning a 60-sol period, SAM observed elevated levels of methane of 7.2±2.1 ppbv (95% CI), implying that Mars is episodically producing methane from an additional unknown source. There are many major unresolved questions regard-ing this detection: 1) What are the potential sources of the methane release? 2) What causes the rapid decrease in concentration? and 3) Where is the re-lease location? 4) How spatially extensive is the re-lease? 5) For how long is CH4 released? Regarding the first question, the source of methane, is so far not identified. It could be related with geo-logical process like methane release from clathrates [3], serpentinisation [4] and volcanism [5]; or due to biological activity from methanogenesis [6]. To answer the second question, the rapid decrease in concentration, it is important to note that the photo-chemical lifetime of methane is of order 100 years, much longer than the atmospheric mixing time scale, and thus the gas should tend to be well mixed except near a source or shortly after an episodic release. The observed spike of 7 ppb from the background of <1 ppb, and then the rapid return to the background lev-el could be due to a sink (destruction) or due to at-mospheric mixing. A wind mediated erosion process of ordinary quartz crystals was proposed to produce activated quartz grains, which sequester methane by forming covalent Si-C bonds. If this process is op-erational on Mars today, which some recent prelimi-nary studies on olivine indicate could be the case, then it might explain the observed fast destruction of methane [7]. In an effort to better address the potential mixing and remaining questions, atmospheric circulation studies of Gale Crater were performed with the Mars Re-gional Atmospheric Modeling System (MRAMS). The model was focused on rover locations using nested grids with a spacing of 330 meters on the in-nermost grid that is centered over the landing [8, 9]. MRAMS is ideally suited for this investigation; the model is explicitly designed to simulate Mars' at-mospheric circulations at the mesoscale and smaller with realistic, high-resolution surface properties [10, 11]. In order to characterize seasonal mixing changes throughout the Martian year, simulations were con-ducted at Ls 0, 90, 180 and 270. Two additional sim-ulations at Ls 225 and 315 were explored to better understand the unique meteorological setting cen-tered around Ls 270. Ls 270 was shown to be an anomalous season when air within and outside the crater was well mixed by strong, flushing, northerly flow and large amplitude breaking mountain waves: air flowing downslope at night is cold enough to penetrate all the way to the surface. At other seasons, the air in the crater is more isolated -but not com-pletely- from the surrounding environment: mesoscale simulations indicate that the air flowing down the crater rims does not easily make it to the crater floor. Instead, the air encounters very cold and stable air pooled in the bottom of the crater, which forces the air to glide right over the colder, more dense air below. Thus, the mixing of near-surface crater air with the external environment is potentially more limited than around Ls 270. 2. Tracking methane source location The rise in concentration was reported to start around sol 300 (˜Ls 336), peaked shortly after sol 520 (˜Ls 82), and then dropped to background val-ues prior to sol 575 (˜Ls 103). Two scenarios are considered in the context of the circulations predicted by MRAMS. The first scenario is the release of methane from somewhere outside the crater. The second is a release of methane within the crater. In both cases, the release is assumed to take place near the season when the rise of concen-tration was first noted (˜Ls 336). This is a transition-al time at Gale Crater, when the flushing winds are giving way to the more isolated crater scenario: In the situation where the release was outside the crater, the experiment assumes a uniform, elevated abundance of CH4 outside the crater, and mixing should be sufficient to bring the crater methane abundance to something close to the larger-scale environmental value. As the crater becomes more isolated with time, the methane abundance in the crater will begin to lag whatever the value is outside the crater. If the release was far from the crater, the external ˜7 ppbv value might be expected to slowly decrease as the methane becomes increasingly well-mixed on a global scale, and as some of that air mix-es slowly into the crater. For the elevated methane levels in the crater to drop rapidly back to back-ground levels, at least two things would need to hap-pen. First, the external crater environment would have to drop at least as rapidly to the background levels. This seems possible only if there is very deep mixing that spreads the release through a very large volume of atmosphere, or if a rapid destruction mechanism is invoked. The second thing that would have to happen is that the crater air would have to mix nearly completely with the external crater air. The model results at Ls 90, which bounds the period between the observed peak and the return to the background levels, may be supportive of this idea. However, while mixing seems limited, it may still be possible that the mixing degree and time scale is sufficient to affect the necessary change. In the second scenario, the release is assumed to be within the crater. In this case, some mixing of this air with external crater air at background values can be assumed. Depending on the rate of mixing, it is pos-sible that the value could decay to the background levels in the given time. Thus, from a mixing stand-point, the second scenario seems at least plausible. Some preliminary work, including tracer gases into the model, is being performed to establish the amount of mixing during the limited mixing epochs. Preliminary results may support the idea that during periods of limited mixing, there could be enough time for methane to bind to activated mineral surfac-es through wind erosion. More work is needed to establish the amount of mixing and to de-termine which scenario is more likely. References: [1] Webster et al. (2013), LPI contributions, 1719: 1366; [2] Webster et al. (2015), Science, vol. 347, no. 6220, 415-417; [3] Chastain and Chevrier (2007). Planet. Space Science, 55, 1246-1256; [4] Oze and Sharma (2005). Geophys. Res. Lett., 32, L10203; [5] Etiope et al. (2007), J. Volcanol. Geo-therm. Res., 165, 76-86; [6] Reid et al. (2006), Int. J. Astrobiol., 5, 89-97; [7] Jensen et al. (2014), Icarus, 236, 24-27; [8] Rafkin, S. C. R. et al. (2001), Icarus, 151, 228-256;?[9] Rafkin, S. C. R. et al. (2002), Na-ture, 419, 697-699. [10] Pla-Garcia et al. (2016), Icarus, Accepted; [11] Rafkin S.C.R. et al. (2016), Icarus, Accepted

  16. Probabilistic dose assessment of normal operations and accident conditions for an assured isolation facility in Texas

    NASA Astrophysics Data System (ADS)

    Arno, Matthew Gordon

    Texas is investigating building a long-term waste storage facility, also known as an Assured Isolation Facility. This is an above-ground low-level radioactive waste storage facility that is actively maintained and from which waste may be retrieved. A preliminary, scoping-level analysis has been extended to consider more complex scenarios of radiation streaming and skyshine by using the computer code Monte Carlo N-Particle (MCNP) to model the facility in greater detail. Accidental release scenarios have been studied in more depth to better assess the potential dose to off-site individuals. Using bounding source term assumptions, the projected radiation doses and dose rates are estimated to exceed applicable limits by an order of magnitude. By altering the facility design to fill in the hollow cores of the prefabricated concrete slabs used in the roof over the "high-gamma rooms," where the waste with the highest concentration of gamma emitting radioactive material is stored, dose rates outside the facility decrease by an order of magnitude. With the modified design, the annual dose at the site fenceline is estimated at 86 mrem, below the 100 mrem annual limit for exposure of the public. Within the site perimeter, the dose rates are lowered sufficiently such that it is not necessary to categorize many workers and contractor personnel as radiation workers, saving on costs as well as being advisable under ALARA principles. A detailed analysis of bounding accidents incorporating information on the local meteorological conditions indicate that the maximum committed effective dose equivalent from the passage of a plume of material released in an accident at any of the cities near the facility is 59 :rem in the city of Eunice, NM based on the combined day and night meteorological conditions. Using the daytime meteorological conditions, the maximum dose at any city is 7 :rem, also in the city of Eunice. The maximum dose at the site boundary was determined to be 230 mrem using the combined day and night meteorological conditions and 33 mrem using the daytime conditions.

  17. Radiological environmental dose assessment methods and compliance dose results for 2015 operations at the Savannah River Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jannik, G. T.; Dixon, K. L.

    This report presents the environmental dose assessment methods and the estimated potential doses to the offsite public from 2015 Savannah River Site (SRS) atmospheric and liquid radioactive releases. Also documented are potential doses from special-case exposure scenarios - such as the consumption of deer meat, fish, and goat milk.

  18. 75 FR 8411 - Office of New Reactors: Interim Staff Guidance on Assessing the Consequences of an Accidental...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... the public will be better served by being able to review and comment on both documents at this time... Construction Inspection and Operational Programs, Office of New Reactors, U.S. Nuclear Regulatory Commission... conditions for such releases and define acceptable assumptions to describe exposure scenarios and pathways to...

  19. The use of scenario analysis in local public health departments: alternative futures for strategic planning.

    PubMed Central

    Venable, J M; Ma, Q L; Ginter, P M; Duncan, W J

    1993-01-01

    Scenario analysis is a strategic planning technique used to describe and evaluate an organization's external environment. A methodology for conducting scenario analysis using the Jefferson County Department of Health and the national, State, and county issues confronting it is outlined. Key health care and organizational issues were identified using published sources, focus groups, questionnaires, and personal interviews. The most important of these issues were selected by asking health department managers to evaluate the issues according to their probability of occurrence and likely impact on the health department. The high-probability, high-impact issues formed the basis for developing scenario logics that constitute the story line holding the scenario together. The results were a set of plausible scenarios that aided in strategic planning, encouraged strategic thinking among managers, eliminated or reduced surprise about environmental changes, and improved managerial discussion and communication. PMID:8265754

  20. Research on response spectrum of dam based on scenario earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoliang; Zhang, Yushan

    2017-10-01

    Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.

  1. Predicting induced radioactivity for the accelerator operations at the Taiwan Photon Source.

    PubMed

    Sheu, R J; Jiang, S H

    2010-12-01

    This study investigates the characteristics of induced radioactivity due to the operations of a 3-GeV electron accelerator at the Taiwan Photon Source (TPS). According to the beam loss analysis, the authors set two representative irradiation conditions for the activation analysis. The FLUKA Monte Carlo code has been used to predict the isotope inventories, residual activities, and remanent dose rates as a function of time. The calculation model itself is simple but conservative for the evaluation of induced radioactivity in a light source facility. This study highlights the importance of beam loss scenarios and demonstrates the great advantage of using FLUKA in comparing the predicted radioactivity with corresponding regulatory limits. The calculated results lead to the conclusion that, due to fairly low electron consumption, the radioactivity induced in the accelerator components and surrounding concrete walls of the TPS is rather moderate and manageable, while the possible activation of air and cooling water in the tunnel and their environmental releases are negligible.

  2. The potential of shifting recombination hotspots to increase genetic gain in livestock breeding.

    PubMed

    Gonen, Serap; Battagin, Mara; Johnston, Susan E; Gorjanc, Gregor; Hickey, John M

    2017-07-04

    This study uses simulation to explore and quantify the potential effect of shifting recombination hotspots on genetic gain in livestock breeding programs. We simulated three scenarios that differed in the locations of quantitative trait nucleotides (QTN) and recombination hotspots in the genome. In scenario 1, QTN were randomly distributed along the chromosomes and recombination was restricted to occur within specific genomic regions (i.e. recombination hotspots). In the other two scenarios, both QTN and recombination hotspots were located in specific regions, but differed in whether the QTN occurred outside of (scenario 2) or inside (scenario 3) recombination hotspots. We split each chromosome into 250, 500 or 1000 regions per chromosome of which 10% were recombination hotspots and/or contained QTN. The breeding program was run for 21 generations of selection, after which recombination hotspot regions were kept the same or were shifted to adjacent regions for a further 80 generations of selection. We evaluated the effect of shifting recombination hotspots on genetic gain, genetic variance and genic variance. Our results show that shifting recombination hotspots reduced the decline of genetic and genic variance by releasing standing allelic variation in the form of new allele combinations. This in turn resulted in larger increases in genetic gain. However, the benefit of shifting recombination hotspots for increased genetic gain was only observed when QTN were initially outside recombination hotspots. If QTN were initially inside recombination hotspots then shifting them decreased genetic gain. Shifting recombination hotspots to regions of the genome where recombination had not occurred for 21 generations of selection (i.e. recombination deserts) released more of the standing allelic variation available in each generation and thus increased genetic gain. However, whether and how much increase in genetic gain was achieved by shifting recombination hotspots depended on the distribution of QTN in the genome, the number of recombination hotspots and whether QTN were initially inside or outside recombination hotspots. Our findings show future scope for targeted modification of recombination hotspots e.g. through changes in zinc-finger motifs of the PRDM9 protein to increase genetic gain in production species.

  3. Simulation of Hydrogen Distribution in Ignalina NPP ALS Compartments During BDBA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babilas, Egidijus; Urbonavicius, Egidijus; Rimkevicius, Sigitas

    2006-07-01

    Accident Localisation System (ALS) of Ignalina NPP is a 'pressure suppression' type confinement, which protects the population, employees and environment from the radiation hazards. According to the Safety Analysis Report for Ignalina NPP {approx}110 m{sup 3} of hydrogen is released to ALS compartments during the Maximum Design Basis Accident. However in case of beyond design basis accident, when the oxidation of zirconium starts, the amount of generated hydrogen could be significantly higher. If the volume concentration of hydrogen in the compartment reaches 4%, there is a possibility for a combustible mixture to appear. To prevent the possible hydrogen accumulation inmore » the ALS of the Ignalina NPP during an accident the H{sub 2} control system is installed. The results of the performed analysis derived the places of the possible H{sub 2} accumulation in the ALS compartments during the transient processes and assessed the mixture combustibility in these places for a beyond design basis accident scenario. Such analysis of H{sub 2} distribution in the ALS of Ignalina NPP in case of BDBA was not performed before. (authors)« less

  4. Scenario Analysis of Soil and Water Conservation in Xiejia Watershed Based on Improved CSLE Model

    NASA Astrophysics Data System (ADS)

    Liu, Jieying; Yu, Ming; Wu, Yong; Huang, Yao; Nie, Yawen

    2018-01-01

    According to the existing research results and related data, use the scenario analysis method, to evaluate the effects of different soil and water conservation measures on soil erosion in a small watershed. Based on the analysis of soil erosion scenarios and model simulation budgets in the study area, it is found that all scenarios simulated soil erosion rates are lower than the present situation of soil erosion in 2013. Soil and water conservation measures are more effective in reducing soil erosion than soil and water conservation biological measures and soil and water conservation tillage measures.

  5. Approach on environmental risk assessment of nanosilver released from textiles.

    PubMed

    Voelker, Doris; Schlich, Karsten; Hohndorf, Lars; Koch, Wolfgang; Kuehnen, Ute; Polleichtner, Christian; Kussatz, Carola; Hund-Rinke, Kerstin

    2015-07-01

    Based on the increased utilization of nanosilver (silver nanomaterials=AgNM) as antibacterial agent, there is the strong need to assess the potential environmental implication associated with its new application areas. In this study an exemplary environmental risk assessment (ERA) of AgNM applied in textiles was performed. Environmental exposure scenarios (via municipal sewage treatment plant (STP)) with wastewater supply from domestic homes) were developed for three different types of textiles equipped with AgNM. Based on these scenarios predicted environmental concentrations (PECs) were deduced for STPs and for the environmental compartments surface water, sediment as well as soil. These PECs were related to PNECs (predicted no effect concentrations). PNECs were deduced from results of ecotoxicity tests of a selected AgNM (NM-300K). Data on ecotoxicology were derived from various tests with activated sludge, cyanobacteria, algae, daphnids, fish, duckweed, macrophytes, chironomids, earthworms, terrestrial plants as well as soil microorganisms. Emission data for the AgNM NM-300K from textiles were derived from washing experiments. The performed ERA was based on the specifications defined in the ECHA Guidances on information requirements and chemical safety assessment. Based on the chosen scenarios and preconditions, no environmental risk of the AgNM NM-300K released from textiles was detected. Under conservative assumptions a risk quotient for surface water close to 1 indicated that the aquatic compartment may be affected by an increased emission of AgNM to the environment due to the high sensitivity of aquatic organisms to silver. Based on the successful retention of AgNM in the sewage sludge and the still ongoing continual application of sewage sludge on farmland it is recommended to introduce a threshold for total silver content in sewage sludge into the respective regulations. Regarding potential risk mitigation measures, it is emphasized to preferably directly introduce AgNM into the textile fiber since this will strongly minimize the release of AgNM during washing. If this is not possible due to technical limitations or other reasons, the introduction of a threshold level controlling the release of AgNM from textiles is suggested. It has to be noted that this study is a case study which is only valid for the investigated NM-300K and its potential application in textiles. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perona, Ralph; Whicker, Jeffrey Jay; Mirenda, Richard J.

    This report documents the calculation of Authorized Limits for radionuclides in soil to be used in the transfer of property by the Los Alamos National Laboratory (LANL). The Authorized Limits support the evaluation process to clear land for release under different uses even though the soil contains small residual amounts of radioactivity. The Authorized Limits are developed for four exposure scenarios: residential, commercial/industrial, construction worker, and recreational. Exposure to radionuclides in soil under these scenarios is assessed for exposure routes that include incidental ingestion of soil; inhalation of soil particulates; ingestion of homegrown produce (residential only); and external irradiation frommore » soil. Inhalation and dermal absorption of tritiated water vapor in air are also assessed.« less

  7. Governance and decision making in complex socio-hydrological systems

    NASA Astrophysics Data System (ADS)

    Elshorbagy, Amin; Wheater, Howard; Gober, Patricia; Hassanzadeh, Elmira

    2017-04-01

    The transboundary Saskatchewan River, originating in the Canadian Rockies in Alberta, flows through Saskatchewan and Manitoba and discharges its water into Lake Winnipeg. It supports irrigated agriculture, hydropower generation, flood protection, municipal water supplies, mining, recreation, and environmental services across a large area and in multiple administrative jurisdictions. Managing the region's water-based economic activities and environmental services, requires decisions at a variety of scales to incorporate competing values and priorities about water use. Current inter-provincial allocations are based on the 1969 Master Agreement of Water Apportionment whereby upstream Alberta must release one-half of the annual natural flows of the Saskatchewan River to Saskatchewan, which in turn must pass one-half of the residual natural flow to the Province of Manitoba. This analysis uses a hydro-economic simulation model, SWAMP, to examine risk-based tradeoffs in Saskatchewan for various types of water use including, agriculture, energy, and flood protection under various scenarios of water availability. The eco-hydrological effects of the scenarios on the largest inland delta in North America - the Saskatchewan River Delta - are also shown. Results enable decision makers to weigh the costs and benefits of implementing particular sector-based future development strategies. Assuming net provincial benefit as a single monetary indicator of economic value, the effects of various scenarios of environmental and policy changes are quantified Results show that improving irrigation technology and expanding irrigated lands in Alberta will positively affect the province's economic development and have compound effects downstream on hydropower generation, environmental flows and the economies of Saskatchewan and Manitoba. The implementation of similar policies in Saskatchewan will have different downstream impacts because of the large hydro-power capacity downstream in Manitoba. The model highlights the spatial tradeoffs across the three provinces and sectoral trade-offs among the differing water uses. These trade-offs represent challenging dilemmas for water management decisions in a complex system. The study reveals the need for a holistic framework of water resources analysis that can dynamically capture the feedback loops among hydrological, social, and administrative/political analysis units to support public discussion of critical water tradeoffs and a consensual water value framework to guide future development decisions.

  8. Complexity associated with the optimisation of capability options in military operations

    NASA Astrophysics Data System (ADS)

    Pincombe, A.; Bender, A.; Allen, G.

    2005-12-01

    In the context of a military operation, even if the intended actions, the geographic location, and the capabilities of the opposition are known, there are still some critical uncertainties that could have a major impact on the effectiveness of a given set of capabilities. These uncertainties include unpredictable events and the response alternatives that are available to the command and control elements of the capability set. They greatly complicate any a priori mathematical description. In a forecasting approach, the most likely future might be chosen and a solution sought that is optimal for that case. With scenario analysis, futures are proposed on the basis of critical uncertainties and the option that is most robust is chosen. We use scenario analysis but our approach is different in that we focus on the complexity and use the coupling between scenarios and options to create information on ideal options. The approach makes use of both soft and hard operations research methods, with subject matter expertise being used to define plausible responses to scenarios. In each scenario, uncertainty affects only a subset of the system-inherent variables and the variables that describe system-environment interactions. It is this scenario-specific reduction of variables that makes the problem mathematically tractable. The process we define is significantly different to existing scenario analysis processes, so we have named it adversarial scenario analysis. It can be used in conjunction with other methods, including recent improvements to the scenario analysis process. To illustrate the approach, we undertake a tactical level scenario analysis for a logistics problem that is defined by a network, expected throughputs to end users, the transport capacity available, the infrastructure at the nodes and the capacities of roads, stocks etc. The throughput capacity, e.g. the effectiveness, of the system relies on all of these variables and on the couplings between them. The system is initially in equilibrium for a given level of demand. However, different, and simpler, solutions emerge as the balance of couplings and the importance of variables change. The scenarios describe such changes in conditions. For each scenario it was possible to define measures that describe the differences between options. As with agent-based distillations, the solution is essentially qualitative and exploratory, bringing awareness of possible future difficulties and of the capabilities that are necessary if we are to deal successfully with those difficulties.

  9. Potential effects of four Flaming Gorge Dam hydropower operational scenarios on the fishes of the Green River, Utah and Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hlohowskyj, I.; Hayse, J.W.

    1995-09-01

    Aerial videography and modeling were used to evaluate the impacts of four hydropower operational scenarios at Flaming Gorge Dam, Utah, on trout and native fishes in the Green River, Utah and Colorado. The four operational scenarios studied were year-round high fluctuations, seasonally adjusted high fluctuations, seasonally adjusted moderate fluctuations, and seasonally adjusted steady flows. Impacts on trout were evaluated by examining differences among scenarios in the areas of inundated substrates that serve as spawning and feeding habitat. All scenarios would provide at least 23 acres per mile of habitat for spawning and food production; seasonally adjusted operations would provide additionalmore » areas during periods of sustained high release. Seasonally adjusted high fluctuations would increase inundated areas by 12 to 26% for a short period in winter and spring, but food production and reproduction would not be expected to increase. Seasonally adjusted moderate fluctuations and steady flows would produce similar increases in area, but the longer period of inundation could also result in increased food production and provide additional spawning sites for trout. Impacts on native fishes were assessed by examining daily changes in backwater nursery areas. Compared with year-round high fluctuations, the daily changes in backwater area would decrease by about 47, 89, and 100% under the seasonally adjusted high fluctuation, moderate fluctuation, and steady flow scenarios, respectively. Similarly, daily stage fluctuations during the nursery period would decrease by 72, 89, and 100% under the seasonally adjusted high fluctuation, moderate fluctuation, and steady flow scenarios, respectively. These reductions in daily fluctuations in backwater area and stage would improve conditions in nursery habitats and could in turn improve recruitment and overwinter survival. Introduced fish species could also benefit from the seasonally adjusted operational scenarios.« less

  10. Towards the new CH2018 climate scenarios for Switzerland

    NASA Astrophysics Data System (ADS)

    Fischer, Andreas; Schär, Christoph; Croci-Maspoli, Mischa; Knutti, Reto; Liniger, Mark; Strassmann, Kuno

    2017-04-01

    There is a growing demand for regional assessments of future climate change and its impacts on society and ecosystems to inform and facilitate appropriate adaptation strategies. The basis for such assessments are consistent and up-to-date climate change scenarios on the local to regional scale. In Switzerland, an important step has been accomplished by the release of the climate scenarios in 2011 ("CH2011"). Since then, new climate model simulations have become available and the scientific understanding has improved. It is hence desirable to update these national scenarios. The new CH2018 scenarios are developed in the framework of the recently founded National Center for Climate Services (NCCS), a network consisting of several federal offices and academic partners. The CH2018 scenarios will build primarily upon the latest Euro-CORDEX regional climate model simulations assuming different pathways of future greenhouse gas concentrations. Compared to CH2011, more emphasis will be put on changes in extremes and in putting the projected changes in the context of observed variability. Results of a recently conducted survey on end-user needs in Switzerland will guide the development process toward the CH2018 scenarios. It ensures that the scenarios are presented and communicated in a user-oriented format and find a wide applicability across different sectors in Switzerland. In the presentation we will show the full methodological setup to generate the CH2018 scenarios and how consistency across the methods and products is maximized. First results on mean changes and selected indices will be presented. In terms of dissemination, the results of the user survey show the necessity to address all different user types of climate scenarios, especially the non-experts. Compared to CH2011, this implies a stronger focus on consulting, condensing complex information and providing tutorials. In the presentation, we will outline our plans on dissemination in order to adequately address all relevant user groups of CH2018.

  11. A screening tool to prioritize public health risk associated with accidental or deliberate release of chemicals into the atmosphere

    PubMed Central

    2013-01-01

    The Chemical Events Working Group of the Global Health Security Initiative has developed a flexible screening tool for chemicals that present a risk when accidentally or deliberately released into the atmosphere. The tool is generic, semi-quantitative, independent of site, situation and scenario, encompasses all chemical hazards (toxicity, flammability and reactivity), and can be easily and quickly implemented by non-subject matter experts using freely available, authoritative information. Public health practitioners and planners can use the screening tool to assist them in directing their activities in each of the five stages of the disaster management cycle. PMID:23517410

  12. Industrial accidents triggered by lightning.

    PubMed

    Renni, Elisabetta; Krausmann, Elisabeth; Cozzani, Valerio

    2010-12-15

    Natural disasters can cause major accidents in chemical facilities where they can lead to the release of hazardous materials which in turn can result in fires, explosions or toxic dispersion. Lightning strikes are the most frequent cause of major accidents triggered by natural events. In order to contribute towards the development of a quantitative approach for assessing lightning risk at industrial facilities, lightning-triggered accident case histories were retrieved from the major industrial accident databases and analysed to extract information on types of vulnerable equipment, failure dynamics and damage states, as well as on the final consequences of the event. The most vulnerable category of equipment is storage tanks. Lightning damage is incurred by immediate ignition, electrical and electronic systems failure or structural damage with subsequent release. Toxic releases and tank fires tend to be the most common scenarios associated with lightning strikes. Oil, diesel and gasoline are the substances most frequently released during lightning-triggered Natech accidents. Copyright © 2010 Elsevier B.V. All rights reserved.

  13. Modelling of catastrophic flashing releases.

    PubMed

    Deaves, D M; Gilham, S; Mitchell, B H; Woodburn, P; Shepherd, A M

    2001-11-16

    Several low boiling point materials are stored in closed vessels at ambient temperature, using their own vapour pressure to maintain a liquid state. These materials are often toxic, flammable, or both, and thus any uncontrolled release can have potentially disastrous consequences. There are many ways in which an accidental release can occur, the most severe being due to catastrophic vessel failure. Although not the most common, this mode of failure has the potential to result in an instantaneous loss of the entire vessel inventory in the form of a rapidly expanding, two-phase, vaporising cloud. This paper provides a comprehensive review of the physical processes of existing models and of available experimental and incident data to model such scenarios. Subsequently, this has enabled the development of an improved methodology for the characterisation of the source conditions following catastrophic vessel failures.

  14. 2016 Standard Scenarios Report: A U.S. Electricity Sector Outlook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Wesley; Mai, Trieu; Logan, Jeffrey

    The Standard Scenarios and this associated report, which are now in their second year, present an examination of some of the key aspects of the change occurring, or anticipated to occur, in the power sector over the next several decades. The Standard Scenarios consist of 18 power sector scenarios which have been projected using the National Renewable Energy Laboratory's (NREL's) Regional Energy Deployment System (ReEDS) long-term capacity expansion model and the dGen rooftop PV diffusion model. The purpose of the Standard Scenarios and this associated report is to provide context, discussion, and data to inform stakeholder decision-making regarding the futuremore » direction of U.S. power sector. As an extension to this report, the Standard Scenario outputs are presented in a downloadable format online using the Standard Scenarios' Results Viewer at http://en.openei.org/apps/reeds/. This report reflects high-level conclusions and analysis, whereas the Standard Scenarios' Results Viewer includes the scenario results that can be used for more in-depth analysis.« less

  15. Simulation of atmospheric dispersion of radionuclides using an Eulerian-Lagrangian modelling system.

    PubMed

    Basit, Abdul; Espinosa, Francisco; Avila, Ruben; Raza, S; Irfan, N

    2008-12-01

    In this paper we present an atmospheric dispersion scenario for a proposed nuclear power plant in Pakistan involving the hypothetical accidental release of radionuclides. For this, a concept involving a Lagrangian stochastic particle model (LSPM) coupled with an Eulerian regional atmospheric modelling system (RAMS) is used. The atmospheric turbulent dispersion of radionuclides (represented by non-buoyant particles/neutral traces) in the LSPM is modelled by applying non-homogeneous turbulence conditions. The mean wind velocities governed by the topography of the region and the surface fluxes of momentum and heat are calculated by the RAMS code. A moving least squares (MLS) technique is introduced to calculate the concentration of radionuclides at ground level. The numerically calculated vertical profiles of wind velocity and temperature are compared with observed data. The results obtained demonstrate that in regions of complex terrain it is not sufficient to model the atmospheric dispersion of particles using a straight-line Gaussian plume model, and that by utilising a Lagrangian stochastic particle model and regional atmospheric modelling system a much more realistic estimation of the dispersion in such a hypothetical scenario was ascertained. The particle dispersion results for a 12 h ground release show that a triangular area of about 400 km(2) situated in the north-west quadrant of release is under radiological threat. The particle distribution shows that the use of a Gaussian plume model (GPM) in such situations will yield quite misleading results.

  16. The need for a life-cycle based aging paradigm for nanomaterials: importance of real-world test systems to identify realistic particle transformations

    NASA Astrophysics Data System (ADS)

    Mitrano, Denise M.; Nowack, Bernd

    2017-02-01

    Assessing the risks of manufactured nanomaterials (MNM) has been almost exclusively focused on the pristine, as-produced materials with far fewer studies delving into more complex, real world scenarios. However, when considering a life-cycle perspective, it is clear that MNM released from commercial products during manufacturing, use and disposal are far more relevant both in terms of more realistic environmental fate and transport as well as environmental risk. The quantity in which the particles are released and their (altered) physical and chemical form should be identified and it is these metrics that should be used to assess the exposure and hazard the materials pose. The goal of this review is to (1) provide a rationale for using a life-cycle based approach when dealing with MNM transformations, (2) to elucidate the different chemical and physical forces which age and transform MNM and (3) assess the pros and cons of current analytical techniques as they pertain to the measurement of aged and transformed MNM in these complex release scenarios. Specifically, we will describe the possible transformations common MNM may undergo during the use or disposal of nano-products based on how these products will be used by the consumer by taking stock of the current nano-enabled products on the market. Understanding the impact of these transformations may help forecast the benefits and/or risks associated with the use of products containing MNM.

  17. Contingency planning for a deliberate release of smallpox in Great Britain--the role of geographical scale and contact structure.

    PubMed

    House, Thomas; Hall, Ian; Danon, Leon; Keeling, Matt J

    2010-02-14

    In the event of a release of a pathogen such as smallpox, which is human-to-human transmissible and has high associated mortality, a key question is how best to deploy containment and control strategies. Given the general uncertainty surrounding this issue, mathematical modelling has played an important role in informing the likely optimal response, in particular defining the conditions under which mass-vaccination would be appropriate. In this paper, we consider two key questions currently unanswered in the literature: firstly, what is the optimal spatial scale for intervention; and secondly, how sensitive are results to the modelling assumptions made about the pattern of human contacts? Here we develop a novel mathematical model for smallpox that incorporates both information on individual contact structure (which is important if the effects of contact tracing are to be captured accurately) and large-scale patterns of movement across a range of spatial scales in Great Britain. Analysis of this model confirms previous work suggesting that a locally targeted 'ring' vaccination strategy is optimal, and that this conclusion is actually quite robust for different socio-demographic and epidemiological assumptions. Our method allows for intuitive understanding of the reasons why national mass vaccination is typically predicted to be suboptimal. As such, we present a general framework for fast calculation of expected outcomes during the attempted control of diverse emerging infections; this is particularly important given that parameters would need to be interactively estimated and modelled in any release scenario.

  18. Pulsed strain release on the Altyn Tagh fault, northwest China

    USGS Publications Warehouse

    Gold, Ryan D.; Cowgill, Eric; Arrowsmith, J. Ramón; Friedrich, Anke M.

    2017-01-01

    Earthquake recurrence models assume that major surface-rupturing earthquakes are followed by periods of reduced rupture probability as stress rebuilds. Although purely periodic, time- or slip-predictable rupture models are known to be oversimplifications, a paucity of long records of fault slip clouds understanding of fault behavior and earthquake recurrence over multiple ruptures. Here, we report a 16 kyr history of fault slip—including a pulse of accelerated slip from 6.4 to 6.0 ka—determined using a Monte Carlo analysis of well-dated offset landforms along the central Altyn Tagh strike-slip fault (ATF) in northwest China. This pulse punctuates a median rate of 8.1+1.2/−0.9 mm/a and likely resulted from either a flurry of temporally clustered ∼Mw 7.5 ground-rupturing earthquakes or a single large >Mw 8.2 earthquake. The clustered earthquake scenario implies rapid re-rupture of a fault reach >195 km long and indicates decoupled rates of elastic strain energy accumulation versus dissipation, conceptualized as a crustal stress battery. If the pulse reflects a single event, slip-magnitude scaling implies that it ruptured much of the ATF with slip similar to, or exceeding, the largest documented historical ruptures. Both scenarios indicate fault rupture behavior that deviates from classic time- or slip-predictable models.

  19. [Online endpoint detection algorithm for blending process of Chinese materia medica].

    PubMed

    Lin, Zhao-Zhou; Yang, Chan; Xu, Bing; Shi, Xin-Yuan; Zhang, Zhi-Qiang; Fu, Jing; Qiao, Yan-Jiang

    2017-03-01

    Blending process, which is an essential part of the pharmaceutical preparation, has a direct influence on the homogeneity and stability of solid dosage forms. With the official release of Guidance for Industry PAT, online process analysis techniques have been more and more reported in the applications in blending process, but the research on endpoint detection algorithm is still in the initial stage. By progressively increasing the window size of moving block standard deviation (MBSD), a novel endpoint detection algorithm was proposed to extend the plain MBSD from off-line scenario to online scenario and used to determine the endpoint in the blending process of Chinese medicine dispensing granules. By online learning of window size tuning, the status changes of the materials in blending process were reflected in the calculation of standard deviation in a real-time manner. The proposed method was separately tested in the blending processes of dextrin and three other extracts of traditional Chinese medicine. All of the results have shown that as compared with traditional MBSD method, the window size changes according to the proposed MBSD method (progressively increasing the window size) could more clearly reflect the status changes of the materials in blending process, so it is suitable for online application. Copyright© by the Chinese Pharmaceutical Association.

  20. Potential environmental hazards of photovoltaic panel disposal: Discussion of Tammaro et al. (2015).

    PubMed

    Sinha, Parikhit

    2017-02-05

    In their recent publication in Journal of Hazardous Materials (http://dx.doi.org/10.1016/j.jhazmat.2015.12.018), Tammaro et al. evaluate the potential environmental impacts of an illegal disposal scenario of photovoltaic panels in the European Union. Critical assumptions that underlie the study's conclusions would benefit from clarification. A scenario of photovoltaic panels finely crushed and abandoned in nature is not supported with field breakage data, in which photovoltaic panels remain largely intact with a number of glass fractures or cracks, as opposed to breakage into cm-scale pieces. Fate and transport analysis is necessary to evaluate how leachate transforms and disperses in moving from the point of emissions to the point of exposure, prior to making comparisons with drinking water limits. Some hazardous metal content has declined in both crystalline silicon and thin film panels, including a 50% decline in semiconductor material intensity in CdTe thin film panels (g CdTe/W) from 2009 to 2015. Waste laws, recycling requirements and minimum treatment standards under the EU WEEE Directive, and illegal disposal rates affect the accuracy of forecasts of releasable metal amounts from PV panels in Europe through 2050. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Analyses in Support of Risk-Informed Natural Gas Vehicle Maintenance Facility Codes and Standards: Phase II.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaylock, Myra L.; LaFleur, Chris Bensdotter; Muna, Alice Baca

    Safety standards development for maintenance facilities of liquid and compressed natural gas fueled vehicles is required to ensure proper facility design and operating procedures. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase II work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest using risk ranking. Detailed simulations and modeling were performed to estimate the location and behaviormore » of natural gas releases based on these scenarios. Specific code conflicts were identified, and ineffective code requirements were highlighted and resolutions proposed. These include ventilation rate basis on area or volume, as well as a ceiling offset which seems ineffective at protecting against flammable gas concentrations. ACKNOWLEDGEMENTS The authors gratefully acknowledge Bill Houf (SNL -- Retired) for his assistance with the set-up and post-processing of the numerical simulations. The authors also acknowledge Doug Horne (retired) for his helpful discussions. We would also like to acknowledge the support from the Clean Cities program of DOE's Vehicle Technology Office.« less

  2. Community Solar Scenario Tool | Integrated Energy Solutions | NREL

    Science.gov Websites

    Community Solar Scenario Tool Community Solar Scenario Tool The Community Solar Scenario Tool (CSST ) provides a "first cut" analysis of different community or shared solar program options. NREL sponsoring utility. Community Solar Scenario Tool -Beta Version Available as a Microsoft Excel file, which

  3. Bioaccessibility studies of ferro-chromium alloy particles for a simulated inhalation scenario: a comparative study with the pure metals and stainless steel.

    PubMed

    Midander, Klara; de Frutos, Alfredo; Hedberg, Yolanda; Darrie, Grant; Wallinder, Inger Odnevall

    2010-07-01

    The European product safety legislation, REACH, requires that companies that manufacture, import, or use chemicals demonstrate safe use and high level of protection of their products placed on the market from a human health and environmental perspective. This process involves detailed assessment of potential hazards for various toxicity endpoints induced by the use of chemicals with a minimum use of animal testing. Such an assessment requires thorough understanding of relevant exposure scenarios including material characteristics and intrinsic properties and how, for instance, physical and chemical properties change from the manufacturing phase, throughout use, to final disposal. Temporary or permanent adverse health effects induced by particles depend either on their shape or physical characteristics, and/or on chemical interactions with the particle surface upon human exposure. Potential adverse effects caused by the exposure of metal particles through the gastrointestinal system, the pulmonary system, or the skin, and their subsequent potential for particle dissolution and metal release in contact with biological media, show significant gaps of knowledge. In vitro bioaccessibility testing at conditions of relevance for different exposure scenarios, combined with the generation of a detailed understanding of intrinsic material properties and surface characteristics, are in this context a useful approach to address aspects of relevance for accurate risk and hazard assessment of chemicals, including metals and alloys and to avoid the use of in vivo testing. Alloys are essential engineering materials in all kinds of applications in society, but their potential adverse effects on human health and the environment are very seldom assessed. Alloys are treated in REACH as mixtures of their constituent elements, an approach highly inappropriate because intrinsic properties of alloys generally are totally different compared with their pure metal components. A large research effort was therefore conducted to generate quantitative bioaccessibility data for particles of ferro-chromium alloys compared with particles of the pure metals and stainless steel exposed at in vitro conditions in synthetic biological media of relevance for particle inhalation and ingestion. All results are presented combining bioaccessibility data with aspects of particle characteristics, surface composition, and barrier properties of surface oxides. Iron and chromium were the main elements released from ferro-chromium alloys upon exposure in synthetic biological media. Both elements revealed time-dependent release processes. One week exposures resulted in very small released particle fractions being less than 0.3% of the particle mass at acidic conditions and less than 0.001% in near pH-neutral media. The extent of Fe released from ferro-chromium alloy particles was significantly lower compared with particles of pure Fe, whereas Cr was released to a very low and similar extent as from particles of pure Cr and stainless steel. Low release rates are a result of a surface oxide with passive properties predominantly composed of chromium(III)-rich oxides and silica and, to a lesser extent, of iron(II,III)oxides. Neither the relative bulk alloy composition nor the surface composition can be used to predict or assess the extent of metals released in different synthetic biological media. Ferro-chromium alloys cannot be assessed from the behavior of their pure metal constituents. (c) 2009 SETAC.

  4. High velocity impact on composite link of aircraft wing flap mechanism

    NASA Astrophysics Data System (ADS)

    Heimbs, Sebastian; Lang, Holger; Havar, Tamas

    2012-12-01

    This paper describes the numerical investigation of the mechanical behaviour of a structural component of an aircraft wing flap support impacted by a wheel rim fragment. The support link made of composite materials was modelled in the commercial finite element code Abaqus/Explicit, incorporating intralaminar and interlaminar failure modes by adequate material models and cohesive interfaces. Validation studies were performed step by step using quasi-static tensile test data and low velocity impact test data. Finally, high velocity impact simulations with a metallic rim fragment were performed for several load cases involving different impact angles, impactor rotation and pre-stress. The numerical rim release analysis turned out to be an efficient approach in the development process of such composite structures and for the identification of structural damage and worst case impact loading scenarios.

  5. The size of coronal hard X-ray sources in solar flares: How big are they?

    NASA Astrophysics Data System (ADS)

    Effenberger, F.; Krucker, S.; Rubio da Costa, F.

    2017-12-01

    Coronal hard X-ray sources are considered to be one of the key signatures of non-thermal particle acceleration and heating during the energy release in solar flares. In some cases, X-ray observations reveal multiple components spatially located near and above the loop top and even further up in the corona. Here, we combine a detailed RHESSI imaging analysis of near-limb solar flares with occulted footpoints and a multi-wavelength study of the flare loop evolution in SDO/AIA. We connect our findings to different current sheet formation and magnetic break-out scenarios and relate it to particle acceleration theory. We find that the upper and usually fainter emission regions can be underestimated in their size due to the majority of flux originating from the lower loops.

  6. Ion release from dental casting alloys as assessed by a continuous flow system: Nutritional and toxicological implications.

    PubMed

    López-Alías, José F; Martinez-Gomis, Jordi; Anglada, Josep M; Peraire, Maria

    2006-09-01

    The aims of this study were to quantify the metallic ions released by various dental alloys subjected to a continuous flow of saliva and to estimate the nutritional and toxicological implications of such a release. Four pieces of three nickel-based, one noble, one high-noble and two copper-aluminum alloys were cast and then immersed in a continuous flow of artificial saliva for 15 days. To simulate three meals a day, casts were subjected to thrice-daily episodes, lasting 30 min each and consisting of pH decreases and salinity increases. After 15 days, the metallic ions in the artificial saliva were analyzed. Data were expressed as averaged release rate: microg/cm2/day of ion released for each alloy. The highest value of 95% Cl of each ion was adapted to a hypothetical worst scenario of a subject with 100 cm2 of exposed metal surface. The results were compared with the tolerable upper daily intake level of each ion. The copper-aluminum alloys released copper, aluminum, nickel, manganese and iron. The nickel-based alloys essentially released nickel and chromium, while the beryllium-containing alloy released beryllium and significantly more nickel. The noble and high-noble alloys were very resistant to corrosion. The amount of ions released remained far below the upper tolerable intake level, with the exception of nickel, released by beryllium-containing nickel-based alloy, whose levels approach 50% of this threshold. The daily amount of ions released seems to be far below the tolerable upper intake levels for each ion.

  7. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  8. The use of multi criteria analysis to compare the operating scenarios of the hybrid generation system of wind turbines, photovoltaic modules and a fuel cell

    NASA Astrophysics Data System (ADS)

    Ceran, Bartosz

    2017-11-01

    The paper presents the results of the use of multi-criteria analysis to compare hybrid power generation system collaboration scenarios (HSW) consisting of wind turbines, solar panels and energy storage electrolyzer - PEM type fuel cell with electricity system. The following scenarios were examined: the base S-I-hybrid system powers the off-grid mode receiver, S-II, S-III, S-IV scenarios-electricity system covers 25%, 50%, 75% of energy demand by the recipient. The effect of weights of the above-mentioned criteria on the final result of the multi-criteria analysis was examined.

  9. Using of bayesian networks to estimate the probability of "NATECH" scenario occurrence

    NASA Astrophysics Data System (ADS)

    Dobes, Pavel; Dlabka, Jakub; Jelšovská, Katarína; Polorecká, Mária; Baudišová, Barbora; Danihelka, Pavel

    2015-04-01

    In the twentieth century, implementation of Bayesian statistics and probability was not much used (may be it wasn't a preferred approach) in the area of natural and industrial risk analysis and management. Neither it was used within analysis of so called NATECH accidents (chemical accidents triggered by natural events, such as e.g. earthquakes, floods, lightning etc.; ref. E. Krausmann, 2011, doi:10.5194/nhess-11-921-2011). Main role, from the beginning, played here so called "classical" frequentist probability (ref. Neyman, 1937), which rely up to now especially on the right/false results of experiments and monitoring and didn't enable to count on expert's beliefs, expectations and judgements (which is, on the other hand, one of the once again well known pillars of Bayessian approach to probability). In the last 20 or 30 years, there is possible to observe, through publications and conferences, the Renaissance of Baysssian statistics into many scientific disciplines (also into various branches of geosciences). The necessity of a certain level of trust in expert judgment within risk analysis is back? After several decades of development on this field, it could be proposed following hypothesis (to be checked): "We couldn't estimate probabilities of complex crisis situations and their TOP events (many NATECH events could be classified as crisis situations or emergencies), only by classical frequentist approach, but also by using of Bayessian approach (i.e. with help of prestaged Bayessian Network including expert belief and expectation as well as classical frequentist inputs). Because - there is not always enough quantitative information from monitoring of historical emergencies, there could be several dependant or independant variables necessary to consider and in generally - every emergency situation always have a little different run." In this topic, team of authors presents its proposal of prestaged typized Bayessian network model for specified NATECH scenario (heavy rainfalls AND/OR melting snow OR earthquake -> landslides AND/OR floods -> major chemical accident), comparing it with "Black Box approach" and with so called "Bow-tie approach" (ref. C. A. Brebbia, Risk Analysis VIII, p.103-111 , WIT Press, 2012) - visualisation of development of the scenario with possibility to calculate frequencies (TOP event of the scenario, developed both ways down to initation events and upwards to end accidental events, using Fault Tree Analysis and Event Tree Analysis methods). This model can include also possible terrorist attack on the chemical facility with potential of major release of chemical into the environmental compartments (water, soil, air), with the goal to threaten environmental safety in the specific area. The study was supported by the project no. VG20132015128 "Increasing of the Environmental Safety & Security by the Prevention of Industrial Chemicals Misuse to the Terrorism", supported by the Ministry of the Interior of the Czech Republic through Security Research Programme, 2013-2015.

  10. Atmospheric transport of radioactive debris to Norway in case of a hypothetical accident related to the recovery of the Russian submarine K-27.

    PubMed

    Bartnicki, Jerzy; Amundsen, Ingar; Brown, Justin; Hosseini, Ali; Hov, Øystein; Haakenstad, Hilde; Klein, Heiko; Lind, Ole Christian; Salbu, Brit; Szacinski Wendel, Cato C; Ytre-Eide, Martin Album

    2016-01-01

    The Russian nuclear submarine K-27 suffered a loss of coolant accident in 1968 and with nuclear fuel in both reactors it was scuttled in 1981 in the outer part of Stepovogo Bay located on the eastern coast of Novaya Zemlya. The inventory of spent nuclear fuel on board the submarine is of concern because it represents a potential source of radioactive contamination of the Kara Sea and a criticality accident with potential for long-range atmospheric transport of radioactive particles cannot be ruled out. To address these concerns and to provide a better basis for evaluating possible radiological impacts of potential releases in case a salvage operation is initiated, we assessed the atmospheric transport of radionuclides and deposition in Norway from a hypothetical criticality accident on board the K-27. To achieve this, a long term (33 years) meteorological database has been prepared and used for selection of the worst case meteorological scenarios for each of three selected locations of the potential accident. Next, the dispersion model SNAP was run with the source term for the worst-case accident scenario and selected meteorological scenarios. The results showed predictions to be very sensitive to the estimation of the source term for the worst-case accident and especially to the sizes and densities of released radioactive particles. The results indicated that a large area of Norway could be affected, but that the deposition in Northern Norway would be considerably higher than in other areas of the country. The simulations showed that deposition from the worst-case scenario of a hypothetical K-27 accident would be at least two orders of magnitude lower than the deposition observed in Norway following the Chernobyl accident. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Getting the Word Out About the National Assessment: The National Assessment Governing Board and Changes in NAEP Reporting

    ERIC Educational Resources Information Center

    Stevens, John H.

    2009-01-01

    This paper describes the development of Board policy on the reporting, release, and dissemination of National Assessment results and other relevant information. Through selected scenarios, it also illustrates the kinds of issues that required action by the Board. These issues have included concerns about the testing and reporting of English…

  12. Overview of the Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, Marc; Bush, Brian; Penev, Michael

    This presentation provides an introduction to the Hydrogen Financial Analysis Scenario Tool (H2FAST) and includes an overview of each of the three versions of H2FAST: the Web tool, the Excel spreadsheet version, and the beta version of the H2FAST Business Case Scenario tool.

  13. Annual Technology Baseline and Standard Scenarios | Energy Analysis | NREL

    Science.gov Websites

    electric sector analysis in the United States. NREL analysts consistently apply the products of this work Scenarios Annual Report and A U.S. Electric Sector Outlook - This annual report presents an outlook of the U.S. electricity sector based on a suite of standard scenarios with their associated assumptions

  14. SERA Scenarios of Early Market Fuel Cell Electric Vehicle Introductions: Modeling Framework, Regional Markets, and Station Clustering; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, M.

    This presentation provides an overview of the Scenario Evaluation and Regionalization Analysis (SERA) model, describes the methodology for developing scenarios for hydrogen infrastructure development, outlines an example "Hydrogen Success" scenario, and discusses detailed scenario metrics for a particular case study region, the Northeast Corridor.

  15. Self-consistent modeling of CFETR baseline scenarios for steady-state operation

    NASA Astrophysics Data System (ADS)

    Chen, Jiale; Jian, Xiang; Chan, Vincent S.; Li, Zeyu; Deng, Zhao; Li, Guoqiang; Guo, Wenfeng; Shi, Nan; Chen, Xi; CFETR Physics Team

    2017-07-01

    Integrated modeling for core plasma is performed to increase confidence in the proposed baseline scenario in the 0D analysis for the China Fusion Engineering Test Reactor (CFETR). The steady-state scenarios are obtained through the consistent iterative calculation of equilibrium, transport, auxiliary heating and current drives (H&CD). Three combinations of H&CD schemes (NB + EC, NB + EC + LH, and EC + LH) are used to sustain the scenarios with q min > 2 and fusion power of ˜70-150 MW. The predicted power is within the target range for CFETR Phase I, although the confinement based on physics models is lower than that assumed in 0D analysis. Ideal MHD stability analysis shows that the scenarios are stable against n = 1-10 ideal modes, where n is the toroidal mode number. Optimization of RF current drive for the RF-only scenario is also presented. The simulation workflow for core plasma in this work provides a solid basis for a more extensive research and development effort for the physics design of CFETR.

  16. Telling better stories: strengthening the story in story and simulation

    NASA Astrophysics Data System (ADS)

    Kemp-Benedict, Eric

    2012-12-01

    The scenarios of the IPCC Special Report on Emissions Scenarios (SRES) (Nakicenovic and Swart 2000) are both widely cited and widely criticized. This combination of censure and regard reflects their importance, as they provide both a point of reference and a point of departure for those wishing to understand the long-term implications of policies and human activities for the climate and adaptive capacity. The paper by Schweizer and Kriegler in this issue (Schweizer and Kriegler 2012) reports a unique and interesting critique of the SRES scenarios. The authors find several results, including that the path the world may now be on (labeled by them 'coal-powered growth') is under-represented in the SRES scenarios. While such post-hoc critiques are easy to dismiss, Schweizer and Kriegler were careful to use only the information available to the SRES authors, and they applied a technique that (if it had been available) could have been carried out at that time. In this way they demonstrate that not only was coal-powered growth a clearly discernible possible future at the time of the SRES, but variants on the theme dominate the handful of highly consistent and robust scenarios as identified by their method. Their paper is well-timed because a new round of climate scenarios is now under development (Kriegler et al 2012, van Vuuren et al 2012), and it could learn from evaluations of the SRES process and scenarios. Schweizer and Kriegler (2012) construct a consistent scenario logic using a relatively new foresight technique, cross-impact balances (CIB) (Weimer-Jehle 2006). As explained above, to sharpen their critique and properly evaluate the method, they apply CIB to the information that the authors of the SRES had at their disposal at the time they constructed their scenarios. Their study is therefore anachronistic, in that the CIB method was not published when the SRES was released, but historically faithful in that Schweizer and Kriegler limit themselves to the information available at that time, based on statements that appear in the SRES itself. The CIB method is a technique for constructing internally consistent qualitative scenarios. Global-scale scenario exercises, in particular climate scenarios, typically include both qualitative (narrative) and quantitative (model) elements. As noted by Schweizer and Kriegler, the dominant method for such studies, which Alcamo (2001, 2008) formalized and named the 'story and simulation' (SAS) approach, relies at least in part on quantitative modeling to ensure consistency. Schweizer and Kriegler rightly criticize the idea that models alone can ensure consistency of a scenario narrative. By itself, this critique is not new. Indeed, if asked, both Alcamo and Raskin et al (Raskin et al 2005), whom Schweizer and Kriegler (2012) cite, would probably agree with them; both sources emphasize the need for qualitative storylines that go beyond what models can provide. However, Schweizer and Kriegler correctly point out that these sources provide little or no guidance to those responsible for the narratives beyond a dialog with the model outputs. The CIB method addresses this problem, and Schweizer and Kriegler's application of the method shows that even the best narrative-writing teams can benefit from this guidance. While the paper of Schweizer and Kriegler makes a compelling argument for using CIB in global scenarios, it should be used in combination with other methods. A scenario exercise has several aims, of which consistency is one. Another important goal is diversity: given a set of internally consistent scenarios, a diverse set covers the space of possibilities, and thereby helps users of the scenarios avoid underestimating or overestimating the potential for change in one or another key factor (e.g., see (Carlsen 2009)). From this point of view, the SRES authors could legitimately respond to Schweizer and Kriegler's finding that the SRES scenarios excluded interesting variants on coal-fueled growth by arguing that they did include some variants, and to include more would have conflicted with a legitimate goal of breadth. In this imagined dialog, Schweizer and Kriegler could concede the point, but then point out that several of the SRES scenarios were revealed to be either marginally or very inconsistent by their exercise. Thus, CIB and a technique that helps ensure breadth can usefully complement one another. The CIB method is also liable to a form of specification error, in that the worldviews of the people filling in the cross-impact table influence the results. This is a problem with many foresight techniques, but it is masked by the formalism of CIB, and there is a danger it will go unnoticed. For example, Schweizer and Kriegler's paper suggests that the A1T2 scenario is (marginally) internally consistent. It has relatively low carbon emissions, low rates of population growth, very high GDP per capita growth rates, low primary energy intensity, very low carbon intensity, high fossil-fuel availability, global economic policy focus, and mixed global and regional energy policy focus. It has been argued by Jackson (2009) and Victor (2008), among others, that the evidence is slim that we ever will decouple carbon emissions from GDP to any meaningful extent. Thus, they would presumably argue that this is an inconsistent scenario, and might very well have done so at the time the SRES was written. That is not by itself a reason to reject the scenario, but it suggests that a CIB exercise could be run assuming the qualitative models implied by different worldviews, and the results contrasted. Such an exercise would go beyond the sensitivity analysis that Schweizer and Kriegler report in their paper. The cross-impact balance method should be a useful tool for constructing the next round of climate scenarios. It will be even more useful if combined with techniques that ensure a diversity of scenarios. This could include formal techniques such as 'scenario diversity analysis', which maximizes a quantitative measure of the spread of a set of qualitative scenarios defined by states of driving forces (Carlsen 2009). It could also include a survey of different worldviews, and the qualitative models that they imply, such as that carried out by Sunderlin (Sunderlin 2003). Futures studies has moved forward from the time the SRES was published, and new techniques are now available that can help us to tell better stories of the future. References Alcamo J 2001 Scenarios as Tools for International Environmental Assessments (Cophenhagen: European Environment Agency) Alcamo J 2008 The SAS approach: combining qualitative and quantitative knowledge in environmental scenarios Environmental Futures—The Practice of Environmental Scenario Analysis vol 2, ed J Alcamo (Amsterdam: Elsevier) pp 123-50 Carlsen H 2009 Climate change and the construction of scenario sets that span the range of societal uncertainties Paper for International Studies Association Annual Convention 2009 (New York City, February) Jackson T 2009 Prosperity Without Growth: Economics for a Finite Planet (London: Earthscan) Kriegler E, O'Neill B C, Hallegatte S, Kram T, Lempert R J, Moss R H and Wilbanks T 2012 The need for and use of socio-economic scenarios for climate change analysis: a new approach based on shared socio-economic pathways Glob. Environ. Change 22 807-22 Nakicenovic N and Swart R (eds) 2000 Special Report on Emissions Scenarios (Cambridge: Cambridge University Press) Raskin P, Monks F, Ribeiro T, van Vuuren D and Zurek M 2005 Global scenarios in historical perspective Ecosystems and Human Well-Being: Scenarios: Findings of the Scenarios Working Group vol 2, ed S R Carpenter et al (Washington, DC: Island) pp 35-44 Schweizer V J and Kriegler E 2012 Improving environmental change research with systematic techniques for qualitative scenarios Environ. Res. Lett. 7 044011 Sunderlin W D 2003 Ideology, Social Theory, and the Environment (Lanham, MD: Rowman & Littlefield) van Vuuren D P et al 2012 A proposal for a new scenario framework to support research and assessment in different climate research communities Glob. Environ. Change 22 21-35 Victor P A 2008 Managing Without Growth: Slower by Design, Not Disaster (Advances in Ecological Economics Series) (Cheltenham: Edward Elgar) Weimer-Jehle W 2006 Cross-impact balances: a system—theoretical approach to cross-impact analysis Technol. Forecast. Social Change 73 334-61

  17. Life cycle analysis of pistachio production in Greece.

    PubMed

    Bartzas, Georgios; Komnitsas, Kostas

    2017-10-01

    In the present paper, a life cycle assessment (LCA) study regarding pistachio (Pistacia vera L.) cultivation in Aegina island, Greece, was performed to evaluate the energy use footprint and the associated environmental impacts. In this context, a detailed life cycle inventory was created based on site-survey data and used for a holistic cradle-to-farm gate LCA analysis using the GaBi 6.5 software. The main impact categories assessed were acidification potential (AP), eutrophication potential (EP), global warming potential (GWP), ozone depletion potential (ODP), photochemical ozone creation potential (POCP) and cumulative energy demand (CED). In order to reveal the main environmental concerns pertinent to pistachio production and in turn propose measures for the reduction of environmental and energetic impacts, three scenarios were compared, namely the Baseline scenario (BS) that involves current cultivation practices, the Green Energy (GE) scenario that involves the use of biological fertilizers i.e. compost, and the Waste Utilization (WU) scenario that involves the production of biochar from pistachio and other agricultural wastes and its subsequent soil application to promote carbon sequestration and improve soil quality. Based on the results of this study, the use of compost for fertilization (GE scenario), which results in approximately 9% savings in terms of energy consumption and the five environmental impact categories studied compared to BS scenario, is considered a promising alternative cultivation strategy. Slightly higher savings (10% on average) in terms of the five calculated environmental impact categories, compared to the BS scenario, were indicated when the WU scenario was considered. Regarding energy consumption, the WU scenario results in minor increase, 3%, compared to the BS scenario. Results of uncertainty analysis performed using the Monte Carlo technique and contribution analysis showed that GE and WU scenarios offer reliable and significant eco-profile improvements for pistachio production in the study area compared to the current situation. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barr, G.E.; Borns, D.J.; Fridrich, C.

    A comprehensive collection of scenarios is presented that connect initiating tectonic events with radionuclide releases by logical and physically possible combinations or sequences of features, events and processes. The initiating tectonic events include both discrete faulting and distributed rock deformation developed through the repository and adjacent to it, as well as earthquake-induced ground motion and changes in tectonic stress at the site. The effects of these tectonic events include impacts on the engineered-barrier system, such as container rupture and failure of repository tunnels. These effects also include a wide range of hydrologic effects such as changes in pathways and flowmore » rates in the unsaturated and saturated zones, changes in the water-table configuration, and in the development of perched-water systems. These scenarios are intended go guide performance-assessment analyses and to assist principal investigators in how essential field, laboratory, and calculational studies are used. This suite of scenarios will help ensure that all important aspects of the system disturbance related to a tectonic scenario are captured in numerical analyses. It also provides a record of all options considered by project analysts to provide documentation required for licensing agreement. The final portion of this report discusses issues remaining to be addressed with respect to tectonic activity. 105 refs.« less

  19. Biochemical transport modeling, estimation, and detection in realistic environments

    NASA Astrophysics Data System (ADS)

    Ortner, Mathias; Nehorai, Arye

    2006-05-01

    Early detection and estimation of the spread of a biochemical contaminant are major issues for homeland security applications. We present an integrated approach combining the measurements given by an array of biochemical sensors with a physical model of the dispersion and statistical analysis to solve these problems and provide system performance measures. We approximate the dispersion model of the contaminant in a realistic environment through numerical simulations of reflected stochastic diffusions describing the microscopic transport phenomena due to wind and chemical diffusion using the Feynman-Kac formula. We consider arbitrary complex geometries and account for wind turbulence. Localizing the dispersive sources is useful for decontamination purposes and estimation of the cloud evolution. To solve the associated inverse problem, we propose a Bayesian framework based on a random field that is particularly powerful for localizing multiple sources with small amounts of measurements. We also develop a sequential detector using the numerical transport model we propose. Sequential detection allows on-line analysis and detecting wether a change has occurred. We first focus on the formulation of a suitable sequential detector that overcomes the presence of unknown parameters (e.g. release time, intensity and location). We compute a bound on the expected delay before false detection in order to decide the threshold of the test. For a fixed false-alarm rate, we obtain the detection probability of a substance release as a function of its location and initial concentration. Numerical examples are presented for two real-world scenarios: an urban area and an indoor ventilation duct.

  20. Influence of Cracks in Cementitious Engineered Barriers in a Near-Surface Disposal System: Assessment Analysis of the Belgian Case

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perko, Janez; Seetharam, Suresh C.; Jacques, Diederik

    2013-07-01

    In large cement-based structures such as a near surface disposal facility for radioactive waste voids and cracks are inevitable. However, the pattern and nature of cracks are very difficult to predict reliably. Cracks facilitate preferential water flow through the facility because their saturated hydraulic conductivity is generally higher than the conductivity of the cementitious matrix. Moreover, sorption within the crack is expected to be lower than in the matrix and hence cracks in engineered barriers can act as a bypass for radionuclides. Consequently, understanding the effects of crack characteristics on contaminant fluxes from the facility is of utmost importance inmore » a safety assessment. In this paper we numerically studied radionuclide leaching from a crack-containing cementitious containment system. First, the effect of cracks on radionuclide fluxes is assessed for a single repository component which contains a radionuclide source (i.e. conditioned radwaste). These analyses reveal the influence of cracks on radionuclide release from the source. The second set of calculations deals with the safety assessment results for the planned near-surface disposal facility for low-level radioactive waste in Dessel (Belgium); our focus is on the analysis of total system behaviour in regards to release of radionuclide fluxes from the facility. Simulation results are interpreted through a complementary safety indicator (radiotoxicity flux). We discuss the possible consequences from different scenarios of cracks and voids. (authors)« less

  1. Optimizing Characterization of Site Hydrology in Support of New Reactor Licensing at the U.S. Nuclear Regulatory Commission (Invited)

    NASA Astrophysics Data System (ADS)

    Nicholson, T. J.; Raione, R.; Ahn, H.; Barnhurst, D.; Giacinto, J.; McBride, M.; Tiruneh, N. D.

    2009-12-01

    The NRC regulates the civilian use of radioactive materials and facilities in an open and transparent manner. The NRC regulatory criteria are designed to protect human health and safety, and the environment by regulating nuclear facilities. During review of new reactor licensing applications, NRC staff reviews and independently verifies hydrogeologic information submitted by the applicant in several topical areas such as development and testing of Conceptual Site Models (CSM) which may involve perched aquifers; engineered water level fluctuations of surface-water reservoirs; ground-water collector wells and local ground-water uses; design-basis ground-water levels for structural analysis; analysis of scenarios for potential release of radionuclides to the subsurface; deep well injection of effluents; and monitoring to detect radionuclide releases. This information is reviewed in a systematic manner in accordance with NRC requirements and guidance to evaluate safety and environmental impacts and reduce the uncertainties for these impacts. NRC licensing staff is reviewing 14 applications for siting new reactors. Experience gained through these licensing activities has shown the value of using site-specific data to evaluate the CSM and its use to assess design and operational issues. Optimizing the information flow process through a systemically and thorough review process creates efficiencies. Through an iterative process of evaluating various geographical settings and associated ground-water conditions, NRC staff has developed methods to minimize prediction uncertainty through the use of confirmatory analyses performed under conservative, hierarchal approaches.

  2. Configurational entropy as a tool to select a physical thick brane model

    NASA Astrophysics Data System (ADS)

    Chinaglia, M.; Cruz, W. T.; Correa, R. A. C.; de Paula, W.; Moraes, P. H. R. S.

    2018-04-01

    We analize braneworld scenarios via a configurational entropy (CE) formalism. Braneworld scenarios have drawn attention mainly due to the fact that they can explain the hierarchy problem and unify the fundamental forces through a symmetry breaking procedure. Those scenarios localize matter in a (3 + 1) hypersurface, the brane, which is inserted in a higher dimensional space, the bulk. Novel analytical braneworld models, in which the warp factor depends on a free parameter n, were recently released in the literature. In this article we will provide a way to constrain this parameter through the relation between information and dynamics of a system described by the CE. We demonstrate that in some cases the CE is an important tool in order to provide the most probable physical system among all the possibilities. In addition, we show that the highest CE is correlated to a tachyonic sector of the configuration, where the solutions for the corresponding model are dynamically unstable.

  3. Scenarios of global mercury emissions from anthropogenic sources

    NASA Astrophysics Data System (ADS)

    Rafaj, P.; Bertok, I.; Cofala, J.; Schöpp, W.

    2013-11-01

    This paper discusses the impact of air quality and climate policies on global mercury emissions in the time horizon up to 2050. Evolution of mercury emissions is based on projections of energy consumption for a scenario without any global greenhouse gas mitigation efforts, and for a 2 °C climate policy scenario, which assumes internationally coordinated action to mitigate climate change. The assessment takes into account current air quality legislation in each country, as well as provides estimates of maximum feasible reductions in mercury through 2050. Results indicate significant scope for co-benefits of climate policies for mercury emissions. Atmospheric releases of mercury from anthropogenic sources under the global climate mitigation regime are reduced in 2050 by 45% when compared to the case without climate measures. Around one third of world-wide co-benefits for mercury emissions by 2050 occur in China. An annual Hg-abatement of about 800 tons is estimated for the coal combustion in power sector if the current air pollution legislation and climate policies are adopted in parallel.

  4. Orbit Determination Error Analysis Results for the Triana Sun-Earth L2 Libration Point Mission

    NASA Technical Reports Server (NTRS)

    Marr, G.

    2003-01-01

    Using the NASA Goddard Space Flight Center's Orbit Determination Error Analysis System (ODEAS), orbit determination error analysis results are presented for all phases of the Triana Sun-Earth L1 libration point mission and for the science data collection phase of a future Sun-Earth L2 libration point mission. The Triana spacecraft was nominally to be released by the Space Shuttle in a low Earth orbit, and this analysis focuses on that scenario. From the release orbit a transfer trajectory insertion (TTI) maneuver performed using a solid stage would increase the velocity be approximately 3.1 km/sec sending Triana on a direct trajectory to its mission orbit. The Triana mission orbit is a Sun-Earth L1 Lissajous orbit with a Sun-Earth-vehicle (SEV) angle between 4.0 and 15.0 degrees, which would be achieved after a Lissajous orbit insertion (LOI) maneuver at approximately launch plus 6 months. Because Triana was to be launched by the Space Shuttle, TTI could potentially occur over a 16 orbit range from low Earth orbit. This analysis was performed assuming TTI was performed from a low Earth orbit with an inclination of 28.5 degrees and assuming support from a combination of three Deep Space Network (DSN) stations, Goldstone, Canberra, and Madrid and four commercial Universal Space Network (USN) stations, Alaska, Hawaii, Perth, and Santiago. These ground stations would provide coherent two-way range and range rate tracking data usable for orbit determination. Larger range and range rate errors were assumed for the USN stations. Nominally, DSN support would end at TTI+144 hours assuming there were no USN problems. Post-TTI coverage for a range of TTI longitudes for a given nominal trajectory case were analyzed. The orbit determination error analysis after the first correction maneuver would be generally applicable to any libration point mission utilizing a direct trajectory.

  5. Nanodust released in interplanetary collisions

    NASA Astrophysics Data System (ADS)

    Lai, H. R.; Russell, C. T.

    2018-07-01

    The lifecycle of near-Earth objects (NEOs) involves a collisional cascade that produces ever smaller debris ending with nanoscale particles which are removed from the solar system by radiation pressure and electromagnetic effects. It has been proposed that the nanodust clouds released in collisions perturb the background interplanetary magnetic field and create the interplanetary field enhancements (IFEs). Assuming that this IFE formation scenario is actually operating, we calculate the interplanetary collision rate, estimate the total debris mass carried by nanodust, and compare the collision rate with the IFE rate. We find that to release the same amount of nanodust, the collision rate is comparable to the observed IFE rate. Besides quantitatively testing the association between the collisions evolving large objects and giant solar wind structures, such a study can be extended to ranges of smaller scales and to investigate the source of moderate and small solar wind perturbations.

  6. A natural experiment of the consequences of concentrating former prisoners in the same neighborhoods.

    PubMed

    Kirk, David S

    2015-06-02

    More than 600,000 prisoners are released from incarceration each year in the United States, and most end up residing in metropolitan areas, clustered within a select few neighborhoods. Likely consequences of this concentration of returning prisoners include higher rates of subsequent crime and recidivism. In fact, one-half of released prisoners return to prison within only 3 y of release. The routine exposure to criminogenic influences and criminal opportunities portends a bleak future for individuals who reside in neighborhoods with numerous other ex-prisoners. Through a natural experiment focused on post-Hurricane Katrina Louisiana, I examine a counterfactual scenario: If instead of concentrating ex-prisoners in geographic space, what would happen to recidivism rates if ex-prisoners were dispersed across space? Findings reveal that a decrease in the concentration of parolees in a neighborhood leads to a significant decrease in the reincarceration rate of former prisoners.

  7. The influence of the energy emitted by solar flare soft X-ray bursts on the propagation of their associated interplanetary shock waves

    NASA Technical Reports Server (NTRS)

    Pinter, S.; Dryer, M.

    1985-01-01

    The relationship between the thermal energy released from 29 solar flares and the propagation features of their associated interplanetary shock waves that were detected at 1 AU is investigated. The 29 interplanetary shock waves were identified unambiguously and their tracking from each solar flare was deduced by tracking their associated interplanetary type-II radio emission. The thermal energy released in the solar flares was estimated from the time-intensity profiles of 1-8 A soft X-ray bursts from each flare. A good relationship is found between the flares' thermal energy with the IP shock-waves' transient velocity and arrival time at the earth - that is, the largest flare energy released is associated with the faster shock waves. Finally, a possible scenario of formation of a shock wave during the early phase of the flare and its propagation features is discussed.

  8. Analyzing a bioterror attack on the food supply: the case of botulinum toxin in milk.

    PubMed

    Wein, Lawrence M; Liu, Yifan

    2005-07-12

    We developed a mathematical model of a cows-to-consumers supply chain associated with a single milk-processing facility that is the victim of a deliberate release of botulinum toxin. Because centralized storage and processing lead to substantial dilution of the toxin, a minimum amount of toxin is required for the release to do damage. Irreducible uncertainties regarding the dose-response curve prevent us from quantifying the minimum effective release. However, if terrorists can obtain enough toxin, and this may well be possible, then rapid distribution and consumption result in several hundred thousand poisoned individuals if detection from early symptomatics is not timely. Timely and specific in-process testing has the potential to eliminate the threat of this scenario at a cost of <1 cent per gallon and should be pursued aggressively. Investigation of improving the toxin inactivation rate of heat pasteurization without sacrificing taste or nutrition is warranted.

  9. Executive Summary of the Guidelines for the Use of interferon-gamma Release Assays in the Diagnosis of Tuberculosis Infection.

    PubMed

    Santin, Miguel; García-García, José-María; Rigau, David; Altet, Neus; Anibarro, Luis; Casas, Irma; Díez, Nuria; García-Gasalla, Mercedes; Martínez-Lacasa, Xavier; Penas, Antón; Pérez-Escolano, Elvira; Sánchez, Francisca; Domínguez, José

    2016-09-01

    Interferon-gamma release assays are widely used for the diagnosis of tuberculosis infection in Spain. However, there is no consensus on their application in specific clinical scenarios. To develop a guide-line for their use, a panel of experts comprising specialists in infectious diseases, respiratory diseases, microbiology, pediatrics and preventive medicine, together with a methodologist, conducted a systematic literature search, summarized the findings, rated the quality of the evidence, and formulated recommendations following the Grading of Recommendations of Assessment Development and Evaluations methodology. This document provides evidence-based guidance on the use of interferon-gamma release assays for the diagnosis of tuberculosis infection in patients at risk of tuberculosis or suspected of having active disease. The guidelines will be applicable to specialist and primary care, and public health. Copyright © 2016 SEPAR. Publicado por Elsevier España, S.L.U. All rights reserved.

  10. A natural experiment of the consequences of concentrating former prisoners in the same neighborhoods

    PubMed Central

    Kirk, David S.

    2015-01-01

    More than 600,000 prisoners are released from incarceration each year in the United States, and most end up residing in metropolitan areas, clustered within a select few neighborhoods. Likely consequences of this concentration of returning prisoners include higher rates of subsequent crime and recidivism. In fact, one-half of released prisoners return to prison within only 3 y of release. The routine exposure to criminogenic influences and criminal opportunities portends a bleak future for individuals who reside in neighborhoods with numerous other ex-prisoners. Through a natural experiment focused on post-Hurricane Katrina Louisiana, I examine a counterfactual scenario: If instead of concentrating ex-prisoners in geographic space, what would happen to recidivism rates if ex-prisoners were dispersed across space? Findings reveal that a decrease in the concentration of parolees in a neighborhood leads to a significant decrease in the reincarceration rate of former prisoners. PMID:25976097

  11. Hazardous substances releases associated with Hurricanes Katrina and Rita in industrial settings, Louisiana and Texas.

    PubMed

    Ruckart, Perri Zeitz; Orr, Maureen F; Lanier, Kenneth; Koehler, Allison

    2008-11-15

    The scientific literature concerning the public health response to the unprecedented hurricanes striking the Gulf Coast in August and September 2005 has focused mainly on assessing health-related needs and surveillance of injuries, infectious diseases, and other illnesses. However, the hurricanes also resulted in unintended hazardous substances releases in the affected states. Data from two states (Louisiana and Texas) participating in the Hazardous Substances Emergency Events Surveillance (HSEES) system were analyzed to describe the characteristics of hazardous substances releases in industrial settings associated with Hurricanes Katrina and Rita. HSEES is an active multi-state Web-based surveillance system maintained by the Agency for Toxic Substances and Disease Registry (ATSDR). In 2005, 166 hurricane-related hazardous substances events in industrial settings in Louisiana and Texas were reported. Most (72.3%) releases were due to emergency shut downs in preparation for the hurricanes and start-ups after the hurricanes. Emphasis is given to the contributing causal factors, hazardous substances released, and event scenarios. Recommendations are made to prevent or minimize acute releases of hazardous substances during future hurricanes, including installing backup power generation, securing equipment and piping to withstand high winds, establishing procedures to shutdown process operations safely, following established and up-to-date start-up procedures and checklists, and carefully performing pre-start-up safety reviews.

  12. Thermal decomposition of nano-enabled thermoplastics: Possible environmental health and safety implications

    PubMed Central

    Sotiriou, Georgios A.; Singh, Dilpreet; Zhang, Fang; Chalbot, Marie-Cecile G.; Spielman-Sun, Eleanor; Hoering, Lutz; Kavouras, Ilias G.; Lowry, Gregory V.; Wohlleben, Wendel; Demokritou, Philip

    2015-01-01

    Nano-enabled products (NEPs) are currently part of our life prompting for detailed investigation of potential nano-release across their life-cycle. Particularly interesting is their end-of-life thermal decomposition scenario. Here, we examine the thermal decomposition of a widely used NEP, namely thermoplastic nanocomposites, and assess the properties of the byproducts (released aerosol and residual ash) and possible environmental health and safety implications. We focus on establishing a fundamental understanding on the effect of thermal decomposition parameters, such as polymer matrix, nanofiller properties, decomposition temperature, on the properties of byproducts using a recently-developed lab-based experimental integrated platform. Our results indicate that thermoplastic polymer matrix strongly influences size and morphology of released aerosol, while there was minimal but detectable nano-release, especially when inorganic nanofillers were used. The chemical composition of the released aerosol was found not to be strongly influenced by the presence of nanofiller at least for the low, industry-relevant loadings assessed here. Furthermore, the morphology and composition of residual ash was found to be strongly influenced by the presence of nanofiller. The findings presented here on thermal decomposition/incineration of NEPs raise important questions and concerns regarding the potential fate and transport of released engineered nanomaterials in environmental media and potential environmental health and safety implications. PMID:26642449

  13. Comprehensive Reactive Receiver Modeling for Diffusive Molecular Communication Systems: Reversible Binding, Molecule Degradation, and Finite Number of Receptors.

    PubMed

    Ahmadzadeh, Arman; Arjmandi, Hamidreza; Burkovski, Andreas; Schober, Robert

    2016-10-01

    This paper studies the problem of receiver modeling in molecular communication systems. We consider the diffusive molecular communication channel between a transmitter nano-machine and a receiver nano-machine in a fluid environment. The information molecules released by the transmitter nano-machine into the environment can degrade in the channel via a first-order degradation reaction and those that reach the receiver nano-machine can participate in a reversible bimolecular reaction with receiver receptor proteins. Thereby, we distinguish between two scenarios. In the first scenario, we assume that the entire surface of the receiver is covered by receptor molecules. We derive a closed-form analytical expression for the expected received signal at the receiver, i.e., the expected number of activated receptors on the surface of the receiver. Then, in the second scenario, we consider the case where the number of receptor molecules is finite and the uniformly distributed receptor molecules cover the receiver surface only partially. We show that the expected received signal for this scenario can be accurately approximated by the expected received signal for the first scenario after appropriately modifying the forward reaction rate constant. The accuracy of the derived analytical results is verified by Brownian motion particle-based simulations of the considered environment, where we also show the impact of the effect of receptor occupancy on the derived analytical results.

  14. Evaluation of the Hydropower Generation Potential of a Dam Using Optimization Techniques: Application to Doma Dam, Nassarawa, in North Central Nigeria

    NASA Astrophysics Data System (ADS)

    Salami, Adebayo Wahab; Sule, Bolaji Fatai; Adunkpe, Tope Lacroix; Ayanshola, Ayanniyi Mufutau; Bilewu, Solomon Olakunle

    2017-03-01

    Optimization models have been developed to maximize annual energy generation from the Doma dam, subject to the constraint of releases for irrigation, ecological purposes, the water supply, the maximum yield from the reservoir and reservoir storage. The model was solved with LINGO software for various mean annual inflow exceedence probabilities. Two scenarios of hydropower retrofitting were considered. Scenario 1, with the reservoir inflows at 50%, 75%, and 90% probabilities of exceedence, gives the total annual hydropower as 0.531 MW, 0.450 MW and 0.291 MW, respectively. The corresponding values for scenario 2 were 0.615 MW, 0.507 MW, and 0.346 MW respectively. The study also considered increasing the reservoir's live storage to 32.63Mm3 by taking part of the flood storage so that the maximum draft increases to 7 Mm3. With this upper limit of storage and draft with reservoir inflows of 50%, 75% and 90% probabilities of exceedence, the hydropower generated increased to 0.609 MW, 0.540 MW, and 0.347 MW respectively for the scenario 1 arrangement, while those of scenario 2 increased to 0.699 MW, 0.579MW and 0.406 MW respectively. The results indicate that the Doma Dam is suitable for the production of hydroelectric power and that its generation potential is between 0.61 MW and 0.70 MW.

  15. What is the Value Added to Adaptation Planning by Probabilistic Projections of Climate Change?

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.

    2008-12-01

    Probabilistic projections of climate change offer new sources of risk information to support regional impacts assessment and adaptation options appraisal. However, questions continue to surround how best to apply these scenarios in a practical context, and whether the added complexity and computational burden leads to more robust decision-making. This paper provides an overview of recent efforts in the UK to 'bench-test' frameworks for employing probabilistic projections ahead of the release of the next generation, UKCIP08 projections (in November 2008). This is involving close collaboration between government agencies, research and stakeholder communities. Three examples will be cited to illustrate how probabilistic projections are already informing decisions about future flood risk management in London, water resource planning in trial river basins, and assessments of risks from rising water temperatures to Atlantic salmon stocks in southern England. When compared with conventional deterministic scenarios, ensemble projections allow exploration of a wider range of management options and highlight timescales for implementing adaptation measures. Users of probabilistic scenarios must keep in mind that other uncertainties (e.g., due to impacts model structure and parameterisation) should be handled in an equally rigorous way to those arising from climate models and emission scenarios. Finally, it is noted that a commitment to long-term monitoring is also critical for tracking environmental change, testing model projections, and for evaluating the success (or not) of any scenario-led interventions.

  16. Improving Conflict Alert Performance Using Moving Target Detector Data.

    DTIC Science & Technology

    1982-06-01

    2 L136 IIIII I lIlS 1 1 10 11120 125 11111I ~1.6 MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU Of SIANDARDg 19bi A DOT/FAA/RD-82/47 DOT/FAA/CT-81...Differences for Stochastic Case 23 7 Illustration of Scenarios for Warning Time Tests 30 8 Illustration of Scenarios Used for Nuisance Alert 35 Area...Nuisance Alert Area Analysis of Scenario 3 with a Target 64 Velocity of 480 Knots and SPMB= SPPB =2.8 nmi 12 Nuisance Alert Area Analysis of Scenario 3

  17. Energy crops on landfills: functional, environmental, and costs analysis of different landfill configurations.

    PubMed

    Pivato, Alberto; Garbo, Francesco; Moretto, Marco; Lavagnolo, Maria Cristina

    2018-02-09

    The cultivation of energy crops on landfills represents an important challenge for the near future, as the possibility to use devalued sites for energy production is very attractive. In this study, four scenarios have been assessed and compared with respect to a reference case defined for northern Italy. The scenarios were defined taking into consideration current energy crops issues. In particular, the first three scenarios were based on energy maximisation, phytotreatment ability, and environmental impact, respectively. The fourth scenario was a combination of these characteristics emphasised by the previous scenarios. A multi-criteria analysis, based on economic, energetic, and environmental aspects, was performed. From the analysis, the best scenario resulted to be the fourth, with its ability to pursue several objectives simultaneously and obtain the best score relatively to both environmental and energetic criteria. On the contrary, the economic criterion emerges as weak, as all the considered scenarios showed some limits from this point of view. Important indications for future designs can be derived. The decrease of leachate production due to the presence of energy crops on the top cover, which enhances evapotranspiration, represents a favourable but critical aspect in the definition of the results.

  18. Prediction of the run out extents of the Slano Blato landslide for future debris flow events

    NASA Astrophysics Data System (ADS)

    Askarinejad, Amin; Leu, Pascal; Macek, Matej; Petkovsek, Ana; Springman, Sarah

    2013-04-01

    The Slano Blato landslide has a volume of about 1 mio m3 and is located in the western part of Slovenia. It has been considered to be a potential natural hazard for the village of Lokavec for more than 200 years. Several mud flows, exhibiting a range of volumes and velocities, have originated from the landslide body since the year 2000, when the landslide was reactivated due to an intense rainfall event. A series of obstacles, including safety dams and deposition ponds, have been constructed for the remediation of the landslide. These obstacles are designed to absorb and contain future debris flow hazard. A prerequisite to any risk analysis is to establish the vulnerability to the hazard event. The aim of this work is to simulate possible future debris flow scenarios in order to predict the run out distances, flow heights, impact pressures and potential effects on the downstream village buildings and infrastructure. The simulations were carried out using the RAMMS program (RApid Mass MovementS, www.ramms.slf.ch). A three dimensional terrain model of the landslide area and the downstream zones, with or without the inclusion of the obstacles, was made for the simulations and different scenarios concerning the released volume, the internal friction and viscosity of the sliding mass were studied. The results indicate that low viscosity mudflows with a volume of 5,000 m3 endanger some parts of Lokavec village. However, the simulations with volumes of 15,000 and 50,000 m3 predict catastrophic effects in terms of either impact pressures or deposition heights for the majority of houses. Moreover, the simulations confirmed that the choice of the material properties (internal friction and viscosity), the characteristics of the release hydrograph, event location, and natural or man-made obstacles play major roles in the run out distances and impact pressures.

  19. Issues Involving The OSI Concept of Operation For Noble Gas Radionuclide Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carrigan, C R; Sun, Y

    2011-01-21

    The development of a technically sound protocol for detecting the subsurface release of noble gas radionuclides is critical to the successful operation of an on site inspection (OSI) under the CTBT and has broad ramifications for all aspects of the OSI regime including the setting of specifications for both sampling and analysis equipment used during an OSI. With NA-24 support, we are investigating a variety of issues and concerns that have significant bearing on policy development and technical guidance regarding the detection of noble gases and the creation of a technically justifiable OSI concept of operation. The work at LLNLmore » focuses on optimizing the ability to capture radioactive noble gases subject to the constraints of possible OSI scenarios. This focus results from recognizing the difficulty of detecting gas releases in geologic environments - a lesson we learned previously from the LLNL Non-Proliferation Experiment (NPE). Evaluation of a number of important noble gas detection issues, potentially affecting OSI policy, has awaited the US re-engagement with the OSI technical community. Thus, there have been numerous issues to address during the past 18 months. Most of our evaluations of a sampling or transport issue necessarily involve computer simulations. This is partly due to the lack of OSI-relevant field data, such as that provided by the NPE, and partly a result of the ability of LLNL computer-based models to test a range of geologic and atmospheric scenarios far beyond what could ever be studied in the field making this approach very highly cost effective. We review some highlights of the transport and sampling issues we have investigated during the past year. We complete the discussion of these issues with a description of a preliminary design for subsurface sampling that is intended to be a practical solution to most if not all the challenges addressed here.« less

  20. Evaluation of laser-induced breakdown spectroscopy analysis potential for addressing radiological threats from a distance

    NASA Astrophysics Data System (ADS)

    Gaona, I.; Serrano, J.; Moros, J.; Laserna, J. J.

    2014-06-01

    Although radioactive materials are nowadays valuable tools in nearly all fields of modern science and technology, the dangers stemming from the uncontrolled use of ionizing radiation are more than evident. Since preparedness is a key issue to face the risks of a radiation dispersal event, development of rapid and efficient monitoring technologies to control the contamination caused by radioactive materials is of crucial interest. Laser-induced breakdown spectroscopy (LIBS) exhibits appealing features for this application. This research focuses on the assessment of LIBS potential for the in-situ fingerprinting and identification of radioactive material surrogates from a safe distance. LIBS selectivity and sensitivity to detect a variety of radioactive surrogates, namely 59Co, 88Sr, 130Ba, 133Cs, 193Ir and 238U, on the surface of common urban materials at a distance of 30 m have been evaluated. The performance of the technique for nuclear forensics has been also studied on different model scenarios. Findings have revealed the difficulties to detect and to identify the analytes depending on the surface being interrogated. However, as demonstrated, LIBS shows potential enough for prompt and accurate gathering of essential evidence at a number of sites after the release, either accidental or intentional, of radioactive material. The capability of standoff analysis confers to LIBS unique advantages in terms of fast and safe inspection of forensic scenarios. The identity of the radioactive surrogates is easily assigned from a distance and the sensitivity to their detection is in the range of a few hundreds of ng per square centimeter.

  1. Critical Assessment of Analytical Techniques in the Search for Biomarkers on Mars: A Mummified Microbial Mat from Antarctica as a Best-Case Scenario

    NASA Astrophysics Data System (ADS)

    Blanco, Yolanda; Gallardo-Carreño, Ignacio; Ruiz-Bermejo, Marta; Puente-Sánchez, Fernando; Cavalcante-Silva, Erika; Quesada, Antonio; Prieto-Ballesteros, Olga; Parro, Víctor

    2017-10-01

    The search for biomarkers of present or past life is one of the major challenges for in situ planetary exploration. Multiple constraints limit the performance and sensitivity of remote in situ instrumentation. In addition, the structure, chemical, and mineralogical composition of the sample may complicate the analysis and interpretation of the results. The aim of this work is to highlight the main constraints, performance, and complementarity of several techniques that have already been implemented or are planned to be implemented on Mars for detection of organic and molecular biomarkers on a best-case sample scenario. We analyzed a 1000-year-old desiccated and mummified microbial mat from Antarctica by Raman and IR (infrared) spectroscopies (near- and mid-IR), thermogravimetry (TG), differential thermal analysis, mass spectrometry (MS), and immunological detection with a life detector chip. In spite of the high organic content (ca. 20% wt/wt) of the sample, the Raman spectra only showed the characteristic spectral peaks of the remaining beta-carotene biomarker and faint peaks of phyllosilicates over a strong fluorescence background. IR spectra complemented the mineralogical information from Raman spectra and showed the main molecular vibrations of the humic acid functional groups. The TG-MS system showed the release of several volatile compounds attributed to biopolymers. An antibody microarray for detecting cyanobacteria (CYANOCHIP) detected biomarkers from Chroococcales, Nostocales, and Oscillatoriales orders. The results highlight limitations of each technique and suggest the necessity of complementary approaches in the search for biomarkers because some analytical techniques might be impaired by sample composition, presentation, or processing.

  2. 2nd Generation Reusable Launch Vehicle Potential Commercial Development Scenarios

    NASA Technical Reports Server (NTRS)

    Creech, Stephen D.; Rogacki, John R. (Technical Monitor)

    2001-01-01

    The presentation will discuss potential commercial development scenarios for a Second Generation Reusable Launch Vehicle. The analysis of potential scenarios will include commercial rates of return, government return on investment, and market considerations. The presentation will include policy considerations in addition to analysis of Second Generation Reusable Launch Vehicle economics. The data discussed is being developed as a part of NASA's Second Generation Reusable Launch Vehicle Program, for consideration as potential scenarios for enabling a next generation system. Material will include potential scenarios not previously considered by NASA or presented at other conferences. Candidate paper has not been presented at a previous meeting, and conference attendance of the author has been approved by NASA.

  3. Potential risks of TiO2 and ZnO nanoparticles released from sunscreens into outdoor swimming pools.

    PubMed

    Jeon, Soo-Kyung; Kim, Eun-Ju; Lee, Jaesang; Lee, Seunghak

    2016-11-05

    The potential risks of nanoparticles (NPs) in sunscreens being released into swimming water were evaluated by a series of laboratory experiments simulating the fate and transport of NPs in outdoor swimming pools. NPs released from sunscreen-applied skin were estimated using pig skins covered with five different commercial sunscreens containing TiO2, ZnO, or both at various concentrations. Assuming that the swimming water treatment processes consisted of filtration, UV irradiation, heating, and chlorination, possible removal of the released NPs by each process was estimated. Generation of hydrogen peroxide (H2O2) by the NPs under sunlight and after UV photochemical treatment were measured, and the H2O2 concentration possibly present in the swimming pool was calculated based on some specific scenarios of operating an outdoor swimming pool. It was found that a significant amount of the NPs in sunscreens could be released into the swimming water, and accumulate during circulation through the treatment system. However, the concentration of H2O2 possibly present in the swimming pool should be below the level at which an adverse effect to bathers is concerned. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 3. Scenario Analysis.

    DOT National Transportation Integrated Search

    1979-09-01

    This is the third volume of a six-volume report documenting the results of a study entitled "Benefit-Cost Analysis of Integrated Paratransit Systems." This volume provides detailed results of a series of scenario analyses designed to determine the im...

  5. Atmospheric Transport Modelling and Radionuclide Analysis for the NPE 2015 scenario

    NASA Astrophysics Data System (ADS)

    Ross, J. Ole; Bollhöfer, Andreas; Heidmann, Verena; Krais, Roman; Schlosser, Clemens; Gestermann, Nicolai; Ceranna, Lars

    2017-04-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) prohibits all kinds of nuclear explosions. The International Monitoring System (IMS) is in place and at about 90% complete to verify compliance with the CTBT. The stations of the waveform technologies are capable to detect seismic, hydro-acoustic and infrasonic signals for detection, localization, and characterization of explosions. For practicing Comprehensive Nuclear-Test-Ban Treaty (CTBT) verification procedures and interplay between the International Data Centre (IDC) and National Data Centres (NDC), prepardness exercises (NPE) are regularly performed with selected events of fictitious CTBT-violation. The German NDC's expertise for radionuclide analyses and operation of station RN33 is provided by the Federal Office for Radiation Protection (BfS) while Atmospheric Transport Modelling (ATM) for CTBT purposes is performed at the Federal Institute for Geosciences and Natural Resources (BGR) for the combination of the radionuclide findings with waveform evidence. The radionuclide part of the NPE 2015 scenario is tackled in a joint effort by BfS and BGR. First, the NPE 2015 spectra are analysed, fission products are identified, and respective activity concentrations are derived. Special focus is on isotopic ratios which allow for source characterization and event timing. For atmospheric backtracking the binary coincidence method is applied for both, SRS fields from IDC and WMO-RSMC, and for in-house backward simulations in higher resolution for the first affected samples. Results are compared with the WebGrape PSR and the spatio-temporal domain with high atmospheric release probability is determined. The ATM results together with the radionuclide fingerprint are used for identification of waveform candidate events. Comparative forward simulations of atmospheric dispersion for candidate events are performed. Finally the overall consistency of various source scenarios is assessed and a fictitious government briefing on the findings is given.

  6. Is there enough sand? Evaluating the fate of Grand Canyon sandbars

    USGS Publications Warehouse

    Wright, S.A.; Schmidt, J.C.; Meles, T.S.; Topping, D.J.; Rubin, D.M.

    2008-01-01

    Large dams have the potential to dramatically alter the flow regime, geomorphology, and aquatic ecosystem of downstream river reaches. Development of flow release regimes in order to meet multiple objectives is a challenge facing dam operators, resource managers, and scientists. Herein, we review previous work and present new analyses related to the effects of Glen Canyon Dam on the downstream reach of the Colorado River in Marble and Grand Canyons. The dam traps the entire incoming sediment load in Lake Powell and modulates the hydrologic regime by, for example, eliminating spring snowmelt floods, resulting in changes in the geomorphology of the river downstream. The primary geomorphic impact has been the erosion of sandbars along the banks of the river. Recognition of this impact has led to many scientific studies and a variety of experimental operations of Glen Canyon Dam with the goal of rebuilding the eroding sandbars. These efforts have thus far been generally unsuccessful and the question remains as to whether or not the dam can be operated such that sandbars can be rebuilt and maintained over extended periods with the existing sediment supply. We attempt to answer this question by evaluating a dam operation that may be considered a "best-case scenario" for rebuilding and maintaining eroded sandbars. Our analysis suggests that this best-case scenario may indeed have viability for rebuilding sandbars, and that the initial rate at which sandbars could be rebuilt is comparable to the rate at which sandbars have been eroded since dam construction. The question remains open as to the viability of operations that deviate from the best-case scenario that we have defined.

  7. Model Performance Evaluation and Scenario Analysis (MPESA)

    EPA Pesticide Factsheets

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  8. Fuel treatment effects on tree-based forest carbon storage and emissions under modeled wildfire scenarios

    Treesearch

    M. Hurteau; M. North

    2009-01-01

    Forests are viewed as a potential sink for carbon (C) that might otherwise contribute to climate change. It is unclear, however, how to manage forests with frequent fire regimes to maximize C storage while reducing C emissions from prescribed burns or wildfire. We modeled the effects of eight different fuel treatments on treebased C storage and release over a century,...

  9. Calculation note for an underground leak which remains underground

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, H.J.

    1997-05-20

    This calculation note supports the subsurface leak accident scenario which remains subsurface. It is assumed that a single walled pipe carrying waste from tank 106-C ruptures, releasing the liquid waste into the soil. In this scenario, the waste does not form a surface pool, but remains subsurface. However, above the pipe is a berm, 0.762 m (2.5 ft) high and 2.44 m (8 ft) wide, and the liquid released from the leak rises into the berm. The slurry line, which transports a source term of higher activity than the sluice line, leaks into the soil at a rate of 5%more » of the maximum flow rate of 28.4 L/s (450 gpm) for twelve hours. The dose recipient was placed a perpendicular distance of 100 m from the pipe. Two source terms were considered, mitigated and unmitigated release as described in section 3.4.1 of UANF-SD-WM-BIO-001, Addendum 1. The unmitigated consisted of two parts of AWF liquid and one part AWF solid. The mitigated release consisted of two parts SST liquid, eighteen parts AWF liquid, nine parts SST solid, and one part AWF solid. The isotopic breakdown of the release in these cases is presented. Two geometries were considered in preliminary investigations, disk source, and rectangular source. Since the rectangular source results from the assumption that the contamination is wicked up into the berm, only six inches of shielding from uncontaminated earth is present, while the disk source, which remains six inches below the level of the surface of the land is often shielded by a thick shield due to the slant path to the dose point. For this reason, only the rectangular source was considered in the final analysis. The source model was a rectangle 2.134 m (7 ft) thick, 0.6096 m (2 ft) high, and 130.899 m (131 ft) long. The top and sides of this rectangular source was covered with earth of density 1.6 g/cm{sup 3} to a thickness of 15.24 cm (6 in). This soil is modeled as 40% void space. The source consisted of earth of the same density with the void spaces filled with the liquid waste which added 0.56 g/cm{sup 3} to the density. The dose point was 100 m (328 ft) away from the berm in a perpendicular direction off the center. The computer code MICROSKYSHINEO was used to calculate the skyshine from the source. This code calculates exposure rate at the receptor point. The photon spectrum from 2 MeV to 0.15 MeV, obtained from ISOSHLD, was used as input, although this did not differ substantially from the results obtained from using Co, 137mBa, and 154Eu. However, this methodology allowed the bremsstrahlung contribution to be included in the skyshine calculation as well as in the direct radiation calculation.« less

  10. Comparing CMIP-3 and CMIP-5 climate projections on flooding estimation of Devils Lake of North Dakota, USA

    PubMed Central

    2018-01-01

    Background Water level fluctuations in endorheic lakes are highly susceptible to even slight changes in climate and land use. Devils Lake (DL) in North Dakota, USA is an endorheic system that has undergone multi-decade flooding driven by changes in regional climate. Flooding mitigation strategies have centered on the release of lake water to a nearby river system through artificial outlets, resulting in legal challenges and environmental concerns related to water quality, downstream flooding, species migration, stakeholder opposition, and transboundary water conflicts between the US and Canada. Despite these drawbacks, running outlets would result in low overspill risks in the next 30 years. Methods In this study we evaluated the efficacy of this outlet-based mitigation strategy under scenarios based on the latest IPCC future climate projections. We used the Coupled Model Intercomparison Project CMIP-5 weather patterns from 17 general circulation models (GCMs) obtained under four representative concentration pathways (RCP) scenarios and downscaled to the DL region. Then, we simulated the changes in lake water levels using the soil and water assessment tool based hydrological model of the watershed. We estimated the probability of future flood risks under those scenarios and compared those with previously estimated overspill risks under the CMIP-3 climate. Results The CMIP-5 ensemble projected a mean annual temperature of 5.78 °C and mean daily precipitation of 1.42 mm/day; both are higher than the existing CMIP-3 future estimates of 4.98 °C and 1.40 mm/day, respectively. The increased precipitation and higher temperature resulted in a significant increase of DL’s overspill risks: 24.4–47.1% without release from outlets and 3.5–14.4% even if the outlets are operated at their combined full 17 m3/s capacity. Discussion The modeled increases in overspill risks indicate a greater frequency of water releases through the artificial outlets. Future risk mitigation management should include providing a flood warning signal to local resource managers, and tasking policy makers to identify additional solution measures such as land use management in the upper watershed to mitigate DL’s flooding. PMID:29736343

  11. Comparing CMIP-3 and CMIP-5 climate projections on flooding estimation of Devils Lake of North Dakota, USA.

    PubMed

    Kharel, Gehendra; Kirilenko, Andrei

    2018-01-01

    Water level fluctuations in endorheic lakes are highly susceptible to even slight changes in climate and land use. Devils Lake (DL) in North Dakota, USA is an endorheic system that has undergone multi-decade flooding driven by changes in regional climate. Flooding mitigation strategies have centered on the release of lake water to a nearby river system through artificial outlets, resulting in legal challenges and environmental concerns related to water quality, downstream flooding, species migration, stakeholder opposition, and transboundary water conflicts between the US and Canada. Despite these drawbacks, running outlets would result in low overspill risks in the next 30 years. In this study we evaluated the efficacy of this outlet-based mitigation strategy under scenarios based on the latest IPCC future climate projections. We used the Coupled Model Intercomparison Project CMIP-5 weather patterns from 17 general circulation models (GCMs) obtained under four representative concentration pathways (RCP) scenarios and downscaled to the DL region. Then, we simulated the changes in lake water levels using the soil and water assessment tool based hydrological model of the watershed. We estimated the probability of future flood risks under those scenarios and compared those with previously estimated overspill risks under the CMIP-3 climate. The CMIP-5 ensemble projected a mean annual temperature of 5.78 °C and mean daily precipitation of 1.42 mm/day; both are higher than the existing CMIP-3 future estimates of 4.98 °C and 1.40 mm/day, respectively. The increased precipitation and higher temperature resulted in a significant increase of DL's overspill risks: 24.4-47.1% without release from outlets and 3.5-14.4% even if the outlets are operated at their combined full 17 m 3 /s capacity. The modeled increases in overspill risks indicate a greater frequency of water releases through the artificial outlets. Future risk mitigation management should include providing a flood warning signal to local resource managers, and tasking policy makers to identify additional solution measures such as land use management in the upper watershed to mitigate DL's flooding.

  12. Scenario analysis of the future of medicines.

    PubMed Central

    Leufkens, H.; Haaijer-Ruskamp, F.; Bakker, A.; Dukes, G.

    1994-01-01

    Planning future policy for medicines poses difficult problems. The main players in the drug business have their own views as to how the world around them functions and how the future of medicines should be shaped. In this paper we show how a scenario analysis can provide a powerful teaching device to readjust peoples' preconceptions. Scenarios are plausible, not probable or preferable, portraits of alternative futures. A series of four of alternative scenarios were constructed: "sobriety in sufficiency," "risk avoidance," "technology on demand," and "free market unfettered." Each scenario was drawn as a narrative, documented quantitatively wherever possible, that described the world as it might be if particular trends were to dominate development. The medical community and health policy markers may use scenarios to take a long term view in order to be prepared adequately for the future. PMID:7987110

  13. Simulating future water temperatures in the North Santiam River, Oregon

    NASA Astrophysics Data System (ADS)

    Buccola, Norman L.; Risley, John C.; Rounds, Stewart A.

    2016-04-01

    A previously calibrated two-dimensional hydrodynamic and water-quality model (CE-QUAL-W2) of Detroit Lake in western Oregon was used in conjunction with inflows derived from Precipitation-Runoff Modeling System (PRMS) hydrologic models to examine in-lake and downstream water temperature effects under future climate conditions. Current and hypothetical operations and structures at Detroit Dam were imposed on boundary conditions derived from downscaled General Circulation Models in base (1990-1999) and future (2059-2068) periods. Compared with the base period, future air temperatures were about 2 °C warmer year-round. Higher air temperature and lower precipitation under the future period resulted in a 23% reduction in mean annual PRMS-simulated discharge and a 1 °C increase in mean annual estimated stream temperatures flowing into the lake compared to the base period. Simulations incorporating current operational rules and minimum release rates at Detroit Dam to support downstream habitat, irrigation, and water supply during key times of year resulted in lower future lake levels. That scenario results in a lake level that is above the dam's spillway crest only about half as many days in the future compared to historical frequencies. Managing temperature downstream of Detroit Dam depends on the ability to blend warmer water from the lake's surface with cooler water from deep in the lake, and the spillway is an important release point near the lake's surface. Annual average in-lake and release temperatures from Detroit Lake warmed 1.1 °C and 1.5 °C from base to future periods under present-day dam operational rules and fill schedules. Simulated dam operations such as beginning refill of the lake 30 days earlier or reducing minimum release rates (to keep more water in the lake to retain the use of the spillway) mitigated future warming to 0.4 and 0.9 °C below existing operational scenarios during the critical autumn spawning period for endangered salmonids. A hypothetical floating surface withdrawal at Detroit Dam improved temperature control in summer and autumn (0.6 °C warmer in summer, 0.6 °C cooler in autumn compared to existing structures) without altering release rates or lake level management rules.

  14. Automated source term and wind parameter estimation for atmospheric transport and dispersion applications

    NASA Astrophysics Data System (ADS)

    Bieringer, Paul E.; Rodriguez, Luna M.; Vandenberghe, Francois; Hurst, Jonathan G.; Bieberbach, George; Sykes, Ian; Hannan, John R.; Zaragoza, Jake; Fry, Richard N.

    2015-12-01

    Accurate simulations of the atmospheric transport and dispersion (AT&D) of hazardous airborne materials rely heavily on the source term parameters necessary to characterize the initial release and meteorological conditions that drive the downwind dispersion. In many cases the source parameters are not known and consequently based on rudimentary assumptions. This is particularly true of accidental releases and the intentional releases associated with terrorist incidents. When available, meteorological observations are often not representative of the conditions at the location of the release and the use of these non-representative meteorological conditions can result in significant errors in the hazard assessments downwind of the sensors, even when the other source parameters are accurately characterized. Here, we describe a computationally efficient methodology to characterize both the release source parameters and the low-level winds (eg. winds near the surface) required to produce a refined downwind hazard. This methodology, known as the Variational Iterative Refinement Source Term Estimation (STE) Algorithm (VIRSA), consists of a combination of modeling systems. These systems include a back-trajectory based source inversion method, a forward Gaussian puff dispersion model, a variational refinement algorithm that uses both a simple forward AT&D model that is a surrogate for the more complex Gaussian puff model and a formal adjoint of this surrogate model. The back-trajectory based method is used to calculate a ;first guess; source estimate based on the available observations of the airborne contaminant plume and atmospheric conditions. The variational refinement algorithm is then used to iteratively refine the first guess STE parameters and meteorological variables. The algorithm has been evaluated across a wide range of scenarios of varying complexity. It has been shown to improve the source parameters for location by several hundred percent (normalized by the distance from source to the closest sampler), and improve mass estimates by several orders of magnitude. Furthermore, it also has the ability to operate in scenarios with inconsistencies between the wind and airborne contaminant sensor observations and adjust the wind to provide a better match between the hazard prediction and the observations.

  15. Modeling avian exposures to perfluoroalkyl substances in aquatic habitats impacted by historical aqueous film forming foam releases.

    PubMed

    Larson, Emily S; Conder, Jason M; Arblaster, Jennifer A

    2018-06-01

    Releases of Perfluoroalkyl and Polyfluoroalkyl Substances (PFASs) associated with Aqueous Film Forming Foams (AFFFs) have the potential to impact on-site and downgradient aquatic habitats. Dietary exposures of aquatic-dependent birds were modeled for seven PFASs (PFHxA, PFOA, PFNA, PFDA, PFHxS, PFOS, and PFDS) using five different scenarios based on measurements of PFASs obtained from five investigations of sites historically-impacted by AFFF. Exposure modeling was conducted for four avian receptors representing various avian feeding guilds: lesser scaup (Aythya affinis), spotted sandpiper (Actitis macularia), great blue heron (Ardea herodias), and osprey (Pandion haliaetus). For the receptor predicted to receive the highest PFAS exposure (spotted sandpiper), model-predicted exposure to PFOS exceeded a laboratory-based, No Observed Adverse Effect Level exposure benchmark in three of the five model scenarios, confirming that risks to aquatic-dependent avian wildlife should be considered for investigations of historic AFFF releases. Perfluoroalkyl sulfonic acids (PFHxS, PFOS, and PFDS) represented 94% (on average) of total PFAS exposures due to their prevalence in historical AFFF formulations, and increased bioaccumulation in aquatic prey items and partitioning to aquatic sediment relative to perfluoroalkyl carboxylic acids. Sediment-associated PFASs (rather than water-associated PFASs) were the source of the highest predicted PFAS exposures, and are likely to be very important for understanding and managing AFFF site-specific ecological risks. Additional considerations for research needs and site-specific ecological risk assessments are discussed with the goal of optimizing ecological risk-based decision making at AFFF sites and prioritizing research needs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. The nitrogen footprint tool network: a multi-institution program ...

    EPA Pesticide Factsheets

    Anthropogenic sources of reactive nitrogen have local and global impacts on air and water quality and detrimental effects on human and ecosystem health. This paper uses the nitrogen footprint tool (NFT) to determine the amount of nitrogen (N) released as a result of institutional consumption. The sectors accounted for include food (consumption and the upstream production), energy, transportation, fertilizer, research animals, and agricultural research. The NFT is then used for scenario analysis to manage and track reductions to institution N footprints, which are driven by the consumption behaviors of both the institution itself and its constituent individuals. In this paper, the first seven institution N footprint results are presented. The institution NFT network aims to develop footprints for many institutions to encourage widespread upper-level management strategies that will create significant reductions in reactive N released to the environment. Energy use and food purchases are the two largest contributors to institution N footprints. Ongoing efforts by institutions to reduce greenhouse gas emissions also help to reduce the N footprint, but the impact of food production on N pollution has not been directly addressed by the higher-ed sustainability community. The NFT Network found that institutions could reduce their N footprints by optimizing food purchasing to reduce consumption of animal products and minimize food waste, as well as reducing dependence o

  17. Anticipatory Water Management in Phoenix using Advanced Scenario Planning and Analyses: WaterSim 5

    NASA Astrophysics Data System (ADS)

    Sampson, D. A.; Quay, R.; White, D. D.; Gober, P.; Kirkwood, C.

    2013-12-01

    Complexity, uncertainty, and variability are inherent properties of linked social and natural processes; sustainable resource management must somehow consider all three. Typically, a decision support tool (using scenario analyses) is used to examine management alternatives under suspected trajectories in driver variables (i.e., climate forcing's, growth or economic projections, etc.). This traditional planning focuses on a small set of envisioned scenarios whose outputs are compared against one-another in order to evaluate their differing impacts on desired metrics. Human cognition typically limits this to three to five scenarios. However, complex and highly uncertain issues may require more, often much more, than five scenarios. In this case advanced scenario analysis provides quantitative or qualitative methods that can reveal patterns and associations among scenario metrics for a large ensemble of scenarios. From this analysis, then, a smaller set of heuristics that describe the complexity and uncertainty revealed provides a basis to guide planning in an anticipatory fashion. Our water policy and management model, termed WaterSim, permits advanced scenario planning and analysis for the Phoenix Metropolitan Area. In this contribution we examine the concepts of advanced scenario analysis on a large scale ensemble of scenarios using our work with WaterSim as a case study. For this case study we created a range of possible water futures by creating scenarios that encompasses differences in water supplies (our surrogates for climate change, drought, and inherent variability in riverine flows), population growth, and per capital water consumption. We used IPCC estimates of plausible, future, alterations in riverine runoff, locally produced and vetted estimates of population growth projections, and empirical trends in per capita water consumption for metropolitan cities. This ensemble consisted of ~ 30, 700 scenarios (~575 k observations). We compared and contrasted two metropolitan communities that exhibit differing growth projections and water portfolios; moderate growth with a diverse portfolio versus high growth for a more restrictive portfolio. Results illustrate that both communities exhibited an expanding envelope of possible, future water outcomes with rational water management trajectories. However, a more diverse portfolio resulted in a broad, time-insensitive decision space for management interventions. The reverse was true for the more restrictive water portfolio with high growth projections.

  18. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  19. Community Near-Port Modeling System (C-PORT): Briefing for ...

    EPA Pesticide Factsheets

    What C-PORT is: Screening level tool for assessing port activities and exploring the range of potential impacts that changes to port operations might have on local air quality; Analysis of decision alternatives through mapping of the likely pattern of potential pollutant dispersion and an estimated change in pollutant concentrations for user-designated scenarios; Designed primarily to evaluate the local air quality impacts of proposed port expansion or modernization, as well as to identify options for mitigating any impacts; Currently includes data from 21 US seaports and features a map-based interface similar to the widely used Google Earth; Still under development, C-PORT is designed as an easy-to-use computer modeling tool for users, such as state air quality managers and planners. This is part of our product outreach prior to model public release and to solicit for additional beta testers.

  20. Feasibility of using microencapsulated phase change materials as filler for improving low temperature performance of rubber sealing materials.

    PubMed

    Tiwari, Avinash; Shubin, Sergey N; Alcock, Ben; Freidin, Alexander B; Thorkildsen, Brede; Echtermeyer, Andreas T

    2017-11-01

    The feasibility of a novel composite rubber sealing material to improve sealing under transient cooling (in a so-called blowdown scenario) is investigated here. A composite of hydrogenated nitrile butadiene rubber (HNBR) filled with Micro Encapsulated Phase Change Materials (MEPCM) is described. The fillers contain phase change materials that release heat during the phase transformation from liquid to solid while cooling. This exotherm locally heats the rubber and may improve the function of the seal during a blowdown event. A representative HNBR-MEPCM composite was made and the critical thermal and mechanical properties were obtained by simulating the temperature distribution during a blowdown event. Simulations predict that the MEPCM composites can delay the temperature decrease in a region of the seal during the transient blowdown. A sensitivity analysis of material properties is also presented which highlights possible avenues of improvement of the MEPCMs for sealing applications.

  1. Setup, Validation and Quality Control of a Centralized WGS laboratory - Lessons Learned.

    PubMed

    Arnold, Cath; Edwards, Kirstin; Desai, Meeta; Platt, Steve; Green, Jonathan; Conway, David

    2018-04-25

    Routine use of Whole Genome analysis for infectious diseases can be used to enlighten various scenarios pertaining to Public Health, including identification of microbial pathogens; relating individual cases to an outbreak of infectious disease; establishing an association between an outbreak of food poisoning and a specific food vehicle; inferring drug susceptibility; source tracing of contaminants and study of variations in the genome affect pathogenicity/virulence. We describe the setup, validation and ongoing verification of a centralised WGS laboratory to carry out the sequencing for these public health functions for the National Infection Services, Public Health England in the UK. The performance characteristics and Quality Control metrics measured during validation and verification of the entire end to end process (accuracy, precision, reproducibility and repeatability) are described and include information regarding the automated pass and release of data to service users without intervention. © Crown copyright 2018.

  2. Dose Assessment of Los Alamos National Laboratory-Derived Residual Radionuclides in Soils within Tract A-18-2 for Land Conveyance and Transfer Decisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruedig, Elizabeth; Whicker, Jeffrey Jay

    In 2017, soil sampling for radiological materials was conducted within Tract A-18-2 specifically for land conveyance decisions. Measurements of radionuclides in soil samples were evaluated against a recreational use scenario, and all measurements were below screening action levels for each radionuclide. The total estimated dose was less than 1 mrem/y (< 10 μSv/y) for a hypothetical recreational user (compared to a dose limit of 25 mrem/y (250 μSv/y)). Dose estimates were based on the 95% upper confidence limits for radionuclide concentrations within the Tract. Additionally, dose estimates less than 3 mrem/y are considered to be As Low As Reasonably Achievable,more » so no follow-up analysis was conducted. Release of this property is consistent with the requirements of DOE Order 458.1 and Policy 412.« less

  3. The vertical metallicity gradients of mono-age stellar populations in the Milky Way with the RAVE and Gaia data

    NASA Astrophysics Data System (ADS)

    Ciucǎ, Ioana; Kawata, Daisuke; Lin, Jane; Casagrande, Luca; Seabroke, George; Cropper, Mark

    2018-03-01

    We investigate the vertical metallicity gradients of five mono-age stellar populations between 0 and 11 Gyr for a sample of 18 435 dwarf stars selected from the cross-matched Tycho-Gaia Astrometric Solution and Radial Velocity Experiment (RAVE) Data Release 5. We find a correlation between the vertical metallicity gradients and age, with no vertical metallicity gradient in the youngest population and an increasingly steeper negative vertical metallicity gradient for the older stellar populations. The metallicity at disc plane remains almost constant between 2 and 8 Gyr, and it becomes significantly lower for the 8 < τ ≤ 11 Gyr population. The current analysis also reveals that the intrinsic dispersion in metallicity increases steadily with age. We discuss that our results are consistent with a scenario that (thin) disc stars formed from a flaring (thin) star-forming disc.

  4. Insights into future air quality: a multipollutant analysis of future scenarios using the MARKAL model

    EPA Science Inventory

    In this presentation, we will provide an update on the development and evaluation of the Air Quality Futures (AQF) scenarios. These scenarios represent widely different assumptions regarding the evolution of the U.S. energy system over the next 40 years. The four AQF scenarios di...

  5. Scenario analysis and path selection of low-carbon transformation in China based on a modified IPAT model.

    PubMed

    Chen, Liang; Yang, Zhifeng; Chen, Bin

    2013-01-01

    This paper presents a forecast and analysis of population, economic development, energy consumption and CO2 emissions variation in China in the short- and long-term steps before 2020 with 2007 as the base year. The widely applied IPAT model, which is the basis for calculations, projections, and scenarios of greenhouse gases (GHGs) reformulated as the Kaya equation, is extended to analyze and predict the relations between human activities and the environment. Four scenarios of CO2 emissions are used including business as usual (BAU), energy efficiency improvement scenario (EEI), low carbon scenario (LC) and enhanced low carbon scenario (ELC). The results show that carbon intensity will be reduced by 40-45% as scheduled and economic growth rate will be 6% in China under LC scenario by 2020. The LC scenario, as the most appropriate and the most feasible scheme for China's low-carbon development in the future, can maximize the harmonious development of economy, society, energy and environmental systems. Assuming China's development follows the LC scenario, the paper further gives four paths of low-carbon transformation in China: technological innovation, industrial structure optimization, energy structure optimization and policy guidance.

  6. Scenario Analysis and Path Selection of Low-Carbon Transformation in China Based on a Modified IPAT Model

    PubMed Central

    Chen, Liang; Yang, Zhifeng; Chen, Bin

    2013-01-01

    This paper presents a forecast and analysis of population, economic development, energy consumption and CO2 emissions variation in China in the short- and long-term steps before 2020 with 2007 as the base year. The widely applied IPAT model, which is the basis for calculations, projections, and scenarios of greenhouse gases (GHGs) reformulated as the Kaya equation, is extended to analyze and predict the relations between human activities and the environment. Four scenarios of CO2 emissions are used including business as usual (BAU), energy efficiency improvement scenario (EEI), low carbon scenario (LC) and enhanced low carbon scenario (ELC). The results show that carbon intensity will be reduced by 40–45% as scheduled and economic growth rate will be 6% in China under LC scenario by 2020. The LC scenario, as the most appropriate and the most feasible scheme for China’s low-carbon development in the future, can maximize the harmonious development of economy, society, energy and environmental systems. Assuming China's development follows the LC scenario, the paper further gives four paths of low-carbon transformation in China: technological innovation, industrial structure optimization, energy structure optimization and policy guidance. PMID:24204922

  7. Polyvinyl alcohol hydrogels for iontohporesis

    NASA Astrophysics Data System (ADS)

    Bera, Prasanta; Alam, Asif Ali; Arora, Neha; Tibarewala, Dewaki Nandan; Basak, Piyali

    2013-06-01

    Transdermal therapeutic systems propound controlled release of active ingredients through the skin into the systemic circulation in a predictive manner. Drugs administered through these systems escape first-pass metabolism and maintain a steady state scenario similar to a continuous intravenous infusion for up to several days. The iontophoresis deal with the systemic delivery of the bioactive agents (drug) by applying an electric current. It is basically an injection without the needle. The iontophoretic system requires a gel-based matrix to accommodate the bioactive agent. Hydrogels have been used by many investigators in controlled-release drug delivery systems because of their good tissue compatibility and easy manipulation of swelling level and, thereby, solute permeability. In this work we have prepared polyvinyl alcohol (PVA) hydrogel. We have cross linked polyvinyl alcohol chemically with Glutaraldehyde with different wt%. FTIR study reveals the chemical changes during cross linking. Swelling in water, is done to have an idea about drug loading and drug release from the membrane. After drug loading to the hydrogels, we have studied the drug release property of the hydrogels using salicylic acid as a model drug.

  8. Accidental Release of Chlorine from a Storage Facility and an On-Site Emergency Mock Drill: A Case Study

    PubMed Central

    Soman, Ambalathumpara Raman; Sundararaj, Gopalswamy

    2015-01-01

    In the current industrial scenario there is a serious need for formulating strategies to handle hazardous substances in the safest way. Manufacture, storage, and use of hazardous substances pose a serious risk to industry, people, and the environment. Accidental release of toxic chemicals can lead to emergencies. An emergency response plan (ERP) is inevitable to minimize the adverse effects of such releases. The on-site emergency plan is an integral component of any process safety and risk management system. This paper deals with an on-site emergency response plan for a chlorine manufacturing industry. It was developed on the basis of a previous study on chlorine release and a full scale mock drill has been conducted for testing the plan. Results indicated that properly trained personnel can effectively handle each level of incidents occurring in the process plant. As an extensive guideline to the district level government authorities for off-site emergency planning, risk zone has also been estimated with reference to a chlorine exposure threshold of 3 ppm. PMID:26171416

  9. Steelhead Kelt Reconditioning and Reproductive Success, 2008 Annul Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hatch, Douglas R.

    Iteroparity, the ability to repeat spawn, is a natural life history strategy that is expressed by some species from the family Salmonidae. Current rates of observed steelhead Oncorhynchus mykiss iteroparity rates in the Columbia River Basin are severely depressed due to anthropogenic development which includes operation of the hydropower system and other habitat degradations. Artificial reconditioning, which is the process of culturing post-spawned fish (kelts) in a captive environment until they are able to reinitiate feeding, growth, and redevelop mature gonads, is evaluated in this study as method to restore depressed steelhead populations. To test the efficacy of steelhead keltmore » reconditioning as a management and recovery tool different scenarios were investigated ranging from very low intensity (collect and transport fish) to high intensity (collect and feed fish in captivity until rematuration). Examinations of gamete and progeny viability were performed for first-time spawners and reconditioned kelt steelhead. We have continued to examine reproductive success of reconditioned kelt steelhead in Omak Creek using microsatellite loci to perform parentage analysis on juvenile O. mykiss . The groundwork has also begun on developing a genetic analysis of the Yakima subbasin in order to determine steelhead kelt contribution by utilizing parentage analysis on a larger scale. A research and study plan has been developed cooperatively with the University of Idaho to determine the feasibility of steelhead kelt reconditioning program in the Snake River Basin. Analysis of management scenarios indicated that while no-term and short-term reconditioned kelts continue to perform well outmigrating to the ocean but returns from these groups have been low ranging from 0-12% during 2002-2008. Survival (56%) of fish in the long-term treatment in 2008 was the highest we have observed in this project. Analyzing the three different management scenarios within the Yakima River subbasin we determined that long-term reconditioning contributed the highest numbers of fish to the spawning run, and short-term reconditioning overall was the best of the transport releases to the ocean. However contributions to the spawning run by no-term or short-term groups was low in all years. This is the second successful year of kelt gamete and progeny analysis. Initial limited results suggest that reconditioned kelts may have shown limited improvement in both egg quantity and/or quality. There is further evidence to support the successful spawning of steelhead kelts in the wild at Omak Creek. Yakima kelts have been successfully identified to stream origin using genotypes.« less

  10. Indirect health costs in ulcerative colitis and Crohn's disease: a systematic review and meta-analysis.

    PubMed

    Kawalec, Paweł; Malinowski, Krzysztof Piotr

    2015-04-01

    The aim of this systematic review was to collect all current data on indirect costs related to inflammatory bowel disease as well as assessing homogeneity and comparability, and conducting a meta-analysis. Costs were collected using databases from Medline, Embase and Centre for Reviews and Dissemination databases, then average annual cost per patient was calculated and expressed in 2013-rate USD using the consumer price index and purchasing power parity (scenario 1) and then adjusted to specific gross domestic product (scenario 2) to make them comparable. The studies were then included in quantitative synthesis using the meta-analysis and bootstrap methods. This systematic review was carried out and reported in accordance with Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement. From 18 publications, overall annual indirect costs per patient as a result of the quantitative synthesis among all studies eligible for meta-analysis ranged from US$2425.01-US$9622.15 depending on the scenario and model used for analysis. The cost of presenteeism was assessed in only two studies. Considering heterogeneity among all identified studies random-effect model presented the most accurate results of meta-analysis equal to US$7189.27 and US$9622.15 per patient per year for scenario 1 and scenario 2, respectively. This systematic review revealed the existence of a relatively small number of studies that reported on the great economic burden of the disease upon society. A great variety of methodologies and cost components resulted in a very large discrepancy in indirect costs and made meta-analysis difficult to perform, so two scenarios were considered and meta-analysis conducted in subgroups to make data more comparable.

  11. Assessing green waste route by using Network Analysis

    NASA Astrophysics Data System (ADS)

    Hasmantika, I. H.; Maryono, M.

    2018-02-01

    Green waste, such as waste from park need treat proper. One of the main problems of green waste management is how to design optimum collection. This research aims to determine the optimum green waste collection by determining optimum route among park. The route optimum was assessed by using network analysis method. And the region five of Semarang city’s park within 20 parks in chose as case study. To enhancing recycle of green waste, three scenarios of treatment are proposed. Scenario 1 used one integrated treatment facility as terminal for enhancing recycle of green waste, Scenario 2 used two sites and scenario 3 used three sites. According to the assessment, the length of route of scenario 1 is 36.126 km and the time for collection estimated is 46 minutes. In scenario 2, the length of route is 36.471 km with a travel time is 47 minutes. The length of scenario three is 46.934 km and the time of collection is 60 minutes.

  12. Hydrogen Financial Analysis Scenario Tool (H2FAST); NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, Marc

    This presentation describes the Hydrogen Financial Analysis Scenario Tool, H2FAST, and provides an overview of each of the three H2FAST formats: the H2FAST web tool, the H2FAST Excel spreadsheet, and the H2FAST Business Case Scenario (BCS) tool. Examples are presented to illustrate the types of questions that H2FAST can help answer.

  13. Scenario and multiple criteria decision analysis for energy and environmental security of military and industrial installations.

    PubMed

    Karvetski, Christopher W; Lambert, James H; Linkov, Igor

    2011-04-01

    Military and industrial facilities need secure and reliable power generation. Grid outages can result in cascading infrastructure failures as well as security breaches and should be avoided. Adding redundancy and increasing reliability can require additional environmental, financial, logistical, and other considerations and resources. Uncertain scenarios consisting of emergent environmental conditions, regulatory changes, growth of regional energy demands, and other concerns result in further complications. Decisions on selecting energy alternatives are made on an ad hoc basis. The present work integrates scenario analysis and multiple criteria decision analysis (MCDA) to identify combinations of impactful emergent conditions and to perform a preliminary benefits analysis of energy and environmental security investments for industrial and military installations. Application of a traditional MCDA approach would require significant stakeholder elicitations under multiple uncertain scenarios. The approach proposed in this study develops and iteratively adjusts a scoring function for investment alternatives to find the scenarios with the most significant impacts on installation security. A robust prioritization of investment alternatives can be achieved by integrating stakeholder preferences and focusing modeling and decision-analytical tools on a few key emergent conditions and scenarios. The approach is described and demonstrated for a campus of several dozen interconnected industrial buildings within a major installation. Copyright © 2010 SETAC.

  14. A scenario-planning approach to human resources for health: the case of community pharmacists in Portugal.

    PubMed

    Gregório, João; Cavaco, Afonso; Velez Lapão, Luís

    2014-10-13

    Health workforce planning is especially important in a setting of political, social, and economic uncertainty. Portuguese community pharmacists are experiencing such conditions as well as increasing patient empowerment, shortage of primary care physicians, and primary health care reforms. This study aims to design three future scenarios for Portuguese community pharmacists, recognizing the changing environment as an opportunity to develop the role that community pharmacists may play in the Portuguese health system. The community pharmacist scenario design followed a three-stage approach. The first stage comprised thinking of relevant questions to be addressed and definition of the scenarios horizon. The second stage comprised two face-to-face, scenario-building workshops, for which 10 experts from practice and academic settings were invited. Academic and professional experience was the main selection criteria. The first workshop was meant for context analysis and design of draft scenarios, while the second was aimed at scenario analysis and validation. The final scenarios were built merging workshops' information with data collected from scientific literature followed by team consensus. The final stage involved scenario development carried by the authors alone, developing the narratives behind each scenario. Analysis allowed the identification of critical factors expected to have particular influence in 2020 for Portuguese community pharmacists, leading to two critical uncertainties: the "Legislative environment" and "Ability to innovate and develop services". Three final scenarios were built, namely "Pharmacy-Mall", "e-Pharmacist", and "Reorganize or Die". These scenarios provide possible trends for market needs, pharmacist workforce numbers, and expected qualifications to be developed by future professionals. In all scenarios it is clear that the future advance of Portuguese community pharmacists will depend on pharmaceutical services provision beyond medicine dispensing. This innovative professional role will require the acquisition or development of competencies in the fields of management, leadership, marketing, information technologies, teamwork abilities, and behavioural and communication skills. To accomplish a sustainable evolution, legislative changes and adequate financial incentives will be beneficial. The scenario development proves to be valuable as a strategic planning tool, not only for understanding future community pharmacist needs in a complex and uncertain environment, but also for other health care professionals.

  15. Simulating post-wildfire forest trajectories under alternative climate and management scenarios.

    PubMed

    Tarancón, Alicia Azpeleta; Fulé, Peter Z; Shive, Kristen L; Sieg, Carolyn H; Meador, Andrew Sánchez; Strom, Barbara

    Post-fire predictions of forest recovery under future climate change and management actions are necessary for forest managers to make decisions about treatments. We applied the Climate-Forest Vegetation Simulator (Climate-FVS), a new version of a widely used forest management model, to compare alternative climate and management scenarios in a severely burned multispecies forest of Arizona, USA. The incorporation of seven combinations of General Circulation Models (GCM) and emissions scenarios altered long-term (100 years) predictions of future forest condition compared to a No Climate Change (NCC) scenario, which forecast a gradual increase to high levels of forest density and carbon stock. In contrast, emissions scenarios that included continued high greenhouse gas releases led to near-complete deforestation by 2111. GCM-emissions scenario combinations that were less severe reduced forest structure and carbon stock relative to NCC. Fuel reduction treatments that had been applied prior to the severe wildfire did have persistent effects, especially under NCC, but were overwhelmed by increasingly severe climate change. We tested six management strategies aimed at sustaining future forests: prescribed burning at 5, 10, or 20-year intervals, thinning 40% or 60% of stand basal area, and no treatment. Severe climate change led to deforestation under all management regimes, but important differences emerged under the moderate scenarios: treatments that included regular prescribed burning fostered low density, wildfire-resistant forests composed of the naturally dominant species, ponderosa pine. Non-fire treatments under moderate climate change were forecast to become dense and susceptible to severe wildfire, with a shift to dominance by sprouting species. Current U.S. forest management requires modeling of future scenarios but does not mandate consideration of climate change effects. However, this study showed substantial differences in model outputs depending on climate and management actions. Managers should incorporate climate change into the process of analyzing the environmental effects of alternative actions.

  16. SPECIAL ANALYSIS FOR SLIT TRENCH DISPOSAL OF THE REACTOR PROCESS HEAT EXCHANGERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamm, L.; Collard, L.; Aleman, S.

    2012-06-18

    The Savannah River National Laboratory (SRNL), in response to a request from Solid Waste Management (SWM), conducted a Special Analysis (SA) to evaluate the performance of nineteen heat exchangers that are to be disposed in the E-Area low level waste facility Slit Trench 9 (ST 9). Although these nineteen heat exchangers were never decontaminated, the majority of the radionuclides in the heat exchanger inventory list were determined to be acceptable for burial because they are less than the 'generic' waste form inventory limits given in the 2008 Performance Assessment (PA) (WSRC, 2008). However, as generic waste, the H-3 and C-14more » inventories resulted in unacceptable sum-of-fractions (SOFs). Initial scoping analyses performed by SRNL indicated that if alterations were made to certain external nozzles to mitigate various potential leak paths, acceptable SOFs could be achieved through the use of a 'Special' waste form. This SA provides the technical basis for this new 'Special' waste form and provides the inventory limits for H-3 and C-14 for these nineteen heat exchangers such that the nineteen heat exchangers can be disposed in ST 9. This 'Special' waste form is limited to these nineteen heat exchangers in ST 9 and applies for H-3 and C-14, which are designated as H-3X and C-14X, respectively. The SA follows the same methodology used in the 2008 PA and the 2008 SA except for the modeling enhancements noted below. Infiltration rates above the heat exchangers are identical to those used in the 2008 PA; however, flow through the heat exchangers is unique. Because it is unknown exactly how sealed heat exchanger openings will perform and how surface and embedded contaminants will be released, multiple base cases or scenarios were established to investigate a set of performances. Each scenario consists of flow options (based on the performance of sealed openings) and a near-field release of contaminants (based on corrosion and diffusion performance). Two disposal configurations were analyzed where heat exchangers were assumed to be disposed four across and five lengthwise (the 4x5 configuration, with one empty) and three across and seven lengthwise (the 3x7 configuration, with two empty). A large range of conditions was considered. For example, peak well concentrations at the 100-m boundary for H-3 are shown in Figure ES-1 for a wide range of configurations (i.e. release mechanism and degree of sealing options). The maximum contaminant level (MCL) and a 10% SOF goal for H-3 are also shown. The 10% goal was based on an estimated volume fraction that these nineteen heat exchangers would consume in ST 9 and was solely used for scoping purposes to assess disposal feasibility and sealing requirements. Because various line breaks and poor sealing greatly exceeded that 10% goal, the determination was made that mitigating activities were needed, such as protection from line breaks and better sealing. An initial set of scenarios was run to assess the requirements for sealing the heat exchanger openings and the need to ensure that the sealed heat exchangers stayed sealed during transit and disposal operations. After discovering that such mitigating activities were required, additional scenarios were run that included the mitigating activities. Scenarios deemed to have a very low probability of occurrence were excluded from consideration for calculating inventory limits (for example, those scenarios that assumed an instantaneous release of contaminants along with poor sealing). The SA used the most recent K{sub d} values for the C-14 analyses and the most recent Dose Conversion Factors for H-3 and C-14 which have been updated since the 2008 PA was issued. This SA took into account the location and the disposal timing of these heat exchangers. The disposal location is within a small area of the overall Slit Trench unit (about 6% of the total) and is behind a line that is 200 ft from the down-gradient edge of ST 9. The disposal timing is assumed to be after July 1, 2012 (because disposals cannot occur until this document is approved and mitigating activities are completed) which means that the disposal occurs after the first time period for the 2008 PA beta-gamma pathway (that time period is from December 1995 until December 2007), thus that pathway time period is not considered. Table ES-1 provides new 'Special' waste form groundwater pathway inventory limits for C-14X and H-3X in the heat exchangers. Inventory limits for generic C-14 and H-3 in the West Slit Trenches are included for comparison. The lowest limit for generic C-14 is 1.9E-1 Ci, while for C-14X it is 2.7E0, an increase of more than 14 times. Because time windows are employed, at later times C-14X exhibits lower limits than those for generic C-14 because with its smaller K{sub d} the C-14 moves much faster. The lowest limit for generic H-3 is 3.6E0 Ci, while for H-3X it is 1.7E3, an increase of almost 500 times.« less

  17. Construction, calibration, and validation of the RBM10 water temperature model for the Trinity River, northern California

    USGS Publications Warehouse

    Jones, Edward C.; Perry, Russell W.; Risley, John C.; Som, Nicholas A.; Hetrick, Nicholas J.

    2016-03-31

    Augmentation scenarios were based on historical hydrological and meteorological data, combined with prescribed flow and temperature releases from Lewiston Dam provided by the Bureau of Reclamation. Water releases were scheduled to achieve targeted flows of 2,500, 2,800, and 3,200 cubic feet per second in the lower Klamath River from mid-August through late September, coinciding with the upstream migration of adult fall-run Chinook salmon (Oncorhynchus tshawytscha). Water temperatures simulated at river mile 5.7 on the Klamath River showed a 5 °C decrease from the No Action historical baseline, which was near or greater than 23 °C when augmentation began in mid-August. Thereafter, an approximate 1 °C difference among augmentation scenarios emerged, with the decrease in water temperature commensurate to the level of augmentation. All augmentation scenarios simulated water temperatures equal to or less than 21 °C from mid-August through late September. Water temperatures equal to or greater than 23 °C are of particular interest because of a thermal threshold known to inhibit upstream migration of salmon. When temperatures exceed this approximate 23 °C threshold, Chinook salmon are known to congregate in high densities in thermal refugias and show extended residence times, which can potentially trigger epizootic outbreaks such as of Ichthyophthirius multifiliis (“Ich”) and Flavobacterium columnare (“Columnaris”) that were the causative factors of the Klamath River fish kill in 2002. A model with the ability to simulate water temperatures in response to management actions at the basin scale is a valuable asset for water managers who must make decisions about how best to use limited water resources, which directly affect the state of fisheries in the Klamath Basin.

  18. The K-PG boundary: how geological events lead to collapse of marine primary producers

    NASA Astrophysics Data System (ADS)

    Hir guillaume, Le; frederic, Fluteau; yves, Goddéris

    2017-04-01

    The cause(s) of Cretaceous/Paleogene (K-Pg) mass extinction event is a matter of debate since three decades. A first scenario connects the K-Pg crisis with the Chicxulub impact while the second scenario evokes the emplacement of the Deccan traps in India as the cause for the K-Pg biodiversity collapse. Pierazzo et al. (1998) estimated that the extraterrestrial bolide lead to an instantaneously CO2 degassing ranging from 880 Gt to 2,960 Gt into the atmosphere, together with a massive release of SO2 ranging from 150 to 460 Gt.. Self et al. (2006, 2008) and Chenet et al. (2009) suggested that the emplacement of the Deccan traps released 15,000 Gt to 35,000 Gt of CO2 and 6,800 Gt to 17,000 Gt of SO2 over a 250 kyr-long period (Schoene et al., 2015). To decipher and quantify the long term environmental consequences of both events, we tested different scenarios: a pulse-like magmatic degassing, a bolide impact, and a combination of both. To understand the environmental changes and quantify biodiversity responses, we improve GEOCLIM, a coupled climate-carbon numerical model, by implementing a biodiversity model in which marine species are described by specific death/born rates, sensitivity to abiotic factors (temperature, pH, dissolved O2, calcite saturation state) and feeding relationships, each of these characteristics is assigned randomly. Preliminary simulations accounting for the eruption of the Deccan traps show that successive cooling events (S-aerosols effect) combined with a progressive acidification of surface water (caused by CO2 and SO2 injections) cause a major collapse of the marine biomass. Additional simulations in which Chicxulub impact, different community structures of primary producers will be discussed.

  19. Personal Breathing Zone Exposures among Hot-Mix Asphalt Paving Workers; Preliminary Analysis for Trends and Analysis of Work Practices That Resulted in the Highest Exposure Concentrations

    PubMed Central

    Osborn, Linda V.; Snawder, John E.; Kriech, Anthony J.; Cavallari, Jennifer M.; McClean, Michael D.; Herrick, Robert F.; Blackburn, Gary R.; Olsen, Larry D.

    2015-01-01

    An exposure assessment of hot-mix asphalt (HMA) paving workers was conducted to determine which of four exposure scenarios impacted worker exposure and dose. Goals of this report are to present the personal-breathing zone (PBZ) data, discuss the impact of substituting the releasing/cleaning agent, and discuss work practices that resulted in the highest exposure concentration for each analyte. One-hundred-seven PBZ samples were collected from HMA paving workers on days when diesel oil was used as a releasing/cleaning agent. An additional 36 PBZ samples were collected on days when B-100 (100% biodiesel, containing no petroleum-derived products) was used as a substitute releasing/cleaning agent. Twenty-four PBZ samples were collected from a reference group of concrete workers, who also worked in outdoor construction but had no exposure to asphalt emissions. Background and field blank samples were also collected daily. Total particulates and the benzene soluble fraction were determined gravimetrically. Total organic matter was determined using gas chromatography (GC) with flame ionization detection and provided qualitative information about other exposure sources contributing to worker exposure besides asphalt emissions. Thirty-three individual polycyclic aromatic compounds (PACs) were determined using GC with time-offlight mass spectrometry; results were presented as either the concentration of an individual PAC or a summation of the individual PACs containing either 2- to 3-rings or 4- to 6-rings. Samples were also screened for PACs containing 4- to 6-rings using fluorescence spectroscopy. Arithmetic means, medians, and box plots of the PBZ data were used to evaluate trends in the data. Box plots illustrating the diesel oil results were more variable than the B-100. Also, the highest diesel oil results were much higher in concentration than the highest B-100 results. An analysis of the highest exposure results and field notes revealed a probable association between these exposures and the use of diesel oil, use of a diesel-powered screed, elevated HMA paving application temperatures, lubricating and working on broken-down equipment, and operation of a broom machine. PMID:24195533

  20. Personal breathing zone exposures among hot-mix asphalt paving workers; preliminary analysis for trends and analysis of work practices that resulted in the highest exposure concentrations.

    PubMed

    Osborn, Linda V; Snawder, John E; Kriech, Anthony J; Cavallari, Jennifer M; McClean, Michael D; Herrick, Robert F; Blackburn, Gary R; Olsen, Larry D

    2013-01-01

    An exposure assessment of hot-mix asphalt (HMA) paving workers was conducted to determine which of four exposure scenarios impacted worker exposure and dose. Goals of this report are to present the personal-breathing zone (PBZ) data, discuss the impact of substituting the releasing/cleaning agent, and discuss work practices that resulted in the highest exposure concentration for each analyte. One-hundred-seven PBZ samples were collected from HMA paving workers on days when diesel oil was used as a releasing/cleaning agent. An additional 36 PBZ samples were collected on days when B-100 (100% biodiesel, containing no petroleum-derived products) was used as a substitute releasing/cleaning agent. Twenty-four PBZ samples were collected from a reference group of concrete workers, who also worked in outdoor construction but had no exposure to asphalt emissions. Background and field blank samples were also collected daily. Total particulates and the benzene soluble fraction were determined gravimetrically. Total organic matter was determined using gas chromatography (GC) with flame ionization detection and provided qualitative information about other exposure sources contributing to worker exposure besides asphalt emissions. Thirty-three individual polycyclic aromatic compounds (PACs) were determined using GC with time-of-flight mass spectrometry; results were presented as either the concentration of an individual PAC or a summation of the individual PACs containing either 2- to 3-rings or 4- to 6-rings. Samples were also screened for PACs containing 4- to 6-rings using fluorescence spectroscopy. Arithmetic means, medians, and box plots of the PBZ data were used to evaluate trends in the data. Box plots illustrating the diesel oil results were more variable than the B-100. Also, the highest diesel oil results were much higher in concentration than the highest B-100 results. An analysis of the highest exposure results and field notes revealed a probable association between these exposures and the use of diesel oil, use of a diesel-powered screed, elevated HMA paving application temperatures, lubricating and working on broken-down equipment, and operation of a broom machine.

  1. Anatomy of the hyper-runaway star LP 40-365 with Gaia

    NASA Astrophysics Data System (ADS)

    Raddi, R.; Hollands, M. A.; Gänsicke, B. T.; Townsley, D. M.; Hermes, J. J.; Gentile Fusillo, N. P.; Koester, D.

    2018-06-01

    LP 40-365 (aka GD 492) is a nearby low-luminosity hyper-runaway star with an extremely unusual atmospheric composition, which has been proposed as the remnant of a white dwarf that survived a subluminous Type Ia supernova (SN Ia) in a single-degenerate scenario. Adopting the Gaia Data Release (DR2) parallax, ϖ = 1.58 ± 0.03 mas, we estimate a radius of 0.18 ± 0.01 R⊙, confirming LP 40-365 as a subluminous star that is ≃ 15 times larger than a typical white dwarf and is compatible with the SN Ia remnant scenario. We present an updated kinematic analysis, making use of the Gaia parallax and proper motion, and confirm that LP 40-365 is leaving the Milky Way at about 1.5 times the escape velocity of the Solar neighbourhood with a rest-frame velocity of 852 ± 10 km s-1. Integrating the past trajectories of LP 40-365, we confirm it crossed the Galactic disc 5.0 ± 0.3 Myr ago in the direction of Carina, likely coming from beneath the plane. Finally, we estimate that LP 40-365 was ejected from its progenitor binary with a velocity of at least 600 km s-1, which is compatible with theoretical predictions for close binaries containing a white dwarf and a helium-star donor.

  2. Possible impacts of sequestration on federal research

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    2012-10-01

    U.S. federal research and development (R&D) activities could be reduced by up to $57.5 billion, or 8.4%, through 2017 because of automatic reductions in U.S. federal funding, referred to as sequestration, that are set to begin in January 2013 under the 2011 Budget Control Act. That is according to a 27 September analysis by the American Association for the Advancement of Science (AAAS). If defense R&D is pulled from the equation, sequestration could cut nondefense R&D by $50.8 billion, or 17.2% through that same time period, according to AAAS. Under an equal allocation scenario, the Department of Energy could lose $4.6 billion for R&D over that time period, the National Science Foundation could lose $2.1 billion for R&D, and NASA could lose $3.5 billion, according to the analysis, which also notes that states could be hit hard by decreased federal R&D spending. Congressional leaders currently are looking into how to avoid sequestration. For more information, see http://www.aaas.org/news/releases/2012/0928sequester.shtml.

  3. Minimizing makespan in a two-stage flow shop with parallel batch-processing machines and re-entrant jobs

    NASA Astrophysics Data System (ADS)

    Huang, J. D.; Liu, J. J.; Chen, Q. X.; Mao, N.

    2017-06-01

    Against a background of heat-treatment operations in mould manufacturing, a two-stage flow-shop scheduling problem is described for minimizing makespan with parallel batch-processing machines and re-entrant jobs. The weights and release dates of jobs are non-identical, but job processing times are equal. A mixed-integer linear programming model is developed and tested with small-scale scenarios. Given that the problem is NP hard, three heuristic construction methods with polynomial complexity are proposed. The worst case of the new constructive heuristic is analysed in detail. A method for computing lower bounds is proposed to test heuristic performance. Heuristic efficiency is tested with sets of scenarios. Compared with the two improved heuristics, the performance of the new constructive heuristic is superior.

  4. KSC-2013-4494

    NASA Image and Video Library

    2013-12-20

    MORRO BAY, Calif. – The SpaceX Dragon test article tumbles over the Pacific Ocean, off the coast of Morro Bay, Calif., following its release for an Erickson Sky Crane helicopter. SpaceX engineers induced the tumble to evaluate the spacecraft's parachute deployment system in an emergency abort scenario. The test is part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. Photo credit: NASA/Kim Shiflett

  5. KSC-2013-4492

    NASA Image and Video Library

    2013-12-20

    MORRO BAY, Calif. – An Erickson Sky Crane helicopter releases the SpaceX Dragon test article, inducing a tumble similar to what is expected in an emergency abort scenario, over the Pacific Ocean, off the coast of Morro Bay, Calif. The test allowed engineers to better evaluate the spacecraft's parachute deployment system as part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. Photo credit: NASA/Kim Shiflett

  6. KSC-2013-4493

    NASA Image and Video Library

    2013-12-20

    MORRO BAY, Calif. – The SpaceX Dragon test article tumbles over the Pacific Ocean, off the coast of Morro Bay, Calif., following its release for an Erickson Sky Crane helicopter. SpaceX engineers induced the tumble to evaluate the spacecraft's parachute deployment system in an emergency abort scenario. The test is part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. Photo credit: NASA/Kim Shiflett

  7. Effects of a Nutrient Enriched Beverage on Host Defense Mechanisms of Soldiers Completing the Special Forces Assessment and Selection School.

    DTIC Science & Technology

    2000-01-01

    Nutrition and Biochemistry Division United States Army Research Institute of Environmental Medicine Natick, Massachusetts 01760-5007 and *Ross Products...Division of Abbott Laboratories Medical Nutrition R&D Columbus, OH 43215 DISTRIBUTION STATEMENT A Approved for Public Release Distribution...that specific nutritional supplements may enhance immune function, expecially under stressful conditions or scenarios. 14. SUBJECT TERMS Special

  8. Religious Accommodation for Military Members in the Twenty-First Century

    DTIC Science & Technology

    2012-02-01

    order and discipline is speculative. He presented a scenario where a female Airman had permission to wear her hijab indoors. When she transferred to the...Lieutenant Colonel, USAF A Research Report Submitted to the Faculty In Partial Fulfillment of the Graduation Requirements Advisor: Dr. Stefan Eisen, Jr...Colonel, USAF (Retired) Maxwell Air Force Base, Alabama February 2012 DISTRIBUTION A . Approved for public release: distribution unlimited 2

  9. Vapor tagging of electric blasting caps with perfluorinated compounds. [For crime detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Senum, G I; Gergley, R P; Greene, M

    Vapor tagging of electric blasting caps (EBC) is accomplished with the use of perfluorocarbon taggants. These taggants are absorbed in either the present EBC end closures or in substitute fluoroelastomeric end closures to approximately 5 to 10% of the total weight of end closure. The specific taggants have been chosen to allow a 0.5 to 5 nanoliter per minute vapor taggant emission rate from the tagged EBC over a 5 y lifetime. The taggant emission rates from tagged EBC have been experimentally observed to be well described by a taggant emission rate model. This model provides for experimental selection ofmore » the proper taggant for projected lifetimes of ten years based on just several months of observed emission measurements. Another model has been derived which can predict the taggant concentrations in various realistic scenarios such as room, building, lockers, etc. The model takes into consideration the effect of barriers such as boxes, suitcases, etc., in impeding the release of the taggant vapors from the tagged EBC into the scenario and the dilution effect of the scenarios air circulation system. Taggant concentrations have been experimentally determined using a 425 liter sampling chamber with various barriers and the results are used with the model to predict various scenario taggant concentrations.« less

  10. Environmental Risk Evaluation System – An Approach to Ranking Risk of Ocean Energy Development on Coastal and Estuarine Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Copping, Andrea E.; Hanna, Luke A.; Van Cleve, Frances B.

    Deployment and operation of ocean energy devices does not represent the first foray into industrialization of the oceans; shipping, nearshore development, waste disposal, subsea mining, oil and gas extraction, and large-scale commercial fishing all coexist in various states of equilibrium with the marine environment. In most cases these industries were developed without a clear understanding of the likely outcomes of large-scale development. In virtually every country where the harvest of ocean energy is emerging, regulators and stakeholders require that the industry examine potential effects of devices, minimize the footprint of effects, and provide management measures that either avoid the impactsmore » or mitigate to further reduce the residual impacts. The ERES analysis is based on scenarios that are consistent with sequences of events that lead to adverse impacts, distinguishing between episodic, intermittent, and chronic risks. In the context of ocean energy development, an episodic scenario might involve the exceedingly rare but potentially devastating event of an oil spill from vessels caused by the presence of the device, while vulnerable receptors are present; understanding the risk of such a scenario involves determining the probability of the occurrence by examining factors such as the petroleum content of ocean energy devices, the vessel traffic volume and the proximity of shipping lanes to the ocean energy devices, the reliability of the control measures to avoid an episodic event, and the likely presence of seabirds, marine mammals, or fish that may be affected by oil. In contrast, chronic risk scenarios involve events or circumstances that are continuous, so that risk characterization involves assessing only the severity of the consequences. An example of a chronic risk scenario might be the toxicity to marine organisms due to low-level chemical releases from anti-biofouling paints and coatings that may be used on devices, and the effect that the level of toxicity may have on marine flora and fauna. Between these two extremes are intermittent events, such as encounters between fish and rotating tidal turbine blades that will occur only when fish are present and the tidal device is turning. A key feature of understanding risk is describing the uncertainty associated with the occurrence of an episodic, intermittent, or chronic event, as well as the uncertainty of the resulting consequences.« less

  11. Model Performance Evaluation and Scenario Analysis (MPESA) Tutorial

    EPA Science Inventory

    This tool consists of two parts: model performance evaluation and scenario analysis (MPESA). The model performance evaluation consists of two components: model performance evaluation metrics and model diagnostics. These metrics provides modelers with statistical goodness-of-fit m...

  12. Climate-based archetypes for the environmental fate assessment of chemicals.

    PubMed

    Ciuffo, Biagio; Sala, Serenella

    2013-11-15

    Emissions of chemicals have been on the rise for years, and their impacts are greatly influenced by spatial differentiation. Chemicals are usually emitted locally but their impact can be felt both locally and globally, due to their chemical properties and persistence. The variability of environmental parameters in the emission compartment may affect the chemicals' fate and the exposure at different orders of magnitude. The assessment of the environmental fate of chemicals and the inherent spatial differentiation requires the use of multimedia models at various levels of complexity (from a simple box model to complex computational and high-spatial-resolution models). The objective of these models is to support ecological and human health risk assessment, by reducing the uncertainty of chemical impact assessments. The parameterisation of spatially resolved multimedia models is usually based on scenarios of evaluative environments, or on geographical resolutions related to administrative boundaries (e.g. countries/continents) or landscape areas (e.g. watersheds, eco-regions). The choice of the most appropriate scale and scenario is important from a management perspective, as a balance should be reached between a simplified approach and computationally intensive multimedia models. In this paper, which aims to go beyond the more traditional approach based on scale/resolution (cell, country, and basin), we propose and assess climate-based archetypes for the impact assessment of chemicals released in air. We define the archetypes based on the main drivers of spatial variability, which we systematically identify by adopting global sensitivity analysis techniques. A case study that uses the high resolution multimedia model MAPPE (Multimedia Assessment of Pollutant Pathways in the Environment) is presented. Results of the analysis showed that suitable archetypes should be both climate- and chemical-specific, as different chemicals (or groups of them) have different traits that influence their spatial variability. This hypothesis was tested by comparing the variability of the output of MAPPE for four different climatic zones on four different continents for four different chemicals (which represent different combinations of physical and chemical properties). Results showed the high suitability of climate-based archetypes in assessing the impacts of chemicals released in air. However, further research work is still necessary to test these findings. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Wall conditioning in ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Rohde, V.; Dux, R.; Kallenbach, A.; Krieger, K.; Neu, R.; ASDEX Upgrade Team

    2007-06-01

    An overview on wall conditioning in ASDEX Upgrade is presented. Helium glow discharges (HeGD) are needed mostly for plasma start up after high density discharges, disruptions and disruption mitigation gas puffs. Boronisation is routinely applied. The reduction of the oxygen content is a minor effect. Strong variation of the wall pumping is observed for tungsten first wall materials. The uncoated tungsten surface stores and releases large amounts of He, which can disturb the plasma. The released He causes the modification in the wall pumping. By reducing HeGD this effect could be minimized. Advanced and natural density scenarios are sensitive to the status of the wall coating. Accumulation of impurities at the pedestal influences the ELM frequency and finally causes radiation unstable discharges.

  14. The Role of Deposition in Limiting the Hazard Extent of Dense-Gas Plumes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillon, M B

    2008-05-11

    Accidents that involve large (multi-ton) releases of toxic industrial chemicals and form dense-gas clouds often yield far fewer fatalities, casualties and environmental effects than standard assessment and emergency response models predict. This modeling study, which considers both dense-gas turbulence suppression and deposition to environmental objects (e.g. buildings), demonstrates that dry deposition to environmental objects may play a significant role in reducing the distance at which adverse impacts occur - particularly under low-wind, stable atmospheric conditions which are often considered to be the worst-case scenario for these types of releases. The degree to which the released chemical sticks to (or reactsmore » with) environmental surfaces is likely a key parameter controlling hazard extents. In all modeled cases, the deposition to vertical surfaces of environmental objects (e.g. building walls) was more efficient in reducing atmospheric chemical concentrations than deposition to the earth's surface. This study suggests that (1) hazard extents may vary widely by release environment (e.g. grasslands vs. suburbia) and release conditions (e.g. sunlight or humidity may change the rate at which chemicals react with a surface) and (2) greenbelts (or similar structures) may dramatically reduce the impacts of large-scale releases. While these results are demonstrated to be qualitatively consistent with the downwind extent of vegetation damage in two chlorine releases, critical knowledge gaps exist and this study provides recommendations for additional experimental studies.« less

  15. Modeling the effect of snow and ice on the global environmental fate and long-range transport potential of semivolatile organic compounds.

    PubMed

    Stocker, Judith; Scheringer, Martin; Wegmann, Fabio; Hungerbuhler, Konrad

    2007-09-01

    Snow and ice have been implemented in a global multimedia box model to investigate the influence of these media on the environmental fate and long-range transport (LRT) of semivolatile organic compounds (SOCs). Investigated compounds include HCB, PCB28, PCB180, PBDE47, PBDE209, alpha-HCH, and dacthal. In low latitudes, snow acts as a transfer medium taking up chemicals from air and releasing them to water or soil during snowmelt. In high latitudes, snow and ice shield water, soil, and vegetation from chemical deposition. In the model version including snow and ice (scenario 2), the mass of chemicals in soil in high latitudes is between 27% (HCB) and 97% (alpha-HCH) of the mass calculated with the model version without snow and ice (scenario 1). Amounts in Arctic seawater in scenario 2 are 8% (alpha-HCH) to 21% (dacthal) of the amounts obtained in scenario 1. For all investigated chemicals except alpha-HCH, presence of snow and ice in the model increases the concentration in air by a factor of 2 (HCB)to 10 (PBDE209). Because of reduced net deposition to snow-covered surfaces in high latitudes, LRT to the Arctic is reduced for most chemicals whereas transport to the south is more pronounced than in scenario 1 ("southward shift"). The presence of snow and ice thus considerably changes the environmental fate of SOCs.

  16. Size-separated particle fractions of stainless steel welding fume particles - A multi-analytical characterization focusing on surface oxide speciation and release of hexavalent chromium.

    PubMed

    Mei, N; Belleville, L; Cha, Y; Olofsson, U; Odnevall Wallinder, I; Persson, K-A; Hedberg, Y S

    2018-01-15

    Welding fume of stainless steels is potentially health hazardous. The aim of this study was to investigate the manganese (Mn) and chromium (Cr) speciation of welding fume particles and their extent of metal release relevant for an inhalation scenario, as a function of particle size, welding method (manual metal arc welding, metal arc welding using an active shielding gas), different electrodes (solid wires and flux-cored wires) and shielding gases, and base alloy (austenitic AISI 304L and duplex stainless steel LDX2101). Metal release investigations were performed in phosphate buffered saline (PBS), pH 7.3, 37°, 24h. The particles were characterized by means of microscopic, spectroscopic, and electroanalytical methods. Cr was predominantly released from particles of the welding fume when exposed in PBS [3-96% of the total amount of Cr, of which up to 70% as Cr(VI)], followed by Mn, nickel, and iron. Duplex stainless steel welded with a flux-cored wire generated a welding fume that released most Cr(VI). Nano-sized particles released a significantly higher amount of nickel compared with micron-sized particle fractions. The welding fume did not contain any solitary known chromate compounds, but multi-elemental highly oxidized oxide(s) (iron, Cr, and Mn, possibly bismuth and silicon). Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Joint release rate estimation and measurement-by-measurement model correction for atmospheric radionuclide emission in nuclear accidents: An application to wind tunnel experiments.

    PubMed

    Li, Xinpeng; Li, Hong; Liu, Yun; Xiong, Wei; Fang, Sheng

    2018-03-05

    The release rate of atmospheric radionuclide emissions is a critical factor in the emergency response to nuclear accidents. However, there are unavoidable biases in radionuclide transport models, leading to inaccurate estimates. In this study, a method that simultaneously corrects these biases and estimates the release rate is developed. Our approach provides a more complete measurement-by-measurement correction of the biases with a coefficient matrix that considers both deterministic and stochastic deviations. This matrix and the release rate are jointly solved by the alternating minimization algorithm. The proposed method is generic because it does not rely on specific features of transport models or scenarios. It is validated against wind tunnel experiments that simulate accidental releases in a heterogonous and densely built nuclear power plant site. The sensitivities to the position, number, and quality of measurements and extendibility of the method are also investigated. The results demonstrate that this method effectively corrects the model biases, and therefore outperforms Tikhonov's method in both release rate estimation and model prediction. The proposed approach is robust to uncertainties and extendible with various center estimators, thus providing a flexible framework for robust source inversion in real accidents, even if large uncertainties exist in multiple factors. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Designing Peace and Conflict Exercises: Level of Analysis, Scenario, and Role Specification

    ERIC Educational Resources Information Center

    Bartels, Elizabeth; McCown, Margaret; Wilkie, Timothy

    2013-01-01

    Attentiveness to and transparency about the methodological implications of the level of analysis selected for peace and conflict exercises constitute essential elements of good game design. The article explores the impact of level of analysis choices in the context of two key portions of exercises, scenario construction and role specification. It…

  19. Isotopic Composition of Carbon Dioxide Released from Confidence Hills Sediment as Measured by the Sample Analysis at Mars (SAM) Quadrupole Mass Spectrometer

    NASA Technical Reports Server (NTRS)

    Franz, H. B.; Mahaffy, P. R.; Stern, J.; Archer, P., Jr.; Conrad, P.; Eigenbrode, J.; Freissinet, C.; Glavin, D.; Grotzinger, J. P.; Jones, J.; hide

    2015-01-01

    In October 2014, the Mars Science Laboratory (MSL) "Curiosity" rover drilled into the sediment at the base of Mount Sharp in a location namsed Cionfidence Hills (CH). CH marked the fifth sample pocessed by the Sample Analysis at Mars (SAM) instrument suite since Curiosity arrived in Gale Crater, with previous analyses performed at Rocknest (RN), John Klein (JK), Cumberland (CB), and Windjana (WJ). Evolved gas analysis (EGA) of all samples has indicated H2O as well as O-, C- and S-bearing phases in the samples, often at abundances that would be below the detection limit of the CheMin instrument. By examining the temperatures at which gases are evolved from samples, SAM EGA data can help provide clues to the mineralogy of volatile-bearing phases when their identities are unclear to CheMin. SAM may also detect gases evolved from amorphous material in solid samples, which is not suitable for analysis by CheMin. Finally, the isotopic composition of these gases may suggest possible formation scenarios and relationships between phases. We will discuss C isotope ratios of CO2 evolved from the CH sample as measured with SAM's quadrupole mass spectrometer (QMS) and draw comparisons to samples previously analyzed by SAM.

  20. Linking global scenarios to national assessments: Experiences from the Resources Planning Act (RPA) Assessment

    Treesearch

    Linda L. Langner; Peter J. Ince

    2012-01-01

    The Resources Planning Act (RPA) Assessment provides a nationally consistent analysis of the status and trends of the Nation's renewable forest resources. A global scenario approach was taken for the 2010 RPA Assessment to provide a shared world view of potential futures. The RPA Assessment scenarios were linked to the global scenarios and climate projections used...

  1. Inadvertent Intruder Analysis For The Portsmouth On-Site Waste Disposal Facility (OSWDF)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Frank G.; Phifer, Mark A.

    2014-01-22

    The inadvertent intruder analysis considers the radiological impacts to hypothetical persons who are assumed to inadvertently intrude on the Portsmouth OSWDF site after institutional control ceases 100 years after site closure. For the purposes of this analysis, we assume that the waste disposal in the OSWDF occurs at time zero, the site is under institutional control for the next 100 years, and inadvertent intrusion can occur over the following 1,000 year time period. Disposal of low-level radioactive waste in the OSWDF must meet a requirement to assess impacts on such individuals, and demonstrate that the effective dose equivalent to anmore » intruder would not likely exceed 100 mrem per year for scenarios involving continuous exposure (i.e. chronic) or 500 mrem for scenarios involving a single acute exposure. The focus in development of exposure scenarios for inadvertent intruders was on selecting reasonable events that may occur, giving consideration to regional customs and construction practices. An important assumption in all scenarios is that an intruder has no prior knowledge of the existence of a waste disposal facility at the site. Results of the analysis show that a hypothetical inadvertent intruder at the OSWDF who, in the worst case scenario, resides on the site and consumes vegetables from a garden established on the site using contaminated soil (chronic agriculture scenario) would receive a maximum chronic dose of approximately 7.0 mrem/yr during the 1000 year period of assessment. This dose falls well below the DOE chronic dose limit of 100 mrem/yr. Results of the analysis also showed that a hypothetical inadvertent intruder at the OSWDF who, in the worst case scenario, excavates a basement in the soil that reaches the waste (acute basement construction scenario) would receive a maximum acute dose of approximately 0.25 mrem/yr during the 1000 year period of assessment. This dose falls well below the DOE acute dose limit of 500 mrem/yr. Disposal inventory constraints based on the intruder analysis are well above conservative estimates of the OSWDF inventory and, based on intruder disposal limits; about 7% of the disposal capacity is reached with the estimated OSWDF inventory.« less

  2. Accuracy Analysis and Parameters Optimization in Urban Flood Simulation by PEST Model

    NASA Astrophysics Data System (ADS)

    Keum, H.; Han, K.; Kim, H.; Ha, C.

    2017-12-01

    The risk of urban flooding has been increasing due to heavy rainfall, flash flooding and rapid urbanization. Rainwater pumping stations, underground reservoirs are used to actively take measures against flooding, however, flood damage from lowlands continues to occur. Inundation in urban areas has resulted in overflow of sewer. Therefore, it is important to implement a network system that is intricately entangled within a city, similar to the actual physical situation and accurate terrain due to the effects on buildings and roads for accurate two-dimensional flood analysis. The purpose of this study is to propose an optimal scenario construction procedure watershed partitioning and parameterization for urban runoff analysis and pipe network analysis, and to increase the accuracy of flooded area prediction through coupled model. The establishment of optimal scenario procedure was verified by applying it to actual drainage in Seoul. In this study, optimization was performed by using four parameters such as Manning's roughness coefficient for conduits, watershed width, Manning's roughness coefficient for impervious area, Manning's roughness coefficient for pervious area. The calibration range of the parameters was determined using the SWMM manual and the ranges used in the previous studies, and the parameters were estimated using the automatic calibration method PEST. The correlation coefficient showed a high correlation coefficient for the scenarios using PEST. The RPE and RMSE also showed high accuracy for the scenarios using PEST. In the case of RPE, error was in the range of 13.9-28.9% in the no-parameter estimation scenarios, but in the scenario using the PEST, the error range was reduced to 6.8-25.7%. Based on the results of this study, it can be concluded that more accurate flood analysis is possible when the optimum scenario is selected by determining the appropriate reference conduit for future urban flooding analysis and if the results is applied to various rainfall event scenarios and parameter optimization. Keywords: Parameters Optimization; PEST model; Urban area Acknowledgement This research was supported by a grant (17AWMP-B079625-04) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  3. Sea Level Rise Induced Arsenic Release from Historically Contaminated Coastal Soils.

    PubMed

    LeMonte, Joshua J; Stuckey, Jason W; Sanchez, Joshua Z; Tappero, Ryan; Rinklebe, Jörg; Sparks, Donald L

    2017-06-06

    Climate change-induced perturbations in the hydrologic regime are expected to impact biogeochemical processes, including contaminant mobility and cycling. Elevated levels of geogenic and anthropogenic arsenic are found along many coasts around the world, most notably in south and southeast Asia but also in the United States, particularly along the Mid-Atlantic coast. The mechanism by and the extent to which arsenic may be released in contaminated coastal soils due to sea level rise are unknown. Here we show a series of data from a coastal arsenic-contaminated soil exposed to sea and river waters in biogeochemical microcosm reactors across field-validated redox conditions. We find that reducing conditions lead to arsenic release from historically contaminated coastal soils through reductive dissolution of arsenic-bearing mineral oxides in both sea and river water inundations, with less arsenic release from seawater scenarios than river water due to inhibition of oxide dissolution. For the first time, we systematically display gradation of solid phase soil-arsenic speciation across defined redox windows from reducing to oxidizing conditions in natural waters by combining biogeochemical microcosm experiments and X-ray absorption spectroscopy. Our results demonstrate the threat of sea level rise stands to impact arsenic release from contaminated coastal soils by changing redox conditions.

  4. Sea Level Rise Induced Arsenic Release from Historically Contaminated Coastal Soils

    DOE PAGES

    LeMonte, Joshua J.; Stuckey, Jason W.; Sanchez, Joshua Z.; ...

    2017-05-04

    Climate change-induced perturbations in the hydrologic regime are expected to impact biogeochemical processes, including contaminant mobility and cycling. Elevated levels of geogenic and anthropogenic arsenic are found along many coasts around the world, most notably in south and southeast Asia but also in the United States, particularly along the Mid-Atlantic coast. The mechanism by and the extent to which arsenic may be released in contaminated coastal soils due to sea level rise are unknown. Here we show a series of data from a coastal arsenic-contaminated soil exposed to sea and river waters in biogeochemical microcosm reactors across field-validated redox conditions.more » We find that reducing conditions lead to arsenic release from historically contaminated coastal soils through reductive dissolution of arsenic-bearing mineral oxides in both sea and river water inundations, with less arsenic release from seawater scenarios than river water due to inhibition of oxide dissolution. For the first time, we systematically display gradation of solid phase soil-arsenic speciation across defined redox windows from reducing to oxidizing conditions in natural waters by combining biogeochemical microcosm experiments and X-ray absorption spectroscopy. Here, our results demonstrate the threat of sea level rise stands to impact arsenic release from contaminated coastal soils by changing redox conditions.« less

  5. Modelling Aedes aegypti mosquito control via transgenic and sterile insect techniques: endemics and emerging outbreaks.

    PubMed

    Seirin Lee, S; Baker, R E; Gaffney, E A; White, S M

    2013-08-21

    The invasion of pest insects often changes or destroys a native ecosystem, and can result in food shortages and disease endemics. Issues such as the environmental effects of chemical control methods, the economic burden of maintaining control strategies and the risk of pest resistance still remain, and mosquito-borne diseases such as malaria and dengue fever prevail in many countries, infecting over 100 million worldwide in 2010. One environmentally friendly method for mosquito control is the Sterile Insect Technique (SIT). This species-specific method of insect control relies on the mass rearing, sterilization and release of large numbers of sterile insects. An alternative transgenic method is the Release of Insects carrying a Dominant Lethal (RIDL). Our objective is to consider contrasting control strategies for two invasive scenarios via SIT and RIDL: an endemic case and an emerging outbreak. We investigate how the release rate and size of release region influence both the potential for control success and the resources needed to achieve it, under a range of conditions and control strategies, and we discuss advantageous strategies with respect to reducing the release resources and strategy costs (in terms of control mosquito numbers) required to achieve complete eradication of wild-type mosquitoes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Sea Level Rise Induced Arsenic Release from Historically Contaminated Coastal Soils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LeMonte, Joshua J.; Stuckey, Jason W.; Sanchez, Joshua Z.

    Climate change-induced perturbations in the hydrologic regime are expected to impact biogeochemical processes, including contaminant mobility and cycling. Elevated levels of geogenic and anthropogenic arsenic are found along many coasts around the world, most notably in south and southeast Asia but also in the United States, particularly along the Mid-Atlantic coast. The mechanism by and the extent to which arsenic may be released in contaminated coastal soils due to sea level rise are unknown. Here we show a series of data from a coastal arsenic-contaminated soil exposed to sea and river waters in biogeochemical microcosm reactors across field-validated redox conditions.more » We find that reducing conditions lead to arsenic release from historically contaminated coastal soils through reductive dissolution of arsenic-bearing mineral oxides in both sea and river water inundations, with less arsenic release from seawater scenarios than river water due to inhibition of oxide dissolution. For the first time, we systematically display gradation of solid phase soil-arsenic speciation across defined redox windows from reducing to oxidizing conditions in natural waters by combining biogeochemical microcosm experiments and X-ray absorption spectroscopy. Here, our results demonstrate the threat of sea level rise stands to impact arsenic release from contaminated coastal soils by changing redox conditions.« less

  7. Dopamine Release and Uptake Impairments and Behavioral Alterations Observed in Mice that Model Fragile X Mental Retardation Syndrome.

    PubMed

    Fulks, Jenny L; O'Bryhim, Bliss E; Wenzel, Sara K; Fowler, Stephen C; Vorontsova, Elena; Pinkston, Jonathan W; Ortiz, Andrea N; Johnson, Michael A

    2010-10-20

    In this study we evaluated the relationship between amphetamine-induced behavioral alterations and dopamine release and uptake characteristics in Fmr1 knockout (Fmr1 KO) mice, which model fragile X syndrome. The behavioral analyses, obtained at millisecond temporal resolution and 2 mm spatial resolution using a force-plate actometer, revealed that Fmr1 KO mice express a lower degree of focused stereotypy compared to wild type (WT) control mice after injection with 10 mg/kg (ip) amphetamine. To identify potentially related neurochemical mechanisms underlying this phenomenon, we measured electrically-evoked dopamine release and uptake using fast-scan cyclic voltammetry at carbon-fiber microelectrodes in striatal brain slices. At 10 weeks of age, dopamine release per pulse, which is dopamine release corrected for differences in uptake, was unchanged. However, at 15 (the age of behavioral testing) and 20 weeks of age, dopamine per pulse and the maximum rate of dopamine uptake was diminished in Fmr1 KO mice compared to WT mice. Dopamine uptake measurements, obtained at different amphetamine concentrations, indicated that dopamine transporters in both genotypes have equal affinities for amphetamine. Moreover, dopamine release measurements from slices treated with quinpirole, a D2-family receptor agonist, rule out enhanced D2 autoreceptor sensitivity as a mechanism of release inhibition. However, dopamine release, uncorrected for uptake and normalized against the corresponding pre-drug release peaks, increased in Fmr1 KO mice, but not in WT mice. Collectively, these data are consistent with a scenario in which a decrease in extracellular dopamine levels in the striatum result in diminished expression of focused stereotypy in Fmr1 KO mice.

  8. An approach to the preliminary evaluation of Closed Ecological Life Support System (CELSS) scenarios and control strategies

    NASA Technical Reports Server (NTRS)

    Stahr, J. D.; Auslander, D. M.; Spear, R. C.; Young, G. E.

    1982-01-01

    Life support systems for manned space missions are discussed. A scenario analysis method was proposed for the initial step of comparing possible partial or total recycle scenarios. The method is discussed in detail.

  9. [Cost-effective analysis of rotation from sustained-release morphine tablet to transdermal fentanyl of matrix type or sustained-release oxycodone tablet].

    PubMed

    Ise, Yuya; Wako, Tetsuya; Miura, Yoshihiko; Katayama, Shirou; Shimizu, Hisanori

    2009-12-01

    The present study was undertaken to determine the pharmacoeconomics of switching from sustained-release morphine tablet to matrix type (MT) of transdermal fontanel or sustained-release Oxycodone tablet. Cost-effective analysis was performed using a simulation model along with decision analysis. The analysis was done from the payer's perspective. The cost-effective ratio/patient of transdermal MT fontanel (22, 539 yen)was lower than that of sustained -release Oxycodone tablet (23, 630 yen), although a sensitivity analysis could not indicate that this result was reliable. These results suggest the possibility that transdermal MT fontanel was much less expensive than a sustained-release Oxycodone tablet.

  10. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.

    2014-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, and several new data products, such as map and distance-based goodness of fit plots. As the number and complexity of scenarios simulated using the Broadband Platform increases, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  11. Application of Frequency of Detection Methods in Design and Optimization of the INL Site Ambient Air Monitoring Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rood, Arthur S.; Sondrup, A. Jeffrey

    This report presents an evaluation of a hypothetical INL Site monitoring network and the existing INL air monitoring network using frequency of detection methods. The hypothetical network was designed to address the requirement in 40 CFR Part 61, Subpart H (2006) that “emissions of radionuclides to ambient air from U.S. DOE facilities shall not exceed those amounts that would cause any member of the public to receive in any year an effective dose equivalent exceeding 10 mrem/year.” To meet the requirement for monitoring only, “radionuclide releases that would result in an effective dose of 10% of the standard shall bemore » readily detectable and distinguishable from background.” Thus, the hypothetical network consists of air samplers placed at residence locations that surround INL and at other locations where onsite livestock grazing takes place. Two exposure scenarios were used in this evaluation: a resident scenario and a shepherd/rancher scenario. The resident was assumed to be continuously present at their residence while the shepherd/rancher was assumed to be present 24-hours at a fixed location on the grazing allotment. Important radionuclides were identified from annual INL radionuclide National Emission Standards for Hazardous Pollutants reports. Important radionuclides were defined as those that potentially contribute 1% or greater to the annual total dose at the radionuclide National Emission Standards for Hazardous Pollutants maximally exposed individual location and include H-3, Am-241, Pu-238, Pu 239, Cs-137, Sr-90, and I-131. For this evaluation, the network performance objective was set at achieving a frequency of detection greater than or equal to 95%. Results indicated that the hypothetical network for the resident scenario met all performance objectives for H-3 and I-131 and most performance objectives for Cs-137 and Sr-90. However, all actinides failed to meet the performance objectives for most sources. The shepherd/rancher scenario showed that air samplers placed around the facilities every 22.5 degrees were very effective in detecting releases, but this arrangement is not practical or cost effective. However, it was shown that a few air samplers placed in the prevailing wind direction around each facility could achieve the performance objective of a frequency of detection greater than or equal to 95% for the shepherd/rancher scenario. The results also indicate some of the current sampler locations have little or no impact on the network frequency of detection and could be removed from the network with no appreciable deterioration of performance. Results show that with some slight modifications to the existing network (i.e., additional samplers added north and south of the Materials and Fuels Complex and ineffective samplers removed), the network would achieve performance objectives for all sources for both the resident and shepherd/rancher scenario.« less

  12. Emergy Analysis and Sustainability Efficiency Analysis of Different Crop-Based Biodiesel in Life Cycle Perspective

    PubMed Central

    Ren, Jingzheng; Manzardo, Alessandro; Mazzi, Anna; Fedele, Andrea; Scipioni, Antonio

    2013-01-01

    Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used to evaluate the sustainability efficiencies of these options, and the biodiesel production systems based on soybean, sunflower, and palm are considered as DEA efficient, whereas rapeseed-based and jatropha-based scenarios are needed to be improved, and the improved methods have also been specified. PMID:23766723

  13. On-line Meteorology-Chemistry/Aerosols Modelling and Integration for Risk Assessment: Case Studies

    NASA Astrophysics Data System (ADS)

    Bostanbekov, Kairat; Mahura, Alexander; Nuterman, Roman; Nurseitov, Daniyar; Zakarin, Edige; Baklanov, Alexander

    2016-04-01

    On regional level, and especially in areas with potential diverse sources of industrial pollutants, the risk assessment of impact on environment and population is critically important. During normal operations, the risk is minimal. However, during accidental situations, the risk is increased due to releases of harmful pollutants into different environments such as water, soil, and atmosphere where it is following processes of continuous transformation and transport. In this study, the Enviro-HIRLAM (Environment High Resolution Limited Area Model) was adapted and employed for assessment of scenarios with accidental and continuous emissions of sulphur dioxide (SO2) for selected case studies during January of 2010. The following scenarios were considered: (i) control reference run; (ii) accidental release (due to short-term 1 day fire at oil storage facility) occurred at city of Atyrau (Kazakhstan) near the northern part of the Caspian Sea; and (iii) doubling of original continuous emissions from three locations of metallurgical enterprises on the Kola Peninsula (Russia). The implemented aerosol microphysics module M7 uses 5 types - sulphates, sea salt, dust, black and organic carbon; as well as distributed in 7 size modes. Removal processes of aerosols include gravitational settling and wet deposition. As the Enviro-HIRLAM model is the on-line integrated model, both meteorological and chemical processes are simultaneously modelled at each time step. The modelled spatio-temporal variations for meteorological and chemical patterns are analyzed for both European and Kazakhstan regions domains. The results of evaluation of sulphur dioxide concentration and deposition on main populated cities, selected regions, countries are presented employing GIS tools. As outcome, the results of Enviro-HIRLAM modelling for accidental release near the Caspian Sea are integrated into the RANDOM (Risk Assessment of Nature Detriment due to Oil spill Migration) system.

  14. Alternative Futures Analysis Of Farmington Bay Wetlands In The Great Salt Lake Ecosystem

    EPA Science Inventory

    An Alternative Futures Analysis (AFA) was conducted to evaluate tradeoffs between landscape design scenarios and ecological services for Farmington Bay, Great Salt Lake (GSL), wetlands. Model scenarios included both plan trend and conservation "futures" projected to 2030. Scena...

  15. Metabolomics analysis of 'Housui' Japanese pear flower buds during endodormancy reveals metabolic suppression by thermal fluctuation.

    PubMed

    Horikoshi, Humberto Mitio; Sekozawa, Yoshihiko; Kobayashi, Makoto; Saito, Kazuki; Kusano, Miyako; Sugaya, Sumiko

    2018-05-01

    Dormancy is a complex phenomenon that allows plants to survive the winter season. Studies of dormancy have recently attracted more attention due to the expansion of temperate fruit production in areas under mild winters and due to climate changes. This study aimed to identify and characterize the metabolic changes induced by chilling temperatures, as well as during thermal fluctuation conditions that simulate mild winter and/or climate change scenarios. To do this, we compared the metabolic profile of Japanese pear flower buds exposed to constant chilling at 6 °C and thermal fluctuations of 6 °C/18 °C (150 h/150 h) during endodormancy. We detected 91 metabolites by gas chromatography paired with time-of-flight mass spectrometry (GC-TOF-MS) that could be classified into eight groups: amino acids, amino acid derivatives, organic acids, sugars and polyols, fatty acids and sterols, phenol lipids, phenylpropanoids, and other compounds. Metabolomics analysis revealed that the level of several amino acids decreased during endodormancy. Sugar and polyol levels increased during endodormancy during constant chilling and might be associated with chilling stress tolerance and providing an energy supply for resuming growth. In contrast, thermal fluctuations produced low levels of metabolites related to the pentose phosphate pathway, energy production, and tricarboxylic acid (TCA) cycle in flower buds, which may be associated with failed endodormancy release. This metabolic profile contributes to our understanding of the biological mechanism of dormancy during chilling accumulation and clarifies the metabolic changes during mild winters and future climate change scenarios. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  16. Comparison and Validation of Hydrological E-Flow Methods through Hydrodynamic Modelling

    NASA Astrophysics Data System (ADS)

    Kuriqi, Alban; Rivaes, Rui; Sordo-Ward, Alvaro; Pinheiro, António N.; Garrote, Luis

    2017-04-01

    Flow regime determines physical habitat conditions and local biotic configuration. The development of environmental flow guidelines to support the river integrity is becoming a major concern in water resources management. In this study, we analysed two sites located in southern part of Portugal, respectively at Odelouca and Ocreza Rivers, characterised by the Mediterranean climate. Both rivers are almost in pristine condition, not regulated by dams or other diversion construction. This study presents an analysis of the effect on fish habitat suitability by the implementation of different hydrological e-flow methods. To conduct this study we employed certain hydrological e-flow methods recommended by the European Small Hydropower Association (ESHA). River hydrology assessment was based on approximately 30 years of mean daily flow data, provided by the Portuguese Water Information System (SNIRH). The biological data, bathymetry, physical and hydraulic features, and the Habitat Suitability Index for fish species were collected from extensive field works. We followed the Instream Flow Incremental Methodology (IFIM) to assess the flow-habitat relationship taking into account the habitat suitability of different instream flow releases. Initially, we analysed fish habitat suitability based on natural conditions, and we used it as reference condition for other scenarios considering the chosen hydrological e-flow methods. We accomplished the habitat modelling through hydrodynamic analysis by using River-2D model. The same methodology was applied to each scenario by considering as input the e-flows obtained from each of the hydrological method employed in this study. This contribution shows the significance of ecohydrological studies in establishing a foundation for water resources management actions. Keywords: ecohydrology, e-flow, Mediterranean rivers, river conservation, fish habitat, River-2D, Hydropower.

  17. Forecasting volcanic unrest using seismicity: The good, the bad and the time consuming

    NASA Astrophysics Data System (ADS)

    Salvage, Rebecca; Neuberg, Jurgen W.

    2013-04-01

    Volcanic eruptions are inherently unpredictable in nature, with scientists struggling to forecast the type and timing of events, in particular in real time scenarios. Current understanding suggests that the use of statistical patterns within precursory datasets of seismicity prior to eruptive events could hold the potential to be used as real time forecasting tools. They allow us to determine times of clear deviation in data, which might be indicative of volcanic unrest. The identification of low frequency seismic swarms and the acceleration of this seismicity prior to observed volcanic unrest may be key in developing forecasting tools. The development of these real time forecasting models which can be implemented at volcano observatories is of particular importance since the identification of early warning signals allows danger to the proximal population to be minimized. We concentrate on understanding the significance and development of these seismic swarms as unrest develops at the volcano. In particular, analysis of accelerations in event rate, amplitude and energy rates released by seismicity prior to eruption suggests that these are important indicators of developing unrest. Real time analysis of these parameters simultaneously allows possible improvements to forecasting models. Although more time and computationally intense, cross correlation techniques applied to continuous seismicity prior to volcanic unrest scenarios allows all significant seismic events to be analysed, rather than only those which can be detected by an automated identification system. This may allow a more accurate forecast since all precursory seismicity can be taken into account. In addition, the classification of seismic events based on spectral characteristics may allow us to isolate individual types of signals which are responsible for certain types of unrest. In this way, we may be able to better forecast the type of eruption that may ensue, or at least some of its prevailing characteristics.

  18. Critical Assessment of Analytical Techniques in the Search for Biomarkers on Mars: A Mummified Microbial Mat from Antarctica as a Best-Case Scenario

    PubMed Central

    Blanco, Yolanda; Gallardo-Carreño, Ignacio; Ruiz-Bermejo, Marta; Puente-Sánchez, Fernando; Cavalcante-Silva, Erika; Quesada, Antonio; Prieto-Ballesteros, Olga

    2017-01-01

    Abstract The search for biomarkers of present or past life is one of the major challenges for in situ planetary exploration. Multiple constraints limit the performance and sensitivity of remote in situ instrumentation. In addition, the structure, chemical, and mineralogical composition of the sample may complicate the analysis and interpretation of the results. The aim of this work is to highlight the main constraints, performance, and complementarity of several techniques that have already been implemented or are planned to be implemented on Mars for detection of organic and molecular biomarkers on a best-case sample scenario. We analyzed a 1000-year-old desiccated and mummified microbial mat from Antarctica by Raman and IR (infrared) spectroscopies (near- and mid-IR), thermogravimetry (TG), differential thermal analysis, mass spectrometry (MS), and immunological detection with a life detector chip. In spite of the high organic content (ca. 20% wt/wt) of the sample, the Raman spectra only showed the characteristic spectral peaks of the remaining beta-carotene biomarker and faint peaks of phyllosilicates over a strong fluorescence background. IR spectra complemented the mineralogical information from Raman spectra and showed the main molecular vibrations of the humic acid functional groups. The TG-MS system showed the release of several volatile compounds attributed to biopolymers. An antibody microarray for detecting cyanobacteria (CYANOCHIP) detected biomarkers from Chroococcales, Nostocales, and Oscillatoriales orders. The results highlight limitations of each technique and suggest the necessity of complementary approaches in the search for biomarkers because some analytical techniques might be impaired by sample composition, presentation, or processing. Key Words: Planetary exploration—Life detection—Microbial mat—Life detector chip—Thermogravimetry—Raman spectroscopy—NIR—DRIFTS. Astrobiology 17, 984–996. PMID:29016195

  19. Critical Assessment of Analytical Techniques in the Search for Biomarkers on Mars: A Mummified Microbial Mat from Antarctica as a Best-Case Scenario.

    PubMed

    Blanco, Yolanda; Gallardo-Carreño, Ignacio; Ruiz-Bermejo, Marta; Puente-Sánchez, Fernando; Cavalcante-Silva, Erika; Quesada, Antonio; Prieto-Ballesteros, Olga; Parro, Víctor

    2017-10-01

    The search for biomarkers of present or past life is one of the major challenges for in situ planetary exploration. Multiple constraints limit the performance and sensitivity of remote in situ instrumentation. In addition, the structure, chemical, and mineralogical composition of the sample may complicate the analysis and interpretation of the results. The aim of this work is to highlight the main constraints, performance, and complementarity of several techniques that have already been implemented or are planned to be implemented on Mars for detection of organic and molecular biomarkers on a best-case sample scenario. We analyzed a 1000-year-old desiccated and mummified microbial mat from Antarctica by Raman and IR (infrared) spectroscopies (near- and mid-IR), thermogravimetry (TG), differential thermal analysis, mass spectrometry (MS), and immunological detection with a life detector chip. In spite of the high organic content (ca. 20% wt/wt) of the sample, the Raman spectra only showed the characteristic spectral peaks of the remaining beta-carotene biomarker and faint peaks of phyllosilicates over a strong fluorescence background. IR spectra complemented the mineralogical information from Raman spectra and showed the main molecular vibrations of the humic acid functional groups. The TG-MS system showed the release of several volatile compounds attributed to biopolymers. An antibody microarray for detecting cyanobacteria (CYANOCHIP) detected biomarkers from Chroococcales, Nostocales, and Oscillatoriales orders. The results highlight limitations of each technique and suggest the necessity of complementary approaches in the search for biomarkers because some analytical techniques might be impaired by sample composition, presentation, or processing. Key Words: Planetary exploration-Life detection-Microbial mat-Life detector chip-Thermogravimetry-Raman spectroscopy-NIR-DRIFTS. Astrobiology 17, 984-996.

  20. Effects of obligatory training and prior training experience on attitudes towards performing basic life support: a questionnaire survey.

    PubMed

    Matsubara, Hiroki; Enami, Miki; Hirose, Keiko; Kamikura, Takahisa; Nishi, Taiki; Takei, Yutaka; Inaba, Hideo

    2015-04-01

    To determine the effect of Japanese obligatory basic life support training for new driver's license applicants on their willingness to carry out basic life support. We distributed a questionnaire to 9,807 participants of basic life support courses in authorized driving schools from May 2007 to April 2008 after the release of the 2006 Japanese guidelines. The questionnaire explored the participants' willingness to perform basic life support in four hypothetical scenarios: cardiopulmonary resuscitation on one's own initiative; compression-only cardiopulmonary resuscitation following telephone cardiopulmonary resuscitation; early emergency call; and use of an automated external defibrillator. The questionnaire was given at the beginning of the basic life support course in the first 6-month term and at the end in the second 6-month term. The 9,011 fully completed answer sheets were analyzed. The training significantly increased the proportion of respondents willing to use an automated external defibrillator and to perform cardiopulmonary resuscitation on their own initiative in those with and without prior basic life support training experience. It significantly increased the proportion of respondents willing to carry out favorable actions in all four scenarios. In multiple logistic regression analysis, basic life support training and prior training experiences within 3 years were associated with the attitude. The analysis of reasons for unwillingness suggested that the training reduced the lack of confidence in their skill but did not attenuate the lack of confidence in detection of arrest or clinical judgment to initiate a basic life support action. Obligatory basic life support training should be carried out periodically and modified to ensure that participants gain confidence in judging and detecting cardiac arrest.

  1. Smart Micro/Nano-robotic Systems for Gene Delivery.

    PubMed

    Pedram, Alireza; Pishkenari, Hossein Nejat

    2017-01-01

    Small scale robotics have attracted growing attention for the prospect of targeting and accessing cell-sized sites, necessary for high precision biomedical applications and drug/gene delivery. The loss of controlled gene therapy, inducing systemic side effects and reduced therapeutic efficiency, can be settled utilizing these intelligent carriers. Newly proposed solutions for the main challenges of control, power supplying, gene release and final carrier extraction/degradation have shifted these smart miniature robots to the point of being employed for practical applications of transferring oligonucleotides (pDNA, siRNA, mRNA, etc.) in near future. In this paper, different scenarios and their endeavors to address the vital working demands and steps, in particular, carrier attachment and release, cell internalization, manipulation concerns as well as actuation systems are discussed.This review highlights some promising experimental results showing controlled gene release of robotic systems in comparison with current non-specific gene delivery methods. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Preliminary risks associated with postulated tritium release from production reactor operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Kula, K.R.; Horton, W.H.

    1988-01-01

    The Probabilistic Risk Assessment (PRA) of Savannah River Plant (SRP) reactor operation is assessing the off-site risk due to tritium releases during postulated full or partial loss of heavy water moderator accidents. Other sources of tritium in the reactor are less likely to contribute to off-site risk in non-fuel melting accident scenarios. Preliminary determination of the frequency of average partial moderator loss (including incidents with leaks as small as .5 kg) yields an estimate of /approximately/1 per reactor year. The full moderator loss frequency is conservatively chosen as 5 /times/ 10/sup /minus/3/ per reactor year. Conditional consequences, determined with amore » version of the MACCS code modified to handle tritium, are found to be insignificant. The 95th percentile individual cancer risk is 4 /times/ 10/sup /minus/8/ per reactor year within 16 km of the release point. The full moderator loss accident contributes about 75% of the evaluated risks. 13 refs., 4 figs., 5 tabs.« less

  3. Lunar base surface mission operations. Lunar Base Systems Study (LBSS) task 4.1

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The purpose was to perform an analysis of the surface operations associated with a human-tended lunar base. Specifically, the study defined surface elements and developed mission manifests for a selected base scenario, determined the nature of surface operations associated with this scenario, generated a preliminary crew extravehicular and intravehicular activity (EVA/IVA) time resource schedule for conducting the missions, and proposed concepts for utilizing remotely operated equipment to perform repetitious or hazardous surface tasks. The operations analysis was performed on a 6 year period of human-tended lunar base operation prior to permanent occupancy. The baseline scenario was derived from a modified version of the civil needs database (CNDB) scenario. This scenario emphasizes achievement of a limited set of science and exploration objectives while emplacing the minimum habitability elements required for a permanent base.

  4. Molecular Diagnostic Analysis of Outbreak Scenarios

    ERIC Educational Resources Information Center

    Morsink, M. C.; Dekter, H. E.; Dirks-Mulder, A.; van Leeuwen, W. B.

    2012-01-01

    In the current laboratory assignment, technical aspects of the polymerase chain reaction (PCR) are integrated in the context of six different bacterial outbreak scenarios. The "Enterobacterial Repetitive Intergenic Consensus Sequence" (ERIC) PCR was used to analyze different outbreak scenarios. First, groups of 2-4 students determined optimal…

  5. Greenhouse gas emissions and reactive nitrogen releases during the life-cycles of staple food production in China and their mitigation potential.

    PubMed

    Xia, Longlong; Ti, Chaopu; Li, Bolun; Xia, Yongqiu; Yan, Xiaoyuan

    2016-06-15

    Life-cycle analysis of staple food (rice, flour and corn-based fodder) production and assessments of the associated greenhouse gas (GHG) and reactive nitrogen (Nr) releases, from environmental and economic perspectives, help to develop effective mitigation options. However, such evaluations have rarely been executed in China. We evaluated the GHG and Nr releases per kilogram of staple food production (carbon and Nr footprints) and per unit of net economic benefit (CO2-NEB and Nr-NEB), and explored their mitigation potential. Carbon footprints of food production in China were obviously higher than those in some developed countries. There was a high spatial variation in the footprints, primarily attributable to differences in synthetic N use (or CH4 emissions) per unit of food production. Provincial carbon footprints had a significant linear relationship with Nr footprints, attributed to large contribution of N fertilizer use to both GHG and Nr releases. Synthetic N fertilizer applications and CH4 emissions dominated the carbon footprints, while NH3 volatilization and N leaching were the main contributors to the Nr footprints. About 564 (95% uncertainty range: 404-701) TgCO2eqGHG and 10 (7.4-12.4) Tg Nr-N were released every year during 2001-2010 from staple food production. This caused the total damage costs of 325 (70-555) billion ¥, equivalent to nearly 1.44% of the Gross Domestic Product of China. Moreover, the combined damage costs and economic input costs, accounted for 66%-80% of the gross economic benefit generated from food production. A reduction of 92.7TgCO2eqyr(-1) and 2.2TgNr-Nyr(-1) could be achieved by reducing synthetic N inputs by 20%, increasing grain yields by 5% and implementing off-season application of straw and mid-season drainage practices for rice cultivation. In order to realize these scenarios, an ecological compensation scheme should be established to incentivize farmers to gradually adopt knowledge-based managements. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. The development of English primary care group governance. A scenario analysis.

    PubMed

    Sheaff, R

    1999-01-01

    At present there is a policy vacuum about what English Primary Care Groups' (PCGs) governance will be when they develop into Primary Care Trusts (PCTs). Draft legislation leaves many options open, so PCT governance is likely to 'emerge' as PCTs are created. It also remains uncertain how general practitioners (GPs) will react to the formation of PCTs and how the UK government will then respond in turn. A scenario analysis suggests three possible lines of development. The base (likeliest) scenario predicts a mainly networked form of PCT governance. An alternative scenario is of PCT governance resembling the former National Health Service internal market. A third scenario predicts 'franchise model' PCTs employing some GPs and subcontracting others. To different degrees all three scenarios predict that PCTs will retain elements of networked governance. If it fails to make GPs as accountable to NHS management as the UK government wishes, networked governance may prove only a transitional stage before English PCTs adopt either quasi-market or hierarchical governance.

  7. Scenario analysis for integrated water resources planning and management under uncertainty in the Zayandehrud river basin

    NASA Astrophysics Data System (ADS)

    Safavi, Hamid R.; Golmohammadi, Mohammad H.; Sandoval-Solis, Samuel

    2016-08-01

    The goal of this study is to develop and analyze three scenarios in the Zayandehrud river basin in Iran using a model already built and calibrated by Safavi et al. (2015) that has results for the baseline scenario. Results from the baseline scenario show that water demands will be supplied at the cost of depletion of surface and ground water resources, making this scenario undesirable and unsustainable. Supply Management, Demand Management, and Meta (supply and demand management) scenarios are the selected scenarios in this study. They are to be developed and declared into the Zayandehrud model to assess and evaluate the imminent status of the basin. Certain strategies will be employed for this purpose to improve and rectify the current management policies. The five performance criteria of time-based and volumetric reliability, resilience, vulnerability, and maximum deficit will be employed in the process of scenario analysis and evaluation. The results obtained from the performance criteria will be summed up into a so-called 'Water Resources Sustainability Index' to facilitate comparison among the likely trade-offs. Uncertainties arising from historical data, management policies, rainfall-runoff model, demand priorities, and performance criteria are considered in the proposed conceptual framework and modeled by appropriate approaches. Results show that the Supply Management scenario can be used to improve upon the demand supply but that it has no tangible effects on the improvement of the resources in the study region. In this regard, the Demand Management scenario is found to be more effective than the water supply one although it still remains unacceptable. Results of the Meta scenario indicate that both the supply and demand management scenarios must be applied if the water resources are to be safeguarded against degradation and depletion. In other words, the supply management scenario is necessary but not adequate; rather, it must be coupled to the demand management scenario. Finally, it will be shown that applying the Meta scenario will improve the water resources from sustainably.

  8. A Multi-Objective Decision-Making Model for Resources Allocation in Humanitarian Relief

    DTIC Science & Technology

    2007-03-01

    Applied Mathematics and Computation 163, 2005, pp756 19. Malczewski, J., GIS and Multicriteria Decision Analysis , John Wiley and Sons, New York... used when interpreting the results of the analysis . (Raimo et al. 2002) (7) Sensitivity analysis Sensitivity analysis in a DA process answers...Budget Scenario Analysis The MILP is solved ( using LINDO 6.1) for high, medium and low budget scenarios in both damage degree levels. Tables 17 and

  9. Gas Retention, Gas Release, and Fluidization of Spherical Resorcinol-Formaldehyde (sRF) Ion Exchange Resin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauglitz, Phillip A.; Rassat, Scot D.; Linn, Diana

    The Low-Activity Waste Pretreatment System (LAWPS) is being developed to provide treated supernatant liquid from the Hanford tank farms directly to the Low-Activity Waste (LAW) Vitrification Facility at the Hanford Tank Waste Treatment and Immobilization Plant. The design and development of the LAWPS is being conducted by Washington River Protection Solutions, LLC. A key process in LAWPS is the removal of radioactive Cs in ion exchange (IX) columns filled with spherical resorcinol-formaldehyde (sRF) resin. When loaded with radioactive Cs, radiolysis of water in the LAW liquid will generate hydrogen gas. In normal operations, the generated hydrogen is expected to remainmore » dissolved in the liquid and be continuously removed by liquid flow. One accident scenario being evaluated is the loss of liquid flow through the sRF resin bed after it has been loaded with radioactive Cs and hydrogen gas is being generated by radiolysis. For an accident scenario with a loss of flow, hydrogen gas can be retained within the IX column both in the sRF resin bed and below the bottom screen that supports the resin within the column, which creates a hydrogen flammability hazard. Because there is a potential for a large fraction of the retained hydrogen to be released over a short duration as a gas release event, there is a need to quantify the size and rate of potential gas release events. Due to the potential for a large, rapid gas release event, an evaluation of mitigation methods to eliminate the hydrogen hazard is also needed. One method being considered for mitigating the hydrogen hazard during a loss of flow accident is to have a secondary flow system, with two redundant pumps operating in series, that re-circulates liquid upwards through the bed and into a vented break tank where hydrogen gas is released from the liquid and removed by venting the headspace of the break tank. The mechanism for inducing release of gas from the sRF bed is to fluidize the bed, which should allow retained bubbles to rise and be carried to the break tank. The overall conclusion of the testing is that fluidization is an effective method to remove hydrogen gas from a bed of sRF resin, but that a single fluidization velocity that is adequate to release gas in 55 ºC water will over-fluidize sRF resin in most LAW liquids, including both nominal and high-limit LAW simulants used in testing. An upper packed bed can retain hydrogen gas and pose a flammability hazard. Using periodic on:off fluidization, such as 5:55 min. on:off cycles, is effective at releasing gas while not creating an upper packed bed. Note that lengthening the fluidization duration in a one-hour cycle did result in a stable upper packed bed in one case with the nominal LAW simulant, so testing focused on shorter “on” periods which are needed for effective hydrogen release with periodic on:off fluidization« less

  10. Life-Cycle Energy and GHG Emissions for New and Recovered Softwood Framing Lumber and Hardwood Flooring Considering End-of-Life Scenarios

    Treesearch

    Richard D. Bergman; Robert H. Falk; Hongmei Gu; Thomas R. Napier; Jamie Meil

    2013-01-01

    Within the green building fields is a growing movement to recover and reuse building materials in lieu of demolition and land fill disposal. However, they lack life-cycle data to help quantify environmental impacts. This study quantifies the primary energy and greenhouse gas (GHG) emissions released from the production of wood recovered from an old house and from new...

  11. A 21st Century National Public Health System

    DTIC Science & Technology

    2008-09-01

    Security (DHS) released fifteen national planning scenarios in 2004 and the Target Capabilities List: A Companion to the National Preparedness Goal in...no clinical samples available from the first SARS patient in China to test for the virus; however, the second identified SARS case was a chef , Huang...Xingchu, who worked at a restaurant and was reported to have atypical pneumonia. As a chef , he came into regular contact with several types of

  12. First Responders Immersive Training Simulation Environment (FRITSE): Scenario Development and Support

    DTIC Science & Technology

    2014-07-30

    consisted of three stages: a deliberate derailment of a train containing either methanol (flammable) or ammonia (caustic) cars at an overpass where...event would be the deliberate release of chemical agent from a van parked near the train derailment and the tertiary event would be a giant fertilizer ...political targets is highest. The derailment would cause a major leak of either caustic ( ammonia ) or highly flammable (methanol) material from derailed

  13. Geothermal Impact Analysis | Geothermal Technologies | NREL

    Science.gov Websites

    on potential geothermal growth scenarios, jobs and economic impacts, clean energy manufacturing geothermal resources. We: Perform resource analysis Develop techno-economic models Quantify environmental growth scenarios across multiple market sectors. Learn more about the GeoVision Study. Jobs and Economic

  14. Micro-Logistics Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Cirillo, William; Stromgren, Chel; Galan, Ricardo

    2008-01-01

    Traditionally, logistics analysis for space missions has focused on the delivery of elements and goods to a destination. This type of logistics analysis can be referred to as "macro-logistics". While the delivery of goods is a critical component of mission analysis, it captures only a portion of the constraints that logistics planning may impose on a mission scenario. The other component of logistics analysis concerns the local handling of goods at the destination, including storage, usage, and disposal. This type of logistics analysis, referred to as "micro-logistics", may also be a primary driver in the viability of a human lunar exploration scenario. With the rigorous constraints that will be placed upon a human lunar outpost, it is necessary to accurately evaluate micro-logistics operations in order to develop exploration scenarios that will result in an acceptable level of system performance.

  15. Trade-Off Analysis Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    NASAs Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASAs four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. CNS previously developed a report which applied the methodology, to three space Internet-based communications scenarios for future missions. CNS conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. GRC selected for further analysis the scenario that involved unicast communications between a Low-Earth-Orbit (LEO) International Space Station (ISS) and a ground terminal Internet node via a Tracking and Data Relay Satellite (TDRS) transfer. This report contains a tradeoff analysis on the selected scenario. The analysis examines the performance characteristics of the various protocols and architectures. The tradeoff analysis incorporates the results of a CNS developed analytical model that examined performance parameters.

  16. An Exploration of Scenarios to Support Sustainable Land Management Using Integrated Environmental Socio-economic Models

    NASA Astrophysics Data System (ADS)

    Fleskens, L.; Nainggolan, D.; Stringer, L. C.

    2014-11-01

    Scenario analysis constitutes a valuable deployment method for scientific models to inform environmental decision-making, particularly for evaluating land degradation mitigation options, which are rarely based on formal analysis. In this paper we demonstrate such an assessment using the PESERA-DESMICE modeling framework with various scenarios for 13 global land degradation hotspots. Starting with an initial assessment representing land degradation and productivity under current conditions, options to combat instances of land degradation are explored by determining: (1) Which technologies are most biophysically appropriate and most financially viable in which locations; we term these the "technology scenarios"; (2) how policy instruments such as subsidies influence upfront investment requirements and financial viability and how they lead to reduced levels of land degradation; we term these the "policy scenarios"; and (3) how technology adoption affects development issues such as food production and livelihoods; we term these the "global scenarios". Technology scenarios help choose the best technology for a given area in biophysical and financial terms, thereby outlining where policy support may be needed to promote adoption; policy scenarios assess whether a policy alternative leads to a greater extent of technology adoption; while global scenarios demonstrate how implementing technologies may serve wider sustainable development goals. Scenarios are applied to assess spatial variation within study sites as well as to compare across different sites. Our results show significant scope to combat land degradation and raise agricultural productivity at moderate cost. We conclude that scenario assessment can provide informative input to multi-level land management decision-making processes.

  17. An exploration of scenarios to support sustainable land management using integrated environmental socio-economic models.

    PubMed

    Fleskens, L; Nainggolan, D; Stringer, L C

    2014-11-01

    Scenario analysis constitutes a valuable deployment method for scientific models to inform environmental decision-making, particularly for evaluating land degradation mitigation options, which are rarely based on formal analysis. In this paper we demonstrate such an assessment using the PESERA-DESMICE modeling framework with various scenarios for 13 global land degradation hotspots. Starting with an initial assessment representing land degradation and productivity under current conditions, options to combat instances of land degradation are explored by determining: (1) Which technologies are most biophysically appropriate and most financially viable in which locations; we term these the "technology scenarios"; (2) how policy instruments such as subsidies influence upfront investment requirements and financial viability and how they lead to reduced levels of land degradation; we term these the "policy scenarios"; and (3) how technology adoption affects development issues such as food production and livelihoods; we term these the "global scenarios". Technology scenarios help choose the best technology for a given area in biophysical and financial terms, thereby outlining where policy support may be needed to promote adoption; policy scenarios assess whether a policy alternative leads to a greater extent of technology adoption; while global scenarios demonstrate how implementing technologies may serve wider sustainable development goals. Scenarios are applied to assess spatial variation within study sites as well as to compare across different sites. Our results show significant scope to combat land degradation and raise agricultural productivity at moderate cost. We conclude that scenario assessment can provide informative input to multi-level land management decision-making processes.

  18. Agricultural conservation planning framework: 1. Developing multipractice watershed planning scenarios and assessing nutrient reduction potential.

    PubMed

    Tomer, M D; Porter, S A; Boomer, K M B; James, D E; Kostel, J A; Helmers, M J; Isenhart, T M; McLellan, E

    2015-05-01

    Spatial data on soils, land use, and topography, combined with knowledge of conservation effectiveness, can be used to identify alternatives to reduce nutrient discharge from small (hydrologic unit code [HUC]12) watersheds. Databases comprising soil attributes, agricultural land use, and light detection and ranging-derived elevation models were developed for two glaciated midwestern HUC12 watersheds: Iowa's Beaver Creek watershed has an older dissected landscape, and Lime Creek in Illinois is young and less dissected. Subsurface drainage is common in both watersheds. We identified locations for conservation practices, including in-field practices (grassed waterways), edge-of-field practices (nutrient-removal wetlands, saturated buffers), and drainage-water management, by applying terrain analyses, geographic criteria, and cross-classifications to field- and watershed-scale geographic data. Cover crops were randomly distributed to fields without geographic prioritization. A set of alternative planning scenarios was developed to represent a variety of extents of implementation among these practices. The scenarios were assessed for nutrient reduction potential using a spreadsheet approach to calculate the average nutrient-removal efficiency required among the practices included in each scenario to achieve a 40% NO-N reduction. Results were evaluated in the context of the Iowa Nutrient Reduction Strategy, which reviewed nutrient-removal efficiencies of practices and established the 40% NO-N reduction as Iowa's target for Gulf of Mexico hypoxia mitigation by agriculture. In both test watersheds, planning scenarios that could potentially achieve the targeted NO-N reduction but remove <5% of cropland from production were identified. Cover crops and nutrient removal wetlands were common to these scenarios. This approach provides an interim technology to assist local watershed planning and could provide planning scenarios to evaluate using watershed simulation models. A set of ArcGIS tools is being released to enable transfer of this mapping technology. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  19. The effects of country-level population policy for enhancing adaptation to climate change

    NASA Astrophysics Data System (ADS)

    Gunasekara, N. K.; Kazama, S.; Yamazaki, D.; Oki, T.

    2012-08-01

    The effectiveness of population policy scenarios in reducing the combined impacts of population change and climate change on water resources is explored. One no-policy scenario and two scenarios with population policy assumptions are employed in combination with water availability under the SRES scenarios A1b, B1 and A2 for the impact analysis. The population data used are from the World Bank. The river discharges per grid of horizontal resolution 0.5° are obtained from the Total Runoff Integrating Pathways (TRIP) of the University of Tokyo, Japan. Unlike the population scenarios utilized in the SRES emission scenarios and the newest Representative Concentration Pathways, the scenarios employed in this research are based, even after 2050, on country-level rather than regional growth assumptions. Our analysis implies that in combination with a more heterogeneous pattern of population changes across the world, a more convergent, environmentally friendly emissions scenario, such as B1, can result in a high-impact climate scenario, similar to A2, for the already water-stressed low latitudes. However, the effect of population change supersedes the changes in the climate scenarios. In 2100, Africa, Middle-East and parts of Asia are in extreme water-stress under all scenarios. For countries with high population momentum, the population policy scenario with fertility-reduction assumptions gained a maximum of 6.1 times the water availability in Niger and 5.3 times that in Uganda compared with the no-policy scenario. Most of these countries are in Sub-Saharan Africa. These countries represent 24.5% of the global population in the no-policy scenario and the scenario with fertility- reduction assumptions reduces it to 8.7% by 2100. This scenario is also effective at reducing the area under extreme water stress in these countries. However, the policy scenario with assumptions of population stabilization at the replacement fertility rate increases the water stress in high-latitude countries. Nevertheless, the impact is low due to the high per capita water availability in the region. This research is expected to widen the understanding of the combined impacts of climate change in the future and of the strategies needed to enhance the space for adaptation.

  20. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from any...

  1. Dynamic microvesicle release and clearance within the cardiovascular system: triggers and mechanisms.

    PubMed

    Ayers, Lisa; Nieuwland, Rienk; Kohler, Malcolm; Kraenkel, Nicolle; Ferry, Berne; Leeson, Paul

    2015-12-01

    Interest in cell-derived microvesicles (or microparticles) within cardiovascular diagnostics and therapeutics is rapidly growing. Microvesicles are often measured in the circulation at a single time point. However, it is becoming clear that microvesicle levels both increase and decrease rapidly in response to certain stimuli such as hypoxia, acute cardiac stress, shear stress, hypertriglyceridaemia and inflammation. Consequently, the levels of circulating microvesicles will reflect the balance between dynamic mechanisms for release and clearance. The present review describes the range of triggers currently known to lead to microvesicle release from different cellular origins into the circulation. Specifically, the published data are used to summarize the dynamic impact of these triggers on the degree and rate of microvesicle release. Secondly, a summary of the current understanding of microvesicle clearance via different cellular systems, including the endothelial cell and macrophage, is presented, based on reported studies of clearance in experimental models and clinical scenarios, such as transfusion or cardiac stress. Together, this information can be used to provide insights into potential underlying biological mechanisms that might explain the increases or decreases in circulating microvesicle levels that have been reported and help to design future clinical studies. © 2015 Authors; published by Portland Press Limited.

  2. Performance evaluation of various K- anonymity techniques

    NASA Astrophysics Data System (ADS)

    Maheshwarkar, Nidhi; Pathak, Kshitij; Chourey, Vivekanand

    2011-12-01

    Today's advanced scenario where each information is available in one click, data security is the main aspect. Individual information which sometimes needs to be hiding is easily available using some tricks. Medical information, income details are needed to be kept away from adversaries and so, are stored in private tables. Some publicly released information contains zip code, sex, birth date. When this released information is linked with the private table, adversary can detect the whole confidential information of individuals or respondents, i.e. name, medical status. So to protect respondents identity, a new concept k-anonymity is used which means each released record has at least (k-1) other records in the release whose values are distinct over those fields that appear in the external data. K-anonymity can be achieved easily in case of single sensitive attributes i.e. name, salary, medical status, but it is quiet difficult when multiple sensitive attributes are present. Generalization and Suppression are used to achieve k-anonymity. This paper provides a formal introduction of k-anonymity and some techniques used with it l-diversity, t-closeness. This paper covers k-anonymity model and the comparative study of these concepts along with a new proposed concept for multiple sensitive attributes.

  3. Assessing the Responses of Streamflow to Pollution Release in South Carolina

    NASA Astrophysics Data System (ADS)

    Maze, G.; Chovancak, N. A.; Samadi, S. Z.

    2017-12-01

    The purpose of this investigation was to examine the effects of various stream flows on the transport of a pollutant downstream and to evaluate the uncertainty associated with using a single stream flow value when the true flow is unknown in the model. The area used for this study was Horse Creek in South Carolina where a chlorine pollutant spill has occurred in the past resulting from a train derailment in Graniteville, SC. In the example scenario used, the chlorine gas pollutant was released into the environment, where it killed plants, infected groundwater, and caused evacuation of the city. Tracking the movement and concentrations at various points downstream in the river system is crucial to understanding how a single accidental pollutant release can affect the surrounding areas. As a result of the lack of real-time data available this emergency response model uses historical monthly averages, however, these monthly averages do not reflect how widely the flow can vary within that month. Therefore, the assumption to use the historical monthly average flow data may not be accurate, and this investigation aims at quantifying the uncertainty associated with using a single stream flow value when the true stream flow may vary greatly. For the purpose of this investigation, the event in Graniteville was used as a case study to evaluate the emergency response model. This investigation was conducted by adjusting the STREAM II V7 program developed by Savannah River National Laboratory (SRNL) to model a confluence at the Horse Creek and the Savannah River system. This adjusted program was utilized to track the progress of the chlorine pollutant release and examine how it was transported downstream. By adjusting this program, the concentrations and time taken to reach various points downstream of the release were obtained and can be used not only to analyze this particular pollutant release in Graniteville, but can continue to be adjusted and used as a technical tool for emergency responders in future accidents. Further, the program was run with monthly maximum, minimum, and average advective flows and an uncertainty analysis was conducted to examine the error associated with the input data. These results underscore to profound influence that streamflow magnitudes (maximum, minimum, and average) have on shaping downstream water quality.

  4. Technology assessment of future intercity passenger transportation systems. Volume 4: Scenarios

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Four background scenarios that relate to alternative states of society in the next 25 to 50 years are described. The scenarios were developed for use in analyzing and evaluating alternative future intercity transportation technologies. The scenarios are based, in part, on discussions contained in the issue papers and, in part, on separate analysis of social and economic trends considered relevant for the evolution of intercity transportation.

  5. Assessing global fossil fuel availability in a scenario framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauer, Nico; Hilaire, Jérôme; Brecha, Robert J.

    This study assesses global, long-term economic availability of coal, oil and gas within the Shared Socio-economic Pathway (SSP) scenario framework considering alternative assumptions as to highly uncertain future developments of technology, policy and the economy. Diverse sets of trajectories are formulated varying the challenges to mitigation and adaptation of climate change. The potential CO2 emissions from fossil fuels make it a crucial element subject to deep uncertainties. The analysis is based on a well-established data set of cost-quantity combinations that assumes favorable techno-economic developments, but ignores additional constraints on the extraction sector. This study significantly extends that analysis to includemore » alternative assumptions for the fossil fuel sector consistent with the SSP scenario families and applies these filters to the original data set, thus resulting in alternative cumulative fossil fuel availability curves. In a Middle-of-the-Road scenario, low cost fossil fuels embody carbon consistent with a RCP6.0 emission profile, if all the CO2 were emitted freely during the 21st century. In scenarios with high challenges to mitigation, the assumed embodied carbon in low-cost fossil fuels can trigger a RCP8.5 scenario; low mitigation challenges scenarios are still consistent with a RCP4.5 scenario.« less

  6. Scenario analysis and strategic planning: practical applications for radiology practices.

    PubMed

    Lexa, Frank James; Chan, Stephen

    2010-05-01

    Modern business science has many tools that can be of great value to radiologists and their practices. One of the most important and underused is long-term planning. Part of the problem has been the pace of change. Making a 5-year plan makes sense only if your develop robust scenarios of possible future conditions you will face. Scenario analysis is one of many highly regarded tools that can improve your predictive capability. However, as with many tools, it pays to have some training and to get practical tips on how to improve their value. It also helps to learn from other people's mistakes rather than your own. The authors discuss both theoretical and practical issues in using scenario analysis to improve your planning process. They discuss actionable ways this set of tools can be applied in a group meeting or retreat. Copyright (c) 2010 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  7. Population and business exposure to twenty scenario earthquakes in the State of Washington

    USGS Publications Warehouse

    Wood, Nathan; Ratliff, Jamie

    2011-01-01

    This report documents the results of an initial analysis of population and business exposure to scenario earthquakes in Washington. This analysis was conducted to support the U.S. Geological Survey (USGS) Pacific Northwest Multi-Hazards Demonstration Project (MHDP) and an ongoing collaboration between the State of Washington Emergency Management Division (WEMD) and the USGS on earthquake hazards and vulnerability topics. This report was developed to help WEMD meet internal planning needs. A subsequent report will provide analysis to the community level. The objective of this project was to use scenario ground-motion hazard maps to estimate population and business exposure to twenty Washington earthquakes. In consultation with the USGS Earthquake Hazards Program and the Washington Division of Geology and Natural Resources, the twenty scenario earthquakes were selected by WEMD (fig. 1). Hazard maps were then produced by the USGS and placed in the USGS ShakeMap archive.

  8. Developing a database for pedestrians' earthquake emergency evacuation in indoor scenarios.

    PubMed

    Zhou, Junxue; Li, Sha; Nie, Gaozhong; Fan, Xiwei; Tan, Jinxian; Li, Huayue; Pang, Xiaoke

    2018-01-01

    With the booming development of evacuation simulation software, developing an extensive database in indoor scenarios for evacuation models is imperative. In this paper, we conduct a qualitative and quantitative analysis of the collected videotapes and aim to provide a complete and unitary database of pedestrians' earthquake emergency response behaviors in indoor scenarios, including human-environment interactions. Using the qualitative analysis method, we extract keyword groups and keywords that code the response modes of pedestrians and construct a general decision flowchart using chronological organization. Using the quantitative analysis method, we analyze data on the delay time, evacuation speed, evacuation route and emergency exit choices. Furthermore, we study the effect of classroom layout on emergency evacuation. The database for indoor scenarios provides reliable input parameters and allows the construction of real and effective constraints for use in software and mathematical models. The database can also be used to validate the accuracy of evacuation models.

  9. Quantification of physical and economic impacts of climate change on public infrastructure in Alaska and benefits of global greenhouse gas mitigation

    NASA Astrophysics Data System (ADS)

    Melvin, A. M.; Larsen, P.; Boehlert, B.; Martinich, J.; Neumann, J.; Chinowsky, P.; Schweikert, A.; Strzepek, K.

    2015-12-01

    Climate change poses many risks and challenges for the Arctic and sub-Arctic, including threats to infrastructure. The safety and stability of infrastructure in this region can be impacted by many factors including increased thawing of permafrost soils, reduced coastline protection due to declining arctic sea ice, and changes in inland flooding. The U.S. Environmental Protection Agency (EPA) is coordinating an effort to quantify physical and economic impacts of climate change on public infrastructure across the state of Alaska and estimate how global greenhouse gas (GHG) mitigation may avoid or reduce these impacts. This research builds on the Climate Change Impacts and Risk Analysis (CIRA) project developed for the contiguous U.S., which is described in an EPA report released in June 2015. We are using a multi-model analysis focused primarily on the impacts of changing permafrost, coastal erosion, and inland flooding on a range of infrastructure types, including transportation (e.g. roads, airports), buildings and harbors, energy sources and transmission, sewer and water systems, and others. This analysis considers multiple global GHG emission scenarios ranging from a business as usual future to significant global action. These scenarios drive climate projections through 2100 spanning a range of outcomes to capture variability amongst climate models. Projections are being combined with a recently developed public infrastructure database and integrated into a version of the Infrastructure Planning Support System (IPSS) we are modifying for use in the Arctic and sub-Arctic region. The IPSS tool allows for consideration of both adaptation and reactive responses to climate change. Results of this work will address a gap in our understanding of climate change impacts in Alaska, provide estimates of the physical and economic damages we may expect with and without global GHG mitigation, and produce important insights about infrastructure vulnerabilities in response to warming at northern latitudes.

  10. Women's Drinking Decisions in Heterosocial Situations: Development and Validation of Scenarios to Assess Influence of Attraction and Risk-Awareness.

    PubMed

    Noel, Nora E; Ogle, Richard L; Maisto, Stephen A; Jackson, Lee A; Loomis, Randi B; Heaton, Jennifer A

    2016-07-01

    These three related studies created a set of ecologically valid scenarios for assessing relative associations of both attraction and sexual coercion risk-recognition in college women's heterosocial situational drinking decisions. The first study constructed nine scenarios using input from heterosexual drinking women in the age cohort (18-30) most likely to experience alcohol-related sexual coercion. In the second study, 50 female undergraduates (ages 18-25) assessed the salience of three important dimensions (attraction, risk, and realism) in these scenarios. The third study was a factor analysis (and a follow-up confirmatory factor analysis) of the elements of coercion-risk as perceived by the target group with two female samples recruited 1 year apart (Sample 1: N = 157, ages 18-29); Sample 2: N = 157, ages 18-30). Results confirmed that the scenarios could be a useful vehicle for assessing how women balance out risk and attraction to make in-the moment heterosocial drinking decisions. The factor analysis showed participants perceived two types of situations, based on whether the male character was "Familiar" or "Just Met" and perceived themselves as happier and more excited with Familiar males. However, in contrast to HIV risk studies, Familiar males were perceived as higher risk for unwanted sex. Future research will use the six scenarios that emerged from the factor analysis to study how attraction and risk perception differentially affect young adult women's social drinking decisions.

  11. 10 CFR 63.322 - Human intrusion scenario.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Human intrusion scenario. 63.322 Section 63.322 Energy... REPOSITORY AT YUCCA MOUNTAIN, NEVADA Postclosure Public Health and Environmental Standards Human Intrusion Standard § 63.322 Human intrusion scenario. For the purposes of the analysis of human intrusion, DOE must...

  12. Insights into future air quality: Analysis of future emissions scenarios using the MARKAL model

    EPA Science Inventory

    This presentation will provide an update on the development and evaluation of four Air Quality Futures (AQF) scenarios. These scenarios represent widely different assumptions regarding the evolution of the U.S. energy system over the next 40 years. The primary differences between...

  13. Evaluation Framework for Dependable Mobile Learning Scenarios

    ERIC Educational Resources Information Center

    Bensassi, Manel; Laroussi, Mona

    2014-01-01

    The goal of the dependability analysis is to predict inconsistencies and to reveal ambiguities and incompleteness in the designed learning scenario. Evaluation, in traditional learning design, is generally planned after the execution of the scenario. In mobile learning, this stage becomes too difficult and expensive to apply due to the complexity…

  14. F-106 tow cable attachment and release mechanism for Eclipse program

    NASA Technical Reports Server (NTRS)

    1997-01-01

    View of the tow cable attachment and release mechanism forward of the cockpit on the QF-106 Eclipse aircraft. This mechanism held and then released the Vectran rope used to tow the QF-106 behind an Air Force C-141A. In 1997 and 1998, the Dryden Flight Research Center at Edwards, California, supported and hosted a Kelly Space & Technology, Inc. project called Eclipse, which sought to demonstrate the feasibility of a reusable tow-launch vehicle concept. The project goal was to successfully tow, inflight, a modified QF-106 delta-wing aircraft with an Air Force C-141A transport aircraft. This would demonstrate the possibility of towing and launching an actual launch vehicle from behind a tow plane. Dryden was the responsible test organization and had flight safety responsibility for the Eclipse project. Dryden provided engineering, instrumentation, simulation, modification, maintenance, range support, and research pilots for the test program. The Air Force Flight Test Center (AFFTC), Edwards, California, supplied the C-141A transport aircraft and crew and configured the aircraft as needed for the tests. The AFFTC also provided the concept and detail design and analysis as well as hardware for the tow system and QF-106 modifications. Dryden performed the modifications to convert the QF-106 drone into the piloted EXD-01 (Eclipse eXperimental Demonstrator-01) experimental aircraft. Kelly Space & Technology hoped to use the results gleaned from the tow test in developing a series of low-cost, reusable launch vehicles. These tests demonstrated the validity of towing a delta-wing aircraft having high wing loading, validated the tow simulation model, and demonstrated various operational procedures, such as ground processing of in-flight maneuvers and emergency abort scenarios.

  15. Closeup of QF-106 release hook for Eclipse program

    NASA Technical Reports Server (NTRS)

    1997-01-01

    View of the release hook on the QF-106 that allowed the pilot to release the tow rope extending from the C-141A tow plane in the Eclipse project. In 1997 and 1998, the Dryden Flight Research Center at Edwards, California, supported and hosted a Kelly Space & Technology, Inc. project called Eclipse, which sought to demonstrate the feasibility of a reusable tow-launch vehicle concept. The project goal was to successfully tow, inflight, a modified QF-106 delta-wing aircraft with an Air Force C-141A transport aircraft. This would demonstrate the possibility of towing and launching an actual launch vehicle from behind a tow plane. Dryden was the responsible test organization and had flight safety responsibility for the Eclipse project. Dryden provided engineering, instrumentation, simulation, modification, maintenance, range support, and research pilots for the test program. The Air Force Flight Test Center (AFFTC), Edwards, California, supplied the C-141A transport aircraft and crew and configured the aircraft as needed for the tests. The AFFTC also provided the concept and detail design and analysis as well as hardware for the tow system and QF-106 modifications. Dryden performed the modifications to convert the QF-106 drone into the piloted EXD-01 (Eclipse eXperimental Demonstrator-01) experimental aircraft. Kelly Space & Technology hoped to use the results gleaned from the tow test in developing a series of low-cost, reusable launch vehicles. These tests demonstrated the validity of towing a delta-wing aircraft having high wing loading, validated the tow simulation model, and demonstrated various operational procedures, such as ground processing of in-flight maneuvers and emergency abort scenarios.

  16. An activity theory perspective of how scenario-based simulations support learning: a descriptive analysis.

    PubMed

    Battista, Alexis

    2017-01-01

    The dominant frameworks for describing how simulations support learning emphasize increasing access to structured practice and the provision of feedback which are commonly associated with skills-based simulations. By contrast, studies examining student participants' experiences during scenario-based simulations suggest that learning may also occur through participation. However, studies directly examining student participation during scenario-based simulations are limited. This study examined the types of activities student participants engaged in during scenario-based simulations and then analyzed their patterns of activity to consider how participation may support learning. Drawing from Engeström's first-, second-, and third-generation activity systems analysis, an in-depth descriptive analysis was conducted. The study drew from multiple qualitative methods, namely narrative, video, and activity systems analysis, to examine student participants' activities and interaction patterns across four video-recorded simulations depicting common motivations for using scenario-based simulations (e.g., communication, critical patient management). The activity systems analysis revealed that student participants' activities encompassed three clinically relevant categories, including (a) use of physical clinical tools and artifacts, (b) social interactions, and (c) performance of structured interventions. Role assignment influenced participants' activities and the complexity of their engagement. Importantly, participants made sense of the clinical situation presented in the scenario by reflexively linking these three activities together. Specifically, student participants performed structured interventions, relying upon the use of physical tools, clinical artifacts, and social interactions together with interactions between students, standardized patients, and other simulated participants to achieve their goals. When multiple student participants were present, such as in a team-based scenario, they distributed the workload to achieve their goals. The findings suggest that student participants learned as they engaged in these scenario-based simulations when they worked to make sense of the patient's clinical presentation. The findings may provide insight into how student participants' meaning-making efforts are mediated by the cultural artifacts (e.g., physical clinical tools) they access, the social interactions they engage in, the structured interventions they perform, and the roles they are assigned. The findings also highlight the complex and emergent properties of scenario-based simulations as well as how activities are nested. Implications for learning, instructional design, and assessment are discussed.

  17. Integrated assessment of future potential global change scenarios and their hydrological impacts in coastal aquifers - a new tool to analyse management alternatives in the Plana Oropesa-Torreblanca aquifer

    NASA Astrophysics Data System (ADS)

    Pulido-Velazquez, David; Renau-Pruñonosa, Arianna; Llopis-Albert, Carlos; Morell, Ignacio; Collados-Lara, Antonio-Juan; Senent-Aparicio, Javier; Baena-Ruiz, Leticia

    2018-05-01

    Any change in the components of the water balance in a coastal aquifer, whether natural or anthropogenic, can alter the freshwater-salt water equilibrium. In this sense climate change (CC) and land use and land cover (LULC) change might significantly influence the availability of groundwater resources in the future. These coastal systems demand an integrated analysis of quantity and quality issues to obtain an appropriate assessment of hydrological impacts using density-dependent flow solutions. The aim of this work is to perform an integrated analysis of future potential global change (GC) scenarios and their hydrological impacts in a coastal aquifer, the Plana Oropesa-Torreblanca aquifer. It is a Mediterranean aquifer that extends over 75 km2 in which important historical LULC changes have been produced and are planned for the future. Future CC scenarios will be defined by using an equi-feasible and non-feasible ensemble of projections based on the results of a multi-criteria analysis of the series generated from several regional climatic models with different downscaling approaches. The hydrological impacts of these CC scenarios combined with future LULC scenarios will be assessed with a chain of models defined by a sequential coupling of rainfall-recharge models, crop irrigation requirements and irrigation return models (for the aquifer and its neighbours that feed it), and a density-dependent aquifer approach. This chain of models, calibrated using the available historical data, allow testing of the conceptual approximation of the aquifer behaviour. They are also fed with series representatives of potential global change scenarios in order to perform a sensitivity analysis regarding future scenarios of rainfall recharge, lateral flows coming from the hydraulically connected neighbouring aquifer, agricultural recharge (taking into account expected future LULC changes) and sea level rise (SLR). The proposed analysis is valuable for improving our knowledge about the aquifer, and so comprises a tool to design sustainable adaptation management strategies taking into account the uncertainty in future GC conditions and their impacts. The results show that GC scenarios produce significant increases in the variability of flow budget components and in the salinity.

  18. Economic Analysis of Different Electric Vehicle Charging Scenarios

    NASA Astrophysics Data System (ADS)

    Ying, Li; Haiming, Zhou; Xiufan, Ma; Hao, Wang

    2017-05-01

    Influence of electric vehicles (EV) to grid cannot be ignored. Research on the economy analysis of different charging scenarios is helpful to guide the user to charge or discharge orderly. EV charging models are built such as disordered charging, valley charging, intelligent charging, and V2G (Vehicle to Grid), by which changes of charging load in different scenarios can be seen to analyze the influence to initial load curve, and comparison can be done about user’s average cost. Monte Carlo method is used to simulate the electric vehicle charging behavior, cost in different charging scenarios are compared, social cost is introduced in V2G scene, and the relationship between user’s average cost and social cost is analyzed. By test, it is proved that user’s cost is the lowest in V2G scenario, and the larger the scale of vehicles is, the more the social cost can save.

  19. Gaia and exoplanets: a revolution in the making

    NASA Astrophysics Data System (ADS)

    Sozzetti, Alessandro

    2017-09-01

    The Gaia global astrometry mission is now entering its fourth year of routine science operations. With the publication of the first data release in September 2016, it has begun to fulfil its promise for revolutionary science in countless aspects of Galactic astronomy and astrophysics. I briefly review the Gaia mission status of operations and the scenario for the upcoming intermediate data releases, focusing on important lessons learned. Then, I illustrate the Gaia exoplanet science case, and discuss how the field will be revolutionized by the power of microarcsecond (μas) astrometry that is about to be unleashed. I conclude by touching upon some of the synergy elements that will call for combination of Gaia data with other indirect and direct detection and characterization techniques, for much improved understanding of exoplanetary systems.

  20. European Union regulators and industry agree on improving specific environmental release categories: Report from the exchange network for exposure scenarios specific environmental release category workshop on May 13, 2016.

    PubMed

    Ahrens, Andreas; Moilanen, Marianne; Martin, Sara; Garcia-John, Enrique; Sättler, Daniel; Bakker, Joost; Reihlen, Antonia; Wind, Thorsten; Tolls, Johannes

    2017-09-01

    Specific environmental release categories (SPERCs) are an instrument for lower-tier environmental emissions assessments. They support chemical safety assessments under the European Union (EU) regulation Registration, Evaluation, Authorisation, and Restriction of Chemicals. SPERCs have been developed by industry and subjected to regulatory review. Within the framework of the Chemical Safety Report/Exposure Scenario Roadmap, the EU Chemicals Agency (ECHA), the EU Member State authorities, and European industry sector associations collaborate to improve the quality of the SPERCs. Following up on the outcome of ECHA's SPERC Best Practice Project, industry, together with ECHA, developed an updated SPERC factsheet template and guidance on how to fill it out. In addition, industry developed 2 sets of SPERC factsheet examples and the corresponding SPERC background documents. These documents were submitted to a multistakeholder review process. The comments from the review were discussed at a workshop in spring 2016. The workshop participants acknowledged the revised factsheet format including the corresponding guidance, the 2 SPERC factsheets, and the 2 SPERC background documents as best practice examples. The package is expected to support further improvement of the quality of the SPERCs. A common understanding was achieved of the need to match the level of detail of the use conditions description with the risk to be controlled (i.e., the emission intensity and hazard profile of the substances) and with the level of conservatism of SPERC release factors. The complete and transparent documentation of the derivation of the release factors and of their conservatism is conceived as crucial for the credibility of the SPERCs, such that they can be trusted by partners in the chemicals supply chain and by regulators. To that end, background documents will include a dedicated section describing the conservatism of SPERCs. The workshop concluded with an outline of the practical way forward for the improvement of SPERC documentation. Integr Environ Assess Manag 2017;13:815-820. © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

  1. Simulating future water temperatures in the North Santiam River, Oregon

    USGS Publications Warehouse

    Buccola, Norman; Risley, John C.; Rounds, Stewart A.

    2016-01-01

    A previously calibrated two-dimensional hydrodynamic and water-quality model (CE-QUAL-W2) of Detroit Lake in western Oregon was used in conjunction with inflows derived from Precipitation-Runoff Modeling System (PRMS) hydrologic models to examine in-lake and downstream water temperature effects under future climate conditions. Current and hypothetical operations and structures at Detroit Dam were imposed on boundary conditions derived from downscaled General Circulation Models in base (1990–1999) and future (2059–2068) periods. Compared with the base period, future air temperatures were about 2 °C warmer year-round. Higher air temperature and lower precipitation under the future period resulted in a 23% reduction in mean annual PRMS-simulated discharge and a 1 °C increase in mean annual estimated stream temperatures flowing into the lake compared to the base period. Simulations incorporating current operational rules and minimum release rates at Detroit Dam to support downstream habitat, irrigation, and water supply during key times of year resulted in lower future lake levels. That scenario results in a lake level that is above the dam’s spillway crest only about half as many days in the future compared to historical frequencies. Managing temperature downstream of Detroit Dam depends on the ability to blend warmer water from the lake’s surface with cooler water from deep in the lake, and the spillway is an important release point near the lake’s surface. Annual average in-lake and release temperatures from Detroit Lake warmed 1.1 °C and 1.5 °C from base to future periods under present-day dam operational rules and fill schedules. Simulated dam operations such as beginning refill of the lake 30 days earlier or reducing minimum release rates (to keep more water in the lake to retain the use of the spillway) mitigated future warming to 0.4 and 0.9 °C below existing operational scenarios during the critical autumn spawning period for endangered salmonids. A hypothetical floating surface withdrawal at Detroit Dam improved temperature control in summer and autumn (0.6 °C warmer in summer, 0.6 °C cooler in autumn compared to existing structures) without altering release rates or lake level management rules.

  2. Health Physics Code System for Evaluating Accidents Involving Radioactive Materials.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-10-01

    Version 03 The HOTSPOT Health Physics codes were created to provide Health Physics personnel with a fast, field-portable calculational tool for evaluating accidents involving radioactive materials. HOTSPOT codes provide a first-order approximation of the radiation effects associated with the atmospheric release of radioactive materials. The developer's website is: http://www.llnl.gov/nhi/hotspot/. Four general programs, PLUME, EXPLOSION, FIRE, and RESUSPENSION, calculate a downwind assessment following the release of radioactive material resulting from a continuous or puff release, explosive release, fuel fire, or an area contamination event. Additional programs deal specifically with the release of plutonium, uranium, and tritium to expedite an initial assessmentmore » of accidents involving nuclear weapons. The FIDLER program can calibrate radiation survey instruments for ground survey measurements and initial screening of personnel for possible plutonium uptake in the lung. The HOTSPOT codes are fast, portable, easy to use, and fully documented in electronic help files. HOTSPOT supports color high resolution monitors and printers for concentration plots and contours. The codes have been extensively used by the DOS community since 1985. Tables and graphical output can be directed to the computer screen, printer, or a disk file. The graphical output consists of dose and ground contamination as a function of plume centerline downwind distance, and radiation dose and ground contamination contours. Users have the option of displaying scenario text on the plots. HOTSPOT 3.0.1 fixes three significant Windows 7 issues: Executable installed properly under "Program Files/HotSpot 3.0". Installation package now smaller: removed dependency on older Windows DLL files which previously needed to; Forms now properly scale based on DPI instead of font for users who change their screen resolution to something other than 100%. This is a more common feature in Windows 7; Windows installer was starting everytime most users started the program, even after HotSpot was already installed. Now, after the program is installed the installer may come up once for each new user but only the first time they run HotSpot on a particular machine. So no user should see the installer come up more than once over many uses; and GPS capability updated to directly use a serial port through a USB connection. Non-USB connections should still work. Fixed table output inconsistencies for fire scenarios.« less

  3. The Nitrogen Footprint Tool Network: A Multi-Institution Program To Reduce Nitrogen Pollution

    PubMed Central

    Leach, Allison M.; Leary, Neil; Baron, Jill; Compton, Jana E.; Galloway, James N.; Hastings, Meredith G.; Kimiecik, Jacob; Lantz-Trissel, Jonathan; de la Reguera, Elizabeth; Ryals, Rebecca

    2017-01-01

    Abstract Anthropogenic sources of reactive nitrogen have local and global impacts on air and water quality and detrimental effects on human and ecosystem health. This article uses the Nitrogen Footprint Tool (NFT) to determine the amount of nitrogen (N) released as a result of institutional consumption. The sectors accounted for include food (consumption and upstream production), energy, transportation, fertilizer, research animals, and agricultural research. The NFT is then used for scenario analysis to manage and track reductions, which are driven by the consumption behaviors of both the institution itself and its constituent individuals. In this article, the first seven completed institution nitrogen footprint results are presented. The Nitrogen Footprint Tool Network aims to develop footprints for many institutions to encourage widespread upper-level management strategies that will create significant reductions in reactive nitrogen released to the environment. Energy use and food purchases are the two largest sectors contributing to institution nitrogen footprints. Ongoing efforts by institutions to reduce greenhouse gas emissions also help to reduce the nitrogen footprint, but the impact of food production on nitrogen pollution has not been directly addressed by the higher education sustainability community. The Nitrogen Footprint Tool Network found that institutions could reduce their nitrogen footprints by optimizing food purchasing to reduce consumption of animal products and minimize food waste, as well as by reducing dependence on fossil fuels for energy. PMID:29350216

  4. The Nitrogen Footprint Tool network: A multi-institution program to reduce nitrogen pollution

    USGS Publications Warehouse

    Castner, Elizabeth A.; Leah, Allison M.; Leary, Neal; Baron, Jill S.; Compton, Jana E.; Galloway, James N.; Hastings, Meredith G.; Kimiecik, Jacob; Lantz-Trissel, Jonathan; de la Riguera, Elizabeth; Ryals, Rebecca

    2017-01-01

    Anthropogenic sources of reactive nitrogen have local and global impacts on air and water quality and detrimental effects on human and ecosystem health. This paper uses the nitrogen footprint tool (NFT) to determine the amount of nitrogen (N) released as a result of institutional consumption. The sectors accounted for include food (consumption and upstream production), energy, transportation, fertilizer, research animals, and agricultural research. The NFT is then used for scenario analysis to manage and track reductions, which are driven by the consumption behaviors of both the institution itself and its constituent individuals. In this paper, the first seven completed institution nitrogen footprint results are presented. The institution NFT network aims to develop footprints for many institutions to encourage widespread upper-level management strategies that will create significant reductions in reactive nitrogen released to the environment. Energy use and food purchases are the two largest sectors contributing to institution nitrogen footprints. Ongoing efforts by institutions to reduce greenhouse gas emissions also help to reduce the nitrogen footprint, but the impact of food production on nitrogen pollution has not been directly addressed by the higher-ed sustainability community. The NFT Network found that institutions could reduce their nitrogen footprints by optimizing food purchasing to reduce consumption of animal products and minimize food waste, as well as reducing dependence on fossil fuels for energy.

  5. Modeling cost of ultrasound versus nerve stimulator guidance for nerve blocks with sensitivity analysis.

    PubMed

    Liu, Spencer S; John, Raymond S

    2010-01-01

    Ultrasound guidance for regional anesthesia has increased in popularity. However, the cost of ultrasound versus nerve stimulator guidance is controversial, as multiple and varying cost inputs are involved. Sensitivity analysis allows modeling of different scenarios and determination of the relative importance of each cost input for a given scenario. We modeled cost per patient of ultrasound versus nerve stimulator using single-factor sensitivity analysis for 4 different clinical scenarios designed to span the expected financial impact of ultrasound guidance. The primary cost factors for ultrasound were revenue from billing for ultrasound (85% of variation in final cost), number of patients examined per ultrasound machine (10%), and block success rate (2.6%). In contrast, the most important input factors for nerve stimulator were the success rate of the nerve stimulator block (89%) and the amount of liability payout for failed airway due to rescue general anesthesia (9%). Depending on clinical scenario, ultrasound was either a profit or cost center. If revenue is generated, then ultrasound-guided blocks consistently become a profit center regardless of clinical scenario in our model. Without revenue, the clinical scenario dictates the cost of ultrasound. In an ambulatory setting, ultrasound is highly competitive with nerve stimulator and requires at least a 96% success rate with nerve stimulator before becoming more expensive. In a hospitalized scenario, ultrasound is consistently more expensive as the uniform use of general anesthesia and hospitalization negate any positive cost effects from greater efficiency with ultrasound.

  6. Impact on the Fe redox cycling of organic ligands released by Synechococcus PCC 7002, under different iron fertilization scenarios. Modeling approach

    NASA Astrophysics Data System (ADS)

    Samperio-Ramos, Guillermo; González-Dávila, Melchor; Santana-Casiano, J. Magdalena

    2018-06-01

    The kinetics of Fe redox transformations are of crucial importance in determining the bioavailability of iron, due to inorganic Fe(II) and Fe weakly organic complexes being the most easily assimilated species by phytoplankton. The role played by the natural organic ligands excreted by the cyanobacteria Synecococcus PCC 7002 on the iron redox chemistry was studied at different stages of growth, considering changes in the organic exudation of the cyanobacteria, associated with growth under two different scenarios of iron availability. The oxidation/reduction processes of iron were studied at nanomolar levels and under different physicochemical conditions of pH (7.2- 8.2), temperature (5- 35 °C) and salinity (10- 37). The presence of natural organic exudates of Synechococcus affected the redox behavior of iron. A pH-dependent and photo-induced Fe(III) reduction process was detected in the presence of exudates produced under Fe-Low conditions. Photolytic reactions also modified the reactivity of those exudates with respect to Fe(II), increasing its lifetime in seawater. Without light mediated processes, organic ligands excreted under iron deficient conditions intensified the Fe(II) oxidation at pH < 7.5. The organic exudates released under High-Fe conditions retarded the Fe(II) oxidation rate, as a function of DOC produced. The changes in the apparent oxidation rate were fitted to polynomial functions for both of the Fe-scenarios considered. A kinetic modeling approach to describe the speciation and the contribution of individual Fe(II) species to the overall oxidation rate was applied, considering the experimental data and delimiting the equilibrium and redox constants between iron and the major ligands present in solution. Two organic type ligands for the exudates of Synechococcus PCC 7002, with different iron-chelation properties were included in the model. The Fe(II) speciation was radically affected when organic ligands were considered. The individual contributions to the overall Fe(II) oxidation rate demonstrated that these organic ligands played a key role in the oxidation process, although their contributions were dependent on the prescribed iron conditions. The study, therefore, suggests that the variability in the composition and nature of organic exudates released, due to iron availability conditions, might determine the redox behaviour of iron in seawater.

  7. A Methodology for Robust Comparative Life Cycle Assessments Incorporating Uncertainty.

    PubMed

    Gregory, Jeremy R; Noshadravan, Arash; Olivetti, Elsa A; Kirchain, Randolph E

    2016-06-21

    We propose a methodology for conducting robust comparative life cycle assessments (LCA) by leveraging uncertainty. The method evaluates a broad range of the possible scenario space in a probabilistic fashion while simultaneously considering uncertainty in input data. The method is intended to ascertain which scenarios have a definitive environmentally preferable choice among the alternatives being compared and the significance of the differences given uncertainty in the parameters, which parameters have the most influence on this difference, and how we can identify the resolvable scenarios (where one alternative in the comparison has a clearly lower environmental impact). This is accomplished via an aggregated probabilistic scenario-aware analysis, followed by an assessment of which scenarios have resolvable alternatives. Decision-tree partitioning algorithms are used to isolate meaningful scenario groups. In instances where the alternatives cannot be resolved for scenarios of interest, influential parameters are identified using sensitivity analysis. If those parameters can be refined, the process can be iterated using the refined parameters. We also present definitions of uncertainty quantities that have not been applied in the field of LCA and approaches for characterizing uncertainty in those quantities. We then demonstrate the methodology through a case study of pavements.

  8. Effect of detention basin release rates on flood flows - Application of a model to the Blackberry Creek Watershed in Kane County, Illinois

    USGS Publications Warehouse

    Soong, David T.; Murphy, Elizabeth A.; Straub, Timothy D.

    2009-01-01

    The effects of stormwater detention basins with specified release rates are examined on the watershed scale with a Hydrological Simulation Program - FORTRAN (HSPF) continuous-simulation model. Modeling procedures for specifying release rates from detention basins with orifice and weir discharge configurations are discussed in this report. To facilitate future detention modeling as a tool for watershed management, a chart relating watershed impervious area to detention volume is presented. The report also presents a case study of the Blackberry Creek watershed in Kane County, Ill., a rapidly urbanizing area seeking to avoid future flood damages from increased urbanization, to illustrate the effects of various detention basin release rates on flood peaks and volumes and flood frequencies. The case study compares flows simulated with a 1996 land-use HSPF model to those simulated with four different 2020 projected land-use HSPF model scenarios - no detention, and detention basins with release rates of 0.08, 0.10, and 0.12 cubic feet per second per acre (ft3/s-acre), respectively. Results of the simulations for 15 locations, which included the downstream ends of all tributaries and various locations along the main stem, showed that a release rate of 0.10 ft3/s-acre, in general, can maintain postdevelopment 100-year peak-flood discharge at a similar magnitude to that of 1996 land-use conditions. Although the release rate is designed to reduce the 100-year peak flow, reduction of the 2-year peak flow is also achieved for a smaller proportion of the peak. Results also showed that the 0.10 ft3/s-acre release rate was less effective in watersheds with relatively high percentages of preexisting (1996) development than in watersheds with less preexisting development.

  9. Accelerated Evaluation of Automated Vehicles Safety in Lane-Change Scenarios Based on Importance Sampling Techniques

    PubMed Central

    Zhao, Ding; Lam, Henry; Peng, Huei; Bao, Shan; LeBlanc, David J.; Nobukawa, Kazutoshi; Pan, Christopher S.

    2016-01-01

    Automated vehicles (AVs) must be thoroughly evaluated before their release and deployment. A widely used evaluation approach is the Naturalistic-Field Operational Test (N-FOT), which tests prototype vehicles directly on the public roads. Due to the low exposure to safety-critical scenarios, N-FOTs are time consuming and expensive to conduct. In this paper, we propose an accelerated evaluation approach for AVs. The results can be used to generate motions of the other primary vehicles to accelerate the verification of AVs in simulations and controlled experiments. Frontal collision due to unsafe cut-ins is the target crash type of this paper. Human-controlled vehicles making unsafe lane changes are modeled as the primary disturbance to AVs based on data collected by the University of Michigan Safety Pilot Model Deployment Program. The cut-in scenarios are generated based on skewed statistics of collected human driver behaviors, which generate risky testing scenarios while preserving the statistical information so that the safety benefits of AVs in nonaccelerated cases can be accurately estimated. The cross-entropy method is used to recursively search for the optimal skewing parameters. The frequencies of the occurrences of conflicts, crashes, and injuries are estimated for a modeled AV, and the achieved accelerated rate is around 2000 to 20 000. In other words, in the accelerated simulations, driving for 1000 miles will expose the AV with challenging scenarios that will take about 2 to 20 million miles of real-world driving to encounter. This technique thus has the potential to greatly reduce the development and validation time for AVs. PMID:27840592

  10. Accelerated Evaluation of Automated Vehicles Safety in Lane-Change Scenarios Based on Importance Sampling Techniques.

    PubMed

    Zhao, Ding; Lam, Henry; Peng, Huei; Bao, Shan; LeBlanc, David J; Nobukawa, Kazutoshi; Pan, Christopher S

    2017-03-01

    Automated vehicles (AVs) must be thoroughly evaluated before their release and deployment. A widely used evaluation approach is the Naturalistic-Field Operational Test (N-FOT), which tests prototype vehicles directly on the public roads. Due to the low exposure to safety-critical scenarios, N-FOTs are time consuming and expensive to conduct. In this paper, we propose an accelerated evaluation approach for AVs. The results can be used to generate motions of the other primary vehicles to accelerate the verification of AVs in simulations and controlled experiments. Frontal collision due to unsafe cut-ins is the target crash type of this paper. Human-controlled vehicles making unsafe lane changes are modeled as the primary disturbance to AVs based on data collected by the University of Michigan Safety Pilot Model Deployment Program. The cut-in scenarios are generated based on skewed statistics of collected human driver behaviors, which generate risky testing scenarios while preserving the statistical information so that the safety benefits of AVs in nonaccelerated cases can be accurately estimated. The cross-entropy method is used to recursively search for the optimal skewing parameters. The frequencies of the occurrences of conflicts, crashes, and injuries are estimated for a modeled AV, and the achieved accelerated rate is around 2000 to 20 000. In other words, in the accelerated simulations, driving for 1000 miles will expose the AV with challenging scenarios that will take about 2 to 20 million miles of real-world driving to encounter. This technique thus has the potential to greatly reduce the development and validation time for AVs.

  11. Nerve growth factor released from a novel PLGA nerve conduit can improve axon growth

    NASA Astrophysics Data System (ADS)

    Lin, Keng-Min; Shea, Jill; Gale, Bruce K.; Sant, Himanshu; Larrabee, Patti; Agarwal, Jay

    2016-04-01

    Nerve injury can occur due to penetrating wounds, compression, traumatic stretch, and cold exposure. Despite prompt repair, outcomes are dismal. In an attempt to help resolve this challenge, in this work, a poly-lactic-co-glycolic acid (PLGA) nerve conduit with associated biodegradable drug reservoir was designed, fabricated, and tested. Unlike current nerve conduits, this device is capable of fitting various clinical scenarios by delivering different drugs without reengineering the whole system. To demonstrate the potential of this device for nerve repair, a series of experiments were performed using nerve growth factor (NGF). First, an NGF dosage curve was developed to determine the minimum NGF concentration for optimal axonal outgrowth on chick dorsal root ganglia (DRG) cells. Next, PLGA devices loaded with NGF were evaluated for sustained drug release and axon growth enhancement with the released drug. A 20 d in vitro release test was conducted and the nerve conduit showed the ability to meet and maintain the minimum NGF requirement determined previously. Bioactivity assays of the released NGF showed that drug released from the device between the 15th and 20th day could still promote axon growth (76.6-95.7 μm) in chick DRG cells, which is in the range of maximum growth. These novel drug delivery conduits show the ability to deliver NGF at a dosage that efficiently promotes ex vivo axon growth and have the potential for in vivo application to help bridge peripheral nerve gaps.

  12. Modeling aquifer behaviour under climate change and high consumption: Case study of the Sfax region, southeast Tunisia

    NASA Astrophysics Data System (ADS)

    Boughariou, Emna; Allouche, Nabila; Jmal, Ikram; Mokadem, Naziha; Ayed, Bachaer; Hajji, Soumaya; Khanfir, Hafedh; Bouri, Salem

    2018-05-01

    The water resources are exhausted by the increasing demand related to the population growth. They are also affected by climate circumstances, especially in arid and semi-arid regions. These areas are already undergoing noticeable shortages and low annual precipitation rate. This paper presents a numerical model of the Sfax shallow aquifer system that was developed by coupling the geographical information system tool ArcGIS 9.3 and ground water modeling system GMS6.5's interface, ground water flow modeling MODFLOW 2000. Being in coastal city and having an arid climate with high consumption rates, this aquifer is undergoing a hydraulic stress situation. Therefore, the groundwater piezometric variations were calibrated for the period 2003-2013 and simulated based on two scenarios; first the constant and growing consumption and second the rainfall forecast as a result of climate change scenario released by the Tunisian Ministry of Agriculture and Water Resources and the German International Cooperation Agency "GIZ" using HadCM3 as a general circulation model. The piezometric simulations globally forecast a decrease that is about 0.5 m in 2020 and 1 m in 2050 locally the decrease is more pronounced in "Chaffar" and "Djbeniana" regions and that is more evident for the increasing consumption scenario. The two scenarios announce a quantitative degradation of the groundwater by the year 2050 with an alarming marine intrusion in "Djbeniana" region.

  13. Dark scenarios

    NASA Astrophysics Data System (ADS)

    Ahonen, Pasi; Alahuhta, Petteri; Daskala, Barbara; Delaitre, Sabine; Hert, Paul De; Lindner, Ralf; Maghiros, Ioannis; Moscibroda, Anna; Schreurs, Wim; Verlinden, Michiel

    In this chapter, we present four "dark scenarios" that highlight the key socio-economic, legal, technological and ethical risks to privacy, identity, trust, security and inclusiveness posed by new AmI technologies. We call them dark scenarios, because they show things that could go wrong in an AmI world, because they present visions of the future that we do not want to become reality. The scenarios expose threats and vulnerabilities as a way to inform policy-makers and planners about issues they need to take into account in developing new policies or updating existing legislation. Before presenting the four scenarios and our analysis of each, we describe the process of how we created the scenarios as well as the elements in our methodology for analysing the scenarios.

  14. A Re-Analysis of the Collaborative Knowledge Transcripts from a Noncombatant Evacuation Operation Scenario: The Next Phase in the Evolution of a Team Collaboration Model

    DTIC Science & Technology

    2008-04-15

    65 E. Scoring Matrix for the NEO Scenario ............................................................................ 69 F. Experimenter...the unclassified scenario. Warner, Wroblewski, and Shuck (2004) also developed a scoring matrix for the final NEO plan (see appendix E). They did...this with input from military operational personnel who had experience in actual NEO scenarios. The researchers created the matrix so that they

  15. Cultural narratives and the succession scenario: Slumdog Millionaire and other popular films and fictions.

    PubMed

    Paul, Robert A

    2011-04-01

    An approach to the analysis of cultural narratives is proposed drawing inspiration from Lévi-Strauss's analysis of myths as fantasied resolutions of conflicts and contradictions in culture and of typical dilemmas of human life. An example of such an analysis revolves around contradictions in the Western cultural construction of the succession of generations. The logic of the structural analysis of cultural representations is explicated, the schema of the succession scenario is laid out, and the conflicts that generate it are identified. The movie Slumdog Millionaire is examined in some detail as an illustration of the succession scenario at work, and a comparative analysis shows how the same underlying schema accounts for otherwise obscure aspects of comparable contemporary popular narratives including Harry Potter, The Lion King and Star Wars. Copyright © 2011 Institute of Psychoanalysis.

  16. The Scenario-Based Engineering Process (SEP): a user-centered approach for the development of health care systems.

    PubMed

    Harbison, K; Kelly, J; Burnell, L; Silva, J

    1995-01-01

    The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.

  17. A near death experience: Shigella manipulates host death machinery to silence innate immunity.

    PubMed

    Bronner, Denise N; O'Riordan, Mary Xd

    2014-10-01

    Release of mitochondrial contents often triggers inflammation and cell death, and modulating this process can be advantageous to invading pathogens. In this issue of The EMBO Journal, Andree and colleagues reveal new findings that an intracellular bacterial pathogen exploits apoptotic machinery to suppress host immune signaling, yet avoids cell death. This study emphasizes the need to expand our understanding of the roles played by pro‐apoptotic proteins in non‐death scenarios.

  18. A scenario analysis of the future residential requirements for people with mental health problems in Eindhoven

    PubMed Central

    2011-01-01

    Background Despite large-scale investments in mental health care in the community since the 1990 s, a trend towards reinstitutionalization has been visible since 2002. Since many mental health care providers regard this as an undesirable trend, the question arises: In the coming 5 years, what types of residence should be organized for people with mental health problems? The purpose of this article is to provide mental health care providers, public housing corporations, and local government with guidelines for planning organizational strategy concerning types of residence for people with mental health problems. Methods A scenario analysis was performed in four steps: 1) an exploration of the external environment; 2) the identification of key uncertainties; 3) the development of scenarios; 4) the translation of scenarios into guidelines for planning organizational strategy. To explore the external environment a document study was performed, and 15 semi-structured interviews were conducted. During a workshop, a panel of experts identified two key uncertainties in the external environment, and formulated four scenarios. Results The study resulted in four scenarios: 1) Integrated and independent living in the community with professional care; 2) Responsible healthcare supported by society; 3) Differentiated provision within the walls of the institution; 4) Residence in large-scale institutions but unmet need for care. From the range of aspects within the different scenarios, the panel was able to work out concrete guidelines for planning organizational strategy. Conclusions In the context of residence for people with mental health problems, the focus should be on investment in community care and their re-integration into society. A joint effort is needed to achieve this goal. This study shows that scenario analysis leads to useful guidelines for planning organizational strategy in mental health care. PMID:21211015

  19. Connectivity patterns of coastal fishes following different dispersal scenarios across a transboundary marine protected area (Bonifacio strait, NW Mediterranean)

    NASA Astrophysics Data System (ADS)

    Koeck, Barbara; Gérigny, Olivia; Durieux, Eric Dominique Henri; Coudray, Sylvain; Garsi, Laure-Hélène; Bisgambiglia, Paul-Antoine; Galgani, François; Agostini, Sylvia

    2015-03-01

    The Strait of Bonifacio constitutes one of the rare transboundary Marine Protected Areas (MPA) of the Mediterranean Sea (between Sardinia, Italy and Corsica, France). Based on the hypothesis that no-take zones will produce more fish larvae, compared to adjacent fished areas, we modeled the outcome of larvae released by coastal fishes inside the no-take zones of the MPA in order to: (1) characterize the dispersal patterns across the Strait of Bonifacio; (2) identify the main potential settlement areas; (3) quantify the connectivity and the larval supply from the MPAs to the surrounding areas. A high resolution hydrodynamic model (MARS 3D, Corse 400 m) combined to an individual based model (Ichthyop software) was used to model the larval dispersal of fish following various scenarios (Pelagic Larval Duration PLD and release depth) over the main spawning period (i.e. between April and September). Dispersal model outputs were then compared with those obtained from an ichthyoplankton sampling cruise performed in August 2012. There was a significant influence of PLD to the connectivity between coastal areas. The synchronization between spawning and hydrodynamic conditions appeared to be determinant in the larval transport success. Biotic and abiotic parameters affecting the dispersal dynamic of fish larvae within the Strait of Bonifacio were identified and synthesis maps were established as a tool for conservation planning.

  20. Global Xenon-133 Emission Inventory Caused by Medical Isotope Production and Derived from the Worldwide Technetium-99m Demand

    NASA Astrophysics Data System (ADS)

    Kalinowski, Martin B.; Grosch, Martina; Hebel, Simon

    2014-03-01

    Emissions from medical isotope production are the most important source of background for atmospheric radioxenon measurements, which are an essential part of nuclear explosion monitoring. This article presents a new approach for estimating the global annual radioxenon emission inventory caused by medical isotope production using the amount of Tc-99m applications in hospitals as the basis. Tc-99m is the most commonly used isotope in radiology and dominates the medical isotope production. This paper presents the first estimate of the global production of Tc-99m. Depending on the production and transport scenario, global xenon emissions of 11-45 PBq/year can be derived from the global isotope demand. The lower end of this estimate is in good agreement with other estimations which are making use of reported releases and realistic process simulations. This proves the validity of the complementary assessment method proposed in this paper. It may be of relevance for future emission scenarios and for estimating the contribution to the global source term from countries and operators that do not make sufficient radioxenon release information available. It depends on sound data on medical treatments with radio-pharmaceuticals and on technical information on the production process of the supplier. This might help in understanding the apparent underestimation of the global emission inventory that has been found by atmospheric transport modelling.

Top