Sample records for scenario methodologies direct

  1. A novel methodology for interpreting air quality measurements from urban streets using CFD modelling

    NASA Astrophysics Data System (ADS)

    Solazzo, Efisio; Vardoulakis, Sotiris; Cai, Xiaoming

    2011-09-01

    In this study, a novel computational fluid dynamics (CFD) based methodology has been developed to interpret long-term averaged measurements of pollutant concentrations collected at roadside locations. The methodology is applied to the analysis of pollutant dispersion in Stratford Road (SR), a busy street canyon in Birmingham (UK), where a one-year sampling campaign was carried out between August 2005 and July 2006. Firstly, a number of dispersion scenarios are defined by combining sets of synoptic wind velocity and direction. Assuming neutral atmospheric stability, CFD simulations are conducted for all the scenarios, by applying the standard k-ɛ turbulence model, with the aim of creating a database of normalised pollutant concentrations at specific locations within the street. Modelled concentration for all wind scenarios were compared with hourly observed NO x data. In order to compare with long-term averaged measurements, a weighted average of the CFD-calculated concentration fields was derived, with the weighting coefficients being proportional to the frequency of each scenario observed during the examined period (either monthly or annually). In summary the methodology consists of (i) identifying the main dispersion scenarios for the street based on wind speed and directions data, (ii) creating a database of CFD-calculated concentration fields for the identified dispersion scenarios, and (iii) combining the CFD results based on the frequency of occurrence of each dispersion scenario during the examined period. The methodology has been applied to calculate monthly and annually averaged benzene concentration at several locations within the street canyon so that a direct comparison with observations could be made. The results of this study indicate that, within the simplifying assumption of non-buoyant flow, CFD modelling can aid understanding of long-term air quality measurements, and help assessing the representativeness of monitoring locations for population exposure studies.

  2. Using the Simulated Patient Methodology to Assess Paracetamol-Related Counselling for Headache

    PubMed Central

    Horvat, Nejc; Koder, Marko; Kos, Mitja

    2012-01-01

    Objectives Firstly, to assess paracetamol-related counselling. Secondly, to evaluate the patient’s approach as a determinant of counselling and to test the acceptability of the simulated patient method in Slovenian pharmacies. Methods The simulated patient methodology was used in 17 community pharmacies. Three scenarios related to self-medication for headaches were developed and used in all participating pharmacies. Two scenarios were direct product requests: scenario 1: a patient with an uncomplicated short-term headache; scenario 2: a patient with a severe, long-duration headache who takes paracetamol for too long and concurrently drinks alcohol. Scenario 3 was a symptom-based request: a patient asking for medicine for a headache. Pharmacy visits were audio recorded and scored according to predetermined criteria arranged in two categories: counselling content and manner of counselling. The acceptability of the methodology used was evaluated by surveying the participating pharmacists. Results The symptom-based request was scored significantly better (a mean 2.17 out of a possible 4 points) than the direct product requests (means of 1.64 and 0.67 out of a possible 4 points for scenario 1 and 2, respectively). The most common information provided was dosage and adverse effects. Only the symptom-based request stimulated spontaneous counselling. No statistically significant differences in the duration of the consultation between the scenarios were found. There were also no significant differences in the quality of counselling between the Masters of Pharmacy and Pharmacy Technicians. The acceptability of the SP method was not as high as in other countries. Conclusion The assessment of paracetamol-related counselling demonstrates room for practice improvement. PMID:23300691

  3. Effects of Special Use Airspace on Economic Benefits of Direct Flights

    NASA Technical Reports Server (NTRS)

    Datta, Koushik; Barrington, Craig; Foster, John D. (Technical Monitor)

    1996-01-01

    A methodology for estimating the economic effects of Special Use Airspace (SUA) on direct route flights is presented in this paper. The methodology is based on evaluating operating costs of aircraft and analyzing the different ground-track distances traveled by flights under different air traffic scenarios. Using this methodology the following objectives are evaluated: optimistic bias of studies that assume accessible SUAs the maximum economic benefit of dynamic use of SUAs and the marginal economic benefit of the dynamic use of individual SUAs.

  4. Method for the technical, financial, economic and environmental pre-feasibility study of geothermal power plants by RETScreen - Ecuador's case study.

    PubMed

    Moya, Diego; Paredes, Juan; Kaparaju, Prasad

    2018-01-01

    RETScreen presents a proven focused methodology on pre-feasibility studies. Although this tool has been used to carry out a number of pre-feasibility studies of solar, wind, and hydropower projects; that is not the case for geothermal developments. This method paper proposes a systematic methodology to cover all the necessary inputs of the RETScreen-International Geothermal Project Model. As case study, geothermal power plant developments in the Ecuadorian context were analysed by RETScreen-International Geothermal Project Model. Three different scenarios were considered for analyses. Scenario I and II considered incentives of 132.1 USD/MWh for electricity generation and grants of 3 million USD. Scenario III considered the geothermal project with an electricity export price of 49.3 USD/MWh. Scenario III was further divided into IIIA and IIIB case studies. Scenario IIIA considered a 3 million USD grant while Scenario IIIB considered an income of 8.9 USD/MWh for selling heat in direct applications. Modelling results showed that binary power cycle was the most suitable geothermal technology to produce electricity along with aquaculture and greenhouse heating for direct use applications in all scenarios. Financial analyses showed that the debt payment would be 5.36 million USD/year under in Scenario I and III. The correspindig values for Scenario II was 7.06 million USD/year. Net Present Value was positive for all studied scenarios except for Scenario IIIA. Overall, Scenario II was identified as the most feasible project due to positive NPV with short payback period. Scenario IIIB could become financially attractive by selling heat for direct applications. The total initial investment for a 22 MW geothermal power plant was 114.3 million USD (at 2017 costs). Economic analysis showed an annual savings of 24.3 million USD by avoiding fossil fuel electricity generation. More than 184,000 tCO 2 eq. could be avoided annually.

  5. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  6. Using Rapid-Response Scenario-Building Methodology for Climate Change Adaptation Planning

    NASA Astrophysics Data System (ADS)

    Ludwig, K. A.; Stoepler, T. M.; Schuster, R.

    2015-12-01

    Rapid-response scenario-building methodology can be modified to develop scenarios for slow-onset disasters associated with climate change such as drought. Results of a collaboration between the Department of the Interior (DOI) Strategic Sciences Group (SSG) and the Southwest Colorado Social-Ecological Climate Resilience Project are presented in which SSG scenario-building methods were revised and applied to climate change adaptation planning in Colorado's Gunnison Basin, United States. The SSG provides the DOI with the capacity to rapidly assemble multidisciplinary teams of experts to develop scenarios of the potential environmental, social, and economic cascading consequences of environmental crises, and to analyze these chains to determine actionable intervention points. By design, the SSG responds to acute events of a relatively defined duration. As a capacity-building exercise, the SSG explored how its scenario-building methodology could be applied to outlining the cascading consequences of slow-onset events related to climate change. SSG staff facilitated two workshops to analyze the impacts of drought, wildfire, and insect outbreak in the sagebrush and spruce-fir ecosystems. Participants included local land managers, natural and social scientists, ranchers, and other stakeholders. Key findings were: 1) scenario framing must be adjusted to accommodate the multiple, synergistic components and longer time frames of slow-onset events; 2) the development of slow-onset event scenarios is likely influenced by participants having had more time to consider potential consequences, relative to acute events; 3) participants who are from the affected area may have a more vested interest in the outcome and/or may be able to directly implement interventions.

  7. [Using the simulated patient methodology to assess counselling for acute diarrhoea - evidence from Germany].

    PubMed

    Langer, Bernhard; Bull, Elisa; Burgsthaler, Tina; Glawe, Julia; Schwobeda, Monique; Simon, Karen

    2016-01-01

    First, to assess the quality of counselling for acute diarrhoea; second, to evaluate the patient's approach and different user groups as a determinant of counselling. The simulated patient methodology was used in all 21 community pharmacies in a north-eastern German city with a population of about 63,000. Four scenarios related to self-medication for acute diarrhoea were developed and used in all pharmacies (total: 84 visits). Two scenarios were direct product-based requests for loperamide (scenario 1: a 74-year old woman with diabetes and hypertension; scenario 3: a 30-year old man with no primary disease). Scenario 2 and 4 were symptom-based requests asking for medicine for acute diarrhoea (scenario 2: a 74-year old woman with diabetes and hypertension; scenario 4: a 30-year old man with no primary disease). The assessment sheet included 9 objective items relating to the pharmacological advice to avoid a subjective evaluation by the mystery shoppers (e. g., the friendliness of the customer contact). Simulated patient visits were conducted covertly by five untrained female master students. After evaluation of the data every pharmacy received an individual performance feedback to encourage behavioural change and improve counselling quality. Overall, the quality of counselling was quite poor (277 out of 756 possible points). The most commonly provided information was dosage (86.9 %); information on adverse effects was least commonly provided (3.6 %). Furthermore, there was a huge difference in the counselling quality between the pharmacies (minimum 4 points, maximum 20 points out of 36 possible points). The symptom-based requests scored significantly better (95 and 85 out of 189 possible points) than the direct product-based requests (42 and 55 out of 189 possible points). The symptom-based requests had a significantly better counselling quality for an older woman with primary disease than for a younger man without any primary disease. This difference was not observed with the direct product-based requests. Copyright © 2016. Published by Elsevier GmbH.

  8. SUDOQU, a new dose-assessment methodology for radiological surface contamination.

    PubMed

    van Dillen, Teun; van Dijk, Arjan

    2018-06-12

    A new methodology has been developed for the assessment of the annual effective dose resulting from removable and fixed radiological surface contamination. It is entitled SUDOQU (SUrface DOse QUantification) and it can for instance be used to derive criteria for surface contamination related to the import of non-food consumer goods, containers and conveyances, e.g., limiting values and operational screening levels. SUDOQU imposes mass (activity)-balance equations based on radioactive decay, removal and deposition processes in indoor and outdoor environments. This leads to time-dependent contamination levels that may be of particular importance in exposure scenarios dealing with one or a few contaminated items only (usually public exposure scenarios, therefore referred to as the 'consumer' model). Exposure scenarios with a continuous flow of freshly contaminated goods also fall within the scope of the methodology (typically occupational exposure scenarios, thus referred to as the 'worker model'). In this paper we describe SUDOQU, its applications, and its current limitations. First, we delineate the contamination issue, present the assumptions and explain the concepts. We describe the relevant removal, transfer, and deposition processes, and derive equations for the time evolution of the radiological surface-, air- and skin-contamination levels. These are then input for the subsequent evaluation of the annual effective dose with possible contributions from external gamma radiation, inhalation, secondary ingestion (indirect, from hand to mouth), skin contamination, direct ingestion and skin-contact exposure. The limiting effective surface dose is introduced for issues involving the conservatism of dose calculations. SUDOQU can be used by radiation-protection scientists/experts and policy makers in the field of e.g. emergency preparedness, trade and transport, exemption and clearance, waste management, and nuclear facilities. Several practical examples are worked out demonstrating the potential applications of the methodology. . Creative Commons Attribution license.

  9. Performance-Driven Hybrid Full-Body Character Control for Navigation and Interaction in Virtual Environments

    NASA Astrophysics Data System (ADS)

    Mousas, Christos; Anagnostopoulos, Christos-Nikolaos

    2017-06-01

    This paper presents a hybrid character control interface that provides the ability to synthesize in real-time a variety of actions based on the user's performance capture. The proposed methodology enables three different performance interaction modules: the performance animation control that enables the direct mapping of the user's pose to the character, the motion controller that synthesizes the desired motion of the character based on an activity recognition methodology, and the hybrid control that lies within the performance animation and the motion controller. With the methodology presented, the user will have the freedom to interact within the virtual environment, as well as the ability to manipulate the character and to synthesize a variety of actions that cannot be performed directly by him/her, but which the system synthesizes. Therefore, the user is able to interact with the virtual environment in a more sophisticated fashion. This paper presents examples of different scenarios based on the three different full-body character control methodologies.

  10. ARAMIS project: a comprehensive methodology for the identification of reference accident scenarios in process industries.

    PubMed

    Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno

    2006-03-31

    In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage.

  11. Assessment of Effectiveness of Geologic Isolation Systems: REFERENCE SITE INITIAL ASSESSMENT FOR A SALT DOME REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, M. A.; Brandstetter, A.; Benson, G. L.

    1982-06-01

    As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was sUGcessful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less

  12. Assessment of Effectiveness of Geologic Isolation Systems: REFERENCE SITE INITIAL ASSESSMENT FOR A SALT DOME REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, M. A.; Brandstetter, A.; Benson, G. L.

    1982-06-01

    As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was successful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less

  13. Temperature - Emissivity Separation Assessment in a Sub-Urban Scenario

    NASA Astrophysics Data System (ADS)

    Moscadelli, M.; Diani, M.; Corsini, G.

    2017-10-01

    In this paper, a methodology that aims at evaluating the effectiveness of different TES strategies is presented. The methodology takes into account the specific material of interest in the monitored scenario, sensor characteristics, and errors in the atmospheric compensation step. The methodology is proposed in order to predict and analyse algorithms performances during the planning of a remote sensing mission, aimed to discover specific materials of interest in the monitored scenario. As case study, the proposed methodology is applied to a real airborne data set of a suburban scenario. In order to perform the TES problem, three state-of-the-art algorithms, and a recently proposed one, are investigated: Temperature-Emissivity Separation '98 (TES-98) algorithm, Stepwise Refining TES (SRTES) algorithm, Linear piecewise TES (LTES) algorithm, and Optimized Smoothing TES (OSTES) algorithm. At the end, the accuracy obtained with real data, and the ones predicted by means of the proposed methodology are compared and discussed.

  14. Effects of Competitive E-Learning Tools on Higher Education Students: A Case Study

    ERIC Educational Resources Information Center

    Regueras, L. M.; Verdu, E.; Munoz, M. F.; Perez, M. A.; de Castro, J. P.; Verdu, M. J.

    2009-01-01

    Over the last few years, most of the attempts to introduce active learning methodologies in the classroom have made use of Information and Communication Technology (ICT). Many of these efforts have been directed to collaborative scenarios used in remote, blended or face-to-face experiences, in order to take advantage of the flexibility provided by…

  15. Life cycle assessment of a packaging waste recycling system in Portugal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira, S.; Cabral, M.; Cruz, N.F. da, E-mail: nunocruz@tecnico.ulisboa.pt

    Highlights: • We modeled a real packaging waste recycling system. • The analysis was performed using the life cycle assessment methodology. • The 2010 situation was compared with scenarios where the materials were not recycled. • The “Baseline” scenario seems to be more beneficial to the environment. - Abstract: Life Cycle Assessment (LCA) has been used to assess the environmental impacts associated with an activity or product life cycle. It has also been applied to assess the environmental performance related to waste management activities. This study analyses the packaging waste management system of a local public authority in Portugal. Themore » operations of selective and refuse collection, sorting, recycling, landfilling and incineration of packaging waste were considered. The packaging waste management system in operation in 2010, which we called “Baseline” scenario, was compared with two hypothetical scenarios where all the packaging waste that was selectively collected in 2010 would undergo the refuse collection system and would be sent directly to incineration (called “Incineration” scenario) or to landfill (“Landfill” scenario). Overall, the results show that the “Baseline” scenario is more environmentally sound than the hypothetical scenarios.« less

  16. ESP v1.0: Methodology for Exploring Emission Impacts of Future Scenarios in the United States

    EPA Science Inventory

    This article presents a methodology for creating anthropogenic emission inventories that can be used to simulate future regional air quality. The Emission Scenario Projection (ESP) methodology focuses on energy production and use, the principal sources of many air pollutants. Emi...

  17. Land Use Explains the Distribution of Threatened New World Amphibians Better than Climate

    PubMed Central

    Brum, Fernanda Thiesen; Gonçalves, Larissa Oliveira; Cappelatti, Laura; Carlucci, Marcos Bergmann; Debastiani, Vanderlei Júlio; Salengue, Elisa Viana; dos Santos Seger, Guilherme Dubal; Both, Camila; Bernardo-Silva, Jorge Sebastião; Loyola, Rafael Dias; da Silva Duarte, Leandro

    2013-01-01

    Background We evaluated the direct and indirect influence of climate, land use, phylogenetic structure, species richness and endemism on the distribution of New World threatened amphibians. Methodology/Principal Findings We used the WWF’s New World ecoregions, the WWFs amphibian distributional data and the IUCN Red List Categories to obtain the number of threatened species per ecoregion. We analyzed three different scenarios urgent, moderate, and the most inclusive scenario. Using path analysis we evaluated the direct and indirect effects of climate, type of land use, phylogenetic structure, richness and endemism on the number of threatened amphibians in New World ecoregions. In all scenarios we found strong support for direct influences of endemism, the cover of villages and species richness on the number of threatened species in each ecoregion. The proportion of wild area had indirect effects in the moderate and the most inclusive scenario. Phylogenetic composition was important in determining the species richness and endemism in each ecoregion. Climate variables had complex and indirect effects on the number of threatened species. Conclusion/Significance Land use has a more direct influence than climate in determining the distribution of New World threatened amphibians. Independently of the scenario analyzed, the main variables influencing the distribution of threatened amphibians were consistent, with endemism having the largest magnitude path coefficient. The importance of phylogenetic composition could indicate that some clades may be more threatened than others, and their presence increases the number of threatened species. Our results highlight the importance of man-made land transformation, which is a local variable, as a critical factor underlying the distribution of threatened amphibians at a biogeographic scale. PMID:23637764

  18. Evaluation of Structural Robustness against Column Loss: Methodology and Application to RC Frame Buildings.

    PubMed

    Bao, Yihai; Main, Joseph A; Noh, Sam-Young

    2017-08-01

    A computational methodology is presented for evaluating structural robustness against column loss. The methodology is illustrated through application to reinforced concrete (RC) frame buildings, using a reduced-order modeling approach for three-dimensional RC framing systems that includes the floor slabs. Comparisons with high-fidelity finite-element model results are presented to verify the approach. Pushdown analyses of prototype buildings under column loss scenarios are performed using the reduced-order modeling approach, and an energy-based procedure is employed to account for the dynamic effects associated with sudden column loss. Results obtained using the energy-based approach are found to be in good agreement with results from direct dynamic analysis of sudden column loss. A metric for structural robustness is proposed, calculated by normalizing the ultimate capacities of the structural system under sudden column loss by the applicable service-level gravity loading and by evaluating the minimum value of this normalized ultimate capacity over all column removal scenarios. The procedure is applied to two prototype 10-story RC buildings, one employing intermediate moment frames (IMFs) and the other employing special moment frames (SMFs). The SMF building, with its more stringent seismic design and detailing, is found to have greater robustness.

  19. Validation of The Scenarios Designed For The Eu Registration of Pesticides

    NASA Astrophysics Data System (ADS)

    Piñeros Garcet, J. D.; de Nie, D.; Vanclooster, M.; Tiktak, A.; Klein, M.

    As part of recent efforts to harmonise registration procedures for pesticides within the EU, a set of uniform principles were developed, setting out the detailed evaluation and decision making criteria for pesticide registration. The EU directive 91/414/EEC places great importance on the use of validated models to calculate Predicted Envi- ronmental Concentrations (PECs), as a basis for assessing the environmental risks and health effects. To be used in a harmonised registration process, the quality of PEC modelling needs to be assured. Quality assurance of mathematical modelling implies, amongst others, the validation of the environmental modelling scenarios. The FOrum for the CO-ordination of pesticide fate models and their USe (FOCUS), is the cur- rent platform where common modelling methodologies are designed and subjected for approval to the European authorities. In 2000, the FOCUS groundwater scenarios working group defined the procedures for realising tier 1 PEC groundwater calcula- tions for the active substances of plant protection products at the pan-european level. The procedures and guidelines were approved by the Standing Committee on Plant Health, and are now recommended for tier 1 PEC groundwater calculations in the reg-istration dossier. Yet, the working group also identified a range of uncertainties related to the validity of the present leaching scenarios. To mitigate some of these problems,the EU R&D project APECOP was designed and approved for support in the frame-work of the EU-FP5-Quality of Life Programme. One of the objectives of the project is to evaluate the appropriateness of the current Tier 1 groundwater scenarios. In this paper, we summarise the methodology and results of the scenarios validation.

  20. Validation of The Scenarios Designed For The Eu Registration of Pesticides

    NASA Astrophysics Data System (ADS)

    Piñeros Garcet, J. D.; de Nie, D.; Vanclooster, M.; Tiktak, A.; Klein, M.; Jones, A.

    As part of recent efforts to harmonise registration procedures for pesticides within the EU, a set of uniform principles were developed, setting out the detailed evaluation and decision making criteria for pesticide registration. The EU directive 91/414/EEC places great importance on the use of validated models to calculate Predicted Envi- ronmental Concentrations (PECs), as a basis for assessing the environmental risks and health effects. To be used in a harmonised registration process, the quality of PEC modelling needs to be assured. Quality assurance of mathematical modelling implies, amongst others, the validation of the environmental modelling scenarios. The FOrum for the CO-ordination of pesticide fate models and their USe (FOCUS), is the cur- rent platform where common modelling methodologies are designed and subjected for approval to the European authorities. In 2000, the FOCUS groundwater scenarios working group defined the procedures for realising tier 1 PEC groundwater calcula- tions for the active substances of plant protection products at the pan-european level. The procedures and guidelines were approved by the Standing Committee on Plant Health, and are now recommended for tier 1 PEC groundwater calculations in the reg- istration dossier. Yet, the working group also identified a range of uncertainties related to the validity of the present leaching scenarios. To mitigate some of these problems, the EU R&D project APECOP was designed and approved for support in the frame- work of the EU-FP5-Quality of Life Programme. One of the objectives of the project is to evaluate the appropriateness of the current Tier 1 groundwater scenarios. In this paper, we summarise the methodology and results of the scenarios validation.

  1. Extreme Magnitude Earthquakes and their Economical Consequences

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Cabrera, E.; Ashworth, M.; Perea, N.; Emerson, D.; Salazar, A.; Moulinec, C.

    2011-12-01

    The frequency of occurrence of extreme magnitude earthquakes varies from tens to thousands of years, depending on the considered seismotectonic region of the world. However, the human and economic losses when their hypocenters are located in the neighborhood of heavily populated and/or industrialized regions, can be very large, as recently observed for the 1985 Mw 8.01 Michoacan, Mexico and the 2011 Mw 9 Tohoku, Japan, earthquakes. Herewith, a methodology is proposed in order to estimate the probability of exceedance of: the intensities of extreme magnitude earthquakes, PEI and of their direct economical consequences PEDEC. The PEI are obtained by using supercomputing facilities to generate samples of the 3D propagation of extreme earthquake plausible scenarios, and enlarge those samples by Monte Carlo simulation. The PEDEC are computed by using appropriate vulnerability functions combined with the scenario intensity samples, and Monte Carlo simulation. An example of the application of the methodology due to the potential occurrence of extreme Mw 8.5 subduction earthquakes on Mexico City is presented.

  2. MEGASTAR: The meaning of growth. An assessment of systems, technologies, and requirements. [methodology for display and analysis of energy production and consumption

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A methodology for the display and analysis of postulated energy futures for the United States is presented. A systems approach methodology including the methodology of technology assessment is used to examine three energy scenarios--the Westinghouse Nuclear Electric Economy, the Ford Technical Fix Base Case and a MEGASTAR generated Alternate to the Ford Technical Fix Base Case. The three scenarios represent different paths of energy consumption from the present to the year 2000. Associated with these paths are various mixes of fuels, conversion, distribution, conservation and end-use technologies. MEGASTAR presents the estimated times and unit requirements to supply the fuels, conversion and distribution systems for the postulated end uses for the three scenarios and then estimates the aggregate manpower, materials, and capital requirements needed to develop the energy system described by the particular scenario.

  3. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.

  4. Quantifying construction and demolition waste: An analytical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin

    2014-09-15

    Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less

  5. Bio-inspired algorithms applied to molecular docking simulations.

    PubMed

    Heberlé, G; de Azevedo, W F

    2011-01-01

    Nature as a source of inspiration has been shown to have a great beneficial impact on the development of new computational methodologies. In this scenario, analyses of the interactions between a protein target and a ligand can be simulated by biologically inspired algorithms (BIAs). These algorithms mimic biological systems to create new paradigms for computation, such as neural networks, evolutionary computing, and swarm intelligence. This review provides a description of the main concepts behind BIAs applied to molecular docking simulations. Special attention is devoted to evolutionary algorithms, guided-directed evolutionary algorithms, and Lamarckian genetic algorithms. Recent applications of these methodologies to protein targets identified in the Mycobacterium tuberculosis genome are described.

  6. Eutrophication assessment and management methodology of multiple pollution sources of a landscape lake in North China.

    PubMed

    Chen, Yanxi; Niu, Zhiguang; Zhang, Hongwei

    2013-06-01

    Landscape lakes in the city suffer high eutrophication risk because of their special characters and functions in the water circulation system. Using a landscape lake HMLA located in Tianjin City, North China, with a mixture of point source (PS) pollution and non-point source (NPS) pollution, we explored the methodology of Fluent and AQUATOX to simulate and predict the state of HMLA, and trophic index was used to assess the eutrophication state. Then, we use water compensation optimization and three scenarios to determine the optimal management methodology. Three scenarios include ecological restoration scenario, best management practices (BMPs) scenario, and a scenario combining both. Our results suggest that the maintenance of a healthy ecosystem with ecoremediation is necessary and the BMPs have a far-reaching effect on water reusing and NPS pollution control. This study has implications for eutrophication control and management under development for urbanization in China.

  7. Greenhouse gas emissions and land use change from Jatropha curcas-based jet fuel in Brazil.

    PubMed

    Bailis, Robert E; Baka, Jennifer E

    2010-11-15

    This analysis presents a comparison of life-cycle GHG emissions from synthetic paraffinic kerosene (SPK) produced as jet fuel substitute from jatropha curcas feedstock cultivated in Brazil against a reference scenario of conventional jet fuel. Life cycle inventory data are derived from surveys of actual Jatropha growers and processors. Results indicate that a baseline scenario, which assumes a medium yield of 4 tons of dry fruit per hectare under drip irrigation with existing logistical conditions using energy-based coproduct allocation methodology, and assumes a 20-year plantation lifetime with no direct land use change (dLUC), results in the emissions of 40 kg CO₂e per GJ of fuel produced, a 55% reduction relative to conventional jet fuel. However, dLUC based on observations of land-use transitions leads to widely varying changes in carbon stocks ranging from losses in excess of 50 tons of carbon per hectare when Jatropha is planted in native cerrado woodlands to gains of 10-15 tons of carbon per hectare when Jatropha is planted in former agro-pastoral land. Thus, aggregate emissions vary from a low of 13 kg CO₂e per GJ when Jatropha is planted in former agro-pastoral lands, an 85% decrease from the reference scenario, to 141 kg CO₂e per GJ when Jatropha is planted in cerrado woodlands, a 60% increase over the reference scenario. Additional sensitivities are also explored, including changes in yield, exclusion of irrigation, shortened supply chains, and alternative allocation methodologies.

  8. A Methodology for Robust Comparative Life Cycle Assessments Incorporating Uncertainty.

    PubMed

    Gregory, Jeremy R; Noshadravan, Arash; Olivetti, Elsa A; Kirchain, Randolph E

    2016-06-21

    We propose a methodology for conducting robust comparative life cycle assessments (LCA) by leveraging uncertainty. The method evaluates a broad range of the possible scenario space in a probabilistic fashion while simultaneously considering uncertainty in input data. The method is intended to ascertain which scenarios have a definitive environmentally preferable choice among the alternatives being compared and the significance of the differences given uncertainty in the parameters, which parameters have the most influence on this difference, and how we can identify the resolvable scenarios (where one alternative in the comparison has a clearly lower environmental impact). This is accomplished via an aggregated probabilistic scenario-aware analysis, followed by an assessment of which scenarios have resolvable alternatives. Decision-tree partitioning algorithms are used to isolate meaningful scenario groups. In instances where the alternatives cannot be resolved for scenarios of interest, influential parameters are identified using sensitivity analysis. If those parameters can be refined, the process can be iterated using the refined parameters. We also present definitions of uncertainty quantities that have not been applied in the field of LCA and approaches for characterizing uncertainty in those quantities. We then demonstrate the methodology through a case study of pavements.

  9. Evaluation of Structural Robustness against Column Loss: Methodology and Application to RC Frame Buildings

    PubMed Central

    Bao, Yihai; Main, Joseph A.; Noh, Sam-Young

    2017-01-01

    A computational methodology is presented for evaluating structural robustness against column loss. The methodology is illustrated through application to reinforced concrete (RC) frame buildings, using a reduced-order modeling approach for three-dimensional RC framing systems that includes the floor slabs. Comparisons with high-fidelity finite-element model results are presented to verify the approach. Pushdown analyses of prototype buildings under column loss scenarios are performed using the reduced-order modeling approach, and an energy-based procedure is employed to account for the dynamic effects associated with sudden column loss. Results obtained using the energy-based approach are found to be in good agreement with results from direct dynamic analysis of sudden column loss. A metric for structural robustness is proposed, calculated by normalizing the ultimate capacities of the structural system under sudden column loss by the applicable service-level gravity loading and by evaluating the minimum value of this normalized ultimate capacity over all column removal scenarios. The procedure is applied to two prototype 10-story RC buildings, one employing intermediate moment frames (IMFs) and the other employing special moment frames (SMFs). The SMF building, with its more stringent seismic design and detailing, is found to have greater robustness. PMID:28890599

  10. Fuzzy risk analysis of a modern γ-ray industrial irradiator.

    PubMed

    Castiglia, F; Giardina, M

    2011-06-01

    Fuzzy fault tree analyses were used to investigate accident scenarios that involve radiological exposure to operators working in industrial γ-ray irradiation facilities. The HEART method, a first generation human reliability analysis method, was used to evaluate the probability of adverse human error in these analyses. This technique was modified on the basis of fuzzy set theory to more directly take into account the uncertainties in the error-promoting factors on which the methodology is based. Moreover, with regard to some identified accident scenarios, fuzzy radiological exposure risk, expressed in terms of potential annual death, was evaluated. The calculated fuzzy risks for the examined plant were determined to be well below the reference risk suggested by International Commission on Radiological Protection.

  11. Comparative Analysis of Modeling Studies on China's Future Energy and Emissions Outlook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Nina; Zhou, Nan; Fridley, David

    The past decade has seen the development of various scenarios describing long-term patterns of future Greenhouse Gas (GHG) emissions, with each new approach adding insights to our understanding of the changing dynamics of energy consumption and aggregate future energy trends. With the recent growing focus on China's energy use and emission mitigation potential, a range of Chinese outlook models have been developed across different institutions including in China's Energy Research Institute's 2050 China Energy and CO2 Emissions Report, McKinsey & Co's China's Green Revolution report, the UK Sussex Energy Group and Tyndall Centre's China's Energy Transition report, and the China-specificmore » section of the IEA World Energy Outlook 2009. At the same time, the China Energy Group at Lawrence Berkeley National Laboratory (LBNL) has developed a bottom-up, end-use energy model for China with scenario analysis of energy and emission pathways out to 2050. A robust and credible energy and emission model will play a key role in informing policymakers by assessing efficiency policy impacts and understanding the dynamics of future energy consumption and energy saving and emission reduction potential. This is especially true for developing countries such as China, where uncertainties are greater while the economy continues to undergo rapid growth and industrialization. A slightly different assumption or storyline could result in significant discrepancies among different model results. Therefore, it is necessary to understand the key models in terms of their scope, methodologies, key driver assumptions and the associated findings. A comparative analysis of LBNL's energy end-use model scenarios with the five above studies was thus conducted to examine similarities and divergences in methodologies, scenario storylines, macroeconomic drivers and assumptions as well as aggregate energy and emission scenario results. Besides directly tracing different energy and CO{sub 2} savings potential back to the underlying strategies and combination of efficiency and abatement policy instruments represented by each scenario, this analysis also had other important but often overlooked findings.« less

  12. Coupled ecosystem/supply chain modelling of fish products from sea to shelf: the Peruvian anchoveta case.

    PubMed

    Avadí, Angel; Fréon, Pierre; Tam, Jorge

    2014-01-01

    Sustainability assessment of food supply chains is relevant for global sustainable development. A framework is proposed for analysing fishfood (fish products for direct human consumption) supply chains with local or international scopes. It combines a material flow model (including an ecosystem dimension) of the supply chains, calculation of sustainability indicators (environmental, socio-economic, nutritional), and finally multi-criteria comparison of alternative supply chains (e.g. fates of landed fish) and future exploitation scenarios. The Peruvian anchoveta fishery is the starting point for various local and global supply chains, especially via reduction of anchoveta into fishmeal and oil, used worldwide as a key input in livestock and fish feeds. The Peruvian anchoveta supply chains are described, and the proposed methodology is used to model them. Three scenarios were explored: status quo of fish exploitation (Scenario 1), increase in anchoveta landings for food (Scenario 2), and radical decrease in total anchoveta landings to allow other fish stocks to prosper (Scenario 3). It was found that Scenario 2 provided the best balance of sustainability improvements among the three scenarios, but further refinement of the assessment is recommended. In the long term, the best opportunities for improving the environmental and socio-economic performance of Peruvian fisheries are related to sustainability-improving management and policy changes affecting the reduction industry. Our approach provides the tools and quantitative results to identify these best improvement opportunities.

  13. Coupled Ecosystem/Supply Chain Modelling of Fish Products from Sea to Shelf: The Peruvian Anchoveta Case

    PubMed Central

    Avadí, Angel; Fréon, Pierre; Tam, Jorge

    2014-01-01

    Sustainability assessment of food supply chains is relevant for global sustainable development. A framework is proposed for analysing fishfood (fish products for direct human consumption) supply chains with local or international scopes. It combines a material flow model (including an ecosystem dimension) of the supply chains, calculation of sustainability indicators (environmental, socio-economic, nutritional), and finally multi-criteria comparison of alternative supply chains (e.g. fates of landed fish) and future exploitation scenarios. The Peruvian anchoveta fishery is the starting point for various local and global supply chains, especially via reduction of anchoveta into fishmeal and oil, used worldwide as a key input in livestock and fish feeds. The Peruvian anchoveta supply chains are described, and the proposed methodology is used to model them. Three scenarios were explored: status quo of fish exploitation (Scenario 1), increase in anchoveta landings for food (Scenario 2), and radical decrease in total anchoveta landings to allow other fish stocks to prosper (Scenario 3). It was found that Scenario 2 provided the best balance of sustainability improvements among the three scenarios, but further refinement of the assessment is recommended. In the long term, the best opportunities for improving the environmental and socio-economic performance of Peruvian fisheries are related to sustainability-improving management and policy changes affecting the reduction industry. Our approach provides the tools and quantitative results to identify these best improvement opportunities. PMID:25003196

  14. Air quality impacts of distributed energy resources implemented in the northeastern United States.

    PubMed

    Carreras-Sospedra, Marc; Dabdub, Donald; Brouwer, Jacob; Knipping, Eladio; Kumar, Naresh; Darrow, Ken; Hampson, Anne; Hedman, Bruce

    2008-07-01

    Emissions from the potential installation of distributed energy resources (DER) in the place of current utility-scale power generators have been introduced into an emissions inventory of the northeastern United States. A methodology for predicting future market penetration of DER that considers economics and emission factors was used to estimate the most likely implementation of DER. The methodology results in spatially and temporally resolved emission profiles of criteria pollutants that are subsequently introduced into a detailed atmospheric chemistry and transport model of the region. The DER technology determined by the methodology includes 62% reciprocating engines, 34% gas turbines, and 4% fuel cells and other emerging technologies. The introduction of DER leads to retirement of 2625 MW of existing power plants for which emissions are removed from the inventory. The air quality model predicts maximum differences in air pollutant concentrations that are located downwind from the central power plants that were removed from the domain. Maximum decreases in hourly peak ozone concentrations due to DER use are 10 ppb and are located over the state of New Jersey. Maximum decreases in 24-hr average fine particulate matter (PM2.5) concentrations reach 3 microg/m3 and are located off the coast of New Jersey and New York. The main contribution to decreased PM2.5 is the reduction of sulfate levels due to significant reductions in direct emissions of sulfur oxides (SO(x)) from the DER compared with the central power plants removed. The scenario presented here represents an accelerated DER penetration case with aggressive emission reductions due to removal of highly emitting power plants. Such scenario provides an upper bound for air quality benefits of DER implementation scenarios.

  15. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less

  16. Handling the difficult Brownfields issues: A case study of privately funded remediation to residential standards update 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLeod, D.P.; Ridley, A.P.

    Most Brownfields projects are based on either direct or indirect government funding. This paper describes a more unusual scenario: the remediation of a contaminated industrial site for re-use as residential property. Using the ASTM RBCA risk assessment methodology and an innovative fixed fee arrangement between Woodward-Clyde Consultants and the site owner, they developed and successfully implemented a plan to clean up the site to residential standards over a twelve (12) month time period.

  17. SERA Scenarios of Early Market Fuel Cell Electric Vehicle Introductions: Modeling Framework, Regional Markets, and Station Clustering; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, M.

    This presentation provides an overview of the Scenario Evaluation and Regionalization Analysis (SERA) model, describes the methodology for developing scenarios for hydrogen infrastructure development, outlines an example "Hydrogen Success" scenario, and discusses detailed scenario metrics for a particular case study region, the Northeast Corridor.

  18. Effects of Scenario Planning on Participant Mental Models

    ERIC Educational Resources Information Center

    Glick, Margaret B.; Chermack, Thomas J.; Luckel, Henry; Gauck, Brian Q.

    2012-01-01

    Purpose: The purpose of this paper is to assess the effects of scenario planning on participant mental model styles. Design/methodology/approach: The scenario planning literature is consistent with claims that scenario planning can change individual mental models. These claims are supported by anecdotal evidence and stories from the practical…

  19. Methodological issues in the choice among different drugs approved for the same therapeutic indication: a position paper by the Italian Association of Medical Oncology (AIOM)

    PubMed Central

    Bruzzi, Paolo; Perrone, Francesco; Torri, Valter; Montemurro, Filippo; Tiseo, Marcello; Vasile, Enrico

    2016-01-01

    In oncology, as in other clinical fields, different treatments are often approved for the same therapeutic indication. In many cases, no direct comparisons are available to inform the choice in clinical practice. In 2015, the Italian Association of Medical Oncology (AIOM) instructed a working group, including both clinicians and methodologists, to discuss the issue of the best choice among different treatments available for the same indication. The working group discussed 3 different scenarios: (1) biosimilar drugs; (2) different drugs with same mechanism of action; (3) different drugs with different mechanism of action. For each scenario, methodological issues were discussed, along with the priority for investment of resources in the conduct of clinical trials testing direct comparison. As for biosimilar drugs, the panel recommended that, following comparability exercise and approval by regulatory agencies, they should be widely used, considered that their use allows financial savings. As for different drugs (with either the same or a different mechanism of action), the panel agreed that indirect comparisons and network meta-analyses are associated with relevant risk of bias and imprecision, and direct comparisons should be encouraged. The priority of these direct comparisons should be higher when the potential differences in efficacy and/or toxicity are clinically relevant. The choice of the study design (superiority vs non-inferiority) depends on the toxicity profiles and also on the presumed difference in efficacy. Scientific societies should put pressure on public bodies to identify all the administrative and financial mechanisms useful to facilitate the conduct of trials testing direct comparisons, when needed. Decision about therapeutic equivalence can have important consequences on innovation: the availability of drugs characterised by the same effectiveness, but at a lower cost, could enable non-negligible savings of economic resources that could be used to guarantee access to innovative, high-cost drugs. PMID:28255452

  20. Methodological issues in the choice among different drugs approved for the same therapeutic indication: a position paper by the Italian Association of Medical Oncology (AIOM).

    PubMed

    Di Maio, Massimo; Bruzzi, Paolo; Perrone, Francesco; Torri, Valter; Montemurro, Filippo; Tiseo, Marcello; Vasile, Enrico

    2016-01-01

    In oncology, as in other clinical fields, different treatments are often approved for the same therapeutic indication. In many cases, no direct comparisons are available to inform the choice in clinical practice. In 2015, the Italian Association of Medical Oncology (AIOM) instructed a working group, including both clinicians and methodologists, to discuss the issue of the best choice among different treatments available for the same indication. The working group discussed 3 different scenarios: (1) biosimilar drugs; (2) different drugs with same mechanism of action; (3) different drugs with different mechanism of action. For each scenario, methodological issues were discussed, along with the priority for investment of resources in the conduct of clinical trials testing direct comparison. As for biosimilar drugs, the panel recommended that, following comparability exercise and approval by regulatory agencies, they should be widely used, considered that their use allows financial savings. As for different drugs (with either the same or a different mechanism of action), the panel agreed that indirect comparisons and network meta-analyses are associated with relevant risk of bias and imprecision, and direct comparisons should be encouraged. The priority of these direct comparisons should be higher when the potential differences in efficacy and/or toxicity are clinically relevant. The choice of the study design (superiority vs non-inferiority) depends on the toxicity profiles and also on the presumed difference in efficacy. Scientific societies should put pressure on public bodies to identify all the administrative and financial mechanisms useful to facilitate the conduct of trials testing direct comparisons, when needed. Decision about therapeutic equivalence can have important consequences on innovation: the availability of drugs characterised by the same effectiveness, but at a lower cost, could enable non-negligible savings of economic resources that could be used to guarantee access to innovative, high-cost drugs.

  1. Developing ecological scenarios for the prospective aquatic risk assessment of pesticides.

    PubMed

    Rico, Andreu; Van den Brink, Paul J; Gylstra, Ronald; Focks, Andreas; Brock, Theo Cm

    2016-07-01

    The prospective aquatic environmental risk assessment (ERA) of pesticides is generally based on the comparison of predicted environmental concentrations in edge-of-field surface waters with regulatory acceptable concentrations derived from laboratory and/or model ecosystem experiments with aquatic organisms. New improvements in mechanistic effect modeling have allowed a better characterization of the ecological risks of pesticides through the incorporation of biological trait information and landscape parameters to assess individual, population and/or community-level effects and recovery. Similarly to exposure models, ecological models require scenarios that describe the environmental context in which they are applied. In this article, we propose a conceptual framework for the development of ecological scenarios that, when merged with exposure scenarios, will constitute environmental scenarios for prospective aquatic ERA. These "unified" environmental scenarios are defined as the combination of the biotic and abiotic parameters that are required to characterize exposure, (direct and indirect) effects, and recovery of aquatic nontarget species under realistic worst-case conditions. Ideally, environmental scenarios aim to avoid a potential mismatch between the parameter values and the spatial-temporal scales currently used in aquatic exposure and effect modeling. This requires a deeper understanding of the ecological entities we intend to protect, which can be preliminarily addressed by the formulation of ecological scenarios. In this article we present a methodological approach for the development of ecological scenarios and illustrate this approach by a case-study for Dutch agricultural ditches and the example focal species Sialis lutaria. Finally, we discuss the applicability of ecological scenarios in ERA and propose research needs and recommendations for their development and integration with exposure scenarios. Integr Environ Assess Manag 2016;12:510-521. © 2015 SETAC. © 2015 SETAC.

  2. Probabilistic Based Modeling and Simulation Assessment

    DTIC Science & Technology

    2010-06-01

    different crash and blast scenarios. With the integration of the high fidelity neck and head model, a methodology to calculate the probability of injury...variability, correlation, and multiple (often competing) failure metrics. Important scenarios include vehicular collisions, blast /fragment impact, and...first area of focus is to develop a methodology to integrate probabilistic analysis into finite element analysis of vehicle collisions and blast . The

  3. Developing a stochastic conflict resolution model for urban runoff quality management: Application of info-gap and bargaining theories

    NASA Astrophysics Data System (ADS)

    Ghodsi, Seyed Hamed; Kerachian, Reza; Estalaki, Siamak Malakpour; Nikoo, Mohammad Reza; Zahmatkesh, Zahra

    2016-02-01

    In this paper, two deterministic and stochastic multilateral, multi-issue, non-cooperative bargaining methodologies are proposed for urban runoff quality management. In the proposed methodologies, a calibrated Storm Water Management Model (SWMM) is used to simulate stormwater runoff quantity and quality for different urban stormwater runoff management scenarios, which have been defined considering several Low Impact Development (LID) techniques. In the deterministic methodology, the best management scenario, representing location and area of LID controls, is identified using the bargaining model. In the stochastic methodology, uncertainties of some key parameters of SWMM are analyzed using the info-gap theory. For each water quality management scenario, robustness and opportuneness criteria are determined based on utility functions of different stakeholders. Then, to find the best solution, the bargaining model is performed considering a combination of robustness and opportuneness criteria for each scenario based on utility function of each stakeholder. The results of applying the proposed methodology in the Velenjak urban watershed located in the northeastern part of Tehran, the capital city of Iran, illustrate its practical utility for conflict resolution in urban water quantity and quality management. It is shown that the solution obtained using the deterministic model cannot outperform the result of the stochastic model considering the robustness and opportuneness criteria. Therefore, it can be concluded that the stochastic model, which incorporates the main uncertainties, could provide more reliable results.

  4. Eco-efficiency for greenhouse gas emissions mitigation of municipal solid waste management: a case study of Tianjin, China.

    PubMed

    Zhao, Wei; Huppes, Gjalt; van der Voet, Ester

    2011-06-01

    The issue of municipal solid waste (MSW) management has been highlighted in China due to the continually increasing MSW volumes being generated and the limited capacity of waste treatment facilities. This article presents a quantitative eco-efficiency (E/E) analysis on MSW management in terms of greenhouse gas (GHG) mitigation. A methodology for E/E analysis has been proposed, with an emphasis on the consistent integration of life cycle assessment (LCA) and life cycle costing (LCC). The environmental and economic impacts derived from LCA and LCC have been normalized and defined as a quantitative E/E indicator. The proposed method was applied in a case study of Tianjin, China. The study assessed the current MSW management system, as well as a set of alternative scenarios, to investigate trade-offs between economy and GHG emissions mitigation. Additionally, contribution analysis was conducted on both LCA and LCC to identify key issues driving environmental and economic impacts. The results show that the current Tianjin's MSW management system emits the highest GHG and costs the least, whereas the situation reverses in the integrated scenario. The key issues identified by the contribution analysis show no linear relationship between the global warming impact and the cost impact in MSW management system. The landfill gas utilization scenario is indicated as a potential optimum scenario by the proposed E/E analysis, given the characteristics of MSW, technology levels, and chosen methodologies. The E/E analysis provides an attractive direction towards sustainable waste management, though some questions with respect to uncertainty need to be discussed further. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. A Maritime Phase Zero Force for the Year 2020

    DTIC Science & Technology

    2009-06-01

    mind, the team constructed maritime forces and then evaluated them against the same scenarios to determine which ones performed better. The...120  1.  Project Methodology and Choice of Missions ...............................120  2.  Missions and Scenarios construction methodology... projects are designed to build tools that students in the Systems Engineering Analysis curriculum have learned over the 18 month enrollment in the program

  6. Benefit/cost comparison for utility SMES applications

    NASA Astrophysics Data System (ADS)

    Desteese, J. G.; Dagle, J. E.

    1991-08-01

    This paper summarizes eight case studies that account for the benefits and costs of superconducting magnetic energy storage (SMES) in system-specific utility applications. Four of these scenarios are hypothetical SMES applications in the Pacific Northwest, where relatively low energy costs impose a stringent test on the viability of the concept. The other four scenarios address SMES applications on high-voltage, direct-current (HVDC) transmission lines. While estimated SMES benefits are based on a previously reported methodology, this paper presents results of an improved cost-estimating approach that includes an assumed reduction in the cost of the power conditioning system (PCS) from approximately $160/kW to $80/kW. The revised approach results in all the SMES scenarios showing higher benefit/cost ratios than those reported earlier. However, in all but two cases, the value of any single benefit is still less than the unit's levelized cost. This suggests, as a general principle, that the total value of multiple benefits should always be considered if SMES is to appear cost effective in many utility applications. These results should offer utilities further encouragement to conduct more detailed analyses of SMES benefits in scenarios that apply to individual systems.

  7. Practical Applications for Earthquake Scenarios Using ShakeMap

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Worden, B.; Quitoriano, V.; Goltz, J.

    2001-12-01

    In planning and coordinating emergency response, utilities, local government, and other organizations are best served by conducting training exercises based on realistic earthquake situations-ones that they are most likely to face. Scenario earthquakes can fill this role; they can be generated for any geologically plausible earthquake or for actual historic earthquakes. ShakeMap Web pages now display selected earthquake scenarios (www.trinet.org/shake/archive/scenario/html) and more events will be added as they are requested and produced. We will discuss the methodology and provide practical examples where these scenarios are used directly for risk reduction. Given a selected event, we have developed tools to make it relatively easy to generate a ShakeMap earthquake scenario using the following steps: 1) Assume a particular fault or fault segment will (or did) rupture over a certain length, 2) Determine the magnitude of the earthquake based on assumed rupture dimensions, 3) Estimate the ground shaking at all locations in the chosen area around the fault, and 4) Represent these motions visually by producing ShakeMaps and generating ground motion input for loss estimation modeling (e.g., FEMA's HAZUS). At present, ground motions are estimated using empirical attenuation relationships to estimate peak ground motions on rock conditions. We then correct the amplitude at that location based on the local site soil (NEHRP) conditions as we do in the general ShakeMap interpolation scheme. Finiteness is included explicitly, but directivity enters only through the empirical relations. Although current ShakeMap earthquake scenarios are empirically based, substantial improvements in numerical ground motion modeling have been made in recent years. However, loss estimation tools, HAZUS for example, typically require relatively high frequency (3 Hz) input for predicting losses, above the range of frequencies successfully modeled to date. Achieving full-synthetic ground motion estimates that will substantially improve over empirical relations at these frequencies will require developing cost-effective numerical tools for proper theoretical inclusion of known complex ground motion effects. Current efforts underway must continue in order to obtain site, basin, and deeper crustal structure, and to characterize and test 3D earth models (including attenuation and nonlinearity). In contrast, longer period synthetics (>2 sec) are currently being generated in a deterministic fashion to include 3D and shallow site effects, an improvement on empirical estimates alone. As progress is made, we will naturally incorporate such advances into the ShakeMap scenario earthquake and processing methodology. Our scenarios are currently used heavily in emergency response planning and loss estimation. Primary users include city, county, state and federal government agencies (e.g., the California Office of Emergency Services, FEMA, the County of Los Angeles) as well as emergency response planners and managers for utilities, businesses, and other large organizations. We have found the scenarios are also of fundamental interest to many in the media and the general community interested in the nature of the ground shaking likely experienced in past earthquakes as well as effects of rupture on known faults in the future.

  8. The Scenario-Based Engineering Process (SEP): a user-centered approach for the development of health care systems.

    PubMed

    Harbison, K; Kelly, J; Burnell, L; Silva, J

    1995-01-01

    The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.

  9. A multi-criteria decision aid methodology to design electric vehicles public charging networks

    NASA Astrophysics Data System (ADS)

    Raposo, João; Rodrigues, Ana; Silva, Carlos; Dentinho, Tomaz

    2015-05-01

    This article presents a new multi-criteria decision aid methodology, dynamic-PROMETHEE, here used to design electric vehicle charging networks. In applying this methodology to a Portuguese city, results suggest that it is effective in designing electric vehicle charging networks, generating time and policy based scenarios, considering offer and demand and the city's urban structure. Dynamic-PROMETHE adds to the already known PROMETHEE's characteristics other useful features, such as decision memory over time, versatility and adaptability. The case study, used here to present the dynamic-PROMETHEE, served as inspiration and base to create this new methodology. It can be used to model different problems and scenarios that may present similar requirement characteristics.

  10. Transitioning from Software Requirements Models to Design Models

    NASA Technical Reports Server (NTRS)

    Lowry, Michael (Technical Monitor); Whittle, Jon

    2003-01-01

    Summary: 1. Proof-of-concept of state machine synthesis from scenarios - CTAS case study. 2. CTAS team wants to use the syntheses algorithm to validate trajectory generation. 3. Extending synthesis algorithm towards requirements validation: (a) scenario relationships' (b) methodology for generalizing/refining scenarios, and (c) interaction patterns to control synthesis. 4. Initial ideas tested on conflict detection scenarios.

  11. The use of concept maps during knowledge elicitation in ontology development processes – the nutrigenomics use case

    PubMed Central

    Castro, Alexander Garcia; Rocca-Serra, Philippe; Stevens, Robert; Taylor, Chris; Nashar, Karim; Ragan, Mark A; Sansone, Susanna-Assunta

    2006-01-01

    Background Incorporation of ontologies into annotations has enabled 'semantic integration' of complex data, making explicit the knowledge within a certain field. One of the major bottlenecks in developing bio-ontologies is the lack of a unified methodology. Different methodologies have been proposed for different scenarios, but there is no agreed-upon standard methodology for building ontologies. The involvement of geographically distributed domain experts, the need for domain experts to lead the design process, the application of the ontologies and the life cycles of bio-ontologies are amongst the features not considered by previously proposed methodologies. Results Here, we present a methodology for developing ontologies within the biological domain. We describe our scenario, competency questions, results and milestones for each methodological stage. We introduce the use of concept maps during knowledge acquisition phases as a feasible transition between domain expert and knowledge engineer. Conclusion The contributions of this paper are the thorough description of the steps we suggest when building an ontology, example use of concept maps, consideration of applicability to the development of lower-level ontologies and application to decentralised environments. We have found that within our scenario conceptual maps played an important role in the development process. PMID:16725019

  12. Improved water resource management for a highly complex environment using three-dimensional groundwater modelling

    NASA Astrophysics Data System (ADS)

    Moeck, Christian; Affolter, Annette; Radny, Dirk; Dressmann, Horst; Auckenthaler, Adrian; Huggenberger, Peter; Schirmer, Mario

    2018-02-01

    A three-dimensional groundwater model was used to improve water resource management for a study area in north-west Switzerland, where drinking-water production is close to former landfills and industrial areas. To avoid drinking-water contamination, artificial groundwater recharge with surface water is used to create a hydraulic barrier between the contaminated sites and drinking-water extraction wells. The model was used for simulating existing and proposed water management strategies as a tool to ensure the utmost security for drinking water. A systematic evaluation of the flow direction between existing observation points using a developed three-point estimation method for a large number of scenarios was carried out. It is demonstrated that systematically applying the developed methodology helps to identify vulnerable locations which are sensitive to changing boundary conditions such as those arising from changes to artificial groundwater recharge rates. At these locations, additional investigations and protection are required. The presented integrated approach, using the groundwater flow direction between observation points, can be easily transferred to a variety of hydrological settings to systematically evaluate groundwater modelling scenarios.

  13. Participatory Development and Analysis of a Fuzzy Cognitive Map of the Establishment of a Bio-Based Economy in the Humber Region

    PubMed Central

    Penn, Alexandra S.; Knight, Christopher J. K.; Lloyd, David J. B.; Avitabile, Daniele; Kok, Kasper; Schiller, Frank; Woodward, Amy; Druckman, Angela; Basson, Lauren

    2013-01-01

    Fuzzy Cognitive Mapping (FCM) is a widely used participatory modelling methodology in which stakeholders collaboratively develop a ‘cognitive map’ (a weighted, directed graph), representing the perceived causal structure of their system. This can be directly transformed by a workshop facilitator into simple mathematical models to be interrogated by participants by the end of the session. Such simple models provide thinking tools which can be used for discussion and exploration of complex issues, as well as sense checking the implications of suggested causal links. They increase stakeholder motivation and understanding of whole systems approaches, but cannot be separated from an intersubjective participatory context. Standard FCM methodologies make simplifying assumptions, which may strongly influence results, presenting particular challenges and opportunities. We report on a participatory process, involving local companies and organisations, focussing on the development of a bio-based economy in the Humber region. The initial cognitive map generated consisted of factors considered key for the development of the regional bio-based economy and their directional, weighted, causal interconnections. A verification and scenario generation procedure, to check the structure of the map and suggest modifications, was carried out with a second session. Participants agreed on updates to the original map and described two alternate potential causal structures. In a novel analysis all map structures were tested using two standard methodologies usually used independently: linear and sigmoidal FCMs, demonstrating some significantly different results alongside some broad similarities. We suggest a development of FCM methodology involving a sensitivity analysis with different mappings and discuss the use of this technique in the context of our case study. Using the results and analysis of our process, we discuss the limitations and benefits of the FCM methodology in this case and in general. We conclude by proposing an extended FCM methodology, including multiple functional mappings within one participant-constructed graph. PMID:24244303

  14. Safety assessment methodology in management of spent sealed sources.

    PubMed

    Mahmoud, Narmine Salah

    2005-02-14

    Environmental hazards can be caused from radioactive waste after their disposal. It was therefore important that safety assessment methodologies be developed and established to study and estimate the possible hazards, and institute certain safety methodologies that lead and prevent the evolution of these hazards. Spent sealed sources are specific type of radioactive waste. According to IAEA definition, spent sealed sources are unused sources because of activity decay, damage, misuse, loss, or theft. Accidental exposure of humans from spent sealed sources can occur at the moment they become spent and before their disposal. Because of that reason, safety assessment methodologies were tailored to suit the management of spent sealed sources. To provide understanding and confidence of this study, validation analysis was undertaken by considering the scenario of an accident that occurred in Egypt, June 2000 (the Meet-Halfa accident from an iridium-192 source). The text of this work includes consideration related to the safety assessment approaches of spent sealed sources which constitutes assessment context, processes leading an active source to be spent, accident scenarios, mathematical models for dose calculations, and radiological consequences and regulatory criteria. The text also includes a validation study, which was carried out by evaluating a theoretical scenario compared to the real scenario of Meet-Halfa accident depending on the clinical assessment of affected individuals.

  15. A scenario elicitation methodology to identify the drivers of electricity infrastructure cost in South America

    NASA Astrophysics Data System (ADS)

    Moksnes, Nandi; Taliotis, Constantinos; Broad, Oliver; de Moura, Gustavo; Howells, Mark

    2017-04-01

    Developing a set of scenarios to assess a proposed policy or future development pathways requires a certain level of information, as well as establishing the socio-economic context. As the future is difficult to predict, great care in defining the selected scenarios is needed. Even so it can be difficult to assess if the selected scenario is covering the possible solution space. Instead, this paper's methodology develops a large set of scenarios (324) in OSeMOSYS using the SAMBA 2.0 (South America Model Base) model to assess long-term electricity supply scenarios and applies a scenario-discovery statistical data mining algorithm, Patient Rule Induction Method (PRIM). By creating a multidimensional space, regions related to high and low cost can be identified as well as their key driver. The six key drivers are defined a priori in three (high, medium, low) or two levers (high, low): 1) Demand projected from GDP, population, urbanization and transport, 2) Fossil fuel price, 3) Climate change impact on hydropower, 4) Renewable technology learning rate, 5) Discount rate, 6) CO2 emission targets.

  16. The space station assembly phase: Flight telerobotic servicer feasibility. Volume 2: Methodology and case study

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Gyamfi, Max A.; Volkmer, Kent; Zimmerman, Wayne F.

    1987-01-01

    A methodology is described for examining the feasibility of a Flight Telerobotic Servicer (FTS) using two assembly scenarios, defined at the EVA task level, for the 30 shuttle flights (beginning with MB-1) over a four-year period. Performing all EVA tasks by crew only is compared to a scenario in which crew EVA is augmented by FTS. A reference FTS concept is used as a technology baseline and life-cycle cost analysis is performed to highlight cost tradeoffs. The methodology, procedure, and data used to complete the analysis are documented in detail.

  17. Development of a new methodology for the creation of water temperature scenarios using frequency analysis tool.

    PubMed

    Val, Jonatan; Pino, María Rosa; Chinarro, David

    2018-03-15

    Thermal quality in river ecosystems is a fundamental property for the development of biological processes and many of the human activities linked to the aquatic environment. In the future, this property is going to be threatened due to global change impacts, and basin managers will need useful tools to evaluate these impacts. Currently, future projections in temperature modelling are based on the historical data for air and water temperatures, and the relationship with past temperature scenarios; however, this represents a problem when evaluating future scenarios with new thermal impacts. Here, we analysed the thermal impacts produced by several human activities, and linked them with the decoupling degree of the thermal transfer mechanism from natural systems measured with frequency analysis tools (wavelet coherence). Once this relationship has been established we develop a new methodology for simulating different thermal impacts scenarios in order to project them into future. Finally, we validate this methodology using a site that changed its thermal quality during the studied period due to human impacts. Results showed a high correlation (r 2 =0.84) between the decoupling degree of the thermal transfer mechanisms and the quantified human impacts, obtaining 3 thermal impact scenarios. Furthermore, the graphic representation of these thermal scenarios with its wavelet coherence spectrums showed the impacts of an extreme drought period and the agricultural management. The inter-conversion between the scenarios gave high morphological similarities in the obtained wavelet coherence spectrums, and the validation process clearly showed high efficiency of the developed model against old methodologies when comparing with Nash-Stucliffe criterion. Although there is need for further investigation with different climatic and anthropic management conditions, the developed frequency models could be useful in decision-making processes by managers when faced with future global change impacts. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Using simulation to study difficult clinical issues: prenatal counseling at the threshold of viability across American and Dutch cultures.

    PubMed

    Geurtzen, Rosa; Hogeveen, Marije; Rajani, Anand K; Chitkara, Ritu; Antonius, Timothy; van Heijst, Arno; Draaisma, Jos; Halamek, Louis P

    2014-06-01

    Prenatal counseling at the threshold of viability is a challenging yet critically important activity, and care guidelines differ across cultures. Studying how this task is performed in the actual clinical environment is extremely difficult. In this pilot study, we used simulation as a methodology with 2 aims as follows: first, to explore the use of simulation incorporating a standardized pregnant patient as an investigative methodology and, second, to determine similarities and differences in content and style of prenatal counseling between American and Dutch neonatologists. We compared counseling practice between 11 American and 11 Dutch neonatologists, using a simulation-based investigative methodology. All subjects performed prenatal counseling with a simulated pregnant patient carrying a fetus at the limits of viability. The following elements of scenario design were standardized across all scenarios: layout of the physical environment, details of the maternal and fetal histories, questions and responses of the standardized pregnant patient, and the time allowed for consultation. American subjects typically presented several treatment options without bias, whereas Dutch subjects were more likely to explicitly advise a specific course of treatment (emphasis on partial life support). American subjects offered comfort care more frequently than the Dutch subjects and also discussed options for maximal life support more often than their Dutch colleagues. Simulation is a useful research methodology for studying activities difficult to assess in the actual clinical environment such as prenatal counseling at the limits of viability. Dutch subjects were more directive in their approach than their American counterparts, offering fewer options for care and advocating for less invasive interventions. American subjects were more likely to offer a wider range of therapeutic options without providing a recommendation for any specific option.

  19. An integrated methodology to forecast the efficiency of nourishment strategies in eroding deltas.

    PubMed

    Bergillos, Rafael J; López-Ruiz, Alejandro; Principal-Gómez, Daniel; Ortega-Sánchez, Miguel

    2018-02-01

    Many deltas across the globe are retreating, and nearby beaches are undergoing strong erosion as a result. Among soft and prompt solutions, nourishments are the most heavily used. This paper presents an integrated methodology to forecast the efficiency of nourishment strategies by means of wave climate simulations, wave propagations with downscaling techniques, computation of longshore sediment transport rates and application of the one-line model. It was applied to an eroding deltaic beach (Guadalfeo, southern Spain), where different scenarios as a function of the nourished coastline morphology, input volume and grain size were tested. For that, the evolution of six scenarios of coastline geometry over a two-year period (lifetime of nourishment projects at the study site) was modelled and the uncertainty of the predictions was also quantified through Monte Carlo techniques. For the most efficient coastline shape in terms of gained dry beach area, eight sub-scenarios with different nourished volumes were defined and modelled. The results indicate that an input volume around 460,000m 3 is the best strategy since nourished morphologies with higher volumes are more exposed to the prevailing storm directions, inducing less efficient responses. After setting the optimum coastline morphology and input sediment volume, eleven different nourished grain sizes were modelled; the most efficient coastline responses were obtained for sediment sizes greater than 0.01m. The availability of these sizes in the sediment accumulated upstream of a dam in the Guadalfeo River basin allows for the conclusion that this alternative would not only mitigate coastal erosion problems but also sedimentation issues in the reservoir. The methodology proposed in this work is extensible to other coastal areas across the world and can be helpful to support the decision-making process of artificial nourishment projects and other environmental management strategies. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Physicians' psychophysiological stress reaction in medical communication of bad news: A critical literature review.

    PubMed

    Studer, Regina Katharina; Danuser, Brigitta; Gomez, Patrick

    2017-10-01

    Stress is a common phenomenon in medical professions. Breaking bad news (BBN) is reported to be a particularly distressing activity for physicians. Traditionally, the stress experienced by physicians when BBN was assessed exclusively using self-reporting. Only recently, the field of difficult physician-patient communication has used physiological assessments to better understand physicians' stress reactions. This paper's goals are to (a) review current knowledge about the physicians' psychophysiological stress reactions in BBN situations, (b) discuss methodological aspects of these studies and (c) suggest directions for future research. The seven studies identified all used scenarios with simulated patients but were heterogeneous with regard to other methodological aspects, such as the psychophysiological parameters, time points and durations assessed, comparative settings, and operationalisation of the communication scenarios. Despite this heterogeneity, all the papers reported increases in psychological and/or physiological activation when breaking bad news in comparison to control conditions, such as history taking or breaking good news. Taken together, the studies reviewed support the hypothesis that BBN is a psychophysiologically arousing and stressful task for medical professionals. However, much remains to be done. We suggest several future directions to advance the field. These include (a) expanding and refining the conceptual framework, (b) extending assessments to include more diverse physiological parameters, (c) exploring the modulatory effects of physicians' personal characteristics (e.g. level of experience), (d) comparing simulated and real-life physician-patient encounters and (e) combining physiological assessment with a discourse analysis of physician-patient communication. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. A methodology for modeling barrier island storm-impact scenarios

    USGS Publications Warehouse

    Mickey, Rangley C.; Long, Joseph W.; Plant, Nathaniel G.; Thompson, David M.; Dalyander, P. Soupy

    2017-02-16

    A methodology for developing a representative set of storm scenarios based on historical wave buoy and tide gauge data for a region at the Chandeleur Islands, Louisiana, was developed by the U.S. Geological Survey. The total water level was calculated for a 10-year period and analyzed against existing topographic data to identify when storm-induced wave action would affect island morphology. These events were categorized on the basis of the threshold of total water level and duration to create a set of storm scenarios that were simulated, using a high-fidelity, process-based, morphologic evolution model, on an idealized digital elevation model of the Chandeleur Islands. The simulated morphological changes resulting from these scenarios provide a range of impacts that can help coastal managers determine resiliency of proposed or existing coastal structures and identify vulnerable areas within those structures.

  2. Sustainability Assessment of Future Scenarios: Methodology and Application to Mountain Areas of Europe

    NASA Astrophysics Data System (ADS)

    Sheate, William R.; Partidário, Maria Rosário Do; Byron, Helen; Bina, Olivia; Dagg, Suzan

    2008-02-01

    BioScene (scenarios for reconciling biodiversity conservation with declining agriculture use in mountain areas in Europe) was a three-year project (2002 2005) funded by the European Union’s Fifth Framework Programme, and aimed to investigate the implications of agricultural restructuring and decline for biodiversity conservation in the mountain areas of Europe. The research took a case study approach to the analysis of the biodiversity processes and outcomes of different scenarios of agri-environmental change in six countries (France, Greece, Norway, Slovakia, Switzerland, and the United Kingdom) covering the major biogeographical regions of Europe. The project was coordinated by Imperial College London, and each study area had a multidisciplinary team including ecologists and social and economic experts, which sought a comprehensive understanding of the drivers for change and their implications for sustainability. A key component was the sustainability assessment (SA) of the alternative scenarios. This article discusses the development and application of the SA methodology developed for BioScene. While the methodology was objectives-led, it was also strongly grounded in baseline ecological and socio-economic data. This article also describes the engagement of stakeholder panels in each study area and the use of causal chain analysis for understanding the likely implications for land use and biodiversity of strategic drivers of change under alternative scenarios for agriculture and rural policy and for biodiversity management. Finally, this article draws conclusions for the application of SA more widely, its use with scenarios, and the benefits of stakeholder engagement in the SA process.

  3. Scenario driven data modelling: a method for integrating diverse sources of data and data streams

    DOEpatents

    Brettin, Thomas S.; Cottingham, Robert W.; Griffith, Shelton D.; Quest, Daniel J.

    2015-09-08

    A system and method of integrating diverse sources of data and data streams is presented. The method can include selecting a scenario based on a topic, creating a multi-relational directed graph based on the scenario, identifying and converting resources in accordance with the scenario and updating the multi-directed graph based on the resources, identifying data feeds in accordance with the scenario and updating the multi-directed graph based on the data feeds, identifying analytical routines in accordance with the scenario and updating the multi-directed graph using the analytical routines and identifying data outputs in accordance with the scenario and defining queries to produce the data outputs from the multi-directed graph.

  4. Protocol Architecture Model Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASA's four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. This report applies the methodology to three space Internet-based communications scenarios for future missions. CNS has conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. The scenarios are: Scenario 1: Unicast communications between a Low-Earth-Orbit (LEO) spacecraft inspace Internet node and a ground terminal Internet node via a Tracking and Data Rela Satellite (TDRS) transfer; Scenario 2: Unicast communications between a Low-Earth-Orbit (LEO) International Space Station and a ground terminal Internet node via a TDRS transfer; Scenario 3: Multicast Communications (or "Multicasting"), 1 Spacecraft to N Ground Receivers, N Ground Transmitters to 1 Ground Receiver via a Spacecraft.

  5. Explosion/Blast Dynamics for Constellation Launch Vehicles Assessment

    NASA Technical Reports Server (NTRS)

    Baer, Mel; Crawford, Dave; Hickox, Charles; Kipp, Marlin; Hertel, Gene; Morgan, Hal; Ratzel, Arthur; Cragg, Clinton H.

    2009-01-01

    An assessment methodology is developed to guide quantitative predictions of adverse physical environments and the subsequent effects on the Ares-1 crew launch vehicle associated with the loss of containment of cryogenic liquid propellants from the upper stage during ascent. Development of the methodology is led by a team at Sandia National Laboratories (SNL) with guidance and support from a number of National Aeronautics and Space Administration (NASA) personnel. The methodology is based on the current Ares-1 design and feasible accident scenarios. These scenarios address containment failure from debris impact or structural response to pressure or blast loading from an external source. Once containment is breached, the envisioned assessment methodology includes predictions for the sequence of physical processes stemming from cryogenic tank failure. The investigative techniques, analysis paths, and numerical simulations that comprise the proposed methodology are summarized and appropriate simulation software is identified in this report.

  6. MEGASTAR: The Meaning of Energy Growth: An Assessment of Systems, Technologies, and Requirements

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A methodology for the display and analysis of postulated energy futures for the United States is presented. A systems approach that includes the methodology of technology assessment is used to examine three energy scenarios--the Westinghouse Nuclear Electric Economy, the Ford Technical Fix Base Case and a MEGASTAR generated Alternate to the Ford Technical Fix Base Case. The three scenarios represent different paths of energy consumption for the present to the year 2000. Associated with these paths are various mixes of fuels, conversion, distribution, conservation and end-use technologies. MEGASTAR presents the estimated times and unit requirements to supply the fuels, conversion and distribution systems for the postulated end uses for the three scenarios and then estimates the aggregate manpower, materials, and capital requirements needed to develop the energy system described by the particular scenario. The total requirements and the energy subsystems for each scenario are assessed for their primary impacts in the areas of society, the environment, technology and the economy.

  7. Developing scenarios to assess future landslide risks: a model-based approach applied to mountainous regions

    NASA Astrophysics Data System (ADS)

    Vacquie, Laure; Houet, Thomas

    2016-04-01

    In the last century, European mountain landscapes have experienced significant transformations. Natural and anthropogenic changes, climate changes, touristic and industrial development, socio-economic interactions, and their implications in terms of LUCC (land use and land cover changes) have directly influenced the spatial organization and vulnerability of mountain landscapes. This study is conducted as part of the SAMCO project founded by the French National Science Agency (ANR). It aims at developing a methodological approach, combining various tools, modelling platforms and methods, to identify vulnerable regions to landslide hazards accounting for futures LUCC. It presents an integrated approach combining participative scenarios and a LULC changes simulation models to assess the combined effects of LUCC and climate change on landslide risks in the Cauterets valley (French Pyrenees Mountains) up to 2100. Through vulnerability and risk mapping, the objective is to gather information to support landscape planning and implement land use strategies with local stakeholders for risk management. Four contrasting scenarios are developed and exhibit contrasting trajectories of socio-economic development. Prospective scenarios are based on national and international socio-economic contexts relying on existing assessment reports. The methodological approach integrates knowledge from local stakeholders to refine each scenario during their construction and to reinforce their plausibility and relevance by accounting for local specificities, e.g. logging and pastoral activities, touristic development, urban planning, etc. A process-based model, the Forecasting Scenarios for Mountains (ForeSceM) model, developed on the Dinamica Ego modelling platform is used to spatially allocate futures LUCC for each prospective scenario. Concurrently, a spatial decision support tool, i.e. the SYLVACCESS model, is used to identify accessible areas for forestry in scenario projecting logging activities. The method results in the development of LULC maps providing insights into a range of alternative futures using a scope of socio-economic and environmental conditions. A landslides assessment model, the ALICE model is then used as a final tool to analyze the potential impacts of simulated LUCC on landslide risks and the consequences in terms of vulnerability, e.g. changes in disaster risk allocation or characterization, degree of perturbation. This assessment intends to provide insights onto the potential future development of the valley to help identify areas at stake and to guide decision makers to help the risk management. Preliminary results show strong differences of futures land use and land cover maps that have significant influence on landslides hazards.

  8. A novel procedure for generating solar irradiance TSYs

    NASA Astrophysics Data System (ADS)

    Fanego, Vicente Lara; Rubio, Jesús Pulgar; Peruchena, Carlos M. Fernández; Romeo, Martín Gastón; Tejera, Sara Moreno; Santigosa, Lourdes Ramírez; Balderrama, Rita X. Valenzuela; Tirado, Luis F. Zarzalejo; Pantaleón, Diego Bermejo; Pérez, Manuel Silva; Contreras, Manuel Pavón; García, Ana Bernardos; Anarte, Sergio Macías

    2017-06-01

    Typical Solar Years (TSYs) are key parameters for the solar energy industry. In particular, TSYs are mainly used for the design and bankability analysis of solar projects. In essence, a TSY intends to describe the expected long-term behavior of the solar resource (direct and/or global irradiance) into a condensed period of one year at the specific location of interest. A TSY differs from a conventional Typical Meteorological Year (TMY) by its absence of meteorological variables other than solar radiation. Concerning the probability of exceedance (Pe) needed for bankability, various scenarios are commonly used, with Pe90, Pe95 or even Pe99 being most usually required as unfavorable scenarios, along with the most widely used median scenario (Pe50). There is no consensus in the scientific community regarding the methodology for generating TSYs for any Pe scenario. Furthermore, the application of two different construction methods to the same original dataset could produce differing TSYs. Within this framework, a group of experts has been established by the Spanish Association for Standardization and Certification (AENOR) in order to propose a method that can be standardized. The method developed by this working group, referred to as the EVA method, is presented in this contribution. Its evaluation shows that it provides reasonable results for the two main irradiance components (direct and global), with low errors in the annual estimations for any given Pe. The EVA method also preserves the long-term statistics when the computed TSYs for a specific Pe are expanded from the monthly basis used in the generation of the TSY to higher time resolutions, such as 1 hour, which are necessary for the precise energy simulation of solar systems.

  9. JEDI Methodology | Jobs and Economic Development Impact Models | NREL

    Science.gov Websites

    Methodology JEDI Methodology The intent of the Jobs and Economic Development Impact (JEDI) models costs) to demonstrate the employment and economic impacts that will likely result during the estimate of overall economic impacts from specific scenarios. Please see Limitations of JEDI Models for

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Arizona. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Hawaii. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Connecticut. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  13. Streamflow Prediction in Ungauged, Irrigated Basins

    NASA Astrophysics Data System (ADS)

    Zhang, M.; Thompson, S. E.

    2016-12-01

    The international "predictions in ungauged basins" or "PUB" effort has broadened and improved the tools available to support water resources management in sparsely observed regions. These tools have, however, been primarily focused on regions with limited diversion of surface or shallow groundwater resources. Incorporating anthropogenic activity into PUB methods is essential given the high level of development of many basins. We extended an existing stochastic framework used to predict the flow duration curve to explore the effects of irrigation on streamflow dynamics. Four canonical scenarios were considered in which irrigation water was (i) primarily sourced from water imports, (ii) primarily sourced from direct in-channel diversions, (iii) sourced from shallow groundwater with direct connectivity to stream channels, or (iv) sourced from deep groundwater that is indirectly connected to surface flow via a shallow aquifer. By comparing the predicted flow duration curves to those predicted by accounting for climate and geomorphic factors in isolation, specific "fingerprints" of human water withdrawals could be identified for the different irrigation scenarios, and shown to be sensitive to irrigation volumes and scheduling. The results provide a first insight into PUB methodologies that could be employed in heavily managed basins.

  14. FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES

    DTIC Science & Technology

    2017-06-01

    FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and

  15. The use of scenarios for long-range planning by investor-owned electric utilities in the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Lyons, John V.

    Scenario planning is a method of organizing and understanding large amounts of quantitative and qualitative data for leaders to make better strategic decisions. There is a lack of academic research about scenario planning with a subsequent shortage of definitions and theories. This study utilized a case study methodology to analyze scenario planning by investor-owned electric utilities in the Pacific Northwest in their integrated resource planning (IRP) process. The cases include Avista Corporation, Idaho Power, PacifiCorp, Portland General Electric, and Puget Sound Energy. This study sought to determine how scenario planning was used, what scenario approach was used, the scenario outcomes, and the similarities and differences in the scenario planning processes. The literature review of this study covered the development of scenario planning, common definitions and theories, approaches to scenario development, and scenario outcomes. A research methodology was developed to classify the scenario development approach into intuitive, hybrid, or quantitative approaches; and scenario outcomes of changed thinking, stories of plausible futures, improved decision making, and enhanced organizational learning. The study found all three forms of scenario planning in the IRPs. All of the cases used a similar approach to IRP development. All of the cases had at least improved decision making as an outcome of scenario planning. Only one case demonstrated all four scenario outcomes. A critical finding was a correlation between the use of the intuitive approach and the use of all scenario outcomes. Another major finding was the unique use of predetermined elements, which are normally consistent across scenarios, but became critical uncertainties in some of the scenarios in the cases for this study. This finding will need to be confirmed by future research as unique to the industry or an aberration. An unusually high number of scenarios were found for cases using the hybrid approach, which was unexpected based on the literature. This work expanded the methods for studying scenario planning, enhanced the body of scholarly works on scenario planning, and provided a starting point for additional research concerning the use of scenario planning by electric utilities.

  16. The development of multi-objective optimization model for excess bagasse utilization: A case study for Thailand

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buddadee, Bancha; Wirojanagud, Wanpen; Watts, Daniel J.

    In this paper, a multi-objective optimization model is proposed as a tool to assist in deciding for the proper utilization scheme of excess bagasse produced in sugarcane industry. Two major scenarios for excess bagasse utilization are considered in the optimization. The first scenario is the typical situation when excess bagasse is used for the onsite electricity production. In case of the second scenario, excess bagasse is processed for the offsite ethanol production. Then the ethanol is blended with an octane rating of 91 gasoline by a portion of 10% and 90% by volume respectively and the mixture is used asmore » alternative fuel for gasoline vehicles in Thailand. The model proposed in this paper called 'Environmental System Optimization' comprises the life cycle impact assessment of global warming potential (GWP) and the associated cost followed by the multi-objective optimization which facilitates in finding out the optimal proportion of the excess bagasse processed in each scenario. Basic mathematical expressions for indicating the GWP and cost of the entire process of excess bagasse utilization are taken into account in the model formulation and optimization. The outcome of this study is the methodology developed for decision-making concerning the excess bagasse utilization available in Thailand in view of the GWP and economic effects. A demonstration example is presented to illustrate the advantage of the methodology which may be used by the policy maker. The methodology developed is successfully performed to satisfy both environmental and economic objectives over the whole life cycle of the system. It is shown in the demonstration example that the first scenario results in positive GWP while the second scenario results in negative GWP. The combination of these two scenario results in positive or negative GWP depending on the preference of the weighting given to each objective. The results on economics of all scenarios show the satisfied outcomes.« less

  17. Technical, hygiene, economic, and life cycle assessment of full-scale moving bed biofilm reactors for wastewater treatment in India.

    PubMed

    Singh, Anju; Kamble, Sheetal Jaisingh; Sawant, Megha; Chakravarthy, Yogita; Kazmi, Absar; Aymerich, Enrique; Starkl, Markus; Ghangrekar, Makarand; Philip, Ligy

    2018-01-01

    Moving bed biofilm reactor (MBBR) is a highly effective biological treatment process applied to treat both urban and industrial wastewaters in developing countries. The present study investigated the technical performance of ten full-scale MBBR systems located across India. The biochemical oxygen demand, chemical oxygen demand, total suspended solid, pathogens, and nutrient removal efficiencies were low as compared to the values claimed in literature. Plant 1 was considered for evaluation of environmental impacts using life cycle assessment approach. CML 2 baseline 2000 methodology was adopted, in which 11 impact categories were considered. The life cycle impact assessment results revealed that the main environmental hot spot of this system was energy consumption. Additionally, two scenarios were compared: scenario 1 (direct discharge of treated effluent, i.e., no reuse) and scenario 2 (effluent reuse and tap water replacement). The results showed that scenario 2 significantly reduce the environmental impact in all the categories ultimately decreasing the environmental burden. Moreover, significant economic and environmental benefits can be obtained in scenario 2 by replacing the freshwater demand for non-potable uses. To enhance the performance of wastewater treatment plant (WWTP), there is a need to optimize energy consumption and increase wastewater collection efficiency to maximize the operating capacity of plant and minimize overall environmental footprint. It was concluded that MBBR can be a good alternative for upgrading and optimizing existing municipal wastewater treatment plants with appropriate tertiary treatment. Graphical abstract ᅟ.

  18. Probabilistic-Based Modeling and Simulation Assessment

    DTIC Science & Technology

    2010-06-01

    developed to determine the relative importance of structural components of the vehicle under differnet crash and blast scenarios. With the integration of...the vehicle under different crash and blast scenarios. With the integration of the high fidelity neck and head model, a methodology to calculate the...parameter variability, correlation, and multiple (often competing) failure metrics. Important scenarios include vehicular collisions, blast /fragment

  19. Non-contact versus contact-based sensing methodologies for in-home upper arm robotic rehabilitation.

    PubMed

    Howard, Ayanna; Brooks, Douglas; Brown, Edward; Gebregiorgis, Adey; Chen, Yu-Ping

    2013-06-01

    In recent years, robot-assisted rehabilitation has gained momentum as a viable means for improving outcomes for therapeutic interventions. Such therapy experiences allow controlled and repeatable trials and quantitative evaluation of mobility metrics. Typically though these robotic devices have been focused on rehabilitation within a clinical setting. In these traditional robot-assisted rehabilitation studies, participants are required to perform goal-directed movements with the robot during a therapy session. This requires physical contact between the participant and the robot to enable precise control of the task, as well as a means to collect relevant performance data. On the other hand, non-contact means of robot interaction can provide a safe methodology for extracting the control data needed for in-home rehabilitation. As such, in this paper we discuss a contact and non-contact based method for upper-arm rehabilitation exercises that enables quantification of upper-arm movements. We evaluate our methodology on upper-arm abduction/adduction movements and discuss the advantages and limitations of each approach as applied to an in-home rehabilitation scenario.

  20. Adaptation to hydrological extremes through insurance: a financial fund simulation model under changing scenarios

    NASA Astrophysics Data System (ADS)

    Guzman, Diego; Mohor, Guilherme; Câmara, Clarissa; Mendiondo, Eduardo

    2017-04-01

    Researches from around the world relate global environmental changes with the increase of vulnerability to extreme events, such as heavy and scarce precipitations - floods and droughts. Hydrological disasters have caused increasing losses in recent years. Thus, risk transfer mechanisms, such as insurance, are being implemented to mitigate impacts, finance the recovery of the affected population, and promote the reduction of hydrological risks. However, among the main problems in implementing these strategies, there are: First, the partial knowledge of natural and anthropogenic climate change in terms of intensity and frequency; Second, the efficient risk reduction policies require accurate risk assessment, with careful consideration of costs; Third, the uncertainty associated with numerical models and input data used. The objective of this document is to introduce and discuss the feasibility of the application of Hydrological Risk Transfer Models (HRTMs) as a strategy of adaptation to global climate change. The article shows the development of a methodology for the collective and multi-sectoral vulnerability management, facing the hydrological risk in the long term, under an insurance funds simulator. The methodology estimates the optimized premium as a function of willingness to pay (WTP) and the potential direct loss derived from hydrological risk. The proposed methodology structures the watershed insurance scheme in three analysis modules. First, the hazard module, which characterizes the hydrologic threat from the recorded series input or modelled series under IPCC / RCM's generated scenarios. Second, the vulnerability module calculates the potential economic loss for each sector1 evaluated as a function of the return period "TR". Finally, the finance module determines the value of the optimal aggregate premium by evaluating equiprobable scenarios of water vulnerability; taking into account variables such as the maximum limit of coverage, deductible, reinsurance schemes, and incentives for risk reduction. The methodology tested by members of the Integrated Nucleus of River Basins (NIBH) (University of Sao Paulo (USP) School of Engineering of São Carlos (EESC) - Brazil) presents an alternative to the analysis and planning of insurance funds, aiming to mitigate the impacts of hydrological droughts and stream flash floods. The presented procedure is especially important when information relevant to studies and the development and implementation of insurance funds are difficult to access and of complex evaluation. A sequence of academic applications has been made in Brazil under the South American context, where the market of hydrological insurance has a low penetration compared to developed economies and insurance markets more established as the United States and Europe, producing relevant information and demonstrating the potential of the methodology in development.

  1. Risk Mapping Case Study: Industrial Area Of Trinec Town (Czech Republic) potentially endangered by floods and landslides

    NASA Astrophysics Data System (ADS)

    Dobes, P.; Hrdina, P.; Kotatko, A.; Danihelka, P.; Bednarik, M.; Krejci, O.; Kasperakova, D.

    2009-04-01

    One of present questions in the context of natural and technological risk mapping, which become important in last years, is analysis and assessment of selected types of multirisks. It results from relevant R&D projetcs and also from international workshops and conferences. From various surveys and presented activities it is evident existence a lot of data and methodological approaches for single risk categories but a lack of tested methodological approaches for multirisks. Within framework of workgroup was done literature search of multirisk assessment methodologies and innovations. The idea of this relatively small, local scale case study arose during the 3rd Risk Mapping Workshop, coordinated by EC DG JRC, IPSC in November 2007. The proposal was based on the previous risk analysis and assessment project, which has been done for Frydek-Mistek County area (Czech Republic) in the year 2002. Several industrial facilities in the Trinec are partly situated in the inundation area of river Olše and are partly protected by concrete barriers built on the banks of Olše. It has to be mentioned that these banks are unstable and in the permanent slow movement. If iron-concrete barriers will be overflowed by water as the result of sudden bank landslide or flood wave, it could trigger several industrial accidents on steel and energy production facilities. Area is highly developed from demographic and socioeconomic point of view. Selected area is in high stage of geological, engineering geological and hydrogeological investigation. Most important scenarios of acidents in the area were developed by What-If analysis and Black box analysis (just growth of several different scenarios; qualitative analysis). In the period of few years later, more QRA analyses of industrial risks were proceeded separately, thanks to District Office, public and Seveso II Directive requirements. General scenarios of multi-hazard events was considered. In the case study, three methodologies was applied to assess hazard and risk: qualitative approach based on German methodology of Risk matrix compilation; quantitative approach based on statistical methods previously used for the area between two towns Hlohovec and Sered in Slovakia; quantitative approach for the modelling of the floods on the river Olse based on model HEC-RAS. For evaluation of selected scenarios impacts to the facilities and also to the public, including evaluation of present barriers, was used also method of expert assesment. With regard to the preliminary results it could be estimated, that flooding of industrial facilities is less probable due to existing barriers, but several usefull recomendations for similar prone areas could be derived. Acknowledgements This work is partially supported by the Czech Ministry of the Evnironment in the frame of R&D project "Comprehensive Interactions between Natural Processes and Industry with Regard to Major Accident Prevention and Emergency Planning" (Registration Number: SPII 1a10 45/07).

  2. Evaluating methods to establish habitat suitability criteria: A case study in the upper Delaware River Basin, USA

    USGS Publications Warehouse

    Galbraith, Heather S.; Blakeslee, Carrie J.; Cole, Jeffrey C.; Talbert, Colin; Maloney, Kelly O.

    2016-01-01

    Defining habitat suitability criteria (HSC) of aquatic biota can be a key component to environmental flow science. HSC can be developed through numerous methods; however, few studies have evaluated the consistency of HSC developed by different methodologies. We directly compared HSC for depth and velocity developed by the Delphi method (expert opinion) and by two primary literature meta-analyses (literature-derived range and interquartile range) to assess whether these independent methods produce analogous criteria for multiple species (rainbow trout, brown trout, American shad, and shallow fast guild) and life stages. We further evaluated how these two independently developed HSC affect calculations of habitat availability under three alternative reservoir management scenarios in the upper Delaware River at a mesohabitat (main channel, stream margins, and flood plain), reach, and basin scale. In general, literature-derived HSC fell within the range of the Delphi HSC, with highest congruence for velocity habitat. Habitat area predicted using the Delphi HSC fell between the habitat area predicted using two literature-derived HSC, both at the basin and the site scale. Predicted habitat increased in shallow regions (stream margins and flood plain) using literature-derived HSC while Delphi-derived HSC predicted increased channel habitat. HSC generally favoured the same reservoir management scenario; however, no favoured reservoir management scenario was the most common outcome when applying the literature range HSC. The differences found in this study lend insight into how different methodologies can shape HSC and their consequences for predicted habitat and water management decisions. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  3. Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.

    DOT National Transportation Integrated Search

    1979-09-01

    This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...

  4. Computer-aided testing of pilot response to critical in-flight events

    NASA Technical Reports Server (NTRS)

    Giffin, W. C.; Rockwell, T. H.

    1984-01-01

    This research on pilot response to critical in-flight events employs a unique methodology including an interactive computer-aided scenario-testing system. Navigation displays, instrument-panel displays, and assorted textual material are presented on a touch-sensitive CRT screen. Problem diagnosis scenarios, destination-diversion scenarios and combined destination/diagnostic tests are available. A complete time history of all data inquiries and responses is maintained. Sample results of diagnosis scenarios obtained from testing 38 licensed pilots are presented and discussed.

  5. Recurrence quantification analysis of extremes of maximum and minimum temperature patterns for different climate scenarios in the Mesochora catchment in Central-Western Greece

    NASA Astrophysics Data System (ADS)

    Panagoulia, Dionysia; Vlahogianni, Eleni I.

    2018-06-01

    A methodological framework based on nonlinear recurrence analysis is proposed to examine the historical data evolution of extremes of maximum and minimum daily mean areal temperature patterns over time under different climate scenarios. The methodology is based on both historical data and atmospheric General Circulation Model (GCM) produced climate scenarios for the periods 1961-2000 and 2061-2100 which correspond to 1 × CO2 and 2 × CO2 scenarios. Historical data were derived from the actual daily observations coupled with atmospheric circulation patterns (CPs). The dynamics of the temperature was reconstructed in the phase-space from the time series of temperatures. The statistically comparing different temperature patterns were based on some discriminating statistics obtained by the Recurrence Quantification Analysis (RQA). Moreover, the bootstrap method of Schinkel et al. (2009) was adopted to calculate the confidence bounds of RQA parameters based on a structural preserving resampling. The overall methodology was implemented to the mountainous Mesochora catchment in Central-Western Greece. The results reveal substantial similarities between the historical maximum and minimum daily mean areal temperature statistical patterns and their confidence bounds, as well as the maximum and minimum temperature patterns in evolution under the 2 × CO2 scenario. A significant variability and non-stationary behaviour characterizes all climate series analyzed. Fundamental differences are produced from the historical and maximum 1 × CO2 scenarios, the maximum 1 × CO2 and minimum 1 × CO2 scenarios, as well as the confidence bounds for the two CO2 scenarios. The 2 × CO2 scenario reflects the strongest shifts in intensity, duration and frequency in temperature patterns. Such transitions can help the scientists and policy makers to understand the effects of extreme temperature changes on water resources, economic development, and health of ecosystems and hence to proceed to effective proactive management of extreme phenomena. The impacts of the findings on the predictability of the extreme daily mean areal temperature patterns are also commented.

  6. A model to calculate consistent atmospheric emission projections and its application to Spain

    NASA Astrophysics Data System (ADS)

    Lumbreras, Julio; Borge, Rafael; de Andrés, Juan Manuel; Rodríguez, Encarnación

    Global warming and air quality are headline environmental issues of our time and policy must preempt negative international effects with forward-looking strategies. As part of the revision of the European National Emission Ceilings Directive, atmospheric emission projections for European Union countries are being calculated. These projections are useful to drive European air quality analyses and to support wide-scale decision-making. However, when evaluating specific policies and measures at sectoral level, a more detailed approach is needed. This paper presents an original methodology to evaluate emission projections. Emission projections are calculated for each emitting activity that has emissions under three scenarios: without measures (business as usual), with measures (baseline) and with additional measures (target). The methodology developed allows the estimation of highly disaggregated multi-pollutant, consistent emissions for a whole country or region. In order to assure consistency with past emissions included in atmospheric emission inventories and coherence among the individual activities, the consistent emission projection (CEP) model incorporates harmonization and integration criteria as well as quality assurance/quality check (QA/QC) procedures. This study includes a sensitivity analysis as a first approach to uncertainty evaluation. The aim of the model presented in this contribution is to support decision-making process through the assessment of future emission scenarios taking into account the effect of different detailed technical and non-technical measures and it may also constitute the basis for air quality modelling. The system is designed to produce the information and formats related to international reporting requirements and it allows performing a comparison of national results with lower resolution models such as RAINS/GAINS. The methodology has been successfully applied and tested to evaluate Spanish emission projections up to 2020 for 26 pollutants but the methodology could be adopted for any particular region for different purposes, especially for European countries.

  7. Theory of mind: mechanisms, methods, and new directions

    PubMed Central

    Byom, Lindsey J.; Mutlu, Bilge

    2013-01-01

    Theory of Mind (ToM) has received significant research attention. Traditional ToM research has provided important understanding of how humans reason about mental states by utilizing shared world knowledge, social cues, and the interpretation of actions; however, many current behavioral paradigms are limited to static, “third-person” protocols. Emerging experimental approaches such as cognitive simulation and simulated social interaction offer opportunities to investigate ToM in interactive, “first-person” and “second-person” scenarios while affording greater experimental control. The advantages and limitations of traditional and emerging ToM methodologies are discussed with the intent of advancing the understanding of ToM in socially mediated situations. PMID:23964218

  8. A changing climate: impacts on human exposures to O3 using ...

    EPA Pesticide Factsheets

    Predicting the impacts of changing climate on human exposure to air pollution requires future scenarios that account for changes in ambient pollutant concentrations, population sizes and distributions, and housing stocks. An integrated methodology to model changes in human exposures due to these impacts was developed by linking climate, air quality, land-use, and human exposure models. This methodology was then applied to characterize changes in predicted human exposures to O3 under multiple future scenarios. Regional climate projections for the U.S. were developed by downscaling global circulation model (GCM) scenarios for three of the Intergovernmental Panel on Climate Change’s (IPCC’s) Representative Concentration Pathways (RCPs) using the Weather Research and Forecasting (WRF) model. The regional climate results were in turn used to generate air quality (concentration) projections using the Community Multiscale Air Quality (CMAQ) model. For each of the climate change scenarios, future U.S. census-tract level population distributions from the Integrated Climate and Land Use Scenarios (ICLUS) model for four future scenarios based on the IPCC’s Special Report on Emissions Scenarios (SRES) storylines were used. These climate, air quality, and population projections were used as inputs to EPA’s Air Pollutants Exposure (APEX) model for 12 U.S. cities. Probability density functions show changes in the population distribution of 8 h maximum daily O3 exposur

  9. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Texas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Texas. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  10. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Minnesota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Minnesota. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  11. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Indiana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Indiana. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  12. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Florida

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Florida. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  13. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Maine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Maine. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  14. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Vermont

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Vermont. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  15. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Michigan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Michigan. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  16. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Alabama

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Alabama. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  17. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of New Hampshire

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of New Hampshire. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology usedmore » in the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  18. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of New Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of New Mexico. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology usedmore » in the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  19. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Colorado. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  20. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Washington

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Washington. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  1. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Montana

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Montana. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  2. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the District of Columbia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the District of Columbia. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  3. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Massachusetts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Massachusetts. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  4. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Oregon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Oregon. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  5. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Wisconsin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Wisconsin. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  6. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Ohio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Ohio. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  7. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of South Carolina

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of South Carolina. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology usedmore » in the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  8. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of North Carolina

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of North Carolina. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology usedmore » in the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  9. Cost Effectiveness of ASHRAE Standard 90.1-2013 for the State of Iowa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, Philip R.; Athalye, Rahul A.; Xie, YuLong

    2015-12-01

    Moving to the ASHRAE Standard 90.1-2013 (ASHRAE 2013) edition from Standard 90.1-2010 (ASHRAE 2010) is cost-effective for the State of Iowa. The table below shows the state-wide economic impact of upgrading to Standard 90.1-2013 in terms of the annual energy cost savings in dollars per square foot, additional construction cost per square foot required by the upgrade, and life-cycle cost (LCC) per square foot. These results are weighted averages for all building types in all climate zones in the state, based on weightings shown in Table 4. The methodology used for this analysis is consistent with the methodology used inmore » the national cost-effectiveness analysis. Additional results and details on the methodology are presented in the following sections. The report provides analysis of two LCC scenarios: Scenario 1, representing publicly-owned buildings, considers initial costs, energy costs, maintenance costs, and replacement costs—without borrowing or taxes. Scenario 2, representing privately-owned buildings, adds borrowing costs and tax impacts.« less

  10. LCA of greywater management within a water circular economy restorative thinking framework.

    PubMed

    Dominguez, Sara; Laso, Jara; Margallo, María; Aldaco, Rubén; Rivero, Maria J; Irabien, Ángel; Ortiz, Inmaculada

    2018-04-15

    Greywater reuse is an attractive option for the sustainable management of water under water scarcity circumstances, within a water circular economy restorative thinking framework. Its successful deployment relies on the availability of low cost and environmentally friendly technologies. The life cycle assessment (LCA) approach provides the appropriate methodological tool for the evaluation of alternative treatments based on environmental decision criteria and, therefore, it is highly useful during the process conceptual design. This methodology should be employed in the early design phase to select those technologies with lower environmental impact. This work reports the comparative LCA of three scenarios for greywater reuse: photocatalysis, photovoltaic solar-driven photocatalysis and membrane biological reactor, in order to help the selection of the most environmentally friendly technology. The study has been focused on the removal of the surfactant sodium dodecylbenzenesulfonate, which is used in the formulation of detergents and personal care products and, thus, widely present in greywater. LCA was applied using the Environmental Sustainability Assessment methodology to obtain two main environmental indicators in order to simplify the decision making process: natural resources and environmental burdens. Energy consumption is the main contributor to both indicators owing to the high energy consumption of the light source for the photocatalytic greywater treatment. In order to reduce its environmental burdens, the most desirable scenario would be the use of solar light for the photocatalytic transformation. However, while the technological challenge of direct use of solar light is approached, the environmental suitability of the photovoltaic solar energy driven photocatalysis technology to greywater reuse has been demonstrated, as it involves the smallest environmental impact among the three studied alternatives. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Advanced space system concepts and their orbital support needs (1980 - 2000). Volume 4: Detailed data. Part 2: Program plans and common support needs (a study of the commonality of space vehicle applications to future national needs

    NASA Technical Reports Server (NTRS)

    Bekey, I.; Mayer, H. L.; Wolfe, M. G.

    1976-01-01

    The methodology of alternate world future scenarios is utilized for selecting a plausible, though not advocated, set of future scenarios each of which results in a program plan appropriate for the respective environment. Each such program plan gives rise to different building block and technology requirements, which are analyzed for common need between the NASA and the DoD for each of the alternate world scenarios. An essentially invariant set of system, building block, and technology development plans is presented at the conclusion, intended to allow protection of most of the options for system concepts regardless of what the actual future world environment turns out to be. Thus, building block and technology needs are derived which support: (1) each specific world scenario; (2) all the world scenarios identified in this study; or (3) generalized scenarios applicable to almost any future environment. The output included in this volume consists of the building blocks, i.e.: transportation vehicles, orbital support vehicles, and orbital support facilities; the technology required to support the program plans; identification of their features which could support the DoD and NASA in common; and a complete discussion of the planning methodology.

  12. Dark scenarios

    NASA Astrophysics Data System (ADS)

    Ahonen, Pasi; Alahuhta, Petteri; Daskala, Barbara; Delaitre, Sabine; Hert, Paul De; Lindner, Ralf; Maghiros, Ioannis; Moscibroda, Anna; Schreurs, Wim; Verlinden, Michiel

    In this chapter, we present four "dark scenarios" that highlight the key socio-economic, legal, technological and ethical risks to privacy, identity, trust, security and inclusiveness posed by new AmI technologies. We call them dark scenarios, because they show things that could go wrong in an AmI world, because they present visions of the future that we do not want to become reality. The scenarios expose threats and vulnerabilities as a way to inform policy-makers and planners about issues they need to take into account in developing new policies or updating existing legislation. Before presenting the four scenarios and our analysis of each, we describe the process of how we created the scenarios as well as the elements in our methodology for analysing the scenarios.

  13. Future Scenarios for Mobile Science Learning

    ERIC Educational Resources Information Center

    Burden, Kevin; Kearney, Matthew

    2016-01-01

    This paper adopts scenario planning as a methodological approach and tool to help science educators reconceptualise their use of mobile technologies across various different futures. These "futures" are set out neither as predictions nor prognoses but rather as stimuli to encourage greater discussion and reflection around the use of…

  14. Game-Like Technology Innovation Education

    ERIC Educational Resources Information Center

    Magnussen, Rikke

    2011-01-01

    This paper examines the methodological challenges and perspectives of designing game-like scenarios for the implementation of innovation processes in school science education. This paper presents a design-based research study of a game-like innovation scenario designed for technology education for Danish public school students aged 13-15. Students…

  15. The Simulation of Daily Temperature Time Series from GCM Output. Part II: Sensitivity Analysis of an Empirical Transfer Function Methodology.

    NASA Astrophysics Data System (ADS)

    Winkler, Julie A.; Palutikof, Jean P.; Andresen, Jeffrey A.; Goodess, Clare M.

    1997-10-01

    Empirical transfer functions have been proposed as a means for `downscaling' simulations from general circulation models (GCMs) to the local scale. However, subjective decisions made during the development of these functions may influence the ensuing climate scenarios. This research evaluated the sensitivity of a selected empirical transfer function methodology to 1) the definition of the seasons for which separate specification equations are derived, 2) adjustments for known departures of the GCM simulations of the predictor variables from observations, 3) the length of the calibration period, 4) the choice of function form, and 5) the choice of predictor variables. A modified version of the Climatological Projection by Model Statistics method was employed to generate control (1 × CO2) and perturbed (2 × CO2) scenarios of daily maximum and minimum temperature for two locations with diverse climates (Alcantarilla, Spain, and Eau Claire, Michigan). The GCM simulations used in the scenario development were from the Canadian Climate Centre second-generation model (CCC GCMII).Variations in the downscaling methodology were found to have a statistically significant impact on the 2 × CO2 climate scenarios, even though the 1 × CO2 scenarios for the different transfer function approaches were often similar. The daily temperature scenarios for Alcantarilla and Eau Claire were most sensitive to the decision to adjust for deficiencies in the GCM simulations, the choice of predictor variables, and the seasonal definitions used to derive the functions (i.e., fixed seasons, floating seasons, or no seasons). The scenarios were less sensitive to the choice of function form (i.e., linear versus nonlinear) and to an increase in the length of the calibration period.The results of Part I, which identified significant departures of the CCC GCMII simulations of two candidate predictor variables from observations, together with those presented here in Part II, 1) illustrate the importance of detailed comparisons of observed and GCM 1 × CO2 series of candidate predictor variables as an initial step in impact analysis, 2) demonstrate that decisions made when developing the transfer functions can have a substantial influence on the 2 × CO2 scenarios and their interpretation, 3) highlight the uncertainty in the appropriate criteria for evaluating transfer function approaches, and 4) suggest that automation of empirical transfer function methodologies is inappropriate because of differences in the performance of transfer functions between sites and because of spatial differences in the GCM's ability to adequately simulate the predictor variables used in the functions.

  16. Incorporating scenario-based simulation into a hospital nursing education program.

    PubMed

    Nagle, Beth M; McHale, Jeanne M; Alexander, Gail A; French, Brian M

    2009-01-01

    Nurse educators are challenged to provide meaningful and effective learning opportunities for both new and experienced nurses. Simulation as a teaching and learning methodology is being embraced by nursing in academic and practice settings to provide innovative educational experiences to assess and develop clinical competency, promote teamwork, and improve care processes. This article provides an overview of the historical basis for using simulation in education, simulation methodologies, and perceived advantages and disadvantages. It also provides a description of the integration of scenario-based programs using a full-scale patient simulator into nursing education programming at a large academic medical center.

  17. Mid-Twenty-First-Century Changes in Global Wave Energy Flux: Single-Model, Single-Forcing and Single-Scenario Ensemble Projections

    NASA Astrophysics Data System (ADS)

    Semedo, Alvaro; Lemos, Gil; Dobrynin, Mikhail; Behrens, Arno; Staneva, Joanna; Miranda, Pedro

    2017-04-01

    The knowledge of ocean surface wave energy fluxes (or wave power) is of outmost relevance since wave power has a direct impact in coastal erosion, but also in sediment transport and beach nourishment, and ship, as well as in coastal and offshore infrastructures design. Changes in the global wave energy flux pattern can alter significantly the impact of waves in continental shelf and coastal areas. Up until recently the impact of climate change in future global wave climate had received very little attention. Some single model single scenario global wave climate projections, based on CMIP3 scenarios, were pursuit under the auspices of the COWCLIP (coordinated ocean wave climate projections) project, and received some attention in the IPCC (Intergovernmental Panel for Climate Change) AR5 (fifth assessment report). In the present study the impact of a warmer climate in the near future global wave energy flux climate is investigated through a 4-member "coherent" ensemble of wave climate projections: single-model, single-forcing, and single-scenario. In this methodology model variability is reduced, leaving only room for the climate change signal. The four ensemble members were produced with the wave model WAM, forced with wind speed and ice coverage from EC-Earth projections, following the representative concentration pathway with a high emissions scenario 8.5 (RCP8.5). The ensemble present climate reference period (the control run) has been set for 1976 to 2005. The projected changes in the global wave energy flux climate are analyzed for the 2031-2060 period.

  18. End State: The Fallacy of Modern Military Planning

    DTIC Science & Technology

    2017-04-06

    operational planning for non -linear, complex scenarios requires application of non -linear, advanced planning techniques such as design methodology ...cannot be approached in a linear, mechanistic manner by a universal planning methodology . Theater/global campaign plans and theater strategies offer no...strategic environments, and instead prescribes a universal linear methodology that pays no mind to strategic complexity. This universal application

  19. SOCIOECONOMIC ANALYSIS OF HAZARDOUS WASTE MANAGEMENT ALTERNATIVES: METHOLOLOGY AND DEMONSTRATION

    EPA Science Inventory

    A methodology for analyzing economic and social effects of alternatives in hazardous waste management is presented and demonstrated. The approach includes the use of environmental threat scenarios and evaluation of effects on and responses by parties-at-interest. The methodology ...

  20. A Scenario Approach to Assessment of New Communications Media.

    ERIC Educational Resources Information Center

    Spangler, Kathleen; And Others

    In a study supported by the Charles F. Kettering Foundation, a research team developed a methodology for illustrating the effective and ineffective uses of audio, video, and computer teleconferencing by developing scenarios for eacb medium. The group first invented a general situation--a conference involving participants with global, regional, and…

  1. Considerations in linking energy scenario modeling and Life Cycle Analysis

    EPA Science Inventory

    The U.S. EPA Office of Research and Development (ORD) has been exploring approaches for estimating U.S. anthropogenic air pollutant emissions through the mid-21st century. As a result, we have developed the Emission Scenario Projection methodology, or ESP. In this document, we pr...

  2. Using the scenario method in the context of health and health care--a scoping review.

    PubMed

    Vollmar, Horst Christian; Ostermann, Thomas; Redaèlli, Marcus

    2015-10-16

    The scenario technique is a method for future research and for strategic planning. Today, it includes both qualitative and quantitative elements. The aims of this scoping review are to give an overview of the application of the scenario method in the fields of health care and to make suggestions for better reporting in future scenario projects. Between January 2013 and October 2013 we conducted a systematic search in the databases Medline, Embase, PsycInfo, Eric, The Cochrane Library, Scopus, Web of Science, and Cinahl since inception for the term 'scenario(s)' in combination with other terms, e.g. method, model, and technique. Our search was not restricted by date or language. In addition, we screened the reference lists of the included articles. A total of 576 bibliographical records were screened. After removing duplicates and three rounds of screening, 41 articles covering 38 different scenario projects were included for the final analysis. Nine of the included articles addressed disease related issues, led by mental health and dementia (n = 4), and followed by cancer (n = 3). Five scenario projects focused on public health issues at an organizational level and five focused on the labor market for different health care professionals. In addition, four projects dealt with health care 'in general', four with the field of biotechnology and personalized medicine, and additional four with other technology developments. Some of the scenario projects suffered from poor reporting of methodological aspects. Despite its potential, use of the scenario method seems to be published rarely in comparison to other methods such as the Delphi-technique, at least in the field of health care. This might be due to the complexity of the methodological approach. Individual project methods and activities vary widely and are poorly reported. Improved criteria are required for reporting of scenario project methods. With improved standards and greater transparency, the scenario method will be a good tool for scientific health care planning and strategic decision-making in public health.

  3. Utility Estimation for Pediatric Vesicoureteral Reflux: Methodological Considerations Using an Online Survey Platform.

    PubMed

    Tejwani, Rohit; Wang, Hsin-Hsiao S; Lloyd, Jessica C; Kokorowski, Paul J; Nelson, Caleb P; Routh, Jonathan C

    2017-03-01

    The advent of online task distribution has opened a new avenue for efficiently gathering community perspectives needed for utility estimation. Methodological consensus for estimating pediatric utilities is lacking, with disagreement over whom to sample, what perspective to use (patient vs parent) and whether instrument induced anchoring bias is significant. We evaluated what methodological factors potentially impact utility estimates for vesicoureteral reflux. Cross-sectional surveys using a time trade-off instrument were conducted via the Amazon Mechanical Turk® (https://www.mturk.com) online interface. Respondents were randomized to answer questions from child, parent or dyad perspectives on the utility of a vesicoureteral reflux health state and 1 of 3 "warm-up" scenarios (paralysis, common cold, none) before a vesicoureteral reflux scenario. Utility estimates and potential predictors were fitted to a generalized linear model to determine what factors most impacted utilities. A total of 1,627 responses were obtained. Mean respondent age was 34.9 years. Of the respondents 48% were female, 38% were married and 44% had children. Utility values were uninfluenced by child/personal vesicoureteral reflux/urinary tract infection history, income or race. Utilities were affected by perspective and were higher in the child group (34% lower in parent vs child, p <0.001, and 13% lower in dyad vs child, p <0.001). Vesicoureteral reflux utility was not significantly affected by the presence or type of time trade-off warm-up scenario (p = 0.17). Time trade-off perspective affects utilities when estimated via an online interface. However, utilities are unaffected by the presence, type or absence of warm-up scenarios. These findings could have significant methodological implications for future utility elicitations regarding other pediatric conditions. Copyright © 2017 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  4. The ALMA CONOPS project: the impact of funding decisions on observatory performance

    NASA Astrophysics Data System (ADS)

    Ibsen, Jorge; Hibbard, John; Filippi, Giorgio

    2014-08-01

    In time when every penny counts, many organizations are facing the question of how much scientific impact a budget cut can have or, putting it in more general terms, which is the science impact of alternative (less costly) operational modes. In reply to such question posted by the governing bodies, the ALMA project had to develop a methodology (ALMA Concepts for Operations, CONOPS) that attempts to measure the impact that alternative operational scenarios may have on the overall scientific production of the Observatory. Although the analysis and the results are ALMA specific, the developed approach is rather general and provides a methodology for a cost-performance analysis of alternatives before any radical alterations to the operations model are adopted. This paper describes the key aspects of the methodology: a) the definition of the Figures of Merit (FoMs) for the assessment of quantitative science performance impacts as well as qualitative impacts, and presents a methodology using these FoMs to evaluate the cost and impact of the different operational scenarios; b) the definition of a REFERENCE operational baseline; c) the identification of Alternative Scenarios each replacing one or more concepts in the REFERENCE by a different concept that has a lower cost and some level of scientific and/or operational impact; d) the use of a Cost-Performance plane to graphically combine the effects that the alternative scenarios can have in terms of cost reduction and affected performance. Although is a firstorder assessment, we believe this approach is useful for comparing different operational models and to understand the cost performance impact of these choices. This can be used to take decision to meet budget cuts as well as in evaluating possible new emergent opportunities.

  5. Applying the Verona coding definitions of emotional sequences (VR-CoDES) to code medical students' written responses to written case scenarios: Some methodological and practical considerations.

    PubMed

    Ortwein, Heiderose; Benz, Alexander; Carl, Petra; Huwendiek, Sören; Pander, Tanja; Kiessling, Claudia

    2017-02-01

    To investigate whether the Verona Coding Definitions of Emotional Sequences to code health providers' responses (VR-CoDES-P) can be used for assessment of medical students' responses to patients' cues and concerns provided in written case vignettes. Student responses in direct speech to patient cues and concerns were analysed in 21 different case scenarios using VR-CoDES-P. A total of 977 student responses were available for coding, and 857 responses were codable with the VR-CoDES-P. In 74.6% of responses, the students used either a "reducing space" statement only or a "providing space" statement immediately followed by a "reducing space" statement. Overall, the most frequent response was explicit information advice (ERIa) followed by content exploring (EPCEx) and content acknowledgement (EPCAc). VR-CoDES-P were applicable to written responses of medical students when they were phrased in direct speech. The application of VR-CoDES-P is reliable and feasible when using the differentiation of "providing" and "reducing space" responses. Communication strategies described by students in non-direct speech were difficult to code and produced many missings. VR-CoDES-P are useful for analysis of medical students' written responses when focusing on emotional issues. Students need precise instructions for their response in the given test format. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Contribution of economic evaluation to decision making in early phases of product development: a methodological and empirical review.

    PubMed

    Hartz, Susanne; John, Jürgen

    2008-01-01

    Economic evaluation as an integral part of health technology assessment is today mostly applied to established technologies. Evaluating healthcare innovations in their early states of development has recently attracted attention. Although it offers several benefits, it also holds methodological challenges. The aim of our study was to investigate the possible contributions of economic evaluation to industry's decision making early in product development and to confront the results with the actual use of early data in economic assessments. We conducted a literature research to detect methodological contributions as well as economic evaluations that used data from early phases of product development. Economic analysis can be beneficially used in early phases of product development for various purposes including early market assessment, R&D portfolio management, and first estimations of pricing and reimbursement scenarios. Analytical tools available for these purposes have been identified. Numerous empirical works were detected, but most do not disclose any concrete decision context and could not be directly matched with the suggested applications. Industry can benefit from starting economic evaluation early in product development in several ways. Empirical evidence suggests that there is still potential left unused.

  7. Analyzing Effects of Turbulence on Power Generation Using Wind Plant Monitoring Data: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, J.; Chowdhury, S.; Hodge, B. M.

    2014-01-01

    In this paper, a methodology is developed to analyze how ambient and wake turbulence affects the power generation of a single wind turbine within an array of turbines. Using monitoring data from a wind power plant, we selected two sets of wind and power data for turbines on the edge of the wind plant that resemble (i) an out-of-wake scenario (i.e., when the turbine directly faces incoming winds) and (ii) an in-wake scenario (i.e., when the turbine is under the wake of other turbines). For each set of data, two surrogate models were then developed to represent the turbine powermore » generation (i) as a function of the wind speed; and (ii) as a function of the wind speed and turbulence intensity. Support vector regression was adopted for the development of the surrogate models. Three types of uncertainties in the turbine power generation were also investigated: (i) the uncertainty in power generation with respect to the published/reported power curve, (ii) the uncertainty in power generation with respect to the estimated power response that accounts for only mean wind speed; and (iii) the uncertainty in power generation with respect to the estimated power response that accounts for both mean wind speed and turbulence intensity. Results show that (i) under the same wind conditions, the turbine generates different power between the in-wake and out-of-wake scenarios, (ii) a turbine generally produces more power under the in-wake scenario than under the out-of-wake scenario, (iii) the power generation is sensitive to turbulence intensity even when the wind speed is greater than the turbine rated speed, and (iv) there is relatively more uncertainty in the power generation under the in-wake scenario than under the out-of-wake scenario.« less

  8. Risk in the mist? Deriving data to quantify microbial health risks associated with aerosol generation by water-efficient devices during typical domestic water-using activities.

    PubMed

    O'Toole, J; Keywood, M; Sinclair, M; Leder, K

    2009-01-01

    The aim of this study was to address existing data gaps and to determine the size distribution of aerosols associated with water-efficient devices during typical domestic activities. This information is important to assist in understanding infection spread during water-using activities and in designing water regulations. Three water-using scenarios were evaluated: i) showering using a water-efficient showerhead; ii) use of a high pressure spray unit for cleaning cars and iii) toilet flushing using a dual flush low volume flush device. For each scenario a control condition (conventional lower efficiency device) was selected for benchmarking purposes. Shower module results highlighted the complexity of particle generation and removal processes and showed that more than 90% of total particle mass in the breathing zone was attributed to particle diameters greater than 6 mum. Conversely, results for car washing experiments showed that particle diameters up to 6 mum constituted the major part of the total mass generated by both water-efficient and conventional devices. Even under worse case scenario conditions for toilet flushing, particle measurements were at or below the level of detection of the measuring instrumentation. The data provide information that assists in health risk assessment and in determining future research directions, including methodological aspects.

  9. Prospects of light sterile neutrino oscillation and C P violation searches at the Fermilab Short Baseline Neutrino Facility

    NASA Astrophysics Data System (ADS)

    Cianci, D.; Furmanski, A.; Karagiorgi, G.; Ross-Lonergan, M.

    2017-09-01

    We investigate the ability of the short baseline neutrino (SBN) experimental program at Fermilab to test the globally-allowed (3 +N ) sterile neutrino oscillation parameter space. We explicitly consider the globally-allowed parameter space for the (3 +1 ), (3 +2 ), and (3 +3 ) sterile neutrino oscillation scenarios. We find that SBN can probe with 5 σ sensitivity more than 85%, 95% and 55% of the parameter space currently allowed at 99% confidence level for the (3 +1 ), (3 +2 ) and (3 +3 ) scenarios, respectively, with the (3 +N ) allowed space used in these studies closely resembling that of previous studies [J. M. Conrad, C. M. Ignarra, G. Karagiorgi, M. H. Shaevitz, and J. Spitz, Adv. High Energy Phys. 2013, 1 (2013)., 10.1155/2013/163897], calculated using the same methodology. In the case of the (3 +2 ) and (3 +3 ) scenarios, C P -violating phases appear in the oscillation probability terms, leading to observable differences in the appearance probabilities of neutrinos and antineutrinos. We explore SBN's sensitivity to those phases for the (3 +2 ) scenario through the currently planned neutrino beam running, and investigate potential improvements through additional antineutrino beam running. We show that, if antineutrino exposure is considered, for maximal values of the (3 +2 ) C P -violating phase ϕ54, SBN could be the first experiment to directly observe ˜2 σ hints of C P violation associated with an extended lepton sector.

  10. Mapping Oil and Gas Development Potential in the US Intermountain West and Estimating Impacts to Species

    PubMed Central

    Copeland, Holly E.; Doherty, Kevin E.; Naugle, David E.; Pocewicz, Amy; Kiesecker, Joseph M.

    2009-01-01

    Background Many studies have quantified the indirect effect of hydrocarbon-based economies on climate change and biodiversity, concluding that a significant proportion of species will be threatened with extinction. However, few studies have measured the direct effect of new energy production infrastructure on species persistence. Methodology/Principal Findings We propose a systematic way to forecast patterns of future energy development and calculate impacts to species using spatially-explicit predictive modeling techniques to estimate oil and gas potential and create development build-out scenarios by seeding the landscape with oil and gas wells based on underlying potential. We illustrate our approach for the greater sage-grouse (Centrocercus urophasianus) in the western US and translate the build-out scenarios into estimated impacts on sage-grouse. We project that future oil and gas development will cause a 7–19 percent decline from 2007 sage-grouse lek population counts and impact 3.7 million ha of sagebrush shrublands and 1.1 million ha of grasslands in the study area. Conclusions/Significance Maps of where oil and gas development is anticipated in the US Intermountain West can be used by decision-makers intent on minimizing impacts to sage-grouse. This analysis also provides a general framework for using predictive models and build-out scenarios to anticipate impacts to species. These predictive models and build-out scenarios allow tradeoffs to be considered between species conservation and energy development prior to implementation. PMID:19826472

  11. Jet aircraft engine exhaust emissions database development: Year 1990 and 2015 scenarios

    NASA Technical Reports Server (NTRS)

    Landau, Z. Harry; Metwally, Munir; Vanalstyne, Richard; Ward, Clay A.

    1994-01-01

    Studies relating to environmental emissions associated with the High Speed Civil Transport (HSCT) military jet and charter jet aircraft were conducted by McDonnell Douglas Aerospace Transport Aircraft. The report includes engine emission results for baseline 1990 charter and military scenario and the projected jet engine emissions results for a 2015 scenario for a Mach 1.6 HSCT charter and military fleet. Discussions of the methodology used in formulating these databases are provided.

  12. A changing climate: impacts on human exposures to O3 using an integrated modeling methodology

    EPA Science Inventory

    Predicting the impacts of changing climate on human exposure to air pollution requires future scenarios that account for changes in ambient pollutant concentrations, population sizes and distributions, and housing stocks. An integrated methodology to model changes in human exposu...

  13. The Effects of Scenario Planning on Participant Reports of Resilience

    ERIC Educational Resources Information Center

    Chermack, Thomas J.; Coons, Laura M.; O'barr, Gregory; Khatami, Shiva

    2017-01-01

    Purpose: The purpose of this research is to examine the effects of scenario planning on participant ratings of resilience. Design/methodology/approach: The research design is a quasi experimental pretest/posttest with treatment and control groups. Random selection or assignment was not achieved. Findings: Results show a significant difference in…

  14. Teacher Education in Portugal: Analysing Changes using the ATEE-RDC19 Scenario Methodology.

    ERIC Educational Resources Information Center

    Sousa, Jesus Maria

    2003-01-01

    Reviews the development of teacher education in Portugal since the 1974 revolution, which brought the country to democracy. Using the Association for Teacher Education in Europe's scenario model, the paper describes the hidden philosophies underlying changes that are occurring and shows how teacher education has evolved from a romantic, idealistic…

  15. Designing Peace and Conflict Exercises: Level of Analysis, Scenario, and Role Specification

    ERIC Educational Resources Information Center

    Bartels, Elizabeth; McCown, Margaret; Wilkie, Timothy

    2013-01-01

    Attentiveness to and transparency about the methodological implications of the level of analysis selected for peace and conflict exercises constitute essential elements of good game design. The article explores the impact of level of analysis choices in the context of two key portions of exercises, scenario construction and role specification. It…

  16. Wikis in Collaborative Educational Scenarios: Integrated in LMS or Standalone Wikis?

    ERIC Educational Resources Information Center

    Forment, Marc Alier; De Pedro, Xavier; Casan, Maria Jose; Piguillem, Jordi; Galanis, Nikolas

    2012-01-01

    This article outlines a set of features that wiki engines require to successfully host collaborative educational scenarios. The authors explore multiple issues that deal with the use wikis with learning activities. One of the first issues to solve is software support for assessment methodologies. The second is choosing between using an integrated…

  17. Meaningful Intuitions: The Evidential Role of Intuitions in the Study of Language

    ERIC Educational Resources Information Center

    Maynes, Jeffrey

    2012-01-01

    Philosophical theories are often repudiated, or taken to be repudiated, by identifying counter-examples. These counter-examples are typically based upon our natural response to a real or hypothetical scenario, also called our intuition about the scenario. This methodology of appealing to intuition has been the focus of recent debates about…

  18. Trade-Off Analysis Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    NASAs Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to examine protocols and architectures for an In-Space Internet Node. CNS has developed a methodology for network reference models to support NASAs four mission areas: Earth Science, Space Science, Human Exploration and Development of Space (REDS), Aerospace Technology. CNS previously developed a report which applied the methodology, to three space Internet-based communications scenarios for future missions. CNS conceptualized, designed, and developed space Internet-based communications protocols and architectures for each of the independent scenarios. GRC selected for further analysis the scenario that involved unicast communications between a Low-Earth-Orbit (LEO) International Space Station (ISS) and a ground terminal Internet node via a Tracking and Data Relay Satellite (TDRS) transfer. This report contains a tradeoff analysis on the selected scenario. The analysis examines the performance characteristics of the various protocols and architectures. The tradeoff analysis incorporates the results of a CNS developed analytical model that examined performance parameters.

  19. Methodology for Generating Conflict Scenarios by Time Shifting Recorded Traffic Data

    NASA Technical Reports Server (NTRS)

    Paglione, Mike; Oaks, Robert; Bilimoria, Karl D.

    2003-01-01

    A methodology is presented for generating conflict scenarios that can be used as test cases to estimate the operational performance of a conflict probe. Recorded air traffic data is time shifted to create traffic scenarios featuring conflicts with characteristic properties similar to those encountered in typical air traffic operations. First, a reference set of conflicts is obtained from trajectories that are computed using birth points and nominal flight plans extracted from recorded traffic data. Distributions are obtained for several primary properties (e.g., encounter angle) that are most likely to affect the performance of a conflict probe. A genetic algorithm is then utilized to determine the values of time shifts for the recorded track data so that the primary properties of conflicts generated by the time shifted data match those of the reference set. This methodology is successfully demonstrated using recorded traffic data for the Memphis Air Route Traffic Control Center; a key result is that the required time shifts are less than 5 min for 99% of the tracks. It is also observed that close matching of the primary properties used in this study additionally provides a good match for some other secondary properties.

  20. Appendix 2. Guide for Running AgMIP Climate Scenario Generation Tools with R in Windows, Version 2.3

    NASA Technical Reports Server (NTRS)

    Hudson, Nicholas; Ruane, Alexander Clark

    2013-01-01

    This Guide explains how to create climate series and climate change scenarios by using the AgMip Climate team's methodology as outlined in the AgMIP Guide for Regional Assessment: Handbook of Methods and Procedures. It details how to: install R and the required packages to run the AgMIP Climate Scenario Generation scripts, and create climate scenarios from CMIP5 GCMs using a 30-year baseline daily weather dataset. The Guide also outlines a workflow that can be modified for application to your own climate data.

  1. Multi-frequency local wavenumber analysis and ply correlation of delamination damage.

    PubMed

    Juarez, Peter D; Leckey, Cara A C

    2015-09-01

    Wavenumber domain analysis through use of scanning laser Doppler vibrometry has been shown to be effective for non-contact inspection of damage in composites. Qualitative and semi-quantitative local wavenumber analysis of realistic delamination damage and quantitative analysis of idealized damage scenarios (Teflon inserts) have been performed previously in the literature. This paper presents a new methodology based on multi-frequency local wavenumber analysis for quantitative assessment of multi-ply delamination damage in carbon fiber reinforced polymer (CFRP) composite specimens. The methodology is presented and applied to a real world damage scenario (impact damage in an aerospace CFRP composite). The methodology yields delamination size and also correlates local wavenumber results from multiple excitation frequencies to theoretical dispersion curves in order to robustly determine the delamination ply depth. Results from the wavenumber based technique are validated against a traditional nondestructive evaluation method. Published by Elsevier B.V.

  2. 3D and 4D Simulations for Landscape Reconstruction and Damage Scenarios: GIS Pilot Applications

    ERIC Educational Resources Information Center

    Pesaresi, Cristano; Van Der Schee, Joop; Pavia, Davide

    2017-01-01

    The project "3D and 4D Simulations for Landscape Reconstruction and Damage Scenarios: GIS Pilot Applications" has been devised with the intention to deal with the demand for research, innovation and applicative methodology on the part of the international programme, requiring concrete results to increase the capacity to know, anticipate…

  3. Designing Bilingual Scenarios to Promote English Language Learning at a Public School in Monteria

    ERIC Educational Resources Information Center

    Romero, Yanilis; Manjarres, Milton Pájaro

    2016-01-01

    This research study examines the assumptions of creating bilingual scenarios to promote English language learning for 384 students of ninth, tenth and eleventh grade of a public school in Monteria Colombia. An action research methodology was carried out in this study. The findings of this research suggested that the creation of bilingual scenarios…

  4. Operational Group Sandy technical progress report

    USGS Publications Warehouse

    ,

    2013-01-01

    This report documents results from the March 2013 deployment of the OGS. It includes background information on Hurricane Sandy and the federal response; the OGS methodology; scenarios for Hurricane Sandy’s impact on coastal communities and urban ecosystems; potential interventions to improve regional resilience to future major storms; a discussion of scenario results; and lessons learned about the OGS process.

  5. Using scenarios and personas to enhance the effectiveness of heuristic usability evaluations for older adults and their care team.

    PubMed

    Kneale, Laura; Mikles, Sean; Choi, Yong K; Thompson, Hilaire; Demiris, George

    2017-09-01

    Using heuristics to evaluate user experience is a common methodology for human-computer interaction studies. One challenge of this method is the inability to tailor results towards specific end-user needs. This manuscript reports on a method that uses validated scenarios and personas of older adults and care team members to enhance heuristics evaluations of the usability of commercially available personal health records for homebound older adults. Our work extends the Chisnell and Redish heuristic evaluation methodology by using a protocol that relies on multiple expert reviews of each system. It further standardizes the heuristic evaluation process through the incorporation of task-based scenarios. We were able to use the modified version of the Chisnell and Redish heuristic evaluation methodology to identify potential usability challenges of two commercially available personal health record systems. This allowed us to: (1) identify potential usability challenges for specific types of users, (2) describe improvements that would be valuable to all end-users of the system, and (3) better understand how the interactions of different users may vary within a single personal health record. The methodology described in this paper may help designers of consumer health information technology tools, such as personal health records, understand the needs of diverse end-user populations. Such methods may be particularly helpful when designing systems for populations that are difficult to recruit for end-user evaluations through traditional methods. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. A GIS-based methodology for the estimation of potential volcanic damage and its application to Tenerife Island, Spain

    NASA Astrophysics Data System (ADS)

    Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.

    2014-05-01

    This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.

  7. Illustrative national scale scenarios of environmental and human health impacts of Carbon Capture and Storage.

    PubMed

    Tzanidakis, Konstantinos; Oxley, Tim; Cockerill, Tim; ApSimon, Helen

    2013-06-01

    Integrated Assessment, and the development of strategies to reduce the impacts of air pollution, has tended to focus only upon the direct emissions from different sources, with the indirect emissions associated with the full life-cycle of a technology often overlooked. Carbon Capture and Storage (CCS) reflects a number of new technologies designed to reduce CO2 emissions, but which may have much broader environmental implications than greenhouse gas emissions. This paper considers a wider range of pollutants from a full life-cycle perspective, illustrating a methodology for assessing environmental impacts using source-apportioned effects based impact factors calculated by the national scale UK Integrated Assessment Model (UKIAM). Contrasting illustrative scenarios for the deployment of CCS towards 2050 are presented which compare the life-cycle effects of air pollutant emissions upon human health and ecosystems of business-as-usual, deployment of CCS and widespread uptake of IGCC for power generation. Together with estimation of the transboundary impacts we discuss the benefits of an effects based approach to such assessments in relation to emissions based techniques. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Solar thermal technology development: Estimated market size and energy cost savings. Volume 2: Assumptions, methodology and results

    NASA Astrophysics Data System (ADS)

    Gates, W. R.

    1983-02-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. Three fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. Solar thermal technology research and development (R&D) is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), depending on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest. Analysis is also provided regarding two federal incentives currently in use: The Federal Business Energy Tax Credit and direct R&D funding.

  9. Solar thermal technology development: Estimated market size and energy cost savings. Volume 2: Assumptions, methodology and results

    NASA Technical Reports Server (NTRS)

    Gates, W. R.

    1983-01-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. Three fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. Solar thermal technology research and development (R&D) is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), depending on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest. Analysis is also provided regarding two federal incentives currently in use: The Federal Business Energy Tax Credit and direct R&D funding.

  10. Exploring consensus in 21st century projections of climatically suitable areas for African vertebrates

    PubMed Central

    Garcia, Raquel A; Burgess, Neil D; Cabeza, Mar; Rahbek, Carsten; Araújo, Miguel B

    2012-01-01

    Africa is predicted to be highly vulnerable to 21st century climatic changes. Assessing the impacts of these changes on Africa's biodiversity is, however, plagued by uncertainties, and markedly different results can be obtained from alternative bioclimatic envelope models or future climate projections. Using an ensemble forecasting framework, we examine projections of future shifts in climatic suitability, and their methodological uncertainties, for over 2500 species of mammals, birds, amphibians and snakes in sub-Saharan Africa. To summarize a priori the variability in the ensemble of 17 general circulation models, we introduce a consensus methodology that combines co-varying models. Thus, we quantify and map the relative contribution to uncertainty of seven bioclimatic envelope models, three multi-model climate projections and three emissions scenarios, and explore the resulting variability in species turnover estimates. We show that bioclimatic envelope models contribute most to variability, particularly in projected novel climatic conditions over Sahelian and southern Saharan Africa. To summarize agreements among projections from the bioclimatic envelope models we compare five consensus methodologies, which generally increase or retain projection accuracy and provide consistent estimates of species turnover. Variability from emissions scenarios increases towards late-century and affects southern regions of high species turnover centred in arid Namibia. Twofold differences in median species turnover across the study area emerge among alternative climate projections and emissions scenarios. Our ensemble of projections underscores the potential bias when using a single algorithm or climate projection for Africa, and provides a cautious first approximation of the potential exposure of sub-Saharan African vertebrates to climatic changes. The future use and further development of bioclimatic envelope modelling will hinge on the interpretation of results in the light of methodological as well as biological uncertainties. Here, we provide a framework to address methodological uncertainties and contextualize results.

  11. The Demand for Scientific and Technical Manpower in Selected Energy-Related Industries, 1970-85: A Methodology Applied to a Selected Scenario of Energy Output. A Summary.

    ERIC Educational Resources Information Center

    Gutmanis, Ivars; And Others

    The primary purpose of the study was to develop and apply a methodology for estimating the need for scientists and engineers by specialty in energy and energy-related industries. The projections methodology was based on the Case 1 estimates by the National Petroleum Council of the results of "maximum efforts" to develop domestic fuel sources by…

  12. Testing coastal DRR in current and climate change scenarios - Artificial winter dune system in a highly touristic beach in the Northern Adriatic.

    NASA Astrophysics Data System (ADS)

    Duo, Enrico; Armaroli, Clara

    2017-04-01

    Artificial dunes are common features built along the coast of the Emilia-Romagna region (Italy) that act as temporary protections during the stormy season in order to prevent damages and inundation to the structures located on the backshore. The RER coast is in fact characterised by low sandy beaches that are exploited for tourism and where beach huts are permanently present on the rear part of the beach. While scientists and regional managers already provided proofs of the capacity of the artificial dunes to lower the hazard component, any study has never investigated their direct impacts in the current (CS) and climate change scenarios (CCS). The RISC-KIT project (www.risckit.eu) provided a methodology for testing DRRs at local level integrating hydro-morphological numerical modelling with a Bayesian Network to assess the consequences of extreme events for different scenarios. The approach was applied at the beach of Lido degli Estensi and Spina (Comacchio, Italy) in the Emilia-Romagna coast. It is a highly touristic area with concessions directly facing the sea, providing sun-and-beach tourism services during summer time, and private residences, commercial activities and hotels at the seafront. The flooding and erosion hazards were analyzed, along with their impacts. A 2DH XBeach model was built and forced with a large number of triangular storms, representative of many different representative combinations of waves' and total water level's ranges observed at regional level. Flooding and erosion results were input into a Bayesian Network which included, as feeding variables categories, deep water boundary conditions (including the CCS trigger), receptors (type and location of assets at the coast), hazard intensity affecting the receptors, impacts and DRR. Therefore, it was possible to integrate a flood damage curve and an erosion potential damage function for the analyzed receptors (beach concessions and residential/commercial buildings), in order to calculate the direct impacts. The artificial dune system was implemented, as representative of the DRR scenario, modifying the topography through the DuneMaker 2.0 Matlab tool. The CCS was implemented through a predicted RSLR under RCP8.52050. The results evidenced that the DRR positively influenced both flooding and erosion hazard intensities distributions. The impacts for the CS showed that, potentially: 20% of residential and commercial buildings and 90% of concessions will be preserved from flood impacts; more than 50% of concessions will be preserved from erosion impacts. The impacts of the CCS evidenced that, potentially: 65% of residential and commercial buildings and 95% of concessions will be preserved from flood impacts; more than 30% of concessions will be preserved from erosion. The positive effect on coastal extreme storm impacts of the implementation of the artificial dunes was evidenced and quantified in comparison with current and climate change scenarios without any DRR implemented. Ongoing studies on the artificial winter dunes, comparing field drone observations and numerical modelling, are being implemented starting from October 2016. Besides, the methodology, if properly adapted, can be applied for any type of DRR, as demonstrated by the RISC-KIT project. It is able to help managers in comparing DRR solutions or strategic alternatives.

  13. USE OF CATEGORICAL REGRESSION IN THE DEFINITION OF THE DURATION/CONCENTRATION CURVE IN THE U.S. EPA'S ACUTE REFEFENCE EXPOSURE (ARE) METHODOLOGY

    EPA Science Inventory

    The U.S. EPA's current draft ARE methodology offers three different approaches for derivation of health effects values for various chemicals and agents under inhalation exposure scenarios of < 24 hrs. These approaches, the NOAEL, benchmark concentration (BMC), and categorical ...

  14. Alice’s Delirium: A Theatre-based Simulation Scenario for Nursing

    PubMed Central

    Posner, Glenn D

    2018-01-01

    As an educational methodology, simulation has been used by nursing education at the academic level for numerous years and has started to gain traction in the onboarding education and professional development of practicing nurses. Simulation allows the learner to apply knowledge and skills in a safe environment where mistakes and learning can happen without an impact on patient safety. The development of a simulation scenario to demonstrate the benefits of simulation education methodologies to a large group of nurse educators was requested by nursing education leadership at The Ottawa Hospital (TOH). Since the demonstration of this scenario in the fall of 2016, there has been significant uptake and adaptation of this particular scenario within the nursing education departments of TOH. Originally written to be used with a simulated patient (SP), “Alice” has since been adapted to be used with a hi-fidelity manikin within an inpatient surgery department continuing professional development (CPD) program for practicing nurses, orientation for nurses to a level 2 trauma unit and at the corporate level of nursing orientation using an SP. Therefore, this scenario is applicable to nurses practicing in an area of inpatient surgery at varying levels, from novice to expert. It could easily be adapted for use with medicine nursing education programs. The case presented in this technical report is of the simulation scenario used for the inpatient surgery CPD program. Varying adaptations of the case are included in the appendices. PMID:29872592

  15. Regional Risk Assessment for the analysis of the risks related to storm surge extreme events in the coastal area of the North Adriatic Sea.

    NASA Astrophysics Data System (ADS)

    Rizzi, Jonathan; Torresan, Silvia; Gallina, Valentina; Critto, Andrea; Marcomini, Antonio

    2013-04-01

    Europe's coast faces a variety of climate change threats from extreme high tides, storm surges and rising sea levels. In particular, it is very likely that mean sea level rise will contribute to upward trends in extreme coastal high water levels, thus posing higher risks to coastal locations currently experiencing coastal erosion and inundation processes. In 2007 the European Commission approved the Flood Directive (2007/60/EC), which has the main purpose to establish a framework for the assessment and management of flood risks for inland and coastal areas, thus reducing the adverse consequences for human health, the environment, cultural heritage and economic activities. Improvements in scientific understanding are thus needed to inform decision-making about the best strategies for mitigating and managing storm surge risks in coastal areas. The CLIMDAT project is aimed at improving the understanding of the risks related to extreme storm surge events in the coastal area of the North Adriatic Sea (Italy), considering potential climate change scenarios. The project implements a Regional Risk Assessment (RRA) methodology developed in the FP7 KULTURisk project for the assessment of physical/environmental impacts posed by flood hazards and employs the DEcision support SYstem for Coastal climate change impact assessment (DESYCO) for the application of the methodology to the case study area. The proposed RRA methodology is aimed at the identification and prioritization of targets and areas at risk from water-related natural hazards in the considered region at the meso-scale. To this aim, it integrates information about extreme storm surges with bio-geophysical and socio-economic information (e.g. vegetation cover, slope, soil type, population density) of the analyzed receptors (i.e. people, economic activities, cultural heritages, natural and semi-natural systems). Extreme storm surge hazard scenarios are defined using tide gauge time series coming from 28 tide gauge stations located in the North Adriatic coastal areas from 1989 to 2011. These data, together with the sea-level rise scenarios for the considered future timeframe, represent the input for the application of the Joint Probability method (Pugh and Vassie, 1979), which allows the evaluation of the maximum height of extreme storm surge events with different return period and the number of extreme events per year. The methodology uses Geographic Information Systems to manage, process, analyse, and visualize data and employs Multi-Criteria Decision Analysis to integrate stakeholders preferences and experts judgments into the analysis in order to obtain a total risk index in the considered region. The final outputs are represented by GIS-based risk maps which allow the communication of the potential consequences of extreme storm surge to decision makers and stakeholders. Moreover, they can support the establishment of relative priorities for intervention through the identification of suitable areas for human settlements, infrastructures and economic activities. Finally the produced output can represent a basis for definition of storm surge hazard and storm surge risk management plans according to the Floods Directive. The preliminary results of the RRA application in the CLIMDAT project will be here presented and discussed.

  16. Numerical 3D modelling of oil dispersion in the sea due to different accident scenarios

    NASA Astrophysics Data System (ADS)

    Guandalini, Roberto; Agate, Giordano; Moia, Fabio

    2017-04-01

    The purpose of the study has been the development of a methodology, based on a numerical 3D approach, for the analysis of oil dispersion in the sea, in order to simulate with a high level of accuracy the dynamic behavior of the oil plume and its displacement in the environment. As a matter of fact, the numerical simulation is the only approach currently able to analyse in detail possible accident scenarios, even with an high degree of complexity, of different type and intensity, allowing to follow their evolution both in time and space, and to evaluate the effectiveness of suggested prevention or recovery actions. The software for these calculations is therefore an essential tool in order to simulate the impact effects in the short, medium and long period, able to account for the complexity of the sea system involved in the dispersion process and its dependency on the meteorological, marine and morphological local conditions. This software, generally based on fluid dynamic 3D simulators and modellers, is therefore extremely specialized and requires expertise for an appropriate usage, but at the same time it allows detailed scenario analyses and design verifications. It takes into account different parameters as the sea current field and its turbulence, the wind acting on the sea surface, the salinity and temperature gradients, the local coastal morphology, the seabed bathymetry and the tide. The applied methodology is based on the Integrated Fluid Dynamic Simulation System HyperSuite developed by RSE. This simulation system includes the consideration of all the parameters previously listed, in the frame of a 3D Eulerian finite element fluid dynamic model, which accuracy is guaranteed by a very detailed spatial mesh and by an automatically optimized time step management. In order to assess the methodology features, an area of more than 2500 km2 and depth of 200 m located in the middle Adriatic Sea has been modelled. The information required for the simulation in different environmental conditions, have been collected from RSE proprietary and public databases directly connected to the model. Finally, the possible pollution source has been chosen in correspondence with the offshore drilling wells for the exploitation of the "Ombrina Mare" oil field, located at a distance of 6 km from the coast, and the project includes a FPSO unit. A number of different scenarios have been simulated using the 3D model created by HyperSuite, in different environmental conditions and considering emission events of low intensity and long period or of high intensity and short period, located near the sea surface or near the sea bottom. For each scenario, a preliminary initialization in the fluid dynamic unperturbed conditions at the starting date has been carried out, from which the emission period followed by a properly duration of diffusion period of the pollutant has been simulated. The results allowed to evaluate the relevance of the effects due to the environmental parameters as the wind, sea current and tide, putting in evidence the capability of the methodology to support the safety requirements in the frame of off shore oil exploitation provided that a dynamic characterization of the environment parameters is accounted for a sufficient detail.

  17. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  18. Performance evaluation in full-mission simulation - Methodological advances and research challenges. [in air transport operations

    NASA Technical Reports Server (NTRS)

    Chidester, Thomas R.; Kanki, Barbara G.; Helmreich, Robert L.

    1989-01-01

    The crew-factors research program at NASA Ames has developed a methodology for studying the impact of a variety of variables on the effectiveness of crews flying realistic but high workload simulated trips. The validity of investigations using the methodology is enhanced by careful design of full-mission scenarios, performance assessment using converging sources of data, and recruitment of representative subjects. Recently, portions of this methodology have been adapted for use in assessing the effectiveness of crew coordination among participants in line-oriented flight training.

  19. Agro-ecological analysis for the EU water framework directive: an applied case study for the river contract of the Seveso basin (Italy).

    PubMed

    Bocchi, Stefano; La Rosa, Daniele; Pileri, Paolo

    2012-10-01

    The innovative approach to the protection and management of water resources at the basin scale introduced by the European Union water framework directive (WFD) requires new scientific tools. WFD implementation also requires the participation of many stakeholders (administrators, farmers and citizens) with the aim of improving the quality of river waters and basin ecosystems through cooperative planning. This approach encompasses different issues, such as agro-ecology, land use planning and water management. This paper presents the results of a methodology suggested for implementing the WFD in the case of the Seveso river contract in Italy, one of the recent WFD applications. The Seveso basin in the Lombardy region has been one of the most rapidly urbanizing areas in Italy over the last 50 years. First, land use changes in the last 50 years are assessed with the use of historical aerial photos. Then, elements of an ecological network along the river corridor are outlined, and different scenarios for enhancing existing ecological connections are assessed using indicators from graph theory. These scenarios were discussed in technical workshops with involved stakeholders of the river contract. The results show a damaged rural landscape, where urbanization processes have decimated the system of linear green features (hedges/rows). Progressive reconnections of some of the identified network nodes may significantly increase the connectivity and circuitry of the study area.

  20. What strategy is needed for attaining the EU air quality regulations under future climate change scenarios? A sensitivity analysis over Europe

    NASA Astrophysics Data System (ADS)

    Jiménez-Guerrero, P.; Baró, R.; Gómez-Navarro, J. J.; Lorente-Plazas, R.; García-Valero, J. A.; Hernández, Z.; Montávez, J. P.

    2012-04-01

    A wide number of studies show that several areas over Europe exceed some of the air quality thresholds established in the legislation. These exceedances will become more frequent under future climate change scenarios, since the policies aimed at improving air quality in the EU directives have not accounted for the variations in the climate. Climate change alone will influence the future concentrations of atmospheric pollutants through modifications of gas-phase chemistry, transport, removal, and natural emissions. In this sense, chemistry transport models (CTMs) play a key role in assessing and understanding the emissions abatement plans through the use of sensitivity analysis strategies. These sensitivity analyses characterize the change in model output due to variations in model input parameters. Since the management strategies of air pollutant emission is one of the predominant factors for controlling urban air quality, this work assesses the impact of various emission reduction scenarios in air pollution levels over Europe under two climate change scenarios. The methodology includes the use of a climate version of the meteorological model MM5 coupled with the CHIMERE chemistry transport model. Experiments span the periods 1971-2000, as a reference, and 2071-2100, as two future enhanced greenhouse gas and aerosol scenarios (SRES A2 and B2). The atmospheric simulations have an horizontal resolution of 25 km and 23 vertical layers up to 100 hPa, and are driven by the global climate model ECHO-G . In order to represent the sensitivity of the chemistry and transport of aerosols, tropospheric ozone and other photochemical species, several hypothetical scenarios of emission control have been implemented to quantify the influence of diverse emission sources in the area, such as on-road traffic, port and industrial emissions, among others. The modeling strategy lies on a sensitivity analysis to determine the emission reduction and strategy needed in the target area in order to attain the standards and thresholds set in the European Directive 2008/50/EC. Results depict that the system is able to characterize the exceedances occurring in Europe, mainly related to the maximum 8h moving average exceeding the target value of 120 μg/m3, mainly over southern Europe. Also, compliance of the PM10 daily limit values (50 μg/m3) is not achieved over wide areas in Europe. The sensitivity analysis indicates that large reductions of precursors emissions are needed in all the scenarios examined for attaining the thresholds set in the European Directive. In most cases this abatement strategy is hard to take into practice (e.g. unrealistic percentage of emission reductions in on-road traffic, industry or harbor activity); however, ozone and particulate matter air pollution improve considerably in most of the scenarios included. Results also unveil the propagation of uncertainties from the meteorological projections into future air quality and claim for future studies aimed at deepening the knowledge about the parameterized processes, the definition of emissions and, last, reducing uncertainties.

  1. Combining Learning and Assessment in Assessment-Based Gaming Environments: A Case Study from a New York City School

    ERIC Educational Resources Information Center

    Zapata-Rivera, Diego; VanWinkle, Waverely; Doyle, Bryan; Buteux, Alyssa; Bauer, Malcolm

    2009-01-01

    Purpose: The purpose of this paper is to propose and demonstrate an evidence-based scenario design framework for assessment-based computer games. Design/methodology/approach: The evidence-based scenario design framework is presented and demonstrated by using BELLA, a new assessment-based gaming environment aimed at supporting student learning of…

  2. Teaching the Methodology of Science: The Utilization of Microbial Model Systems for Biometric Analysis.

    ERIC Educational Resources Information Center

    Adamo, Joseph A.

    Students set in their ways are usually reluctant, as a general rule, to deal with open-ended investigative scenarios. In order to acquaint the student with the physical method and philosophical thought process of the discipline, the tone of the course must be set early on. The present study was conducted to develop scenarios and microbial model…

  3. Risk analysis of technological hazards: Simulation of scenarios and application of a local vulnerability index.

    PubMed

    Sanchez, E Y; Represa, S; Mellado, D; Balbi, K B; Acquesta, A D; Colman Lerner, J E; Porta, A A

    2018-06-15

    The potential impact of a technological accident can be assessed by risk estimation. Taking this into account, the latent or potential condition can be warned and mitigated. In this work we propose a methodology to estimate risk of technological hazards, focused on two components. The first one is the processing of meteorological databases to define the most probably and conservative scenario of study, and the second one, is the application of a local social vulnerability index to classify the population. In this case of study, the risk was estimated for a hypothetical release of liquefied ammonia in a meat-packing industry in the city of La Plata, Argentina. The method consists in integrating the simulated toxic threat zone with ALOHA software, and the layer of sociodemographic classification of the affected population. The results show the areas associated with higher risks of exposure to ammonia, which are worth being addressed for the prevention of disasters in the region. Advantageously, this systemic approach is methodologically flexible as it provides the possibility of being applied in various scenarios based on the available information of both, the exposed population and its meteorology. Furthermore, this methodology optimizes the processing of the input data and its calculation. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Future Sulfur Dioxide Emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Steven J.; Pitcher, Hugh M.; Wigley, Tom M.

    2005-12-01

    The importance of sulfur dioxide emissions for climate change is now established, although substantial uncertainties remain. This paper presents projections for future sulfur dioxide emissions using the MiniCAM integrated assessment model. A new income-based parameterization for future sulfur dioxide emissions controls is developed based on purchasing power parity (PPP) income estimates and historical trends related to the implementation of sulfur emissions limitations. This parameterization is then used to produce sulfur dioxide emissions trajectories for the set of scenarios developed for the Special Report on Emission Scenarios (SRES). We use the SRES methodology to produce harmonized SRES scenarios using the latestmore » version of the MiniCAM model. The implications, and requirements, for IA modeling of sulfur dioxide emissions are discussed. We find that sulfur emissions eventually decline over the next century under a wide set of assumptions. These emission reductions result from a combination of emission controls, the adoption of advanced electric technologies, and a shift away from the direct end use of coal with increasing income levels. Only under a scenario where incomes in developing regions increase slowly do global emission levels remain at close to present levels over the next century. Under a climate policy that limits emissions of carbon dioxide, sulfur dioxide emissions fall in a relatively narrow range. In all cases, the relative climatic effect of sulfur dioxide emissions decreases dramatically to a point where sulfur dioxide is only a minor component of climate forcing by the end of the century. Ecological effects of sulfur dioxide, however, could be significant in some developing regions for many decades to come.« less

  5. Modelling regional land change scenarios to assess land abandonment and reforestation dynamics in the Pyrenees (France)

    USGS Publications Warehouse

    Vacquie, Laure; Houet, Thomas; Sohl, Terry L.; Reker, Ryan R.; Sayler, Kristi L.

    2015-01-01

    Over the last decades and centuries, European mountain landscapes have experienced substantial transformations. Natural and anthropogenic LULC changes (land use and land cover changes), especially agro-pastoral activities, have directly influenced the spatial organization and composition of European mountain landscapes. For the past sixty years, natural reforestation has been occurring due to a decline in both agricultural production activities and rural population. Stakeholders, to better anticipate future changes, need spatially and temporally explicit models to identify areas at risk of land change and possible abandonment. This paper presents an integrated approach combining forecasting scenarios and a LULC changes simulation model to assess where LULC changes may occur in the Pyrenees Mountains, based on historical LULC trends and a range of future socio-economic drivers. The proposed methodology considers local specificities of the Pyrenean valleys, sub-regional climate and topographical properties, and regional economic policies. Results indicate that some regions are projected to face strong abandonment, regardless of the scenario conditions. Overall, high rates of change are associated with administrative regions where land productivity is highly dependent on socio-economic drivers and climatic and environmental conditions limit intensive (agricultural and/or pastoral) production and profitability. The combination of the results for the four scenarios allows assessments of where encroachment (e.g. colonization by shrublands) and reforestation are the most probable. This assessment intends to provide insight into the potential future development of the Pyrenees to help identify areas that are the most sensitive to change and to guide decision makers to help their management decisions.

  6. An open-access CMIP5 pattern library for temperature and precipitation: Description and methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynch, Cary D.; Hartin, Corinne A.; Bond-Lamberty, Benjamin

    Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squared regression methods. We exploremore » the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90°N/S). Bias and mean errors between modeled and pattern predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5°C, but choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. As a result, this paper describes our library of least squared regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns.« less

  7. An open-access CMIP5 pattern library for temperature and precipitation: Description and methodology

    DOE PAGES

    Lynch, Cary D.; Hartin, Corinne A.; Bond-Lamberty, Benjamin; ...

    2017-05-15

    Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squared regression methods. We exploremore » the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90°N/S). Bias and mean errors between modeled and pattern predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5°C, but choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. As a result, this paper describes our library of least squared regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns.« less

  8. Extreme risk assessment based on normalized historic loss data

    NASA Astrophysics Data System (ADS)

    Eichner, Jan

    2017-04-01

    Natural hazard risk assessment and risk management focuses on the expected loss magnitudes of rare and extreme events. Such large-scale loss events typically comprise all aspects of compound events and accumulate losses from multiple sectors (including knock-on effects). Utilizing Munich Re's NatCatSERVICE direct economic loss data, we beriefly recap a novel methodology of peril-specific loss data normalization which improves the stationarity properties of highly non-stationary historic loss data (due to socio-economic growth of assets prone to destructive forces), and perform extreme value analysis (peaks-over-threshold method) to come up with return level estimates of e.g. 100-yr loss event scenarios for various types of perils, globally or per continent, and discuss uncertainty in the results.

  9. Optimizing performance of hybrid FSO/RF networks in realistic dynamic scenarios

    NASA Astrophysics Data System (ADS)

    Llorca, Jaime; Desai, Aniket; Baskaran, Eswaran; Milner, Stuart; Davis, Christopher

    2005-08-01

    Hybrid Free Space Optical (FSO) and Radio Frequency (RF) networks promise highly available wireless broadband connectivity and quality of service (QoS), particularly suitable for emerging network applications involving extremely high data rate transmissions such as high quality video-on-demand and real-time surveillance. FSO links are prone to atmospheric obscuration (fog, clouds, snow, etc) and are difficult to align over long distances due the use of narrow laser beams and the effect of atmospheric turbulence. These problems can be mitigated by using adjunct directional RF links, which provide backup connectivity. In this paper, methodologies for modeling and simulation of hybrid FSO/RF networks are described. Individual link propagation models are derived using scattering theory, as well as experimental measurements. MATLAB is used to generate realistic atmospheric obscuration scenarios, including moving cloud layers at different altitudes. These scenarios are then imported into a network simulator (OPNET) to emulate mobile hybrid FSO/RF networks. This framework allows accurate analysis of the effects of node mobility, atmospheric obscuration and traffic demands on network performance, and precise evaluation of topology reconfiguration algorithms as they react to dynamic changes in the network. Results show how topology reconfiguration algorithms, together with enhancements to TCP/IP protocols which reduce the network response time, enable the network to rapidly detect and act upon link state changes in highly dynamic environments, ensuring optimized network performance and availability.

  10. Putting Order into Our Universe: The Concept of "Blended Learning"--A Methodology within the Concept-Based Terminology Framework

    ERIC Educational Resources Information Center

    Fernandes, Joana; Costa, Rute; Peres, Paula

    2016-01-01

    This paper aims at discussing the advantages of a methodology design grounded on a concept-based approach to Terminology applied to the most prominent scenario of current Higher Education: "blended learning." Terminology is a discipline that aims at representing, describing and defining specialized knowledge through language, putting…

  11. Communicative Language Teaching: A Practical Scenario in the Context of Bangladesh

    ERIC Educational Resources Information Center

    Ahmed, Md. Kawser

    2016-01-01

    Communicative Language Teaching, popularly known as CLT, has become a newly adopted methodology in the teaching and learning context of Bangladesh. This methodology, since the initiation, has encountered and is still encountering a number of hurdles that need to be dealt with best care and feasible strategy. Of all methods, most of the educational…

  12. The KULTURisk Regional Risk Assessment methodology for flood risk: the case of Sihl river in Zurich

    NASA Astrophysics Data System (ADS)

    Ronco, Paolo; Bullo, Martina; Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Zabeo, Alex; Semenzin, Elena; Buchecker, Matthias; Marcomini, Antonio

    2014-05-01

    In recent years, the frequency of catastrophes induced by natural hazard has increased and flood events in particular have been recognized as one of the most threatening water-related disasters. Severe floods have occurred in Europe over the last decade causing loss of life, displacement of people and heavy economic losses. Flood disasters are growing as a consequence of many factors both climatic and non-climatic. Indeed, the current increase of water-related disasters can be mainly attributed to the increase of exposure (elements potentially at risk in floodplains area) and vulnerability (i.e. economic, social, geographic, cultural, and physical/environmental characteristics of the exposure). Besides these factors, the strong effect of climate change is projected to radically modify the usual pattern of the hydrological cycle by intensifying the frequency and severity of flood events both at local, regional and global scale. Within this context, it is necessary to develop effective and pro-active strategies, tools and actions which allow to assess and (possibly) to reduce the risk of floods. In light of the recent European Flood Directive (FD), the KULTURisk-FP7 Project developed a state-of-the-art Regional Risk Assessment (RRA) methodology for assessing the risk imposed by floods events. The KULTURisk RRA methodology is based on the concept of risk being function of hazard, exposure and vulnerability. It is a flexible that can be adapted to different case studies (i.e. large rivers, alpine/mountain catchments, urban areas and coastal areas) and spatial scales (i.e. from the large river to the urban scale) that integrates the outputs of various hydrodynamics models (hazard) with sito-specific geophysical and socio-economic indicators (exposure and vulnerability factors such as land cover, slope, soil permeability, population density, economic activities, etc.). The main outputs of the methodology are GIS-based risk maps that identify and prioritize relative hot-spot areas and targets at risk (i.e. people, buildings, infrastructures, agriculture, natural and semi-natural systems, cultural heritages) in the considered region by comparing the baseline scenario with alternative scenarios, where different structural and/or non-structural mitigation measures are planned. Risk maps, along with related statistics, provide crucial information about flood risk pattern, and allow the development of relevant and strategic mitigation and prevention measures to minimizing flood risk in urban areas. The present study applied and validated the KULTURisk RRA methodology to the Sihl river case study in Zurich (Switzerland). Through a tuning process of the methodology to the site-specific context and features, flood related risks have been assessed for different receptors lying on the Sihl river valley, which represents a typical case of river flooding in urban area. The total risk maps obtained under a 300 years return period scenario (selected as the reference one) have highlighted that the area is associated with the lower class of risk. Moreover, the relative risk is higher in Zurich city centre, in the few residential areas around the city centre and within the districts that rely just beside to the Sihl river course.

  13. Visual performance-based image enhancement methodology: an investigation of contrast enhancement algorithms

    NASA Astrophysics Data System (ADS)

    Neriani, Kelly E.; Herbranson, Travis J.; Reis, George A.; Pinkus, Alan R.; Goodyear, Charles D.

    2006-05-01

    While vast numbers of image enhancing algorithms have already been developed, the majority of these algorithms have not been assessed in terms of their visual performance-enhancing effects using militarily relevant scenarios. The goal of this research was to apply a visual performance-based assessment methodology to evaluate six algorithms that were specifically designed to enhance the contrast of digital images. The image enhancing algorithms used in this study included three different histogram equalization algorithms, the Autolevels function, the Recursive Rational Filter technique described in Marsi, Ramponi, and Carrato1 and the multiscale Retinex algorithm described in Rahman, Jobson and Woodell2. The methodology used in the assessment has been developed to acquire objective human visual performance data as a means of evaluating the contrast enhancement algorithms. Objective performance metrics, response time and error rate, were used to compare algorithm enhanced images versus two baseline conditions, original non-enhanced images and contrast-degraded images. Observers completed a visual search task using a spatial-forcedchoice paradigm. Observers searched images for a target (a military vehicle) hidden among foliage and then indicated in which quadrant of the screen the target was located. Response time and percent correct were measured for each observer. Results of the study and future directions are discussed.

  14. The use of scenario analysis in local public health departments: alternative futures for strategic planning.

    PubMed Central

    Venable, J M; Ma, Q L; Ginter, P M; Duncan, W J

    1993-01-01

    Scenario analysis is a strategic planning technique used to describe and evaluate an organization's external environment. A methodology for conducting scenario analysis using the Jefferson County Department of Health and the national, State, and county issues confronting it is outlined. Key health care and organizational issues were identified using published sources, focus groups, questionnaires, and personal interviews. The most important of these issues were selected by asking health department managers to evaluate the issues according to their probability of occurrence and likely impact on the health department. The high-probability, high-impact issues formed the basis for developing scenario logics that constitute the story line holding the scenario together. The results were a set of plausible scenarios that aided in strategic planning, encouraged strategic thinking among managers, eliminated or reduced surprise about environmental changes, and improved managerial discussion and communication. PMID:8265754

  15. Virtual reality and live simulation: a comparison between two simulation tools for assessing mass casualty triage skills.

    PubMed

    Luigi Ingrassia, Pier; Ragazzoni, Luca; Carenzo, Luca; Colombo, Davide; Ripoll Gallardo, Alba; Della Corte, Francesco

    2015-04-01

    This study tested the hypothesis that virtual reality simulation is equivalent to live simulation for testing naive medical students' abilities to perform mass casualty triage using the Simple Triage and Rapid Treatment (START) algorithm in a simulated disaster scenario and to detect the improvement in these skills after a teaching session. Fifty-six students in their last year of medical school were randomized into two groups (A and B). The same scenario, a car accident, was developed identically on the two simulation methodologies: virtual reality and live simulation. On day 1, group A was exposed to the live scenario and group B was exposed to the virtual reality scenario, aiming to triage 10 victims. On day 2, all students attended a 2-h lecture on mass casualty triage, specifically the START triage method. On day 3, groups A and B were crossed over. The groups' abilities to perform mass casualty triage in terms of triage accuracy, intervention correctness, and speed in the scenarios were assessed. Triage and lifesaving treatment scores were assessed equally by virtual reality and live simulation on day 1 and on day 3. Both simulation methodologies detected an improvement in triage accuracy and treatment correctness from day 1 to day 3 (P<0.001). The time to complete each scenario and its decrease from day 1 to day 3 were detected equally in the two groups (P<0.05). Virtual reality simulation proved to be a valuable tool, equivalent to live simulation, to test medical students' abilities to perform mass casualty triage and to detect improvement in such skills.

  16. Agility through Automated Negotiation for C2 Services

    DTIC Science & Technology

    2014-06-01

    using this e-contract negotiation methodology in a C2 context in Brazil. We have modeled the operations of the Rio de Janeiro Command Center that will be...methodology in a C2 context in Brazil. We have modeled the operations of the Rio de Janeiro Command Center that will be in place for the World Cup (2014...through e-contracts. The scenario chosen to demonstrate this methodology is a security incident in Rio de Janeiro , host city of the next World Cup (2014

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ehlen, Mark Andrew; Vugrin, Eric D.; Warren, Drake E.

    In recent years, the nation has recognized that critical infrastructure protection should consider not only the prevention of disruptive events, but also the processes that infrastructure systems undergo to maintain functionality following disruptions. This more comprehensive approach has been termed critical infrastructure resilience (CIR). Given the occurrence of a particular disruptive event, the resilience of a system to that event is the system's ability to efficiently reduce both the magnitude and duration of the deviation from targeted system performance levels. Sandia National Laboratories (Sandia) has developed a comprehensive resilience assessment framework for evaluating the resilience of infrastructure and economic systems.more » The framework includes a quantitative methodology that measures resilience costs that result from a disruption to infrastructure function. The framework also includes a qualitative analysis methodology that assesses system characteristics that affect resilience in order to provide insight and direction for potential improvements to resilience. This paper describes the resilience assessment framework. This paper further demonstrates the utility of the assessment framework through application to a hypothetical scenario involving the disruption of a petrochemical supply chain by a hurricane.« less

  18. Estimation of number of fatalities caused by toxic gases due to fire in road tunnels.

    PubMed

    Qu, Xiaobo; Meng, Qiang; Liu, Zhiyuan

    2013-01-01

    The quantitative risk assessment (QRA) is one of the explicit requirements under the European Union (EU) Directive (2004/54/EC). As part of this, it is essential to be able to estimate the number of fatalities in different accident scenarios. In this paper, a tangible methodology is developed to estimate the number of fatalities caused by toxic gases due to fire in road tunnels by incorporating traffic flow and the spread of fire in tunnels. First, a deterministic queuing model is proposed to calculate the number of people at risk, by taking into account tunnel geometry, traffic flow patterns, and incident response plans for road tunnels. Second, the Fire Dynamics Simulator (FDS) is used to obtain the temperature and concentrations of CO, CO(2), and O(2). By taking advantage of the additivity of the fractional effective dose (FED) method, fatality rates for different locations in given time periods can be estimated. An illustrative case study is carried out to demonstrate the applicability of the proposed methodology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Land use change modeling through scenario-based cellular automata Markov: improving spatial forecasting.

    PubMed

    Jahanishakib, Fatemeh; Mirkarimi, Seyed Hamed; Salmanmahiny, Abdolrassoul; Poodat, Fatemeh

    2018-05-08

    Efficient land use management requires awareness of past changes, present actions, and plans for future developments. Part of these requirements is achieved using scenarios that describe a future situation and the course of changes. This research aims to link scenario results with spatially explicit and quantitative forecasting of land use development. To develop land use scenarios, SMIC PROB-EXPERT and MORPHOL methods were used. It revealed eight scenarios as the most probable. To apply the scenarios, we considered population growth rate and used a cellular automata-Markov chain (CA-MC) model to implement the quantified changes described by each scenario. For each scenario, a set of landscape metrics was used to assess the ecological integrity of land use classes in terms of fragmentation and structural connectivity. The approach enabled us to develop spatial scenarios of land use change and detect their differences for choosing the most integrated landscape pattern in terms of landscape metrics. Finally, the comparison between paired forecasted scenarios based on landscape metrics indicates that scenarios 1-1, 2-2, 3-2, and 4-1 have a more suitable integrity. The proposed methodology for developing spatial scenarios helps executive managers to create scenarios with many repetitions and customize spatial patterns in real world applications and policies.

  20. Methodological Validation of Quality of Life Questionnaire for Coal Mining Groups-Indian Scenario

    ERIC Educational Resources Information Center

    Sen, Sayanti; Sen, Goutam; Tewary, B. K.

    2012-01-01

    Maslow's hierarchy-of-needs theory has been used to predict development of Quality of Life (QOL) in countries over time. In this paper an attempt has been taken to derive a methodological validation of quality of life questionnaire which have been prepared for the study area. The objective of the study is to standardize a questionnaire tool to…

  1. Statistical, Graphical, and Learning Methods for Sensing, Surveillance, and Navigation Systems

    DTIC Science & Technology

    2016-06-28

    harsh propagation environments. Conventional filtering techniques fail to provide satisfactory performance in many important nonlinear or non...Gaussian scenarios. In addition, there is a lack of a unified methodology for the design and analysis of different filtering techniques. To address...these problems, we have proposed a new filtering methodology called belief condensation (BC) DISTRIBUTION A: Distribution approved for public release

  2. Designing a Methodology for Future Air Travel Scenarios

    NASA Technical Reports Server (NTRS)

    Wuebbles, Donald J.; Baughcum, Steven L.; Gerstle, John H.; Edmonds, Jae; Kinnison, Douglas E.; Krull, Nick; Metwally, Munir; Mortlock, Alan; Prather, Michael J.

    1992-01-01

    The growing demand on air travel throughout the world has prompted several proposals for the development of commercial aircraft capable of transporting a large number of passengers at supersonic speeds. Emissions from a projected fleet of such aircraft, referred to as high-speed civil transports (HSCT's), are being studied because of their possible effects on the chemistry and physics of the global atmosphere, in particular, on stratospheric ozone. At the same time, there is growing concern about the effects on ozone from the emissions of current (primarily subsonic) aircraft emissions. Evaluating the potential atmospheric impact of aircraft emissions from HSCT's requires a scientifically sound understanding of where the aircraft fly and under what conditions the aircraft effluents are injected into the atmosphere. A preliminary set of emissions scenarios are presented. These scenarios will be used to understand the sensitivity of environment effects to a range of fleet operations, flight conditions, and aircraft specifications. The baseline specifications for the scenarios are provided: the criteria to be used for developing the scenarios are defined, the required data base for initiating the development of the scenarios is established, and the state of the art for those scenarios that have already been developed is discussed. An important aspect of the assessment will be the evaluation of realistic projections of emissions as a function of both geographical distribution and altitude from an economically viable commercial HSCT fleet. With an assumed introduction date of around the year 2005, it is anticipated that there will be no HSCT aircraft in the global fleet at that time. However, projections show that, by 2015, the HSCT fleet could reach significant size. We assume these projections of HSCT and subsonic fleets for about 2015 can the be used as input to global atmospheric chemistry models to evaluate the impact of the HSCT fleets, relative to an all-subsonic future fleet. The methodology, procedures, and recommendations for the development of future HSCT and the subsonic fleet scenarios used for this evaluation are discussed.

  3. A pen-based system to support pre-operative data collection within an anaesthesia department.

    PubMed Central

    Sanz, M. F.; Gómez, E. J.; Trueba, I.; Cano, P.; Arredondo, M. T.; del Pozo, F.

    1993-01-01

    This paper describes the design and implementation of a pen-based computer system for remote preoperative data collection. The system is envisaged to be used by anaesthesia staff at different hospital scenarios where pre-operative data are generated. Pen-based technology offers important advantages in terms of portability and human-computer interaction, as direct manipulation interfaces by direct pointing, and "notebook user interfaces metaphors". Being the human factors analysis and user interface design a vital stage to achieve the appropriate user acceptability, a methodology that integrates the "usability" evaluation from the earlier development stages was used. Additionally, the selection of a pen-based computer system as a portable device to be used by health care personnel allows to evaluate the appropriateness of this new technology for remote data collection within the hospital environment. The work presented is currently being realised under the Research Project "TANIT: Telematics in Anaesthesia and Intensive Care", within the "A.I.M.--Telematics in Health CARE" European Research Program. PMID:8130488

  4. Assessing reservoir operations risk under climate change

    USGS Publications Warehouse

    Brekke, L.D.; Maurer, E.P.; Anderson, J.D.; Dettinger, M.D.; Townsley, E.S.; Harrison, A.; Pruitt, T.

    2009-01-01

    Risk-based planning offers a robust way to identify strategies that permit adaptive water resources management under climate change. This paper presents a flexible methodology for conducting climate change risk assessments involving reservoir operations. Decision makers can apply this methodology to their systems by selecting future periods and risk metrics relevant to their planning questions and by collectively evaluating system impacts relative to an ensemble of climate projection scenarios (weighted or not). This paper shows multiple applications of this methodology in a case study involving California's Central Valley Project and State Water Project systems. Multiple applications were conducted to show how choices made in conducting the risk assessment, choices known as analytical design decisions, can affect assessed risk. Specifically, risk was reanalyzed for every choice combination of two design decisions: (1) whether to assume climate change will influence flood-control constraints on water supply operations (and how), and (2) whether to weight climate change scenarios (and how). Results show that assessed risk would motivate different planning pathways depending on decision-maker attitudes toward risk (e.g., risk neutral versus risk averse). Results also show that assessed risk at a given risk attitude is sensitive to the analytical design choices listed above, with the choice of whether to adjust flood-control rules under climate change having considerably more influence than the choice on whether to weight climate scenarios. Copyright 2009 by the American Geophysical Union.

  5. Development of a methodology to assess future trends in low flows at the watershed scale using solely climate data

    NASA Astrophysics Data System (ADS)

    Foulon, Étienne; Rousseau, Alain N.; Gagnon, Patrick

    2018-02-01

    Low flow conditions are governed by short-to-medium term weather conditions or long term climate conditions. This prompts the question: given climate scenarios, is it possible to assess future extreme low flow conditions from climate data indices (CDIs)? Or should we rely on the conventional approach of using outputs of climate models as inputs to a hydrological model? Several CDIs were computed using 42 climate scenarios over the years 1961-2100 for two watersheds located in Québec, Canada. The relationship between the CDIs and hydrological data indices (HDIs; 7- and 30-day low flows for two hydrological seasons) were examined through correlation analysis to identify the indices governing low flows. Results of the Mann-Kendall test, with a modification for autocorrelated data, clearly identified trends. A partial correlation analysis allowed attributing the observed trends in HDIs to trends in specific CDIs. Furthermore, results showed that, even during the spatial validation process, the methodological framework was able to assess trends in low flow series from: (i) trends in the effective drought index (EDI) computed from rainfall plus snowmelt minus PET amounts over ten to twelve months of the hydrological snow cover season or (ii) the cumulative difference between rainfall and potential evapotranspiration over five months of the snow free season. For 80% of the climate scenarios, trends in HDIs were successfully attributed to trends in CDIs. Overall, this paper introduces an efficient methodological framework to assess future trends in low flows given climate scenarios. The outcome may prove useful to municipalities concerned with source water management under changing climate conditions.

  6. An open-access CMIP5 pattern library for temperature and precipitation: description and methodology

    NASA Astrophysics Data System (ADS)

    Lynch, Cary; Hartin, Corinne; Bond-Lamberty, Ben; Kravitz, Ben

    2017-05-01

    Pattern scaling is used to efficiently emulate general circulation models and explore uncertainty in climate projections under multiple forcing scenarios. Pattern scaling methods assume that local climate changes scale with a global mean temperature increase, allowing for spatial patterns to be generated for multiple models for any future emission scenario. For uncertainty quantification and probabilistic statistical analysis, a library of patterns with descriptive statistics for each file would be beneficial, but such a library does not presently exist. Of the possible techniques used to generate patterns, the two most prominent are the delta and least squares regression methods. We explore the differences and statistical significance between patterns generated by each method and assess performance of the generated patterns across methods and scenarios. Differences in patterns across seasons between methods and epochs were largest in high latitudes (60-90° N/S). Bias and mean errors between modeled and pattern-predicted output from the linear regression method were smaller than patterns generated by the delta method. Across scenarios, differences in the linear regression method patterns were more statistically significant, especially at high latitudes. We found that pattern generation methodologies were able to approximate the forced signal of change to within ≤ 0.5 °C, but the choice of pattern generation methodology for pattern scaling purposes should be informed by user goals and criteria. This paper describes our library of least squares regression patterns from all CMIP5 models for temperature and precipitation on an annual and sub-annual basis, along with the code used to generate these patterns. The dataset and netCDF data generation code are available at doi:10.5281/zenodo.495632.

  7. Exploring precipitation pattern scaling methodologies and robustness among CMIP5 models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kravitz, Ben; Lynch, Cary; Hartin, Corinne

    Pattern scaling is a well-established method for approximating modeled spatial distributions of changes in temperature by assuming a time-invariant pattern that scales with changes in global mean temperature. We compare two methods of pattern scaling for annual mean precipitation (regression and epoch difference) and evaluate which method is better in particular circumstances by quantifying their robustness to interpolation/extrapolation in time, inter-model variations, and inter-scenario variations. Both the regression and epoch-difference methods (the two most commonly used methods of pattern scaling) have good absolute performance in reconstructing the climate model output, measured as an area-weighted root mean square error. We decomposemore » the precipitation response in the RCP8.5 scenario into a CO 2 portion and a non-CO 2 portion. Extrapolating RCP8.5 patterns to reconstruct precipitation change in the RCP2.6 scenario results in large errors due to violations of pattern scaling assumptions when this CO 2-/non-CO 2-forcing decomposition is applied. As a result, the methodologies discussed in this paper can help provide precipitation fields to be utilized in other models (including integrated assessment models or impacts assessment models) for a wide variety of scenarios of future climate change.« less

  8. Exploring precipitation pattern scaling methodologies and robustness among CMIP5 models

    DOE PAGES

    Kravitz, Ben; Lynch, Cary; Hartin, Corinne; ...

    2017-05-12

    Pattern scaling is a well-established method for approximating modeled spatial distributions of changes in temperature by assuming a time-invariant pattern that scales with changes in global mean temperature. We compare two methods of pattern scaling for annual mean precipitation (regression and epoch difference) and evaluate which method is better in particular circumstances by quantifying their robustness to interpolation/extrapolation in time, inter-model variations, and inter-scenario variations. Both the regression and epoch-difference methods (the two most commonly used methods of pattern scaling) have good absolute performance in reconstructing the climate model output, measured as an area-weighted root mean square error. We decomposemore » the precipitation response in the RCP8.5 scenario into a CO 2 portion and a non-CO 2 portion. Extrapolating RCP8.5 patterns to reconstruct precipitation change in the RCP2.6 scenario results in large errors due to violations of pattern scaling assumptions when this CO 2-/non-CO 2-forcing decomposition is applied. As a result, the methodologies discussed in this paper can help provide precipitation fields to be utilized in other models (including integrated assessment models or impacts assessment models) for a wide variety of scenarios of future climate change.« less

  9. Exploring climate change vulnerability across sectors and scenarios using indicators of impacts and coping capacity.

    PubMed

    Dunford, R; Harrison, P A; Jäger, J; Rounsevell, M D A; Tinch, R

    Addressing climate change vulnerability requires an understanding of both the level of climate impacts and the capacity of the exposed population to cope. This study developed a methodology for allowing users to explore vulnerability to changes in ecosystem services as a result of climatic and socio-economic changes. It focuses on the vulnerability of Europe across multiple sectors by combining the outputs of a regional integrated assessment (IA) model, the CLIMSAVE IA Platform, with maps of coping capacity based on the five capitals approach. The presented methodology enables stakeholder-derived socio-economic futures to be represented within a quantitative integrated modelling framework in a way that changes spatially and temporally with the socio-economic storyline. Vulnerability was mapped for six key ecosystem services in 40 combined climate and socio-economic scenarios. The analysis shows that, whilst the north and west of Europe are generally better placed to cope with climate impacts than the south and east, coping could be improved in all areas. Furthermore, whilst the lack of coping capacity in dystopian scenarios often leads to greater vulnerability, there are complex interactions between sectors that lead to patterns of vulnerability that vary spatially, with scenario and by sector even within the more utopian futures.

  10. Comparative techno-economic assessment and LCA of selected integrated sugarcane-based biorefineries.

    PubMed

    Gnansounou, Edgard; Vaskan, Pavel; Pachón, Elia Ruiz

    2015-11-01

    This work addresses the economic and environmental performance of integrated biorefineries based on sugarcane juice and residues. Four multiproduct scenarios were considered; two from sugar mills and the others from ethanol distilleries. They are integrated biorefineries producing first (1G) and second (2G) generation ethanol, sugar, molasses (for animal feed) and electricity in the context of Brazil. The scenarios were analysed and compared using techno-economic value-based approach and LCA methodology. The results show that the best economic configuration is provided by a scenario with largest ethanol production while the best environmental performance is presented by a scenario with full integration sugar - 1G2G ethanol production. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  12. Scenario-Based Specification and Evaluation of Architectures for Health Monitoring of Aerospace Structures

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Sundaram, P.

    2001-01-01

    HUMS systems have been an area of increased research in the recent times due to two main reasons: (a) increase in the occurrences of accidents in the aerospace, and (b) stricter FAA regulations on aircrafts maintenance [2]. There are several problems associated with the maintenance of aircrafts that the HUMS systems can solve through the use of several monitoring technologies.This paper documents our methodology of employing scenarios in the specification and evaluation of architecture for HUMS. Section 2 investigates related works that use scenarios in software development. Section 3 describes how we use scenarios in our work, which is followed by a demonstration of our methods in the development of KUMS in section 4. Conclusion summarizes results.

  13. Environmental assessment of anaerobically digested sludge reuse in agriculture: potential impacts of emerging micropollutants.

    PubMed

    Hospido, Almudena; Carballa, Marta; Moreira, Maite; Omil, Francisco; Lema, Juan M; Feijoo, Gumersindo

    2010-05-01

    Agricultural application of sewage sludge has been emotionally discussed in the last decades, because the latter contains organic micropollutants with unknown fate and risk potential. In this work, the reuse of anaerobically digested sludge in agriculture is evaluated from an environmental point of view by using Life Cycle Assessment methodology. More specifically, the potential impacts of emerging micropollutants, such as pharmaceuticals and personal care products, present in the sludge have been quantified. Four scenarios were considered according to the temperature of the anaerobic digestion (mesophilic or thermophilic) and the sludge retention time (20 or 10d), and they have been compared with the non-treated sludge. From an environmental point of view, the disposal of undigested sludge is not the most suitable alternative, except for global warming due to the dominance (65-85%) of the indirect emissions associated to the electricity use. Nutrient-related direct emissions dominate the eutrophication category impact in all the scenarios (>71.4%), although a beneficial impact related to the avoidance of industrial fertilisers production is also quantified (up to 6.7%). In terms of human and terrestrial toxicity, the direct emissions of heavy metals to soil dominate these two impact categories (>70%), and the contribution of other micropollutants is minimal. Moreover, only six (Galaxolide, Tonalide, Diazepam, Ibuprofen, Sulfamethoxazole and 17alpha-ethinyloestradiol) out of the 13 substances considered are really significant since they account for more than 95% of the overall micropollutants impact.

  14. Life Cycle Assessment of Mixed Municipal Solid Waste: Multi-input versus multi-output perspective.

    PubMed

    Fiorentino, G; Ripa, M; Protano, G; Hornsby, C; Ulgiati, S

    2015-12-01

    This paper analyses four strategies for managing the Mixed Municipal Solid Waste (MMSW) in terms of their environmental impacts and potential advantages by means of Life Cycle Assessment (LCA) methodology. To this aim, both a multi-input and a multi-output approach are applied to evaluate the effect of these perspectives on selected impact categories. The analyzed management options include direct landfilling with energy recovery (S-1), Mechanical-Biological Treatment (MBT) followed by Waste-to-Energy (WtE) conversion (S-2), a combination of an innovative MBT/MARSS (Material Advanced Recovery Sustainable Systems) process and landfill disposal (S-3), and finally a combination of the MBT/MARSS process with WtE conversion (S-4). The MARSS technology, developed within an European LIFE PLUS framework and currently implemented at pilot plant scale, is an innovative MBT plant having the main goal to yield a Renewable Refined Biomass Fuel (RRBF) to be used for combined heat and power production (CHP) under the regulations enforced for biomass-based plants instead of Waste-to-Energy systems, for increased environmental performance. The four scenarios are characterized by different resource investment for plant and infrastructure construction and different quantities of matter, heat and electricity recovery and recycling. Results, calculated per unit mass of waste treated and per unit exergy delivered, under both multi-input and multi-output LCA perspectives, point out improved performance for scenarios characterized by increased matter and energy recovery. Although none of the investigated scenarios is capable to provide the best performance in all the analyzed impact categories, the scenario S-4 shows the best LCA results in the human toxicity and freshwater eutrophication categories, i.e. the ones with highest impacts in all waste management processes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Small-body deflection techniques using spacecraft: Techniques in simulating the fate of ejecta

    NASA Astrophysics Data System (ADS)

    Schwartz, Stephen R.; Yu, Yang; Michel, Patrick; Jutzi, Martin

    2016-04-01

    We define a set of procedures to numerically study the fate of ejecta produced by the impact of an artificial projectile with the aim of deflecting an asteroid. Here we develop a simplified, idealized model of impact conditions that can be adapted to fit the details of specific deflection-test scenarios, such as what is being proposed for the AIDA project. Ongoing studies based upon the methodology described here can be used to inform observational strategies and safety conditions for an observing spacecraft. To account for ejecta evolution, the numerical strategies we are employing are varied and include a large N-Body component, a smoothed-particle hydrodynamics (SPH) component, and an application of impactor scaling laws. Simulations that use SPH-derived initial conditions show high-speed ejecta escaping at low angles of inclination, and very slowly moving ejecta lofting off the surface at higher inclination angles, some of which reimpacts the small-body surface. We are currently investigating the realism of this and other models' behaviors. Next steps will include the addition of solar perturbations to the model and applying the protocol developed here directly to specific potential mission concepts such as the proposed AIDA scenario.

  16. Future methods in pharmacy practice research.

    PubMed

    Almarsdottir, A B; Babar, Z U D

    2016-06-01

    This article describes the current and future practice of pharmacy scenario underpinning and guiding this research and then suggests future directions and strategies for such research. First, it sets the scene by discussing the key drivers which could influence the change in pharmacy practice research. These are demographics, technology and professional standards. Second, deriving from this, it seeks to predict and forecast the future shifts in use of methodologies. Third, new research areas and availability of data impacting on future methods are discussed. These include the impact of aging information technology users on healthcare, understanding and responding to cultural and social disparities, implementing multidisciplinary initiatives to improve health care, medicines optimization and predictive risk analysis, and pharmacy as business and health care institution. Finally, implications of the trends for pharmacy practice research methods are discussed.

  17. Systems scenarios: a tool for facilitating the socio-technical design of work systems.

    PubMed

    Hughes, Helen P N; Clegg, Chris W; Bolton, Lucy E; Machon, Lauren C

    2017-10-01

    The socio-technical systems approach to design is well documented. Recognising the benefits of this approach, organisations are increasingly trying to work with systems, rather than their component parts. However, few tools attempt to analyse the complexity inherent in such systems, in ways that generate useful, practical outputs. In this paper, we outline the 'System Scenarios Tool' (SST), which is a novel, applied methodology that can be used by designers, end-users, consultants or researchers to help design or re-design work systems. The paper introduces the SST using examples of its application, and describes the potential benefits of its use, before reflecting on its limitations. Finally, we discuss potential opportunities for the tool, and describe sets of circumstances in which it might be used. Practitioner Summary: The paper presents a novel, applied methodological tool, named the 'Systems Scenarios Tool'. We believe this tool can be used as a point of reference by designers, end-users, consultants or researchers, to help design or re-design work systems. Included in the paper are two worked examples, demonstrating the tool's application.

  18. Proceedings of the tenth annual DOE low-level waste management conference: Session 2: Site performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-12-01

    This document contains twelve papers on various aspects of low-level radioactive waste management. Topics of this volume include: performance assessment methodology; remedial action alternatives; site selection and site characterization procedures; intruder scenarios; sensitivity analysis procedures; mathematical models for mixed waste environmental transport; and risk assessment methodology. Individual papers were processed separately for the database. (TEM)

  19. Methane emission estimation from landfills in Korea (1978-2004): quantitative assessment of a new approach.

    PubMed

    Kim, Hyun-Sun; Yi, Seung-Muk

    2009-01-01

    Quantifying methane emission from landfills is important to evaluating measures for reduction of greenhouse gas (GHG) emissions. To quantify GHG emissions and identify sensitive parameters for their measurement, a new assessment approach consisting of six different scenarios was developed using Tier 1 (mass balance method) and Tier 2 (the first-order decay method) methodologies for GHG estimation from landfills, suggested by the Intergovernmental Panel on Climate Change (IPCC). Methane emissions using Tier 1 correspond to trends in disposed waste amount, whereas emissions from Tier 2 gradually increase as disposed waste decomposes over time. The results indicate that the amount of disposed waste and the decay rate for anaerobic decomposition were decisive parameters for emission estimation using Tier 1 and Tier 2. As for the different scenarios, methane emissions were highest under Scope 1 (scenarios I and II), in which all landfills in Korea were regarded as one landfill. Methane emissions under scenarios III, IV, and V, which separated the dissimilated fraction of degradable organic carbon (DOC(F)) by waste type and/or revised the methane correction factor (MCF) by waste layer, were underestimated compared with scenarios II and III. This indicates that the methodology of scenario I, which has been used in most previous studies, may lead to an overestimation of methane emissions. Additionally, separate DOC(F) and revised MCF were shown to be important parameters for methane emission estimation from landfills, and revised MCF by waste layer played an important role in emission variations. Therefore, more precise information on each landfill and careful determination of parameter values and characteristics of disposed waste in Korea should be used to accurately estimate methane emissions from landfills.

  20. Assessment of vulnerability to future marine processes of urbanized coastal environments by a GIS-based approach: expected scenario in the metropolitan area of Bari (Italy)

    NASA Astrophysics Data System (ADS)

    Mancini, F.; Ceppi, C.; Christopulos, V.

    2013-12-01

    Literature concerning the risk assessment procedures after extreme meteorological events is generally focused on the establishing of relationship between actual severe weather conditions and impact detected over the involved zones. Such an events are classified on the basis of measurements and observation able to assess the magnitude of phenomena or on the basis of related effects on the affected area, the latter being deeply connected with the overall physical vulnerability. However such assessment almost never do consider scenario about expected extreme event and possible pattern of urbanization at the time of impact and nor the spatial and temporal uncertainty of phenomena are taken into account. The drawn of future scenario about coastal vulnerability to marine processes is therefore difficult. This work focuses the study case of the Metropoli Terra di Bari (metropolitan area of Bari, Apulia, Italy) where a coastal vulnerability analysis due to climate changes expected on the basis of expert opinions coming from the scientific community was carried out. Several possible impacts on the coastal environments were considered, in particular sea level rise inundation, flooding due to storm surge and coastal erosion. For such a purpose the methodology base on SRES (Special Report on Emission Scenario) produced by the IPCC (Intergovernmental Panel on Climate Change) was adopted after a regionalization procedure as carried out by Verburgh and others (2006) at the European scale. The open source software SLEUTH, base on the cellular automate principle, was used and the reliability of obtained scenario verified through the Monte Carlo method. Once these scenario were produced, a GIS-based multicriteria methodology was implemented to evaluate the vulnerability of the urbanized coastal area of interest. Several vulnerability maps related are therefore available for different scenario able to consider the degree of hazards and potential development of the typology and extent of urban settlements. The vulnerability assessments under different scenario could represent a suitable tool in the designing of risk mitigation strategies under uncertain scenario of hazard.

  1. Development of Real-time Tsunami Inundation Forecast Using Ocean Bottom Tsunami Networks along the Japan Trench

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Yamamoto, N.; Suzuki, W.; Hirata, K.; Nakamura, H.; Kunugi, T.; Kubo, T.; Maeda, T.

    2015-12-01

    In the 2011 Tohoku earthquake, in which huge tsunami claimed a great deal of lives, the initial tsunami forecast based on hypocenter information estimated using seismic data on land were greatly underestimated. From this lesson, NIED is now constructing S-net (Seafloor Observation Network for Earthquakes and Tsunamis along the Japan Trench) which consists of 150 ocean bottom observatories with seismometers and pressure gauges (tsunamimeters) linked by fiber optic cables. To take full advantage of S-net, we develop a new methodology of real-time tsunami inundation forecast using ocean bottom observation data and construct a prototype system that implements the developed forecasting method for the Pacific coast of Chiba prefecture (Sotobo area). We employ a database-based approach because inundation is a strongly non-linear phenomenon and its calculation costs are rather heavy. We prepare tsunami scenario bank in advance, by constructing the possible tsunami sources, and calculating the tsunami waveforms at S-net stations, coastal tsunami heights and tsunami inundation on land. To calculate the inundation for target Sotobo area, we construct the 10-m-mesh precise elevation model with coastal structures. Based on the sensitivities analyses, we construct the tsunami scenario bank that efficiently covers possible tsunami scenarios affecting the Sotobo area. A real-time forecast is carried out by selecting several possible scenarios which can well explain real-time tsunami data observed at S-net from tsunami scenario bank. An advantage of our method is that tsunami inundations are estimated directly from the actual tsunami data without any source information, which may have large estimation errors. In addition to the forecast system, we develop Web services, APIs, and smartphone applications and brush them up through social experiments to provide the real-time tsunami observation and forecast information in easy way to understand toward urging people to evacuate.

  2. A Discussion of Virtual Reality As a New Tool for Training Healthcare Professionals.

    PubMed

    Fertleman, Caroline; Aubugeau-Williams, Phoebe; Sher, Carmel; Lim, Ai-Nee; Lumley, Sophie; Delacroix, Sylvie; Pan, Xueni

    2018-01-01

    Virtual reality technology is an exciting and emerging field with vast applications. Our study sets out the viewpoint that virtual reality software could be a new focus of direction in the development of training tools in medical education. We carried out a panel discussion at the Center for Behavior Change 3rd Annual Conference, prompted by the study, "The Responses of Medical General Practitioners to Unreasonable Patient Demand for Antibiotics--A Study of Medical Ethics Using Immersive Virtual Reality" (1). In Pan et al.'s study, 21 general practitioners (GPs) and GP trainees took part in a videoed, 15-min virtual reality scenario involving unnecessary patient demands for antibiotics. This paper was discussed in-depth at the Center for Behavior Change 3rd Annual Conference; the content of this paper is a culmination of findings and feedback from the panel discussion. The experts involved have backgrounds in virtual reality, general practice, medicines management, medical education and training, ethics, and philosophy. Virtual reality is an unexplored methodology to instigate positive behavioral change among clinicians where other methods have been unsuccessful, such as antimicrobial stewardship. There are several arguments in favor of use of virtual reality in medical education: it can be used for "difficult to simulate" scenarios and to standardize a scenario, for example, for use in exams. However, there are limitations to its usefulness because of the cost implications and the lack of evidence that it results in demonstrable behavior change.

  3. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  4. Towards an optimal adaptation of exposure to NOAA assessment methodology in Multi-Source Industrial Scenarios (MSIS): the challenges and the decision-making process

    NASA Astrophysics Data System (ADS)

    López de Ipiña, JM; Vaquero, C.; Gutierrez-Cañas, C.

    2017-06-01

    It is expected a progressive increase of the industrial processes that manufacture of intermediate (iNEPs) and end products incorporating ENMs (eNEPs) to bring about improved properties. Therefore, the assessment of occupational exposure to airborne NOAA will migrate, from the simple and well-controlled exposure scenarios in research laboratories and ENMs production plants using innovative production technologies, to much more complex exposure scenarios located around processes of manufacture of eNEPs that, in many cases, will be modified conventional production processes. Here will be discussed some of the typical challenging situations in the process of risk assessment of inhalation exposure to NOAA in Multi-Source Industrial Scenarios (MSIS), from the basis of the lessons learned when confronted to those scenarios in the frame of some European and Spanish research projects.

  5. Development of nonproliferation and assessment scenarios.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finley, Melissa; Barnett, Natalie Beth

    2005-10-01

    The overall objective of the Nonproliferation and Assessments Scenario Development project is to create and analyze potential and plausible scenarios that would lead to an adversary's ability to acquire and use a biological weapon. The initial three months of funding was intended to be used to develop a scenario to demonstrate the efficacy of this analysis methodology; however, it was determined that a substantial amount of preliminary data collection would be needed before a proof of concept scenario could be developed. We have dedicated substantial effort to determine the acquisition pathways for Foot and Mouth Disease Virus, and similar processesmore » will be applied to all pathogens of interest. We have developed a biosecurity assessments database to capture information on adversary skill locales, available skill sets in specific regions, pathogen sources and regulations involved in pathogen acquisition from legitimate facilities. FY06 funding, once released, will be dedicated to data collection on acquisition, production and dissemination requirements on a pathogen basis. Once pathogen data has been collected, scenarios will be developed and scored.« less

  6. CIRMIS Data system. Volume 2. Program listings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedrichs, D.R.

    1980-01-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (OWNI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for utilization by the hydraulic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required.The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the second of four volumes of the description of the CIRMIS Data System.« less

  7. Assessment of effectiveness of geologic isolation systems. CIRMIS data system. Volume 4. Driller's logs, stratigraphic cross section and utility routines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedrichs, D.R.

    1980-01-01

    The Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program is developing and applying the methodology for assessing the far-field, long-term post-closure safety of deep geologic nuclear waste repositories. AEGIS is being performed by Pacific Northwest Laboratory (PNL) under contract with the Office of Nuclear Waste Isolation (ONWI) for the Department of Energy (DOE). One task within AEGIS is the development of methodology for analysis of the consequences (water pathway) from loss of repository containment as defined by various release scenarios. Analysis of the long-term, far-field consequences of release scenarios requires the application of numerical codes which simulate the hydrologicmore » systems, model the transport of released radionuclides through the hydrologic systems to the biosphere, and, where applicable, assess the radiological dose to humans. The various input parameters required in the analysis are compiled in data systems. The data are organized and prepared by various input subroutines for use by the hydrologic and transport codes. The hydrologic models simulate the groundwater flow systems and provide water flow directions, rates, and velocities as inputs to the transport models. Outputs from the transport models are basically graphs of radionuclide concentration in the groundwater plotted against time. After dilution in the receiving surface-water body (e.g., lake, river, bay), these data are the input source terms for the dose models, if dose assessments are required. The dose models calculate radiation dose to individuals and populations. CIRMIS (Comprehensive Information Retrieval and Model Input Sequence) Data System is a storage and retrieval system for model input and output data, including graphical interpretation and display. This is the fourth of four volumes of the description of the CIRMIS Data System.« less

  8. Ecological-economic assessment of the effects of freshwater flow in the Florida Everglades on recreational fisheries.

    PubMed

    Brown, Christina Estela; Bhat, Mahadev G; Rehage, Jennifer S; Mirchi, Ali; Boucek, Ross; Engel, Victor; Ault, Jerald S; Mozumder, Pallab; Watkins, David; Sukop, Michael

    2018-06-15

    This research develops an integrated methodology to determine the economic value to anglers of recreational fishery ecosystem services in Everglades National Park that could result from different water management scenarios. The study first used bio-hydrological models to link managed freshwater inflows to indicators of fishery productivity and ecosystem health, then link those models to anglers' willingness-to-pay for various attributes of the recreational fishing experience and monthly fishing effort. This approach allowed us to estimate the foregone economic benefits of failing to meet monthly freshwater delivery targets. The study found that the managed freshwater delivery to the Park had declined substantially over the years and had fallen short of management targets. This shortage in the flow resulted in the decline of biological productivity of recreational fisheries in downstream coastal areas. This decline had in turn contributed to reductions in the overall economic value of recreational ecosystem services enjoyed by anglers. The study estimated the annual value of lost recreational services at $68.81 million. The losses were greater in the months of dry season when the water shortage was higher and the number of anglers fishing also was higher than the levels in wet season. The study also developed conservative estimates of implicit price of water for recreation, which ranged from $11.88 per AF in November to $112.11 per AF in April. The annual average price was $41.54 per AF. Linking anglers' recreational preference directly to a decision variable such as water delivery is a powerful and effective way to make management decision. This methodology has relevant applications to water resource management, serving as useful decision-support metrics, as well as for policy and restoration scenario analysis. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Hydraulic and Condition Assessment of Existing Sewerage Network: A Case Study of an Educational Institute

    NASA Astrophysics Data System (ADS)

    Sourabh, Nishant; Timbadiya, P. V.

    2018-04-01

    The hydraulic simulation of the existing sewerage network provides various information about critical points to assess the deteriorating condition and help in rehabilitation of existing network and future expansion. In the present study, hydraulic and condition assessment of existing network of educational Institute (i.e. Sardar Vallabhbhai National Institute of Technology-Surat, Gujarat, India), having an area of 100 ha and ground levels in range of 5.0-9.0 m above mean sea level, has been carried out using sewage flow simulation for existing and future scenarios analysis using SewerGEMS v8i. The paper describes the features of 4.79 km long sewerage network of institute followed by network model simulation for aforesaid scenarios and recommendations on improvement of the existing network for future use. The total sewer loads for present and future scenarios are 1.67 million litres per day (MLD) and 3.62 MLD, considering the peak factor of 3 on the basis of population. The hydraulic simulation of the existing scenario indicated depth by diameter (d/D) ratio in the range of 0.02-0.48 and velocity range of 0.08-0.53 m/s for existing network for present scenario. For the future scenario, the existing network is needed to be modified and it was found that total of 11 conduits (length: 464.8 m) should be replaced to the next higher diameter available, i.e., 350 mm for utilization of existing network for future scenario. The present study provides the methodology for condition assessment of existing network and its utilization as per guidelines provided by Central Public Health and Environmental Engineering Organization, 2013. The methodology presented in this paper can be used by municipal/public health engineer for the assessment of existing sewerage network for its serviceability and improvement in future.

  10. Selection and Training of Field Artillery Forward Observers: Methodologies for Improving Target Acquisition Skills

    DTIC Science & Technology

    1979-07-01

    African scenario.) The training analysis revealed some discrepancies between the list of tasks taught in FAOBC and the list of tasks emerging from the...I tD ’. 0C-) Q) 4- ) 0 N 4- _ L ~~1 CC 0 -- .0 I 4 J0C cog 1 . wi. I -4 1- Co4- ~a) U’ cu ) 0o 0 0 CDm 0 -% o c u- CO 0) -* -- cN- LO) C’I) NO 0 - CV...population density. (Refer to Figure 3-2). The African combat scenario, closely followed by the Middle Eastern scenario, was rated as being the most

  11. Valuating Indonesian upstream oil management scenario through system dynamics modelling

    NASA Astrophysics Data System (ADS)

    Ketut Gunarta, I.; Putri, F. A.

    2018-04-01

    Under the existing regulation in Constitution Number 22 Year 2001 (UU No 22 Tahun 2001), Production Sharing Contract (PSC) continues to be the scenario in conducting oil and gas upstream mining activities as the previous regulation (UU No. 8 Tahun 1971). Because of the high costs and risks in upstream mining activities, the contractors are dominated by foreign companies, meanwhile National Oil Company (NOC) doesn’t act much. The domination of foreign contractor companies also warned Indonesia in several issues addressing to energy independence and energy security. Therefore, to achieve the goals of energy which is independence and security, there need to be a revision in upstream oil activities regulating scenario. The scenarios will be comparing the current scenario, which is PSC, with the “full concession” scenario for National Oil Company (NOC) in managing oil upstream mining activities. Both scenario will be modelled using System Dynamics methodology and assessed furthermore using financial valuation method of income approach. Under the 2 scenarios, the author will compare which scenario is better for upstream oil management in reaching the goals mentioned before and more profitable in financial aspect. From the simulation, it is gathered that concession scenario offers better option than PSC in reaching energy independence and energy security.

  12. Alternative futures of proactive tools for a citizen's own wellbeing.

    PubMed

    Meristö, Tarja; Tuohimaa, Hanna; Leppimäki, Sami; Laitinen, Jukka

    2009-01-01

    The aim of this paper is to create the basis for a vision of an empowered citizen who can control his/her life, especially in relation to health and personal wellbeing with the use of new ICT-tools. The methods used in the study are based on futures studies, especially on scenario methodology. Alternative future paths, i.e. scenarios are constructed using the scenario filter model that we have developed, with market, technology and society perspectives. Scenarios not resulting in the vision are described in what if - analysis as well. The scenarios are combined with Viherä's model on citizen's skills, access and motivation to use new ICT-tools. The concept COPER is targeted to different user groups with an adaptable user interface and its development is user centered. We will consider the effects and the appropriate elements of COPER in every scenario, as well as the possibilities and challenges nursing will confront. As a result we will gain information of the characteristic of COPER that advance the vision. For the future development of COPER the alternative scenarios give the basis for flexibility planning, too.

  13. Predicting trends of invasive plants richness using local socio-economic data: An application in North Portugal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santos, Mario, E-mail: mgsantoss@gmail.com; Freitas, Raul, E-mail: raulfreitas@portugalmail.com; Crespi, Antonio L., E-mail: aluis.crespi@gmail.com

    2011-10-15

    This study assesses the potential of an integrated methodology for predicting local trends in invasive exotic plant species (invasive richness) using indirect, regional information on human disturbance. The distribution of invasive plants was assessed in North Portugal using herbarium collections and local environmental, geophysical and socio-economic characteristics. Invasive richness response to anthropogenic disturbance was predicted using a dynamic model based on a sequential modeling process (stochastic dynamic methodology-StDM). Derived scenarios showed that invasive richness trends were clearly associated with ongoing socio-economic change. Simulations including scenarios of growing urbanization showed an increase in invasive richness while simulations in municipalities with decreasingmore » populations showed stable or decreasing levels of invasive richness. The model simulations demonstrate the interest and feasibility of using this methodology in disturbance ecology. - Highlights: {yields} Socio-economic data indicate human induced disturbances. {yields} Socio-economic development increase disturbance in ecosystems. {yields} Disturbance promotes opportunities for invasive plants.{yields} Increased opportunities promote richness of invasive plants.{yields} Increase in richness of invasive plants change natural ecosystems.« less

  14. Cognitive Task Analysis of Business Jet Pilots' Weather Flying Behaviors: Preliminary Results

    NASA Technical Reports Server (NTRS)

    Latorella, Kara; Pliske, Rebecca; Hutton, Robert; Chrenka, Jason

    2001-01-01

    This report presents preliminary findings from a cognitive task analysis (CTA) of business aviation piloting. Results describe challenging weather-related aviation decisions and the information and cues used to support these decisions. Further, these results demonstrate the role of expertise in business aviation decision-making in weather flying, and how weather information is acquired and assessed for reliability. The challenging weather scenarios and novice errors identified in the results provide the basis for experimental scenarios and dependent measures to be used in future flight simulation evaluations of candidate aviation weather information systems. Finally, we analyzed these preliminary results to recommend design and training interventions to improve business aviation decision-making with weather information. The primary objective of this report is to present these preliminary findings and to document the extended CTA methodology used to elicit and represent expert business aviator decision-making with weather information. These preliminary findings will be augmented with results from additional subjects using this methodology. A summary of the complete results, absent the detailed treatment of methodology provided in this report, will be documented in a separate publication.

  15. Improved water resource management using three dimensional groundwater modelling for a highly complex environmental

    NASA Astrophysics Data System (ADS)

    Moeck, Christian; Affolter, Annette; Radny, Dirk; Auckenthaler, Adrian; Huggenberger, Peter; Schirmer, Mario

    2017-04-01

    Proper allocation and management of groundwater is an important and critical challenge under rising water demands of various environmental sectors but good groundwater quality is often limited because of urbanization and contamination of aquifers. Given the predictive capability of groundwater models, they are often the only viable means of providing input to water management decisions. However, modelling flow and transport processes can be difficult due to their unknown subsurface heterogeneity and typically unknown distribution of contaminants. As a result water resource management tasks are based on uncertain assumption on contaminants patterns and this uncertainty is typically not incorporated into the assessment of risks associated with different proposed management scenarios. A three-dimensional groundwater model was used to improve water resource management for a study area, where drinking water production is close to different former landfills and industrial areas. To avoid drinking water contamination, artificial groundwater recharge with surface water into the gravel aquifer is used to create a hydraulic barrier between contaminated sites and drinking water extraction wells. The model was used for simulating existing and proposed water management strategies as a tool to ensure the utmost security for drinking water. A systematic evaluation of the flow direction and magnitude between existing observation points using a newly developed three point estimation method for a large amount of scenarios was carried out. Due to the numerous observation points 32 triangles (three-points) were created which cover the entire area around the Hardwald. We demonstrated that systematically applying our developed methodology helps to identify important locations which are sensitive to changing boundary conditions and where additional protection is required without highly computational demanding transport modelling. The presented integrated approach using the flow direction between observation points can be easily transferred to a variety of hydrological settings to evaluate systematically groundwater modelling scenarios.

  16. Assessing Consequential Scenarios in a Complex Operational Environment Using Agent Based Simulation

    DTIC Science & Technology

    2017-03-16

    RWISE) 93 5.1.5 Conflict Modeling, Planning, and Outcomes Experimentation Program (COMPOEX) 94 5.1.6 Joint Non -Kinetic Effects Model (JNEM)/Athena... experimental design and testing. 4.3.8 Types and Attributes of Agent-Based Model Design Patterns Using the aforementioned ABM flowchart design methodology ...speed, or flexibility during tactical US Army wargaming. The report considers methodologies to improve analysis of the human domain, identifies

  17. Assessing Hydrologic Impacts of Future Land Cover Change ...

    EPA Pesticide Factsheets

    Long‐term land‐use and land cover change and their associated impacts pose critical challenges to sustaining vital hydrological ecosystem services for future generations. In this study, a methodology was developed on the San Pedro River Basin to characterize hydrologic impacts from future urban growth through time. This methodology was then expanded and utilized to characterize the changing hydrology on the South Platte River Basin. Future urban growth is represented by housingdensity maps generated in decadal intervals from 2010 to 2100, produced by the U.S. Environmental Protection Agency (EPA) Integrated Climate and Land‐Use Scenarios (ICLUS) project. ICLUS developed future housing density maps by adapting the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) social, economic, and demographic storylines to the conterminous United States. To characterize hydrologic impacts from future growth, the housing density maps were reclassified to National Land Cover Database (NLCD) 2006 land cover classes and used to parameterize the Soil and Water Assessment Tool (SWAT) using the Automated Geospatial Watershed Assessment (AGWA) tool. The objectives of this project were to 1) develop and describe a methodology for adapting the ICLUS data for use in AGWA as anapproach to evaluate basin‐wide impacts of development on water‐quantity and ‐quality, 2) present initial results from the application of the methodology to

  18. A modelling approach for the assessment of the effects of Common Agricultural Policy measures on farmland biodiversity in the EU27.

    PubMed

    Overmars, Koen P; Helming, John; van Zeijts, Henk; Jansson, Torbjörn; Terluin, Ida

    2013-09-15

    In this paper we describe a methodology to model the impacts of policy measures within the Common Agricultural Policy (CAP) on farm production, income and prices, and on farmland biodiversity. Two stylised scenarios are used to illustrate how the method works. The effects of CAP measures, such as subsidies and regulations, are calculated and translated into changes in land use and land-use intensity. These factors are then used to model biodiversity with a species-based indicator on a 1 km scale in the EU27. The Common Agricultural Policy Regionalised Impact Modelling System (CAPRI) is used to conduct the economic analysis and Dyna-CLUE (Conversion of Land Use and its Effects) is used to model land use changes. An indicator that expresses the relative species richness was used as the indicator for biodiversity in agricultural areas. The methodology is illustrated with a baseline scenario and two scenarios that include a specific policy. The strength of the methodology is that impacts of economic policy instruments can be linked to changes in agricultural production, prices and incomes, on the one hand, and to biodiversity effects, on the other - with land use and land-use intensity as the connecting drivers. The method provides an overall assessment, but for detailed impact assessment at landscape, farm or field level, additional analysis would be required. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Effects of clinical supervision on resident learning and patient care during simulated ICU scenarios.

    PubMed

    Piquette, Dominique; Tarshis, Jordan; Regehr, Glenn; Fowler, Robert A; Pinto, Ruxandra; LeBlanc, Vicki R

    2013-12-01

    Closer supervision of residents' clinical activities has been promoted to improve patient safety, but may additionally affect resident participation in patient care and learning. The objective of this study was to determine the effects of closer supervision on patient care, resident participation, and the development of resident ability to care independently for critically ill patients during simulated scenarios. This quantitative study represents a component of a larger mixed-methods study. Residents were randomized to one of three levels of supervision, defined by the physical proximity of the supervisor (distant, immediately available, and direct). Each resident completed a simulation scenario under the supervision of a critical care fellow, immediately followed by a modified scenario of similar content without supervision. The simulation center of a tertiary, university-affiliated academic center in a large urban city. Fifty-three residents completing a critical care rotation and 24 critical care fellows were recruited between April 2009 and June 2010. None. During the supervised scenarios, lower team performance checklist scores were obtained for distant supervision compared with immediately available and direct supervision (mean [SD], direct: 72% [12%] vs immediately available: 77% [10%] vs distant: 61% [11%]; p = 0.0013). The percentage of checklist items completed by the residents themselves was significantly lower during direct supervision (median [interquartile range], direct: 40% [21%] vs immediately available: 58% [16%] vs distant: 55% [11%]; p = 0.005). During unsupervised scenarios, no significant differences were found on the outcome measures. Care delivered in the presence of senior supervising physicians was more comprehensive than care delivered without access to a bedside supervisor, but was associated with lower resident participation. However, subsequent resident performance during unsupervised scenarios was not adversely affected. Direct supervision of residents leads to improved care process and does not diminish the subsequent ability of residents to function independently.

  20. Energy Efficiency Appliance Standards: Where do we stand, how far can we go and how do we get there? An analysis across several economies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Letschert, Virginie E.; de la Rue du Can, Stephane; McNeil, Michael A.

    This paper analyses several potential savings scenarios for minimum energy performance standard (MEPS) and comparable programs for governments participating i n the Super-efficient Equipment and Appliance Deployment (SEAD) Initiative, of the Clean Energy Ministerial, which represent over 60% of primary energy consumption in the world. We compare projected energy savings from the main end uses in the residential sector using three energy efficiency scenarios: (1) recent achievements, (2) cost-effective saving potential, and (3) energy efficiency technical potential. The recent achievement scenario (1) evaluates the future impact of MEPS enacted or under development between 2010 and 2012. The cost-effective potential scenariomore » (2) identifies the maximum potential for energy efficiency that results in net benefits to the consumer. The best available technology scenario (3) re presents the full potential of energy efficiency considering best available technologies as candidates for MEPS and incentive programs. We use the Bottom Up Energy Analysis System (BUENAS), developed by Lawrence Berkeley National Laboratory in collaboration with the Collaborative Labelling and Appliances Standards Program (CLASP), to provide a consistent methodology to com pare the different scenarios. This paper focuses on the main end uses in the residential sector. The comparison of the three scenarios for each economy provides possible opportunities for scaling up current policies or implementing additional policies. This comparison across economies reveals country best practices as well as end uses that present the greatest additional potential savings. The paper describes areas where methodologies and additional policy instruments can increase penetration of energy efficient technologies. First , we summarize the barriers and provide remedial policy tools/best practices, such as techno-economic analysis, in response to each barriers that prevent economies from capturing the full cost-effective potentials of MEPS (Scenario 1 to 2). Then, we consider the possible complementary policy options, such as incentive pro grams, to reach the full technical potential of energy efficiency in the residential sector (Scenario 2 to 3).« less

  1. Alternative Fuel Transportation Optimization Tool : Description, Methodology, and Demonstration Scenarios.

    DOT National Transportation Integrated Search

    2015-09-01

    This report describes an Alternative Fuel Transportation Optimization Tool (AFTOT), developed by the U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe) in support of the Federal Aviation Administration (FAA)....

  2. Probalistic Finite Elements (PFEM) structural dynamics and fracture mechanics

    NASA Technical Reports Server (NTRS)

    Liu, Wing-Kam; Belytschko, Ted; Mani, A.; Besterfield, G.

    1989-01-01

    The purpose of this work is to develop computationally efficient methodologies for assessing the effects of randomness in loads, material properties, and other aspects of a problem by a finite element analysis. The resulting group of methods is called probabilistic finite elements (PFEM). The overall objective of this work is to develop methodologies whereby the lifetime of a component can be predicted, accounting for the variability in the material and geometry of the component, the loads, and other aspects of the environment; and the range of response expected in a particular scenario can be presented to the analyst in addition to the response itself. Emphasis has been placed on methods which are not statistical in character; that is, they do not involve Monte Carlo simulations. The reason for this choice of direction is that Monte Carlo simulations of complex nonlinear response require a tremendous amount of computation. The focus of efforts so far has been on nonlinear structural dynamics. However, in the continuation of this project, emphasis will be shifted to probabilistic fracture mechanics so that the effect of randomness in crack geometry and material properties can be studied interactively with the effect of random load and environment.

  3. Quantifying construction and demolition waste: an analytical review.

    PubMed

    Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen

    2014-09-01

    Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Power systems for future missions

    NASA Technical Reports Server (NTRS)

    Gill, S. P.; Frye, P. E.; Littman, Franklin D.; Meisl, C. J.

    1994-01-01

    A comprehensive scenario of future missions was developed and applicability of different power technologies to these missions was assessed. Detailed technology development roadmaps for selected power technologies were generated. A simple methodology to evaluate economic benefits of current and future power system technologies by comparing Life Cycle Costs of potential missions was developed. The methodology was demonstrated by comparing Life Cycle Costs for different implementation strategies of DIPS/CBC technology to a selected set of missions.

  5. A method for assessing carbon stocks, carbon sequestration, and greenhouse-gas fluxes in ecosystems of the United States under present conditions and future scenarios

    Treesearch

    Zhiliang Zhu; Brian Bergamaschi; Richard Bernknopf; David Clow; Dennis Dye; Stephen Faulkner; William Forney; Robert Gleason; Todd Hawbaker; Jinxun Liu; Shuguang Liu; Stephen Prisley; Bradley Reed; Matthew Reeves; Matthew Rollins; Benjamin Sleeter; Terry Sohl; Sarah Stackpoole; Stephen Stehman; Robert Striegl; Anne Wein

    2010-01-01

    This methodology was developed to fulfill a requirement by the Energy Independence and Security Act of 2007 (EISA). The EISA legislation mandates the U.S. Department of the Interior (DOI) to develop a methodology and conduct an assessment of carbon storage, carbon sequestration, and fluxes of three principal greenhouse gases (GHG) for the Nation's ecosystems. The...

  6. CACDA Jiffy War Game Technical Manual. Part 1: Methodology

    DTIC Science & Technology

    1977-03-01

    Systems Analysis Office (Mr Tyburski) Fort Monmout’.h, NJ 07703 Commander 1* USAISD ATTN: ATISE-TD-TS-CD (LT Boyer) Fort Deven , MASS 01433 Commander 2...Developments Activity Fort Leavenworth, Kansas 66027 CACDA JIFFY WAR GAME TECHNICAL MANUAL Part 1: Methodology by Timothy J. Bailey and Gerald A. Martin ACN...ComrbatDevelopments Activity (CACDA), Fort Leavenworth,i-Xsas," for scenario devel- opment and force structure evaluation. The Jiffy Game computer

  7. Laser Threat Analysis System (LTAS)

    NASA Astrophysics Data System (ADS)

    Pfaltz, John M.; Richardson, Christina E.; Ruiz, Abel; Barsalou, Norman; Thomas, Robert J.

    2002-11-01

    LTAS is a totally integrated modeling and simulation environment designed for the purpose of ascertaining the susceptibility of Air Force pilots and air crews to optical radiation threats. Using LTAS, mission planners can assess the operational impact of optically directed energy weapons and countermeasures. Through various scenarios, threat analysts are able to determine the capability of laser threats and their impact on operational missions including the air crew's ability to complete their mission effectively. Additionally, LTAS allows the risk of laser use on training ranges and the requirement for laser protection to be evaluated. LTAS gives mission planners and threat analysts complete control of the threat environment including threat parameter control and placement, terrain mapping (line-of-site), atmospheric conditions, and laser eye protection (LEP) selection. This report summarizes the design of the final version of LTAS, and the modeling methodologies implemented to accomplish analysis.

  8. Hybrid Scenario Development Methodology and Tool: An Arctic-Oriented Scenario Example

    DTIC Science & Technology

    2011-07-01

    T E N D S T A T E = S C E N A R I O T R I G G E R P O I N T SECURITY(INC DEFENCE) LEGAL ENVIRONMENTAL...SECURITY(INC DEFENCE) S C E N A R I O D E V E L O P M E N T E N D S T A T E = S C E N A R I O T R I G G E R P O I N T

  9. Paper-based and web-based intervention modeling experiments identified the same predictors of general practitioners' antibiotic-prescribing behavior.

    PubMed

    Treweek, Shaun; Bonetti, Debbie; Maclennan, Graeme; Barnett, Karen; Eccles, Martin P; Jones, Claire; Pitts, Nigel B; Ricketts, Ian W; Sullivan, Frank; Weal, Mark; Francis, Jill J

    2014-03-01

    To evaluate the robustness of the intervention modeling experiment (IME) methodology as a way of developing and testing behavioral change interventions before a full-scale trial by replicating an earlier paper-based IME. Web-based questionnaire and clinical scenario study. General practitioners across Scotland were invited to complete the questionnaire and scenarios, which were then used to identify predictors of antibiotic-prescribing behavior. These predictors were compared with the predictors identified in an earlier paper-based IME and used to develop a new intervention. Two hundred seventy general practitioners completed the questionnaires and scenarios. The constructs that predicted simulated behavior and intention were attitude, perceived behavioral control, risk perception/anticipated consequences, and self-efficacy, which match the targets identified in the earlier paper-based IME. The choice of persuasive communication as an intervention in the earlier IME was also confirmed. Additionally, a new intervention, an action plan, was developed. A web-based IME replicated the findings of an earlier paper-based IME, which provides confidence in the IME methodology. The interventions will now be evaluated in the next stage of the IME, a web-based randomized controlled trial. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  11. Assessing hydrologic impacts of future Land Change scenarios in the San Pedro River (U.S./Mexico)

    NASA Astrophysics Data System (ADS)

    Kepner, W. G.; Burns, S.; Sidman, G.; Levick, L.; Goodrich, D. C.; Guertin, P.; Yee, W.; Scianni, M.

    2012-12-01

    An approach was developed to characterize the hydrologic impacts of urban expansion through time for the San Pedro River, a watershed of immense international importance that straddles the U.S./Mexico border. Future urban growth is a key driving force altering local and regional hydrology and is represented by decadal changes in housing density maps from 2010 to 2100 derived from the Integrated Climate and Land-Use Scenarios (ICLUS) database. ICLUS developed future housing density maps by adapting the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) social, economic, and demographic storylines to the conterminous United States. To characterize the hydrologic impacts of future growth, the housing density maps were reclassified to National Land Cover Database 2006 land cover classes and used to parameterize the Soil and Water Assessment Tool (SWAT) using the Automated Geospatial Watershed Assessment (AGWA) tool. The presentation will report 1) the methodology for adapting the ICLUS data for use in AGWA as an approach to evaluate basin-wide impacts of development on water-quantity and -quality, 2) initial results of the application of the methodology, and 3) discuss implications of the analysis.

  12. Solving the muon g -2 anomaly in deflected anomaly mediated SUSY breaking with messenger-matter interactions

    NASA Astrophysics Data System (ADS)

    Wang, Fei; Wang, Wenyu; Yang, Jin Min

    2017-10-01

    We propose to introduce general messenger-matter interactions in the deflected anomaly mediated supersymmetry (SUSY) breaking (AMSB) scenario to explain the gμ-2 anomaly. Scenarios with complete or incomplete grand unified theory (GUT) multiplet messengers are discussed, respectively. The introduction of incomplete GUT mulitiplets can be advantageous in various aspects. We found that the gμ-2 anomaly can be solved in both scenarios under current constraints including the gluino mass bounds, while the scenarios with incomplete GUT representation messengers are more favored by the gμ-2 data. We also found that the gluino is upper bounded by about 2.5 TeV (2.0 TeV) in scenario A and 3.0 TeV (2.7 TeV) in scenario B if the generalized deflected AMSB scenarios are used to fully account for the gμ-2 anomaly at 3 σ (2 σ ) level. Such a gluino should be accessible in the future LHC searches. Dark matter (DM) constraints, including DM relic density and direct detection bounds, favor scenario B with incomplete GUT multiplets. Much of the allowed parameter space for scenario B could be covered by the future DM direct detection experiments.

  13. Assessment of seismic risk in Tashkent, Uzbekistan and Bishkek, Kyrgyz Republic

    USGS Publications Warehouse

    Erdik, M.; Rashidov, T.; Safak, E.; Turdukulov, A.

    2005-01-01

    The impact of earthquakes in urban centers prone to disastrous earthquakes necessitates the analysis of associated risk for rational formulation of contingency plans and mitigation strategies. In urban centers the seismic risk is best quantified and portrayed through the preparation of 'Earthquake damage and Loss Scenarios'. The components of such scenarios are the assessment of the hazard, inventories and the vulnerabilities of elements at risk. For the development of earthquake risk scenario in Tashkent-Uzbekistan and Bishkek-Kyrgyzstan an approach based on spectral displacements is utilized. This paper will present the important features of a comprehensive study, highlight the methodology, discuss the results and provide insights to the future developments. ?? 2005 Elsevier Ltd. All rights reserved.

  14. Quantifying the daily economic impact of extreme space weather due to failure in electricity transmission infrastructure

    NASA Astrophysics Data System (ADS)

    Oughton, Edward J.; Skelton, Andrew; Horne, Richard B.; Thomson, Alan W. P.; Gaunt, Charles T.

    2017-01-01

    Extreme space weather due to coronal mass ejections has the potential to cause considerable disruption to the global economy by damaging the transformers required to operate electricity transmission infrastructure. However, expert opinion is split between the potential outcome being one of a temporary regional blackout and of a more prolonged event. The temporary blackout scenario proposed by some is expected to last the length of the disturbance, with normal operations resuming after a couple of days. On the other hand, others have predicted widespread equipment damage with blackout scenarios lasting months. In this paper we explore the potential costs associated with failure in the electricity transmission infrastructure in the U.S. due to extreme space weather, focusing on daily economic loss. This provides insight into the direct and indirect economic consequences of how an extreme space weather event may affect domestic production, as well as other nations, via supply chain linkages. By exploring the sensitivity of the blackout zone, we show that on average the direct economic cost incurred from disruption to electricity represents only 49% of the total potential macroeconomic cost. Therefore, if indirect supply chain costs are not considered when undertaking cost-benefit analysis of space weather forecasting and mitigation investment, the total potential macroeconomic cost is not correctly represented. The paper contributes to our understanding of the economic impact of space weather, as well as making a number of key methodological contributions relevant for future work. Further economic impact assessment of this threat must consider multiday, multiregional events.

  15. Using Scenarios and Simulations to Plan Colleges

    ERIC Educational Resources Information Center

    McIntyre, Chuck

    2004-01-01

    Using a case study, this article describes a method by which higher education institutions construct and use multiple future scenarios and simulations to plan strategically: to create visions of their futures, chart broad directions (mission and goals), and select learning and delivery strategies so as to achieve those broad directions. The…

  16. Space telescope observatory management system preliminary test and verification plan

    NASA Technical Reports Server (NTRS)

    Fritz, J. S.; Kaldenbach, C. F.; Williams, W. B.

    1982-01-01

    The preliminary plan for the Space Telescope Observatory Management System Test and Verification (TAV) is provided. Methodology, test scenarios, test plans and procedure formats, schedules, and the TAV organization are included. Supporting information is provided.

  17. Biofuel transportation analysis tool : description, methodology, and demonstration scenarios

    DOT National Transportation Integrated Search

    2014-01-01

    This report describes a Biofuel Transportation Analysis Tool (BTAT), developed by the U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe) in support of the Department of Defense (DOD) Office of Naval Research ...

  18. Transient Region Coverage in the Propulsion IVHM Technology Experiment

    NASA Technical Reports Server (NTRS)

    Balaban, Edward; Sweet, Adam; Bajwa, Anupa; Maul, William; Fulton, Chris; Chicatelli, amy

    2004-01-01

    Over the last several years researchers at NASA Glenn and Ames Research Centers have developed a real-time fault detection and isolation system for propulsion subsystems of future space vehicles. The Propulsion IVHM Technology Experiment (PITEX), as it is called follows the model-based diagnostic methodology and employs Livingstone, developed at NASA Ames, as its reasoning engine. The system has been tested on,flight-like hardware through a series of nominal and fault scenarios. These scenarios have been developed using a highly detailed simulation of the X-34 flight demonstrator main propulsion system and include realistic failures involving valves, regulators, microswitches, and sensors. This paper focuses on one of the recent research and development efforts under PITEX - to provide more complete transient region coverage. It describes the development of the transient monitors, the corresponding modeling methodology, and the interface software responsible for coordinating the flow of information between the quantitative monitors and the qualitative, discrete representation Livingstone.

  19. A new scenario-based approach to damage detection using operational modal parameter estimates

    NASA Astrophysics Data System (ADS)

    Hansen, J. B.; Brincker, R.; López-Aenlle, M.; Overgaard, C. F.; Kloborg, K.

    2017-09-01

    In this paper a vibration-based damage localization and quantification method, based on natural frequencies and mode shapes, is presented. The proposed technique is inspired by a damage assessment methodology based solely on the sensitivity of mass-normalized experimental determined mode shapes. The present method differs by being based on modal data extracted by means of Operational Modal Analysis (OMA) combined with a reasonable Finite Element (FE) representation of the test structure and implemented in a scenario-based framework. Besides a review of the basic methodology this paper addresses fundamental theoretical as well as practical considerations which are crucial to the applicability of a given vibration-based damage assessment configuration. Lastly, the technique is demonstrated on an experimental test case using automated OMA. Both the numerical study as well as the experimental test case presented in this paper are restricted to perturbations concerning mass change.

  20. A methodology for long-range prediction of air transportation

    NASA Technical Reports Server (NTRS)

    Ayati, M. B.; English, J. M.

    1980-01-01

    A framework and methodology for long term projection of demand for aviation fuels is presented. The approach taken includes two basic components. The first was a new technique for establishing the socio-economic environment within which the future aviation industry is embedded. The concept utilized was a definition of an overall societal objective for the very long run future. Within a framework so defined, a set of scenarios by which the future will unfold are then written. These scenarios provide the determinants of the air transport industry operations and accordingly provide an assessment of future fuel requirements. The second part was the modeling of the industry in terms of an abstracted set of variables to represent the overall industry performance on a macro scale. The model was validated by testing the desired output variables from the model with historical data over the past decades.

  1. Assessment of climate change impacts on groundwater resources: the case study of Veneto and Friuli plain in Italy

    NASA Astrophysics Data System (ADS)

    Critto, Andrea; Pasini, Sara; Torresan, Silvia; Rizzi, Jonathan; Zabeo, Alex; Marcomini, Antonio

    2013-04-01

    Climate change will have different impacts on water resources and water-dependent services worldwide. In particular, climate-related risks for groundwater and related ecosystems pose great concern to scientists and water authorities involved in the protection of these valuable resources. Research is needed to better understand how climate change will impact groundwater resources in specific regions and places and to develop predictive tools for their sustainable management, copying with the envisaged effects of global climate change and the key principles of EU water policy. Within the European project Life+ TRUST (Tool for Regional-scale assessment of groundwater Storage improvement in adaptation to climaTe change), a Regional Risk Assessment (RRA) methodology was developed in order to identify impacts from climate change on groundwater and associated ecosystems (e.g. surface waters, agricultural areas, natural environments) and to rank areas and receptors at risk in the high and middle Veneto and Friuli Plain (Italy). Based on an integrated analysis of impacts, vulnerability and risks linked to climate change at the regional scale, a RRA framework complying with the Sources-Pathway-Receptor-Consequence (SPRC) approach was defined. Relevant impacts on groundwater and surface waters (i.e. groundwater level variations, changes in nitrate infiltration processes, changes in water availability for irrigation) were selected and analyzed through hazard scenario, exposure, susceptibility and risk assessment. The RRA methodology used hazard scenarios constructed through global and high resolution models simulations for the 2071-2100 period, according with IPCC A1B emission scenario in order to produce useful indications for future risk prioritization and to support the addressing of adaptation measures, primarily Managed Artificial Recharge (MAR) techniques. Relevant outcomes from the described RRA application highlighted that potential climate change impacts will occur with different extension and magnitude in the case study area. Particularly, qualitative and quantitative impacts on groundwater will occur with more severe consequences in the wettest and in the driest scenario (respectively) and on natural and anthropic systems through the reduction in quality and quantity of water availability for agricultural and other uses (about 80% of agricultural areas and 27% of groundwater bodies at risk). While, such impacts will likely have little direct effects on related ecosystems - croplands, forests and natural environments - lying along the spring area (about 12% of croplands and 2% of natural environments at risk). The major outcomes of the described RRA application are here presented and discussed.

  2. Climate change impact assessment in Veneto and Friuli Plain groundwater. Part II: a spatially resolved regional risk assessment.

    PubMed

    Pasini, S; Torresan, S; Rizzi, J; Zabeo, A; Critto, A; Marcomini, A

    2012-12-01

    Climate change impact assessment on water resources has received high international attention over the last two decades, due to the observed global warming and its consequences at the global to local scale. In particular, climate-related risks for groundwater and related ecosystems pose a great concern to scientists and water authorities involved in the protection of these valuable resources. The close link of global warming with water cycle alterations encourages research to deepen current knowledge on relationships between climate trends and status of water systems, and to develop predictive tools for their sustainable management, copying with key principles of EU water policy. Within the European project Life+ TRUST (Tool for Regional-scale assessment of groundwater Storage improvement in adaptation to climaTe change), a Regional Risk Assessment (RRA) methodology was developed in order to identify impacts from climate change on groundwater and associated ecosystems (e.g. surface waters, agricultural areas, natural environments) and to rank areas and receptors at risk in the high and middle Veneto and Friuli Plain (Italy). Based on an integrated analysis of impacts, vulnerability and risks linked to climate change at the regional scale, a RRA framework complying with the Sources-Pathway-Receptor-Consequence (SPRC) approach was defined. Relevant impacts on groundwater and surface waters (i.e. groundwater level variations, changes in nitrate infiltration processes, changes in water availability for irrigation) were selected and analyzed through hazard scenario, exposure, susceptibility and risk assessment. The RRA methodology used hazard scenarios constructed through global and high resolution model simulations for the 2071-2100 period, according to IPCC A1B emission scenario in order to produce useful indications for future risk prioritization and to support the addressing of adaptation measures, primarily Managed Artificial Recharge (MAR) techniques. Relevant outcomes from the described RRA application highlighted that potential climate change impacts will occur with different extension and magnitude in the case study area. Particularly, qualitative and quantitative impacts on groundwater will occur with more severe consequences in the wettest and in the driest scenario (respectively). Moreover, such impacts will likely have little direct effects on related ecosystems - croplands, forests and natural environments - lying along the spring area (about 12% of croplands and 2% of natural environments at risk) while more severe consequences will indirectly occur on natural and anthropic systems through the reduction in quality and quantity of water availability for agricultural and other uses (about 80% of agricultural areas and 27% of groundwater bodies at risk). Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Future air pollution in the Shared Socio-economic Pathways

    DOE PAGES

    Rao, Shilpa; Klimont, Zbigniew; Smith, Steven J.; ...

    2016-07-15

    Emissions of air pollutants such as sulfur and nitrogen oxides and particulates have significant health impacts as well as effects on natural and anthropogenic ecosystems. These same emissions also can change atmospheric chemistry and the planetary energy balance, thereby impacting global and regional climate. Long-term scenarios for air pollutant emissions are needed as inputs to global climate and chemistry models, and for analysis linking air pollutant impacts across sectors. In this paper we present methodology and results for air pollutant emissions in Shared Socioeconomic Pathways (SSP) scenarios. We first present a set of three air pollution narratives that describe high,more » central, and low pollution control ambitions over the 21 st century. These narratives are then translated into quantitative guidance for use in integrated assessment models. We provide an overview of pollutant emission trajectories under the SSP scenarios. Pollutant emissions in these scenarios cover a wider range than the scenarios used in previous international climate model comparisons. Furthermore, the SSP scenarios provide the opportunity to access a more comprehensive range of future global and regional air quality outcomes.« less

  4. Future air pollution in the Shared Socio-economic Pathways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Shilpa; Klimont, Zbigniew; Smith, Steven J.

    Emissions of air pollutants such as sulfur and nitrogen oxides and particulates have significant health impacts as well as effects on natural and anthropogenic ecosystems. These same emissions also can change atmospheric chemistry and the planetary energy balance, thereby impacting global and regional climate. Long-term scenarios for air pollutant emissions are needed as inputs to global climate and chemistry models, and for analysis linking air pollutant impacts across sectors. In this paper we present methodology and results for air pollutant emissions in Shared Socioeconomic Pathways (SSP) scenarios. We first present a set of three air pollution narratives that describe high,more » central, and low pollution control ambitions over the 21 st century. These narratives are then translated into quantitative guidance for use in integrated assessment models. We provide an overview of pollutant emission trajectories under the SSP scenarios. Pollutant emissions in these scenarios cover a wider range than the scenarios used in previous international climate model comparisons. Furthermore, the SSP scenarios provide the opportunity to access a more comprehensive range of future global and regional air quality outcomes.« less

  5. Scenario Planning: A Phenomenological Examination of Influence on Organizational Learning and Decision-Making in a K-12 Public Education System

    ERIC Educational Resources Information Center

    Deklotz, Patricia F.

    2013-01-01

    Organizations commonly engage in long range planning to direct decisions. Scenario planning, one method of private sector planning, is recognized as useful when organizations are facing uncertainty. Scenario planning engages the organization in a process that produces plausible stories, called scenarios, describing the organization in several…

  6. Directed Energy Technology Working Group Report (IDA/OSD R&M (Institute for Defense Analyses/Office of the Secretary of Defense Reliability and Maintainability) Study).

    DTIC Science & Technology

    1983-08-01

    Missile (SLBM) Defense Scenario ............................................ B-1 C Space-Based Anti-Ballistic Missile ( ABM ) Defense Scenario...Ballistic Missile (SLBM) Defense Scenario, and at Strategic Space-Based Anti-Ballistic Missile ( ABM ) Defense Scenario. These case studies are provided...of flight. 3.5.3 Spaced-Based ABM Defense Scenario In this scenario, an orbiting battle station is operating as an element of GBMD System, and it is

  7. A new method for solving reachable domain of spacecraft with a single impulse

    NASA Astrophysics Data System (ADS)

    Chen, Qi; Qiao, Dong; Shang, Haibin; Liu, Xinfu

    2018-04-01

    This paper develops a new approach to solve the reachable domain of a spacecraft with a single maximum available impulse. First, the distance in a chosen direction, started from a given position on the initial orbit, is formulated. Then, its extreme value is solved to obtain the maximum reachable distance in this direction. The envelop of the reachable domain in three-dimensional space is determined by solving the maximum reachable distance in all directions. Four scenarios are analyzed, including three typical scenarios (either the maneuver position or impulse direction is fixed, or both are arbitrary) and a new extended scenario (the maneuver position is restricted to an interval and the impulse direction is arbitrary). Moreover, the symmetry and the boundedness of the reachable domain are discussed in detail. The former is helpful to reduce the numerical computation, while the latter decides the maximum eccentricity of the initial orbit for a maximum available impulse. The numerical simulations verify the effectiveness of the proposed method for solving the reachable domain in all four scenarios. Especially, the reachable domain with a highly elliptical initial orbit can be determined successfully, which remains unsolved in the existing papers.

  8. Looking to the future: Framing the implementation of interprofessional education and practice with scenario planning.

    PubMed

    Forman, Dawn; Nicol, Pam; Nicol, Paul

    2015-01-01

    Adapting to interprofessional education and practice requires a change of perspective for many health professionals. We aimed to explore the potential of scenario planning to bridge the understanding gap and framing strategic planning for interprofessional education (IPE) and practice (IPP), as well as to implement innovative techniques and technology for large-group scenario planning. A full-day scenario planning workshop incorporating innovative methodology was designed and offered to participants. The 71 participants included academics from nine universities, as well as service providers, government, students and consumer organisations. The outcomes were evaluated by statistical and thematic analysis of a mixed method survey questionnaire. The scenario planning method resulted in a positive response as a means of collaboratively exploring current knowledge and broadening entrenched attitudes. It was perceived to be an effective instrument for framing strategy for the implementation of IPE/IPP, with 81 percent of respondents to a post-workshop survey indicating they would consider using scenario planning in their own organisations. The scenario planning method can be used by tertiary academic institutions as a strategy in developing, implementing and embedding IPE, and for the enculturation of IPP in practice settings.

  9. Deterministic seismogenic scenarios based on asperities spatial distribution to assess tsunami hazard on northern Chile (18°S to 24°S)

    NASA Astrophysics Data System (ADS)

    González-Carrasco, J. F.

    2016-12-01

    Southern Peru and northern Chile coastal areas, extended between 12º to 24ºS, have been recognized as a mature seismic gap with a high seismogenic potential associated to seismic moment deficit accumulated since 1877. An important scientific question is which will be the breaking pattern of a future megathrust earthquake, being relevant from hazard assessment perspective. During the last decade, the occurrence of three major subduction earthquakes has given the possibility to acquire outstanding geophysical and geological information to know the behavior of phenomena. An interesting result is the relationship between the maximum slip areas and the spatial distribution of asperities in subduction zones. In this contribution, we propose a methodology to identify a regional pattern of main asperities to construct reliable seismogenic scenarios in a seismic gap. We follow a deterministic approach to explore the distribution of asperities segmentation using geophysical and geodetic data as trench-parallel gravity anomaly (TPGA), interseismic coupling (ISC), b-value, historical moment release, residual bathymetric and gravity anomalies. The combined information represents physical constraints for short and long term suitable regions for future mega earthquakes. To illuminate the asperities distribution, we construct profiles using fault coordinates, along-strike and down-dip direction, of all proxies to define the boundaries of a major asperities (> 100 km). The geometry of a major asperity is useful to define a finite set of future deterministic seismogenic scenarios to evaluate tsunamigenic hazard in main cities of northern zone of Chile (18°S to 24°S).

  10. Long-term ensemble forecast of snowmelt inflow into the Cheboksary Reservoir under two different weather scenarios

    NASA Astrophysics Data System (ADS)

    Gelfan, Alexander; Moreydo, Vsevolod; Motovilov, Yury; Solomatine, Dimitri P.

    2018-04-01

    A long-term forecasting ensemble methodology, applied to water inflows into the Cheboksary Reservoir (Russia), is presented. The methodology is based on a version of the semi-distributed hydrological model ECOMAG (ECOlogical Model for Applied Geophysics) that allows for the calculation of an ensemble of inflow hydrographs using two different sets of weather ensembles for the lead time period: observed weather data, constructed on the basis of the Ensemble Streamflow Prediction methodology (ESP-based forecast), and synthetic weather data, simulated by a multi-site weather generator (WG-based forecast). We have studied the following: (1) whether there is any advantage of the developed ensemble forecasts in comparison with the currently issued operational forecasts of water inflow into the Cheboksary Reservoir, and (2) whether there is any noticeable improvement in probabilistic forecasts when using the WG-simulated ensemble compared to the ESP-based ensemble. We have found that for a 35-year period beginning from the reservoir filling in 1982, both continuous and binary model-based ensemble forecasts (issued in the deterministic form) outperform the operational forecasts of the April-June inflow volume actually used and, additionally, provide acceptable forecasts of additional water regime characteristics besides the inflow volume. We have also demonstrated that the model performance measures (in the verification period) obtained from the WG-based probabilistic forecasts, which are based on a large number of possible weather scenarios, appeared to be more statistically reliable than the corresponding measures calculated from the ESP-based forecasts based on the observed weather scenarios.

  11. A meta-analysis of the greenhouse gas abatement of bioenergy factoring in land use changes.

    PubMed

    El Akkari, M; Réchauchère, O; Bispo, A; Gabrielle, B; Makowski, D

    2018-06-04

    Non-food biomass production is developing rapidly to fuel the bioenergy sector and substitute dwindling fossil resources, which is likely to impact land-use patterns worldwide. Recent publications attempting to factor this effect into the climate mitigation potential of bioenergy chains have come to widely variable conclusions depending on their scope, data sources or methodology. Here, we conducted a first of its kind, systematic review of scientific literature on this topic and derived quantitative trends through a meta-analysis. We showed that second-generation biofuels and bioelectricity have a larger greenhouse gas (GHG) abatement potential than first generation biofuels, and stand the best chances (with a 80 to 90% probability range) of achieving a 50% reduction compared to fossil fuels. Conversely, directly converting forest ecosystems to produce bioenergy feedstock appeared as the worst-case scenario, systematically leading to negative GHG savings. On the other hand, converting grassland appeared to be a better option and entailed a 60% chance of halving GHG emissions compared to fossil energy sources. Since most climate mitigation scenarios assume still larger savings, it is critical to gain better insight into land-use change effects to provide a more realistic estimate of the mitigation potential associated with bioenergy.

  12. Risk-based maintenance of ethylene oxide production facilities.

    PubMed

    Khan, Faisal I; Haddara, Mahmoud R

    2004-05-20

    This paper discusses a methodology for the design of an optimum inspection and maintenance program. The methodology, called risk-based maintenance (RBM) is based on integrating a reliability approach and a risk assessment strategy to obtain an optimum maintenance schedule. First, the likely equipment failure scenarios are formulated. Out of many likely failure scenarios, the ones, which are most probable, are subjected to a detailed study. Detailed consequence analysis is done for the selected scenarios. Subsequently, these failure scenarios are subjected to a fault tree analysis to determine their probabilities. Finally, risk is computed by combining the results of the consequence and the probability analyses. The calculated risk is compared against known acceptable criteria. The frequencies of the maintenance tasks are obtained by minimizing the estimated risk. A case study involving an ethylene oxide production facility is presented. Out of the five most hazardous units considered, the pipeline used for the transportation of the ethylene is found to have the highest risk. Using available failure data and a lognormal reliability distribution function human health risk factors are calculated. Both societal risk factors and individual risk factors exceeded the acceptable risk criteria. To determine an optimal maintenance interval, a reverse fault tree analysis was used. The maintenance interval was determined such that the original high risk is brought down to an acceptable level. A sensitivity analysis is also undertaken to study the impact of changing the distribution of the reliability model as well as the error in the distribution parameters on the maintenance interval.

  13. Assessing Hydrologic Impacts of Future Land Cover Change Scenarios in the South Platte River Basin (CO, WY, & NE) and the San Pedro River Basin (U.S./Mexico).

    NASA Astrophysics Data System (ADS)

    Barlow, J. E.; Burns, I. S.; Guertin, D. P.; Kepner, W. G.; Goodrich, D. C.

    2016-12-01

    Long-term land-use and land cover change and their associated impacts pose critical challenges to sustaining vital hydrological ecosystem services for future generations. In this study, a methodology to characterize hydrologic impacts from future urban growth through time that was developed and applied on the San Pedro River Basin was expanded and utilized on the South Platte River Basin as well. Future urban growth is represented by housing density maps generated in decadal intervals from 2010 to 2100, produced by the U.S. Environmental Protection Agency (EPA) Integrated Climate and Land-Use Scenarios (ICLUS) project. ICLUS developed future housing density maps by adapting the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) social, economic, and demographic storylines to the conterminous United States. To characterize hydrologic impacts from future growth, the housing density maps were reclassified to National Land Cover Database (NLCD) 2006 land cover classes and used to parameterize the Soil and Water Assessment Tool (SWAT) using the Automated Geospatial Watershed Assessment (AGWA) tool. The objectives of this project were to 1) develop and implement a methodology for adapting the ICLUS data for use in AGWA as an approach to evaluate impacts of development on water-quantity and -quality, 2) present, evaluate, and compare results from scenarios for watersheds in two different geographic and climatic regions, 3) determine watershed specific implications of this type of future land cover change analysis.

  14. Describing treatment effects to patients.

    PubMed

    Moxey, Annette; O'Connell, Dianne; McGettigan, Patricia; Henry, David

    2003-11-01

    To examine the impact of different presentations of equivalent information (framing) on treatment decisions faced by patients. A systematic review of the published literature was conducted. English language publications allocating participants to different frames were retrieved using electronic and bibliographic searches. Two reviewers examined each article for inclusion, and assessed methodological quality. Study characteristics were tabulated and where possible, relative risks (RR; 95% confidence intervals) were calculated to estimate intervention effects. Thirty-seven articles, yielding 40 experimental studies, were included. Studies examined treatment (N = 24), immunization (N = 5), or health behavior scenarios (N = 11). Overall, active treatments were preferred when outcomes were described in terms of relative rather than absolute risk reductions or number needed to treat. Surgery was preferred to other treatments when treatment efficacy was presented in a positive frame (survival) rather than a negative frame (mortality) (relative risk [RR] = 1.51, 95% confidence interval [CI], 1.39 to 1.64). Framing effects were less obvious for immunization and health behavior scenarios. Those with little interest in the behavior at baseline were influenced by framing, particularly when information was presented as gains. In studies judged to be of good methodological quality and/or examining actual decisions, the framing effect, although still evident, was less convincing compared to the results of all included studies. Framing effects varied with the type of scenario, responder characteristics, scenario manipulations, and study quality. When describing treatment effects to patients, expressing the information in more than one way may present a balanced view to patients and enable them to make informed decisions.

  15. How Can You Support RIDM/CRM/RM Through the Use of PRA

    NASA Technical Reports Server (NTRS)

    DoVemto. Tpmu

    2011-01-01

    Probabilistic Risk Assessment (PRA) is one of key Risk Informed Decision Making (RIDM) tools. It is a scenario-based methodology aimed at identifying and assessing Safety and Technical Performance risks in complex technological systems.

  16. General RMP Guidance - Chapter 4: Offsite Consequence Analysis

    EPA Pesticide Factsheets

    This chapter provides basic compliance information, not modeling methodologies, for people who plan to do their own air dispersion modeling. OCA is a required part of the risk management program, and involves worst-case and alternative release scenarios.

  17. Target crashes and safety benefits estimation methodology for pedestrian crash avoidance/mitigation systems

    DOT National Transportation Integrated Search

    2014-04-01

    Through the analysis of national crash databases from the National Highway Traffic Safety Administration, pre-crash scenarios are identified, prioritized, and described for the development of objective tests for pedestrian crash avoidance/mitigation ...

  18. Designing the next generation of robotic controllers

    NASA Technical Reports Server (NTRS)

    Goldstein, David G.

    1994-01-01

    The use of scenario-based, object-oriented software engineering methodologies in the next generation of robotic controllers is discussed. The controllers are intended to supplant the decades old technology currently embraced by the manufacturing industry of the United States.

  19. Public Review Draft: A Method for Assessing Carbon Stocks, Carbon Sequestration, and Greenhouse-Gas Fluxes in Ecosystems of the United States Under Present Conditions and Future Scenarios

    USGS Publications Warehouse

    Bergamaschi, Brian A.; Bernknopf, Richard; Clow, David; Dye, Dennis; Faulkner, Stephen; Forney, William; Gleason, Robert; Hawbaker, Todd; Liu, Jinxun; Liu, Shu-Guang; Prisley, Stephen; Reed, Bradley; Reeves, Matthew; Rollins, Matthew; Sleeter, Benjamin; Sohl, Terry; Stackpoole, Sarah; Stehman, Stephen; Striegl, Robert G.; Wein, Anne; Zhu, Zhi-Liang; Zhu, Zhi-Liang

    2010-01-01

    The Energy Independence and Security Act of 2007 (EISA), Section 712, authorizes the U.S. Department of the Interior to develop a methodology and conduct an assessment of the Nation's ecosystems focusing on carbon stocks, carbon sequestration, and emissions of three greenhouse gases (GHGs): carbon dioxide, methane, and nitrous oxide. The major requirements include (1) an assessment of all ecosystems (terrestrial systems, such as forests, croplands, wetlands, shrub and grasslands; and aquatic ecosystems, such as rivers, lakes, and estuaries), (2) an estimation of annual potential capacities of ecosystems to increase carbon sequestration and reduce net GHG emissions in the context of mitigation strategies (including management and restoration activities), and (3) an evaluation of the effects of controlling processes, such as climate change, land use and land cover, and wildlfires. The purpose of this draft methodology for public review is to propose a technical plan to conduct the assessment. Within the methodology, the concepts of ecosystems, carbon pools, and GHG fluxes used for the assessment follow conventional definitions in use by major national and international assessment or inventory efforts. In order to estimate current ecosystem carbon stocks and GHG fluxes and to understand the potential capacity and effects of mitigation strategies, the method will use two time periods for the assessment: 2001 through 2010, which establishes a current ecosystem GHG baseline and will be used to validate the models; and 2011 through 2050, which will be used to assess future potential conditions based on a set of projected scenarios. The scenario framework is constructed using storylines of the Intergovernmental Panel on Climate Change (IPCC) Special Report Emission Scenarios (SRES), along with initial reference land-use and land-cover (LULC) and land-management scenarios. An additional three LULC and land-management mitigation scenarios will be constructed for each storyline to enhance carbon sequestration and reduce GHG fluxes in ecosystems. Input from regional experts and stakeholders will be solicited to construct realistic and meaningful scenarios. The methods for mapping the current LULC and ecosystem disturbances will require the extensive use of both remote-sensing data and in-situ (for example, forest inventory data) to capture and characterize landscape-change events. For future potential LULC and ecosystem disturbances, key drivers such as socioeconomic, policy, and climate assumptions will be used in addition to biophysical data. The product of these analyses will be a series of maps for each future year for each scenario. These annual maps will form the basis for estimating carbon storage and GHG emissions. For terrestrial ecosystems, carbon storage, carbon-sequestration capacities, and GHG emissions under the current and projected future conditions will be assessed using the LULC and ecosystem-disturbance estimates in map format with a spatially explicit biogeochemical ensemble modeling system that incorporates properties of management activities (such as tillage or harvesting) and properties of individual ecosystems (such as elevation, vegetation characteristics, and soil attributes). For aquatic ecosystems, carbon burial in sediments and GHG fluxes are functions of the current and projected future stream flow and sediment transports, and therefore will be assessed using empirical modeling methods. Validation and uncertainty analysis methods described in the methodology will follow established guidelines to assess the quality of the assessment results. The U.S. Environmental Protection Agency's Level II ecoregions map (which delineates 24 ecoregions for the Nation) will be the practical instrument for developing and delivering assessment results. Consequently, the ecoregion will be the reporting unit of the assessment because the mitigation scenarios, assessment results, validation, and uncertainty analysis will be

  20. Efficient design and inference for multistage randomized trials of individualized treatment policies.

    PubMed

    Dawson, Ree; Lavori, Philip W

    2012-01-01

    Clinical demand for individualized "adaptive" treatment policies in diverse fields has spawned development of clinical trial methodology for their experimental evaluation via multistage designs, building upon methods intended for the analysis of naturalistically observed strategies. Because often there is no need to parametrically smooth multistage trial data (in contrast to observational data for adaptive strategies), it is possible to establish direct connections among different methodological approaches. We show by algebraic proof that the maximum likelihood (ML) and optimal semiparametric (SP) estimators of the population mean of the outcome of a treatment policy and its standard error are equal under certain experimental conditions. This result is used to develop a unified and efficient approach to design and inference for multistage trials of policies that adapt treatment according to discrete responses. We derive a sample size formula expressed in terms of a parametric version of the optimal SP population variance. Nonparametric (sample-based) ML estimation performed well in simulation studies, in terms of achieved power, for scenarios most likely to occur in real studies, even though sample sizes were based on the parametric formula. ML outperformed the SP estimator; differences in achieved power predominately reflected differences in their estimates of the population mean (rather than estimated standard errors). Neither methodology could mitigate the potential for overestimated sample sizes when strong nonlinearity was purposely simulated for certain discrete outcomes; however, such departures from linearity may not be an issue for many clinical contexts that make evaluation of competitive treatment policies meaningful.

  1. Feasibility of "Standardized Clinician" Methodology for Patient Training on Hospital-to-Home Transitions.

    PubMed

    Wehbe-Janek, Hania; Hochhalter, Angela K; Castilla, Theresa; Jo, Chanhee

    2015-02-01

    Patient engagement in health care is increasingly recognized as essential for promoting the health of individuals and populations. This study pilot tested the standardized clinician (SC) methodology, a novel adaptation of standardized patient methodology, for teaching patient engagement skills for the complex health care situation of transitioning from a hospital back to home. Sixty-seven participants at heightened risk for hospitalization were randomly assigned to either simulation exposure-only or full-intervention group. Both groups participated in simulation scenarios with "standardized clinicians" around tasks related to hospital discharge and follow-up. The full-intervention group was also debriefed after scenario sets and learned about tools for actively participating in hospital-to-home transitions. Measures included changes in observed behaviors at baseline and follow-up and an overall program evaluation. The full-intervention group showed increases in observed tool possession (P = 0.014) and expression of their preferences and values (P = 0.043). The simulation exposure-only group showed improvement in worksheet scores (P = 0.002) and fewer engagement skills (P = 0.021). Both groups showed a decrease in telling an SC about their hospital admission (P < 0.05). Open-ended comments from the program evaluation were largely positive. Both groups benefited from exposure to the SC intervention. Program evaluation data suggest that simulation training is feasible and may provide a useful methodology for teaching patient skills for active engagement in health care. Future studies are warranted to determine if this methodology can be used to assess overall patient engagement and whether new patient learning transfers to health care encounters.

  2. A methodology to modify land uses in a transit oriented development scenario.

    PubMed

    Sahu, Akshay

    2018-05-01

    Developing nations are adopting transit oriented development (TOD) strategies to decongest their transportation systems. These strategies are often adopted after the preparation of land use plans. The goal of this study was to build a methodology to modify these land uses using soft computing. This can help to achieve alternate land use plans relevant to TOD. The methodology incorporates TOD characteristics and objectives. Global TOD parameters (density, diversity, and distance to transit) were studied. Expert opinions gave weights and ranges for the parameters in an Indian TOD scenario. Rules to allocate land use was developed. Objective functions were defined. Four objectives were used. First was to maximize employment density, residential density and percent of mix land use. Second was to shape density and diversity with respect to distance. Third was to minimize degree of land use change, and fourth was to increase compactness of the land use allocation. The methodology was applied to two sectors of Naya Raipur, the new planned administrative capital of the state of Chhattisgarh, India. The city has implemented TOD in the form of Bus rapid transit system (BRTS) over an existing land use. Thousand random plans were generated through the methodology. Top 30 plans were selected as parent population for modifications through genetic algorithm (GA). Alternate plans were generated at the end of GA cycle. The best alternate plan was compared with successful BRTS and TOD land uses for its merits and demerits. It was also compared with the initial land use plan for empirical validation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Fractal Risk Assessment of ISS Propulsion Module in Meteoroid and Orbital Debris Environments

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.

    2001-01-01

    A unique and innovative risk assessment of the International Space Station (ISS) Propulsion Module is conducted using fractal modeling of the Module's response to the meteoroid and orbital debris environments. Both the environment models and structural failure modes due to the resultant hypervelocity impact phenomenology, as well as Module geometry, are investigated for fractal applicability. The fractal risk assessment methodology could produce a greatly simplified alternative to current methodologies, such as BUMPER analyses, while maintaining or increasing the number of complex scenarios that can be assessed. As a minimum, this innovative fractal approach will provide an independent assessment of existing methodologies in a unique way.

  4. Mitigating active shooter impact: Analysis for policy options based on agent/computer-based modeling.

    PubMed

    Anklam, Charles; Kirby, Adam; Sharevski, Filipo; Dietz, J Eric

    2015-01-01

    Active shooting violence at confined settings, such as educational institutions, poses serious security concerns to public safety. In studying the effects of active shooter scenarios, the common denominator associated with all events, regardless of reason/intent for shooter motives, or type of weapons used, was the location chosen and time expended between the beginning of the event and its culmination. This in turn directly correlates to number of casualties incurred in any given event. The longer the event protracts, the more casualties are incurred until law enforcement or another barrier can react and culminate the situation. Using AnyLogic technology, devise modeling scenarios to test multiple hypotheses against free-agent modeling simulation to determine the best method to reduce casualties associated with active shooter scenarios. Test four possible scenarios of responding to active shooter in a public school setting using agent-based computer modeling techniques-scenario 1: basic scenario where no access control or any type of security is used within the school; scenario 2, scenario assumes that concealed carry individual(s) (5-10 percent of the work force) are present in the school; scenario 3, scenario assumes that the school has assigned resource officer; scenario 4, scenario assumes that the school has assigned resource officer and concealed carry individual(s) (5-10 percent) present in the school. Statistical data from modeling scenarios indicating which tested hypothesis resulted in fewer casualties and quicker culmination of event. The use of AnyLogic proved the initial hypothesis that a decrease on response time to an active shooter scenario directly reduced victim casualties. Modeling tests show statistically significant fewer casualties in scenarios where on scene armed responders such as resource officers and concealed carry personnel were present.

  5. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less computation power. The authors have used this approach for risk assessment towards identification of effectiveness-profitability of risk mitigation measures, using optimization model for resource allocation. Based on the error-computation trade-off, 62-earthquake scenarios are chosen to be used for this purpose.

  6. Performer-centric Interface Design.

    ERIC Educational Resources Information Center

    McGraw, Karen L.

    1995-01-01

    Describes performer-centric interface design and explains a model-based approach for conducting performer-centric analysis and design. Highlights include design methodology, including cognitive task analysis; creating task scenarios; creating the presentation model; creating storyboards; proof of concept screens; object models and icons;…

  7. A Front-End Analysis Of Rear-End Crashes

    DOT National Transportation Integrated Search

    1992-05-17

    THIS PAPER DESCRIBES THE APPLICATION OF A SEVEN-STEP CRASH PROBLEM ANALYSIS METHODOLOGY, AS DESCRIBED IN THE PRECEDING PAPER BY LEASURE (1), TO REAR-END CRASHES. THE PAPER SHOWS HOW MODELING OF REAR-END CRASH SCENARIOS AND CANDIDATE COUNTERMEASURE AC...

  8. SURVIAC Bulletin: RPG Encounter Modeling, Vol 27, Issue 1, 2012

    DTIC Science & Technology

    2012-01-01

    return a probability of hit ( PHIT ) for the scenario. In the model, PHIT depends on the presented area of the targeted system and a set of errors infl...simplifying assumptions, is data-driven, and uses simple yet proven methodologies to determine PHIT . Th e inputs to THREAT describe the target, the RPG, and...Point on 2-D Representation of a CH-47 Th e determination of PHIT by THREAT is performed using one of two possible methodologies. Th e fi rst is a

  9. ImmunoScenarios: A Game for the Immune System.

    ERIC Educational Resources Information Center

    Taylor, Mark F.; Jackson, Sally W.

    1996-01-01

    Describes a board game, ImmunoScenarios, which was developed to reinforce the ideas about the immune system discussed in lecture classes. Emphasizes important characteristics of the body's specific defense system including specificity, cooperation among various cells, and memory. Includes directions for playing, student handouts, and scenarios.…

  10. Orion Handling Qualities During ISS Rendezvous and Docking

    NASA Technical Reports Server (NTRS)

    Hart, Jeremy J.; Stephens, J. P.; Spehar, P.; Bilimoria, K.; Foster, C.; Gonzalex, R.; Sullivan, K.; Jackson, B.; Brazzel, J.; Hart, J.

    2011-01-01

    The Orion spacecraft was designed to rendezvous with multiple vehicles in low earth orbit (LEO) and beyond. To perform the required rendezvous and docking task, Orion must provide enough control authority to perform coarse translational maneuvers while maintaining precision to perform the delicate docking corrections. While Orion has autonomous docking capabilities, it is expected that final approach and docking operations with the International Space Station (ISS) will initially be performed in a manual mode. A series of evaluations was conducted by NASA and Lockheed Martin at the Johnson Space Center to determine the handling qualities (HQ) of the Orion spacecraft during different docking and rendezvous conditions using the Cooper-Harper scale. This paper will address the specifics of the handling qualities methodology, vehicle configuration, scenarios flown, data collection tools, and subject ratings and comments. The initial Orion HQ assessment examined Orion docking to the ISS. This scenario demonstrates the Translational Hand Controller (THC) handling qualities of Orion. During this initial assessment, two different scenarios were evaluated. The first was a nominal docking approach to a stable ISS, with Orion initializing with relative position dispersions and a closing rate of approximately 0.1 ft/sec. The second docking scenario was identical to the first, except the attitude motion of the ISS was modeled to simulate a stress case ( 1 degree deadband per axis and 0.01 deg/sec rate deadband per axis). For both scenarios, subjects started each run on final approach at a docking port-to-port range of 20 ft. Subjects used the THC in pulse mode with cues from the docking camera image, window views, and range and range rate data displayed on the Orion display units. As in the actual design, the attitude of the Orion vehicle was held by the automated flight control system at 0.5 degree deadband per axis. Several error sources were modeled including Reaction Control System (RCS) jet angular and position misalignment, RCS thrust magnitude uncertainty, RCS jet force direction uncertainty due to self plume impingement, and Orion center of mass uncertainty.

  11. Journal of Air Transportation, Volume 9, No. 2. Volume 9, No. 2

    NASA Technical Reports Server (NTRS)

    Bowen, Brent (Editor); Kabashkin, Igor (Editor); Gudmundsson, Sveinn Vidar (Editor); Scarpellini, Nanette (Editor)

    2004-01-01

    The following articles from the "Journal of Air Transportation" were processed: Future Requirements and Concepts for Cabins of Blended Wing Body Configurations:A Scenario Approach; Future Scenarios for the European Airline Industry: A Marketing-Based Perspective; An Application of the Methodology for Assessment of the Sustainability of the Air Transport System; Modeling the Effect of Enlarged Seating Room on Passenger Preferences of Domestic Airlines in Taiwan; Developing a Fleet Standardization Index for Airline Pricing; and Future Airport Capacity Utilization in Germany: Peaked Congestion and/or Idle Capacity).

  12. Using operations research to plan the british columbia registered nurses' workforce.

    PubMed

    Lavieri, Mariel S; Regan, Sandra; Puterman, Martin L; Ratner, Pamela A

    2008-11-01

    The authors explore the power and flexibility of using an operations research methodology known as linear programming to support health human resources (HHR) planning. The model takes as input estimates of the future need for healthcare providers and, in contrast to simulation, compares all feasible strategies to identify a long-term plan for achieving a balance between supply and demand at the least cost to the system. The approach is illustrated by using it to plan the British Columbia registered nurse (RN) workforce over a 20-year horizon. The authors show how the model can be used for scenario analysis by investigating the impact of decreasing attrition from educational programs, changing RN-to-manager ratios in direct care and exploring how other changes might alter planning recommendations. In addition to HHR policy recommendations, their analysis also points to new research opportunities. Copyright © 2008 Longwoods Publishing.

  13. Functional relationship-based alarm processing

    DOEpatents

    Corsberg, D.R.

    1987-04-13

    A functional relationship-based alarm processing system and method analyzes each alarm as it is activated and determines its relative importance with other currently activated alarms and signals in accordance with the relationships that the newly activated alarm has with other currently activated alarms. Once the initial level of importance of the alarm has been determined, that alarm is again evaluated if another related alarm is activated. Thus, each alarm's importance is continuously updated as the state of the process changes during a scenario. Four hierarchical relationships are defined by this alarm filtering methodology: (1) level precursor (usually occurs when there are two alarm settings on the same parameter); (2) direct precursor (based on causal factors between two alarms); (3) required action (system response or action expected within a specified time following activation of an alarm or combination of alarms and process signals); and (4) blocking condition (alarms that are normally expected and are not considered important). 11 figs.

  14. A climate stress-test of the financial system

    NASA Astrophysics Data System (ADS)

    Battiston, Stefano; Mandel, Antoine; Monasterolo, Irene; Schütze, Franziska; Visentin, Gabriele

    2017-03-01

    The urgency of estimating the impact of climate risks on the financial system is increasingly recognized among scholars and practitioners. By adopting a network approach to financial dependencies, we look at how climate policy risk might propagate through the financial system. We develop a network-based climate stress-test methodology and apply it to large Euro Area banks in a `green' and a `brown' scenario. We find that direct and indirect exposures to climate-policy-relevant sectors represent a large portion of investors' equity portfolios, especially for investment and pension funds. Additionally, the portion of banks' loan portfolios exposed to these sectors is comparable to banks' capital. Our results suggest that climate policy timing matters. An early and stable policy framework would allow for smooth asset value adjustments and lead to potential net winners and losers. In contrast, a late and abrupt policy framework could have adverse systemic consequences.

  15. Industrial Accidents Triggered by Natural Hazards: an Emerging Risk Issue

    NASA Astrophysics Data System (ADS)

    Renni, Elisabetta; Krausmann, Elisabeth; Basco, Anna; Salzano, Ernesto; Cozzani, Valerio

    2010-05-01

    Natural disasters such as earthquakes, tsunamis, flooding or hurricanes have recently and dramatically hit several countries worldwide. Both direct and indirect consequences involved the population, causing on the one hand a high number of fatalities and on the other hand so relevant economical losses that the national gross product may be affected for many years. Loss of critical industrial infrastructures (electricity generation and distribution, gas pipelines, oil refineries, etc.) also occurred, causing further indirect damage to the population. In several cases, accident scenarios with large releases of hazardous materials were triggered by these natural events, causing so-called "Natech events", in which the overall damage resulted from the simultaneous consequences of the natural event and of the release of hazardous substances. Toxic releases, large fires and explosions, as well as possible long-term environmental pollution, economical losses, and overloading of emergency systems were recognised by post-event studies as the main issues of these Natech scenarios. In recent years the increasing frequency and severity of some natural hazards due to climate change has slowly increased the awareness of Natech risk as an emerging risk among the stakeholders. Indeed, the iNTeg-Risk project, co-funded by the European Commission within the 7th Framework Program specifically addresses these scenarios among new technological issues on public safety. The present study, in part carried out within the iNTeg-Risk project, was aimed at the analysis and further development of methods and tools for the assessment and mitigation of Natech accidents. Available tools and knowledge gaps in the assessment of Natech scenarios were highlighted. The analysis mainly addressed the potential impact of flood, lightning and earthquake events on industrial installations where hazardous substances are present. Preliminary screening methodologies and more detailed methods based on quantitative risk analysis were developed. Strategies based on the use of multiple information layers aiming at the identification of mitigation and early warning systems were also explored. A case-study in the Emilia-Romagna region is presented.

  16. Air traffic control resource management strategies and the small aircraft transportation system: A system dynamics perspective

    NASA Astrophysics Data System (ADS)

    Galvin, James J., Jr.

    The National Aeronautics and Space Administration (NASA) is leading a research effort to develop a Small Aircraft Transportation System (SATS) that will expand air transportation capabilities to hundreds of underutilized airports in the United States. Most of the research effort addresses the technological development of the small aircraft as well as the systems to manage airspace usage and surface activities at airports. The Federal Aviation Administration (FAA) will also play a major role in the successful implementation of SATS, however, the administration is reluctant to embrace the unproven concept. The purpose of the research presented in this dissertation is to determine if the FAA can pursue a resource management strategy that will support the current radar-based Air Traffic Control (ATC) system as well as a Global Positioning Satellite (GPS)-based ATC system required by the SATS. The research centered around the use of the System Dynamics modeling methodology to determine the future behavior of the principle components of the ATC system over time. The research included a model of the ATC system consisting of people, facilities, equipment, airports, aircraft, the FAA budget, and the Airport and Airways Trust Fund. The model generated system performance behavior used to evaluate three scenarios. The first scenario depicted the base case behavior of the system if the FAA continued its current resource management practices. The second scenario depicted the behavior of the system if the FAA emphasized development of GPS-based ATC systems. The third scenario depicted a combined resource management strategy that supplemented radar systems with GPS systems. The findings of the research were that the FAA must pursue a resource management strategy that primarily funds a radar-based ATC system and directs lesser funding toward a GPS-based supplemental ATC system. The most significant contribution of this research was the insight and understanding gained of how several resource management strategies and the presence of SATS aircraft may impact the future US Air Traffic Control system.

  17. Participatory modelling to support decision making in water management under uncertainty: two comparative case studies in the Guadiana river basin, Spain.

    PubMed

    Carmona, Gema; Varela-Ortega, Consuelo; Bromley, John

    2013-10-15

    A participatory modelling process has been conducted in two areas of the Guadiana river (the upper and the middle sub-basins), in Spain, with the aim of providing support for decision making in the water management field. The area has a semi-arid climate where irrigated agriculture plays a key role in the economic development of the region and accounts for around 90% of water use. Following the guidelines of the European Water Framework Directive, we promote stakeholder involvement in water management with the aim to achieve an improved understanding of the water system and to encourage the exchange of knowledge and views between stakeholders in order to help building a shared vision of the system. At the same time, the resulting models, which integrate the different sectors and views, provide some insight of the impacts that different management options and possible future scenarios could have. The methodology is based on a Bayesian network combined with an economic model and, in the middle Guadiana sub-basin, with a crop model. The resulting integrated modelling framework is used to simulate possible water policy, market and climate scenarios to find out the impacts of those scenarios on farm income and on the environment. At the end of the modelling process, an evaluation questionnaire was filled by participants in both sub-basins. Results show that this type of processes are found very helpful by stakeholders to improve the system understanding, to understand each other's views and to reduce conflict when it exists. In addition, they found the model an extremely useful tool to support management. The graphical interface, the quantitative output and the explicit representation of uncertainty helped stakeholders to better understand the implications of the scenario tested. Finally, the combination of different types of models was also found very useful, as it allowed exploring in detail specific aspects of the water management problems. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. BSM2 Plant-Wide Model construction and comparative analysis with other methodologies for integrated modelling.

    PubMed

    Grau, P; Vanrolleghem, P; Ayesa, E

    2007-01-01

    In this paper, a new methodology for integrated modelling of the WWTP has been used for the construction of the Benchmark Simulation Model N degrees 2 (BSM2). The transformations-approach proposed in this methodology does not require the development of specific transformers to interface unit process models and allows the construction of tailored models for a particular WWTP guaranteeing the mass and charge continuity for the whole model. The BSM2 PWM constructed as case study, is evaluated by means of simulations under different scenarios and its validity in reproducing water and sludge lines in WWTP is demonstrated. Furthermore the advantages that this methodology presents compared to other approaches for integrated modelling are verified in terms of flexibility and coherence.

  19. Developing spatially explicit footprints of plausible land-use scenarios in the Santa Cruz Watershed, Arizona and Sonora

    USGS Publications Warehouse

    Norman, Laura M.; Feller, Mark; Villarreal, Miguel L.

    2012-01-01

    The SLEUTH urban growth model is applied to a binational dryland watershed to envision and evaluate plausible future scenarios of land use change into the year 2050. Our objective was to create a suite of geospatial footprints portraying potential land use change that can be used to aid binational decision-makers in assessing the impacts relative to sustainability of natural resources and potential socio-ecological consequences of proposed land-use management. Three alternatives are designed to simulate different conditions: (i) a Current Trends Scenario of unmanaged exponential growth, (ii) a Conservation Scenario with managed growth to protect the environment, and (iii) a Megalopolis Scenario in which growth is accentuated around a defined international trade corridor. The model was calibrated with historical data extracted from a time series of satellite images. Model materials, methodology, and results are presented. Our Current Trends Scenario predicts the footprint of urban growth to approximately triple from 2009 to 2050, which is corroborated by local population estimates. The Conservation Scenario results in protecting 46% more of the Evergreen class (more than 150,000 acres) than the Current Trends Scenario and approximately 95,000 acres of Barren Land, Crops, Deciduous Forest (Mesquite Bosque), Grassland/Herbaceous, Urban/Recreational Grasses, and Wetlands classes combined. The Megalopolis Scenario results also depict the preservation of some of these land-use classes compared to the Current Trends Scenario, most notably in the environmentally important headwaters region. Connectivity and areal extent of land cover types that provide wildlife habitat were preserved under the alternative scenarios when compared to Current Trends.

  20. Modeling ecosystem service tradeoffs for alternative land use and climate scenarios

    EPA Science Inventory

    Scientists, policymakers, community planners and others have discussed ecosystem services for decades, however, society is still in the early stages of developing methodologies to quantify and value the goods and services that ecosystems provide. Essential to this goal are highly...

  1. Modeling of policies for reduction of GHG emissions in energy sector using ANN: case study-Croatia (EU).

    PubMed

    Bolanča, Tomislav; Strahovnik, Tomislav; Ukić, Šime; Stankov, Mirjana Novak; Rogošić, Marko

    2017-07-01

    This study describes the development of tool for testing different policies for reduction of greenhouse gas (GHG) emissions in energy sector using artificial neural networks (ANNs). The case study of Croatia was elaborated. Two different energy consumption scenarios were used as a base for calculations and predictions of GHG emissions: the business as usual (BAU) scenario and sustainable scenario. Both of them are based on predicted energy consumption using different growth rates; the growth rates within the second scenario resulted from the implementation of corresponding energy efficiency measures in final energy consumption and increasing share of renewable energy sources. Both ANN architecture and training methodology were optimized to produce network that was able to successfully describe the existing data and to achieve reliable prediction of emissions in a forward time sense. The BAU scenario was found to produce continuously increasing emissions of all GHGs. The sustainable scenario was found to decrease the GHG emission levels of all gases with respect to BAU. The observed decrease was attributed to the group of measures termed the reduction of final energy consumption through energy efficiency measures.

  2. Integrated watershed-scale response to climate change for selected basins across the United States

    USGS Publications Warehouse

    Markstrom, Steven L.; Hay, Lauren E.; Ward-Garrison, D. Christian; Risley, John C.; Battaglin, William A.; Bjerklie, David M.; Chase, Katherine J.; Christiansen, Daniel E.; Dudley, Robert W.; Hunt, Randall J.; Koczot, Kathryn M.; Mastin, Mark C.; Regan, R. Steven; Viger, Roland J.; Vining, Kevin C.; Walker, John F.

    2012-01-01

    A study by the U.S. Geological Survey (USGS) evaluated the hydrologic response to different projected carbon emission scenarios of the 21st century using a hydrologic simulation model. This study involved five major steps: (1) setup, calibrate and evaluated the Precipitation Runoff Modeling System (PRMS) model in 14 basins across the United States by local USGS personnel; (2) acquire selected simulated carbon emission scenarios from the World Climate Research Programme's Coupled Model Intercomparison Project; (3) statistical downscaling of these scenarios to create PRMS input files which reflect the future climatic conditions of these scenarios; (4) generate PRMS projections for the carbon emission scenarios for the 14 basins; and (5) analyze the modeled hydrologic response. This report presents an overview of this study, details of the methodology, results from the 14 basin simulations, and interpretation of these results. A key finding is that the hydrological response of the different geographical regions of the United States to potential climate change may be different, depending on the dominant physical processes of that particular region. Also considered is the tremendous amount of uncertainty present in the carbon emission scenarios and how this uncertainty propagates through the hydrologic simulations.

  3. From scenarios to domain models: processes and representations

    NASA Astrophysics Data System (ADS)

    Haddock, Gail; Harbison, Karan

    1994-03-01

    The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.

  4. Cost-benefit analysis in occupational health: a comparison of intervention scenarios for occupational asthma and rhinitis among bakery workers.

    PubMed

    Meijster, Tim; van Duuren-Stuurman, Birgit; Heederik, Dick; Houba, Remko; Koningsveld, Ernst; Warren, Nicholas; Tielemans, Erik

    2011-10-01

    Use of cost-benefit analysis in occupational health increases insight into the intervention strategy that maximises the cost-benefit ratio. This study presents a methodological framework identifying the most important elements of a cost-benefit analysis for occupational health settings. One of the main aims of the methodology is to evaluate cost-benefit ratios for different stakeholders (employers, employees and society). The developed methodology was applied to two intervention strategies focused on reducing respiratory diseases. A cost-benefit framework was developed and used to set up a calculation spreadsheet containing the inputs and algorithms required to calculate the costs and benefits for all cost elements. Inputs from a large variety of sources were used to calculate total costs, total benefits, net costs and the benefit-to-costs ratio for both intervention scenarios. Implementation of a covenant intervention program resulted in a net benefit of €16 848 546 over 20 years for a population of 10 000 workers. Implementation was cost-effective for all stakeholders. For a health surveillance scenario, total benefits resulting from a decreased disease burden were estimated to be €44 659 352. The costs of the interventions could not be calculated. This study provides important insights for developing effective intervention strategies in the field of occupational medicine. Use of a model based approach enables investigation of those parameters most likely to impact on the effectiveness and costs of interventions for work related diseases. Our case study highlights the importance of considering different perspectives (of employers, society and employees) in assessing and sharing the costs and benefits of interventions.

  5. Thermophysics modeling of an infrared detector cryochamber for transient operational scenario

    NASA Astrophysics Data System (ADS)

    Singhal, Mayank; Singhal, Gaurav; Verma, Avinash C.; Kumar, Sushil; Singh, Manmohan

    2016-05-01

    An infrared detector (IR) is essentially a transducer capable of converting radiant energy in the infrared regime into a measurable form. The benefit of infrared radiation is that it facilitates viewing objects in dark or through obscured conditions by detecting the infrared energy emitted by them. One of the most significant applications of IR detector systems is for target acquisition and tracking of projectile systems. IR detectors also find widespread applications in the industry and commercial market. The performance of infrared detector is sensitive to temperatures and performs best when cooled to cryogenic temperatures in the range of nearly 120 K. However, the necessity to operate in such cryogenic regimes increases the complexity in the application of IR detectors. This entails a need for detailed thermophysics analysis to be able to determine the actual cooling load specific to the application and also due to its interaction with the environment. This will enable design of most appropriate cooling methodologies suitable for specific scenarios. The focus of the present work is to develop a robust thermo-physical numerical methodology for predicting IR cryochamber behavior under transient conditions, which is the most critical scenario, taking into account all relevant heat loads including radiation in its original form. The advantage of the developed code against existing commercial software (COMSOL, ANSYS, etc.), is that it is capable of handling gas conduction together with radiation terms effectively, employing a ubiquitous software such as MATLAB. Also, it requires much smaller computational resources and is significantly less time intensive. It provides physically correct results enabling thermal characterization of cryochamber geometry in conjunction with appropriate cooling methodology. The code has been subsequently validated experimentally as the observed cooling characteristics are found to be in close agreement with the results predicted using the developed model thereby proving its efficacy.

  6. Moisture content prediction in poultry litter using artificial intelligence techniques and Monte Carlo simulation to determine the economic yield from energy use.

    PubMed

    Rico-Contreras, José Octavio; Aguilar-Lasserre, Alberto Alfonso; Méndez-Contreras, Juan Manuel; López-Andrés, Jhony Josué; Cid-Chama, Gabriela

    2017-11-01

    The objective of this study is to determine the economic return of poultry litter combustion in boilers to produce bioenergy (thermal and electrical), as this biomass has a high-energy potential due to its component elements, using fuzzy logic to predict moisture and identify the high-impact variables. This is carried out using a proposed 7-stage methodology, which includes a statistical analysis of agricultural systems and practices to identify activities contributing to moisture in poultry litter (for example, broiler chicken management, number of air extractors, and avian population density), and thereby reduce moisture to increase the yield of the combustion process. Estimates of poultry litter production and heating value are made based on 4 different moisture content percentages (scenarios of 25%, 30%, 35%, and 40%), and then a risk analysis is proposed using the Monte Carlo simulation to select the best investment alternative and to estimate the environmental impact for greenhouse gas mitigation. The results show that dry poultry litter (25%) is slightly better for combustion, generating 3.20% more energy. Reducing moisture from 40% to 25% involves considerable economic investment due to the purchase of equipment to reduce moisture; thus, when calculating financial indicators, the 40% scenario is the most attractive, as it is the current scenario. Thus, this methodology proposes a technology approach based on the use of advanced tools to predict moisture and representation of the system (Monte Carlo simulation), where the variability and uncertainty of the system are accurately represented. Therefore, this methodology is considered generic for any bioenergy generation system and not just for the poultry sector, whether it uses combustion or another type of technology. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Assessment of a French scenario with the INPRO methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasile, A.; Fiorini, G.L.; Cazalet, J.

    2006-07-01

    This paper presents the French contribution to the Joint Study of the IAEA International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO). It concerns the application of the INPRO methodology to a French scenario, on the transition from present LWRs to EPRs in a first phase and to 4. generation fast reactors in a second phase during the 21. century. The scenario also considers the renewal of the present fuel cycle facilities by the third and the fourth generation ones. Present practice of plutonium recycling in PWR is replaced by the middle of the century by a global recyclingmore » of actinides, uranium, plutonium and minor actinides in fast reactors. The status and the evolution of the INPRO criteria and the corresponding indicators during the studied period are analyzed for each of the six considered areas: economics, safety, environment, waste management, proliferation resistance and infrastructure. Improvements on economic and safety are expected for both the EPR and the 4. generation systems having these improvements among their basic goals. The use of fast reactors and global recycling of actinides leads to a significant improvement on environment indicators and in particular on the natural resources utilization. The envisaged waste management policy results in significant reductions on mass, thermal loads and radiotoxicity of the final waste which only contains fission products. The use of fuels that do not relay on enriched uranium and separated plutonium increases the proliferation resistance characteristics of the future fuel cycle. The paper summarizes also some recommendations on the data, codes and methods used to support the continuous improvement of the INPRO methodology and help future assessors. (authors)« less

  8. The effects of land use change and precipitation change on direct runoff in Wei River watershed, China.

    PubMed

    Dong, Leihua; Xiong, Lihua; Lall, Upmanu; Wang, Jiwu

    2015-01-01

    The principles and degrees to which land use change and climate change affect direct runoff generation are distinctive. In this paper, based on the MODIS data of land use in 1992 and 2003, the impacts of land use and climate change are explored using the Soil Conservation Service Curve Number (SCS-CN) method under two defined scenarios. In the first scenario, the precipitation is assumed to be constant, and thus the consequence of land use change could be evaluated. In the second scenario, the condition of land use is assumed to be constant, so the influence only induced by climate change could be assessed. Combining the conclusions of two scenarios, the effects of land use and climate change on direct runoff volume can be separated. At last, it is concluded: for the study basin, the land use types which have the greatest effect on direct runoff generation are agricultural land and water body. For the big sub basins, the effect of land use change is generally larger than that of climate change; for middle and small sub basins, most of them suffer more from land use change than from climate change.

  9. LCA of waste prevention activities: a case study for drinking water in Italy.

    PubMed

    Nessi, Simone; Rigamonti, Lucia; Grosso, Mario

    2012-10-15

    The strategic relevance of waste prevention has considerably increased worldwide during recent years, such that the current European legislation requires the preparation of national waste prevention programmes in which reduction objectives and measures are identified. In such a context, it is possible to recognise how, in order to correctly evaluate the environmental consequences of a prevention activity, a life cycle perspective should be employed. This allows us to go beyond the simple reduction of the generated waste which, alone, does not automatically imply achieving better overall environmental performance, especially when this reduction is not pursued through the simple reduction of consumption. In this study, the energetic and environmental performance of two waste prevention activities considered particularly meaningful for the Italian context were evaluated using life cycle assessment (LCA) methodology. The two activities were the utilisation of public network water (two scenarios) and of refillable bottled water (two scenarios) for drinking purposes, instead of one-way bottled water (three scenarios). The energy demand and specific potential impacts of the four waste prevention scenarios and of the three baseline scenarios were compared with the aim of evaluating whether, and under what conditions, the analysed prevention activities are actually associated with overall energetic and environmental benefits. In typical conditions, the use of public network water directly from the tap results in the best scenario, while if water is withdrawn from public fountains, its further transportation by private car can involve significant impacts. The use of refillable PET bottled water seems the preferable scenario for packaged water consumption, if refillable bottles are transported to local distributors along the same (or a lower) distance as one-way bottles to retailers. The use of refillable glass bottled water is preferable to one-way bottled water only if a distance beneath 150 km separates bottling plants from local distributors and retailers (except for eutrophication indicator which is always slightly worse). To reduce waste generation and to achieve meaningful potential savings of natural resources, energy and greenhouse gas emissions, a reduction in one-way bottled water consumption in Italy is recommended in favour of the use of public network water and of the establishment of short distance-PET bottles based refilling systems. The development of closed loop recycling of one-way PET bottles, and especially the reduction of the distance along which one-way bottled water is transported, are also important. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Arbitrage model for optimal capital transactions in petroleum reserves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ten Eyck, D.K.

    1986-01-01

    This dissertation provides a methodology for identifying price differentials in the market for petroleum reserves, enabling petroleum-producing firms to engage in a variation of classical arbitrage. This approach enables the petroleum-producing firm to evaluate and rank reserve-replacement projects from the three principal sources listed below in order to maximize the return on invested capital. The methodology is based on the discounted cash flow approach to valuation of the oil and gas reserves obtained (1) by exploration, (2) by direct purchase of reserves, and (3) by acquisition of an entire petroleum firm. The reserve-replacement projects are evaluated and ranked to determinemore » an optimal portfolio of reserve-replacement projects. Cost per barrel alone is shown to be ineffective as an evaluation tool because it may lead to economic decisions that do not maximize the value of the firm. When used with other economic decision criteria, cost per barrel is useful as a downside economic indicator by showing which projects will fare better under unfavorable price scenarios. Important factors affecting the valuation of an acquisition (in addition to the oil and gas reserves) are shown by this study to be purchase price, other assets including cash, future tax savings from operating losses carried forward, and liabilities, primarily long-term debt.« less

  11. Experience with environmental issues in GM crop production and the likely future scenarios.

    PubMed

    Gaugitsch, Helmut

    2002-02-28

    In the Cartagena Protocol on Biosafety, standards for risk assessment of genetically modified organisms (GMOs) have been set. The criteria and information basis for the risk assessment of GMOs have been modified by the EU Directive 2001/18/EC. Various approaches to further improve the criteria for environmental risk assessment of GMOs are described in this study. Reports on the ecological impacts of the cultivation of certain non-transgenic crop plants with novel or improved traits as analogy models to transgenic plants showed that the effects of agricultural practice can be at least equally important as the effects of gene transfer and invasiveness, although the latter currently play a major role in risk assessment of transgenic crops. Based on these results the applicability of the methodology of 'Life Cycle Analysis (LCA)' for genetically modified plants in comparison with conventionally bred and organically grown crop plants was evaluated. The methodology was regarded as applicable with some necessary future improvements. In current projects, the assessment of toxicology and allergenicity of GM crops are analysed, and suggestions for standardization are developed. Based on results and recommendations from these efforts there are still the challenges of how to operationalize the precautionary principle and how to take into account ecologically sensitive ecosystems, including centres of origin and centres of genetic diversity.

  12. Multi crop model climate risk country-level management design: case study on the Tanzanian maize production system

    NASA Astrophysics Data System (ADS)

    Chavez, E.

    2015-12-01

    Future climate projections indicate that a very serious consequence of post-industrial anthropogenic global warming is the likelihood of the greater frequency and intensity of extreme hydrometeorological events such as heat waves, droughts, storms, and floods. The design of national and international policies targeted at building more resilient and environmentally sustainable food systems needs to rely on access to robust and reliable data which is largely absent. In this context, the improvement of the modelling of current and future agricultural production losses using the unifying language of risk is paramount. In this study, we use a methodology that allows the integration of the current understanding of the various interacting systems of climate, agro-environment, crops, and the economy to determine short to long-term risk estimates of crop production loss, in different environmental, climate, and adaptation scenarios. This methodology is applied to Tanzania to assess optimum risk reduction and maize production increase paths in different climate scenarios. The simulations carried out use inputs from three different crop models (DSSAT, APSIM, WRSI) run in different technological scenarios and thus allowing to estimate crop model-driven risk exposure estimation bias. The results obtained also allow distinguishing different region-specific optimum climate risk reduction policies subject to historical as well as RCP2.5 and RCP8.5 climate scenarios. The region-specific risk profiles obtained provide a simple framework to determine cost-effective risk management policies for Tanzania and allow to optimally combine investments in risk reduction and risk transfer.

  13. Describing Treatment Effects to Patients

    PubMed Central

    Moxey, Annette; O'Connell, Dianne; McGettigan, Patricia; Henry, David

    2003-01-01

    OBJECTIVE To examine the impact of different presentations of equivalent information (framing) on treatment decisions faced by patients. DESIGN A systematic review of the published literature was conducted. English language publications allocating participants to different frames were retrieved using electronic and bibliographic searches. Two reviewers examined each article for inclusion, and assessed methodological quality. Study characteristics were tabulated and where possible, relative risks (RR; 95% confidence intervals) were calculated to estimate intervention effects. MEASUREMENTS AND MAIN RESULTS Thirty-seven articles, yielding 40 experimental studies, were included. Studies examined treatment (N = 24), immunization (N = 5), or health behavior scenarios (N = 11). Overall, active treatments were preferred when outcomes were described in terms of relative rather than absolute risk reductions or number needed to treat. Surgery was preferred to other treatments when treatment efficacy was presented in a positive frame (survival) rather than a negative frame (mortality) (relative risk [RR] = 1.51, 95% confidence interval [CI], 1.39 to 1.64). Framing effects were less obvious for immunization and health behavior scenarios. Those with little interest in the behavior at baseline were influenced by framing, particularly when information was presented as gains. In studies judged to be of good methodological quality and/or examining actual decisions, the framing effect, although still evident, was less convincing compared to the results of all included studies. CONCLUSIONS Framing effects varied with the type of scenario, responder characteristics, scenario manipulations, and study quality. When describing treatment effects to patients, expressing the information in more than one way may present a balanced view to patients and enable them to make informed decisions. PMID:14687282

  14. MisTec - A software application for supporting space exploration scenario options and technology development analysis and planning

    NASA Technical Reports Server (NTRS)

    Horsham, Gary A. P.

    1992-01-01

    This structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this kind of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.

  15. MisTec: A software application for supporting space exploration scenario options and technology development analysis and planning

    NASA Technical Reports Server (NTRS)

    Horsham, Gary A. P.

    1991-01-01

    The structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this king of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.

  16. Flying into the future: aviation emissions scenarios to 2050.

    PubMed

    Owen, Bethan; Lee, David S; Lim, Ling

    2010-04-01

    This study describes the methodology and results for calculating future global aviation emissions of carbon dioxide and oxides of nitrogen from air traffic under four of the IPCC/SRES (Intergovernmental Panel on Climate Change/Special Report on Emissions Scenarios) marker scenarios: A1B, A2, B1, and B2. In addition, a mitigation scenario has been calculated for the B1 scenario, requiring rapid and significant technology development and transition. A global model of aircraft movements and emissions (FAST) was used to calculate fuel use and emissions to 2050 with a further outlook to 2100. The aviation emission scenarios presented are designed to interpret the SRES and have been developed to aid in the quantification of the climate change impacts of aviation. Demand projections are made for each scenario, determined by SRES economic growth factors and the SRES storylines. Technology trends are examined in detail and developed for each scenario providing plausible projections for fuel efficiency and emissions control technology appropriate to the individual SRES storylines. The technology trends that are applied are calculated from bottom-up inventory calculations and industry technology trends and targets. Future emissions of carbon dioxide are projected to grow between 2000 and 2050 by a factor in the range of 2.0 and 3.6 depending on the scenario. Emissions of oxides of nitrogen associated with aviation over the same period are projected to grow by between a factor of 1.2 and 2.7.

  17. The impact of municipal solid waste treatment methods on greenhouse gas emissions in Lahore, Pakistan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Batool, Syeda Adila; Chuadhry, Muhammad Nawaz

    2009-01-15

    The contribution of existing municipal solid waste management to emission of greenhouse gases and the alternative scenarios to reduce emissions were analyzed for Data Ganj Bukhsh Town (DGBT) in Lahore, Pakistan using the life cycle assessment methodology. DGBT has a population of 1,624,169 people living in 232,024 dwellings. Total waste generated is 500,000 tons per year with an average per capita rate of 0.84 kg per day. Alternative scenarios were developed and evaluated according to the environmental, economic, and social atmosphere of the study area. Solid waste management options considered include the collection and transportation of waste, collection of recyclablesmore » with single and mixed material bank container systems (SMBCS, MMBCS), material recovery facilities (MRF), composting, biogasification and landfilling. A life cycle inventory (LCI) of the six scenarios along with the baseline scenario was completed; this helped to quantify the CO{sub 2} equivalents, emitted and avoided, for energy consumption, production, fuel consumption, and methane (CH{sub 4}) emissions. LCI results showed that the contribution of the baseline scenario to the global warming potential as CO{sub 2} equivalents was a maximum of 838,116 tons. The sixth scenario had a maximum reduction of GHG emissions in terms of CO{sub 2} equivalents of -33,773 tons, but the most workable scenario for the current situation in the study area is scenario 5. It saves 25% in CO{sub 2} equivalents compared to the baseline scenario.« less

  18. Assessment of the Risk of Ebola Importation to Australia

    PubMed Central

    Cope, Robert C.; Cassey, Phillip; Hugo, Graeme J.; Ross, Joshua V.

    2014-01-01

    Objectives: To assess the risk of Ebola importation to Australia during the first six months of 2015, based upon the current outbreak in West Africa. Methodology: We assessed the risk under two distinct scenarios: (i) assuming that significant numbers of cases of Ebola remain confined to Guinea, Liberia and Sierra Leone, and using historic passenger arrival data into Australia; and, (ii) assuming potential secondary spread based upon international flight data. A model appropriate to each scenario is developed, and parameterised using passenger arrival card or international flight data, and World Health Organisation case data from West Africa. These models were constructed based on WHO Ebola outbreak data as at 17 October 2014 and 3 December 2014. An assessment of the risk under each scenario is reported. On 27 October 2014 the Australian Government announced a policy change, that visas from affected countries would be refused/cancelled, and the predicted effect of this policy change is reported. Results: The current probability of at least one case entering Australia by 1 July 2015, having travelled directly from West Africa with historic passenger arrival rates into Australia, is 0.34. Under the new Australian Government policy of restricting visas from affected countries (as of 27 October 2014), the probability of at least one case entering Australia by 1 July 2015 is reduced to 0.16. The probability of at least one case entering Australia by 1 July 2015 via an outbreak from a secondary source country is approximately 0.12. Conclusions: Our models suggest that if the transmission of Ebola remains unchanged, it is possible that a case will enter Australia within the first six months of 2015, either directly from West Africa (even when current visa restrictions are considered), or via secondary outbreaks elsewhere. Government and medical authorities should be prepared to respond to this eventuality. Control measures within West Africa over recent months have contributed to a reduction in projected risk of a case entering Australia. A significant further reduction of the rate at which Ebola is proliferating in West Africa, and control of the disease if and when it proliferates elsewhere, will continue to result in substantially lower risk of the disease entering Australia. PMID:25685627

  19. Safety of railroad passenger vehicle dynamics : OMNISIM simulation and test correlations for passenger rail cars

    DOT National Transportation Integrated Search

    2002-07-01

    The purpose of the work is to validate the safety assessment methodology previously developed for passenger rail vehicle dynamics, which requires the application of simulation tools as well as testing of vehicles under different track scenarios. This...

  20. Commercial Demand Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Commercial Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  1. Optimal scheduling and its Lyapunov stability for advanced load-following energy plants with CO 2 capture

    DOE PAGES

    Bankole, Temitayo; Jones, Dustin; Bhattacharyya, Debangsu; ...

    2017-11-03

    In this study, a two-level control methodology consisting of an upper-level scheduler and a lower-level supervisory controller is proposed for an advanced load-following energy plant with CO 2 capture. With the use of an economic objective function that considers fluctuation in electricity demand and price at the upper level, optimal scheduling of energy plant electricity production and carbon capture with respect to several carbon tax scenarios is implemented. The optimal operational profiles are then passed down to corresponding lower-level supervisory controllers designed using a methodological approach that balances control complexity with performance. Finally, it is shown how optimal carbon capturemore » and electricity production rate profiles for an energy plant such as the integrated gasification combined cycle (IGCC) plant are affected by electricity demand and price fluctuations under different carbon tax scenarios. As a result, the paper also presents a Lyapunov stability analysis of the proposed scheme.« less

  2. Deterministic propagation model for RFID using site-specific and FDTD

    NASA Astrophysics Data System (ADS)

    Cunha de Azambuja, Marcelo; Passuelo Hessel, Fabiano; Luís Berz, Everton; Bauermann Porfírio, Leandro; Ruhnke Valério, Paula; De Pieri Baladei, Suely; Jung, Carlos Fernando

    2015-06-01

    The conduction of experiments to evaluate a tag orientation and its readability in a laboratory offers great potential for reducing time and costs for users. This article presents a novel methodology for developing simulation models for RFID (radio-frequency identification) environments. The main challenges in adopting this model are: (1) to find out how the properties of each one of the materials, on which the tag is applied, influence the read range and to determine the necessary power for tag reading and (2) to find out the power of the backscattered signal received by the tag when energised by the RF wave transmitted by the reader. The validation tests, performed in four different kinds of environments, with tags applied to six different kinds of materials, six different distances and with a reader configured with three different powers, showed achievements on the average of 95.3% accuracy in the best scenario and 87.0% in the worst scenario. The methodology can be easily duplicated to generate simulation models to other different RFID environments.

  3. Modeling Negotiation by a Paticipatory Approach

    NASA Astrophysics Data System (ADS)

    Torii, Daisuke; Ishida, Toru; Bousquet, François

    In a participatory approach by social scientists, role playing games (RPG) are effectively used to understand real thinking and behavior of stakeholders, but RPG is not sufficient to handle a dynamic process like negotiation. In this study, a participatory simulation where user-controlled avatars and autonomous agents coexist is introduced to the participatory approach for modeling negotiation. To establish a modeling methodology of negotiation, we have tackled the following two issues. First, for enabling domain experts to concentrate interaction design for participatory simulation, we have adopted the architecture in which an interaction layer controls agents and have defined three types of interaction descriptions (interaction protocol, interaction scenario and avatar control scenario) to be described. Second, for enabling domain experts and stakeholders to capitalize on participatory simulation, we have established a four-step process for acquiring negotiation model: 1) surveys and interviews to stakeholders, 2) RPG, 3) interaction design, and 4) participatory simulation. Finally, we discussed our methodology through a case study of agricultural economics in the northeast Thailand.

  4. Optimal scheduling and its Lyapunov stability for advanced load-following energy plants with CO 2 capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bankole, Temitayo; Jones, Dustin; Bhattacharyya, Debangsu

    In this study, a two-level control methodology consisting of an upper-level scheduler and a lower-level supervisory controller is proposed for an advanced load-following energy plant with CO 2 capture. With the use of an economic objective function that considers fluctuation in electricity demand and price at the upper level, optimal scheduling of energy plant electricity production and carbon capture with respect to several carbon tax scenarios is implemented. The optimal operational profiles are then passed down to corresponding lower-level supervisory controllers designed using a methodological approach that balances control complexity with performance. Finally, it is shown how optimal carbon capturemore » and electricity production rate profiles for an energy plant such as the integrated gasification combined cycle (IGCC) plant are affected by electricity demand and price fluctuations under different carbon tax scenarios. As a result, the paper also presents a Lyapunov stability analysis of the proposed scheme.« less

  5. Plastic solid waste utilization technologies: A Review

    NASA Astrophysics Data System (ADS)

    Awasthi, Arun Kumar; Shivashankar, Murugesh; Majumder, Suman

    2017-11-01

    Plastics are used in more number of applications in worldwide and it becomes essential part of our daily life. In Indian cities and villages people use the plastics in buying vegetable as a carry bag, drinking water bottle, use of plastic furniture in home, plastics objects uses in kitchen, plastic drums in packing and storage of the different chemicals for industrial use, use plastic utensils in home and many more uses. After usage of plastics it will become part of waste garbage and create pollution due to presence of toxic chemicals and it will be spread diseases and give birth to uncontrolled issues in social society. In current scenario consumption of plastic waste increasing day by day and it is very difficult to manage the plastic waste. There are limited methodologies available for reutilization of plastic waste again. Such examples are recycling, landfill, incineration, gasification and hydrogenation. In this paper we will review the existing methodologies of utilization of plastic waste in current scenario

  6. Radiation Transport Calculation of the UGXR Collimators for the Jules Horowitz Reactor (JHR)

    NASA Astrophysics Data System (ADS)

    Chento, Yelko; Hueso, César; Zamora, Imanol; Fabbri, Marco; Fuente, Cristina De La; Larringan, Asier

    2017-09-01

    Jules Horowitz Reactor (JHR), a major infrastructure of European interest in the fission domain, will be built and operated in the framework of an international cooperation, including the development and qualification of materials and nuclear fuel used in nuclear industry. For this purpose UGXR Collimators, two multi slit gamma and X-ray collimation mechatronic systems, will be installed at the JHR pool and at the Irradiated Components Storage pool. Expected amounts of radiation produced by the spent fuel and X-ray accelerator implies diverse aspects need to be verified to ensure adequate radiological zoning and personnel radiation protection. A computational methodology was devised to validate the Collimators design by means of coupling different engineering codes. In summary, several assessments were performed by means of MCNP5v1.60 to fulfil all the radiological requirements in Nominal scenario (TEDE < 25µSv/h) and in Maintenance scenario (TEDE < 2mSv/h) among others, detailing the methodology, hypotheses and assumptions employed.

  7. Modeling the Car Crash Crisis Management System Using HiLA

    NASA Astrophysics Data System (ADS)

    Hölzl, Matthias; Knapp, Alexander; Zhang, Gefei

    An aspect-oriented modeling approach to the Car Crash Crisis Management System (CCCMS) using the High-Level Aspect (HiLA) language is described. HiLA is a language for expressing aspects for UML static structures and UML state machines. In particular, HiLA supports both a static graph transformational and a dynamic approach of applying aspects. Furthermore, it facilitates methodologically turning use case descriptions into state machines: for each main success scenario, a base state machine is developed; all extensions to this main success scenario are covered by aspects. Overall, the static structure of the CCCMS is modeled in 43 classes, the main success scenarios in 13 base machines, the use case extensions in 47 static and 31 dynamic aspects, most of which are instantiations of simple aspect templates.

  8. A Bayesian beta distribution model for estimating rainfall IDF curves in a changing climate

    NASA Astrophysics Data System (ADS)

    Lima, Carlos H. R.; Kwon, Hyun-Han; Kim, Jin-Young

    2016-09-01

    The estimation of intensity-duration-frequency (IDF) curves for rainfall data comprises a classical task in hydrology studies to support a variety of water resources projects, including urban drainage and the design of flood control structures. In a changing climate, however, traditional approaches based on historical records of rainfall and on the stationary assumption can be inadequate and lead to poor estimates of rainfall intensity quantiles. Climate change scenarios built on General Circulation Models offer a way to access and estimate future changes in spatial and temporal rainfall patterns at the daily scale at the utmost, which is not as fine temporal resolution as required (e.g. hours) to directly estimate IDF curves. In this paper we propose a novel methodology based on a four-parameter beta distribution to estimate IDF curves conditioned on the observed (or simulated) daily rainfall, which becomes the time-varying upper bound of the updated nonstationary beta distribution. The inference is conducted in a Bayesian framework that provides a better way to take into account the uncertainty in the model parameters when building the IDF curves. The proposed model is tested using rainfall data from four stations located in South Korea and projected climate change Representative Concentration Pathways (RCPs) scenarios 6 and 8.5 from the Met Office Hadley Centre HadGEM3-RA model. The results show that the developed model fits the historical data as good as the traditional Generalized Extreme Value (GEV) distribution but is able to produce future IDF curves that significantly differ from the historically based IDF curves. The proposed model predicts for the stations and RCPs scenarios analysed in this work an increase in the intensity of extreme rainfalls of short duration with long return periods.

  9. Tsunami risk zoning in south-central Chile

    NASA Astrophysics Data System (ADS)

    Lagos, M.

    2010-12-01

    The recent 2010 Chilean tsunami revealed the need to optimize methodologies for assessing the risk of disaster. In this context, modern techniques and criteria for the evaluation of the tsunami phenomenon were applied in the coastal zone of south-central Chile as a specific methodology for the zoning of tsunami risk. This methodology allows the identification and validation of a scenario of tsunami hazard; the spatialization of factors that have an impact on the risk; and the zoning of the tsunami risk. For the hazard evaluation, different scenarios were modeled by means of numerical simulation techniques, selecting and validating the results that better fit with the observed tsunami data. Hydrodynamic parameters of the inundation as well as physical and socioeconomic vulnerability aspects were considered for the spatialization of the factors that affect the tsunami risk. The tsunami risk zoning was integrated into a Geographic Information System (GIS) by means of multicriteria evaluation (MCE). The results of the tsunami risk zoning show that the local characteristics and their location, together with the concentration of poverty levels, establish spatial differentiated risk levels. This information builds the basis for future applied studies in land use planning that tend to minimize the risk levels associated to the tsunami hazard. This research is supported by Fondecyt 11090210.

  10. Stoffenmanager exposure model: company-specific exposure assessments using a Bayesian methodology.

    PubMed

    van de Ven, Peter; Fransman, Wouter; Schinkel, Jody; Rubingh, Carina; Warren, Nicholas; Tielemans, Erik

    2010-04-01

    The web-based tool "Stoffenmanager" was initially developed to assist small- and medium-sized enterprises in the Netherlands to make qualitative risk assessments and to provide advice on control at the workplace. The tool uses a mechanistic model to arrive at a "Stoffenmanager score" for exposure. In a recent study it was shown that variability in exposure measurements given a certain Stoffenmanager score is still substantial. This article discusses an extension to the tool that uses a Bayesian methodology for quantitative workplace/scenario-specific exposure assessment. This methodology allows for real exposure data observed in the company of interest to be combined with the prior estimate (based on the Stoffenmanager model). The output of the tool is a company-specific assessment of exposure levels for a scenario for which data is available. The Bayesian approach provides a transparent way of synthesizing different types of information and is especially preferred in situations where available data is sparse, as is often the case in small- and medium sized-enterprises. Real-world examples as well as simulation studies were used to assess how different parameters such as sample size, difference between prior and data, uncertainty in prior, and variance in the data affect the eventual posterior distribution of a Bayesian exposure assessment.

  11. The Interpersonal Adaptiveness of Dispositional Guilt and Shame: A Meta-Analytic Investigation.

    PubMed

    Tignor, Stefanie M; Colvin, C Randall

    2017-06-01

    Despite decades of empirical research, conclusions regarding the adaptiveness of dispositional guilt and shame are mixed. We use meta-analysis to summarize the empirical literature and clarify these ambiguities. Specifically, we evaluate how guilt and shame are uniquely related to pro-social orientation and, in doing so, highlight the substantial yet under-acknowledged impact of researchers' methodological choices. A series of meta-analyses was conducted investigating the relationship between dispositional guilt (or shame) and pro-social orientation. Two main methodological moderators of interest were tested: test format (scenario vs. checklist) and statistical analysis (semi-partial vs. zero-order correlations). Among studies employing zero-order correlations, dispositional guilt was positively correlated with pro-social orientation (k = 63, Mr = .13, p < .001), whereas dispositional shame was negatively correlated, (k = 47, Mr = -.05, p = .07). Test format was a significant moderator for guilt studies only, with scenario measures producing significantly stronger effects. Semi-partial correlations resulted in significantly stronger effects among guilt and shame studies. Although dispositional guilt and shame are differentially related to pro-social orientation, such relationships depend largely on the methodological choices of the researcher, particularly in the case of guilt. Implications for the study of these traits are discussed. © 2016 Wiley Periodicals, Inc.

  12. Sustainable breeding objectives and possible selection response: Finding the balance between economics and breeders' preferences.

    PubMed

    Fuerst-Waltl, Birgit; Fuerst, Christian; Obritzhauser, Walter; Egger-Danner, Christa

    2016-12-01

    To optimize breeding objectives of Fleckvieh and Brown Swiss cattle, economic values were re-estimated using updated prices, costs, and population parameters. Subsequently, the expected selection responses for the total merit index (TMI) were calculated using previous and newly derived economic values. The responses were compared for alternative scenarios that consider breeders' preferences. A dairy herd with milk production, bull fattening, and rearing of replacement stock was modeled. The economic value of a trait was derived by calculating the difference in herd profit before and after genetic improvement. Economic values for each trait were derived while keeping all other traits constant. The traits considered were dairy, beef, and fitness traits, the latter including direct health traits. The calculation of the TMI and the expected selection responses was done using selection index methodology with estimated breeding values instead of phenotypic deviations. For the scenario representing the situation up to 2016, all traits included in the TMI were considered with their respective economic values before the update. Selection response was also calculated for newly derived economic values and some alternative scenarios, including the new trait vitality index (subindex comprising stillbirth and rearing losses). For Fleckvieh, the relative economic value for the trait groups milk, beef, and fitness were 38, 16, and 46%, respectively, up to 2016, and 39, 13, and 48%, respectively, for the newly derived economic values. Approximately the same selection response may be expected for the milk trait group, whereas the new weightings resulted in a substantially decreased response in beef traits. Within the fitness block, all traits, with the exception of fertility, showed a positive selection response. For Brown Swiss, the relative economic values for the main trait groups milk, beef, and fitness were 48, 5, and 47% before 2016, respectively, whereas for the newly derived scenario they were 40, 14, and 39%. For both Brown Swiss and Fleckvieh, the fertility complex was expected to further deteriorate, whereas all other expected selection responses for fitness traits were positive. Several additional and alternative scenarios were calculated as a basis for discussion with breeders. A decision was made to implement TMI with relative economic values for milk, beef, and fitness with 38, 18, and 44% for Fleckvieh and 50, 5, and 45% for Brown Swiss, respectively. In both breeds, no positive expected selection response was predicted for fertility, although this trait complex received a markedly higher weight than that derived economically. An even higher weight for fertility could not be agreed on due to the effect on selection response of other traits. Hence, breeders decided to direct more attention toward the preselection of bulls with regard to fertility. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  13. The potential of using the Ecosystem Approach in the implementation of the EU Water Framework Directive.

    PubMed

    Vlachopoulou, M; Coughlin, D; Forrow, D; Kirk, S; Logan, P; Voulvoulis, N

    2014-02-01

    The Ecosystem Approach provides a framework for looking at whole ecosystems in decision making to ensure that society can maintain a healthy and resilient natural environment now and for future generations. Although not explicitly mentioned in the Water Framework Directive, the Ecosystem Approach appears to be a promising concept to help its implementation, on the basis that there is a connection between the aims and objectives of the Directive (including good ecological status) and the provision of ecosystem services. In this paper, methodological linkages between the Ecosystem Approach and the Water Framework Directive have been reviewed and a framework is proposed that links its implementation to the Ecosystem Approach taking into consideration all ecosystem services and water management objectives. Individual River Basin Management Plan objectives are qualitatively assessed as to how strong their link is with individual ecosystem services. The benefits of using this approach to provide a preliminary assessment of how it could support future implementation of the Directive have been identified and discussed. Findings also demonstrate its potential to encourage more systematic and systemic thinking as it can provide a consistent framework for identifying shared aims and evaluating alternative water management scenarios and options in decision making. Allowing for a broad consideration of the benefits, costs and tradeoffs that occur in each case, this approach can further improve the economic case for certain measures, and can also help restore the shift in focus from strict legislative compliance towards a more holistic implementation that can deliver the wider aims and intentions of the Directive. © 2013.

  14. Simulated combined abnormal environment fire calculations for aviation impacts.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Alexander L.

    2010-08-01

    Aircraft impacts at flight speeds are relevant environments for aircraft safety studies. This type of environment pertains to normal environments such as wildlife impacts and rough landings, but also the abnormal environment that has more recently been evidenced in cases such as the Pentagon and World Trade Center events of September 11, 2001, and the FBI building impact in Austin. For more severe impacts, the environment is combined because it involves not just the structural mechanics, but also the release of the fuel and the subsequent fire. Impacts normally last on the order of milliseconds to seconds, whereas the firemore » dynamics may last for minutes to hours, or longer. This presents a serious challenge for physical models that employ discrete time stepping to model the dynamics with accuracy. Another challenge is that the capabilities to model the fire and structural impact are seldom found in a common simulation tool. Sandia National Labs maintains two codes under a common architecture that have been used to model the dynamics of aircraft impact and fire scenarios. Only recently have these codes been coupled directly to provide a fire prediction that is better informed on the basis of a detailed structural calculation. To enable this technology, several facilitating models are necessary, as is a methodology for determining and executing the transfer of information from the structural code to the fire code. A methodology has been developed and implemented. Previous test programs at the Sandia National Labs sled track provide unique data for the dynamic response of an aluminum tank of liquid water impacting a barricade at flight speeds. These data are used to validate the modeling effort, and suggest reasonable accuracy for the dispersion of a non-combustible fluid in an impact environment. The capability is also demonstrated with a notional impact of a fuel-filled container at flight speed. Both of these scenarios are used to evaluate numeric approximations, and help provide an understanding of the quantitative accuracy of the modeling methods.« less

  15. Introducing Hurst exponent in pair trading

    NASA Astrophysics Data System (ADS)

    Ramos-Requena, J. P.; Trinidad-Segovia, J. E.; Sánchez-Granero, M. A.

    2017-12-01

    In this paper we introduce a new methodology for pair trading. This new method is based on the calculation of the Hurst exponent of a pair. Our approach is inspired by the classical concepts of co-integration and mean reversion but joined under a unique strategy. We will show how Hurst approach presents better results than classical Distance Method and Correlation strategies in different scenarios. Results obtained prove that this new methodology is consistent and suitable by reducing the drawdown of trading over the classical ones getting as a result a better performance.

  16. THERP and HEART integrated methodology for human error assessment

    NASA Astrophysics Data System (ADS)

    Castiglia, Francesco; Giardina, Mariarosa; Tomarchio, Elio

    2015-11-01

    THERP and HEART integrated methodology is proposed to investigate accident scenarios that involve operator errors during high-dose-rate (HDR) treatments. The new approach has been modified on the basis of fuzzy set concept with the aim of prioritizing an exhaustive list of erroneous tasks that can lead to patient radiological overexposures. The results allow for the identification of human errors that are necessary to achieve a better understanding of health hazards in the radiotherapy treatment process, so that it can be properly monitored and appropriately managed.

  17. A discussion on the methodology for calculating radiological and toxicological consequences for the spent nuclear fuel project at the Hanford Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RITTMANN, P.D.

    1999-07-14

    This report contains technical information used to determine accident consequences for the Spent Nuclear Fuel Project safety documents. It does not determine accident consequences or describe specific accident scenarios, but instead provides generic information.

  18. Energy Models and the Policy Process.

    ERIC Educational Resources Information Center

    De Man, Reinier

    1983-01-01

    Describes the function of econometric and technological models in the policy process, and shows how different positions in the Dutch energy discussion are reflected by the application of different model methodologies. Discussion includes the energy policy context, a conceptual framework for using energy models, and energy scenarios in policy…

  19. Ontology for E-Learning: A Case Study

    ERIC Educational Resources Information Center

    Colace, Francesco; De Santo, Massimo; Gaeta, Matteo

    2009-01-01

    Purpose: The development of adaptable and intelligent educational systems is widely considered one of the great challenges in scientific research. Among key elements for building advanced training systems, an important role is played by methodologies chosen for knowledge representation. In this scenario, the introduction of ontology formalism can…

  20. 12 CFR 252.146 - Methodologies and practices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... SYSTEM (CONTINUED) ENHANCED PRUDENTIAL STANDARDS (REGULATION YY) Company-Run Stress Test Requirements for... stress test under sections 252.144 and 252.145, for each quarter of the planning horizon, a covered company must estimate the following for each scenario required to be used: (1) Losses, pre-provision net...

  1. Improving the spatial and temporal resolution with quantification of uncertainty and errors in earth observation data sets using Data Interpolating Empirical Orthogonal Functions methodology

    NASA Astrophysics Data System (ADS)

    El Serafy, Ghada; Gaytan Aguilar, Sandra; Ziemba, Alexander

    2016-04-01

    There is an increasing use of process-based models in the investigation of ecological systems and scenario predictions. The accuracy and quality of these models are improved when run with high spatial and temporal resolution data sets. However, ecological data can often be difficult to collect which manifests itself through irregularities in the spatial and temporal domain of these data sets. Through the use of Data INterpolating Empirical Orthogonal Functions(DINEOF) methodology, earth observation products can be improved to have full spatial coverage within the desired domain as well as increased temporal resolution to daily and weekly time step, those frequently required by process-based models[1]. The DINEOF methodology results in a degree of error being affixed to the refined data product. In order to determine the degree of error introduced through this process, the suspended particulate matter and chlorophyll-a data from MERIS is used with DINEOF to produce high resolution products for the Wadden Sea. These new data sets are then compared with in-situ and other data sources to determine the error. Also, artificial cloud cover scenarios are conducted in order to substantiate the findings from MERIS data experiments. Secondly, the accuracy of DINEOF is explored to evaluate the variance of the methodology. The degree of accuracy is combined with the overall error produced by the methodology and reported in an assessment of the quality of DINEOF when applied to resolution refinement of chlorophyll-a and suspended particulate matter in the Wadden Sea. References [1] Sirjacobs, D.; Alvera-Azcárate, A.; Barth, A.; Lacroix, G.; Park, Y.; Nechad, B.; Ruddick, K.G.; Beckers, J.-M. (2011). Cloud filling of ocean colour and sea surface temperature remote sensing products over the Southern North Sea by the Data Interpolating Empirical Orthogonal Functions methodology. J. Sea Res. 65(1): 114-130. Dx.doi.org/10.1016/j.seares.2010.08.002

  2. “Low road” to rehabilitation: a perspective on subliminal sensory neuroprosthetics

    PubMed Central

    Ghai, Shashank; Ghai, Ishan; Effenberg, Alfred O

    2018-01-01

    Fear can propagate parallelly through both cortical and subcortical pathways. It can instigate memory consolidation habitually and might allow internal simulation of movements independent of the cortical structures. This perspective suggests delivery of subliminal, aversive and kinematic audiovisual stimuli via neuroprosthetics in patients with neocortical dysfunctions. We suggest possible scenarios by which these stimuli might bypass damaged neocortical structures and possibly assisting in motor relearning. Anticipated neurophysiological mechanisms and methodological scenarios have been discussed in this perspective. This approach introduces novel perspectives into neuropsychology as to how subcortical pathways might be used to induce motor relearning. PMID:29398914

  3. "Low road" to rehabilitation: a perspective on subliminal sensory neuroprosthetics.

    PubMed

    Ghai, Shashank; Ghai, Ishan; Effenberg, Alfred O

    2018-01-01

    Fear can propagate parallelly through both cortical and subcortical pathways. It can instigate memory consolidation habitually and might allow internal simulation of movements independent of the cortical structures. This perspective suggests delivery of subliminal, aversive and kinematic audiovisual stimuli via neuroprosthetics in patients with neocortical dysfunctions. We suggest possible scenarios by which these stimuli might bypass damaged neocortical structures and possibly assisting in motor relearning. Anticipated neurophysiological mechanisms and methodological scenarios have been discussed in this perspective. This approach introduces novel perspectives into neuropsychology as to how subcortical pathways might be used to induce motor relearning.

  4. Energy-optimal path planning by stochastic dynamically orthogonal level-set optimization

    NASA Astrophysics Data System (ADS)

    Subramani, Deepak N.; Lermusiaux, Pierre F. J.

    2016-04-01

    A stochastic optimization methodology is formulated for computing energy-optimal paths from among time-optimal paths of autonomous vehicles navigating in a dynamic flow field. Based on partial differential equations, the methodology rigorously leverages the level-set equation that governs time-optimal reachability fronts for a given relative vehicle-speed function. To set up the energy optimization, the relative vehicle-speed and headings are considered to be stochastic and new stochastic Dynamically Orthogonal (DO) level-set equations are derived. Their solution provides the distribution of time-optimal reachability fronts and corresponding distribution of time-optimal paths. An optimization is then performed on the vehicle's energy-time joint distribution to select the energy-optimal paths for each arrival time, among all stochastic time-optimal paths for that arrival time. Numerical schemes to solve the reduced stochastic DO level-set equations are obtained, and accuracy and efficiency considerations are discussed. These reduced equations are first shown to be efficient at solving the governing stochastic level-sets, in part by comparisons with direct Monte Carlo simulations. To validate the methodology and illustrate its accuracy, comparisons with semi-analytical energy-optimal path solutions are then completed. In particular, we consider the energy-optimal crossing of a canonical steady front and set up its semi-analytical solution using a energy-time nested nonlinear double-optimization scheme. We then showcase the inner workings and nuances of the energy-optimal path planning, considering different mission scenarios. Finally, we study and discuss results of energy-optimal missions in a wind-driven barotropic quasi-geostrophic double-gyre ocean circulation.

  5. [The effects of a new model of hospital management on undergraduate teaching of urology].

    PubMed

    Bogado S, Justo; Bogado C, Mariana; López C, Ilse; Rosselot J, Eduardo

    2010-04-01

    Since January 2005, a new model for hospital coordinated assistance was implanted in Chile, denominated Self Managed Hospitals in net, to improve resource use effectiveness and efficiency. This new design changed health care and teaching models. To analyze, understand and to reflect on how teachers and students of the Urology Unit of the Eastern Campus of the Faculty of Medicine in the University of Chile, perceive learning in this new hospital scenario. A qualitative methodology was used, including semi-structured interviews to chief teachers and focal groups of teachers and students. Also, a written structured questionnaire was answered by a group of 5th year students and interns. University teachers perceive that undergraduate learning is affected in the new hospital scenario. Students think that they have less opportunities to directly interact with patients, and therefore have fewer possibilities to take medical histories, perform physical examinations, and fewer occasions to discuss cases with their tutors. The new health system that runs hospitals under a network could jeopardize undergraduate teaching. This is the case for the Urology Service at Hospital and the corresponding Department of Specialties, where the dominant perception of teachers and a number of students is that their clinical learning is endangered by these innovations. To obtain the learning objectives of the undergraduate program in this subject, reorientation of their ambulatory practice and derivation skills must be rationally elaborated to improve students accomplishment.

  6. Dynamic response of airborne infections to climate change: predictions for varicella

    NASA Astrophysics Data System (ADS)

    Baker, R.; Mahmud, A. S.; Metcalf, C. J. E.

    2017-12-01

    Characterizing how climate change will alter the burden of infectious diseases has clear applications for public health policy. Despite our uniquely detailed understanding of the transmission process for directly transmitted infections, the impact of climate variables on these infections remains understudied. We develop a novel methodology for estimating the causal relationship between climate and directly transmitted infections, which combines an epidemiological model of disease transmission with panel regression techniques. Our method allows us to move beyond correlational approaches to studying the link between climate and infectious diseases. Further, we can generate semi-mechanistic projections of incidence across climate scenarios. We illustrate our approach using 30 years of reported cases of varicella, a common airborne childhood infection, across 32 states in Mexico. We find significantly increased varicella transmission in drier conditions. We use this to map potential changes in the magnitude and variability of varicella incidence in Mexico as a result of projected changes in future climate conditions. Our results indicate that the predicted decrease in humidity in Mexico towards the end of the century will increase incidence of varicella, all else equal, and that these changes in incidence will be non-uniform across the year.

  7. Innovation strategies in a fruit growers association impacts assessment by using combined LCA and s-LCA methodologies.

    PubMed

    Tecco, Nadia; Baudino, Claudio; Girgenti, Vincenzo; Peano, Cristiana

    2016-10-15

    In the challenging world of territorial transformations within the agriculture, there is an increasing need for an integrated methodological framework of assessment that is able to reconcile the demand for solutions that are both economically sustainable and contribute to environmental and social improvement. This study aims to assess the introduction of innovation into agro-food systems by combining an environmental life cycle (LCA) assessment and a social life cycle assessment (s-LCA) to support the decision making process of a fruit growers co-op for the adoption of mulching and covering in raspberry farming. LCA and s-LCA have been applied independently under specific consistency requirements, selecting two scenarios to compare the impact with (1) and without (2) the innovation and then combined within a cause-effect chain. The interactions between the environment and socioeconomic components were considered within a nested frameset of business and territorial features. The total emissions from raspberry production in Scenario 1, according to the Global Warming Potential (GWP) Impact Category amounted to 2.2840kg of CO2 eq. In Scenario 2, the impact of production was associated with a GWP of 0.1682kg of CO2 eq. Social repercussions analysis from Scenario 1 compared to Scenario 2 indicate more satisfaction for working conditions and the management of climate risks. The mulching and covering, implemented within a given framework of farm activity, created conditions for the preservation of a model in which raspberry production contributes to landscape protection, the business sustainability of farms and the creation of employment. The combined use of the two methods contributes to the development of a strategy planning due to its ability to deliver, as well as specific analysis at a functional level, a wider framework for assessing the consistency of the impacts related to innovation in raspberry production. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    NASA Astrophysics Data System (ADS)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them for future streamflow projections and segregate the contribution of various sources to the uncertainty.

  9. Methodology of risk assessment of loss of water resources due to climate changes

    NASA Astrophysics Data System (ADS)

    Israfilov, Yusif; Israfilov, Rauf; Guliyev, Hatam; Afandiyev, Galib

    2016-04-01

    For sustainable development and management of rational use of water resources of Azerbaijan Republic it is actual to forecast their changes taking into account different scenarios of climate changes and assessment of possible risks of loss of sections of water resources. The major part of the Azerbaijani territory is located in the arid climate and the vast majority of water is used in the national economic production. An optimal use of conditional groundwater and surface water is of great strategic importance for economy of the country in terms of lack of common water resources. Low annual rate of sediments, high evaporation and complex natural and hydrogeological conditions prevent sustainable formation of conditioned resources of ground and surface water. In addition, reserves of fresh water resources are not equally distributed throughout the Azerbaijani territory. The lack of the common water balance creates tension in the rational use of fresh water resources in various sectors of the national economy, especially in agriculture, and as a result, in food security of the republic. However, the fresh water resources of the republic have direct proportional dependence on climatic factors. 75-85% of the resources of ground stratum-pore water of piedmont plains and fracture-vein water of mountain regions are formed by the infiltration of rainfall and condensate water. Changes of climate parameters involve changes in the hydrological cycle of the hydrosphere and as a rule, are reflected on their resources. Forecasting changes of water resources of the hydrosphere with different scenarios of climate change in regional mathematical models allowed estimating the extent of their relationship and improving the quality of decisions. At the same time, it is extremely necessary to obtain additional data for risk assessment and management to reduce water resources for a detailed analysis, forecasting the quantitative and qualitative parameters of resources, and also for optimization the use of water resources. In this regard, we have developed the methodology of risk assessment including statistical fuzzy analysis of the relationship "probability-consequences", classification of probabilities, the consequences on degree of severity and risk. The current methodology allow providing the possibility of practical use of the obtained results and giving effectual help in the sustainable development and reduction of risk degree of optimal use of water resources of the republic and, as a consequence, the national strategy of economic development.

  10. Evaluation of countermeasures for red light running by traffic simulator-based surrogate safety measures.

    PubMed

    Lee, Changju; So, Jaehyun Jason; Ma, Jiaqi

    2018-01-02

    The conflicts among motorists entering a signalized intersection with the red light indication have become a national safety issue. Because of its sensitivity, efforts have been made to investigate the possible causes and effectiveness of countermeasures using comparison sites and/or before-and-after studies. Nevertheless, these approaches are ineffective when comparison sites cannot be found, or crash data sets are not readily available or not reliable for statistical analysis. Considering the random nature of red light running (RLR) crashes, an inventive approach regardless of data availability is necessary to evaluate the effectiveness of each countermeasure face to face. The aims of this research are to (1) review erstwhile literature related to red light running and traffic safety models; (2) propose a practical methodology for evaluation of RLR countermeasures with a microscopic traffic simulation model and surrogate safety assessment model (SSAM); (3) apply the proposed methodology to actual signalized intersection in Virginia, with the most prevalent scenarios-increasing the yellow signal interval duration, installing an advance warning sign, and an RLR camera; and (4) analyze the relative effectiveness by RLR frequency and the number of conflicts (rear-end and crossing). All scenarios show a reduction in RLR frequency (-7.8, -45.5, and -52.4%, respectively), but only increasing the yellow signal interval duration results in a reduced total number of conflicts (-11.3%; a surrogate safety measure of possible RLR-related crashes). An RLR camera makes the greatest reduction (-60.9%) in crossing conflicts (a surrogate safety measure of possible angle crashes), whereas increasing the yellow signal interval duration results in only a 12.8% reduction of rear-end conflicts (a surrogate safety measure of possible rear-end crash). Although increasing the yellow signal interval duration is advantageous because this reduces the total conflicts (a possibility of total RLR-related crashes), each countermeasure shows different effects by RLR-related conflict types that can be referred to when making a decision. Given that each intersection has different RLR crash issues, evaluated countermeasures are directly applicable to enhance the cost and time effectiveness, according to the situation of the target intersection. In addition, the proposed methodology is replicable at any site that has a dearth of crash data and/or comparison sites in order to test any other countermeasures (both engineering and enforcement countermeasures) for RLR crashes.

  11. Time-dependent neo-deterministic seismic hazard scenarios for the 2016 Central Italy earthquakes sequence

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Kossobokov, Vladimir; Romashkova, Leontina; Panza, Giuliano F.

    2017-04-01

    Predicting earthquakes and related ground shaking is widely recognized among the most challenging scientific problems, both for societal relevance and intrinsic complexity of the problem. The development of reliable forecasting tools requires their rigorous formalization and testing, first in retrospect, and then in an experimental real-time mode, which imply a careful application of statistics to data sets of limited size and different accuracy. Accordingly, the operational issues of prospective validation and use of time-dependent neo-deterministic seismic hazard scenarios are discussed, reviewing the results in their application in Italy and surroundings. Long-term practice and results obtained for the Italian territory in about two decades of rigorous prospective testing, support the feasibility of earthquake forecasting based on the analysis of seismicity patterns at the intermediate-term middle-range scale. Italy is the only country worldwide where two independent, globally tested, algorithms are simultaneously applied, namely CN and M8S, which permit to deal with multiple sets of seismic precursors to allow for a diagnosis of the intervals of time when a strong event is likely to occur inside a given region. Based on routinely updated space-time information provided by CN and M8S forecasts, an integrated procedure has been developed that allows for the definition of time-dependent seismic hazard scenarios, through the realistic modeling of ground motion by the neo-deterministic approach (NDSHA). This scenario-based methodology permits to construct, both at regional and local scale, scenarios of ground motion for the time interval when a strong event is likely to occur within the alerted areas. CN and M8S predictions, as well as the related time-dependent ground motion scenarios associated with the alarmed areas, are routinely updated since 2006. The issues and results from real-time testing of the integrated NDSHA scenarios are illustrated, with special emphasis on the sequence of destructive earthquakes that struck Central Italy starting on August 2016. The results obtained so far evidence the validity of the proposed methodology in anticipating ground shaking from approaching strong earthquakes and prove that the information provided by time-dependent NDSHA can be useful in assigning priorities for timely and effective mitigation actions.

  12. Life cycle analyses of CO2, energy, and cost for four different routes of microalgal bioenergy conversion.

    PubMed

    Ventura, Jey-R S; Yang, Benqin; Lee, Yong-Woo; Lee, Kisay; Jahng, Deokjin

    2013-06-01

    With a target production of 1000 ton of dry algae/yr, lipid content of 30 wt.%, and productivity of 30 g/m(2)-d in a 340-day annual operation, four common scenarios of microalgae bioenergy routes were assessed in terms of cost, energy, and CO2 inputs and outputs. Scenario 1 (biodiesel production), Scenario 2 (Scenario 1 with integrated anaerobic digestion system), Scenario 3 (biogas production), and Scenario 4 (supercritical gasification) were evaluated. Scenario 4 outperformed other scenarios in terms of net energy production (1282.42 kWh/ton algae) and CO2 removal (1.32 ton CO2/ton algae) while Scenario 2 surpassed the other three scenarios in terms of net cost. Scenario 1 produced the lowest energy while Scenario 3 was the most expensive bioenergy system. This study evaluated critical parameters that could direct the proper design of the microalgae bioenergy system with an efficient energy production, CO2 removal, and economic feasibility. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Mixed oxidizer hybrid propulsion system optimization under uncertainty using applied response surface methodology and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Whitehead, James Joshua

    The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.

  14. Futures of elderly care in Iran: A protocol with scenario approach.

    PubMed

    Goharinezhad, Salime; Maleki, Mohammadreza; Baradaran, Hamid Reza; Ravaghi, Hamid

    2016-01-01

    Background: The number of people aged 60 and older is increasing faster than other age groups worldwide. Iran will experience a sharp aging population increase in the next decades, and this will pose new challenges to the healthcare system. Since providing high quality aged-care services would be the major concern of the policymakers, this question arises that what types of aged care services should be organized in the coming 10 years? This protocol has been designed to develop a set of scenarios for the future of elderly care in Iran. Methods: In this study, intuitive logics approach and Global Business Network (GBN) model were used to develop scenarios for elderly care in Iran. In terms of perspective, the scenarios in this approach are normative, qualitative with respect to methodology and deductive in constructing the process of scenarios. The three phases of GBN model are as follows: 1) Orientation: Identifying strategic levels, stakeholders, participants and time horizon; 2) Exploration: Identifying the driving forces and key uncertainties; 3) Synthesis: Defining the scenario logics and constructing scenario storyline. Results: Presently, two phases are completed and the results will be published in mid-2016. Conclusion: This study delivers a comprehensive framework for taking appropriate actions in providing care for the elderly in the future. Moreover, policy makers should specify and provide the full range of services for the elderly, and in doing so, the scenarios and key findings of this study could be of valuable help.

  15. A primary care Web-based Intervention Modeling Experiment replicated behavior changes seen in earlier paper-based experiment.

    PubMed

    Treweek, Shaun; Francis, Jill J; Bonetti, Debbie; Barnett, Karen; Eccles, Martin P; Hudson, Jemma; Jones, Claire; Pitts, Nigel B; Ricketts, Ian W; Sullivan, Frank; Weal, Mark; MacLennan, Graeme

    2016-12-01

    Intervention Modeling Experiments (IMEs) are a way of developing and testing behavior change interventions before a trial. We aimed to test this methodology in a Web-based IME that replicated the trial component of an earlier, paper-based IME. Three-arm, Web-based randomized evaluation of two interventions (persuasive communication and action plan) and a "no intervention" comparator. The interventions were designed to reduce the number of antibiotic prescriptions in the management of uncomplicated upper respiratory tract infection. General practitioners (GPs) were invited to complete an online questionnaire and eight clinical scenarios where an antibiotic might be considered. One hundred twenty-nine GPs completed the questionnaire. GPs receiving the persuasive communication did not prescribe an antibiotic in 0.70 more scenarios (95% confidence interval [CI] = 0.17-1.24) than those in the control arm. For the action plan, GPs did not prescribe an antibiotic in 0.63 (95% CI = 0.11-1.15) more scenarios than those in the control arm. Unlike the earlier IME, behavioral intention was unaffected by the interventions; this may be due to a smaller sample size than intended. A Web-based IME largely replicated the findings of an earlier paper-based study, providing some grounds for confidence in the IME methodology. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Development of a methodology for the assessment of sea level rise impacts on Florida's transportation modes and infrastructure : [summary].

    DOT National Transportation Integrated Search

    2012-01-01

    In Florida, low elevations can make transportation infrastructure in coastal and low-lying areas potentially vulnerable to sea level rise (SLR). Becuase global SLR forecasts lack precision at local or regional scales, SLR forecasts or scenarios for p...

  17. A Semantic Web-Based Methodology for Describing Scientific Research Efforts

    ERIC Educational Resources Information Center

    Gandara, Aida

    2013-01-01

    Scientists produce research resources that are useful to future research and innovative efforts. In a typical scientific scenario, the results created by a collaborative team often include numerous artifacts, observations and relationships relevant to research findings, such as programs that generate data, parameters that impact outputs, workflows…

  18. Laboratory Biosafety and Biosecurity Risk Assessment Technical Guidance Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Astuto-Gribble, Lisa M; Caskey, Susan Adele

    2014-07-01

    The purpose of this document is threefold: 1) to describe the laboratory bio safety and biosecurity risk assessment process and its conceptual framework; 2) provide detailed guidance and suggested methodologies on how to conduct a risk assessment; and 3) present some practical risk assessment process strategies using realistic laboratory scenarios.

  19. Recent Policy Developments in Green Education in the Netherlands

    ERIC Educational Resources Information Center

    Kupper, Hendrik; Laurentzen, Ramona; Mulder, Martin

    2012-01-01

    Purpose: To present a description of recent developments in the Dutch green educational system (agriculture, living environment, food). The article builds on a previous 2006 contribution to "JAEE" where different scenarios for changes in green education were suggested. Design/methodology/approach: An analysis of policy documents from…

  20. Assessing Hydrologic Impacts of Future Land Cover Change Scenarios in the San Pedro River (U.S./Mexico)

    EPA Science Inventory

    Long-term land-use and land cover change and their associated impacts pose critical challenges to sustaining vital hydrological ecosystem services for future generations. In this study, a methodology was developed to characterize hydrologic impacts from future urban growth throug...

  1. Assessing Hydrologic Impacts of Future Land Cover Change Scenarios in the South Platte River Basin (CO, WY, & NE)

    EPA Science Inventory

    Long-term land-use and land cover change and their associated impacts pose critical challenges to sustaining vital hydrological ecosystem services for future generations. In this study, a methodology was developed to characterize hydrologic impacts from future urban growth throug...

  2. Assessing Hydrologic Impacts of Future Land Cover Change Scenarios in the South Platte River Basin (CO, WY, & NE)

    EPA Science Inventory

    Long‐term land‐use and land cover change and their associated impacts pose critical challenges to sustaining vital hydrological ecosystem services for future generations. In this study, a methodology was developed on the San Pedro River Basin to characterize hydrologi...

  3. 12 CFR 252.146 - Methodologies and practices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... SYSTEM (CONTINUED) ENHANCED PRUDENTIAL STANDARDS (REGULATION YY) Company-Run Stress Test Requirements for... stress test under §§ 252.144 and 252.145, for each quarter of the planning horizon, a covered company must estimate the following for each scenario required to be used: (1) Losses, pre-provision net...

  4. 12 CFR 1238.4 - Methodologies and practices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... conducting a stress test under § 1238.3, each regulated entity shall calculate how each of the following is affected during each quarter of the stress test planning horizon, for each scenario: (1) Potential losses, pre-provision net revenues, allowance for loan losses, and future pro forma capital positions over the...

  5. The Role of Feedback in Young People's Academic Choices

    ERIC Educational Resources Information Center

    Skipper, Yvonne; Leman, Patrick J.

    2017-01-01

    Women are underrepresented in Science, Technology, Engineering and Mathematics subjects with more girls leaving these subjects at every stage in education. The current research used a scenario methodology to examine the impact of teacher feedback on girls' and boys' choices to study a specific science subject, engineering. British participants…

  6. A multivariate copula-based framework for dealing with hazard scenarios and failure probabilities

    NASA Astrophysics Data System (ADS)

    Salvadori, G.; Durante, F.; De Michele, C.; Bernardi, M.; Petrella, L.

    2016-05-01

    This paper is of methodological nature, and deals with the foundations of Risk Assessment. Several international guidelines have recently recommended to select appropriate/relevant Hazard Scenarios in order to tame the consequences of (extreme) natural phenomena. In particular, the scenarios should be multivariate, i.e., they should take into account the fact that several variables, generally not independent, may be of interest. In this work, it is shown how a Hazard Scenario can be identified in terms of (i) a specific geometry and (ii) a suitable probability level. Several scenarios, as well as a Structural approach, are presented, and due comparisons are carried out. In addition, it is shown how the Hazard Scenario approach illustrated here is well suited to cope with the notion of Failure Probability, a tool traditionally used for design and risk assessment in engineering practice. All the results outlined throughout the work are based on the Copula Theory, which turns out to be a fundamental theoretical apparatus for doing multivariate risk assessment: formulas for the calculation of the probability of Hazard Scenarios in the general multidimensional case (d≥2) are derived, and worthy analytical relationships among the probabilities of occurrence of Hazard Scenarios are presented. In addition, the Extreme Value and Archimedean special cases are dealt with, relationships between dependence ordering and scenario levels are studied, and a counter-example concerning Tail Dependence is shown. Suitable indications for the practical application of the techniques outlined in the work are given, and two case studies illustrate the procedures discussed in the paper.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jie; Jain, Rishabh; Hodge, Bri-Mathias

    A data-driven methodology is developed to analyze how ambient and wake turbulence affect the power generation of wind turbine(s). Using supervisory control and data acquisition (SCADA) data from a wind plant, we select two sets of wind velocity and power data for turbines on the edge of the plant that resemble (i) an out-of-wake scenario and (ii) an in-wake scenario. For each set of data, two surrogate models are developed to represent the turbine(s) power generation as a function of (i) the wind speed and (ii) the wind speed and turbulence intensity. Three types of uncertainties in turbine(s) power generationmore » are investigated: (i) the uncertainty in power generation with respect to the reported power curve; (ii) the uncertainty in power generation with respect to the estimated power response that accounts for only mean wind speed; and (iii) the uncertainty in power generation with respect to the estimated power response that accounts for both mean wind speed and turbulence intensity. Results show that (i) the turbine(s) generally produce more power under the in-wake scenario than under the out-of-wake scenario with the same wind speed; and (ii) there is relatively more uncertainty in the power generation under the in-wake scenario than under the out-of-wake scenario.« less

  8. Biophysical impacts of climate-smart agriculture in the Midwest United States.

    PubMed

    Bagley, Justin E; Miller, Jesse; Bernacchi, Carl J

    2015-09-01

    The potential impacts of climate change in the Midwest United States present unprecedented challenges to regional agriculture. In response to these challenges, a variety of climate-smart agricultural methodologies have been proposed to retain or improve crop yields, reduce agricultural greenhouse gas emissions, retain soil quality and increase climate resilience of agricultural systems. One component that is commonly neglected when assessing the environmental impacts of climate-smart agriculture is the biophysical impacts, where changes in ecosystem fluxes and storage of moisture and energy lead to perturbations in local climate and water availability. Using a combination of observational data and an agroecosystem model, a series of climate-smart agricultural scenarios were assessed to determine the biophysical impacts these techniques have in the Midwest United States. The first scenario extended the growing season for existing crops using future temperature and CO2 concentrations. The second scenario examined the biophysical impacts of no-till agriculture and the impacts of annually retaining crop debris. Finally, the third scenario evaluated the potential impacts that the adoption of perennial cultivars had on biophysical quantities. Each of these scenarios was found to have significant biophysical impacts. However, the timing and magnitude of the biophysical impacts differed between scenarios. © 2014 John Wiley & Sons Ltd.

  9. The Application of High Energy Resolution Green's Functions to Threat Scenario Simulation

    NASA Astrophysics Data System (ADS)

    Thoreson, Gregory G.; Schneider, Erich A.

    2012-04-01

    Radiation detectors installed at key interdiction points provide defense against nuclear smuggling attempts by scanning vehicles and traffic for illicit nuclear material. These hypothetical threat scenarios may be modeled using radiation transport simulations. However, high-fidelity models are computationally intensive. Furthermore, the range of smuggler attributes and detector technologies create a large problem space not easily overcome by brute-force methods. Previous research has demonstrated that decomposing the scenario into independently simulated components using Green's functions can simulate photon detector signals with coarse energy resolution. This paper extends this methodology by presenting physics enhancements and numerical treatments which allow for an arbitrary level of energy resolution for photon transport. As a result, spectroscopic detector signals produced from full forward transport simulations can be replicated while requiring multiple orders of magnitude less computation time.

  10. Forest carbon benefits, costs and leakage effects of carbon reserve scenarios in the United States

    Treesearch

    Prakash Nepal; Peter J. Ince; Kenneth E. Skog; Sun J. Chang

    2013-01-01

    This study evaluated the potential effectiveness of future carbon reserve scenarios, where U.S. forest landowners would hypothetically be paid to sequester carbon on their timberland and forego timber harvests for 100 years. Scenarios featured direct payments to landowners of $0 (baseline), $5, $10, or $15 per metric ton of additional forest carbon sequestered on the...

  11. Investigating the Sensitivity of Streamflow and Water Quality to Climate Change and Urbanization in 20 U.S. Watersheds

    NASA Astrophysics Data System (ADS)

    Johnson, T. E.; Weaver, C. P.; Butcher, J.; Parker, A.

    2011-12-01

    Watershed modeling was conducted in 20 large (15,000-60,000 km2), U.S. watersheds to address gaps in our knowledge of the sensitivity of U.S. streamflow, nutrient (N and P) and sediment loading to potential future climate change, and methodological challenges associated with integrating existing tools (e.g., climate models, watershed models) and datasets to address these questions. Climate change scenarios are based on dynamically downscaled (50x50 km2) output from four of the GCMs used in the Intergovernmental Panel on Climate Change (IPCC) 4th Assessment Report for the period 2041-2070 archived by the North American Regional Climate Change Assessment Program (NARCCAP). To explore the potential interaction of climate change and urbanization, model simulations also include urban and residential development scenarios for each of the 20 study watersheds. Urban and residential development scenarios were acquired from EPA's national-scale Integrated Climate and Land Use Scenarios (ICLUS) project. Watershed modeling was conducted using the Hydrologic Simulation Program-FORTRAN (HSPF) and Soil and Water Assessment Tool (SWAT) models. Here we present a summary of results for 5 of the study watersheds; the Minnesota River, the Susquehanna River, the Apalachicola-Chattahoochee-Flint, the Salt/Verde/San Pedro, and the Willamette River Basins. This set of results provide an overview of the response to climate change in different regions of the U.S., the different sensitivities of different streamflow and water quality endpoints, and illustrate a number of methodological issues including the sensitivities and uncertainties associated with use of different watershed models, approaches for downscaling climate change projections, and interaction between climate change and other forcing factors, specifically urbanization and changes in atmospheric CO2 concentration.

  12. Time value of emission and technology discounting rate for off-grid electricity generation in India using intermediate pyrolysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patel, Amit, E-mail: amitrp@iitrpr.ac.in; Faculty of Technology and Engineering, The Maharaja Sayajirao University of Baroda, Vadodara 390001, Gujarat; Sarkar, Prabir

    The environmental impact assessment of a process over its entire operational lifespan is an important issue. Estimation of life cycle emission helps in predicting the contribution of a given process to abate (or to pollute) the environmental emission scenario. Considering diminishing and time-dependent effect of emission, assessment of the overall effect of emissions is very complex. The paper presents a generalized methodology for arriving at a single emission discounting number for a process option, using the concept of time value of carbon emission flow. This number incorporates the effect of the emission resulting from the process over the entire operationalmore » lifespan. The advantage of this method is its quantitative aspect as well as its flexible nature. It can be applied to any process. The method is demonstrated with the help of an Intermediate Pyrolysis process when used to generate off-grid electricity and opting biochar route for disposing straw residue. The scenarios of very high net emission to very high net carbon sequestration is generated using process by careful selection of process parameters for different scenarios. For these different scenarios, the process discounting rate was determined and its outcome is discussed. The paper also proposes a process specific eco-label that mentions the discounting rates. - Highlight: • Methodology to obtain emission discounting rate for a process is proposed. • The method includes all components of life cycle emission converts into a time dependent discounting number. • A case study of Intermediate Pyrolysis is used to obtain such number for a range of processes. • The method is useful to determine if the effect from the operation of a process will lead to a net absorption of emission or net accumulation of emission in the environment.« less

  13. The KULTURisk Regional Risk Assessment methodology for water-related natural hazards - Part 1: Physical-environmental assessment

    NASA Astrophysics Data System (ADS)

    Ronco, P.; Gallina, V.; Torresan, S.; Zabeo, A.; Semenzin, E.; Critto, A.; Marcomini, A.

    2014-07-01

    In recent years, the frequency of catastrophes induced by natural hazard has increased and flood events in particular have been recognized as one of the most threatening water-related disasters. Severe floods have occurred in Europe over the last decade causing loss of life, displacement of people and heavy economic losses. Flood disasters are growing as a consequence of many factors, both climatic and non-climatic. Indeed, the current increase of water-related disasters can be mainly attributed to the increase of exposure (increase elements potentially at risk in floodplains area) and vulnerability (i.e. economic, social, geographic, cultural, and physical/environmental characteristics of the exposure). Besides these factors, the strong effect of climate change is projected to radically modify the usual pattern of the hydrological cycle by intensifying the frequency and severity of flood events both at local, regional and global scale. Within this context, it becomes urgent and dramatically relevant the need of promoting and developing effective and pro-active strategies, tools and actions which allow to assess and (possibly) to reduce the flood risks that threats different relevant receptors. Several methodologies to assess the risk posed by water-related natural hazards have been proposed so far, but very few of them can be adopted to implement the last European Flood Directive (FD). The present study is intended to introduce and present a state-of-the-art Regional Risk Assessment (RRA) methodology to evaluate the benefits of risk prevention in terms of reduced environmental risks due to floods. The methodology, developed within the recently phased out FP7-KULTURisk Project (Knowledge-based approach to develop a cULTUre of Risk prevention - KR) is flexible and can be adapted to different case studies (i.e. large rivers, alpine/mountain catchments, urban areas and coastal areas) and spatial scales (i.e. from the large river to the urban scale). The FD compliant KR-RRA methodology is based on the concept of risk being function of hazard, exposure and vulnerability. It integrates the outputs of various hydrodynamics models (hazard) with sito-specific bio-geophysical and socio-economic indicators (e.g. slope, land cover, population density, economic activities) to develop tailored risk indexes and GIS-based maps for each of the selected targets (i.e. people, buildings, infrastructures, agriculture, natural and semi-natural systems, cultural heritages) in the considered region, by comparing the baseline scenario with alternative scenarios, where different structural and/or non-structural mitigation measures are planned. As demonstrated in the companion paper (Part 2, Ronco et al., 2014), risk maps, along with related statistics, allow to identify and prioritize relative hotspots and targets which are more likely to be affected by flood and support the development of relevant and strategic adaptation and prevention measures to minimizing flood impacts. Moreover, the outputs of the RRA methodology can be used for the economic evaluation of different damages (e.g. tangible costs, intangible costs) and for the social assessment considering the benefits of the human dimension of vulnerability (i.e. adaptive and coping capacity).

  14. Dying scenarios improve recall as much as survival scenarios.

    PubMed

    Burns, Daniel J; Hart, Joshua; Kramer, Melanie E

    2014-01-01

    Merely contemplating one's death improves retention for entirely unrelated material learned subsequently. This "dying to remember" effect seems conceptually related to the survival processing effect, whereby processing items for their relevance to being stranded in the grasslands leads to recall superior to that of other deep processing control conditions. The present experiments directly compared survival processing scenarios with "death processing" scenarios. Results showed that when the survival and dying scenarios are closely matched on key dimensions, and possible congruency effects are controlled, the dying and survival scenarios produced equivalently high recall levels. We conclude that the available evidence (cf. Bell, Roer, & Buchner, 2013; Klein, 2012), while not definitive, is consistent with the possibility of overlapping mechanisms.

  15. Methodology of management of dredging operations II. Applications.

    PubMed

    Junqua, G; Abriak, N E; Gregoire, P; Dubois, V; Mac Farlane, F; Damidot, D

    2006-04-01

    This paper presents the new methodology of management of dredging operations. Derived partly from existing methodologies (OECD, PNUE, AIPCN), it aims to be more comprehensive, mixing the qualities and the complementarities of previous methodologies. The application of the methodology has been carried out on the site of the Port of Dunkirk (FRANCE). Thus, a characterization of the sediments of this site has allowed a zoning of the Port to be established in to zones of probable homogeneity of sediments. Moreover, sources of pollution have been identified, with an aim of prevention. Ways of waste improvement have also been developed, to answer regional needs, from a point of view of competitive and territorial intelligence. Their development has required a mutualisation of resources between professionals, research centres and local communities, according to principles of industrial ecology. Lastly, a tool of MultiCriteria Decision-Making Aid (M.C.D.M.A.) has been used to determine the most relevant scenario (or alternative, or action) for a dredging operation intended by the Port of Dunkirk. These applications have confirmed the relevance of this methodology for the management of dredging operations.

  16. Cumulative impact assessment: Application of a methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witmer, G.W.; Bain, M.B.; Irving, J.S.

    We expanded upon the Federal Energy Regulatory Commission's (FERC) Cluster Impact Assessment Procedure (CIAP) to provide a practical methodology for assessing potential cumulative impacts from multiple hydroelectric projects within a river basin. The objectives in designing the methodology were to allow the evaluation of a large number of combinations of proposed projects and to minimize constraints on the use of ecological knowledge for planning and regulating hydroelectric development at the river basin level. Interactive workshops and evaluative matrices were used to identify preferred development scenarios in the Snohomish (Washington) and Salmon (Idaho) River Basins. Although the methodology achieved its basicmore » objectives, some difficulties were encountered. These revolved around issues of (1) data quality and quantity, (2) alternatives analysis, (3) determination of project interactions, (4) determination of cumulative impact thresholds, and (5) the use of evaluative techniques to express degrees of impact. 8 refs., 1 fig., 2 tabs.« less

  17. Evaluation of Ecotoxicological Risks Related to the Discharge of Combined Sewer Overflows (CSOs) in a Periurban River

    PubMed Central

    Angerville, Ruth; Perrodin, Yves; Bazin, Christine; Emmanuel, Evens

    2013-01-01

    Discharges of Combined Sewer Overflows (CSOs) into periurban rivers present risks for the concerned aquatic ecosystems. In this work, a specific ecotoxicological risk assessment methodology has been developed as management tool to municipalities equipped with CSOs. This methodology comprises a detailed description of the spatio-temporal system involved, the choice of ecological targets to be preserved, and carrying out bioassays adapted to each compartment of the river receiving CSOs. Once formulated, this methodology was applied to a river flowing through the outskirts of the city of Lyon in France. The results obtained for the scenario studied showed a moderate risk for organisms of the water column and a major risk for organisms of the benthic and hyporheic zones of the river. The methodology enabled identifying the critical points of the spatio-temporal systems studied, and then making proposals for improving the management of CSOs. PMID:23812025

  18. Measuring individual differences in responses to date-rape vignettes using latent variable models.

    PubMed

    Tuliao, Antover P; Hoffman, Lesa; McChargue, Dennis E

    2017-01-01

    Vignette methodology can be a flexible and powerful way to examine individual differences in response to dangerous real-life scenarios. However, most studies underutilize the usefulness of such methodology by analyzing only one outcome, which limits the ability to track event-related changes (e.g., vacillation in risk perception). The current study was designed to illustrate the dynamic influence of risk perception on exit point from a date-rape vignette. Our primary goal was to provide an illustrative example of how to use latent variable models for vignette methodology, including latent growth curve modeling with piecewise slopes, as well as latent variable measurement models. Through the combination of a step-by-step exposition in this text and corresponding model syntax available electronically, we detail an alternative statistical "blueprint" to enhance future violence research efforts using vignette methodology. Aggr. Behav. 43:60-73, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. Application of the Hardman methodology to the Army Remotely Piloted Vehicle (RPV)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The application of the HARDMAN Methodology to the Remotely Piloted Vehicle (RPV) is described. The methodology was used to analyze the manpower, personnel, and training (MPT) requirements of the proposed RPV system design for a number of operating scenarios. The RPV system is defined as consisting of the equipment, personnel, and operational procedures needed to perform five basic artillery missions: reconnaissance, target acquisition, artillery adjustment, target designation and damage assessment. The RPV design evaluated includes an air vehicle (AV), a modular integrated communications and navigation system (MICNS), a ground control station (GCS), a launch subsystem (LS), a recovery subsystem (RS), and a number of ground support requirements. The HARDMAN Methodology is an integrated set of data base management techniques and analytic tools, designed to provide timely and fully documented assessments of the human resource requirements associated with an emerging system's design.

  20. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    NASA Astrophysics Data System (ADS)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  1. The KULTURisk Regional Risk Assessment methodology for water-related natural hazards - Part 1: Physical-environmental assessment

    NASA Astrophysics Data System (ADS)

    Ronco, P.; Gallina, V.; Torresan, S.; Zabeo, A.; Semenzin, E.; Critto, A.; Marcomini, A.

    2014-12-01

    In recent years, the frequency of catastrophes induced by natural hazards has increased, and flood events in particular have been recognized as one of the most threatening water-related disasters. Severe floods have occurred in Europe over the last decade, causing loss of life, displacement of people and heavy economic losses. Flood disasters are growing in frequency as a consequence of many factors, both climatic and non-climatic. Indeed, the current increase of water-related disasters can be mainly attributed to the increase of exposure (elements potentially at risk in flood-prone area) and vulnerability (i.e. economic, social, geographic, cultural and physical/environmental characteristics of the exposure). Besides these factors, the undeniable effect of climate change is projected to strongly modify the usual pattern of the hydrological cycle by intensifying the frequency and severity of flood events at the local, regional and global scale. Within this context, the need for developing effective and pro-active strategies, tools and actions which allow one to assess and (possibly) to reduce the flood risks that threatens different relevant receptors becomes urgent. Several methodologies to assess the risk posed by water-related natural hazards have been proposed so far, but very few of them can be adopted to implement the last European Flood Directive (FD). This paper is intended to introduce and present a state-of-the-art Regional Risk Assessment (RRA) methodology to appraise the risk posed by floods from a physical-environmental perspective. The methodology, developed within the recently completed FP7-KULTURisk Project (Knowledge-based approach to develop a cULTUre of Risk prevention - KR) is flexible and can be adapted to different case studies (i.e. plain rivers, mountain torrents, urban and coastal areas) and spatial scales (i.e. from catchment to the urban scale). The FD compliant KR-RRA methodology is based on the concept of risk being function of hazard, exposure and vulnerability. It integrates the outputs of various hydrodynamic models with site-specific bio-geophysical and socio-economic indicators (e.g. slope, land cover, population density, economic activities etc.) to develop tailored risk indexes and GIS-based maps for each of the selected receptors (i.e. people, buildings, infrastructure, agriculture, natural and semi-natural systems, cultural heritage) in the considered region. It further compares the baseline scenario with alternative scenarios, where different structural and/or non-structural mitigation measures are planned and eventually implemented. As demonstrated in the companion paper (Part 2, Ronco et al., 2014), risk maps, along with related statistics, allow one to identify and classify, on a relative scale, areas at risk which are more likely to be affected by floods and support the development of strategic adaptation and prevention measures to minimizing flood impacts. In addition, the outcomes of the RRA can be eventually used for a further socio-economic assessment, considering the tangible and intangible costs as well as the human dimension of vulnerability.

  2. Managing uncertainty: a review of food system scenario analysis and modelling

    PubMed Central

    Reilly, Michael; Willenbockel, Dirk

    2010-01-01

    Complex socio-ecological systems like the food system are unpredictable, especially to long-term horizons such as 2050. In order to manage this uncertainty, scenario analysis has been used in conjunction with food system models to explore plausible future outcomes. Food system scenarios use a diversity of scenario types and modelling approaches determined by the purpose of the exercise and by technical, methodological and epistemological constraints. Our case studies do not suggest Malthusian futures for a projected global population of 9 billion in 2050; but international trade will be a crucial determinant of outcomes; and the concept of sustainability across the dimensions of the food system has been inadequately explored so far. The impact of scenario analysis at a global scale could be strengthened with participatory processes involving key actors at other geographical scales. Food system models are valuable in managing existing knowledge on system behaviour and ensuring the credibility of qualitative stories but they are limited by current datasets for global crop production and trade, land use and hydrology. Climate change is likely to challenge the adaptive capacity of agricultural production and there are important knowledge gaps for modelling research to address. PMID:20713402

  3. ‘Imagined guilt’ vs ‘recollected guilt’: implications for fMRI

    PubMed Central

    Mclatchie, Neil; Giner-Sorolla, Roger; Derbyshire, Stuart W. G.

    2016-01-01

    Abstract Guilt is thought to maintain social harmony by motivating reparation. This study compared two methodologies commonly used to identify the neural correlates of guilt. The first, imagined guilt, requires participants to read hypothetical scenarios and then imagine themselves as the protagonist. The second, recollected guilt, requires participants to reflect on times they personally experienced guilt. In the fMRI scanner, participants were presented with guilt/neutral memories and guilt/neutral hypothetical scenarios. Contrasts confirmed a priori predictions that guilt memories, relative to guilt scenarios, were associated with significantly greater activity in regions associated with affect [anterior cingulate cortex (ACC), Caudate, Insula, orbital frontal cortex (OFC)] and social cognition [temporal pole (TP), precuneus). Similarly, results indicated that guilt memories, relative to neutral memories, were also associated with greater activity in affective (ACC, amygdala, Insula, OFC) and social cognition (mPFC, TP, precuneus, temporo-parietal junction) regions. There were no significant differences between guilt hypothetical scenarios and neutral hypothetical scenarios in either affective or social cognition regions. The importance of distinguishing between different guilt inductions inside the scanner is discussed. We offer explanations of our results and discuss ideas for future research. PMID:26746179

  4. Life cycle assessment of wood wastes: A case study of ephemeral architecture.

    PubMed

    Rivela, Beatriz; Moreira, María Teresa; Muñoz, Iván; Rieradevall, Joan; Feijoo, Gumersindo

    2006-03-15

    One of the most commonly used elements in ephemeral architecture is a particleboard panel. These types of wood products are produced from wood wastes and they are used in temporary constructions such as trade fairs. Once the event is over, they are usually disposed into landfills. This paper intends to assess the environmental effects related to the use of these wood wastes in the end-of-life stage. The Life Cycle Assessment (LCA) of two scenarios was performed, considering the recycling of wood waste for particleboard manufacture and energy generation from non-renewable resources (Scenario 1) versus the production of energy from the combustion of wood waste and particleboard manufacture with conventional wooden resources (Scenario 2). A sensitive analysis was carried out taking into account the influence of the percentage of recycled material and the emissions data from wood combustion. According to Ecoindicator 99 methodology, Damage to Human Health and Ecosystem Quality are more significant in Scenario 2 whereas Scenario 1 presents the largest contribution to Damage to Resources. Between the two proposed alternatives, the recycling of wood waste for particleboard manufacture seems to be more favorable under an environmental perspective.

  5. Integrating manufacturing softwares for intelligent planning execution: a CIIMPLEX perspective

    NASA Astrophysics Data System (ADS)

    Chu, Bei Tseng B.; Tolone, William J.; Wilhelm, Robert G.; Hegedus, M.; Fesko, J.; Finin, T.; Peng, Yun; Jones, Chris H.; Long, Junshen; Matthews, Mike; Mayfield, J.; Shimp, J.; Su, S.

    1997-01-01

    Recent developments have made it possible to interoperate complex business applications at much lower costs. Application interoperation, along with business process re- engineering can result in significant savings by eliminating work created by disconnected business processes due to isolated business applications. However, we believe much greater productivity benefits can be achieved by facilitating timely decision-making, utilizing information from multiple enterprise perspectives. The CIIMPLEX enterprise integration architecture is designed to enable such productivity gains by helping people to carry out integrated enterprise scenarios. An enterprise scenario is triggered typically by some external event. The goal of an enterprise scenario is to make the right decisions considering the full context of the problem. Enterprise scenarios are difficult for people to carry out because of the interdependencies among various actions. One can easily be overwhelmed by the large amount of information. We propose the use of software agents to help gathering relevant information and present them in the appropriate context of an enterprise scenario. The CIIMPLEX enterprise integration architecture is based on the FAIME methodology for application interoperation and plug-and-play. It also explores the use of software agents in application plug-and- play.

  6. 'Imagined guilt' vs 'recollected guilt': implications for fMRI.

    PubMed

    Mclatchie, Neil; Giner-Sorolla, Roger; Derbyshire, Stuart W G

    2016-05-01

    Guilt is thought to maintain social harmony by motivating reparation. This study compared two methodologies commonly used to identify the neural correlates of guilt. The first, imagined guilt, requires participants to read hypothetical scenarios and then imagine themselves as the protagonist. The second, recollected guilt, requires participants to reflect on times they personally experienced guilt. In the fMRI scanner, participants were presented with guilt/neutral memories and guilt/neutral hypothetical scenarios. Contrasts confirmed a priori predictions that guilt memories, relative to guilt scenarios, were associated with significantly greater activity in regions associated with affect [anterior cingulate cortex (ACC), Caudate, Insula, orbital frontal cortex (OFC)] and social cognition [temporal pole (TP), precuneus). Similarly, results indicated that guilt memories, relative to neutral memories, were also associated with greater activity in affective (ACC, amygdala, Insula, OFC) and social cognition (mPFC, TP, precuneus, temporo-parietal junction) regions. There were no significant differences between guilt hypothetical scenarios and neutral hypothetical scenarios in either affective or social cognition regions. The importance of distinguishing between different guilt inductions inside the scanner is discussed. We offer explanations of our results and discuss ideas for future research. © The Author (2016). Published by Oxford University Press.

  7. Industrial water resources management based on violation risk analysis of the total allowable target on wastewater discharge.

    PubMed

    Yue, Wencong; Cai, Yanpeng; Xu, Linyu; Yang, Zhifeng; Yin, Xin'An; Su, Meirong

    2017-07-11

    To improve the capabilities of conventional methodologies in facilitating industrial water allocation under uncertain conditions, an integrated approach was developed through the combination of operational research, uncertainty analysis, and violation risk analysis methods. The developed approach can (a) address complexities of industrial water resources management (IWRM) systems, (b) facilitate reflections of multiple uncertainties and risks of the system and incorporate them into a general optimization framework, and (c) manage robust actions for industrial productions in consideration of water supply capacity and wastewater discharging control. The developed method was then demonstrated in a water-stressed city (i.e., the City of Dalian), northeastern China. Three scenarios were proposed according to the city's industrial plans. The results indicated that in the planning year of 2020 (a) the production of civilian-used steel ships and machine-made paper & paperboard would reduce significantly, (b) violation risk of chemical oxygen demand (COD) discharge under scenario 1 would be the most prominent, compared with those under scenarios 2 and 3, (c) the maximal total economic benefit under scenario 2 would be higher than the benefit under scenario 3, and (d) the production of rolling contact bearing, rail vehicles, and commercial vehicles would be promoted.

  8. Calculation of greenhouse gas emissions of jatropha oil and jatropha biodiesel as alternative fuels for electricity production in Côte d'Ivoire

    NASA Astrophysics Data System (ADS)

    Atta, Pascal Atta; N'guessan, Yao; Morin, Celine; Voirol, Anne Jaecker; Descombes, Georges

    2017-02-01

    The electricity in Côte d'Ivoire is mainly produced from fossil energy sources. This causes damages on environment due to greenhouse gas emissions (GHG). The aim of this paper is to calculate the greenhouse gas (GHG) emissions of jatropha oil and jatropha biodiesel as alternative fuels for electricity production in Côte d'Ivoire by using Life Cycle Assessment (LCA) methodology. The functional unit in this LCA is defined as 1 kWh of electricity produced by the combustion of jatropha oil or jatropha biodiesel in the engine of a generator. Two scenarios, called short chain and long chain, were examined in this LCA. The results show that 0.132 kg CO2 equivalent is emitted for the scenario 1 with jatropha oil as an alternative fuel against 0.6376 kg CO2 equivalent for the scenario 2 with jatropha biodiesel as an alternative fuel. An 87 % reduction of kg CO2 equivalent is observed in scenario 1 and a 37 % reduction of kg CO2 equivalent is observed in the scenario 2, when compared with a Diesel fuel.

  9. Atmospheric circulation and hydroclimate impacts of alternative warming scenarios for the Eocene

    NASA Astrophysics Data System (ADS)

    Carlson, Henrik; Caballero, Rodrigo

    2017-08-01

    Recent work in modelling the warm climates of the early Eocene shows that it is possible to obtain a reasonable global match between model surface temperature and proxy reconstructions, but only by using extremely high atmospheric CO2 concentrations or more modest CO2 levels complemented by a reduction in global cloud albedo. Understanding the mix of radiative forcing that gave rise to Eocene warmth has important implications for constraining Earth's climate sensitivity, but progress in this direction is hampered by the lack of direct proxy constraints on cloud properties. Here, we explore the potential for distinguishing among different radiative forcing scenarios via their impact on regional climate changes. We do this by comparing climate model simulations of two end-member scenarios: one in which the climate is warmed entirely by CO2 (which we refer to as the greenhouse gas (GHG) scenario) and another in which it is warmed entirely by reduced cloud albedo (which we refer to as the low CO2-thin clouds or LCTC scenario) . The two simulations have an almost identical global-mean surface temperature and equator-to-pole temperature difference, but the LCTC scenario has ˜ 11 % greater global-mean precipitation than the GHG scenario. The LCTC scenario also has cooler midlatitude continents and warmer oceans than the GHG scenario and a tropical climate which is significantly more El Niño-like. Extremely high warm-season temperatures in the subtropics are mitigated in the LCTC scenario, while cool-season temperatures are lower at all latitudes. These changes appear large enough to motivate further, more detailed study using other climate models and a more realistic set of modelling assumptions.

  10. The Importance of Measuring Strength-of-Preference Scores for Health Care Options in Preference-Sensitive Care

    PubMed Central

    Crump, R. Trafford; Llewellyn-Thomas, Hilary A.

    2012-01-01

    Objective The objective was to determine whether a paired-comparison/leaning scale method: a) could feasibly be used to elicit strength-of-preference scores for elective health care options in large community-based survey settings; and b) could reveal preferential sub-groups that would have been overlooked if only a categorical-response format had been used. Study Design Medicare beneficiaries in four different regions of the United States were interviewed in person. Participants considered 8 clinical scenarios, each with 2 to 3 different health care options. For each scenario, participants categorically selected their favored option, then indicated how strongly they favored that option relative to the alternative on a paired-comparison bi-directional Leaning Scale. Results Two hundred and two participants were interviewed. For 7 of the 8 scenarios, a clear majority (> 50%) indicated that, overall, they categorically favored one option over the alternative(s). However, the bi-directional strength-of-preference Leaning Scale scores revealed that, in 4 scenarios, for half of those participants, their preference for the favored option was actually “weak” or “neutral”. Conclusion Investigators aiming to assess population-wide preferential attitudes towards different elective health care scenarios should consider gathering ordinal-level strength-of-preference scores and could feasibly use the paired-comparison/bi-directional Leaning Scale to do so. PMID:22494579

  11. Landslide risk analysis: a multi-disciplinary methodological approach

    NASA Astrophysics Data System (ADS)

    Sterlacchini, S.; Frigerio, S.; Giacomelli, P.; Brambilla, M.

    2007-11-01

    This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004) on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps), poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis. A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event) was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities). This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect damage ranged considerably from 2 840 000 to 9 350 000 €, depending on the selected temporal scenario and the expected closing time of the potentially affected structures. The multi-disciplinary approach discussed in this study may assist local decision makers in determining the nature and magnitude of the expected losses due to a dangerous event, which can be anticipated in a given study area, during a specified time period. Besides, a preventive knowledge of the prospective physical effects and economic consequences may help local decision makers to choose the best prevention and mitigation options and to decide how to allocate resources properly, so that potential benefits are maximised at an acceptable cost.

  12. Modeling and validation of directional reflectance for heterogeneous agro-forestry scenarios

    NASA Astrophysics Data System (ADS)

    Yelu, Z.; Jing, L.; Qinhuo, L.; Huete, A. R.

    2015-12-01

    Landscape heterogeneity is a common natural phenomenon but is seldom considered in current radiative transfer models for predicting the surface reflectance. This paper developed an explicit analytical Radiative Transfer model for heterogeneous Agro-Forestry scenarios (RTAF) by dividing the scenario into non-boundary regions and boundary regions. The scattering contribution of the non-boundary regions that are treated as homogeneous canopies can be estimated from the SAILH model, whereas that of the boundary regions with lengths, widths, canopy heights, and orientations of the field patches, is calculated based on the bidirectional gap probability by considering the interactions and mutual shadowing effects among different patches. The hot spot factor is extended for heterogeneous scenarios, the Hapke model for soil anisotropy is incorporated, and the contributions of the direct and diffuse radiation are separately calculated. The multi-angular airborne observations and the Discrete Anisotropic Radiative Transfer (DART) model simulations were used for validating and evaluating the RTAF model over an agro-forestry scenario in Heihe River Basin, China. It indicates that the RTAF model can accurately simulate the hemispherical-directional reflectance factors (HDRFs) of the heterogeneous agro-forestry scenario, with an RMSE of 0.0016 and 0.0179 in the red and near-infrared (NIR) bands, respectively. The RTAF model was compared with two widely used models, the dominant cover type (DCT) model and the spectral linear mixture (SLM) model, which either neglected the interactions and mutual shadowing effects between the shelterbets and crops, or did not account for the contribution of the shelterbets. Results suggest that the boundary effect can significantly influence the angular distribution of the HDRFs, and consequently enlarged the HDRF variations between the backward and forward directions in the principle plane. The RTAF model reduced the maximum relative error from 25.7% (SLM) and 23.0% (DCT) to 9.8% in the red band, and from 19.6% (DCT) and 13.7% (SLM) to 8.7% in the NIR band. According to the findings in this paper, the RTAF model provides a promising way to improve the retrieval of biophysical parameters (e.g. leaf area index) from remote sensing data over heterogeneous agro-forestry scenarios.

  13. Modeled changes in extreme wave climates of the tropical Pacific over the 21st century: Implications for U.S. and U.S.-Affiliated atoll islands

    USGS Publications Warehouse

    Shope, J.B.; Storlazzi, Curt; Erikson, Li H.; Hegermiller, C.A.

    2015-01-01

    Wave heights, periods, and directions were forecast for 2081–2100 using output from four coupled atmosphere–ocean global climate models for representative concentration pathway scenarios RCP4.5 and RCP8.5. Global climate model wind fields were used to drive the global WAVEWATCH-III wave model to generate hourly time-series of bulk wave parameters for 25 islands in the mid to western tropical Pacific. December–February 95th percentile extreme significant wave heights under both climate scenarios decreased by 2100 compared to 1976–2010 historical values. Trends under both scenarios were similar, with the higher-emission RCP8.5 scenario displaying a greater decrease in extreme significant wave heights than where emissions are reduced in the RCP4.5 scenario. Central equatorial Pacific Islands displayed the greatest departure from historical values; significant wave heights decreased there by as much as 0.32 m during December–February and associated wave directions rotated approximately 30° clockwise during June–August compared to hindcast data.

  14. Climate change impact assessment on Veneto and Friuli Plain groundwater. Part I: an integrated modeling approach for hazard scenario construction.

    PubMed

    Baruffi, F; Cisotto, A; Cimolino, A; Ferri, M; Monego, M; Norbiato, D; Cappelletto, M; Bisaglia, M; Pretner, A; Galli, A; Scarinci, A; Marsala, V; Panelli, C; Gualdi, S; Bucchignani, E; Torresan, S; Pasini, S; Critto, A; Marcomini, A

    2012-12-01

    Climate change impacts on water resources, particularly groundwater, is a highly debated topic worldwide, triggering international attention and interest from both researchers and policy makers due to its relevant link with European water policy directives (e.g. 2000/60/EC and 2007/118/EC) and related environmental objectives. The understanding of long-term impacts of climate variability and change is therefore a key challenge in order to address effective protection measures and to implement sustainable management of water resources. This paper presents the modeling approach adopted within the Life+ project TRUST (Tool for Regional-scale assessment of groUndwater Storage improvement in adaptation to climaTe change) in order to provide climate change hazard scenarios for the shallow groundwater of high Veneto and Friuli Plain, Northern Italy. Given the aim to evaluate potential impacts on water quantity and quality (e.g. groundwater level variation, decrease of water availability for irrigation, variations of nitrate infiltration processes), the modeling approach integrated an ensemble of climate, hydrologic and hydrogeologic models running from the global to the regional scale. Global and regional climate models and downscaling techniques were used to make climate simulations for the reference period 1961-1990 and the projection period 2010-2100. The simulation of the recent climate was performed using observed radiative forcings, whereas the projections have been done prescribing the radiative forcings according to the IPCC A1B emission scenario. The climate simulations and the downscaling, then, provided the precipitation, temperatures and evapo-transpiration fields used for the impact analysis. Based on downscaled climate projections, 3 reference scenarios for the period 2071-2100 (i.e. the driest, the wettest and the mild year) were selected and used to run a regional geomorphoclimatic and hydrogeological model. The final output of the model ensemble produced information about the potential variations of the water balance components (e.g. river discharge, groundwater level and volume) due to climate change. Such projections were used to develop potential hazard scenarios for the case study area, to be further applied within climate change risk assessment studies for groundwater resources and associated ecosystems. This paper describes the models' chain and the methodological approach adopted in the TRUST project and analyzes the hazard scenarios produced in order to investigate climate change risks for the case study area. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Projecting land-use change and its consequences for biodiversity in northern Thailand.

    PubMed

    Trisurat, Yongyut; Alkemade, Rob; Verburg, Peter H

    2010-03-01

    Rapid deforestation has occurred in northern Thailand over the last few decades and it is expected to continue. The government has implemented conservation policies aimed at maintaining forest cover of 50% or more and promoting agribusiness, forestry, and tourism development in the region. The goal of this paper was to analyze the likely effects of various directions of development on the region. Specific objectives were (1) to forecast land-use change and land-use patterns across the region based on three scenarios, (2) to analyze the consequences for biodiversity, and (3) to identify areas most susceptible to future deforestation and high biodiversity loss. The study combined a dynamic land-use change model (Dyna-CLUE) with a model for biodiversity assessment (GLOBIO3). The Dyna-CLUE model was used to determine the spatial patterns of land-use change for the three scenarios. The methodology developed for the Global Biodiversity Assessment Model framework (GLOBIO 3) was used to estimate biodiversity intactness expressed as the remaining relative mean species abundance (MSA) of the original species relative to their abundance in the primary vegetation. The results revealed that forest cover in 2050 would mainly persist in the west and upper north of the region, which is rugged and not easily accessible. In contrast, the highest deforestation was expected to occur in the lower north. MSA values decreased from 0.52 in 2002 to 0.45, 0.46, and 0.48, respectively, for the three scenarios in 2050. In addition, the estimated area with a high threat to biodiversity (an MSA decrease >0.5) derived from the simulated land-use maps in 2050 was approximately 2.8% of the region for the trend scenario. In contrast, the high-threat areas covered 1.6 and 0.3% of the region for the integrated-management and conservation-oriented scenarios, respectively. Based on the model outcomes, conservation measures were recommended to minimize the impacts of deforestation on biodiversity. The model results indicated that only establishing a fixed percentage of forest was not efficient in conserving biodiversity. Measures aimed at the conservation of locations with high biodiversity values, limited fragmentation, and careful consideration of road expansion in pristine forest areas may be more efficient to achieve biodiversity conservation.

  16. Do abnormal responses show utilitarian bias?

    PubMed

    Kahane, Guy; Shackel, Nicholas

    2008-03-20

    Neuroscience has recently turned to the study of utilitarian and non-utilitarian moral judgement. Koenigs et al. examine the responses of normal subjects and those with ventromedial-prefrontal-cortex (VMPC) damage to moral scenarios drawn from functional magnetic resonance imaging studies by Greene et al., and claim that patients with VMPC damage have an abnormally "utilitarian" pattern of moral judgement. It is crucial to the claims of Koenigs et al. that the scenarios of Greene et al. pose a conflict between utilitarian consequence and duty: however, many of them do not meet this condition. Because of this methodological problem, it is too early to claim that VMPC patients have a utilitarian bias.

  17. Direct multitrait selection realizes the highest genetic response for ratio traits.

    PubMed

    Zetouni, L; Henryon, M; Kargo, M; Lassen, J

    2017-05-01

    For a number of traits the phenotype considered to be the goal trait is a combination of 2 or more traits, like methane (CH) emission (CH/kg of milk). Direct selection on CH4 emission defined as a ratio is problematic, because it is uncertain whether the improvement comes from an improvement in milk yield, a decrease in CH emission or both. The goal was to test different strategies on selecting for 2 antagonistic traits- improving milk yield while decreasing methane emissions. The hypothesis was that to maximize genetic gain for a ratio trait, the best approach is to select directly for the component traits rather than using a ratio trait or a trait where 1 trait is corrected for the other as the selection criteria. Stochastic simulation was used to mimic a dairy cattle population. Three scenarios were tested, which differed in selection criteria but all selecting for increased milk yield: 1) selection based on a multitrait approach using the correlation structure between the 2 traits, 2) the ratio of methane to milk and 3) gross methane phenotypically corrected for milk. Four correlation sets were tested in all scenarios, to access robustness of the results. An average genetic gain of 66 kg of milk per yr was obtained in all scenarios, but scenario 1 had the best response for decreased methane emissions, with a genetic gain of 24.8 l/yr, while scenarios 2 and 3 had genetic gains of 27.1 and 27.3 kg/yr. The results found were persistent across correlation sets. These results confirm the hypothesis that to obtain the highest genetic gain a multitrait selection is a better approach than selecting for the ratio directly. The results are exemplified for a methane and milk scenario but can be generalized to other situations where combined traits need to be improved.

  18. A Method for Evaluating the Safety Impacts of Air Traffic Automation

    NASA Technical Reports Server (NTRS)

    Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Bonesteel, Charles

    1998-01-01

    This report describes a methodology for analyzing the safety and operational impacts of emerging air traffic technologies. The approach integrates traditional reliability models of the system infrastructure with models that analyze the environment within which the system operates, and models of how the system responds to different scenarios. Products of the analysis include safety measures such as predicted incident rates, predicted accident statistics, and false alarm rates; and operational availability data. The report demonstrates the methodology with an analysis of the operation of the Center-TRACON Automation System at Dallas-Fort Worth International Airport.

  19. Agile rediscovering values: Similarities to continuous improvement strategies

    NASA Astrophysics Data System (ADS)

    Díaz de Mera, P.; Arenas, J. M.; González, C.

    2012-04-01

    Research in the late 80's on technological companies that develop products of high value innovation, with sufficient speed and flexibility to adapt quickly to changing market conditions, gave rise to the new set of methodologies known as Agile Management Approach. In the current changing economic scenario, we considered very interesting to study the similarities of these Agile Methodologies with other practices whose effectiveness has been amply demonstrated in both the West and Japan. Strategies such as Kaizen, Lean, World Class Manufacturing, Concurrent Engineering, etc, would be analyzed to check the values they have in common with the Agile Approach.

  20. Supervisory Control of Multiple Uninhabited Systems - Methodologies and Enabling Human-Robot Interface Technologies (Commande et surveillance de multiples systemes sans pilote - Methodologies et technologies habilitantes d’interfaces homme-machine)

    DTIC Science & Technology

    2012-12-01

    FRANCE 6.1 DATES SMAART (2006 – 2008) and SUSIE (2009 – 2011). 6.2 LOCATION Brest – Nancy – Paris (France). 6.3 SCENARIO/TASKS The setting...Agency (RTA), a dedicated staff with its headquarters in Neuilly, near Paris , France. In order to facilitate contacts with the military users and...Mission Delay for the Helicopter 8-12 Table 8-2 Assistant Interventions and Commander’s Reactions 8-13 Table 10-1 Partial LOA Matrix as Originally

  1. Teachers' Assessments of Elements of Multimedia and Constructivist Didactics in School

    ERIC Educational Resources Information Center

    Matijevic, Milan; Topolovcan, Tomislav; Lapat, Goran

    2015-01-01

    Despite the understandings that constructivist and multimedia didactics, as well as curricular theory and multiple intelligences theory, have been providing for years, what happens in the classroom and in the teaching process is still mostly teacher-centred. The didactic and methodological scenarios that prevail in our classes are more suitable to…

  2. New Perspectives on Teaching and Working with Languages in the Digital Era

    ERIC Educational Resources Information Center

    Pareja-Lora, Antonio, Ed.; Calle-Martínez, Cristina, Ed.; Rodríguez-Arancón, Pilar, Ed.

    2016-01-01

    This volume offers a comprehensive, up-to-date, empirical and methodological view over the new scenarios and environments for language teaching and learning recently emerged (e.g. blended learning, e-learning, ubiquitous learning, social learning, autonomous learning or lifelong learning), and also over some of the new approaches to language…

  3. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    ERIC Educational Resources Information Center

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  4. Exploration of a Contextual Management Framework for Strategic Learning Alliances

    ERIC Educational Resources Information Center

    Dealtry, Richard

    2008-01-01

    Purpose: This article aims to take a further step forward in examining those important business factors that will shape the future of best practice in the quality management of internal and external strategic alliances. Design/methodology/approach: The article presents a speculative scenario on the future of strategic alliances in education,…

  5. MARC Data, the OPAC, and Library Professionals

    ERIC Educational Resources Information Center

    Williams, Jo

    2009-01-01

    Purpose: The purpose of this paper is to show that knowledge of the Machine-Readable Cataloguing (MARC) format is useful in all aspects of librarianship, not just for cataloguing, and how MARC knowledge can address indexing limitations of the online catalogue. Design/methodology/approach: The paper employs examples and scenarios to show the…

  6. Cartography As Language: An Argument and a Functional Application.

    ERIC Educational Resources Information Center

    Bosowski, Elaine Frances

    This paper justifies the teaching of cartography in secondary schools and expands graphic knowledge by providing a formal graphic language simulation lesson. The cartographer's task, decisions, and methodologies are approximated by the use of this role playing scenario. Students assume the roles of map authors who are contracted to draw up a set…

  7. Predicting Plausible Impacts of Sets of Climate and Land Use Change Scenarios on Water Resources

    EPA Science Inventory

    Global changes in climate and land use can alTect the quantity and quality of water resources. Hence, we need a methodology to predict these ramifications. Using the Little Miami River (LMR) watershed as a case study, this paper describes a spatial analytical approach integrating...

  8. Exploring the Effectiveness of a Virtual Learning Methodology in Occupational Therapy Education

    ERIC Educational Resources Information Center

    Bebeau, Deborah

    2016-01-01

    This quantitative, randomly controlled study sought to find relationships between occupational therapy students' participation in a virtual situated-case scenario (VSCS) and enhanced perceived self-efficacy as well as academic performance when compared to participation in a text-based case study. To determine effects of participation in a virtual…

  9. Quantifying ecosystem service tradeoffs in response to alternative land use and climate scenarios: Pacific Northwest applications of the VELMA ecohydrological model

    EPA Science Inventory

    Scientists, policymakers, community planners and others have discussed ecosystem services for decades, however, society is still in the early stages of developing methodologies to quantify and value the goods and services that ecosystems provide. Essential to this goal are highl...

  10. Using MBTI for the Success Assessment of Engineering Teams in Project-Based Learning

    ERIC Educational Resources Information Center

    Rodríguez Montequín, V.; Mesa Fernández, J. M.; Balsera, J. Villanueva; García Nieto, A.

    2013-01-01

    Project-Based Learning (PBL) is a teaching and learning methodology that emphasizes student centered instruction by assigning projects. The students have to conduct significant projects and cope with realistic working conditions and scenarios. PBL is generally done by groups of students working together towards a common goal. Several factors play…

  11. Characterization of Radiation Hardened Bipolar Linear Devices for High Total Dose Missions

    NASA Technical Reports Server (NTRS)

    McClure, Steven S.; Harris, Richard D.; Rax, Bernard G.; Thorbourn, Dennis O.

    2012-01-01

    Radiation hardened linear devices are characterized for performance in combined total dose and displacement damage environments for a mission scenario with a high radiation level. Performance at low and high dose rate for both biased and unbiased conditions is compared and the impact to hardness assurance methodology is discussed.

  12. Discovering the Future of the Case Study Method in Evaluation Research.

    ERIC Educational Resources Information Center

    Yin, Robert K.

    1994-01-01

    It is assumed that evaluators of the future will still be interested in case study methodology. Scenarios that ignore a case study method, that look back to a distinctive case study method, and that see the case study method as an integrating force in the qualitative-quantitative debate are explored. (SLD)

  13. The Importance of Teaching Methodology in Moral Education of Sport Populations.

    ERIC Educational Resources Information Center

    Stoll, Sharon Kay; And Others

    Three approaches to teaching moral reasoning were implemented by expert teachers in classes at three small colleges and outcomes were compared. Teaching models included the following: Model A, a "good reasoned" approach in which students discussed scenarios and determined the best course of action; Model B, a teacher-centered lecture,…

  14. "PolyCAFe"--Automatic Support for the Polyphonic Analysis of CSCL Chats

    ERIC Educational Resources Information Center

    Trausan-Matu, Stefan; Dascalu, Mihai; Rebedea, Traian

    2014-01-01

    Chat conversations and other types of online communication environments are widely used within CSCL educational scenarios. However, there is a lack of theoretical and methodological background for the analysis of collaboration. Manual assessing of non-moderated chat discussions is difficult and time-consuming, having as a consequence that learning…

  15. Effect-size measures as descriptors of assay quality in high-content screening: A brief review of some available methodologies

    USDA-ARS?s Scientific Manuscript database

    The field of high-content screening (HCS) typically uses measures of screen quality conceived for fairly straightforward high-throughput screening (HTS) scenarios. However, in contrast to HTS, image-based HCS systems rely on multidimensional readouts reporting biological responses associated with co...

  16. Valuation of Transactive Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammerstrom, Donald J.; Corbin, Charles D.; Fernandez, Nicholas

    2016-05-12

    This is a final report from a project funded by the U.S. Department of Energy to formulate and test a methodology for valuation of systems where transaction-based mechanisms coordinate the exchange of value between the system’s actors. Today, the principal commodity being exchanged is electrical energy, and such mechanisms are called transactive energy systems. The authors strove to lay a foundation for meaningful valuations of transactive systems in general, and transactive energy systems as a special case. The word valuation is used in many different ways. This report proposes a valuation methodology that is inclusive of many types of valuations.more » Many will be familiar with cost-benefit valuations, in which both costs and benefits are assessed to determine whether the assets are worth their cost. Another set of valuation methods attempt to optimize an outcome using available resources, as is the case with integrated resource planning. In the end, this report’s methodology was most influenced by and most resembles the integrated-resource-planning approach. Regardless, we wish to enforce the premise that all valuations are comparative and should clearly specify a baseline scenario. A long, annotated list of prior valuation studies and valuation methodologies that influenced this report has been appended to this report. Much research is being conducted today concerning transactive systems, but only a handful of transactive system mechanisms have been formulated and field tested. They are found to be quite diverse, and the documentation of the various mechanisms is uneven in breadth and quality. It is therefore not adequate to simply assert that a valuation scenario includes a transactive system; certain characteristics and qualities of the chosen transactive system mechanism must be defined and stated. The report lists and discusses most of the known transactive system mechanisms. It offers a set of questions that may be used to help specify important characteristics of the transactive system mechanisms, which should be conveyed along with other valuation results. A valuation methodology is proposed. Some abstraction is necessarily retained so that the methodology may be applied for the many purposes of today’s valuations and across grid, building, societal, and other domains. The report’s methodology advocates separation of operational timescales from long-term growth timescales. Operational models are defined as the models that inform impacts within the relatively short, often yearlong, operational time periods. Growth models define how the scenarios evolve from one operational period to the next (e.g., from year to year). We believe the recommended methodology is a critical step toward collaborative community platforms, where analysts and decision makers alike could contribute and borrow content within their expertise. The report then asks, what is unique about valuations when systems become coordinated by transactive systems? In answer, accurate valuations of transactive systems require careful adherence to the dynamic interaction between a system’s responsive elements and the system’s operational objectives. In every transactive system mechanism, elements respond to incentives that become revealed to them, and certain operational objectives become explicitly incentivized by the transactive system mechanism. The transactive system mechanisms define the important coupling between the responsive elements and the system’s objectives.« less

  17. Portable open-path optical remote sensing (ORS) Fourier transform infrared (FTIR) instrumentation miniaturization and software for point and click real-time analysis

    NASA Astrophysics Data System (ADS)

    Zemek, Peter G.; Plowman, Steven V.

    2010-04-01

    Advances in hardware have miniaturized the emissions spectrometer and associated optics, rendering them easily deployed in the field. Such systems are also suitable for vehicle mounting, and can provide high quality data and concentration information in minutes. Advances in software have accompanied this hardware evolution, enabling the development of portable point-and-click OP-FTIR systems that weigh less than 16 lbs. These systems are ideal for first-responders, military, law enforcement, forensics, and screening applications using optical remote sensing (ORS) methodologies. With canned methods and interchangeable detectors, the new generation of OP-FTIR technology is coupled to the latest forward reference-type model software to provide point-and-click technology. These software models have been established for some time. However, refined user-friendly models that use active, passive, and solar occultation methodologies now allow the user to quickly field-screen and quantify plumes, fence-lines, and combustion incident scenarios in high-temporal-resolution. Synthetic background generation is now redundant as the models use highly accurate instrument line shape (ILS) convolutions and several other parameters, in conjunction with radiative transfer model databases to model a single calibration spectrum to collected sample spectra. Data retrievals are performed directly on single beam spectra using non-linear classical least squares (NLCLS). Typically, the Hitran line database is used to generate the initial calibration spectrum contained within the software.

  18. Preliminary Comparison of Radioactive Waste Disposal Cost for Fusion and Fission Reactors

    NASA Astrophysics Data System (ADS)

    Seki, Yasushi; Aoki, Isao; Yamano, Naoki; Tabara, Takashi

    1997-09-01

    The environmental and economic impact of radioactive waste (radwaste) generated from fusion power reactors using five types of structural materials and a fission reactor has been evaluated and compared. Possible radwaste disposal scenario of fusion radwaste in Japan is considered. The exposure doses were evaluated for the skyshine of gamma-ray during the disposal operation, groundwater migration scenario during the institutional control period of 300 years and future site use scenario after the institutional period. The radwaste generated from a typical light water fission reactor was evaluated using the same methodology as for the fusion reactors. It is found that radwaste from the fusion reactors using F82H and SiC/SiC composites without impurities could be disposed by the shallow land disposal presently applied to the low level waste in Japan. The disposal cost of radwaste from five fusion power reactors and a typical light water reactor were roughly evaluated and compared.

  19. Data Farming and the Exploration of Inter-Agency, Inter-Disciplinary, and International "What If?" Questions

    NASA Technical Reports Server (NTRS)

    Anderson, Steve; Horne, Gary; Meyer, Ted; Triola, Larry

    2012-01-01

    Data farming uses simulation modeling, high performance computing, and analysis to examine questions of interest with large possibility spaces.This methodology allows for the examination of whole landscapes of potential outcomes and provides the capability of executing enough experiments so that outlets might be captured and examined for insights. This capability may be quite informative when used to examine the plethora of "What if?" questions that result when examining potential scenarios that our forces may face in the uncertain world of the future. Many of theses scenarios most certainly will be challenging and solutions may depend on interagency and international collaboration as well as the need for inter-disciplinary scientific inquiry preceding these events. In this paper, we describe data farming and illustrate it in the context of application to questions inherent to military decision-making as we consider alternate future scenarios.

  20. Progress in modeling solidification in molten salt coolants

    NASA Astrophysics Data System (ADS)

    Tano, Mauricio; Rubiolo, Pablo; Doche, Olivier

    2017-10-01

    Molten salts have been proposed as heat carrier media in the nuclear and concentrating solar power plants. Due to their high melting temperature, solidification of the salts is expected to occur during routine and accidental scenarios. Furthermore, passive safety systems based on the solidification of these salts are being studied. The following article presents new developments in the modeling of eutectic molten salts by means of a multiphase, multicomponent, phase-field model. Besides, an application of this methodology for the eutectic solidification process of the ternary system LiF-KF-NaF is presented. The model predictions are compared with a newly developed semi-analytical solution for directional eutectic solidification at stable growth rate. A good qualitative agreement is obtained between the two approaches. The results obtained with the phase-field model are then used for calculating the homogenized properties of the solid phase distribution. These properties can then be included in a mixture macroscale model, more suitable for industrial applications.

  1. Solar thermal technologies benefits assessment: Objectives, methodologies and results for 1981

    NASA Technical Reports Server (NTRS)

    Gates, W. R.

    1982-01-01

    The economic and social benefits of developing cost competitive solar thermal technologies (STT) were assessed. The analysis was restricted to STT in electric applications for 16 high insolation/high energy price states. Three fuel price scenarios and three 1990 STT system costs were considered, reflecting uncertainty over fuel prices and STT cost projections. After considering the numerous benefits of introducing STT into the energy market, three primary benefits were identified and evaluated: (1) direct energy cost savings were estimated to range from zero to $50 billion; (2) oil imports may be reduced by up to 9 percent, improving national security; and (3) significant environmental benefits can be realized in air basins where electric power plant emissions create substantial air pollution problems. STT research and development was found to be unacceptably risky for private industry in the absence of federal support. The normal risks associated with investments in research and development are accentuated because the OPEC cartel can artificially manipulate oil prices and undercut the growth of alternative energy sources.

  2. Enhanced Engine Performance During Emergency Operation Using a Model-Based Engine Control Architecture

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2016-01-01

    This paper discusses the design and application of model-based engine control (MBEC) for use during emergency operation of the aircraft. The MBEC methodology is applied to the Commercial Modular Aero-Propulsion System Simulation 40k (CMAPSS40k) and features an optimal tuner Kalman Filter (OTKF) to estimate unmeasured engine parameters, which can then be used for control. During an emergency scenario, normally-conservative engine operating limits may be relaxed to increase the performance of the engine and overall survivability of the aircraft; this comes at the cost of additional risk of an engine failure. The MBEC architecture offers the advantage of estimating key engine parameters that are not directly measureable. Estimating the unknown parameters allows for tighter control over these parameters, and on the level of risk the engine will operate at. This will allow the engine to achieve better performance than possible when operating to more conservative limits on a related, measurable parameter.

  3. Enhanced Engine Performance During Emergency Operation Using a Model-Based Engine Control Architecture

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2015-01-01

    This paper discusses the design and application of model-based engine control (MBEC) for use during emergency operation of the aircraft. The MBEC methodology is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (CMAPSS40,000) and features an optimal tuner Kalman Filter (OTKF) to estimate unmeasured engine parameters, which can then be used for control. During an emergency scenario, normally-conservative engine operating limits may be relaxed to increase the performance of the engine and overall survivability of the aircraft; this comes at the cost of additional risk of an engine failure. The MBEC architecture offers the advantage of estimating key engine parameters that are not directly measureable. Estimating the unknown parameters allows for tighter control over these parameters, and on the level of risk the engine will operate at. This will allow the engine to achieve better performance than possible when operating to more conservative limits on a related, measurable parameter.

  4. Comparing and validating models of driver steering behaviour in collision avoidance and vehicle stabilisation

    NASA Astrophysics Data System (ADS)

    Markkula, G.; Benderius, O.; Wahde, M.

    2014-12-01

    A number of driver models were fitted to a large data set of human truck driving, from a simulated near-crash, low-friction scenario, yielding two main insights: steering to avoid a collision was best described as an open-loop manoeuvre of predetermined duration, but with situation-adapted amplitude, and subsequent vehicle stabilisation could to a large extent be accounted for by a simple yaw rate nulling control law. These two phenomena, which could be hypothesised to generalise to passenger car driving, were found to determine the ability of four driver models adopted from the literature to fit the human data. Based on the obtained results, it is argued that the concept of internal vehicle models may be less valuable when modelling driver behaviour in non-routine situations such as near-crashes, where behaviour may be better described as direct responses to salient perceptual cues. Some methodological issues in comparing and validating driver models are also discussed.

  5. Solar thermal technologies benefits assessment: Objectives, methodologies and results for 1981

    NASA Astrophysics Data System (ADS)

    Gates, W. R.

    1982-07-01

    The economic and social benefits of developing cost competitive solar thermal technologies (STT) were assessed. The analysis was restricted to STT in electric applications for 16 high insolation/high energy price states. Three fuel price scenarios and three 1990 STT system costs were considered, reflecting uncertainty over fuel prices and STT cost projections. After considering the numerous benefits of introducing STT into the energy market, three primary benefits were identified and evaluated: (1) direct energy cost savings were estimated to range from zero to $50 billion; (2) oil imports may be reduced by up to 9 percent, improving national security; and (3) significant environmental benefits can be realized in air basins where electric power plant emissions create substantial air pollution problems. STT research and development was found to be unacceptably risky for private industry in the absence of federal support. The normal risks associated with investments in research and development are accentuated because the OPEC cartel can artificially manipulate oil prices and undercut the growth of alternative energy sources.

  6. Indoor Navigation from Point Clouds: 3d Modelling and Obstacle Detection

    NASA Astrophysics Data System (ADS)

    Díaz-Vilariño, L.; Boguslawski, P.; Khoshelham, K.; Lorenzo, H.; Mahdjoubi, L.

    2016-06-01

    In the recent years, indoor modelling and navigation has become a research of interest because many stakeholders require navigation assistance in various application scenarios. The navigational assistance for blind or wheelchair people, building crisis management such as fire protection, augmented reality for gaming, tourism or training emergency assistance units are just some of the direct applications of indoor modelling and navigation. Navigational information is traditionally extracted from 2D drawings or layouts. Real state of indoors, including opening position and geometry for both windows and doors, and the presence of obstacles is commonly ignored. In this work, a real indoor-path planning methodology based on 3D point clouds is developed. The value and originality of the approach consist on considering point clouds not only for reconstructing semantically-rich 3D indoor models, but also for detecting potential obstacles in the route planning and using these for readapting the routes according to the real state of the indoor depictured by the laser scanner.

  7. A methodology for secure recovery of spacecrafts based on a trusted hardware platform

    NASA Astrophysics Data System (ADS)

    Juliato, Marcio; Gebotys, Catherine

    2017-02-01

    This paper proposes a methodology for the secure recovery of spacecrafts and the recovery of its cryptographic capabilities in emergency scenarios recurring from major unintentional failures and malicious attacks. The proposed approach employs trusted modules to achieve higher reliability and security levels in space missions due to the presence of integrity check capabilities as well as secure recovery mechanisms. Additionally, several recovery protocols are thoroughly discussed and analyzed against a wide variety of attacks. Exhaustive search attacks are shown in a wide variety of contexts and are shown to be infeasible and totally independent of the computational power of attackers. Experimental results have shown that the proposed methodology allows for the fast and secure recovery of spacecrafts, demanding minimum implementation area, power consumption and bandwidth.

  8. Digital Methodology to implement the ECOUTER engagement process.

    PubMed

    Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J

    2016-01-01

    ECOUTER ( E mploying CO ncept u al schema for policy and T ranslation E  in R esearch - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.

  9. Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana

    2013-01-01

    The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.

  10. System Interdependency Modeling in the Design of Prognostic and Health Management Systems in Smart Manufacturing.

    PubMed

    Malinowski, M L; Beling, P A; Haimes, Y Y; LaViers, A; Marvel, J A; Weiss, B A

    2015-01-01

    The fields of risk analysis and prognostics and health management (PHM) have developed in a largely independent fashion. However, both fields share a common core goal. They aspire to manage future adverse consequences associated with prospective dysfunctions of the systems under consideration due to internal or external forces. This paper describes how two prominent risk analysis theories and methodologies - Hierarchical Holographic Modeling (HHM) and Risk Filtering, Ranking, and Management (RFRM) - can be adapted to support the design of PHM systems in the context of smart manufacturing processes. Specifically, the proposed methodologies will be used to identify targets - components, subsystems, or systems - that would most benefit from a PHM system in regards to achieving the following objectives: minimizing cost, minimizing production/maintenance time, maximizing system remaining usable life (RUL), maximizing product quality, and maximizing product output. HHM is a comprehensive modeling theory and methodology that is grounded on the premise that no system can be modeled effectively from a single perspective. It can also be used as an inductive method for scenario structuring to identify emergent forced changes (EFCs) in a system. EFCs connote trends in external or internal sources of risk to a system that may adversely affect specific states of the system. An important aspect of proactive risk management includes bolstering the resilience of the system for specific EFCs by appropriately controlling the states. Risk scenarios for specific EFCs can be the basis for the design of prognostic and diagnostic systems that provide real-time predictions and recognition of scenario changes. The HHM methodology includes visual modeling techniques that can enhance stakeholders' understanding of shared states, resources, objectives and constraints among the interdependent and interconnected subsystems of smart manufacturing systems. In risk analysis, HHM is often paired with Risk Filtering, Ranking, and Management (RFRM). The RFRM process provides the users, (e.g., technology developers, original equipment manufacturers (OEMs), technology integrators, manufacturers), with the most critical risks to the objectives, which can be used to identify the most critical components and subsystems that would most benefit from a PHM system. A case study is presented in which HHM and RFRM are adapted for PHM in the context of an active manufacturing facility located in the United States. The methodologies help to identify the critical risks to the manufacturing process, and the major components and subsystems that would most benefit from a developed PHM system.

  11. System Interdependency Modeling in the Design of Prognostic and Health Management Systems in Smart Manufacturing

    PubMed Central

    Malinowski, M.L.; Beling, P.A.; Haimes, Y.Y.; LaViers, A.; Marvel, J.A.; Weiss, B.A.

    2017-01-01

    The fields of risk analysis and prognostics and health management (PHM) have developed in a largely independent fashion. However, both fields share a common core goal. They aspire to manage future adverse consequences associated with prospective dysfunctions of the systems under consideration due to internal or external forces. This paper describes how two prominent risk analysis theories and methodologies – Hierarchical Holographic Modeling (HHM) and Risk Filtering, Ranking, and Management (RFRM) – can be adapted to support the design of PHM systems in the context of smart manufacturing processes. Specifically, the proposed methodologies will be used to identify targets – components, subsystems, or systems – that would most benefit from a PHM system in regards to achieving the following objectives: minimizing cost, minimizing production/maintenance time, maximizing system remaining usable life (RUL), maximizing product quality, and maximizing product output. HHM is a comprehensive modeling theory and methodology that is grounded on the premise that no system can be modeled effectively from a single perspective. It can also be used as an inductive method for scenario structuring to identify emergent forced changes (EFCs) in a system. EFCs connote trends in external or internal sources of risk to a system that may adversely affect specific states of the system. An important aspect of proactive risk management includes bolstering the resilience of the system for specific EFCs by appropriately controlling the states. Risk scenarios for specific EFCs can be the basis for the design of prognostic and diagnostic systems that provide real-time predictions and recognition of scenario changes. The HHM methodology includes visual modeling techniques that can enhance stakeholders’ understanding of shared states, resources, objectives and constraints among the interdependent and interconnected subsystems of smart manufacturing systems. In risk analysis, HHM is often paired with Risk Filtering, Ranking, and Management (RFRM). The RFRM process provides the users, (e.g., technology developers, original equipment manufacturers (OEMs), technology integrators, manufacturers), with the most critical risks to the objectives, which can be used to identify the most critical components and subsystems that would most benefit from a PHM system. A case study is presented in which HHM and RFRM are adapted for PHM in the context of an active manufacturing facility located in the United States. The methodologies help to identify the critical risks to the manufacturing process, and the major components and subsystems that would most benefit from a developed PHM system. PMID:28664162

  12. Construction of Gridded Daily Weather Data and its Use in Central-European Agroclimatic Study

    NASA Astrophysics Data System (ADS)

    Dubrovsky, M.; Trnka, M.; Skalak, P.

    2013-12-01

    The regional-scale simulations of weather-sensitive processes (e.g. hydrology, agriculture and forestry) for the present and/or future climate often require high resolution meteorological inputs in terms of the time series of selected surface weather characteristics (typically temperature, precipitation, solar radiation, humidity, wind) for a set of stations or on a regular grid. As even the latest Global and Regional Climate Models (GCMs and RCMs) do not provide realistic representation of statistical structure of the surface weather, the model outputs must be postprocessed (downscaled) to achieve the desired statistical structure of the weather data before being used as an input to the follow-up simulation models. One of the downscaling approaches, which is employed also here, is based on a weather generator (WG), which is calibrated using the observed weather series, interpolated, and then modified according to the GCM- or RCM-based climate change scenarios. The present contribution, in which the parametric daily weather generator M&Rfi is linked to the high-resolution RCM output (ALADIN-Climate/CZ model) and GCM-based climate change scenarios, consists of two parts: The first part focuses on a methodology. Firstly, the gridded WG representing the baseline climate is created by merging information from observations and high resolution RCM outputs. In this procedure, WG is calibrated with RCM-simulated multi-variate weather series, and the grid specific WG parameters are then de-biased by spatially interpolated correction factors based on comparison of WG parameters calibrated with RCM-simulated weather series vs. spatially scarcer observations. To represent the future climate, the WG parameters are modified according to the 'WG-friendly' climate change scenarios. These scenarios are defined in terms of changes in WG parameters and include - apart from changes in the means - changes in WG parameters, which represent the additional characteristics of the weather series (e.g. probability of wet day occurrence and lag-1 autocorrelation of daily mean temperature). The WG-friendly scenarios for the present experiment are based on comparison of future vs baseline surface weather series simulated by GCMs from a CMIP3 database. The second part will present results of climate change impact study based on an above methodology applied to Central Europe. The changes in selected climatic (focusing on the extreme precipitation and temperature characteristics) and agroclimatic (including number of days during vegetation season with heat and drought stresses) characteristics will be analysed. In discussing the results, the emphasis will be put on 'added value' of various aspects of above methodology (e.g. inclusion of changes in 'advanced' WG parameters into the climate change scenarios). Acknowledgements: The present experiment is made within the frame of projects WG4VALUE (project LD12029 sponsored by the Ministry of Education, Youth and Sports of CR), ALARO-Climate (project P209/11/2405 sponsored by the Czech Science Foundation), and VALUE (COST ES 1102 action).

  13. Comparisons of Simulated Hydrodynamics and Water Quality for Projected Demands in 2046, Pueblo Reservoir, Southeastern Colorado

    USGS Publications Warehouse

    Ortiz, Roderick F.; Galloway, Joel M.; Miller, Lisa D.; Mau, David P.

    2008-01-01

    Pueblo Reservoir is one of southeastern Colorado's most valuable water resources. The reservoir provides irrigation, municipal, and industrial water to various entities throughout the region. The reservoir also provides flood control, recreational activities, sport fishing, and wildlife enhancement to the region. The Bureau of Reclamation is working to meet its goal to issue a Final Environmental Impact Statement (EIS) on the Southern Delivery System project (SDS). SDS is a regional water-delivery project that has been proposed to provide a safe, reliable, and sustainable water supply through the foreseeable future (2046) for Colorado Springs, Fountain, Security, and Pueblo West. Discussions with the Bureau of Reclamation and the U.S. Geological Survey led to a cooperative agreement to simulate the hydrodynamics and water quality of Pueblo Reservoir. This work has been completed and described in a previously published report, U.S. Geological Survey Scientific Investigations Report 2008-5056. Additionally, there was a need to make comparisons of simulated hydrodynamics and water quality for projected demands associated with the various EIS alternatives and plans by Pueblo West to discharge treated water into the reservoir. Plans by Pueblo West are fully independent of the SDS project. This report compares simulated hydrodynamics and water quality for projected demands in Pueblo Reservoir resulting from changes in inflow and water quality entering the reservoir, and from changes to withdrawals from the reservoir as projected for the year 2046. Four of the seven EIS alternatives were selected for scenario simulations. The four U.S. Geological Survey simulation scenarios were the No Action scenario (EIS Alternative 1), the Downstream Diversion scenario (EIS Alternative 2), the Upstream Return-Flow scenario (EIS Alternative 4), and the Upstream Diversion scenario (EIS Alternative 7). Additionally, the results of an Existing Conditions scenario (water years 2000 through 2002) were compared to the No Action scenario (projected demands in 2046) to assess changes in water quality over time. All scenario modeling used an external nutrient-decay model to simulate degradation and assimilation of nutrients along the riverine reach upstream from Pueblo Reservoir. Reservoir modeling was conducted using the U.S. Army Corps of Engineers CE-QUAL-W2 two-dimensional water-quality model. Lake hydrodynamics, water temperature, dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, algal biomass, and total iron were simulated. Two reservoir site locations were selected for comparison. Results of simulations at site 3B were characteristic of a riverine environment in the reservoir while results at site 7B (near the dam) were characteristic of the main body of the reservoir. Simulation results for the epilimnion and hypolimnion at these two sites also were evaluated and compared. The simulation results in the hypolimnion at site 7B were indicative of the water quality leaving the reservoir. Comparisons of the different scenario results were conducted to assess if substantial differences were observed between selected scenarios. Each of the scenarios was simulated for three contiguous years representing a wet, average, and dry annual hydrologic cycle (water years 2000 through 2002). Additionally, each selected simulation scenario was evaluated for differences in direct- and cumulative-effects on a particular scenario. Direct effects are intended to isolate the future effects of the scenarios. Cumulative effects are intended to evaluate the effects of the scenarios in conjunction with all reasonably foreseeable future activities in the study area. Comparisons between the direct- and cumulative-effects analyses indicated that there were not large differences in the results between most of the simulation scenarios and, as such, the focus of this report was on results for the direct-effects analysis. Addi

  14. Revised Comparisons of Simulated Hydrodynamics and Water Quality for Projected Demands in 2046, Pueblo Reservoir, Southeastern Colorado

    USGS Publications Warehouse

    Ortiz, Roderick F.; Miller, Lisa D.

    2009-01-01

    Pueblo Reservoir is one of southeastern Colorado's most valuable water resources. The reservoir provides irrigation, municipal, and industrial water to various entities throughout the region. The reservoir also provides flood control, recreational activities, sport fishing, and wildlife enhancement to the region. The Southern Delivery System (SDS) project is a regional water-delivery project that has been proposed to provide a safe, reliable, and sustainable water supply through the foreseeable future (2046) for Colorado Springs, Fountain, Security, and Pueblo West. Discussions with the Bureau of Reclamation and the U.S. Geological Survey led to a cooperative agreement to simulate the hydrodynamics and water quality of Pueblo Reservoir. This work has been completed and described in a previously published report, U.S. Geological Survey Scientific Investigations Report 2008-5056. Additionally, there was a need to make comparisons of simulated hydrodynamics and water quality for projected demands associated with the various Environmental Impact Statements (EIS) alternatives and plans by Pueblo West to discharge treated wastewater into the reservoir. Wastewater plans by Pueblo West are fully independent of the SDS project. This report compares simulated hydrodynamics and water quality for projected demands in Pueblo Reservoir resulting from changes in inflow and water quality entering the reservoir, and from changes to withdrawals from the reservoir as projected for the year 2046. Four of the seven EIS alternatives were selected for scenario simulations. The four U.S. Geological Survey simulation scenarios were the No Action scenario (EIS Alternative 1), the Downstream Diversion scenario (EIS Alternative 2), the Upstream Return-Flow scenario (EIS Alternative 4), and the Upstream Diversion scenario (EIS Alternative 7). Additionally, the results of an Existing Conditions scenario (year 2006 demand conditions) were compared to the No Action scenario (projected demands in 2046) to assess changes in water quality over time. All scenario modeling used an external nutrient-decay model to simulate degradation and assimilation of nutrients along the riverine reach upstream from Pueblo Reservoir. Reservoir modeling was conducted using the U.S. Army Corps of Engineers CE-QUAL-W2 two-dimensional water-quality model. Lake hydrodynamics, water temperature, dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, algal biomass, and total iron were simulated. Two reservoir site locations were selected for comparison. Results of simulations at site 3B were characteristic of a riverine environment in the reservoir, whereas results at site 7B (near the dam) were characteristic of the main body of the reservoir. Simulation results for the epilimnion and hypolimnion at these two sites also were evaluated and compared. The simulation results in the hypolimnion at site 7B were indicative of the water quality leaving the reservoir. Comparisons of the different scenario results were conducted to assess if substantial differences were observed between selected scenarios. Each of the scenarios was simulated for three contiguous years representing a wet, average, and dry annual hydrologic cycle (water years 2000 through 2002). Additionally, each selected simulation scenario was evaluated for differences in direct and cumulative effects on a particular scenario. Direct effects are intended to isolate the future effects of the scenarios. Cumulative effects are intended to evaluate the effects of the scenarios in conjunction with all reasonably foreseeable future activities in the study area. Comparisons between the direct- and cumulative-effects analyses indicated that there were not large differences in the results between most of the simulation scenarios, and, as such, the focus of this report was on results for the direct-effects analysis. Additionally, the differences between simulation results generally were

  15. Multi-criteria decision analysis of concentrated solar power with thermal energy storage and dry cooling.

    PubMed

    Klein, Sharon J W

    2013-12-17

    Decisions about energy backup and cooling options for parabolic trough (PT) concentrated solar power have technical, economic, and environmental implications. Although PT development has increased rapidly in recent years, energy policies do not address backup or cooling option requirements, and very few studies directly compare the diverse implications of these options. This is the first study to compare the annual capacity factor, levelized cost of energy (LCOE), water consumption, land use, and life cycle greenhouse gas (GHG) emissions of PT with different backup options (minimal backup (MB), thermal energy storage (TES), and fossil fuel backup (FF)) and different cooling options (wet (WC) and dry (DC). Multicriteria decision analysis was used with five preference scenarios to identify the highest-scoring energy backup-cooling combination for each preference scenario. MB-WC had the highest score in the Economic and Climate Change-Economy scenarios, while FF-DC and FF-WC had the highest scores in the Equal and Availability scenarios, respectively. TES-DC had the highest score for the Environmental scenario. DC was ranked 1-3 in all preference scenarios. Direct comparisons between GHG emissions and LCOE and between GHG emissions and land use suggest a preference for TES if backup is require for PT plants to compete with baseload generators.

  16. Scenario driven data modelling: a method for integrating diverse sources of data and data streams

    PubMed Central

    2011-01-01

    Background Biology is rapidly becoming a data intensive, data-driven science. It is essential that data is represented and connected in ways that best represent its full conceptual content and allows both automated integration and data driven decision-making. Recent advancements in distributed multi-relational directed graphs, implemented in the form of the Semantic Web make it possible to deal with complicated heterogeneous data in new and interesting ways. Results This paper presents a new approach, scenario driven data modelling (SDDM), that integrates multi-relational directed graphs with data streams. SDDM can be applied to virtually any data integration challenge with widely divergent types of data and data streams. In this work, we explored integrating genetics data with reports from traditional media. SDDM was applied to the New Delhi metallo-beta-lactamase gene (NDM-1), an emerging global health threat. The SDDM process constructed a scenario, created a RDF multi-relational directed graph that linked diverse types of data to the Semantic Web, implemented RDF conversion tools (RDFizers) to bring content into the Sematic Web, identified data streams and analytical routines to analyse those streams, and identified user requirements and graph traversals to meet end-user requirements. Conclusions We provided an example where SDDM was applied to a complex data integration challenge. The process created a model of the emerging NDM-1 health threat, identified and filled gaps in that model, and constructed reliable software that monitored data streams based on the scenario derived multi-relational directed graph. The SDDM process significantly reduced the software requirements phase by letting the scenario and resulting multi-relational directed graph define what is possible and then set the scope of the user requirements. Approaches like SDDM will be critical to the future of data intensive, data-driven science because they automate the process of converting massive data streams into usable knowledge. PMID:22165854

  17. Estimating human exposure to PFOS isomers and PFCA homologues: the relative importance of direct and indirect (precursor) exposure.

    PubMed

    Gebbink, Wouter A; Berger, Urs; Cousins, Ian T

    2015-01-01

    Contributions of direct and indirect (via precursors) pathways of human exposure to perfluorooctane sulfonic acid (PFOS) isomers and perfluoroalkyl carboxylic acids (PFCAs) are estimated using a Scenario-Based Risk Assessment (SceBRA) modelling approach. Monitoring data published since 2008 (including samples from 2007) are used. The estimated daily exposures (resulting from both direct and precursor intake) for the general adult population are highest for PFOS and perfluorooctanoic acid (PFOA), followed by perfluorohexanoic acid (PFHxA) and perfluorodecanoic acid (PFDA), while lower daily exposures are estimated for perfluorobutanoic acid (PFBA) and perfluorododecanoic acid (PFDoDA). The precursor contributions to the individual perfluoroalkyl acid (PFAA) daily exposures are estimated to be 11-33% for PFOS, 0.1-2.5% for PFBA, 3.7-34% for PFHxA, 13-64% for PFOA, 5.2-66% for PFDA, and 0.7-25% for PFDoDA (ranges represent estimated precursor contributions in a low- and high-exposure scenario). For PFOS, direct intake via diet is the major exposure pathway regardless of exposure scenario. For PFCAs, the dominant exposure pathway is dependent on perfluoroalkyl chain length and exposure scenario. Modelled PFOS and PFOA concentrations in human serum using the estimated intakes from an intermediate-exposure scenario are in agreement with measured concentrations in different populations. The isomer pattern of PFOS resulting from total intakes (direct and via precursors) is estimated to be enriched with linear PFOS (84%) relative to technical PFOS (70% linear). This finding appears to be contradictory to the observed enrichment of branched PFOS isomers in recent human serum monitoring studies and suggests that either external exposure is not fully understood (e.g. there are unknown precursors, missing or poorly quantified exposure pathways) and/or that there is an incomplete understanding of the isomer-specific human pharmacokinetic processes of PFOS, its precursors and intermediates. Copyright © 2014. Published by Elsevier Ltd.

  18. Life cycle assessment of introducing an anaerobic digester in a municipal wastewater treatment plant in Spain.

    PubMed

    Blanco, David; Collado, Sergio; Laca, Adriana; Díaz, Mario

    2016-01-01

    Anaerobic digestion (AD) is being established as a standard technology to recover some of the energy contained in the sludge in wastewater treatment plants (WWTPs) as biogas, allowing an economy in electricity and heating and a decrease in climate gas emission. The purpose of this study was to quantify the contributions to the total environmental impact of the plant using life cycle assessment methodology. In this work, data from real operation during 2012 of a municipal WWTP were utilized as the basis to determine the impact of including AD in the process. The climate change human health was the most important impact category when AD was included in the treatment (Scenario 1), especially due to fossil carbon dioxide emissions. Without AD (Scenario 2), increased emissions of greenhouse gases, mostly derived from the use of electricity, provoked a rise in the climate change categories. Biogas utilization was able to provide 47% of the energy required in the WWTP in Scenario 1. Results obtained make Scenario 1 the better environmental choice by far, mainly due to the use of the digested sludge as fertilizer.

  19. Evaluating uncertainty in environmental life-cycle assessment. A case study comparing two insulation options for a Dutch one-family dwelling.

    PubMed

    Huijbregts, Mark A J; Gilijamse, Wim; Ragas, Ad M J; Reijnders, Lucas

    2003-06-01

    The evaluation of uncertainty is relatively new in environmental life-cycle assessment (LCA). It provides useful information to assess the reliability of LCA-based decisions and to guide future research toward reducing uncertainty. Most uncertainty studies in LCA quantify only one type of uncertainty, i.e., uncertainty due to input data (parameter uncertainty). However, LCA outcomes can also be uncertain due to normative choices (scenario uncertainty) and the mathematical models involved (model uncertainty). The present paper outlines a new methodology that quantifies parameter, scenario, and model uncertainty simultaneously in environmental life-cycle assessment. The procedure is illustrated in a case study that compares two insulation options for a Dutch one-family dwelling. Parameter uncertainty was quantified by means of Monte Carlo simulation. Scenario and model uncertainty were quantified by resampling different decision scenarios and model formulations, respectively. Although scenario and model uncertainty were not quantified comprehensively, the results indicate that both types of uncertainty influence the case study outcomes. This stresses the importance of quantifying parameter, scenario, and model uncertainty simultaneously. The two insulation options studied were found to have significantly different impact scores for global warming, stratospheric ozone depletion, and eutrophication. The thickest insulation option has the lowest impact on global warming and eutrophication, and the highest impact on stratospheric ozone depletion.

  20. 2016 Standard Scenarios Report: A U.S. Electricity Sector Outlook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Wesley; Mai, Trieu; Logan, Jeffrey

    The Standard Scenarios and this associated report, which are now in their second year, present an examination of some of the key aspects of the change occurring, or anticipated to occur, in the power sector over the next several decades. The Standard Scenarios consist of 18 power sector scenarios which have been projected using the National Renewable Energy Laboratory's (NREL's) Regional Energy Deployment System (ReEDS) long-term capacity expansion model and the dGen rooftop PV diffusion model. The purpose of the Standard Scenarios and this associated report is to provide context, discussion, and data to inform stakeholder decision-making regarding the futuremore » direction of U.S. power sector. As an extension to this report, the Standard Scenario outputs are presented in a downloadable format online using the Standard Scenarios' Results Viewer at http://en.openei.org/apps/reeds/. This report reflects high-level conclusions and analysis, whereas the Standard Scenarios' Results Viewer includes the scenario results that can be used for more in-depth analysis.« less

  1. Semantic integration of gene expression analysis tools and data sources using software connectors

    PubMed Central

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380

  2. Semantic integration of gene expression analysis tools and data sources using software connectors.

    PubMed

    Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G

    2013-10-25

    The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.

  3. Hospital influenza pandemic stockpiling needs: A computer simulation.

    PubMed

    Abramovich, Mark N; Hershey, John C; Callies, Byron; Adalja, Amesh A; Tosh, Pritish K; Toner, Eric S

    2017-03-01

    A severe influenza pandemic could overwhelm hospitals but planning guidance that accounts for the dynamic interrelationships between planning elements is lacking. We developed a methodology to calculate pandemic supply needs based on operational considerations in hospitals and then tested the methodology at Mayo Clinic in Rochester, MN. We upgraded a previously designed computer modeling tool and input carefully researched resource data from the hospital to run 10,000 Monte Carlo simulations using various combinations of variables to determine resource needs across a spectrum of scenarios. Of 10,000 iterations, 1,315 fell within the parameters defined by our simulation design and logical constraints. From these valid iterations, we projected supply requirements by percentile for key supplies, pharmaceuticals, and personal protective equipment requirements needed in a severe pandemic. We projected supplies needs for a range of scenarios that use up to 100% of Mayo Clinic-Rochester's surge capacity of beds and ventilators. The results indicate that there are diminishing patient care benefits for stockpiling on the high side of the range, but that having some stockpile of critical resources, even if it is relatively modest, is most important. We were able to display the probabilities of needing various supply levels across a spectrum of scenarios. The tool could be used to model many other hospital preparedness issues, but validation in other settings is needed. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  4. Global Change adaptation in water resources management: the Water Change project.

    PubMed

    Pouget, Laurent; Escaler, Isabel; Guiu, Roger; Mc Ennis, Suzy; Versini, Pierre-Antoine

    2012-12-01

    In recent years, water resources management has been facing new challenges due to increasing changes and their associated uncertainties, such as changes in climate, water demand or land use, which can be grouped under the term Global Change. The Water Change project (LIFE+ funding) developed a methodology and a tool to assess the Global Change impacts on water resources, thus helping river basin agencies and water companies in their long term planning and in the definition of adaptation measures. The main result of the project was the creation of a step by step methodology to assess Global Change impacts and define strategies of adaptation. This methodology was tested in the Llobregat river basin (Spain) with the objective of being applicable to any water system. It includes several steps such as setting-up the problem with a DPSIR framework, developing Global Change scenarios, running river basin models and performing a cost-benefit analysis to define optimal strategies of adaptation. This methodology was supported by the creation of a flexible modelling system, which can link a wide range of models, such as hydrological, water quality, and water management models. The tool allows users to integrate their own models to the system, which can then exchange information among them automatically. This enables to simulate the interactions among multiple components of the water cycle, and run quickly a large number of Global Change scenarios. The outcomes of this project make possible to define and test different sets of adaptation measures for the basin that can be further evaluated through cost-benefit analysis. The integration of the results contributes to an efficient decision-making on how to adapt to Global Change impacts. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. The SAMCO Web-platform for resilience assessment in mountainous valleys impacted by landslide risks.

    NASA Astrophysics Data System (ADS)

    Grandjean, Gilles; Thomas, Loic; Bernardie, Severine

    2016-04-01

    The ANR-SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points: (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform available on the web. The strength and originality of the SAMCO project lies in the combination of different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) that are implemented in a user-oriented web-platform, currently in development. We present the first results of this development task, architecture and functions of the web-tools, the case studies database showing the multi-hazard maps and the stakes at risks. Risk assessment over several area of interest in Alpine or Pyrenean valleys are still in progress, but the first analyses are presented for current and future periods for which climate change and land-use (economical, geographical and social aspects) scenarios are taken into account. This tool, dedicated to stakeholders, should be finally used to evaluate resilience of mountainous regions since multiple scenarios can be tested and compared.

  6. Risk-based economic decision analysis of remediation options at a PCE-contaminated site.

    PubMed

    Lemming, Gitte; Friis-Hansen, Peter; Bjerg, Poul L

    2010-05-01

    Remediation methods for contaminated sites cover a wide range of technical solutions with different remedial efficiencies and costs. Additionally, they may vary in their secondary impacts on the environment i.e. the potential impacts generated due to emissions and resource use caused by the remediation activities. More attention is increasingly being given to these secondary environmental impacts when evaluating remediation options. This paper presents a methodology for an integrated economic decision analysis which combines assessments of remediation costs, health risk costs and potential environmental costs. The health risks costs are associated with the residual contamination left at the site and its migration to groundwater used for drinking water. A probabilistic exposure model using first- and second-order reliability methods (FORM/SORM) is used to estimate the contaminant concentrations at a downstream groundwater well. Potential environmental impacts on the local, regional and global scales due to the site remediation activities are evaluated using life cycle assessments (LCA). The potential impacts on health and environment are converted to monetary units using a simplified cost model. A case study based upon the developed methodology is presented in which the following remediation scenarios are analyzed and compared: (a) no action, (b) excavation and off-site treatment of soil, (c) soil vapor extraction and (d) thermally enhanced soil vapor extraction by electrical heating of the soil. Ultimately, the developed methodology facilitates societal cost estimations of remediation scenarios which can be used for internal ranking of the analyzed options. Despite the inherent uncertainties of placing a value on health and environmental impacts, the presented methodology is believed to be valuable in supporting decisions on remedial interventions. Copyright 2010 Elsevier Ltd. All rights reserved.

  7. CAMEO-SIM: a physics-based broadband scene simulation tool for assessment of camouflage, concealment, and deception methodologies

    NASA Astrophysics Data System (ADS)

    Moorhead, Ian R.; Gilmore, Marilyn A.; Houlbrook, Alexander W.; Oxford, David E.; Filbee, David R.; Stroud, Colin A.; Hutchings, G.; Kirk, Albert

    2001-09-01

    Assessment of camouflage, concealment, and deception (CCD) methodologies is not a trivial problem; conventionally the only method has been to carry out field trials, which are both expensive and subject to the vagaries of the weather. In recent years computing power has increased, such that there are now many research programs using synthetic environments for CCD assessments. Such an approach is attractive; the user has complete control over the environmental parameters and many more scenarios can be investigated. The UK Ministry of Defence is currently developing a synthetic scene generation tool for assessing the effectiveness of air vehicle camouflage schemes. The software is sufficiently flexible to allow it to be used in a broader range of applications, including full CCD assessment. The synthetic scene simulation system (CAMEO- SIM) has been developed, as an extensible system, to provide imagery within the 0.4 to 14 micrometers spectral band with as high a physical fidelity as possible. it consists of a scene design tool, an image generator, that incorporates both radiosity and ray-tracing process, and an experimental trials tool. The scene design tool allows the user to develop a 3D representation of the scenario of interest from a fixed viewpoint. Target(s) of interest can be placed anywhere within this 3D representation and may be either static or moving. Different illumination conditions and effects of the atmosphere can be modeled together with directional reflectance effects. The user has complete control over the level of fidelity of the final image. The output from the rendering tool is a sequence of radiance maps, which may be used by sensor models or for experimental trials in which observers carry out target acquisition tasks. The software also maintains an audit trail of all data selected to generate a particular image, both in terms of material properties used and the rendering options chosen. A range of verification tests has shown that the software computes the correct values for analytically tractable scenarios. Validation test using simple scenes have also been undertaken. More complex validation tests using observer trials are planned. The current version of CAMEO-SIM and how its images are used for camouflage assessment is described. The verification and validation tests undertaken are discussed. In addition, example images will be used to demonstrate the significance of different effects, such as spectral rendering and shadows. Planned developments of CAMEO-SIM are also outlined.

  8. A systematic intercomparison of regional flood frequency analysis models in a simulation framework

    NASA Astrophysics Data System (ADS)

    Ganora, Daniele; Laio, Francesco; Claps, Pierluigi

    2015-04-01

    Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve (or other discharge-related variables), based on the fundamental concept of substituting temporal information at a site (no data or short time series) by exploiting observations at other sites (spatial information). Different RFA paradigms exist, depending on the way the information is transferred to the site of interest. Despite the wide use of such methodology, a systematic comparison between these paradigms has not been performed. The aim of this study is to provide a framework wherein carrying out the intercomparison: we thus synthetically generate data through Monte Carlo simulations for a number of (virtual) stations, following a GEV parent distribution; different scenarios can be created to represent different spatial heterogeneity patterns by manipulating the parameters of the parent distribution at each station (e.g. with a linear variation in space of the shape parameter of the GEV). A special case is the homogeneous scenario where each station record is sampled from the same parent distribution. For each scenario and each simulation, different regional models are applied to evaluate the 200-year growth factor at each station. Results are than compared to the exact growth factor of each station, which is known in our virtual world. Considered regional approaches include: (i) a single growth curve for the whole region; (ii) a multiple-region model based on cluster analysis which search for an adequate number of homogeneous subregions; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially-smooth estimation procedure based on linear regressions.. A further benchmark model is the at-site estimate based on the analysis of the local record. A comprehensive analysis of the results of the simulations shows that, if the scenario is homogeneous (no spatial variability), all the regional approaches have comparable performances. Moreover, as expected, regional estimates are much more reliable than the at-site estimates. If the scenario is heterogeneous, the performances of the regional models depend on the pattern of heterogeneity; in general, however, the spatially-smooth regional approach performs better than the others, and its performances improve for increasing record lengths. For heterogeneous scenarios, the at-site estimates appear to be comparably more efficient than in the homogeneous case, and in general less biased than the regional estimates.

  9. 3D technology of Sony Bloggie has no advantage in decision-making of tennis serve direction: A randomized placebo-controlled study.

    PubMed

    Liu, Sicong; Ritchie, Jason; Sáenz-Moncaleano, Camilo; Ward, Savanna K; Paulsen, Cody; Klein, Tyler; Gutierrez, Oscar; Tenenbaum, Gershon

    2017-06-01

    This study aimed at exploring whether 3D technology enhances tennis decision-making under the conceptual framework of human performance model. A 3 (skill-level: varsity, club, recreational) × 3 (experimental condition: placebo, weak 3D [W3D], strong 3D [S3D]) between-participant design was used. Allocated to experimental conditions by a skill-level stratified randomization, 105 tennis players judged tennis serve direction from video scenarios and rated their perceptions of enjoyment, flow, and presence during task performance. Results showed that varsity players made more accurate decisions than less skilled ones. Additionally, applying 3D technology to typical video displays reduced tennis players' decision-making accuracy, although wearing the 3D glasses led to a placebo effect that shortened the decision-making reaction time. The unexpected negative effect of 3D technology on decision-making was possibly due to participants being more familiar to W3D than to S3D, and relatedly, a suboptimal task-technology match. Future directions for advancing this area of research are offered. Highlights 3D technology augments binocular depth cues to tradition video displays, and thus results in the attainment of more authentic visual representation. This process enhances task fidelity in researching perceptual-cognitive skills in sports. The paper clarified both conceptual and methodological difficulties in testing 3D technology in sports settings. Namely, the nomenclature of video footage (with/without 3D technology) and the possible placebo effect (arising from wearing glasses of 3D technology) merit researchers' attention. Participants varying in level of domain-specific expertise were randomized into viewing conditions using a placebo-controlled design. Measurement consisted of both participants' subjective experience (i.e., presence, flow, and enjoyment) and objective performance (i.e., accuracy and reaction time) in a decision-making task. Findings revealed that wearing glasses of 3D technology resulted in a placebo effect that shortened participants' reaction times in decision-making. Moreover, participants' decision-making accuracy decreased when viewing video scenarios using 3D technology. The findings generated meaningful implications regarding applying 3D technology to sports research.

  10. The Distributed Geothermal Market Demand Model (dGeo): Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCabe, Kevin; Mooney, Meghan E; Sigrin, Benjamin O

    The National Renewable Energy Laboratory (NREL) developed the Distributed Geothermal Market Demand Model (dGeo) as a tool to explore the potential role of geothermal distributed energy resources (DERs) in meeting thermal energy demands in the United States. The dGeo model simulates the potential for deployment of geothermal DERs in the residential and commercial sectors of the continental United States for two specific technologies: ground-source heat pumps (GHP) and geothermal direct use (DU) for district heating. To quantify the opportunity space for these technologies, dGeo leverages a highly resolved geospatial database and robust bottom-up, agent-based modeling framework. This design is consistentmore » with others in the family of Distributed Generation Market Demand models (dGen; Sigrin et al. 2016), including the Distributed Solar Market Demand (dSolar) and Distributed Wind Market Demand (dWind) models. dGeo is intended to serve as a long-term scenario-modeling tool. It has the capability to simulate the technical potential, economic potential, market potential, and technology deployment of GHP and DU through the year 2050 under a variety of user-defined input scenarios. Through these capabilities, dGeo can provide substantial analytical value to various stakeholders interested in exploring the effects of various techno-economic, macroeconomic, financial, and policy factors related to the opportunity for GHP and DU in the United States. This report documents the dGeo modeling design, methodology, assumptions, and capabilities.« less

  11. Estimating the time and temperature relationship for causation of deep-partial thickness skin burns.

    PubMed

    Abraham, John P; Plourde, Brian; Vallez, Lauren; Stark, John; Diller, Kenneth R

    2015-12-01

    The objective of this study is to develop and present a simple procedure for evaluating the temperature and exposure-time conditions that lead to causation of a deep-partial thickness burn and the effect that the immediate post-burn thermal environment can have on the process. A computational model has been designed and applied to predict the time required for skin burns to reach a deep-partial thickness level of injury. The model includes multiple tissue layers including the epidermis, dermis, hypodermis, and subcutaneous tissue. Simulated exposure temperatures ranged from 62.8 to 87.8°C (145-190°F). Two scenarios were investigated. The first and worst case scenario was a direct exposure to water (characterized by a large convection coefficient) with the clothing left on the skin following the exposure. A second case consisted of a scald insult followed immediately by the skin being washed with cool water (20°C). For both cases, an Arrhenius injury model was applied whereby the extent and depth of injury were calculated and compared for the different post-burn treatments. In addition, injury values were compared with experiment data from the literature to assess verification of the numerical methodology. It was found that the clinical observations of injury extent agreed with the calculated values. Furthermore, inundation with cool water decreased skin temperatures more quickly than the clothing insulating case and led to a modest decrease in the burn extent. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  12. Assessing the air quality impact of nitrogen oxides and benzene from road traffic and domestic heating and the associated cancer risk in an urban area of Verona (Italy)

    NASA Astrophysics Data System (ADS)

    Schiavon, Marco; Redivo, Martina; Antonacci, Gianluca; Rada, Elena Cristina; Ragazzi, Marco; Zardi, Dino; Giovannini, Lorenzo

    2015-11-01

    Simulations of emission and dispersion of nitrogen oxides (NOx) are performed in an urban area of Verona (Italy), characterized by street canyons and typical sources of urban pollutants. Two dominant source categories are considered: road traffic and, as an element of novelty, domestic heaters. Also, to assess the impact of urban air pollution on human health and, in particular, the cancer risk, simulations of emission and dispersion of benzene are carried out. Emissions from road traffic are estimated by the COPERT 4 algorithm, whilst NOx emission factors from domestic heaters are retrieved by means of criteria provided in the technical literature. Then maps of the annual mean concentrations of NOx and benzene are calculated using the AUSTAL2000 dispersion model, considering both scenarios representing the current situation, and scenarios simulating the introduction of environmental strategies for air pollution mitigation. The simulations highlight potentially critical situations of human exposure that may not be detected by the conventional network of air quality monitoring stations. The proposed methodology provides a support for air quality policies, such as planning targeted measurement campaigns, re-locating monitoring stations and adopting measures in favour of better air quality in urban planning. In particular, the estimation of the induced cancer risk is an important starting point to conduct zoning analyses and to detect the areas where population is more directly exposed to potential risks for health.

  13. A hydro-sedimentary modeling system for flash flood propagation and hazard estimation under different agricultural practices

    NASA Astrophysics Data System (ADS)

    Kourgialas, N. N.; Karatzas, G. P.

    2014-03-01

    A modeling system for the estimation of flash flood flow velocity and sediment transport is developed in this study. The system comprises three components: (a) a modeling framework based on the hydrological model HSPF, (b) the hydrodynamic module of the hydraulic model MIKE 11 (quasi-2-D), and (c) the advection-dispersion module of MIKE 11 as a sediment transport model. An important parameter in hydraulic modeling is the Manning's coefficient, an indicator of the channel resistance which is directly dependent on riparian vegetation changes. Riparian vegetation's effect on flood propagation parameters such as water depth (inundation), discharge, flow velocity, and sediment transport load is investigated in this study. Based on the obtained results, when the weed-cutting percentage is increased, the flood wave depth decreases while flow discharge, velocity and sediment transport load increase. The proposed modeling system is used to evaluate and illustrate the flood hazard for different riparian vegetation cutting scenarios. For the estimation of flood hazard, a combination of the flood propagation characteristics of water depth, flow velocity and sediment load was used. Next, a well-balanced selection of the most appropriate agricultural cutting practices of riparian vegetation was performed. Ultimately, the model results obtained for different agricultural cutting practice scenarios can be employed to create flood protection measures for flood-prone areas. The proposed methodology was applied to the downstream part of a small Mediterranean river basin in Crete, Greece.

  14. Ideal evolution of magnetohydrodynamic turbulence when imposing Taylor-Green symmetries.

    PubMed

    Brachet, M E; Bustamante, M D; Krstulovic, G; Mininni, P D; Pouquet, A; Rosenberg, D

    2013-01-01

    We investigate the ideal and incompressible magnetohydrodynamic (MHD) equations in three space dimensions for the development of potentially singular structures. The methodology consists in implementing the fourfold symmetries of the Taylor-Green vortex generalized to MHD, leading to substantial computer time and memory savings at a given resolution; we also use a regridding method that allows for lower-resolution runs at early times, with no loss of spectral accuracy. One magnetic configuration is examined at an equivalent resolution of 6144(3) points and three different configurations on grids of 4096(3) points. At the highest resolution, two different current and vorticity sheet systems are found to collide, producing two successive accelerations in the development of small scales. At the latest time, a convergence of magnetic field lines to the location of maximum current is probably leading locally to a strong bending and directional variability of such lines. A novel analytical method, based on sharp analysis inequalities, is used to assess the validity of the finite-time singularity scenario. This method allows one to rule out spurious singularities by evaluating the rate at which the logarithmic decrement of the analyticity-strip method goes to zero. The result is that the finite-time singularity scenario cannot be ruled out, and the singularity time could be somewhere between t=2.33 and t=2.70. More robust conclusions will require higher resolution runs and grid-point interpolation measurements of maximum current and vorticity.

  15. Regional scenario building as a tool to support vulnerability assessment of food & water security and livelihood conditions under varying natural resources managements

    NASA Astrophysics Data System (ADS)

    Reinhardt, Julia; Liersch, Stefan; Dickens, Chris; Kabaseke, Clovis; Mulugeta Lemenih, Kassaye; Sghaier, Mongi; Hattermann, Fred

    2013-04-01

    Participatory regional scenario building was carried out with stakeholders and local researchers in four meso-scale case studies (CS) in Africa. In all CS the improvement of food and / or water security and livelihood conditions was identified as the focal issue. A major concern was to analyze the impacts of different plausible future developments on these issues. The process of scenario development is of special importance as it helps to identify main drivers, critical uncertainties and patterns of change. Opportunities and constraints of actors and actions become clearer and reveal adaptation capacities. Effective strategies must be furthermore reasonable and accepted by local stakeholders to be implemented. Hence, developing scenarios and generating strategies need the integration of local knowledge. The testing of strategies shows how they play out in different scenarios and how robust they are. Reasons and patterns of social and natural vulnerability can so be shown. The scenario building exercise applied in this study is inspired by the approach from Peter Schwartz. It aims at determining critical uncertainties and to identify the most important driving forces for a specific focal issue which are likely to shape future developments of a region. The most important and uncertain drivers were analyzed and systematized with ranking exercises during meetings with local researchers and stakeholders. Cause-effect relationships were drawn in the form of concept maps either during the meetings or by researchers based on available information. Past observations and the scenario building outcomes were used to conduct a trend analysis. Cross-comparisons were made to find similarities and differences between CS in terms of main driving forces, patterns of change, opportunities and constraints. Driving forces and trends which aroused consistently over scenarios and CS were identified. First results indicate that livelihood conditions of people rely often directly on the state and availability of natural resources. Major concerns in all CS are the fast growing populations and natural resources degradation because of unsustainable natural resource management. Land use and resource competition are a consequence of unclear land tenure systems and limited resources availability. Scarce rainfall with high annual variability causes food insecurity if yield failures cannot be compensated, e.g. because of lacking financial resources. In all case studies critical uncertainties were identified to be more or less related to "poor governance". Missing governmental and political stability and effectiveness as well as corruption hamper the implementation of laws and policies related to natural resource management. Other critical uncertainties lie in the social domain. They are either related to demographic patterns like emigration or immigration varying the pressure on natural resources use or to the society in general like the evolvement of people's environmental awareness or voice and accountability. Methodological outcomes of the scenario building were that the complexity of the process requires the use of reliable and powerful tools to support the communication process. Concept maps were found to be a useful tool in this regard.

  16. Faculty Response to Ethical Issues at an American University in the Middle-East

    ERIC Educational Resources Information Center

    Tabsh, Sami W.; El Kadi, Hany A.; Abdelfatah, Akmal S.

    2012-01-01

    Purpose: The objective of this study is to get feedback on faculty perception of ethical issues related to teaching, scholarship and service at a relatively new American-style university in the Middle-East. Design/methodology/approach: A questionnaire involving 21 scenarios with multiple choice answers was developed and distributed to all faculty…

  17. Evaluating hydrological response of future land cover change scenarios in the San Pedro river (U.S./Mexico) with the Automated Geospatial Watershed (AGWA) tool

    USDA-ARS?s Scientific Manuscript database

    Long-term land-use and land cover change and their associated impacts pose critical challenges to sustaining vital hydrological ecosystem services for future generations. In this study, a methodology was developed to characterize potential hydrologic impacts from future urban growth through time. Fu...

  18. Making Just Tenure and Promotion Decisions Using the Objective Knowledge Growth Framework

    ERIC Educational Resources Information Center

    Chitpin, Stephanie

    2015-01-01

    Purpose: The purpose of this paper is to utilize the Objective Knowledge Growth Framework (OKGF) to promote a better understanding of the evaluating tenure and promotion processes. Design/Methodology/Approach: A scenario is created to illustrate the concept of using OKGF. Findings: The framework aims to support decision makers in identifying the…

  19. Agent-Based Simulation of Learning Dissemination in a Project-Based Learning Context Considering the Human Aspects

    ERIC Educational Resources Information Center

    Seman, Laio Oriel; Hausmann, Romeu; Bezerra, Eduardo Augusto

    2018-01-01

    Contribution: This paper presents the "PBL classroom model," an agent-based simulation (ABS) that allows testing of several scenarios of a project-based learning (PBL) application by considering different levels of soft-skills, and students' perception of the methodology. Background: While the community has made great advances in…

  20. Evaluating Hydrological Response of Future Land Cover Change Scenarios in the San Pedro River (U.S./Mexico) with the Automated Geospatial Watershed Assessment (AGWA) Tool.

    EPA Science Inventory

    Long-term land-use and land cover change and their associated impacts pose critical challenges to sustaining vital hydrological ecosystem services for future generations. In this study, a methodology was developed to characterize potential hydrologic impacts from future urban gro...

  1. Projecting Long-Term Care Expenditure in Four European Union Member States: The Influence of Demographic Scenarios

    ERIC Educational Resources Information Center

    Costa-Font, Joan; Wittenberg, Raphael; Patxot, Concepcio; Comas-Herrera, Adelina; Gori, Cristiano; di Maio, Alessandra; Pickard, Linda; Pozzi, Alessandro; Rothgang, Heinz

    2008-01-01

    This study examines the sensitivity of future long-term care demand and expenditure estimates to official demographic projections in four selected European countries: Germany, Spain, Italy and the United Kingdom. It uses standardised methodology in the form of a macro-simulation exercise and finds evidence for significant differences in…

  2. Scenarios and Strategies for Web 2.0

    ERIC Educational Resources Information Center

    Martin, Graeme; Reddington, Martin; Kneafsey, Mary Beth; Sloman, Martyn

    2009-01-01

    Purpose: The aim of this article is to bring together ideas from the authors' review of the Web 2.0 literature, the data and their insights from this and other technology-related projects to produce a framework for strategies on Web 2.0 focusing on the implications for human resource professionals. Design/methodology/approach: The authors discuss…

  3. Satellite services system analysis study. Volume 2: Satellite and services user model

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Satellite services needs are analyzed. Topics include methodology: a satellite user model; representative servicing scenarios; potential service needs; manned, remote, and automated involvement; and inactive satellites/debris. Satellite and services user model development is considered. Groundrules and assumptions, servicing, events, and sensitivity analysis are included. Selection of references satellites is also discussed.

  4. Reaction-to-fire testing and modeling for wood products

    Treesearch

    Mark A. Dietenberger; Robert H. White

    2001-01-01

    In this review we primarily discuss our use of the oxygen consumption calorimeter (ASTM E1354 for cone calorimeter and ISO9705 for room/corner tests) and fire growth modeling to evaluate treated wood products. With recent development towards performance-based building codes, new methodology requires engineering calculations of various fire growth scenarios. The initial...

  5. Teaching an Application of Bayes' Rule for Legal Decision-Making: Measuring the Strength of Evidence

    ERIC Educational Resources Information Center

    Satake, Eiki; Murray, Amy Vashlishan

    2014-01-01

    Although Bayesian methodology has become a powerful approach for describing uncertainty, it has largely been avoided in undergraduate statistics education. Here we demonstrate that one can present Bayes' Rule in the classroom through a hypothetical, yet realistic, legal scenario designed to spur the interests of students in introductory- and…

  6. Supply Chain Simulator: A Scenario-Based Educational Tool to Enhance Student Learning

    ERIC Educational Resources Information Center

    Siddiqui, Atiq; Khan, Mehmood; Akhtar, Sohail

    2008-01-01

    Simulation-based educational products are excellent set of illustrative tools that proffer features like visualization of the dynamic behavior of a real system, etc. Such products have great efficacy in education and are known to be one of the first-rate student centered learning methodologies. These products allow students to practice skills such…

  7. Youth and the Ethics of Identity Play in Virtual Spaces

    ERIC Educational Resources Information Center

    Siyahhan, Sinem; Barab, Sasha; James, Carrie

    2011-01-01

    In this study, we explored a new experimental methodology for investigating children's (ages 10 to 14) stances with respect to the ethics of online identity play. We used a scenario about peer identity misrepresentation embedded in a 3D virtual game environment and randomly assigned 265 elementary students (162 female, 103 male) to three…

  8. Checking Equity: Why Differential Item Functioning Analysis Should Be a Routine Part of Developing Conceptual Assessments

    ERIC Educational Resources Information Center

    Martinková, Patricia; Drabinová, Adéla; Liaw, Yuan-Ling; Sanders, Elizabeth A.; McFarland, Jenny L.; Price, Rebecca M.

    2017-01-01

    We provide a tutorial on differential item functioning (DIF) analysis, an analytic method useful for identifying potentially biased items in assessments. After explaining a number of methodological approaches, we test for gender bias in two scenarios that demonstrate why DIF analysis is crucial for developing assessments, particularly because…

  9. Model documentation: Electricity Market Module, Electricity Fuel Dispatch Submodule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report documents the objectives, analytical approach and development of the National Energy Modeling System Electricity Fuel Dispatch Submodule (EFD), a submodule of the Electricity Market Module (EMM). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  10. Evaluating the ecological sustainability of a pinyon-juniper grassland ecosystem in northern Arizona

    Treesearch

    Reuben Weisz; Jack Triepke; Don Vandendriesche; Mike Manthei; Jim Youtz; Jerry Simon; Wayne Robbie

    2010-01-01

    In order to develop strategic land management plans, managers must assess current and future ecological conditions. Climate change has expanded the need to assess the sustainability of ecosystems and predict their conditions under different climate change and management scenarios using landscape dynamics simulation models. We present a methodology for developing a...

  11. Optimal Modality Selection for Cooperative Human-Robot Task Completion.

    PubMed

    Jacob, Mithun George; Wachs, Juan P

    2016-12-01

    Human-robot cooperation in complex environments must be fast, accurate, and resilient. This requires efficient communication channels where robots need to assimilate information using a plethora of verbal and nonverbal modalities such as hand gestures, speech, and gaze. However, even though hybrid human-robot communication frameworks and multimodal communication have been studied, a systematic methodology for designing multimodal interfaces does not exist. This paper addresses the gap by proposing a novel methodology to generate multimodal lexicons which maximizes multiple performance metrics over a wide range of communication modalities (i.e., lexicons). The metrics are obtained through a mixture of simulation and real-world experiments. The methodology is tested in a surgical setting where a robot cooperates with a surgeon to complete a mock abdominal incision and closure task by delivering surgical instruments. Experimental results show that predicted optimal lexicons significantly outperform predicted suboptimal lexicons (p <; 0.05) in all metrics validating the predictability of the methodology. The methodology is validated in two scenarios (with and without modeling the risk of a human-robot collision) and the differences in the lexicons are analyzed.

  12. The effects of active (hot-seat) versus observer roles during simulation-based training on stress levels and non-technical performance: a randomized trial.

    PubMed

    Bong, Choon Looi; Lee, Sumin; Ng, Agnes Suah Bwee; Allen, John Carson; Lim, Evangeline Hua Ling; Vidyarthi, Arpana

    2017-01-01

    Active 'hands-on' participation in the 'hot-seat' during immersive simulation-based training (SBT) induces stress for participants, which is believed to be necessary to improve performance. We hypothesized that observers of SBT can subsequently achieve an equivalent level of non-technical performance as 'hot-seat' participants despite experiencing lower stress. We randomized 37 anaesthesia trainees into two groups to undergo three consecutive SBT scenarios. Eighteen 'hot-seat' trainees actively participated in all three scenarios, and 19 'observer' trainees were directed to observe the first two scenarios and participated in the 'hot-seat' only in scenario 3. Salivary cortisol (SC) was measured at four time points during each scenario. Primary endpoint for stress response was the change in SC (ΔSC) from baseline. Performance was measured using the Anaesthetist's Non-Technical Skills (ANTS) Score. Mean SC increased in all participants whenever they were in the 'hot-seat' role, but not when in the observer role. Hot-seat ΔSC (mcg/dL) for scenarios 1, 2, and 3 were 0.122 ( p  = 0.001), 0.074 ( p  = 0.047), and 0.085 ( p  = 0.023), respectively. Observers ΔSC (mcg/dL) for scenarios 1, 2, and 3 were -0.062 ( p  = 0.091), 0.010 ( p  = 0.780), and 0.144 ( p  = 0.001), respectively. Mean ANTS scores were equivalent between the 'hot-seat' (40.0) and 'observer' (39.4) groups in scenario 3 ( p  = 0.733). Observers of SBT achieved an equivalent level of non-technical performance, while experiencing lower stress than trainees repeatedly trained in the 'hot-seat'. Our findings suggest that directed observers may benefit from immersive SBT even without repeated 'hands-on' experience and stress in the hot-seat. The directed observer role may offer a less stressful, practical alternative to the traditional 'hot-seat' role, potentially rendering SBT accessible to a wider audience. ClinicalTrials.gov Identifier NCT02211378, registered August 5, 2014, retrospectively registered.

  13. Landslide hazard assessment : LIFE+IMAGINE project methodology and Liguria region use case

    NASA Astrophysics Data System (ADS)

    Spizzichino, Daniele; Campo, Valentina; Congi, Maria Pia; Cipolloni, Carlo; Delmonaco, Giuseppe; Guerrieri, Luca; Iadanza, Carla; Leoni, Gabriele; Trigila, Alessandro

    2015-04-01

    Scope of the work is to present a methodology developed for analysis of potential impacts in areas prone to landslide hazard in the framework of the EC project LIFE+IMAGINE. The project aims to implement a web services-based infrastructure addressed to environmental analysis, that integrates, in its own architecture, specifications and results from INSPIRE, SEIS and GMES. Existing web services has been customized to provide functionalities for supporting environmental integrated management. The implemented infrastructure has been applied to landslide risk scenarios, developed in selected pilot areas, aiming at: i) application of standard procedures to implement a landslide risk analysis; ii) definition of a procedure for assessment of potential environmental impacts, based on a set of indicators to estimate the different exposed elements with their specific vulnerability in the pilot area. The landslide pilot and related scenario are focused at providing a simplified Landslide Risk Assessment (LRA) through: 1) a landslide inventory derived from available historical and recent databases and maps; 2) landslide susceptibility and hazard maps; 3) assessment of exposure and vulnerability on selected typologies of elements at risk; 4) implementation of a landslide risk scenario for different sets of exposed elements 5) development of a use case; 6) definition of guidelines, best practices and production of thematic maps. The LRA has been implemented in Liguria region, Italy, in two different catchment areas located in the Cinque Terre National Park, characterized by a high landslide susceptibility and low resilience. The landslide risk impact analysis has been calibrated taking into account the socio-economic damage caused by landslides triggered by the October 2011 meteorological event. During this event, over 600 landslides were triggered in the selected pilot area. Most of landslides affected the diffuse system of anthropogenic terraces and caused the direct disruption of the walls as well as transportation of a large amount of loose sediments along the slopes and channels as induced consequence of the event. Application of a spatial analysis detected ca. 400 critical point along the road network with an average length of about 200 m. Over 1,000 buildings were affected and damaged by the event. The exposed population in the area involved by the event has been estimated in ca. 2,600 inhabitants (people?). In the pilot area, 19 different typologies of Cultural Heritage were affected by landslide phenomena or located in zones classified as high landslide hazard. The final scope of the landslide scenario is to improve the awareness on hazard, exposure, vulnerability and landslide risk in the Cinque Terre National Park to the benefit of local authorities and population. In addition, the results of the application will be used for updating the land planning process in order to improve the resilience of local communities, ii) implementing cost-benefit analysis aimed at the definition of guidelines for sustainable landslide risk mitigation strategies, iii) suggesting a general road map for the implementation of a local adaptation plan.

  14. Marine data and its potential for coastal and offshore applications

    NASA Astrophysics Data System (ADS)

    Meyer, Elke; Weisse, Ralf

    2013-04-01

    The coastDat data set is a compilation of coastal analyses and scenarios for the future from various sources. It contains no direct measurements but results from numerical models that have been driven either by observed data in order to achieve the best possible representation of observed past conditions or by climate change scenarios for the near future. One of the key objectives for developing coastDat was to derive a consistent and mostly homogeneous database for assessing marine weather statistics and long-term changes. Here, homogeneity refers to a data set which is free from effects caused by changes in instrumentation or measurement techniques. Contrary to direct measurements which are often rare and incomplete, coastDat offers a unique combination of consistent atmospheric, oceanic, sea state and other parameters at high spatial and temporal detail, even for places and variables for which no measurements have been made. The backbones of coastDat are regional wind, wave and storm surge hindcast and scenarios mainly for the North Sea and the Baltic Sea. Furthermore hindcast simulations are available for temperature, salinity, water level, u- and v-components for the North Sea for last 60 years. We will discuss the methodology to derive these data, their quality and limitations in comparison with observations. Long-term changes in the temperature, wind, wave and storm surge climate will be discussed and potential future changes will be assessed. We will conclude with a number of coastal and offshore applications (e.g. ship design, coastal protection, oil risk modelling and marine energy use) of coastDat demonstrating some of the potentials of the data set in hazard assessment. For example data from coastDat have been used extensively for designing, planning and installation of offshore wind farms. Return periods of extreme wind speed, surge and wave heights are used by a variety of users involved in the design and construction of offshore wind parks. Moreover, planning of installation and maintenance requires the estimation of probabilities of weather windows; that is, for example the probability of an extended period with wave heights below a given threshold to enable installation and/or maintenance. Data from coastDat were frequently used in such cases as observational data are too often too short to derive reliable statistics. www.coastdat.de

  15. The coastDat data set and its potential for coastal and offshore applications

    NASA Astrophysics Data System (ADS)

    Meyer, E.; Weisse, R.

    2012-12-01

    The coastDat data set is a compilation of coastal analyses and scenarios for the future from various sources. It contains no direct measurements but results from numerical models that have been driven either by observed data in order to achieve the best possible representation of observed past conditions or by climate change scenarios for the near future. One of the key objectives for developing coastDat was to derive a consistent and mostly homogeneous database for assessing marine weather statistics and long-term changes. Here, homogeneity refers to a data set which is free from effects caused by changes in instrumentation or measurement techniques. Contrary to direct measurements which are often rare and incomplete, coastDat offers a unique combination of consistent atmospheric, oceanic, sea state and other parameters at high spatial and temporal detail, even for places and variables for which no measurements have been made. The backbones of coastDat are regional wind, wave and storm surge hindcast and scenarios mainly for the North Sea and the Baltic Sea. Furthermore hindcast simulations are available for temperature, salinity, water level, u- and v-components for the North Sea for last 60 years. We will discuss the methodology to derive these data, their quality and limitations in comparison with observations. Long-term changes in the temperature, wind, wave and storm surge climate will be discussed and potential future changes will be assessed. We will conclude with a number of coastal and offshore applications (e.g. ship design, coastal protection, oil risk modelling and marine energy use) of coastDat demonstrating some of the potentials of the data set in hazard assessment. For example data from coastDat have been used extensively for designing, planning and installation of offshore wind farms. Return periods of extreme wind speed, surge and wave heights are used by a variety of users involved in the design and construction of offshore wind parks. Moreover, planning of installation and maintenance requires the estimation of probabilities of weather windows; that is, for example the probability of an extended period with wave heights below a given threshold to enable installation and/or maintenance. Data from coastDat were frequently used in such cases as observational data are too often too short to derive reliable statistics.

  16. Flood damage in Italy: towards an assessment model of reconstruction costs

    NASA Astrophysics Data System (ADS)

    Sterlacchini, Simone; Zazzeri, Marco; Genovese, Elisabetta; Modica, Marco; Zoboli, Roberto

    2016-04-01

    Recent decades in Italy have seen a very rapid expansion of urbanisation in terms of physical assets, while demographics have remained stable. Both the characteristics of Italian soil and anthropic development, along with repeated global climatic stress, have made the country vulnerable to floods, the intensity of which is increasingly alarming. The combination of these trends will contribute to large financial losses due to property damage in the absence of specific mitigation strategies. The present study focuses on the province of Sondrio in Northern Italy (area of about 3,200 km²), which is home to more than 180,000 inhabitants and the population is growing slightly. It is clearly a hot spot for flood exposure, as it is primarily a mountainous area where floods and flash floods hit frequently. The model we use for assessing potential flood damage determines risk scenarios by overlaying flood hazard maps and economic asset data. In Italy, hazard maps are provided by Regional Authorities through the Hydrogeological System Management Plan (PAI) based on EU Flood Directive guidelines. The PAI in the study area includes both the large plain and the secondary river system and considers three hazard scenarios of Low, Medium and High Frequency associated with return periods of 20, 200 and 500 years and related water levels. By an overlay of PAI maps and residential areas, visualized on a GIS, we determine which existing built-up areas are at risk for flood according to each scenario. Then we investigate the value of physical assets potentially affected by floods in terms of market values, using the database of the Italian Property Market Observatory (OMI), and in terms of reconstruction costs, by considering synthetic cost indexes of predominant building types (from census information) and PAI water height. This study illustrates a methodology to assess flood damage in urban settlements and aims to determine general guidelines that can be extended throughout Italy. The final objective will be to analyse how the loss prospective can change when mitigation measures, including actions to reduce the flood hazard and strategies to prevent potential consequences, are implemented. Flood impacts and the corresponding value of mitigation measures will be assessed by means of a cost-benefit analysis in accordance with the EU Floods Directive.

  17. A land-use and land-cover modeling strategy to support a national assessment of carbon stocks and fluxes

    USGS Publications Warehouse

    Sohl, Terry L.; Sleeter, Benjamin M.; Zhu, Zhiliang; Sayler, Kristi L.; Bennett, Stacie; Bouchard, Michelle; Reker, Ryan R.; Hawbaker, Todd J.; Wein, Anne M.; Liu, Shuguang; Kanengieter, Ronald L.; Acevedo, William

    2012-01-01

    Changes in land use, land cover, disturbance regimes, and land management have considerable influence on carbon and greenhouse gas (GHG) fluxes within ecosystems. Through targeted land-use and land-management activities, ecosystems can be managed to enhance carbon sequestration and mitigate fluxes of other GHGs. National-scale, comprehensive analyses of carbon sequestration potential by ecosystem are needed, with a consistent, nationally applicable land-use and land-cover (LULC) modeling framework a key component of such analyses. The U.S. Geological Survey has initiated a project to analyze current and projected future GHG fluxes by ecosystem and quantify potential mitigation strategies. We have developed a unique LULC modeling framework to support this work. Downscaled scenarios consistent with IPCC Special Report on Emissions Scenarios (SRES) were constructed for U.S. ecoregions, and the FORE-SCE model was used to spatially map the scenarios. Results for a prototype demonstrate our ability to model LULC change and inform a biogeochemical modeling framework for analysis of subsequent GHG fluxes. The methodology was then successfully used to model LULC change for four IPCC SRES scenarios for an ecoregion in the Great Plains. The scenario-based LULC projections are now being used to analyze potential GHG impacts of LULC change across the U.S.

  18. A land-use and land-cover modeling strategy to support a national assessment of carbon stocks and fluxes

    USGS Publications Warehouse

    Sohl, Terry L.; Sleeter, Benjamin M.; Zhu, Zhi-Liang; Sayler, Kristi L.; Bennett, Stacie; Bouchard, Michelle; Reker, Ryan R.; Hawbaker, Todd; Wein, Anne; Liu, Shu-Guang; Kanengleter, Ronald; Acevedo, William

    2012-01-01

    Changes in land use, land cover, disturbance regimes, and land management have considerable influence on carbon and greenhouse gas (GHG) fluxes within ecosystems. Through targeted land-use and landmanagement activities, ecosystems can be managed to enhance carbon sequestration and mitigate fluxes of other GHGs. National-scale, comprehensive analyses of carbon sequestration potential by ecosystem are needed, with a consistent, nationally applicable land-use and land-cover (LULC) modeling framework a key component of such analyses. The U.S. Geological Survey has initiated a project to analyze current and projected future GHG fluxes by ecosystem and quantify potential mitigation strategies. We have developed a unique LULC modeling framework to support this work. Downscaled scenarios consistent with IPCC Special Report on Emissions Scenarios (SRES) were constructed for U.S. ecoregions, and the FORE-SCE model was used to spatially map the scenarios. Results for a prototype demonstrate our ability to model LULC change and inform a biogeochemical modeling framework for analysis of subsequent GHG fluxes. The methodology was then successfully used to model LULC change for four IPCC SRES scenarios for an ecoregion in the Great Plains. The scenario-based LULC projections are now being used to analyze potential GHG impacts of LULC change across the U.S.

  19. Assessment of environmental public exposure from a hypothetical nuclear accident for Unit-1 Bushehr nuclear power plant.

    PubMed

    Sohrabi, M; Ghasemi, M; Amrollahi, R; Khamooshi, C; Parsouzi, Z

    2013-05-01

    Unit-1 of the Bushehr nuclear power plant (BNPP-1) is a VVER-type reactor with 1,000-MWe power constructed near Bushehr city at the coast of the Persian Gulf, Iran. The reactor has been recently operational to near its full power. The radiological impact of nuclear power plant (NPP) accidents is of public concern, and the assessment of radiological consequences of any hypothetical nuclear accident on public exposure is vital. The hypothetical accident scenario considered in this paper is a design-basis accident, that is, a primary coolant leakage to the secondary circuit. This scenario was selected in order to compare and verify the results obtained in the present paper with those reported in the Final Safety Analysis Report (FSAR 2007) of the BNPP-1 and to develop a well-proven methodology that can be used to study other and more severe hypothetical accident scenarios for this reactor. In the present study, the version 2.01 of the PC COSYMA code was applied. In the early phase of the accidental releases, effective doses (from external and internal exposures) as well as individual and collective doses (due to the late phase of accidental releases) were evaluated. The surrounding area of the BNPP-1 within a radius of 80 km was subdivided into seven concentric rings and 16 sectors, and distribution of population and agricultural products was calculated for this grid. The results show that during the first year following the modeled hypothetical accident, the effective doses do not exceed the limit of 5 mSv, for the considered distances from the BNPP-1. The results obtained in this study are in good agreement with those in the FSAR-2007 report. The agreement obtained is in light of many inherent uncertainties and variables existing in the two modeling procedures applied and proves that the methodology applied here can also be used to model other severe hypothetical accident scenarios of the BNPP-1 such as a small and large break in the reactor coolant system as well as beyond design-basis accidents. Such scenarios are planned to be studied in the near future, for this reactor.

  20. Mars base buildup scenarios

    NASA Technical Reports Server (NTRS)

    Blacic, J. D.

    1986-01-01

    Two Mars surface based build-up scenarios are presented in order to help visualize the mission and to serve as a basis for trade studies. In the first scenario, direct manned landings on the Martian surface occur early in the missions and scientific investigation is the main driver and rationale. In the second senario, Earth development of an infrastructure to exploit the volatile resources of the Martian moons for economic purposes is emphasized. Scientific exploration of the surface is delayed at first in this scenario relative to the first, but once begun develops rapidly, aided by the presence of a permanently manned orbital station.

  1. Fuzzy Traffic Control with Vehicle-to-Everything Communication.

    PubMed

    Salman, Muntaser A; Ozdemir, Suat; Celebi, Fatih V

    2018-01-27

    Traffic signal control (TSC) with vehicle-to everything (V2X) communication can be a very efficient solution to traffic congestion problem. Ratio of vehicles equipped with V2X communication capability in the traffic to the total number of vehicles (called penetration rate PR) is still low, thus V2X based TSC systems need to be supported by some other mechanisms. PR is the major factor that affects the quality of TSC process along with the evaluation interval. Quality of the TSC in each direction is a function of overall TSC quality of an intersection. Hence, quality evaluation of each direction should follow the evaluation of the overall intersection. Computational intelligence, more specifically swarm algorithm, has been recently used in this field in a European Framework Program FP7 supported project called COLOMBO. In this paper, using COLOMBO framework, further investigations have been done and two new methodologies using simple and fuzzy logic have been proposed. To evaluate the performance of our proposed methods, a comparison with COLOMBOs approach has been realized. The results reveal that TSC problem can be solved as a logical problem rather than an optimization problem. Performance of the proposed approaches is good enough to be suggested for future work under realistic scenarios even under low PR.

  2. Fuzzy Traffic Control with Vehicle-to-Everything Communication

    PubMed Central

    Ozdemir, Suat; Celebi, Fatih V.

    2018-01-01

    Traffic signal control (TSC) with vehicle-to everything (V2X) communication can be a very efficient solution to traffic congestion problem. Ratio of vehicles equipped with V2X communication capability in the traffic to the total number of vehicles (called penetration rate PR) is still low, thus V2X based TSC systems need to be supported by some other mechanisms. PR is the major factor that affects the quality of TSC process along with the evaluation interval. Quality of the TSC in each direction is a function of overall TSC quality of an intersection. Hence, quality evaluation of each direction should follow the evaluation of the overall intersection. Computational intelligence, more specifically swarm algorithm, has been recently used in this field in a European Framework Program FP7 supported project called COLOMBO. In this paper, using COLOMBO framework, further investigations have been done and two new methodologies using simple and fuzzy logic have been proposed. To evaluate the performance of our proposed methods, a comparison with COLOMBOs approach has been realized. The results reveal that TSC problem can be solved as a logical problem rather than an optimization problem. Performance of the proposed approaches is good enough to be suggested for future work under realistic scenarios even under low PR. PMID:29382053

  3. Hazard assessment of substances produced from the accidental heating of chemical compounds.

    PubMed

    Lunghi, A; Gigante, L; Cardillo, P; Stefanoni, V; Pulga, G; Rota, R

    2004-12-10

    Accidental events concerning process industries can affect not only the staff working in, but also the environment and people living next to the factory. For this reason a regulation is imposed by the European Community to prevent accidents that could represent a risk for the population and the environment. In particular, Directive 96/82/CE, the so-called 'Seveso II directive', requests a risk analysis involving also the hazardous materials generated in accidental events. Therefore, it is necessary to develop simple and economic procedure to foresee the hazardous materials that can be produced in the case of major accidents, among which the accidental heating of a chemical due to a fire or a runaway reaction is one of the most frequent. The procedure proposed in this work is based on evolved gas analysis methodology that consists in coupling two instruments: a thermogravimetric analyzer or a flash pyrolyzer, that are employed to simulate accident conditions, and a FTIR spectrometer that can be used to detect the evolved gas composition. More than 40 materials have been examined in various accident scenarios and the obtained data have been statistically analyzed in order to identify meaningful correlations between the presence of a chemical group in the molecule of a chemical and the presence of a given hazardous species in the fume produced.

  4. A high performance biometric signal and image processing method to reveal blood perfusion towards 3D oxygen saturation mapping

    NASA Astrophysics Data System (ADS)

    Imms, Ryan; Hu, Sijung; Azorin-Peris, Vicente; Trico, Michaël.; Summers, Ron

    2014-03-01

    Non-contact imaging photoplethysmography (PPG) is a recent development in the field of physiological data acquisition, currently undergoing a large amount of research to characterize and define the range of its capabilities. Contact-based PPG techniques have been broadly used in clinical scenarios for a number of years to obtain direct information about the degree of oxygen saturation for patients. With the advent of imaging techniques, there is strong potential to enable access to additional information such as multi-dimensional blood perfusion and saturation mapping. The further development of effective opto-physiological monitoring techniques is dependent upon novel modelling techniques coupled with improved sensor design and effective signal processing methodologies. The biometric signal and imaging processing platform (bSIPP) provides a comprehensive set of features for extraction and analysis of recorded iPPG data, enabling direct comparison with other biomedical diagnostic tools such as ECG and EEG. Additionally, utilizing information about the nature of tissue structure has enabled the generation of an engineering model describing the behaviour of light during its travel through the biological tissue. This enables the estimation of the relative oxygen saturation and blood perfusion in different layers of the tissue to be calculated, which has the potential to be a useful diagnostic tool.

  5. New generation of elastic network models.

    PubMed

    López-Blanco, José Ramón; Chacón, Pablo

    2016-04-01

    The intrinsic flexibility of proteins and nucleic acids can be grasped from remarkably simple mechanical models of particles connected by springs. In recent decades, Elastic Network Models (ENMs) combined with Normal Model Analysis widely confirmed their ability to predict biologically relevant motions of biomolecules and soon became a popular methodology to reveal large-scale dynamics in multiple structural biology scenarios. The simplicity, robustness, low computational cost, and relatively high accuracy are the reasons behind the success of ENMs. This review focuses on recent advances in the development and application of ENMs, paying particular attention to combinations with experimental data. Successful application scenarios include large macromolecular machines, structural refinement, docking, and evolutionary conservation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Source-Based Modeling Of Urban Stormwater Quality Response to the Selected Scenarios Combining Future Changes in Climate and Socio-Economic Factors

    NASA Astrophysics Data System (ADS)

    Borris, Matthias; Leonhardt, Günther; Marsalek, Jiri; Österlund, Heléne; Viklander, Maria

    2016-08-01

    The assessment of future trends in urban stormwater quality should be most helpful for ensuring the effectiveness of the existing stormwater quality infrastructure in the future and mitigating the associated impacts on receiving waters. Combined effects of expected changes in climate and socio-economic factors on stormwater quality were examined in two urban test catchments by applying a source-based computer model (WinSLAMM) for TSS and three heavy metals (copper, lead, and zinc) for various future scenarios. Generally, both catchments showed similar responses to the future scenarios and pollutant loads were generally more sensitive to changes in socio-economic factors (i.e., increasing traffic intensities, growth and intensification of the individual land-uses) than in the climate. Specifically, for the selected Intermediate socio-economic scenario and two climate change scenarios (RSP = 2.6 and 8.5), the TSS loads from both catchments increased by about 10 % on average, but when applying the Intermediate climate change scenario (RCP = 4.5) for two SSPs, the Sustainability and Security scenarios (SSP1 and SSP3), the TSS loads increased on average by 70 %. Furthermore, it was observed that well-designed and maintained stormwater treatment facilities targeting local pollution hotspots exhibited the potential to significantly improve stormwater quality, however, at potentially high costs. In fact, it was possible to reduce pollutant loads from both catchments under the future Sustainability scenario (on average, e.g., TSS were reduced by 20 %), compared to the current conditions. The methodology developed in this study was found useful for planning climate change adaptation strategies in the context of local conditions.

  7. Source-Based Modeling Of Urban Stormwater Quality Response to the Selected Scenarios Combining Future Changes in Climate and Socio-Economic Factors.

    PubMed

    Borris, Matthias; Leonhardt, Günther; Marsalek, Jiri; Österlund, Heléne; Viklander, Maria

    2016-08-01

    The assessment of future trends in urban stormwater quality should be most helpful for ensuring the effectiveness of the existing stormwater quality infrastructure in the future and mitigating the associated impacts on receiving waters. Combined effects of expected changes in climate and socio-economic factors on stormwater quality were examined in two urban test catchments by applying a source-based computer model (WinSLAMM) for TSS and three heavy metals (copper, lead, and zinc) for various future scenarios. Generally, both catchments showed similar responses to the future scenarios and pollutant loads were generally more sensitive to changes in socio-economic factors (i.e., increasing traffic intensities, growth and intensification of the individual land-uses) than in the climate. Specifically, for the selected Intermediate socio-economic scenario and two climate change scenarios (RSP = 2.6 and 8.5), the TSS loads from both catchments increased by about 10 % on average, but when applying the Intermediate climate change scenario (RCP = 4.5) for two SSPs, the Sustainability and Security scenarios (SSP1 and SSP3), the TSS loads increased on average by 70 %. Furthermore, it was observed that well-designed and maintained stormwater treatment facilities targeting local pollution hotspots exhibited the potential to significantly improve stormwater quality, however, at potentially high costs. In fact, it was possible to reduce pollutant loads from both catchments under the future Sustainability scenario (on average, e.g., TSS were reduced by 20 %), compared to the current conditions. The methodology developed in this study was found useful for planning climate change adaptation strategies in the context of local conditions.

  8. Human health risk assessment case study: an abandoned metal smelter site in Poland.

    PubMed

    Wcisło, Eleonora; Ioven, Dawn; Kucharski, Rafal; Szdzuj, Jerzy

    2002-05-01

    United States Environmental Protection Agency methodologies for human health risk assessment (HRA) were applied in a Brownfields Demonstration Project on the Warynski smelter site (WSS), an abandoned industrial site at Piekary Slaskie town, Upper Silesia, Poland. The HRA included the baseline risk assessment (BRA) and the development of risk-based preliminary remedial goals (RBPRGs). The HRA focused on surface area covered with waste materials, which were evaluated with regard to the potential risks they may pose to humans. Cadmium, copper, iron, manganese, lead, and zinc were proposed as the contaminants of potential concern (COPCs) at WSS based on archive data on chemical composition of waste located on WSS. For the defined future land use patterns, the industrial (I) and recreational (II) exposure scenarios were assumed and evaluated. The combined hazard index for all COPCs was 3.1E+00 for Scenario I and 3.2E+00 for Scenario II. Regarding potential carcinogenic risks associated with the inhalation route, only cadmium was a contributor, with risks of 1.6E-06 and 2.6E-07 for Scenario I and Scenario II, respectively. The results of the BRA indicated that the potential health risks at WSS were mainly associated with waste material exposure to cadmium (industrial and recreational scenarios) and lead (industrial scenario). RBPRGs calculated under the industrial scenario were 1.17E+03 and 1.62E+03 mg/kg for cadmium and lead, respectively. The RBPRG for cadmium was 1.18E+03 mg/kg under the recreational scenario. The BRA results, as well as RBCs, are comparable for both scenarios, so it is impossible to prioritize land use patterns for WSS based on these results. For choosing a future land use pattern or an appropriate redevelopment option, different factors would be decisive in the decision-making process, e.g., social, market needs, technical feasibility and costs of redevelopment actions or acceptance of local community.

  9. Towards the new CH2018 climate scenarios for Switzerland

    NASA Astrophysics Data System (ADS)

    Fischer, Andreas; Schär, Christoph; Croci-Maspoli, Mischa; Knutti, Reto; Liniger, Mark; Strassmann, Kuno

    2017-04-01

    There is a growing demand for regional assessments of future climate change and its impacts on society and ecosystems to inform and facilitate appropriate adaptation strategies. The basis for such assessments are consistent and up-to-date climate change scenarios on the local to regional scale. In Switzerland, an important step has been accomplished by the release of the climate scenarios in 2011 ("CH2011"). Since then, new climate model simulations have become available and the scientific understanding has improved. It is hence desirable to update these national scenarios. The new CH2018 scenarios are developed in the framework of the recently founded National Center for Climate Services (NCCS), a network consisting of several federal offices and academic partners. The CH2018 scenarios will build primarily upon the latest Euro-CORDEX regional climate model simulations assuming different pathways of future greenhouse gas concentrations. Compared to CH2011, more emphasis will be put on changes in extremes and in putting the projected changes in the context of observed variability. Results of a recently conducted survey on end-user needs in Switzerland will guide the development process toward the CH2018 scenarios. It ensures that the scenarios are presented and communicated in a user-oriented format and find a wide applicability across different sectors in Switzerland. In the presentation we will show the full methodological setup to generate the CH2018 scenarios and how consistency across the methods and products is maximized. First results on mean changes and selected indices will be presented. In terms of dissemination, the results of the user survey show the necessity to address all different user types of climate scenarios, especially the non-experts. Compared to CH2011, this implies a stronger focus on consulting, condensing complex information and providing tutorials. In the presentation, we will outline our plans on dissemination in order to adequately address all relevant user groups of CH2018.

  10. Medication dispensing errors in Palestinian community pharmacy practice: a formal consensus using the Delphi technique.

    PubMed

    Shawahna, Ramzi; Haddad, Aseel; Khawaja, Baraa; Raie, Rand; Zaneen, Sireen; Edais, Tasneem

    2016-10-01

    Background Medication dispensing errors (MDEs) are frequent in community pharmacy practice. A definition of MDEs and scenarios representing MDE situations in Palestinian community pharmacy practice were not previously approached using formal consensus techniques. Objective This study was conducted to achieve consensus on a definition of MDEs and a wide range of scenarios that should or should not be considered as MDEs in Palestinian community pharmacy practice by a panel of community pharmacists. Setting Community pharmacy practice in Palestine. Method This was a descriptive study using the Delphi technique. A panel of fifty community pharmacists was recruited from different geographical locations of the West Bank of Palestine. A three round Delphi technique was followed to achieve consensus on a proposed definition of MDEs and 83 different scenarios representing potential MDEs using a nine-point scale. Main outcome measure Agreement or disagreement of a panel of community pharmacists on a proposed definition of MDEs and a series of scenarios representing potential MDEs. Results In the first Delphi round, views of key contact community pharmacists on MDEs were explored and situations representing potential MDEs were collected. In the second Delphi round, consensus was achieved to accept the proposed definition and to include 49 (59 %) of the 83 proposed scenarios as MDEs. In the third Delphi round, consensus was achieved to include further 13 (15.7 %) scenarios as MDEs, exclude 9 (10.8 %) scenarios and the rest of 12 (14.5 %) scenarios were considered equivocal based on the opinions of the panelists. Conclusion Consensus on a definition of MDEs and scenarios representing MDE situations in Palestinian community pharmacy practice was achieved using a formal consensus technique. The use of consensual definitions and scenarios representing MDE situations in community pharmacy practice might minimize methodological variations and their significant effects on the number and rate of MDEs reported in different studies.

  11. Target-motion prediction for robotic search and rescue in wilderness environments.

    PubMed

    Macwan, Ashish; Nejat, Goldie; Benhabib, Beno

    2011-10-01

    This paper presents a novel modular methodology for predicting a lost person's (motion) behavior for autonomous coordinated multirobot wilderness search and rescue. The new concept of isoprobability curves is introduced and developed, which represents a unique mechanism for identifying the target's probable location at any given time within the search area while accounting for influences such as terrain topology, target physiology and psychology, clues found, etc. The isoprobability curves are propagated over time and space. The significant tangible benefit of the proposed target-motion prediction methodology is demonstrated through a comparison to a nonprobabilistic approach, as well as through a simulated realistic wilderness search scenario.

  12. Set-membership fault detection under noisy environment with application to the detection of abnormal aircraft control surface positions

    NASA Astrophysics Data System (ADS)

    El Houda Thabet, Rihab; Combastel, Christophe; Raïssi, Tarek; Zolghadri, Ali

    2015-09-01

    The paper develops a set membership detection methodology which is applied to the detection of abnormal positions of aircraft control surfaces. Robust and early detection of such abnormal positions is an important issue for early system reconfiguration and overall optimisation of aircraft design. In order to improve fault sensitivity while ensuring a high level of robustness, the method combines a data-driven characterisation of noise and a model-driven approach based on interval prediction. The efficiency of the proposed methodology is illustrated through simulation results obtained based on data recorded in several flight scenarios of a highly representative aircraft benchmark.

  13. Direct Administration of Nerve-Specific Contrast to Improve Nerve Sparing Radical Prostatectomy

    PubMed Central

    Barth, Connor W.; Gibbs, Summer L.

    2017-01-01

    Nerve damage remains a major morbidity following nerve sparing radical prostatectomy, significantly affecting quality of life post-surgery. Nerve-specific fluorescence guided surgery offers a potential solution by enhancing nerve visualization intraoperatively. However, the prostate is highly innervated and only the cavernous nerve structures require preservation to maintain continence and potency. Systemic administration of a nerve-specific fluorophore would lower nerve signal to background ratio (SBR) in vital nerve structures, making them difficult to distinguish from all nervous tissue in the pelvic region. A direct administration methodology to enable selective nerve highlighting for enhanced nerve SBR in a specific nerve structure has been developed herein. The direct administration methodology demonstrated equivalent nerve-specific contrast to systemic administration at optimal exposure times. However, the direct administration methodology provided a brighter fluorescent nerve signal, facilitating nerve-specific fluorescence imaging at video rate, which was not possible following systemic administration. Additionally, the direct administration methodology required a significantly lower fluorophore dose than systemic administration, that when scaled to a human dose falls within the microdosing range. Furthermore, a dual fluorophore tissue staining method was developed that alleviates fluorescence background signal from adipose tissue accumulation using a spectrally distinct adipose tissue specific fluorophore. These results validate the use of the direct administration methodology for specific nerve visualization with fluorescence image-guided surgery, which would improve vital nerve structure identification and visualization during nerve sparing radical prostatectomy. PMID:28255352

  14. Direct Administration of Nerve-Specific Contrast to Improve Nerve Sparing Radical Prostatectomy.

    PubMed

    Barth, Connor W; Gibbs, Summer L

    2017-01-01

    Nerve damage remains a major morbidity following nerve sparing radical prostatectomy, significantly affecting quality of life post-surgery. Nerve-specific fluorescence guided surgery offers a potential solution by enhancing nerve visualization intraoperatively. However, the prostate is highly innervated and only the cavernous nerve structures require preservation to maintain continence and potency. Systemic administration of a nerve-specific fluorophore would lower nerve signal to background ratio (SBR) in vital nerve structures, making them difficult to distinguish from all nervous tissue in the pelvic region. A direct administration methodology to enable selective nerve highlighting for enhanced nerve SBR in a specific nerve structure has been developed herein. The direct administration methodology demonstrated equivalent nerve-specific contrast to systemic administration at optimal exposure times. However, the direct administration methodology provided a brighter fluorescent nerve signal, facilitating nerve-specific fluorescence imaging at video rate, which was not possible following systemic administration. Additionally, the direct administration methodology required a significantly lower fluorophore dose than systemic administration, that when scaled to a human dose falls within the microdosing range. Furthermore, a dual fluorophore tissue staining method was developed that alleviates fluorescence background signal from adipose tissue accumulation using a spectrally distinct adipose tissue specific fluorophore. These results validate the use of the direct administration methodology for specific nerve visualization with fluorescence image-guided surgery, which would improve vital nerve structure identification and visualization during nerve sparing radical prostatectomy.

  15. Methodology for the optimal design of an integrated first and second generation ethanol production plant combined with power cogeneration.

    PubMed

    Bechara, Rami; Gomez, Adrien; Saint-Antonin, Valérie; Schweitzer, Jean-Marc; Maréchal, François

    2016-08-01

    The application of methodologies for the optimal design of integrated processes has seen increased interest in literature. This article builds on previous works and applies a systematic methodology to an integrated first and second generation ethanol production plant with power cogeneration. The methodology breaks into process simulation, heat integration, thermo-economic evaluation, exergy efficiency vs. capital costs, multi-variable, evolutionary optimization, and process selection via profitability maximization. Optimization generated Pareto solutions with exergy efficiency ranging between 39.2% and 44.4% and capital costs from 210M$ to 390M$. The Net Present Value was positive for only two scenarios and for low efficiency, low hydrolysis points. The minimum cellulosic ethanol selling price was sought to obtain a maximum NPV of zero for high efficiency, high hydrolysis alternatives. The obtained optimal configuration presented maximum exergy efficiency, hydrolyzed bagasse fraction, capital costs and ethanol production rate, and minimum cooling water consumption and power production rate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Simulated effects of proposed Arkansas Valley Conduit on hydrodynamics and water quality for projected demands through 2070, Pueblo Reservoir, southeastern Colorado

    USGS Publications Warehouse

    Ortiz, Roderick F.

    2013-01-01

    The purpose of the Arkansas Valley Conduit (AVC) is to deliver water for municipal and industrial use within the boundaries of the Southeastern Colorado Water Conservancy District. Water supplied through the AVC would serve two needs: (1) to supplement or replace existing poor-quality water to communities downstream from Pueblo Reservoir; and (2) to meet a portion of the AVC participants’ projected water demands through 2070. The Bureau of Reclamation (Reclamation) initiated an Environmental Impact Statement (EIS) to address the potential environmental consequences associated with constructing and operating the proposed AVC, entering into a conveyance contract for the Pueblo Dam north-south outlet works interconnect (Interconnect), and entering into a long-term excess capacity master contract (Master Contract). Operational changes, as a result of implementation of proposed EIS alternatives, could change the hydrodynamics and water-quality conditions in Pueblo Reservoir. An interagency agreement was initiated between Reclamation and the U.S. Geological Survey to accurately simulate hydrodynamics and water quality in Pueblo Reservoir for projected demands associated with four of the seven proposed EIS alternatives. The four alternatives submitted to the USGS for scenario simulation included various combinations (action or no action) of the proposed Arkansas Valley Conduit, Master Contract, and Interconnect options. The four alternatives were the No Action, Comanche South, Joint Use Pipeline North, and Master Contract Only. Additionally, scenario simulations were done that represented existing conditions (Existing Conditions scenario) in Pueblo Reservoir. Water-surface elevations, water temperature, dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, total iron, and algal biomass (measured as chlorophyll-a) were simulated. Each of the scenarios was simulated for three contiguous water years representing a wet, average, and dry annual hydrologic cycle. Each selected simulation scenario also was evaluated for differences in direct/indirect effects and cumulative effects on a particular scenario. Analysis of the results for the direct/indirect- and cumulative-effects analyses indicated that, in general, the results were similar for most of the scenarios and comparisons in this report focused on results from the direct/indirect-effects analyses. Scenario simulations that represented existing conditions in Pueblo Reservoir were compared to the No Action scenario to assess changes in water quality from current demands (2006) to projected demands in 2070. Overall, comparisons of the results between the Existing Conditions and the No Action scenarios for water-surface elevations, water temperature, and dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, and total iron concentrations indicated that the annual median values generally were similar for all three simulated years. Additionally, algal groups and chlorophyll-a concentrations (algal biomass) were similar for the Existing Conditions and the No Action scenarios at site 7B in the epilimnion for the simulated period (Water Year 2000 through 2002). The No Action scenario also was compared individually to the Comanche South, Joint Use Pipeline North, and Master Contract Only scenarios. These comparisons were made to describe changes in the annual median, 85th percentile, or 15th percentile concentration between the No Action scenario and each of the other three simulation scenarios. Simulated water-surface elevations, water temperature, dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, total iron, algal groups, and chlorophyll-a concentrations in Pueblo Reservoir generally were similar between the No Action scenario and each of the other three simulation scenarios.

  17. The IRYSS-COPD appropriateness study: objectives, methodology, and description of the prospective cohort

    PubMed Central

    2011-01-01

    Background Patients with chronic obstructive pulmonary disease (COPD) often experience exacerbations of the disease that require hospitalization. Current guidelines offer little guidance for identifying patients whose clinical situation is appropriate for admission to the hospital, and properly developed and validated severity scores for COPD exacerbations are lacking. To address these important gaps in clinical care, we created the IRYSS-COPD Appropriateness Study. Methods/Design The RAND/UCLA Appropriateness Methodology was used to identify appropriate and inappropriate scenarios for hospital admission for patients experiencing COPD exacerbations. These scenarios were then applied to a prospective cohort of patients attending the emergency departments (ED) of 16 participating hospitals. Information was recorded during the time the patient was evaluated in the ED, at the time a decision was made to admit the patient to the hospital or discharge home, and during follow-up after admission or discharge home. While complete data were generally available at the time of ED admission, data were often missing at the time of decision making. Predefined assumptions were used to impute much of the missing data. Discussion The IRYSS-COPD Appropriateness Study will validate the appropriateness criteria developed by the RAND/UCLA Appropriateness Methodology and thus better delineate the requirements for admission or discharge of patients experiencing exacerbations of COPD. The study will also provide a better understanding of the determinants of outcomes of COPD exacerbations, and evaluate the equity and variability in access and outcomes in these patients. PMID:22115318

  18. Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.

    2014-05-01

    The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily performed to understand the influence of the model characteristics on the computed ground shaking scenarios. For massive parametric tests, or for the repeated generation of large scale hazard maps, the methodology can take advantage of more advanced computational platforms, ranging from GRID computing infrastructures to HPC dedicated clusters up to Cloud computing. In such a way, scientists can deal efficiently with the variety and complexity of the potential earthquake sources, and perform parametric studies to characterize the related uncertainties. NDSHA provides realistic time series of expected ground motion readily applicable for seismic engineering analysis and other mitigation actions. The methodology has been successfully applied to strategic buildings, lifelines and cultural heritage sites, and for the purpose of seismic microzoning in several urban areas worldwide. A web application is currently being developed that facilitates the access to the NDSHA methodology and the related outputs by end-users, who are interested in reliable territorial planning and in the design and construction of buildings and infrastructures in seismic areas. At the same, the web application is also shaping up as an advanced educational tool to explore interactively how seismic waves are generated at the source, propagate inside structural models, and build up ground shaking scenarios. We illustrate the preliminary results obtained from a multiscale application of NDSHA approach to the territory of India, zooming from large scale hazard maps of ground shaking at bedrock, to the definition of local scale earthquake scenarios for selected sites in the Gujarat state (NW India). The study aims to provide the community (e.g. authorities and engineers) with advanced information for earthquake risk mitigation, which is particularly relevant to Gujarat in view of the rapid development and urbanization of the region.

  19. ACCF/ACR/SCCT/SCMR/ASNC/NASCI/SCAI/SIR 2006 appropriateness criteria for cardiac computed tomography and cardiac magnetic resonance imaging. A report of the American College of Cardiology Foundation Quality Strategic Directions Committee Appropriateness Criteria Working Group.

    PubMed

    2006-10-01

    Under the auspices of the American College of Cardiology Foundation (ACCF) together with key specialty and subspecialty societies, appropriateness reviews were conducted for 2 relatively new clinical cardiac imaging modalities, cardiac computed tomography (CCT) and cardiac magnetic resonance (CMR) imaging. The reviews assessed the risks and benefits of the imaging tests for several indications or clinical scenarios and scored them based on a scale of 1 to 9, where the upper range (7 to 9) implies that the test is generally acceptable and is a reasonable approach, and the lower range (1 to 3) implies that the test is generally not acceptable and is not a reasonable approach. The mid-range (4 to 6) indicates an uncertain clinical scenario. The indications for these reviews were drawn from common applications or anticipated uses, as few clinical practice guidelines currently exist for these techniques. These indications were reviewed by an independent group of clinicians and modified by the Working Group, and then panelists rated the indications based on the ACCF Methodology for Evaluating the Appropriateness of Cardiovascular Imaging, which blends scientific evidence and practice experience. A modified Delphi technique was used to obtain first and second round ratings of clinical indications after the panelists were provided with a set of literature reviews, evidence tables, and seminal references. The final ratings were evenly distributed among the 3 categories of appropriateness for both CCT and CMR. Use of tests for structure and function and for diagnosis in symptomatic, intermediate coronary artery disease (CAD) risk patients was deemed appropriate, while repeat testing and general screening uses were viewed less favorably. It is anticipated that these results will have a significant impact on physician decision making and performance, reimbursement policy, and future research directions.

  20. Financial and feasibility implications of the treatment of hepatitis C virus in Italy: scenarios and perspectives

    PubMed Central

    Croce, Davide; Bonfanti, Marzia; Restelli, Umberto

    2016-01-01

    Background Hepatitis C virus (HCV) affects an estimated number of people between 130 million and 210 million worldwide. In the next few years, the Italian National Health Service will face a growing trend of patients requiring HCV antiviral treatments. The aim of the analysis was to estimate the time horizon in which it would be possible to treat HCV-infected patients and the related direct medical costs (antiviral treatment and monitoring activities) from the Italian National Health Service point of view. Methodology In order to estimate the number of HCV-infected patients in Italy, we considered a top-down (considering published data) and a bottom-up approach. The number of years needed for treatment and related direct costs were estimated through the development of a static deterministic model. Results The estimated number of HCV-infected patients in Italy varies from 2.7 (estimated through a top-down approach) to 0.6 million (estimated through a bottom-up approach) and 0.3 million (measured through a bottom-up approach). Considering the last two scenarios and the use of interferon-free therapies for 50,000 patients per year, treatment for HCV-infected patients could be at a cost of €13.7 billion and €7.0 billion by 2030 and 2023, respectively. Conclusion The treatment for HCV-infected patients in Italy is a challenging target for the financial implications of patient care. HCV infection could be controlled or eliminated in a 10- to 15-year time horizon. The cost of treatment can hardly be dealt with using the traditional economic tools but should be faced through multiyear investments, as health benefits are expected in the long period. National Health Service stakeholders (industry, government, insurance, and also patients) will have to identify suitable financial instruments to face the new expenditure required. PMID:27540306

  1. Cost assessment and ecological effectiveness of nutrient reduction options for mitigating Phaeocystis colony blooms in the Southern North Sea: an integrated modeling approach.

    PubMed

    Lancelot, Christiane; Thieu, Vincent; Polard, Audrey; Garnier, Josette; Billen, Gilles; Hecq, Walter; Gypens, Nathalie

    2011-05-01

    Nutrient reduction measures have been already taken by wealthier countries to decrease nutrient loads to coastal waters, in most cases however, prior to having properly assessed their ecological effectiveness and their economic costs. In this paper we describe an original integrated impact assessment methodology to estimate the direct cost and the ecological performance of realistic nutrient reduction options to be applied in the Southern North Sea watershed to decrease eutrophication, visible as Phaeocystis blooms and foam deposits on the beaches. The mathematical tool couples the idealized biogeochemical GIS-based model of the river system (SENEQUE-RIVERSTRAHLER) implemented in the Eastern Channel/Southern North Sea watershed to the biogeochemical MIRO model describing Phaeocystis blooms in the marine domain. Model simulations explore how nutrient reduction options regarding diffuse and/or point sources in the watershed would affect the Phaeocystis colony spreading in the coastal area. The reference and prospective simulations are performed for the year 2000 characterized by mean meteorological conditions, and nutrient reduction scenarios include and compare upgrading of wastewater treatment plants and changes in agricultural practices including an idealized shift towards organic farming. A direct cost assessment is performed for each realistic nutrient reduction scenario. Further the reduction obtained for Phaeocystis blooms is assessed by comparison with ecological indicators (bloom magnitude and duration) and the cost for reducing foam events on the beaches is estimated. Uncertainty brought by the added effect of meteorological conditions (rainfall) on coastal eutrophication is discussed. It is concluded that the reduction obtained by implementing realistic environmental measures on the short-term is costly and insufficient to restore well-balanced nutrient conditions in the coastal area while the replacement of conventional agriculture by organic farming might be an option to consider in the nearby future. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Direct Operational Field Test Evaluation, Simulation And Modeling

    DOT National Transportation Integrated Search

    1998-08-01

    THE PURPOSE OF THE SIMULATION EVALUATION IS TO ASSESS THE EXPECTED FUTURE IMPACTS OF THE DIRECT TECHNOLOGIES UNDER SCENARIOS OF FULL DEPLOYMENT. THIS PROVIDED SOME INDICATION OF THE LEVEL OF BENEFITS THAT CAN BE EXPECTED FROM DIRECT IN THE FUTURE. BE...

  3. A methodology for the assessment of flood hazards at the regional scale

    NASA Astrophysics Data System (ADS)

    Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Zabeo, Alex; Semenzin, Elena; Marcomini, Antonio

    2013-04-01

    In recent years, the frequency of water-related disasters has increased and recent flood events in Europe (e.g. 2002 in Central Europe, 2007 in UK, 2010 in Italy) caused physical-environmental and socio-economic damages. Specifically, floods are the most threatening water-related disaster that affects humans, their lives and properties. Within the KULTURisk project (FP7) a Regional Risk Assessment (RRA) methodology is proposed to evaluate the benefits of risk prevention in terms of reduced environmental risks due to floods. The method is based on the KULTURisk framework and allows the identification and prioritization of targets (i.e. people, buildings, infrastructures, agriculture, natural and semi-natural systems, cultural heritages) and areas at risk from floods in the considered region by comparing the baseline scenario (i.e. current state) with alternative scenarios (i.e. where different structural and/or non-structural measures are planned). The RRA methodology is flexible and can be adapted to different case studies (i.e. large rivers, alpine/mountain catchments, urban areas and coastal areas) and spatial scales (i.e. from the large river to the urban scale). The final aim of RRA is to help decision-makers in examining the possible environmental risks associated with uncertain future flood hazards and in identifying which prevention scenario could be the most suitable one. The RRA methodology employs Multi-Criteria Decision Analysis (MCDA functions) in order to integrate stakeholder preferences and experts judgments into the analysis. Moreover, Geographic Information Systems (GISs) are used to manage, process, analyze, and map data to facilitate the analysis and the information sharing with different experts and stakeholders. In order to characterize flood risks, the proposed methodology integrates the output of hydrodynamic models with the analysis of site-specific bio-geophysical and socio-economic indicators (e.g. slope of the territory, land cover, population density, economic activities) of several case studies in order to develop risk maps that identify and prioritize relative hot-spot areas and targets at risk at the regional scale. The main outputs of the RRA are receptor-based maps of risks useful to communicate the potential implications of floods in non-monetary terms to stakeholders and administrations. These maps can be a basis for the management of flood risks as they can provide information about the indicative number of inhabitants, the type of economic activities, natural systems and cultural heritages potentially affected by flooding. Moreover, they can provide suitable information about flood risk in the considered area in order to define priorities for prevention measures, for land use planning and management. Finally, the outputs of the RRA methodology can be used as data input in the Socio- Economic Regional Risk Assessment methodology for the economic evaluation of different damages (e.g. tangible costs, intangible costs) and for the social assessment considering the benefits of the human dimension of vulnerability (i.e. adaptive and coping capacity). Within the KULTURisk project, the methodology has been applied and validated in several European case studies. Moreover, its generalization to address other types of natural hazards (e.g. earthquakes, forest fires) will be evaluated. The preliminary results of the RRA application in the KULTURisk project will be here presented and discussed.

  4. FEST-C 1.3 & 2.0 for CMAQ Bi-directional NH3, Crop Production, and SWAT Modeling

    EPA Science Inventory

    The Fertilizer Emission Scenario Tool for CMAQ (FEST-C) is developed in a Linux environment, a festc JAVA interface that integrates 14 tools and scenario management options facilitating land use/crop data processing for the Community Multiscale Air Quality (CMAQ) modeling system ...

  5. Factors that affect implementation of a nurse staffing directive: results from a qualitative multi-case evaluation.

    PubMed

    Robinson, Claire H; Annis, Ann M; Forman, Jane; Krein, Sarah L; Yankey, Nicholas; Duffy, Sonia A; Taylor, Beth; Sales, Anne E

    2016-08-01

    To assess implementation of the Veterans Health Administration staffing methodology directive. In 2010 the Veterans Health Administration promulgated a staffing methodology directive for inpatient nursing units to address staffing and budget forecasting. A qualitative multi-case evaluation approach assessed staffing methodology implementation. Semi-structured telephone interviews were conducted from March - June 2014 with Nurse Executives and their teams at 21 facilities. Interviews focused on the budgeting process, implementation experiences, use of data, leadership support, and training. An implementation score was created for each facility using a 4-point rating scale. The scores were used to select three facilities (low, medium and high implementation) for more detailed case studies. After analysing interview summaries, the evaluation team developed a four domain scoring structure: (1) integration of staffing methodology into budget development; (2) implementation of the Directive elements; (3) engagement of leadership and staff; and (4) use of data to support the staffing methodology process. The high implementation facility had leadership understanding and endorsement of staffing methodology, confidence in and ability to work with data, and integration of staffing methodology results into the budgeting process. The low implementation facility reported poor leadership engagement and little understanding of data sources and interpretation. Implementation varies widely across facilities. Implementing staffing methodology in facilities with complex and changing staffing needs requires substantial commitment at all organizational levels especially for facilities that have traditionally relied on historical levels to budget for staffing. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.

  6. Assessing methane emission estimation methods based on atmospheric measurements from oil and gas production using LES simulations

    NASA Astrophysics Data System (ADS)

    Saide, P. E.; Steinhoff, D.; Kosovic, B.; Weil, J.; Smith, N.; Blewitt, D.; Delle Monache, L.

    2017-12-01

    There are a wide variety of methods that have been proposed and used to estimate methane emissions from oil and gas production by using air composition and meteorology observations in conjunction with dispersion models. Although there has been some verification of these methodologies using controlled releases and concurrent atmospheric measurements, it is difficult to assess the accuracy of these methods for more realistic scenarios considering factors such as terrain, emissions from multiple components within a well pad, and time-varying emissions representative of typical operations. In this work we use a large-eddy simulation (LES) to generate controlled but realistic synthetic observations, which can be used to test multiple source term estimation methods, also known as an Observing System Simulation Experiment (OSSE). The LES is based on idealized simulations of the Weather Research & Forecasting (WRF) model at 10 m horizontal grid-spacing covering an 8 km by 7 km domain with terrain representative of a region located in the Barnett shale. Well pads are setup in the domain following a realistic distribution and emissions are prescribed every second for the components of each well pad (e.g., chemical injection pump, pneumatics, compressor, tanks, and dehydrator) using a simulator driven by oil and gas production volume, composition and realistic operational conditions. The system is setup to allow assessments under different scenarios such as normal operations, during liquids unloading events, or during other prescribed operational upset events. Methane and meteorology model output are sampled following the specifications of the emission estimation methodologies and considering typical instrument uncertainties, resulting in realistic observations (see Figure 1). We will show the evaluation of several emission estimation methods including the EPA Other Test Method 33A and estimates using the EPA AERMOD regulatory model. We will also show source estimation results from advanced methods such as variational inverse modeling, and Bayesian inference and stochastic sampling techniques. Future directions including other types of observations, other hydrocarbons being considered, and assessment of additional emission estimation methods will be discussed.

  7. Seismic Vulnerability Assessment for Montreal -An Application of HAZUS-MH4

    NASA Astrophysics Data System (ADS)

    Yu, Keyan

    2011-12-01

    Seismic loss estimation for Montreal, Canada is performed for a 2% in 50 years seismic hazard using the HAZUS-MH4 tool developed by US Federal Emergency Management. The software is manipulated to accept a Canadian setting for the Montreal study region, which includes 522 census tracts. The accuracy of loss estimations using HAZUS is dependent on the quality and quantity of data collection and preparation. The data collected for Montreal study region comprise: (1) the building inventory (2) hazard maps regarding soil amplification, liquefaction, and landslides (3) population distribution at three different times of the day (4) census demographic information and (5) synthetic ground motion contour maps using three different ground motion prediction equations. All these data are prepared and assembled into geodatabases that are compatible with the HAZUS software. The study estimated that roughly 5% of the building stock would be damaged with direct economic losses evaluated at 1.4 billion dollars for a scenario corresponding to the 2% in 50 years scenario. The maximum number of casualties associated with this scenario corresponds to a time of occurrence of 2pm and would result in approximately 500 people being injured. Epistemic uncertainty was considered by obtaining damage estimates for three attenuation functions that were developed for Eastern North America. The results indicate that loss estimates are highly sensitive to the choice of the attenuation function and suggests that epistemic uncertainty should be considered both for the definition of the hazard function and in loss estimation methodologies. The next steps in the study should be to increase the size of the survey area to the Greater Montreal which includes more than 3 million inhabitants and to perform more targeted studies for critical areas such as downtown Montreal, and the south-eastern tip of Montreal. The current study was performed mainly for the built environment; the next phase will need to include more information relative to lifelines and their impact on risks.

  8. Hyperspectral Vehicle BRDF Learning: An Exploration of Vehicle Reflectance Variation and Optimal Measures of Spectral Similarity for Vehicle Reacquisition and Tracking Algorithms

    NASA Astrophysics Data System (ADS)

    Svejkosky, Joseph

    The spectral signatures of vehicles in hyperspectral imagery exhibit temporal variations due to the preponderance of surfaces with material properties that display non-Lambertian bi-directional reflectance distribution functions (BRDFs). These temporal variations are caused by changing illumination conditions, changing sun-target-sensor geometry, changing road surface properties, and changing vehicle orientations. To quantify these variations and determine their relative importance in a sub-pixel vehicle reacquisition and tracking scenario, a hyperspectral vehicle BRDF sampling experiment was conducted in which four vehicles were rotated at different orientations and imaged over a six-hour period. The hyperspectral imagery was calibrated using novel in-scene methods and converted to reflectance imagery. The resulting BRDF sampled time-series imagery showed a strong vehicle level BRDF dependence on vehicle shape in off-nadir imaging scenarios and a strong dependence on vehicle color in simulated nadir imaging scenarios. The imagery also exhibited spectral features characteristic of sampling the BRDF of non-Lambertian targets, which were subsequently verified with simulations. In addition, the imagery demonstrated that the illumination contribution from vehicle adjacent horizontal surfaces significantly altered the shape and magnitude of the vehicle reflectance spectrum. The results of the BRDF sampling experiment illustrate the need for a target vehicle BRDF model and detection scheme that incorporates non-Lambertian BRDFs. A new detection algorithm called Eigenvector Loading Regression (ELR) is proposed that learns a hyperspectral vehicle BRDF from a series of BRDF measurements using regression in a lower dimensional space and then applies the learned BRDF to make test spectrum predictions. In cases of non-Lambertian vehicle BRDF, this detection methodology performs favorably when compared to subspace detections algorithms and graph-based detection algorithms that do not account for the target BRDF. The algorithms are compared using a test environment in which observed spectral reflectance signatures from the BRDF sampling experiment are implanted into aerial hyperspectral imagery that contain large quantities of vehicles.

  9. Agricultural climate impacts assessment for economic modeling and decision support

    NASA Astrophysics Data System (ADS)

    Thomson, A. M.; Izaurralde, R. C.; Beach, R.; Zhang, X.; Zhao, K.; Monier, E.

    2013-12-01

    A range of approaches can be used in the application of climate change projections to agricultural impacts assessment. Climate projections can be used directly to drive crop models, which in turn can be used to provide inputs for agricultural economic or integrated assessment models. These model applications, and the transfer of information between models, must be guided by the state of the science. But the methodology must also account for the specific needs of stakeholders and the intended use of model results beyond pure scientific inquiry, including meeting the requirements of agencies responsible for designing and assessing policies, programs, and regulations. Here we present methodology and results of two climate impacts studies that applied climate model projections from CMIP3 and from the EPA Climate Impacts and Risk Analysis (CIRA) project in a crop model (EPIC - Environmental Policy Indicator Climate) in order to generate estimates of changes in crop productivity for use in an agricultural economic model for the United States (FASOM - Forest and Agricultural Sector Optimization Model). The FASOM model is a forward-looking dynamic model of the US forest and agricultural sector used to assess market responses to changing productivity of alternative land uses. The first study, focused on climate change impacts on the UDSA crop insurance program, was designed to use available daily climate projections from the CMIP3 archive. The decision to focus on daily data for this application limited the climate model and time period selection significantly; however for the intended purpose of assessing impacts on crop insurance payments, consideration of extreme event frequency was critical for assessing periodic crop failures. In a second, coordinated impacts study designed to assess the relative difference in climate impacts under a no-mitigation policy and different future climate mitigation scenarios, the stakeholder specifically requested an assessment of a mitigation level of 3.7 W/m2, as well as consideration of different levels of climate sensitivity (2, 3, 4.5 and 6oC) and different initial conditions for addressing uncertainty. Since the CMIP 3 and CMIP5 protocols did not include this mitigation level or consider alternative levels of climate sensitivity, additional climate projections were required. These two cases will be discussed to illustrate some of the trade-offs made in development of methodologies for climate impact assessments that are intended for a specific user or audience, and oriented towards addressing a specific topic of interest and providing useable results. This involvement of stakeholders from the design phase of climate impacts methodology serves to both define the appropriate method for the question at hand and also to engage and inform the stakeholders of the myriad options and uncertainties associated with different methodology choices. This type of engagement should benefit decision making in the long run through greater stakeholder understanding of the science of future climate model projections, scenarios, the climate impacts sector models and the types of outputs that can be generated by each along with the respective uncertainties at each step of the climate impacts assessment process.

  10. General tradeoff relations of quantum nonlocality in the Clauser–Horne–Shimony–Holt scenario

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Hong-Yi, E-mail: hongyisu@chonnam.ac.kr; Chen, Jing-Ling; Centre for Quantum Technologies, National University of Singapore, 3 Science Drive 2, Singapore 117543

    2017-02-15

    General tradeoff relations present in nonlocal correlations of bipartite systems are studied, regardless of any specific quantum states and measuring directions. Extensions to multipartite scenarios are possible and very promising. Tsirelson’s bound can be derived out in particular. The close connection with uncertainty relations is also presented and discussed. - Highlights: • Quantum violation of CHSH inequalities is found to satisfy tradeoff relations. • Tsirelson’s bound for quantum mechanics can be directly implied from these tradeoffs. • Tradeoff relations shed new light on uncertainty relations in summation forms.

  11. Revised estimates for direct-effect recreational jobs in the interior Columbia River basin.

    Treesearch

    Lisa K. Crone; Richard W. Haynes

    1999-01-01

    This paper reviews the methodology used to derive the original estimates for direct employment associated with recreation on Federal lands in the interior Columbia River basin (the basin), and details the changes in methodology and data used to derive new estimates. The new analysis resulted in an estimate of 77,655 direct-effect jobs associated with recreational...

  12. A social choice-based methodology for treated wastewater reuse in urban and suburban areas.

    PubMed

    Mahjouri, Najmeh; Pourmand, Ehsan

    2017-07-01

    Reusing treated wastewater for supplying water demands such as landscape and agricultural irrigation in urban and suburban areas has become a major water supply approach especially in regions struggling with water shortage. Due to limited available treated wastewater to satisfy all water demands, conflicts may arise in allocating treated wastewater to water users. Since there is usually more than one decision maker and more than one criterion to measure the impact of each water allocation scenario, effective tools are needed to combine individual preferences to reach a collective decision. In this paper, a new social choice (SC) method, which can consider some indifference thresholds for decision makers, is proposed for evaluating and ranking treated wastewater and urban runoff allocation scenarios to water users in urban and suburban areas. Some SC methods, namely plurality voting, Borda count, pairwise comparisons, Hare system, dictatorship, and approval voting, are applied for comparing and evaluating the results. Different scenarios are proposed for allocating treated wastewater and urban runoff to landscape irrigation, agricultural lands as well as artificial recharge of aquifer in the Tehran metropolitan Area, Iran. The main stakeholders rank the proposed scenarios based on their utilities using two different approaches. The proposed method suggests ranking of the scenarios based on the stakeholders' utilities and considering the scores they assigned to each scenario. Comparing the results of the proposed method with those of six different SC methods shows that the obtained ranks are mostly in compliance with the social welfare.

  13. Integrating ecosystem services analysis into scenario planning practice: accounting for street tree benefits with i-Tree valuation in Central Texas.

    PubMed

    Hilde, Thomas; Paterson, Robert

    2014-12-15

    Scenario planning continues to gain momentum in the United States as an effective process for building consensus on long-range community plans and creating regional visions for the future. However, efforts to integrate more sophisticated information into the analytical framework to help identify important ecosystem services have lagged in practice. This is problematic because understanding the tradeoffs of land consumption patterns on ecological integrity is central to mitigating the environmental degradation caused by land use change and new development. In this paper we describe how an ecosystem services valuation model, i-Tree, was integrated into a mainstream scenario planning software tool, Envision Tomorrow, to assess the benefits of public street trees for alternative future development scenarios. The tool is then applied to development scenarios from the City of Hutto, TX, a Central Texas Sustainable Places Project demonstration community. The integrated tool represents a methodological improvement for scenario planning practice, offers a way to incorporate ecosystem services analysis into mainstream planning processes, and serves as an example of how open source software tools can expand the range of issues available for community and regional planning consideration, even in cases where community resources are limited. The tool also offers room for future improvements; feasible options include canopy analysis of various future land use typologies, as well as a generalized street tree model for broader U.S. application. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. A comparison between the example reference biosphere model ERB 2B and a process-based model: simulation of a natural release scenario.

    PubMed

    Almahayni, T

    2014-12-01

    The BIOMASS methodology was developed with the objective of constructing defensible assessment biospheres for assessing potential radiological impacts of radioactive waste repositories. To this end, a set of Example Reference Biospheres were developed to demonstrate the use of the methodology and to provide an international point of reference. In this paper, the performance of the Example Reference Biosphere model ERB 2B associated with the natural release scenario, discharge of contaminated groundwater to the surface environment, was evaluated by comparing its long-term projections of radionuclide dynamics and distribution in a soil-plant system to those of a process-based, transient advection-dispersion model (AD). The models were parametrised with data characteristic of a typical rainfed winter wheat crop grown on a sandy loam soil under temperate climate conditions. Three safety-relevant radionuclides, (99)Tc, (129)I and (237)Np with different degree of sorption were selected for the study. Although the models were driven by the same hydraulic (soil moisture content and water fluxes) and radiological (Kds) input data, their projections were remarkably different. On one hand, both models were able to capture short and long-term variation in activity concentration in the subsoil compartment. On the other hand, the Reference Biosphere model did not project any radionuclide accumulation in the topsoil and crop compartments. This behaviour would underestimate the radiological exposure under natural release scenarios. The results highlight the potential role deep roots play in soil-to-plant transfer under a natural release scenario where radionuclides are released into the subsoil. When considering the relative activity and root depth profiles within the soil column, much of the radioactivity was taken up into the crop from the subsoil compartment. Further improvements were suggested to address the limitations of the Reference Biosphere model presented in this paper. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Regional Risk Assessment for climate change impacts on coastal aquifers.

    PubMed

    Iyalomhe, F; Rizzi, J; Pasini, S; Torresan, S; Critto, A; Marcomini, A

    2015-12-15

    Coastal aquifers have been identified as particularly vulnerable to impacts on water quantity and quality due to the high density of socio-economic activities and human assets in coastal regions and to the projected rising sea levels, contributing to the process of saltwater intrusion. This paper proposes a Regional Risk Assessment (RRA) methodology integrated with a chain of numerical models to evaluate potential climate change-related impacts on coastal aquifers and linked natural and human systems (i.e., wells, river, agricultural areas, lakes, forests and semi-natural environments). The RRA methodology employs Multi Criteria Decision Analysis methods and Geographic Information Systems functionalities to integrate heterogeneous spatial data on hazard, susceptibility and risk for saltwater intrusion and groundwater level variation. The proposed approach was applied on the Esino River basin (Italy) using future climate hazard scenarios based on a chain of climate, hydrological, hydraulic and groundwater system models running at different spatial scales. Models were forced with the IPCC SRES A1B emission scenario for the period 2071-2100 over four seasons (i.e., winter, spring, summer and autumn). Results indicate that in future seasons, climate change will cause few impacts on the lower Esino River valley. Groundwater level decrease will have limited effects: agricultural areas, forests and semi-natural environments will be at risk only in a region close to the coastline which covers less than 5% of the total surface of the considered receptors; less than 3.5% of the wells will be exposed in the worst scenario. Saltwater intrusion impact in future scenarios will be restricted to a narrow region close to the coastline (only few hundred meters), and thus it is expected to have very limited effects on the Esino coastal aquifer with no consequences on the considered natural and human systems. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Understanding and managing disaster evacuation on a transportation network.

    PubMed

    Lambert, James H; Parlak, Ayse I; Zhou, Qian; Miller, John S; Fontaine, Michael D; Guterbock, Thomas M; Clements, Janet L; Thekdi, Shital A

    2013-01-01

    Uncertain population behaviors in a regional emergency could potentially harm the performance of the region's transportation system and subsequent evacuation effort. The integration of behavioral survey data with travel demand modeling enables an assessment of transportation system performance and the identification of operational and public health countermeasures. This paper analyzes transportation system demand and system performance for emergency management in three disaster scenarios. A two-step methodology first estimates the number of trips evacuating the region, thereby capturing behavioral aspects in a scientifically defensible manner based on survey results, and second, assigns these trips to a regional highway network, using geographic information systems software, thereby making the methodology transferable to other locations. Performance measures are generated for each scenario including maps of volume-to-capacity ratios, geographic contours of evacuation time from the center of the region, and link-specific metrics such as weighted average speed and traffic volume. The methods are demonstrated on a 600 segment transportation network in Washington, DC (USA) and are applied to three scenarios involving attacks from radiological dispersion devices (e.g., dirty bombs). The results suggests that: (1) a single detonation would degrade transportation system performance two to three times more than that which occurs during a typical weekday afternoon peak hour, (2) volume on several critical arterials within the network would exceed capacity in the represented scenarios, and (3) resulting travel times to reach intended destinations imply that un-aided evacuation is impractical. These results assist decisions made by two categories of emergency responders: (1) transportation managers who provide traveler information and who make operational adjustments to improve the network (e.g., signal retiming) and (2) public health officials who maintain shelters, food and water stations, or first aid centers along evacuation routes. This approach may also interest decisionmakers who are in a position to influence the allocation of emergency resources, including healthcare providers, infrastructure owners, transit providers, and regional or local planning staff. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Forecasting poductivity in forest fire suppression operations: A methodological approach based on suppression difficulty analysis and documented experience

    Treesearch

    Francisco Rodríguez y Silva; Armando González-Cabán

    2013-01-01

    The abandonment of land, the high energy load generated and accumulated by vegetation covers, climate change and interface scenarios in Mediterranean forest ecosystems are demanding serious attention to forest fire conditions. This is particularly true when dealing with the budget requirements for undertaking protection programs related to the state of current and...

  18. Matrix Game Methodology - Support to V2010 Olympic Marine Security Planners

    DTIC Science & Technology

    2011-02-01

    OMOC was called the Integrated Safety /Security Matrix Game – Marine III, and was held 16-17 June 2009. This was the most extensive and complex of...Protection Matrix Game Marine Two .................................................. 12 3.3 Integrated Safety /Security Matrix Game – Marine III...Integrated Safety /Security Matrix Game – Marine III Scenarios........................... 53 ISSMG Marine III – Team Groupings

  19. A Guide for Setting the Cut-Scores to Minimize Weighted Classification Errors in Test Batteries

    ERIC Educational Resources Information Center

    Grabovsky, Irina; Wainer, Howard

    2017-01-01

    In this article, we extend the methodology of the Cut-Score Operating Function that we introduced previously and apply it to a testing scenario with multiple independent components and different testing policies. We derive analytically the overall classification error rate for a test battery under the policy when several retakes are allowed for…

  20. A Methodology to Assess UrbanSim Scenarios

    DTIC Science & Technology

    2012-09-01

    Education LOE – Line of Effort MMOG – Massively Multiplayer Online Game MC3 – Maneuver Captain’s Career Course MSCCC – Maneuver Support...augmented reality simulations, increased automation and artificial intelligence simulation, and massively multiplayer online games (MMOG), among...distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Turn-based strategy games and simulations are vital tools for military

Top