Sample records for quantitative hazard assessment

  1. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  2. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  3. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waters, Michael; Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens andmore » presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data

  4. Probabilistic Volcanic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Neri, A.; Newhall, C. G.; Papale, P.

    2007-08-01

    Quantifying Long- and Short-Term Volcanic Hazard: Building Up a Common Strategy for Italian Volcanoes, Erice Italy, 8 November 2006 The term ``hazard'' can lead to some misunderstanding. In English, hazard has the generic meaning ``potential source of danger,'' but for more than 30 years [e.g., Fournier d'Albe, 1979], hazard has been also used in a more quantitative way, that reads, ``the probability of a certain hazardous event in a specific time-space window.'' However, many volcanologists still use ``hazard'' and ``volcanic hazard'' in purely descriptive and subjective ways. A recent meeting held in November 2006 at Erice, Italy, entitled ``Quantifying Long- and Short-Term Volcanic Hazard: Building up a Common Strategy for Italian Volcanoes'' (http://www.bo.ingv.it/erice2006) concluded that a more suitable term for the estimation of quantitative hazard is ``probabilistic volcanic hazard assessment'' (PVHA).

  5. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish amore » lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.« less

  6. Landslide hazard assessment: recent trends and techniques.

    PubMed

    Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S

    2013-01-01

    Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.

  7. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  8. GENETIC ACTIVITY PROFILES AND HAZARD ASSESSMENT

    EPA Science Inventory

    A methodology has been developed to display and evaluate multiple test quantitative information on genetic toxicants for purposes of hazard/risk assessment. ose information is collected from the open literature: either the lowest effective dose (LED) or the highest ineffective do...

  9. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    NASA Astrophysics Data System (ADS)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  10. Assessment of the Casualty Risk of Multiple Meteorological Hazards in China

    PubMed Central

    Xu, Wei; Zhuo, Li; Zheng, Jing; Ge, Yi; Gu, Zhihui; Tian, Yugang

    2016-01-01

    A study of the frequency, intensity, and risk of extreme climatic events or natural hazards is important for assessing the impacts of climate change. Many models have been developed to assess the risk of multiple hazards, however, most of the existing approaches can only model the relative levels of risk. This paper reports the development of a method for the quantitative assessment of the risk of multiple hazards based on information diffusion. This method was used to assess the risks of loss of human lives from 11 types of meteorological hazards in China at the prefectural and provincial levels. Risk curves of multiple hazards were obtained for each province and the risks of 10-year, 20-year, 50-year, and 100-year return periods were mapped. The results show that the provinces (municipalities, autonomous regions) in southeastern China are at higher risk of multiple meteorological hazards as a result of their geographical location and topography. The results of this study can be used as references for the management of meteorological disasters in China. The model can be used to quantitatively calculate the risks of casualty, direct economic losses, building collapse, and agricultural losses for any hazards at different spatial scales. PMID:26901210

  11. Assessment of the Casualty Risk of Multiple Meteorological Hazards in China.

    PubMed

    Xu, Wei; Zhuo, Li; Zheng, Jing; Ge, Yi; Gu, Zhihui; Tian, Yugang

    2016-02-17

    A study of the frequency, intensity, and risk of extreme climatic events or natural hazards is important for assessing the impacts of climate change. Many models have been developed to assess the risk of multiple hazards, however, most of the existing approaches can only model the relative levels of risk. This paper reports the development of a method for the quantitative assessment of the risk of multiple hazards based on information diffusion. This method was used to assess the risks of loss of human lives from 11 types of meteorological hazards in China at the prefectural and provincial levels. Risk curves of multiple hazards were obtained for each province and the risks of 10-year, 20-year, 50-year, and 100-year return periods were mapped. The results show that the provinces (municipalities, autonomous regions) in southeastern China are at higher risk of multiple meteorological hazards as a result of their geographical location and topography. The results of this study can be used as references for the management of meteorological disasters in China. The model can be used to quantitatively calculate the risks of casualty, direct economic losses, building collapse, and agricultural losses for any hazards at different spatial scales.

  12. Nanomaterials, and Occupational Health and Safety—A Literature Review About Control Banding and a Semi-Quantitative Method Proposed for Hazard Assessment.

    NASA Astrophysics Data System (ADS)

    Dimou, Kaotar; Emond, Claude

    2017-06-01

    In recent decades, the control banding (CB) approach has been recognised as a hazard assessment methodology because of its increased importance in the occupational safety, health and hygiene (OSHH) industry. According to the American Industrial Hygiene Association, this approach originates from the pharmaceutical industry in the United Kingdom. The aim of the CB approach is to protect more than 90% (or approximately 2.7 billion) of the world’s workers who do not have access to OSHH professionals and traditional quantitative risk assessment methods. In other words, CB is a qualitative or semi-quantitative tool designed to prevent occupational accidents by controlling worker exposures to potentially hazardous chemicals in the absence of comprehensive toxicological and exposure data. These criteria correspond very precisely to the development and production of engineered nanomaterials (ENMs). Considering the significant lack of scientific knowledge about work-related health risks because of ENMs, CB is, in general, appropriate for these issues. Currently, CB can be adapted to the specificities of ENMs; hundreds of nanotechnology products containing ENMs are already on the market. In this context, this qualitative or semi-quantitative approach appears to be relevant for characterising and quantifying the degree of physico-chemical and biological reactivities of ENMs, leading towards better control of human health effects and the safe handling of ENMs in workplaces. The need to greater understand the CB approach is important to further manage the risks related to handling hazardous substances, such as ENMs, without established occupational exposure limits. In recent years, this topic has garnered much interest, including discussions in many technical papers. Several CB models have been developed, and many countries have created their own nano-specific CB instruments. The aims of this research were to perform a literature review about CBs, to classify the main

  13. Assessing the risk posed by natural hazards to infrastructures

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni Marie K.; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2017-03-01

    This paper proposes a model for assessing the risk posed by natural hazards to infrastructures, with a focus on the indirect losses and loss of stability for the population relying on the infrastructure. The model prescribes a three-level analysis with increasing level of detail, moving from qualitative to quantitative analysis. The focus is on a methodology for semi-quantitative analyses to be performed at the second level. The purpose of this type of analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures, identifying the most critical scenarios and investigating the need for further analyses (third level). The proposed semi-quantitative methodology considers the frequency of the natural hazard, different aspects of vulnerability, including the physical vulnerability of the infrastructure itself, and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale according to pre-defined ranking criteria. The proposed indicators, which characterise conditions that influence the probability of an infrastructure malfunctioning caused by a natural event, are defined as (1) robustness and buffer capacity, (2) level of protection, (3) quality/level of maintenance and renewal, (4) adaptability and quality of operational procedures and (5) transparency/complexity/degree of coupling. Further indicators describe conditions influencing the socio-economic consequences of the infrastructure malfunctioning, such as (1) redundancy and/or substitution, (2) cascading effects and dependencies, (3) preparedness and (4) early warning, emergency response and measures. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard, the potential duration of the infrastructure malfunctioning (e.g. depending on the required restoration effort) and the number of users of

  14. Threat Assessment of Hazardous Materials Transportation in Aircraft Cargo Compartments.

    DOT National Transportation Integrated Search

    1999-12-01

    The Volpe National Transportation Systems Center of the U.S. Department of Transportation's (DOT's) Research and Special Programs Administration (RSPA) has conducted a quantitative threat assessment for RSPA's Office of Hazardous Materials Safety (OH...

  15. Assessing the risk posed by natural hazards to infrastructures

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2015-04-01

    The modern society is increasingly dependent on infrastructures to maintain its function, and disruption in one of the infrastructure systems may have severe consequences. The Norwegian municipalities have, according to legislation, a duty to carry out a risk and vulnerability analysis and plan and prepare for emergencies in a short- and long term perspective. Vulnerability analysis of the infrastructures and their interdependencies is an important part of this analysis. This paper proposes a model for assessing the risk posed by natural hazards to infrastructures. The model prescribes a three level analysis with increasing level of detail, moving from qualitative to quantitative analysis. This paper focuses on the second level, which consists of a semi-quantitative analysis. The purpose of this analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures identified in the level 1 analysis and investigate the need for further analyses, i.e. level 3 quantitative analyses. The proposed level 2 analysis considers the frequency of the natural hazard, different aspects of vulnerability including the physical vulnerability of the infrastructure itself and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale. The proposed indicators characterize the robustness of the infrastructure, the importance of the infrastructure as well as interdependencies between society and infrastructure affecting the potential for cascading effects. Each indicator is ranked on a 1-5 scale based on pre-defined ranking criteria. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented, where risk to primary road, water supply and power network threatened by storm

  16. Transportation of Hazardous Materials Emergency Preparedness Hazards Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanchard, A.

    This report documents the Emergency Preparedness Hazards Assessment (EPHA) for the Transportation of Hazardous Materials (THM) at the Department of Energy (DOE) Savannah River Site (SRS). This hazards assessment is intended to identify and analyze those transportation hazards significant enough to warrant consideration in the SRS Emergency Management Program.

  17. Quantitative rock-fall hazard and risk assessment for Yosemite Valley, Yosemite National Park, California

    USGS Publications Warehouse

    Stock, Greg M.; Luco, Nicolas; Collins, Brian D.; Harp, Edwin L.; Reichenbach, Paola; Frankel, Kurt L.

    2014-01-01

    Rock falls are common in Yosemite Valley, California, posing substantial hazard and risk to the approximately four million annual visitors to Yosemite National Park. Rock falls in Yosemite Valley over the past few decades have damaged structures and caused injuries within developed regions located on or adjacent to talus slopes highlighting the need for additional investigations into rock-fall hazard and risk. This assessment builds upon previous investigations of rock-fall hazard and risk in Yosemite Valley and focuses on hazard and risk to structures posed by relatively frequent fragmental-type rock falls as large as approximately 100,000 (cubic meters) in volume.

  18. Quantitative rock-fall hazard and risk assessment for Yosemite Valley, Yosemite National Park, California

    USGS Publications Warehouse

    Stock, Greg M.; Luco, Nicolas; Collins, Brian D.; Harp, Edwin L.; Reichenbach, Paola; Frankel, Kurt L.

    2012-01-01

    caused injuries within developed regions located on or adjacent to talus slopes, highlighting the need for additional investigations into rock-fall hazard and risk. This assessment builds upon previous investigations of rock fall hazard and risk in Yosemite Valley (Wieczorek et al., 1998, 1999; Guzzetti et al., 2003; Wieczorek et al., 2008), and focuses on hazard and risk to structures posed by relatively frequent fragmental-type rock falls (Evans and Hungr, 1999), up to approximately 100,000 m3 in volume.

  19. Rock Slide Risk Assessment: A Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Duzgun, H. S. B.

    2009-04-01

    Rock slides can be better managed by systematic risk assessments. Any risk assessment methodology for rock slides involves identification of rock slide risk components, which are hazard, elements at risk and vulnerability. For a quantitative/semi-quantitative risk assessment for rock slides, a mathematical value the risk has to be computed and evaluated. The quantitative evaluation of risk for rock slides enables comparison of the computed risk with the risk of other natural and/or human-made hazards and providing better decision support and easier communication for the decision makers. A quantitative/semi-quantitative risk assessment procedure involves: Danger Identification, Hazard Assessment, Elements at Risk Identification, Vulnerability Assessment, Risk computation, Risk Evaluation. On the other hand, the steps of this procedure require adaptation of existing or development of new implementation methods depending on the type of landslide, data availability, investigation scale and nature of consequences. In study, a generic semi-quantitative risk assessment (SQRA) procedure for rock slides is proposed. The procedure has five consecutive stages: Data collection and analyses, hazard assessment, analyses of elements at risk and vulnerability and risk assessment. The implementation of the procedure for a single rock slide case is illustrated for a rock slope in Norway. Rock slides from mountain Ramnefjell to lake Loen are considered to be one of the major geohazards in Norway. Lake Loen is located in the inner part of Nordfjord in Western Norway. Ramnefjell Mountain is heavily jointed leading to formation of vertical rock slices with height between 400-450 m and width between 7-10 m. These slices threaten the settlements around Loen Valley and tourists visiting the fjord during summer season, as the released slides have potential of creating tsunami. In the past, several rock slides had been recorded from the Mountain Ramnefjell between 1905 and 1950. Among them

  20. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  1. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    USDA-ARS?s Scientific Manuscript database

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  2. Aiding alternatives assessment with an uncertainty-tolerant hazard scoring method.

    PubMed

    Faludi, Jeremy; Hoang, Tina; Gorman, Patrick; Mulvihill, Martin

    2016-11-01

    This research developed a single-score system to simplify and clarify decision-making in chemical alternatives assessment, accounting for uncertainty. Today, assessing alternatives to hazardous constituent chemicals is a difficult task-rather than comparing alternatives by a single definitive score, many independent toxicological variables must be considered at once, and data gaps are rampant. Thus, most hazard assessments are only comprehensible to toxicologists, but business leaders and politicians need simple scores to make decisions. In addition, they must balance hazard against other considerations, such as product functionality, and they must be aware of the high degrees of uncertainty in chemical hazard data. This research proposes a transparent, reproducible method to translate eighteen hazard endpoints into a simple numeric score with quantified uncertainty, alongside a similar product functionality score, to aid decisions between alternative products. The scoring method uses Clean Production Action's GreenScreen as a guide, but with a different method of score aggregation. It provides finer differentiation between scores than GreenScreen's four-point scale, and it displays uncertainty quantitatively in the final score. Displaying uncertainty also illustrates which alternatives are early in product development versus well-defined commercial products. This paper tested the proposed assessment method through a case study in the building industry, assessing alternatives to spray polyurethane foam insulation containing methylene diphenyl diisocyanate (MDI). The new hazard scoring method successfully identified trade-offs between different alternatives, showing finer resolution than GreenScreen Benchmarking. Sensitivity analysis showed that different weighting schemes in hazard scores had almost no effect on alternatives ranking, compared to uncertainty from data gaps. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Seismic hazard assessment: Issues and alternatives

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  4. Identification of Potential Hazard using Hazard Identification and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Sari, R. M.; Syahputri, K.; Rizkya, I.; Siregar, I.

    2017-03-01

    This research was conducted in the paper production’s company. These Paper products will be used as a cigarette paper. Along in the production’s process, Company provides the machines and equipment that operated by workers. During the operations, all workers may potentially injured. It known as a potential hazard. Hazard identification and risk assessment is one part of a safety and health program in the stage of risk management. This is very important as part of efforts to prevent occupational injuries and diseases resulting from work. This research is experiencing a problem that is not the identification of potential hazards and risks that would be faced by workers during the running production process. The purpose of this study was to identify the potential hazards by using hazard identification and risk assessment methods. Risk assessment is done using severity criteria and the probability of an accident. According to the research there are 23 potential hazard that occurs with varying severity and probability. Then made the determination Risk Assessment Code (RAC) for each potential hazard, and gained 3 extreme risks, 10 high risks, 6 medium risks and 3 low risks. We have successfully identified potential hazard using RAC.

  5. Evaluating a multi-criteria model for hazard assessment in urban design. The Porto Marghera case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luria, Paolo; Aspinall, Peter A

    2003-08-01

    The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based onmore » a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)« less

  6. Quantitative structure-activity relationships for predicting potential ecological hazard of organic chemicals for use in regulatory risk assessments.

    PubMed

    Comber, Mike H I; Walker, John D; Watts, Chris; Hermens, Joop

    2003-08-01

    The use of quantitative structure-activity relationships (QSARs) for deriving the predicted no-effect concentration of discrete organic chemicals for the purposes of conducting a regulatory risk assessment in Europe and the United States is described. In the United States, under the Toxic Substances Control Act (TSCA), the TSCA Interagency Testing Committee and the U.S. Environmental Protection Agency (U.S. EPA) use SARs to estimate the hazards of existing and new chemicals. Within the Existing Substances Regulation in Europe, QSARs may be used for data evaluation, test strategy indications, and the identification and filling of data gaps. To illustrate where and when QSARs may be useful and when their use is more problematic, an example, methyl tertiary-butyl ether (MTBE), is given and the predicted and experimental data are compared. Improvements needed for new QSARs and tools for developing and using QSARs are discussed.

  7. Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems

    NASA Astrophysics Data System (ADS)

    Kwag, Shinyoung

    Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or

  8. Los Angeles County Department of Public Health's Health Hazard Assessment: putting the "health" into hazard assessment.

    PubMed

    Dean, Brandon; Bagwell, Dee Ann; Dora, Vinita; Khan, Sinan; Plough, Alonzo

    2013-01-01

    A ll communities, explicitly or implicitly, assess and prepare for the natural and manmade hazards that they know could impact their community. The commonality of hazard-based threats in most all communities does not usually result in standard or evidence-based preparedness practice and outcomes across those communities. Without specific efforts to build a shared perspective and prioritization, "all-hazards" preparedness can result in a random hodgepodge of priorities and preparedness strategies, resulting in diminished emergency response capabilities. Traditional risk assessments, with a focus on physical infrastructure, do not present the potential health and medical impacts of specific hazards and threats. With the implementation of Centers for Disease Control and Prevention's capability-based planning, there is broad recognition that a health-focused hazard assessment process--that engages the "Whole of Community"--is needed. Los Angeles County's Health Hazard Assessment and Prioritization tool provides a practical and innovative approach to enhance existing planning capacities. Successful utilization of this tool can provide a way for local and state health agencies and officials to more effectively identify the health consequences related to hazard-specific threats and risk, determine priorities, and develop improved and better coordinated agency planning, including community engagement in prioritization.

  9. Slope Hazard and Risk Assessment in the Tropics: Malaysia' Experience

    NASA Astrophysics Data System (ADS)

    Mohamad, Zakaria; Azahari Razak, Khamarrul; Ahmad, Ferdaus; Manap, Mohamad Abdul; Ramli, Zamri; Ahmad, Azhari; Mohamed, Zainab

    2015-04-01

    The increasing number of geological hazards in Malaysia has often resulted in casualties and extensive devastation with high mitigation cost. Given the destructive capacity and high frequency of disaster, Malaysia has taken a step forward to address the multi-scale landslide risk reduction emphasizing pre-disaster action rather than post-disaster reaction. Slope hazard and risk assessment in a quantitative manner at regional and national scales remains challenging in Malaysia. This paper presents the comprehensive methodology framework and operational needs driven by modern and advanced geospatial technology to address the aforementioned issues in the tropics. The Slope Hazard and Risk Mapping, the first national project in Malaysia utilizing the multi-sensor LIDAR has been critically implemented with the support of multi- and trans-disciplinary partners. The methodological model has been formulated and evaluated given the complexity of risk scenarios in this knowledge driven project. Instability slope problems in the urban, mountainous and tectonic landscape are amongst them, and their spatial information is of crucial for regional landslide assessment. We develop standard procedures with optimal parameterization for susceptibility, hazard and risk assessment in the selected regions. Remarkably, we are aiming at producing an utmost complete landslide inventory in both space and time. With the updated reliable terrain and landscape models, the landslide conditioning factor maps can be accurately derived depending on the landslide types and failure mechanisms which crucial for hazard and risk assessment. We also aim to improve the generation of elements at risk for landslide and promote integrated approaches for a better disaster risk analysis. As a result, a new tool, notably multi-sensor LIDAR technology is a very promising tool for an old geological problem and its derivative data for hazard and risk analysis is an effective preventive measure in Malaysia

  10. Quantitative meta-analytic approaches for the analysis of animal toxicology and epidemiologic data in human health risk assessments

    EPA Science Inventory

    Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...

  11. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  12. Beyond eruptive scenarios: assessing tephra fallout hazard from Neapolitan volcanoes

    PubMed Central

    Sandri, Laura; Costa, Antonio; Selva, Jacopo; Tonini, Roberto; Macedonio, Giovanni; Folch, Arnau; Sulpizio, Roberto

    2016-01-01

    Assessment of volcanic hazards is necessary for risk mitigation. Typically, hazard assessment is based on one or a few, subjectively chosen representative eruptive scenarios, which use a specific combination of eruptive sizes and intensities to represent a particular size class of eruption. While such eruptive scenarios use a range of representative members to capture a range of eruptive sizes and intensities in order to reflect a wider size class, a scenario approach neglects to account for the intrinsic variability of volcanic eruptions, and implicitly assumes that inter-class size variability (i.e. size difference between different eruptive size classes) dominates over intra-class size variability (i.e. size difference within an eruptive size class), the latter of which is treated as negligible. So far, no quantitative study has been undertaken to verify such an assumption. Here, we adopt a novel Probabilistic Volcanic Hazard Analysis (PVHA) strategy, which accounts for intrinsic eruptive variabilities, to quantify the tephra fallout hazard in the Campania area. We compare the results of the new probabilistic approach with the classical scenario approach. The results allow for determining whether a simplified scenario approach can be considered valid, and for quantifying the bias which arises when full variability is not accounted for. PMID:27067389

  13. Beyond eruptive scenarios: assessing tephra fallout hazard from Neapolitan volcanoes.

    PubMed

    Sandri, Laura; Costa, Antonio; Selva, Jacopo; Tonini, Roberto; Macedonio, Giovanni; Folch, Arnau; Sulpizio, Roberto

    2016-04-12

    Assessment of volcanic hazards is necessary for risk mitigation. Typically, hazard assessment is based on one or a few, subjectively chosen representative eruptive scenarios, which use a specific combination of eruptive sizes and intensities to represent a particular size class of eruption. While such eruptive scenarios use a range of representative members to capture a range of eruptive sizes and intensities in order to reflect a wider size class, a scenario approach neglects to account for the intrinsic variability of volcanic eruptions, and implicitly assumes that inter-class size variability (i.e. size difference between different eruptive size classes) dominates over intra-class size variability (i.e. size difference within an eruptive size class), the latter of which is treated as negligible. So far, no quantitative study has been undertaken to verify such an assumption. Here, we adopt a novel Probabilistic Volcanic Hazard Analysis (PVHA) strategy, which accounts for intrinsic eruptive variabilities, to quantify the tephra fallout hazard in the Campania area. We compare the results of the new probabilistic approach with the classical scenario approach. The results allow for determining whether a simplified scenario approach can be considered valid, and for quantifying the bias which arises when full variability is not accounted for.

  14. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  15. Hydrothermal Liquefaction Treatment Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios receivedmore » increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.« less

  16. Multi-hazard risk assessment applied to hydraulic fracturing operations

    NASA Astrophysics Data System (ADS)

    Garcia-Aristizabal, Alexander; Gasparini, Paolo; Russo, Raffaella; Capuano, Paolo

    2017-04-01

    Without exception, the exploitation of any energy resource produces impacts and intrinsically bears risks. Therefore, to make sound decisions about future energy resource exploitation, it is important to clearly understand the potential environmental impacts in the full life-cycle of an energy development project, distinguishing between the specific impacts intrinsically related to exploiting a given energy resource and those shared with the exploitation of other energy resources. Technological advances as directional drilling and hydraulic fracturing have led to a rapid expansion of unconventional resources (UR) exploration and exploitation; as a consequence, both public health and environmental concerns have risen. The main objective of a multi-hazard risk assessment applied to the development of UR is to assess the rate (or the likelihood) of occurrence of incidents and the relative potential impacts on surrounding environment, considering different hazards and their interactions. Such analyses have to be performed considering the different stages of development of a project; however, the discussion in this paper is mainly focused on the analysis applied to the hydraulic fracturing stage of a UR development project. The multi-hazard risk assessment applied to the development of UR poses a number of challenges, making of this one a particularly complex problem. First, a number of external hazards might be considered as potential triggering mechanisms. Such hazards can be either of natural origin or anthropogenic events caused by the same industrial activities. Second, failures might propagate through the industrial elements, leading to complex scenarios according to the layout of the industrial site. Third, there is a number of potential risk receptors, ranging from environmental elements (as the air, soil, surface water, or groundwater) to local communities and ecosystems. The multi-hazard risk approach for this problem is set by considering multiple hazards

  17. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Hazard assessment. 850.21 Section 850.21 Energy DEPARTMENT... assessment. (a) If the baseline inventory establishes the presence of beryllium, the responsible employer must conduct a beryllium hazard assessment that includes an analysis of existing conditions, exposure...

  18. A review of multi-risk methodologies for natural hazards: Consequences and challenges for a climate change impact assessment.

    PubMed

    Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Sperotto, Anna; Glade, Thomas; Marcomini, Antonio

    2016-03-01

    This paper presents a review of existing multi-risk assessment concepts and tools applied by organisations and projects providing the basis for the development of a multi-risk methodology in a climate change perspective. Relevant initiatives were developed for the assessment of multiple natural hazards (e.g. floods, storm surges, droughts) affecting the same area in a defined timeframe (e.g. year, season, decade). Major research efforts were focused on the identification and aggregation of multiple hazard types (e.g. independent, correlated, cascading hazards) by means of quantitative and semi-quantitative approaches. Moreover, several methodologies aim to assess the vulnerability of multiple targets to specific natural hazards by means of vulnerability functions and indicators at the regional and local scale. The overall results of the review show that multi-risk approaches do not consider the effects of climate change and mostly rely on the analysis of static vulnerability (i.e. no time-dependent vulnerabilities, no changes among exposed elements). A relevant challenge is therefore to develop comprehensive formal approaches for the assessment of different climate-induced hazards and risks, including dynamic exposure and vulnerability. This requires the selection and aggregation of suitable hazard and vulnerability metrics to make a synthesis of information about multiple climate impacts, the spatial analysis and ranking of risks, including their visualization and communication to end-users. To face these issues, climate impact assessors should develop cross-sectorial collaborations among different expertise (e.g. modellers, natural scientists, economists) integrating information on climate change scenarios with sectorial climate impact assessment, towards the development of a comprehensive multi-risk assessment process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Assessment and Prediction of Natural Hazards from Satellite Imagery

    PubMed Central

    Gillespie, Thomas W.; Chu, Jasmine; Frankenberg, Elizabeth; Thomas, Duncan

    2013-01-01

    Since 2000, there have been a number of spaceborne satellites that have changed the way we assess and predict natural hazards. These satellites are able to quantify physical geographic phenomena associated with the movements of the earth’s surface (earthquakes, mass movements), water (floods, tsunamis, storms), and fire (wildfires). Most of these satellites contain active or passive sensors that can be utilized by the scientific community for the remote sensing of natural hazards over a number of spatial and temporal scales. The most useful satellite imagery for the assessment of earthquake damage comes from high-resolution (0.6 m to 1 m pixel size) passive sensors and moderate resolution active sensors that can quantify the vertical and horizontal movement of the earth’s surface. High-resolution passive sensors have been used to successfully assess flood damage while predictive maps of flood vulnerability areas are possible based on physical variables collected from passive and active sensors. Recent moderate resolution sensors are able to provide near real time data on fires and provide quantitative data used in fire behavior models. Limitations currently exist due to atmospheric interference, pixel resolution, and revisit times. However, a number of new microsatellites and constellations of satellites will be launched in the next five years that contain increased resolution (0.5 m to 1 m pixel resolution for active sensors) and revisit times (daily ≤ 2.5 m resolution images from passive sensors) that will significantly improve our ability to assess and predict natural hazards from space. PMID:25170186

  20. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less

  1. Comparison of landslide hazard and risk assessment practices in Europe

    NASA Astrophysics Data System (ADS)

    Corominas, J.; Mavrouli, O.

    2012-04-01

    An overview is made of the landslide hazard and risk assessment practices that are officially promoted or applied in Europe by administration offices, geological surveys, and decision makers (recommendations, regulations and codes). The reported countries are: Andorra, Austria, France, Italy (selected river basins), Romania, Spain (Catalonia), Switzerland and United Kingdom. The objective here was to compare the different practices for hazard and risk evaluation with respect to the official policies, the methodologies used (qualitative and quantitative), the provided outputs and their contents, and the terminology and map symbols used. The main observations made are illustrated with examples and the possibility of harmonization of the policies and the application of common practices to bridge the existing gaps is discussed. Some of the conclusions reached include the following: zoning maps are legally binding for public administrators and land owners only in some cases and generally when referring to site-specific or local scales rather than regional or national ones; so far, information is mainly provided on landslide susceptibility and hazard and risk assessment is performed only in a few countries; there is a variation in the use of scales between countries; the classification criteria for landslide types and mechanisms present large diversity even within the same country (in some cases no landslide mechanisms are specified while in others there is an exhaustive list); the techniques to obtain input data for the landslide inventory and susceptibility maps vary from basic to sophisticated, resulting in various levels of data quality and quantity; the procedures followed for hazard and risk assessment include analytical procedures supported by computer simulation, weighted-indicators, expert judgment and field survey-based, or a combination of all; there is an important variation between hazard and risk matrices with respect to the used parameters, the thresholds

  2. Multi Hazard Assessment: The Azores Archipelagos (PT) case

    NASA Astrophysics Data System (ADS)

    Aifantopoulou, Dorothea; Boni, Giorgio; Cenci, Luca; Kaskara, Maria; Kontoes, Haris; Papoutsis, Ioannis; Paralikidis, Sideris; Psichogyiou, Christina; Solomos, Stavros; Squicciarino, Giuseppe; Tsouni, Alexia; Xerekakis, Themos

    2016-04-01

    ) and earthquake (475 years return period) was used. Topography, lithology, soil moisture and LU/LC were also accounted for. Soil erosion risk was assessed through the empirical model RUSLE (Renard et al. 1991b). Rainfall erosivity, topography and vegetation cover are the main parameters which were used for predicting the proneness to soil loss. Expected, maximum tsunami wave heights were estimated for a specific earthquake scenario at designated forecast points along the coasts. Deformation at the source was calculated by utilizing the Okada code (Okada, 1985). Tsunami waves' generation and propagation is based on the SWAN model (JRC/IPSC modification). To estimate the wave height (forecast points) the Green's Law function was used (JRC Tsunami Analysis Tool). Storm tracks' historical data indicate a return period of 17 /41 years for H1 /H2 hurricane categories respectively. NOAA WAVEWATCH III model hindcast reanalysis was used to estimate the maximum significant wave height (wind and swell) along the coastline during two major storms. The associated storm-surge risk assessment accounted also for the coastline morphology. Seven empirical (independent) indicators were used to express the erosion susceptibility of the coasts. Each indicator is evaluated according to a semi?quantitative score that represents low, medium and high level of erosion risk or impact. The estimation of the coastal erosion hazard was derived through aggregating the indicators in a grid scale.

  3. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  4. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  5. Developing International Guidelines on Volcanic Hazard Assessments for Nuclear Facilities

    NASA Astrophysics Data System (ADS)

    Connor, Charles

    2014-05-01

    Worldwide, tremendous progress has been made in recent decades in forecasting volcanic events, such as episodes of volcanic unrest, eruptions, and the potential impacts of eruptions. Generally these forecasts are divided into two categories. Short-term forecasts are prepared in response to unrest at volcanoes, rely on geophysical monitoring and related observations, and have the goal of forecasting events on timescales of hours to weeks to provide time for evacuation of people, shutdown of facilities, and implementation of related safety measures. Long-term forecasts are prepared to better understand the potential impacts of volcanism in the future and to plan for potential volcanic activity. Long-term forecasts are particularly useful to better understand and communicate the potential consequences of volcanic events for populated areas around volcanoes and for siting critical infrastructure, such as nuclear facilities. Recent work by an international team, through the auspices of the International Atomic Energy Agency, has focused on developing guidelines for long-term volcanic hazard assessments. These guidelines have now been implemented for hazard assessment for nuclear facilities in nations including Indonesia, the Philippines, Armenia, Chile, and the United States. One any time scale, all volcanic hazard assessments rely on a geologically reasonable conceptual model of volcanism. Such conceptual models are usually built upon years or decades of geological studies of specific volcanic systems, analogous systems, and development of a process-level understanding of volcanic activity. Conceptual models are used to bound potential rates of volcanic activity, potential magnitudes of eruptions, and to understand temporal and spatial trends in volcanic activity. It is these conceptual models that provide essential justification for assumptions made in statistical model development and the application of numerical models to generate quantitative forecasts. It is a

  6. Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones

    NASA Astrophysics Data System (ADS)

    Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto

    2015-04-01

    Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions

  7. Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping

    NASA Astrophysics Data System (ADS)

    Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai

    2015-04-01

    Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by

  8. PyBetVH: A Python tool for probabilistic volcanic hazard assessment and for generation of Bayesian hazard curves and maps

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary

    2015-06-01

    PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.

  9. A methodology for physically based rockfall hazard assessment

    NASA Astrophysics Data System (ADS)

    Crosta, G. B.; Agliardi, F.

    Rockfall hazard assessment is not simple to achieve in practice and sound, physically based assessment methodologies are still missing. The mobility of rockfalls implies a more difficult hazard definition with respect to other slope instabilities with minimal runout. Rockfall hazard assessment involves complex definitions for "occurrence probability" and "intensity". This paper is an attempt to evaluate rockfall hazard using the results of 3-D numerical modelling on a topography described by a DEM. Maps portraying the maximum frequency of passages, velocity and height of blocks at each model cell, are easily combined in a GIS in order to produce physically based rockfall hazard maps. Different methods are suggested and discussed for rockfall hazard mapping at a regional and local scale both along linear features or within exposed areas. An objective approach based on three-dimensional matrixes providing both a positional "Rockfall Hazard Index" and a "Rockfall Hazard Vector" is presented. The opportunity of combining different parameters in the 3-D matrixes has been evaluated to better express the relative increase in hazard. Furthermore, the sensitivity of the hazard index with respect to the included variables and their combinations is preliminarily discussed in order to constrain as objective as possible assessment criteria.

  10. Assessing homeland chemical hazards outside the military gates: industrial hazard threat assessments for department of defense installations.

    PubMed

    Kirkpatrick, Jeffrey S; Howard, Jacqueline M; Reed, David A

    2002-04-08

    As part of comprehensive joint medical surveillance measures outlined by the Department of Defense, the US Army Center for Health Promotion and Preventive Medicine (USACHPPM) is beginning to assess environmental health threats to continental US military installations. A common theme in comprehensive joint medical surveillance, in support of Force Health Protection, is the identification and assessment of potential environmental health hazards, and the evaluation and documentation of actual exposures in both a continental US and outside a continental US setting. For the continental US assessments, the USACHPPM has utilized the US Environmental Protection Agency (EPA) database for risk management plans in accordance with Public Law 106-40, and the toxic release inventory database, in a state-of the art geographic information systems based program, termed the Consequence Assessment and Management Tool Set, or CATS, for assessing homeland industrial chemical hazards outside the military gates. As an example, the US EPA toxic release inventory and risk management plans databases are queried to determine the types and locations of industries surrounding a continental US military installation. Contaminants of concern are then ranked with respect to known toxicological and physical hazards, where they are then subject to applicable downwind hazard simulations using applicable meteorological and climatological data sets. The composite downwind hazard areas are mapped in relation to emergency response planning guidelines (ERPG), which were developed by the American Industrial Hygiene Association to assist emergency response personnel planning for catastrophic chemical releases. In addition, other geographic referenced data such as transportation routes, satellite imagery and population data are included in the operational, equipment, and morale risk assessment and management process. These techniques have been developed to assist military medical planners and operations

  11. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety workplace hazards through appropriate workplace monitoring; (2) Document assessment for chemical, physical... hazards; (6) Perform routine job activity-level hazard analyses; (7) Review site safety and health...

  12. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety workplace hazards through appropriate workplace monitoring; (2) Document assessment for chemical, physical... hazards; (6) Perform routine job activity-level hazard analyses; (7) Review site safety and health...

  13. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...

  14. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...

  15. 10 CFR 851.21 - Hazard identification and assessment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... Procedures must include methods to: (1) Assess worker exposure to chemical, physical, biological, or safety..., biological, and safety workplace hazards using recognized exposure assessment and testing methodologies and... hazards and the established controls within 90 days after identifying such hazards. The Head of DOE Field...

  16. Quantitative risk assessment for skin sensitization: Success or failure?

    PubMed

    Kimber, Ian; Gerberick, G Frank; Basketter, David A

    2017-02-01

    Skin sensitization is unique in the world of toxicology. There is a combination of reliable, validated predictive test methods for identification of skin sensitizing chemicals, a clearly documented and transparent approach to risk assessment, and effective feedback from dermatology clinics around the world delivering evidence of the success or failure of the hazard identification/risk assessment/management process. Recent epidemics of contact allergy, particularly to preservatives, have raised questions of whether the safety/risk assessment process is working in an optimal manner (or indeed is working at all!). This review has as its focus skin sensitization quantitative risk assessment (QRA). The core toxicological principles of QRA are reviewed, and evidence of use and misuse examined. What becomes clear is that skin sensitization QRA will only function adequately if two essential criteria are met. The first is that QRA is applied rigourously, and the second is that potential exposure to the sensitizing substance is assessed adequately. This conclusion will come as no surprise to any toxicologist who appreciates the basic premise that "risk = hazard x exposure". Accordingly, use of skin sensitization QRA is encouraged, not least because the essential feedback from dermatology clinics can be used as a tool to refine QRA in situations where this risk assessment tool has not been properly used. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. FDA-iRISK--a comparative risk assessment system for evaluating and ranking food-hazard pairs: case studies on microbial hazards.

    PubMed

    Chen, Yuhuan; Dennis, Sherri B; Hartnett, Emma; Paoli, Greg; Pouillot, Régis; Ruthman, Todd; Wilson, Margaret

    2013-03-01

    Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012.

  18. Distance education course on spatial multi-hazard risk assessment, using Open Source software

    NASA Astrophysics Data System (ADS)

    van Westen, C. J.; Frigerio, S.

    2009-04-01

    As part of the capacity building activities of the United Nations University - ITC School on Disaster Geo-Information Management (UNU-ITC DGIM) the International Institute for Geoinformation Science and Earth Observation (ITC) has developed a distance education course on the application of Geographic Information Systems for multi-hazard risk assessment. This course is designed for academic staff, as well as for professionals working in (non-) governmental organizations where knowledge of disaster risk management is essential. The course guides the participants through the entire process of risk assessment, on the basis of a case study of a city exposed to multiple hazards, in a developing country. The courses consists of eight modules, each with a guide book explaining the theoretical background, and guiding the participants through spatial data requirements for risk assessment, hazard assessment procedures, generation of elements at risk databases, vulnerability assessment, qualitative and quantitative risk assessment methods, risk evaluation and risk reduction. Linked to the theory is a large set of exercises, with exercise descriptions, answer sheets, demos and GIS data. The exercises deal with four different types of hazards: earthquakes, flooding, technological hazards, and landslides. One important consideration in designing the course is that people from developing countries should not be restricted in using it due to financial burdens for software acquisition. Therefore the aim was to use Open Source software as a basis. The GIS exercises are written for the ILWIS software. All exercises have also been integrated into a WebGIS, using the Open source software CartoWeb (based on GNU License). It is modular and customizable thanks to its object-oriented architecture and based on a hierarchical structure (to manage and organize every package of information of every step required in risk assessment). Different switches for every component of the risk assessment

  19. Long-term multi-hazard assessment for El Misti volcano (Peru)

    NASA Astrophysics Data System (ADS)

    Sandri, Laura; Thouret, Jean-Claude; Constantinescu, Robert; Biass, Sébastien; Tonini, Roberto

    2014-02-01

    . Although this study does not intend to replace the current El Misti hazard map, the quantitative results of this probabilistic multi-hazard assessment can be incorporated into a multi-risk analysis, to support decision makers in any future improvement of the current hazard evaluation, such as further land-use planning and possible emergency management.

  20. A new remote hazard and risk assessment framework for glacial lakes in the Nepal Himalaya

    NASA Astrophysics Data System (ADS)

    Rounce, David R.; McKinney, Daene C.; Lala, Jonathan M.; Byers, Alton C.; Watson, C. Scott

    2016-08-01

    Glacial lake outburst floods (GLOFs) pose a significant threat to downstream communities and infrastructure due to their potential to rapidly unleash stored lake water. The most common triggers of these GLOFs are mass movement entering the lake and/or the self-destruction of the terminal moraine due to hydrostatic pressures or a buried ice core. This study initially uses previous qualitative and quantitative assessments to understand the hazards associated with eight glacial lakes in the Nepal Himalaya that are widely considered to be highly dangerous. The previous assessments yield conflicting classifications with respect to each glacial lake, which spurred the development of a new holistic, reproducible, and objective approach based solely on remotely sensed data. This remote hazard assessment analyzes mass movement entering the lake, the stability of the moraine, and lake growth in conjunction with a geometric GLOF to determine the downstream impacts such that the present and future risk associated with each glacial lake may be quantified. The new approach is developed within a hazard, risk, and management action framework with the aim that this remote assessment may guide future field campaigns, modeling efforts, and ultimately risk-mitigation strategies. The remote assessment was found to provide valuable information regarding the hazards faced by each glacial lake and results were discussed within the context of the current state of knowledge to help guide future efforts.

  1. Earthquake Hazard Assessment: Basics of Evaluation

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA) is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Earthquakes follow the Unified Scaling Law that generalizes the Gutenberg-Richter relationship by taking into account naturally fractal distribution of their sources. Moreover, earthquakes, including the great and mega events, are clustered in time and their sequences have irregular recurrence intervals. Furthermore, earthquake related observations are limited to the recent most decades (or centuries in just a few rare cases). Evidently, all this complicates reliable assessment of seismic hazard and associated risks. Making SHA claims, either termless or time dependent (so-called t-DASH), quantitatively probabilistic in the frames of the most popular objectivists' viewpoint on probability requires a long series of "yes/no" trials, which cannot be obtained without an extended rigorous testing of the method predictions against real observations. Therefore, we reiterate the necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing supplies us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified in brief with a few examples, which analyses in more detail are given in a poster of

  2. Exploration of resilience assessments for natural hazards

    NASA Astrophysics Data System (ADS)

    Lo Jacomo, Anna; Han, Dawei; Champneys, Alan

    2017-04-01

    The occurrence of extreme events due to natural hazards is difficult to predict. Extreme events are stochastic in nature, there is a lack of long term data on their occurrence, and there are still gaps in our understanding of their physical processes. This difficulty in prediction will be exacerbated by climate change and human activities. Yet traditional risk assessments measure risk as the probability of occurrence of a hazard, multiplied by the consequences of the hazard occurring, which ignores the recovery process. In light of the increasing concerns on disaster risks and the related system recovery, resilience assessments are being used as an approach which complements and builds on traditional risk assessments and management. In mechanical terms, resilience refers to the amount of energy per unit volume that a material can absorb while maintaining its ability to return to its original shape. Resilience was first applied in the fields of psychology and ecology, and more recently has been used in areas such as social sciences, economics, and engineering. A common metaphor for understanding resilience is the stability landscape. The landscape consists of a surface of interconnected basins, where each basin represents different states of a system, which is a point on the stability landscape. The resilience of the system is its capacity and tendency to remain within a particular basin. This depends on the topology of the landscape, on the system's current position, and on its reaction to different shocks and stresses. In practical terms, resilience assessments have been conducted for various purposes in different sectors. These assessments vary in their required inputs, the methodologies applied, and the output they produce. Some measures used for resilience assessments are hazard independent. These focus on the intrinsic capabilities of a system, for example the insurance coverage of a community, or the buffer capacity of a water storage reservoir. Other

  3. Volcanic-hazards assessments; past, present, and future

    USGS Publications Warehouse

    Crandell, D.R.

    1991-01-01

    Worldwide interest in volcanic-hazards assessments was greatly stimulated by the 1980 eruption of Mount St. Helens, just 2 years after a hazards assessment of the volcano was published in U.S Geological Survey Bulletin 1383-C. Many climactic eruption on May 18, although the extent of the unprecedented and devastating lateral blast was not anticipated. 

  4. Risk assessment of major hazards and its application in urban planning: a case study.

    PubMed

    Zhou, Yafei; Liu, Mao

    2012-03-01

    With the rapid development of industry in China, the number of establishments that are proposed or under construction is increasing year by year, and many are industries that handle flammable, explosive, toxic, harmful, and dangerous substances. Accidents such as fire, explosion, and toxic diffusion inevitably happen. Accidents resulting from these major hazards in cities cause a large number of casualties and property losses. It is increasingly important to analyze the risk of major hazards in cities realistically and to suitably plan and utilize the surrounding land based on the risk analysis results, thereby reducing the hazards. A theoretical system for risk assessment of major hazards in cities is proposed in this article, and the major hazard risk for the entire city is analyzed quantitatively. Risks of various major accidents are considered together, superposition effect is analyzed, individual risk contours of the entire city are drawn out, and the level of risk in the city is assessed using "as low as reasonably practicable" guidelines. After the entire city's individual risk distribution is obtained, risk zones are divided according to corresponding individual risk value of HSE, and land-use planning suggestions are proposed. Finally, a city in China is used as an example to illustrate the risk assessment process of the city's major hazard and its application in urban land-use planning. The proposed method has a certain theoretical and practical significance in establishing and improving risk analysis of major hazard and urban land-use planning. On the one hand, major urban public risk is avoided; further, the land is utilized in the best possible way in order to obtain the maximum benefit from its use. © 2011 Society for Risk Analysis.

  5. 10 CFR 850.21 - Hazard assessment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Hazard assessment. 850.21 Section 850.21 Energy DEPARTMENT OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.21 Hazard... with the greatest risks of exposure are evaluated first. (b) The responsible employer must ensure that...

  6. GRC Payload Hazard Assessment: Supporting the STS-107 Accident Investigation

    NASA Technical Reports Server (NTRS)

    Schoren, William R.; Zampino, Edward J.

    2004-01-01

    A hazard assessment was conducted on the GRC managed payloads in support of a NASA Headquarters Code Q request to examine STS-107 payloads and determine if they were credible contributors to the Columbia accident. This assessment utilized each payload's Final Flight Safety Data Package for hazard identification. An applicability assessment was performed and most of the hazards were eliminated because they dealt with payload operations or crew interactions. A Fault Tree was developed for all the hazards deemed applicable and the safety verification documentation was reviewed for these applicable hazards. At the completion of this hazard assessment, it was concluded that none of the GRC managed payloads were credible contributors to the Columbia accident.

  7. Methodologies For A Physically Based Rockfall Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Agliardi, F.; Crosta, G. B.; Guzzetti, F.; Marian, M.

    Rockfall hazard assessment is an important land planning tool in alpine areas, where settlements progressively expand across rockfall prone areas, rising the vulnerability of the elements at risk, the worth of potential losses and the restoration costs. Nev- ertheless, hazard definition is not simple to achieve in practice and sound, physically based assessment methodologies are still missing. In addition, the high mobility of rockfalls implies a more difficult hazard definition with respect to other slope insta- bilities for which runout is minimal. When coping with rockfalls, hazard assessment involves complex definitions for "occurrence probability" and "intensity". The local occurrence probability must derive from the combination of the triggering probability (related to the geomechanical susceptibility of rock masses to fail) and the transit or impact probability at a given location (related to the motion of falling blocks). The intensity (or magnitude) of a rockfall is a complex function of mass, velocity and fly height of involved blocks that can be defined in many different ways depending on the adopted physical description and "destructiveness" criterion. This work is an attempt to evaluate rockfall hazard using the results of numerical modelling performed by an original 3D rockfall simulation program. This is based on a kinematic algorithm and allows the spatially distributed simulation of rockfall motions on a three-dimensional topography described by a DTM. The code provides raster maps portraying the max- imum frequency of transit, velocity and height of blocks at each model cell, easily combined in a GIS in order to produce physically based rockfall hazard maps. The results of some three dimensional rockfall models, performed at both regional and lo- cal scale in areas where rockfall related problems are well known, have been used to assess rockfall hazard, by adopting an objective approach based on three-dimensional matrixes providing a positional

  8. Progress in NTHMP Hazard Assessment

    USGS Publications Warehouse

    Gonzalez, F.I.; Titov, V.V.; Mofjeld, H.O.; Venturato, A.J.; Simmons, R.S.; Hansen, R.; Combellick, Rodney; Eisner, R.K.; Hoirup, D.F.; Yanagi, B.S.; Yong, S.; Darienzo, M.; Priest, G.R.; Crawford, G.L.; Walsh, T.J.

    2005-01-01

    The Hazard Assessment component of the U.S. National Tsunami Hazard Mitigation Program has completed 22 modeling efforts covering 113 coastal communities with an estimated population of 1.2 million residents that are at risk. Twenty-three evacuation maps have also been completed. Important improvements in organizational structure have been made with the addition of two State geotechnical agency representatives to Steering Group membership, and progress has been made on other improvements suggested by program reviewers. ?? Springer 2005.

  9. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central

  10. Debris flow hazards mitigation--Mechanics, prediction, and assessment

    USGS Publications Warehouse

    Chen, C.-L.; Major, J.J.

    2007-01-01

    These proceedings contain papers presented at the Fourth International Conference on Debris-Flow Hazards Mitigation: Mechanics, Prediction, and Assessment held in Chengdu, China, September 10-13, 2007. The papers cover a wide range of topics on debris-flow science and engineering, including the factors triggering debris flows, geomorphic effects, mechanics of debris flows (e.g., rheology, fluvial mechanisms, erosion and deposition processes), numerical modeling, various debris-flow experiments, landslide-induced debris flows, assessment of debris-flow hazards and risk, field observations and measurements, monitoring and alert systems, structural and non-structural countermeasures against debris-flow hazards and case studies. The papers reflect the latest devel-opments and advances in debris-flow research. Several studies discuss the development and appli-cation of Geographic Information System (GIS) and Remote Sensing (RS) technologies in debris-flow hazard/risk assessment. Timely topics presented in a few papers also include the development of new or innovative techniques for debris-flow monitoring and alert systems, especially an infra-sound acoustic sensor for detecting debris flows. Many case studies illustrate a wide variety of debris-flow hazards and related phenomena as well as their hazardous effects on human activities and settlements.

  11. Spatially resolved hazard and exposure assessments: an example of lead in soil at Lavrion, Greece.

    PubMed

    Tristán, E; Demetriades, A; Ramsey, M H; Rosenbaum, M S; Stavrakis, P; Thornton, I; Vassiliades, E; Vergou, K

    2000-01-01

    Spatially resolved hazard assessment (SRHA) and spatially resolved exposure assessment (SREA) are methodologies that have been devised for assessing child exposure to soil containing environmental pollutants. These are based on either a quantitative or a semiquantitative approach. The feasibility of the methodologies has been demonstrated in a study assessing child exposure to Pb accessible in soil at the town of Lavrion in Greece. Using a quantitative approach, both measured and kriged concentrations of Pb in soil are compared with an "established" statutory threshold value. The probabilistic approach gives a refined classification of the contaminated land, since it takes into consideration the uncertainty in both the actual measurement and estimated kriged values. Two exposure assessment models (i.e., IEUBK and HESP) are used as the basis of the quantitative SREA methodologies. The significant correlation between the blood-Pb predictions, using the IEUBK model, and measured concentrations provides a partial validation of the method, because it allows for the uncertainty in the measurements and the lack of some site-specific measurements. The semiquantitative applications of SRHA and SREA incorporate both qualitative information (e.g., land use and dustiness of waste) and quantitative information (e.g., distance from wastes and distance from industry). The significant correlation between the results of these assessments and the measured blood-Pb levels confirms the robust nature of this approach. Successful application of these methodologies could reduce the cost of the assessment and allow areas to be prioritized for further investigation, remediation, or risk management.

  12. Conceptual geoinformation model of natural hazards risk assessment

    NASA Astrophysics Data System (ADS)

    Kulygin, Valerii

    2016-04-01

    Natural hazards are the major threat to safe interactions between nature and society. The assessment of the natural hazards impacts and their consequences is important in spatial planning and resource management. Today there is a challenge to advance our understanding of how socio-economical and climate changes will affect the frequency and magnitude of hydro-meteorological hazards and associated risks. However, the impacts from different types of natural hazards on various marine and coastal economic activities are not of the same type. In this study, the conceptual geomodel of risk assessment is presented to highlight the differentiation by the type of economic activities in extreme events risk assessment. The marine and coastal ecosystems are considered as the objects of management, on the one hand, and as the place of natural hazards' origin, on the other hand. One of the key elements in describing of such systems is the spatial characterization of their components. Assessment of ecosystem state is based on ecosystem indicators (indexes). They are used to identify the changes in time. The scenario approach is utilized to account for the spatio-temporal dynamics and uncertainty factors. Two types of scenarios are considered: scenarios of using ecosystem services by economic activities and scenarios of extreme events and related hazards. The reported study was funded by RFBR, according to the research project No. 16-35-60043 mol_a_dk.

  13. A Life Cycle Based Approach to Multi-Hazard Risk Assessment

    NASA Astrophysics Data System (ADS)

    Keen, A. S.; Lynett, P. J.

    2017-12-01

    Small craft harbors are important facets to many coastal communities providing a transition from land to ocean. Because of the damage resulting from the 2010 Chile and 2011 Japanese tele-tsunamis, the tsunami risk to the small craft marinas in California has become an important concern. However, tsunamis represent only one of many hazards a harbor is likely to see in California. Other natural hazards including tsunamis, wave attack, storm surge and sea level rise all can damage a harbor but are not typically addressed in traditional risk studies. Existing approaches to assess small craft harbor vulnerably typically look at single events assigning likely damage levels to each event. However, a harbor will likely experience damage from several different types of hazards over its service life with each event contributing proportionally to the total damage state. A new, fully probabilistic risk method will be presented which considers the distribution of return period for various hazards over a harbor's service life. The likelihood of failure is connected to each hazard via vulnerability curves. By simply tabulating the expected damage levels from each event, the method provides a quantitative measure of a harbor's risk to various types of hazards as well as the likelihood of failure (i.e. cumulative risk) during the service life. Crescent City Harbor in Northern California and Kings Harbor in Southern California have been chosen as case studies. Each harbor is dynamically different and were chosen to highlight the strengths and weaknesses of the method. Findings of each study will focus on assisting the stakeholders and decision makers to better understand the relative risk to each harbor with the goal of providing them with a tool to better plan for the future maritime environment.

  14. Flood hazard assessment in areas prone to flash flooding

    NASA Astrophysics Data System (ADS)

    Kvočka, Davor; Falconer, Roger A.; Bray, Michaela

    2016-04-01

    Contemporary climate projections suggest that there will be an increase in the occurrence of high-intensity rainfall events in the future. These precipitation extremes are usually the main cause for the emergence of extreme flooding, such as flash flooding. Flash floods are among the most unpredictable, violent and fatal natural hazards in the world. Furthermore, it is expected that flash flooding will occur even more frequently in the future due to more frequent development of extreme weather events, which will greatly increase the danger to people caused by flash flooding. This being the case, there will be a need for high resolution flood hazard maps in areas susceptible to flash flooding. This study investigates what type of flood hazard assessment methods should be used for assessing the flood hazard to people caused by flash flooding. Two different types of flood hazard assessment methods were tested: (i) a widely used method based on an empirical analysis, and (ii) a new, physically based and experimentally calibrated method. Two flash flood events were considered herein, namely: the 2004 Boscastle flash flood and the 2007 Železniki flash flood. The results obtained in this study suggest that in the areas susceptible to extreme flooding, the flood hazard assessment should be conducted using methods based on a mechanics-based analysis. In comparison to standard flood hazard assessment methods, these physically based methods: (i) take into account all of the physical forces, which act on a human body in floodwater, (ii) successfully adapt to abrupt changes in the flow regime, which often occur for flash flood events, and (iii) rapidly assess a flood hazard index in a relatively short period of time.

  15. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    NASA Astrophysics Data System (ADS)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide

  16. Multi-hazards risk assessment at different levels

    NASA Astrophysics Data System (ADS)

    Frolova, N.; Larionov, V.; Bonnin, J.

    2012-04-01

    Natural and technological disasters are becoming more frequent and devastating. Social and economic losses due to those events increase annually, which is definitely in relation with evolution of society. Natural hazards identification and analysis, as well natural risk assessment taking into account secondary technological accidents are the first steps in prevention strategy aimed at saving lives and protecting property against future events. The paper addresses methodological issues of natural and technological integrated risk assessment and mapping at different levels [1, 2]. At the country level the most hazardous natural processes, which may results in fatalities, injuries and economic loss in the Russian Federation, are considered. They are earthquakes, landslides, mud flows, floods, storms, avalanches. The special GIS environment for the country territory was developed which includes information about hazards' level and reoccurrence, an impact databases for the last 20 years, as well as models for estimating damage and casualties caused by these hazards. Federal maps of seismic individual and collective risk, as well as multi-hazards natural risk maps are presented. The examples of regional seismic risk assessment taking into account secondary accidents at fire, explosion and chemical hazardous facilities and regional integrated risk assessment are given for the earthquake prone areas of the Russian Federation. The paper also gives examples of loss computations due to scenario earthquakes taking into account accidents trigged by strong events at critical facilities: fire and chemical hazardous facilities, including oil pipe lines routes located in the earthquake prone areas. The estimations of individual seismic risk obtained are used by EMERCOM of the Russian Federation, as well as by other federal and local authorities, for planning and implementing preventive measures, aimed at saving lives and protecting property against future disastrous events. The

  17. Current Knowledge on the Use of Computational Toxicology in Hazard Assessment of Metallic Engineered Nanomaterials.

    PubMed

    Chen, Guangchao; Peijnenburg, Willie; Xiao, Yinlong; Vijver, Martina G

    2017-07-12

    As listed by the European Chemicals Agency, the three elements in evaluating the hazards of engineered nanomaterials (ENMs) include the integration and evaluation of toxicity data, categorization and labeling of ENMs, and derivation of hazard threshold levels for human health and the environment. Assessing the hazards of ENMs solely based on laboratory tests is time-consuming, resource intensive, and constrained by ethical considerations. The adoption of computational toxicology into this task has recently become a priority. Alternative approaches such as (quantitative) structure-activity relationships ((Q)SAR) and read-across are of significant help in predicting nanotoxicity and filling data gaps, and in classifying the hazards of ENMs to individual species. Thereupon, the species sensitivity distribution (SSD) approach is able to serve the establishment of ENM hazard thresholds sufficiently protecting the ecosystem. This article critically reviews the current knowledge on the development of in silico models in predicting and classifying the hazard of metallic ENMs, and the development of SSDs for metallic ENMs. Further discussion includes the significance of well-curated experimental datasets and the interpretation of toxicity mechanisms of metallic ENMs based on reported models. An outlook is also given on future directions of research in this frontier.

  18. Assessing Surface Fuel Hazard in Coastal Conifer Forests through the Use of LiDAR Remote Sensing

    NASA Astrophysics Data System (ADS)

    Koulas, Christos

    The research problem that this thesis seeks to examine is a method of predicting conventional fire hazards using data drawn from specific regions, namely the Sooke and Goldstream watershed regions in coastal British Columbia. This thesis investigates whether LiDAR data can be used to describe conventional forest stand fire hazard classes. Three objectives guided this thesis: to discuss the variables associated with fire hazard, specifically the distribution and makeup of fuel; to examine the relationship between derived LiDAR biometrics and forest attributes related to hazard assessment factors defined by the Capitol Regional District (CRD); and to assess the viability of the LiDAR biometric decision tree in the CRD based on current frameworks for use. The research method uses quantitative datasets to assess the optimal generalization of these types of fire hazard data through discriminant analysis. Findings illustrate significant LiDAR-derived data limitations, and reflect the literature in that flawed field application of data modelling techniques has led to a disconnect between the ways in which fire hazard models have been intended to be used by scholars and the ways in which they are used by those tasked with prevention of forest fires. It can be concluded that a significant trade-off exists between computational requirements for wildfire simulation models and the algorithms commonly used by field teams to apply these models with remote sensing data, and that CRD forest management practices would need to change to incorporate a decision tree model in order to decrease risk.

  19. Challenges in assessing seismic hazard in intraplate Europe

    NASA Astrophysics Data System (ADS)

    Brooks, Edward; Stein, Seth; Liu, Mian; Camelbeeck, Thierry; Merino, Miguel; Landgraf, Angela; Hintersberger, Esther; Kübler, Simon

    2016-04-01

    Intraplate seismicity is often characterized by episodic, clustered and migrating earth- quakes and extended after-shock sequences. Can these observations - primarily from North America, China and Australia - usefully be applied to seismic hazard assessment for intraplate Europe? Existing assessments are based on instrumental and historical seismicity of the past c. 1000 years, as well as some data for active faults. This time span probably fails to capture typical large-event recurrence intervals of the order of tens of thousands of years. Palaeoseismology helps to lengthen the observation window, but preferentially produces data in regions suspected to be seismically active. Thus the expected maximum magnitudes of future earthquakes are fairly uncertain, possibly underestimated, and earthquakes are likely to occur in unexpected locations. These issues particularly arise in considering the hazards posed by low-probability events to both heavily populated areas and critical facilities. For example, are the variations in seismicity (and thus assumed seismic hazard) along the Rhine Graben a result of short sampling or are they real? In addition to a better assessment of hazards with new data and models, it is important to recognize and communicate uncertainties in hazard estimates. The more users know about how much confidence to place in hazard maps, the more effectively the maps can be used.

  20. Multi-hazard national-level risk assessment in Africa using global approaches

    NASA Astrophysics Data System (ADS)

    Fraser, Stuart; Jongman, Brenden; Simpson, Alanna; Murnane, Richard

    2016-04-01

    In recent years Sub-Saharan Africa has been characterized by unprecedented opportunity for transformation and sustained growth. However, natural disasters such as droughts, floods, cyclones, earthquakes, landslides, volcanic eruptions and extreme temperatures cause significant economic and human losses, and major development challenges. Quantitative disaster risk assessments are an important basis for governments to understand disaster risk in their country, and to develop effective risk management and risk financing solutions. However, the data-scarce nature of many Sub-Saharan African countries as well as a lack of financing for risk assessments has long prevented detailed analytics. Recent advances in globally applicable disaster risk modelling practices and data availability offer new opportunities. In December 2013 the European Union approved a € 60 million contribution to support the development of an analytical basis for risk financing and to accelerate the effective implementation of a comprehensive disaster risk reduction. The World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR) was selected as the implementing partner of the Program for Result Area 5: the "Africa Disaster Risk Assessment and Financing Program." As part of this effort, the GFDRR is overseeing the production of national-level multi-hazard risk profiles for a range of countries in Sub-Saharan Africa, using a combination of national and global datasets and state-of-the-art hazard and risk assessment methodologies. In this presentation, we will highlight the analytical approach behind these assessments, and show results for the first five countries for which the assessment has been completed (Kenya, Uganda, Senegal, Niger and Ethiopia). The presentation will also demonstrate the visualization of the risk assessments into understandable and visually attractive risk profile documents.

  1. A Probabilistic Tsunami Hazard Study of the Auckland Region, Part II: Inundation Modelling and Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Lane, E. M.; Gillibrand, P. A.; Wang, X.; Power, W.

    2013-09-01

    Regional source tsunamis pose a potentially devastating hazard to communities and infrastructure on the New Zealand coast. But major events are very uncommon. This dichotomy of infrequent but potentially devastating hazards makes realistic assessment of the risk challenging. Here, we describe a method to determine a probabilistic assessment of the tsunami hazard by regional source tsunamis with an "Average Recurrence Interval" of 2,500-years. The method is applied to the east Auckland region of New Zealand. From an assessment of potential regional tsunamigenic events over 100,000 years, the inundation of the Auckland region from the worst 100 events is modelled using a hydrodynamic model and probabilistic inundation depths on a 2,500-year time scale were determined. Tidal effects on the potential inundation were included by coupling the predicted wave heights with the probability density function of tidal heights at the inundation site. Results show that the more exposed northern section of the east coast and outer islands in the Hauraki Gulf face the greatest hazard from regional tsunamis in the Auckland region. Incorporating tidal effects into predictions of inundation reduced the predicted hazard compared to modelling all the tsunamis arriving at high tide giving a more accurate hazard assessment on the specified time scale. This study presents the first probabilistic analysis of dynamic modelling of tsunami inundation for the New Zealand coast and as such provides the most comprehensive assessment of tsunami inundation of the Auckland region from regional source tsunamis available to date.

  2. Hazard Screening Methods for Nanomaterials: A Comparative Study

    PubMed Central

    Murphy, Finbarr; Mullins, Martin; Furxhi, Irini; Costa, Anna L.; Simeone, Felice C.

    2018-01-01

    Hazard identification is the key step in risk assessment and management of manufactured nanomaterials (NM). However, the rapid commercialisation of nano-enabled products continues to out-pace the development of a prudent risk management mechanism that is widely accepted by the scientific community and enforced by regulators. However, a growing body of academic literature is developing promising quantitative methods. Two approaches have gained significant currency. Bayesian networks (BN) are a probabilistic, machine learning approach while the weight of evidence (WoE) statistical framework is based on expert elicitation. This comparative study investigates the efficacy of quantitative WoE and Bayesian methodologies in ranking the potential hazard of metal and metal-oxide NMs—TiO2, Ag, and ZnO. This research finds that hazard ranking is consistent for both risk assessment approaches. The BN and WoE models both utilize physico-chemical, toxicological, and study type data to infer the hazard potential. The BN exhibits more stability when the models are perturbed with new data. The BN has the significant advantage of self-learning with new data; however, this assumes all input data is equally valid. This research finds that a combination of WoE that would rank input data along with the BN is the optimal hazard assessment framework. PMID:29495342

  3. Multi-scale landslide hazard assessment: Advances in global and regional methodologies

    NASA Astrophysics Data System (ADS)

    Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang

    2010-05-01

    . Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.

  4. Methodology for environmental assessments of oil and hazardous substance spills

    NASA Astrophysics Data System (ADS)

    Davis, W. P.; Scott, G. I.; Getter, C. D.; Hayes, M. O.; Gundlach, E. R.

    1980-03-01

    Scientific assessment of the complex environmental consequences of large spills of oil or other hazardous substances has stimulated development of improved strategies for rapid and valid collection and processing of ecological data. The combination of coastal processes and geological measurements developed by Hayes & Gundlach (1978), together with selected field biological and chemical observations/measurements, provide an ecosystem impact assessment approach which is termed “integrated zonal method of ecological impact assessment.” Ecological assessment of oil and hazardous material spills has been divided into three distinct phases: (1) first-order response studies — conducted at the time of the initial spill event, which gather data to document acute impacts and assist decision-makers in prioritization of cleanup efforts and protection of ecologically sensitive habitats, (2) second-order response studies — conducted two months to one year post-spill, which document any delayed mortality and attempt to identify potential sublethal impacts in sensitive species, and (3) third-order response studies — conducted one to three years post-spill, to document chronic impacts (both lethal and sublethal) to specific indicator species. Data collected during first-order response studies are gathered in a quantitative manner so that the initial assessment may become a baseline for later, more detailed, post-spill scientific efforts. First- and second-order response studies of the “Peck Slip” oil spill in Puerto Rico illustrate the usefulness of this method. The need for contingency planning before a spill has been discussed along with the use of the Vulnerability Index, a method in which coastal environments are classified on a scale of 1 10, based upon their potential susceptibility to oiling. A study of the lower Cook Inlet section of the Alaskan coast illustrates the practical application of this method.

  5. Correlating regional natural hazards for global reinsurance risk assessment

    NASA Astrophysics Data System (ADS)

    Steptoe, Hamish; Maynard, Trevor; Economou, Theo; Fox, Helen; Wallace, Emily; Maisey, Paul

    2016-04-01

    Concurrent natural hazards represent an uncertainty in assessing exposure for the insurance industry. The recently implemented Solvency II Directive requires EU insurance companies to fully understand and justify their capital reserving and portfolio decisions. Lloyd's, the London insurance and reinsurance market, commissioned the Met Office to investigate the dependencies between different global extreme weather events (known to the industry as perils), and the mechanisms for these dependencies, with the aim of helping them assess their compound risk to the exposure of multiple simultaneous hazards. In this work, we base the analysis of hazard-to-hazard dependency on the interaction of different modes of global and regional climate variability. Lloyd's defined 16 key hazard regions, including Australian wildfires, flooding in China and EU windstorms, and we investigate the impact of 10 key climate modes on these areas. We develop a statistical model that facilitates rapid risk assessment whilst allowing for both temporal auto-correlation and, crucially, interdependencies between drivers. The simulator itself is built conditionally using autoregressive regression models for each driver conditional on the others. Whilst the baseline assumption within the (re)insurance industry is that different natural hazards are independent of each other, the assumption of independence of meteorological risks requires greater justification. Although our results suggest that most of the 120 hazard-hazard connections considered are likely to be independent of each other, 13 have significant dependence arising from one or more global modes of climate variability. This allows us to create a matrix of linkages describing the hazard dependency structure that Lloyd's can use to inform their understanding of risk.

  6. Kauai Test Facility hazards assessment document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swihart, A

    1995-05-01

    The Department of Energy Order 55003A requires facility-specific hazards assessment be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with the Kauai Test Facility, Barking Sands, Kauai, Hawaii. The Kauai Test Facility`s chemical and radiological inventories were screened according to potential airborne impact to onsite and offsite individuals. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the release site, the atmospheric conditions, and the circumstances of the release. The greatest distance to themore » Early Severe Health Effects threshold is 4.2 kilometers. The highest emergency classification is a General Emergency at the {open_quotes}Main Complex{close_quotes} and a Site Area Emergency at the Kokole Point Launch Site. The Emergency Planning Zone for the {open_quotes}Main Complex{close_quotes} is 5 kilometers. The Emergency Planning Zone for the Kokole Point Launch Site is the Pacific Missile Range Facility`s site boundary.« less

  7. The application of quantitative risk assessment to microbial food safety risks.

    PubMed

    Jaykus, L A

    1996-01-01

    Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data

  8. Hazard assessment of selenium to endangered razorback suckers (Xyrauchen texanus)

    USGS Publications Warehouse

    Hamilton, S.J.; Holley, K.M.; Buhl, K.J.

    2002-01-01

    A hazard assessment was conducted based on information derived from two reproduction studies conducted with endangered razorback suckers (Xyrauchen texanus) at three sites near Grand Junction, CO, USA. Selenium contamination of the upper and lower Colorado River basin has been documented in water, sediment, and biota in studies by US Department of the Interior agencies and academia. Concern has been raised that this selenium contamination may be adversely affecting endangered fish in the upper Colorado River basin. The reproduction studies with razorback suckers revealed that adults readily accumulated selenium in various tissues including eggs, and that 4.6 μg/g of selenium in food organisms caused increased mortality of larvae. The selenium hazard assessment protocol resulted in a moderate hazard at the Horsethief site and high hazards at the Adobe Creek and North Pond sites. The selenium hazard assessment was considered conservative because an on-site toxicity test with razorback sucker larvae using 4.6 μg/g selenium in zooplankton caused nearly complete mortality, in spite of the moderate hazard at Horsethief. Using the margin of uncertainty ratio also suggested a high hazard for effects on razorback suckers from selenium exposure. Both assessment approaches suggested that selenium in the upper Colorado River basin adversely affects the reproductive success of razorback suckers.

  9. Hazard assessment of selenium to endangered razorback suckers (Xyrauchen texanus).

    PubMed

    Hamilton, Steven J; Holley, Kathleen M; Buhl, Kevin J

    2002-05-27

    A hazard assessment was conducted based on information derived from two reproduction studies conducted with endangered razorback suckers (Xyrauchen texanus) at three sites near Grand Junction, CO, USA. Selenium contamination of the upper and lower Colorado River basin has been documented in water, sediment, and biota in studies by US Department of the Interior agencies and academia. Concern has been raised that this selenium contamination may be adversely affecting endangered fish in the upper Colorado River basin. The reproduction studies with razorback suckers revealed that adults readily accumulated selenium in various tissues including eggs, and that 4.6 microg/g of selenium in food organisms caused increased mortality of larvae. The selenium hazard assessment protocol resulted in a moderate hazard at the Horsethief site and high hazards at the Adobe Creek and North Pond sites. The selenium hazard assessment was considered conservative because an on-site toxicity test with razorback sucker larvae using 4.6 microg/g selenium in zooplankton caused nearly complete mortality, in spite of the moderate hazard at Horsethief. Using the margin of uncertainty ratio also suggested a high hazard for effects on razorback suckers from selenium exposure. Both assessment approaches suggested that selenium in the upper Colorado River basin adversely affects the reproductive success of razorback suckers.

  10. Toward uniform probabilistic seismic hazard assessments for Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Shi, X.; Ornthammarath, T.; Warnitchai, P.; Kosuwan, S.; Thant, M.; Nguyen, P. H.; Nguyen, L. M.; Solidum, R., Jr.; Irsyam, M.; Hidayati, S.; Sieh, K.

    2017-12-01

    Although most Southeast Asian countries have seismic hazard maps, various methodologies and quality result in appreciable mismatches at national boundaries. We aim to conduct a uniform assessment across the region by through standardized earthquake and fault databases, ground-shaking scenarios, and regional hazard maps. Our earthquake database contains earthquake parameters obtained from global and national seismic networks, harmonized by removal of duplicate events and the use of moment magnitude. Our active-fault database includes fault parameters from previous studies and from the databases implemented for national seismic hazard maps. Another crucial input for seismic hazard assessment is proper evaluation of ground-shaking attenuation. Since few ground-motion prediction equations (GMPEs) have used local observations from this region, we evaluated attenuation by comparison of instrumental observations and felt intensities for recent earthquakes with predicted ground shaking from published GMPEs. We then utilize the best-fitting GMPEs and site conditions into our seismic hazard assessments. Based on the database and proper GMPEs, we have constructed regional probabilistic seismic hazard maps. The assessment shows highest seismic hazard levels near those faults with high slip rates, including the Sagaing Fault in central Myanmar, the Sumatran Fault in Sumatra, the Palu-Koro, Matano and Lawanopo Faults in Sulawesi, and the Philippine Fault across several islands of the Philippines. In addition, our assessment demonstrates the important fact that regions with low earthquake probability may well have a higher aggregate probability of future earthquakes, since they encompass much larger areas than the areas of high probability. The significant irony then is that in areas of low to moderate probability, where building codes are usually to provide less seismic resilience, seismic risk is likely to be greater. Infrastructural damage in East Malaysia during the 2015

  11. Maritime Tsunami Hazard Assessment in California

    NASA Astrophysics Data System (ADS)

    Lynett, P. J.; Borrero, J. C.; Wilson, R. I.; Miller, K. M.

    2012-12-01

    The California tsunami program in cooperation with NOAA and FEMA has begun implementing a plan to increase awareness of tsunami generated hazards to the maritime community (both ships and harbor infrastructure) through the development of in-harbor hazard maps, offshore safety zones for boater evacuation, and associated guidance for harbors and marinas before, during and following tsunamis. The hope is that the maritime guidance and associated education and outreach program will help save lives and reduce exposure of damage to boats and harbor infrastructure. An important step in this process is to understand the causative mechanism for damage in ports and harbors, and then ensure that the models used to generate hazard maps are able to accurately simulate these processes. Findings will be used to develop maps, guidance documents, and consistent policy recommendations for emergency managers and port authorities and provide information critical to real-time decisions required when responding to tsunami alert notifications. Basin resonance and geometric amplification are two reasonably well understood mechanisms for local magnification of tsunami impact in harbors, and are generally the mechanisms investigated when estimating the tsunami hazard potential in a port or harbor. On the other hand, our understanding of and predictive ability for currents is lacking. When a free surface flow is forced through a geometric constriction, it is readily expected that the enhanced potential gradient will drive strong, possibly unstable currents and the associated turbulent coherent structures such as "jets" and "whirlpools"; a simple example would be tidal flow through an inlet channel. However, these fundamentals have not been quantitatively connected with respect to understanding tsunami hazards in ports and harbors. A plausible explanation for this oversight is the observation that these features are turbulent phenomena with spatial and temporal scales much smaller than that

  12. Qualitative and Quantitative Assessment of Naturals Hazards in the Caldera of Mount Bambouto (West Cameroon)

    NASA Astrophysics Data System (ADS)

    Zangmo Tefogoum, G.; Kagou Dongmo, A.; Nkouathio, D. G.; Wandji, P.

    2009-04-01

    Mount Bambouto is polygenic stratovolcano of the Cameroon Volcanic Line, build between 21 Ma and 4,5Ma (Nkouathio et al., 2008). It is situated at about 200 km NE of mount Cameroon, at 09°55' and 10°15' East and, 05°25' and 05°50' Nord. This volcano covers an area of 500 Km2 and culminates at 2740 m at Meletan hill and bears a collapse caldera (13 x 8 km). Fissural, extrusive and explosive dynamism are responsible of the construction in three main stages this volcano including the edification of a sommital large rim caldera. Mount Bambouto structure gives rise to different natural hazards, of volcanological origin and meteorological origin. In the past time, landslides, floodings, firebush, blocks collapse took place in this area with catastrophic impact on the population. New research program had been carried out in the caldera concerning qualitative and quantitative evaluation of natural risks and catastrophes. The main factors of instability are rain, structure of the basement, slopes, lithology and anthropic activities; particularly, the occurrence of exceptional rainfall due to global change are relevant; this gives opportunity to draw landslides hazards zonation map of the Bambouto caldera which is the main risk in this area. We evaluate the financial potential of the caldera base on the average income of breeding, farming, school fees and the cost of houses and equipments for each family. The method of calculation revealed that, the yearly economy of the mounts Bambouto caldera represents about 2 billions FCFA. Some recommendations have been made in order to prevent and reduced the potential losses and the number of victims in particular by better land use planning. These help us to estimate the importance of destruction of the environment and biodiversity in case of catastrophes. We conclude that in the Bambouto caldera there is moderate to high probability that destructive phenomena due to landslides occurs within the upcoming years with enormous

  13. USGS Training in Afghanistan: Modern Earthquake Hazards Assessments

    NASA Astrophysics Data System (ADS)

    Medlin, J. D.; Garthwaite, M.; Holzer, T.; McGarr, A.; Bohannon, R.; Bergen, K.; Vincent, T.

    2007-05-01

    Afghanistan is located in a tectonically active region where ongoing deformation has generated rugged mountainous terrain, and where large earthquakes occur frequently. These earthquakes can present a significant hazard, not only from strong ground shaking, but also from liquefaction and extensive land sliding. The magnitude 6.1 earthquake of March 25, 2002 highlighted the vulnerability of Afghanistan to such hazards, and resulted in over 1000 fatalities. The USGS has provided the first of a series of Earth Science training courses to the Afghan Geological Survey (AGS). This course was concerned with modern earthquake hazard assessments, and is an integral part of a larger USGS effort to provide a comprehensive seismic-hazard assessment for Afghanistan. Funding for these courses is provided by the US Agency for International Development Afghanistan Reconstruction Program. The particular focus of this training course, held December 2-6, 2006 in Kabul, was on providing a background in the seismological and geological methods relevant to preparing for future earthquakes. Topics included identifying active faults, modern tectonic theory, geotechnical measurements of near-surface materials, and strong-motion seismology. With this background, participants may now be expected to educate other members of the community and be actively involved in earthquake hazard assessments themselves. The December, 2006, training course was taught by four lecturers, with all lectures and slides being presented in English and translated into Dari. Copies of the lectures were provided to the students in both hardcopy and digital formats. Class participants included many of the section leaders from within the AGS who have backgrounds in geology, geophysics, and engineering. Two additional training sessions are planned for 2007, the first entitled "Modern Concepts in Geology and Mineral Resource Assessments," and the second entitled "Applied Geophysics for Mineral Resource Assessments."

  14. Assessment of social vulnerability to natural hazards in Nepal

    NASA Astrophysics Data System (ADS)

    Gautam, Dipendra

    2017-12-01

    This paper investigates district-wide social vulnerability to natural hazards in Nepal. Disasters such as earthquakes, floods, landslides, epidemics, and droughts are common in Nepal. Every year thousands of people are killed and huge economic and environmental losses occur in Nepal due to various natural hazards. Although natural hazards are well recognized, quantitative and qualitative social vulnerability mapping has not existed until now in Nepal. This study aims to quantify the social vulnerability on a local scale, considering all 75 districts using the available census. To perform district-level vulnerability mapping, 13 variables were selected and aggregated indexes were plotted in an ArcGIS environment. The sum of results shows that only 4 districts in Nepal have a very low social vulnerability index whereas 46 districts (61 %) are at moderate to high social vulnerability levels. Vulnerability mapping highlights the immediate need for decentralized frameworks to tackle natural hazards in district level; additionally, the results of this study can contribute to preparedness, planning and resource management, inter-district coordination, contingency planning, and public awareness efforts.

  15. Assessing hazards along our Nation's coasts

    USGS Publications Warehouse

    Hapke, Cheryl J.; Brenner, Owen; Henderson, Rachel E.; Reynolds, B.J.

    2013-01-01

    Coastal areas are essential to the economic, cultural, and environmental health of the Nation, yet by nature coastal areas are constantly changing due to a variety of events and processes. Extreme storms can cause dramatic changes to our shorelines in a matter of hours, while sea-level rise can profoundly alter coastal environments over decades. These changes can have a devastating impact on coastal communities, such as the loss of homes built on retreating sea cliffs or protective dunes eroded by storm waves. Sometimes, however, the changes can be positive, such as new habitat created by storm deposits. The U.S. Geological Survey (USGS) is meeting the need for scientific understanding of how our coasts respond to different hazards with continued assessments of current and future changes along U.S. coastlines. Through the National Assessment of Coastal Change Hazards (NACCH), the USGS carries out the unique task of quantifying coastal change hazards along open-ocean coasts in the United States and its territories. Residents of coastal communities, emergency managers, and other stakeholders can use science-based data, tools, models, and other products to improve planning and enhance resilience.

  16. Improving the Linkages between Air Pollution Epidemiology and Quantitative Risk Assessment

    PubMed Central

    Bell, Michelle L.; Walker, Katy; Hubbell, Bryan

    2011-01-01

    Background: Air pollution epidemiology plays an integral role in both identifying the hazards of air pollution as well as supplying the risk coefficients that are used in quantitative risk assessments. Evidence from both epidemiology and risk assessments has historically supported critical environmental policy decisions. The extent to which risk assessors can properly specify a quantitative risk assessment and characterize key sources of uncertainty depends in part on the availability, and clarity, of data and assumptions in the epidemiological studies. Objectives: We discuss the interests shared by air pollution epidemiology and risk assessment communities in ensuring that the findings of epidemiological studies are appropriately characterized and applied correctly in risk assessments. We highlight the key input parameters for risk assessments and consider how modest changes in the characterization of these data might enable more accurate risk assessments that better represent the findings of epidemiological studies. Discussion: We argue that more complete information regarding the methodological choices and input data used in epidemiological studies would support more accurate risk assessments—to the benefit of both disciplines. In particular, we suggest including additional details regarding air quality, demographic, and health data, as well as certain types of data-rich graphics. Conclusions: Relatively modest changes to the data reported in epidemiological studies will improve the quality of risk assessments and help prevent the misinterpretation and mischaracterization of the results of epidemiological studies. Such changes may also benefit epidemiologists undertaking meta-analyses. We suggest workshops as a way to improve the dialogue between the two communities. PMID:21816702

  17. Hazard assessment of hydraulic fracturing chemicals using an indexing method.

    PubMed

    Hu, Guangji; Liu, Tianyi; Hager, James; Hewage, Kasun; Sadiq, Rehan

    2018-04-01

    The rapid expansion of unconventional natural gas production has triggered considerable public concerns, particularly regarding environmental and human health (EHH) risks posed by various chemical additives used in hydraulic fracturing (HF) operations. There is a need to assess the potential EHH hazards of additives used in real-world HF operations. In this study, HF additive and fracturing fluid data was acquired, and EHH hazards were assessed using an indexing approach. The indexing system analyzed chemical toxicological data of different ingredients contained within additives and produced an aggregated EHH safety index for each additive, along with an indicator describing the completeness of the chemical toxicological data. The results show that commonly used additives are generally associated with medium-level EHH hazards. In each additive category, ingredients of high EHH concern were identified, and the high hazard designation was primarily attributed to ingredients' high aquatic toxicity and carcinogenic effects. Among all assessed additive categories, iron control agents were identified as the greatest EHH hazards. Lack of information, such as undisclosed ingredients and chemical toxicological data gaps, has resulted in different levels of assessment uncertainties. In particular, friction reducers show the highest data incompleteness with regards to EHH hazards. This study reveals the potential EHH hazards associated with chemicals used in current HF field operations and can provide decision makers with valuable information to facilitate sustainable and responsible unconventional gas production. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Models of volcanic eruption hazards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wohletz, K.H.

    1992-01-01

    Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluidmore » flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.« less

  19. Models of volcanic eruption hazards

    NASA Astrophysics Data System (ADS)

    Wohletz, K. H.

    Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluid flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.

  20. [Application of three risk assessment models in occupational health risk assessment of dimethylformamide].

    PubMed

    Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J

    2016-08-20

    Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) 1/2 ; Occupational hazards risk assessment index=2 Health effect level ×2 exposure ratio ×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions

  1. Probabilistic Seismic Hazard Assessment for Northeast India Region

    NASA Astrophysics Data System (ADS)

    Das, Ranjit; Sharma, M. L.; Wason, H. R.

    2016-08-01

    Northeast India bounded by latitudes 20°-30°N and longitudes 87°-98°E is one of the most seismically active areas in the world. This region has experienced several moderate-to-large-sized earthquakes, including the 12 June, 1897 Shillong earthquake ( M w 8.1) and the 15 August, 1950 Assam earthquake ( M w 8.7) which caused loss of human lives and significant damages to buildings highlighting the importance of seismic hazard assessment for the region. Probabilistic seismic hazard assessment of the region has been carried out using a unified moment magnitude catalog prepared by an improved General Orthogonal Regression methodology (Geophys J Int, 190:1091-1096, 2012; Probabilistic seismic hazard assessment of Northeast India region, Ph.D. Thesis, Department of Earthquake Engineering, IIT Roorkee, Roorkee, 2013) with events compiled from various databases (ISC, NEIC,GCMT, IMD) and other available catalogs. The study area has been subdivided into nine seismogenic source zones to account for local variation in tectonics and seismicity characteristics. The seismicity parameters are estimated for each of these source zones, which are input variables into seismic hazard estimation of a region. The seismic hazard analysis of the study region has been performed by dividing the area into grids of size 0.1° × 0.1°. Peak ground acceleration (PGA) and spectral acceleration ( S a) values (for periods of 0.2 and 1 s) have been evaluated at bedrock level corresponding to probability of exceedance (PE) of 50, 20, 10, 2 and 0.5 % in 50 years. These exceedance values correspond to return periods of 100, 225, 475, 2475, and 10,000 years, respectively. The seismic hazard maps have been prepared at the bedrock level, and it is observed that the seismic hazard estimates show a significant local variation in contrast to the uniform hazard value suggested by the Indian standard seismic code [Indian standard, criteria for earthquake-resistant design of structures, fifth edition, Part

  2. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  3. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a

  4. Urban Heat Wave Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Quattrochi, D. A.; Jedlovec, G.; Crane, D. L.; Meyer, P. J.; LaFontaine, F.

    2016-12-01

    Heat waves are one of the largest causes of environmentally-related deaths globally and are likely to become more numerous as a result of climate change. The intensification of heat waves by the urban heat island effect and elevated humidity, combined with urban demographics, are key elements leading to these disasters. Better warning of the potential hazards may help lower risks associated with heat waves. Moderate resolution thermal data from NASA satellites is used to derive high spatial resolution estimates of apparent temperature (heat index) over urban regions. These data, combined with demographic data, are used to produce a daily heat hazard/risk map for selected cities. MODIS data are used to derive daily composite maximum and minimum land surface temperature (LST) fields to represent the amplitude of the diurnal temperature cycle and identify extreme heat days. Compositing routines are used to generate representative daily maximum and minimum LSTs for the urban environment. The limited effect of relative humidity on the apparent temperature (typically 10-15%) allows for the use of modeled moisture fields to convert LST to apparent temperature without loss of spatial variability. The daily max/min apparent temperature fields are used to identify abnormally extreme heat days relative to climatological values in order to produce a heat wave hazard map. Reference to climatological values normalizes the hazard for a particular region (e.g., the impact of an extreme heat day). A heat wave hazard map has been produced for several case study periods and then computed on a quasi-operational basis during the summer of 2016 for Atlanta, GA, Chicago, IL, St. Louis, MO, and Huntsville, AL. A hazard does not become a risk until someone or something is exposed to that hazard at a level that might do harm. Demographic information is used to assess the urban risk associated with the heat wave hazard. Collectively, the heat wave hazard product can warn people in urban

  5. Development of new geomagnetic storm ground response scaling factors for utilization in hazard assessments

    NASA Astrophysics Data System (ADS)

    Pulkkinen, A. A.; Bernabeu, E.; Weigel, R. S.; Kelbert, A.; Rigler, E. J.; Bedrosian, P.; Love, J. J.

    2017-12-01

    Development of realistic storm scenarios that can be played through the exposed systems is one of the key requirements for carrying out quantitative space weather hazards assessments. In the geomagnetically induced currents (GIC) and power grids context, these scenarios have to quantify the spatiotemporal evolution of the geoelectric field that drives the potentially hazardous currents in the system. In response to the Federal Energy Regulatory Commission (FERC) order 779, a team of scientists and engineers that worked under the auspices of North American Electric Reliability Corporation (NERC), has developed extreme geomagnetic storm and geoelectric field benchmark(s) that use various scaling factors that account for geomagnetic latitude and ground structure of the locations of interest. These benchmarks, together with the information generated in the National Space Weather Action Plan, are the foundation for the hazards assessments that the industry will be carrying out in response to the FERC order and under the auspices of the National Science and Technology Council. While the scaling factors developed in the past work were based on the best available information, there is now significant new information available for parts of the U.S. pertaining to the ground response to external geomagnetic field excitation. The significant new information includes the results magnetotelluric surveys that have been conducted over the past few years across the contiguous US and results from previous surveys that have been made available in a combined online database. In this paper, we distill this new information in the framework of the NERC benchmark and in terms of updated ground response scaling factors thereby allowing straightforward utilization in the hazard assessments. We also outline the path forward for improving the overall extreme event benchmark scenario(s) including generalization of the storm waveforms and geoelectric field spatial patterns.

  6. Risk assessment of debris flow hazards in natural slope

    NASA Astrophysics Data System (ADS)

    Choi, Junghae; Chae, Byung-gon; Liu, Kofei; Wu, Yinghsin

    2016-04-01

    The study area is located at north-east part of South Korea. Referring to the map of landslide sus-ceptibility (KIGAM, 2009) from Korea Institute of Geoscience and Mineral Resources (KIGAM for short), there are large areas of potential landslide in high probability on slope land of mountain near the study area. Besides, recently some severe landslide-induced debris flow hazards occurred in this area. So this site is convinced to be prone to debris flow haz-ards. In order to mitigate the influence of hazards, the assessment of potential debris flow hazards is very important and essential. In this assessment, we use Debris-2D, debris flow numerical program, to assess the potential debris flow hazards. The worst scenario is considered for simulation. The input mass sources are determined using landslide susceptibility map. The water input is referred to the daily accumulative rainfall in the past debris flow event in study area. The only one input material property, i.e. yield stress, is obtained using calibration test. The simulation results show that the study area has po-tential to be impacted by debris flow. Therefore, based on simulation results, to mitigate debris flow hazards, we can propose countermeasures, including building check dams, constructing a protection wall in study area, and installing instruments for active monitoring of debris flow hazards. Acknowledgements:This research was supported by the Public Welfare & Safety Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning (NRF-2012M3A2A1050983)

  7. Rockfall Hazard Process Assessment : Implementation Report

    DOT National Transportation Integrated Search

    2017-10-01

    The Montana Department of Transportation (MDT) commissioned a new research program to improve assessment and management of its rock slope assets. The Department implemented a Rockfall Hazard Rating System (RHRS) program in 2005 and wished to add valu...

  8. Wicked Problems in Natural Hazard Assessment and Mitigation

    NASA Astrophysics Data System (ADS)

    Stein, S.; Steckler, M. S.; Rundle, J. B.; Dixon, T. H.

    2017-12-01

    Social scientists have defined "wicked" problems that are "messy, ill-defined, more complex than we fully grasp, and open to multiple interpretations based on one's point of view... No solution to a wicked problem is permanent or wholly satisfying, which leaves every solution open to easy polemical attack." These contrast with "tame" problems in which necessary information is available and solutions - even if difficult and expensive - are straightforward to identify and execute. Updating the U.S.'s aging infrastructure is a tame problem, because what is wrong and how to fix it are clear. In contrast, addressing climate change is a wicked problem because its effects are uncertain and the best strategies to address them are unclear. An analogous approach can be taken to natural hazard problems. In tame problems, we have a good model of the process, good information about past events, and data implying that the model should predict future events. In such cases, we can make a reasonable assessment of the hazard that can be used to develop mitigation strategies. Earthquake hazard mitigation for San Francisco is a relatively tame problem. We understand how the earthquakes result from known plate motions, have information about past earthquakes, and have geodetic data implying that future similar earthquakes will occur. As a result, it is straightforward to develop and implement mitigation strategies. However, in many cases, hazard assessment and mitigation is a wicked problem. How should we prepare for a great earthquake on plate boundaries where tectonics favor such events but we have no evidence that they have occurred and hence how large they may be or how often to expect them? How should we assess the hazard within plates, for example in the New Madrid seismic zone, where large earthquakes have occurred but we do not understand their causes and geodetic data show no strain accumulating? How can we assess the hazard and make sensible policy when the recurrence of

  9. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    PubMed

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-08

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition.

  10. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  11. Nationwide tsunami hazard assessment project in Japan

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2014-12-01

    In 2012, we began a project of nationwide Probabilistic Tsunami Hazard Assessment (PTHA) in Japan to support various measures (Fujiwara et al., 2013, JpGU; Hirata et al., 2014, AOGS). The most important strategy in the nationwide PTHA is predominance of aleatory uncertainty in the assessment but use of epistemic uncertainty is limited to the minimum, because the number of all possible combinations among epistemic uncertainties diverges quickly when the number of epistemic uncertainties in the assessment increases ; we consider only a type of earthquake occurrence probability distribution as epistemic uncertainty. We briefly show outlines of the nationwide PTHA as follows; (i) we consider all possible earthquakes in the future, including those that the Headquarters for Earthquake Research Promotion (HERP) of Japanese Government, already assessed. (ii) We construct a set of simplified earthquake fault models, called "Characterized Earthquake Fault Models (CEFMs)", for all of the earthquakes by following prescribed rules (Toyama et al., 2014, JpGU; Korenaga et al., 2014, JpGU). (iii) For all of initial water surface distributions caused by a number of the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. (iv) Finally, we integrate information about the tsunamis calculated from the numerous CEFMs to get nationwide tsunami hazard assessments. One of the most popular representations of the integrated information is a tsunami hazard curve for coastal tsunami heights, incorporating uncertainties inherent in tsunami simulation and earthquake fault slip heterogeneity (Abe et al., 2014, JpGU). We will show a PTHA along the eastern coast of Honshu, Japan, based on approximately 1,800 tsunami sources located within the subduction zone along the Japan Trench, as a prototype of the nationwide PTHA. This study is supported by part of the research

  12. Robot-assisted home hazard assessment for fall prevention: a feasibility study.

    PubMed

    Sadasivam, Rajani S; Luger, Tana M; Coley, Heather L; Taylor, Benjamin B; Padir, Taskin; Ritchie, Christine S; Houston, Thomas K

    2014-01-01

    We examined the feasibility of using a remotely manoeuverable robot to make home hazard assessments for fall prevention. We employed use-case simulations to compare robot assessments with in-person assessments. We screened the homes of nine elderly patients (aged 65 years or more) for fall risks using the HEROS screening assessment. We also assessed the participants' perspectives of the remotely-operated robot in a survey. The nine patients had a median Short Blessed Test score of 8 (interquartile range, IQR 2-20) and a median Life-Space Assessment score of 46 (IQR 27-75). Compared to the in-person assessment (mean = 4.2 hazards identified per participant), significantly more home hazards were perceived in the robot video assessment (mean = 7.0). Only two checklist items (adequate bedroom lighting and a clear path from bed to bathroom) had more than 60% agreement between in-person and robot video assessment. Participants were enthusiastic about the robot and did not think it violated their privacy. The study found little agreement between the in-person and robot video hazard assessments. However, it identified several research questions about how to best use remotely-operated robots.

  13. A novel hazard assessment method for biomass gasification stations based on extended set pair analysis

    PubMed Central

    Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai

    2017-01-01

    Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA. PMID:28938011

  14. Probabilistic seismic hazard assessment of southern part of Ghana

    NASA Astrophysics Data System (ADS)

    Ahulu, Sylvanus T.; Danuor, Sylvester Kojo; Asiedu, Daniel K.

    2018-05-01

    This paper presents a seismic hazard map for the southern part of Ghana prepared using the probabilistic approach, and seismic hazard assessment results for six cities. The seismic hazard map was prepared for 10% probability of exceedance for peak ground acceleration in 50 years. The input parameters used for the computations of hazard were obtained using data from a catalogue that was compiled and homogenised to moment magnitude (Mw). The catalogue covered a period of over a century (1615-2009). The hazard assessment is based on the Poisson model for earthquake occurrence, and hence, dependent events were identified and removed from the catalogue. The following attenuation relations were adopted and used in this study—Allen (for south and eastern Australia), Silva et al. (for Central and eastern North America), Campbell and Bozorgnia (for worldwide active-shallow-crust regions) and Chiou and Youngs (for worldwide active-shallow-crust regions). Logic-tree formalism was used to account for possible uncertainties associated with the attenuation relationships. OpenQuake software package was used for the hazard calculation. The highest level of seismic hazard is found in the Accra and Tema seismic zones, with estimated peak ground acceleration close to 0.2 g. The level of the seismic hazard in the southern part of Ghana diminishes with distance away from the Accra/Tema region to a value of 0.05 g at a distance of about 140 km.

  15. Quantitative risk assessment using empirical vulnerability functions from debris flow event reconstruction

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Blahut, Jan; Camera, Corrado; van Westen, Cees; Sterlacchini, Simone; Apuani, Tiziana; Akbas, Sami

    2010-05-01

    For a quantitative risk assessment framework it is essential to assess not only the hazardous process itself but to perform an analysis of their consequences. This quantitative assessment should include the expected monetary losses as the product of the probability of occurrence of a hazard with a given magnitude and its vulnerability. A quantifiable integrated approach of both hazard and risk is becoming a required practice in risk reduction management. Dynamic run-out models for debris flows are able to calculate physical outputs (extension, depths, velocities, impact pressures) and to determine the zones where the elements at risk could suffer an impact. These results are then applied for vulnerability and risk calculations. The risk assessment has been conducted in the Valtellina Valley, a typical Italian alpine valley lying in northern Italy (Lombardy Region). On 13th July 2008, after more than two days of intense rainfall, several debris and mud flows were released in the central part of valley between Morbegno and Berbenno. One of the largest debris flows occurred in Selvetta. The debris flow event was reconstructed after extensive field work and interviews with local inhabitants and civil protection teams. Also inside the Valtellina valley, between the 22nd and the 23rd of May 1983, two debris flows happened in Tresenda (Teglio municipality), causing casualties and considerable economic damages. On the same location, during the 26th of November 2002, another debris flow occurred that caused significant damage. For the quantification of a new scenario, the outcome results obtained from the event of Selvetta were applied in Tresenda. The Selvetta and Tresenda event were modelled with the FLO2D program. FLO2D is an Eulerian formulation with a finite differences numerical scheme that requires the specification of an input hydrograph. The internal stresses are isotropic and the basal shear stresses are calculated using a quadratic model. The significance of

  16. Simulation Technology Laboratory Building 970 hazards assessment document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, C.L.; Starr, M.D.

    1994-11-01

    The Department of Energy Order 5500.3A requires facility-specific hazards assessments be prepared, maintained, and used for emergency planning purposes. This hazards assessment document describes the chemical and radiological hazards associated with the Simulation Technology Laboratory, Building 970. The entire inventory was screened according to the potential airborne impact to onsite and offsite individuals. The air dispersion model, ALOHA, estimated pollutant concentrations downwind from the source of a release, taking into consideration the toxicological and physical characteristics of the release site, the atmospheric conditions, and the circumstances of the release. The greatest distances at which a postulated facility event will producemore » consequences exceeding the ERPG-2 and Early Severe Health Effects thresholds are 78 and 46 meters, respectively. The highest emergency classification is a Site Area Emergency. The Emergency Planning Zone is 100 meters.« less

  17. The Spatial Assessment of the Current Seismic Hazard State for Hard Rock Underground Mines

    NASA Astrophysics Data System (ADS)

    Wesseloo, Johan

    2018-06-01

    Mining-induced seismic hazard assessment is an important component in the management of safety and financial risk in mines. As the seismic hazard is a response to the mining activity, it is non-stationary and variable both in space and time. This paper presents an approach for implementing a probabilistic seismic hazard assessment to assess the current hazard state of a mine. Each of the components of the probabilistic seismic hazard assessment is considered within the context of hard rock underground mines. The focus of this paper is the assessment of the in-mine hazard distribution and does not consider the hazard to nearby public or structures. A rating system and methodologies to present hazard maps, for the purpose of communicating to different stakeholders in the mine, i.e. mine managers, technical personnel and the work force, are developed. The approach allows one to update the assessment with relative ease and within short time periods as new data become available, enabling the monitoring of the spatial and temporal change in the seismic hazard.

  18. Teaching quantitative biology: goals, assessments, and resources

    PubMed Central

    Aikens, Melissa L.; Dolan, Erin L.

    2014-01-01

    More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425

  19. Review of Natural Phenomena Hazard (NPH) Assessments for the DOE Hanford Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snow, Robert L.; Ross, Steven B.

    2011-09-15

    The purpose of this review is to assess the need for updating Natural Phenomena Hazard (NPH) assessments for the DOE's Hanford Site, as required by DOE Order 420.1B Chapter IV, Natural Phenomena Hazards Mitigation, based on significant changes in state-of-the-art NPH assessment methodology or site-specific information. This review is an update and expansion to the September 2010 review of PNNL-19751, Review of Natural Phenomena Hazard (NPH) Assessments for the Hanford 200 Areas (Non-Seismic).

  20. Multi scenario seismic hazard assessment for Egypt

    NASA Astrophysics Data System (ADS)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-01-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  1. Multi scenario seismic hazard assessment for Egypt

    NASA Astrophysics Data System (ADS)

    Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed

    2018-05-01

    Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.

  2. Afghanistan Multi-Risk Assessment to Natural Hazards

    NASA Astrophysics Data System (ADS)

    Diermanse, Ferdinand; Daniell, James; Pollino, Maurizio; Glover, James; Bouwer, Laurens; de Bel, Mark; Schaefer, Andreas; Puglisi, Claudio; Winsemius, Hessel; Burzel, Andreas; Ammann, Walter; Aliparast, Mojtaba; Jongman, Brenden; Ranghieri, Federica; Fallesen, Ditte

    2017-04-01

    The geographical location of Afghanistan and years of environmental degradation in the country make Afghanistan highly prone to intense and recurring natural hazards such as flooding, earthquakes, snow avalanches, landslides, and droughts. These occur in addition to man-made disasters resulting in the frequent loss of live, livelihoods, and property. Since 1980, disasters caused by natural hazards have affected 9 million people and caused over 20,000 fatalities in Afghanistan. The creation, understanding and accessibility of hazard, exposure, vulnerability and risk information is key for effective management of disaster risk. This is especially true in Afghanistan, where reconstruction after recent natural disasters and military conflicts is on-going and will continue over the coming years. So far, there has been limited disaster risk information produced in Afghanistan, and information that does exist typically lacks standard methodology and does not have uniform geo-spatial coverage. There are currently no available risk assessment studies that cover all major natural hazards in Afghanistan, which can be used to assess the costs and benefits of different resilient reconstruction and disaster risk reduction strategies. As a result, the Government of Afghanistan has limited information regarding current and future disaster risk and the effectiveness of policy options on which to base their reconstruction and risk reduction decisions. To better understand natural hazard and disaster risk, the World Bank and Global Facility for Disaster Reduction and Recovery (GFDRR) are supporting the development of new fluvial flood, flash flood, drought, landslide, avalanche and seismic risk information in Afghanistan, as well as a first-order analysis of the costs and benefits of resilient reconstruction and risk reduction strategies undertaken by the authors. The hazard component is the combination of probability and magnitude of natural hazards. Hazard analyses were carried out

  3. Considering potential seismic sources in earthquake hazard assessment for Northern Iran

    NASA Astrophysics Data System (ADS)

    Abdollahzadeh, Gholamreza; Sazjini, Mohammad; Shahaky, Mohsen; Tajrishi, Fatemeh Zahedi; Khanmohammadi, Leila

    2014-07-01

    Located on the Alpine-Himalayan earthquake belt, Iran is one of the seismically active regions of the world. Northern Iran, south of Caspian Basin, a hazardous subduction zone, is a densely populated and developing area of the country. Historical and instrumental documented seismicity indicates the occurrence of severe earthquakes leading to many deaths and large losses in the region. With growth of seismological and tectonic data, updated seismic hazard assessment is a worthwhile issue in emergency management programs and long-term developing plans in urban and rural areas of this region. In the present study, being armed with up-to-date information required for seismic hazard assessment including geological data and active tectonic setting for thorough investigation of the active and potential seismogenic sources, and historical and instrumental events for compiling the earthquake catalogue, probabilistic seismic hazard assessment is carried out for the region using three recent ground motion prediction equations. The logic tree method is utilized to capture epistemic uncertainty of the seismic hazard assessment in delineation of the seismic sources and selection of attenuation relations. The results are compared to a recent practice in code-prescribed seismic hazard of the region and are discussed in detail to explore their variation in each branch of logic tree approach. Also, seismic hazard maps of peak ground acceleration in rock site for 475- and 2,475-year return periods are provided for the region.

  4. Fish acute toxicity syndromes and their use in the QSAR approach to hazard assessment.

    PubMed Central

    McKim, J M; Bradbury, S P; Niemi, G J

    1987-01-01

    Implementation of the Toxic Substances Control Act of 1977 creates the need to reliably establish testing priorities because laboratory resources are limited and the number of industrial chemicals requiring evaluation is overwhelming. The use of quantitative structure activity relationship (QSAR) models as rapid and predictive screening tools to select more potentially hazardous chemicals for in-depth laboratory evaluation has been proposed. Further implementation and refinement of quantitative structure-toxicity relationships in aquatic toxicology and hazard assessment requires the development of a "mode-of-action" database. With such a database, a qualitative structure-activity relationship can be formulated to assign the proper mode of action, and respective QSAR, to a given chemical structure. In this review, the development of fish acute toxicity syndromes (FATS), which are toxic-response sets based on various behavioral and physiological-biochemical measurements, and their projected use in the mode-of-action database are outlined. Using behavioral parameters monitored in the fathead minnow during acute toxicity testing, FATS associated with acetylcholinesterase (AChE) inhibitors and narcotics could be reliably predicted. However, compounds classified as oxidative phosphorylation uncouplers or stimulants could not be resolved. Refinement of this approach by using respiratory-cardiovascular responses in the rainbow trout, enabled FATS associated with AChE inhibitors, convulsants, narcotics, respiratory blockers, respiratory membrane irritants, and uncouplers to be correctly predicted. PMID:3297660

  5. National-Level Multi-Hazard Risk Assessments in Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Murnane, R. J.; Balog, S.; Fraser, S. A.; Jongman, B.; Van Ledden, M.; Phillips, E.; Simpson, A.

    2017-12-01

    National-level risk assessments can provide important baseline information for decision-making on risk management and risk financing strategies. In this study, multi-hazard risk assessments were undertaken for 9 countries in Sub-Saharan Africa: Cape Verde, Ethiopia, Kenya, Niger, Malawi, Mali, Mozambique, Senegal and Uganda. The assessment was part of the Building Disaster Resilience in Sub-Saharan Africa Program and aimed at supporting the development of multi-risk financing strategies to help African countries make informed decisions to mitigate the socio-economic, fiscal and financial impacts of disasters. The assessments considered hazards and exposures consistent with the years 2010 and 2050. We worked with multiple firms to develop the hazard, exposure and vulnerability data and the risk results. The hazards include: coastal flood, drought, earthquake, landslide, riverine flood, tropical cyclone wind and storm surge, and volcanoes. For hazards expected to vary with climate, the 2050 hazard is based on the IPCC RCP 6.0. Geolocated exposure data for 2010 and 2050 at a 15 arc second ( 0.5 km) resolution includes: structures as a function of seven development patterns; transportation networks including roads, bridges, tunnels and rail; critical facilities such as schools, hospitals, energy facilities and government buildings; crops; population; and, gross domestic product (GDP). The 2050 exposure values for population are based on the IPCC SSP 2. Values for other exposure data are a function of population change. Vulnerability was based on openly available vulnerability functions. Losses were based on replacement values (e.g., cost/m2 or cost/km). Risk results are provided in terms of annual average loss and a variety of return periods at the national and Admin 1 levels. Assessments of recent historical events are used to validate the model results. In the future, it would be useful to use hazard footprints of historical events for validation purposes. The

  6. Bayesian network learning for natural hazard assessments

    NASA Astrophysics Data System (ADS)

    Vogel, Kristin

    2016-04-01

    Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables

  7. Use of quantified risk assessment techniques in relation to major hazard installations

    NASA Astrophysics Data System (ADS)

    Elliott, M. J.

    Over the past decade, industry and regulatory authorities have expressed interest in the development and use of hazard assessment techniques, particularly in relation to the control of major hazards. However, misconceptions about the methodology and role of quantified hazard assessment techniques in decision-making has hindered productive dialogues on the use and value of these techniques, both within industry and between industry and regulatory authorities. This Paper outlines the nature, role and current uses of hazard assessment as perceived by the author; and identifies and differentiates between those areas and types of decisions where quantification should prove beneficial, and those where it is unwarranted and should be discouraged.

  8. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  9. A quantitative framework for assessing ecological resilience

    EPA Science Inventory

    Quantitative approaches to measure and assess resilience are needed to bridge gaps between science, policy, and management. In this paper, we suggest a quantitative framework for assessing ecological resilience. Ecological resilience as an emergent ecosystem phenomenon can be de...

  10. Geomorphological hazards and environmental impact: Assessment and mapping

    NASA Astrophysics Data System (ADS)

    Panizza, Mario

    In five sections the author develops the methods for the integration of geomorphological concepts into Environmental Impact and Mapping. The first section introduces the concepts of Impact and Risk through the relationships between Geomorphological Environment and Anthropical Element. The second section proposes a methodology for the determination of Geomorphological Hazard and the identification of Geomorphological Risk. The third section synthesizes the procedure for the compilation of a Geomorphological Hazards Map. The fourth section outlines the concepts of Geomorphological Resource Assessment for the analysis of the Environmental Impact. The fifth section considers the contribution of geomorphological studies and mapping in the procedure for Environmental Impact Assessment.

  11. Assessment and prediction of debris-flow hazards

    USGS Publications Warehouse

    Wieczorek, Gerald F.; ,

    1993-01-01

    Study of debris-flow geomorphology and initiation mechanism has led to better understanding of debris-flow processes. This paper reviews how this understanding is used in current techniques for assessment and prediction of debris-flow hazards.

  12. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less

  13. Assessment of volcanic hazards, vulnerability, risk and uncertainty (Invited)

    NASA Astrophysics Data System (ADS)

    Sparks, R. S.

    2009-12-01

    A volcanic hazard is any phenomenon that threatens communities . These hazards include volcanic events like pyroclastic flows, explosions, ash fall and lavas, and secondary effects such as lahars and landslides. Volcanic hazards are described by the physical characteristics of the phenomena, by the assessment of the areas that they are likely to affect and by the magnitude-dependent return period of events. Volcanic hazard maps are generated by mapping past volcanic events and by modelling the hazardous processes. Both these methods have their strengths and limitations and a robust map should use both approaches in combination. Past records, studied through stratigraphy, the distribution of deposits and age dating, are typically incomplete and may be biased. Very significant volcanic hazards, such as surge clouds and volcanic blasts, are not well-preserved in the geological record for example. Models of volcanic processes are very useful to help identify hazardous areas that do not have any geological evidence. They are, however, limited by simplifications and incomplete understanding of the physics. Many practical volcanic hazards mapping tools are also very empirical. Hazards maps are typically abstracted into hazards zones maps, which are some times called threat or risk maps. Their aim is to identify areas at high levels of threat and the boundaries between zones may take account of other factors such as roads, escape routes during evacuation, infrastructure. These boundaries may change with time due to new knowledge on the hazards or changes in volcanic activity levels. Alternatively they may remain static but implications of the zones may change as volcanic activity changes. Zone maps are used for planning purposes and for management of volcanic crises. Volcanic hazards maps are depictions of the likelihood of future volcanic phenomena affecting places and people. Volcanic phenomena are naturally variable, often complex and not fully understood. There are

  14. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  15. Towards quantitative assessment of calciphylaxis

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent

    2014-03-01

    Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.

  16. Source processes for the probabilistic assessment of tsunami hazards

    USGS Publications Warehouse

    Geist, Eric L.; Lynett, Patrick J.

    2014-01-01

    The importance of tsunami hazard assessment has increased in recent years as a result of catastrophic consequences from events such as the 2004 Indian Ocean and 2011 Japan tsunamis. In particular, probabilistic tsunami hazard assessment (PTHA) methods have been emphasized to include all possible ways a tsunami could be generated. Owing to the scarcity of tsunami observations, a computational approach is used to define the hazard. This approach includes all relevant sources that may cause a tsunami to impact a site and all quantifiable uncertainty. Although only earthquakes were initially considered for PTHA, recent efforts have also attempted to include landslide tsunami sources. Including these sources into PTHA is considerably more difficult because of a general lack of information on relating landslide area and volume to mean return period. The large variety of failure types and rheologies associated with submarine landslides translates to considerable uncertainty in determining the efficiency of tsunami generation. Resolution of these and several other outstanding problems are described that will further advance PTHA methodologies leading to a more accurate understanding of tsunami hazard.

  17. A global probabilistic tsunami hazard assessment from earthquake sources

    USGS Publications Warehouse

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  18. Probabilistic Seismic Hazard Assessment for Iraq

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onur, Tuna; Gok, Rengin; Abdulnaby, Wathiq

    Probabilistic Seismic Hazard Assessments (PSHA) form the basis for most contemporary seismic provisions in building codes around the world. The current building code of Iraq was published in 1997. An update to this edition is in the process of being released. However, there are no national PSHA studies in Iraq for the new building code to refer to for seismic loading in terms of spectral accelerations. As an interim solution, the new draft building code was considering to refer to PSHA results produced in the late 1990s as part of the Global Seismic Hazard Assessment Program (GSHAP; Giardini et al.,more » 1999). However these results are: a) more than 15 years outdated, b) PGA-based only, necessitating rough conversion factors to calculate spectral accelerations at 0.3s and 1.0s for seismic design, and c) at a probability level of 10% chance of exceedance in 50 years, not the 2% that the building code requires. Hence there is a pressing need for a new, updated PSHA for Iraq.« less

  19. Preparing for Euro 2012: developing a hazard risk assessment.

    PubMed

    Wong, Evan G; Razek, Tarek; Luhovy, Artem; Mogilevkina, Irina; Prudnikov, Yuriy; Klimovitskiy, Fedor; Yutovets, Yuriy; Khwaja, Kosar A; Deckelbaum, Dan L

    2015-04-01

    Risk assessment is a vital step in the disaster-preparedness continuum as it is the foundation of subsequent phases, including mitigation, response, and recovery. To develop a risk assessment tool geared specifically towards the Union of European Football Associations (UEFA) Euro 2012. In partnership with the Donetsk National Medical University, Donetsk Research and Development Institute of Traumatology and Orthopedics, Donetsk Regional Public Health Administration, and the Ministry of Emergency of Ukraine, a table-based tool was created, which, based on historical evidence, identifies relevant potential threats, evaluates their impacts and likelihoods on graded scales based on previous available data, identifies potential mitigating shortcomings, and recommends further mitigation measures. This risk assessment tool has been applied in the vulnerability-assessment-phase of the UEFA Euro 2012. Twenty-three sub-types of potential hazards were identified and analyzed. Ten specific hazards were recognized as likely to very likely to occur, including natural disasters, bombing and blast events, road traffic collisions, and disorderly conduct. Preventative measures, such as increased stadium security and zero tolerance for impaired driving, were recommended. Mitigating factors were suggested, including clear, incident-specific preparedness plans and enhanced inter-agency communication. This hazard risk assessment tool is a simple aid in vulnerability assessment, essential for disaster preparedness and response, and may be applied broadly to future international events.

  20. [The key problems in the population exposure assessment of hazardous chemicals accidents].

    PubMed

    Pan, L J; Liu, F P; Zhang, X; Bai, X T; Shi, X M

    2016-07-06

    Serious accidents of hazardous chemicals can cause a variety of acute or chronic impairment in human health. The effects of hazardous chemicals on human health can be identified by carrying on population exposure assessment. Through analyzing the domestic and overseas population exposure assessment cases related to hazardous chemicals accidents, we summarized that the base and key of the population exposure assessment were to identify the characteristics of the chemicals , delimit the area and the population exposed to the chemicals, and collect the data of the monitored chemicals and the population health in the polluted area.

  1. Review of Natural Phenomena Hazard (NPH) Assessments for the Hanford 200 Areas (Non-Seismic)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snow, Robert L.; Ross, Steven B.; Sullivan, Robin S.

    2010-09-24

    The purpose of this review is to assess the need for updating Natural Phenomena Hazard (NPH) assessments for the Hanford 200 Areas, as required by DOE Order 420.1B Chapter IV, Natural Phenomena Hazards Mitigation, based on significant changes in state-of-the-art NPH assessment methodology or site-specific information. The review includes all natural phenomena hazards with the exception of seismic/earthquake hazards, which are being addressed under a separate effort. It was determined that existing non-seismic NPH assessments are consistent with current design methodology and site specific data.

  2. U.S. states and territories national tsunami hazard assessment, historic record and sources for waves

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Weaver, C.

    2007-12-01

    In 2005, the U.S. National Science and Technology Council (NSTC) released a joint report by the sub-committee on Disaster Reduction and the U.S. Group on Earth Observations titled Tsunami Risk Reduction for the United States: A Framework for Action (Framework). The Framework outlines the President's&pstrategy for reducing the United States tsunami risk. The first specific action called for in the Framework is to "Develop standardized and coordinated tsunami hazard and risk assessments for all coastal regions of the United States and its territories." Since NOAA is the lead agency for providing tsunami forecasts and warnings and NOAA's National Geophysical Data Center (NGDC) catalogs information on global historic tsunamis, NOAA/NGDC was asked to take the lead in conducting the first national tsunami hazard assessment. Earthquakes or earthquake-generated landslides caused more than 85% of the tsunamis in the NGDC tsunami database. Since the United States Geological Survey (USGS) conducts research on earthquake hazards facing all of the United States and its territories, NGDC and USGS partnered together to conduct the first tsunami hazard assessment for the United States and its territories. A complete tsunami hazard and risk assessment consists of a hazard assessment, exposure and vulnerability assessment of buildings and people, and loss assessment. This report is an interim step towards a tsunami risk assessment. The goal of this report is provide a qualitative assessment of the United States tsunami hazard at the national level. Two different methods are used to assess the U.S. tsunami hazard. The first method involves a careful examination of the NGDC historical tsunami database. This resulted in a qualitative national tsunami hazard assessment based on the distribution of runup heights and the frequency of runups. Although tsunami deaths are a measure of risk rather than hazard, the known tsunami deaths found in the NGDC database search were compared with the

  3. Long-term volcanic hazard assessment on El Hierro (Canary Islands)

    NASA Astrophysics Data System (ADS)

    Becerril, L.; Bartolini, S.; Sobradelo, R.; Martí, J.; Morales, J. M.; Galindo, I.

    2014-07-01

    Long-term hazard assessment, one of the bastions of risk-mitigation programs, is required for land-use planning and for developing emergency plans. To ensure quality and representative results, long-term volcanic hazard assessment requires several sequential steps to be completed, which include the compilation of geological and volcanological information, the characterisation of past eruptions, spatial and temporal probabilistic studies, and the simulation of different eruptive scenarios. Despite being a densely populated active volcanic region that receives millions of visitors per year, no systematic hazard assessment has ever been conducted on the Canary Islands. In this paper we focus our attention on El Hierro, the youngest of the Canary Islands and the most recently affected by an eruption. We analyse the past eruptive activity to determine the spatial and temporal probability, and likely style of a future eruption on the island, i.e. the where, when and how. By studying the past eruptive behaviour of the island and assuming that future eruptive patterns will be similar, we aim to identify the most likely volcanic scenarios and corresponding hazards, which include lava flows, pyroclastic fallout and pyroclastic density currents (PDCs). Finally, we estimate their probability of occurrence. The end result, through the combination of the most probable scenarios (lava flows, pyroclastic density currents and ashfall), is the first qualitative integrated volcanic hazard map of the island.

  4. Space vehicle propulsion systems: Environmental space hazards

    NASA Technical Reports Server (NTRS)

    Disimile, P. J.; Bahr, G. K.

    1990-01-01

    The hazards that exist in geolunar space which may degrade, disrupt, or terminate the performance of space-based LOX/LH2 rocket engines are evaluated. Accordingly, a summary of the open literature pertaining to the geolunar space hazards is provided. Approximately 350 citations and about 200 documents and abstracts were reviewed; the documents selected give current and quantitative detail. The methodology was to categorize the various space hazards in relation to their importance in specified regions of geolunar space. Additionally, the effect of the various space hazards in relation to spacecraft and their systems were investigated. It was found that further investigation of the literature would be required to assess the effects of these hazards on propulsion systems per se; in particular, possible degrading effects on exterior nozzle structure, directional gimbals, and internal combustion chamber integrity and geometry.

  5. Assessment of hazards and risks for landscape protection planning in Sicily.

    PubMed

    La Rosa, Daniele; Martinico, Francesco

    2013-09-01

    Landscape protection planning is a complex task that requires an integrated assessment and involves heterogeneous issues. These issues include not only the management of a considerable amount of data to describe landscape features but also the choice of appropriate tools to evaluate the hazards and risks. The landscape assessment phase can provide fundamental information for the definition of a Landscape Protection Plan, in which the selection of norms for protection or rehabilitation is strictly related to hazards, values and risks that are found. This paper describes a landscape assessment methodology conducted by using GIS, concerning landscape hazards, values and risk. Four hazard categories are introduced and assessed concerning urban sprawl and erosion: landscape transformations by new planned developments, intensification of urban sprawl patterns, loss of agriculture land and erosion. Landscape value is evaluated by using different thematic layers overlaid with GIS geoprocessing. The risk of loss of landscape value is evaluated, with reference to the potential occurrence of the previously assessed hazards. The case study is the Province of Enna (Sicily), where landscape protection is a relevant issue because of the importance of cultural and natural heritage. Results show that high value landscape features have a low risk of loss of landscape value. For this reason, landscape protection policies assume a relevant role in landscapes with low-medium values and they should be addressed to control the urban sprawl processes that are beginning in the area. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Assessing qualitative long-term volcanic hazards at Lanzarote Island (Canary Islands)

    NASA Astrophysics Data System (ADS)

    Becerril, Laura; Martí, Joan; Bartolini, Stefania; Geyer, Adelina

    2017-07-01

    Conducting long-term hazard assessment in active volcanic areas is of primary importance for land-use planning and defining emergency plans able to be applied in case of a crisis. A definition of scenario hazard maps helps to mitigate the consequences of future eruptions by anticipating the events that may occur. Lanzarote is an active volcanic island that has hosted the largest (> 1.5 km3 DRE) and longest (6 years) eruption, the Timanfaya eruption (1730-1736), on the Canary Islands in historical times (last 600 years). This eruption brought severe economic losses and forced local people to migrate. In spite of all these facts, no comprehensive hazard assessment or hazard maps have been developed for the island. In this work, we present an integrated long-term volcanic hazard evaluation using a systematic methodology that includes spatial analysis and simulations of the most probable eruptive scenarios.

  7. Lost in translation? The hazards of applying social constructionism to quantitative research on sexual orientation development.

    PubMed

    Robboy, Caroline Alex

    2002-01-01

    This article explores the hazards faced by social constructionists who attempt to conduct quantitative research on sexual orientation development. By critically reviewing two quantitative research studies, this article explores the ways in which the very nature of social constructionist arguments may be incongruous with the methodological requirements of quantitative studies. I suggest this conflict is a result of the differing natures of these two modes of scholarly inquiry. While research requires the acceptance of certain analytical categories, the strength of social constructionism comes from its reflexive scrutiny and problematization of those very categories. Ultimately, social constructionists who try to apply their theories/perspectives must necessarily conform to the methodological constraints of quantitative research. The intent of this article is not to suggest that it is futile or self-contradictory for social constructionists to attempt empirical research, but that these are two distinct modes of scholarly inquiry which can, and should, co-exist in a dialectical relationship to each other.

  8. Assessing Perceptions AbouT Hazardous Substances (PATHS): The PATHS questionnaire

    PubMed Central

    Amlôt, Richard; Page, Lisa; Pearce, Julia; Wessely, Simon

    2013-01-01

    How people perceive the nature of a hazardous substance may determine how they respond when potentially exposed to it. We tested a new Perceptions AbouT Hazardous Substances (PATHS) questionnaire. In Study 1 (N = 21), we assessed the face validity of items concerning perceptions about eight properties of a hazardous substance. In Study 2 (N = 2030), we tested the factor structure, reliability and validity of the PATHS questionnaire across four qualitatively different substances. In Study 3 (N = 760), we tested the impact of information provision on Perceptions AbouT Hazardous Substances scores. Our results showed that our eight measures demonstrated good reliability and validity when used for non-contagious hazards. PMID:23104995

  9. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    NASA Astrophysics Data System (ADS)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  10. Hazard assessment for small torrent catchments - lessons learned

    NASA Astrophysics Data System (ADS)

    Eisl, Julia; Huebl, Johannes

    2013-04-01

    The documentation of extreme events as a part of the integral risk management cycle is an important basis for the analysis and assessment of natural hazards. In July 2011 a flood event occurred in the Wölzer-valley in the province of Styria, Austria. For this event at the "Wölzerbach" a detailed event documentation was carried out, gathering data about rainfall, runoff and sediment transport as well as information on damaged objects, infrastructure or crops using various sources. The flood was triggered by heavy rainfalls in two tributaries of the Wölzer-river. Though a rain as well as a discharge gaging station exists for the Wölzer-river, the torrents affected by the high intensity rainfalls are ungaged. For these ungaged torrent catchments the common methods for hazard assessment were evaluated. The back-calculation of the rainfall event was done using a new approach for precipitation analysis. In torrent catchments especially small-scale and high-intensity rainfall events are mainly responsible for extreme events. Austria's weather surveillance radar is operated by the air traffic service "AustroControl". The usually available dataset is interpreted and shows divergences especially when it comes to high intensity rainfalls. For this study the raw data of the radar were requested and analysed. Further on the event was back-calculated with different rainfall-runoff models, hydraulic models and sediment transport models to obtain calibration parameters for future use in hazard assessment for this region. Since there are often problems with woody debris different scenarios were simulated. The calibrated and plausible results from the runoff models were used for the comparison with empirical approaches used in the practical sector. For the planning of mitigation measures of the Schöttl-torrent, which is one of the affected tributaries of the Wölzer-river, a physical scale model was used in addition to the insights of the event analysis to design a check dam

  11. Health Risk Assessment on Hazardous Ingredients in Household Deodorizing Products

    PubMed Central

    Lee, Minjin; Kim, Joo-Hyon; Lee, Daeyeop; Kim, Jaewoo; Lim, Hyunwoo; Seo, Jungkwan; Park, Young-Kwon

    2018-01-01

    The inhalation of a water aerosol from a humidifier containing disinfectants has led to serious lung injuries in Korea. To promote the safe use of products, the Korean government enacted regulations on the chemicals in various consumer products that could have adverse health effects. Given the concern over the potential health risks associated with the hazardous ingredients in deodorizing consumer products, 17 ingredients were analyzed and assessed according to their health risk on 3 groups by the application type in 47 deodorizing products. The risk assessment study followed a stepwise procedure (e.g., collecting toxicological information, hazard identification/exposure assessment, and screening and detailed assessment for inhalation and dermal routes). The worst-case scenario and maximum concentration determined by the product purpose and application type were used as the screening assessment. In a detailed assessment, the 75th exposure factor values were used to estimate the assumed reasonable exposure to ingredients. The exposed concentrations of seven ingredients were calculated. Due to limitation of toxicity information, butylated hydroxyl toluene for a consumer’s exposure via the dermal route only was conducted for a detailed assessment. This study showed that the assessed ingredients have no health risks at their maximum concentrations in deodorizing products. This approach can be used to establish guidelines for ingredients that may pose inhalation and dermal hazards. PMID:29652814

  12. Hazard Identification, Risk Assessment, and Control Measures as an Effective Tool of Occupational Health Assessment of Hazardous Process in an Iron Ore Pelletizing Industry.

    PubMed

    Rout, B K; Sikdar, B K

    2017-01-01

    With the growing numbers of iron ore pelletization industries in India, various impacts on environment and health in relation to the workplace will rise. Therefore, understanding the hazardous process is crucial in the development of effective control measures. Hazard Identification, Risk Assessment, and Control measures (HIRAC) acts as an effective tool of Occupational Health Assessment. The aim of the study was to identify all the possible hazards at different workplaces of an iron ore pelletizing industry, to conduct an occupational health risk assessment, to calculate the risk rating based on the risk matrix, and to compare the risk rating before and after the control measures. The research was a cross-sectional study done from March to December 2015 in an iron ore pelletizing industry located in Odisha, India. Data from the survey were collected by inspecting the workplace, responses of employees regarding possible hazards in their workplace, reviewing department procedure manual, work instructions, standard operating procedure, previous incident reports, material safety data sheet, first aid/injury register, and health record of employees. A total of 116 hazards were identified. Results of the paired-sample's t -test showed that mean risk rating differs before taking control measures (M = 9.13, SD = 5.99) and after taking control measures (M = 2.80, SD = 1.38) at the 0.0001 level of significance ( t = 12.6428, df = 115, N = 116, P < 0.0001, 95% CI for mean difference 5.34 to 7.32). On an average, risk reduction was about 6.33 points lower after taking control measures. The hazards having high-risk rating and above were reduced to a level considered As Low as Reasonably Practicable (ALARP) when the control measures were applied, thereby reducing the occurrence of injury or disease in the workplace.

  13. Volcanic hazard assessment in western Europe

    NASA Astrophysics Data System (ADS)

    Chester, David K.; Dibben, Christopher J. L.; Duncan, Angus M.

    2002-06-01

    Volcanology has been in the past and in many respects remains a subject dominated by pure research grounded in the earth sciences. Over the past 30 years a paradigm shift has occurred in hazard assessment which has been aided by significant changes in the social theory of natural hazards and the first-hand experience gained in the 1990s by volcanologists working on projects conceived during the International Decade for Natural Disaster Reduction (IDNDR). Today much greater stress is placed on human vulnerability, the potential for marginalisation of disadvantaged individuals and social groups, and the requirement to make applied volcanology sensitive to the characteristics of local demography, economy, culture and politics. During the IDNDR a methodology, broadly similar to environmental impact analysis, has emerged as the preferred method for studying human vulnerability and risk assessment in volcanically active regions. The characteristics of this new methodology are discussed and the progress which has been made in innovating it on the European Union laboratory volcanoes located in western Europe is reviewed. Furnas (São Miguel, Azores) and Vesuvius in Italy are used as detailed case studies.

  14. Elevation uncertainty in coastal inundation hazard assessments

    USGS Publications Warehouse

    Gesch, Dean B.; Cheval, Sorin

    2012-01-01

    Coastal inundation has been identified as an important natural hazard that affects densely populated and built-up areas (Subcommittee on Disaster Reduction, 2008). Inundation, or coastal flooding, can result from various physical processes, including storm surges, tsunamis, intense precipitation events, and extreme high tides. Such events cause quickly rising water levels. When rapidly rising water levels overwhelm flood defenses, especially in heavily populated areas, the potential of the hazard is realized and a natural disaster results. Two noteworthy recent examples of such natural disasters resulting from coastal inundation are the Hurricane Katrina storm surge in 2005 along the Gulf of Mexico coast in the United States, and the tsunami in northern Japan in 2011. Longer term, slowly varying processes such as land subsidence (Committee on Floodplain Mapping Technologies, 2007) and sea-level rise also can result in coastal inundation, although such conditions do not have the rapid water level rise associated with other flooding events. Geospatial data are a critical resource for conducting assessments of the potential impacts of coastal inundation, and geospatial representations of the topography in the form of elevation measurements are a primary source of information for identifying the natural and human components of the landscape that are at risk. Recently, the quantity and quality of elevation data available for the coastal zone have increased markedly, and this availability facilitates more detailed and comprehensive hazard impact assessments.

  15. A spatiotemporal multi-hazard exposure assessment based on property data

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Keiler, Margreth; Zischg, Andreas

    2016-04-01

    The paper presents a nation-wide spatially explicit object-based assessment of buildings and citizens exposed to natural hazards in Austria, including river flooding, torrential flooding, and snow avalanches. The assessment was based on two different datasets, (a) hazard information providing input to the exposure of elements at risk, and (b) information on the building stock combined from different spatial data available on the national level. Hazard information was compiled from two different sources. For torrential flooding and snow avalanches available local-scale hazard maps were used, and for river flooding the results of the countrywide flood modelling eHORA were available. Information on the building stock contained information on the location and size of each building, as well as on the building category and the construction period. Additional information related to the individual floors, such as their height and net area, main purpose and configuration, was included for each property. Moreover, this dataset has an interface to the population register and allowed therefore retrieving the number of primary residents for each building. With the exception of sacral buildings, an economic module was used to compute the monetary value of buildings using (a) the information of the building register such as building type, number of storeys and utilisation, and (b) regionally averaged construction costs. It is shown that the repeatedly-stated assumption of increasing exposure due to continued population growth and related increase in assets has to be carefully evaluated by the local development of building stock. While some regions have shown a clearly above-average increase in assets, other regions were characterised by a below-average development. This mirrors the topography of the country, but also the different economic activities. While hotels and hostels are extraordinary prone to torrential flooding, commercial buildings as well as buildings used for

  16. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matthews, Elizabeth C., E-mail: echiso1@lsu.edu; Sattler, Meredith, E-mail: msattler@lsu.edu; Friedland, Carol J., E-mail: friedland@lsu.edu

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site,more » community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.« less

  17. Combining heuristic and statistical techniques in landslide hazard assessments

    NASA Astrophysics Data System (ADS)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  18. Probabilistic seismic hazard assessment for northern Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Kosuwan, S.; Nguyen, M. L.; Shi, X.; Sieh, K.

    2016-12-01

    We assess seismic hazard for northern Southeast Asia through constructing an earthquake and fault database, conducting a series of ground-shaking scenarios and proposing regional seismic hazard maps. Our earthquake database contains earthquake parameters from global and local seismic catalogues, including the ISC, ISC-GEM, the global ANSS Comprehensive Catalogues, Seismological Bureau, Thai Meteorological Department, Thailand, and Institute of Geophysics Vietnam Academy of Science and Technology, Vietnam. To harmonize the earthquake parameters from various catalogue sources, we remove duplicate events and unify magnitudes into the same scale. Our active fault database include active fault data from previous studies, e.g. the active fault parameters determined by Wang et al. (2014), Department of Mineral Resources, Thailand, and Institute of Geophysics, Vietnam Academy of Science and Technology, Vietnam. Based on the parameters from analysis of the databases (i.e., the Gutenberg-Richter relationship, slip rate, maximum magnitude and time elapsed of last events), we determined the earthquake recurrence models of seismogenic sources. To evaluate the ground shaking behaviours in different tectonic regimes, we conducted a series of tests by matching the felt intensities of historical earthquakes to the modelled ground motions using ground motion prediction equations (GMPEs). By incorporating the best-fitting GMPEs and site conditions, we utilized site effect and assessed probabilistic seismic hazard. The highest seismic hazard is in the region close to the Sagaing Fault, which cuts through some major cities in central Myanmar. The northern segment of Sunda megathrust, which could potentially cause M8-class earthquake, brings significant hazard along the Western Coast of Myanmar and eastern Bangladesh. Besides, we conclude a notable hazard level in northern Vietnam and the boundary between Myanmar, Thailand and Laos, due to a series of strike-slip faults, which could

  19. Risk Assessment Methodology for Hazardous Waste Management (1998)

    EPA Pesticide Factsheets

    A methodology is described for systematically assessing and comparing the risks to human health and the environment of hazardous waste management alternatives. The methodology selects and links appropriate models and techniques for performing the process.

  20. Debris flows: behavior and hazard assessment

    USGS Publications Warehouse

    Iverson, Richard M.

    2014-01-01

    Debris flows are water-laden masses of soil and fragmented rock that rush down mountainsides, funnel into stream channels, entrain objects in their paths, and form lobate deposits when they spill onto valley floors. Because they have volumetric sediment concentrations that exceed 40 percent, maximum speeds that surpass 10 m/s, and sizes that can range up to ~109 m3, debris flows can denude slopes, bury floodplains, and devastate people and property. Computational models can accurately represent the physics of debris-flow initiation, motion and deposition by simulating evolution of flow mass and momentum while accounting for interactions of debris' solid and fluid constituents. The use of physically based models for hazard forecasting can be limited by imprecise knowledge of initial and boundary conditions and material properties, however. Therefore, empirical methods continue to play an important role in debris-flow hazard assessment.

  1. Updating Parameters for Volcanic Hazard Assessment Using Multi-parameter Monitoring Data Streams And Bayesian Belief Networks

    NASA Astrophysics Data System (ADS)

    Odbert, Henry; Aspinall, Willy

    2014-05-01

    Evidence-based hazard assessment at volcanoes assimilates knowledge about the physical processes of hazardous phenomena and observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We discuss

  2. Occurrence and quantitative microbial risk assessment of Cryptosporidium and Giardia in soil and air samples.

    PubMed

    Balderrama-Carmona, Ana Paola; Gortáres-Moroyoqui, Pablo; Álvarez-Valencia, Luis Humberto; Castro-Espinoza, Luciano; Mondaca-Fernández, Iram; Balderas-Cortés, José de Jesús; Chaidez-Quiroz, Cristóbal; Meza-Montenegro, María Mercedes

    2014-09-01

    Cryptosporidium oocysts and Giardia cysts can be transmitted by the fecal-oral route and may cause gastrointestinal parasitic zoonoses. These zoonoses are common in rural zones due to the parasites being harbored in fecally contaminated soil. This study assessed the risk of illness (giardiasis and cryptosporidiosis) from inhaling and/or ingesting soil and/or airborne dust in Potam, Mexico. To assess the risk of infection, Quantitative Microbial Risk Assessment (QMRA) was employed, with the following steps: (1) hazard identification, (2) hazard exposure, (3) dose-response, and (4) risk characterization. Cryptosporidium oocysts and Giardia cysts were observed in 52% and 57%, respectively, of total soil samples (n=21), and in 60% and 80%, respectively, of air samples (n=12). The calculated annual risks were higher than 9.9 × 10(-1) for both parasites in both types of sample. Soil and air inhalation and/or ingestion are important vehicles for these parasites. To our knowledge, the results obtained in the present study represent the first QMRAs for cryptosporidiosis and giardiasis due to soil and air inhalation/ingestion in Mexico. In addition, this is the first evidence of the microbial air quality around these parasites in rural zones. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations.

    PubMed

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.

  4. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations

    PubMed Central

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440

  5. Applying high resolution remote sensing image and DEM to falling boulder hazard assessment

    NASA Astrophysics Data System (ADS)

    Huang, Changqing; Shi, Wenzhong; Ng, K. C.

    2005-10-01

    Boulder fall hazard assessing generally requires gaining the boulder information. The extensive mapping and surveying fieldwork is a time-consuming, laborious and dangerous conventional method. So this paper proposes an applying image processing technology to extract boulder and assess boulder fall hazard from high resolution remote sensing image. The method can replace the conventional method and extract the boulder information in high accuracy, include boulder size, shape, height and the slope and aspect of its position. With above boulder information, it can be satisfied for assessing, prevention and cure boulder fall hazard.

  6. Prioritization of reproductive toxicants in unconventional oil and gas operations using a multi-country regulatory data-driven hazard assessment.

    PubMed

    Inayat-Hussain, Salmaan H; Fukumura, Masao; Muiz Aziz, A; Jin, Chai Meng; Jin, Low Wei; Garcia-Milian, Rolando; Vasiliou, Vasilis; Deziel, Nicole C

    2018-08-01

    Recent trends have witnessed the global growth of unconventional oil and gas (UOG) production. Epidemiologic studies have suggested associations between proximity to UOG operations with increased adverse birth outcomes and cancer, though specific potential etiologic agents have not yet been identified. To perform effective risk assessment of chemicals used in UOG production, the first step of hazard identification followed by prioritization specifically for reproductive toxicity, carcinogenicity and mutagenicity is crucial in an evidence-based risk assessment approach. To date, there is no single hazard classification list based on the United Nations Globally Harmonized System (GHS), with countries applying the GHS standards to generate their own chemical hazard classification lists. A current challenge for chemical prioritization, particularly for a multi-national industry, is inconsistent hazard classification which may result in misjudgment of the potential public health risks. We present a novel approach for hazard identification followed by prioritization of reproductive toxicants found in UOG operations using publicly available regulatory databases. GHS classification for reproductive toxicity of 157 UOG-related chemicals identified as potential reproductive or developmental toxicants in a previous publication was assessed using eleven governmental regulatory agency databases. If there was discordance in classifications across agencies, the most stringent classification was assigned. Chemicals in the category of known or presumed human reproductive toxicants were further evaluated for carcinogenicity and germ cell mutagenicity based on government classifications. A scoring system was utilized to assign numerical values for reproductive health, cancer and germ cell mutation hazard endpoints. Using a Cytoscape analysis, both qualitative and quantitative results were presented visually to readily identify high priority UOG chemicals with evidence of multiple

  7. Volcanic hazards at distant critical infrastructure: A method for bespoke, multi-disciplinary assessment

    NASA Astrophysics Data System (ADS)

    Odbert, H. M.; Aspinall, W.; Phillips, J.; Jenkins, S.; Wilson, T. M.; Scourse, E.; Sheldrake, T.; Tucker, P.; Nakeshree, K.; Bernardara, P.; Fish, K.

    2015-12-01

    Societies rely on critical services such as power, water, transport networks and manufacturing. Infrastructure may be sited to minimise exposure to natural hazards but not all can be avoided. The probability of long-range transport of a volcanic plume to a site is comparable to other external hazards that must be considered to satisfy safety assessments. Recent advances in numerical models of plume dispersion and stochastic modelling provide a formalized and transparent approach to probabilistic assessment of hazard distribution. To understand the risks to critical infrastructure far from volcanic sources, it is necessary to quantify their vulnerability to different hazard stressors. However, infrastructure assets (e.g. power plantsand operational facilities) are typically complex systems in themselves, with interdependent components that may differ in susceptibility to hazard impact. Usually, such complexity means that risk either cannot be estimated formally or that unsatisfactory simplifying assumptions are prerequisite to building a tractable risk model. We present a new approach to quantifying risk by bridging expertise of physical hazard modellers and infrastructure engineers. We use a joint expert judgment approach to determine hazard model inputs and constrain associated uncertainties. Model outputs are chosen on the basis of engineering or operational concerns. The procedure facilitates an interface between physical scientists, with expertise in volcanic hazards, and infrastructure engineers, with insight into vulnerability to hazards. The result is a joined-up approach to estimating risk from low-probability hazards to critical infrastructure. We describe our methodology and show preliminary results for vulnerability to volcanic hazards at a typical UK industrial facility. We discuss our findings in the context of developing bespoke assessment of hazards from distant sources in collaboration with key infrastructure stakeholders.

  8. A Probabilistic Tsunami Hazard Study of the Auckland Region, Part I: Propagation Modelling and Tsunami Hazard Assessment at the Shoreline

    NASA Astrophysics Data System (ADS)

    Power, William; Wang, Xiaoming; Lane, Emily; Gillibrand, Philip

    2013-09-01

    Regional source tsunamis represent a potentially devastating threat to coastal communities in New Zealand, yet are infrequent events for which little historical information is available. It is therefore essential to develop robust methods for quantitatively estimating the hazards posed, so that effective mitigation measures can be implemented. We develop a probabilistic model for the tsunami hazard posed to the Auckland region of New Zealand from the Kermadec Trench and the southern New Hebrides Trench subduction zones. An innovative feature of our model is the systematic analysis of uncertainty regarding the magnitude-frequency distribution of earthquakes in the source regions. The methodology is first used to estimate the tsunami hazard at the coastline, and then used to produce a set of scenarios that can be applied to produce probabilistic maps of tsunami inundation for the study region; the production of these maps is described in part II. We find that the 2,500 year return period regional source tsunami hazard for the densely populated east coast of Auckland is dominated by events originating in the Kermadec Trench, while the equivalent hazard to the sparsely populated west coast is approximately equally due to events on the Kermadec Trench and the southern New Hebrides Trench.

  9. Developing a methodology for the national-scale assessment of rainfall-induced landslide hazard in a changing climate

    NASA Astrophysics Data System (ADS)

    Jurchescu, Marta; Micu, Dana; Sima, Mihaela; Bălteanu, Dan; Bojariu, Roxana; Dumitrescu, Alexandru; Dragotă, Carmen; Micu, Mihai; Senzaconi, Francisc

    2017-04-01

    Landslides together with earthquakes and floods represent the main natural hazards in Romania, causing major impacts to human activities. The RO-RISK (Disaster Risk Evaluation at a National Level) project is a flagship project aimed to strengthen risk prevention and management in Romania, by evaluating - among the specific risks in the country - landslide hazard and risk at a national level. Landslide hazard is defined as "the probability of occurrence within a specified period of time and within a given area of a landslide of a given magnitude" (Varnes 1984; Guzzetti et al. 1999). Nevertheless, most landslide ʿhazardʾ maps only consist in susceptibility (i.e. spatial probability) zonations without considering temporal or magnitude information on the hazard. This study proposes a methodology for the assessment of landslide hazard at the national scale on a scenario basis, while also considering changes in hazard patterns and levels under climate change conditions. A national landslide database consisting of more than 3,000 records has been analyzed against a meteorological observation dataset in order to assess the relationship between precipitation and landslides. Various extreme climate indices were computed in order to account for the different rainfall patterns able to prepare/trigger landslides (e.g. extreme levels of seasonal rainfall, 3-days rainfall or number of consecutive rainy days with different return periods). In order to derive national rainfall thresholds, i.e. valid for diverse climatic environments across the country, values in the parameter maps were rendered comparable by means of normalization with the mean annual precipitation and the rainy-day-normal. A hazard assessment builds on a frequency-magnitude relationship. In the current hazard scenario approach, frequency was kept constant for each single map, while the magnitude of the expected geomorphic event was modeled in relation to the distributed magnitude of the triggering factor. Given

  10. Hazard Identification and Risk Assessment in Water Treatment Plant considering Environmental Health and Safety Practice

    NASA Astrophysics Data System (ADS)

    Falakh, Fajrul; Setiani, Onny

    2018-02-01

    Water Treatment Plant (WTP) is an important infrastructure to ensure human health and the environment. In its development, aspects of environmental safety and health are of concern. This paper case study was conducted at the Water Treatment Plant Company in Semarang, Central Java, Indonesia. Hazard identification and risk assessment is one part of the occupational safety and health program at the risk management stage. The purpose of this study was to identify potential hazards using hazard identification methods and risk assessment methods. Risk assessment is done using criteria of severity and probability of accident. The results obtained from this risk assessment are 22 potential hazards present in the water purification process. Extreme categories that exist in the risk assessment are leakage of chlorine and industrial fires. Chlorine and fire leakage gets the highest value because its impact threatens many things, such as industrial disasters that could endanger human life and the environment. Control measures undertaken to avoid potential hazards are to apply the use of personal protective equipment, but management will also be better managed in accordance with hazard control hazards, occupational safety and health programs such as issuing work permits, emergency response training is required, Very useful in overcoming potential hazards that have been determined.

  11. Seismic Hazard Assessment at Esfaraen‒Bojnurd Railway, North‒East of Iran

    NASA Astrophysics Data System (ADS)

    Haerifard, S.; Jarahi, H.; Pourkermani, M.; Almasian, M.

    2018-01-01

    The objective of this study is to evaluate the seismic hazard at the Esfarayen-Bojnurd railway using the probabilistic seismic hazard assessment (PSHA) method. This method was carried out based on a recent data set to take into account the historic seismicity and updated instrumental seismicity. A homogenous earthquake catalogue was compiled and a proposed seismic sources model was presented. Attenuation equations that recently recommended by experts and developed based upon earthquake data obtained from tectonic environments similar to those in and around the studied area were weighted and used for assessment of seismic hazard in the frame of logic tree approach. Considering a grid of 1.2 × 1.2 km covering the study area, ground acceleration for every node was calculated. Hazard maps at bedrock conditions were produced for peak ground acceleration, in addition to return periods of 74, 475 and 2475 years.

  12. Identification of potentially hazardous human gene products in GMO risk assessment.

    PubMed

    Bergmans, Hans; Logie, Colin; Van Maanen, Kees; Hermsen, Harm; Meredyth, Michelle; Van Der Vlugt, Cécile

    2008-01-01

    Genetically modified organisms (GMOs), e.g. viral vectors, could threaten the environment if by their release they spread hazardous gene products. Even in contained use, to prevent adverse consequences, viral vectors carrying genes from mammals or humans should be especially scrutinized as to whether gene products that they synthesize could be hazardous in their new context. Examples of such potentially hazardous gene products (PHGPs) are: protein toxins, products of dominant alleles that have a role in hereditary diseases, gene products and sequences involved in genome rearrangements, gene products involved in immunomodulation or with an endocrine function, gene products involved in apoptosis, activated proto-oncogenes. For contained use of a GMO that carries a construct encoding a PHGP, the precautionary principle dictates that safety measures should be applied on a "worst case" basis, until the risks of the specific case have been assessed. The potential hazard of cloned genes can be estimated before empirical data on the actual GMO become available. Preliminary data may be used to focus hazard identification and risk assessment. Both predictive and empirical data may also help to identify what further information is needed to assess the risk of the GMO. A two-step approach, whereby a PHGP is evaluated for its conceptual dangers, then checked by data bank searches, is delineated here.

  13. Flood hazard, vulnerability, and risk assessment for human life

    NASA Astrophysics Data System (ADS)

    Pan, T.; Chang, T.; Lai, J.; Hsieh, M.; Tan, Y.; Lin, Y.

    2011-12-01

    Flood risk assessment is an important issue for the countries suffering tropical cyclones and monsoon. Taiwan is located in the hot zone of typhoon tracks in the Western Pacific. There are three to five typhoons landing Taiwan every year. Typhoons and heavy rainfalls often cause inundation disaster rising with the increase of population and the development of social economy. The purpose of this study is to carry out the flood hazard, vulnerability and risk in term of human life. Based on the concept that flood risk is composed by flood hazard and vulnerability, a inundation simulation is performed to evaluate the factors of flood hazard for human life according to base flood (100-year return period). The flood depth, velocity and rising ratio are the three factors of flood hazards. Furthermore, the factors of flood vulnerability are identified in terms of human life that are classified into two main factors, residents and environment. The sub factors related to residents are the density of population and the density of vulnerable people including elders, youngers and disabled persons. The sub factors related to environment include the the number of building floors, the locations of buildings, the and distance to rescue center. The analytic hierarchy process (AHP) is adopted to determine the weights of these factors. The risk matrix is applied to show the risk from low to high based on the evaluation of flood hazards and vulnerabilities. The Tseng-Wen River watershed is selected as the case study because a serious flood was induced by Typhoon Morakot in 2009, which produced a record-breaking rainfall of 2.361mm in 48 hours in the last 50 years. The results of assessing the flood hazard, vulnerability and risk in term of human life could improve the emergency operation for flood disaster to prepare enough relief goods and materials during typhoon landing.

  14. Quantitative Microbial Risk Assessment Tutorial - Primer

    EPA Science Inventory

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  15. Flood hazard assessment for french NPPs

    NASA Astrophysics Data System (ADS)

    Rebour, Vincent; Duluc, Claire-Marie; Guimier, Laurent

    2015-04-01

    This paper presents the approach for flood hazard assessment for NPP which is on-going in France in the framework of post-Fukushima activities. These activities were initially defined considering both European "stress tests" of NPPs pursuant to the request of the European Council, and the French safety audit of civilian nuclear facilities in the light of the Fukushima Daiichi accident. The main actors in that process are the utility (EDF is, up to date, the unique NPP's operator in France), the regulatory authority (ASN) and its technical support organization (IRSN). This paper was prepared by IRSN, considering official positions of the other main actors in the current review process, it was not officially endorsed by them. In France, flood hazard to be considered for design basis definition (for new NPPs and for existing NPPs in periodic safety reviews conducted every 10 years) was revised before Fukushima-Daichi accident, due to le Blayais NPP December 1999 experience (partial site flooding and loss of some safety classified systems). The paper presents in the first part an overview of the revised guidance for design basis flood. In order to address design extension conditions (conditions that could result from natural events exceeding the design basis events), a set of flooding scenarios have been defined by adding margins on the scenarios that are considered for the design. Due to the diversity of phenomena to be considered for flooding hazard, the margin assessment is specific to each flooding scenario in terms of parameter to be penalized and of degree of variation of this parameter. The general approach to address design extension conditions is presented in the second part of the paper. The next parts present the approach for five flooding scenarios including design basis scenario and additional margin to define design extension scenarios.

  16. Challenges in Assessing Seismic Hazard in Intraplate Europe

    NASA Astrophysics Data System (ADS)

    Hintersberger, E.; Kuebler, S.; Landgraf, A.; Stein, S. A.

    2014-12-01

    Intraplate regions are often characterized by scattered, clustered and migrating seismicity and the occurrence of low-strain areas next to high-strain ones. Increasing evidence for large paleoearthquakes in such regions together with population growth and development of critical facilities, call for better assessments of earthquake hazards. Existing seismic hazard assessment for intraplate Europe is based on instrumental and historical seismicity of the past 1000 years, as well some active fault data. These observations face important limitations due to the quantity and quality of the available data bases. Even considering the long record of historical events in some populated areas of Europe, this time-span of thousand years likely fails to capture some faults' typical large-event recurrence intervals that are in the order of tens of thousands of years. Paleoseismology helps lengthen the observation window, but only produces point measurements, and preferentially in regions suspected to be seismically active. As a result, the expected maximum magnitudes of future earthquakes are quite uncertain, likely to be underestimated, and earthquakes are likely to occur in unexpected locations. These issues in particular arise in the heavily populated Rhine Graben and Vienna Basin areas, and in considering the hazard to critical facilities like nuclear power plants posed by low-probability events.

  17. Landslide hazard assessment of the Black sea coastline (Caucasus, Russia) via drones

    NASA Astrophysics Data System (ADS)

    Kazeev, Andrey; Postoev, German; Fedotova, Ksenia

    2017-04-01

    Landslide hazard assessment of slopes of Sochi was performed along the railway between the cities Tuapse and Adler (total length 103 km). The railway passes through the territory with active development of hazardous geological processes such as landslides, rock falls and debris-flows. By the beginning of 2016, 36 landslide sites were discovered along the railway (total length 34 km), 48 rock-fall sites (length 31 km), and 5 debris-flow sites (length 0.14 km). In recent years the intensification of deformations was observed. For instance, during previous 10 years (1996¬¬-2005) 28 sudden deformations occurred due to slope processes, which caused interruptions in traffic. And in the present decade (2006-2015), 72 deformations were recorded. High landslide activity and economic loss determined the necessity of complex investigations of engineering geological conditions of landslides development and causes of its intensification. The protection strategy development was needed to minimize negative consequences. Thus, the investigations of landslide situation along the railway "Tuapse - Adler" included the categorization of landslide sites by level of hazard, with risk assessment based on numerical criteria. Preliminary evaluation of landslide hazard for the railway was conducted via the analysis of archived engineering-geological documents. 13 of 36 landslide sites (total length 13 km) were selected, reflecting the variety and peculiarities of landslide displacements on slopes (both active and inactive sites). Visual field observations of landslide slopes using drone "DJI Phantom 4" were completed during the second stage of this investigation. High-resolution photographs of landslide cirques, cracks, scarp walls, vegetation features were obtained via drone, which would have been impossible to obtain from the ground in conditions of dense subtropical vegetation cover. Possible approaches to the landslide activity and hazard assessment were evaluated: slope stability

  18. Tsunami hazard and risk assessment in El Salvador

    NASA Astrophysics Data System (ADS)

    González, M.; González-Riancho, P.; Gutiérrez, O. Q.; García-Aguilar, O.; Aniel-Quiroga, I.; Aguirre, I.; Alvarez, J. A.; Gavidia, F.; Jaimes, I.; Larreynaga, J. A.

    2012-04-01

    Tsunamis are relatively infrequent phenomena representing a greater threat than earthquakes, hurricanes and tornadoes, causing the loss of thousands of human lives and extensive damage to coastal infrastructure around the world. Several works have attempted to study these phenomena in order to understand their origin, causes, evolution, consequences, and magnitude of their damages, to finally propose mechanisms to protect coastal societies. Advances in the understanding and prediction of tsunami impacts allow the development of adaptation and mitigation strategies to reduce risk on coastal areas. This work -Tsunami Hazard and Risk Assessment in El Salvador-, funded by AECID during the period 2009-12, examines the state of the art and presents a comprehensive methodology for assessing the risk of tsunamis at any coastal area worldwide and applying it to the coast of El Salvador. The conceptual framework is based on the definition of Risk as the probability of harmful consequences or expected losses resulting from a given hazard to a given element at danger or peril, over a specified time period (European Commission, Schneiderbauer et al., 2004). The HAZARD assessment (Phase I of the project) is based on propagation models for earthquake-generated tsunamis, developed through the characterization of tsunamigenic sources -sismotectonic faults- and other dynamics under study -tsunami waves, sea level, etc.-. The study area is located in a high seismic activity area and has been hit by 11 tsunamis between 1859 and 1997, nine of them recorded in the twentieth century and all generated by earthquakes. Simulations of historical and potential tsunamis with greater or lesser affection to the country's coast have been performed, including distant sources, intermediate and close. Deterministic analyses of the threats under study -coastal flooding- have been carried out, resulting in different hazard maps (maximum wave height elevation, maximum water depth, minimum tsunami

  19. Remote sensing and landslide hazard assessment

    NASA Technical Reports Server (NTRS)

    Mckean, J.; Buechel, S.; Gaydos, L.

    1991-01-01

    Remotely acquired multispectral data are used to improve landslide hazard assessments at all scales of investigation. A vegetation map produced from automated interpretation of TM data is used in a GIS context to explore the effect of vegetation type on debris flow occurrence in preparation for inclusion in debris flow hazard modeling. Spectral vegetation indices map spatial patterns of grass senescence which are found to be correlated with soil thickness variations on hillslopes. Grassland senescence is delayed over deeper, wetter soils that are likely debris flow source areas. Prediction of actual soil depths using vegetation indices may be possible up to some limiting depth greater than the grass rooting zone. On forested earthflows, the slow slide movement disrupts the overhead timber canopy, exposes understory vegetation and soils, and alters site spectral characteristics. Both spectral and textural measures from broad band multispectral data are successful at detecting an earthflow within an undisturbed old-growth forest.

  20. RiskScape Volcano: Development of a risk assessment tool for volcanic hazards

    NASA Astrophysics Data System (ADS)

    Deligne, Natalia; King, Andrew; Jolly, Gill; Wilson, Grant; Wilson, Tom; Lindsay, Jan

    2013-04-01

    RiskScape is a multi-hazard risk assessment tool developed by GNS Science and the National Institute of Water and Atmospheric Research Ltd. (NIWA) in New Zealand that models the risk and impact of various natural hazards on a given built environment. RiskScape has a modular structure: the hazard module models hazard exposure (e.g., ash thickness at a given location), the asset module catalogues assets (built environment, infrastructure, and people) and their attributes exposed to the hazard, and the vulnerability module models the consequences of asset exposure to the hazard. Hazards presently included in RiskScape are earthquakes, river floods, tsunamis, windstorms, and ash from volcanic eruptions (specifically from Ruapehu). Here we present our framework for incorporating other volcanic hazards (e.g., pyroclastic density currents, lava flows, lahars, ground deformation) into RiskScape along with our approach for assessing asset vulnerability. We also will discuss the challenges of evaluating risk for 'point source' (e.g., stratovolcanoes) vs 'diffuse' (e.g., volcanic fields) volcanism using Ruapehu and the Auckland volcanic field as examples. Once operational, RiskScape Volcano will be a valuable resource both in New Zealand and internationally as a practical tool for evaluating risk and also as an example for how to predict the consequences of volcanic eruptions on both rural and urban environments.

  1. Assessing natural hazard risk using images and data

    NASA Astrophysics Data System (ADS)

    Mccullough, H. L.; Dunbar, P. K.; Varner, J. D.; Mungov, G.

    2012-12-01

    Photographs and other visual media provide valuable pre- and post-event data for natural hazard assessment. Scientific research, mitigation, and forecasting rely on visual data for risk analysis, inundation mapping and historic records. Instrumental data only reveal a portion of the whole story; photographs explicitly illustrate the physical and societal impacts from the event. Visual data is rapidly increasing as the availability of portable high resolution cameras and video recorders becomes more attainable. Incorporating these data into archives ensures a more complete historical account of events. Integrating natural hazards data, such as tsunami, earthquake and volcanic eruption events, socio-economic information, and tsunami deposits and runups along with images and photographs enhances event comprehension. Global historic databases at NOAA's National Geophysical Data Center (NGDC) consolidate these data, providing the user with easy access to a network of information. NGDC's Natural Hazards Image Database (ngdc.noaa.gov/hazardimages) was recently improved to provide a more efficient and dynamic user interface. It uses the Google Maps API and Keyhole Markup Language (KML) to provide geographic context to the images and events. Descriptive tags, or keywords, have been applied to each image, enabling easier navigation and discovery. In addition, the Natural Hazards Map Viewer (maps.ngdc.noaa.gov/viewers/hazards) provides the ability to search and browse data layers on a Mercator-projection globe with a variety of map backgrounds. This combination of features creates a simple and effective way to enhance our understanding of hazard events and risks using imagery.

  2. Probabilistic Seismic Hazard Assessment of the Chiapas State (SE Mexico)

    NASA Astrophysics Data System (ADS)

    Rodríguez-Lomelí, Anabel Georgina; García-Mayordomo, Julián

    2015-04-01

    The Chiapas State, in southeastern Mexico, is a very active seismic region due to the interaction of three tectonic plates: Northamerica, Cocos and Caribe. We present a probabilistic seismic hazard assessment (PSHA) specifically performed to evaluate seismic hazard in the Chiapas state. The PSHA was based on a composited seismic catalogue homogenized to Mw and was used a logic tree procedure for the consideration of different seismogenic source models and ground motion prediction equations (GMPEs). The results were obtained in terms of peak ground acceleration as well as spectral accelerations. The earthquake catalogue was compiled from the International Seismological Center and the Servicio Sismológico Nacional de México sources. Two different seismogenic source zones (SSZ) models were devised based on a revision of the tectonics of the region and the available geomorphological and geological maps. The SSZ were finally defined by the analysis of geophysical data, resulting two main different SSZ models. The Gutenberg-Richter parameters for each SSZ were calculated from the declustered and homogenized catalogue, while the maximum expected earthquake was assessed from both the catalogue and geological criteria. Several worldwide and regional GMPEs for subduction and crustal zones were revised. For each SSZ model we considered four possible combinations of GMPEs. Finally, hazard was calculated in terms of PGA and SA for 500-, 1000-, and 2500-years return periods for each branch of the logic tree using the CRISIS2007 software. The final hazard maps represent the mean values obtained from the two seismogenic and four attenuation models considered in the logic tree. For the three return periods analyzed, the maps locate the most hazardous areas in the Chiapas Central Pacific Zone, the Pacific Coastal Plain and in the Motagua and Polochic Fault Zone; intermediate hazard values in the Chiapas Batholith Zone and in the Strike-Slip Faults Province. The hazard decreases

  3. Seismic hazard assessment and pattern recognition of earthquake prone areas in the Po Plain (Italy)

    NASA Astrophysics Data System (ADS)

    Gorshkov, Alexander; Peresan, Antonella; Soloviev, Alexander; Panza, Giuliano F.

    2014-05-01

    A systematic and quantitative assessment, capable of providing first-order consistent information about the sites where large earthquakes may occur, is crucial for the knowledgeable seismic hazard evaluation. The methodology for the pattern recognition of areas prone to large earthquakes is based on the morphostructural zoning method (MSZ), which employs topographic data and present-day tectonic structures for the mapping of earthquake-controlling structures (i.e. the nodes formed around lineaments intersections) and does not require the knowledge about past seismicity. The nodes are assumed to be characterized by a uniform set of topographic, geologic, and geophysical parameters; on the basis of such parameters the pattern recognition algorithm defines a classification rule to discriminate seismogenic and non-seismogenic nodes. This methodology has been successfully applied since the early 1970s in a number of regions worldwide, including California, where it permitted the identification of areas that have been subsequently struck by strong events and that previously were not considered prone to strong earthquakes. Recent studies on the Iberian Peninsula and the Rhone Valley, have demonstrated the applicability of MSZ to flat basins, with a relatively flat topography. In this study, the analysis is applied to the Po Plain (Northern Italy), an area characterized by a flat topography, to allow for the systematic identification of the nodes prone to earthquakes with magnitude larger or equal to M=5.0. The MSZ method differs from the standard morphostructural analysis where the term "lineament" is used to define the complex of alignments detectable on topographic maps or on satellite images. According to that definition the lineament is locally defined and the existence of the lineament does not depend on the surrounding areas. In MSZ, the primary element is the block - a relatively homogeneous area - while the lineament is a secondary element of the morphostructure

  4. Screening guide for rapid assessment of liquefaction hazard at highway bridge sites

    DOT National Transportation Integrated Search

    1998-06-16

    As an aid to seismic hazard assessment, this report provides a "screening guide" for systematic evaluation of liquefactin hazard at bridge sites and a guide for prioritizing sites for further investigation or mitigation. The guide presents a systemat...

  5. Quantification of tsunami hazard on Canada's Pacific Coast; implications for risk assessment

    NASA Astrophysics Data System (ADS)

    Evans, Stephen G.; Delaney, Keith B.

    2015-04-01

    Our assessment of tsunami hazard on Canada's Pacific Coast (i.e., the coast of British Columbia) begins with a review of the 1964 tsunami generated by The Great Alaska Earthquake (M9.2) that resulted in significant damage to coastal communities and infrastructure. In particular, the tsunami waves swept up inlets on the west coast of Vancouver Island and damaged several communities; Port Alberni suffered upwards of 5M worth of damage. At Port Alberni, the maximum tsunami wave height was estimated at 8.2 m above mean sea level and was recorded on the stream gauge on the Somass River located at about 7 m a.s.l, 6 km upstream from its mouth. The highest wave (9.75 m above tidal datum) was reported from Shields Bay, Graham Island, Queen Charlotte Islands (Haida Gwaii). In addition, the 1964 tsunami was recorded on tide gauges at a number of locations on the BC coast. The 1964 signal and the magnitude and frequency of traces of other historical Pacific tsunamis (both far-field and local) are analysed in the Tofino tide gauge records and compared to tsunami traces in other tide gauges in the Pacific Basin (e.g., Miyako, Japan). Together with a review of the geological evidence for tsunami occurrence along Vancouver Island's west coast, we use this tide gauge data to develop a quantitative framework for tsunami hazard on Canada's Pacific coast. In larger time scales, tsunamis are a major component of the hazard from Cascadia megathrust events. From sedimentological evidence and seismological considerations, the recurrence interval of megathrust events on the Cascadia Subduction Zone has been estimated by others at roughly 500 years. We assume that the hazard associated with a high-magnitude destructive tsunami thus has an annual frequency of roughly 1/500. Compared to other major natural hazards in western Canada this represents a very high annual probability of potentially destructive hazard that, in some coastal communities, translates into high levels of local risk

  6. Map Your Hazards! - an Interdisciplinary, Place-Based Educational Approach to Assessing Natural Hazards, Social Vulnerability, Risk and Risk Perception.

    NASA Astrophysics Data System (ADS)

    Brand, B. D.; McMullin-Messier, P. A.; Schlegel, M. E.

    2014-12-01

    'Map your Hazards' is an educational module developed within the NSF Interdisciplinary Teaching about Earth for a Sustainable Future program (InTeGrate). The module engages students in place-based explorations of natural hazards, social vulnerability, and the perception of natural hazards and risk. Students integrate geoscience and social science methodologies to (1) identify and assess hazards, vulnerability and risk within their communities; (2) distribute, collect and evaluate survey data (designed by authors) on the knowledge, risk perception and preparedness within their social networks; and (3) deliver a PPT presentation to local stakeholders detailing their findings and recommendations for development of a prepared, resilient community. 'Map your Hazards' underwent four rigorous assessments by a team of geoscience educators and external review before being piloted in our classrooms. The module was piloted in a 300-level 'Volcanoes and Society' course at Boise State University, a 300-level 'Environmental Sociology' course at Central Washington University, and a 100-level 'Natural Disasters and Environmental Geology' course at the College of Western Idaho. In all courses students reported a fascination with learning about the hazards around them and identifying the high risk areas in their communities. They were also surprised at the low level of knowledge, inaccurate risk perception and lack of preparedness of their social networks. This successful approach to engaging students in an interdisciplinary, place-based learning environment also has the broad implications of raising awareness of natural hazards (survey participants are provided links to local hazard and preparedness information). The data and preparedness suggestions can be shared with local emergency managers, who are encouraged to attend the student's final presentations. All module materials are published at serc.carleton.edu/integrate/ and are appropriate to a wide range of classrooms.

  7. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    NASA Astrophysics Data System (ADS)

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-01

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationships for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.

  8. Identification and assessment of hazardous compounds in drinking water.

    PubMed

    Fawell, J K; Fielding, M

    1985-12-01

    The identification of organic chemicals in drinking water and their assessment in terms of potential hazardous effects are two very different but closely associated tasks. In relation to both continuous low-level background contamination and specific, often high-level, contamination due to pollution incidents, the identification of contaminants is a pre-requisite to evaluation of significant hazards. Even in the case of the rapidly developing short-term bio-assays which are applied to water to indicate a potential genotoxic hazard (for example Ames tests), identification of the active chemicals is becoming a major factor in the further assessment of the response. Techniques for the identification of low concentrations of organic chemicals in drinking water have developed remarkably since the early 1970s and methods based upon gas chromatography-mass spectrometry (GC-MS) have revolutionised qualitative analysis of water. Such techniques are limited to "volatile" chemicals and these usually constitute a small fraction of the total organic material in water. However, in recent years there have been promising developments in techniques for "non-volatile" chemicals in water. Such techniques include combined high-performance liquid chromatography-mass spectrometry (HPLC-MS) and a variety of MS methods, involving, for example, field desorption, fast atom bombardment and thermospray ionisation techniques. In the paper identification techniques in general are reviewed and likely future developments outlined. The assessment of hazards associated with chemicals identified in drinking and related waters usually centres upon toxicology - an applied science which involves numerous disciplines. The paper examines the toxicological information needed, the quality and deployment of such information and discusses future research needs. Application of short-term bio-assays to drinking water is a developing area and one which is closely involved with, and to some extent dependent on

  9. Setting the Stage for Harmonized Risk Assessment by Seismic Hazard Harmonization in Europe (SHARE)

    NASA Astrophysics Data System (ADS)

    Woessner, Jochen; Giardini, Domenico; SHARE Consortium

    2010-05-01

    Probabilistic seismic hazard assessment (PSHA) is arguably one of the most useful products that seismology can offer to society. PSHA characterizes the best available knowledge on the seismic hazard of a study area, ideally taking into account all sources of uncertainty. Results form the baseline for informed decision making, such as building codes or insurance rates and provide essential input to each risk assessment application. Several large scale national and international projects have recently been launched aimed at improving and harmonizing PSHA standards around the globe. SHARE (www.share-eu.org) is the European Commission funded project in the Framework Programme 7 (FP-7) that will create an updated, living seismic hazard model for the Euro-Mediterranean region. SHARE is a regional component of the Global Earthquake Model (GEM, www.globalquakemodel.org), a public/private partnership initiated and approved by the Global Science Forum of the OECD-GSF. GEM aims to be the uniform, independent and open access standard to calculate and communicate earthquake hazard and risk worldwide. SHARE itself will deliver measurable progress in all steps leading to a harmonized assessment of seismic hazard - in the definition of engineering requirements, in the collection of input data, in procedures for hazard assessment, and in engineering applications. SHARE scientists will create a unified framework and computational infrastructure for seismic hazard assessment and produce an integrated European probabilistic seismic hazard assessment (PSHA) model and specific scenario based modeling tools. The results will deliver long-lasting structural impact in areas of societal and economic relevance, they will serve as reference for the Eurocode 8 (EC8) application, and will provide homogeneous input for the correct seismic safety assessment for critical industry, such as the energy infrastructures and the re-insurance sector. SHARE will cover the whole European territory, the

  10. Summary information of human health hazard assessment of existing chemical substances (I).

    PubMed

    Matsumoto, Mariko; Kobayashi, Katsumi; Takahashi, Mika; Hirata-Koizumi, Mutsuko; Ono, Atsushi; Hirose, Akihiko

    2015-01-01

    Under the Chemical Substances Control Law (CSCL) in Japan, initial hazard information tor existing chemical substances has been collected by the Ministry of Health, Labour and Welfare, Japan (MHLW) to assess potential initial risks to human health. We have reviewed all collected toxicity information pertaining to acute toxicity, repeated dose toxicity, genotoxicity, and/or reproductive/developmental toxicity and performed hazard assessments. Approximately 150 substances are currently undergoing review and assessment. For clarification and evaluation of each toxicity study, we have created a dossier (a collection of study data containing a detailed summary of the methods, results, and conclusions of each study) in English using the International Uniform Chemical Information Database (IUCLID) version 5. The IUCLID dossier format is widely used and has been accepted as one of the most beneficial formats for providing summarized chemical substance toxicity assessments. In this report, as a contribution to our ongoing hazard assessment activity, we present summary hazard information related to the potential human health effects of the following 5 chemical substances: 4-chlorobenzoyl chloride (CAS: 122-01-0); benzenesulfonic acid, 4-hydroxy-, tin (2+) salt (CAS: 70974- 33-3); chlorocyclohexane (CAS: 542-18-7); 1,3-cyclohexanedimethanamine (CAS: 2579-20-6); and 1,3,5-triazine-2,4,6 (1H,3H,5H) -trithione (CAS: 638-16-4). The IUCLID dossiers created for these 5 chemical substances will be made available via the Japan Existing Chemical Data Base (JECDB) at . Additional human health hazard information on existing chemical substances will be provided using the same methodology and website when it is available.

  11. Alternatives Assessment Frameworks: Research Needs for the Informed Substitution of Hazardous Chemicals

    PubMed Central

    Jacobs, Molly M.; Malloy, Timothy F.; Tickner, Joel A.; Edwards, Sally

    2015-01-01

    Background Given increasing pressures for hazardous chemical replacement, there is growing interest in alternatives assessment to avoid substituting a toxic chemical with another of equal or greater concern. Alternatives assessment is a process for identifying, comparing, and selecting safer alternatives to chemicals of concern (including those used in materials, processes, or technologies) on the basis of their hazards, performance, and economic viability. Objectives The purposes of this substantive review of alternatives assessment frameworks are to identify consistencies and differences in methods and to outline needs for research and collaboration to advance science policy practice. Methods This review compares methods used in six core components of these frameworks: hazard assessment, exposure characterization, life-cycle impacts, technical feasibility evaluation, economic feasibility assessment, and decision making. Alternatives assessment frameworks published from 1990 to 2014 were included. Results Twenty frameworks were reviewed. The frameworks were consistent in terms of general process steps, but some differences were identified in the end points addressed. Methodological gaps were identified in the exposure characterization, life-cycle assessment, and decision–analysis components. Methods for addressing data gaps remain an issue. Discussion Greater consistency in methods and evaluation metrics is needed but with sufficient flexibility to allow the process to be adapted to different decision contexts. Conclusion Although alternatives assessment is becoming an important science policy field, there is a need for increased cross-disciplinary collaboration to refine methodologies in support of the informed substitution and design of safer chemicals, materials, and products. Case studies can provide concrete lessons to improve alternatives assessment. Citation Jacobs MM, Malloy TF, Tickner JA, Edwards S. 2016. Alternatives assessment frameworks: research

  12. A tiered asthma hazard characterization and exposure assessment approach for evaluation of consumer product ingredients.

    PubMed

    Maier, Andrew; Vincent, Melissa J; Parker, Ann; Gadagbui, Bernard K; Jayjock, Michael

    2015-12-01

    Asthma is a complex syndrome with significant consequences for those affected. The number of individuals affected is growing, although the reasons for the increase are uncertain. Ensuring the effective management of potential exposures follows from substantial evidence that exposure to some chemicals can increase the likelihood of asthma responses. We have developed a safety assessment approach tailored to the screening of asthma risks from residential consumer product ingredients as a proactive risk management tool. Several key features of the proposed approach advance the assessment resources often used for asthma issues. First, a quantitative health benchmark for asthma or related endpoints (irritation and sensitization) is provided that extends qualitative hazard classification methods. Second, a parallel structure is employed to include dose-response methods for asthma endpoints and methods for scenario specific exposure estimation. The two parallel tracks are integrated in a risk characterization step. Third, a tiered assessment structure is provided to accommodate different amounts of data for both the dose-response assessment (i.e., use of existing benchmarks, hazard banding, or the threshold of toxicological concern) and exposure estimation (i.e., use of empirical data, model estimates, or exposure categories). Tools building from traditional methods and resources have been adapted to address specific issues pertinent to asthma toxicology (e.g., mode-of-action and dose-response features) and the nature of residential consumer product use scenarios (e.g., product use patterns and exposure durations). A case study for acetic acid as used in various sentinel products and residential cleaning scenarios was developed to test the safety assessment methodology. In particular, the results were used to refine and verify relationships among tiered approaches such that each lower data tier in the approach provides a similar or greater margin of safety for a given

  13. Assessment of liquefaction-induced hazards using Bayesian networks based on standard penetration test data

    NASA Astrophysics Data System (ADS)

    Tang, Xiao-Wei; Bai, Xu; Hu, Ji-Lei; Qiu, Jiang-Nan

    2018-05-01

    Liquefaction-induced hazards such as sand boils, ground cracks, settlement, and lateral spreading are responsible for considerable damage to engineering structures during major earthquakes. Presently, there is no effective empirical approach that can assess different liquefaction-induced hazards in one model. This is because of the uncertainties and complexity of the factors related to seismic liquefaction and liquefaction-induced hazards. In this study, Bayesian networks (BNs) are used to integrate multiple factors related to seismic liquefaction, sand boils, ground cracks, settlement, and lateral spreading into a model based on standard penetration test data. The constructed BN model can assess four different liquefaction-induced hazards together. In a case study, the BN method outperforms an artificial neural network and Ishihara and Yoshimine's simplified method in terms of accuracy, Brier score, recall, precision, and area under the curve (AUC) of the receiver operating characteristic (ROC). This demonstrates that the BN method is a good alternative tool for the risk assessment of liquefaction-induced hazards. Furthermore, the performance of the BN model in estimating liquefaction-induced hazards in Japan's 2011 Tōhoku earthquake confirms its correctness and reliability compared with the liquefaction potential index approach. The proposed BN model can also predict whether the soil becomes liquefied after an earthquake and can deduce the chain reaction process of liquefaction-induced hazards and perform backward reasoning. The assessment results from the proposed model provide informative guidelines for decision-makers to detect the damage state of a field following liquefaction.

  14. Hazard Assessment in a Big Data World

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir; Nekrasova, Anastasia

    2017-04-01

    Open data in a Big Data World provides unprecedented opportunities for enhancing scientific studies and better understanding of the Earth System. At the same time, it opens wide avenues for deceptive associations in inter- and transdisciplinary data misleading to erroneous predictions, which are unacceptable for implementation. Even the advanced tools of data analysis may lead to wrong assessments when inappropriately used to describe the phenomenon under consideration. A (self-) deceptive conclusion could be avoided by verification of candidate models in experiments on empirical data and in no other way. Seismology is not an exception. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in early history of instrumental seismology can be proved erroneous when subjected to objective hypothesis testing. In many cases of seismic hazard assessment (SHA), either probabilistic or deterministic, term-less or short-term, the claims of a high potential of a model forecasts are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers, which situation creates numerous deception points and resulted controversies. So far, most, if not all, the standard probabilistic methods to assess seismic hazard and associated risks are based on subjective, commonly unrealistic, and even erroneous assumptions about seismic recurrence and none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Accurate testing against real observations must be done in advance claiming seismically hazardous areas and/or times. The set of errors of the first and second kind in such a comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a user-defined cost-benefit function. The information obtained in testing experiments may supply

  15. 78 FR 33894 - Proposed Information Collection (Open Burn Pit Registry Airborne Hazard Self-Assessment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-05

    ... Burn Pit Registry Airborne Hazard Self-Assessment Questionnaire) Activity: Comment Request AGENCY... ascertain and monitor the health effects of the exposure of members of the Armed Forces to toxic airborne... to ``OMB Control No. 2900-NEW, Open Burn Pit Registry Airborne Hazard Self-Assessment Questionnaire...

  16. Assessing the long-term probabilistic volcanic hazard for tephra fallout in Reykjavik, Iceland: a preliminary multi-source analysis

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Barsotti, Sara; Sandri, Laura; Tumi Guðmundsson, Magnús

    2015-04-01

    Icelandic volcanism is largely dominated by basaltic magma. Nevertheless the presence of glaciers over many Icelandic volcanic systems results in frequent phreatomagmatic eruptions and associated tephra production, making explosive eruptions the most common type of volcanic activity. Jökulhlaups are commonly considered as major volcanic hazard in Iceland for their high frequency and potentially very devastating local impact. Tephra fallout is also frequent and can impact larger areas. It is driven by the wind direction that can change with both altitude and season, making impossible to predict a priori where the tephra will be deposited during the next eruptions. Most of the volcanic activity in Iceland occurs in the central eastern part, over 100 km to the east of the main population centre around the capital Reykjavík. Therefore, the hazard from tephra fallout in Reykjavík is expected to be smaller than for communities settled near the main volcanic systems. However, within the framework of quantitative hazard and risk analyses, less frequent and/or less intense phenomena should not be neglected, since their risk evaluation depends on the effects suffered by the selected target. This is particularly true if the target is highly vulnerable, as large urban areas or important infrastructures. In this work we present the preliminary analysis aiming to perform a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra fallout focused on the target area which includes the municipality of Reykjavík and the Keflavík international airport. This approach reverts the more common perspective where the hazard analysis is focused on the source (the volcanic system) and it follows a multi-source approach: indeed, the idea is to quantify, homogeneously, the hazard due to the main hazardous volcanoes that could pose a tephra fallout threat for the municipality of Reykjavík and the Keflavík airport. PVHA for each volcanic system is calculated independently and the results

  17. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  18. RESTORING HAZARDOUS SPILL-DAMAGED AREAS: TECHNIQUE IDENTIFICATION/ASSESSMENT

    EPA Science Inventory

    The goal of this study was to identify and assess methods that could be used to accelerate the restoration of lands damaged by spills of hazardous materials. The literature was reviewed to determine what response methods had been used in the past to clean up spills on land and id...

  19. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin L.; Bolisetti, Chandu; Veeraraghavan, Swetha

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporatesmore » deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.« less

  20. Sinkhole hazard assessment in Lesina Marina area (Apulia, Italy)

    NASA Astrophysics Data System (ADS)

    Canora, F.; Caporale, F.; D'Angella, A.; Fidelibus, D.; Gutierrez, F.; Pellicani, R.; Spilotro, G.

    2012-04-01

    In "Lesina Marina" area, located in the north-western part of the Apulia region (Italy), near the Adriatic coast, sinkhole phenomena are particularly widespread and constitute a risk for the built-up area. These phenomena are due to the structure of the evaporitic rocks located in the study area and to the groundwater regime, influenced by the presence of a channel that connects the sea to the lagoon. The complex sea-channel-lagoon system produces an inland flow towards the channel modulated by the tide with a variable width according to the rules of the coastal aquifers. Further studies have been carried out in order to clarify the context and the causes of this instability phenomenon. A procedure for the sinkhole susceptibility and hazard assessment has been performed, in order to evaluate the spatial distribution of the most unstable areas and the potential spatio-temporal evolution of the phenomenon. The sinkhole susceptibility model has been created in GIS by assessing the spatial relationship between the sinkhole inventory map and a series of thematic maps relative to instability factors. The thematic layers selected for the study are nine and cover geometrical features of the surface, of the gypsum rockhead and of the incoherent soil cover, groundwater and daily and seasonal groundwater level variations. Daily groundwater variation in a semiconfined coastal aquifer can be related to the permeability and to the void structures of the evaporitic mass. In the years subsequent to 1980, when the first reports of the presence of sinkholes are dated, the evolution of these instabilities in terms of their number and of their increase of extension has been monitored with repeated surveys. These data were used for susceptibility model validation and to define the hazard model. The selected layers revealed to be very useful in describing and mapping the hazard coming from suffusion sinkholes in the study area. The sinkhole hazard assessment is carried out, according to

  1. Development and Evaluation of Reproductive and Developmental Toxicity Tests for Assessing the Hazards of Environmental Contaminants

    DTIC Science & Technology

    1997-08-01

    AL/EQ-TR-1997-0050 DEVELOPMENT AND EVALUATION OF REPRODUCTIVE AND DEVELOPMENT TOXICITY TESTS FOR ASSESSING THE HAZARDS OF ENVIRONMENTAL...SUBTITLE Development and Evaluation of Reproductive and Developmental Toxicity Tests for Assessing the Hazards of Environmental Contaminants 6...pd in testing toxicity in surface waters, ground waters and H- ™t™j£J^^^M hazard assessment when used in conjunction in sediments. FETAX can be usea

  2. Cost assessment of natural hazards in Europe - state-of-the-art, knowledge gaps and recommendations

    NASA Astrophysics Data System (ADS)

    Meyer, V.; Becker, N.; Markantonis, V.; Schwarze, R.; van den Bergh, J. C. J. M.; Bouwer, L. M.; Bubeck, P.; Ciavola, P.; Thieken, A. H.; Genovese, E.; Green, C.; Hallegatte, S.; Kreibich, H.; Lequeux, Q.; Viavattenne, C.; Logar, I.; Papyrakis, E.; Pfurtscheller, C.; Poussin, J.; Przyluski, V.

    2012-04-01

    Effective and efficient reduction of natural hazard risks requires a thorough understanding of the costs of natural hazards in order to develop sustainable risk management strategies. The current methods that assess the costs of different natural hazards employ a diversity of terminologies and approaches for different hazards and impacted sectors. This makes it difficult to arrive at robust, comprehensive and comparable cost figures. The CONHAZ (Costs of Natural Hazards) project aimed to compile and synthesise current knowledge on cost assessment methods in order to strengthen the role of cost assessments in the development of integrated natural hazard management and adaptation planning. In order to achieve this, CONHAZ has adopted a comprehensive approach, considering natural hazards ranging from droughts, floods and coastal hazards to Alpine hazards, as well as different impacted sectors and cost types. Its specific objectives have been 1) to compile the state-of-the-art methods for cost assessment; 2) to analyse and assess these methods in terms of technical aspects, as well as terminology, data quality and availability, and research gaps; and 3) to synthesise resulting knowledge into recommendations and to identify further research needs. This presentation summarises the main results of CONHAZ. CONHAZ differentiates between direct tangible damages, losses due to business interruption, indirect damages, intangible effects, and costs of risk mitigation. It is shown that the main focus of cost assessment methods and their application in practice is on direct costs, while existing methods for assessing intangible and indirect effects are rather rarely applied and methods for assessing indirect effects often cannot be used on the scale of interest (e.g. the regional scale). Furthermore, methods often focus on single sectors and/or hazards, and only very few are able to reflect several sectors or multiple hazards. Process understanding and its use in cost assessment

  3. Disability adjusted life year (DALY): a useful tool for quantitative assessment of environmental pollution.

    PubMed

    Gao, Tingting; Wang, Xiaochang C; Chen, Rong; Ngo, Huu Hao; Guo, Wenshan

    2015-04-01

    Disability adjusted life year (DALY) has been widely used since 1990s for evaluating global and/or regional burden of diseases. As many environmental pollutants are hazardous to human health, DALY is also recognized as an indicator to quantify the health impact of environmental pollution related to disease burden. Based on literature reviews, this article aims to give an overview of the applicable methodologies and research directions for using DALY as a tool for quantitative assessment of environmental pollution. With an introduction of the methodological framework of DALY, the requirements on data collection and manipulation for quantifying disease burdens are summarized. Regarding environmental pollutants hazardous to human beings, health effect/risk evaluation is indispensable for transforming pollution data into disease data through exposure and dose-response analyses which need careful selection of models and determination of parameters. Following the methodological discussions, real cases are analyzed with attention paid to chemical pollutants and pathogens usually encountered in environmental pollution. It can be seen from existing studies that DALY is advantageous over conventional environmental impact assessment for quantification and comparison of the risks resulted from environmental pollution. However, further studies are still required to standardize the methods of health effect evaluation regarding varied pollutants under varied circumstances before DALY calculation. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Quantitative landslide risk assessment and mapping on the basis of recent occurrences

    NASA Astrophysics Data System (ADS)

    Remondo, Juan; Bonachea, Jaime; Cendrero, Antonio

    A quantitative procedure for mapping landslide risk is developed from considerations of hazard, vulnerability and valuation of exposed elements. The approach based on former work by the authors, is applied in the Bajo Deba area (northern Spain) where a detailed study of landslide occurrence and damage in the recent past (last 50 years) was carried out. Analyses and mapping are implemented in a Geographic Information System (GIS). The method is based on a susceptibility model developed previously from statistical relationships between past landslides and terrain parameters related to instability. Extrapolations based on past landslide behaviour were used to calculate failure frequency for the next 50 years. A detailed inventory of direct damage due to landslides during the study period was carried out and the main elements at risk in the area identified and mapped. Past direct (monetary) losses per type of element were estimated and expressed as an average 'specific loss' for events of a given magnitude (corresponding to a specified scenario). Vulnerability was assessed by comparing losses with the actual value of the elements affected and expressed as a fraction of that value (0-1). From hazard, vulnerability and monetary value, risk was computed for each element considered. Direct risk maps (€/pixel/year) were obtained and indirect losses from the disruption of economic activities due to landslides assessed. The final result is a risk map and table combining all losses per pixel for a 50-year period. Total monetary value at risk for the Bajo Deba area in the next 50 years is about 2.4 × 10 6 Euros.

  5. International Collaboration for Strengthening Capacity to Assess Earthquake Hazard in Indonesia

    NASA Astrophysics Data System (ADS)

    Cummins, P. R.; Hidayati, S.; Suhardjono, S.; Meilano, I.; Natawidjaja, D.

    2012-12-01

    Indonesia has experienced a dramatic increase in earthquake risk due to rapid population growth in the 20th century, much of it occurring in areas near the subduction zone plate boundaries that are prone to earthquake occurrence. While recent seismic hazard assessments have resulted in better building codes that can inform safer building practices, many of the fundamental parameters controlling earthquake occurrence and ground shaking - e.g., fault slip rates, earthquake scaling relations, ground motion prediction equations, and site response - could still be better constrained. In recognition of the need to improve the level of information on which seismic hazard assessments are based, the Australian Agency for International Development (AusAID) and Indonesia's National Agency for Disaster Management (BNPB), through the Australia-Indonesia Facility for Disaster Reduction, have initiated a 4-year project designed to strengthen the Government of Indonesia's capacity to reliably assess earthquake hazard. This project is a collaboration of Australian institutions including Geoscience Australia and the Australian National University, with Indonesian government agencies and universities including the Agency for Meteorology, Climatology and Geophysics, the Geological Agency, the Indonesian Institute of Sciences, and Bandung Institute of Technology. Effective earthquake hazard assessment requires input from many different types of research, ranging from geological studies of active faults, seismological studies of crustal structure, earthquake sources and ground motion, PSHA methodology, and geodetic studies of crustal strain rates. The project is a large and diverse one that spans all these components, and these will be briefly reviewed in this presentation

  6. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  7. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  8. Induction Hazard Assessment: The Variability of Geoelectric Responses During Geomagnetic Storms Within Common Hazard Zones

    NASA Astrophysics Data System (ADS)

    Cuttler, S. W.; Love, J. J.; Swidinsky, A.

    2017-12-01

    Geomagnetic field data obtained through the INTERMAGNET program are convolved with four validated EarthScope USArray impedances to estimate the geoelectric variations throughout the duration of a geomagnetic storm. A four day long geomagnetic storm began on June 22, 2016, and was recorded at the Brandon (BRD), Manitoba and Fredericksburg (FRD), Virginia magnetic observatories over four days. Two impedance tensors corresponding to each magnetic observatory produce extremely different responses, despite being within close geographical proximity. Estimated time series of the geoelectric field throughout the duration of the geomagnetic storm were calculated, providing an understanding of how the geoelectric field differs across small geographic distances within the same geomagnetic hazard zones derived from prior geomagnetic hazard assessment. We show that the geoelectric response of two sites within 200km of one another can differ by up to two orders of magnitude (4245 mV/km at one location and 38 mV/km at another location 125km away). In addition, we compare these results with estimations of the geoelectric field generated from synthetic 1-dimensional resistivity models commonly used to represent large geographic regions when assessing geomagnetically induced current (GIC) hazards. This comparison shows that estimations of the geomagnetic field from these models differ greatly from estimations produced from Earthscope USArray sites (1205 mV/km in the 1D and 4245 mV/km in the 3D case in one example). This study demonstrates that the application of uniform 1-dimensional resistivity models of the subsurface to wide geographic regions is insufficient to predict the geoelectric hazard at a given location. Furthermore an evaluation of the 3-dimensional resistivity distribution at a given location is necessary to produce a reliable estimation of how the geoelectric field evolves over the course of a geomagnetic storm.

  9. Earthquake hazard assessment after Mexico (1985).

    PubMed

    Degg, M R

    1989-09-01

    The 1985 Mexican earthquake ranks foremost amongst the major earthquake disasters of the twentieth century. One of the few positive aspects of the disaster is that it provided massive quantities of data that would otherwise have been unobtainable. Every opportunity should be taken to incorporate the findings from these data in earthquake hazard assessments. The purpose of this paper is to provide a succinct summary of some of the more important lessons from Mexico. It stems from detailed field investigations, and subsequent analyses, conducted by the author on the behalf of reinsurance companies.

  10. Application of disease burden to quantitative assessment of health hazards for a decentralized water reuse system.

    PubMed

    Gao, Tingting; Chen, Rong; Wang, Xiaochang; Ngo, Huu Hao; Li, Yu-You; Zhou, Jinhong; Zhang, Lu

    2016-05-01

    The aim of this article is to introduce the methodology of disease burden (DB) to quantify the health impact of microbial regrowth during wastewater reuse, using the case study of a decentralized water reuse system in Xi'an Si-yuan University, located in Xi'an, China. Based on field investigation findings, Escherichia coli (E. coli), Salmonella and rotavirus were selected as typical regrowth pathogens causing potential health hazards during the reuse of reclaimed water. Subsequently, major exposure routes including sprinkler irrigation, landscape fountains and toilet flushing were identified. Mathematical models were established to build the relationship between exposure dose and disease burden by calculating the disability adjusted life year (DALY). Results of disease burden for this case study show that DALYs attributed to E. coli were significantly greater than those caused by other pathogens, and DALYs associated with sprinkler irrigation were higher than those originating from other routes. A correlation between exposure dose and disease was obtained by introducing a modified calculation of morbidity, which can extend the assessment endpoint of health risk to disease burden from the conventional infection rate. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Role of beach morphology in wave overtopping hazard assessment

    NASA Astrophysics Data System (ADS)

    Phillips, Benjamin; Brown, Jennifer; Bidlot, Jean-Raymond; Plater, Andrew

    2017-04-01

    Understanding the role of beach morphology in controlling wave overtopping volume will further minimise uncertainties in flood risk assessments at coastal locations defended by engineered structures worldwide. XBeach is used to model wave overtopping volume for a 1:200 yr joint probability distribution of waves and water levels with measured, pre- and post-storm beach profiles. The simulation with measured bathymetry is repeated with and without morphological evolution enabled during the modelled storm event. This research assesses the role of morphology in controlling wave overtopping volumes for hazardous events that meet the typical design level of coastal defence structures. Results show disabling storm-driven morphology under-represents modelled wave overtopping volumes by up to 39% under high Hs conditions, and has a greater impact on the wave overtopping rate than the variability applied within the boundary conditions due to the range of wave-water level combinations that meet the 1:200 yr joint probability criterion. Accounting for morphology in flood modelling is therefore critical for accurately predicting wave overtopping volumes and the resulting flood hazard and to assess economic losses.

  12. Quantitative risk assessment of human campylobacteriosis associated with thermophilic Campylobacter species in chickens.

    PubMed

    Rosenquist, Hanne; Nielsen, Niels L; Sommer, Helle M; Nørrung, Birgit; Christensen, Bjarke B

    2003-05-25

    A quantitative risk assessment comprising the elements hazard identification, hazard characterization, exposure assessment, and risk characterization has been prepared to assess the effect of different mitigation strategies on the number of human cases in Denmark associated with thermophilic Campylobacter spp. in chickens. To estimate the human exposure to Campylobacter from a chicken meal and the number of human cases associated with this exposure, a mathematical risk model was developed. The model details the spread and transfer of Campylobacter in chickens from slaughter to consumption and the relationship between ingested dose and the probability of developing campylobacteriosis. Human exposure was estimated in two successive mathematical modules. Module 1 addresses changes in prevalence and numbers of Campylobacter on chicken carcasses throughout the processing steps of a slaughterhouse. Module 2 covers the transfer of Campylobacter during food handling in private kitchens. The age and sex of consumers were included in this module to introduce variable hygiene levels during food preparation and variable sizes and compositions of meals. Finally, the outcome of the exposure assessment modules was integrated with a Beta-Poisson dose-response model to provide a risk estimate. Simulations designed to predict the effect of different mitigation strategies showed that the incidence of campylobacteriosis associated with consumption of chicken meals could be reduced 30 times by introducing a 2 log reduction of the number of Campylobacter on the chicken carcasses. To obtain a similar reduction of the incidence, the flock prevalence should be reduced approximately 30 times or the kitchen hygiene improved approximately 30 times. Cross-contamination from positive to negative flocks during slaughter had almost no effect on the human Campylobacter incidence, which indicates that implementation of logistic slaughter will only have a minor influence on the risk. Finally, the

  13. Seismic Hazard Assessment of Tehran Based on Arias Intensity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, G. Ghodrati; Mahmoodi, H.; Amrei, S. A. Razavian

    2008-07-08

    In this paper probabilistic seismic hazard assessment of Tehran for Arias intensity parameter is done. Tehran is capital and most populated city of Iran. From economical, political and social points of view, Tehran is the most significant city of Iran. Since in the previous centuries, catastrophic earthquakes have occurred in Tehran and its vicinity, probabilistic seismic hazard assessment of this city for Arias intensity parameter is useful. Iso-intensity contour lines maps of Tehran on the basis of different attenuation relationships for different earthquake periods are plotted. Maps of iso-intensity points in the Tehran region are presented using proportional attenuation relationshipsmore » for rock and soil beds for 2 hazard levels of 10% and 2% in 50 years. Seismicity parameters on the basis of historical and instrumental earthquakes for a time period that initiate from 4th century BC and ends in the present time are calculated using Tow methods. For calculation of seismicity parameters, the earthquake catalogue with a radius of 200 km around Tehran has been used. SEISRISKIII Software has been employed. Effects of different parameters such as seismicity parameters, length of fault rupture relationships and attenuation relationships are considered using Logic Tree.« less

  14. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  15. New Multi-HAzard and MulTi-RIsk Assessment MethodS for Europe (MATRIX): A research program towards mitigating multiple hazards and risks in Europe

    NASA Astrophysics Data System (ADS)

    Fleming, K. M.; Zschau, J.; Gasparini, P.; Modaressi, H.; Matrix Consortium

    2011-12-01

    Scientists, engineers, civil protection and disaster managers typically treat natural hazards and risks individually. This leads to the situation where the frequent causal relationships between the different hazards and risks, e.g., earthquakes and volcanos, or floods and landslides, are ignored. Such an oversight may potentially lead to inefficient mitigation planning. As part of their efforts to confront this issue, the European Union, under its FP7 program, is supporting the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project. The focus of MATRIX is on natural hazards, in particular earthquakes, landslides, volcanos, wild fires, storms and fluvial and coastal flooding. MATRIX will endeavour to develop methods and tools to tackle multi-type natural hazards and risks within a common framework, focusing on methodologies that are suited to the European context. The work will involve an assessment of current single-type hazard and risk assessment methodologies, including a comparison and quantification of uncertainties and harmonization of single-type methods, examining the consequence of cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and a series of test cases. Three test sites are being used to assess the methods developed within the project (Naples, Cologne, and the French West Indies), as well as a "virtual city" based on a comprehensive IT platform that will allow scenarios not represented by the test cases to be examined. In addition, a comprehensive dissemination program that will involve national platforms for disaster management, as well as various outreach activities, will be undertaken. The MATRIX consortium consists of ten research institutions (nine European and one Canadian), an end-user (i.e., one of the European national platforms for disaster reduction) and a partner from industry.

  16. Geospatial Data Integration for Assessing Landslide Hazard on Engineered Slopes

    NASA Astrophysics Data System (ADS)

    Miller, P. E.; Mills, J. P.; Barr, S. L.; Birkinshaw, S. J.

    2012-07-01

    Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety) for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator's hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator's existing field-based approaches.

  17. Earthquake Hazard Assessment: an Independent Review

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  18. The Active Fault Parameters for Time-Dependent Earthquake Hazard Assessment in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Y.; Cheng, C.; Lin, P.; Shao, K.; Wu, Y.; Shih, C.

    2011-12-01

    Taiwan is located at the boundary between the Philippine Sea Plate and the Eurasian Plate, with a convergence rate of ~ 80 mm/yr in a ~N118E direction. The plate motion is so active that earthquake is very frequent. In the Taiwan area, disaster-inducing earthquakes often result from active faults. For this reason, it's an important subject to understand the activity and hazard of active faults. The active faults in Taiwan are mainly located in the Western Foothills and the Eastern longitudinal valley. Active fault distribution map published by the Central Geological Survey (CGS) in 2010 shows that there are 31 active faults in the island of Taiwan and some of which are related to earthquake. Many researchers have investigated these active faults and continuously update new data and results, but few people have integrated them for time-dependent earthquake hazard assessment. In this study, we want to gather previous researches and field work results and then integrate these data as an active fault parameters table for time-dependent earthquake hazard assessment. We are going to gather the seismic profiles or earthquake relocation of a fault and then combine the fault trace on land to establish the 3D fault geometry model in GIS system. We collect the researches of fault source scaling in Taiwan and estimate the maximum magnitude from fault length or fault area. We use the characteristic earthquake model to evaluate the active fault earthquake recurrence interval. In the other parameters, we will collect previous studies or historical references and complete our parameter table of active faults in Taiwan. The WG08 have done the time-dependent earthquake hazard assessment of active faults in California. They established the fault models, deformation models, earthquake rate models, and probability models and then compute the probability of faults in California. Following these steps, we have the preliminary evaluated probability of earthquake-related hazards in certain

  19. Physiologic Basis for Understanding Quantitative Dehydration Assessment

    DTIC Science & Technology

    2012-01-01

    Perspective Physiologic basis for understanding quantitative dehydration assessment1–4 Samuel N Cheuvront, Robert W Kenefick, Nisha Charkoudian, and...Michael N Sawka ABSTRACT Dehydration (body water deficit) is a physiologic state that can have profound implications for human health and performance...review the physiologic basis for understanding quantitative dehydration as- sessment. We highlight how phenomenologic interpretations of de- hydration

  20. U.S. States and Territories National Tsunami Hazard Assessment: Historical record and sources for waves – Update

    USGS Publications Warehouse

    Dunbar, Paula K.; Weaver, Craig S.

    2015-01-01

    The first U.S. Tsunami Hazard Assessment (Dunbar and Weaver, 2008) was prepared at the request of the National Tsunami Hazard Mitigation Program (NTHMP). The NTHMP is a partnership formed between federal and state agencies to reduce the impact of tsunamis through hazard assessment, warning guidance, and mitigation. The assessment was conducted in response to a 2005 joint report by the Sub-Committee on Disaster Reduction and the U.S. Group on Earth Observations entitled Tsunami Risk Reduction for the United States: A Framework for Action. The first specific action called for in the Framework was to “develop standardized and coordinated tsunami hazard and risk assessments for all coastal regions of the United States and its territories.” Since the first assessment, there have been a number of very significant tsunamis, including the 2009 Samoa, 2010 Chile, and 2011 Japan tsunamis. As a result, the NTHMP requested an update of the U.S. tsunami hazard assessment.

  1. All Hazards Risk Assessment Transition Project: Report on Capability Assessment Management System (CAMS) Automation

    DTIC Science & Technology

    2014-04-01

    All Hazards Risk Assessment Transition Project : Report on Capability Assessment Management System (CAMS) Automation...Prepared by: George Giroux Computer Applications Specialist Modis155 Queen Street, Suite 1206 Ottawa, ON K1P 6L1 Contract # THS 2335474-2 Project ...Under a Canadian Safety and Security Program (CSSP) targeted investigation (TI) project (CSSP-2012-TI- 1108), Defence Research and Development

  2. Multi-hazard risk assessment of the Republic of Mauritius

    NASA Astrophysics Data System (ADS)

    Mysiak, Jaroslav; Galli, Alberto; Amadio, Mattia; Teatini, Chiara

    2013-04-01

    The Republic of Mauritius (ROM) is a small island developing state (SIDS), part of the Mascarene Islands in West Indian Ocean, comprised by Mauritius, Rodrigues, Agalega and St. Brandon islands and several islets. ROM is exposed to many natural hazards notably cyclones, tsunamis, torrential precipitation, landslides, and droughts; and highly vulnerable sea level rise (SLR) driven by human induced climate change. The multihazard risk assessment presented in this paper is aimed at identifying the areas prone to flood, inundation and landslide hazard, and inform the development of strategy for disaster risk reduction (DRR) and climate change adaptation (CCA). Climate risk analysis - a central component of the analysis - is one of the first comprehensive climate modelling studies conducted for the country. Climate change may lift the temperature by 1-2 degree Celsius by 2060-2070, and increase sizably the intensity and frequency of extreme precipitation events. According to the IPCC Forth Assessment Report (AR4), the expected Sea Level Rise (SLR) ranges between 16 and 49 cm. Individually or in combination, the inland flood, coastal inundation and landslide hazards affect large proportion of the country. Sea level rise and the changes in precipitation regimes will amplified existing vulnerabilities and create new ones. The paper outlines an Action plan for Disaster Risk Reduction that takes into account the likely effects of climate change. The Action Plan calls on the government to establish a National Platform for Disaster Risk Reduction as recommended by the Hyogo Framework for Action (HFA) 2005-2015. It consists of nine recommendations which, if put in practice, will significantly reduce the annual damage to natural hazard and produce additional (ancillary) benefits in economic, social and environmental terms.

  3. Benefits Assessment of Two California Hazardous Waste Disposal Facilities (1983)

    EPA Pesticide Factsheets

    The purpose of this study was to assess the benefits of RCRA regulations, comparing the results before and after new regulations at two existing hazardous waste sites previously regulated under California state law

  4. Regional landslide-hazard assessment for Seattle, Washington, USA

    USGS Publications Warehouse

    Baum, R.L.; Coe, J.A.; Godt, J.W.; Harp, E.L.; Reid, M.E.; Savage, W.Z.; Schulz, W.H.; Brien, D.L.; Chleborad, A.F.; McKenna, J.P.; Michael, J.A.

    2005-01-01

    Landslides are a widespread, frequent, and costly hazard in Seattle and the Puget Sound area of Washington State, USA. Shallow earth slides triggered by heavy rainfall are the most common type of landslide in the area; many transform into debris flows and cause significant property damage or disrupt transportation. Large rotational and translational slides, though less common, also cause serious property damage. The hundreds of landslides that occurred during the winters of 1995-96 and 1996-97 stimulated renewed interest by Puget Sound communities in identifying landslide-prone areas and taking actions to reduce future landslide losses. Informal partnerships between the U.S. Geological Survey (USGS), the City of Seattle, and private consultants are focusing on the problem of identifying and mapping areas of landslide hazard as well as characterizing temporal aspects of the hazard. We have developed GIS-based methods to map the probability of landslide occurrence as well as empirical rainfall thresholds and physically based methods to forecast times of landslide occurrence. Our methods for mapping landslide hazard zones began with field studies and physically based models to assess relative slope stability, including the effects of material properties, seasonal groundwater levels, and rainfall infiltration. We have analyzed the correlation between historic landslide occurrence and relative slope stability to map the degree of landslide hazard. The City of Seattle is using results of the USGS studies in storm preparedness planning for emergency access and response, planning for development or redevelopment of hillsides, and municipal facility planning and prioritization. Methods we have developed could be applied elsewhere to suit local needs and available data.

  5. Hazardous Materials Flow by Rail

    DOT National Transportation Integrated Search

    1990-03-01

    The report presents a quantitative overview of the movement of hazardous materials by rail in the United States. The data used is a hazardous materials rail waybill sample developed at TSC from the 1983 Rail Waybill Sample. The report examines (1) th...

  6. An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest

    PubMed Central

    Kim, Sung-Min; Yi, Huiuk; Choi, Yosoon

    2017-01-01

    In this study, current geographic information system (GIS)-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or three subtopics according to the steps involved in the reclamation procedure, or elements of the hazard of interest. Because GIS is appropriated for the handling of geospatial data in relation to mining-induced hazards, the application and feasibility of exploiting GIS-based modeling and assessment of mining-induced hazards within the mining industry could be expanded further. PMID:29186922

  7. An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest.

    PubMed

    Suh, Jangwon; Kim, Sung-Min; Yi, Huiuk; Choi, Yosoon

    2017-11-27

    In this study, current geographic information system (GIS)-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or three subtopics according to the steps involved in the reclamation procedure, or elements of the hazard of interest. Because GIS is appropriated for the handling of geospatial data in relation to mining-induced hazards, the application and feasibility of exploiting GIS-based modeling and assessment of mining-induced hazards within the mining industry could be expanded further.

  8. Decreased pain sensitivity due to trimethylbenzene exposure: case study on quantitative approaches for hazard identification

    EPA Science Inventory

    Traditionally, human health risk assessments have relied on qualitative approaches for hazard identification, often using the Hill criteria and weight of evidence determinations to integrate data from multiple studies. Recently, the National Research Council has recommended the ...

  9. Quality-of-life-adjusted hazard of death: a formulation of the quality-adjusted life-years model of use in benefit-risk assessment.

    PubMed

    Garcia-Hernandez, Alberto

    2014-03-01

    Although the quality-adjusted life-years (QALY) model is standard in health technology assessment, quantitative methods are less frequent but increasingly used for benefit-risk assessment (BRA) at earlier stages of drug development. A frequent challenge when implementing metrics for BRA is to weigh the importance of effects on a chronic condition against the risk of severe events during the trial. The lifetime component of the QALY model has a counterpart in the BRA context, namely, the risk of dying during the study. A new concept is presented, the hazard of death function that a subject is willing to accept instead of the baseline hazard to improve his or her chronic health status, which we have called the quality-of-life-adjusted hazard of death. It has been proven that if assumptions of the linear QALY model hold, the excess mortality rate tolerated by a subject for a chronic health improvement is inversely proportional to the mean residual life. This result leads to a new representation of the linear QALY model in terms of hazard rate functions and allows utilities obtained by using standard methods involving trade-offs of life duration to be translated into thresholds of tolerated mortality risk during a short period of time, thereby avoiding direct trade-offs using small probabilities of events during the study, which is known to lead to bias and variability. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  10. RF number as a new index for assessing combustion hazard of flammable gases.

    PubMed

    Kondo, Shigeo; Takahashi, Akifumi; Tokuhashi, Kazuaki; Sekiya, Akira

    2002-08-05

    A new index called RF number has been proposed for assessing the combustion hazard of all sorts of flammable gases and their mixtures. RF number represents the total expectancy of combustion hazard in terms of flammability limits and heat of combustion for each known and unknown compounds. The advantage of RF number over others such as R-index and F-number for classification of combustion hazard has been highlighted.

  11. A spatiotemporal multi-hazard exposure assessment based on property data

    NASA Astrophysics Data System (ADS)

    Fuchs, S.; Keiler, M.; Zischg, A.

    2015-09-01

    The paper presents a nation-wide spatially explicit object-based assessment of buildings and citizens exposed to natural hazards in Austria, including river flooding, torrential flooding, and snow avalanches. The assessment was based on two different data sets, (a) hazard information providing input to the exposure of elements at risk, and (b) information on the building stock combined from different spatial data available on the national level. Hazard information was compiled from two different sources. For torrential flooding and snow avalanches available local-scale hazard maps were used, and for river flooding the results of the countrywide flood modelling eHORA were available. Information on the building stock contained information on the location and size of each building, as well as on the building category and the construction period. Additional information related to the individual floors, such as their height and net area, main purpose and configuration, was included for each property. Moreover, this data set has an interface to the population register and allowed, therefore, for retrieving the number of primary residents for each building. With the exception of sacral buildings, an economic module was used to compute the monetary value of buildings using (a) the information of the building register such as building type, number of storeys and utilisation, and (b) regionally averaged construction costs. It is shown that the repeatedly stated assumption of increasing exposure due to continued population growth and related increase in assets has to be carefully evaluated by the local development of building stock. While some regions have shown a clearly above-average increase in assets, other regions were characterised by a below-average development. This mirrors the topography of the country, but also the different economic activities. While hotels and hostels are extraordinarily prone to torrential flooding, commercial buildings as well as buildings used

  12. Environmental Hazards Assessment Program. Quarterly report, July--September 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This report describes activities and reports on progress for the first quarter (July--September) of the fourth year of the grant to support the Environmental Hazards Assessment Program (EHAP) at the Medical University of South Carolina. It reports progress against the grant objectives and the Program Implementation Plan published at the end of the first year of the grant. The objectives of EHAP stated in the proposal to DOE are to: (1) develop a holistic, national basis for risk assessment, risk management, and risk communication that recognizes the direct impact of environmental hazards on the health and well-being of all; (2)more » develop a pool of talented scientists and experts in cleanup activities, especially in human health aspects; and (3) identify needs and develop programs addressing the critical shortage of well-educated, highly-skilled technical and scientific personnel to address the health-oriented aspects of environmental restoration and waste management.« less

  13. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis

    NASA Astrophysics Data System (ADS)

    Karlsson, Caroline S. J.; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W.

    2017-11-01

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  14. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis.

    PubMed

    Karlsson, Caroline S J; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W

    2017-11-01

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  15. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  16. Integrated Risk Assessment to Natural Hazards in Motozintla, Chiapas, Mexico

    NASA Astrophysics Data System (ADS)

    Novelo-Casanova, D. A.

    2012-12-01

    An integrated risk assessment includes the analysis of all components of individual constituents of risk such as baseline study, hazard identification and categorization, hazard exposure, and vulnerability. Vulnerability refers to the inability of people, organizations, and societies to withstand adverse impacts from multiple stressors to which they are exposed. These impacts are due to characteristics inherent in social interactions, institutions, and systems of cultural values. Thus, social vulnerability is a pre-existing condition that affects a society's ability to prepare for and recover from a disruptive event. Risk is the probability of a loss, and this loss depends on three elements: hazard, exposure, and vulnerability. Thus, risk is the estimated impact that a hazard event would have on people, services, facilities, structures and assets in a community. In this work we assess the risk to natural hazards in the community of Motozintla located in southern Mexico in the state of Chiapas (15.37N, 92.25W) with a population of about 20 000 habitants. Due to its geographical and geological location, this community is continuously exposed to many different natural hazards (earthquakes, landslides, volcanic eruptions, and floods). To determine the level of exposure of the community to natural hazards, we developed integrated studies and analysis of seismic microzonation, landslide and flood susceptibility as well as volcanic impact using standard methodologies. Social vulnerability was quantified from data obtained from local families interviews. Five variables were considered: household structure quality and design, availability of basic public services, family economic conditions, existing family plans for disaster preparedness, and risk perception.The number of families surveyed was determined considering a sample statistically significant. The families that were interviewed were selected using the simple random sampling technique with replacement. With these

  17. Assessing community vulnerabilities to natural hazards on the Island of Hawaii

    NASA Astrophysics Data System (ADS)

    Nishioka, Chris; Delparte, Donna

    2010-05-01

    The island of Hawaii is susceptible to numerous natural hazards such as tsunamis, flooding, lava flow, earthquakes, hurricanes, landslides, wildfires and storm surge. The impact of a natural disaster on the island's communities has the potential to endanger peoples' lives and threaten critical infrastructure, homes, businesses and economic drivers such as tourism. A Geographic Information System (GIS) has the ability to assess community vulnerabilities by examining the spatial relationships between hazard zones, socioeconomic infrastructure and demographic data. By drawing together existing datasets, GIS was used to examine a number of community vulnerabilities. Key areas of interest were government services, utilities, property assets, industry and transportation. GIS was also used to investigate population dynamics in hazard zones. Identification of community vulnerabilities from GIS analysis can support mitigation measures and assist planning and response measures to natural hazards.

  18. Assessment of the Microscreen phage-induction assay for screening hazardous wastes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houk, V.S.; DeMarini, D.M.

    1987-09-01

    The Microscreen phage-induction assay, which quantitatively measures the induction of prophage lambda in Escherichia coli WP2s(lambda), was used to test 14 crude (unfractionated) hazardous industrial waste samples for genotoxic activity in the presence and absence of metabolic activation. Eleven of the 14 wastes induced prophage, and induction was observed at concentrations as low as 0.4 picograms per ml. Comparisons between the mutagenicity of these waste samples in Salmonella and their ability to induce prophage lambda indicate that the Microscreen phage-induction assay detected genotoxic activity in all but one of the wastes that were mutagenic in Salmonella. Moreover, the Microscreen assaymore » detected as genotoxic 5 additional wastes that were not detected in the Salmonella assay. The applicability of the Microscreen phage-induction assay for screening hazardous wastes for genotoxic activity is discussed along with some of the problems associated with screening highly toxic wastes containing toxic volatile compounds.« less

  19. Landslide hazard in Bukavu (DR Congo): a geomorphological assessment in a data-poor context

    NASA Astrophysics Data System (ADS)

    Dewitte, Olivier; Mugaruka Bibentyo, Toussaint; Kulimushi Matabaro, Sylvain; Balegamire, Clarisse; Basimike, Joseph; Delvaux, Damien; Dille, Antoine; Ganza Bamulezi, Gloire; Jacobs, Liesbet; Michellier, Caroline; Monsieurs, Elise; Mugisho Birhenjira, Espoir; Nshokano, Jean-Robert; Nzolang, Charles; Kervyn, François

    2017-04-01

    Many cities in the Global South are known for facing an important increase in their population size. Many of them are then struggling with the sprawl of new settlements and very often urban planning and sustainable management policies are limited, if not non-existent. When those cities are set in landslide-prone environments, this situation is even more problematic. Despite these environmental constrains, landslide hazard assessments relevant for landscape planning remain rare. The objective of this research is to assess the landslide hazard in Bukavu, a city in DR Congo that is facing such a situation. We used a geomorphological approach (adapted from Cardinali et al., 2002) taking into account the data-poor context and the impact of anthropogenic activities. First, we built a multi-temporal historical inventory for a period of 60 years. A total of 151 landslides were mapped (largest landslide 1.5 km2). Their cumulative areas cover 29% of the urban territory and several types of processes are identified. Changes in the distribution and pattern of landslides allowed then to infer the possible evolution of the slopes, the most probable type of failures, and their expected frequency of occurrence and intensity. Despite this comprehensive inventory, hazard linked to the occurrence of new large deep-seated slides cannot be assessed due a scarcity of reliable data on the environmental factors controlling their occurrence. In addition, age estimation of the occurrence of some of the largest landslides refers to periods at the beginning of the Holocene where climatic and seismic conditions were probably different. Therefore, based on the inventory, we propose four hazard scenarios that coincide with today's environment. Hazard assessment was done for (1) reactivation of deep-seated slides, (2) occurrence of new small shallow slides, (3) rock falls, and (4) movements within existing landslides. Based on these assessments, we produced four hazard maps that indicate the

  20. Predictive models in hazard assessment of Great Lakes contaminants for fish

    USGS Publications Warehouse

    Passino, Dora R. May

    1986-01-01

    A hazard assessment scheme was developed and applied to predict potential harm to aquatic biota of nearly 500 organic compounds detected by gas chromatography/mass spectrometry (GC/MS) in Great Lakes fish. The frequency of occurrence and estimated concentrations of compounds found in lake trout (Salvelinus namaycush) and walleyes (Stizostedion vitreum vitreum) were compared with available manufacturing and discharge information. Bioconcentration potential of the compounds was estimated from available data or from calculations of quantitative structure-activity relationships (QSAR). Investigators at the National Fisheries Research Center-Great Lakes also measured the acute toxicity (48-h EC50's) of 35 representative compounds to Daphnia pulex and compared the results with acute toxicity values generated by QSAR. The QSAR-derived toxicities for several chemicals underestimated the actual acute toxicity by one or more orders of magnitude. A multiple regression of log EC50 on log water solubility and molecular volume proved to be a useful predictive model. Additional models providing insight into toxicity incorporate solvatochromic parameters that measure dipolarity/polarizability, hydrogen bond acceptor basicity, and hydrogen bond donor acidity of the solute (toxicant).

  1. THE ROLE OF RISK ASSESSMENT IN ADDRESSING HAZARDOUS WASTE ISSUES

    EPA Science Inventory

    Risk assessment plays many important roles in addressing hazardous waste issues. In addition to providing a scientific framework and common health metric to evaluate risks. Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA or "Superfund") risk assessm...

  2. Application of quantitative microbial risk assessments for estimation of risk management metrics: Clostridium perfringens in ready-to-eat and partially cooked meat and poultry products as an example.

    PubMed

    Crouch, Edmund A; Labarre, David; Golden, Neal J; Kause, Janell R; Dearfield, Kerry L

    2009-10-01

    The U.S. Department of Agriculture, Food Safety and Inspection Service is exploring quantitative risk assessment methodologies to incorporate the use of the Codex Alimentarius' newly adopted risk management metrics (e.g., food safety objectives and performance objectives). It is suggested that use of these metrics would more closely tie the results of quantitative microbial risk assessments (QMRAs) to public health outcomes. By estimating the food safety objective (the maximum frequency and/or concentration of a hazard in a food at the time of consumption) and the performance objective (the maximum frequency and/or concentration of a hazard in a food at a specified step in the food chain before the time of consumption), risk managers will have a better understanding of the appropriate level of protection (ALOP) from microbial hazards for public health protection. We here demonstrate a general methodology that allows identification of an ALOP and evaluation of corresponding metrics at appropriate points in the food chain. It requires a two-dimensional probabilistic risk assessment, the example used being the Monte Carlo QMRA for Clostridium perfringens in ready-to eat and partially cooked meat and poultry products, with minor modifications to evaluate and abstract required measures. For demonstration purposes, the QMRA model was applied specifically to hot dogs produced and consumed in the United States. Evaluation of the cumulative uncertainty distribution for illness rate allows a specification of an ALOP that, with defined confidence, corresponds to current industry practices.

  3. Resilience to Interacting multi-natural hazards

    NASA Astrophysics Data System (ADS)

    Zhuo, Lu; Han, Dawei

    2016-04-01

    temporal changes in hazards and vulnerability during successive hazards; 2) hazard monitoring, forecasting and early warning systems have not fully utilised the domain knowledge of physical processes and the statistical information of the observations; 3) uncertainties have not been well recognised in the current risk management practice, and ignorance of uncertainties could lead to major threat to the society and poor consideration with inefficient or unsustainable preferences of options; 4) there is increasing recognition that the so called 'natural' disasters are not just the consequences of nature-related processes alone, but are attributable to various social, economic, historical, political and cultural causes. However, despite this recognition, the current hazard and risk assessments are fragmented with a weakness in holistically combining quantitative and qualitative information from a variety of sources; 5) successful disaster risk management must be relevant and useful to all stakeholders involved. Efforts should enable the essential common purpose, collective learning and entrepreneurial collaborations that underpin effective and efficient resilience. Therefore, there is an urgent need for the systems thinking framework and decision support system tools in adequate scenario assessment and resilience development from a harmonised and transdisciplinary perspective. It is important that the aforementioned issues should be tackled with a joint effort from a multidisciplinary team in social science, natural science, engineering and systems.

  4. Rockfall hazard and risk assessments along roads at a regional scale: example in Swiss Alps

    NASA Astrophysics Data System (ADS)

    Michoud, C.; Derron, M.-H.; Horton, P.; Jaboyedoff, M.; Baillifard, F.-J.; Loye, A.; Nicolet, P.; Pedrazzini, A.; Queyrel, A.

    2012-03-01

    Unlike fragmental rockfall runout assessments, there are only few robust methods to quantify rock-mass-failure susceptibilities at regional scale. A detailed slope angle analysis of recent Digital Elevation Models (DEM) can be used to detect potential rockfall source areas, thanks to the Slope Angle Distribution procedure. However, this method does not provide any information on block-release frequencies inside identified areas. The present paper adds to the Slope Angle Distribution of cliffs unit its normalized cumulative distribution function. This improvement is assimilated to a quantitative weighting of slope angles, introducing rock-mass-failure susceptibilities inside rockfall source areas previously detected. Then rockfall runout assessment is performed using the GIS- and process-based software Flow-R, providing relative frequencies for runout. Thus, taking into consideration both susceptibility results, this approach can be used to establish, after calibration, hazard and risk maps at regional scale. As an example, a risk analysis of vehicle traffic exposed to rockfalls is performed along the main roads of the Swiss alpine valley of Bagnes.

  5. The hostel or the warehouse? Spatiotemporal exposure assessment for natural hazards

    NASA Astrophysics Data System (ADS)

    Fuchs, S.; Keiler, M.; Zischg, A.

    2015-04-01

    A spatially explicit object-based temporal assessment of buildings and citizens exposed to natural hazards in Austria is presented, including elements at risk to river flooding, torrential flooding, and snow avalanches. It is shown that the repeatedly-stated assumption of increasing losses due to continued population growth and related increase in assets has to be opposed to the local development of building stock. While some regions have shown a clearly above-average increase in assets, other regions were characterised by a below-average development. This mirrors the topography of the country, but also the different economic activities. While hotels and hostels are extraordinary prone to mountain hazards, commercial buildings as well as buildings used for recreation purpose are considerably exposed to river flooding. Residential buildings have shown an average exposure, compared to the amount of buildings of this type in the overall building stock. In sum, around 5% of all buildings are exposed to mountain hazards, and around 9% to river flooding, with around 1% of the buildings stock being multi-exposed. It is shown that the dynamics of elements at risk exposed have a time lag once land use regulations are enforced, and it is concluded that an object-based assessment has clear advantages compared to the assessment using aggregated land use data.

  6. Values of Flood Hazard Mapping for Disaster Risk Assessment and Communication

    NASA Astrophysics Data System (ADS)

    Sayama, T.; Takara, K. T.

    2015-12-01

    Flood plains provide tremendous benefits for human settlements. Since olden days people have lived with floods and attempted to control them if necessary. Modern engineering works such as building embankment have enabled people to live even in flood prone areas, and over time population and economic assets have concentrated in these areas. In developing countries also, rapid land use change alters exposure and vulnerability to floods and consequently increases disaster risk. Flood hazard mapping is an essential step for any counter measures. It has various objectives including raising awareness of residents, finding effective evacuation routes and estimating potential damages through flood risk mapping. Depending on the objectives and data availability, there are also many possible approaches for hazard mapping including simulation basis, community basis and remote sensing basis. In addition to traditional paper-based hazard maps, Information and Communication Technology (ICT) promotes more interactive hazard mapping such as movable hazard map to demonstrate scenario simulations for risk communications and real-time hazard mapping for effective disaster responses and safe evacuations. This presentation first summarizes recent advancement of flood hazard mapping by focusing on Japanese experiences and other examples from Asian countries. Then it introduces a flood simulation tool suitable for hazard mapping at the river basin scale even in data limited regions. In the past few years, the tool has been practiced by local officers responsible for disaster management in Asian countries. Through the training activities of hazard mapping and risk assessment, we conduct comparative analysis to identify similarity and uniqueness of estimated economic damages depending on topographic and land use conditions.

  7. Transparent Global Seismic Hazard and Risk Assessment

    NASA Astrophysics Data System (ADS)

    Smolka, Anselm; Schneider, John; Pinho, Rui; Crowley, Helen

    2013-04-01

    Vulnerability to earthquakes is increasing, yet advanced reliable risk assessment tools and data are inaccessible to most, despite being a critical basis for managing risk. Also, there are few, if any, global standards that allow us to compare risk between various locations. The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange, and leverages the knowledge of leading experts for the benefit of society. Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. Guided by the needs and experiences of governments, companies and citizens at large, they work in continuous interaction with the wider community. A continuously expanding public-private partnership constitutes the GEM Foundation, which drives the collaborative GEM effort. An integrated and holistic approach to risk is key to GEM's risk assessment platform, OpenQuake, that integrates all above-mentioned contributions and will become available towards the end of 2014. Stakeholders worldwide will be able to calculate, visualise and investigate earthquake risk, capture new data and to share their findings for joint learning. Homogenized information on hazard can be combined with data on exposure (buildings, population) and data on their vulnerability, for loss assessment around the globe. Furthermore, for a true integrated view of seismic risk, users can add social vulnerability and resilience indices to maps and estimate the costs and benefits

  8. GIS thematic layers for assessing karst hazard in Murgia region (Italy)

    NASA Astrophysics Data System (ADS)

    Canora, Filomena; D'Angella, Annachiara; Fidelibus, Dolores; Lella, Angela; Pellicani, Roberta; Spilotro, Giuseppe

    2013-04-01

    The assessment of karst hazard in a carbonate area may be somewhat complex for the multiplicity of involved factors (geological, hydrological, morphological, anthropogenic, etc.), their history and the slow rate of evolution of the processes. In coastal areas, moreover, the long term sea level variations and the short term oscillations generally influence the generation and evolution of the karst process. Another peculiarity of the karst hazard assessment consists in the difficulty for identifying the location of subsurface forms, which may develop over very large areas without any kind of surface signal. The karst processes and landforms often require specific methods of investigation and mitigation, due to the unique and highly variable characters of karst environments. In addition, the hidden character of the karst processes, often accelerated by human activity, is an issue with significant economic impact, affecting many regions of the world. The assessment of karst hazard in the Murgia plateau (in central-west of Apulia region) is the main goal of this research. For this aim, the typologies of karst phenomena, able to produce hazard in the study area, were individuated and collected in a specific database. The hazard was evaluated on the basis of the probability of occurrence of a phenomenon of instability, active (produced by human activities) or passive (natural evolution of karst process), in relation to the presence, evolution or generation of karst forms on surface or at critical distance from the surface. The critical distance from the surface is defined as the distance at which the local or general destructive evolution of a karst process can produce a variation of the usability of the area or of the value of elements involved in the instability. The thematic layers relative to the factors influencing karst processes and landforms (doline, sinkholes, polje, lame, gravine, caves) were elaborated and managed in a GIS system. The archives of the main karst

  9. Assessment of a Tsunami Hazard for Mediterranean Coast of Egypt

    NASA Astrophysics Data System (ADS)

    Zaytsev, Andrey; Babeyko, Andrey; Yalciner, Ahmet; Pelinovsky, Efim

    2017-04-01

    Analysis of tsunami hazard for Egypt based on historic data and numerical modelling of historic and prognostic events is given. There are 13 historic events for 4000 years, including one instrumental record (1956). Tsunami database includes 12 earthquake tsunamis and 1 event of volcanic origin (Santorini eruption). Tsunami intensity of events (365, 881, 1303, 1870) is estimated as I = 3 led to tsunami wave height more than 6 m. Numerical simulation of some possible scenario of tsunamis of seismic and landslide origin is done with use of NAMI-DANCE software solved the shallow-water equations. The PTHA method (Probabilistic Tsunami Hazard Assessment - Probabilistic assessment of a tsunami hazard) for the Mediterranean Sea developed in (Sorensen M.B., Spada M., Babeyko A., Wiemer S., Grunthal G. Probabilistic tsunami hazard in the Mediterranean Sea. J Geophysical Research, 2012, vol. 117, B01305) is used to evaluate the probability of tsunami occurrence on the Egyptian coast. The synthetic catalogue of prognostic tsunamis of seismic origin with magnitude more than 6.5 includes 84 920 events for 100000 years. For the wave heights more 1 m the curve: exceedance probability - tsunami height can be approximated by exponential Gumbel function with two parameters which are determined for each coastal location in Egypt (totally. 24 points). Prognostic extreme highest events with probability less 10-4 are not satisfied to the Gumbel function (approximately 10 events) and required the special analysis. Acknowledgements: This work was supported EU FP7 ASTARTE Project [603839], and for EP - NS6637.2016.5.

  10. Landslide Hazard Zonation and Risk Assessment of Ramganga Basin in Garhwal Himalaya

    NASA Astrophysics Data System (ADS)

    Wasini Pandey, Bindhy; Roy, Nikhil

    2016-04-01

    The Himalaya being unique in its physiographic, tectonic and climatic characteristics coupled with many natural and man-made factors is inherently prone to landslides. These landslides lead to mass loss of property and lives every year in Himalayas. Hence, Landslide Hazard Zonation is important to take quick and safe mitigation measures and make strategic planning for future development. The present study tries to explore the causes of landslides in Ramganga Basin in Garhwal Himalaya, which has an established history and inherent susceptibility to massive landslides has been chosen for landslide hazard zonation and risk assessment. The satellite imageries of LANDSAT, IRS P6, ASTER along with Survey of India (SOI) topographical sheets formed the basis for deriving baseline information on various parameters like slope, aspect, relative relief, drainage density, geology/lithology and land use/land cover. The weighted parametric method will be used to determine the degree of susceptibility to landslides. Finally, a risk map will be prepared from the landslide probability values, which will be classified into no risk, very low to moderate, high, and very high to severe landslide hazard risk zones. Keywords: Landslides, Hazard Zonation, Risk Assessment

  11. Physiologic basis for understanding quantitative dehydration assessment.

    PubMed

    Cheuvront, Samuel N; Kenefick, Robert W; Charkoudian, Nisha; Sawka, Michael N

    2013-03-01

    Dehydration (body water deficit) is a physiologic state that can have profound implications for human health and performance. Unfortunately, dehydration can be difficult to assess, and there is no single, universal gold standard for decision making. In this article, we review the physiologic basis for understanding quantitative dehydration assessment. We highlight how phenomenologic interpretations of dehydration depend critically on the type (dehydration compared with volume depletion) and magnitude (moderate compared with severe) of dehydration, which in turn influence the osmotic (plasma osmolality) and blood volume-dependent compensatory thresholds for antidiuretic and thirst responses. In particular, we review new findings regarding the biological variation in osmotic responses to dehydration and discuss how this variation can help provide a quantitative and clinically relevant link between the physiology and phenomenology of dehydration. Practical measures with empirical thresholds are provided as a starting point for improving the practice of dehydration assessment.

  12. Landslides in Nicaragua - Mapping, Inventory, Hazard Assessment, Vulnerability Reduction, and Forecasting Attempts

    NASA Astrophysics Data System (ADS)

    Dévoli, G.; Strauch, W.; Álvarez, A.; Muñoz, A.; Kjekstad, O.

    2009-04-01

    A successful landslide hazard and risk assessment requires awareness and good understanding of the potential landslide problems within the geographic area involved. However, this requirement is not always met in developing countries where population, scientific community, and the government may not be aware of the landslide threat. The landslide hazard assessment is often neglected or is based on sparse and not well documented technical information. In Nicaragua (Central America), the basic conditions for landslide hazard and risk assessment were first created after the catastrophic landslides triggered by Hurricane Mitch in October 1998. A single landslide took the life of thousands of people at Casita volcano forcing entire communities to be evacuated or relocated and, furthermore, thousands of smaller landslides caused loss of fertile soils and pasture lands, and made serious damages to the infrastructure. Since those events occurred, the public awareness has increased and the country relies now on new local and national governmental laws and policies, on a number of landslide investigations, and on educational and training programs. Dozens of geologists have been capacitated to investigate landslide prone areas, The Instituto Nicaragüense de Estudios Territoriales (INETER), governmental geo-scientific institution, has assumed the responsibility to help land-use planners and public officials to reduce geological hazard losses. They are committed to work cooperatively with national, international, and local agencies, universities and the private sector to provide scientific information and improve public safety through forecasting and warnings. However, in order to provide successful long-term landslide hazard assessment, the institutions must face challenges related to the scarcity and varied quality of available landslide information; collection and access to dispersed data and documents; organization of landslide information in a form that can be easy to

  13. Introduction: Hazard mapping

    USGS Publications Warehouse

    Baum, Rex L.; Miyagi, Toyohiko; Lee, Saro; Trofymchuk, Oleksandr M

    2014-01-01

    Twenty papers were accepted into the session on landslide hazard mapping for oral presentation. The papers presented susceptibility and hazard analysis based on approaches ranging from field-based assessments to statistically based models to assessments that combined hydromechanical and probabilistic components. Many of the studies have taken advantage of increasing availability of remotely sensed data and nearly all relied on Geographic Information Systems to organize and analyze spatial data. The studies used a range of methods for assessing performance and validating hazard and susceptibility models. A few of the studies presented in this session also included some element of landslide risk assessment. This collection of papers clearly demonstrates that a wide range of approaches can lead to useful assessments of landslide susceptibility and hazard.

  14. Damage assessment of bridge infrastructure subjected to flood-related hazards

    NASA Astrophysics Data System (ADS)

    Michalis, Panagiotis; Cahill, Paul; Bekić, Damir; Kerin, Igor; Pakrashi, Vikram; Lapthorne, John; Morais, João Gonçalo Martins Paulo; McKeogh, Eamon

    2017-04-01

    Transportation assets represent a critical component of society's infrastructure systems. Flood-related hazards are considered one of the main climate change impacts on highway and railway infrastructure, threatening the security and functionality of transportation systems. Of such hazards, flood-induced scour is a primarily cause of bridge collapses worldwide and one of the most complex and challenging water flow and erosion phenomena, leading to structural instability and ultimately catastrophic failures. Evaluation of scour risk under severe flood events is a particularly challenging issue considering that depth of foundations is very difficult to evaluate in water environment. The continual inspection, assessment and maintenance of bridges and other hydraulic structures under extreme flood events requires a multidisciplinary approach, including knowledge and expertise of hydraulics, hydrology, structural engineering, geotechnics and infrastructure management. The large number of bridges under a single management unit also highlights the need for efficient management, information sharing and self-informing systems to provide reliable, cost-effective flood and scour risk management. The "Intelligent Bridge Assessment Maintenance and Management System" (BRIDGE SMS) is an EU/FP7 funded project which aims to couple state-of-the art scientific expertise in multidisciplinary engineering sectors with industrial knowledge in infrastructure management. This involves the application of integrated low-cost structural health monitoring systems to provide real-time information towards the development of an intelligent decision support tool and a web-based platform to assess and efficiently manage bridge assets. This study documents the technological experience and presents results obtained from the application of sensing systems focusing on the damage assessment of water-hazards at bridges over watercourses in Ireland. The applied instrumentation is interfaced with an open

  15. Long-range hazard assessment of volcanic ash dispersal for a Plinian eruptive scenario at Popocatépetl volcano (Mexico): implications for civil aviation safety

    NASA Astrophysics Data System (ADS)

    Bonasia, Rosanna; Scaini, Chiara; Capra, Lucia; Nathenson, Manuel; Siebe, Claus; Arana-Salinas, Lilia; Folch, Arnau

    2014-01-01

    Popocatépetl is one of Mexico's most active volcanoes threatening a densely populated area that includes Mexico City with more than 20 million inhabitants. The destructive potential of this volcano is demonstrated by its Late Pleistocene-Holocene eruptive activity, which has been characterized by recurrent Plinian eruptions of large magnitude, the last two of which destroyed human settlements in pre-Hispanic times. Popocatépetl's reawakening in 1994 produced a crisis that culminated with the evacuation of two villages on the northeastern flank of the volcano. Shortly after, a monitoring system and a civil protection contingency plan based on a hazard zone map were implemented. The current volcanic hazards map considers the potential occurrence of different volcanic phenomena, including pyroclastic density currents and lahars. However, no quantitative assessment of the tephra hazard, especially related to atmospheric dispersal, has been performed. The presence of airborne volcanic ash at low and jet-cruise atmospheric levels compromises the safety of aircraft operations and forces re-routing of aircraft to prevent encounters with volcanic ash clouds. Given the high number of important airports in the surroundings of Popocatépetl volcano and considering the potential threat posed to civil aviation in Mexico and adjacent regions in case of a Plinian eruption, a hazard assessment for tephra dispersal is required. In this work, we present the first probabilistic tephra dispersal hazard assessment for Popocatépetl volcano. We compute probabilistic hazard maps for critical thresholds of airborne ash concentrations at different flight levels, corresponding to the situation defined in Europe during 2010, and still under discussion. Tephra dispersal mode is performed using the FALL3D numerical model. Probabilistic hazard maps are built for a Plinian eruptive scenario defined on the basis of geological field data for the "Ochre Pumice" Plinian eruption (4965 14C yr BP

  16. Integrated hazard assessment of Cirenmaco glacial lake in Zhangzangbo valley, Central Himalayas

    NASA Astrophysics Data System (ADS)

    Wang, Weicai; Gao, Yang; Iribarren Anacona, Pablo; Lei, Yanbin; Xiang, Yang; Zhang, Guoqing; Li, Shenghai; Lu, Anxin

    2018-04-01

    Glacial lake outburst floods (GLOFs) have recently become one of the primary natural hazards in the Himalayas. There is therefore an urgent need to assess GLOF hazards in the region. Cirenmaco, a moraine-dammed lake located in the upstream portion of Zhangzangbo valley, Central Himalayas, has received public attention after its damaging 1981 outburst flood. Here, by combining remote sensing methods, bathymetric survey and 2D hydraulic modeling, we assessed the hazard posed by Cirenmaco in its current status. Inter-annual variation of Cirenmaco lake area indicates a rapid lake expansion from 0.10 ± 0.08 km2 in 1988 to 0.39 ± 0.04 km2 in 2013. Bathymetric survey shows the maximum water depth of the lake in 2012 was 115 ± 2 m and the lake volume was calculated to be 1.8 × 107 m3. Field geomorphic analysis shows that Cirenmaco glacial lake is prone to GLOFs as mass movements and ice and snow avalanches can impact the lake and the melting of the dead ice in the moraine can lower the dam level. HEC-RAS 2D model was then used to simulate moraine dam failure of the Cirenmaco and assess GLOF impacts downstream. Reconstruction of Cirenmaco 1981 GLOF shows that HEC-RAS can produce reasonable flood extent and water depth, thus demonstrate its ability to effectively model complex GLOFs. GLOF modeling results presented can be used as a basis for the implementation of disaster prevention and mitigation measures. As a case study, this work shows how we can integrate different methods to GLOF hazard assessment.

  17. Application of a new methodology for coastal multi-hazard-assessment & management on the state of Karnataka, India.

    PubMed

    Rosendahl Appelquist, Lars; Balstrøm, Thomas

    2015-04-01

    This paper presents the application of a new methodology for coastal multi-hazard assessment & management under a changing global climate on the state of Karnataka, India. The recently published methodology termed the Coastal Hazard Wheel (CHW) is designed for local, regional and national hazard screening in areas with limited data availability, and covers the hazards of ecosystem disruption, gradual inundation, salt water intrusion, erosion and flooding. The application makes use of published geophysical data and remote sensing information and is showcasing how the CHW framework can be applied at a scale relevant for regional planning purposes. It uses a GIS approach to develop regional and sub-regional hazard maps as well as to produce relevant hazard risk data, and includes a discussion of uncertainties, limitations and management perspectives. The hazard assessment shows that 61 percent of Karnataka's coastline has a high or very high inherent hazard of erosion, making erosion the most prevalent coastal hazard. The hazards of flooding and salt water intrusion are also relatively widespread as 39 percent of Karnataka's coastline has a high or very high inherent hazard for both of these hazard types. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Errors in Seismic Hazard Assessment are Creating Huge Human Losses

    NASA Astrophysics Data System (ADS)

    Bela, J.

    2015-12-01

    The current practice of representing earthquake hazards to the public based upon their perceived likelihood or probability of occurrence is proven now by the global record of actual earthquakes to be not only erroneous and unreliable, but also too deadly! Earthquake occurrence is sporadic and therefore assumptions of earthquake frequency and return-period are both not only misleading, but also categorically false. More than 700,000 people have now lost their lives (2000-2011), wherein 11 of the World's Deadliest Earthquakes have occurred in locations where probability-based seismic hazard assessments had predicted only low seismic low hazard. Unless seismic hazard assessment and the setting of minimum earthquake design safety standards for buildings and bridges are based on a more realistic deterministic recognition of "what can happen" rather than on what mathematical models suggest is "most likely to happen" such future huge human losses can only be expected to continue! The actual earthquake events that did occur were at or near the maximum potential-size event that either already had occurred in the past; or were geologically known to be possible. Haiti's M7 earthquake, 2010 (with > 222,000 fatalities) meant the dead could not even be buried with dignity. Japan's catastrophic Tohoku earthquake, 2011; a M9 Megathrust earthquake, unleashed a tsunami that not only obliterated coastal communities along the northern Japanese coast, but also claimed > 20,000 lives. This tsunami flooded nuclear reactors at Fukushima, causing 4 explosions and 3 reactors to melt down. But while this history of huge human losses due to erroneous and misleading seismic hazard estimates, despite its wrenching pain, cannot be unlived; if faced with courage and a more realistic deterministic estimate of "what is possible", it need not be lived again. An objective testing of the results of global probability based seismic hazard maps against real occurrences has never been done by the

  19. The application of the geography census data in seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Yuan, Shen; Ying, Zhang

    2017-04-01

    Limited by basic data timeliness to earthquake emergency database in Sichuan province, after the earthquake disaster assessment results and the actual damage there is a certain gap. In 2015, Sichuan completed the province census for the first time which including topography, traffic, vegetation coverage, water area, desert and bare ground, traffic network, the census residents and facilities, geographical unit, geological hazard as well as the Lushan earthquake-stricken area's town planning construction and ecological environment restoration. On this basis, combining with the existing achievements of basic geographic information data and high resolution image data, supplemented by remote sensing image interpretation and geological survey, Carried out distribution and change situation of statistical analysis and information extraction for earthquake disaster hazard-affected body elements such as surface coverage, roads, structures infrastructure in Lushan county before 2013 after 2015. At the same time, achieved the transformation and updating from geographical conditions census data to earthquake emergency basic data through research their data type, structure and relationship. Finally, based on multi-source disaster information including hazard-affected body changed data and Lushan 7.0 magnitude earthquake CORS network coseismal displacement field, etc. obtaining intensity control points through information fusion. Then completed the seismic influence field correction and assessed earthquake disaster again through Sichuan earthquake relief headquarters technology platform. Compared the new assessment result,original assessment result and actual earthquake disaster loss which shows that the revised evaluation result is more close to the actual earthquake disaster loss. In the future can realize geographical conditions census data to earthquake emergency basic data's normalized updates, ensure the timeliness to earthquake emergency database meanwhile improve the

  20. The influence of hazard models on GIS-based regional risk assessments and mitigation policies

    USGS Publications Warehouse

    Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.

    2006-01-01

    Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.

  1. Assessing the reporting of categorised quantitative variables in observational epidemiological studies.

    PubMed

    Mabikwa, Onkabetse V; Greenwood, Darren C; Baxter, Paul D; Fleming, Sarah J

    2017-03-14

    One aspect to consider when reporting results of observational studies in epidemiology is how quantitative risk factors are analysed. The STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) guidelines recommend that researchers describe how they handle quantitative variables when analysing data. For categorised quantitative variables, the authors are required to provide reasons and justifications informing their practice. We investigated and assessed the practices and reporting of categorised quantitative variables in epidemiology. The assessment was based on five medical journals that publish epidemiological research. Observational studies published between April and June 2015 and investigating the relationships between quantitative exposures (or risk factors) and the outcomes were considered for assessment. A standard form was used to collect the data, and the reporting patterns amongst eligible studies were quantified and described. Out of 61 articles assessed for eligibility, 23 observational studies were included in the assessment. Categorisation of quantitative exposures occurred in 61% of these studies and reasons informing the practice were rarely provided. Only one article explained the choice of categorisation in the analysis. Transformation of quantitative exposures into four or five groups was common and dominant amongst studies using equally spaced categories. Dichotomisation was not popular; the practice featured in one article. Overall, the majority (86%) of the studies preferred ordered or arbitrary group categories. Other criterions used to decide categorical boundaries were based on established guidelines such as consensus statements and WHO standards. Categorisation of continuous variables remains a dominant practice in epidemiological studies. The reasons informing the practice of categorisation within published work are limited and remain unknown in most articles. The existing STROBE guidelines could provide stronger

  2. Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Probabilistic-Deterministic Method

    NASA Astrophysics Data System (ADS)

    mouloud, Hamidatou

    2016-04-01

    The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.

  3. Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site

    NASA Astrophysics Data System (ADS)

    Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.

    2012-04-01

    The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological

  4. Assessment of Rip-Current Hazards Using Alongshore Topographic Anisotropy at Bondi Beach, Australia

    NASA Astrophysics Data System (ADS)

    Hartman, K.; Trimble, S. M.; Bishop, M. P.; Houser, C.

    2016-12-01

    Rip currents are a relatively high-velocity flow of water away from the beach common in coastal environments. As beach morphology adapts to sediment fluxes and wave climate, it is essential to be able to assess rip-current hazard conditions. Furthermore, it is essential to be able to characterize the scale-dependent bathymetric morphology that governs the extent and magnitude of a rip current. Consequently, our primary objective is to assess the alongshore distribution of topographic anisotropy, in order to identify rip-current hazard locations. Specifically, we utilized multi-band satellite imagery to generate a bathymetric digital elevation model (DEM) for Bondi Beach Australia, and collected field data to support our analysis. Scale-dependent spatial analysis of the DEM was conducted to assess the directional dependence of topographic relief, the magnitude of topographic anisotropy, and the degree of anisotropic symmetry. We displayed anisotropy parameters as images and false-color composites to visualize morphological conditions associated with rip channels. Our preliminary results indicate that rip channels generally have a higher anisotropy index and orthogonal orientation compared to dissipative or reflective beach anisotropy and orientation. Scale-dependent variations in anisotropy can be used to assess the spatial extent of rip currents. Furthermore, well-defined rip channels exhibit positive symmetry, while variations in the distribution of symmetry reflect sediment-flux variations alongshore. These results clearly reveal that a well-developed rip channel can be identified and assessed using topographic anisotropy, as scale-dependent anisotropy patterns are unique when compared to the surrounding bathymetry and terrain. In this way, it is possible to evaluate the alongshore distribution of rip currents. Alongshore topographic anisotropy data will be extremely important as input into hazard assessment studies and the development of hazard decision support

  5. Assessment of Nearshore Hazard due to Tsunami-Induced Currents (Invited)

    NASA Astrophysics Data System (ADS)

    Lynett, P. J.; Borrero, J. C.; Son, S.; Wilson, R. I.; Miller, K.

    2013-12-01

    The California Tsunami Program coordinated by CalOES and CGS in cooperation with NOAA and FEMA has begun implementing a plan to increase awareness of tsunami generated hazards to the maritime community (both ships and harbor infrastructure) through the development of in-harbor hazard maps, offshore safety zones for boater evacuation, and associated guidance for harbors and marinas before, during and following tsunamis. The hope is that the maritime guidance and associated education and outreach program will help save lives and reduce exposure of damage to boats and harbor infrastructure. An important step in this process is to understand the causative mechanism for damage in ports and harbors, and then ensure that the models used to generate hazard maps are able to accurately simulate these processes. Findings will be used to develop maps, guidance documents, and consistent policy recommendations for emergency managers and port authorities and provide information critical to real-time decisions required when responding to tsunami alert notifications. The goals of the study are to (1) evaluate the effectiveness and sensitivity of existing numerical models for assessing maritime tsunami hazards, (2) find a relationship between current speeds and expected damage levels, (3) evaluate California ports and harbors in terms of tsunami induced hazards by identifying regions that are prone to higher current speeds and damage and to identify regions of relatively lower impact that may be used for evacuation of maritime assets, and (4) determine ';safe depths' for evacuation of vessels from ports and harbors during a tsunami event. This presentation will focus on the results from five California ports and harbors, and will include feedback we have received from initial discussion with local harbor masters and port authorities. This work in California will form the basis for tsunami hazard reduction for all U.S. maritime communities through the National Tsunami Hazard

  6. Towards an integrated approach to natural hazards risk assessment using GIS: with reference to bushfires.

    PubMed

    Chen, Keping; Blong, Russell; Jacobson, Carol

    2003-04-01

    This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.

  7. Using Remotely Sensed Information for Near Real-Time Landslide Hazard Assessment

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia; Adler, Robert; Peters-Lidard, Christa

    2013-01-01

    The increasing availability of remotely sensed precipitation and surface products provides a unique opportunity to explore how landslide susceptibility and hazard assessment may be approached at larger spatial scales with higher resolution remote sensing products. A prototype global landslide hazard assessment framework has been developed to evaluate how landslide susceptibility and satellite-derived precipitation estimates can be used to identify potential landslide conditions in near-real time. Preliminary analysis of this algorithm suggests that forecasting errors are geographically variable due to the resolution and accuracy of the current susceptibility map and the application of satellite-based rainfall estimates. This research is currently working to improve the algorithm through considering higher spatial and temporal resolution landslide susceptibility information and testing different rainfall triggering thresholds, antecedent rainfall scenarios, and various surface products at regional and global scales.

  8. Globe of Natural Hazard - A new assessment tool for risk managers

    NASA Astrophysics Data System (ADS)

    Siebert, A. C.

    2009-04-01

    A large number of tropical cyclones and the earthquake in Sichuan made 2008 one of the most devastating years on record. Throughout the world, more than 220,000 people died as a result of natural catastrophes this year. Overall losses totaled some US 200bn (2007: US 82bn). Insured losses in 2008 rose to US 45bn, about 50% higher than in the previous year. Mainly driven by high losses from weather-related natural catastrophes, 2008 was - on the basis of figures adjusted for inflation - even the third most expensive year on record for the insurance industry, exceeded only by the hurricane year of 2005 and by 1995, the year of the Kobe earthquake. Munich Re, a worldwide operating reinsurance company, is a world leader in terms of investigating risks from natural hazards of all kinds. 2008 has again shown the insurance industry how important it is to analyse risks like natural hazards and climate change in all their facets and to manage the insurance business accordingly. An excellent example of the wealth of knowledge Munich Re has developed in natural hazard assessment is the DVD "Globe of Natural Hazards". It combines the geoscientific data and findings Munich Re has accumulated over a period of 35 years. First devised as a wall-map in 1978, the product has established itself as a standard work for the identification, exposure assessment and risk management of natural hazards. Over 80,000 copies of the CD-ROM version of 2000 have been provided to clients - a mark achieved by no other service product in Munich Re's history. Since the beginning of 2009, the fully updated fourth-generation version has been available. The bilingual DVD (German and English) shows natural hazards and climate effects at a glance: the global maps are presented on a 3D globe, underlaid with satellite images. The hazard complexes of hail, tornado and winter storms have been completely revised and flood incorporated as a new hazard. Users can intuitively home in on and enlarge any location on

  9. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  10. Landslide and flood hazard assessment in urban areas of Levoča region (Eastern Slovakia)

    NASA Astrophysics Data System (ADS)

    Magulova, Barbora; Caporali, Enrica; Bednarik, Martin

    2010-05-01

    The case study presents the use of statistical methods and analysis tools, for hazard assessment of "urbanization units", implemented in a Geographic Information Systems (GIS) environment. As a case study, the Levoča region (Slovakia) is selected. The region, with a total area of about 351 km2, is widely affected by landslides and floods. The problem, for small urbanization areas, is nowadays particularly significant from the socio-economic point of view. It is considered, presently, also an increasing problem, mainly because of climate change and more frequent extreme rainfall events. The geo-hazards are evaluated using a multivariate analysis. The landslide hazard assessment is based on the comparison and subsequent statistical elaboration of territorial dependence among different input factors influencing the instability of the slopes. Particularly, five factors influencing slope stability are evaluated, i.e. lithology, slope aspect, slope angle, hypsographic level and present land use. As a result a new landslide susceptibility map is compiled and different zones of stable, dormant and non-stable areas are defined. For flood hazard map a detailed digital elevation model is created. A compose index of flood hazard is derived from topography, land cover and pedology related data. To estimate flood discharge, time series of stream flow and precipitation measurements are used. The assessment results are prognostic maps of landslide hazard and flood hazard, which presents the optimal base for urbanization planning.

  11. Rockfall Hazard Process Assessment : [Project Summary

    DOT National Transportation Integrated Search

    2017-10-01

    The Montana Department of Transportation (MDT) implemented its Rockfall Hazard Rating System (RHRS) between 2003 and 2005, obtaining information on the state's rock slopes and their associated hazards. The RHRS data facilitated decision-making in an ...

  12. Multi-hazard Assessment and Scenario Toolbox (MhAST): A Framework for Analyzing Compounding Effects of Multiple Hazards

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Moftakhari, H.; AghaKouchak, A.

    2017-12-01

    Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.

  13. Hazard Response Modeling Uncertainty (A Quantitative Method)

    DTIC Science & Technology

    1988-10-01

    was conducted by the National Maritime Institute under contract to the United Kingdom Health and Safety Executive. Instantaneous releases of 2000...the National Maritime Institute under contract to the United Kingdom Health and Safety Executive with the sponsorship of numerous international...WORK UNIT ELEMENT NO. NO. NO. ACESSION NO. _____________________65502F I 00O ne I " 11. TITLE (incl& e Security Oauffication) Hazard Response

  14. Assessing the earthquake hazards in urban areas

    USGS Publications Warehouse

    Hays, W.W.; Gori, P.L.; Kockelman, W.J.

    1988-01-01

    Major urban areas in widely scattered geographic locations across the United States are a t varying degrees of risk from earthquakes. the locations of these urban areas include Charleston, South Carolina; Memphis Tennessee; St.Louis, Missouri; Salt Lake City, Utah; Seattle-Tacoma, Washington; Portland, Oregon; and Anchorage, Alaska; even Boston, Massachusetts, and Buffalo New York, have a history of large earthquakes. Cooperative research during the past decade has focused on assessing the nature and degree of the risk or seismic hazard i nthe broad geographic regions around each urban area. The strategy since the 1970's has been to bring together local, State, and Federal resources to solve the problem of assessing seismic risk. Successfl sooperative programs have been launched in the San Francisco Bay and Los Angeles regions in California and the Wasatch Front region in Utah. 

  15. A comprehensive Probabilistic Tsunami Hazard Assessment for the city of Naples (Italy)

    NASA Astrophysics Data System (ADS)

    Anita, G.; Tonini, R.; Selva, J.; Sandri, L.; Pierdominici, S.; Faenza, L.; Zaccarelli, L.

    2012-12-01

    A comprehensive Probabilistic Tsunami Hazard Assessment (PTHA) should consider different tsunamigenic sources (seismic events, slide failures, volcanic eruptions) to calculate the hazard on given target sites. This implies a multi-disciplinary analysis of all natural tsunamigenic sources, in a multi-hazard/risk framework, which considers also the effects of interaction/cascade events. Our approach shows the ongoing effort to analyze the comprehensive PTHA for the city of Naples (Italy) including all types of sources located in the Tyrrhenian Sea, as developed within the Italian project ByMuR (Bayesian Multi-Risk Assessment). The project combines a multi-hazard/risk approach to treat the interactions among different hazards, and a Bayesian approach to handle the uncertainties. The natural potential tsunamigenic sources analyzed are: 1) submarine seismic sources located on active faults in the Tyrrhenian Sea and close to the Southern Italian shore line (also we consider the effects of the inshore seismic sources and the associated active faults which we provide their rapture properties), 2) mass failures and collapses around the target area (spatially identified on the basis of their propensity to failure), and 3) volcanic sources mainly identified by pyroclastic flows and collapses from the volcanoes in the Neapolitan area (Vesuvius, Campi Flegrei and Ischia). All these natural sources are here preliminary analyzed and combined, in order to provide a complete picture of a PTHA for the city of Naples. In addition, the treatment of interaction/cascade effects is formally discussed in the case of significant temporary variations in the short-term PTHA due to an earthquake.

  16. Seismic Hazard and Risk Assessments for Beijing-Tianjin-Tangshan, China, Area

    USGS Publications Warehouse

    Xie, F.; Wang, Z.; Liu, J.

    2011-01-01

    Seismic hazard and risk in the Beijing-Tianjin-Tangshan, China, area were estimated from 500-year intensity observations. First, we digitized the intensity observations (maps) using ArcGIS with a cell size of 0.1 ?? 0.1??. Second, we performed a statistical analysis on the digitized intensity data, determined an average b value (0.39), and derived the intensity-frequency relationship (hazard curve) for each cell. Finally, based on a Poisson model for earthquake occurrence, we calculated seismic risk in terms of a probability of I ??? 7, 8, or 9 in 50 years. We also calculated the corresponding 10 percent probability of exceedance of these intensities in 50 years. The advantages of assessing seismic hazard and risk from intensity records are that (1) fewer assumptions (i. e., earthquake source and ground motion attenuation) are made, and (2) site-effect is included. Our study shows that the area has high seismic hazard and risk. Our study also suggests that current design peak ground acceleration or intensity for the area may not be adequate. ?? 2010 Birkh??user / Springer Basel AG.

  17. A Comprehensive Probabilistic Tsunami Hazard Assessment: Multiple Sources and Short-Term Interactions

    NASA Astrophysics Data System (ADS)

    Anita, G.; Selva, J.; Laura, S.

    2011-12-01

    We develop a comprehensive and total probabilistic tsunami hazard assessment (TotPTHA), in which many different possible source types concur to the definition of the total tsunami hazard at given target sites. In a multi-hazard and multi-risk perspective, such an innovative approach allows, in principle, to consider all possible tsunamigenic sources, from seismic events, to slides, asteroids, volcanic eruptions, etc. In this respect, we also formally introduce and discuss the treatment of interaction/cascade effects in the TotPTHA analysis. We demonstrate how external triggering events may induce significant temporary variations in the tsunami hazard. Because of this, such effects should always be considered, at least in short-term applications, to obtain unbiased analyses. Finally, we prove the feasibility of the TotPTHA and of the treatment of interaction/cascade effects by applying this methodology to an ideal region with realistic characteristics (Neverland).

  18. Global Assessment of Volcanic Debris Hazards from Space

    NASA Technical Reports Server (NTRS)

    Watters, Robert J.

    2003-01-01

    Hazard (slope stability) assessment for different sectors of volcano edifices was successfully obtained from volcanoes in North and South America. The assessment entailed Hyperion images to locate portions of the volcano that were hydrothermally altered to clay rich rocks with zones that were also rich in alunite and other minerals. The identified altered rock zones were field checked and sampled. The rock strength of these zones was calculated from the field and laboratory measurements. Volcano modeling utilizing the distinct element method and limit equilibrium technique, with the calculated strength data was used to assess stability and deformation of the edifice. Modeling results give indications of possible failure volumes, velocities and direction. The models show the crucial role hydrothermally weak rock plays in reducing the strength o the volcano edifice and the rapid identification of weak rock through remote sensing techniques. Volcanoes were assessed in the Cascade Range (USA), Mexico, and Chile (ongoing).

  19. OpenQuake, a platform for collaborative seismic hazard and risk assessment

    NASA Astrophysics Data System (ADS)

    Henshaw, Paul; Burton, Christopher; Butler, Lars; Crowley, Helen; Danciu, Laurentiu; Nastasi, Matteo; Monelli, Damiano; Pagani, Marco; Panzeri, Luigi; Simionato, Michele; Silva, Vitor; Vallarelli, Giuseppe; Weatherill, Graeme; Wyss, Ben

    2013-04-01

    Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, tools and models for global seismic hazard and risk assessment, within the context of the Global Earthquake Model (GEM). Guided by the needs and experiences of governments, companies and international organisations, all contributions are being integrated into OpenQuake: a web-based platform that - together with other resources - will become accessible in 2014. With OpenQuake, stakeholders worldwide will be able to calculate, visualize and investigate earthquake hazard and risk, capture new data and share findings for joint learning. The platform is envisaged as a collaborative hub for earthquake risk assessment, used at global and local scales, around which an active network of users has formed. OpenQuake will comprise both online and offline tools, many of which can also be used independently. One of the first steps in OpenQuake development was the creation of open-source software for advanced seismic hazard and risk calculations at any scale, the OpenQuake Engine. Although in continuous development, a command-line version of the software is already being test-driven and used by hundreds worldwide; from non-profits in Central Asia, seismologists in sub-Saharan Africa and companies in South Asia to the European seismic hazard harmonization programme (SHARE). In addition, several technical trainings were organized with scientists from different regions of the world (sub-Saharan Africa, Central Asia, Asia-Pacific) to introduce the engine and other OpenQuake tools to the community, something that will continue to happen over the coming years. Other tools that are being developed of direct interest to the hazard community are: • OpenQuake Modeller; fundamental

  20. Geological hazards, vulnerability, and risk assessment using GIS: model for Glenwood Springs, Colorado

    NASA Astrophysics Data System (ADS)

    Mejía-Navarro, Mario; Wohl, Ellen E.; Oaks, Sherry D.

    1994-08-01

    Glenwood Springs, Colorado, lies at the junction of the Roaring Fork and Colorado Rivers, surrounded by the steep peaks of the Colorado Rocky Mountains. Large parts of the region have had intensive sheet erosion, debris flows, and hyperconcentrated floods triggered by landslides and slumps. The latter come from unstable slopes in the many tributary channels on the mountainsides, causing concentration of debris in channels and a large accumulation of sediment in colluvial wedges and debris fans that line the river valleys. Many of the landslide and debris-flow deposits exist in a state resembling suspended animation, ready to be destabilized by intense precipitation and/or seismic activity. During this century urban development in the Roaring Fork River valley has increased rapidly. The city of Glenwood Springs continues to expand over unstable debris fans without any construction of hazard mitigation structures. Since 1900, Glenwood Springs has had at least 21 damaging debris flows and floods; on July 24, 1977 a heavy thunderstorm spread a debris flow over more than 80 ha of the city. This paper presents a method that uses Geographic Information Systems (GIS) to assess geological hazards, vulnerability, and risk in the Glenwood Springs area. The hazards evaluated include subsidence, rockfall, debris flows, and floods, and in this paper we focus on debris flows and subsidence. Information on topography, hydrology, precipitation, geomorphic processes, bedrock and surficial geology, structural geology, soils, vegetation, and land use, was processed for hazard assessment using a series of algorithms. ARC/INFO and GRASS GIS softwares were used to produce maps and tables in a format accessible to urban planners. After geological hazards were defined for the study area, we estimated the vulnerability ( Ve) of various elements for an event of intensity i. Risk is assessed as a function of hazard and vulnerability. We categorized the study area in 14 classes for planning

  1. Rockfall hazard and risk assessment in the Yosemite Valley, California, USA

    USGS Publications Warehouse

    Guzzetti, F.; Reichenbach, P.; Wieczorek, G.F.

    2003-01-01

    Rock slides and rock falls are the most frequent types of slope movements in Yosemite National Park, California. In historical time (1857-2002) 392 rock falls and rock slides have been documented in the valley, and some of them have been mapped in detail. We present the results of an attempt to assess rock fall hazards in the Yosemite Valley. Spatial and temporal aspects of rock falls hazard are considered. A detailed inventory of slope movements covering the 145-year period from 1857 to 2002 is used to determine the frequency-volume statistics of rock falls and to estimate the annual frequency of rock falls, providing the temporal component of rock fall hazard. The extent of the areas potentially subject to rock fall hazards in the Yosemite Valley were obtained using STONE, a physically-based rock fall simulation computer program. The software computes 3-dimensional rock fall trajectories starting from a digital elevation model (DEM), the location of rock fall release points, and maps of the dynamic rolling friction coefficient and of the coefficients of normal and tangential energy restitution. For each DEM cell the software calculates the number of rock falls passing through the cell, the maximum rock fall velocity and the maximum flying height. For the Yosemite Valley, a DEM with a ground resolution of 10 ?? 10 m was prepared using topographic contour lines from the U.S. Geological Survey 1:24 000-scale maps. Rock fall release points were identified as DEM cells having a slope steeper than 60??, an assumption based on the location of historical rock falls. Maps of the normal and tangential energy restitution coefficients and of the rolling friction coefficient were produced from a surficial geologic map. The availability of historical rock falls mapped in detail allowed us to check the computer program performance and to calibrate the model parameters. Visual and statistical comparison of the model results with the mapped rock falls confirmed the accuracy of

  2. Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel; Malamud, Bruce D.

    2016-04-01

    Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  3. Some suggested future directions of quantitative resource assessments

    USGS Publications Warehouse

    Singer, D.A.

    2001-01-01

    Future quantitative assessments will be expected to estimate quantities, values, and locations of undiscovered mineral resources in a form that conveys both economic viability and uncertainty associated with the resources. Historically, declining metal prices point to the need for larger deposits over time. Sensitivity analysis demonstrates that the greatest opportunity for reducing uncertainty in assessments lies in lowering uncertainty associated with tonnage estimates. Of all errors possible in assessments, those affecting tonnage estimates are by far the most important. Selecting the correct deposit model is the most important way of controlling errors because the dominance of tonnage-deposit models are the best known predictor of tonnage. Much of the surface is covered with apparently barren rocks and sediments in many large regions. Because many exposed mineral deposits are believed to have been found, a prime concern is the presence of possible mineralized rock under cover. Assessments of areas with resources under cover must rely on extrapolation from surrounding areas, new geologic maps of rocks under cover, or analogy with other well-explored areas that can be considered training tracts. Cover has a profound effect on uncertainty and on methods and procedures of assessments because geology is seldom known and geophysical methods typically have attenuated responses. Many earlier assessment methods were based on relationships of geochemical and geophysical variables to deposits learned from deposits exposed on the surface-these will need to be relearned based on covered deposits. Mineral-deposit models are important in quantitative resource assessments for two reasons: (1) grades and tonnages of most deposit types are significantly different, and (2) deposit types are present in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral

  4. Long-range hazard assessment of volcanic ash dispersal for a Plinian eruptive scenario at Popocatépetl volcano (Mexico): implications for civil aviation safety

    USGS Publications Warehouse

    Bonasia, Rosanna; Scaini, Chirara; Capra, Lucia; Nathenson, Manuel; Siebe, Claus; Arana-Salinas, Lilia; Folch, Arnau

    2013-01-01

    Popocatépetl is one of Mexico’s most active volcanoes threatening a densely populated area that includes Mexico City with more than 20 million inhabitants. The destructive potential of this volcano is demonstrated by its Late Pleistocene–Holocene eruptive activity, which has been characterized by recurrent Plinian eruptions of large magnitude, the last two of which destroyed human settlements in pre-Hispanic times. Popocatépetl’s reawakening in 1994 produced a crisis that culminated with the evacuation of two villages on the northeastern flank of the volcano. Shortly after, a monitoring system and a civil protection contingency plan based on a hazard zone map were implemented. The current volcanic hazards map considers the potential occurrence of different volcanic phenomena, including pyroclastic density currents and lahars. However, no quantitative assessment of the tephra hazard, especially related to atmospheric dispersal, has been performed. The presence of airborne volcanic ash at low and jet-cruise atmospheric levels compromises the safety of aircraft operations and forces re-routing of aircraft to prevent encounters with volcanic ash clouds. Given the high number of important airports in the surroundings of Popocatépetl volcano and considering the potential threat posed to civil aviation in Mexico and adjacent regions in case of a Plinian eruption, a hazard assessment for tephra dispersal is required. In this work, we present the first probabilistic tephra dispersal hazard assessment for Popocatépetl volcano. We compute probabilistic hazard maps for critical thresholds of airborne ash concentrations at different flight levels, corresponding to the situation defined in Europe during 2010, and still under discussion. Tephra dispersal mode is performed using the FALL3D numerical model. Probabilistic hazard maps are built for a Plinian eruptive scenario defined on the basis of geological field data for the “Ochre Pumice” Plinian eruption (4965 14C

  5. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  6. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly

  7. Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)

    NASA Astrophysics Data System (ADS)

    Seyrek, Evren; Orhan, Ahmet; Dinçer, İsmail

    2014-05-01

    Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach

  8. An open framework for automated chemical hazard assessment based on GreenScreen for Safer Chemicals: A proof of concept.

    PubMed

    Wehage, Kristopher; Chenhansa, Panan; Schoenung, Julie M

    2017-01-01

    GreenScreen® for Safer Chemicals is a framework for comparative chemical hazard assessment. It is the first transparent, open and publicly accessible framework of its kind, allowing manufacturers and governmental agencies to make informed decisions about the chemicals and substances used in consumer products and buildings. In the GreenScreen® benchmarking process, chemical hazards are assessed and classified based on 18 hazard endpoints from up to 30 different sources. The result is a simple numerical benchmark score and accompanying assessment report that allows users to flag chemicals of concern and identify safer alternatives. Although the screening process is straightforward, aggregating and sorting hazard data is tedious, time-consuming, and prone to human error. In light of these challenges, the present work demonstrates the usage of automation to cull chemical hazard data from publicly available internet resources, assign metadata, and perform a GreenScreen® hazard assessment using the GreenScreen® "List Translator." The automated technique, written as a module in the Python programming language, generates GreenScreen® List Translation data for over 3000 chemicals in approximately 30 s. Discussion of the potential benefits and limitations of automated techniques is provided. By embedding the library into a web-based graphical user interface, the extensibility of the library is demonstrated. The accompanying source code is made available to the hazard assessment community. Integr Environ Assess Manag 2017;13:167-176. © 2016 SETAC. © 2016 SETAC.

  9. Distinguishing nanomaterial particles from background airborne particulate matter for quantitative exposure assessment

    NASA Astrophysics Data System (ADS)

    Ono-Ogasawara, Mariko; Serita, Fumio; Takaya, Mitsutoshi

    2009-10-01

    As the production of engineered nanomaterials quantitatively expands, the chance that workers involved in the manufacturing process will be exposed to nanoparticles also increases. A risk management system is needed for workplaces in the nanomaterial industry based on the precautionary principle. One of the problems in the risk management system is difficulty of exposure assessment. In this article, examples of exposure assessment in nanomaterial industries are reviewed with a focus on distinguishing engineered nanomaterial particles from background nanoparticles in workplace atmosphere. An approach by JNIOSH (Japan National Institute of Occupational Safety and Health) to quantitatively measure exposure to carbonaceous nanomaterials is also introduced. In addition to real-time measurements and qualitative analysis by electron microscopy, quantitative chemical analysis is necessary for quantitatively assessing exposure to nanomaterials. Chemical analysis is suitable for quantitative exposure measurement especially at facilities with high levels of background NPs.

  10. Earth reencounter probabilities for aborted space disposal of hazardous nuclear waste

    NASA Technical Reports Server (NTRS)

    Friedlander, A. L.; Feingold, H.

    1977-01-01

    A quantitative assessment is made of the long-term risk of earth reencounter and reentry associated with aborted disposal of hazardous material in the space environment. Numerical results are presented for 10 candidate disposal options covering a broad spectrum of disposal destinations and deployment propulsion systems. Based on representative models of system failure, the probability that a single payload will return and collide with earth within a period of 250,000 years is found to lie in the range .0002-.006. Proportionately smaller risk attaches to shorter time intervals. Risk-critical factors related to trajectory geometry and system reliability are identified as possible mechanisms of hazard reduction.

  11. Seismic hazard assessment over time: Modelling earthquakes in Taiwan

    NASA Astrophysics Data System (ADS)

    Chan, Chung-Han; Wang, Yu; Wang, Yu-Ju; Lee, Ya-Ting

    2017-04-01

    To assess the seismic hazard with temporal change in Taiwan, we develop a new approach, combining both the Brownian Passage Time (BPT) model and the Coulomb stress change, and implement the seismogenic source parameters by the Taiwan Earthquake Model (TEM). The BPT model was adopted to describe the rupture recurrence intervals of the specific fault sources, together with the time elapsed since the last fault-rupture to derive their long-term rupture probability. We also evaluate the short-term seismicity rate change based on the static Coulomb stress interaction between seismogenic sources. By considering above time-dependent factors, our new combined model suggests an increased long-term seismic hazard in the vicinity of active faults along the western Coastal Plain and the Longitudinal Valley, where active faults have short recurrence intervals and long elapsed time since their last ruptures, and/or short-term elevated hazard levels right after the occurrence of large earthquakes due to the stress triggering effect. The stress enhanced by the February 6th, 2016, Meinong ML 6.6 earthquake also significantly increased rupture probabilities of several neighbouring seismogenic sources in Southwestern Taiwan and raised hazard level in the near future. Our approach draws on the advantage of incorporating long- and short-term models, to provide time-dependent earthquake probability constraints. Our time-dependent model considers more detailed information than any other published models. It thus offers decision-makers and public officials an adequate basis for rapid evaluations of and response to future emergency scenarios such as victim relocation and sheltering.

  12. An information diffusion technique to assess integrated hazard risks.

    PubMed

    Huang, Chongfu; Huang, Yundong

    2018-02-01

    An integrated risk is a scene in the future associated with some adverse incident caused by multiple hazards. An integrated probability risk is the expected value of disaster. Due to the difficulty of assessing an integrated probability risk with a small sample, weighting methods and copulas are employed to avoid this obstacle. To resolve the problem, in this paper, we develop the information diffusion technique to construct a joint probability distribution and a vulnerability surface. Then, an integrated risk can be directly assessed by using a small sample. A case of an integrated risk caused by flood and earthquake is given to show how the suggested technique is used to assess the integrated risk of annual property loss. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Probabilistic Volcanic Multi-Hazard Assessment at Somma-Vesuvius (Italy): coupling Bayesian Belief Networks with a physical model for lahar propagation

    NASA Astrophysics Data System (ADS)

    Tierz, Pablo; Woodhouse, Mark; Phillips, Jeremy; Sandri, Laura; Selva, Jacopo; Marzocchi, Warner; Odbert, Henry

    2017-04-01

    and, finally, assess the probability of occurrence of lahars of different volumes. The information utilized to parametrize the BBNs includes: (1) datasets of lahar observations; (2) numerical modelling of tephra fallout and PDCs; and (3) literature data. The BBN framework provides an opportunity to quantitatively combine these different types of evidence and use them to derive a rational approach to lahar forecasting. Lastly, we couple the BBN assessments with a shallow-water physical model for lahar propagation in order to attach probabilities to the simulated hazard footprints. We develop our methodology at Somma-Vesuvius (Italy), an explosive volcano prone to rain-triggered lahars or debris flows whether right after an eruption or during inter-eruptive periods. Accounting for the variability in tephra-fallout and dense-PDC propagation and the main geomorphological features of the catchments around Somma-Vesuvius, the areas most likely of forming medium-large lahars are the flanks of the volcano and the Sarno mountains towards the east.

  14. Hazard interactions and interaction networks (cascades) within multi-hazard methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2016-08-01

    This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability

  15. ST-HASSET for volcanic hazard assessment: A Python tool for evaluating the evolution of unrest indicators

    NASA Astrophysics Data System (ADS)

    Bartolini, Stefania; Sobradelo, Rosa; Martí, Joan

    2016-08-01

    Short-term hazard assessment is an important part of the volcanic management cycle, above all at the onset of an episode of volcanic agitation (unrest). For this reason, one of the main tasks of modern volcanology is to use monitoring data to identify and analyse precursory signals and so determine where and when an eruption might occur. This work follows from Sobradelo and Martí [Short-term volcanic hazard assessment through Bayesian inference: retrospective application to the Pinatubo 1991 volcanic crisis. Journal of Volcanology and Geothermal Research 290, 111, 2015] who defined the principle for a new methodology for conducting short-term hazard assessment in unrest volcanoes. Using the same case study, the eruption on Pinatubo (15 June 1991), this work introduces a new free Python tool, ST-HASSET, for implementing Sobradelo and Martí (2015) methodology in the time evolution of unrest indicators in the volcanic short-term hazard assessment. Moreover, this tool is designed for complementing long-term hazard assessment with continuous monitoring data when the volcano goes into unrest. It is based on Bayesian inference and transforms different pre-eruptive monitoring parameters into a common probabilistic scale for comparison among unrest episodes from the same volcano or from similar ones. This allows identifying common pre-eruptive behaviours and patterns. ST-HASSET is especially designed to assist experts and decision makers as a crisis unfolds, and allows detecting sudden changes in the activity of a volcano. Therefore, it makes an important contribution to the analysis and interpretation of relevant data for understanding the evolution of volcanic unrest.

  16. Approach to the assessment of the hazard. [fire released carbon fiber electrical effects

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1980-01-01

    An overview of the carbon fiber hazard assessment is presented. The potential risk to the civil sector associated with the accidental release of carbon fibers from aircraft having composite structures was assessed along with the need for protection of civil aircraft from carbon fibers.

  17. Quantitative Microbial Risk Assessment for Clostridium perfringens in Natural and Processed Cheeses

    PubMed Central

    Lee, Heeyoung; Lee, Soomin; Kim, Sejeong; Lee, Jeeyeon; Ha, Jimyeong; Yoon, Yohan

    2016-01-01

    This study evaluated the risk of Clostridium perfringens (C. perfringens) foodborne illness from natural and processed cheeses. Microbial risk assessment in this study was conducted according to four steps: hazard identification, hazard characterization, exposure assessment, and risk characterization. The hazard identification of C. perfringens on cheese was identified through literature, and dose response models were utilized for hazard characterization of the pathogen. For exposure assessment, the prevalence of C. perfringens, storage temperatures, storage time, and annual amounts of cheese consumption were surveyed. Eventually, a simulation model was developed using the collected data and the simulation result was used to estimate the probability of C. perfringens foodborne illness by cheese consumption with @RISK. C. perfringens was determined to be low risk on cheese based on hazard identification, and the exponential model (r = 1.82×10−11) was deemed appropriate for hazard characterization. Annual amounts of natural and processed cheese consumption were 12.40±19.43 g and 19.46±14.39 g, respectively. Since the contamination levels of C. perfringens on natural (0.30 Log CFU/g) and processed cheeses (0.45 Log CFU/g) were below the detection limit, the initial contamination levels of natural and processed cheeses were estimated by beta distribution (α1 = 1, α2 = 91; α1 = 1, α2 = 309)×uniform distribution (a = 0, b = 2; a = 0, b = 2.8) to be −2.35 and −2.73 Log CFU/g, respectively. Moreover, no growth of C. perfringens was observed for exposure assessment to simulated conditions of distribution and storage. These data were used for risk characterization by a simulation model, and the mean values of the probability of C. perfringens foodborne illness by cheese consumption per person per day for natural and processed cheeses were 9.57×10−14 and 3.58×10−14, respectively. These results indicate that probability of C. perfringens foodborne illness

  18. A life cycle hazard assessment (LCHA) framework to address fire hazards at the wildland-urban interface

    NASA Astrophysics Data System (ADS)

    Lindquist, Eric; Pierce, Jen; Wuerzer, Thomas; Glenn, Nancy; Dialani, Jijay; Gibble, Katie; Frazier, Tim; Strand, Eva

    2015-04-01

    up with an assessment of the impact of the product on the environment over time and is being considered beyond the business and logistics communities in such areas as biodiversity and ecosystem impacts. From our perspective, we consider wildfire as the "product" and want to understand how it impacts the environment (spatially, temporally, across the bio-physical and social domains). Through development of this LCHA we adapt the LCA approach with a focus on the inputs (from fire and pre-fire efforts) outputs (from post fire conditions) and how they evolve and are responded to by the responsible agencies and stakeholders responsible. A Life Cycle Hazard Assessment (LCHA) approach extends and integrates the understanding of hazards over much longer periods of time than previously considered. The LCHA also provides an integrated platform for the necessary interdisciplinary approach to understanding decision and environmental change across the life cycle of the fire event. This presentation will discuss our theoretical and empirical framework for developing a longitudinal LCHA and contribute to the overall goals of the NH7.1 session.

  19. Mathematical modelling and quantitative methods.

    PubMed

    Edler, L; Poirier, K; Dourson, M; Kleiner, J; Mileson, B; Nordmann, H; Renwick, A; Slob, W; Walton, K; Würtzen, G

    2002-01-01

    The present review reports on the mathematical methods and statistical techniques presently available for hazard characterisation. The state of the art of mathematical modelling and quantitative methods used currently for regulatory decision-making in Europe and additional potential methods for risk assessment of chemicals in food and diet are described. Existing practices of JECFA, FDA, EPA, etc., are examined for their similarities and differences. A framework is established for the development of new and improved quantitative methodologies. Areas for refinement, improvement and increase of efficiency of each method are identified in a gap analysis. Based on this critical evaluation, needs for future research are defined. It is concluded from our work that mathematical modelling of the dose-response relationship would improve the risk assessment process. An adequate characterisation of the dose-response relationship by mathematical modelling clearly requires the use of a sufficient number of dose groups to achieve a range of different response levels. This need not necessarily lead to an increase in the total number of animals in the study if an appropriate design is used. Chemical-specific data relating to the mode or mechanism of action and/or the toxicokinetics of the chemical should be used for dose-response characterisation whenever possible. It is concluded that a single method of hazard characterisation would not be suitable for all kinds of risk assessments, and that a range of different approaches is necessary so that the method used is the most appropriate for the data available and for the risk characterisation issue. Future refinements to dose-response characterisation should incorporate more clearly the extent of uncertainty and variability in the resulting output.

  20. From leaves to landscape: A multiscale approach to assess fire hazard in wildland-urban interface areas.

    PubMed

    Ghermandi, Luciana; Beletzky, Natacha A; de Torres Curth, Mónica I; Oddi, Facundo J

    2016-12-01

    The overlapping zone between urbanization and wildland vegetation, known as the wildland urban interface (WUI), is often at high risk of wildfire. Human activities increase the likelihood of wildfires, which can have disastrous consequences for property and land use, and can pose a serious threat to lives. Fire hazard assessments depend strongly on the spatial scale of analysis. We assessed the fire hazard in a WUI area of a Patagonian city by working at three scales: landscape, community and species. Fire is a complex phenomenon, so we used a large number of variables that correlate a priori with the fire hazard. Consequently, we analyzed environmental variables together with fuel load and leaf flammability variables and integrated all the information in a fire hazard map with four fire hazard categories. The Nothofagus dombeyi forest had the highest fire hazard while grasslands had the lowest. Our work highlights the vulnerability of the wildland-urban interface to fire in this region and our suggested methodology could be applied in other wildland-urban interface areas. Particularly in high hazard areas, our work could help in spatial delimitation policies, urban planning and development of plans for the protection of human lives and assets. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. The hidden costs of coastal hazards: Implications for risk assessment and mitigation

    USGS Publications Warehouse

    Kunreuther, H.; Platt, R.; Baruch, S.; Bernknopf, R.L.; Buckley, M.; Burkett, V.; Conrad, D.; Davidson, T.; Deutsch, K.; Geis, D.; Jannereth, M.; Knap, A.; Lane, H.; Ljung, G.; McCauley, M.; Mileti, D.; Miller, T.; Morrow, B.; Meyers, J.; Pielke, R.; Pratt, A.; Tripp, J.

    2000-01-01

    Society has limited hazard mitigation dollars to invest. Which actions will be most cost effective, considering the true range of impacts and costs incurred? In 1997, the H. John Heinz III Center for Science, Economics and the Environment began a two-year study with a panel of experts to help develop new strategies to identify and reduce the costs of weather-related hazards associated with rapidly increasing coastal development activities.The Hidden Costs of Coastal Hazards presents the panel's findings, offering the first in-depth study that considers the costs of coastal hazards to natural resources, social institutions, business, and the built environment. Using Hurricane Hugo, which struck South Carolina in 1989, as a case study, it provides for the first time information on the full range of economic costs caused by a major coastal hazard event. The book:describes and examines unreported, undocumented, and hidden costs such as losses due to business interruption, reduction in property values, interruption of social services, psychological trauma, damage to natural systems, and othersexamines the concepts of risk and vulnerability, and discusses conventional approaches to risk assessment and the emerging area of vulnerability assessmentrecommends a comprehensive framework for developing and implementing mitigation strategiesdocuments the human impact of Hurricane Hugo and provides insight from those who lived through it.The Hidden Costs of Coastal Hazards takes a structured approach to the problem of coastal hazards, offering a new framework for community-based hazard mitigation along with specific recommendations for implementation. Decisionmakers -- both policymakers and planners -- who are interested in coastal hazard issues will find the book a unique source of new information and insight, as will private-sector decisionmakers including lenders, investors, developers, and insurers of coastal property.

  2. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  3. Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources

    USGS Publications Warehouse

    Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.

    2009-01-01

    The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.

  4. Spatial earthquake hazard assessment of Evansville, Indiana

    USGS Publications Warehouse

    Rockaway, T.D.; Frost, J.D.; Eggert, D.L.; Luna, R.

    1997-01-01

    The earthquake hazard has been evaluated for a 150-square-kilometer area around Evansville, Indiana. GIS-QUAKE, a system that combines liquefaction and ground motion analysis routines with site-specific geological, geotechnical, and seismological information, was used for the analysis. The hazard potential was determined by using 586 SPT borings, 27 CPT sounding, 39 shear-wave velocity profiles and synthesized acceleration records for body-wave magnitude 6.5 and 7.3 mid-continental earthquakes, occurring at distances of 50 km and 250 km, respectively. The results of the GIS-QUAKE hazard analyses for Evansville identify areas with a high hazard potential that had not previously been identified in earthquake zonation studies. The Pigeon Creek area specifically is identified as having significant potential for liquefaction-induced damage. Damage as a result of ground motion amplification is determined to be a moderate concern throughout the area. Differences in the findings of this zonation study and previous work are attributed to the size and range of the database, the hazard evaluation methodologies, and the geostatistical interpolation techniques used to estimate the hazard potential. Further, assumptions regarding the groundwater elevations made in previous studies are also considered to have had a significant effect on the results.

  5. Modeling Compound Flood Hazards in Coastal Embayments

    NASA Astrophysics Data System (ADS)

    Moftakhari, H.; Schubert, J. E.; AghaKouchak, A.; Luke, A.; Matthew, R.; Sanders, B. F.

    2017-12-01

    Coastal cities around the world are built on lowland topography adjacent to coastal embayments and river estuaries, where multiple factors threaten increasing flood hazards (e.g. sea level rise and river flooding). Quantitative risk assessment is required for administration of flood insurance programs and the design of cost-effective flood risk reduction measures. This demands a characterization of extreme water levels such as 100 and 500 year return period events. Furthermore, hydrodynamic flood models are routinely used to characterize localized flood level intensities (i.e., local depth and velocity) based on boundary forcing sampled from extreme value distributions. For example, extreme flood discharges in the U.S. are estimated from measured flood peaks using the Log-Pearson Type III distribution. However, configuring hydrodynamic models for coastal embayments is challenging because of compound extreme flood events: events caused by a combination of extreme sea levels, extreme river discharges, and possibly other factors such as extreme waves and precipitation causing pluvial flooding in urban developments. Here, we present an approach for flood risk assessment that coordinates multivariate extreme analysis with hydrodynamic modeling of coastal embayments. First, we evaluate the significance of correlation structure between terrestrial freshwater inflow and oceanic variables; second, this correlation structure is described using copula functions in unit joint probability domain; and third, we choose a series of compound design scenarios for hydrodynamic modeling based on their occurrence likelihood. The design scenarios include the most likely compound event (with the highest joint probability density), preferred marginal scenario and reproduced time series of ensembles based on Monte Carlo sampling of bivariate hazard domain. The comparison between resulting extreme water dynamics under the compound hazard scenarios explained above provides an insight to the

  6. Environmental hazard assessment of a marine mine tailings deposit site and potential implications for deep-sea mining.

    PubMed

    Mestre, Nélia C; Rocha, Thiago L; Canals, Miquel; Cardoso, Cátia; Danovaro, Roberto; Dell'Anno, Antonio; Gambi, Cristina; Regoli, Francesco; Sanchez-Vidal, Anna; Bebianno, Maria João

    2017-09-01

    Portmán Bay is a heavily contaminated area resulting from decades of metal mine tailings disposal, and is considered a suitable shallow-water analogue to investigate the potential ecotoxicological impact of deep-sea mining. Resuspension plumes were artificially created by removing the top layer of the mine tailings deposit by bottom trawling. Mussels were deployed at three sites: i) off the mine tailings deposit area; ii) on the mine tailings deposit beyond the influence from the resuspension plumes; iii) under the influence of the artificially generated resuspension plumes. Surface sediment samples were collected at the same sites for metal analysis and ecotoxicity assessment. Metal concentrations and a battery of biomarkers (oxidative stress, metal exposure, biotransformation and oxidative damage) were measured in different mussel tissues. The environmental hazard posed by the resuspension plumes was investigated by a quantitative weight of evidence (WOE) model that integrated all the data. The resuspension of sediments loaded with metal mine tails demonstrated that chemical contaminants were released by trawling subsequently inducing ecotoxicological impact in mussels' health. Considering as sediment quality guidelines (SQGs) those indicated in Spanish action level B for the disposal of dredged material at sea, the WOE model indicates that the hazard is slight off the mine tailings deposit, moderate on the mine tailings deposit without the influence from the resuspension plumes, and major under the influence of the resuspension plumes. Portmán Bay mine tailings deposit is a by-product of sulphide mining, and despite differences in environmental setting, it can reflect the potential ecotoxic effects to marine fauna from the impact of resuspension of plumes created by deep-sea mining of polymetallic sulphides. A similar approach as in this study could be applied in other areas affected by sediment resuspension and for testing future deep-sea mining sites in

  7. Assessment of oil slick hazard and risk at vulnerable coastal sites.

    PubMed

    Melaku Canu, Donata; Solidoro, Cosimo; Bandelj, Vinko; Quattrocchi, Giovanni; Sorgente, Roberto; Olita, Antonio; Fazioli, Leopoldo; Cucco, Andrea

    2015-05-15

    This work gives an assessment of the hazard faced by Sicily coasts regarding potential offshore surface oil spill events and provides a risk assessment for Sites of Community Importance (SCI) and Special Protection Areas (SPA). A lagrangian module, coupled with a high resolution finite element three dimensional hydrodynamic model, was used to track the ensemble of a large number of surface trajectories followed by particles released over 6 selected areas located inside the Sicily Channel. The analysis was carried out under multiple scenarios of meteorological conditions. Oil evaporation, oil weathering, and shore stranding are also considered. Seasonal hazard maps for different stranding times and seasonal risk maps were then produced for the whole Sicilian coastline. The results highlight that depending on the meteo-marine conditions, particles can reach different areas of the Sicily coast, including its northern side, and illustrate how impacts can be greatly reduced through prompt implementation of mitigation strategies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  9. Assessment of seismic hazard in the North Caucasus

    NASA Astrophysics Data System (ADS)

    Ulomov, V. I.; Danilova, T. I.; Medvedeva, N. S.; Polyakova, T. P.; Shumilina, L. S.

    2007-07-01

    The seismicity of the North Caucasus is the highest in the European part of Russia. The detection of potential seismic sources here and long-term prediction of earthquakes are extremely important for the assessment of seismic hazard and seismic risk in this densely populated and industrially developed region of the country. The seismogenic structures of the Iran-Caucasus-Anatolia and Central Asia regions, adjacent to European Russia, are the subjects of this study. These structures are responsible for the specific features of regional seismicity and for the geodynamic interaction with adjacent areas of the Scythian and Turan platforms. The most probable potential sources of earthquakes with magnitudes M = 7.0 ± 0.2 and 7.5 ± 0.2 in the North Caucasus are located. The possible macroseismic effect of one of them is assessed.

  10. Regional coseismic landslide hazard assessment without historical landslide inventories: A new approach

    NASA Astrophysics Data System (ADS)

    Kritikos, Theodosios; Robinson, Tom R.; Davies, Tim R. H.

    2015-04-01

    Currently, regional coseismic landslide hazard analyses require comprehensive historical landslide inventories as well as detailed geotechnical data. Consequently, such analyses have not been possible where these data are not available. A new approach is proposed herein to assess coseismic landslide hazard at regional scale for specific earthquake scenarios in areas without historical landslide inventories. The proposed model employs fuzzy logic and geographic information systems to establish relationships between causative factors and coseismic slope failures in regions with well-documented and substantially complete coseismic landslide inventories. These relationships are then utilized to estimate the relative probability of landslide occurrence in regions with neither historical landslide inventories nor detailed geotechnical data. Statistical analyses of inventories from the 1994 Northridge and 2008 Wenchuan earthquakes reveal that shaking intensity, topography, and distance from active faults and streams are the main controls on the spatial distribution of coseismic landslides. Average fuzzy memberships for each factor are developed and aggregated to model the relative coseismic landslide hazard for both earthquakes. The predictive capabilities of the models are assessed and show good-to-excellent model performance for both events. These memberships are then applied to the 1999 Chi-Chi earthquake, using only a digital elevation model, active fault map, and isoseismal data, replicating prediction of a future event in a region lacking historic inventories and/or geotechnical data. This similarly results in excellent model performance, demonstrating the model's predictive potential and confirming it can be meaningfully applied in regions where previous methods could not. For such regions, this method may enable a greater ability to analyze coseismic landslide hazard from specific earthquake scenarios, allowing for mitigation measures and emergency response plans

  11. Undersampling power-law size distributions: effect on the assessment of extreme natural hazards

    USGS Publications Warehouse

    Geist, Eric L.; Parsons, Thomas E.

    2014-01-01

    The effect of undersampling on estimating the size of extreme natural hazards from historical data is examined. Tests using synthetic catalogs indicate that the tail of an empirical size distribution sampled from a pure Pareto probability distribution can range from having one-to-several unusually large events to appearing depleted, relative to the parent distribution. Both of these effects are artifacts caused by limited catalog length. It is more difficult to diagnose the artificially depleted empirical distributions, since one expects that a pure Pareto distribution is physically limited in some way. Using maximum likelihood methods and the method of moments, we estimate the power-law exponent and the corner size parameter of tapered Pareto distributions for several natural hazard examples: tsunamis, floods, and earthquakes. Each of these examples has varying catalog lengths and measurement thresholds, relative to the largest event sizes. In many cases where there are only several orders of magnitude between the measurement threshold and the largest events, joint two-parameter estimation techniques are necessary to account for estimation dependence between the power-law scaling exponent and the corner size parameter. Results indicate that whereas the corner size parameter of a tapered Pareto distribution can be estimated, its upper confidence bound cannot be determined and the estimate itself is often unstable with time. Correspondingly, one cannot statistically reject a pure Pareto null hypothesis using natural hazard catalog data. Although physical limits to the hazard source size and by attenuation mechanisms from source to site constrain the maximum hazard size, historical data alone often cannot reliably determine the corner size parameter. Probabilistic assessments incorporating theoretical constraints on source size and propagation effects are preferred over deterministic assessments of extreme natural hazards based on historic data.

  12. Linking rainfall-induced landslides with debris flows runout patterns towards catchment scale hazard assessment

    NASA Astrophysics Data System (ADS)

    Fan, Linfeng; Lehmann, Peter; McArdell, Brian; Or, Dani

    2017-03-01

    Debris flows and landslides induced by heavy rainfall represent an ubiquitous and destructive natural hazard in steep mountainous regions. For debris flows initiated by shallow landslides, the prediction of the resulting pathways and associated hazard is often hindered by uncertainty in determining initiation locations, volumes and mechanical state of the mobilized debris (and by model parameterization). We propose a framework for linking a simplified physically-based debris flow runout model with a novel Landslide Hydro-mechanical Triggering (LHT) model to obtain a coupled landslide-debris flow susceptibility and hazard assessment. We first compared the simplified debris flow model of Perla (1980) with a state-of-the art continuum-based model (RAMMS) and with an empirical model of Rickenmann (1999) at the catchment scale. The results indicate that predicted runout distances by the Perla model are in reasonable agreement with inventory measurements and with the other models. Predictions of localized shallow landslides by LHT model provides information on water content of released mass. To incorporate effects of water content and flow viscosity as provided by LHT on debris flow runout, we adapted the Perla model. The proposed integral link between landslide triggering susceptibility quantified by LHT and subsequent debris flow runout hazard calculation using the adapted Perla model provides a spatially and temporally resolved framework for real-time hazard assessment at the catchment scale or along critical infrastructure (roads, railroad lines).

  13. QUANTITATION OF MOLECULAR ENDPOINTS FOR THE DOSE-RESPONSE COMPONENT OF CANCER RISK ASSESSMENT

    EPA Science Inventory

    Cancer risk assessment involves the steps of hazard identification, dose-response assessment, exposure assessment and risk characterization. The rapid advances in the use of molecular biology approaches has had an impact on all four components, but the greatest overall current...

  14. Clinical Outcome of Degenerative Mitral Regurgitation: Critical Importance of Echocardiographic Quantitative Assessment in Routine Practice.

    PubMed

    Antoine, Clemence; Benfari, Giovanni; Michelena, Hector I; Malouf, Joseph F; Nkomo, Vuyisile T; Thapa, Prabin; Enriquez-Sarano, Maurice

    2018-05-31

    Background -Echocardiographic quantitation of degenerative mitral regurgitation (DMR) is recommended whenever possible in clinical guidelines but is criticized and its scalability to routine clinical practice doubted. We hypothesized that echocardiographic DMR quantitation, performed in routine clinical practice by multiple practitioners predicts independently long-term survival, and thus is essential to DMR management. Methods -We included patients diagnosed with isolated mitral-valve-prolapse 2003-2011 and any degree of MR quantified by any physician/sonographer in routine clinical practice. Clinical/echocardiographic data acquired at diagnosis were retrieved electronically. Endpoint was mortality under medical treatment analyzed by Kaplan-Meir method and Proportional-Hazard models. Results -The cohort included 3914 patients (55% male) aged 62±17 years, with left ventricular ejection fraction (LVEF) 63±8% and routinely measured effective regurgitant orifice area (EROA) 19[0-40] mm 2 During follow-up (6.7±3.1 years) 696 patients died under medical management and 1263 underwent mitral surgery. In multivariate analysis, routinely measured EROA was associated with mortality (adjusted-hazard-ratio 1.19[1.13-1.24] p<0.0001 per-10mm 2 ) independently of LVEF and end-systolic diameter, symptoms and age/comorbidities. The association between routinely measured EROA and mortality persisted with competitive risk modeling (adjusted hazard-ratio 1.15[1.10-1.20] per 10mm 2 p<0.0001), or in patients without guideline-based Class I/II surgical triggers (adjusted hazard ratio 1.19[1.10-1.28] per 10mm 2 p<0.0001) and in all subgroups examined (all p<0.01). Spline curve analysis showed that, compared with general population mortality, excess mortality appears for moderate DMR (EROA ≥20mm 2 ) becomes notable ≥EROA 30mm 2 and steadily increases with higher EROA levels, > 40 mm 2 threshold. Conclusions -Echocardiographic DMR quantitation is scalable to routine practice and is

  15. A framework for organizing and selecting quantitative approaches for benefit-harm assessment.

    PubMed

    Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M

    2012-11-19

    Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the

  16. A framework for organizing and selecting quantitative approaches for benefit-harm assessment

    PubMed Central

    2012-01-01

    Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of

  17. Evaluation Seismicity west of block-lut for Deterministic Seismic Hazard Assessment of Shahdad ,Iran

    NASA Astrophysics Data System (ADS)

    Ney, B.; Askari, M.

    2009-04-01

    Evaluation Seismicity west of block-lut for Deterministic Seismic Hazard Assessment of Shahdad ,Iran Behnoosh Neyestani , Mina Askari Students of Science and Research University,Iran. Seismic Hazard Assessment has been done for Shahdad city in this study , and four maps (Kerman-Bam-Nakhil Ab-Allah Abad) has been prepared to indicate the Deterministic estimate of Peak Ground Acceleration (PGA) in this area. Deterministic Seismic Hazard Assessment has been preformed for a region in eastern Iran (Shahdad) based on the available geological, seismological and geophysical information and seismic zoning map of region has been constructed. For this assessment first Seimotectonic map of study region in a radius of 100km is prepared using geological maps, distribution of historical and instrumental earthquake data and focal mechanism solutions it is used as the base map for delineation of potential seismic sources. After that minimum distance, for every seismic sources until site (Shahdad) and maximum magnitude for each source have been determined. In Shahdad ,according to results, peak ground acceleration using the Yoshimitsu Fukushima &Teiji Tanaka'1990 attenuation relationship is estimated to be 0.58 g, that is related to the movement of nayband fault with distance 2.4km of the site and maximum magnitude Ms=7.5.

  18. Capturing spatiotemporal variation in wildfires for improving postwildfire debris-flow hazard assessments: Chapter 20

    USGS Publications Warehouse

    Haas, Jessica R.; Thompson, Matthew P.; Tillery, Anne C.; Scott, Joe H.

    2017-01-01

    Wildfires can increase the frequency and magnitude of catastrophic debris flows. Integrated, proactive natural hazard assessment would therefore characterize landscapes based on the potential for the occurrence and interactions of wildfires and postwildfire debris flows. This chapter presents a new modeling effort that can quantify the variability surrounding a key input to postwildfire debris-flow modeling, the amount of watershed burned at moderate to high severity, in a prewildfire context. The use of stochastic wildfire simulation captures variability surrounding the timing and location of ignitions, fire weather patterns, and ultimately the spatial patterns of watershed area burned. Model results provide for enhanced estimates of postwildfire debris-flow hazard in a prewildfire context, and multiple hazard metrics are generated to characterize and contrast hazards across watersheds. Results can guide mitigation efforts by allowing planners to identify which factors may be contributing the most to the hazard rankings of watersheds.

  19. Coastal flooding hazard assessment on potentially vulnerable coastal sectors at Varna regional coast

    NASA Astrophysics Data System (ADS)

    Eftimova, Petya; Valchev, Nikolay; Andreeva, Nataliya

    2017-04-01

    Storm induced flooding is one of the most significant threats that the coastal communities face. In the light of the climate change it is expected to gain even more importance. Therefore, the adequate assessment of this hazard could increase the capability of mitigation of environmental, social, and economic impacts. The study was accomplished in the frames of the Coastal Risk Assessment Framework (CRAF) developed within the FP7 RISC-KIT Project (Resilience-Increasing Strategies for Coasts - toolkit). The hazard assessment was applied on three potentially vulnerable coastal sectors located at the regional coast of Varna, Bulgarian Black Sea coast. The potential "hotspot" candidates were selected during the initial phase of CRAF which evaluated the coastal risks at regional level. The area of interest comprises different coastal types - from natural beaches and rocky cliffs to man modified environments presented by coastal and port defense structures such as the Varna Port breakwater, groynes, jetties and beaches formed by the presence of coastal structures. The assessment of coastal flooding was done using combination of models -XBeach model and LISFLOOD inundation model applied consecutively. The XBeach model was employed to calculate the hazard intensities at the coast up to the berm crest, while LISFLOOD model was used to calculate the intensity and extent of flooding in the hinterland. At the first stage, 75 extreme storm events were simulated using XBeach model run in "non-hydrostatic" mode to obtain series of flood depth, depth-velocity and overtopping discharges at the predefined coastal cross-shore transects. Extreme value analysis was applied to the calculated hazard parameters series in order to determine their probability distribution functions. This is so called response approach, which is focused on the onshore impact rather than on the deep water boundary conditions. It allows calculation of the hazard extremes probability distribution induced by a

  20. Climatic Changes and Consequences on the French West Indies (C3AF), Hurricane and Tsunami Hazards Assessment

    NASA Astrophysics Data System (ADS)

    Arnaud, G.; Krien, Y.; Zahibo, N.; Dudon, B.

    2017-12-01

    Coastal hazards are among the most worrying threats of our time. In a context of climate change coupled to a large population increase, tropical areas could be the most exposed zones of the globe. In such circumstances, understanding the underlying processes can help to better predict storm surges and the associated global risks.Here we present the partial preliminary results integrated in a multidisciplinary project focused on climatic change effects over the coastal threat in the French West Indies and funded by the European Regional Development Fund. The study aims to provide a coastal hazard assessment based on hurricane surge and tsunami modeling including several aspects of climate changes that can affect hazards such as sea level rise, crustal subsidence/uplift, coastline changes etc. Several tsunamis scenarios have been simulated including tele-tsunamis to ensure a large range of tsunami hazards. Surge level of hurricane have been calculated using a large number of synthetic hurricanes to cover the actual and forecasted climate over the tropical area of Atlantic ocean. This hazard assessment will be later coupled with stakes assessed over the territory to provide risk maps.

  1. KSC VAB Aeroacoustic Hazard Assessment

    NASA Technical Reports Server (NTRS)

    Oliveira, Justin M.; Yedo, Sabrina; Campbell, Michael D.; Atkinson, Joseph P.

    2010-01-01

    NASA Kennedy Space Center (KSC) carried out an analysis of the effects of aeroacoustics produced by stationary solid rocket motors in processing areas at KSC. In the current paper, attention is directed toward the acoustic effects of a motor burning within the Vehicle Assembly Building (VAB). The analysis was carried out with support from ASRC Aerospace who modeled transmission effects into surrounding facilities. Calculations were done using semi-analytical models for both aeroacoustics and transmission. From the results it was concluded that acoustic hazards in proximity to the source of ignition and plume can be severe; acoustic hazards in the far-field are significantly lower.

  2. Final Report: Seismic Hazard Assessment at the PGDP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhinmeng

    2007-06-01

    Selecting a level of seismic hazard at the Paducah Gaseous Diffusion Plant for policy considerations and engineering design is not an easy task because it not only depends on seismic hazard, but also on seismic risk and other related environmental, social, and economic issues. Seismic hazard is the main focus. There is no question that there are seismic hazards at the Paducah Gaseous Diffusion Plant because of its proximity to several known seismic zones, particularly the New Madrid Seismic Zone. The issues in estimating seismic hazard are (1) the methods being used and (2) difficulty in characterizing the uncertainties ofmore » seismic sources, earthquake occurrence frequencies, and ground-motion attenuation relationships. This report summarizes how input data were derived, which methodologies were used, and what the hazard estimates at the Paducah Gaseous Diffusion Plant are.« less

  3. From tsunami hazard assessment to risk management in Guadeloupe (F.W.I.)

    NASA Astrophysics Data System (ADS)

    Zahibo, Narcisse; Dudon, Bernard; Krien, Yann; Arnaud, Gaël; Mercado, Aurelio; Roger, Jean

    2017-04-01

    The Caribbean region is prone to numerous natural hazards such as earthquakes, landslides, storm surges, tsunamis, coastal erosion or hurricanes. All these threats may cause great human and economic losses and are thus of prime interest for applied research. One of the main challenges for the scientific community is to conduct state-of-the-art research to assess hazards and share the results with coastal planners and decision makers so that they can regulate land use and develop mitigation strategies. We present here the results of a scientific collaborative project between Guadeloupe and Porto Rico which aimed at bringing a decision-making support to the authorities regarding tsunami hazards. This project led us to build a database of potential extreme events, and to study their impacts on Guadeloupe to investigate storm surge and tsunami hazards. The results were used by local authorities to develop safeguarding and mitigation measures in coastal areas. This project is thus a good example to demonstrate the benefit of inter Caribbean scientific collaboration for natural risks management.

  4. Geomorphologic flood-hazard assessment of alluvial fans and piedmonts

    USGS Publications Warehouse

    Field, J.J.; Pearthree, P.A.

    1997-01-01

    Geomorphologic studies are an excellent means of flood-hazard assessment on alluvial fans and piedmonts in the southwestern United States. Inactive, flood-free, alluvial fans display well developed soils, desert pavement, rock varnish, and tributary drainage networks. These areas are easily distinguished from flood-prone active alluvial fans on aerial photographs and in the field. The distribution of flood-prone areas associated with alluvial fans is strongly controlled by fanhead trenches dissecting the surface. Where fanhead trenches are permanent features cut in response to long-term conditions such as tectonic quiescence, flood-prone surfaces are situated down-slope from the mountain front and their positions are stable for thousands of years. Since the length and permanency of fanhead trenches can vary greatly between adjacent drainages, it is not appropriate to use regional generalizations to evaluate the distribution and stability of flood-hazard zones. Site-specific geomorphologic studies must be carried out if piedmont areas with a high risk of flooding are to be correctly identified and losses due to alluvial-fan flooding minimized. To meet the growing demand for trained professionals to complete geomorphologic maps of desert piedmonts, undergraduate and graduate geomorphology courses should adopt an instructional unit on alluvial-fan flood hazards that includes: 1) a review of geomorphologic characteristics that vary with surface age; 2) a basic mapping exercise; and 3) a discussion of the causes of fanhead trenching.

  5. Assessing quantities and disposal routes for household hazardous products in the United Kingdom.

    PubMed

    Slack, Rebecca J; Zerva, Panagoula; Gronow, Jan R; Voulvoulis, Nikolaos

    2005-03-15

    The disposal of household products containing hazardous substances (household hazardous wastes; HHW) is of concern due to possible health and environmental effects as a consequence of environmental pollution. The potential risks of disposal are proportional to the amounts of products used and waste generated, but much of the data relating to quantities are old, inconsistent, or nonexistent. Hence, full-scale risk assessment is not yet feasible. This pilot study was aimed at an initial assessment of the amounts of hazardous products used or stored within the household and potential disposal routes. Representatives of 400 households from southeast England were interviewed about socio-demographic factors, perception of the risks associated with the use and disposal of hazardous waste generated in households, quantities of particular products currently in use or stored within the household, and times and methods of disposal of such products. The estimates of quantities obtained were compared with sales figures and waste estimates to improve understanding of product flow through to the HHW stream. The disposal routes investigated demonstrated that most householders claim to use the entire product priorto disposal in the general refuse bin. The relationship with socio-demographic factors demonstrated a difference between neighborhood size and length of residence in a household with regard to product quantities possessed and the disposal habits adopted.

  6. Tsunami Hazard, Vulnerability and Risk assessment for the coast of Oman

    NASA Astrophysics Data System (ADS)

    Gonzalez, Mauricio; Aniel-Quiroga, Íñigo; Aguirre-Ayerbe, Ignacio; Álvarez-Gómez, José Antonio; MArtínez, Jara; Gonzalez-Riancho, Pino; Fernandez, Felipe; Medina, Raúl; Al-Yahyai, Sultan

    2016-04-01

    Tsunamis are relatively infrequent phenomena representing a greater threat than earthquakes, hurricanes and tornadoes, and causing the loss of thousands of human lives and extensive damage to coastal infrastructures around the world. Advances in the understanding and prediction of tsunami impacts allow the development of new methodologies in this field. This work presents the methodology that has been followed for developing the tsunami hazard, vulnerability and risk assessment for the coast of Oman, including maps containing the results of the process. Oman is located in the south eastern corner of the Arabian Peninsula and of the Arabian plate, in front of the Makran Subduction Zone (MSZ), which is the major source of earthquakes in the eastern border of the Arabian plate and Oman (Al-Shaqsi, 2012). There are at least three historical tsunamis assigned to seismic origin in the MSZ (Heidarzadeh et al., 2008; Jordan, 2008). These events show the high potential for tsunami generation of the MSZ, being one of the most tsunamigenic zones in the Indian Ocean. For the tsunami hazard assessment, worst potential cases have been selected, as well as the historical case of 1945, when an 8.1 earthquake generated a tsunami affecting the coast of Oman, and prompting 4000 casualties in the countries of the area. These scenarios have been computationally simulated in order to get tsunami hazard maps, including flooding maps. These calculations have been carried out at national and local scale, in 9 municipalities all along the coast of Oman, including the cities of Sohar, Wudam, Sawadi, Muscat, Quriyat, Sur, Masirah, Al Duqm, and Salalah. Using the hazard assessment as input, this work presents as well an integrated framework for the tsunami vulnerability and risk assessment carried out in the Sultanate of Oman. This framework considers different dimensions (human, structural) and it is developed at two different spatial resolutions, national and local scale. The national

  7. Index based regional vulnerability assessment to cyclones hazards of coastal area of Bangladesh

    NASA Astrophysics Data System (ADS)

    Mohammad, Q. A.; Kervyn, M.; Khan, A. U.

    2016-12-01

    Cyclone, storm surge, coastal flooding, salinity intrusion, tornado, nor'wester, and thunderstorms are the listed natural hazards in the coastal areas of Bangladesh. Bangladesh was hit by devastating cyclones in 1970, 1991, 2007, 2009, and 2016. Intensity and frequency of natural hazards in the coastal area are likely to increase in future due to climate change. Risk assessment is one of the most important steps of disaster risk reduction. As a climate change victim nation, Bangladesh claims compensation from green climate fund. It also created its own climate funds. It is therefore very important to assess vulnerability of the coast of Bangladesh to natural hazards for efficient allocation of financial investment to support the national risk reduction. This study aims at identifying the spatial variations in factors contributing to vulnerability of the coastal inhabitants of Bangladesh to natural hazards. An exploratory factor analysis method has been used to assess the vulnerability at each local administrative unit. The 141 initially selected 141 socio-economic indicators were reduced to 41 by converting some of them to meaningful widely accepted indicators and removing highly correlated indicators. Principle component analysis further reduced 41 indicators to 13 dimensions which explained 79% of total variation. PCA dimensions show three types of characteristics of the people that may lead people towards vulnerability. They are (a) demographic, education and job opportunities, (b) access to basic needs and facilities, and (c) special needs people. Vulnerability maps of the study area has been prepared by weighted overlay of the dimensions. Study revealed that 29 and 8 percent of total coastal area are very high and high vulnerable to natural hazards respectively. These are distributed along sea boundary and major rivers. Comparison of this spatial distribution with the capacities to face disaster show that highly vulnerable areas are well covered by cyclone

  8. Use of cloud computing technology in natural hazard assessment and emergency management

    NASA Astrophysics Data System (ADS)

    Webley, P. W.; Dehn, J.

    2015-12-01

    During a natural hazard event, the most up-to-date data needs to be in the hands of those on the front line. Decision support system tools can be developed to provide access to pre-made outputs to quickly assess the hazard and potential risk. However, with the ever growing availability of new satellite data as well as ground and airborne data generated in real-time there is a need to analyze the large volumes of data in an easy-to-access and effective environment. With the growth in the use of cloud computing, where the analysis and visualization system can grow with the needs of the user, then these facilities can used to provide this real-time analysis. Think of a central command center uploading the data to the cloud compute system and then those researchers in-the-field connecting to a web-based tool to view the newly acquired data. New data can be added by any user and then viewed instantly by anyone else in the organization through the cloud computing interface. This provides the ideal tool for collaborative data analysis, hazard assessment and decision making. We present the rationale for developing a cloud computing systems and illustrate how this tool can be developed for use in real-time environments. Users would have access to an interactive online image analysis tool without the need for specific remote sensing software on their local system therefore increasing their understanding of the ongoing hazard and mitigate its impact on the surrounding region.

  9. An Approach for Rapid Assessment of Seismic Hazards in Turkey by Continuous GPS Data

    PubMed Central

    Ozener, Haluk; Dogru, Asli; Unlutepe, Ahmet

    2009-01-01

    The Earth is being monitored every day by all kinds of sensors. This leads an overflow of data in all branches of science nowadays, especially in Earth Sciences. Data storage and data processing are the problems to be solved by current technologies, as well as by those accessing and analyzing these large data sources. Once solutions have been created for collecting, storing and accessing data, then the challenge becomes how to effectively share data, applications and processing resources across many locations. The Global Positioning System (GPS) sensors are being used as geodetic instruments to precisely detect crustal motion in the Earth's surface. Rapid access to data provided by GPS sensors is becoming increasingly important for deformation monitoring and rapid hazard assessments. Today, reliable and fast collection and distribution of data is a challenge and advances in Internet technologies have made it easier to provide the needed data. This study describes a system which will be able to generate strain maps using data from continuous GPS stations for seismic hazard analysis. Strain rates are a key factor in seismic hazard analyses. Turkey is a country prone to earthquakes with a long history of seismic hazards and disasters. This situation has resulted in the studies by Earth scientists that focus on Turkey in order to improve their understanding of the Earth's crust structure and seismic hazards. Nevertheless, the construction of models, data access and analysis are often not fast as expected, but the combination of Internet technologies with continuous GPS sensors can be a solution to overcome this problem. This system would have the potential to answer many important questions to assess seismic hazards such as how much stretching, squashing and shearing is taking place in different parts of Turkey, and how do velocities change from place to place? Seismic hazard estimation is the most effective way to reduce earthquake losses. It is clear that reliability

  10. Introductory Talk: Adding Up Chemicals: Component-Based Risk Assessment of Chemical Mixturesand Primary Talk: Grouping Chemicals for Assessment and Conducting Assessments with the Hazard Index and Related Methods

    EPA Science Inventory

    Dr. Simmons will provide a concise overview of established and emerging methods to group chemicals for component-based mixture risk assessments. This will be followed by introduction to several important component-based methods, the Hazard Index, Target Organ Hazard Index, Multi...

  11. Integrating Hazardous Materials Characterization and Assessment Tools to Guide Pollution Prevention in Electronic Products and Manufacturing

    NASA Astrophysics Data System (ADS)

    Lam, Carl

    Due to technology proliferation, the environmental burden attributed to the production, use, and disposal of hazardous materials in electronics have become a worldwide concern. The major theme of this dissertation is to develop and apply hazardous materials assessment tools to systematically guide pollution prevention opportunities in the context of electronic product design, manufacturing and end-of-life waste management. To this extent, a comprehensive review is first provided on describing hazard traits and current assessment methods to evaluate hazardous materials. As a case study at the manufacturing level, life cycle impact assessment (LCIA)-based and risk-based screening methods are used to quantify chemical and geographic environmental impacts in the U.S. printed wiring board (PWB) industry. Results from this industrial assessment clarify priority waste streams and States to most effectively mitigate impact. With further knowledge of PWB manufacturing processes, select alternative chemical processes (e.g., spent copper etchant recovery) and material options (e.g., lead-free etch resist) are discussed. In addition, an investigation on technology transition effects for computers and televisions in the U.S. market is performed by linking dynamic materials flow and environmental assessment models. The analysis forecasts quantities of waste units generated and maps shifts in environmental impact potentials associated with metal composition changes due to product substitutions. This insight is important to understand the timing and waste quantities expected and the emerging toxic elements needed to be addressed as a consequence of technology transition. At the product level, electronic utility meter devices are evaluated to eliminate hazardous materials within product components. Development and application of a component Toxic Potential Indicator (TPI) assessment methodology highlights priority components requiring material alternatives. Alternative

  12. Developing a global tsunami propagation database and its application for coastal hazard assessments in China

    NASA Astrophysics Data System (ADS)

    Wang, N.; Tang, L.; Titov, V.; Newman, J. C.; Dong, S.; Wei, Y.

    2013-12-01

    The tragedies of the 2004 Indian Ocean and 2011 Japan tsunamis have increased awareness of tsunami hazards for many nations, including China. The low land level and high population density of China's coastal areas place it at high risk for tsunami hazards. Recent research (Komatsubara and Fujiwara, 2007) highlighted concerns of a magnitude 9.0 earthquake on the Nankai trench, which may affect China's coasts not only in South China Sea, but also in the East Sea and Yellow Sea. Here we present our work in progress towards developing a global tsunami propagation database that can be used for hazard assessments by many countries. The propagation scenarios are computed by using NOAA's MOST numerical model. Each scenario represents a typical Mw 7.5 earthquake with predefined earthquake parameters (Gica et al., 2008). The model grid was interpolated from ETOPO1 at 4 arc-min resolution, covering -80° to72°N and 0 to 360°E. We use this database for preliminary tsunami hazard assessment along China's coastlines.

  13. Towards quantitative condition assessment of biodiversity outcomes: Insights from Australian marine protected areas.

    PubMed

    Addison, Prue F E; Flander, Louisa B; Cook, Carly N

    2017-08-01

    Protected area management effectiveness (PAME) evaluation is increasingly undertaken to evaluate governance, assess conservation outcomes and inform evidence-based management of protected areas (PAs). Within PAME, quantitative approaches to assess biodiversity outcomes are now emerging, where biological monitoring data are directly assessed against quantitative (numerically defined) condition categories (termed quantitative condition assessments). However, more commonly qualitative condition assessments are employed in PAME, which use descriptive condition categories and are evaluated largely with expert judgement that can be subject to a range of biases, such as linguistic uncertainty and overconfidence. Despite the benefits of increased transparency and repeatability of evaluations, quantitative condition assessments are rarely used in PAME. To understand why, we interviewed practitioners from all Australian marine protected area (MPA) networks, which have access to long-term biological monitoring data and are developing or conducting PAME evaluations. Our research revealed that there is a desire within management agencies to implement quantitative condition assessment of biodiversity outcomes in Australian MPAs. However, practitioners report many challenges in transitioning from undertaking qualitative to quantitative condition assessments of biodiversity outcomes, which are hampering progress. Challenges include a lack of agency capacity (staff numbers and money), knowledge gaps, and diminishing public and political support for PAs. We point to opportunities to target strategies that will assist agencies overcome these challenges, including new decision support tools, approaches to better finance conservation efforts, and to promote more management relevant science. While a single solution is unlikely to achieve full evidence-based conservation, we suggest ways for agencies to target strategies and advance PAME evaluations toward best practice. Copyright

  14. Assessment of the Acute and Chronic Health Hazards of Hydraulic Fracturing Fluids.

    PubMed

    Wattenberg, Elizabeth V; Bielicki, Jeffrey M; Suchomel, Ashley E; Sweet, Jessica T; Vold, Elizabeth M; Ramachandran, Gurumurthy

    2015-01-01

    There is growing concern about how hydraulic fracturing affects public health because this activity involves handling large volumes of fluids that contain toxic and carcinogenic constituents, which are injected under high pressure through wells into the subsurface to release oil and gas from tight shale formations. The constituents of hydraulic fracturing fluids (HFFs) present occupational health risks because workers may be directly exposed to them, and general public health risks because of potential air and water contamination. Hazard identification, which focuses on the types of toxicity that substances may cause, is an important step in the complex health risk assessment of hydraulic fracturing. This article presents a practical and adaptable tool for the hazard identification of HFF constituents, and its use in the analysis of HFF constituents reported to be used in 2,850 wells in North Dakota between December 2009 and November 2013. Of the 569 reported constituents, 347 could be identified by a Chemical Abstract Service Registration Number (CASRN) and matching constituent name. The remainder could not be identified either because of trade secret labeling (210) or because of an invalid CASRN (12). Eleven public databases were searched for health hazard information on thirteen health hazard endpoints for 168 identifiable constituents that had at least 25 reports of use. Health hazard counts were generated for chronic and acute endpoints, including those associated with oral, inhalation, ocular, and dermal exposure. Eleven of the constituents listed in the top 30 by total health hazard count were also listed in the top 30 by reports of use. This includes naphthalene, which along with benzyl chloride, has the highest health hazard count. The top 25 constituents reportedly used in North Dakota largely overlap with those reported for Texas and Pennsylvania, despite different geologic formations, target resources (oil vs. gas), and disclosure requirements

  15. Evaluating the Use of Declustering for Induced Seismicity Hazard Assessment

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Michael, A. J.

    2016-12-01

    The recent dramatic seismicity rate increase in the central and eastern US (CEUS) has motivated the development of seismic hazard assessments for induced seismicity (e.g., Petersen et al., 2016). Standard probabilistic seismic hazard assessment (PSHA) relies fundamentally on the assumption that seismicity is Poissonian (Cornell, BSSA, 1968); therefore, the earthquake catalogs used in PSHA are typically declustered (e.g., Petersen et al., 2014) even though this may remove earthquakes that may cause damage or concern (Petersen et al., 2015; 2016). In some induced earthquake sequences in the CEUS, the standard declustering can remove up to 90% of the sequence, reducing the estimated seismicity rate by a factor of 10 compared to estimates from the complete catalog. In tectonic regions the reduction is often only about a factor of 2. We investigate how three declustering methods treat induced seismicity: the window-based Gardner-Knopoff (GK) algorithm, often used for PSHA (Gardner and Knopoff, BSSA, 1974); the link-based Reasenberg algorithm (Reasenberg, JGR,1985); and a stochastic declustering method based on a space-time Epidemic-Type Aftershock Sequence model (Ogata, JASA, 1988; Zhuang et al., JASA, 2002). We apply these methods to three catalogs that likely contain some induced seismicity. For the Guy-Greenbrier, AR earthquake swarm from 2010-2013, declustering reduces the seismicity rate by factors of 6-14, depending on the algorithm. In northern Oklahoma and southern Kansas from 2010-2015, the reduction varies from factors of 1.5-20. In the Salton Trough of southern California from 1975-2013, the rate is reduced by factors of 3-20. Stochastic declustering tends to remove the most events, followed by the GK method, while the Reasenberg method removes the fewest. Given that declustering and choice of algorithm have such a large impact on the resulting seismicity rate estimates, we suggest that more accurate hazard assessments may be found using the complete catalog.

  16. Rockfall Hazard Process Assessment : Final Project Report

    DOT National Transportation Integrated Search

    2017-10-01

    After a decade of using the Rockfall Hazard Rating System (RHRS), the Montana Department of Transportation (MDT) sought a reassessment of their rockfall hazard evaluation process. Their prior system was a slightly modified version of the RHRS and was...

  17. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era.

    PubMed

    Chiu, Weihsueh A; Euling, Susan Y; Scott, Cheryl Siegel; Subramaniam, Ravi P

    2013-09-15

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA)--i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on "augmentation" of weight of evidence--using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards "integration" of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for "expansion" of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual "reorientation" of QRA towards approaches that more directly link environmental exposures to human outcomes. Published by Elsevier Inc.

  18. Probabilistic tsunami hazard assessment in Greece for seismic sources along the segmented Hellenic Arc

    NASA Astrophysics Data System (ADS)

    Novikova, Tatyana; Babeyko, Andrey; Papadopoulos, Gerassimos

    2017-04-01

    Greece and adjacent coastal areas are characterized by a high population exposure to tsunami hazard. The Hellenic Arc is the most active geotectonic structure for the generation of earthquakes and tsunamis. We performed probabilistic tsunami hazard assessment for selected locations of Greek coastlines which are the forecasting points officially used in the tsunami warning operations by the Hellenic National Tsunami Warning Center and the NEAMTWS/IOC/UNESCO. In our analysis we considered seismic sources for tsunami generation along the western, central and eastern segments of the Hellenic Arc. We first created a synthetic catalog as long as 10,000 years for all the significant earthquakes with magnitudes in the range from 6.0 to 8.5, the real events being included in this catalog. For each event included in the synthetic catalog a tsunami was generated and propagated using Boussinesq model. The probability of occurrence for each event was determined by Gutenberg-Richter magnitude-frequency distribution. The results of our study are expressed as hazard curves and hazard maps. The hazard curves were obtained for the selected sites and present the annual probability of exceedance as a function of pick coastal tsunami amplitude. Hazard maps represent the distribution of peak coastal tsunami amplitudes corresponding to a fixed annual probability. In such forms our results can be easily compared to the ones obtained in other studies and further employed for the development of tsunami risk management plans. This research is a contribution to the EU-FP7 tsunami research project ASTARTE (Assessment, Strategy And Risk Reduction for Tsunamis in Europe), grant agreement no: 603839, 2013-10-30.

  19. National Environmental Policy Act Hazards Assessment for the TREAT Alternative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyd D. Christensen; Annette L. Schafer

    2013-11-01

    This document provides an assessment of hazards as required by the National Environmental Policy Act for the alternative of restarting the reactor at the Transient Reactor Test (TREAT) facility by the Resumption of Transient Testing Program. Potential hazards have been identified and screening level calculations have been conducted to provide estimates of unmitigated dose consequences that could be incurred through this alternative. Consequences considered include those related to use of the TREAT Reactor, experiment assembly handling, and combined events involving both the reactor and experiments. In addition, potential safety structures, systems, and components for processes associated with operating TREAT andmore » onsite handling of nuclear fuels and experiments are listed. If this alternative is selected, a safety basis will be prepared in accordance with 10 CFR 830, “Nuclear Safety Management,” Subpart B, “Safety Basis Requirements.”« less

  20. National Environmental Policy Act Hazards Assessment for the TREAT Alternative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Boyd D.; Schafer, Annette L.

    2014-02-01

    This document provides an assessment of hazards as required by the National Environmental Policy Act for the alternative of restarting the reactor at the Transient Reactor Test (TREAT) facility by the Resumption of Transient Testing Program. Potential hazards have been identified and screening level calculations have been conducted to provide estimates of unmitigated dose consequences that could be incurred through this alternative. Consequences considered include those related to use of the TREAT Reactor, experiment assembly handling, and combined events involving both the reactor and experiments. In addition, potential safety structures, systems, and components for processes associated with operating TREAT andmore » onsite handling of nuclear fuels and experiments are listed. If this alternative is selected, a safety basis will be prepared in accordance with 10 CFR 830, “Nuclear Safety Management,” Subpart B, “Safety Basis Requirements.”« less

  1. 24 CFR 35.1320 - Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and reevaluations. 35.1320 Section 35.1320 Housing and...

  2. 24 CFR 35.1320 - Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard... 24 Housing and Urban Development 1 2014-04-01 2014-04-01 false Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and reevaluations. 35.1320 Section 35.1320 Housing and...

  3. 24 CFR 35.1320 - Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard... 24 Housing and Urban Development 1 2013-04-01 2013-04-01 false Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and reevaluations. 35.1320 Section 35.1320 Housing and...

  4. 24 CFR 35.1320 - Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard... 24 Housing and Urban Development 1 2012-04-01 2012-04-01 false Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and reevaluations. 35.1320 Section 35.1320 Housing and...

  5. 24 CFR 35.1320 - Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES Methods and Standards for Lead-Paint Hazard... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Lead-based paint inspections, paint testing, risk assessments, lead-hazard screens, and reevaluations. 35.1320 Section 35.1320 Housing and...

  6. Environmental Hazards Assessment Program annual report, July 1, 1993--June 30, 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    On June 23, 1992, the US Department of Energy (DOE) signed Assistance Instrument Number DE-FG01-92EW50625 with the Medical University of South Carolina (MUSC) to support the Environmental Hazards Assessment Program (EHAP). The objectives of the EHAP program stated in the proposal to DOE are to: (1) Develop a holistic, national basis for risk assessment, risk management, and risk communication which recognizes the direct impact of environmental hazards on the health and well-being of all. (2) Develop a pool of talented scientists and experts in cleanup activities, especially in human health aspects; and (3) Identify needs and develop programs addressing themore » critical shortage of well-educated, highly-skilled technical and scientific personnel to address the health oriented aspects of environmental restoration and waste management. This report describes activities and reports on progress for the second year of the grant.« less

  7. Interaction Potency of Single-Walled Carbon Nanotubes with DNAs: A Novel Assay for Assessment of Hazard Risk

    PubMed Central

    Yao, Chunhe; Carlisi, Cristina; Li, Yuning; Chen, Da; Ding, Jianfu; Feng, Yong-Lai

    2016-01-01

    Increasing use of single-walled carbon nanotubes (SWCNTs) necessitates a novel method for hazard risk assessment. In this work, we investigated the interaction of several types of commercial SWCNTs with single-stranded (ss) and double-stranded (ds) DNA oligonucleotides (20-mer and 20 bp). Based on the results achieved, we proposed a novel assay that employed the DNA interaction potency to assess the hazard risk of SWCNTs. It was found that SWCNTs in different sizes or different batches of the same product number of SWCNTs showed dramatically different potency of interaction with DNAs. In addition, the same SWCNTs also exerted strikingly different interaction potency with ss- versus ds- DNAs. The interaction rates of SWCNTs with DNAs were investigated, which could be utilized as the indicator of potential hazard for acute exposure. Compared to solid SWCNTs, the SWCNTs dispersed in liquid medium (2% sodium cholate solution) exhibited dramatically different interaction potency with DNAs. This indicates that the exposure medium may greatly influence the subsequent toxicity and hazard risk produced by SWCNTs. Based on the findings of dose-dependences and time-dependences from the interactions between SWCNTs and DNAs, a new chemistry based assay for hazard risk assessment of nanomaterials including SWCNTs has been presented. PMID:27936089

  8. Interaction Potency of Single-Walled Carbon Nanotubes with DNAs: A Novel Assay for Assessment of Hazard Risk.

    PubMed

    Yao, Chunhe; Carlisi, Cristina; Li, Yuning; Chen, Da; Ding, Jianfu; Feng, Yong-Lai

    2016-01-01

    Increasing use of single-walled carbon nanotubes (SWCNTs) necessitates a novel method for hazard risk assessment. In this work, we investigated the interaction of several types of commercial SWCNTs with single-stranded (ss) and double-stranded (ds) DNA oligonucleotides (20-mer and 20 bp). Based on the results achieved, we proposed a novel assay that employed the DNA interaction potency to assess the hazard risk of SWCNTs. It was found that SWCNTs in different sizes or different batches of the same product number of SWCNTs showed dramatically different potency of interaction with DNAs. In addition, the same SWCNTs also exerted strikingly different interaction potency with ss- versus ds- DNAs. The interaction rates of SWCNTs with DNAs were investigated, which could be utilized as the indicator of potential hazard for acute exposure. Compared to solid SWCNTs, the SWCNTs dispersed in liquid medium (2% sodium cholate solution) exhibited dramatically different interaction potency with DNAs. This indicates that the exposure medium may greatly influence the subsequent toxicity and hazard risk produced by SWCNTs. Based on the findings of dose-dependences and time-dependences from the interactions between SWCNTs and DNAs, a new chemistry based assay for hazard risk assessment of nanomaterials including SWCNTs has been presented.

  9. Hazard identification and risk assessment for biologics targeting the immune system.

    PubMed

    Weir, Andrea B

    2008-01-01

    Biologic pharmaceuticals include a variety of products, such as monoclonal antibodies, fusion proteins and cytokines. Products in those classes include immunomodulatory biologics, which are intended to enhance or diminish the activity of the immune system. Immunomodulatory biologics have been approved by the U.S. FDA for a variety of indications, including cancer and inflammatory conditions. Prior to gaining approval for marketing, sponsoring companies for all types of products must demonstrate a product's safety in toxicology studies conducted in animals and show safety and efficacy in clinical trials conducted in patients. The overall goal of toxicology studies, which applies to immunomodulatory and other product types, is to identify the hazards that products pose to humans. Because biologics are generally highly selective for specific targets (receptors/epitopes), conducting toxicology studies in animal models with the target is essential. Such animals are referred to as pharmacologically relevant. Endpoints routinely included in toxicology studies, such as hematology, organ weight and histopathology, can be used to assess the effect of a product on the structure of the immune system. Additionally, specialized endpoints, such as immunophenotyping and immune function tests, can be used to define effects of immunomodulatory products on the immune system. Following hazard identification, risks posed to patients are assessed and managed. Risks can be managed through clinical trial design and risk communication, a practice that applies to immunomodulatory and other product types. Examples of risk management in clinical trial design include establishing a safe starting dose, defining the appropriate patient population and establishing appropriate patient monitoring. Risk communication starts during clinical trials and continues after product approval. A combination of hazard identification, risk assessment and risk management allows for drug development to proceed

  10. Multi-Hazard Vulnerability Assessment Along the Coast of Visakhapatnam, North-East Coast of India

    NASA Astrophysics Data System (ADS)

    Vivek, G.; Grinivasa Kumar, T.

    2016-08-01

    The current study area is coastal zone of Visakhapatnam, district of Andhra Pradesh along the coast of India. This area is mostly vulnerable to many disasters such as storms, cyclone, flood, tsunami and erosion. This area is considered as cyclone prone area because of frequently occurrence of the cyclones in this area. Recently the two tropical cyclones that formed in the Bay of Bengal are Hudhud (October 13, 2014) and Phylin (October 11, 2013), has caused devastating impacts on the eastern coast and shows that the country has lack of preparedness to cyclone, storm surge and related natural hazards. The multi-hazard vulnerability maps prepared here are a blended and combined overlay of multiple hazards those affecting the coastal zone. The present study aims to develop a methodology for coastal multi-hazard vulnerability assessment. This study carried out using parameters like probability of coastal slope, tsunami arrival height, future sea level rise, coastal erosion and tidal range. The multi-hazard vulnerability maps prepared by overlaying of multi hazards those affecting the coastal zone. Multi-hazard vulnerability maps further reproduced as risk maps with the land use information. The decision making tools presented here can provide a useful information during the disaster for the evacuation process and to evolve a management strategy.

  11. A procedure for NEPA assessment of selenium hazards associated with mining

    Treesearch

    Dennis A. Lemly

    2007-01-01

    This paper gives step-by-step instructions for assessing aquatic selenium hazards associated with mining. The procedure was developed to provide the U.S. Forest Service with a proactive capability for determining the risk of selenium pollution when it reviews mine permit applications in accordance with the National Environmental Policy Act (NEPA). The procedural...

  12. Application of Gumbel I and Monte Carlo methods to assess seismic hazard in and around Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2018-05-01

    A proper assessment of seismic hazard is of considerable importance in order to achieve suitable building construction criteria. This paper presents probabilistic seismic hazard assessment in and around Pakistan (23° N-39° N; 59° E-80° E) in terms of peak ground acceleration (PGA). Ground motion is calculated in terms of PGA for a return period of 475 years using a seismogenic-free zone method of Gumbel's first asymptotic distribution of extreme values and Monte Carlo simulation. Appropriate attenuation relations of universal and local types have been used in this study. The results show that for many parts of Pakistan, the expected seismic hazard is relatively comparable with the level specified in the existing PGA maps.

  13. Probabilistic assessment of landslide tsunami hazard for the northern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Pampell-Manis, A.; Horrillo, J.; Shigihara, Y.; Parambath, L.

    2016-01-01

    The devastating consequences of recent tsunamis affecting Indonesia and Japan have prompted a scientific response to better assess unexpected tsunami hazards. Although much uncertainty exists regarding the recurrence of large-scale tsunami events in the Gulf of Mexico (GoM), geological evidence indicates that a tsunami is possible and would most likely come from a submarine landslide triggered by an earthquake. This study customizes for the GoM a first-order probabilistic landslide tsunami hazard assessment. Monte Carlo Simulation (MCS) is employed to determine landslide configurations based on distributions obtained from observational submarine mass failure (SMF) data. Our MCS approach incorporates a Cholesky decomposition method for correlated landslide size parameters to capture correlations seen in the data as well as uncertainty inherent in these events. Slope stability analyses are performed using landslide and sediment properties and regional seismic loading to determine landslide configurations which fail and produce a tsunami. The probability of each tsunamigenic failure is calculated based on the joint probability of slope failure and probability of the triggering earthquake. We are thus able to estimate sizes and return periods for probabilistic maximum credible landslide scenarios. We find that the Cholesky decomposition approach generates landslide parameter distributions that retain the trends seen in observational data, improving the statistical validity and relevancy of the MCS technique in the context of landslide tsunami hazard assessment. Estimated return periods suggest that probabilistic maximum credible SMF events in the north and northwest GoM have a recurrence of 5000-8000 years, in agreement with age dates of observed deposits.

  14. Use of a modified GreenScreen tool to conduct a screening-level comparative hazard assessment of conventional silver and two forms of nanosilver.

    PubMed

    Sass, Jennifer; Heine, Lauren; Hwang, Nina

    2016-11-08

    Increased concern for potential health and environmental impacts of chemicals, including nanomaterials, in consumer products is driving demand for greater transparency regarding potential risks. Chemical hazard assessment is a powerful tool to inform product design, development and procurement and has been integrated into alternative assessment frameworks. The extent to which assessment methods originally designed for conventionally-sized materials can be used for nanomaterials, which have size-dependent physical and chemical properties, have not been well established. We contracted with a certified GreenScreen profiler to conduct three GreenScreen hazard assessments, for conventional silver and two forms of nanosilver. The contractor summarized publicly available literature, and used defined GreenScreen hazard criteria and expert judgment to assign and report hazard classification levels, along with indications of confidence in those assignments. Where data were not available, a data gap (DG) was assigned. Using the individual endpoint scores, an aggregated benchmark score (BM) was applied. Conventional silver and low-soluble nanosilver were assigned the highest possible hazard score and a silica-silver nanocomposite called AGS-20 could not be scored due to data gaps. AGS-20 is approved for use as antimicrobials by the US Environmental Protection Agency. An existing method for chemical hazard assessment and communication can be used - with minor adaptations- to compare hazards across conventional and nano forms of a substance. The differences in data gaps and in hazard profiles support the argument that each silver form should be considered unique and subjected to hazard assessment to inform regulatory decisions and decisions about product design and development. A critical limitation of hazard assessments for nanomaterials is the lack of nano-specific hazard data - where data are available, we demonstrate that existing hazard assessment systems can work. The work

  15. Tsunami hazard assessment for the island of Rhodes, Greece

    NASA Astrophysics Data System (ADS)

    Pagnoni, Gianluca; Armigliato, Alberto; Zaniboni, Filippo; Tinti, Stefano

    2013-04-01

    The island of Rhodes is part of the Dodecanese archipelago, and is one of the many islands that are found in the Aegean Sea. The tectonics of the Rhodes area is rather complex, involving both strike-slip and dip-slip (mainly thrust) processes. Tsunami catalogues (e.g. Papadopulos et al, 2007) show the relative high frequency of occurrence of tsunamis in this area, some also destructive, in particular between the coasts of Rhodes and Turkey. In this part of the island is located the town of Rhodes, the capital and also the largest and most populated city. Rhodes is historically famous for the Colossus of Rhodes, collapsed following an earthquake, and nowadays is a popular tourist destination. This work is focused on the hazard assessment evaluation with research performed in the frame of the European project NearToWarn. The hazard is assessed by using the worst-credible case scenario, a method introduced and used to study local tsunami hazard in coastal towns like Catania, Italy, and Alexandria, Egypt (Tinti et al., 2012). The tsunami sources chosen for building scenarios are three: two located in the sea area in front of the Turkish coasts where the events are more frequent represent local sources and were selected in the frame of the European project NearToWarn, while one provides the case of a distant source. The first source is taken from the paper Ebeling et al. (2012) and modified by UNIBO and models the earthquake and small tsunami occurred on 25th April 1957.The second source is a landslide and is derived from the TRANSFER Project "Database of Tsunamigenic Non-Seismic Sources" and coincides with the so-called "Northern Rhodes Slide", possibly responsible for the 24th March 2002 tsunami. The last source is the fault that is located close to the island of Crete believed to be responsible for the tsunami event of 1303 that was reported to have caused damage in the city of Rhodes. The simulations are carried out using the finite difference code UBO-TSUFD that

  16. A transparent and data-driven global tectonic regionalization model for seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Shin; Weatherill, Graeme; Pagani, Marco; Cotton, Fabrice

    2018-05-01

    A key concept that is common to many assumptions inherent within seismic hazard assessment is that of tectonic similarity. This recognizes that certain regions of the globe may display similar geophysical characteristics, such as in the attenuation of seismic waves, the magnitude scaling properties of seismogenic sources or the seismic coupling of the lithosphere. Previous attempts at tectonic regionalization, particularly within a seismic hazard assessment context, have often been based on expert judgements; in most of these cases, the process for delineating tectonic regions is neither reproducible nor consistent from location to location. In this work, the regionalization process is implemented in a scheme that is reproducible, comprehensible from a geophysical rationale, and revisable when new relevant data are published. A spatial classification-scheme is developed based on fuzzy logic, enabling the quantification of concepts that are approximate rather than precise. Using the proposed methodology, we obtain a transparent and data-driven global tectonic regionalization model for seismic hazard applications as well as the subjective probabilities (e.g. degree of being active/degree of being cratonic) that indicate the degree to which a site belongs in a tectonic category.

  17. Probabilistic volcanic hazard assessments of Pyroclastic Density Currents: ongoing practices and future perspectives

    NASA Astrophysics Data System (ADS)

    Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto

    2014-05-01

    Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease

  18. Probabilistic Seismic Hazard Assessment for a NPP in the Upper Rhine Graben, France

    NASA Astrophysics Data System (ADS)

    Clément, Christophe; Chartier, Thomas; Jomard, Hervé; Baize, Stéphane; Scotti, Oona; Cushing, Edward

    2015-04-01

    The southern part of the Upper Rhine Graben (URG) straddling the border between eastern France and western Germany, presents a relatively important seismic activity for an intraplate area. A magnitude 5 or greater shakes the URG every 25 years and in 1356 a magnitude greater than 6.5 struck the city of Basel. Several potentially active faults have been identified in the area and documented in the French Active Fault Database (web site in construction). These faults are located along the Graben boundaries and also inside the Graben itself, beneath heavily populated areas and critical facilities (including the Fessenheim Nuclear Power Plant). These faults are prone to produce earthquakes with magnitude 6 and above. Published regional models and preliminary geomorphological investigations provided provisional assessment of slip rates for the individual faults (0.1-0.001 mm/a) resulting in recurrence time of 10 000 years or greater for magnitude 6+ earthquakes. Using a fault model, ground motion response spectra are calculated for annual frequencies of exceedance (AFE) ranging from 10-4 to 10-8 per year, typical for design basis and probabilistic safety analyses of NPPs. A logic tree is implemented to evaluate uncertainties in seismic hazard assessment. The choice of ground motion prediction equations (GMPEs) and range of slip rate uncertainty are the main sources of seismic hazard variability at the NPP site. In fact, the hazard for AFE lower than 10-4 is mostly controlled by the potentially active nearby Rhine River fault. Compared with areal source zone models, a fault model localizes the hazard around the active faults and changes the shape of the Uniform Hazard Spectrum at the site. Seismic hazard deaggregations are performed to identify the earthquake scenarios (including magnitude, distance and the number of standard deviations from the median ground motion as predicted by GMPEs) that contribute to the exceedance of spectral acceleration for the different AFE

  19. A human health assessment of hazardous air pollutants in Portland, OR.

    PubMed

    Tam, B N; Neumann, C M

    2004-11-01

    Ambient air samples collected from five monitoring sites in Portland, OR during July 1999 to August 2000 were analyzed for 43 hazardous air pollutants (HAP). HAP concentrations were compared to carcinogenic and non-carcinogenic benchmark levels. Carcinogenic benchmark concentrations were set at a risk level of one-in-one-million (1x10(-6)). Hazard ratios of 1.0 were used when comparing HAP concentrations to non-carcinogenic benchmarks. Emission sources (point, area, and mobile) were identified and a cumulative cancer risk and total hazard index were calculated for HAPs exceeding these health benchmark levels. Seventeen HAPs exceeded a cancer risk level of 1x10(-6) at all five monitoring sites. Nineteen HAPs exceeded this level at one or more site. Carbon tetrachloride, 1,3-butadiene, formaldehyde, and 1,1,2,2-tetrachloroethane contributed more than 50% to the upper-bound lifetime cumulative cancer risk of 2.47x10(-4). Acrolein was the only non-carcinogenic HAP with hazard ratios that exceeded 1.0 at all five sites. Mobile sources contributed the greatest percentage (68%) of HAP emissions. Additional monitoring and health assessments for HAPs in Portland, OR are warranted, including addressing issues that may have overestimated or underestimated risks in this study. Abatement strategies for HAPs that exceeded health benchmarks should be implemented to reduce potential adverse health risks.

  20. Autonomous Soil Assessment System: A Data-Driven Approach to Planetary Mobility Hazard Detection

    NASA Astrophysics Data System (ADS)

    Raimalwala, K.; Faragalli, M.; Reid, E.

    2018-04-01

    The Autonomous Soil Assessment System predicts mobility hazards for rovers. Its development and performance are presented, with focus on its data-driven models, machine learning algorithms, and real-time sensor data fusion for predictive analytics.

  1. Next generation microbiological risk assessment-Potential of omics data for hazard characterisation.

    PubMed

    Haddad, Nabila; Johnson, Nick; Kathariou, Sophia; Métris, Aline; Phister, Trevor; Pielaat, Annemarie; Tassou, Chrysoula; Wells-Bennik, Marjon H J; Zwietering, Marcel H

    2018-04-12

    According to the World Health Organization estimates in 2015, 600 million people fall ill every year from contaminated food and 420,000 die. Microbial risk assessment (MRA) was developed as a tool to reduce and prevent risks presented by pathogens and/or their toxins. MRA is organized in four steps to analyse information and assist in both designing appropriate control options and implementation of regulatory decisions and programs. Among the four steps, hazard characterisation is performed to establish the probability and severity of a disease outcome, which is determined as function of the dose of toxin and/or pathogen ingested. This dose-response relationship is subject to both variability and uncertainty. The purpose of this review/opinion article is to discuss how Next Generation Omics can impact hazard characterisation and, more precisely, how it can improve our understanding of variability and limit the uncertainty in the dose-response relation. The expansion of omics tools (e.g. genomics, transcriptomics, proteomics and metabolomics) allows for a better understanding of pathogenicity mechanisms and virulence levels of bacterial strains. Detection and identification of virulence genes, comparative genomics, analyses of mRNA and protein levels and the development of biomarkers can help in building a mechanistic dose-response model to predict disease severity. In this respect, systems biology can help to identify critical system characteristics that confer virulence and explain variability between strains. Despite challenges in the integration of omics into risk assessment, some omics methods have already been used by regulatory agencies for hazard identification. Standardized methods, reproducibility and datasets obtained from realistic conditions remain a challenge, and are needed to improve accuracy of hazard characterisation. When these improvements are realized, they will allow the health authorities and government policy makers to prioritize hazards more

  2. Quantitative Risk Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Helms, J.

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investmentsmore » or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.« less

  3. STRUCTURE-ACTIVITY APPROACHES AND DATA EXPLORATION TOOLS FOR PRIORITIZING AND ASSESSING THE TOXICITY OF HAZARDOUS AIR POLLUTANTS

    EPA Science Inventory


    STRUCTURE-ACTIVITY APPROACHES AND DATA EXPLORATION TOOLS FOR PRIORITIZING AND ASSESSING THE TOXICITY OF HAZARDOUS AIR POLLUTANTS

    Hazardous Air Pollutants (HAPs) refers to a set of structurally diverse environmental chemicals, many with limited toxicity data, that have...

  4. The role of models in estimating consequences as part of the risk assessment process.

    PubMed

    Forde-Folle, K; Mitchell, D; Zepeda, C

    2011-08-01

    The degree of disease risk represented by the introduction, spread, or establishment of one or several diseases through the importation of animals and animal products is assessed by importing countries through an analysis of risk. The components of a risk analysis include hazard identification, risk assessment, risk management, and risk communication. A risk assessment starts with identification of the hazard(s) and then continues with four interrelated steps: release assessment, exposure assessment, consequence assessment, and risk estimation. Risk assessments may be either qualitative or quantitative. This paper describes how, through the integration of epidemiological and economic models, the potential adverse biological and economic consequences of exposure can be quantified.

  5. Seismic hazard assessment based on the Unified Scaling Law for Earthquakes: the Greater Caucasus

    NASA Astrophysics Data System (ADS)

    Nekrasova, A.; Kossobokov, V. G.

    2015-12-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. The parameters A, B, and C of USLE are used to estimate, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters including macro-seismic intensity. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks (e.g., those based on the density of exposed population). The methodology of seismic hazard and risks assessment based on USLE is illustrated by application to the seismic region of Greater Caucasus.

  6. A European perspective on alternatives to animal testing for environmental hazard identification and risk assessment.

    PubMed

    Scholz, Stefan; Sela, Erika; Blaha, Ludek; Braunbeck, Thomas; Galay-Burgos, Malyka; García-Franco, Mauricio; Guinea, Joaquin; Klüver, Nils; Schirmer, Kristin; Tanneberger, Katrin; Tobor-Kapłon, Marysia; Witters, Hilda; Belanger, Scott; Benfenati, Emilio; Creton, Stuart; Cronin, Mark T D; Eggen, Rik I L; Embry, Michelle; Ekman, Drew; Gourmelon, Anne; Halder, Marlies; Hardy, Barry; Hartung, Thomas; Hubesch, Bruno; Jungmann, Dirk; Lampi, Mark A; Lee, Lucy; Léonard, Marc; Küster, Eberhard; Lillicrap, Adam; Luckenbach, Till; Murk, Albertinka J; Navas, José M; Peijnenburg, Willie; Repetto, Guillermo; Salinas, Edward; Schüürmann, Gerrit; Spielmann, Horst; Tollefsen, Knut Erik; Walter-Rohde, Susanne; Whale, Graham; Wheeler, James R; Winter, Matthew J

    2013-12-01

    Tests with vertebrates are an integral part of environmental hazard identification and risk assessment of chemicals, plant protection products, pharmaceuticals, biocides, feed additives and effluents. These tests raise ethical and economic concerns and are considered as inappropriate for assessing all of the substances and effluents that require regulatory testing. Hence, there is a strong demand for replacement, reduction and refinement strategies and methods. However, until now alternative approaches have only rarely been used in regulatory settings. This review provides an overview on current regulations of chemicals and the requirements for animal tests in environmental hazard and risk assessment. It aims to highlight the potential areas for alternative approaches in environmental hazard identification and risk assessment. Perspectives and limitations of alternative approaches to animal tests using vertebrates in environmental toxicology, i.e. mainly fish and amphibians, are discussed. Free access to existing (proprietary) animal test data, availability of validated alternative methods and a practical implementation of conceptual approaches such as the Adverse Outcome Pathways and Integrated Testing Strategies were identified as major requirements towards the successful development and implementation of alternative approaches. Although this article focusses on European regulations, its considerations and conclusions are of global relevance. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Hazards in volcanic arcs

    NASA Astrophysics Data System (ADS)

    Sparks, S. R.

    2008-12-01

    Volcanic eruptions in arcs are complex natural phenomena, involving the movement of magma to the Earth's surface and interactions with the surrounding crust during ascent and with the surface environment during eruption, resulting in secondary hazards. Magma changes its properties profoundly during ascent and eruption and many of the underlying processes of heat and mass transfer and physical property changes that govern volcanic flows and magmatic interactions with the environment are highly non-linear. Major direct hazards include tephra fall, pyroclastic flows from explosions and dome collapse, volcanic blasts, lahars, debris avalanches and tsunamis. There are also health hazards related to emissions of gases and very fine volcanic ash. These hazards and progress in their assessment are illustrated mainly from the ongoing eruption of the Soufriere Hills volcano. Montserrat. There are both epistemic and aleatory uncertainties in the assessment of volcanic hazards, which can be large, making precise prediction a formidable objective. Indeed in certain respects volcanic systems and hazardous phenomena may be intrinsically unpredictable. As with other natural phenomena, predictions and hazards inevitably have to be expressed in probabilistic terms that take account of these uncertainties. Despite these limitations significant progress is being made in the ability to anticipate volcanic activity in volcanic arcs and, in favourable circumstances, make robust hazards assessments and predictions. Improvements in monitoring ground deformation, gas emissions and seismicity are being combined with more advanced models of volcanic flows and their interactions with the environment. In addition more structured and systematic methods for assessing hazards and risk are emerging that allow impartial advice to be given to authorities during volcanic crises. There remain significant issues of how scientific advice and associated uncertainties are communicated to provide effective

  8. Preliminary Volcano-Hazard Assessment for Redoubt Volcano, Alaska

    USGS Publications Warehouse

    Waythomas, Christopher F.; Dorava, Joseph M.; Miller, Thomas P.; Neal, Christina A.; McGimsey, Robert G.

    1997-01-01

    Redoubt Volcano is a stratovolcano located within a few hundred kilometers of more than half of the population of Alaska. This volcano has erupted explosively at least six times since historical observations began in 1778. The most recent eruption occurred in 1989-90 and similar eruptions can be expected in the future. The early part of the 1989-90 eruption was characterized by explosive emission of substantial volumes of volcanic ash to altitudes greater than 12 kilometers above sea level and widespread flooding of the Drift River valley. Later, the eruption became less violent, as developing lava domes collapsed, forming short-lived pyroclastic flows associated with low-level ash emission. Clouds of volcanic ash had significant effects on air travel as they drifted across Alaska, over Canada, and over parts of the conterminous United States causing damage to jet aircraft. Economic hardships were encountered by the people of south-central Alaska as a result of ash fallout. Based on new information gained from studies of the 1989-90 eruption, an updated assessment of the principal volcanic hazards is now possible. Volcanic hazards from a future eruption of Redoubt Volcano require public awareness and planning so that risks to life and property are reduced as much as possible.

  9. Some relevant parameters for assessing fire hazards of combustible mine materials using laboratory scale experiments

    PubMed Central

    Litton, Charles D.; Perera, Inoka E.; Harteis, Samuel P.; Teacoach, Kara A.; DeRosa, Maria I.; Thomas, Richard A.; Smith, Alex C.

    2018-01-01

    When combustible materials ignite and burn, the potential for fire growth and flame spread represents an obvious hazard, but during these processes of ignition and flaming, other life hazards present themselves and should be included to ensure an effective overall analysis of the relevant fire hazards. In particular, the gases and smoke produced both during the smoldering stages of fires leading to ignition and during the advanced flaming stages of a developing fire serve to contaminate the surrounding atmosphere, potentially producing elevated levels of toxicity and high levels of smoke obscuration that render the environment untenable. In underground mines, these hazards may be exacerbated by the existing forced ventilation that can carry the gases and smoke to locations far-removed from the fire location. Clearly, materials that require high temperatures (above 1400 K) and that exhibit low mass loss during thermal decomposition, or that require high heat fluxes or heat transfer rates to ignite represent less of a hazard than materials that decompose at low temperatures or ignite at low levels of heat flux. In order to define and quantify some possible parameters that can be used to assess these hazards, small-scale laboratory experiments were conducted in a number of configurations to measure: 1) the toxic gases and smoke produced both during non-flaming and flaming combustion; 2) mass loss rates as a function of temperature to determine ease of thermal decomposition; and 3) mass loss rates and times to ignition as a function of incident heat flux. This paper describes the experiments that were conducted, their results, and the development of a set of parameters that could possibly be used to assess the overall fire hazard of combustible materials using small scale laboratory experiments. PMID:29599565

  10. Some relevant parameters for assessing fire hazards of combustible mine materials using laboratory scale experiments.

    PubMed

    Litton, Charles D; Perera, Inoka E; Harteis, Samuel P; Teacoach, Kara A; DeRosa, Maria I; Thomas, Richard A; Smith, Alex C

    2018-04-15

    When combustible materials ignite and burn, the potential for fire growth and flame spread represents an obvious hazard, but during these processes of ignition and flaming, other life hazards present themselves and should be included to ensure an effective overall analysis of the relevant fire hazards. In particular, the gases and smoke produced both during the smoldering stages of fires leading to ignition and during the advanced flaming stages of a developing fire serve to contaminate the surrounding atmosphere, potentially producing elevated levels of toxicity and high levels of smoke obscuration that render the environment untenable. In underground mines, these hazards may be exacerbated by the existing forced ventilation that can carry the gases and smoke to locations far-removed from the fire location. Clearly, materials that require high temperatures (above 1400 K) and that exhibit low mass loss during thermal decomposition, or that require high heat fluxes or heat transfer rates to ignite represent less of a hazard than materials that decompose at low temperatures or ignite at low levels of heat flux. In order to define and quantify some possible parameters that can be used to assess these hazards, small-scale laboratory experiments were conducted in a number of configurations to measure: 1) the toxic gases and smoke produced both during non-flaming and flaming combustion; 2) mass loss rates as a function of temperature to determine ease of thermal decomposition; and 3) mass loss rates and times to ignition as a function of incident heat flux. This paper describes the experiments that were conducted, their results, and the development of a set of parameters that could possibly be used to assess the overall fire hazard of combustible materials using small scale laboratory experiments.

  11. Lava flow hazards and risk assessment on Mauna Loa Volcano, Hawaii

    NASA Astrophysics Data System (ADS)

    Trusdell, Frank A.

    "It is profoundly significant that the Hawaiians of Ka'u did not fear or cringe before, or hate, the power and destructive violence of Mauna Loa. They took unto them this huge mountain as their mother, and measured their personal dignity and powers in terms of its majesty and drama." (Pukui and Handy, 1952) The Island of Hawai'i is the fastest-growing region in the State of Hawai`i with over 100,000 residents. Because the population continues to grow at a rate of 3% per annum, more and more construction will occur on the flanks of active volcanoes. Since the last eruption of Mauna Loa in 1984, $2.3 billion have been invested in new construction on the volcano's flanks, posing an inevitable hazard to the people living there. Part of the mission of The U.S. Geological Survey's Hawaiian Volcano Observatory is to make the public aware of these hazards. Recent mapping has shown that lava flows on Mauna Loa have covered its surface area at a rate of 30-40% every 1000 years. Average effusion rates of up to 12 million cubic meters per day during eruptions, combined with slopes >10 degrees, increase the risk for the population of South Kona. Studies of Mauna Loa's long-term eruptive history will lead to more accurate volcanic hazards assessments and enable us to refine the boundaries between the hazards zones. Our work thus serves as a guide for land-use planners and developers to make more informed decisions for the future. Land-use planning is a powerful way to minimize risk in hazardous areas.

  12. Changing tides: Adaptive monitoring, assessment, and management of pharmaceutical hazards in the environment through time.

    PubMed

    Gaw, Sally; Brooks, Bryan W

    2016-04-01

    Pharmaceuticals are ubiquitous contaminants in aquatic ecosystems. Adaptive monitoring, assessment, and management programs will be required to reduce the environmental hazards of pharmaceuticals of concern. Potentially underappreciated factors that drive the environmental dose of pharmaceuticals include regulatory approvals, marketing campaigns, pharmaceutical subsidies and reimbursement schemes, and societal acceptance. Sales data for 5 common antidepressants (duloxetine [Cymbalta], escitalopram [Lexapro], venlafaxine [Effexor], bupropion [Wellbutrin], and sertraline [Zoloft]) in the United States from 2004 to 2008 were modeled to explore how environmental hazards in aquatic ecosystems changed after patents were obtained or expired. Therapeutic hazard ratios for Effexor and Lexapro did not exceed 1; however, the therapeutic hazard ratio for Zoloft declined whereas the therapeutic hazard ratio for Cymbalta increased as a function of patent protection and sale patterns. These changes in therapeutic hazard ratios highlight the importance of considering current and future drivers of pharmaceutical use when prioritizing pharmaceuticals for water quality monitoring programs. When urban systems receiving discharges of environmental contaminants are examined, water quality efforts should identify, prioritize, and select target analytes presently in commerce for effluent monitoring and surveillance. © 2015 SETAC.

  13. Tsunami Hazard Assessment: Source regions of concern to U.S. interests derived from NOAA Tsunami Forecast Model Development

    NASA Astrophysics Data System (ADS)

    Eble, M. C.; uslu, B. U.; Wright, L.

    2013-12-01

    Synthetic tsunamis generated from source regions around the Pacific Basin are analyzed in terms of their relative impact on United States coastal locations.. The region of tsunami origin is as important as the expected magnitude and the predicted inundation for understanding tsunami hazard. The NOAA Center for Tsunami Research has developed high-resolution tsunami models capable of predicting tsunami arrival time and amplitude of waves at each location. These models have been used to conduct tsunami hazard assessments to assess maximum impact and tsunami inundation for use by local communities in education and evacuation map development. Hazard assessment studies conducted for Los Angeles, San Francisco, Crescent City, Hilo, and Apra Harbor are combined with results of tsunami forecast model development at each of seventy-five locations. Complete hazard assessment, identifies every possible tsunami variation from a pre-computed propagation database. Study results indicate that the Eastern Aleutian Islands and Alaska are the most likely regions to produce the largest impact on the West Coast of the United States, while the East Philippines and Mariana trench regions impact Apra Harbor, Guam. Hawaii appears to be impacted equally from South America, Alaska and the Kuril Islands.

  14. Geological, geomechanical and geostatistical assessment of rockfall hazard in San Quirico Village (Abruzzo, Italy)

    NASA Astrophysics Data System (ADS)

    Chiessi, Vittorio; D'Orefice, Maurizio; Scarascia Mugnozza, Gabriele; Vitale, Valerio; Cannese, Christian

    2010-07-01

    This paper describes the results of a rockfall hazard assessment for the village of San Quirico (Abruzzo region, Italy) based on an engineering-geological model. After the collection of geological, geomechanical, and geomorphological data, the rockfall hazard assessment was performed based on two separate approaches: i) simulation of detachment of rock blocks and their downhill movement using a GIS; and ii) application of geostatistical techniques to the analysis of georeferenced observations of previously fallen blocks, in order to assess the probability of arrival of blocks due to potential future collapses. The results show that the trajectographic analysis is significantly influenced by the input parameters, with particular reference to the coefficients of restitution values. In order to solve this problem, the model was calibrated based on repeated field observations. The geostatistical approach is useful because it gives the best estimation of point-source phenomena such as rockfalls; however, the sensitivity of results to basic assumptions, e.g. assessment of variograms and choice of a threshold value, may be problematic. Consequently, interpolations derived from different variograms have been used and compared among them; hence, those showing the lowest errors were adopted. The data sets which were statistically analysed are relevant to both kinetic energy and surveyed rock blocks in the accumulation area. The obtained maps highlight areas susceptible to rock block arrivals, and show that the area accommodating the new settlement of S. Quirico Village has the highest level of hazard according to both probabilistic and deterministic methods.

  15. Impact of earthquake source complexity and land elevation data resolution on tsunami hazard assessment and fatality estimation

    NASA Astrophysics Data System (ADS)

    Muhammad, Ario; Goda, Katsuichiro

    2018-03-01

    This study investigates the impact of model complexity in source characterization and digital elevation model (DEM) resolution on the accuracy of tsunami hazard assessment and fatality estimation through a case study in Padang, Indonesia. Two types of earthquake source models, i.e. complex and uniform slip models, are adopted by considering three resolutions of DEMs, i.e. 150 m, 50 m, and 10 m. For each of the three grid resolutions, 300 complex source models are generated using new statistical prediction models of earthquake source parameters developed from extensive finite-fault models of past subduction earthquakes, whilst 100 uniform slip models are constructed with variable fault geometry without slip heterogeneity. The results highlight that significant changes to tsunami hazard and fatality estimates are observed with regard to earthquake source complexity and grid resolution. Coarse resolution (i.e. 150 m) leads to inaccurate tsunami hazard prediction and fatality estimation, whilst 50-m and 10-m resolutions produce similar results. However, velocity and momentum flux are sensitive to the grid resolution and hence, at least 10-m grid resolution needs to be implemented when considering flow-based parameters for tsunami hazard and risk assessments. In addition, the results indicate that the tsunami hazard parameters and fatality number are more sensitive to the complexity of earthquake source characterization than the grid resolution. Thus, the uniform models are not recommended for probabilistic tsunami hazard and risk assessments. Finally, the findings confirm that uncertainties of tsunami hazard level and fatality in terms of depth, velocity and momentum flux can be captured and visualized through the complex source modeling approach. From tsunami risk management perspectives, this indeed creates big data, which are useful for making effective and robust decisions.

  16. Quantitative Procedures for the Assessment of Quality in Higher Education Institutions.

    ERIC Educational Resources Information Center

    Moran, Tom; Rowse, Glenwood

    The development of procedures designed to provide quantitative assessments of quality in higher education institutions are reviewed. These procedures employ a systems framework and utilize quantitative data to compare institutions or programs of similar types with one another. Three major elements essential in the development of models focusing on…

  17. Assessment of existing and potential landslide hazards resulting from the April 25, 2015 Gorkha, Nepal earthquake sequence

    USGS Publications Warehouse

    Collins, Brian D.; Jibson, Randall W.

    2015-07-28

    This report provides a detailed account of assessments performed in May and June 2015 and focuses on valley-blocking landslides because they have the potential to pose considerable hazard to many villages in Nepal. First, we provide a seismological background of Nepal and then detail the methods used for both external and in-country data collection and interpretation. Our results consist of an overview of landsliding extent, a characterization of all valley-blocking landslides identified during our work, and a description of video resources that provide high resolution coverage of approximately 1,000 kilometers (km) of river valleys and surrounding terrain affected by the Gorkha earthquake sequence. This is followed by a description of site-specific landslide-hazard assessments conducted while in Nepal and includes detailed descriptions of five noteworthy case studies. Finally, we assess the expectation for additional landslide hazards during the 2015 summer monsoon season.

  18. Integrated flood hazard assessment based on spatial ordered weighted averaging method considering spatial heterogeneity of risk preference.

    PubMed

    Xiao, Yangfan; Yi, Shanzhen; Tang, Zhongqian

    2017-12-01

    Flood is the most common natural hazard in the world and has caused serious loss of life and property. Assessment of flood prone areas is of great importance for watershed management and reduction of potential loss of life and property. In this study, a framework of multi-criteria analysis (MCA) incorporating geographic information system (GIS), fuzzy analytic hierarchy process (AHP) and spatial ordered weighted averaging (OWA) method was developed for flood hazard assessment. The factors associated with geographical, hydrological and flood-resistant characteristics of the basin were selected as evaluation criteria. The relative importance of the criteria was estimated through fuzzy AHP method. The OWA method was utilized to analyze the effects of different risk attitudes of the decision maker on the assessment result. The spatial ordered weighted averaging method with spatially variable risk preference was implemented in the GIS environment to integrate the criteria. The advantage of the proposed method is that it has considered spatial heterogeneity in assigning risk preference in the decision-making process. The presented methodology has been applied to the area including Hanyang, Caidian and Hannan of Wuhan, China, where flood events occur frequently. The outcome of flood hazard distribution presents a tendency of high risk towards populated and developed areas, especially the northeast part of Hanyang city, which has suffered frequent floods in history. The result indicates where the enhancement projects should be carried out first under the condition of limited resources. Finally, sensitivity of the criteria weights was analyzed to measure the stability of results with respect to the variation of the criteria weights. The flood hazard assessment method presented in this paper is adaptable for hazard assessment of a similar basin, which is of great significance to establish counterplan to mitigate life and property losses. Copyright © 2017 Elsevier B.V. All

  19. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  20. Considering the ranges of uncertainties in the New Probabilistic Seismic Hazard Assessment of Germany - Version 2016

    NASA Astrophysics Data System (ADS)

    Grunthal, Gottfried; Stromeyer, Dietrich; Bosse, Christian; Cotton, Fabrice; Bindi, Dino

    2017-04-01

    The seismic load parameters for the upcoming National Annex to the Eurocode 8 result from the reassessment of the seismic hazard supported by the German Institution for Civil Engineering . This 2016 version of hazard assessment for Germany as target area was based on a comprehensive involvement of all accessible uncertainties in models and parameters into the approach and the provision of a rational framework for facilitating the uncertainties in a transparent way. The developed seismic hazard model represents significant improvements; i.e. it is based on updated and extended databases, comprehensive ranges of models, robust methods and a selection of a set of ground motion prediction equations of their latest generation. The output specifications were designed according to the user oriented needs as suggested by two review teams supervising the entire project. In particular, seismic load parameters were calculated for rock conditions with a vS30 of 800 ms-1 for three hazard levels (10%, 5% and 2% probability of occurrence or exceedance within 50 years) in form of, e.g., uniform hazard spectra (UHS) based on 19 sprectral periods in the range of 0.01 - 3s, seismic hazard maps for spectral response accelerations for different spectral periods or for macroseismic intensities. The developed hazard model consists of a logic tree with 4040 end branches and essential innovations employed to capture epistemic uncertainties and aleatory variabilities. The computation scheme enables the sound calculation of the mean and any quantile of required seismic load parameters. Mean, median and 84th percentiles of load parameters were provided together with the full calculation model to clearly illustrate the uncertainties of such a probabilistic assessment for a region of a low-to-moderate level of seismicity. The regional variations of these uncertainties (e.g. ratios between the mean and median hazard estimations) were analyzed and discussed.

  1. Natural hazard risk assessment and management in the Matter valley, Swiss Alps

    NASA Astrophysics Data System (ADS)

    Herz, T.; King, L.; Philippi, S.

    2003-04-01

    The Matter valley has a length of about 40 km and is surrounded by some of the highest peaks of the Alps resulting in extreme altitudinal differences and a continental character of the climate. These climatic conditions cause a high glacier equilibrium line and therefore a periglacial belt of a large vertical extend. Due to the high relief energy, all kinds of natural hazards typical for high mountain environments occur. The steep western slopes are dominated by rockfalls, slope instabilities in bedrock and avalanches. A widespread cover of unconsolidated sediments on the eastern slopes induces landslides and debris flows, which often reach down to the valley bottom where they can dam up the river. Increasing population and modern land use forms required a more and more sensitive attitude towards natural hazard potentials in this endangered area. Assessment and management of natural hazard risks have been much improved during the last fifteen years and increasing amounts of money are spent each year in order to safeguard settlements, traffic lines, and other objects of the technical infrastructure. Numerous investigations concerning natural hazard risks have been made and the results are considered in the actual land use planning of the Canton. The planning law of the Canton Valais defines risk zones as areas, which are endangered by natural hazards like avalanches, rockfalls, landslides and floodings. Risk assessment is done by overview maps (scale 1:25,000) which are specified by detailed risk analyses consisting of registers and detailed maps (scale 1:2,000 to 1:10,000). These analyses are integrated in the land zoning by defining zones of high, medium and low danger, associated with corresponding prohibitions, restrictions and conditions for utilisation. At present, the incorporation of the avalanche and rockfall register in local zoning plans is completed in most communities of the Canton Valais. An additional inventory of 200 slope instabilities was

  2. Assessment of Three Flood Hazard Mapping Methods: A Case Study of Perlis

    NASA Astrophysics Data System (ADS)

    Azizat, Nazirah; Omar, Wan Mohd Sabki Wan

    2018-03-01

    Flood is a common natural disaster and also affect the all state in Malaysia. Regarding to Drainage and Irrigation Department (DID) in 2007, about 29, 270 km2 or 9 percent of region of the country is prone to flooding. Flood can be such devastating catastrophic which can effected to people, economy and environment. Flood hazard mapping can be used is an important part in flood assessment to define those high risk area prone to flooding. The purposes of this study are to prepare a flood hazard mapping in Perlis and to evaluate flood hazard using frequency ratio, statistical index and Poisson method. The six factors affecting the occurrence of flood including elevation, distance from the drainage network, rainfall, soil texture, geology and erosion were created using ArcGIS 10.1 software. Flood location map in this study has been generated based on flooded area in year 2010 from DID. These parameters and flood location map were analysed to prepare flood hazard mapping in representing the probability of flood area. The results of the analysis were verified using flood location data in year 2013, 2014, 2015. The comparison result showed statistical index method is better in prediction of flood area rather than frequency ratio and Poisson method.

  3. [Research progress and development trend of quantitative assessment techniques for urban thermal environment.

    PubMed

    Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang

    2016-08-01

    Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.

  4. Rockfall hazard assessment integrating probabilistic physically based rockfall source detection (Norddal municipality, Norway).

    NASA Astrophysics Data System (ADS)

    Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.

    2012-04-01

    Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m

  5. Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances

    EPA Pesticide Factsheets

    This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.

  6. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  7. Skin exposure: Assessing the hazard in the workplace

    NASA Technical Reports Server (NTRS)

    Cummins, Kevin

    1994-01-01

    An outline of the Occupational Safety and Health Agency's concerns of skin exposure to hazardous chemicals is presented, followed by the corresponding slide narrations. Specifically, dermatitis and skin absorption as compared to lung absorption are addressed. Lung versus skin exposure is examined for glycol ethers and acrylamide. Examples of skin exposure include PBC's in transformers, toluene and xylene from autobody work, polynuclear aromatics (PNA's) among Coke oven workers, toluene diisocyanate (TDI), and occupational chemical exposures in an academic medical center. Permeation through gloves in the semiconductor industry is addressed as evidence for the need to assess the effectiveness of PPE (Personal Protective Equipment). This leads to the revisions of the PPE standard and the Safety and Health Program standard.

  8. Web processing service for landslide hazard assessment

    NASA Astrophysics Data System (ADS)

    Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.

    2012-04-01

    Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive

  9. The Impact Hazard in the Context of Other Natural Hazards and Predictive Science

    NASA Astrophysics Data System (ADS)

    Chapman, C. R.

    1998-09-01

    The hazard due to impact of asteroids and comets has been recognized as analogous, in some ways, to other infrequent but consequential natural hazards (e.g. floods and earthquakes). Yet, until recently, astronomers and space agencies have felt no need to do what their colleagues and analogous agencies must do in order the assess, quantify, and communicate predictions to those with a practical interest in the predictions (e.g. public officials who must assess the threats, prepare for mitigation, etc.). Recent heightened public interest in the impact hazard, combined with increasing numbers of "near misses" (certain to increase as Spaceguard is implemented) requires that astronomers accept the responsibility to place their predictions and assessments in terms that may be appropriately considered. I will report on preliminary results of a multi-year GSA/NCAR study of "Prediction in the Earth Sciences: Use and Misuse in Policy Making" in which I have represented the impact hazard, while others have treated earthquakes, floods, weather, global climate change, nuclear waste disposal, acid rain, etc. The impact hazard presents an end-member example of a natural hazard, helping those dealing with more prosaic issues to learn from an extreme. On the other hand, I bring to the astronomical community some lessons long adopted in other cases: the need to understand the policy purposes of impact predictions, the need to assess potential societal impacts, the requirements to very carefully assess prediction uncertainties, considerations of potential public uses of the predictions, awareness of ethical considerations (e.g. conflicts of interest) that affect predictions and acceptance of predictions, awareness of appropriate means for publicly communicating predictions, and considerations of the international context (especially for a hazard that knows no national boundaries).

  10. A Guide to the Application of Probability Risk Assessment Methodology and Hazard Risk Frequency Criteria as a Hazard Control for the Use of the Mobile Servicing System on the International Space Station

    NASA Astrophysics Data System (ADS)

    D'silva, Oneil; Kerrison, Roger

    2013-09-01

    A key feature for the increased utilization of space robotics is to automate Extra-Vehicular manned space activities and thus significantly reduce the potential for catastrophic hazards while simultaneously minimizing the overall costs associated with manned space. The principal scope of the paper is to evaluate the use of industry standard accepted Probability risk/safety assessment (PRA/PSA) methodologies and Hazard Risk frequency Criteria as a hazard control. This paper illustrates the applicability of combining the selected Probability risk assessment methodology and hazard risk frequency criteria, in order to apply the necessary safety controls that allow for the increased use of the Mobile Servicing system (MSS) robotic system on the International Space Station. This document will consider factors such as component failure rate reliability, software reliability, and periods of operation and dormancy, fault tree analyses and their effects on the probability risk assessments. The paper concludes with suggestions for the incorporation of existing industry Risk/Safety plans to create an applicable safety process for future activities/programs

  11. Relationship between Plaque Echo, Thickness and Neovascularization Assessed by Quantitative and Semi-quantitative Contrast-Enhanced Ultrasonography in Different Stenosis Groups.

    PubMed

    Song, Yan; Feng, Jun; Dang, Ying; Zhao, Chao; Zheng, Jie; Ruan, Litao

    2017-12-01

    The aim of this study was to determine the relationship between plaque echo, thickness and neovascularization in different stenosis groups using quantitative and semi-quantitative contrast-enhanced ultrasound (CEUS) in patients with carotid atherosclerosis plaque. A total of 224 plaques were divided into mild stenosis (<50%; 135 plaques, 60.27%), moderate stenosis (50%-69%; 39 plaques, 17.41%) and severe stenosis (70%-99%; 50 plaques, 22.32%) groups. Quantitative and semi-quantitative methods were used to assess plaque neovascularization and determine the relationship between plaque echo, thickness and neovascularization. Correlation analysis revealed no relationship of neovascularization with plaque echo in the groups using either quantitative or semi-quantitative methods. Furthermore, there was no correlation of neovascularization with plaque thickness using the semi-quantitative method. The ratio of areas under the curve (RAUC) was negatively correlated with plaque thickness (r = -0.317, p = 0.001) in the mild stenosis group. With the quartile method, plaque thickness of the mild stenosis group was divided into four groups, with significant differences between the 1.5-2.2 mm and ≥3.5 mm groups (p = 0.002), 2.3-2.8 mm and ≥3.5 mm groups (p <0.001) and 2.9-3.4 mm and ≥3.5 mm groups (p <0.001). Both semi-quantitative and quantitative CEUS methods characterizing neovascularization of plaque are equivalent with respect to assessing relationships between neovascularization, echogenicity and thickness. However, the quantitative method could fail for plaque <3.5 mm because of motion artifacts. Copyright © 2017 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.

  12. The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health

    DTIC Science & Technology

    2016-10-01

    AWARD NUMBER: W81XWH-15-1-0669 TITLE: The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health PRINCIPAL INVESTIGATOR...3. DATES COVERED 30 Sep 2015 - 29 Sep 2016 4. TITLE AND SUBTITLE The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health 5a...amputation and subsequently evaluate the utility of non-invasive imaging for evaluating the impact of next-generation socket technologies on the health of

  13. Why aftershock duration matters for probabilistic seismic hazard assessment

    USGS Publications Warehouse

    Shinji Toda,; Stein, Ross S.

    2018-01-01

    Most hazard assessments assume that high background seismicity rates indicate a higher probability of large shocks and, therefore, of strong shaking. However, in slowly deforming regions, such as eastern North America, Australia, and inner Honshu, this assumption breaks down if the seismicity clusters are instead aftershocks of historic and prehistoric mainshocks. Here, therefore we probe the circumstances under which aftershocks can last for 100–1000 years. Basham and Adams (1983) and Ebel et al. (2000) proposed that intraplate seismicity in eastern North America could be aftershocks of mainshocks that struck hundreds of years beforehand, a view consonant with rate–state friction (Dieterich, 1994), in which aftershock duration varies inversely with fault‐stressing rate. To test these hypotheses, we estimate aftershock durations of the 2011 Mw 9 Tohoku‐Oki rupture at 12 sites up to 250 km from the source, as well as for the near‐fault aftershocks of eight large Japanese mainshocks, sampling faults slipping 0.01 to 80  mm/yr . Whereas aftershock productivity increases with mainshock magnitude, we find that aftershock duration, the time until the aftershock rate decays to the premainshock rate, does not. Instead, aftershock sequences lasted a month on the fastest‐slipping faults and are projected to persist for more than 2000 years on the slowest. Thus, long aftershock sequences can misguide and inflate hazard assessments in intraplate regions if misinterpreted as background seismicity, whereas areas between seismicity clusters may instead harbor a higher chance of large mainshocks, the opposite of what is being assumed today.

  14. Validation Of TRMM For Hazard Assessment In The Remote Context Of Tropical Africa

    NASA Astrophysics Data System (ADS)

    Monsieurs, E.; Kirschbaum, D.; Tan, J.; Jacobs, L.; Kervyn, M.; Demoulin, A.; Dewitte, O.

    2017-12-01

    Accurate rainfall data is fundamental for understanding and mitigating the disastrous effects of many rainfall-triggered hazards, especially when one considers the challenges arising from climate change and rainfall variability. In tropical Africa in particular, the sparse operational rainfall gauging network hampers the ability to understand these hazards. Satellite rainfall estimates (SRE) can therefore be of great value. Yet, rigorous validation is required to identify the uncertainties when using SRE for hazard applications. We evaluated the Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) 3B42 Research Derived Daily Product from 1998 to 2017, at 0.25° x 0.25° spatial and 24 h temporal resolution. The validation was done over the western branch of the East African Rift, with the perspective of regional landslide hazard assessment in mind. Even though we collected an unprecedented dataset of 47 gauges with a minimum temporal resolution of 24 h, the sparse and heterogeneous temporal coverage in a region with high rainfall variability poses challenges for validation. In addition, the discrepancy between local-scale gauge data and spatially averaged ( 775 km²) TMPA data in the context of local convective storms and orographic rainfall is a crucial source of uncertainty. We adopted a flexible framework for SRE validation that fosters explorative research in a remote context. Results show that TMPA performs reasonably well during the rainy seasons for rainfall intensities <20 mm/day. TMPA systematically underestimates rainfall, but most problematic is the decreasing probability of detection of high intensity rainfalls. We suggest that landslide hazard might be efficiently assessed if we take account of the systematic biases in TMPA data and determine rainfall thresholds modulated by controls on, and uncertainties of, TMPA revealed in this study. Moreover, it is found relevant in mapping regional-scale rainfall

  15. [A preliminary mapping methodology for occupational hazards and biomechanical risk evaluation: presentation of a simple, computerized tool kit for ergonomic hazards identification and risk assessment].

    PubMed

    Colombini, Daniela; Occhipinti, E; Di Leone, G

    2011-01-01

    During the last Congress of the International Ergonomics Association (IEA), Beijing, August 2009, an international group was founded with the task of developing a "toolkit for MSD prevention" under the IEA and in collaboration with the World Health Organization. The possible users of toolkits are: members of health and safety committees; health and safety representatives; line supervisors; foremen; workers; government representatives; health workers providing basic occupational health services; occupational health and safety specialists. According to the ISO standard 11228 series and the new Draft CD ISO 12259-2009: Application document guides for the potential user, our group developed a preliminary "mapping" methodology of occupational hazards in the craft industry, supported by software (Excel). The proposed methodology, using specific key enters and quick assessment criteria, allows a simple ergonomics hazards identification and risk estimation to be made. It is thus possible to decide for which occupational hazards a more exhaustive risk assessment will be necessary and which occupational consultant should be involved (occupational physician, safety engineer, industrial hygienist, etc.).

  16. Automated Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riddle, F. J.

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less

  17. Seismic Hazard and risk assessment for Romania -Bulgaria cross-border region

    NASA Astrophysics Data System (ADS)

    Simeonova, Stela; Solakov, Dimcho; Alexandrova, Irena; Vaseva, Elena; Trifonova, Petya; Raykova, Plamena

    2016-04-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic hazard and vulnerability to earthquakes are steadily increasing as urbanization and development occupy more areas that are prone to effects of strong earthquakes. The assessment of the seismic hazard and risk is particularly important, because it provides valuable information for seismic safety and disaster mitigation, and it supports decision making for the benefit of society. Romania and Bulgaria, situated in the Balkan Region as a part of the Alpine-Himalayan seismic belt, are characterized by high seismicity, and are exposed to a high seismic risk. Over the centuries, both countries have experienced strong earthquakes. The cross-border region encompassing the northern Bulgaria and southern Romania is a territory prone to effects of strong earthquakes. The area is significantly affected by earthquakes occurred in both countries, on the one hand the events generated by the Vrancea intermediate-depth seismic source in Romania, and on the other hand by the crustal seismicity originated in the seismic sources: Shabla (SHB), Dulovo, Gorna Orjahovitza (GO) in Bulgaria. The Vrancea seismogenic zone of Romania is a very peculiar seismic source, often described as unique in the world, and it represents a major concern for most of the northern part of Bulgaria as well. In the present study the seismic hazard for Romania-Bulgaria cross-border region on the basis of integrated basic geo-datasets is assessed. The hazard results are obtained by applying two alternative approaches - probabilistic and deterministic. The MSK64 intensity (MSK64 scale is practically equal to the new EMS98) is used as output parameter for the hazard maps. We prefer to use here the macroseismic intensity instead of PGA, because it is directly related to the degree of damages and, moreover, the epicentral intensity is the original

  18. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective

    NASA Astrophysics Data System (ADS)

    Wyss, Max

    2013-04-01

    An earthquake of M6.3 killed 309 people in L'Aquila, Italy, on 6 April 2011. Subsequently, a judge in L'Aquila convicted seven who had participated in an emergency meeting on March 30, assessing the probability of a major event to follow the ongoing earthquake swarm. The sentence was six years in prison, a combine fine of 2 million Euros, loss of job, loss of retirement rent, and lawyer's costs. The judge followed the prosecution's accusation that the review by the Commission of Great Risks had conveyed a false sense of security to the population, which consequently did not take their usual precautionary measures before the deadly earthquake. He did not consider the facts that (1) one of the convicted was not a member of the commission and had merrily obeyed orders to bring the latest seismological facts to the discussion, (2) another was an engineer who was not required to have any expertise regarding the probability of earthquakes, (3) and two others were seismologists not invited to speak to the public at a TV interview and a press conference. This exaggerated judgment was the consequence of an uproar in the population, who felt misinformed and even mislead. Faced with a population worried by an earthquake swarm, the head of the Italian Civil Defense is on record ordering that the population be calmed, and the vice head executed this order in a TV interview one hour before the meeting of the Commission by stating "the scientific community continues to tell me that the situation is favorable and that there is a discharge of energy." The first lesson to be learned is that communications to the public about earthquake hazard and risk must not be left in the hands of someone who has gross misunderstandings about seismology. They must be carefully prepared by experts. The more significant lesson is that the approach to calm the population and the standard probabilistic hazard and risk assessment, as practiced by GSHAP, are misleading. The later has been criticized as

  19. Assessing framing assumptions in quantitative health impact assessments: a housing intervention example.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2013-09-01

    Health impact assessment (HIA) is often used to determine ex ante the health impact of an environmental policy or an environmental intervention. Underpinning any HIA is the framing assumption, which defines the causal pathways mapping environmental exposures to health outcomes. The sensitivity of the HIA to the framing assumptions is often ignored. A novel method based on fuzzy cognitive map (FCM) is developed to quantify the framing assumptions in the assessment stage of a HIA, and is then applied to a housing intervention (tightening insulation) as a case-study. Framing assumptions of the case-study were identified through a literature search of Ovid Medline (1948-2011). The FCM approach was used to identify the key variables that have the most influence in a HIA. Changes in air-tightness, ventilation, indoor air quality and mould/humidity have been identified as having the most influence on health. The FCM approach is widely applicable and can be used to inform the formulation of the framing assumptions in any quantitative HIA of environmental interventions. We argue that it is necessary to explore and quantify framing assumptions prior to conducting a detailed quantitative HIA during the assessment stage. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Secondary impact hazard assessment

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A series of light gas gun shots (4 to 7 km/sec) were performed with 5 mg nylon and aluminum projectiles to determine the size, mass, velocity, and spatial distribution of spall and ejecta from a number of graphite/epoxy targets. Similar determinations were also performed on a few aluminum targets. Target thickness and material were chosen to be representative of proposed Space Station structure. The data from these shots and other information were used to predict the hazard to Space Station elements from secondary particles resulting from impacts of micrometeoroids and orbital debris on the Space Station. This hazard was quantified as an additional flux over and above the primary micrometeoroid and orbital debris flux that must be considered in the design process. In order to simplify the calculations, eject and spall mass were assumed to scale directly with the energy of the projectile. Other scaling systems may be closer to reality. The secondary particles considered are only those particles that may impact other structure immediately after the primary impact. The addition to the orbital debris problem from these primary impacts was not addressed. Data from this study should be fed into the orbital debris model to see if Space Station secondaries make a significant contribution to orbital debris. The hazard to a Space Station element from secondary particles above and beyond the micrometeoroid and orbital debris hazard is categorized in terms of two factors: (1) the 'view factor' of the element to other Space Station structure or the geometry of placement of the element, and (2) the sensitivity to damage, stated in terms of energy. Several example cases were chosen, the Space Station module windows, windows of a Shuttle docked to the Space Station, the habitat module walls, and the photovoltaic solar cell arrays. For the examples chosen the secondary flux contributed no more than 10 percent to the total flux (primary and secondary) above a given calculated

  1. Computational toxicology as implemented by the U.S. EPA: providing high throughput decision support tools for screening and assessing chemical exposure, hazard and risk.

    PubMed

    Kavlock, Robert; Dix, David

    2010-02-01

    available through the Aggregated Computational Toxicology Resource (ACToR), the Distributed Structure-Searchable Toxicity (DSSTox) Database Network, and other U.S. EPA websites. While initially focused on improving the hazard identification process, the CTRP is placing increasing emphasis on using high-throughput bioactivity profiling data in systems modeling to support quantitative risk assessments, and in developing complementary higher throughput exposure models. This integrated approach will enable analysis of life-stage susceptibility, and understanding of the exposures, pathways, and key events by which chemicals exert their toxicity in developing systems (e.g., endocrine-related pathways). The CTRP will be a critical component in next-generation risk assessments utilizing quantitative high-throughput data and providing a much higher capacity for assessing chemical toxicity than is currently available.

  2. A qualitative and quantitative assessment for a bone marrow harvest simulator.

    PubMed

    Machado, Liliane S; Moraes, Ronei M

    2009-01-01

    Several approaches to perform assessment in training simulators based on virtual reality have been proposed. There are two kinds of assessment methods: offline and online. The main requirements related to online training assessment methodologies applied to virtual reality systems are the low computational complexity and the high accuracy. In the literature it can be found several approaches for general cases which can satisfy such requirements. An inconvenient about those approaches is related to an unsatisfactory solution for specific cases, as in some medical procedures, where there are quantitative and qualitative information available to perform the assessment. In this paper, we present an approach to online training assessment based on a Modified Naive Bayes which can manipulate qualitative and quantitative variables simultaneously. A special medical case was simulated in a bone marrow harvest simulator. The results obtained were satisfactory and evidenced the applicability of the method.

  3. Dig Hazard Assessment Using a Stereo Pair of Cameras

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L.; Trebi-Ollennu, Ashitey

    2012-01-01

    This software evaluates the terrain within reach of a lander s robotic arm for dig hazards using a stereo pair of cameras that are part of the lander s sensor system. A relative level of risk is calculated for a set of dig sectors. There are two versions of this software; one is designed to run onboard a lander as part of the flight software, and the other runs on a PC under Linux as a ground tool that produces the same results generated on the lander, given stereo images acquired by the lander and downlinked to Earth. Onboard dig hazard assessment is accomplished by executing a workspace panorama command sequence. This sequence acquires a set of stereo pairs of images of the terrain the arm can reach, generates a set of candidate dig sectors, and assesses the dig hazard of each candidate dig sector. The 3D perimeter points of candidate dig sectors are generated using configurable parameters. A 3D reconstruction of the terrain in front of the lander is generated using a set of stereo images acquired from the mast cameras. The 3D reconstruction is used to evaluate the dig goodness of each candidate dig sector based on a set of eight metrics. The eight metrics are: 1. The maximum change in elevation in each sector, 2. The elevation standard deviation in each sector, 3. The forward tilt of each sector with respect to the payload frame, 4. The side tilt of each sector with respect to the payload frame, 5. The maximum size of missing data regions in each sector, 6. The percentage of a sector that has missing data, 7. The roughness of each sector, and 8. Monochrome intensity standard deviation of each sector. Each of the eight metrics forms a goodness image layer where the goodness value of each sector ranges from 0 to 1. Goodness values of 0 and 1 correspond to high and low risk, respectively. For each dig sector, the eight goodness values are merged by selecting the lowest one. Including the merged goodness image layer, there are nine goodness image layers for each

  4. Using the Auditory Hazard Assessment Algorithm for Humans (AHAAH) Software, Beta Release W93e

    DTIC Science & Technology

    2009-09-01

    Hazard Assessment Algorithm for Humans (AHAAH) Does The AHAAH is an electro- acoustic model of the ear used to evaluate the hazard of impulse sounds...format is commonly used for recording music ; thus, these are typically stereo files and contain a “right” and a “left” channel as well as a header... acoustic data (sometimes deliberately induced in recording to maximize the digitizer’s dynamic range), it must be removed. When Set Baseline is

  5. Chemical incidents resulted in hazardous substances releases in the context of human health hazards.

    PubMed

    Pałaszewska-Tkacz, Anna; Czerczak, Sławomir; Konieczko, Katarzyna

    2017-02-21

    The research purpose was to analyze data concerning chemical incidents in Poland collected in 1999-2009 in terms of health hazards. The data was obtained, using multimodal information technology (IT) system, from chemical incidents reports prepared by rescuers at the scene. The final analysis covered sudden events associated with uncontrolled release of hazardous chemical substances or mixtures, which may potentially lead to human exposure. Releases of unidentified substances where emergency services took action to protect human health or environment were also included. The number of analyzed chemical incidents in 1999-2009 was 2930 with more than 200 different substances released. The substances were classified into 13 groups of substances and mixtures posing analogous risks. Most common releases were connected with non-flammable corrosive liquids, including: hydrochloric acid (199 cases), sulfuric(VI) acid (131 cases), sodium and potassium hydroxides (69 cases), ammonia solution (52 cases) and butyric acid (32 cases). The next group were gases hazardous only due to physico-chemical properties, including: extremely flammable propane-butane (249 cases) and methane (79 cases). There was no statistically significant trend associated with the total number of incidents. Only with the number of incidents with flammable corrosive, toxic and/or harmful liquids, the regression analysis revealed a statistically significant downward trend. The number of victims reported was 1997, including 1092 children and 18 fatalities. The number of people injured, number of incidents and the high 9th place of Poland in terms of the number of Seveso establishments, and 4 times higher number of hazardous industrial establishments not covered by the Seveso Directive justify the need for systematic analysis of hazards and their proper identification. It is advisable enhance health risk assessment, both qualitative and quantitative, by slight modification of the data collection system so as

  6. Assessing the impact of hazardous waste on children's health: The exposome paradigm.

    PubMed

    Sarigiannis, D A

    2017-10-01

    Assessment of the health impacts related to hazardous waste is a major scientific challenge with multiple societal implications. Most studies related to associations between hazardous waste and public health do not provide established of mechanistic links between environmental exposure and disease burden, resulting in ineffective waste management options. The exposome concept comes to overhaul the nature vs. nurture paradigm and embraces a world of dynamic interactions between environmental exposures, endogenous exposures and genetic expression in humans. In this context, the exposome paradigm provides a novel tool for holistic hazardous waste management. Waste streams and the related contamination of environmental media are not viewed in isolation, but rather as components of the expotype, the vector of exposures an individual is exposed to over time. Thus, a multi-route and multi-pathway exposure estimation can be performed setting a realistic basis for integrated health risk assessment. Waste management practices are thus assessed not only regarding their technological edge and efficacy but also their effects on human health at the individual and community level, considering intra-subject variability in the affected population. The effectiveness of the exposome approach is demonstrated in the case of Athens, the capital of Greece, where the health effects associated to long term and short term exposure to two major waste management facilities (landfill and plastic recycling) are presented. Using the exposome analysis tools, we confirmed that proximity to a landfill is critical for children neurodevelopment. However, this effect is significantly modified by parameters such as parental education level, socioeconomic status and nutrition. Proximity to a plastics recycling plant does not pose significant threats under normal operating conditions; yet, in the case of an accidental fire, release of persistent carcinogenic compounds (dioxins and furans) even for a

  7. Quantitative risk assessment of human salmonellosis in Canadian broiler chicken breast from retail to consumption.

    PubMed

    Smadi, Hanan; Sargeant, Jan M

    2013-02-01

    The current quantitative risk assessment model followed the framework proposed by the Codex Alimentarius to provide an estimate of the risk of human salmonellosis due to consumption of chicken breasts which were bought from Canadian retail stores and prepared in Canadian domestic kitchens. The model simulated the level of Salmonella contamination on chicken breasts throughout the retail-to-table pathway. The model used Canadian input parameter values, where available, to represent risk of salmonellosis. From retail until consumption, changes in the concentration of Salmonella on each chicken breast were modeled using equations for growth and inactivation. The model predicted an average of 318 cases of salmonellosis per 100,000 consumers per year. Potential reasons for this overestimation were discussed. A sensitivity analysis showed that concentration of Salmonella on chicken breasts at retail and food hygienic practices in private kitchens such as cross-contamination due to not washing cutting boards (or utensils) and hands after handling raw meat along with inadequate cooking contributed most significantly to the risk of human salmonellosis. The outcome from this model emphasizes that responsibility for protection from Salmonella hazard on chicken breasts is a shared responsibility. Data needed for a comprehensive Canadian Salmonella risk assessment were identified for future research. © 2012 Society for Risk Analysis.

  8. Neo-deterministic seismic hazard assessment in North Africa

    NASA Astrophysics Data System (ADS)

    Mourabit, T.; Abou Elenean, K. M.; Ayadi, A.; Benouar, D.; Ben Suleman, A.; Bezzeghoud, M.; Cheddadi, A.; Chourak, M.; ElGabry, M. N.; Harbi, A.; Hfaiedh, M.; Hussein, H. M.; Kacem, J.; Ksentini, A.; Jabour, N.; Magrin, A.; Maouche, S.; Meghraoui, M.; Ousadou, F.; Panza, G. F.; Peresan, A.; Romdhane, N.; Vaccari, F.; Zuccolo, E.

    2014-04-01

    North Africa is one of the most earthquake-prone areas of the Mediterranean. Many devastating earthquakes, some of them tsunami-triggering, inflicted heavy loss of life and considerable economic damage to the region. In order to mitigate the destructive impact of the earthquakes, the regional seismic hazard in North Africa is assessed using the neo-deterministic, multi-scenario methodology (NDSHA) based on the computation of synthetic seismograms, using the modal summation technique, at a regular grid of 0.2 × 0.2°. This is the first study aimed at producing NDSHA maps of North Africa including five countries: Morocco, Algeria, Tunisia, Libya, and Egypt. The key input data for the NDSHA algorithm are earthquake sources, seismotectonic zonation, and structural models. In the preparation of the input data, it has been really important to go beyond the national borders and to adopt a coherent strategy all over the area. Thanks to the collaborative efforts of the teams involved, it has been possible to properly merge the earthquake catalogues available for each country to define with homogeneous criteria the seismogenic zones, the characteristic focal mechanism associated with each of them, and the structural models used to model wave propagation from the sources to the sites. As a result, reliable seismic hazard maps are produced in terms of maximum displacement ( D max), maximum velocity ( V max), and design ground acceleration.

  9. UK Hazard Assessment for a Laki-type Volcanic Eruption

    NASA Astrophysics Data System (ADS)

    Witham, Claire; Felton, Chris; Daud, Sophie; Aspinall, Willy; Braban, Christine; Loughlin, Sue; Hort, Matthew; Schmidt, Anja; Vieno, Massimo

    2014-05-01

    Following the impacts of the Eyjafjallajokull eruption in 2010, two types of volcanic eruption have been added to the UK Government's National Risk Register for Civil Emergencies. One of these, a large gas-rich volcanic eruption, was identified as a high impact natural hazard, one of the three highest priority natural hazards faced by the UK. This eruption scenario is typified by the Laki eruption in Iceland in 1783-1784. The Civil Contingency Secretariat (CCS) of the UK's Cabinet Office, responsible for Civil Protection in the UK, has since been working on quantifying the risk and better understanding its potential impacts. This involves cross-cutting work across UK Government departments and the wider scientific community in order to identify the capabilities needed to respond to an effusive eruption, to exercise the response and develop increased resilience where possible. As part of its current work, CCS has been working closely with the UK Met Office and other UK agencies and academics (represented by the co-authors and others) to generate and assess the impacts of a 'reasonable worst case scenario', which can be used for decision making and preparation in advance of an eruption. Information from the literature and the findings of an expert elicitation have been synthesised to determine appropriate eruption source term parameters and associated uncertainties. This scenario is then being used to create a limited ensemble of model simulations of the dispersion and chemical conversion of the emissions of volcanic gases during such an eruption. The UK Met Office's NAME Lagrangian dispersion model and the Centre for Ecology and Hydrology's EMEP4UK Eulerian model are both being used. Modelling outputs will address the likelihood of near-surface concentrations of sulphur and halogen species being above specified health thresholds. Concentrations at aviation relevant altitudes will also be evaluated, as well as the effects of acid deposition of volcanic species on

  10. A Hazard Assessment and Proposed Risk Index for Art, Architecture, Archive and Artifact Protection: Case Studies for Assorted International Museums

    NASA Astrophysics Data System (ADS)

    Kirk, Clara J.

    This study proposes a hazard/risk index for environmental, technological, and social hazards that may threaten a museum or other place of cultural storage and accession. This index can be utilized and implemented to measure the risk at the locations of these storage facilities in relationship to their geologic, geographic, environmental, and social settings. A model case study of the 1966 flood of the Arno River and its impact on the city of Florence and the Uffizi Gallery was used as the index focus. From this focus an additional eleven museums and their related risk were assessed. Each index addressed a diverse range of hazards based on past frequency and magnitude. It was found that locations nearest a hazard had exceptionally high levels of risk, however more distant locations could have influences that would increase their risk to levels similar to those locations near the hazard. Locations not normally associated with a given natural hazard can be susceptible should the right conditions be met and this research identified, complied and assessed those factions found to influence natural hazard risk at these research sites.

  11. Hazards and hazard combinations relevant for the safety of nuclear power plants

    NASA Astrophysics Data System (ADS)

    Decker, Kurt; Brinkman, Hans; Raimond, Emmanuel

    2017-04-01

    The potential of the contemporaneous impact of different, yet causally related, hazardous events and event cascades on nuclear power plants is a major contributor to the overall risk of nuclear installations. In the aftermath of the Fukushima accident, which was caused by a combination of severe ground shaking by an earthquake, an earthquake-triggered tsunami and the disruption of the plants from the electrical grid by a seismically induced landslide, hazard combinations and hazard cascades moved into the focus of nuclear safety research. We therefore developed an exhaustive list of external hazards and hazard combinations which pose potential threats to nuclear installations in the framework of the European project ASAMPSAE (Advanced Safety Assessment: Extended PSA). The project gathers 31 partners from Europe, North Amerika and Japan. The list comprises of exhaustive lists of natural hazards, external man-made hazards, and a cross-correlation matrix of these hazards. The hazard list is regarded comprehensive by including all types of hazards that were previously cited in documents by IAEA, the Western European Nuclear Regulators Association (WENRA), and others. 73 natural hazards and 24 man-made external hazards are included. Natural hazards are grouped into seismotectonic hazards, flooding and hydrological hazards, extreme values of meteorological phenomena, rare meteorological phenomena, biological hazards / infestation, geological hazards, and forest fire / wild fire. The list of external man-made hazards includes industry accidents, military accidents, transportation accidents, pipeline accidents and other man-made external events. The large number of different hazards results in the extremely large number of 5.151 theoretically possible hazard combinations (not considering hazard cascades). In principle all of these combinations are possible to occur by random coincidence except for 82 hazard combinations that - depending on the time scale - are mutually

  12. Geoethical and socio-political aspects of seismic and tsunami hazard assessment, quantification and mapping

    NASA Astrophysics Data System (ADS)

    Tinti, Stefano; Armigliato, Alberto

    2016-04-01

    Seismic hazard and, more recently, tsunami hazard assessments have been undertaken in several countries of the world and globally for the whole Earth planet with the aim of providing a scientifically sound basis to the engineers, technicians, urban and industrial planners, politicians, civil protection operators and in general to the authorities for devising rational risk mitigation strategies and corresponding adequate policies. The main point of this presentation is that the chief-value of all seismic and tsunami hazard studies (including theory, concept, quantification and mapping) resides in the social and political values of the provided products, which is a standpoint entailing a number of relevant geoethical implications. The most relevant implication regards geoscientists who are the subjects mainly involved in carrying out hazard evaluations. Viewed from the classical perspective, the main ethical obligations of geoscientists are restricted to performing hazard estimations in the best possible way from a scientific point of view, which means selecting the "best" available data, adopting sound theoretical models, making use of rigorous methods… What is outlined here, is that this is an insufficient minimalistic position, since it overlooks the basic socio-political and therefore practical value of the hazard-analysis final products. In other words, if one views hazard assessment as a production process leading from data and theories (raw data and production means) to hazard maps (products), the criterion to judge whether it is good or bad needs also to include the usability factor. Seismic and tsunami hazard reports and maps are products that should be usable, which means that they should meet user needs and requirements, and therefore they should be evaluated according to how much they are clearly understandable to, and appropriate for, making-decision users. In the traditional view of a science serving the society, one could represent the interaction

  13. A comprehensive risk assessment framework for offsite transportation of inflammable hazardous waste.

    PubMed

    Das, Arup; Gupta, A K; Mazumder, T N

    2012-08-15

    A framework for risk assessment due to offsite transportation of hazardous wastes is designed based on the type of event that can be triggered from an accident of a hazardous waste carrier. The objective of this study is to design a framework for computing the risk to population associated with offsite transportation of inflammable and volatile wastes. The framework is based on traditional definition of risk and is designed for conditions where accident databases are not available. The probability based variable in risk assessment framework is substituted by a composite accident index proposed in this study. The framework computes the impacts due to a volatile cloud explosion based on TNO Multi-energy model. The methodology also estimates the vulnerable population in terms of disability adjusted life years (DALY) which takes into consideration the demographic profile of the population and the degree of injury on mortality and morbidity sustained. The methodology is illustrated using a case study of a pharmaceutical industry in the Kolkata metropolitan area. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. The evolution of global disaster risk assessments: from hazard to global change

    NASA Astrophysics Data System (ADS)

    Peduzzi, Pascal

    2013-04-01

    The perception of disaster risk as a dynamic process interlinked with global change is a fairly recent concept. It gradually emerged as an evolution from new scientific theories, currents of thinking and lessons learned from large disasters since the 1970s. The interest was further heighten, in the mid-1980s, by the Chernobyl nuclear accident and the discovery of the ozone layer hole, both bringing awareness that dangerous hazards can generate global impacts. The creation of the UN International Decade for Natural Disaster Reduction (IDNDR) and the publication of the first IPCC report in 1990 reinforced the interest for global risk assessment. First global risk models including hazard, exposure and vulnerability components were available since mid-2000s. Since then increased computation power and more refined datasets resolution, led to more numerous and sophisticated global risk models. This article presents a recent history of global disaster risk models, the current status of researches for the Global Assessment Report on Disaster Risk Reduction (GAR 2013) and future challenges and limitations for the development of next generation global disaster risk models.

  15. Quantitative assessment of direct and indirect landslide risk along transportation lines in southern India

    NASA Astrophysics Data System (ADS)

    Jaiswal, P.; van Westen, C. J.; Jetten, V.

    2010-06-01

    A quantitative approach for landslide risk assessment along transportation lines is presented and applied to a road and a railway alignment in the Nilgiri hills in southern India. The method allows estimating direct risk affecting the alignments, vehicles and people, and indirect risk resulting from the disruption of economic activities. The data required for the risk estimation were obtained from historical records. A total of 901 landslides were catalogued initiating from cut slopes along the railway and road alignment. The landslides were grouped into three magnitude classes based on the landslide type, volume, scar depth, run-out distance, etc and their probability of occurrence was obtained using frequency-volume distribution. Hazard, for a given return period, expressed as the number of landslides of a given magnitude class per kilometre of cut slopes, was obtained using Gumbel distribution and probability of landslide magnitude. In total 18 specific hazard scenarios were generated using the three magnitude classes and six return periods (1, 3, 5, 15, 25, and 50 years). The assessment of the vulnerability of the road and railway line was based on damage records whereas the vulnerability of different types of vehicles and people was subjectively assessed based on limited historic incidents. Direct specific loss for the alignments (railway line and road), vehicles (train, bus, lorry, car and motorbike) was expressed in monetary value (US), and direct specific loss of life of commuters was expressed in annual probability of death. Indirect specific loss (US) derived from the traffic interruption was evaluated considering alternative driving routes, and includes losses resulting from additional fuel consumption, additional travel cost, loss of income to the local business, and loss of revenue to the railway department. The results indicate that the total loss, including both direct and indirect loss, from 1 to 50 years return period, varies from US 90 840 to US

  16. Risk analysis for roadways subjected to multiple landslide-related hazards

    NASA Astrophysics Data System (ADS)

    Corominas, Jordi; Mavrouli, Olga

    2014-05-01

    Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two

  17. Assessing Natural Hazard Vulnerability Through Marmara Region Using GIS

    NASA Astrophysics Data System (ADS)

    Sabuncu, A.; Garagon Dogru, A.; Ozener, H.

    2013-12-01

    Natural hazards are natural phenomenon occured in the Earth's system that include geological and meteorological events such as earthquakes, floods, landslides, droughts, fires and tsunamis. The metropolitan cities are vulnerable to natural hazards due to their population densities, industrial facilities and proporties. The urban layout of the megacities are complex since industrial facilities are interference with residential area. The Marmara region is placed in North-western Turkey suffered from natural hazards (earthquakes, floods etc.) for years. After 1999 Kocaeli and Duzce earthquakes and 2009 Istanbul flash floods, dramatic number of casualities and economic losses were reported by the authorities. Geographic information systems (GIS) have substantial capacity in order to develop natural disaster management. As these systems provide more efficient and reliable analysis and evaluation of the data in the management, and also convenient and better solutions for the decision making before during and after the natural hazards. The Earth science data and socio-economic data can be integrated into a GIS as different layers. Additionally, satellite data are used to understand the changes pre and post the natural hazards. GIS is a powerful software for the combination of different type of digital data. A natural hazard database for the Marmara region provides all different types of digital data to the users. All proper data collection processing and analysing are critical to evaluate and identify hazards. The natural hazard database allows users to monitor, analyze and query past and recent disasters in the Marmara Region. The long term aim of this study is to develop geodatabase and identify the natural hazard vulnerabilities of the metropolitan cities.

  18. The evolution of a health hazard assessment database management system for military weapons, equipment, and materiel.

    PubMed

    Murnyak, George R; Spencer, Clark O; Chaney, Ann E; Roberts, Welford C

    2002-04-01

    During the 1970s, the Army health hazard assessment (HHA) process developed as a medical program to minimize hazards in military materiel during the development process. The HHA Program characterizes health hazards that soldiers and civilians may encounter as they interact with military weapons and equipment. Thus, it is a resource for medical planners and advisors to use that can identify and estimate potential hazards that soldiers may encounter as they train and conduct missions. The U.S. Army Center for Health Promotion and Preventive Medicine administers the program, which is integrated with the Army's Manpower and Personnel Integration program. As the HHA Program has matured, an electronic database has been developed to record and monitor the health hazards associated with military equipment and systems. The current database tracks the results of HHAs and provides reporting designed to assist the HHA Program manager in daily activities.

  19. Toward Probabilistic Risk Analyses - Development of a Probabilistic Tsunami Hazard Assessment of Crescent City, CA

    NASA Astrophysics Data System (ADS)

    González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.

    2011-12-01

    Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.

  20. Ensemble tropical-extratropical cyclone coastal flood hazard assessment with climate change

    NASA Astrophysics Data System (ADS)

    Orton, P. M.; Lin, N.; Colle, B.

    2016-12-01

    A challenge with quantifying future changes in coastal flooding for the U.S. East Coast is that climate change has varying effects on different types of storms, in addition to raising mean sea levels. Moreover, future flood hazard uncertainties are large and come from many sources. Here, a new coastal flood hazard assessment approach is demonstrated that separately evaluates and then combines probabilities of storm tide generated from tropical cyclones (TCs) and extratropical cyclones (ETCs). The separation enables us to incorporate climate change impacts on both types of storms. The assessment accounts for epistemic storm tide uncertainty using an ensemble of different prior studies and methods of assessment, merged with uncertainty in climate change effects on storm tides and sea levels. The assessment is applied for New York Harbor, under the auspices of the New York City Panel on Climate Change (NPCC). In the New York Bight region and much of the U.S. East Coast, differing flood exceedance curve slopes for TCs and ETCs arise due to their differing physics. It is demonstrated how errors can arise for this region from mixing together storm types in an extreme value statistical analysis, a common practice when using observations. The effects of climate change on TC and ETC flooding have recently been assessed for this region, for TCs using a Global Climate Model (GCM) driven hurricane model with hydrodynamic modeling, and for ETCs using a GCM-driven multilinear regression-based storm surge model. The results of these prior studies are applied to our central estimates of the flood exceedance curve probabilities, transforming them for climate change effects. The results are useful for decision-makers because they highlight the large uncertainty in present-day and future flood risk, and also for scientists because they identify the areas where further research is most needed.

  1. Repeatability Assessment by ISO 11843-7 in Quantitative HPLC for Herbal Medicines.

    PubMed

    Chen, Liangmian; Kotani, Akira; Hakamata, Hideki; Tsutsumi, Risa; Hayashi, Yuzuru; Wang, Zhimin; Kusu, Fumiyo

    2015-01-01

    We have proposed an assessment methods to estimate the measurement relative standard deviation (RSD) of chromatographic peaks in quantitative HPLC for herbal medicines by the methodology of ISO 11843 Part 7 (ISO 11843-7:2012), which provides detection limits stochastically. In quantitative HPLC with UV detection (HPLC-UV) of Scutellaria Radix for the determination of baicalin, the measurement RSD of baicalin by ISO 11843-7:2012 stochastically was within a 95% confidence interval of the statistically obtained RSD by repetitive measurements (n = 6). Thus, our findings show that it is applicable for estimating of the repeatability of HPLC-UV for determining baicalin without repeated measurements. In addition, the allowable limit of the "System repeatability" in "Liquid Chromatography" regulated in a pharmacopoeia can be obtained by the present assessment method. Moreover, the present assessment method was also successfully applied to estimate the measurement RSDs of quantitative three-channel liquid chromatography with electrochemical detection (LC-3ECD) of Chrysanthemi Flos for determining caffeoylquinic acids and flavonoids. By the present repeatability assessment method, reliable measurement RSD was obtained stochastically, and the experimental time was remarkably reduced.

  2. Considerations of Environmentally Relevant Test Conditions for Improved Evaluation of Ecological Hazards of Engineered Nanomaterials.

    PubMed

    Holden, Patricia A; Gardea-Torresdey, Jorge L; Klaessig, Fred; Turco, Ronald F; Mortimer, Monika; Hund-Rinke, Kerstin; Cohen Hubal, Elaine A; Avery, David; Barceló, Damià; Behra, Renata; Cohen, Yoram; Deydier-Stephan, Laurence; Ferguson, P Lee; Fernandes, Teresa F; Herr Harthorn, Barbara; Henderson, W Matthew; Hoke, Robert A; Hristozov, Danail; Johnston, John M; Kane, Agnes B; Kapustka, Larry; Keller, Arturo A; Lenihan, Hunter S; Lovell, Wess; Murphy, Catherine J; Nisbet, Roger M; Petersen, Elijah J; Salinas, Edward R; Scheringer, Martin; Sharma, Monita; Speed, David E; Sultan, Yasir; Westerhoff, Paul; White, Jason C; Wiesner, Mark R; Wong, Eva M; Xing, Baoshan; Steele Horan, Meghan; Godwin, Hilary A; Nel, André E

    2016-06-21

    Engineered nanomaterials (ENMs) are increasingly entering the environment with uncertain consequences including potential ecological effects. Various research communities view differently whether ecotoxicological testing of ENMs should be conducted using environmentally relevant concentrations-where observing outcomes is difficult-versus higher ENM doses, where responses are observable. What exposure conditions are typically used in assessing ENM hazards to populations? What conditions are used to test ecosystem-scale hazards? What is known regarding actual ENMs in the environment, via measurements or modeling simulations? How should exposure conditions, ENM transformation, dose, and body burden be used in interpreting biological and computational findings for assessing risks? These questions were addressed in the context of this critical review. As a result, three main recommendations emerged. First, researchers should improve ecotoxicology of ENMs by choosing test end points, duration, and study conditions-including ENM test concentrations-that align with realistic exposure scenarios. Second, testing should proceed via tiers with iterative feedback that informs experiments at other levels of biological organization. Finally, environmental realism in ENM hazard assessments should involve greater coordination among ENM quantitative analysts, exposure modelers, and ecotoxicologists, across government, industry, and academia.

  3. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  4. Volcano Hazards Program

    USGS Publications Warehouse

    Venezky, Dina Y.; Myers, Bobbie; Driedger, Carolyn

    2008-01-01

    Diagram of common volcano hazards. The U.S. Geological Survey Volcano Hazards Program (VHP) monitors unrest and eruptions at U.S. volcanoes, assesses potential hazards, responds to volcanic crises, and conducts research on how volcanoes work. When conditions change at a monitored volcano, the VHP issues public advisories and warnings to alert emergency-management authorities and the public. See http://volcanoes.usgs.gov/ to learn more about volcanoes and find out what's happening now.

  5. Nanomaterial characterization: considerations and needs for hazard assessment and safety evaluation.

    PubMed

    Boverhof, Darrell R; David, Raymond M

    2010-02-01

    Nanotechnology is a rapidly emerging field of great interest and promise. As new materials are developed and commercialized, hazard information also needs to be generated to reassure regulators, workers, and consumers that these materials can be used safely. The biological properties of nanomaterials are closely tied to the physical characteristics, including size, shape, dissolution rate, agglomeration state, and surface chemistry, to name a few. Furthermore, these properties can be altered by the medium used to suspend or disperse these water-insoluble particles. However, the current toxicology literature lacks much of the characterization information that allows toxicologists and regulators to develop "rules of thumb" that could be used to assess potential hazards. To effectively develop these rules, toxicologists need to know the characteristics of the particle that interacts with the biological system. This void leaves the scientific community with no options other than to evaluate all materials for all potential hazards. Lack of characterization could also lead to different laboratories reporting discordant results on seemingly the same test material because of subtle differences in the particle or differences in the dispersion medium used that resulted in altered properties and toxicity of the particle. For these reasons, good characterization using a minimal characterization data set should accompany and be required of all scientific publications on nanomaterials.

  6. Using a Geographic Information System to Assess the Risk of Hurricane Hazards on the Maya Civilization

    NASA Astrophysics Data System (ADS)

    Weigel, A. M.; Griffin, R.; Sever, T.

    2014-12-01

    The extent of the Maya civilization spanned across portions of modern day Mexico, Belize, Guatemala, El Salvador and Honduras. Paleoclimatic studies suggest this region has been affected by strong hurricanes for the past six thousand years, reinforced by archeological evidence from Mayan records indicating they experienced strong storms. It is theorized hurricanes aided in the collapse of the Maya, damaging building structures, agriculture, and ceasing industry activities. Today, this region is known for its active tropical climatology, being hit by numerous strong storms including Hurricane Dean, Iris, Keith, and Mitch. This research uses a geographic information system (GIS) to model hurricane hazards, and assess the risk posed on the Maya civilization. GIS has the ability to handle various layer components making it optimal for combining parameters necessary for assessing the risk of experiencing hurricane related hazards. For this analysis, high winds, storm surge flooding, non-storm surge related flooding, and rainfall triggered landslides were selected as the primary hurricane hazards. Data sets used in this analysis include the National Climatic Data Center International Best Track Archive for Climate Stewardships (IBTrACS) hurricane tracks, Shuttle Radar Topography Mission Digital Elevation Model, WorldClim monthly accumulated precipitation, USGS HydroSHEDS river locations, Harmonized World Soil Database soil types, and known Maya site locations from the Electronic Atlas of Ancient Maya Sites. ArcGIS and ENVI software were utilized to process data and model hurricane hazards. To assess locations at risk of experiencing high winds, a model was created using ArcGIS Model Builder to map each storm's temporal wind profile, and adapted to simulate forward storm velocity, and storm frequency. Modeled results were then combined with physical land characteristics, meteorological, and hydrologic data to identify areas likely affected. Certain areas along the eastern

  7. NecroQuant: quantitative assessment of radiological necrosis

    NASA Astrophysics Data System (ADS)

    Hwang, Darryl H.; Mohamed, Passant; Varghese, Bino A.; Cen, Steven Y.; Duddalwar, Vinay

    2017-11-01

    Clinicians can now objectively quantify tumor necrosis by Hounsfield units and enhancement characteristics from multiphase contrast enhanced CT imaging. NecroQuant has been designed to work as part of a radiomics pipelines. The software is a departure from the conventional qualitative assessment of tumor necrosis, as it provides the user (radiologists and researchers) a simple interface to precisely and interactively define and measure necrosis in contrast-enhanced CT images. Although, the software is tested here on renal masses, it can be re-configured to assess tumor necrosis across variety of tumors from different body sites, providing a generalized, open, portable, and extensible quantitative analysis platform that is widely applicable across cancer types to quantify tumor necrosis.

  8. Assessment of tsunami hazard for coastal areas of Shandong Province, China

    NASA Astrophysics Data System (ADS)

    Feng, Xingru; Yin, Baoshu

    2017-04-01

    Shandong province is located on the east coast of China and has a coastline of about 3100 km. There are only a few tsunami events recorded in the history of Shandong Province, but the tsunami hazard assessment is still necessary as the rapid economic development and increasing population of this area. The objective of this study was to evaluate the potential danger posed by tsunamis for Shandong Province. The numerical simulation method was adopted to assess the tsunami hazard for coastal areas of Shandong Province. The Cornell multi-grid coupled tsunami numerical model (COMCOT) was used and its efficacy was verified by comparison with three historical tsunami events. The simulated maximum tsunami wave height agreed well with the observational data. Based on previous studies and statistical analyses, multiple earthquake scenarios in eight seismic zones were designed, the magnitudes of which were set as the potential maximum values. Then, the tsunamis they induced were simulated using the COMCOT model to investigate their impact on the coastal areas of Shandong Province. The numerical results showed that the maximum tsunami wave height, which was caused by the earthquake scenario located in the sea area of the Mariana Islands, could reach up to 1.39 m off the eastern coast of Weihai city. The tsunamis from the seismic zones of the Bohai Sea, Okinawa Trough, and Manila Trench could also reach heights of >1 m in some areas, meaning that earthquakes in these zones should not be ignored. The inundation hazard was distributed primarily in some northern coastal areas near Yantai and southeastern coastal areas of Shandong Peninsula. When considering both the magnitude and arrival time of tsunamis, it is suggested that greater attention be paid to earthquakes that occur in the Bohai Sea. In conclusion, the tsunami hazard facing the coastal area of Shandong Province is not very serious; however, disasters could occur if such events coincided with spring tides or other

  9. Deterministic Seismic Hazard Assessment of Center-East IRAN (55.5-58.5˚ E, 29-31˚ N)

    NASA Astrophysics Data System (ADS)

    Askari, M.; Ney, Beh

    2009-04-01

    Deterministic Seismic Hazard Assessment of Center-East IRAN (55.5-58.5˚E, 29-31˚N) Mina Askari, Behnoosh Neyestani Students of Science and Research University,Iran. Deterministic seismic hazard assessment has been performed in Center-East IRAN, including Kerman and adjacent regions of 100km is selected. A catalogue of earthquakes in the region, including historical earthquakes and instrumental earthquakes is provided. A total of 25 potential seismic source zones in the region delineated as area sources for seismic hazard assessment based on geological, seismological and geophysical information, then minimum distance for every seismic sources until site (Kerman) and maximum magnitude for each source have been determined, eventually using the N. A. ABRAHAMSON and J. J. LITEHISER '1989 attenuation relationship, maximum acceleration is estimated to be 0.38g, that is related to the movement of blind fault with maximum magnitude of this source is Ms=5.5.

  10. Landslide Hazard Probability Derived from Inherent and Dynamic Determinants

    NASA Astrophysics Data System (ADS)

    Strauch, Ronda; Istanbulluoglu, Erkan

    2016-04-01

    Landslide hazard research has typically been conducted independently from hydroclimate research. We unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach combines an empirical inherent landslide probability with a numerical dynamic probability, generated by combining routed recharge from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model run in a Monte Carlo simulation. Landslide hazard mapping is advanced by adjusting the dynamic model of stability with an empirically-based scalar representing the inherent stability of the landscape, creating a probabilistic quantitative measure of geohazard prediction at a 30-m resolution. Climatology, soil, and topography control the dynamic nature of hillslope stability and the empirical information further improves the discriminating ability of the integrated model. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex, a rugged terrain with nearly 2,700 m (9,000 ft) of vertical relief, covering 2757 sq km (1064 sq mi) in northern Washington State, U.S.A.

  11. Combined visual and semi-quantitative assessment of 123I-FP-CIT SPECT for the diagnosis of dopaminergic neurodegenerative diseases.

    PubMed

    Ueda, Jun; Yoshimura, Hajime; Shimizu, Keiji; Hino, Megumu; Kohara, Nobuo

    2017-07-01

    Visual and semi-quantitative assessments of 123 I-FP-CIT single-photon emission computed tomography (SPECT) are useful for the diagnosis of dopaminergic neurodegenerative diseases (dNDD), including Parkinson's disease, dementia with Lewy bodies, progressive supranuclear palsy, multiple system atrophy, and corticobasal degeneration. However, the diagnostic value of combined visual and semi-quantitative assessment in dNDD remains unclear. Among 239 consecutive patients with a newly diagnosed possible parkinsonian syndrome who underwent 123 I-FP-CIT SPECT in our medical center, 114 patients with a disease duration less than 7 years were diagnosed as dNDD with the established criteria or as non-dNDD according to clinical judgment. We retrospectively examined their clinical characteristics and visual and semi-quantitative assessments of 123 I-FP-CIT SPECT. The striatal binding ratio (SBR) was used as a semi-quantitative measure of 123 I-FP-CIT SPECT. We calculated the sensitivity and specificity of visual assessment alone, semi-quantitative assessment alone, and combined visual and semi-quantitative assessment for the diagnosis of dNDD. SBR was correlated with visual assessment. Some dNDD patients with a normal visual assessment had an abnormal SBR, and vice versa. There was no statistically significant difference between sensitivity of the diagnosis with visual assessment alone and semi-quantitative assessment alone (91.2 vs. 86.8%, respectively, p = 0.29). Combined visual and semi-quantitative assessment demonstrated superior sensitivity (96.7%) to visual assessment (p = 0.03) or semi-quantitative assessment (p = 0.003) alone with equal specificity. Visual and semi-quantitative assessments of 123 I-FP-CIT SPECT are helpful for the diagnosis of dNDD, and combined visual and semi-quantitative assessment shows superior sensitivity with equal specificity.

  12. Guideline for assessing the performance of electric power systems in natural hazard and human threat events

    USGS Publications Warehouse

    Savage, W.U.; Nishenko, S.P.; Honegger, D.G.; Kempner, L.

    2006-01-01

    Electric power utilities are familiar with and skilled in preparing for and responding to almost-routine natural hazard events such as strong wind and ice storms and seasonal floods, as well as intentional human acts such as vandalism. Recent extreme weather (hurricanes Katrina and Rita), extremely destructive international earthquakes (in Sumatra and Pakistan), and nation-wide concerns regarding future terrorist attacks have increased the pressure on utilities to take appropriate steps to avoid being overwhelmed by such infrequent and exceedingly severe events. Determining what constitutes the appropriate steps to take requires various levels of understanding of the specific hazards and the risks faced by the utility. The American Lifelines Alliance (www. americanlifelinesalliance.org) has prepared a Guideline that provides clear, concise, and nationally-applicable guidance on determining the scope and level of effort necessary to assess power system performance in the wide range of natural hazard or human threat events. Included in this Guideline are specific procedures to follow and information to consider in performing standardized assessments. With the results of such assessments, utility owners can effectively establish and carry out risk management programs that will lead to achieving appropriate levels of performance in future events. The Guideline incorporates an inquiry-driven process with a two-phase performance assessment that can be applied to power systems of any size. The screening phase enables systems or components that are clearly not at risk to be screened out early. The subsequent analysis phase uses results from the screening phase to prioritize and allocate resources for more detailed assessments of hazard, vulnerability, and system performance. This process helps assure that the scope of the assessment meets the specific performance objectives of the inquiry. A case history is presented to illustrate the type of experience with an inquiry

  13. Objective, Quantitative, Data-Driven Assessment of Chemical Probes.

    PubMed

    Antolin, Albert A; Tym, Joseph E; Komianou, Angeliki; Collins, Ian; Workman, Paul; Al-Lazikani, Bissan

    2018-02-15

    Chemical probes are essential tools for understanding biological systems and for target validation, yet selecting probes for biomedical research is rarely based on objective assessment of all potential compounds. Here, we describe the Probe Miner: Chemical Probes Objective Assessment resource, capitalizing on the plethora of public medicinal chemistry data to empower quantitative, objective, data-driven evaluation of chemical probes. We assess >1.8 million compounds for their suitability as chemical tools against 2,220 human targets and dissect the biases and limitations encountered. Probe Miner represents a valuable resource to aid the identification of potential chemical probes, particularly when used alongside expert curation. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Development of vulnerability curves to typhoon hazards based on insurance policy and claim dataset

    NASA Astrophysics Data System (ADS)

    Mo, Wanmei; Fang, Weihua; li, Xinze; Wu, Peng; Tong, Xingwei

    2016-04-01

    Vulnerability refers to the characteristics and circumstances of an exposure that make it vulnerable to the effects of some certain hazards. It can be divided into physical vulnerability, social vulnerability, economic vulnerabilities and environmental vulnerability. Physical vulnerability indicates the potential physical damage of exposure caused by natural hazards. Vulnerability curves, quantifying the loss ratio against hazard intensity with a horizontal axis for the intensity and a vertical axis for the Mean Damage Ratio (MDR), is essential to the vulnerability assessment and quantitative evaluation of disasters. Fragility refers to the probability of diverse damage states under different hazard intensity, revealing a kind of characteristic of the exposure. Fragility curves are often used to quantify the probability of a given set of exposure at or exceeding a certain damage state. The development of quantitative fragility and vulnerability curves is the basis of catastrophe modeling. Generally, methods for quantitative fragility and vulnerability assessment can be categorized into empirical, analytical and expert opinion or judgment-based ones. Empirical method is one of the most popular methods and it relies heavily on the availability and quality of historical hazard and loss dataset, which has always been a great challenge. Analytical method is usually based on the engineering experiments and it is time-consuming and lacks built-in validation, so its credibility is also sometimes criticized widely. Expert opinion or judgment-based method is quite effective in the absence of data but the results could be too subjective so that the uncertainty is likely to be underestimated. In this study, we will present the fragility and vulnerability curves developed with empirical method based on simulated historical typhoon wind, rainfall and induced flood, and insurance policy and claim datasets of more than 100 historical typhoon events. Firstly, an insurance exposure

  15. Seismic hazard and risk assessment for large Romanian dams situated in the Moldavian Platform

    NASA Astrophysics Data System (ADS)

    Moldovan, Iren-Adelina; Popescu, Emilia; Otilia Placinta, Anica; Petruta Constantin, Angela; Toma Danila, Dragos; Borleanu, Felix; Emilian Toader, Victorin; Moldoveanu, Traian

    2016-04-01

    Besides periodical technical inspections, the monitoring and the surveillance of dams' related structures and infrastructures, there are some more seismic specific requirements towards dams' safety. The most important one is the seismic risk assessment that can be accomplished by rating the dams into seismic risk classes using the theory of Bureau and Ballentine (2002), and Bureau (2003), taking into account the maximum expected peak ground motions at the dams site - values obtained using probabilistic hazard assessment approaches (Moldovan et al., 2008), the structures vulnerability and the downstream risk characteristics (human, economical, historic and cultural heritage, etc) in the areas that might be flooded in the case of a dam failure. Probabilistic seismic hazard (PSH), vulnerability and risk studies for dams situated in the Moldavian Platform, starting from Izvorul Muntelui Dam, down on Bistrita and following on Siret River and theirs affluent will be realized. The most vulnerable dams will be studied in detail and flooding maps will be drawn to find the most exposed downstream localities both for risk assessment studies and warnings. GIS maps that clearly indicate areas that are potentially flooded are enough for these studies, thus giving information on the number of inhabitants and goods that may be destroyed. Geospatial servers included topography is sufficient to achieve them, all other further studies are not necessary for downstream risk assessment. The results will consist of local and regional seismic information, dams specific characteristics and locations, seismic hazard maps and risk classes, for all dams sites (for more than 30 dams), inundation maps (for the most vulnerable dams from the region) and possible affected localities. The studies realized in this paper have as final goal to provide the local emergency services with warnings of a potential dam failure and ensuing flood as a result of an large earthquake occurrence, allowing further

  16. Multiparametric Quantitative Ultrasound Imaging in Assessment of Chronic Kidney Disease.

    PubMed

    Gao, Jing; Perlman, Alan; Kalache, Safa; Berman, Nathaniel; Seshan, Surya; Salvatore, Steven; Smith, Lindsey; Wehrli, Natasha; Waldron, Levi; Kodali, Hanish; Chevalier, James

    2017-11-01

    To evaluate the value of multiparametric quantitative ultrasound imaging in assessing chronic kidney disease (CKD) using kidney biopsy pathologic findings as reference standards. We prospectively measured multiparametric quantitative ultrasound markers with grayscale, spectral Doppler, and acoustic radiation force impulse imaging in 25 patients with CKD before kidney biopsy and 10 healthy volunteers. Based on all pathologic (glomerulosclerosis, interstitial fibrosis/tubular atrophy, arteriosclerosis, and edema) scores, the patients with CKD were classified into mild (no grade 3 and <2 of grade 2) and moderate to severe (at least 2 of grade 2 or 1 of grade 3) CKD groups. Multiparametric quantitative ultrasound parameters included kidney length, cortical thickness, pixel intensity, parenchymal shear wave velocity, intrarenal artery peak systolic velocity (PSV), end-diastolic velocity (EDV), and resistive index. We tested the difference in quantitative ultrasound parameters among mild CKD, moderate to severe CKD, and healthy controls using analysis of variance, analyzed correlations of quantitative ultrasound parameters with pathologic scores and the estimated glomerular filtration rate (GFR) using Pearson correlation coefficients, and examined the diagnostic performance of quantitative ultrasound parameters in determining moderate CKD and an estimated GFR of less than 60 mL/min/1.73 m 2 using receiver operating characteristic curve analysis. There were significant differences in cortical thickness, pixel intensity, PSV, and EDV among the 3 groups (all P < .01). Among quantitative ultrasound parameters, the top areas under the receiver operating characteristic curves for PSV and EDV were 0.88 and 0.97, respectively, for determining pathologic moderate to severe CKD, and 0.76 and 0.86 for estimated GFR of less than 60 mL/min/1.73 m 2 . Moderate to good correlations were found for PSV, EDV, and pixel intensity with pathologic scores and estimated GFR. The

  17. Mapping basin-wide subaquatic slope failure susceptibility as a tool to assess regional seismic and tsunami hazards

    NASA Astrophysics Data System (ADS)

    Strasser, Michael; Hilbe, Michael; Anselmetti, Flavio S.

    2010-05-01

    With increasing awareness of oceanic geohazards, submarine landslides are gaining wide attention because of their catastrophic impacts on both offshore infrastructures (e.g. pipelines, cables and platforms) and coastal areas (e.g. landslide-induced tsunamis). They also are of great interest because they can be directly related to primary trigger mechanisms including earthquakes, rapid sedimentation, gas release, glacial and tidal loading, wave action, or clathrate dissociation, many of which represent potential geohazards themselves. In active tectonic environments, for instance, subaquatic landslide deposits can be used to make inferences regarding the hazard derived from seismic activity. Enormous scientific and economic efforts are thus being undertaken to better determine and quantify causes and effects of natural hazards related to subaquatic landslides. In order to achieve this fundamental goal, the detailed study of past events, the assessment of their recurrence intervals and the quantitative reconstruction of magnitudes and intensities of both causal and subsequent processes and impacts are key requirements. Here we present data and results from a study using fjord-type Lake Lucerne in central Switzerland as a "model ocean" to test a new concept for the assessment of regional seismic and tsunami hazard by basin-wide mapping of critical slope stability conditions for subaquatic landslide initiation. Previously acquired high-resolution bathymetry and reflection seismic data as well as sedimentological and in situ geotechnical data, provide a comprehensive data base to investigate subaquatic landslides and related geohazards. Available data are implemented into a basin-wide slope model. In a Geographic Information System (GIS)-framework, a pseudo-static limit equilibrium infinite slope stability equation is solved for each model point representing reconstructed slope conditions at different times in the past, during which earthquake-triggered landslides

  18. The Quantitative Reasoning for College Science (QuaRCS) Assessment: Emerging Themes from 5 Years of Data

    NASA Astrophysics Data System (ADS)

    Follette, Katherine; Dokter, Erin; Buxner, Sanlyn

    2018-01-01

    The Quantitative Reasoning for College Science (QuaRCS) Assessment is a validated assessment instrument that was designed to measure changes in students' quantitative reasoning skills, attitudes toward mathematics, and ability to accurately assess their own quantitative abilities. It has been administered to more than 5,000 students at a variety of institutions at the start and end of a semester of general education college science instruction. I will begin by briefly summarizing our published work surrounding validation of the instrument and identification of underlying attitudinal factors (composite variables identified via factor analysis) that predict 50% of the variation in students' scores on the assessment. I will then discuss more recent unpublished work, including: (1) Development and validation of an abbreviated version of the assessment (The QuaRCS Light), which results in marked improvements in students' ability to maintain a high effort level throughout the assessment and has broad implications for quantitative reasoning assessments in general, and (2) Our efforts to revise the attitudinal portion of the assessment to better assess math anxiety level, another key factor in student performance on numerical assessments.

  19. Transportation of hazardous materials

    DOT National Transportation Integrated Search

    1986-07-01

    This report discusses transportation of all hazardous materials (commodities, : radioactive materials including spent nuclear fuel, and hazardous wastes) that : travel by truck, rail, water, or air. The Office of Technology Assessment (OTA) : has ide...

  20. Participatory health impact assessment for the development of local government regulation on hazard control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inmuong, Uraiwan, E-mail: uraiwan@kku.ac.t; Faculty of Public Health, Khon Kaen University, Thailand 123 Mittrapharb Road, Khon Kaen 40002; Rithmak, Panee, E-mail: panrit@kku.ac.t

    The Thai Public Health Act 1992 required the Thai local governments to issue respective regulations to take control of any possible health-hazard related activities, both from commercial and noncommercial sources. Since 1999, there has been centrally decentralized of power to a new form of local government establishment, namely Sub-district Administrative Organization (SAO). The SAO is asmall-scale local governing structure while its legitimate function is for community services, including control of health impact related activities. Most elected SAO administrators and officers are new and less experience with any of public health code of practice, particularly on health-hazard control. This action researchmore » attempted to introduce and apply a participatory health impact assessment (HIA) tool for the development of SAO health-hazard control regulation. The study sites were at Ban Meang and Kok See SAOs, Khon Kaen Province, Thailand, while all intervention activities conducted during May 2005-April 2006. A set of cooperative activities between researchers and community representatives were planned and organized by; surveying and identifying place and service base locally causing local environmental health problems, organizing community participatory workshops for drafting and proposing the health-hazard control regulation, and appropriate practices for health-hazard controlling measures. This action research eventually could successfully enable the SAO administrators and officers understanding of local environmental-related health problem, as well as development of imposed health-hazard control regulation for local community.« less

  1. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  2. Inundation Mapping and Hazard Assessment of Tectonic and Landslide Tsunamis in Southeast Alaska

    NASA Astrophysics Data System (ADS)

    Suleimani, E.; Nicolsky, D.; Koehler, R. D., III

    2014-12-01

    The Alaska Earthquake Center conducts tsunami inundation mapping for coastal communities in Alaska, and is currently focused on the southeastern region and communities of Yakutat, Elfin Cove, Gustavus and Hoonah. This activity provides local emergency officials with tsunami hazard assessment, planning, and mitigation tools. At-risk communities are distributed along several segments of the Alaska coastline, each having a unique seismic history and potential tsunami hazard. Thus, a critical component of our project is accurate identification and characterization of potential tectonic and landslide tsunami sources. The primary tectonic element of Southeast Alaska is the Fairweather - Queen Charlotte fault system, which has ruptured in 5 large strike-slip earthquakes in the past 100 years. The 1958 "Lituya Bay" earthquake triggered a large landslide into Lituya Bay that generated a 540-m-high wave. The M7.7 Haida Gwaii earthquake of October 28, 2012 occurred along the same fault, but was associated with dominantly vertical motion, generating a local tsunami. Communities in Southeast Alaska are also vulnerable to hazards related to locally generated waves, due to proximity of communities to landslide-prone fjords and frequent earthquakes. The primary mechanisms for local tsunami generation are failure of steep rock slopes due to relaxation of internal stresses after deglaciation, and failure of thick unconsolidated sediments accumulated on underwater delta fronts at river mouths. We numerically model potential tsunami waves and inundation extent that may result from future hypothetical far- and near-field earthquakes and landslides. We perform simulations for each source scenario using the Alaska Tsunami Model, which is validated through a set of analytical benchmarks and tested against laboratory and field data. Results of numerical modeling combined with historical observations are compiled on inundation maps and used for site-specific tsunami hazard assessment by

  3. The use of Near-surface Geophysics in Evaluating and Assessing Natural Hazards

    NASA Astrophysics Data System (ADS)

    Pellerin, L.

    2007-12-01

    The list of natural hazards that transform the physical environmental is extensive: earthquakes, tsunamis, floods, volcanoes, lahars, landslides and debris flows, avalanches, karst/cavern collapse, heavy-metal contamination, permafrost, liquefaction, and magnetic storms. Because these events or conditions can have significant negative impact on health and infrastructure, the need for knowledge about and education of natural hazards is important. Near-surface geophysics can contribute in significant ways to both the knowledge base and wider understanding of these hazards. The discipline encompasses a wide range of methodologies, some of which are described below. A post-tsunami helicopter electromagnetic (EM) survey along the coasts of Aceh, northern Sumatra was used to discriminate between fresh-water and saltwater aquifers., saltwater intrusion occurred close to the coast as a result of the tsunami and deep saltwater occurrences particularly around 30 m depth were mapped up to several kilometers inland. Based on the survey results recommendations were made to locate shallow hand-dug wells and medium depth (60m) water wells. Utilizing airborne EM and magnetic measurements, a detailed assessment of the internal distribution of altered zones within an active volcano; Mount Rainier (NW USA) showed that alteration is much more restricted than had been inferred from surficial exposures alone. The study also suggested that the collapse of fresh, unaltered portions of the volcano is possible, and no flank of the volcano can be considered immune from lahars during eruption. Ground penetrating radar (GPR) has been used worldwide in a variety of applications from geotechnical investigations related to geologic hazards. These include assessment of transportation infrastructure, which maybe be damaged due to a natural hazard, study of the movement of rock glaciers in the Swiss Alps, and search and recovery of avalanche victims. Permafrost is widespread in polar areas and cold

  4. Hazardous Waste: Cleanup and Prevention.

    ERIC Educational Resources Information Center

    Vandas, Steve; Cronin, Nancy L.

    1996-01-01

    Discusses hazardous waste, waste disposal, unsafe exposure, movement of hazardous waste, and the Superfund clean-up process that consists of site discovery, site assessment, clean-up method selection, site clean up, and site maintenance. Argues that proper disposal of hazardous waste is everybody's responsibility. (JRH)

  5. Quantitative assessment of medical waste generation in the capital city of Bangladesh

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patwary, Masum A.; O'Hare, William Thomas; Street, Graham

    2009-08-15

    There is a concern that mismanagement of medical waste in developing countries may be a significant risk factor for disease transmission. Quantitative estimation of medical waste generation is needed to estimate the potential risk and as a basis for any waste management plan. Dhaka City, the capital of Bangladesh, is an example of a major city in a developing country where there has been no rigorous estimation of medical waste generation based upon a thorough scientific study. These estimates were obtained by stringent weighing of waste in a carefully chosen, representative, sample of HCEs, including non-residential diagnostic centres. This studymore » used a statistically designed sampling of waste generation in a broad range of Health Care Establishments (HCEs) to indicate that the amount of waste produced in Dhaka can be estimated to be 37 {+-} 5 ton per day. The proportion of this waste that would be classified as hazardous waste by World Health Organisation (WHO) guidelines was found to be approximately 21%. The amount of waste, and the proportion of hazardous waste, was found to vary significantly with the size and type of HCE.« less

  6. The effect of the sea on hazard assessment for tephra fallout at Campi Flegrei: a preliminary approach through the use of pyPHaz, an open tool to analyze and visualize probabilistic hazards

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Sandri, Laura; Costa, Antonio; Selva, Jacopo

    2014-05-01

    Campi Flegrei (CF) is a large volcanic field located west of the Gulf of Naples, characterized by a wide and almost circular caldera which is partially submerged beneath the Gulf of Pozzuoli. It is known that the magma-water interaction is a key element to determine the character of submarine eruptions and their impact on the surrounding areas, but this phenomenon is still not well understood and it is rarely considered in hazard assessment. The aim of the present work is to present a preliminary study of the effect of the sea on the tephra fall hazard from CF on the municipality of Naples, by introducing a variability in the probability of tephra production according to the eruptive scale (defined on the basis of the erupted volume) and the depth of the opening submerged vents. Four different Probabilistic Volcanic Hazard Assessment (PVHA) models have been defined through the application of the model BET_VH at CF, by accounting for different modeling procedures and assumptions for the submerged part of the caldera. In particular, we take into account: 1) the effect of the sea as null, i.e. as if the water were not present; 2) the effect of the sea as a cap that totally blocks the explosivity of eruptions and consequently the tephra production; 3) an ensemble model between the two models described at the previous points 1) and 2); 4) a variable probability of tephra production depending on the depth of the submerged vent. The PVHA models are then input to pyPHaz, a tool developed and designed at INGV to visualize, analyze and merge into ensemble models PVHA's results and, potentially, any other kind of probabilistic hazard assessment, both natural and anthropic, in order to evaluate the importance of considering a variability among subaerial and submerged vents on tephra fallout hazard from CF in Naples. The analysis is preliminary and does not pretend to be exhaustive, but on one hand it represents a starting point for future works; on the other hand, it is a good

  7. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  8. Environmental and health hazard ranking and assessment of plastic polymers based on chemical composition.

    PubMed

    Lithner, Delilah; Larsson, Ake; Dave, Göran

    2011-08-15

    Plastics constitute a large material group with a global annual production that has doubled in 15 years (245 million tonnes in 2008). Plastics are present everywhere in society and the environment, especially the marine environment, where large amounts of plastic waste accumulate. The knowledge of human and environmental hazards and risks from chemicals associated with the diversity of plastic products is very limited. Most chemicals used for producing plastic polymers are derived from non-renewable crude oil, and several are hazardous. These may be released during the production, use and disposal of the plastic product. In this study the environmental and health hazards of chemicals used in 55 thermoplastic and thermosetting polymers were identified and compiled. A hazard ranking model was developed for the hazard classes and categories in the EU classification and labelling (CLP) regulation which is based on the UN Globally Harmonized System. The polymers were ranked based on monomer hazard classifications, and initial assessments were made. The polymers that ranked as most hazardous are made of monomers classified as mutagenic and/or carcinogenic (category 1A or 1B). These belong to the polymer families of polyurethanes, polyacrylonitriles, polyvinyl chloride, epoxy resins, and styrenic copolymers. All have a large global annual production (1-37 million tonnes). A considerable number of polymers (31 out of 55) are made of monomers that belong to the two worst of the ranking model's five hazard levels, i.e. levels IV-V. The polymers that are made of level IV monomers and have a large global annual production (1-5 million tonnes) are phenol formaldehyde resins, unsaturated polyesters, polycarbonate, polymethyl methacrylate, and urea-formaldehyde resins. This study has identified hazardous substances used in polymer production for which the risks should be evaluated for decisions on the need for risk reduction measures, substitution, or even phase out. Copyright

  9. Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations

    NASA Astrophysics Data System (ADS)

    Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang

    2017-09-01

    Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.

  10. Fast Risk Assessment Software For Natural Hazard Phenomena Using Georeference Population And Infrastructure Data Bases

    NASA Astrophysics Data System (ADS)

    Marrero, J. M.; Pastor Paz, J. E.; Erazo, C.; Marrero, M.; Aguilar, J.; Yepes, H. A.; Estrella, C. M.; Mothes, P. A.

    2015-12-01

    Disaster Risk Reduction (DRR) requires an integrated multi-hazard assessment approach towards natural hazard mitigation. In the case of volcanic risk, long term hazard maps are generally developed on a basis of the most probable scenarios (likelihood of occurrence) or worst cases. However, in the short-term, expected scenarios may vary substantially depending on the monitoring data or new knowledge. In this context, the time required to obtain and process data is critical for optimum decision making. Availability of up-to-date volcanic scenarios is as crucial as it is to have this data accompanied by efficient estimations of their impact among populations and infrastructure. To address this impact estimation during volcanic crises, or other natural hazards, a web interface has been developed to execute an ANSI C application. This application allows one to compute - in a matter of seconds - the demographic and infrastructure impact that any natural hazard may cause employing an overlay-layer approach. The web interface is tailored to users involved in the volcanic crises management of Cotopaxi volcano (Ecuador). The population data base and the cartographic basis used are of public domain, published by the National Office of Statistics of Ecuador (INEC, by its Spanish acronym). To run the application and obtain results the user is expected to upload a raster file containing information related to the volcanic hazard or any other natural hazard, and determine categories to group population or infrastructure potentially affected. The results are displayed in a user-friendly report.

  11. The KULTURisk Regional Risk Assessment methodology for water-related natural hazards - Part 1: Physical-environmental assessment

    NASA Astrophysics Data System (ADS)

    Ronco, P.; Gallina, V.; Torresan, S.; Zabeo, A.; Semenzin, E.; Critto, A.; Marcomini, A.

    2014-07-01

    In recent years, the frequency of catastrophes induced by natural hazard has increased and flood events in particular have been recognized as one of the most threatening water-related disasters. Severe floods have occurred in Europe over the last decade causing loss of life, displacement of people and heavy economic losses. Flood disasters are growing as a consequence of many factors, both climatic and non-climatic. Indeed, the current increase of water-related disasters can be mainly attributed to the increase of exposure (increase elements potentially at risk in floodplains area) and vulnerability (i.e. economic, social, geographic, cultural, and physical/environmental characteristics of the exposure). Besides these factors, the strong effect of climate change is projected to radically modify the usual pattern of the hydrological cycle by intensifying the frequency and severity of flood events both at local, regional and global scale. Within this context, it becomes urgent and dramatically relevant the need of promoting and developing effective and pro-active strategies, tools and actions which allow to assess and (possibly) to reduce the flood risks that threats different relevant receptors. Several methodologies to assess the risk posed by water-related natural hazards have been proposed so far, but very few of them can be adopted to implement the last European Flood Directive (FD). The present study is intended to introduce and present a state-of-the-art Regional Risk Assessment (RRA) methodology to evaluate the benefits of risk prevention in terms of reduced environmental risks due to floods. The methodology, developed within the recently phased out FP7-KULTURisk Project (Knowledge-based approach to develop a cULTUre of Risk prevention - KR) is flexible and can be adapted to different case studies (i.e. large rivers, alpine/mountain catchments, urban areas and coastal areas) and spatial scales (i.e. from the large river to the urban scale). The FD compliant

  12. 78 FR 44625 - Proposed Information Collection (Open Burn Pit Registry Airborne Hazard Self-Assessment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-24

    ... DEPARTMENT OF VETERANS AFFAIRS Proposed Information Collection (Open Burn Pit Registry Airborne... to ``OMB Control No. 2900--NEW, Open Burn Pit Registry Airborne Hazard Self-Assessment Questionnaire... health effects of service members' exposure to toxic airborne chemicals and fumes caused by open burn...

  13. Use of the microscreen phage-induction assay to assess the genotoxicity of 14 hazardous industrial wastes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houk, V.S.; DeMarini, D.M.

    1988-01-01

    The Microscreen phage-induction assay, which quantitatively measures the induction of prophage lambda in Escherichia coli WP2s(lambda), was used to test 14 crude (unfractionated) hazardous industrial waste samples for genotoxic activity in the presence and absence of metabolic activation. Eleven of the 14 wastes induced prophage, and induction was observed at concentrations as low as 0.4 pg per ml. Comparisons between the ability of these waste samples to induce prophage and their mutagenicity in the Salmonella reverse mutation assay indicate that the phage-induction assay detected genotoxic activity in all but one of the wastes that were mutagenic in Salmonella. Moreover, themore » Microscreen assay detected as genotoxic five additional wastes that were not detected in the Salmonella assay. The applicability of the Microscreen phage-induction assay for screening hazardous wastes for genotoxic activity is discussed, as are some of the problems associated with screening highly toxic wastes containing toxic volatile compounds.« less

  14. Use of the Microscreen phage-induction assay to assess the genotoxicity of 14 hazardous industrial wastes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houk, V.S.; DeMarini, D.M.

    1988-01-01

    The Microscreen phage-induction assay, which quantitatively measures the induction of prophage lambda in Escherichia coli WP2s lambda, was used to test 14 crude (unfractionated) hazardous industrial-waste samples for genotoxic activity in the presence and absence of metabolic activation. Eleven of the 14 wastes induced prophage, and induction was observed at concentrations as low as 0.4 picograms per ml. Comparisons between the mutagenicity of these waste samples in Salmonella and their ability to induce prophage lambda indicate that the Microscreen phage-induction assay detected genotoxic activity in all but one of the wastes that were mutagenic in Salmonella. Moreover, the Microscreen assaymore » detected as genotoxic 5 additional wastes that were not detected in the Salmonella assay. The applicability of the Microscreen phage-induction assay for screening hazardous wastes for genotoxic activity is discussed along with some of the problems associated with screening highly toxic wastes containing toxic volatile compounds.« less

  15. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  16. Methylmercury Poisoning—An Assessment of the Sportfish Hazard in California

    PubMed Central

    Dales, Loring; Kahn, Ephraim; Wei, Eddie

    1971-01-01

    A quantitative assessment of the methylmercury risk in California entails measurement of the contamination distribution, the probability of methylmercury intake and knowledge of the toxicological properties of methylmercury. This article reviews the scientific basis for the California State Task Force's decision to warn the public against excessive consumption of sport fish contaminated by methylmercury. PMID:5544687

  17. Local to global: a collaborative approach to volcanic risk assessment

    NASA Astrophysics Data System (ADS)

    Calder, Eliza; Loughlin, Sue; Barsotti, Sara; Bonadonna, Costanza; Jenkins, Susanna

    2017-04-01

    Volcanic risk assessments at all scales present challenges related to the multitude of volcanic hazards, data gaps (hazards and vulnerability in particular), model representation and resources. Volcanic hazards include lahars, pyroclastic density currents, lava flows, tephra fall, ballistics, gas dispersal and also earthquakes, debris avalanches, tsunamis and more ... they can occur in different combinations and interact in different ways throughout the unrest, eruption and post-eruption period. Volcanoes and volcanic hazards also interact with other natural hazards (e.g. intense rainfall). Currently many hazards assessments consider the hazards from a single volcano but at national to regional scales the potential impacts of multiple volcanoes over time become important. The hazards that have the greatest tendency to affect large areas up to global scale are those transported in the atmosphere: volcanic particles and gases. Volcanic ash dispersal has the greatest potential to directly or indirectly affect the largest number of people worldwide, it is currently the only volcanic hazard for which a global assessment exists. The quantitative framework used (primarily at a regional scale) considers the hazard at a given location from any volcano. Flow hazards such as lahars and floods can have devastating impacts tens of kilometres from a source volcano and lahars can be devastating decades after an eruption has ended. Quantitative assessment of impacts is increasingly undertaken after eruptions to identify thresholds for damage and reduced functionality. Some hazards such as lava flows could be considered binary (totally destructive) but others (e.g. ash fall) have varying degrees of impact. Such assessments are needed to enhance available impact and vulnerability data. Currently, most studies focus on physical vulnerability but there is a growing emphasis on social vulnerability showing that it is highly variable and dynamic with pre-eruption socio

  18. Biointerfaces for Two-way Communication to Assess Hazards in the Aquatic Environment.

    DTIC Science & Technology

    1999-11-01

    to provide timely information on contaminants in freshwater and marine environments. 14. SUBJECT TERMS 15. NUMBER OF PAGES Sentinel species, aquatic ...for using aquatic organisms to provide timely information on contaminants in freshwater and marine environments. Some of the research challenges that...Technical Report 0001 Biointerfaces for Two-way Communication to Assess Hazards in the Aquatic Environment U.S. Army Center for Environmental Health

  19. Combining probabilistic hazard assessment with cost-benefit analysis to support decision making in a volcanic crisis from the Auckland Volcanic Field, New Zealand

    NASA Astrophysics Data System (ADS)

    Sandri, Laura; Jolly, Gill; Lindsay, Jan; Howe, Tracy; Marzocchi, Warner

    2010-05-01

    One of the main challenges of modern volcanology is to provide the public with robust and useful information for decision-making in land-use planning and in emergency management. From the scientific point of view, this translates into reliable and quantitative long- and short-term volcanic hazard assessment and eruption forecasting. Because of the complexity in characterizing volcanic events, and of the natural variability of volcanic processes, a probabilistic approach is more suitable than deterministic modeling. In recent years, two probabilistic codes have been developed for quantitative short- and long-term eruption forecasting (BET_EF) and volcanic hazard assessment (BET_VH). Both of them are based on a Bayesian Event Tree, in which volcanic events are seen as a chain of logical steps of increasing detail. At each node of the tree, the probability is computed by taking into account different sources of information, such as geological and volcanological models, past occurrences, expert opinion and numerical modeling of volcanic phenomena. Since it is a Bayesian tool, the output probability is not a single number, but a probability distribution accounting for aleatory and epistemic uncertainty. In this study, we apply BET_VH in order to quantify the long-term volcanic hazard due to base surge invasion in the region around Auckland, New Zealand's most populous city. Here, small basaltic eruptions from monogenetic cones pose a considerable risk to the city in case of phreatomagmatic activity: evidence for base surges are not uncommon in deposits from past events. Currently, we are particularly focussing on the scenario simulated during Exercise Ruaumoko, a national disaster exercise based on the build-up to an eruption in the Auckland Volcanic Field. Based on recent papers by Marzocchi and Woo, we suggest a possible quantitative strategy to link probabilistic scientific output and Boolean decision making. It is based on cost-benefit analysis, in which all costs

  20. Hydrogen quantitative risk assessment workshop proceedings.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersionmore » 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.« less

  1. Use of Bedrock and Geomorphic Mapping Compilations in Assessing Geologic Hazards at Recreation Sites on National Forests in NW California

    NASA Astrophysics Data System (ADS)

    de La Fuente, J. A.; Bell, A.; Elder, D.; Mowery, R.; Mikulovsky, R.; Klingel, H.; Stevens, M.

    2010-12-01

    Geologic hazards on US Forest Service lands have a long history of producing catastrophic events. In 1890 (prior to the establishment of the Forest Service), the China Mine landslide buried a miner’s camp along the Trinity River in NW California, killing a number of miners. An earthquake in southwestern Montana triggered a massive landslide which killed 28 people in a US Forest Service campground in 1959. In 1980, Mount St. Helens erupted in Oregon, killing 57 people. Debris flows from a winter storm in 2003 on the burned hillslopes of the San Bernardino National Forest in California killed 14 people at the St. Sophia youth Camp. A rockfall in the summer of 2009 in Lassen National Park killed a 9 year old boy. The most recent catastrophe occurred on June 11, 2010 when 20 people died in a flash flood at the Albert Pike Campground on the Ouachita National Forest. These and other disasters point out the need for geologic hazard mapping and assessments on the National Forests. The US Forest Service (USFS) is currently assessing geologic hazards in the Northern Province of USFS Region 5 (Pacific Southwest Region), which includes the Klamath, Mendocino, Shasta-Trinity, and Six Rivers National Forests. The most common geologic hazards (relatively short return intervals) in this area include landslides, rock falls, debris flows, flooding, temporary dam failures (landslide or woody debris), naturally occurring hazardous materials, (asbestos radon, etc), and rarely, karst subsidence. Seismic and volcanic hazards are also important at longer return intervals. This assessment will be conducted in three phases, and is patterned after a process developed by Region 8 of the US Forest Service. The first phase is a reconnaissance level assessment based on existing information such as spatial databases, aerial photos, Digital Elevation Models, State of California Alquist-Priolo Earthquake Fault Zone maps, previous investigations and anecdotal accounts of past events. The bedrock

  2. RAPID-N: Assessing and mapping the risk of natural-hazard impact at industrial installations

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazard-triggered technological accidents (so-called Natech accidents) at hazardous installations can have major consequences due to the potential for release of hazardous materials, fires and explosions. Effective Natech risk reduction requires the identification of areas where this risk is high. However, recent studies have shown that there are hardly any methodologies and tools that would allow authorities to identify these areas. To work towards closing this gap, the European Commission's Joint Research Centre has developed the rapid Natech risk assessment and mapping framework RAPID-N. The tool, which is implemented in an online web-based environment, is unique in that it contains all functionalities required for running a full Natech risk analysis simulation (natural hazards severity estimation, equipment damage probability and severity calculation, modeling of the consequences of loss of containment scenarios) and for visualizing its results. The output of RAPID-N are risk summary reports and interactive risk maps which can be used for decision making. Currently, the tool focuses on Natech risk due to earthquakes at industrial installations. However, it will be extended to also analyse and map Natech risk due to floods in the near future. RAPID-N is available at http://rapidn.jrc.ec.europa.eu. This presentation will discuss the results of case-study calculations performed for selected flammable and toxic substances to test the capabilities of RAPID-N both for single- and multi-site earthquake Natech risk assessment. For this purpose, an Istanbul earthquake scenario provided by the Turkish government was used. The results of the exercise show that RAPID-N is a valuable decision-support tool that assesses the Natech risk and maps the consequence end-point distances. These end-point distances are currently defined by 7 kPa overpressure for Vapour Cloud Explosions, 2nd degree burns for pool fire (which is equivalent to a heat radiation of 5 kW/m2 for 40s

  3. Air Monitoring for Hazardous Gas Detection

    NASA Technical Reports Server (NTRS)

    Arkin, C. Richard; Griffin, Timothy P.; Adams, Frederick W.; Naylor, Guy; Haskell, William; Floyd, David; Curley, Charles; Follistein, Duke W.

    2004-01-01

    The Hazardous Gas Detection Lab (HGDL) at Kennedy Space Center is involved in the design and development of instrumentation that can detect and quantify various hazardous gases. Traditionally these systems are designed for leak detection of the cryogenic gases used for the propulsion of the Shuttle and other vehicles. Mass spectrometers are the basis of these systems, which provide excellent quantitation, sensitivity, selectivity, response times and detection limits. A Table lists common gases monitored for aerospace applications. The first five gases, hydrogen, helium, nitrogen, oxygen, and argon are historically the focus of the HGDL.

  4. 29 CFR Appendix A to Subpart I of... - Non-Mandatory Guidelines for Hazard Assessment, Personal Protective Equipment (PPE) Selection...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... conjunction with engineering controls, guards, and safe work practices and procedures. 2. Assessment and... example, splash protection, and impact protection; (b) compare the hazards associated with the environment... contact lenses must also wear appropriate eye and face protection devices in a hazardous environment. It...

  5. The 3D Elevation Program—Landslide recognition, hazard assessment, and mitigation support

    USGS Publications Warehouse

    Lukas, Vicki; Carswell, Jr., William J.

    2017-01-27

    The U.S. Geological Survey (USGS) Landslide Hazards Program conducts landslide hazard assessments, pursues landslide investigations and forecasts, provides technical assistance to respond to landslide emergencies, and engages in outreach. All of these activities benefit from the availability of high-resolution, three-dimensional (3D) elevation information in the form of light detection and ranging (lidar) data and interferometric synthetic aperture radar (IfSAR) data. Research on landslide processes addresses critical questions of where and when landslides are likely to occur as well as their size, speed, and effects. This understanding informs the development of methods and tools for hazard assessment and situational awareness used to guide efforts to avoid or mitigate landslide impacts. Such research is essential for the USGS to provide improved information on landslide potential associated with severe storms, earthquakes, volcanic activity, coastal wave erosion, and wildfire burn areas.Decisionmakers in government and the private sector increasingly depend on information the USGS provides before, during, and following disasters so that communities can live, work, travel, and build safely. The USGS 3D Elevation Program (3DEP) provides the programmatic infrastructure to generate and supply lidar-derived superior terrain data to address landslide applications and a wide range of other urgent needs nationwide. By providing data to users, 3DEP reduces users’ costs and risks and allows them to concentrate on their mission objectives. 3DEP includes (1) data acquisition partnerships that leverage funding, (2) contracts with experienced private mapping firms, (3) technical expertise, lidar data standards, and specifications, and (4) most important, public access to high-quality 3D elevation data.

  6. Space Shuttle Main Engine Quantitative Risk Assessment: Illustrating Modeling of a Complex System with a New QRA Software Package

    NASA Technical Reports Server (NTRS)

    Smart, Christian

    1998-01-01

    During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen

  7. The crysophere as a resource and hazard - Integrated framework for the assessment of future water resource vulnerability and glacial hazard risk assessment in the Kullu district, Himachal Pradesh, India.

    NASA Astrophysics Data System (ADS)

    Allen, Simon; Awasthi, Kirtiman; Ballesteros, Juan Antonio; Frey, Holger; Huggel, Christian; Kahn, Mustafa; Linsbauer, Andreas; Rohrer, Mario; Ruiz-Villanueva, Virginia; Salzmann, Nadine; Schauwecker, Simone; Stoffel, Markus

    2014-05-01

    High mountain environments are particularly susceptible to changes in atmospheric temperature and precipitation patterns, owing to the sensitivity of cryospheric components to melting conditions, and the importance of rainfall and river runoff for sustaining crops and livelihoods. The Himalayan state of Himachal Pradesh (population ca. 6 mil.) is the initial focus of a joint program between the governments of India and Switzerland aiming to build scientific capacity to understand the threat, and plan for adaptation to climate change in the Himalaya. Here we focus on the cryosphere, and provide an overview of the integrated framework we will follow to assess future water resource vulnerability from changes in runoff, and assess future disaster risk from mass movement and flood hazards. At this early stage of our project, we aim to identify key methodological steps, data requirements, and related challenges. The initial implementation of our framework will be centered on the Kullu district. Core and integrative components of both the traditional climate vulnerability framework (eg., IPCC AR4), and the vulnerability and risk concepts of the disaster risk management community (eg., IPCC SREX 2012) include the assessment of sensitivity, exposure, and adaptive capacity. Sensitivity to water vulnerability in the Kullu district requires the quantification of current and future water resource usage at the block or community level, using metrics such as total irrigated land area, total electricity usage, population density and birth rates. Within the disaster risk framework, sensitivity to mass movement and flood hazards will be determined based on factors such as population density and demographics (notably age and gender), strength of building materials etc. Projected temperature and precipitation data from regional climate model output will be used to model changes in melt water runoff and streamflow, determining the exposure of communities and natural systems to future

  8. Assessing crown fire potential by linking models of surface and crown fire behavior

    Treesearch

    Joe H. Scott; Elizabeth D. Reinhardt

    2001-01-01

    Fire managers are increasingly concerned about the threat of crown fires, yet only now are quantitative methods for assessing crown fire hazard being developed. Links among existing mathematical models of fire behavior are used to develop two indices of crown fire hazard-the Torching Index and Crowning Index. These indices can be used to ordinate different forest...

  9. MC3196 Detonator Shipping Package Hazard Classification Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones; Robert B.

    1979-05-31

    An investigation was made to determine whether the MC3196 detonator should be assigned a DOT hazard classification of Detonating Fuze, Class C Explosives per 49 CFR 173.113. This study covers the Propagation Test and the External Heat Test as approved by DOE Albuquerque Operations Office. Test data led to the recommeded hazard classification of detonating fuze, Class C explosives.

  10. Fault2SHA- A European Working group to link faults and Probabilistic Seismic Hazard Assessment communities in Europe

    NASA Astrophysics Data System (ADS)

    Scotti, Oona; Peruzza, Laura

    2016-04-01

    The key questions we ask are: What is the best strategy to fill in the gap in knowledge and know-how in Europe when considering faults in seismic hazard assessments? Are field geologists providing the relevant information for seismic hazard assessment? Are seismic hazard analysts interpreting field data appropriately? Is the full range of uncertainties associated with the characterization of faults correctly understood and propagated in the computations? How can fault-modellers contribute to a better representation of the long-term behaviour of fault-networks in seismic hazard studies? Providing answers to these questions is fundamental, in order to reduce the consequences of future earthquakes and improve the reliability of seismic hazard assessments. An informal working group was thus created at a meeting in Paris in November 2014, partly financed by the Institute of Radioprotection and Nuclear Safety, with the aim to motivate exchanges between field geologists, fault modellers and seismic hazard practitioners. A variety of approaches were presented at the meeting and a clear gap emerged between some field geologists, that are not necessarily familiar with probabilistic seismic hazard assessment methods and needs and practitioners that do not necessarily propagate the "full" uncertainty associated with the characterization of faults. The group thus decided to meet again a year later in Chieti (Italy), to share concepts and ideas through a specific exercise on a test case study. Some solutions emerged but many problems of seismic source characterizations with people working in the field as well as with people tackling models of interacting faults remained. Now, in Wien, we want to open the group and launch a call for the European community at large to contribute to the discussion. The 2016 EGU session Fault2SHA is motivated by such an urgency to increase the number of round tables on this topic and debate on the peculiarities of using faults in seismic hazard

  11. Fuzzy Cognitive Maps for Glacier Hazards Assessment: Application to Predicting the Potential for Glacier Lake Outbursts

    NASA Astrophysics Data System (ADS)

    Furfaro, R.; Kargel, J. S.; Fink, W.; Bishop, M. P.

    2010-12-01

    Glaciers and ice sheets are among the largest unstable parts of the solid Earth. Generally, glaciers are devoid of resources (other than water), are dangerous, are unstable and no infrastructure is normally built directly on their surfaces. Areas down valley from large alpine glaciers are also commonly unstable due to landslide potential of moraines, debris flows, snow avalanches, outburst floods from glacier lakes, and other dynamical alpine processes; yet there exists much development and human occupation of some disaster-prone areas. Satellite remote sensing can be extremely effective in providing cost-effective and time- critical information. Space-based imagery can be used to monitor glacier outlines and their lakes, including processes such as iceberg calving and debris accumulation, as well as changing thicknesses and flow speeds. Such images can also be used to make preliminary identifications of specific hazardous spots and allows preliminary assessment of possible modes of future disaster occurrence. Autonomous assessment of glacier conditions and their potential for hazards would present a major advance and permit systematized analysis of more data than humans can assess. This technical leap will require the design and implementation of Artificial Intelligence (AI) algorithms specifically designed to mimic glacier experts’ reasoning. Here, we introduce the theory of Fuzzy Cognitive Maps (FCM) as an AI tool for predicting and assessing natural hazards in alpine glacier environments. FCM techniques are employed to represent expert knowledge of glaciers physical processes. A cognitive model embedded in a fuzzy logic framework is constructed via the synergistic interaction between glaciologists and AI experts. To verify the effectiveness of the proposed AI methodology as applied to predicting hazards in glacier environments, we designed and implemented a FCM that addresses the challenging problem of autonomously assessing the Glacier Lake Outburst Flow

  12. Use of Archival Sources to Improve Water-Related Hazard Assessments at Volcán de Agua, Guatemala

    NASA Astrophysics Data System (ADS)

    Hutchison, A. A.; Cashman, K. V.; Rust, A.; Williams, C. A.

    2013-12-01

    This interdisciplinary study focuses on the use of archival sources from the 18th Century Spanish Empire to develop a greater understanding of mudflow trigger mechanisms at Volcán de Agua in Guatemala. Currently, hazard assessments of debris flows at Volcán de Agua are largely based on studies of analogous events, such as the mudflow at Casita Volcano in 1998 caused by excessive rainfall generated by Hurricane Mitch. A preliminary investigation of Spanish archival sources, however, indicates that a damaging mudflow from the volcano in 1717 may have been triggered by activity at the neighbouring Volcán de Fuego. A VEI 4 eruption of Fuego in late August 1717 was followed by 33 days of localized 'retumbos' and then a major local earthquake with accompanying mudflows from several 'bocas' on the southwest flank of Agua. Of particular importance for this study is an archival source from Archivos Generales de Centro América (AGCA) that consists of a series of letters, petitions and witness statements that were written and gathered following the catastrophic events of 1717. Their purpose was to argue for royal permission to relocate the capital city, which at the time was located on the lower flanks of Volcán de Agua. Within these documents there are accounts of steaming 'avenidas' of water with sulphurous smells, and quantitative descriptions that suggest fissure formation related to volcanic activity at Volcán de Fuego. Clear evidence for volcano-tectonic activity at the time, combined with the fact there is no mention of rainfall in the documents, suggest that outbursts of mud from Agua's south flank may have been caused by a volcanic perturbation of a hydrothermal system. This single example suggests that further analysis of archival documents will provide a more accurate and robust assessment of water related hazards at Volcán de Agua than currently exists.

  13. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    ERIC Educational Resources Information Center

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory…

  14. Fire Hazard Assessment in Supporting Fire Protection System Design of a Chemical Process Facility

    DTIC Science & Technology

    1996-08-01

    CSDP/Studies/FireHaz –i– 3/28/97 FIRE HAZARD ASSESSMENT IN SUPPORTING FIRE PROTECTION SYSTEM DESIGN OF A CHEMICAL PROCESS FACILITY Ali Pezeshk...Joseph Chang, Dwight Hunt, and Peter Jahn Parsons Infrastructure & Technology Group, Inc. Pasadena, California 91124 ABSTRACT Because fires in a chemical ...Assessment in Supporting Fire Protection System Design of a Chemical Process Facility 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  15. A simple tool for preliminary hazard identification and quick assessment in craftwork and small/medium enterprises (SME).

    PubMed

    Colombini, Daniela; Occhipinti, E; Di Leone, G

    2012-01-01

    During the last Congress of the International Ergonomics Association (IEA), Beijing, August 2009, an international group was founded aimed at developing a "toolkit for MSD prevention" within IEA and in collaboration with World Health Organization (WHO). Possible users of toolkits are: members of health and safety committees, health and safety representatives, line supervisors; labor inspectors; health workers implementing basic occupational health services; occupational health and safety specialists.According to ISO standard 11228 series and the new Draft CD ISO 12259-2009: Application document guides for the potential user, a computer software ( in Excel®) was create dealing with hazard "mapping" in handicraft The proposed methodology, using specific key enters and quick assessment criteria, allows a simple ergonomics hazard identification and risk estimation. Thus it makes possible to decide for which professional hazards a more exhaustive risk assessment will be necessary and which professional consultant should be involved (occupational physician, safety engineer, industrial hygienist, etc.).

  16. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  17. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    NASA Technical Reports Server (NTRS)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of abort triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of abort triggers.

  18. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    NASA Technical Reports Server (NTRS)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of Abort Triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of Abort Triggers.

  19. 75 FR 58346 - Hazardous Waste Management System; Identification and Listing of Hazardous Waste

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-24

    ... Chemical Company-Texas Operations (Eastman) to exclude (or delist) certain solid wastes generated by its Longview, Texas, facility from the lists of hazardous wastes. EPA used the Delisting Risk Assessment... Waste Management System; Identification and Listing of Hazardous Waste AGENCY: Environmental Protection...

  20. Challenges Ahead for Nuclear Facility Site-Specific Seismic Hazard Assessment in France: The Alternative Energies and the Atomic Energy Commission (CEA) Vision

    NASA Astrophysics Data System (ADS)

    Berge-Thierry, C.; Hollender, F.; Guyonnet-Benaize, C.; Baumont, D.; Ameri, G.; Bollinger, L.

    2017-09-01

    Seismic analysis in the context of nuclear safety in France is currently guided by a pure deterministic approach based on Basic Safety Rule ( Règle Fondamentale de Sûreté) RFS 2001-01 for seismic hazard assessment, and on the ASN/2/01 Guide that provides design rules for nuclear civil engineering structures. After the 2011 Tohohu earthquake, nuclear operators worldwide were asked to estimate the ability of their facilities to sustain extreme seismic loads. The French licensees then defined the `hard core seismic levels', which are higher than those considered for design or re-assessment of the safety of a facility. These were initially established on a deterministic basis, and they have been finally justified through state-of-the-art probabilistic seismic hazard assessments. The appreciation and propagation of uncertainties when assessing seismic hazard in France have changed considerably over the past 15 years. This evolution provided the motivation for the present article, the objectives of which are threefold: (1) to provide a description of the current practices in France to assess seismic hazard in terms of nuclear safety; (2) to discuss and highlight the sources of uncertainties and their treatment; and (3) to use a specific case study to illustrate how extended source modeling can help to constrain the key assumptions or parameters that impact upon seismic hazard assessment. This article discusses in particular seismic source characterization, strong ground motion prediction, and maximal magnitude constraints, according to the practice of the French Atomic Energy Commission. Due to increases in strong motion databases in terms of the number and quality of the records in their metadata and the uncertainty characterization, several recently published empirical ground motion prediction models are eligible for seismic hazard assessment in France. We show that propagation of epistemic and aleatory uncertainties is feasible in a deterministic approach, as in a

  1. Multi-factor evaluation indicator method for the risk assessment of atmospheric and oceanic hazard group due to the attack of tropical cyclones

    NASA Astrophysics Data System (ADS)

    Qi, Peng; Du, Mei

    2018-06-01

    China's southeast coastal areas frequently suffer from storm surge due to the attack of tropical cyclones (TCs) every year. Hazards induced by TCs are complex, such as strong wind, huge waves, storm surge, heavy rain, floods, and so on. The atmospheric and oceanic hazards cause serious disasters and substantial economic losses. This paper, from the perspective of hazard group, sets up a multi-factor evaluation method for the risk assessment of TC hazards using historical extreme data of concerned atmospheric and oceanic elements. Based on the natural hazard dynamic process, the multi-factor indicator system is composed of nine natural hazard factors representing intensity and frequency, respectively. Contributing to the indicator system, in order of importance, are maximum wind speed by TCs, attack frequency of TCs, maximum surge height, maximum wave height, frequency of gusts ≥ Scale 8, rainstorm intensity, maximum tidal range, rainstorm frequency, then sea-level rising rate. The first four factors are the most important, whose weights exceed 10% in the indicator system. With normalization processing, all the single-hazard factors are superposed by multiplying their weights to generate a superposed TC hazard. The multi-factor evaluation indicator method was applied to the risk assessment of typhoon-induced atmospheric and oceanic hazard group in typhoon-prone southeast coastal cities of China.

  2. Hydrometeorological Hazards: Monitoring, Forecasting, Risk Assessment, and Socioeconomic Responses

    NASA Technical Reports Server (NTRS)

    Wu, Huan; Huang, Maoyi; Tang, Qiuhong; Kirschbaum, Dalia B.; Ward, Philip

    2017-01-01

    Hydrometeorological hazards are caused by extreme meteorological and climate events, such as floods, droughts, hurricanes,tornadoes, or landslides. They account for a dominant fraction of natural hazards and occur in all regions of the world, although the frequency and intensity of certain hazards and societies vulnerability to them differ between regions. Severe storms, strong winds, floods, and droughts develop at different spatial and temporal scales, but all can become disasters that cause significant infrastructure damage and claim hundreds of thousands of lives annually worldwide. Oftentimes, multiple hazards can occur simultaneously or trigger cascading impacts from one extreme weather event. For example, in addition to causing injuries, deaths, and material damage, a tropical storm can also result in flooding and mudslides, which can disrupt water purification and sewage disposal systems, cause overflow of toxic wastes, andincrease propagation of mosquito-borne diseases.

  3. Assessing natural hazards in NE Colombia using Sentinel-1 interferometry

    NASA Astrophysics Data System (ADS)

    Olen, Stephanie; Bookhagen, Bodo

    2017-04-01

    The DIGENTI project (Digitaler Entscheidertisch für das Naturgefahrenmanagement auf Basis von Satellitendaten und VGI (Volunteered Geographic Information)) aims to assess the natural hazard threat to the Cesar and La Guajira departments of northeast Colombia as guidance for decision makers and disaster relief workers. As members of the DIGENTI project, we use Sentinel-1 synthetic aperture radar (SAR) interferometry to detect hillslope movements, delineate settlements, and monitor damage to urban areas. Our study area, located in the remote Serranía del Perijá mountain range on the border of Colombia and Venezuela, is mountainous, highly vegetated, and experiences high and spatially variable rainfall (between 1 and 4 m a-1). The remote nature of the region, coupled with the favorable conditions for mass movements and other hillslope instabilities, make it an ideal location to employ remote sensing techniques to monitor potential natural hazards. In the highly vegetated Serranía del Perijá mountain range, traditional damage proxy mapping is complicated by vegetation-related coherence loss between SAR scenes. Cross-referencing existing maps, we define regions of consistently high coherence as settled or urban areas. Using the spatial extent of settled or urban areas as a mask, we establish an algorithm to use coherence loss only in these regions as a damage proxy in urban areas where the local population will be most affected. Outside of settlements, hillslope instabilities and movements are quantified and mapped using a two-prong approach: (1) Horizontal ground displacement is be calculated by dense amplitude cross-correlation using the topsOffsetApp in the InSAR Scientific Computing Environment (ISCE). This allows the location, direction, and magnitude of mass movements and hillslope instabilities to be identified and mapped; (2) We use a timeseries of interferograms to quantify vertical ground deformation (e.g., as caused by landsliding) during the Sentinel-1

  4. Spatially explicit risk approach for multi-hazard assessment and management in marine environment: The case study of the Adriatic Sea.

    PubMed

    Furlan, Elisa; Torresan, Silvia; Critto, Andrea; Marcomini, Antonio

    2018-03-15

    In the last few decades the health of marine ecosystems has been progressively endangered by the anthropogenic presence. Natural and human-made pressures, as well as climate change effects, are posing increasing threats on marine areas, triggering alteration of biological, chemical and physical processes. Planning of marine areas has become a challenge for decision makers involved in the design of sustainable management options. In order to address threats posed by climate drivers in combination with local to regional anthropogenic pressures affecting marine ecosystems and activities, a multi-hazard assessment methodology was developed and applied to the Adriatic Sea for the reference scenario 2000-2015. Through a four-stages process based on the consecutive analysis of hazard, exposure, vulnerability and risk the methodology allows a semi-quantitative evaluation of the relative risk from anthropogenic and natural sources to multiple endpoints, thus supporting the identification and ranking of areas and targets more likely to be at risk. Resulting output showed that the higher relative hazard scores are linked to exogenic pressures (e.g. sea surface temperature variation) while the lower ones resulted from endogenic and more localized stressors (e.g. abrasion, nutrient input). Relatively very high scores were observed for vulnerability over the whole case study for almost all the considered pressures, showing seagrasses meadows, maërl and coral beds as the most susceptible targets. The approach outlined in this study provides planners and decision makers a quick-screening tool to evaluate progress towards attaining a good environmental status and to identify marine areas where management actions and adaptation strategies would be best targeted. Moreover, by focusing on risks induced by land-based drivers, resulting output can support the design of infrastructures for reducing pressures on the sea, contributing to improve the land-sea interface management

  5. SETAC Pellston WorkshopTM: Environmental hazard and risk assessment approaches for endocrine-active chemicals (EHRA)

    EPA Science Inventory

    Suspected endocrine disrupting substances (EDS) are now being evaluated by several regulatory authorities. A debate is in progress about whether or not EDS can be adequately assessed by following the standard approach involving identification of intrinsic hazards, prediction of e...

  6. Assessment of occupational health and safety hazard exposures among working college students.

    PubMed

    Balanay, Jo Anne G; Adesina, Adepeju; Kearney, Gregory D; Richards, Stephanie L

    2014-01-01

    Adolescents and young adults have higher injury rates than their adult counterparts in similar jobs. This study used the working college student population to assess health and safety hazards in the workplace, characterize related occupational diseases and injuries, and describe worker health/safety activities provided by employers. College students (≥17 years old) were assessed via online surveys about work history, workplace exposure to hazards, occupational diseases/injuries, and workplace health/safety activities. Approximately half (51%) of participants (n = 1,147) were currently employed at the time of the survey or had been employed while enrolled in college. Restaurants (other than fast food) were the most frequently reported work setting. The most reported workplace hazards included noise exposure and contact with hot liquids/surfaces. Twenty percent of working students experienced injury at work; some injuries were severe enough to limit students' normal activities for >3 days (30%) or require medical attention (44%). Men had significantly higher prevalence of injuries (P = 0.05) and near-misses (P < 0.01) at work than women. Injury occurrence was associated with near-misses (AOR = 5.08, P < 0.01) and co-worker injuries (AOR = 3.19, P < 0.01) after gender and age adjustments. Most (77%) received worker safety training and half were given personal protective equipment (PPE) by their employers. Risk reduction from workplace injuries and illnesses among working college students may be achieved by implementing occupational health and safety (OHS) strategies including incorporation of OHS in the college curriculum, promotion of OHS by university/college student health services, and improving awareness of OHS online resources among college students, employers, and educators. © 2013 Wiley Periodicals, Inc.

  7. Application-driven ground motion prediction equation for seismic hazard assessments in non-cratonic moderate-seismicity areas

    NASA Astrophysics Data System (ADS)

    Bindi, D.; Cotton, F.; Kotha, S. R.; Bosse, C.; Stromeyer, D.; Grünthal, G.

    2017-09-01

    We present a ground motion prediction equation (GMPE) for probabilistic seismic hazard assessments (PSHA) in low-to-moderate seismicity areas, such as Germany. Starting from the NGA-West2 flat-file (Ancheta et al. in Earthquake Spectra 30:989-1005, 2014), we develop a model tailored to the hazard application in terms of data selection and implemented functional form. In light of such hazard application, the GMPE is derived for hypocentral distance (along with the Joyner-Boore one), selecting recordings at sites with vs30 ≥ 360 m/s, distances within 300 km, and magnitudes in the range 3 to 8 (being 7.4 the maximum magnitude for the PSHA in the target area). Moreover, the complexity of the considered functional form is reflecting the availability of information in the target area. The median predictions are compared with those from the NGA-West2 models and with one recent European model, using the Sammon's map constructed for different scenarios. Despite the simplification in the functional form, the assessed epistemic uncertainty in the GMPE median is of the order of those affecting the NGA-West2 models for the magnitude range of interest of the hazard application. On the other hand, the simplification of the functional form led to an increment of the apparent aleatory variability. In conclusion, the GMPE developed in this study is tailored to the needs for applications in low-to-moderate seismic areas and for short return periods (e.g., 475 years); its application in studies where the hazard is involving magnitudes above 7.4 and for long return periods is not advised.

  8. Spatial analysis and hazard assessment on soil total nitrogen in the middle subtropical zone of China

    NASA Astrophysics Data System (ADS)

    Lu, Peng; Lin, Wenpeng; Niu, Zheng; Su, Yirong; Wu, Jinshui

    2006-10-01

    Nitrogen (N) is one of the main factors affecting environmental pollution. In recent years, non-point source pollution and water body eutrophication have become increasing concerns for both scientists and the policy-makers. In order to assess the environmental hazard of soil total N pollution, a typical ecological unit was selected as the experimental site. This paper showed that Box-Cox transformation achieved normality in the data set, and dampened the effect of outliers. The best theoretical model of soil total N was a Gaussian model. Spatial variability of soil total N at NE60° and NE150° directions showed that it had a strip anisotropic structure. The ordinary kriging estimate of soil total N concentration was mapped. The spatial distribution pattern of soil total N in the direction of NE150° displayed a strip-shaped structure. Kriging standard deviations (KSD) provided valuable information that will increase the accuracy of total N mapping. The probability kriging method is useful to assess the hazard of N pollution by providing the conditional probability of N concentration exceeding the threshold value, where we found soil total N>2.0g/kg. The probability distribution of soil total N will be helpful to conduct hazard assessment, optimal fertilization, and develop management practices to control the non-point sources of N pollution.

  9. New Elements To Consider When Modeling the Hazards Associated with Botulinum Neurotoxin in Food.

    PubMed

    Ihekwaba, Adaoha E C; Mura, Ivan; Malakar, Pradeep K; Walshaw, John; Peck, Michael W; Barker, G C

    2016-01-15

    Botulinum neurotoxins (BoNTs) produced by the anaerobic bacterium Clostridium botulinum are the most potent biological substances known to mankind. BoNTs are the agents responsible for botulism, a rare condition affecting the neuromuscular junction and causing a spectrum of diseases ranging from mild cranial nerve palsies to acute respiratory failure and death. BoNTs are a potential biowarfare threat and a public health hazard, since outbreaks of foodborne botulism are caused by the ingestion of preformed BoNTs in food. Currently, mathematical models relating to the hazards associated with C. botulinum, which are largely empirical, make major contributions to botulinum risk assessment. Evaluated using statistical techniques, these models simulate the response of the bacterium to environmental conditions. Though empirical models have been successfully incorporated into risk assessments to support food safety decision making, this process includes significant uncertainties so that relevant decision making is frequently conservative and inflexible. Progression involves encoding into the models cellular processes at a molecular level, especially the details of the genetic and molecular machinery. This addition drives the connection between biological mechanisms and botulism risk assessment and hazard management strategies. This review brings together elements currently described in the literature that will be useful in building quantitative models of C. botulinum neurotoxin production. Subsequently, it outlines how the established form of modeling could be extended to include these new elements. Ultimately, this can offer further contributions to risk assessments to support food safety decision making. Copyright © 2015 Ihekwaba et al.

  10. Prevalence and patterns of hazardous and harmful alcohol consumption assessed using the AUDIT among Bhutanese refugees in Nepal.

    PubMed

    Luitel, Nagendra P; Jordans, Mark; Murphy, Adrianna; Roberts, Bayard; McCambridge, Jim

    2013-01-01

    This study sought to ascertain the prevalence of hazardous and harmful alcohol consumption among Bhutanese refugees in Nepal and to identify predictors of elevated risk in order to better understand intervention need. Hazardous and harmful alcohol consumption was assessed using the Alcohol Use Disorder Identification Test (AUDIT) administered in a face-to-face interview in a census of two camps comprising ∼8000 refugees. Approximately 1/5 men and 1/14 women drank alcohol and prevalence of hazardous drinking among current drinkers was high and comparable to that seen in Western countries with longstanding alcohol cultures. Harmful drinking was particularly associated with the use of other substances including tobacco. Assessment of the alcohol-related needs of Bhutanese refugees has permitted the design of interventions. This study adds to the small international literature on substance use in forced migration populations, about which there is growing concern.

  11. Satellite-Based Assessment of Rainfall-Triggered Landslide Hazard for Situational Awareness

    NASA Astrophysics Data System (ADS)

    Kirschbaum, Dalia; Stanley, Thomas

    2018-03-01

    Determining the time, location, and severity of natural disaster impacts is fundamental to formulating mitigation strategies, appropriate and timely responses, and robust recovery plans. A Landslide Hazard Assessment for Situational Awareness (LHASA) model was developed to indicate potential landslide activity in near real-time. LHASA combines satellite-based precipitation estimates with a landslide susceptibility map derived from information on slope, geology, road networks, fault zones, and forest loss. Precipitation data from the Global Precipitation Measurement (GPM) mission are used to identify rainfall conditions from the past 7 days. When rainfall is considered to be extreme and susceptibility values are moderate to very high, a "nowcast" is issued to indicate the times and places where landslides are more probable. When LHASA nowcasts were evaluated with a Global Landslide Catalog, the probability of detection (POD) ranged from 8% to 60%, depending on the evaluation period, precipitation product used, and the size of the spatial and temporal window considered around each landslide point. Applications of the LHASA system are also discussed, including how LHASA is used to estimate long-term trends in potential landslide activity at a nearly global scale and how it can be used as a tool to support disaster risk assessment. LHASA is intended to provide situational awareness of landslide hazards in near real-time, providing a flexible, open-source framework that can be adapted to other spatial and temporal scales based on data availability.

  12. Preschool Temperament Assessment: A Quantitative Assessment of the Validity of Behavioral Style Questionnaire Data

    ERIC Educational Resources Information Center

    Huelsman, Timothy J.; Gagnon, Sandra Glover; Kidder-Ashley, Pamela; Griggs, Marissa Swaim

    2014-01-01

    Research Findings: Child temperament is an important construct, but its measurement has been marked by a number of weaknesses that have diminished the frequency with which it is assessed in practice. We address this problem by presenting the results of a quantitative construct validation study. We calculated validity indices by hypothesizing the…

  13. Environmental compatibility of closed landfills - assessing future pollution hazards.

    PubMed

    Laner, David; Fellner, Johann; Brunner, Paul H

    2011-01-01

    Municipal solid waste landfills need to be managed after closure. This so-called aftercare comprises the treatment and monitoring of residual emissions as well as the maintenance and control of landfill elements. The measures can be terminated when a landfill does not pose a threat to the environment any more. Consequently, the evaluation of landfill environmental compatibility includes an estimation of future pollution hazards as well as an assessment of the vulnerability of the affected environment. An approach to assess future emission rates is presented and discussed in view of long-term environmental compatibility. The suggested method consists (a) of a continuous model to predict emissions under the assumption of constant landfill conditions, and (b) different scenarios to evaluate the effects of changing conditions within and around the landfill. The model takes into account the actual status of the landfill, hence different methods to gain information about landfill characteristics have to be applied. Finally, assumptions, uncertainties, and limitations of the methodology are discussed, and the need for future research is outlined.

  14. Global Seismic Hazard Assessment Program (GSHAP) in continental Asia

    USGS Publications Warehouse

    Zhang, Peizhen; Yang, Zhi-xian; Gupta, Harsh K.; Bhatia, Satish C.; Shedlock, Kaye M.

    1999-01-01

    The regional hazard mapping for the whole Eastern Asia was coordinated by the SSB Regional Centre in Beijing, originating from the expansion of the test area initially established in the border region of China-India-Nepal-Myanmar- Bangla Dash, in coordination with the other Regional Centres (JIPE, Moscow, and AGSO, Canberra) and with the direct assistance of the USGS. All Eastern Asian countries have participated directly in this regional effort, with the addition of Japan, for which an existing national hazard map was incorporated. The regional hazard depicts the expected peak ground acceleration with 10% exceedance probability in 50 years.

  15. Hazard Assessment of Chemical Air Contaminants Measured in Residences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Logue, J.M.; McKone, T.E.; Sherman, M. H.

    2010-05-10

    Identifying air pollutants that pose a potential hazard indoors can facilitate exposure mitigation. In this study, we compiled summary results from 77 published studies reporting measurements of chemical pollutants in residences in the United States and in countries with similar lifestyles. These data were used to calculate representative mid-range and upper bound concentrations relevant to chronic exposures for 267 pollutants and representative peak concentrations relevant to acute exposures for 5 activity-associated pollutants. Representative concentrations are compared to available chronic and acute health standards for 97 pollutants. Fifteen pollutants appear to exceed chronic health standards in a large fraction of homes.more » Nine other pollutants are identified as potential chronic health hazards in a substantial minority of homes and an additional nine are identified as potential hazards in a very small percentage of homes. Nine pollutants are identified as priority hazards based on the robustness of measured concentration data and the fraction of residences that appear to be impacted: acetaldehyde; acrolein; benzene; 1,3-butadiene; 1,4-dichlorobenzene; formaldehyde; naphthalene; nitrogen dioxide; and PM{sub 2.5}. Activity-based emissions are shown to pose potential acute health hazards for PM{sub 2.5}, formaldehyde, CO, chloroform, and NO{sub 2}.« less

  16. Use of raster-based data layers to model spatial variation of seismotectonic data in probabilistic seismic hazard assessment

    NASA Astrophysics Data System (ADS)

    Zolfaghari, Mohammad R.

    2009-07-01

    Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.

  17. Hazardous materials accidents: initial scene assessment and patient care.

    PubMed

    Leonard, R B

    1993-06-01

    Hazardous materials, i.e., chemicals that are toxic, corrosive, flammable, or explosive, are a ubiquitous aspect of modern life. They are manufactured throughout the United States, shipped by truck, train, barge, and pipeline, and stored at a wide variety of locations, including factories, military bases, and warehouses. Accidents involving hazardous materials present an added dimension of danger to emergency personnel arriving first at the scene, and have the potential to produce chemically contaminated patients who require special medical treatment. Personnel arriving first at the scene must understand how to evaluate the scene for fast and safe mitigation without endangering themselves. Chemically contaminated patients require prompt treatment, which, for optimal outcome, must begin at the scene. Although frequently the identification of the hazardous materials involved is not known initially, emergency personnel may safely provide medical care to the victims by understanding and following the principles of hazardous materials accidents and the pathophysiology of chemical injuries as presented in this paper.

  18. Deterministic approach for multiple-source tsunami hazard assessment for Sines, Portugal

    NASA Astrophysics Data System (ADS)

    Wronna, M.; Omira, R.; Baptista, M. A.

    2015-11-01

    In this paper, we present a deterministic approach to tsunami hazard assessment for the city and harbour of Sines, Portugal, one of the test sites of project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe). Sines has one of the most important deep-water ports, which has oil-bearing, petrochemical, liquid-bulk, coal, and container terminals. The port and its industrial infrastructures face the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING, a Non-linear Shallow Water model wIth Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages: MLLW (mean lower low water), MSL (mean sea level), and MHHW (mean higher high water). For each scenario, the tsunami hazard is described by maximum values of wave height, flow depth, drawback, maximum inundation area and run-up. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at the Sines test site considering the single scenarios at mean sea level, the aggregate scenario, and the influence of the tide on the aggregate scenario. The results confirm the composite source of Horseshoe and Marques de Pombal faults as the worst-case scenario, with wave heights of over 10 m, which reach the coast approximately 22 min after the rupture. It dominates the aggregate scenario by about 60 % of the impact area at the test site, considering maximum wave height and maximum flow depth. The HSMPF scenario inundates a total area of 3.5 km2.

  19. Evaluation of the ToxRTool's ability to rate the reliability of toxicological data for human health hazard assessments

    EPA Science Inventory

    Regulatory agencies often utilize results from peer reviewed publications for hazard assessments.A problem in doing so is the lack of well-accepted tools to objectively, efficiently and systematically assess the quality of published toxicological studies. Herein, we evaluated the...

  20. Hazard assessment in geothermal exploration: The case of Mt. Parker, Southern Philippines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delfin, F.G. Jr.; Salonga, N.D.; Bayon, F.E.B.

    1996-12-31

    Hazard assessment of the Mt. Parker geothermal prospect, conducted in parallel with the surface exploration from 1992 to 1994, was undertaken to determine the long-term suitability of the prospect for development. By comparison with other acidic magmatic-hydrothermal systems in the Philippines, the geochemical data indicated minimal input of acidic magmatic fluids into Mt. Parker`s hydrothermal system. This system was regarded to be a neutral-pH and high-enthalpy chloride reservoir with temperature of at least 200-250{degrees}C. These favorable geochemical indications contrasted sharply with the C-14 and volcanological data indicating a shallow magmatic body with a potential for future eruption. This hazard ledmore » PNOC EDC to discontinue the survey and abandon the prospect by late 1994. On September 6, 1995, a flashflood of non-volcanic origin from the caldera lake killed nearly 100 people on the volcano`s northwestern flank.« less