NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Burton, Paul W.
2010-09-01
The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.
Rasmussen's legacy: A paradigm change in engineering for safety.
Leveson, Nancy G
2017-03-01
This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis
NASA Technical Reports Server (NTRS)
Shortle, J. F.; Allocco, M.
2005-01-01
Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.
A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva
2018-03-01
The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.
Agent-based simulation for human-induced hazard analysis.
Bulleit, William M; Drewek, Matthew W
2011-02-01
Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.
NASA Technical Reports Server (NTRS)
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
2012-01-01
Background We explore the benefits of applying a new proportional hazard model to analyze survival of breast cancer patients. As a parametric model, the hypertabastic survival model offers a closer fit to experimental data than Cox regression, and furthermore provides explicit survival and hazard functions which can be used as additional tools in the survival analysis. In addition, one of our main concerns is utilization of multiple gene expression variables. Our analysis treats the important issue of interaction of different gene signatures in the survival analysis. Methods The hypertabastic proportional hazards model was applied in survival analysis of breast cancer patients. This model was compared, using statistical measures of goodness of fit, with models based on the semi-parametric Cox proportional hazards model and the parametric log-logistic and Weibull models. The explicit functions for hazard and survival were then used to analyze the dynamic behavior of hazard and survival functions. Results The hypertabastic model provided the best fit among all the models considered. Use of multiple gene expression variables also provided a considerable improvement in the goodness of fit of the model, as compared to use of only one. By utilizing the explicit survival and hazard functions provided by the model, we were able to determine the magnitude of the maximum rate of increase in hazard, and the maximum rate of decrease in survival, as well as the times when these occurred. We explore the influence of each gene expression variable on these extrema. Furthermore, in the cases of continuous gene expression variables, represented by a measure of correlation, we were able to investigate the dynamics with respect to changes in gene expression. Conclusions We observed that use of three different gene signatures in the model provided a greater combined effect and allowed us to assess the relative importance of each in determination of outcome in this data set. These results point to the potential to combine gene signatures to a greater effect in cases where each gene signature represents some distinct aspect of the cancer biology. Furthermore we conclude that the hypertabastic survival models can be an effective survival analysis tool for breast cancer patients. PMID:23241496
Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Y.J.; Reich, M.
Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.
Karin Riley; Matthew Thompson; Peter Webley; Kevin D. Hyde
2017-01-01
Modeling has been used to characterize and map natural hazards and hazard susceptibility for decades. Uncertainties are pervasive in natural hazards analysis, including a limited ability to predict where and when extreme events will occur, with what consequences, and driven by what contributing factors. Modeling efforts are challenged by the intrinsic...
Natural hazard modeling and uncertainty analysis [Chapter 2
Matthew Thompson; Jord J. Warmink
2017-01-01
Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...
Research on the spatial analysis method of seismic hazard for island
NASA Astrophysics Data System (ADS)
Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying
2017-05-01
Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.
Validation of a heteroscedastic hazards regression model.
Wu, Hong-Dar Isaac; Hsieh, Fushing; Chen, Chen-Hsin
2002-03-01
A Cox-type regression model accommodating heteroscedasticity, with a power factor of the baseline cumulative hazard, is investigated for analyzing data with crossing hazards behavior. Since the approach of partial likelihood cannot eliminate the baseline hazard, an overidentified estimating equation (OEE) approach is introduced in the estimation procedure. It by-product, a model checking statistic, is presented to test for the overall adequacy of the heteroscedastic model. Further, under the heteroscedastic model setting, we propose two statistics to test the proportional hazards assumption. Implementation of this model is illustrated in a data analysis of a cancer clinical trial.
NASA Astrophysics Data System (ADS)
Faruk, Alfensi
2018-03-01
Survival analysis is a branch of statistics, which is focussed on the analysis of time- to-event data. In multivariate survival analysis, the proportional hazards (PH) is the most popular model in order to analyze the effects of several covariates on the survival time. However, the assumption of constant hazards in PH model is not always satisfied by the data. The violation of the PH assumption leads to the misinterpretation of the estimation results and decreasing the power of the related statistical tests. On the other hand, the accelerated failure time (AFT) models do not assume the constant hazards in the survival data as in PH model. The AFT models, moreover, can be used as the alternative to PH model if the constant hazards assumption is violated. The objective of this research was to compare the performance of PH model and the AFT models in analyzing the significant factors affecting the first birth interval (FBI) data in Indonesia. In this work, the discussion was limited to three AFT models which were based on Weibull, exponential, and log-normal distribution. The analysis by using graphical approach and a statistical test showed that the non-proportional hazards exist in the FBI data set. Based on the Akaike information criterion (AIC), the log-normal AFT model was the most appropriate model among the other considered models. Results of the best fitted model (log-normal AFT model) showed that the covariates such as women’s educational level, husband’s educational level, contraceptive knowledge, access to mass media, wealth index, and employment status were among factors affecting the FBI in Indonesia.
Critical asset and portfolio risk analysis: an all-hazards framework.
Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark
2007-08-01
This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
[Hazard function and life table: an introduction to the failure time analysis].
Matsushita, K; Inaba, H
1987-04-01
Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.
Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.
2015-01-01
The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after further consideration of the reliability and scientific acceptability of each alternative input model. Forecasting the seismic hazard from induced earthquakes is fundamentally different from forecasting the seismic hazard for natural, tectonic earthquakes. This is because the spatio-temporal patterns of induced earthquakes are reliant on economic forces and public policy decisions regarding extraction and injection of fluids. As such, the rates of induced earthquakes are inherently variable and nonstationary. Therefore, we only make maps based on an annual rate of exceedance rather than the 50-year rates calculated for previous U.S. Geological Survey hazard maps.
Ng'andu, N H
1997-03-30
In the analysis of survival data using the Cox proportional hazard (PH) model, it is important to verify that the explanatory variables analysed satisfy the proportional hazard assumption of the model. This paper presents results of a simulation study that compares five test statistics to check the proportional hazard assumption of Cox's model. The test statistics were evaluated under proportional hazards and the following types of departures from the proportional hazard assumption: increasing relative hazards; decreasing relative hazards; crossing hazards; diverging hazards, and non-monotonic hazards. The test statistics compared include those based on partitioning of failure time and those that do not require partitioning of failure time. The simulation results demonstrate that the time-dependent covariate test, the weighted residuals score test and the linear correlation test have equally good power for detection of non-proportionality in the varieties of non-proportional hazards studied. Using illustrative data from the literature, these test statistics performed similarly.
Automation for System Safety Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul
2009-01-01
This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2008-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.
Yan, Fang; Xu, Kaili
2017-01-01
Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.
Yan, Fang; Xu, Kaili
2017-01-01
Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440
Application of systems and control theory-based hazard analysis to radiation oncology.
Pawlicki, Todd; Samost, Aubrey; Brown, Derek W; Manger, Ryan P; Kim, Gwe-Ya; Leveson, Nancy G
2016-03-01
Both humans and software are notoriously challenging to account for in traditional hazard analysis models. The purpose of this work is to investigate and demonstrate the application of a new, extended accident causality model, called systems theoretic accident model and processes (STAMP), to radiation oncology. Specifically, a hazard analysis technique based on STAMP, system-theoretic process analysis (STPA), is used to perform a hazard analysis. The STPA procedure starts with the definition of high-level accidents for radiation oncology at the medical center and the hazards leading to those accidents. From there, the hierarchical safety control structure of the radiation oncology clinic is modeled, i.e., the controls that are used to prevent accidents and provide effective treatment. Using STPA, unsafe control actions (behaviors) are identified that can lead to the hazards as well as causal scenarios that can lead to the identified unsafe control. This information can be used to eliminate or mitigate potential hazards. The STPA procedure is demonstrated on a new online adaptive cranial radiosurgery procedure that omits the CT simulation step and uses CBCT for localization, planning, and surface imaging system during treatment. The STPA procedure generated a comprehensive set of causal scenarios that are traced back to system hazards and accidents. Ten control loops were created for the new SRS procedure, which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Eighty three unsafe control actions were identified as well as 472 causal scenarios that could lead to those unsafe control actions. STPA provides a method for understanding the role of management decisions and hospital operations on system safety and generating process design requirements to prevent hazards and accidents. The interaction of people, hardware, and software is highlighted. The method of STPA produces results that can be used to improve safety and prevent accidents and warrants further investigation.
Semi-parametric regression model for survival data: graphical visualization with R
2016-01-01
Cox proportional hazards model is a semi-parametric model that leaves its baseline hazard function unspecified. The rationale to use Cox proportional hazards model is that (I) the underlying form of hazard function is stringent and unrealistic, and (II) researchers are only interested in estimation of how the hazard changes with covariate (relative hazard). Cox regression model can be easily fit with coxph() function in survival package. Stratified Cox model may be used for covariate that violates the proportional hazards assumption. The relative importance of covariates in population can be examined with the rankhazard package in R. Hazard ratio curves for continuous covariates can be visualized using smoothHR package. This curve helps to better understand the effects that each continuous covariate has on the outcome. Population attributable fraction is a classic quantity in epidemiology to evaluate the impact of risk factor on the occurrence of event in the population. In survival analysis, the adjusted/unadjusted attributable fraction can be plotted against survival time to obtain attributable fraction function. PMID:28090517
Probabilistic analysis of tsunami hazards
Geist, E.L.; Parsons, T.
2006-01-01
Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).
Software for occupational health and safety risk analysis based on a fuzzy model.
Stefanovic, Miladin; Tadic, Danijela; Djapan, Marko; Macuzic, Ivan
2012-01-01
Risk and safety management are very important issues in healthcare systems. Those are complex systems with many entities, hazards and uncertainties. In such an environment, it is very hard to introduce a system for evaluating and simulating significant hazards. In this paper, we analyzed different types of hazards in healthcare systems and we introduced a new fuzzy model for evaluating and ranking hazards. Finally, we presented a developed software solution, based on the suggested fuzzy model for evaluating and monitoring risk.
Sun, Jianguo; Feng, Yanqin; Zhao, Hui
2015-01-01
Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.
A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam
NASA Astrophysics Data System (ADS)
Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen
2014-05-01
Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho are too short for a meaningful frequency analysis. The detailed hazard mapping is performed by a 2D hydrodynamic model for Can Tho city. As the scenarios are derived in a Monte-Carlo framework, the final flood hazard maps are probabilistic, i.e. show the median flood hazard along with uncertainty estimates for each defined level of probabilities of exceedance. For the pluvial flood hazard a frequency analysis of the hourly rain gauge data of Can Tho is performed implementing a peak-over-threshold procedure. Based on this frequency analysis synthetic rains storms are generated in a Monte-Carlo framework for the same probabilities of exceedance as in the fluvial flood hazard analysis. Probabilistic flood hazard maps were then generated with the same 2D hydrodynamic model for the city. In a last step the fluvial and pluvial scenarios are combined assuming independence of the events. These scenarios were also transferred into hazard maps by the 2D hydrodynamic model finally yielding combined fluvial-pluvial probabilistic flood hazard maps for Can Tho. The derived set of maps may be used for an improved city planning or a flood risk analysis.
Wang, Wei; Albert, Jeffrey M
2017-08-01
An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.
Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2009-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Revealing the underlying drivers of disaster risk: a global analysis
NASA Astrophysics Data System (ADS)
Peduzzi, Pascal
2017-04-01
Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.
Modeling and Hazard Analysis Using STPA
NASA Astrophysics Data System (ADS)
Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka
2010-09-01
A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.
NASA Technical Reports Server (NTRS)
Butler, David R.; Walsh, Stephen J.; Brown, Daniel G.
1991-01-01
Methods are described for using Landsat Thematic Mapper digital data and digital elevation models for the display of natural hazard sites in a mountainous region of northwestern Montana, USA. Hazard zones can be easily identified on the three-dimensional images. Proximity of facilities such as highways and building locations to hazard sites can also be easily displayed. A temporal sequence of Landsat TM (or similar) satellite data sets could also be used to display landscape changes associated with dynamic natural hazard processes.
NASA Astrophysics Data System (ADS)
Avital, Matan; Kamai, Ronnie; Davis, Michael; Dor, Ory
2018-02-01
We present a full probabilistic seismic hazard analysis (PSHA) sensitivity analysis for two sites in southern Israel - one in the near field of a major fault system and one farther away. The PSHA analysis is conducted for alternative source representations, using alternative model parameters for the main seismic sources, such as slip rate and Mmax, among others. The analysis also considers the effect of the ground motion prediction equation (GMPE) on the hazard results. In this way, the two types of epistemic uncertainty - modelling uncertainty and parametric uncertainty - are treated and addressed. We quantify the uncertainty propagation by testing its influence on the final calculated hazard, such that the controlling knowledge gaps are identified and can be treated in future studies. We find that current practice in Israel, as represented by the current version of the building code, grossly underestimates the hazard, by approximately 40 % in short return periods (e.g. 10 % in 50 years) and by as much as 150 % in long return periods (e.g. 10E-5). The analysis shows that this underestimation is most probably due to a combination of factors, including source definitions as well as the GMPE used for analysis.
Ensemble of ground subsidence hazard maps using fuzzy logic
NASA Astrophysics Data System (ADS)
Park, Inhye; Lee, Jiyeong; Saro, Lee
2014-06-01
Hazard maps of ground subsidence around abandoned underground coal mines (AUCMs) in Samcheok, Korea, were constructed using fuzzy ensemble techniques and a geographical information system (GIS). To evaluate the factors related to ground subsidence, a spatial database was constructed from topographic, geologic, mine tunnel, land use, groundwater, and ground subsidence maps. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 70/30 for training and validation of the models. The relationships between the detected ground-subsidence area and the factors were identified and quantified by frequency ratio (FR), logistic regression (LR) and artificial neural network (ANN) models. The relationships were used as factor ratings in the overlay analysis to create ground-subsidence hazard indexes and maps. The three GSH maps were then used as new input factors and integrated using fuzzy-ensemble methods to make better hazard maps. All of the hazard maps were validated by comparison with known subsidence areas that were not used directly in the analysis. As the result, the ensemble model was found to be more effective in terms of prediction accuracy than the individual model.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L.; Vogel, R. M.
2015-12-01
Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.
Cascading disaster models in postburn flash flood
Fred May
2007-01-01
A useful method of modeling threats from hazards and documenting their disaster causation sequences is called âcascading threat modeling.â This type of modeling enables emergency planners to address hazard and risk assessments systematically. This paper describes a cascading threat modeling and analysis process. Wildfire and an associated postburn flash flood disaster...
Wake vortex separation standards : analysis methods
DOT National Transportation Integrated Search
1997-01-01
Wake vortex separation standards are used to prevent hazardous wake vortex encounters. A "safe" separation model can be used to assess the safety of proposed changes in the standards. A safe separation model can be derived from an encounter hazard mo...
NASA Astrophysics Data System (ADS)
Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.
2015-08-01
Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and ways for their usage in flood risk management are outlined.
Occupational hazard evaluation model underground coal mine based on unascertained measurement theory
NASA Astrophysics Data System (ADS)
Deng, Quanlong; Jiang, Zhongan; Sun, Yaru; Peng, Ya
2017-05-01
In order to study how to comprehensively evaluate the influence of several occupational hazard on miners’ physical and mental health, based on unascertained measurement theory, occupational hazard evaluation indicator system was established to make quantitative and qualitative analysis. Determining every indicator weight by information entropy and estimating the occupational hazard level by credible degree recognition criteria, the evaluation model was programmed by Visual Basic, applying the evaluation model to occupational hazard comprehensive evaluation of six posts under a coal mine, and the occupational hazard degree was graded, the evaluation results are consistent with actual situation. The results show that dust and noise is most obvious among the coal mine occupational hazard factors. Excavation face support workers are most affected, secondly, heading machine drivers, coal cutter drivers, coalface move support workers, the occupational hazard degree of these four types workers is II mild level. The occupational hazard degree of ventilation workers and safety inspection workers is I level. The evaluation model could evaluate underground coal mine objectively and accurately, and can be employed to the actual engineering.
NASA Technical Reports Server (NTRS)
1972-01-01
The Accident Model Document is one of three documents of the Preliminary Safety Analysis Report (PSAR) - Reactor System as applied to a Space Base Program. Potential terrestrial nuclear hazards involving the zirconium hydride reactor-Brayton power module are identified for all phases of the Space Base program. The accidents/events that give rise to the hazards are defined and abort sequence trees are developed to determine the sequence of events leading to the hazard and the associated probabilities of occurence. Source terms are calculated to determine the magnitude of the hazards. The above data is used in the mission accident analysis to determine the most probable and significant accidents/events in each mission phase. The only significant hazards during the prelaunch and launch ascent phases of the mission are those which arise form criticality accidents. Fission product inventories during this time period were found to be very low due to very limited low power acceptance testing.
Smoothing spline ANOVA frailty model for recurrent event data.
Du, Pang; Jiang, Yihua; Wang, Yuedong
2011-12-01
Gap time hazard estimation is of particular interest in recurrent event data. This article proposes a fully nonparametric approach for estimating the gap time hazard. Smoothing spline analysis of variance (ANOVA) decompositions are used to model the log gap time hazard as a joint function of gap time and covariates, and general frailty is introduced to account for between-subject heterogeneity and within-subject correlation. We estimate the nonparametric gap time hazard function and parameters in the frailty distribution using a combination of the Newton-Raphson procedure, the stochastic approximation algorithm (SAA), and the Markov chain Monte Carlo (MCMC) method. The convergence of the algorithm is guaranteed by decreasing the step size of parameter update and/or increasing the MCMC sample size along iterations. Model selection procedure is also developed to identify negligible components in a functional ANOVA decomposition of the log gap time hazard. We evaluate the proposed methods with simulation studies and illustrate its use through the analysis of bladder tumor data. © 2011, The International Biometric Society.
WE-G-BRA-06: Application of Systems and Control Theory-Based Hazard Analysis to Radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pawlicki, T; Samost, A; Leveson, N
Purpose: The process of delivering radiation occurs in a complex socio-technical system heavily reliant on human operators. Furthermore, both humans and software are notoriously challenging to account for in traditional hazard analysis models. High reliability industries such as aviation have approached this problem through using hazard analysis techniques grounded in systems and control theory. The purpose of this work is to apply the Systems Theoretic Accident Model Processes (STAMP) hazard model to radiotherapy. In particular, the System-Theoretic Process Analysis (STPA) approach is used to perform a hazard analysis of a proposed on-line adaptive cranial radiosurgery procedure that omits the CTmore » Simulation step and uses only CBCT for planning, localization, and treatment. Methods: The STPA procedure first requires the definition of high-level accidents and hazards leading to those accidents. From there, hierarchical control structures were created followed by the identification and description of control actions for each control structure. Utilizing these control structures, unsafe states of each control action were created. Scenarios contributing to unsafe control action states were then identified and translated into system requirements to constrain process behavior within safe boundaries. Results: Ten control structures were created for this new CBCT-only process which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Twenty three control actions were identified that contributed to over 80 unsafe states of those control actions resulting in over 220 failure scenarios. Conclusion: The interaction of people, hardware, and software are highlighted through the STPA approach. STPA provides a hierarchical model for understanding the role of management decisions in impacting system safety so that a process design requirement can be traced back to the hazard and accident that it is intended to mitigate. Varian Medical Systems, Inc.« less
Analysis and design of randomised clinical trials involving competing risks endpoints.
Tai, Bee-Choo; Wee, Joseph; Machin, David
2011-05-19
In randomised clinical trials involving time-to-event outcomes, the failures concerned may be events of an entirely different nature and as such define a classical competing risks framework. In designing and analysing clinical trials involving such endpoints, it is important to account for the competing events, and evaluate how each contributes to the overall failure. An appropriate choice of statistical model is important for adequate determination of sample size. We describe how competing events may be summarised in such trials using cumulative incidence functions and Gray's test. The statistical modelling of competing events using proportional cause-specific and subdistribution hazard functions, and the corresponding procedures for sample size estimation are outlined. These are illustrated using data from a randomised clinical trial (SQNP01) of patients with advanced (non-metastatic) nasopharyngeal cancer. In this trial, treatment has no effect on the competing event of loco-regional recurrence. Thus the effects of treatment on the hazard of distant metastasis were similar via both the cause-specific (unadjusted csHR = 0.43, 95% CI 0.25 - 0.72) and subdistribution (unadjusted subHR 0.43; 95% CI 0.25 - 0.76) hazard analyses, in favour of concurrent chemo-radiotherapy followed by adjuvant chemotherapy. Adjusting for nodal status and tumour size did not alter the results. The results of the logrank test (p = 0.002) comparing the cause-specific hazards and the Gray's test (p = 0.003) comparing the cumulative incidences also led to the same conclusion. However, the subdistribution hazard analysis requires many more subjects than the cause-specific hazard analysis to detect the same magnitude of effect. The cause-specific hazard analysis is appropriate for analysing competing risks outcomes when treatment has no effect on the cause-specific hazard of the competing event. It requires fewer subjects than the subdistribution hazard analysis for a similar effect size. However, if the main and competing events are influenced in opposing directions by an intervention, a subdistribution hazard analysis may be warranted.
Teamwork tools and activities within the hazard component of the Global Earthquake Model
NASA Astrophysics Data System (ADS)
Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.
2013-05-01
The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.
Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W
2014-06-01
Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.
KSC VAB Aeroacoustic Hazard Assessment
NASA Technical Reports Server (NTRS)
Oliveira, Justin M.; Yedo, Sabrina; Campbell, Michael D.; Atkinson, Joseph P.
2010-01-01
NASA Kennedy Space Center (KSC) carried out an analysis of the effects of aeroacoustics produced by stationary solid rocket motors in processing areas at KSC. In the current paper, attention is directed toward the acoustic effects of a motor burning within the Vehicle Assembly Building (VAB). The analysis was carried out with support from ASRC Aerospace who modeled transmission effects into surrounding facilities. Calculations were done using semi-analytical models for both aeroacoustics and transmission. From the results it was concluded that acoustic hazards in proximity to the source of ignition and plume can be severe; acoustic hazards in the far-field are significantly lower.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sezen, Halil; Aldemir, Tunc; Denning, R.
Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Thio, H. K.; Løvholt, F.; Harbitz, C. B.; Polet, J.; Lorito, S.; Basili, R.; Volpe, M.; Romano, F.; Selva, J.; Piatanesi, A.; Davies, G.; Griffin, J.; Baptista, M. A.; Omira, R.; Babeyko, A. Y.; Power, W. L.; Salgado Gálvez, M.; Behrens, J.; Yalciner, A. C.; Kanoglu, U.; Pekcan, O.; Ross, S.; Parsons, T.; LeVeque, R. J.; Gonzalez, F. I.; Paris, R.; Shäfer, A.; Canals, M.; Fraser, S. A.; Wei, Y.; Weiss, R.; Zaniboni, F.; Papadopoulos, G. A.; Didenkulova, I.; Necmioglu, O.; Suppasri, A.; Lynett, P. J.; Mokhtari, M.; Sørensen, M.; von Hillebrandt-Andrade, C.; Aguirre Ayerbe, I.; Aniel-Quiroga, Í.; Guillas, S.; Macias, J.
2016-12-01
The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Lorito, S.; Basili, R.; Harbitz, C. B.; Løvholt, F.; Polet, J.; Thio, H. K.
2017-12-01
The tsunamis occurred worldwide in the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but often disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Løvholt, Finn
2017-04-01
The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
NASA Astrophysics Data System (ADS)
Wilson, R. I.; Eble, M. C.
2013-12-01
The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for the production and use of new tsunami hazard analysis products. 4) Identify multistate collaborations and funding partners interested in these new products. Application of these new products will improve the overall safety and resilience of coastal communities exposed to tsunami hazards.
Software System Architecture Modeling Methodology for Naval Gun Weapon Systems
2010-12-01
Weapon System HAR Hazard Action Report HERO Hazards of Electromagnetic Radiation to Ordnance IOC Initial Operational Capability... radiation to ordnance ; and combinations therein. Equipment, systems, or procedures and processes whose malfunction would hazard the safe manufacturing...NDI Non-Development Item OPEVAL Operational Evaluation ORDALTS Ordnance Alterations O&SHA Operating and Support Hazard Analysis PDA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish
The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less
Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; ...
2015-06-04
The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less
Application of an adaptive neuro-fuzzy inference system to ground subsidence hazard mapping
NASA Astrophysics Data System (ADS)
Park, Inhye; Choi, Jaewon; Jin Lee, Moung; Lee, Saro
2012-11-01
We constructed hazard maps of ground subsidence around abandoned underground coal mines (AUCMs) in Samcheok City, Korea, using an adaptive neuro-fuzzy inference system (ANFIS) and a geographical information system (GIS). To evaluate the factors related to ground subsidence, a spatial database was constructed from topographic, geologic, mine tunnel, land use, and ground subsidence maps. An attribute database was also constructed from field investigations and reports on existing ground subsidence areas at the study site. Five major factors causing ground subsidence were extracted: (1) depth of drift; (2) distance from drift; (3) slope gradient; (4) geology; and (5) land use. The adaptive ANFIS model with different types of membership functions (MFs) was then applied for ground subsidence hazard mapping in the study area. Two ground subsidence hazard maps were prepared using the different MFs. Finally, the resulting ground subsidence hazard maps were validated using the ground subsidence test data which were not used for training the ANFIS. The validation results showed 95.12% accuracy using the generalized bell-shaped MF model and 94.94% accuracy using the Sigmoidal2 MF model. These accuracy results show that an ANFIS can be an effective tool in ground subsidence hazard mapping. Analysis of ground subsidence with the ANFIS model suggests that quantitative analysis of ground subsidence near AUCMs is possible.
DEVELOPMENT AND ANALYSIS OF AIR QUALITY MODELING SIMULATIONS FOR HAZARDOUS AIR POLLUTANTS
The concentrations of five hazardous air pollutants were simulated using the Community Multi Scale Air Quality (CMAQ) modeling system. Annual simulations were performed over the continental United States for the entire year of 2001 to support human exposure estimates. Results a...
NASA Astrophysics Data System (ADS)
Harper, Bryan; Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto; Harper, Stacey
2015-06-01
The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure-activity relationships.
Using a 3D CAD plant model to simplify process hazard reviews
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tolpa, G.
A Hazard and Operability (HAZOP) review is a formal predictive procedure used to identify potential hazard and operability problems associated with certain processes and facilities. The HAZOP procedure takes place several times during the life cycle of the facility. Replacing plastic models, layout and detail drawings with a 3D CAD electronic model, provides access to process safety information and a detailed level of plant topology that approaches the visualization capability of the imagination. This paper describes the process that is used for adding the use of a 3D CAD model to flowsheets and proven computer programs for the conduct ofmore » hazard and operability reviews. Using flowsheets and study nodes as a road map for the review the need for layout and other detail drawings is all but eliminated. Using the 3D CAD model again for a post-P and ID HAZOP supports conformance to layout and safety requirements, provides superior visualization of the plant configuration and preserves the owners equity in the design. The response from the review teams are overwhelmingly in favor of this type of review over a review that uses only drawings. Over the long term the plant model serves more than just process hazards analysis. Ongoing use of the model can satisfy the required access to process safety information, OHSA documentation and other legal requirements. In this paper extensive instructions address the logic for the process hazards analysis and the preparation required to assist anyone who wishes to add the use of a 3D model to their review.« less
CHANGES TO THE CHEMICAL MECHANISMS FOR HAZARDOUS AIR POLLUTANTS IN CMAQ VERSION 4.6
The extended abstract describes a presentation to the 2006 conference of the Community Modeling and Analysis System. The presentation introduces two new mechanisms for the atmospheric photochemistry of Hazardous Air Pollutants (HAPs) to be used in regional air quality models. It ...
Two models for evaluating landslide hazards
Davis, J.C.; Chung, C.-J.; Ohlmacher, G.C.
2006-01-01
Two alternative procedures for estimating landslide hazards were evaluated using data on topographic digital elevation models (DEMs) and bedrock lithologies in an area adjacent to the Missouri River in Atchison County, Kansas, USA. The two procedures are based on the likelihood ratio model but utilize different assumptions. The empirical likelihood ratio model is based on non-parametric empirical univariate frequency distribution functions under an assumption of conditional independence while the multivariate logistic discriminant model assumes that likelihood ratios can be expressed in terms of logistic functions. The relative hazards of occurrence of landslides were estimated by an empirical likelihood ratio model and by multivariate logistic discriminant analysis. Predictor variables consisted of grids containing topographic elevations, slope angles, and slope aspects calculated from a 30-m DEM. An integer grid of coded bedrock lithologies taken from digitized geologic maps was also used as a predictor variable. Both statistical models yield relative estimates in the form of the proportion of total map area predicted to already contain or to be the site of future landslides. The stabilities of estimates were checked by cross-validation of results from random subsamples, using each of the two procedures. Cell-by-cell comparisons of hazard maps made by the two models show that the two sets of estimates are virtually identical. This suggests that the empirical likelihood ratio and the logistic discriminant analysis models are robust with respect to the conditional independent assumption and the logistic function assumption, respectively, and that either model can be used successfully to evaluate landslide hazards. ?? 2006.
An Integrated GIS-Expert System Framework for Live Hazard Monitoring and Detection.
McCarthy, James D; Graniero, Phil A; Rozic, Steven M
2008-02-08
In the context of hazard monitoring, using sensor web technology to monitor anddetect hazardous conditions in near-real-time can result in large amounts of spatial data thatcan be used to drive analysis at an instrumented site. These data can be used for decisionmaking and problem solving, however as with any analysis problem the success ofanalyzing hazard potential is governed by many factors such as: the quality of the sensordata used as input; the meaning that can be derived from those data; the reliability of themodel used to describe the problem; the strength of the analysis methods; and the ability toeffectively communicate the end results of the analysis. For decision makers to make use ofsensor web data these issues must be dealt with to some degree. The work described in thispaper addresses all of these areas by showing how raw sensor data can be automaticallytransformed into a representation which matches a predefined model of the problem context.This model can be understood by analysis software that leverages rule-based logic andinference techniques to reason with, and draw conclusions about, spatial data. These toolsare integrated with a well known Geographic Information System (GIS) and existinggeospatial and sensor web infrastructure standards, providing expert users with the toolsneeded to thoroughly explore a problem site and investigate hazards in any domain.
The role of building models in the evaluation of heat-related risks
NASA Astrophysics Data System (ADS)
Buchin, Oliver; Jänicke, Britta; Meier, Fred; Scherer, Dieter; Ziegler, Felix
2016-04-01
Hazard-risk relationships in epidemiological studies are generally based on the outdoor climate, despite the fact that most of humans' lifetime is spent indoors. By coupling indoor and outdoor climates with a building model, the risk concept developed can still be based on the outdoor conditions but also includes exposure to the indoor climate. The influence of non-linear building physics and the impact of air conditioning on heat-related risks can be assessed in a plausible manner using this risk concept. For proof of concept, the proposed risk concept is compared to a traditional risk analysis. As an example, daily and city-wide mortality data of the age group 65 and older in Berlin, Germany, for the years 2001-2010 are used. Four building models with differing complexity are applied in a time-series regression analysis. This study shows that indoor hazard better explains the variability in the risk data compared to outdoor hazard, depending on the kind of building model. Simplified parameter models include the main non-linear effects and are proposed for the time-series analysis. The concept shows that the definitions of heat events, lag days, and acclimatization in a traditional hazard-risk relationship are influenced by the characteristics of the prevailing building stock.
Madadizadeh, Farzan; Ghanbarnejad, Amin; Ghavami, Vahid; Zare Bandamiri, Mohammad; Mohammadianpanah, Mohammad
2017-04-01
Introduction: Colorectal cancer (CRC) is a commonly fatal cancer that ranks as third worldwide and third and the fifth in Iranian women and men, respectively. There are several methods for analyzing time to event data. Additive hazards regression models take priority over the popular Cox proportional hazards model if the absolute hazard (risk) change instead of hazard ratio is of primary concern, or a proportionality assumption is not made. Methods: This study used data gathered from medical records of 561 colorectal cancer patients who were admitted to Namazi Hospital, Shiraz, Iran, during 2005 to 2010 and followed until December 2015. The nonparametric Aalen’s additive hazards model, semiparametric Lin and Ying’s additive hazards model and Cox proportional hazards model were applied for data analysis. The proportionality assumption for the Cox model was evaluated with a test based on the Schoenfeld residuals and for test goodness of fit in additive models, Cox-Snell residual plots were used. Analyses were performed with SAS 9.2 and R3.2 software. Results: The median follow-up time was 49 months. The five-year survival rate and the mean survival time after cancer diagnosis were 59.6% and 68.1±1.4 months, respectively. Multivariate analyses using Lin and Ying’s additive model and the Cox proportional model indicated that the age of diagnosis, site of tumor, stage, and proportion of positive lymph nodes, lymphovascular invasion and type of treatment were factors affecting survival of the CRC patients. Conclusion: Additive models are suitable alternatives to the Cox proportionality model if there is interest in evaluation of absolute hazard change, or no proportionality assumption is made. Creative Commons Attribution License
Robust inference in discrete hazard models for randomized clinical trials.
Nguyen, Vinh Q; Gillen, Daniel L
2012-10-01
Time-to-event data in which failures are only assessed at discrete time points are common in many clinical trials. Examples include oncology studies where events are observed through periodic screenings such as radiographic scans. When the survival endpoint is acknowledged to be discrete, common methods for the analysis of observed failure times include the discrete hazard models (e.g., the discrete-time proportional hazards and the continuation ratio model) and the proportional odds model. In this manuscript, we consider estimation of a marginal treatment effect in discrete hazard models where the constant treatment effect assumption is violated. We demonstrate that the estimator resulting from these discrete hazard models is consistent for a parameter that depends on the underlying censoring distribution. An estimator that removes the dependence on the censoring mechanism is proposed and its asymptotic distribution is derived. Basing inference on the proposed estimator allows for statistical inference that is scientifically meaningful and reproducible. Simulation is used to assess the performance of the presented methodology in finite samples.
Mousavi, S. Mostafa; Beroza, Gregory C.; Hoover, Susan M.
2018-01-01
Probabilistic seismic hazard analysis (PSHA) characterizes ground-motion hazard from earthquakes. Typically, the time horizon of a PSHA forecast is long, but in response to induced seismicity related to hydrocarbon development, the USGS developed one-year PSHA models. In this paper, we present a display of the variability in USGS hazard curves due to epistemic uncertainty in its informed submodel using a simple bootstrapping approach. We find that variability is highest in low-seismicity areas. On the other hand, areas of high seismic hazard, such as the New Madrid seismic zone or Oklahoma, exhibit relatively lower variability simply because of more available data and a better understanding of the seismicity. Comparing areas of high hazard, New Madrid, which has a history of large naturally occurring earthquakes, has lower forecast variability than Oklahoma, where the hazard is driven mainly by suspected induced earthquakes since 2009. Overall, the mean hazard obtained from bootstrapping is close to the published model, and variability increased in the 2017 one-year model relative to the 2016 model. Comparing the relative variations caused by individual logic-tree branches, we find that the highest hazard variation (as measured by the 95% confidence interval of bootstrapping samples) in the final model is associated with different ground-motion models and maximum magnitudes used in the logic tree, while the variability due to the smoothing distance is minimal. It should be pointed out that this study is not looking at the uncertainty in the hazard in general, but only as it is represented in the USGS one-year models.
Probabilistic seismic hazard study based on active fault and finite element geodynamic models
NASA Astrophysics Data System (ADS)
Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco
2016-04-01
We present a probabilistic seismic hazard analysis (PSHA) that is exclusively based on active faults and geodynamic finite element input models whereas seismic catalogues were used only in a posterior comparison. We applied the developed model in the External Dinarides, a slow deforming thrust-and-fold belt at the contact between Adria and Eurasia.. is the Our method consists of establishing s two earthquake rupture forecast models: (i) a geological active fault input (GEO) model and, (ii) a finite element (FEM) model. The GEO model is based on active fault database that provides information on fault location and its geometric and kinematic parameters together with estimations on its slip rate. By default in this model all deformation is set to be released along the active faults. The FEM model is based on a numerical geodynamic model developed for the region of study. In this model the deformation is, besides along the active faults, released also in the volumetric continuum elements. From both models we calculated their corresponding activity rates, its earthquake rates and their final expected peak ground accelerations. We investigated both the source model and the earthquake model uncertainties by varying the main active fault and earthquake rate calculation parameters through constructing corresponding branches of the seismic hazard logic tree. Hazard maps and UHS curves have been produced for horizontal ground motion on bedrock conditions VS 30 ≥ 800 m/s), thereby not considering local site amplification effects. The hazard was computed over a 0.2° spaced grid considering 648 branches of the logic tree and the mean value of 10% probability of exceedance in 50 years hazard level, while the 5th and 95th percentiles were also computed to investigate the model limits. We conducted a sensitivity analysis to control which of the input parameters influence the final hazard results in which measure. The results of such comparison evidence the deformation model and with their internal variability together with the choice of the ground motion prediction equations (GMPEs) are the most influencing parameter. Both of these parameters have significan affect on the hazard results. Thus having good knowledge of the existence of active faults and their geometric and activity characteristics is of key importance. We also show that PSHA models based exclusively on active faults and geodynamic inputs, which are thus not dependent on past earthquake occurrences, provide a valid method for seismic hazard calculation.
Baum, Rex L.; Miyagi, Toyohiko; Lee, Saro; Trofymchuk, Oleksandr M
2014-01-01
Twenty papers were accepted into the session on landslide hazard mapping for oral presentation. The papers presented susceptibility and hazard analysis based on approaches ranging from field-based assessments to statistically based models to assessments that combined hydromechanical and probabilistic components. Many of the studies have taken advantage of increasing availability of remotely sensed data and nearly all relied on Geographic Information Systems to organize and analyze spatial data. The studies used a range of methods for assessing performance and validating hazard and susceptibility models. A few of the studies presented in this session also included some element of landslide risk assessment. This collection of papers clearly demonstrates that a wide range of approaches can lead to useful assessments of landslide susceptibility and hazard.
Causal Mediation Analysis of Survival Outcome with Multiple Mediators.
Huang, Yen-Tsung; Yang, Hwai-I
2017-05-01
Mediation analyses have been a popular approach to investigate the effect of an exposure on an outcome through a mediator. Mediation models with multiple mediators have been proposed for continuous and dichotomous outcomes. However, development of multimediator models for survival outcomes is still limited. We present methods for multimediator analyses using three survival models: Aalen additive hazard models, Cox proportional hazard models, and semiparametric probit models. Effects through mediators can be characterized by path-specific effects, for which definitions and identifiability assumptions are provided. We derive closed-form expressions for path-specific effects for the three models, which are intuitively interpreted using a causal diagram. Mediation analyses using Cox models under the rare-outcome assumption and Aalen additive hazard models consider effects on log hazard ratio and hazard difference, respectively; analyses using semiparametric probit models consider effects on difference in transformed survival time and survival probability. The three models were applied to a hepatitis study where we investigated effects of hepatitis C on liver cancer incidence mediated through baseline and/or follow-up hepatitis B viral load. The three methods show consistent results on respective effect scales, which suggest an adverse estimated effect of hepatitis C on liver cancer not mediated through hepatitis B, and a protective estimated effect mediated through the baseline (and possibly follow-up) of hepatitis B viral load. Causal mediation analyses of survival outcome with multiple mediators are developed for additive hazard and proportional hazard and probit models with utility demonstrated in a hepatitis study.
Royston, Patrick; Parmar, Mahesh K B
2014-08-07
Most randomized controlled trials with a time-to-event outcome are designed and analysed under the proportional hazards assumption, with a target hazard ratio for the treatment effect in mind. However, the hazards may be non-proportional. We address how to design a trial under such conditions, and how to analyse the results. We propose to extend the usual approach, a logrank test, to also include the Grambsch-Therneau test of proportional hazards. We test the resulting composite null hypothesis using a joint test for the hazard ratio and for time-dependent behaviour of the hazard ratio. We compute the power and sample size for the logrank test under proportional hazards, and from that we compute the power of the joint test. For the estimation of relevant quantities from the trial data, various models could be used; we advocate adopting a pre-specified flexible parametric survival model that supports time-dependent behaviour of the hazard ratio. We present the mathematics for calculating the power and sample size for the joint test. We illustrate the methodology in real data from two randomized trials, one in ovarian cancer and the other in treating cellulitis. We show selected estimates and their uncertainty derived from the advocated flexible parametric model. We demonstrate in a small simulation study that when a treatment effect either increases or decreases over time, the joint test can outperform the logrank test in the presence of both patterns of non-proportional hazards. Those designing and analysing trials in the era of non-proportional hazards need to acknowledge that a more complex type of treatment effect is becoming more common. Our method for the design of the trial retains the tools familiar in the standard methodology based on the logrank test, and extends it to incorporate a joint test of the null hypothesis with power against non-proportional hazards. For the analysis of trial data, we propose the use of a pre-specified flexible parametric model that can represent a time-dependent hazard ratio if one is present.
Multi-hazard risk analysis related to hurricanes
NASA Astrophysics Data System (ADS)
Lin, Ning
Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is developed based on the theory of Poisson random measure, substantiated by a large amount of empirical data. An advanced vulnerability assessment methodology is then developed, by integrating this debris risk model and a component-based pressure damage model, to predict storm-specific or annual damage to coastal residential neighborhoods. The uniqueness of this vulnerability model lies in its detailed description of the interaction between wind pressure and windborne debris effects over periods of strong winds, which is a major mechanism leading to structural failures during hurricanes.
Spatial prediction of landslide hazard using discriminant analysis and GIS
Peter V. Gorsevski; Paul Gessler; Randy B. Foltz
2000-01-01
Environmental attributes relevant for spatial prediction of landslides triggered by rain and snowmelt events were derived from digital elevation model (DEM). Those data in conjunction with statistics and geographic information system (GIS) provided a detailed basis for spatial prediction of landslide hazard. The spatial prediction of landslide hazard in this paper is...
Paukatong, K V; Kunawasen, S
2001-01-01
Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.
The role of models in estimating consequences as part of the risk assessment process.
Forde-Folle, K; Mitchell, D; Zepeda, C
2011-08-01
The degree of disease risk represented by the introduction, spread, or establishment of one or several diseases through the importation of animals and animal products is assessed by importing countries through an analysis of risk. The components of a risk analysis include hazard identification, risk assessment, risk management, and risk communication. A risk assessment starts with identification of the hazard(s) and then continues with four interrelated steps: release assessment, exposure assessment, consequence assessment, and risk estimation. Risk assessments may be either qualitative or quantitative. This paper describes how, through the integration of epidemiological and economic models, the potential adverse biological and economic consequences of exposure can be quantified.
A conflict model for the international hazardous waste disposal dispute.
Hu, Kaixian; Hipel, Keith W; Fang, Liping
2009-12-15
A multi-stage conflict model is developed to analyze international hazardous waste disposal disputes. More specifically, the ongoing toxic waste conflicts are divided into two stages consisting of the dumping prevention and dispute resolution stages. The modeling and analyses, based on the methodology of graph model for conflict resolution (GMCR), are used in both stages in order to grasp the structure and implications of a given conflict from a strategic viewpoint. Furthermore, a specific case study is investigated for the Ivory Coast hazardous waste conflict. In addition to the stability analysis, sensitivity and attitude analyses are conducted to capture various strategic features of this type of complicated dispute.
NASA Astrophysics Data System (ADS)
Weigel, A. M.; Griffin, R.; Sever, T.
2014-12-01
The extent of the Maya civilization spanned across portions of modern day Mexico, Belize, Guatemala, El Salvador and Honduras. Paleoclimatic studies suggest this region has been affected by strong hurricanes for the past six thousand years, reinforced by archeological evidence from Mayan records indicating they experienced strong storms. It is theorized hurricanes aided in the collapse of the Maya, damaging building structures, agriculture, and ceasing industry activities. Today, this region is known for its active tropical climatology, being hit by numerous strong storms including Hurricane Dean, Iris, Keith, and Mitch. This research uses a geographic information system (GIS) to model hurricane hazards, and assess the risk posed on the Maya civilization. GIS has the ability to handle various layer components making it optimal for combining parameters necessary for assessing the risk of experiencing hurricane related hazards. For this analysis, high winds, storm surge flooding, non-storm surge related flooding, and rainfall triggered landslides were selected as the primary hurricane hazards. Data sets used in this analysis include the National Climatic Data Center International Best Track Archive for Climate Stewardships (IBTrACS) hurricane tracks, Shuttle Radar Topography Mission Digital Elevation Model, WorldClim monthly accumulated precipitation, USGS HydroSHEDS river locations, Harmonized World Soil Database soil types, and known Maya site locations from the Electronic Atlas of Ancient Maya Sites. ArcGIS and ENVI software were utilized to process data and model hurricane hazards. To assess locations at risk of experiencing high winds, a model was created using ArcGIS Model Builder to map each storm's temporal wind profile, and adapted to simulate forward storm velocity, and storm frequency. Modeled results were then combined with physical land characteristics, meteorological, and hydrologic data to identify areas likely affected. Certain areas along the eastern edge of the Yucatan peninsula were found to be more prone to experiencing wind and flood related hurricane hazards. Novel methodologies developed from this analysis can be adapted for further hurricane risk assessment on archeological sites.
Seismic Hazard Analysis — Quo vadis?
NASA Astrophysics Data System (ADS)
Klügel, Jens-Uwe
2008-05-01
The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.
Extended cox regression model: The choice of timefunction
NASA Astrophysics Data System (ADS)
Isik, Hatice; Tutkun, Nihal Ata; Karasoy, Durdu
2017-07-01
Cox regression model (CRM), which takes into account the effect of censored observations, is one the most applicative and usedmodels in survival analysis to evaluate the effects of covariates. Proportional hazard (PH), requires a constant hazard ratio over time, is the assumptionofCRM. Using extended CRM provides the test of including a time dependent covariate to assess the PH assumption or an alternative model in case of nonproportional hazards. In this study, the different types of real data sets are used to choose the time function and the differences between time functions are analyzed and discussed.
Ng, S K; McLachlan, G J
2003-04-15
We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright 2003 John Wiley & Sons, Ltd.
Hazard Regression Models of Early Mortality in Trauma Centers
Clark, David E; Qian, Jing; Winchell, Robert J; Betensky, Rebecca A
2013-01-01
Background Factors affecting early hospital deaths after trauma may be different from factors affecting later hospital deaths, and the distribution of short and long prehospital times may vary among hospitals. Hazard regression (HR) models may therefore be more useful than logistic regression (LR) models for analysis of trauma mortality, especially when treatment effects at different time points are of interest. Study Design We obtained data for trauma center patients from the 2008–9 National Trauma Data Bank (NTDB). Cases were included if they had complete data for prehospital times, hospital times, survival outcome, age, vital signs, and severity scores. Cases were excluded if pulseless on admission, transferred in or out, or ISS<9. Using covariates proposed for the Trauma Quality Improvement Program and an indicator for each hospital, we compared LR models predicting survival at 8 hours after injury to HR models with survival censored at 8 hours. HR models were then modified to allow time-varying hospital effects. Results 85,327 patients in 161 hospitals met inclusion criteria. Crude hazards peaked initially, then steadily declined. When hazard ratios were assumed constant in HR models, they were similar to odds ratios in LR models associating increased mortality with increased age, firearm mechanism, increased severity, more deranged physiology, and estimated hospital-specific effects. However, when hospital effects were allowed to vary by time, HR models demonstrated that hospital outliers were not the same at different times after injury. Conclusions HR models with time-varying hazard ratios reveal inconsistencies in treatment effects, data quality, and/or timing of early death among trauma centers. HR models are generally more flexible than LR models, can be adapted for censored data, and potentially offer a better tool for analysis of factors affecting early death after injury. PMID:23036828
NASA Astrophysics Data System (ADS)
Sadegh, M.; Moftakhari, H.; AghaKouchak, A.
2017-12-01
Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.
Assessing the risk posed by natural hazards to infrastructures
NASA Astrophysics Data System (ADS)
Eidsvig, Unni; Kristensen, Krister; Vidar Vangelsten, Bjørn
2015-04-01
The modern society is increasingly dependent on infrastructures to maintain its function, and disruption in one of the infrastructure systems may have severe consequences. The Norwegian municipalities have, according to legislation, a duty to carry out a risk and vulnerability analysis and plan and prepare for emergencies in a short- and long term perspective. Vulnerability analysis of the infrastructures and their interdependencies is an important part of this analysis. This paper proposes a model for assessing the risk posed by natural hazards to infrastructures. The model prescribes a three level analysis with increasing level of detail, moving from qualitative to quantitative analysis. This paper focuses on the second level, which consists of a semi-quantitative analysis. The purpose of this analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures identified in the level 1 analysis and investigate the need for further analyses, i.e. level 3 quantitative analyses. The proposed level 2 analysis considers the frequency of the natural hazard, different aspects of vulnerability including the physical vulnerability of the infrastructure itself and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale. The proposed indicators characterize the robustness of the infrastructure, the importance of the infrastructure as well as interdependencies between society and infrastructure affecting the potential for cascading effects. Each indicator is ranked on a 1-5 scale based on pre-defined ranking criteria. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented, where risk to primary road, water supply and power network threatened by storm and landslide is assessed. The application examples show that the proposed model provides a useful tool for screening of undesirable events, with the ultimate goal to reduce the societal vulnerability.
Broad-Scale Assessment of Fuel Treatment Opportunities
Patrick D. Miles; Kenneth E. Skog; Wayne D. Shepperd; Elizabeth D. Reinhardt; Roger D. Fight
2006-01-01
The Forest Inventory and Analysis (FIA) program has produced estimates of the extent and composition of the Nation?s forests for several decades. FIA data have been used with a flexible silvicultural thinning option, a fire hazard model for preharvest and postharvest fire hazard assessment, a harvest economics model, and geospatial data to produce a Web-based tool to...
NASA Astrophysics Data System (ADS)
Egozcue, J. J.; Pawlowsky-Glahn, V.; Ortego, M. I.
2005-03-01
Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0.
Interpretation of laser/multi-sensor data for short range terrain modeling and hazard detection
NASA Technical Reports Server (NTRS)
Messing, B. S.
1980-01-01
A terrain modeling algorithm that would reconstruct the sensed ground images formed by the triangulation scheme, and classify as unsafe any terrain feature that would pose a hazard to a roving vehicle is described. This modeler greatly reduces quantization errors inherent in a laser/sensing system through the use of a thinning algorithm. Dual filters are employed to separate terrain steps from the general landscape, simplifying the analysis of terrain features. A crosspath analysis is utilized to detect and avoid obstacles that would adversely affect the roll of the vehicle. Computer simulations of the rover on various terrains examine the performance of the modeler.
The credibility challenge for global fluvial flood risk analysis
NASA Astrophysics Data System (ADS)
Trigg, M. A.; Birch, C. E.; Neal, J. C.; Bates, P. D.; Smith, A.; Sampson, C. C.; Yamazaki, D.; Hirabayashi, Y.; Pappenberger, F.; Dutra, E.; Ward, P. J.; Winsemius, H. C.; Salamon, P.; Dottori, F.; Rudari, R.; Kappes, M. S.; Simpson, A. L.; Hadzilacos, G.; Fewtrell, T. J.
2016-09-01
Quantifying flood hazard is an essential component of resilience planning, emergency response, and mitigation, including insurance. Traditionally undertaken at catchment and national scales, recently, efforts have intensified to estimate flood risk globally to better allow consistent and equitable decision making. Global flood hazard models are now a practical reality, thanks to improvements in numerical algorithms, global datasets, computing power, and coupled modelling frameworks. Outputs of these models are vital for consistent quantification of global flood risk and in projecting the impacts of climate change. However, the urgency of these tasks means that outputs are being used as soon as they are made available and before such methods have been adequately tested. To address this, we compare multi-probability flood hazard maps for Africa from six global models and show wide variation in their flood hazard, economic loss and exposed population estimates, which has serious implications for model credibility. While there is around 30%-40% agreement in flood extent, our results show that even at continental scales, there are significant differences in hazard magnitude and spatial pattern between models, notably in deltas, arid/semi-arid zones and wetlands. This study is an important step towards a better understanding of modelling global flood hazard, which is urgently required for both current risk and climate change projections.
NASA Astrophysics Data System (ADS)
García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.
2007-10-01
A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.
Mueller, C.S.
2010-01-01
I analyze the sensitivity of seismic-hazard estimates in the central and eastern United States (CEUS) to maximum magnitude (mmax) by exercising the U.S. Geological Survey (USGS) probabilistic hazard model with several mmax alternatives. Seismicity-based sources control the hazard in most of the CEUS, but data seldom provide an objective basis for estimating mmax. The USGS uses preferred mmax values of moment magnitude 7.0 and 7.5 for the CEUS craton and extended margin, respectively, derived from data in stable continental regions worldwide. Other approaches, for example analysis of local seismicity or judgment about a source's seismogenic potential, often lead to much smaller mmax. Alternative models span the mmax ranges from the 1980s Electric Power Research Institute/Seismicity Owners Group (EPRI/SOG) analysis. Results are presented as haz-ard ratios relative to the USGS national seismic hazard maps. One alternative model specifies mmax equal to moment magnitude 5.0 and 5.5 for the craton and margin, respectively, similar to EPRI/SOG for some sources. For 2% probability of exceedance in 50 years (about 0.0004 annual probability), the strong mmax truncation produces hazard ratios equal to 0.35-0.60 for 0.2-sec spectral acceleration, and 0.15-0.35 for 1.0-sec spectral acceleration. Hazard-controlling earthquakes interact with mmax in complex ways. There is a relatively weak dependence on probability level: hazardratios increase 0-15% for 0.002 annual exceedance probability and decrease 5-25% for 0.00001 annual exceedance probability. Although differences at some sites are tempered when faults are added, mmax clearly accounts for some of the discrepancies that are seen in comparisons between USGS-based and EPRI/SOG-based hazard results.
Landslide risk models for decision making.
Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio
2009-11-01
This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.
Safety assessment of plant varieties using transcriptomics profiling and a one-class classifier.
van Dijk, Jeroen P; de Mello, Carla Souza; Voorhuijzen, Marleen M; Hutten, Ronald C B; Arisi, Ana Carolina Maisonnave; Jansen, Jeroen J; Buydens, Lutgarde M C; van der Voet, Hilko; Kok, Esther J
2014-10-01
An important part of the current hazard identification of novel plant varieties is comparative targeted analysis of the novel and reference varieties. Comparative analysis will become much more informative with unbiased analytical approaches, e.g. omics profiling. Data analysis estimating the similarity of new varieties to a reference baseline class of known safe varieties would subsequently greatly facilitate hazard identification. Further biological and eventually toxicological analysis would then only be necessary for varieties that fall outside this reference class. For this purpose, a one-class classifier tool was explored to assess and classify transcriptome profiles of potato (Solanum tuberosum) varieties in a model study. Profiles of six different varieties, two locations of growth, two year of harvest and including biological and technical replication were used to build the model. Two scenarios were applied representing evaluation of a 'different' variety and a 'similar' variety. Within the model higher class distances resulted for the 'different' test set compared with the 'similar' test set. The present study may contribute to a more global hazard identification of novel plant varieties. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco
2016-04-01
The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including hanging wall and directivity effects) within modern ground motion prediction equations, can have an influence on the seismic hazard at a site. Yet we also illustrate the conditions under which these effects may be partially tempered when considering the full uncertainty in rupture behaviour within the fault system. The third challenge is the development of efficient means for representing both aleatory and epistemic uncertainties from active fault models in PSHA. In implementing state-of-the-art seismic hazard models into OpenQuake, such as those recently undertaken in California and Japan, new modeling techniques are needed that redefine how we treat interdependence of ruptures within the model (such as mutual exclusivity), and the propagation of uncertainties emerging from geology. Finally, we illustrate how OpenQuake, and GEM's additional toolkits for model preparation, can be applied to address long-standing issues in active fault modeling in PSHA. These include constraining the seismogenic coupling of a fault and the partitioning of seismic moment between the active fault surfaces and the surrounding seismogenic crust. We illustrate some of the possible roles that geodesy can play in the process, but highlight where this may introduce new uncertainties and potential biases into the seismic hazard process, and how these can be addressed.
Bowie, Paul; Price, Julie; Hepworth, Neil; Dinwoodie, Mark; McKay, John
2015-11-27
To analyse a medical protection organisation's database to identify hazards related to general practice systems for ordering laboratory tests, managing test results and communicating test result outcomes to patients. To integrate these data with other published evidence sources to inform design of a systems-based conceptual model of related hazards. A retrospective database analysis. General practices in the UK and Ireland. 778 UK and Ireland general practices participating in a medical protection organisation's clinical risk self-assessment (CRSA) programme from January 2008 to December 2014. Proportion of practices with system risks; categorisation of identified hazards; most frequently occurring hazards; development of a conceptual model of hazards; and potential impacts on health, well-being and organisational performance. CRSA visits were undertaken to 778 UK and Ireland general practices of which a range of systems hazards were recorded across the laboratory test ordering and results management systems in 647 practices (83.2%). A total of 45 discrete hazard categories were identified with a mean of 3.6 per practice (SD=1.94). The most frequently occurring hazard was the inadequate process for matching test requests and results received (n=350, 54.1%). Of the 1604 instances where hazards were recorded, the most frequent was at the 'postanalytical test stage' (n=702, 43.8%), followed closely by 'communication outcomes issues' (n=628, 39.1%). Based on arguably the largest data set currently available on the subject matter, our study findings shed new light on the scale and nature of hazards related to test results handling systems, which can inform future efforts to research and improve the design and reliability of these systems. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
“Smooth” Semiparametric Regression Analysis for Arbitrarily Censored Time-to-Event Data
Zhang, Min; Davidian, Marie
2008-01-01
Summary A general framework for regression analysis of time-to-event data subject to arbitrary patterns of censoring is proposed. The approach is relevant when the analyst is willing to assume that distributions governing model components that are ordinarily left unspecified in popular semiparametric regression models, such as the baseline hazard function in the proportional hazards model, have densities satisfying mild “smoothness” conditions. Densities are approximated by a truncated series expansion that, for fixed degree of truncation, results in a “parametric” representation, which makes likelihood-based inference coupled with adaptive choice of the degree of truncation, and hence flexibility of the model, computationally and conceptually straightforward with data subject to any pattern of censoring. The formulation allows popular models, such as the proportional hazards, proportional odds, and accelerated failure time models, to be placed in a common framework; provides a principled basis for choosing among them; and renders useful extensions of the models straightforward. The utility and performance of the methods are demonstrated via simulations and by application to data from time-to-event studies. PMID:17970813
NASA Astrophysics Data System (ADS)
Cramer, C. H.; Dhar, M. S.
2017-12-01
The influence of deep sediment deposits of the Mississippi Embayment (ME) on the propagation of seismic waves is poorly understood and remains a major source of uncertainty for site response analysis. Many researchers have studied the effects of these deposits on seismic hazard of the area using available information at the time. In this study, we have used updated and newly available resources for seismic and liquefaction hazard analyses of the ME. We have developed an improved 3D geological model. Additionally, we used surface geological maps from Cupples and Van Arsdale (2013) to prepare liquefaction hazard maps. Both equivalent linear and nonlinear site response codes were used to develop site amplification distributions for use in generating hazard maps. The site amplification distributions are created using the Monte Carlo approach of Cramer et al. (2004, 2006) on a 0.1-degree grid. The 2014 National Seismic Hazard model and attenuation relations (Petersen et al., 2014) are used to prepare seismic hazard maps. Then liquefaction hazard maps are generated using liquefaction probability curves from Holzer (2011) and Cramer et al. (2015). Equivalent linear response (w/ increased precision, restricted nonlinear behavior with depth) shows similar hazard for the ME compared to nonlinear analysis (w/o pore pressure) results. At short periods nonlinear deamplification dominates the hazard, but at long periods resonance amplification dominates. The liquefaction hazard tends to be high in Holocene and late Pleistocene lowland sediments, even with lowered ground water levels, and low in Pleistocene loess of the uplands. Considering pore pressure effects in nonlinear site response analysis at a test site on the lowlands shows amplification of ground motion at short periods. PGA estimates from ME liquefaction and MMI observations are in the 0.25 to 0.4 g range. Our estimated M7.5 PGA hazard within 10 km of the fault can exceed this. Ground motion observations from liquefaction sites in New Zealand and Japan support PGAs below 0.4 g, except at sites within 20 km exhibiting pore-pressure induced acceleration spikes due to cyclic mobility where PGA ranges from 0.5 to 1.5 g. This study is being extended to more detailed seismic and liquefaction hazard studies in five western Tennessee counties under a five year grant from HUD.
Wind shear modeling for aircraft hazard definition
NASA Technical Reports Server (NTRS)
Frost, W.; Camp, D. W.; Wang, S. T.
1978-01-01
Mathematical models of wind profiles were developed for use in fast time and manned flight simulation studies aimed at defining and eliminating these wind shear hazards. A set of wind profiles and associated wind shear characteristics for stable and neutral boundary layers, thunderstorms, and frontal winds potentially encounterable by aircraft in the terminal area are given. Engineering models of wind shear for direct hazard analysis are presented in mathematical formulae, graphs, tables, and computer lookup routines. The wind profile data utilized to establish the models are described as to location, how obtained, time of observation and number of data points up to 500 m. Recommendations, engineering interpretations and guidelines for use of the data are given and the range of applicability of the wind shear models is described.
Estimating piecewise exponential frailty model with changing prior for baseline hazard function
NASA Astrophysics Data System (ADS)
Thamrin, Sri Astuti; Lawi, Armin
2016-02-01
Piecewise exponential models provide a very flexible framework for modelling univariate survival data. It can be used to estimate the effects of different covariates which are influenced by the survival data. Although in a strict sense it is a parametric model, a piecewise exponential hazard can approximate any shape of a parametric baseline hazard. In the parametric baseline hazard, the hazard function for each individual may depend on a set of risk factors or explanatory variables. However, it usually does not explain all such variables which are known or measurable, and these variables become interesting to be considered. This unknown and unobservable risk factor of the hazard function is often termed as the individual's heterogeneity or frailty. This paper analyses the effects of unobserved population heterogeneity in patients' survival times. The issue of model choice through variable selection is also considered. A sensitivity analysis is conducted to assess the influence of the prior for each parameter. We used the Markov Chain Monte Carlo method in computing the Bayesian estimator on kidney infection data. The results obtained show that the sex and frailty are substantially associated with survival in this study and the models are relatively quite sensitive to the choice of two different priors.
NASA Astrophysics Data System (ADS)
Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus
2016-04-01
In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach torrent in Tyrol (Austria), are analysed in detail. A couple of buildings are entirely reconstructed within the physical scale model at the scale 1:30. They include basement and first floor and thereby all relevant openings on the building envelopes. The results from experimental modelling represent the data basis for further physics-based vulnerability analysis. Hence, the applied vulnerability analysis concept significantly extends the methods presently used in flood risk assessment. The results of the study are of basic importance for practical application, as they provide extensive information to support hazard zone mapping and management, as well as the planning of local technical protection measures.
NASA Astrophysics Data System (ADS)
Chen, Yuh-Ing; Huang, Chi-Shen; Liu, Jann-Yenq
2015-12-01
We investigated the temporal-spatial hazard of the earthquakes after the 1999 September 21 MW = 7.7 Chi-Chi shock in a continental region of Taiwan. The Reasenberg-Jones (RJ) model (Reasenberg and Jones, 1989, 1994) that combines the frequency-magnitude distribution (Gutenberg and Richter, 1944) and time-decaying occurrence rate (Utsu et al., 1995) is conventionally employed for assessing the earthquake hazard after a large shock. However, it is found that the b values in the frequency-magnitude distribution of the earthquakes in the study region dramatically decreased from background values after the Chi-Chi shock, and then gradually increased up. The observation of a time-dependent frequency-magnitude distribution motivated us to propose a modified RJ model (MRJ) to assess the earthquake hazard. To see how the models perform on assessing short-term earthquake hazard, the RJ and MRJ models were separately used to sequentially forecast earthquakes in the study region. To depict the potential rupture area for future earthquakes, we further constructed relative hazard (RH) maps based on the two models. The Receiver Operating Characteristics (ROC) curves (Swets, 1988) finally demonstrated that the RH map based on the MRJ model was, in general, superior to the one based on the original RJ model for exploring the spatial hazard of earthquakes in a short time after the Chi-Chi shock.
A New Seismic Hazard Model for Mainland China
NASA Astrophysics Data System (ADS)
Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z. K.
2017-12-01
We are developing a new seismic hazard model for Mainland China by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data, and derive a strain rate model based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones. For each zone, a tapered Gutenberg-Richter (TGR) magnitude-frequency distribution is used to model the seismic activity rates. The a- and b-values of the TGR distribution are calculated using observed earthquake data, while the corner magnitude is constrained independently using the seismic moment rate inferred from the geodetically-based strain rate model. Small and medium sized earthquakes are distributed within the source zones following the location and magnitude patterns of historical earthquakes. Some of the larger earthquakes are distributed onto active faults, based on their geological characteristics such as slip rate, fault length, down-dip width, and various paleoseismic data. The remaining larger earthquakes are then placed into the background. A new set of magnitude-rupture scaling relationships is developed based on earthquake data from China and vicinity. We evaluate and select appropriate ground motion prediction equations by comparing them with observed ground motion data and performing residual analysis. To implement the modeling workflow, we develop a tool that builds upon the functionalities of GEM's Hazard Modeler's Toolkit. The GEM OpenQuake software is used to calculate seismic hazard at various ground motion periods and various return periods. To account for site amplification, we construct a site condition map based on geology. The resulting new seismic hazard maps can be used for seismic risk analysis and management.
Dynamic wake prediction and visualization with uncertainty analysis
NASA Technical Reports Server (NTRS)
Holforty, Wendy L. (Inventor); Powell, J. David (Inventor)
2005-01-01
A dynamic wake avoidance system utilizes aircraft and atmospheric parameters readily available in flight to model and predict airborne wake vortices in real time. A novel combination of algorithms allows for a relatively simple yet robust wake model to be constructed based on information extracted from a broadcast. The system predicts the location and movement of the wake based on the nominal wake model and correspondingly performs an uncertainty analysis on the wake model to determine a wake hazard zone (no fly zone), which comprises a plurality of wake planes, each moving independently from another. The system selectively adjusts dimensions of each wake plane to minimize spatial and temporal uncertainty, thereby ensuring that the actual wake is within the wake hazard zone. The predicted wake hazard zone is communicated in real time directly to a user via a realistic visual representation. In an example, the wake hazard zone is visualized on a 3-D flight deck display to enable a pilot to visualize or see a neighboring aircraft as well as its wake. The system substantially enhances the pilot's situational awareness and allows for a further safe decrease in spacing, which could alleviate airport and airspace congestion.
Detecting Spatial Patterns of Natural Hazards from the Wikipedia Knowledge Base
NASA Astrophysics Data System (ADS)
Fan, J.; Stewart, K.
2015-07-01
The Wikipedia database is a data source of immense richness and variety. Included in this database are thousands of geotagged articles, including, for example, almost real-time updates on current and historic natural hazards. This includes usercontributed information about the location of natural hazards, the extent of the disasters, and many details relating to response, impact, and recovery. In this research, a computational framework is proposed to detect spatial patterns of natural hazards from the Wikipedia database by combining topic modeling methods with spatial analysis techniques. The computation is performed on the Neon Cluster, a high performance-computing cluster at the University of Iowa. This work uses wildfires as the exemplar hazard, but this framework is easily generalizable to other types of hazards, such as hurricanes or flooding. Latent Dirichlet Allocation (LDA) modeling is first employed to train the entire English Wikipedia dump, transforming the database dump into a 500-dimension topic model. Over 230,000 geo-tagged articles are then extracted from the Wikipedia database, spatially covering the contiguous United States. The geo-tagged articles are converted into an LDA topic space based on the topic model, with each article being represented as a weighted multidimension topic vector. By treating each article's topic vector as an observed point in geographic space, a probability surface is calculated for each of the topics. In this work, Wikipedia articles about wildfires are extracted from the Wikipedia database, forming a wildfire corpus and creating a basis for the topic vector analysis. The spatial distribution of wildfire outbreaks in the US is estimated by calculating the weighted sum of the topic probability surfaces using a map algebra approach, and mapped using GIS. To provide an evaluation of the approach, the estimation is compared to wildfire hazard potential maps created by the USDA Forest service.
NASA Astrophysics Data System (ADS)
Iskandar, I.
2018-03-01
The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.
NASA Astrophysics Data System (ADS)
Lane, E. M.; Gillibrand, P. A.; Wang, X.; Power, W.
2013-09-01
Regional source tsunamis pose a potentially devastating hazard to communities and infrastructure on the New Zealand coast. But major events are very uncommon. This dichotomy of infrequent but potentially devastating hazards makes realistic assessment of the risk challenging. Here, we describe a method to determine a probabilistic assessment of the tsunami hazard by regional source tsunamis with an "Average Recurrence Interval" of 2,500-years. The method is applied to the east Auckland region of New Zealand. From an assessment of potential regional tsunamigenic events over 100,000 years, the inundation of the Auckland region from the worst 100 events is modelled using a hydrodynamic model and probabilistic inundation depths on a 2,500-year time scale were determined. Tidal effects on the potential inundation were included by coupling the predicted wave heights with the probability density function of tidal heights at the inundation site. Results show that the more exposed northern section of the east coast and outer islands in the Hauraki Gulf face the greatest hazard from regional tsunamis in the Auckland region. Incorporating tidal effects into predictions of inundation reduced the predicted hazard compared to modelling all the tsunamis arriving at high tide giving a more accurate hazard assessment on the specified time scale. This study presents the first probabilistic analysis of dynamic modelling of tsunami inundation for the New Zealand coast and as such provides the most comprehensive assessment of tsunami inundation of the Auckland region from regional source tsunamis available to date.
Multi-hazard risk analysis for management strategies
NASA Astrophysics Data System (ADS)
Kappes, M.; Keiler, M.; Bell, R.; Glade, T.
2009-04-01
Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.
Soong, David T.; Straub, Timothy D.; Murphy, Elizabeth A.
2006-01-01
Results of hydrologic model, flood-frequency, hydraulic model, and flood-hazard analysis of the Blackberry Creek watershed in Kane County, Illinois, indicate that the 100-year and 500-year flood plains range from approximately 25 acres in the tributary F watershed (a headwater subbasin at the northeastern corner of the watershed) to almost 1,800 acres in Blackberry Creek main stem. Based on 1996 land-cover data, most of the land in the 100-year and 500-year flood plains was cropland, forested and wooded land, and grassland. A relatively small percentage of urban land was in the flood plains. The Blackberry Creek watershed has undergone rapid urbanization in recent decades. The population and urbanized lands in the watershed are projected to double from the 1990 condition by 2020. Recently, flood-induced damage has occurred more frequently in urbanized areas of the watershed. There are concerns about the effect of urbanization on flood peaks and volumes, future flood-mitigation plans, and potential effects on the water quality and stream habitats. This report describes the procedures used in developing the hydrologic models, estimating the flood-peak discharge magnitudes and recurrence intervals for flood-hazard analysis, developing the hydraulic model, and the results of the analysis in graphical and tabular form. The hydrologic model, Hydrological Simulation Program-FORTRAN (HSPF), was used to perform the simulation of continuous water movements through various patterns of land uses in the watershed. Flood-frequency analysis was applied to an annual maximum series to determine flood quantiles in subbasins for flood-hazard analysis. The Hydrologic Engineering Center-River Analysis System (HEC-RAS) hydraulic model was used to determine the 100-year and 500-year flood elevations, and to determine the 100-year floodway. The hydraulic model was calibrated and verified using high water marks and observed inundation maps for the July 17-18, 1996, flood event. Digital maps of the 100-year and 500-year flood plains and the 100-year floodway for each tributary and the main stem of Blackberry Creek were compiled.
Regression analysis of informative current status data with the additive hazards model.
Zhao, Shishun; Hu, Tao; Ma, Ling; Wang, Peijie; Sun, Jianguo
2015-04-01
This paper discusses regression analysis of current status failure time data arising from the additive hazards model in the presence of informative censoring. Many methods have been developed for regression analysis of current status data under various regression models if the censoring is noninformative, and also there exists a large literature on parametric analysis of informative current status data in the context of tumorgenicity experiments. In this paper, a semiparametric maximum likelihood estimation procedure is presented and in the method, the copula model is employed to describe the relationship between the failure time of interest and the censoring time. Furthermore, I-splines are used to approximate the nonparametric functions involved and the asymptotic consistency and normality of the proposed estimators are established. A simulation study is conducted and indicates that the proposed approach works well for practical situations. An illustrative example is also provided.
Lin Receives 2010 Natural Hazards Focus Group Award for Graduate Research
NASA Astrophysics Data System (ADS)
2010-11-01
Ning Lin has been awarded the Natural Hazards Focus Group Award for Graduate Research, given annually to a recent Ph.D. recipient for outstanding contributions to natural hazards research. Lin's thesis is entitled “Multi-hazard risk analysis related to hurricanes.” She is scheduled to present an invited talk in the Extreme Natural Events: Modeling, Prediction, and Mitigation session (NH20) during the 2010 AGU Fall Meeting, held 13-17 December in San Francisco, Calif. Lin will be formally presented with the award at the Natural Hazards focus group reception on 14 December 2010.
Landslide Hazard from Coupled Inherent and Dynamic Probabilities
NASA Astrophysics Data System (ADS)
Strauch, R. L.; Istanbulluoglu, E.; Nudurupati, S. S.
2015-12-01
Landslide hazard research has typically been conducted independently from hydroclimate research. We sought to unify these two lines of research to provide regional scale landslide hazard information for risk assessments and resource management decision-making. Our approach couples an empirical inherent landslide probability, based on a frequency ratio analysis, with a numerical dynamic probability, generated by combining subsurface water recharge and surface runoff from the Variable Infiltration Capacity (VIC) macro-scale land surface hydrologic model with a finer resolution probabilistic slope stability model. Landslide hazard mapping is advanced by combining static and dynamic models of stability into a probabilistic measure of geohazard prediction in both space and time. This work will aid resource management decision-making in current and future landscape and climatic conditions. The approach is applied as a case study in North Cascade National Park Complex in northern Washington State.
Designsafe-Ci a Cyberinfrastructure for Natural Hazard Simulation and Data
NASA Astrophysics Data System (ADS)
Dawson, C.; Rathje, E.; Stanzione, D.; Padgett, J.; Pinelli, J. P.
2017-12-01
DesignSafe is the web-based research platform of the Natural Hazards Engineering Research Infrastructure (NHERI) network that provides the computational tools needed to manage and analyze critical data for natural hazards research, with wind and storm surge related hazards being a primary focus. One of the simulation tools under DesignSafe is the Advanced Circulation (ADCIRC) model, a coastal ocean model used in storm surge analysis. ADCIRC is an unstructured, finite element model with high resolution capabilities for studying storm surge impacts, and has long been used in storm surge hind-casting and forecasting. In this talk, we will demonstrate the use of ADCIRC within the DesignSafe platform and its use for forecasting Hurricane Harvey. We will also demonstrate how to analyze, visualize and archive critical storm surge related data within DesignSafe.
An advanced terrain modeler for an autonomous planetary rover
NASA Technical Reports Server (NTRS)
Hunter, E. L.
1980-01-01
A roving vehicle capable of autonomously exploring the surface of an alien world is under development and an advanced terrain modeler to characterize the possible paths of the rover as hazardous or safe is presented. This advanced terrain modeler has several improvements over the Troiani modeler that include: a crosspath analysis, better determination of hazards on slopes, and methods for dealing with missing returns at the extremities of the sensor field. The results from a package of programs to simulate the roving vehicle are then examined and compared to results from the Troiani modeler.
NASA Astrophysics Data System (ADS)
Zosseder, K.; Post, J.; Steinmetz, T.; Wegscheider, S.; Strunz, G.
2009-04-01
Indonesia is located at one of the most active geological subduction zones in the world. Following the most recent seaquakes and their subsequent tsunamis in December 2004 and July 2006 it is expected that also in the near future tsunamis are likely to occur due to increased tectonic tensions leading to abrupt vertical seafloor alterations after a century of relative tectonic silence. To face this devastating threat tsunami hazard maps are very important as base for evacuation planning and mitigation strategies. In terms of a tsunami impact the hazard assessment is mostly covered by numerical modelling because the model results normally offer the most precise database for a hazard analysis as they include spatially distributed data and their influence to the hydraulic dynamics. Generally a model result gives a probability for the intensity distribution of a tsunami at the coast (or run up) and the spatial distribution of the maximum inundation area depending on the location and magnitude of the tsunami source used. The boundary condition of the source used for the model is mostly chosen by a worst case approach. Hence the location and magnitude which are likely to occur and which are assumed to generate the worst impact are used to predict the impact at a specific area. But for a tsunami hazard assessment covering a large coastal area, as it is demanded in the GITEWS (German Indonesian Tsunami Early Warning System) project in which the present work is embedded, this approach is not practicable because a lot of tsunami sources can cause an impact at the coast and must be considered. Thus a multi-scenario tsunami model approach is developed to provide a reliable hazard assessment covering large areas. For the Indonesian Early Warning System many tsunami scenarios were modelled by the Alfred Wegener Institute (AWI) at different probable tsunami sources and with different magnitudes along the Sunda Trench. Every modelled scenario delivers the spatial distribution of the inundation for a specific area, the wave height at coast at this area and the estimated times of arrival (ETAs) of the waves, caused by one tsunamigenic source with a specific magnitude. These parameters from the several scenarios can overlap each other along the coast and must be combined to get one comprehensive hazard assessment for all possible future tsunamis at the region under observation. The simplest way to derive the inundation probability along the coast using the multiscenario approach is to overlay all scenario inundation results and to determine how often a point on land will be significantly inundated from the various scenarios. But this does not take into account that the used tsunamigenic sources for the modeled scenarios have different likelihoods of causing a tsunami. Hence a statistical analysis of historical data and geophysical investigation results based on numerical modelling results is added to the hazard assessment, which clearly improves the significance of the hazard assessment. For this purpose the present method is developed and contains a complex logical combination of the diverse probabilities assessed like probability of occurrence for different earthquake magnitudes at different localities, probability of occurrence for a specific wave height at the coast and the probability for every point on land likely to get hit by a tsunami. The values are combined by a logical tree technique and quantified by statistical analysis of historical data and of the tsunami modelling results as mentioned before. This results in a tsunami inundation probability map covering the South West Coast of Indonesia which nevertheless shows a significant spatial diversity offering a good base for evacuation planning and mitigation strategies. Keywords: tsunami hazard assessment, tsunami modelling, probabilistic analysis, early warning
Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site
NASA Astrophysics Data System (ADS)
Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.
2012-04-01
The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological Development, Area "Environment", Activity 1.3.3.1 "Prediction of triggering and risk assessment for landslides".
NASA Astrophysics Data System (ADS)
García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.
2009-04-01
In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide location data. These results show a high concordance between the landslide inventory and the high susceptibility estimated zone with an adjustment of 95.1 % for ANN model and 89.4% for LR model. In addition, we make a comparative analysis of both techniques using the Receiver Operating Characteristic (ROC) curve, a graphical plot of the sensitivity vs. (1 - specificity) for a binary classifier system in function of its discrimination threshold, and calculating the Area Under the ROC (AUROC) value for each model. Finally, the previous models are used for the developing a new probabilistic landslide hazard map for future events. They are obtained combining the expected triggering factor (calculated earthquake ground motion) for a return period of 475 years with the susceptibility map.
Modeling Compound Flood Hazards in Coastal Embayments
NASA Astrophysics Data System (ADS)
Moftakhari, H.; Schubert, J. E.; AghaKouchak, A.; Luke, A.; Matthew, R.; Sanders, B. F.
2017-12-01
Coastal cities around the world are built on lowland topography adjacent to coastal embayments and river estuaries, where multiple factors threaten increasing flood hazards (e.g. sea level rise and river flooding). Quantitative risk assessment is required for administration of flood insurance programs and the design of cost-effective flood risk reduction measures. This demands a characterization of extreme water levels such as 100 and 500 year return period events. Furthermore, hydrodynamic flood models are routinely used to characterize localized flood level intensities (i.e., local depth and velocity) based on boundary forcing sampled from extreme value distributions. For example, extreme flood discharges in the U.S. are estimated from measured flood peaks using the Log-Pearson Type III distribution. However, configuring hydrodynamic models for coastal embayments is challenging because of compound extreme flood events: events caused by a combination of extreme sea levels, extreme river discharges, and possibly other factors such as extreme waves and precipitation causing pluvial flooding in urban developments. Here, we present an approach for flood risk assessment that coordinates multivariate extreme analysis with hydrodynamic modeling of coastal embayments. First, we evaluate the significance of correlation structure between terrestrial freshwater inflow and oceanic variables; second, this correlation structure is described using copula functions in unit joint probability domain; and third, we choose a series of compound design scenarios for hydrodynamic modeling based on their occurrence likelihood. The design scenarios include the most likely compound event (with the highest joint probability density), preferred marginal scenario and reproduced time series of ensembles based on Monte Carlo sampling of bivariate hazard domain. The comparison between resulting extreme water dynamics under the compound hazard scenarios explained above provides an insight to the strengths/weaknesses of each approach and helps modelers choose the appropriate scenario that best fit to the needs of their project. The proposed risk assessment approach can help flood hazard modeling practitioners achieve a more reliable estimate of risk, by cautiously reducing the dimensionality of the hazard analysis.
Lin, Feng-Chang; Zhu, Jun
2012-01-01
We develop continuous-time models for the analysis of environmental or ecological monitoring data such that subjects are observed at multiple monitoring time points across space. Of particular interest are additive hazards regression models where the baseline hazard function can take on flexible forms. We consider time-varying covariates and take into account spatial dependence via autoregression in space and time. We develop statistical inference for the regression coefficients via partial likelihood. Asymptotic properties, including consistency and asymptotic normality, are established for parameter estimates under suitable regularity conditions. Feasible algorithms utilizing existing statistical software packages are developed for computation. We also consider a simpler additive hazards model with homogeneous baseline hazard and develop hypothesis testing for homogeneity. A simulation study demonstrates that the statistical inference using partial likelihood has sound finite-sample properties and offers a viable alternative to maximum likelihood estimation. For illustration, we analyze data from an ecological study that monitors bark beetle colonization of red pines in a plantation of Wisconsin.
Recent changes in flood damage in the United States from observations and ACME model
NASA Astrophysics Data System (ADS)
Leng, G.; Leung, L. R.
2017-12-01
Despite efforts to mitigate flood hazards in flood-prone areas, survey- and report-based flood databases show that flood damage has increased and emerged as one of the most costly disaster in the United States since the 1990s. Understanding the mechanism driving the changes in flood damage is therefore critical for reducing flood risk. In this study, we first conduct a comprehensive analysis of the changing characteristics of flood damage at local, state and country level. Results show a significant increasing trend in the number of flood hazards, causing economic losses of up to $7 billion per year. The ratio of flood events that caused tangible economical cost to the total flood events has exhibited a non-significant increasing trend before 2007 followed by a significant decrease, indicating a changing vulnerability to floods. Analysis also reveals distinct spatial and temporal patterns in the threshold intensity of flood hazards with tangible economical cost. To understand the mechanism behind the increasing flood damage, we develop a flood damage economic model coupled with the integrated hydrological modeling system of ACME that features a river routing model with an inundation parameterization and a water use and regulation model. The model is evaluated over the country against historical records. Several numerical experiments are then designed to explore the mechanisms behind the recent changes in flood damage from the perspective of flood hazard, exposure and vulnerability, which constitute flood damage. The role of human activities such as reservoir operations and water use in modifying regional floods are also explored using the new tool, with the goal of improving understanding and modeling of vulnerability to flood hazards.
Topography- and nightlight-based national flood risk assessment in Canada
NASA Astrophysics Data System (ADS)
Elshorbagy, Amin; Bharath, Raja; Lakhanpal, Anchit; Ceola, Serena; Montanari, Alberto; Lindenschmidt, Karl-Erich
2017-04-01
In Canada, flood analysis and water resource management, in general, are tasks conducted at the provincial level; therefore, unified national-scale approaches to water-related problems are uncommon. In this study, a national-scale flood risk assessment approach is proposed and developed. The study focuses on using global and national datasets available with various resolutions to create flood risk maps. First, a flood hazard map of Canada is developed using topography-based parameters derived from digital elevation models, namely, elevation above nearest drainage (EAND) and distance from nearest drainage (DFND). This flood hazard mapping method is tested on a smaller area around the city of Calgary, Alberta, against a flood inundation map produced by the city using hydraulic modelling. Second, a flood exposure map of Canada is developed using a land-use map and the satellite-based nightlight luminosity data as two exposure parameters. Third, an economic flood risk map is produced, and subsequently overlaid with population density information to produce a socioeconomic flood risk map for Canada. All three maps of hazard, exposure, and risk are classified into five classes, ranging from very low to severe. A simple way to include flood protection measures in hazard estimation is also demonstrated using the example of the city of Winnipeg, Manitoba. This could be done for the entire country if information on flood protection across Canada were available. The evaluation of the flood hazard map shows that the topography-based method adopted in this study is both practical and reliable for large-scale analysis. Sensitivity analysis regarding the resolution of the digital elevation model is needed to identify the resolution that is fine enough for reliable hazard mapping, but coarse enough for computational tractability. The nightlight data are found to be useful for exposure and risk mapping in Canada; however, uncertainty analysis should be conducted to investigate the effect of the overglow phenomenon on flood risk mapping.
Hydrology Analysis and Modelling for Klang River Basin Flood Hazard Map
NASA Astrophysics Data System (ADS)
Sidek, L. M.; Rostam, N. E.; Hidayah, B.; Roseli, ZA; Majid, W. H. A. W. A.; Zahari, N. Z.; Salleh, S. H. M.; Ahmad, R. D. R.; Ahmad, M. N.
2016-03-01
Flooding, a common environmental hazard worldwide has in recent times, increased as a result of climate change and urbanization with the effects felt more in developing countries. As a result, the explosive of flooding to Tenaga Nasional Berhad (TNB) substation is increased rapidly due to existing substations are located in flood prone area. By understanding the impact of flood to their substation, TNB has provided the non-structure mitigation with the integration of Flood Hazard Map with their substation. Hydrology analysis is the important part in providing runoff as the input for the hydraulic part.
NASA Astrophysics Data System (ADS)
Apel, Heiko; Martínez Trepat, Oriol; Nghia Hung, Nguyen; Thi Chinh, Do; Merz, Bruno; Viet Dung, Nguyen
2016-04-01
Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either a fluvial or pluvial flood hazard, studies of a combined fluvial and pluvial flood hazard are hardly available. Thus this study aims to analyse a fluvial and a pluvial flood hazard individually, but also to develop a method for the analysis of a combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as an example. In this tropical environment the annual monsoon triggered floods of the Mekong River, which can coincide with heavy local convective precipitation events, causing both fluvial and pluvial flooding at the same time. The fluvial flood hazard was estimated with a copula-based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. The pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data and a stochastic rainstorm generator. Inundation for all flood scenarios was simulated by a 2-dimensional hydrodynamic model implemented on a Graphics Processing Unit (GPU) for time-efficient flood propagation modelling. The combined fluvial-pluvial flood scenarios were derived by adding rainstorms to the fluvial flood events during the highest fluvial water levels. The probabilities of occurrence of the combined events were determined assuming independence of the two flood types and taking the seasonality and probability of coincidence into account. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation taking into account the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and their usage in flood risk management are outlined.
Dust Hazard Management in the Outer Solar System
NASA Technical Reports Server (NTRS)
Seal, David A.
2012-01-01
Most robotic missions to the outer solar system must grapple with the hazards posed by the dusty rings of the gas giants. Early assessments of these hazards led simply to ring avoidance due to insufficient data and high uncertainties on the dust population present in such rings. Recent approaches, principal among them the Cassini dust hazard management strategy, provide useful results from detailed modeling of spacecraft vulnerabilities and dust hazard regions, which along with the range of mission trajectories are used to to assess the risks posed by each passage through a zone of potential hazard. This paper shows the general approach used to implement the analysis for Cassini, with recommendations for future outer planet missions.
NASA Astrophysics Data System (ADS)
Chapman, Martin Colby
1998-12-01
The design earthquake selection problem is fundamentally probabilistic. Disaggregation of a probabilistic model of the seismic hazard offers a rational and objective approach that can identify the most likely earthquake scenario(s) contributing to hazard. An ensemble of time series can be selected on the basis of the modal earthquakes derived from the disaggregation. This gives a useful time-domain realization of the seismic hazard, to the extent that a single motion parameter captures the important time-domain characteristics. A possible limitation to this approach arises because most currently available motion prediction models for peak ground motion or oscillator response are essentially independent of duration, and modal events derived using the peak motions for the analysis may not represent the optimal characterization of the hazard. The elastic input energy spectrum is an alternative to the elastic response spectrum for these types of analyses. The input energy combines the elements of amplitude and duration into a single parameter description of the ground motion that can be readily incorporated into standard probabilistic seismic hazard analysis methodology. This use of the elastic input energy spectrum is examined. Regression analysis is performed using strong motion data from Western North America and consistent data processing procedures for both the absolute input energy equivalent velocity, (Vsbea), and the elastic pseudo-relative velocity response (PSV) in the frequency range 0.5 to 10 Hz. The results show that the two parameters can be successfully fit with identical functional forms. The dependence of Vsbea and PSV upon (NEHRP) site classification is virtually identical. The variance of Vsbea is uniformly less than that of PSV, indicating that Vsbea can be predicted with slightly less uncertainty as a function of magnitude, distance and site classification. The effects of site class are important at frequencies less than a few Hertz. The regression modeling does not resolve significant effects due to site class at frequencies greater than approximately 5 Hz. Disaggregation of general seismic hazard models using Vsbea indicates that the modal magnitudes for the higher frequency oscillators tend to be larger, and vary less with oscillator frequency, than those derived using PSV. Insofar as the elastic input energy may be a better parameter for quantifying the damage potential of ground motion, its use in probabilistic seismic hazard analysis could provide an improved means for selecting earthquake scenarios and establishing design earthquakes for many types of engineering analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, A; Samost, A; Viswanathan, A
Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios weremore » then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in brachytherapy and may prove to be an alternative to other hazard analysis techniques.« less
NASA Astrophysics Data System (ADS)
Garcia-Fernandez, Mariano; Assatourians, Karen; Jimenez, Maria-Jose
2018-01-01
Extreme natural hazard events have the potential to cause significant disruption to critical infrastructure (CI) networks. Among them, earthquakes represent a major threat as sudden-onset events with limited, if any, capability of forecast, and high damage potential. In recent years, the increased exposure of interdependent systems has heightened concern, motivating the need for a framework for the management of these increased hazards. The seismic performance level and resilience of existing non-nuclear CIs can be analyzed by identifying the ground motion input values leading to failure of selected key elements. Main interest focuses on the ground motions exceeding the original design values, which should correspond to low probability occurrence. A seismic hazard methodology has been specifically developed to consider low-probability ground motions affecting elongated CI networks. The approach is based on Monte Carlo simulation, which allows for building long-duration synthetic earthquake catalogs to derive low-probability amplitudes. This approach does not affect the mean hazard values and allows obtaining a representation of maximum amplitudes that follow a general extreme-value distribution. This facilitates the analysis of the occurrence of extremes, i.e., very low probability of exceedance from unlikely combinations, for the development of, e.g., stress tests, among other applications. Following this methodology, extreme ground-motion scenarios have been developed for selected combinations of modeling inputs including seismic activity models (source model and magnitude-recurrence relationship), ground motion prediction equations (GMPE), hazard levels, and fractiles of extreme ground motion. The different results provide an overview of the effects of different hazard modeling inputs on the generated extreme motion hazard scenarios. This approach to seismic hazard is at the core of the risk analysis procedure developed and applied to European CI transport networks within the framework of the European-funded INFRARISK project. Such an operational seismic hazard framework can be used to provide insight in a timely manner to make informed risk management or regulating further decisions on the required level of detail or on the adoption of measures, the cost of which can be balanced against the benefits of the measures in question.
Lee, Saro; Park, Inhye
2013-09-30
Subsidence of ground caused by underground mines poses hazards to human life and property. This study analyzed the hazard to ground subsidence using factors that can affect ground subsidence and a decision tree approach in a geographic information system (GIS). The study area was Taebaek, Gangwon-do, Korea, where many abandoned underground coal mines exist. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 50/50 for training and validation of the models. A data-mining classification technique was applied to the GSH mapping, and decision trees were constructed using the chi-squared automatic interaction detector (CHAID) and the quick, unbiased, and efficient statistical tree (QUEST) algorithms. The frequency ratio model was also applied to the GSH mapping for comparing with probabilistic model. The resulting GSH maps were validated using area-under-the-curve (AUC) analysis with the subsidence area data that had not been used for training the model. The highest accuracy was achieved by the decision tree model using CHAID algorithm (94.01%) comparing with QUEST algorithms (90.37%) and frequency ratio model (86.70%). These accuracies are higher than previously reported results for decision tree. Decision tree methods can therefore be used efficiently for GSH analysis and might be widely used for prediction of various spatial events. Copyright © 2013. Published by Elsevier Ltd.
Hazard assessment for small torrent catchments - lessons learned
NASA Astrophysics Data System (ADS)
Eisl, Julia; Huebl, Johannes
2013-04-01
The documentation of extreme events as a part of the integral risk management cycle is an important basis for the analysis and assessment of natural hazards. In July 2011 a flood event occurred in the Wölzer-valley in the province of Styria, Austria. For this event at the "Wölzerbach" a detailed event documentation was carried out, gathering data about rainfall, runoff and sediment transport as well as information on damaged objects, infrastructure or crops using various sources. The flood was triggered by heavy rainfalls in two tributaries of the Wölzer-river. Though a rain as well as a discharge gaging station exists for the Wölzer-river, the torrents affected by the high intensity rainfalls are ungaged. For these ungaged torrent catchments the common methods for hazard assessment were evaluated. The back-calculation of the rainfall event was done using a new approach for precipitation analysis. In torrent catchments especially small-scale and high-intensity rainfall events are mainly responsible for extreme events. Austria's weather surveillance radar is operated by the air traffic service "AustroControl". The usually available dataset is interpreted and shows divergences especially when it comes to high intensity rainfalls. For this study the raw data of the radar were requested and analysed. Further on the event was back-calculated with different rainfall-runoff models, hydraulic models and sediment transport models to obtain calibration parameters for future use in hazard assessment for this region. Since there are often problems with woody debris different scenarios were simulated. The calibrated and plausible results from the runoff models were used for the comparison with empirical approaches used in the practical sector. For the planning of mitigation measures of the Schöttl-torrent, which is one of the affected tributaries of the Wölzer-river, a physical scale model was used in addition to the insights of the event analysis to design a check dam for sediment retention. As far as the transport capacity of the lower reaches is limited a balance had to be found between protection on the one hand and sediment connectivity to the Wölzer-river on the other. The lessons learned kicked off discussions for future hazard assessment especially concerning the use of rainfall data and design precipitation values for small torrent catchments. Also the comparison with empirical values showed the need for differentiated concepts for hazard analysis. Therefor recommendations for the use of spatial rainfall reduction factors as well as the demarcation of hazard maps using different event scenarios are proposed.
A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications
Austin, Peter C.
2017-01-01
Summary Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log–log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata). PMID:29307954
A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications.
Austin, Peter C
2017-08-01
Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log-log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata).
Vulnerability Assessment Using LIDAR Data in Silang-Sta Rosa Subwatershed, Philippines
NASA Astrophysics Data System (ADS)
Bragais, M. A.; Magcale-Macandog, D. B.; Arizapa, J. L.; Manalo, K. M.
2016-10-01
Silang-Sta. Rosa Subwatershed is experiencing rapid urbanization. Its downstream area is already urbanized and the development is moving fast upstream. With the rapid land conversion of pervious to impervious areas and increase frequency of intense rainfall events, the downstream of the watershed is at risk of flood hazard. The widely used freeware HEC-RAS (Hydrologic Engineering Center- River Analysis System) model was used to implement the 2D unsteady flow analysis to develop a flood hazard map. The LiDAR derived digital elevation model (DEM) with 1m resolution provided detailed terrain that is vital for producing reliable flood extent map that can be used for early warning system. With the detailed information from the simulation like areas to be flooded, the predicted depth and duration, we can now provide specific flood forecasting and mitigation plan even at community level. The methodology of using 2D unsteady flow modelling and high resolution DEM in a watershed can be replicated to other neighbouring watersheds specially those areas that are not yet urbanized so that their development will be guided to be flood hazard resilient. LGUs all over the country will benefit from having a high resolution flood hazard map.
Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Woo, Gordon
2017-04-01
For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L. K.; Vogel, R. M.
2015-11-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, Laura K.; Vogel, Richard M.
2016-04-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.
NASA Astrophysics Data System (ADS)
Grasso, S.; Maugeri, M.
After the Summit held in Washington on August 20-22 2001 to plan the first World Conference on the mitigation of Natural Hazards, a Group for the analysis of Natural Hazards within the Mediterranean area has been formed. The Group has so far determined the following hazards: (1) Seismic hazard (hazard for historical buildings included); (2) Hazard linked to the quantity and quality of water; (3) Landslide hazard; (4) Volcanic hazard. The analysis of such hazards implies the creation and the management of data banks, which can only be used if the data are properly geo-settled to allow a crossed use of them. The obtained results must be therefore represented on geo-settled maps. The present study is part of a research programme, namely "Detailed Scenarios and Actions for Seismic Prevention of Damage in the Urban Area of Catania", financed by the National Department for the Civil Protection and the National Research Council-National Group for the Defence Against Earthquakes (CNR-GNDT). Nowadays the south-eastern area of Sicily, called the "Iblea" seismic area of Sicily, is considered as one of the most intense seismic zones in Italy, based on the past and current seismic history and on the typology of civil buildings. Safety against earthquake hazards has two as pects: structural safety against potentially destructive dynamic forces and site safety related to geotechnical phenomena such as amplification, land sliding and soil liquefaction. So the correct evaluation of seismic hazard is highly affected by risk factors due to geological nature and geotechnical properties of soils. The effect of local geotechnical conditions on damages suffered by buildings under seismic conditions has been widely recognized, as it is demonstrated by the Manual for Zonation on Seismic Geotechnical Hazards edited by the International Society for Soil Mechanics and Geotechnical Engineering (TC4, 1999). The evaluation of local amplification effects may be carried out by means of either rigorous complex methods of analysis or qualitative procedures. A semi quantitative procedure based on the definition of the geotechnical hazard index has been applied for the zonation of the seismic geotechnical hazard of the city of Catania. In particular this procedure has been applied to define the influence of geotechnical properties of soil in a central area of the city of Catania, where some historical buildings of great importance are sited. It was also performed an investigation based on the inspection of more than one hundred historical ecclesiastical buildings of great importance, located in the city. Then, in order to identify the amplification effects due to the site conditions, a geotechnical survey form was prepared, to allow a semi quantitative evaluation of the seismic geotechnical hazard for all these historical buildings. In addition, to evaluate the foundation soil time -history response, a 1-D dynamic soil model was employed for all these buildings, considering the non linearity of soil behaviour. Using a GIS, a map of the seismic geotechnical hazard, of the liquefaction hazard and a preliminary map of the seismic hazard for the city of Catania have been obtained. From the analysis of obtained results it may be noticed that high hazard zones are mainly clayey sites
SCEC Earthquake System Science Using High Performance Computing
NASA Astrophysics Data System (ADS)
Maechling, P. J.; Jordan, T. H.; Archuleta, R.; Beroza, G.; Bielak, J.; Chen, P.; Cui, Y.; Day, S.; Deelman, E.; Graves, R. W.; Minster, J. B.; Olsen, K. B.
2008-12-01
The SCEC Community Modeling Environment (SCEC/CME) collaboration performs basic scientific research using high performance computing with the goal of developing a predictive understanding of earthquake processes and seismic hazards in California. SCEC/CME research areas including dynamic rupture modeling, wave propagation modeling, probabilistic seismic hazard analysis (PSHA), and full 3D tomography. SCEC/CME computational capabilities are organized around the development and application of robust, re- usable, well-validated simulation systems we call computational platforms. The SCEC earthquake system science research program includes a wide range of numerical modeling efforts and we continue to extend our numerical modeling codes to include more realistic physics and to run at higher and higher resolution. During this year, the SCEC/USGS OpenSHA PSHA computational platform was used to calculate PSHA hazard curves and hazard maps using the new UCERF2.0 ERF and new 2008 attenuation relationships. Three SCEC/CME modeling groups ran 1Hz ShakeOut simulations using different codes and computer systems and carefully compared the results. The DynaShake Platform was used to calculate several dynamic rupture-based source descriptions equivalent in magnitude and final surface slip to the ShakeOut 1.2 kinematic source description. A SCEC/CME modeler produced 10Hz synthetic seismograms for the ShakeOut 1.2 scenario rupture by combining 1Hz deterministic simulation results with 10Hz stochastic seismograms. SCEC/CME modelers ran an ensemble of seven ShakeOut-D simulations to investigate the variability of ground motions produced by dynamic rupture-based source descriptions. The CyberShake Platform was used to calculate more than 15 new probabilistic seismic hazard analysis (PSHA) hazard curves using full 3D waveform modeling and the new UCERF2.0 ERF. The SCEC/CME group has also produced significant computer science results this year. Large-scale SCEC/CME high performance codes were run on NSF TeraGrid sites including simulations that use the full PSC Big Ben supercomputer (4096 cores) and simulations that ran on more than 10K cores at TACC Ranger. The SCEC/CME group used scientific workflow tools and grid-computing to run more than 1.5 million jobs at NCSA for the CyberShake project. Visualizations produced by a SCEC/CME researcher of the 10Hz ShakeOut 1.2 scenario simulation data were used by USGS in ShakeOut publications and public outreach efforts. OpenSHA was ported onto an NSF supercomputer and was used to produce very high resolution hazard PSHA maps that contained more than 1.6 million hazard curves.
Attrition in Psychotherapy: A Survival Analysis
ERIC Educational Resources Information Center
Roseborough, David John; McLeod, Jeffrey T.; Wright, Florence I.
2016-01-01
Purpose: Attrition is a common problem in psychotherapy and can be defined as clients ending treatment before achieving an optimal response. Method: This longitudinal, archival study utilized data for 3,728 clients, using the Outcome Questionnaire 45.2. A Cox regression proportional hazards (hazard ratios) model was used in order to better…
Statistical analysis of the uncertainty related to flood hazard appraisal
NASA Astrophysics Data System (ADS)
Notaro, Vincenza; Freni, Gabriele
2015-12-01
The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.
NASA Astrophysics Data System (ADS)
Power, William; Wang, Xiaoming; Lane, Emily; Gillibrand, Philip
2013-09-01
Regional source tsunamis represent a potentially devastating threat to coastal communities in New Zealand, yet are infrequent events for which little historical information is available. It is therefore essential to develop robust methods for quantitatively estimating the hazards posed, so that effective mitigation measures can be implemented. We develop a probabilistic model for the tsunami hazard posed to the Auckland region of New Zealand from the Kermadec Trench and the southern New Hebrides Trench subduction zones. An innovative feature of our model is the systematic analysis of uncertainty regarding the magnitude-frequency distribution of earthquakes in the source regions. The methodology is first used to estimate the tsunami hazard at the coastline, and then used to produce a set of scenarios that can be applied to produce probabilistic maps of tsunami inundation for the study region; the production of these maps is described in part II. We find that the 2,500 year return period regional source tsunami hazard for the densely populated east coast of Auckland is dominated by events originating in the Kermadec Trench, while the equivalent hazard to the sparsely populated west coast is approximately equally due to events on the Kermadec Trench and the southern New Hebrides Trench.
A general framework for parametric survival analysis.
Crowther, Michael J; Lambert, Paul C
2014-12-30
Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.
Murphy, Elizabeth A.; Straub, Timothy D.; Soong, David T.; Hamblen, Christopher S.
2007-01-01
Results of the hydrologic model, flood-frequency, hydraulic model, and flood-hazard analysis of the Blackberry Creek watershed in Kendall County, Illinois, indicate that the 100-year and 500-year flood plains cover approximately 3,699 and 3,762 acres of land, respectively. On the basis of land-cover data for 2003, most of the land in the flood plains was cropland and residential land. Although many acres of residential land were included in the flood plain, this land was mostly lawns, with 25 homes within the 100-year flood plain, and 41 homes within the 500-year flood plain in the 2003 aerial photograph. This report describes the data collection activities to refine the hydrologic and hydraulic models used in an earlier study of the Kane County part of the Blackberry Creek watershed and to extend the flood-frequency analysis through water year 2003. The results of the flood-hazard analysis are presented in graphical and tabular form. The hydrologic model, Hydrological Simulation Program - FORTRAN (HSPF), was used to simulate continuous water movement through various land-use patterns in the watershed. Flood-frequency analysis was applied to an annual maximum series to determine flood quantiles in subbasins for flood-hazard analysis. The Hydrologic Engineering Center- River Analysis System (HEC-RAS) hydraulic model was used to determine the 100-year and 500-year flood elevations, and the 100-year floodway. The hydraulic model was calibrated and verified using observations during three storms at two crest-stage gages and the U.S. Geological Survey streamflowgaging station near Yorkville. Digital maps of the 100-year and 500-year flood plains and the 100-year floodway for each tributary and the main stem of Blackberry Creek were compiled.
Quantification of Treatment Effect Modification on Both an Additive and Multiplicative Scale
Girerd, Nicolas; Rabilloud, Muriel; Pibarot, Philippe; Mathieu, Patrick; Roy, Pascal
2016-01-01
Background In both observational and randomized studies, associations with overall survival are by and large assessed on a multiplicative scale using the Cox model. However, clinicians and clinical researchers have an ardent interest in assessing absolute benefit associated with treatments. In older patients, some studies have reported lower relative treatment effect, which might translate into similar or even greater absolute treatment effect given their high baseline hazard for clinical events. Methods The effect of treatment and the effect modification of treatment were respectively assessed using a multiplicative and an additive hazard model in an analysis adjusted for propensity score in the context of coronary surgery. Results The multiplicative model yielded a lower relative hazard reduction with bilateral internal thoracic artery grafting in older patients (Hazard ratio for interaction/year = 1.03, 95%CI: 1.00 to 1.06, p = 0.05) whereas the additive model reported a similar absolute hazard reduction with increasing age (Delta for interaction/year = 0.10, 95%CI: -0.27 to 0.46, p = 0.61). The number needed to treat derived from the propensity score-adjusted multiplicative model was remarkably similar at the end of the follow-up in patients aged < = 60 and in patients >70. Conclusions The present example demonstrates that a lower treatment effect in older patients on a relative scale can conversely translate into a similar treatment effect on an additive scale due to large baseline hazard differences. Importantly, absolute risk reduction, either crude or adjusted, can be calculated from multiplicative survival models. We advocate for a wider use of the absolute scale, especially using additive hazard models, to assess treatment effect and treatment effect modification. PMID:27045168
Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil
NASA Astrophysics Data System (ADS)
de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.
2018-05-01
A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.
Seismic hazard assessment of Syria using seismicity, DEM, slope, active tectonic and GIS
NASA Astrophysics Data System (ADS)
Ahmad, Raed; Adris, Ahmad; Singh, Ramesh
2016-07-01
In the present work, we discuss the use of an integrated remote sensing and Geographical Information System (GIS) techniques for evaluation of seismic hazard areas in Syria. The present study is the first time effort to create seismic hazard map with the help of GIS. In the proposed approach, we have used Aster satellite data, digital elevation data (30 m resolution), earthquake data, and active tectonic maps. Many important factors for evaluation of seismic hazard were identified and corresponding thematic data layers (past earthquake epicenters, active faults, digital elevation model, and slope) were generated. A numerical rating scheme has been developed for spatial data analysis using GIS to identify ranking of parameters to be included in the evaluation of seismic hazard. The resulting earthquake potential map delineates the area into different relative susceptibility classes: high, moderate, low and very low. The potential earthquake map was validated by correlating the obtained different classes with the local probability that produced using conventional analysis of observed earthquakes. Using earthquake data of Syria and the peak ground acceleration (PGA) data is introduced to the model to develop final seismic hazard map based on Gutenberg-Richter (a and b values) parameters and using the concepts of local probability and recurrence time. The application of the proposed technique in Syrian region indicates that this method provides good estimate of seismic hazard map compared to those developed from traditional techniques (Deterministic (DSHA) and probabilistic seismic hazard (PSHA). For the first time we have used numerous parameters using remote sensing and GIS in preparation of seismic hazard map which is found to be very realistic.
Large Scale Flood Risk Analysis using a New Hyper-resolution Population Dataset
NASA Astrophysics Data System (ADS)
Smith, A.; Neal, J. C.; Bates, P. D.; Quinn, N.; Wing, O.
2017-12-01
Here we present the first national scale flood risk analyses, using high resolution Facebook Connectivity Lab population data and data from a hyper resolution flood hazard model. In recent years the field of large scale hydraulic modelling has been transformed by new remotely sensed datasets, improved process representation, highly efficient flow algorithms and increases in computational power. These developments have allowed flood risk analysis to be undertaken in previously unmodeled territories and from continental to global scales. Flood risk analyses are typically conducted via the integration of modelled water depths with an exposure dataset. Over large scales and in data poor areas, these exposure data typically take the form of a gridded population dataset, estimating population density using remotely sensed data and/or locally available census data. The local nature of flooding dictates that for robust flood risk analysis to be undertaken both hazard and exposure data should sufficiently resolve local scale features. Global flood frameworks are enabling flood hazard data to produced at 90m resolution, resulting in a mis-match with available population datasets which are typically more coarsely resolved. Moreover, these exposure data are typically focused on urban areas and struggle to represent rural populations. In this study we integrate a new population dataset with a global flood hazard model. The population dataset was produced by the Connectivity Lab at Facebook, providing gridded population data at 5m resolution, representing a resolution increase over previous countrywide data sets of multiple orders of magnitude. Flood risk analysis undertaken over a number of developing countries are presented, along with a comparison of flood risk analyses undertaken using pre-existing population datasets.
Survivorship analysis when cure is a possibility: a Monte Carlo study.
Goldman, A I
1984-01-01
Parametric survivorship analyses of clinical trials commonly involves the assumption of a hazard function constant with time. When the empirical curve obviously levels off, one can modify the hazard function model by use of a Gompertz or Weibull distribution with hazard decreasing over time. Some cancer treatments are thought to cure some patients within a short time of initiation. Then, instead of all patients having the same hazard, decreasing over time, a biologically more appropriate model assumes that an unknown proportion (1 - pi) have constant high risk whereas the remaining proportion (pi) have essentially no risk. This paper discusses the maximum likelihood estimation of pi and the power curves of the likelihood ratio test. Monte Carlo studies provide results for a variety of simulated trials; empirical data illustrate the methods.
Experimental Concepts for Testing Seismic Hazard Models
NASA Astrophysics Data System (ADS)
Marzocchi, W.; Jordan, T. H.
2015-12-01
Seismic hazard analysis is the primary interface through which useful information about earthquake rupture and wave propagation is delivered to society. To account for the randomness (aleatory variability) and limited knowledge (epistemic uncertainty) of these natural processes, seismologists must formulate and test hazard models using the concepts of probability. In this presentation, we will address the scientific objections that have been raised over the years against probabilistic seismic hazard analysis (PSHA). Owing to the paucity of observations, we must rely on expert opinion to quantify the epistemic uncertainties of PSHA models (e.g., in the weighting of individual models from logic-tree ensembles of plausible models). The main theoretical issue is a frequentist critique: subjectivity is immeasurable; ergo, PSHA models cannot be objectively tested against data; ergo, they are fundamentally unscientific. We have argued (PNAS, 111, 11973-11978) that the Bayesian subjectivity required for casting epistemic uncertainties can be bridged with the frequentist objectivity needed for pure significance testing through "experimental concepts." An experimental concept specifies collections of data, observed and not yet observed, that are judged to be exchangeable (i.e., with a joint distribution independent of the data ordering) when conditioned on a set of explanatory variables. We illustrate, through concrete examples, experimental concepts useful in the testing of PSHA models for ontological errors in the presence of aleatory variability and epistemic uncertainty. In particular, we describe experimental concepts that lead to exchangeable binary sequences that are statistically independent but not identically distributed, showing how the Bayesian concept of exchangeability generalizes the frequentist concept of experimental repeatability. We also address the issue of testing PSHA models using spatially correlated data.
Comparison of methods for estimating the attributable risk in the context of survival analysis.
Gassama, Malamine; Bénichou, Jacques; Dartois, Laureen; Thiébaut, Anne C M
2017-01-23
The attributable risk (AR) measures the proportion of disease cases that can be attributed to an exposure in the population. Several definitions and estimation methods have been proposed for survival data. Using simulations, we compared four methods for estimating AR defined in terms of survival functions: two nonparametric methods based on Kaplan-Meier's estimator, one semiparametric based on Cox's model, and one parametric based on the piecewise constant hazards model, as well as one simpler method based on estimated exposure prevalence at baseline and Cox's model hazard ratio. We considered a fixed binary exposure with varying exposure probabilities and strengths of association, and generated event times from a proportional hazards model with constant or monotonic (decreasing or increasing) Weibull baseline hazard, as well as from a nonproportional hazards model. We simulated 1,000 independent samples of size 1,000 or 10,000. The methods were compared in terms of mean bias, mean estimated standard error, empirical standard deviation and 95% confidence interval coverage probability at four equally spaced time points. Under proportional hazards, all five methods yielded unbiased results regardless of sample size. Nonparametric methods displayed greater variability than other approaches. All methods showed satisfactory coverage except for nonparametric methods at the end of follow-up for a sample size of 1,000 especially. With nonproportional hazards, nonparametric methods yielded similar results to those under proportional hazards, whereas semiparametric and parametric approaches that both relied on the proportional hazards assumption performed poorly. These methods were applied to estimate the AR of breast cancer due to menopausal hormone therapy in 38,359 women of the E3N cohort. In practice, our study suggests to use the semiparametric or parametric approaches to estimate AR as a function of time in cohort studies if the proportional hazards assumption appears appropriate.
NASA Astrophysics Data System (ADS)
Simicevic, Aleksandra; Bonadonna, Costanza; di Traglia, Federico; Rosi, Mauro
2010-05-01
Volcanic eruptions are accompanied by numerous hazards which pose short- and long-term threats to people and property. Recent experiences have shown that successful responses to hazard events correlate strongly with the degree to which proactive policies of risk reduction are already in place before an eruption occurs. Effective proactive risk-reduction strategies require contributions from numerous disciplines. A volcanic eruption is not a hazard, per se, but rather an event capable of producing a variety of hazards (e.g. earthquakes, pyroclastic density currents, lava flows, tephra fall, lahars, landslides, gas release, and tsunamis) that can affect the built environment in a variety of ways, over different time scales and with different degrees of intensity. Our proposed model for the assessment and mitigation of exposure-based volcanic risk is mainly based on the compilation of three types of maps: hazard maps, hazard-specific vulnerability maps and exposure-based risk maps. Hazard maps identify the spatial distribution of individual volcanic hazard and it includes both event analysis and impact analysis. Hazard-specific vulnerability maps represent the systematic evaluation of physical vulnerability of the built environment to a range of volcanic phenomena, i.e. spatial distribution of buildings vulnerable to a given hazard based on the analysis of selected building elements. Buildings are classified on the basis of their major components that are relevant for different volcanic hazards, their strength, their construction materials and are defined taking into account the potential damage that each group of building elements (e.g. walls, roof, load-bearing structure) will suffer under a volcanic hazard. All those factors are enumerated in a checklist and are used for the building survey. Hazard-specific vulnerability maps are then overlapped with hazard maps in order to compile exposure-based risk maps and so quantify the potential damage. Such quantification is the starting point of the identification of suitable mitigation measures which will be analyzed through a cost-benefit analysis to assess their financial feasibility. Information about public networks is also recorded in order to give an overall idea of the built environment condition of the island. The vulnerability assessment of the technical systems describes the potential damages that could stress systems like electricity supply, water distribution, communication networks or transport systems. These damages can also be described as function disruption of the system. The important aspect is not only the physical capacity of a system to resist, but also its capacity to continue functioning. The model will be tested on the island of Vulcano in southern Italy. Vulcano is characterized by clear signs of volcanic unrest and is the type locality for a deadly style of eruption. The main active system of Vulcano Island (La Fossa cone) is known to produce a variety of eruption styles and intensities, each posing their own hazards and threats. Six different hazard scenarios have been identified based on a detailed stratigraphic work. The urbanization on Vulcano took place in the 1980s with no real planning and its population mostly subsists on tourism. Our preliminary results show that Vulcano is not characterized by a great variability of architectural typologies and construction materials. Three main types of buildings are present (masonry with concrete frame, masonry with manufactured stone units, masonry with hollow clay bricks) and no statistically significant trends were found between physical and morphological characteristics. The recent signs of volcanic unrest combined with a complex vulnerability of the island due to an uncontrolled urban development and a significant seasonal variation of the exposed population in summer months result in a high volcanic risk. As a result, Vulcano represents the ideal environment to test a multi-hazard based risk model and to study the transition between micro (building) and macro (urban environment) scale of analysis, which is still an unexplored field in the study of volcanic risk. Different levels of vulnerability need to be analyzed in order to increase the level of preparedness, plan a potential evacuation, manage a potential volcanic crisis and assess the best mitigation measures to put in place and reduce the volcanic risk.
Critical load analysis in hazard assessment of metals using a Unit World Model.
Gandhi, Nilima; Bhavsar, Satyendra P; Diamond, Miriam L
2011-09-01
A Unit World approach has been used extensively to rank chemicals for their hazards and to understand differences in chemical behavior. Whereas the fate and effects of an organic chemical in a Unit World Model (UWM) analysis vary systematically according to one variable (fraction of organic carbon), and the chemicals have a singular ranking regardless of environmental characteristics, metals can change their hazard ranking according to freshwater chemistry, notably pH and dissolved organic carbon (DOC). Consequently, developing a UWM approach for metals requires selecting a series of representative freshwater chemistries, based on an understanding of the sensitivity of model results to this chemistry. Here we analyze results from a UWM for metals with the goal of informing the selection of appropriate freshwater chemistries for a UWM. The UWM loosely couples the biotic ligand model (BLM) to a geochemical speciation model (Windermere Humic Adsorption Model [WHAM]) and then to the multi-species fate transport-speciation (Transpec) model. The UWM is applied to estimate the critical load (CL) of cationic metals Cd, Cu, Ni, Pb, and Zn, using three lake chemistries that vary in trophic status, pH, and other parameters. The model results indicated a difference of four orders of magnitude in particle-to-total dissolved partitioning (K(d)) that translated into minimal differences in fate because of the short water residence time used. However, a maximum 300-fold difference was calculated in Cu toxicity among the three chemistries and three aquatic organisms. Critical loads were lowest (greatest hazard) in the oligotrophic water chemistry and highest (least hazard) in the eutrophic water chemistry, despite the highest fraction of free metal ion as a function of total metal occurring in the mesotrophic system, where toxicity was ameliorated by competing cations. Water hardness, DOC, and pH had the greatest influence on CL, because of the influence of these factors on aquatic toxicity. Copyright © 2011 SETAC.
Trimming the UCERF2 hazard logic tree
Porter, Keith A.; Field, Edward H.; Milner, Kevin
2012-01-01
The Uniform California Earthquake Rupture Forecast 2 (UCERF2) is a fully time‐dependent earthquake rupture forecast developed with sponsorship of the California Earthquake Authority (Working Group on California Earthquake Probabilities [WGCEP], 2007; Field et al., 2009). UCERF2 contains 480 logic‐tree branches reflecting choices among nine modeling uncertainties in the earthquake rate model shown in Figure 1. For seismic hazard analysis, it is also necessary to choose a ground‐motion‐prediction equation (GMPE) and set its parameters. Choosing among four next‐generation attenuation (NGA) relationships results in a total of 1920 hazard calculations per site. The present work is motivated by a desire to reduce the computational effort involved in a hazard analysis without understating uncertainty. We set out to assess which branching points of the UCERF2 logic tree contribute most to overall uncertainty, and which might be safely ignored (set to only one branch) without significantly biasing results or affecting some useful measure of uncertainty. The trimmed logic tree will have all of the original choices from the branching points that contribute significantly to uncertainty, but only one arbitrarily selected choice from the branching points that do not.
Seismic Hazard Analysis for Armenia and its Surrounding Areas
NASA Astrophysics Data System (ADS)
Klein, E.; Shen-Tu, B.; Mahdyiar, M.; Karakhanyan, A.; Pagani, M.; Weatherill, G.; Gee, R. C.
2017-12-01
The Republic of Armenia is located within the central part of a large, 800 km wide, intracontinental collision zone between the Arabian and Eurasian plates. Active deformation occurs along numerous structures in the form of faulting, folding, and volcanism distributed throughout the entire zone from the Bitlis-Zargos suture belt to the Greater Caucasus Mountains and between the relatively rigid Back Sea and Caspian Sea blocks without any single structure that can be claimed as predominant. In recent years, significant work has been done on mapping active faults, compiling and reviewing historic and paleoseismological studies in the region, especially in Armenia; these recent research contributions have greatly improved our understanding of the seismogenic sources and their characteristics. In this study we performed a seismic hazard analysis for Armenia and its surrounding areas using the latest detailed geological and paleoseismological information on active faults, strain rates estimated from kinematic modeling of GPS data and all available historic earthquake data. The seismic source model uses a combination of characteristic earthquake and gridded seismicity models to take advantage of the detailed knowledge of the known faults while acknowledging the distributed deformation and regional tectonic environment of the collision zone. In addition, the fault model considers earthquake ruptures that include single and multi-segment or fault rupture scenarios with earthquakes that can rupture any part of a multiple segment fault zone. The ground motion model uses a set of ground motion prediction equations (GMPE) selected from a pool of GMPEs based on the assessment of each GMPE against the available strong motion data in the region. The hazard is computed in the GEM's OpenQuake engine. We will present final hazard results and discuss the uncertainties associated with various input data and their impact on the hazard at various locations.
Hazard function theory for nonstationary natural hazards
Read, Laura K.; Vogel, Richard M.
2016-04-11
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.« less
Hazard function theory for nonstationary natural hazards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Read, Laura K.; Vogel, Richard M.
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.« less
Seismic hazard analysis for Jayapura city, Papua
NASA Astrophysics Data System (ADS)
Robiana, R.; Cipta, A.
2015-04-01
Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 - 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.
Assessing Surface Fuel Hazard in Coastal Conifer Forests through the Use of LiDAR Remote Sensing
NASA Astrophysics Data System (ADS)
Koulas, Christos
The research problem that this thesis seeks to examine is a method of predicting conventional fire hazards using data drawn from specific regions, namely the Sooke and Goldstream watershed regions in coastal British Columbia. This thesis investigates whether LiDAR data can be used to describe conventional forest stand fire hazard classes. Three objectives guided this thesis: to discuss the variables associated with fire hazard, specifically the distribution and makeup of fuel; to examine the relationship between derived LiDAR biometrics and forest attributes related to hazard assessment factors defined by the Capitol Regional District (CRD); and to assess the viability of the LiDAR biometric decision tree in the CRD based on current frameworks for use. The research method uses quantitative datasets to assess the optimal generalization of these types of fire hazard data through discriminant analysis. Findings illustrate significant LiDAR-derived data limitations, and reflect the literature in that flawed field application of data modelling techniques has led to a disconnect between the ways in which fire hazard models have been intended to be used by scholars and the ways in which they are used by those tasked with prevention of forest fires. It can be concluded that a significant trade-off exists between computational requirements for wildfire simulation models and the algorithms commonly used by field teams to apply these models with remote sensing data, and that CRD forest management practices would need to change to incorporate a decision tree model in order to decrease risk.
Inter-model analysis of tsunami-induced coastal currents
NASA Astrophysics Data System (ADS)
Lynett, Patrick J.; Gately, Kara; Wilson, Rick; Montoya, Luis; Arcas, Diego; Aytore, Betul; Bai, Yefei; Bricker, Jeremy D.; Castro, Manuel J.; Cheung, Kwok Fai; David, C. Gabriel; Dogan, Gozde Guney; Escalante, Cipriano; González-Vida, José Manuel; Grilli, Stephan T.; Heitmann, Troy W.; Horrillo, Juan; Kânoğlu, Utku; Kian, Rozita; Kirby, James T.; Li, Wenwen; Macías, Jorge; Nicolsky, Dmitry J.; Ortega, Sergio; Pampell-Manis, Alyssa; Park, Yong Sung; Roeber, Volker; Sharghivand, Naeimeh; Shelby, Michael; Shi, Fengyan; Tehranirad, Babak; Tolkova, Elena; Thio, Hong Kie; Velioğlu, Deniz; Yalçıner, Ahmet Cevdet; Yamazaki, Yoshiki; Zaytsev, Andrey; Zhang, Y. J.
2017-06-01
To help produce accurate and consistent maritime hazard products, the National Tsunami Hazard Mitigation Program organized a benchmarking workshop to evaluate the numerical modeling of tsunami currents. Thirteen teams of international researchers, using a set of tsunami models currently utilized for hazard mitigation studies, presented results for a series of benchmarking problems; these results are summarized in this paper. Comparisons focus on physical situations where the currents are shear and separation driven, and are thus de-coupled from the incident tsunami waveform. In general, we find that models of increasing physical complexity provide better accuracy, and that low-order three-dimensional models are superior to high-order two-dimensional models. Inside separation zones and in areas strongly affected by eddies, the magnitude of both model-data errors and inter-model differences can be the same as the magnitude of the mean flow. Thus, we make arguments for the need of an ensemble modeling approach for areas affected by large-scale turbulent eddies, where deterministic simulation may be misleading. As a result of the analyses presented herein, we expect that tsunami modelers now have a better awareness of their ability to accurately capture the physics of tsunami currents, and therefore a better understanding of how to use these simulation tools for hazard assessment and mitigation efforts.
Development of hazard-compatible building fragility and vulnerability models
Karaca, E.; Luco, N.
2008-01-01
We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.
Sensitivity test and ensemble hazard assessment for tephra fallout at Campi Flegrei, Italy
NASA Astrophysics Data System (ADS)
Selva, J.; Costa, A.; De Natale, G.; Di Vito, M. A.; Isaia, R.; Macedonio, G.
2018-02-01
We present the results of a statistical study on tephra dispersal in the case of a reactivation of the Campi Flegrei volcano. To represent the spectrum of possible eruptive sizes, four classes of eruptions were considered. Excluding the lava emission, three classes are explosive (Small, Medium, and Large) and can produce a significant quantity of volcanic ash. Hazard assessments were made through simulations of atmospheric dispersion of ash and lapilli, considering the full variability of winds and eruptive vents. The results are presented in form of conditional hazard curves given the occurrence of specific eruptive sizes, representative members of each size class, and then combined to quantify the conditional hazard given an eruption of any size. The main focus of this analysis was to constrain the epistemic uncertainty (i.e. associated with the level of scientific knowledge of phenomena), in order to provide unbiased hazard estimations. The epistemic uncertainty on the estimation of hazard curves was quantified, making use of scientifically acceptable alternatives to be aggregated in the final results. The choice of such alternative models was made after a comprehensive sensitivity analysis which considered different weather databases, alternative modelling of submarine eruptive vents and tephra total grain-size distributions (TGSD) with a different relative mass fraction of fine ash, and the effect of ash aggregation. The results showed that the dominant uncertainty is related to the combined effect of the uncertainty with regard to the fraction of fine particles with respect to the total mass and on how ash aggregation is modelled. The latter is particularly relevant in the case of magma-water interactions during explosive eruptive phases, when a large fraction of fine ash can form accretionary lapilli that might contribute significantly in increasing the tephra load in the proximal areas. The variability induced by the use of different meteorological databases and the selected approach to modelling offshore eruptions were relatively insignificant. The uncertainty arising from the alternative implementations, which would have been neglected in standard (Bayesian) quantifications, were finally quantified by ensemble modelling, and represented by hazard and probability maps produced at different confidence levels.
Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping
NASA Astrophysics Data System (ADS)
Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai
2015-04-01
Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.
Mars Exploration Rovers Landing Dispersion Analysis
NASA Technical Reports Server (NTRS)
Knocke, Philip C.; Wawrzyniak, Geoffrey G.; Kennedy, Brian M.; Desai, Prasun N.; Parker, TImothy J.; Golombek, Matthew P.; Duxbury, Thomas C.; Kass, David M.
2004-01-01
Landing dispersion estimates for the Mars Exploration Rover missions were key elements in the site targeting process and in the evaluation of landing risk. This paper addresses the process and results of the landing dispersion analyses performed for both Spirit and Opportunity. The several contributors to landing dispersions (navigation and atmospheric uncertainties, spacecraft modeling, winds, and margins) are discussed, as are the analysis tools used. JPL's MarsLS program, a MATLAB-based landing dispersion visualization and statistical analysis tool, was used to calculate the probability of landing within hazardous areas. By convolving this with the probability of landing within flight system limits (in-spec landing) for each hazard area, a single overall measure of landing risk was calculated for each landing ellipse. In-spec probability contours were also generated, allowing a more synoptic view of site risks, illustrating the sensitivity to changes in landing location, and quantifying the possible consequences of anomalies such as incomplete maneuvers. Data and products required to support these analyses are described, including the landing footprints calculated by NASA Langley's POST program and JPL's AEPL program, cartographically registered base maps and hazard maps, and flight system estimates of in-spec landing probabilities for each hazard terrain type. Various factors encountered during operations, including evolving navigation estimates and changing atmospheric models, are discussed and final landing points are compared with approach estimates.
NASA Astrophysics Data System (ADS)
Samphutthanon, R.; Tripathi, N. K.; Ninsawat, S.; Duboz, R.
2014-12-01
The main objective of this research was the development of an HFMD hazard zonation (HFMD-HZ) model by applying AHP and Fuzzy Logic AHP methodologies for weighting each spatial factor such as disease incidence, socio-economic and physical factors. The outputs of AHP and FAHP were input into a Geographic Information Systems (GIS) process for spatial analysis. 14 criteria were selected for analysis as important factors: disease incidence over 10 years from 2003 to 2012, population density, road density, land use and physical features. The results showed a consistency ratio (CR) value for these main criteria of 0.075427 for AHP, the CR for FAHP results was 0.092436. As both remained below the threshold of 0.1, the CR value were acceptable. After linking to actual geospatial data (disease incidence 2013) through spatial analysis by GIS for validation, the results of the FAHP approach were found to match more accurately than those of the AHP approach. The zones with the highest hazard of HFMD outbreaks were located in two main areas in central Muang Chiang Mai district including suburbs and Muang Chiang Rai district including the vicinity. The produced hazardous maps may be useful for organizing HFMD protection plans.
Airborne Forward-Looking Interferometer for the Detection of Terminal-Area Hazards
NASA Technical Reports Server (NTRS)
West, Leanne; Gimmestad, Gary; Lane, Sarah; Smith, Bill L.; Kireev, Stanislav; Daniels, Taumi S.; Cornman, Larry; Sharman, Bob
2014-01-01
The Forward Looking Interferometer (FLI) program was a multi-year cooperative research effort to investigate the use of imaging radiometers with high spectral resolution, using both modeling/simulation and field experiments, along with sophisticated data analysis techniques that were originally developed for analysis of data from space-based radiometers and hyperspectral imagers. This investigation has advanced the state of knowledge in this technical area, and the FLI program developed a greatly improved understanding of the radiometric signal strength of aviation hazards in a wide range of scenarios, in addition to a much better understanding of the real-world functionality requirements for hazard detection instruments. The project conducted field experiments on three hazards (turbulence, runway conditions, and wake vortices) and analytical studies on several others including volcanic ash, reduced visibility conditions, in flight icing conditions, and volcanic ash.
Assessing the seismic risk potential of South America
Jaiswal, Kishor; Petersen, Mark D.; Harmsen, Stephen; Smoczyk, Gregory M.
2016-01-01
We present here a simplified approach to quantifying regional seismic risk. The seismic risk for a given region can be inferred in terms of average annual loss (AAL) that represents long-term value of earthquake losses in any one year caused from a long-term seismic hazard. The AAL are commonly measured in the form of earthquake shaking-induced deaths, direct economic impacts or indirect losses caused due to loss of functionality. In the context of South American subcontinent, the analysis makes use of readily available public data on seismicity, population exposure, and the hazard and vulnerability models for the region. The seismic hazard model was derived using available seismic catalogs, fault databases, and the hazard methodologies that are analogous to the U.S. Geological Survey’s national seismic hazard mapping process. The Prompt Assessment of Global Earthquakes for Response (PAGER) system’s direct empirical vulnerability functions in terms of fatality and economic impact were used for performing exposure and risk analyses. The broad findings presented and the risk maps produced herein are preliminary, yet they do offer important insights into the underlying zones of high and low seismic risks in the South American subcontinent. A more detailed analysis of risk may be warranted by engaging local experts, especially in some of the high risk zones identified through the present investigation.
National assessment of hurricane-induced coastal erosion hazards--Gulf of Mexico
Stockdon, Hilary F.; Doran, Kara S.; Thompson, David M.; Sopkin, Kristin L.; Plant, Nathaniel G.; Sallenger, Asbury H.
2012-01-01
Sandy beaches provide a natural barrier between the ocean and inland communities, ecosystems, and resources. However, these dynamic environments move and change in response to winds, waves, and currents. During a hurricane, these changes can be large and sometimes catastrophic. High waves and storm surge act together to erode beaches and inundate low-lying lands, putting inland communities at risk. A decade of USGS research on storm-driven coastal change hazards has provided the data and modeling capabilities to identify areas of our coastline that are likely to experience extreme and potentially hazardous erosion during a hurricane. This report defines hurricane-induced coastal erosion hazards for sandy beaches along the U.S. Gulf of Mexico coastline. The analysis is based on a storm-impact scaling model that uses observations of beach morphology combined with sophisticated hydrodynamic models to predict how the coast will respond to the direct landfall of category 1-5 hurricanes. Hurricane-induced water levels, due to both surge and waves, are compared to beach and dune elevations to determine the probabilities of three types of coastal change: collision (dune erosion), overwash, and inundation. As new beach morphology observations and storm predictions become available, this analysis will be updated to describe how coastal vulnerability to storms will vary in the future.
Blanton, Brian; Dresback, Kendra; Colle, Brian; Kolar, Randy; Vergara, Humberto; Hong, Yang; Leonardo, Nicholas; Davidson, Rachel; Nozick, Linda; Wachtendorf, Tricia
2018-04-25
Hurricane track and intensity can change rapidly in unexpected ways, thus making predictions of hurricanes and related hazards uncertain. This inherent uncertainty often translates into suboptimal decision-making outcomes, such as unnecessary evacuation. Representing this uncertainty is thus critical in evacuation planning and related activities. We describe a physics-based hazard modeling approach that (1) dynamically accounts for the physical interactions among hazard components and (2) captures hurricane evolution uncertainty using an ensemble method. This loosely coupled model system provides a framework for probabilistic water inundation and wind speed levels for a new, risk-based approach to evacuation modeling, described in a companion article in this issue. It combines the Weather Research and Forecasting (WRF) meteorological model, the Coupled Routing and Excess STorage (CREST) hydrologic model, and the ADvanced CIRCulation (ADCIRC) storm surge, tide, and wind-wave model to compute inundation levels and wind speeds for an ensemble of hurricane predictions. Perturbations to WRF's initial and boundary conditions and different model physics/parameterizations generate an ensemble of storm solutions, which are then used to drive the coupled hydrologic + hydrodynamic models. Hurricane Isabel (2003) is used as a case study to illustrate the ensemble-based approach. The inundation, river runoff, and wind hazard results are strongly dependent on the accuracy of the mesoscale meteorological simulations, which improves with decreasing lead time to hurricane landfall. The ensemble envelope brackets the observed behavior while providing "best-case" and "worst-case" scenarios for the subsequent risk-based evacuation model. © 2018 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kienhuis, Anne S., E-mail: anne.kienhuis@rivm.nl; RIKILT, Institute of Food Safety, Wageningen UR, PO Box 230, 6700 AE, Wageningen; Netherlands Toxicogenomics Centre
Hepatic systems toxicology is the integrative analysis of toxicogenomic technologies, e.g., transcriptomics, proteomics, and metabolomics, in combination with traditional toxicology measures to improve the understanding of mechanisms of hepatotoxic action. Hepatic toxicology studies that have employed toxicogenomic technologies to date have already provided a proof of principle for the value of hepatic systems toxicology in hazard identification. In the present review, acetaminophen is used as a model compound to discuss the application of toxicogenomics in hepatic systems toxicology for its potential role in the risk assessment process, to progress from hazard identification towards hazard characterization. The toxicogenomics-based parallelogram is usedmore » to identify current achievements and limitations of acetaminophen toxicogenomic in vivo and in vitro studies for in vitro-to-in vivo and interspecies comparisons, with the ultimate aim to extrapolate animal studies to humans in vivo. This article provides a model for comparison of more species and more in vitro models enhancing the robustness of common toxicogenomic responses and their relevance to human risk assessment. To progress to quantitative dose-response analysis needed for hazard characterization, in hepatic systems toxicology studies, generation of toxicogenomic data of multiple doses/concentrations and time points is required. Newly developed bioinformatics tools for quantitative analysis of toxicogenomic data can aid in the elucidation of dose-responsive effects. The challenge herein is to assess which toxicogenomic responses are relevant for induction of the apical effect and whether perturbations are sufficient for the induction of downstream events, eventually causing toxicity.« less
Dinitz, Laura B.
2008-01-01
With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS-MH currently performs analyses for earthquakes, floods, and hurricane wind. HAZUS-MH loss estimates, however, do not account for some uncertainties associated with the specific natural-hazard scenarios, such as the likelihood of occurrence within a particular time horizon or the effectiveness of alternative risk-reduction options. Because of the uncertainties involved, it is challenging to make informative decisions about how to cost-effectively reduce risk from natural-hazard events. Risk analysis is one approach that decision-makers can use to evaluate alternative risk-reduction choices when outcomes are unknown. The Land Use Portfolio Model (LUPM), developed by the U.S. Geological Survey (USGS), is a geospatial scenario-based tool that incorporates hazard-event uncertainties to support risk analysis. The LUPM offers an approach to estimate and compare risks and returns from investments in risk-reduction measures. This paper describes and demonstrates a hypothetical application of the LUPM for Ventura County, California, and examines the challenges involved in developing decision tools that provide quantitative methods to estimate losses and analyze risk from natural hazards.
NASA Technical Reports Server (NTRS)
Deckert, George
2010-01-01
This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts
Deterministic seismic hazard macrozonation of India
NASA Astrophysics Data System (ADS)
Kolathayar, Sreevalsa; Sitharam, T. G.; Vipin, K. S.
2012-10-01
Earthquakes are known to have occurred in Indian subcontinent from ancient times. This paper presents the results of seismic hazard analysis of India (6°-38°N and 68°-98°E) based on the deterministic approach using latest seismicity data (up to 2010). The hazard analysis was done using two different source models (linear sources and point sources) and 12 well recognized attenuation relations considering varied tectonic provinces in the region. The earthquake data obtained from different sources were homogenized and declustered and a total of 27,146 earthquakes of moment magnitude 4 and above were listed in the study area. The sesismotectonic map of the study area was prepared by considering the faults, lineaments and the shear zones which are associated with earthquakes of magnitude 4 and above. A new program was developed in MATLAB for smoothing of the point sources. For assessing the seismic hazard, the study area was divided into small grids of size 0.1° × 0.1° (approximately 10 × 10 km), and the hazard parameters were calculated at the center of each of these grid cells by considering all the seismic sources within a radius of 300 to 400 km. Rock level peak horizontal acceleration (PHA) and spectral accelerations for periods 0.1 and 1 s have been calculated for all the grid points with a deterministic approach using a code written in MATLAB. Epistemic uncertainty in hazard definition has been tackled within a logic-tree framework considering two types of sources and three attenuation models for each grid point. The hazard evaluation without logic tree approach also has been done for comparison of the results. The contour maps showing the spatial variation of hazard values are presented in the paper.
Extended GTST-MLD for aerospace system safety analysis.
Guo, Chiming; Gong, Shiyu; Tan, Lin; Guo, Bo
2012-06-01
The hazards caused by complex interactions in the aerospace system have become a problem that urgently needs to be settled. This article introduces a method for aerospace system hazard interaction identification based on extended GTST-MLD (goal tree-success tree-master logic diagram) during the design stage. GTST-MLD is a functional modeling framework with a simple architecture. Ontology is used to extend the ability of system interaction description in GTST-MLD by adding the system design knowledge and the past accident experience. From the level of functionality and equipment, respectively, this approach can help the technician detect potential hazard interactions. Finally, a case is used to show the method. © 2011 Society for Risk Analysis.
Assessment and Control of Spacecraft Charging Risks on the International Space Station
NASA Technical Reports Server (NTRS)
Koontz, Steve; Valentine, Mark; Keeping, Thomas; Edeen, Marybeth; Spetch, William; Dalton, Penni
2004-01-01
The International Space Station (ISS) operates in the F2 region of Earth's ionosphere, orbiting at altitudes ranging from 350 to 450 km at an inclination of 51.6 degrees. The relatively dense, cool F2 ionospheric plasma suppresses surface charging processes much of the time, and the flux of relativistic electrons is low enough to preclude deep dielectric charging processes. The most important spacecraft charging processes in the ISS orbital environment are: 1) ISS electrical power system interactions with the F2 plasma, 2) magnetic induction processes resulting from flight through the geomagnetic field and, 3) charging processes that result from interaction with auroral electrons at high latitude. Recently, the continuing review and evaluation of putative ISS charging hazards required by the ISS Program Office revealed that ISS charging could produce an electrical shock hazard to the ISS crew during extravehicular activity (EVA). ISS charging risks are being evaluated in an ongoing measurement and analysis campaign. The results of ISS charging measurements are combined with a recently developed model of ISS charging (the Plasma Interaction Model) and an exhaustive analysis of historical ionospheric variability data (ISS Ionospheric Specification) to evaluate ISS charging risks using Probabilistic Risk Assessment (PRA) methods. The PRA combines estimates of the frequency of occurrence and severity of the charging hazards with estimates of the reliability of various hazard controls systems, as required by NASA s safety and risk management programs, to enable design and selection of a hazard control approach that minimizes overall programmatic and personnel risk. The PRA provides a quantitative methodology for incorporating the results of the ISS charging measurement and analysis campaigns into the necessary hazard reports, EVA procedures, and ISS flight rules required for operating ISS in a safe and productive manner.
2011-03-01
Center 1261 Duck Rd. Kitty Hawk, NC 27949 Lisa Stillwell, Margaret Blanchard-Montgomery, Brian Blanton Renaissance Computing Institute 100 Europa...Insurance Studies in the study area, and serve as the basis for new coastal hazard analysis and ultimately updated Flood Insurance Rate Maps (FIRMs). Study... hazard zones in coastal areas of the United States. Under Task Order HSFE03-06-X-0023, the U.S. Army Corps of Engineers (USACE) and project partners are
NASA Astrophysics Data System (ADS)
Convertito, Vincenzo; Zollo, Aldo
2011-08-01
In this study, we address the issue of short-term to medium-term probabilistic seismic hazard analysis for two volcanic areas, Campi Flegrei caldera and Mt. Vesuvius in the Campania region of southern Italy. Two different phases of the volcanic activity are considered. The first, which we term the pre-crisis phase, concerns the present quiescent state of the volcanoes that is characterized by low-to-moderate seismicity. The second phase, syn-crisis, concerns the unrest phase that can potentially lead to eruption. For the Campi Flegrei case study, we analyzed the pattern of seismicity during the 1982-1984 ground uplift episode (bradyseism). For Mt. Vesuvius, two different time-evolutionary models for seismicity were adopted, corresponding to different ways in which the volcano might erupt. We performed a site-specific analysis, linked with the hazard map, to investigate the effects of input parameters, in terms of source geometry, mean activity rate, periods of data collection, and return periods, for the syn-crisis phase. The analysis in the present study of the pre-crisis phase allowed a comparison of the results of probabilistic seismic hazard analysis for the two study areas with those provided in the Italian national hazard map. For the Mt. Vesuvius area in particular, the results show that the hazard can be greater than that reported in the national hazard map when information at a local scale is used. For the syn-crisis phase, the main result is that the data recorded during the early months of the unrest phase are substantially representative of the seismic hazard during the whole duration of the crisis.
Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.
2016-04-01
Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition, the flexibility of NDSHA allows for generation of ground shaking maps at specified long-term return times, which may permit a straightforward comparison between NDSHA and PSHA maps in terms of average rates of exceedance for specified time windows. The comparison of NDSHA and PSHA maps, particularly for very long recurrence times, may indicate to what extent probabilistic ground shaking estimates are consistent with those from physical models of seismic waves propagation. A systematic comparison over the territory of Italy is carried out exploiting the uniqueness of the Italian earthquake catalogue, a data set covering more than a millennium (a time interval about ten times longer than that available in most of the regions worldwide) with a satisfactory completeness level for M>5, which warrants the results of analysis. By analysing in some detail seismicity in the Vrancea region, we show that well constrained macroseismic field information for individual earthquakes may provide useful information about the reliability of ground shaking estimates. Finally, in order to generalise observations, the comparative analysis is extended to further regions where both standard NDSHA and PSHA maps are available (e.g. State of Gujarat, India). The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with other national scale probabilistic maps, all obtained by PSHA, are considered for this comparative analysis.
Clark, Renee M; Besterfield-Sacre, Mary E
2009-03-01
We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.
Predicting System Accidents with Model Analysis During Hybrid Simulation
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land D.; Throop, David R.
2002-01-01
Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.
Crowther, Michael J; Look, Maxime P; Riley, Richard D
2014-09-28
Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.
Syed, Hamzah; Jorgensen, Andrea L; Morris, Andrew P
2016-06-01
To evaluate the power to detect associations between SNPs and time-to-event outcomes across a range of pharmacogenomic study designs while comparing alternative regression approaches. Simulations were conducted to compare Cox proportional hazards modeling accounting for censoring and logistic regression modeling of a dichotomized outcome at the end of the study. The Cox proportional hazards model was demonstrated to be more powerful than the logistic regression analysis. The difference in power between the approaches was highly dependent on the rate of censoring. Initial evaluation of single-nucleotide polymorphism association signals using computationally efficient software with dichotomized outcomes provides an effective screening tool for some design scenarios, and thus has important implications for the development of analytical protocols in pharmacogenomic studies.
Artigas, Francisco; Bosits, Stephanie; Kojak, Saleh; Elefante, Dominador; Pechmann, Ildiko
2016-10-01
The accurate forecast from Hurricane Sandy sea surge was the result of integrating the most sophisticated environmental monitoring technology available. This stands in contrast to the limited information and technology that exists at the community level to translate these forecasts into flood hazard levels on the ground at scales that are meaningful to property owners. Appropriately scaled maps with high levels of certainty can be effectively used to convey exposure to flood hazard at the community level. This paper explores the most basic analysis and data required to generate a relatively accurate flood hazard map to convey inundation risk due to sea surge. A Boolean overlay analysis of four input layers: elevation and slope derived from LiDAR data and distances from streams and catch basins derived from aerial photography and field reconnaissance were used to create a spatial model that explained 55 % of the extent and depth of the flood during Hurricane Sandy. When a ponding layer was added to the previous model to account for depressions that would fill and spill over to nearby areas, the new model explained almost 70 % of the extent and depth of the flood. The study concludes that fairly accurate maps can be created with readily available information and that it is possible to infer a great deal about risk of inundation at the property level, from flood hazard maps. The study goes on to conclude that local communities are encouraged to prepare for disasters, but in reality because of the existing Federal emergency management framework there is very little incentive to do so.
NASA Astrophysics Data System (ADS)
Artigas, Francisco; Bosits, Stephanie; Kojak, Saleh; Elefante, Dominador; Pechmann, Ildiko
2016-10-01
The accurate forecast from Hurricane Sandy sea surge was the result of integrating the most sophisticated environmental monitoring technology available. This stands in contrast to the limited information and technology that exists at the community level to translate these forecasts into flood hazard levels on the ground at scales that are meaningful to property owners. Appropriately scaled maps with high levels of certainty can be effectively used to convey exposure to flood hazard at the community level. This paper explores the most basic analysis and data required to generate a relatively accurate flood hazard map to convey inundation risk due to sea surge. A Boolean overlay analysis of four input layers: elevation and slope derived from LiDAR data and distances from streams and catch basins derived from aerial photography and field reconnaissance were used to create a spatial model that explained 55 % of the extent and depth of the flood during Hurricane Sandy. When a ponding layer was added to the previous model to account for depressions that would fill and spill over to nearby areas, the new model explained almost 70 % of the extent and depth of the flood. The study concludes that fairly accurate maps can be created with readily available information and that it is possible to infer a great deal about risk of inundation at the property level, from flood hazard maps. The study goes on to conclude that local communities are encouraged to prepare for disasters, but in reality because of the existing Federal emergency management framework there is very little incentive to do so.
NASA Astrophysics Data System (ADS)
Moncoulon, D.; Labat, D.; Ardon, J.; Onfroy, T.; Leblois, E.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.
2013-07-01
The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible but not yet occurred flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2012 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90% of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of CCR claim database has shown that approximately 45% of the insured flood losses are located inside the floodplains and 45% outside. 10% other percent are due to seasurge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: generation of fictive river flows based on the historical records of the river gauge network and generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (MACIF) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).
Demonstration of a Safety Analysis on a Complex System
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey;
1997-01-01
For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.
Rockfall hazard analysis using LiDAR and spatial modeling
NASA Astrophysics Data System (ADS)
Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho
2010-05-01
Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.
Identification and delineation of areas flood hazard using high accuracy of DEM data
NASA Astrophysics Data System (ADS)
Riadi, B.; Barus, B.; Widiatmaka; Yanuar, M. J. P.; Pramudya, B.
2018-05-01
Flood incidents that often occur in Karawang regency need to be mitigated. These expectations exist on technologies that can predict, anticipate and reduce disaster risks. Flood modeling techniques using Digital Elevation Model (DEM) data can be applied in mitigation activities. High accuracy DEM data used in modeling, will result in better flooding flood models. The result of high accuracy DEM data processing will yield information about surface morphology which can be used to identify indication of flood hazard area. The purpose of this study was to identify and describe flood hazard areas by identifying wetland areas using DEM data and Landsat-8 images. TerraSAR-X high-resolution data is used to detect wetlands from landscapes, while land cover is identified by Landsat image data. The Topography Wetness Index (TWI) method is used to detect and identify wetland areas with basic DEM data, while for land cover analysis using Tasseled Cap Transformation (TCT) method. The result of TWI modeling yields information about potential land of flood. Overlay TWI map with land cover map that produces information that in Karawang regency the most vulnerable areas occur flooding in rice fields. The spatial accuracy of the flood hazard area in this study was 87%.
Vithanage Receives 2009 Natural Hazards Focus Group Award for Graduate Research
NASA Astrophysics Data System (ADS)
2010-04-01
Meththika Vithanage has been awarded the Natural Hazards Focus Group Award for Graduate Research, given annually to recent Ph.D. recipients for outstanding contributions to natural hazards research. Vithanage’s thesis is entitled “Effect of tsunami on coastal aquifers: Field studies and tank experiments.” She was formally presented with the award at the Natural Hazards Focus Group reception during the 2009 AGU Fall Meeting, held 14-18 December in San Francisco, Calif. Vithanage received her B.S. in natural resources from Sabaragamuwa University of Sri Lanka in 2002 and an M.S. in environmental science from the University of Peradeniya, Sri Lanka, in 2005. In 2009, she attained a Ph.D. in hydrogeology under the supervision of Karsten Jensen and Peter Engesgaard in the Department of Geology and Geography at University of Copenhagen, Denmark. Her research interests include groundwater flow modeling, density-dependent flow and solute transport modeling, and water quality analysis.
Quantile Regression with Censored Data
ERIC Educational Resources Information Center
Lin, Guixian
2009-01-01
The Cox proportional hazards model and the accelerated failure time model are frequently used in survival data analysis. They are powerful, yet have limitation due to their model assumptions. Quantile regression offers a semiparametric approach to model data with possible heterogeneity. It is particularly powerful for censored responses, where the…
USEPA EXAMPLE EXIT LEVEL ANALYSIS RESULTS
Developed by NERL/ERD for the Office of Solid Waste, the enclosed product provides an example uncertainty analysis (UA) and initial process-based sensitivity analysis (SA) of hazardous waste "exit" concentrations for 7 chemicals and metals using the 3MRA Version 1.0 Modeling Syst...
Seismic Landslide Hazard for the City of Berkeley, California
Miles, Scott B.; Keefer, David K.
2001-01-01
This map describes the possible hazard from earthquake-induced landslides for the city of Berkeley, CA. The hazard depicted by this map was modeled for a scenario corresponding to an M=7.1 earthquake on the Hayward, CA fault. This scenario magnitude is associated with complete rupture of the northern and southern segments of the Hayward fault, an event that has an estimated return period of about 500 years. The modeled hazard also corresponds to completely saturated ground-water conditions resulting from an extreme storm event or series of storm events. This combination of earthquake and ground-water scenarios represents a particularly severe state of hazard for earthquake-induced landslides. For dry ground-water conditions, overall hazard will be less, while relative patterns of hazard are likely to change. Purpose: The map is intended as a tool for regional planning. Any site-specific planning or analysis should be undertaken with the assistance of a qualified geotechnical engineer. This hazard map should not be used as a substitute to the State of California Seismic Hazard Zones map for the same area. (See California Department of Conservation, Division of Mines and Geology, 1999). As previously noted for maps of this type by Wieczorek and others (1985), this map should not be used as a basis to determine the absolute risk from seismically triggered landslides at any locality, as the sole justification for zoning or rezoning any parcel, for detailed design of any lifeline, for site-specific hazard-reduction planning, or for setting or modifying insurance rates.
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
Specifying the ISS Plasma Environment
NASA Technical Reports Server (NTRS)
Minow, Joseph I.; Diekmann, Anne; Neergaard, Linda; Bui, Them; Mikatarian, Ronald; Barsamian, Hagop; Koontz, Steven
2002-01-01
Quantifying the spacecraft charging risks and corresponding hazards for the International Space Station (ISS) requires a plasma environment specification describing the natural variability of ionospheric temperature (Te) and density (Ne). Empirical ionospheric specification and forecast models such as the International Reference Ionosphere (IN) model typically only provide estimates of long term (seasonal) mean Te and Ne values for the low Earth orbit environment. Knowledge of the Te and Ne variability as well as the likelihood of extreme deviations from the mean values are required to estimate both the magnitude and frequency of occurrence of potentially hazardous spacecraft charging environments for a given ISS construction stage and flight configuration. This paper describes the statistical analysis of historical ionospheric low Earth orbit plasma measurements used to estimate Ne, Te variability in the ISS flight environment. The statistical variability analysis of Ne and Te enables calculation of the expected frequency of occurrence of any particular values of Ne and Te, especially those that correspond to possibly hazardous spacecraft charging environments. The database used in the original analysis included measurements from the AE-C, AE-D, and DE-2 satellites. Recent work on the database has added additional satellites to the database and ground based incoherent scatter radar observations as well. Deviations of the data values from the IRI estimated Ne, Te parameters for each data point provide a statistical basis for modeling the deviations of the plasma environment from the IRI model output.
NASA Astrophysics Data System (ADS)
Huttenlau, Matthias; Stötter, Johann
2010-05-01
Reinsurance companies are stating a high increase in natural hazard related losses, both insured and economic losses, within the last decades on a global scale. This ongoing trend can be described as a product of the dynamic in the natural and in the anthroposphere. To analyze the potential impact of natural hazard process to a certain insurance portfolio or to the society in general, reinsurance companies or risk management consultants have developed loss models. However, those models are generally not fitting the scale dependent demand on regional scales like it is appropriate (i) for analyses on the scale of a specific province or (ii) for portfolio analyses of regional insurance companies. Moreover, the scientific basis of most of the models is not transparent documented and therefore scientific evaluations concerning the methodology concepts are not possible (black box). This is contrary to the scientific principles of transparency and traceability. Especially in mountain regions like the European Alps with their inherent (i) specific characteristic on small scales, (ii) the relative high process dynamics in general, (iii) the occurrence of gravitative mass movements which are related to high relief energy and thus only exists in mountain regions, (iv) the small proportion of the area of permanent settlement on the overall area, (v) the high value concentration in the valley floors, (vi) the exposition of important infrastructures and lifelines, and others, analyses must consider these circumstances adequately. Therefore, risk-based analyses are methodically estimating the potential consequences of hazard process on the built environment standardized with the risk components (i) hazard, (ii) elements at risk, and (iii) vulnerability. However, most research and progress have been made in the field of hazard analyses, whereas the other both components are not developed accordingly. Since these three general components are influencing factors without any weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were incorporated with additional GIS and statistic data to a comprehensive property-by-property geodatabase of the existing elements and values. This stock of elements and values geodatabase is furthermore the consistent basis for all natural hazard analyses and enables the comparison of the results. The study follows the general accepted moduls (i) hazard analysis, (ii) exposition analysis, and (iii) consequence analysis, whereas the exposition analysis estimates the elements at risk with their corresponding damage potentials and the consequence analysis estimates the PMLs. This multi-hazard analysis focuses on process types with a high to extreme potential of negative consequences on a regional scale. In this context, (i) floodings, (ii) rockslides with the potential of corresponding consequence effects (backwater ponding and outburst flood), (iii) earthquakes, (iv) hail events, and (v) winter storms were considered as hazard processes. Based on general hazard analyses (hazard maps) concrete scenarios and their spatial affectedness were determined. For the different hazard processes, different vulnerability approaches were considered to demonstrate their sensitivity and implication on the results. Thus, no absolute values of losses but probable loss ranges were estimated. It can be shown, that the most serious amount of losses would arise from extreme earthquake events with loss burdens up to more than € 7 bn. solely on buildings and inventory. Possible extreme flood events could lead to losses between € 2 and 2.5 bn., whereas a severe hail swath which affects the central Inn valley could result in losses of ca. € 455 mill. (thereof € 285 mill. on vehicles). The potential most serious rockslide with additional consequence effects would result in losses up to ca. € 185 mill. and extreme winter storms can induce losses between € 100 mill. and 150 mill..
1993-09-01
AF89-071. Washington: GPO, 1989. 159 14. Diaz , SSgt Laura, 648 Med Sq/SGHL. Personal interview. Brooks AFB, San Antonio TX, 14 July 1993. 15. EA...Training Strateav of the Acauisition Management of Hazardous Materials Program. Draft Report. McLean VA, August 1991. 36. Moore, SSgt Alicia . AL/OEVP
Covariate Measurement Error Correction Methods in Mediation Analysis with Failure Time Data
Zhao, Shanshan
2014-01-01
Summary Mediation analysis is important for understanding the mechanisms whereby one variable causes changes in another. Measurement error could obscure the ability of the potential mediator to explain such changes. This paper focuses on developing correction methods for measurement error in the mediator with failure time outcomes. We consider a broad definition of measurement error, including technical error and error associated with temporal variation. The underlying model with the ‘true’ mediator is assumed to be of the Cox proportional hazards model form. The induced hazard ratio for the observed mediator no longer has a simple form independent of the baseline hazard function, due to the conditioning event. We propose a mean-variance regression calibration approach and a follow-up time regression calibration approach, to approximate the partial likelihood for the induced hazard function. Both methods demonstrate value in assessing mediation effects in simulation studies. These methods are generalized to multiple biomarkers and to both case-cohort and nested case-control sampling design. We apply these correction methods to the Women's Health Initiative hormone therapy trials to understand the mediation effect of several serum sex hormone measures on the relationship between postmenopausal hormone therapy and breast cancer risk. PMID:25139469
Covariate measurement error correction methods in mediation analysis with failure time data.
Zhao, Shanshan; Prentice, Ross L
2014-12-01
Mediation analysis is important for understanding the mechanisms whereby one variable causes changes in another. Measurement error could obscure the ability of the potential mediator to explain such changes. This article focuses on developing correction methods for measurement error in the mediator with failure time outcomes. We consider a broad definition of measurement error, including technical error, and error associated with temporal variation. The underlying model with the "true" mediator is assumed to be of the Cox proportional hazards model form. The induced hazard ratio for the observed mediator no longer has a simple form independent of the baseline hazard function, due to the conditioning event. We propose a mean-variance regression calibration approach and a follow-up time regression calibration approach, to approximate the partial likelihood for the induced hazard function. Both methods demonstrate value in assessing mediation effects in simulation studies. These methods are generalized to multiple biomarkers and to both case-cohort and nested case-control sampling designs. We apply these correction methods to the Women's Health Initiative hormone therapy trials to understand the mediation effect of several serum sex hormone measures on the relationship between postmenopausal hormone therapy and breast cancer risk. © 2014, The International Biometric Society.
Seismic hazards in Thailand: a compilation and updated probabilistic analysis
NASA Astrophysics Data System (ADS)
Pailoplee, Santi; Charusiri, Punya
2016-06-01
A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.
Lindley frailty model for a class of compound Poisson processes
NASA Astrophysics Data System (ADS)
Kadilar, Gamze Özel; Ata, Nihal
2013-10-01
The Lindley distribution gain importance in survival analysis for the similarity of exponential distribution and allowance for the different shapes of hazard function. Frailty models provide an alternative to proportional hazards model where misspecified or omitted covariates are described by an unobservable random variable. Despite of the distribution of the frailty is generally assumed to be continuous, it is appropriate to consider discrete frailty distributions In some circumstances. In this paper, frailty models with discrete compound Poisson process for the Lindley distributed failure time are introduced. Survival functions are derived and maximum likelihood estimation procedures for the parameters are studied. Then, the fit of the models to the earthquake data set of Turkey are examined.
Grove, Erik L; Hansen, Peter Riis; Olesen, Jonas B; Ahlehoff, Ole; Selmer, Christian; Lindhardsen, Jesper; Madsen, Jan Kyst; Køber, Lars; Torp-Pedersen, Christian; Gislason, Gunnar H
2011-01-01
Objective To examine the effect of proton pump inhibitors on adverse cardiovascular events in aspirin treated patients with first time myocardial infarction. Design Retrospective nationwide propensity score matched study based on administrative data. Setting All hospitals in Denmark. Participants All aspirin treated patients surviving 30 days after a first myocardial infarction from 1997 to 2006, with follow-up for one year. Patients treated with clopidogrel were excluded. Main outcome measures The risk of the combined end point of cardiovascular death, myocardial infarction, or stroke associated with use of proton pump inhibitors was analysed using Kaplan-Meier analysis, Cox proportional hazard models, and propensity score matched Cox proportional hazard models. Results 3366 of 19 925 (16.9%) aspirin treated patients experienced recurrent myocardial infarction, stroke, or cardiovascular death. The hazard ratio for the combined end point in patients receiving proton pump inhibitors based on the time dependent Cox proportional hazard model was 1.46 (1.33 to 1.61; P<0.001) and for the propensity score matched model based on 8318 patients it was 1.61 (1.45 to 1.79; P<0.001). A sensitivity analysis showed no increase in risk related to use of H2 receptor blockers (1.04, 0.79 to 1.38; P=0.78). Conclusion In aspirin treated patients with first time myocardial infarction, treatment with proton pump inhibitors was associated with an increased risk of adverse cardiovascular events. PMID:21562004
Local soil effects on the Ground Motion Prediction model for the Racha region in Georgia
NASA Astrophysics Data System (ADS)
Jorjiashvili, N.; Shengelia, I.; Otinashvili, M.; Tvaliashvili, A.
2016-12-01
The Caucasus is a region of numerous natural hazards and ensuing disasters. Analysis of the losses due to past disasters indicates those most catastrophic in the region have historically been due to strong earthquakes. Estimation of expected ground motion is a fundamental earthquake hazard assessment. The most commonly used parameter for attenuation relation is the peak ground acceleration because this parameter gives useful information for Seismic Hazard Assessment that was selected for the analysis. One of the most important topics that have a significant influence on earthquake records is the site ground conditions that are the main issue of the study because the same earthquake recorded at the same distance may cause different damage according to ground conditions. In the study earthquake records were selected for the Racha region in Georgia which has the highest seismic activity in the region. Next, new GMP models are obtained based on new digital data recorded in the same area. After removing the site effect the earthquake records on the rock site were obtained. Thus, two GMP models were obtained: one for the ground surface and the other for the rock site. At the end, comparison was done for the both models in order to analyze the influence of the local soil conditions on the GMP model.
Canister Storage Building (CSB) Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
POWERS, T.B.
2000-03-16
This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other controls required to protect the public, workers, and environment.« less
Applications of statistics to medical science, IV survival analysis.
Watanabe, Hiroshi
2012-01-01
The fundamental principles of survival analysis are reviewed. In particular, the Kaplan-Meier method and a proportional hazard model are discussed. This work is the last part of a series in which medical statistics are surveyed.
A hazards-model analysis of the covariates of infant and child mortality in Sri Lanka.
Trussell, J; Hammerslough, C
1983-02-01
The purpose of this paper is twofold: (a) to provide a complete self-contained exposition of estimating life tables with covariates through the use of hazards models, and (b) to illustrate this technique with a substantive analysis of child mortality in Sri Lanka, thereby demonstrating that World Fertility Survey data are a valuable source for the study of child mortality. We show that life tables with covariates can be easily estimated with standard computer packages designed for analysis of contingency tables. The substantive analysis confirms and supplements an earlier study of infant and child mortality in Sri Lanka by Meegama. Those factors found to be strongly associated with mortality are mother's and father's education, time period of birth, urban/rural/estate residence, ethnicity, sex, birth order, age of the mother at the birth, and type of toilet facility.
Seismic hazard analysis for Jayapura city, Papua
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robiana, R., E-mail: robiana-geo104@yahoo.com; Cipta, A.
Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock typemore » and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.« less
Risk analysis for roadways subjected to multiple landslide-related hazards
NASA Astrophysics Data System (ADS)
Corominas, Jordi; Mavrouli, Olga
2014-05-01
Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two dynamic factors: the service load and the wall deformation. The consequences are then calculated for each hazard type according to its characteristics (mechanism, magnitude, frequency). The difference of this method in comparison with other methodologies for landslide-related hazards lies in the hazard scenarios and consequence profiles that are investigated. The depth of analysis permits to account for local conditions either concerning the hazard or the consequences (the latter with respect to the very particular characteristics of the roadway such as traffic, number of lanes, velocity…). Furthermore it provides an extensive list of quantitative risk descriptors, including both individual and collective ones. The methodology was made automatic using the data sheets by Microsoft Excel. The results can be used to support decision-taking for the planning of protection measures. Gaps in knowledge and restrictions are discussed as well.
Modeling Information Accumulation in Psychological Tests Using Item Response Times
ERIC Educational Resources Information Center
Ranger, Jochen; Kuhn, Jörg-Tobias
2015-01-01
In this article, a latent trait model is proposed for the response times in psychological tests. The latent trait model is based on the linear transformation model and subsumes popular models from survival analysis, like the proportional hazards model and the proportional odds model. Core of the model is the assumption that an unspecified monotone…
Imamura, Fumiaki; Lichtenstein, Alice H; Dallal, Gerard E; Meigs, James B; Jacques, Paul F
2009-07-01
The ability to interpret epidemiologic observations is limited because of potential residual confounding by correlated dietary components. Dietary pattern analyses by factor analysis or partial least squares may overcome the limitation. To examine confounding by dietary pattern as well as standard risk factors and selected nutrients, the authors modeled the longitudinal association between alcohol consumption and 7-year risk of type 2 diabetes mellitus in 2,879 healthy adults enrolled in the Framingham Offspring Study (1991-2001) by Cox proportional hazard models. After adjustment for standard risk factors, consumers of > or =9.0 drinks/week had a significantly lower risk of type 2 diabetes mellitus compared with abstainers (hazard ratio = 0.47, 95% confidence interval (CI): 0.27, 0.81). Adjustment for selected nutrients had little effect on the hazard ratio, whereas adjustment for dietary pattern variables by factor analysis significantly shifted the hazard ratio away from null (hazard ratio = 0.33, 95% CI: 0.17, 0.64) by 40.0% (95% CI: 16.8, 57.0; P = 0.002). Dietary pattern variables by partial least squares showed similar results. Therefore, the observed inverse association, consistent with past studies, was confounded by dietary patterns, and this confounding was not captured by individual nutrient adjustment. The data suggest that alcohol intake, not dietary patterns associated with alcohol intake, is responsible for the observed inverse association with type 2 diabetes mellitus risk.
NASA Astrophysics Data System (ADS)
Moser, M.
2009-04-01
The catchment Gadeinerbach in the District of Lungau/Salzburg/Austria is prone to debris flows. Large debris flow events dates back from the years 1934 and 1953. In the upper catchment large mass movements represent debris sources. A field study shows the debris potential and the catchment looks like a "sleeping torrential giant". To carry out mitigation measures a detailed risk management concept, based on a risk assessment in combination of historical analysis, field study and numerical modeling on the alluvial fan was conducted. Human activities have partly altered the surface of the alluvial fan Gadeinerbach but nevertheless some important hazard indicators could be found. With the hazard indicators and photo analysis from the large debris flow event 1934 the catchment character could be pointed out. With the help of these historical data sets (hazard indicators, sediment and debris amount...) it is possible to calibrate the provided numerical models and to win useful knowledge over the pro and cons and their application. The results were used to simulate the design event and furthermore to derive mitigation measures. Therefore the most effective protection against debris with a reduction of the high energy level to a lower level under particular energy change in combination with a debris/bedload deposition place has been carried out. Expert opinion, the study of historical data and a field work is in addition to numerical simulation techniques very necessary for the work in the field of natural hazard management.
CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources
NASA Astrophysics Data System (ADS)
Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.
2013-12-01
As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over 1100 hazard curves. We will report on the performance of this CyberShake study, four times larger than previous studies. Additionally, we will examine the challenges we face applying these workflow techniques to additional open-science HPC systems and discuss whether our workflow solutions continue to provide value to our large-scale PSHA calculations.
Long term volcanic hazard analysis in the Canary Islands
NASA Astrophysics Data System (ADS)
Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.
2009-04-01
Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit having enough quality information to map volcanic hazards and to run more reliable models of volcanic hazards, but in addition it aims to become a sharing system, improving communication between researchers, reducing redundant work and to be the reference for geological research in the Canary Islands.
St. Louis area earthquake hazards mapping project; seismic and liquefaction hazard maps
Cramer, Chris H.; Bauer, Robert A.; Chung, Jae-won; Rogers, David; Pierce, Larry; Voigt, Vicki; Mitchell, Brad; Gaunt, David; Williams, Robert; Hoffman, David; Hempen, Gregory L.; Steckel, Phyllis; Boyd, Oliver; Watkins, Connor M.; Tucker, Kathleen; McCallister, Natasha
2016-01-01
We present probabilistic and deterministic seismic and liquefaction hazard maps for the densely populated St. Louis metropolitan area that account for the expected effects of surficial geology on earthquake ground shaking. Hazard calculations were based on a map grid of 0.005°, or about every 500 m, and are thus higher in resolution than any earlier studies. To estimate ground motions at the surface of the model (e.g., site amplification), we used a new detailed near‐surface shear‐wave velocity model in a 1D equivalent‐linear response analysis. When compared with the 2014 U.S. Geological Survey (USGS) National Seismic Hazard Model, which uses a uniform firm‐rock‐site condition, the new probabilistic seismic‐hazard estimates document much more variability. Hazard levels for upland sites (consisting of bedrock and weathered bedrock overlain by loess‐covered till and drift deposits), show up to twice the ground‐motion values for peak ground acceleration (PGA), and similar ground‐motion values for 1.0 s spectral acceleration (SA). Probabilistic ground‐motion levels for lowland alluvial floodplain sites (generally the 20–40‐m‐thick modern Mississippi and Missouri River floodplain deposits overlying bedrock) exhibit up to twice the ground‐motion levels for PGA, and up to three times the ground‐motion levels for 1.0 s SA. Liquefaction probability curves were developed from available standard penetration test data assuming typical lowland and upland water table levels. A simplified liquefaction hazard map was created from the 5%‐in‐50‐year probabilistic ground‐shaking model. The liquefaction hazard ranges from low (60% of area expected to liquefy) in the lowlands. Because many transportation routes, power and gas transmission lines, and population centers exist in or on the highly susceptible lowland alluvium, these areas in the St. Louis region are at significant potential risk from seismically induced liquefaction and associated ground deformation
Ryu, Hyeuk; Luco, Nicolas; Baker, Jack W.; Karaca, Erdem
2008-01-01
A methodology was recently proposed for the development of hazard-compatible building fragility models using parameters of capacity curves and damage state thresholds from HAZUS (Karaca and Luco, 2008). In the methodology, HAZUS curvilinear capacity curves were used to define nonlinear dynamic SDOF models that were subjected to the nonlinear time history analysis instead of the capacity spectrum method. In this study, we construct a multilinear capacity curve with negative stiffness after an ultimate (capping) point for the nonlinear time history analysis, as an alternative to the curvilinear model provided in HAZUS. As an illustration, here we propose parameter values of the multilinear capacity curve for a moderate-code low-rise steel moment resisting frame building (labeled S1L in HAZUS). To determine the final parameter values, we perform nonlinear time history analyses of SDOF systems with various parameter values and investigate their effects on resulting fragility functions through sensitivity analysis. The findings improve capacity curves and thereby fragility and/or vulnerability models for generic types of structures.
Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less
A LiDAR based analysis of hydraulic hazard mapping
NASA Astrophysics Data System (ADS)
Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.
2012-04-01
Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed one by one. Therefore, each segment was split into many reaches, so that within any of them the slope of the piezometric line can be approximated to zero. As a consequence, the hydraulic profile (open channel flow) in every reach is assumed horizontal both downslope and on the cross-section. Each reach can be seen as a polygon, delimited laterally by the hazard mapping boundaries and longitudinally by two successive cross sections, usually orthogonal to the talweg line. Simulating the progressive increase of the river stage, with a horizontal piezometric line, allow the definition of the stage-area and stage-volume relationships. Such relationships are obtained exclusively by the geometric information as provided by the high resolution elevation model. The maximum flooded area resulting from the simulation is finally compared to the potentially floodable area described by the hazard maps, to give a flooding index for every reach. Index values lower than 100% show that the mapped hazard area exceeds the maximum floodable area. Very low index values identify spots where there is a significant incongruity between the hazard map and the topography, and where a specific verification is probably needed. The procedure was successfully used for the validation of many hazard maps across Italy.
Condition Assessment Modeling for Distribution Systems Using Shared Frailty Analysis
Condition Assessment (CA) modeling is drawing increasing interest as a methodology for managing drinking water infrastructure. This paper develops a Cox Proportional Hazard (PH)/shared frailty model and applies it to the problem of investment in the repair and replacement of dri...
NASA Technical Reports Server (NTRS)
Watson, Clifford
2010-01-01
Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the twodimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and threedimensional charting gives a visual confirmation of the relationship between causes and their controls
NASA Technical Reports Server (NTRS)
Watson, Clifford C.
2011-01-01
Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the two-dimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and three-dimensional charting gives a visual confirmation of the relationship between causes and their controls.
Risk Presentation Using the Three Dimensions of Likelihood, Severity, and Level of Control
NASA Technical Reports Server (NTRS)
Watson, Clifford
2010-01-01
Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the two-dimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the leastwell-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and three-dimensional charting gives a visual confirmation of the relationship between causes and their controls.
Landslide and flood hazard assessment in urban areas of LevoÄa region (Eastern Slovakia)
NASA Astrophysics Data System (ADS)
Magulova, Barbora; Caporali, Enrica; Bednarik, Martin
2010-05-01
The case study presents the use of statistical methods and analysis tools, for hazard assessment of "urbanization units", implemented in a Geographic Information Systems (GIS) environment. As a case study, the Levoča region (Slovakia) is selected. The region, with a total area of about 351 km2, is widely affected by landslides and floods. The problem, for small urbanization areas, is nowadays particularly significant from the socio-economic point of view. It is considered, presently, also an increasing problem, mainly because of climate change and more frequent extreme rainfall events. The geo-hazards are evaluated using a multivariate analysis. The landslide hazard assessment is based on the comparison and subsequent statistical elaboration of territorial dependence among different input factors influencing the instability of the slopes. Particularly, five factors influencing slope stability are evaluated, i.e. lithology, slope aspect, slope angle, hypsographic level and present land use. As a result a new landslide susceptibility map is compiled and different zones of stable, dormant and non-stable areas are defined. For flood hazard map a detailed digital elevation model is created. A compose index of flood hazard is derived from topography, land cover and pedology related data. To estimate flood discharge, time series of stream flow and precipitation measurements are used. The assessment results are prognostic maps of landslide hazard and flood hazard, which presents the optimal base for urbanization planning.
INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.J. Garrett
2005-02-17
The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology formore » this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.« less
NASA Astrophysics Data System (ADS)
Pradhan, Biswajeet
2010-05-01
This paper presents the results of the cross-validation of a multivariate logistic regression model using remote sensing data and GIS for landslide hazard analysis on the Penang, Cameron, and Selangor areas in Malaysia. Landslide locations in the study areas were identified by interpreting aerial photographs and satellite images, supported by field surveys. SPOT 5 and Landsat TM satellite imagery were used to map landcover and vegetation index, respectively. Maps of topography, soil type, lineaments and land cover were constructed from the spatial datasets. Ten factors which influence landslide occurrence, i.e., slope, aspect, curvature, distance from drainage, lithology, distance from lineaments, soil type, landcover, rainfall precipitation, and normalized difference vegetation index (ndvi), were extracted from the spatial database and the logistic regression coefficient of each factor was computed. Then the landslide hazard was analysed using the multivariate logistic regression coefficients derived not only from the data for the respective area but also using the logistic regression coefficients calculated from each of the other two areas (nine hazard maps in all) as a cross-validation of the model. For verification of the model, the results of the analyses were then compared with the field-verified landslide locations. Among the three cases of the application of logistic regression coefficient in the same study area, the case of Selangor based on the Selangor logistic regression coefficients showed the highest accuracy (94%), where as Penang based on the Penang coefficients showed the lowest accuracy (86%). Similarly, among the six cases from the cross application of logistic regression coefficient in other two areas, the case of Selangor based on logistic coefficient of Cameron showed highest (90%) prediction accuracy where as the case of Penang based on the Selangor logistic regression coefficients showed the lowest accuracy (79%). Qualitatively, the cross application model yields reasonable results which can be used for preliminary landslide hazard mapping.
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Sandri, Laura; Costa, Antonio; Selva, Jacopo
2014-05-01
Campi Flegrei (CF) is a large volcanic field located west of the Gulf of Naples, characterized by a wide and almost circular caldera which is partially submerged beneath the Gulf of Pozzuoli. It is known that the magma-water interaction is a key element to determine the character of submarine eruptions and their impact on the surrounding areas, but this phenomenon is still not well understood and it is rarely considered in hazard assessment. The aim of the present work is to present a preliminary study of the effect of the sea on the tephra fall hazard from CF on the municipality of Naples, by introducing a variability in the probability of tephra production according to the eruptive scale (defined on the basis of the erupted volume) and the depth of the opening submerged vents. Four different Probabilistic Volcanic Hazard Assessment (PVHA) models have been defined through the application of the model BET_VH at CF, by accounting for different modeling procedures and assumptions for the submerged part of the caldera. In particular, we take into account: 1) the effect of the sea as null, i.e. as if the water were not present; 2) the effect of the sea as a cap that totally blocks the explosivity of eruptions and consequently the tephra production; 3) an ensemble model between the two models described at the previous points 1) and 2); 4) a variable probability of tephra production depending on the depth of the submerged vent. The PVHA models are then input to pyPHaz, a tool developed and designed at INGV to visualize, analyze and merge into ensemble models PVHA's results and, potentially, any other kind of probabilistic hazard assessment, both natural and anthropic, in order to evaluate the importance of considering a variability among subaerial and submerged vents on tephra fallout hazard from CF in Naples. The analysis is preliminary and does not pretend to be exhaustive, but on one hand it represents a starting point for future works; on the other hand, it is a good case study to show the potentiality of the pyPHaz tool that, thanks to a dedicated Graphical User Interface (GUI), allows to interactively manage and visualize results of probabilistic hazards (hazard curves together with probability and hazard maps for different levels of uncertainties), and to compare or merge different hazard models producing ensemble models. This work has been developed in the framework of two Italian projects, "ByMuR (Bayesian Multi-Risk Assessment: a case study for natural risks in the city of Naples)" funded by the Italian Ministry of Education, Universities and Research (MIUR), and "V1: Probabilistic Volcanic Hazard Assessments" funded by the Italian Department of Civil Protection (DPC).
Southwestern Oregon's Biscuit Fire: An Analysis of Forest Resources, Fire Severity, and Fire Hazard
David L. Azuma; Glenn A. Christensen
2005-01-01
This study compares pre-fire field inventory data (collected from 1993 to 1997) in relation to post-fire mapped fire severity classes and the Fire and Fuels Extension of the Forest Vegetation Simulator growth and yield model measures of fire hazard for the portion of the Siskiyou National Forest in the 2002 Biscuit fire perimeter of southwestern Oregon. Post-fire...
Assessing the risk posed by natural hazards to infrastructures
NASA Astrophysics Data System (ADS)
Eidsvig, Unni Marie K.; Kristensen, Krister; Vidar Vangelsten, Bjørn
2017-03-01
This paper proposes a model for assessing the risk posed by natural hazards to infrastructures, with a focus on the indirect losses and loss of stability for the population relying on the infrastructure. The model prescribes a three-level analysis with increasing level of detail, moving from qualitative to quantitative analysis. The focus is on a methodology for semi-quantitative analyses to be performed at the second level. The purpose of this type of analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures, identifying the most critical scenarios and investigating the need for further analyses (third level). The proposed semi-quantitative methodology considers the frequency of the natural hazard, different aspects of vulnerability, including the physical vulnerability of the infrastructure itself, and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale according to pre-defined ranking criteria. The proposed indicators, which characterise conditions that influence the probability of an infrastructure malfunctioning caused by a natural event, are defined as (1) robustness and buffer capacity, (2) level of protection, (3) quality/level of maintenance and renewal, (4) adaptability and quality of operational procedures and (5) transparency/complexity/degree of coupling. Further indicators describe conditions influencing the socio-economic consequences of the infrastructure malfunctioning, such as (1) redundancy and/or substitution, (2) cascading effects and dependencies, (3) preparedness and (4) early warning, emergency response and measures. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard, the potential duration of the infrastructure malfunctioning (e.g. depending on the required restoration effort) and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented for demonstration purposes, where risk posed by adverse weather and natural hazards to primary road, water supply and power networks is assessed. The application examples show that the proposed model provides a useful tool for screening of potential undesirable events, contributing to a targeted reduction of the risk.
NASA Astrophysics Data System (ADS)
Komjathy, A.; Yang, Y. M.; Meng, X.; Verkhoglyadova, O. P.; Mannucci, A. J.; Langley, R. B.
2015-12-01
Natural hazards, including earthquakes, volcanic eruptions, and tsunamis, have been significant threats to humans throughout recorded history. The Global Positioning System satellites have become primary sensors to measure signatures associated with such natural hazards. These signatures typically include GPS-derived seismic deformation measurements, co-seismic vertical displacements, and real-time GPS-derived ocean buoy positioning estimates. Another way to use GPS observables is to compute the ionospheric total electron content (TEC) to measure and monitor post-seismic ionospheric disturbances caused by earthquakes, volcanic eruptions, and tsunamis. Research at the University of New Brunswick (UNB) laid the foundations to model the three-dimensional ionosphere at NASA's Jet Propulsion Laboratory by ingesting ground- and space-based GPS measurements into the state-of-the-art Global Assimilative Ionosphere Modeling (GAIM) software. As an outcome of the UNB and NASA research, new and innovative GPS applications have been invented including the use of ionospheric measurements to detect tiny fluctuations in the GPS signals between the spacecraft and GPS receivers caused by natural hazards occurring on or near the Earth's surface.We will show examples for early detection of natural hazards generated ionospheric signatures using ground-based and space-borne GPS receivers. We will also discuss recent results from the U.S. Real-time Earthquake Analysis for Disaster Mitigation Network (READI) exercises utilizing our algorithms. By studying the propagation properties of ionospheric perturbations generated by natural hazards along with applying sophisticated first-principles physics-based modeling, we are on track to develop new technologies that can potentially save human lives and minimize property damage. It is also expected that ionospheric monitoring of TEC perturbations might become an integral part of existing natural hazards warning systems.
Hazard Analysis Guidelines for Transit Projects
DOT National Transportation Integrated Search
2000-01-01
These hazard analysis guidelines discuss safety critical systems and subsystems, types of hazard analyses, when hazard analyses should be performed, and the hazard analysis philosophy. These guidelines are published by FTA to assist the transit indus...
Criminal Justice Outcomes after Engagement in Outpatient Substance Abuse Treatment
Garnick, Deborah W.; Horgan, Constance M.; Acevedo, Andrea; Lee, Margaret T.; Panas, Lee; Ritter, Grant A.; Dunigan, Robert; Bidorini, Alfred; Campbell, Kevin; Haberlin, Karin; Huber, Alice; Lambert-Wacey, Dawn; Leeper, Tracy; Reynolds, Mark; Wright, David
2013-01-01
The relationship between engagement in outpatient treatment facilities in the public sector and subsequent arrest is examined for clients in Connecticut, New York, Oklahoma and Washington. Engagement is defined as receiving another treatment service within 14 days of beginning a new episode of specialty treatment and at least two additional services within the next 30 days. Data are from 2008 and survival analysis modeling is used. Survival analyses express the effects of model covariates in terms of “hazard ratios,” which reflect a change in the likelihood of outcome because of the covariate. Engaged clients had a significantly lower hazard of any arrest than non-engaged in all four states. In NY and OK, engaged clients also had a lower hazard of arrest for substance-related crimes. In CT, NY, and OK engaged clients had a lower hazard of arrest for violent crime. Clients in facilities with higher engagement rates had a lower hazard of any arrest in NY and OK. Engaging clients in outpatient treatment is a promising approach to decrease their subsequent criminal justice involvement. PMID:24238717
Modeling the Risk of Fire/Explosion Due to Oxidizer/Fuel Leaks in the Ares I Interstage
NASA Technical Reports Server (NTRS)
Ring, Robert W.; Stott, James E.; Hales, Christy
2008-01-01
A significant flight hazard associated with liquid propellants, such as those used in the upper stage of NASA's new Ares I launch vehicle, is the possibility of leakage of hazardous fluids resulting in a catastrophic fire/explosion. The enclosed and vented interstage of the Ares I contains numerous oxidizer and fuel supply lines as well as ignition sources. The potential for fire/explosion due to leaks during ascent depends on the relative concentrations of hazardous and inert fluids within the interstage along with other variables such as pressure, temperature, leak rates, and fluid outgasing rates. This analysis improves on previous NASA Probabilistic Risk Assessment (PRA) estimates of the probability of deflagration, in which many of the variables pertinent to the problem were not explicitly modeled as a function of time. This paper presents the modeling methodology developed to analyze these risks.
14 CFR 437.29 - Hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...
14 CFR 437.29 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...
Specification of the ISS Plasma Environment Variability
NASA Technical Reports Server (NTRS)
Minow, Joseph I.; Neergaard, Linda F.; Bui, Them H.; Mikatarian, Ronald R.; Barsamian, H.; Koontz, Steven L.
2002-01-01
Quantifying the spacecraft charging risks and corresponding hazards for the International Space Station (ISS) requires a plasma environment specification describing the natural variability of ionospheric temperature (Te) and density (Ne). Empirical ionospheric specification and forecast models such as the International Reference Ionosphere (IRI) model typically only provide estimates of long term (seasonal) mean Te and Ne values for the low Earth orbit environment. Knowledge of the Te and Ne variability as well as the likelihood of extreme deviations from the mean values are required to estimate both the magnitude and frequency of occurrence of potentially hazardous spacecraft charging environments for a given ISS construction stage and flight configuration. This paper describes the statistical analysis of historical ionospheric low Earth orbit plasma measurements used to estimate Ne, Te variability in the ISS flight environment. The statistical variability analysis of Ne and Te enables calculation of the expected frequency of Occurrence of any particular values of Ne and Te, especially those that correspond to possibly hazardous spacecraft charging environments. The database used in the original analysis included measurements from the AE-C, AE-D, and DE-2 satellites. Recent work on the database has added additional satellites to the database and ground based incoherent scatter radar observations as well. Deviations of the data values from the IRI estimated Ne, Te parameters for each data point provide a statistical basis for modeling the deviations of the plasma environment from the IRI model output. This technique, while developed specifically for the Space Station analysis, can also be generalized to provide ionospheric plasma environment risk specification models for low Earth orbit over an altitude range of 200 km through approximately 1000 km.
NASA Astrophysics Data System (ADS)
Castellanos Abella, Enrique A.; Van Westen, Cees J.
Geomorphological information can be combined with decision-support tools to assess landslide hazard and risk. A heuristic model was applied to a rural municipality in eastern Cuba. The study is based on a terrain mapping units (TMU) map, generated at 1:50,000 scale by interpretation of aerial photos, satellite images and field data. Information describing 603 terrain units was collected in a database. Landslide areas were mapped in detail to classify the different failure types and parts. Three major landslide regions are recognized in the study area: coastal hills with rockfalls, shallow debris flows and old rotational rockslides denudational slopes in limestone, with very large deep-seated rockslides related to tectonic activity and the Sierra de Caujerí scarp, with large rockslides. The Caujerí scarp presents the highest hazard, with recent landslides and various signs of active processes. The different landforms and the causative factors for landslides were analyzed and used to develop the heuristic model. The model is based on weights assigned by expert judgment and organized in a number of components such as slope angle, internal relief, slope shape, geological formation, active faults, distance to drainage, distance to springs, geomorphological subunits and existing landslide zones. From these variables a hierarchical heuristic model was applied in which three levels of weights were designed for classes, variables, and criteria. The model combines all weights into a single hazard value for each pixel of the landslide hazard map. The hazard map was then divided by two scales, one with three classes for disaster managers and one with 10 detailed hazard classes for technical staff. The range of weight values and the number of existing landslides is registered for each class. The resulting increasing landslide density with higher hazard classes indicates that the output map is reliable. The landslide hazard map was used in combination with existing information on buildings and infrastructure to prepare a qualitative risk map. The complete lack of historical landslide information and geotechnical data precludes the development of quantitative deterministic or probabilistic models.
Correlating regional natural hazards for global reinsurance risk assessment
NASA Astrophysics Data System (ADS)
Steptoe, Hamish; Maynard, Trevor; Economou, Theo; Fox, Helen; Wallace, Emily; Maisey, Paul
2016-04-01
Concurrent natural hazards represent an uncertainty in assessing exposure for the insurance industry. The recently implemented Solvency II Directive requires EU insurance companies to fully understand and justify their capital reserving and portfolio decisions. Lloyd's, the London insurance and reinsurance market, commissioned the Met Office to investigate the dependencies between different global extreme weather events (known to the industry as perils), and the mechanisms for these dependencies, with the aim of helping them assess their compound risk to the exposure of multiple simultaneous hazards. In this work, we base the analysis of hazard-to-hazard dependency on the interaction of different modes of global and regional climate variability. Lloyd's defined 16 key hazard regions, including Australian wildfires, flooding in China and EU windstorms, and we investigate the impact of 10 key climate modes on these areas. We develop a statistical model that facilitates rapid risk assessment whilst allowing for both temporal auto-correlation and, crucially, interdependencies between drivers. The simulator itself is built conditionally using autoregressive regression models for each driver conditional on the others. Whilst the baseline assumption within the (re)insurance industry is that different natural hazards are independent of each other, the assumption of independence of meteorological risks requires greater justification. Although our results suggest that most of the 120 hazard-hazard connections considered are likely to be independent of each other, 13 have significant dependence arising from one or more global modes of climate variability. This allows us to create a matrix of linkages describing the hazard dependency structure that Lloyd's can use to inform their understanding of risk.
Nowcast model for hazardous material spill prevention and response, San Francisco Bay, California
Cheng, Ralph T.; Wilmot, Wayne L.; Galt, Jerry A.
1997-01-01
The National Oceanic and Atmospheric Administration (NOAA) installed the Physical Oceanographic Real-time System (PORTS) in San Francisco Bay, California, to provide real-time observations of tides, tidal currents, and meteorological conditions to, among other purposes, guide hazardous material spill prevention and response. Integrated with nowcast modeling techniques and dissemination of real-time data and the nowcasting results through the Internet on the World Wide Web, emerging technologies used in PORTS for real-time data collection forms a nowcast modeling system. Users can download tides and tidal current distribution in San Francisco Bay for their specific applications and/or for further analysis.
Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems
NASA Astrophysics Data System (ADS)
Kwag, Shinyoung
Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.
Lunar mission safety and rescue: Hazards analysis and safety requirements
NASA Technical Reports Server (NTRS)
1971-01-01
The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.
NASA Astrophysics Data System (ADS)
England, John F.; Julien, Pierre Y.; Velleux, Mark L.
2014-03-01
Traditionally, deterministic flood procedures such as the Probable Maximum Flood have been used for critical infrastructure design. Some Federal agencies now use hydrologic risk analysis to assess potential impacts of extreme events on existing structures such as large dams. Extreme flood hazard estimates and distributions are needed for these efforts, with very low annual exceedance probabilities (⩽10-4) (return periods >10,000 years). An integrated data-modeling hydrologic hazard framework for physically-based extreme flood hazard estimation is presented. Key elements include: (1) a physically-based runoff model (TREX) coupled with a stochastic storm transposition technique; (2) hydrometeorological information from radar and an extreme storm catalog; and (3) streamflow and paleoflood data for independently testing and refining runoff model predictions at internal locations. This new approach requires full integration of collaborative work in hydrometeorology, flood hydrology and paleoflood hydrology. An application on the 12,000 km2 Arkansas River watershed in Colorado demonstrates that the size and location of extreme storms are critical factors in the analysis of basin-average rainfall frequency and flood peak distributions. Runoff model results are substantially improved by the availability and use of paleoflood nonexceedance data spanning the past 1000 years at critical watershed locations.
NASA Astrophysics Data System (ADS)
Anderson, E. R.; Griffin, R.; Markert, K. N.
2017-12-01
Scientists, practitioners, policymakers, and citizen groups, share a role in ensuring "that all sectors have access to, understand and can use scientific information for better informed decision-making" (Sendai Framework 2015-2030). When it comes to understanding hazards and exposure, inventories on disaster events are often limited. Thus, there are many opportunities for citizen scientists to engage in improving the collective understanding—and ultimately reduction—of disaster risk. Landslides are very difficult to forecast on spatial and temporal scales meaningful for early warning and evacuation. Heuristic hazard mapping methods are very common in regional hazard zonation and rely on expert knowledge of previous events and local conditions, but they often lack a temporal component. As new data analysis packages are becoming more open and accessible, probabilistic approaches that consider high resolution spatial and temporal dimensions are becoming more common, but this is only possible when rich inventories of landslide events exist. The work presented offers a proof of concept on incorporating crowd-sourced data to improve landslide hazard model performance. Starting with a national inventory of 90 catalogued landslides in El Salvador for a study period of 1998 to 2011, we simulate the addition of over 600 additional crowd-sourced landslide events that would have been identified through human interpretation of high resolution imagery in the Google Earth time slider feature. There is a noticeable improvement in performance statistics between static heuristic hazard models and probabilistic models that incorporate the events identified by the "crowd." Such a dynamic incorporation of crowd-sourced data on hazard events is not so far-fetched. Given the engagement of "local observers" in El Salvador who augment in situ hydro-meteorological measurements, the growing access to Earth observation data to the lay person, and immense interest behind connecting citizen scientists to remote sensing data through hackathons such as the NASA Space Apps Challenges, we envision a much more dynamic, collective understanding of landslide hazards. Here we present a better scenario of what we could have known had data from the crowd been incorporated into probabilistic hazard models on a regular basis.
Pate, M L; Dai, X
2014-04-01
The purpose of this study was to assess how selected variables affect the confined-space hazard perceptions of farmers in Utah. A confined space was defined as "any space found in an agricultural workplace that was not designed or intended as a regular workstation, has limited or restricted means of entry or exit, and contains potential physical and toxic hazards to workers who intentionally or unintentionally enter the space" (proposed by NCERA-197, 18 May 2011, draft copy). A total of 303 out of 327 farm owner/operators provided complete surveys that were used in the analysis. The state of Utah was grouped into five regions in this study: central, east, northeast, northwest, and southwest. Grain and dairy production comprised 48.7% of the operations responding to the survey. The general linear modeling (GLM) procedure in SAS 9.3 was used to select the models on hazard perception scores for the five studied regions. Interested predictors included response type, production type, safety planning, and injury concerns. Animal production operations had the highest average number of confined spaces (micro = 4, SD = 2.7). Regionally, the northwest region had the highest average number of confined spaces (micro = 4, SD = 2.5). The variables contributing most to confined-space hazard perceptions were injury and death concerns while working alone in confined spaces. Three factors were generated using principle factor analysis (PFA) with orthogonal varimax rotation. Results suggested that factors affect hazard perceptions differently by region. We conclude that outreach and educational efforts to change safety behaviors regarding confined-space hazards should be strategically targeted for each region based on predicting factors. The result can assist agricultural safety and health professionals in targeting agricultural producers' social networks to address human factors such as worker attitudes and/or lack of skills or knowledge that effect hazard perceptions of confined spaces in agriculture.
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...
NASA Astrophysics Data System (ADS)
Tonini, R.; Lorito, S.; Orefice, S.; Graziani, L.; Brizuela, B.; Smedile, A.; Volpe, M.; Romano, F.; De Martini, P. M.; Maramai, A.; Selva, J.; Piatanesi, A.; Pantosti, D.
2016-12-01
Site-specific probabilistic tsunami hazard analyses demand very high computational efforts that are often reduced by introducing approximations on tsunami sources and/or tsunami modeling. On one hand, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could easily lead to important bias in the analysis. On the other hand, detailed inundation maps computed by tsunami numerical simulations require very long running time. When tsunami effects are calculated at regional scale, a common practice is to propagate tsunami waves in deep waters (up to 50-100 m depth) neglecting non-linear effects and using coarse bathymetric meshes. Then, maximum wave heights on the coast are empirically extrapolated, saving a significant amount of computational time. However, moving to local scale, such assumptions drop out and tsunami modeling would require much greater computational resources. In this work, we perform a local Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) for the 50 km long coastal segment between Augusta and Siracusa, a touristic and commercial area placed along the South-Eastern Sicily coast, Italy. The procedure consists in using the outcomes of a regional SPTHA as input for a two-step filtering method to select and substantially reduce the number of scenarios contributing to the specific target area. These selected scenarios are modeled using high resolution topo-bathymetry for producing detailed inundation maps. Results are presented as probabilistic hazard curves and maps, with the goal of analyze, compare and highlight the different results provided by regional and local hazard assessments. Moreover, the analysis is enriched by the use of local observed tsunami data, both geological and historical. Indeed, tsunami data-sets available for the selected target areas are particularly rich with respect to the scarce and heterogeneous data-sets usually available elsewhere. Therefore, they can represent valuable benchmarks for testing and strengthening the results of such kind of studies. The work is funded by the Italian Flagship Project RITMARE, the two EC FP7 projects ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389), and the INGV-DPC Agreement.
Probabilistic Seismic Hazard Assessment of the Chiapas State (SE Mexico)
NASA Astrophysics Data System (ADS)
Rodríguez-Lomelí, Anabel Georgina; García-Mayordomo, Julián
2015-04-01
The Chiapas State, in southeastern Mexico, is a very active seismic region due to the interaction of three tectonic plates: Northamerica, Cocos and Caribe. We present a probabilistic seismic hazard assessment (PSHA) specifically performed to evaluate seismic hazard in the Chiapas state. The PSHA was based on a composited seismic catalogue homogenized to Mw and was used a logic tree procedure for the consideration of different seismogenic source models and ground motion prediction equations (GMPEs). The results were obtained in terms of peak ground acceleration as well as spectral accelerations. The earthquake catalogue was compiled from the International Seismological Center and the Servicio Sismológico Nacional de México sources. Two different seismogenic source zones (SSZ) models were devised based on a revision of the tectonics of the region and the available geomorphological and geological maps. The SSZ were finally defined by the analysis of geophysical data, resulting two main different SSZ models. The Gutenberg-Richter parameters for each SSZ were calculated from the declustered and homogenized catalogue, while the maximum expected earthquake was assessed from both the catalogue and geological criteria. Several worldwide and regional GMPEs for subduction and crustal zones were revised. For each SSZ model we considered four possible combinations of GMPEs. Finally, hazard was calculated in terms of PGA and SA for 500-, 1000-, and 2500-years return periods for each branch of the logic tree using the CRISIS2007 software. The final hazard maps represent the mean values obtained from the two seismogenic and four attenuation models considered in the logic tree. For the three return periods analyzed, the maps locate the most hazardous areas in the Chiapas Central Pacific Zone, the Pacific Coastal Plain and in the Motagua and Polochic Fault Zone; intermediate hazard values in the Chiapas Batholith Zone and in the Strike-Slip Faults Province. The hazard decreases towards the northeast across the Reverse Faults Province and up to Yucatan Platform, where the lowest values are reached. We also produced uniform hazard spectra (UHS) for the three main cities of Chiapas. Tapachula city presents the highest spectral accelerations, while Tuxtla Gutierrez and San Cristobal de las Casas cities show similar values. We conclude that seismic hazard in Chiapas is chiefly controlled by the subduction of the Cocos beneath Northamerica and Caribe tectonic plates, that makes the coastal areas the most hazardous. Additionally, the Motagua and Polochic Fault Zones are also important, increasing the hazard particularly in southeastern Chiapas.
NASA Astrophysics Data System (ADS)
Neighbors, C.; Noriega, G. R.; Caras, Y.; Cochran, E. S.
2010-12-01
HAZUS-MH MR4 (HAZards U. S. Multi-Hazard Maintenance Release 4) is a risk-estimation software developed by FEMA to calculate potential losses due to natural disasters. Federal, state, regional, and local government use the HAZUS-MH Earthquake Model for earthquake risk mitigation, preparedness, response, and recovery planning (FEMA, 2003). In this study, we examine several parameters used by the HAZUS-MH Earthquake Model methodology to understand how modifying the user-defined settings affect ground motion analysis, seismic risk assessment and earthquake loss estimates. This analysis focuses on both shallow crustal and deep intraslab events in the American Pacific Northwest. Specifically, the historic 1949 Mw 6.8 Olympia, 1965 Mw 6.6 Seattle-Tacoma and 2001 Mw 6.8 Nisqually normal fault intraslab events and scenario large-magnitude Seattle reverse fault crustal events are modeled. Inputs analyzed include variations of deterministic event scenarios combined with hazard maps and USGS ShakeMaps. This approach utilizes the capacity of the HAZUS-MH Earthquake Model to define landslide- and liquefaction- susceptibility hazards with local groundwater level and slope stability information. Where Shakemap inputs are not used, events are run in combination with NEHRP soil classifications to determine site amplification effects. The earthquake component of HAZUS-MH applies a series of empirical ground motion attenuation relationships developed from source parameters of both regional and global historical earthquakes to estimate strong ground motion. Ground motion and resulting ground failure due to earthquakes are then used to calculate, direct physical damage for general building stock, essential facilities, and lifelines, including transportation systems and utility systems. Earthquake losses are expressed in structural, economic and social terms. Where available, comparisons between recorded earthquake losses and HAZUS-MH earthquake losses are used to determine how region coordinators can most effectively utilize their resources for earthquake risk mitigation. This study is being conducted in collaboration with King County, WA officials to determine the best model inputs necessary to generate robust HAZUS-MH models for the Pacific Northwest.
A stochastic automata network for earthquake simulation and hazard estimation
NASA Astrophysics Data System (ADS)
Belubekian, Maya Ernest
1998-11-01
This research develops a model for simulation of earthquakes on seismic faults with available earthquake catalog data. The model allows estimation of the seismic hazard at a site of interest and assessment of the potential damage and loss in a region. There are two approaches for studying the earthquakes: mechanistic and stochastic. In the mechanistic approach, seismic processes, such as changes in stress or slip on faults, are studied in detail. In the stochastic approach, earthquake occurrences are simulated as realizations of a certain stochastic process. In this dissertation, a stochastic earthquake occurrence model is developed that uses the results from dislocation theory for the estimation of slip released in earthquakes. The slip accumulation and release laws and the event scheduling mechanism adopted in the model result in a memoryless Poisson process for the small and moderate events and in a time- and space-dependent process for large events. The minimum and maximum of the hazard are estimated by the model when the initial conditions along the faults correspond to a situation right after a largest event and after a long seismic gap, respectively. These estimates are compared with the ones obtained from a Poisson model. The Poisson model overestimates the hazard after the maximum event and underestimates it in the period of a long seismic quiescence. The earthquake occurrence model is formulated as a stochastic automata network. Each fault is divided into cells, or automata, that interact by means of information exchange. The model uses a statistical method called bootstrap for the evaluation of the confidence bounds on its results. The parameters of the model are adjusted to the target magnitude patterns obtained from the catalog. A case study is presented for the city of Palo Alto, where the hazard is controlled by the San Andreas, Hayward and Calaveras faults. The results of the model are used to evaluate the damage and loss distribution in Palo Alto. The sensitivity analysis of the model results to the variation in basic parameters shows that the maximum magnitude has the most significant impact on the hazard, especially for long forecast periods.
The Role of Deposition in Limiting the Hazard Extent of Dense-Gas Plumes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillon, M B
2008-01-29
Accidents involving release of large (multi-ton) quantities of toxic industrial chemicals often yield far fewer fatalities and causalities than standard, widely-used assessment and emergency response models predict. While recent work has suggested that models should incorporate the protection provided by buildings, more refined health effect methodologies, and more detailed consideration of the release process; investigations into the role of deposition onto outdoor surfaces has been lacking. In this paper, we examine the conditions under which dry deposition may significantly reduce the extent of the downwind hazard zone. We provide theoretical arguments that in congested environments (e.g. suburbs, forests), deposition tomore » vertical surfaces (such as building walls) may play a significant role in reducing the hazard zone extent--particularly under low-wind, stable atmospheric conditions which are often considered to be the worst-case scenario for these types of releases. Our analysis suggests that in these urban or suburban environments, the amount of toxic chemicals lost to earth's surface is typically a small fraction of overall depositional losses. For isothermal gases such as chlorine, the degree to which the chemicals stick to (or react with) surfaces (i.e. surface resistance) is demonstrated to be a key parameter controlling hazard extent (the maximum distance from the release at which hazards to human health are expected). This analysis does not consider the depositional effects associated with particulate matter or gases that undergo significant thermal change in the atmosphere. While no controlled experiments were available to validate our hypothesis, our analysis results are qualitatively consistent with the observed downwind extent of vegetation damage in two chlorine accidents.« less
Failure analysis and modeling of a VAXcluster system
NASA Technical Reports Server (NTRS)
Tang, Dong; Iyer, Ravishankar K.; Subramani, Sujatha S.
1990-01-01
This paper discusses the results of a measurement-based analysis of real error data collected from a DEC VAXcluster multicomputer system. In addition to evaluating basic system dependability characteristics such as error and failure distributions and hazard rates for both individual machines and for the VAXcluster, reward models were developed to analyze the impact of failures on the system as a whole. The results show that more than 46 percent of all failures were due to errors in shared resources. This is despite the fact that these errors have a recovery probability greater than 0.99. The hazard rate calculations show that not only errors, but also failures occur in bursts. Approximately 40 percent of all failures occur in bursts and involved multiple machines. This result indicates that correlated failures are significant. Analysis of rewards shows that software errors have the lowest reward (0.05 vs 0.74 for disk errors). The expected reward rate (reliability measure) of the VAXcluster drops to 0.5 in 18 hours for the 7-out-of-7 model and in 80 days for the 3-out-of-7 model.
Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.
2016-03-28
The U.S. Geological Survey (USGS) has produced a 1-year seismic hazard forecast for 2016 for the Central and Eastern United States (CEUS) that includes contributions from both induced and natural earthquakes. The model assumes that earthquake rates calculated from several different time windows will remain relatively stationary and can be used to forecast earthquake hazard and damage intensity for the year 2016. This assessment is the first step in developing an operational earthquake forecast for the CEUS, and the analysis could be revised with updated seismicity and model parameters. Consensus input models consider alternative earthquake catalog durations, smoothing parameters, maximum magnitudes, and ground motion estimates, and represent uncertainties in earthquake occurrence and diversity of opinion in the science community. Ground shaking seismic hazard for 1-percent probability of exceedance in 1 year reaches 0.6 g (as a fraction of standard gravity [g]) in northern Oklahoma and southern Kansas, and about 0.2 g in the Raton Basin of Colorado and New Mexico, in central Arkansas, and in north-central Texas near Dallas. Near some areas of active induced earthquakes, hazard is higher than in the 2014 USGS National Seismic Hazard Model (NHSM) by more than a factor of 3; the 2014 NHSM did not consider induced earthquakes. In some areas, previously observed induced earthquakes have stopped, so the seismic hazard reverts back to the 2014 NSHM. Increased seismic activity, whether defined as induced or natural, produces high hazard. Conversion of ground shaking to seismic intensity indicates that some places in Oklahoma, Kansas, Colorado, New Mexico, Texas, and Arkansas may experience damage if the induced seismicity continues unabated. The chance of having Modified Mercalli Intensity (MMI) VI or greater (damaging earthquake shaking) is 5–12 percent per year in north-central Oklahoma and southern Kansas, similar to the chance of damage caused by natural earthquakes at sites in parts of California.
Expanding CyberShake Physics-Based Seismic Hazard Calculations to Central California
NASA Astrophysics Data System (ADS)
Silva, F.; Callaghan, S.; Maechling, P. J.; Goulet, C. A.; Milner, K. R.; Graves, R. W.; Olsen, K. B.; Jordan, T. H.
2016-12-01
As part of its program of earthquake system science, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by first simulating a tensor-valued wavefield of Strain Green Tensors. CyberShake then takes an earthquake rupture forecast and extends it by varying the hypocenter location and slip distribution, resulting in about 500,000 rupture variations. Seismic reciprocity is used to calculate synthetic seismograms for each rupture variation at each computation site. These seismograms are processed to obtain intensity measures, such as spectral acceleration, which are then combined with probabilities from the earthquake rupture forecast to produce a hazard curve. Hazard curves are calculated at seismic frequencies up to 1 Hz for hundreds of sites in a region and the results interpolated to obtain a hazard map. In developing and verifying CyberShake, we have focused our modeling in the greater Los Angeles region. We are now expanding the hazard calculations into Central California. Using workflow tools running jobs across two large-scale open-science supercomputers, NCSA Blue Waters and OLCF Titan, we calculated 1-Hz PSHA results for over 400 locations in Central California. For each location, we produced hazard curves using both a 3D central California velocity model created via tomographic inversion, and a regionally averaged 1D model. These new results provide low-frequency exceedance probabilities for the rapidly expanding metropolitan areas of Santa Barbara, Bakersfield, and San Luis Obispo, and lend new insights into the effects of directivity-basin coupling associated with basins juxtaposed to major faults such as the San Andreas. Particularly interesting are the basin effects associated with the deep sediments of the southern San Joaquin Valley. We will compare hazard estimates from the 1D and 3D models, summarize the challenges of expanding CyberShake to a new geographic region, and describe our future CyberShake plans.
A first hazard analysis of the Harrat Ash Shamah volcanic field, Syria-Jordan Borderline
NASA Astrophysics Data System (ADS)
Cagnan, Zehra; Akkar, Sinan; Moghimi, Saed
2017-04-01
The northernmost part of the Saudi Cenozoic Volcanic Fields, the 100,000 km2 Harrat Ash Shamah has hosted some of the most recent volcanic eruptions along the Syria-Jordan borderline. With rapid growth of the cities in this region, exposure to any potential renewed volcanism increased considerably. We present here a first-order probabilistic hazard analysis related to new vent formation and subsequent lava flow from Harrat Ash Shamah. The 733 visible eruption vent sites were utilized to develop a probability density function for new eruption sites using Gaussian kernel smoothing. This revealed a NNW striking zone of high spatial hazard surrounding the cities Amman and Irbid in Jordan. The temporal eruption recurrence rate is estimated to be approximately one vent per 3500 years, but the temporal record of the field is so poorly constrained that the lower and upper bounds for the recurrence interval are 17,700 yrs and 70 yrs, respectively. A Poisson temporal model is employed within the scope of this study. In order to treat the uncertainties associated with the spatio-temporal models as well as size of the area affected by the lava flow, the logic tree approach is adopted. For the Syria-Jordan borderline, the spatial variation of volcanic hazard is computed as well as uncertainty associated with these estimates.
Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin
2015-12-15
As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Yurkovich, E. S.; Howell, D. G.
2002-12-01
Exploding population and unprecedented urban development within the last century helped fuel an increase in the severity of natural disasters. Not only has the world become more populated, but people, information and commodities now travel greater distances to service larger concentrations of people. While many of the earth's natural hazards remain relatively constant, understanding the risk to increasingly interconnected and large populations requires an expanded analysis. To improve mitigation planning we propose a model that is accessible to planners and implemented with public domain data and industry standard GIS software. The model comprises 1) the potential impact of five significant natural hazards: earthquake, flood, tropical storm, tsunami and volcanic eruption assessed by a comparative index of risk, 2) population density, 3) infrastructure distribution represented by a proxy, 4) the vulnerability of the elements at risk (population density and infrastructure distribution) and 5) the connections and dependencies of our increasingly 'globalized' world, portrayed by a relative linkage index. We depict this model with the equation, Risk = f(H, E, V, I) Where H is an index normalizing the impact of five major categories of natural hazards; E is one element at risk, population or infrastructure; V is a measure of the vulnerability for of the elements at risk; and I pertains to a measure of interconnectivity of the elements at risk as a result of economic and social globalization. We propose that future risk analysis include the variable I to better define and quantify risk. Each assessment reflects different repercussions from natural disasters: losses of life or economic activity. Because population and infrastructure are distributed heterogeneously across the Pacific region, two contrasting representations of risk emerge from this study.
Multi Hazard Assessment: The Azores Archipelagos (PT) case
NASA Astrophysics Data System (ADS)
Aifantopoulou, Dorothea; Boni, Giorgio; Cenci, Luca; Kaskara, Maria; Kontoes, Haris; Papoutsis, Ioannis; Paralikidis, Sideris; Psichogyiou, Christina; Solomos, Stavros; Squicciarino, Giuseppe; Tsouni, Alexia; Xerekakis, Themos
2016-04-01
The COPERNICUS EMS Risk & Recovery Mapping (RRM) activity offers services to support efficient design and implementation of mitigation measures and recovery planning based on EO data exploitation. The Azores Archipelagos case was realized in the context of the FWC 259811 Copernicus EMS RRM, and provides potential impact information for a number of natural disasters. The analysis identified population and assets at risk (infrastructures and environment). The risk assessment was based on hazard and vulnerability of structural elements, road network characteristics, etc. Integration of different hazards and risks was accounted in establishing the necessary first response/ first aid infrastructure. EO data (Pleiades and WV-2), were used to establish a detailed background information, common for the assessment of the whole of the risks. A qualitative Flood hazard level was established, through a "Flood Susceptibility Index" that accounts for upstream drainage area and local slope along the drainage network (Manfreda et al. 2014). Indicators, representing different vulnerability typologies, were accounted for. The risk was established through intersecting hazard and vulnerability (risk- specific lookup table). Probabilistic seismic hazards maps (PGA) were obtained by applying the Cornell (1968) methodology as implemented in CRISIS2007 (Ordaz et al. 2007). The approach relied on the identification of potential sources, the assessment of earthquake recurrence and magnitude distribution, the selection of ground motion model, and the mathematical model to calculate seismic hazard. Lava eruption areas and a volcanic activity related coefficient were established through available historical data. Lava flow paths and their convergence were estimated through applying a cellular, automata based, Lava Flow Hazard numerical model (Gestur Leó Gislason, 2013). The Landslide Hazard Index of NGI (Norwegian Geotechnical Institute) for heavy rainfall (100 year extreme monthly rainfall) and earthquake (475 years return period) was used. Topography, lithology, soil moisture and LU/LC were also accounted for. Soil erosion risk was assessed through the empirical model RUSLE (Renard et al. 1991b). Rainfall erosivity, topography and vegetation cover are the main parameters which were used for predicting the proneness to soil loss. Expected, maximum tsunami wave heights were estimated for a specific earthquake scenario at designated forecast points along the coasts. Deformation at the source was calculated by utilizing the Okada code (Okada, 1985). Tsunami waves' generation and propagation is based on the SWAN model (JRC/IPSC modification). To estimate the wave height (forecast points) the Green's Law function was used (JRC Tsunami Analysis Tool). Storm tracks' historical data indicate a return period of 17 /41 years for H1 /H2 hurricane categories respectively. NOAA WAVEWATCH III model hindcast reanalysis was used to estimate the maximum significant wave height (wind and swell) along the coastline during two major storms. The associated storm-surge risk assessment accounted also for the coastline morphology. Seven empirical (independent) indicators were used to express the erosion susceptibility of the coasts. Each indicator is evaluated according to a semi?quantitative score that represents low, medium and high level of erosion risk or impact. The estimation of the coastal erosion hazard was derived through aggregating the indicators in a grid scale.
Nadeau-Fredette, Annie-Claire; Hawley, Carmel M.; Pascoe, Elaine M.; Chan, Christopher T.; Clayton, Philip A.; Polkinghorne, Kevan R.; Boudville, Neil; Leblanc, Martine
2015-01-01
Background and objectives Home dialysis is often recognized as a first-choice therapy for patients initiating dialysis. However, studies comparing clinical outcomes between peritoneal dialysis and home hemodialysis have been very limited. Design, setting, participants, & measurements This Australia and New Zealand Dialysis and Transplantation Registry study assessed all Australian and New Zealand adult patients receiving home dialysis on day 90 after initiation of RRT between 2000 and 2012. The primary outcome was overall survival. The secondary outcomes were on-treatment survival, patient and technique survival, and death-censored technique survival. All results were adjusted with three prespecified models: multivariable Cox proportional hazards model (main model), propensity score quintile–stratified model, and propensity score–matched model. Results The study included 10,710 patients on incident peritoneal dialysis and 706 patients on incident home hemodialysis. Treatment with home hemodialysis was associated with better patient survival than treatment with peritoneal dialysis (5-year survival: 85% versus 44%, respectively; log-rank P<0.001). Using multivariable Cox proportional hazards analysis, home hemodialysis was associated with superior patient survival (hazard ratio for overall death, 0.47; 95% confidence interval, 0.38 to 0.59) as well as better on-treatment survival (hazard ratio for on-treatment death, 0.34; 95% confidence interval, 0.26 to 0.45), composite patient and technique survival (hazard ratio for death or technique failure, 0.34; 95% confidence interval, 0.29 to 0.40), and death-censored technique survival (hazard ratio for technique failure, 0.34; 95% confidence interval, 0.28 to 0.41). Similar results were obtained with the propensity score models as well as sensitivity analyses using competing risks models and different definitions for technique failure and lag period after modality switch, during which events were attributed to the initial modality. Conclusions Home hemodialysis was associated with superior patient and technique survival compared with peritoneal dialysis. PMID:26068181
Toward Building a New Seismic Hazard Model for Mainland China
NASA Astrophysics Data System (ADS)
Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z.
2015-12-01
At present, the only publicly available seismic hazard model for mainland China was generated by Global Seismic Hazard Assessment Program in 1999. We are building a new seismic hazard model by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data using the methodology recommended by Global Earthquake Model (GEM), and derive a strain rate map based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones based on seismotectonics. For each zone, we use the tapered Gutenberg-Richter (TGR) relationship to model the seismicity rates. We estimate the TGR a- and b-values from the historical earthquake data, and constrain corner magnitude using the seismic moment rate derived from the strain rate. From the TGR distributions, 10,000 to 100,000 years of synthetic earthquakes are simulated. Then, we distribute small and medium earthquakes according to locations and magnitudes of historical earthquakes. Some large earthquakes are distributed on active faults based on characteristics of the faults, including slip rate, fault length and width, and paleoseismic data, and the rest to the background based on the distributions of historical earthquakes and strain rate. We evaluate available ground motion prediction equations (GMPE) by comparison to observed ground motions. To apply appropriate GMPEs, we divide the region into active and stable tectonics. The seismic hazard will be calculated using the OpenQuake software developed by GEM. To account for site amplifications, we construct a site condition map based on geology maps. The resulting new seismic hazard map can be used for seismic risk analysis and management, and business and land-use planning.
Christenson, Elizabeth; Elliott, Mark; Banerjee, Ovik; Hamrick, Laura; Bartram, Jamie
2014-01-01
Global climate change (GCC) has led to increased focus on the occurrence of, and preparation for, climate-related extremes and hazards. Population exposure, the relative likelihood that a person in a given location was exposed to a given hazard event(s) in a given period of time, was the outcome for this analysis. Our objectives were to develop a method for estimating the population exposure at the country level to the climate-related hazards cyclone, drought, and flood; develop a method that readily allows the addition of better datasets to an automated model; differentiate population exposure of urban and rural populations; and calculate and present the results of exposure scores and ranking of countries based on the country-wide, urban, and rural population exposures to cyclone, drought, and flood. Gridded global datasets on cyclone, drought and flood occurrence as well as population density were combined and analysis was carried out using ArcGIS. Results presented include global maps of ranked country-level population exposure to cyclone, drought, flood and multiple hazards. Analyses by geography and human development index (HDI) are also included. The results and analyses of this exposure assessment have implications for country-level adaptation. It can also be used to help prioritize aid decisions and allocation of adaptation resources between countries and within a country. This model is designed to allow flexibility in applying cyclone, drought and flood exposure to a range of outcomes and adaptation measures. PMID:24566046
NASA Astrophysics Data System (ADS)
Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang
2017-09-01
Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.
NASA Astrophysics Data System (ADS)
Kolokolov, Yury; Monovskaya, Anna
2016-06-01
The paper continues the application of the bifurcation analysis in the research on local climate dynamics based on processing the historically observed data on the daily average land surface air temperature. Since the analyzed data are from instrumental measurements, we are doing the experimental bifurcation analysis. In particular, we focus on the discussion where is the joint between the normal dynamics of local climate systems (norms) and situations with the potential to create damages (hazards)? We illustrate that, perhaps, the criteria for hazards (or violent and unfavorable weather factors) relate mainly to empirical considerations from human opinion, but not to the natural qualitative changes of climate dynamics. To build the bifurcation diagrams, we base on the unconventional conceptual model (HDS-model) which originates from the hysteresis regulator with double synchronization. The HDS-model is characterized by a variable structure with the competition between the amplitude quantization and the time quantization. Then the intermittency between three periodical processes is considered as the typical behavior of local climate systems instead of both chaos and quasi-periodicity in order to excuse the variety of local climate dynamics. From the known specific regularities of the HDS-model dynamics, we try to find a way to decompose the local behaviors into homogeneous units within the time sections with homogeneous dynamics. Here, we present the first results of such decomposition, where the quasi-homogeneous sections (QHS) are determined on the basis of the modified bifurcation diagrams, and the units are reconstructed within the limits connected with the problem of shape defects. Nevertheless, the proposed analysis of the local climate dynamics (QHS-analysis) allows to exhibit how the comparatively modest temperature differences between the mentioned units in an annual scale can step-by-step expand into the great temperature differences of the daily variability at a centennial scale. Then the norms and the hazards relate to the fundamentally different viewpoints, where the time sections of months and, especially, seasons distort the causal effects of natural dynamical processes. The specific circumstances to realize the qualitative changes of the local climate dynamics are summarized by the notion of a likely periodicity. That, in particular, allows to explain why 30-year averaging remains the most common rule so far, but the decadal averaging begins to substitute that rule. We believe that the QHS-analysis can be considered as the joint between the norms and the hazards from a bifurcation analysis viewpoint, where the causal effects of the local climate dynamics are projected into the customary timescale only at the last step. We believe that the results could be interesting to develop the fields connected with climatic change and risk assessment.
44 CFR 65.7 - Floodway revisions.
Code of Federal Regulations, 2013 CFR
2013-10-01
... HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND... described below: (i) The floodway analysis must be performed using the hydraulic computer model used to... output data from the original and modified computer models must be submitted. (5) Delineation of the...
44 CFR 65.7 - Floodway revisions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND... described below: (i) The floodway analysis must be performed using the hydraulic computer model used to... output data from the original and modified computer models must be submitted. (5) Delineation of the...
44 CFR 65.7 - Floodway revisions.
Code of Federal Regulations, 2012 CFR
2012-10-01
... HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND... described below: (i) The floodway analysis must be performed using the hydraulic computer model used to... output data from the original and modified computer models must be submitted. (5) Delineation of the...
44 CFR 65.7 - Floodway revisions.
Code of Federal Regulations, 2011 CFR
2011-10-01
... HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND... described below: (i) The floodway analysis must be performed using the hydraulic computer model used to... output data from the original and modified computer models must be submitted. (5) Delineation of the...
44 CFR 65.7 - Floodway revisions.
Code of Federal Regulations, 2014 CFR
2014-10-01
... HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND... described below: (i) The floodway analysis must be performed using the hydraulic computer model used to... output data from the original and modified computer models must be submitted. (5) Delineation of the...
Up-to-date Probabilistic Earthquake Hazard Maps for Egypt
NASA Astrophysics Data System (ADS)
Gaber, Hanan; El-Hadidy, Mahmoud; Badawy, Ahmed
2018-04-01
An up-to-date earthquake hazard analysis has been performed in Egypt using a probabilistic seismic hazard approach. Through the current study, we use a complete and homogenous earthquake catalog covering the time period between 2200 BC and 2015 AD. Three seismotectonic models representing the seismic activity in and around Egypt are used. A logic-tree framework is applied to allow for the epistemic uncertainty in the declustering parameters, minimum magnitude, seismotectonic setting and ground-motion prediction equations. The hazard analysis is performed for a grid of 0.5° × 0.5° in terms of types of rock site for the peak ground acceleration (PGA) and spectral acceleration at 0.2-, 0.5-, 1.0- and 2.0-s periods. The hazard is estimated for three return periods (72, 475 and 2475 years) corresponding to 50, 10 and 2% probability of exceedance in 50 years. The uniform hazard spectra for the cities of Cairo, Alexandria, Aswan and Nuwbia are constructed. The hazard maps show that the highest ground acceleration values are expected in the northeastern part of Egypt around the Gulf of Aqaba (PGA up to 0.4 g for return period 475 years) and in south Egypt around the city of Aswan (PGA up to 0.2 g for return period 475 years). The Western Desert of Egypt is characterized by the lowest level of hazard (PGA lower than 0.1 g for return period 475 years).
NASA Astrophysics Data System (ADS)
Su, Xing; Meng, Xingmin; Ye, Weilin; Wu, Weijiang; Liu, Xingrong; Wei, Wanhong
2018-03-01
Tianshui City is one of the mountainous cities that are threatened by severe geo-hazards in Gansu Province, China. Statistical probability models have been widely used in analyzing and evaluating geo-hazards such as landslide. In this research, three approaches (Certainty Factor Method, Weight of Evidence Method and Information Quantity Method) were adopted to quantitively analyze the relationship between the causative factors and the landslides, respectively. The source data used in this study are including the SRTM DEM and local geological maps in the scale of 1:200,000. 12 causative factors (i.e., altitude, slope, aspect, curvature, plan curvature, profile curvature, roughness, relief amplitude, and distance to rivers, distance to faults, distance to roads, and the stratum lithology) were selected to do correlation analysis after thorough investigation of geological conditions and historical landslides. The results indicate that the outcomes of the three models are fairly consistent.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luria, Paolo; Aspinall, Peter A
2003-08-01
The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based onmore » a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)« less
NASA Astrophysics Data System (ADS)
Spence, C. M.; Brown, C.; Doss-Gollin, J.
2016-12-01
Climate model projections are commonly used for water resources management and planning under nonstationarity, but they do not reliably reproduce intense short-term precipitation and are instead more skilled at broader spatial scales. To provide a credible estimate of flood trend that reflects climate uncertainty, we present a framework that exploits the connections between synoptic-scale oceanic and atmospheric patterns and local-scale flood-producing meteorological events to develop long-term flood hazard projections. We demonstrate the method for the Iowa River, where high flow episodes have been found to correlate with tropical moisture exports that are associated with a pressure dipole across the eastern continental United States We characterize the relationship between flooding on the Iowa River and this pressure dipole through a nonstationary Pareto-Poisson peaks-over-threshold probability distribution estimated based on the historic record. We then combine the results of a trend analysis of dipole index in the historic record with the results of a trend analysis of the dipole index as simulated by General Circulation Models (GCMs) under climate change conditions through a Bayesian framework. The resulting nonstationary posterior distribution of dipole index, combined with the dipole-conditioned peaks-over-threshold flood frequency model, connects local flood hazard to changes in large-scale atmospheric pressure and circulation patterns that are related to flooding in a process-driven framework. The Iowa River example demonstrates that the resulting nonstationary, probabilistic flood hazard projection may be used to inform risk-based flood adaptation decisions.
Lin, Ting; Harmsen, Stephen C.; Baker, Jack W.; Luco, Nicolas
2013-01-01
The conditional spectrum (CS) is a target spectrum (with conditional mean and conditional standard deviation) that links seismic hazard information with ground-motion selection for nonlinear dynamic analysis. Probabilistic seismic hazard analysis (PSHA) estimates the ground-motion hazard by incorporating the aleatory uncertainties in all earthquake scenarios and resulting ground motions, as well as the epistemic uncertainties in ground-motion prediction models (GMPMs) and seismic source models. Typical CS calculations to date are produced for a single earthquake scenario using a single GMPM, but more precise use requires consideration of at least multiple causal earthquakes and multiple GMPMs that are often considered in a PSHA computation. This paper presents the mathematics underlying these more precise CS calculations. Despite requiring more effort to compute than approximate calculations using a single causal earthquake and GMPM, the proposed approach produces an exact output that has a theoretical basis. To demonstrate the results of this approach and compare the exact and approximate calculations, several example calculations are performed for real sites in the western United States. The results also provide some insights regarding the circumstances under which approximate results are likely to closely match more exact results. To facilitate these more precise calculations for real applications, the exact CS calculations can now be performed for real sites in the United States using new deaggregation features in the U.S. Geological Survey hazard mapping tools. Details regarding this implementation are discussed in this paper.
Chronic Use of Theophylline and Mortality in Chronic Obstructive Pulmonary Disease: A Meta-analysis.
Horita, Nobuyuki; Miyazawa, Naoki; Kojima, Ryota; Inoue, Miyo; Ishigatsubo, Yoshiaki; Kaneko, Takeshi
2016-05-01
Theophylline has been shown to improve respiratory function and oxygenation in patients with chronic obstruction pulmonary disease (COPD). However, the impact of theophylline on mortality in COPD patients has not been not sufficiently evaluated. Two investigators independently searched for eligible articles in 4 databases. The eligibility criterion for this meta-analysis was an original research article that provided a hazard ratio for theophylline for all-cause mortality of COPD patients. Both randomized controlled trials and observational studies were accepted. After we confirmed no substantial heterogeneity (I(2)<50%), the fixed-model method with generic inverse variance was used for meta-analysis to estimate the pooled hazard ratio. We screened 364 potentially eligible articles. Of the 364 articles, 259 were excluded on the basis of title and abstract, and 99 were excluded after examination of the full text. Our final analysis included 6 observational studies and no randomized controlled trials. One study reported 2 cohorts. The number of patients in each cohort ranged from 47 to 46,403. Heterogeneity (I(2)=42%, P=.11) and publication bias (Begg's test r=0.21, P=.662) were not substantial. Fixed-model meta-analysis yielded a pooled hazard ratio for theophylline for all-cause death of 1.07 (95% confidence interval: 1.02-1.13, P=.003). This meta-analysis of 7 observational cohorts suggests that theophylline slightly increases all-cause death in COPD patients. Copyright © 2014 SEPAR. Published by Elsevier Espana. All rights reserved.
NASA Astrophysics Data System (ADS)
Moncoulon, D.; Labat, D.; Ardon, J.; Leblois, E.; Onfroy, T.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.
2014-09-01
The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible (but which have not yet occurred) flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2010 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90 % of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff, due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of the CCR (Caisse Centrale de Reassurance) claim database have shown that approximately 45 % of the insured flood losses are located inside the floodplains and 45 % outside. Another 10 % is due to sea surge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: a generation of fictive river flows based on the historical records of the river gauge network and a generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (Macif) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).
Casellas, J; Bach, R
2012-06-01
Lambing interval is a relevant reproductive indicator for sheep populations under continuous mating systems, although there is a shortage of selection programs accounting for this trait in the sheep industry. Both the historical assumption of small genetic background and its unorthodox distribution pattern have limited its implementation as a breeding objective. In this manuscript, statistical performances of 3 alternative parametrizations [i.e., symmetric Gaussian mixed linear (GML) model, skew-Gaussian mixed linear (SGML) model, and piecewise Weibull proportional hazard (PWPH) model] have been compared to elucidate the preferred methodology to handle lambing interval data. More specifically, flock-by-flock analyses were performed on 31,986 lambing interval records (257.3 ± 0.2 d) from 6 purebred Ripollesa flocks. Model performances were compared in terms of deviance information criterion (DIC) and Bayes factor (BF). For all flocks, PWPH models were clearly preferred; they generated a reduction of 1,900 or more DIC units and provided BF estimates larger than 100 (i.e., PWPH models against linear models). These differences were reduced when comparing PWPH models with different number of change points for the baseline hazard function. In 4 flocks, only 2 change points were required to minimize the DIC, whereas 4 and 6 change points were needed for the 2 remaining flocks. These differences demonstrated a remarkable degree of heterogeneity across sheep flocks that must be properly accounted for in genetic evaluation models to avoid statistical biases and suboptimal genetic trends. Within this context, all 6 Ripollesa flocks revealed substantial genetic background for lambing interval with heritabilities ranging between 0.13 and 0.19. This study provides the first evidence of the suitability of PWPH models for lambing interval analysis, clearly discarding previous parametrizations focused on mixed linear models.
Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henneke, Dennis W.; Robinson, James
In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less
Caraviello, D Z; Weigel, K A; Gianola, D
2004-05-01
Predicted transmitting abilities (PTA) of US Jersey sires for daughter longevity were calculated using a Weibull proportional hazards sire model and compared with predictions from a conventional linear animal model. Culling data from 268,008 Jersey cows with first calving from 1981 to 2000 were used. The proportional hazards model included time-dependent effects of herd-year-season contemporary group and parity by stage of lactation interaction, as well as time-independent effects of sire and age at first calving. Sire variances and parameters of the Weibull distribution were estimated, providing heritability estimates of 4.7% on the log scale and 18.0% on the original scale. The PTA of each sire was expressed as the expected risk of culling relative to daughters of an average sire. Risk ratios (RR) ranged from 0.7 to 1.3, indicating that the risk of culling for daughters of the best sires was 30% lower than for daughters of average sires and nearly 50% lower than than for daughters of the poorest sires. Sire PTA from the proportional hazards model were compared with PTA from a linear model similar to that used for routine national genetic evaluation of length of productive life (PL) using cross-validation in independent samples of herds. Models were compared using logistic regression of daughters' stayability to second, third, fourth, or fifth lactation on their sires' PTA values, with alternative approaches for weighting the contribution of each sire. Models were also compared using logistic regression of daughters' stayability to 36, 48, 60, 72, and 84 mo of life. The proportional hazards model generally yielded more accurate predictions according to these criteria, but differences in predictive ability between methods were smaller when using a Kullback-Leibler distance than with other approaches. Results of this study suggest that survival analysis methodology may provide more accurate predictions of genetic merit for longevity than conventional linear models.
Survival analysis in hematologic malignancies: recommendations for clinicians
Delgado, Julio; Pereira, Arturo; Villamor, Neus; López-Guillermo, Armando; Rozman, Ciril
2014-01-01
The widespread availability of statistical packages has undoubtedly helped hematologists worldwide in the analysis of their data, but has also led to the inappropriate use of statistical methods. In this article, we review some basic concepts of survival analysis and also make recommendations about how and when to perform each particular test using SPSS, Stata and R. In particular, we describe a simple way of defining cut-off points for continuous variables and the appropriate and inappropriate uses of the Kaplan-Meier method and Cox proportional hazard regression models. We also provide practical advice on how to check the proportional hazards assumption and briefly review the role of relative survival and multiple imputation. PMID:25176982
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin J. Coppersmith; Lawrence A. Salomone; Chris W. Fuller
2012-01-31
This report describes a new seismic source characterization (SSC) model for the Central and Eastern United States (CEUS). It will replace the Seismic Hazard Methodology for the Central and Eastern United States, EPRI Report NP-4726 (July 1986) and the Seismic Hazard Characterization of 69 Nuclear Plant Sites East of the Rocky Mountains, Lawrence Livermore National Laboratory Model, (Bernreuter et al., 1989). The objective of the CEUS SSC Project is to develop a new seismic source model for the CEUS using a Senior Seismic Hazard Analysis Committee (SSHAC) Level 3 assessment process. The goal of the SSHAC process is to representmore » the center, body, and range of technically defensible interpretations of the available data, models, and methods. Input to a probabilistic seismic hazard analysis (PSHA) consists of both seismic source characterization and ground motion characterization. These two components are used to calculate probabilistic hazard results (or seismic hazard curves) at a particular site. This report provides a new seismic source model. Results and Findings The product of this report is a regional CEUS SSC model. This model includes consideration of an updated database, full assessment and incorporation of uncertainties, and the range of diverse technical interpretations from the larger technical community. The SSC model will be widely applicable to the entire CEUS, so this project uses a ground motion model that includes generic variations to allow for a range of representative site conditions (deep soil, shallow soil, hard rock). Hazard and sensitivity calculations were conducted at seven test sites representative of different CEUS hazard environments. Challenges and Objectives The regional CEUS SSC model will be of value to readers who are involved in PSHA work, and who wish to use an updated SSC model. This model is based on a comprehensive and traceable process, in accordance with SSHAC guidelines in NUREG/CR-6372, Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts. The model will be used to assess the present-day composite distribution for seismic sources along with their characterization in the CEUS and uncertainty. In addition, this model is in a form suitable for use in PSHA evaluations for regulatory activities, such as Early Site Permit (ESPs) and Combined Operating License Applications (COLAs). Applications, Values, and Use Development of a regional CEUS seismic source model will provide value to those who (1) have submitted an ESP or COLA for Nuclear Regulatory Commission (NRC) review before 2011; (2) will submit an ESP or COLA for NRC review after 2011; (3) must respond to safety issues resulting from NRC Generic Issue 199 (GI-199) for existing plants and (4) will prepare PSHAs to meet design and periodic review requirements for current and future nuclear facilities. This work replaces a previous study performed approximately 25 years ago. Since that study was completed, substantial work has been done to improve the understanding of seismic sources and their characterization in the CEUS. Thus, a new regional SSC model provides a consistent, stable basis for computing PSHA for a future time span. Use of a new SSC model reduces the risk of delays in new plant licensing due to more conservative interpretations in the existing and future literature. Perspective The purpose of this study, jointly sponsored by EPRI, the U.S. Department of Energy (DOE), and the NRC was to develop a new CEUS SSC model. The team assembled to accomplish this purpose was composed of distinguished subject matter experts from industry, government, and academia. The resulting model is unique, and because this project has solicited input from the present-day larger technical community, it is not likely that there will be a need for significant revision for a number of years. See also Sponsors Perspective for more details. The goal of this project was to implement the CEUS SSC work plan for developing a regional CEUS SSC model. The work plan, formulated by the project manager and a technical integration team, consists of a series of tasks designed to meet the project objectives. This report was reviewed by a participatory peer review panel (PPRP), sponsor reviewers, the NRC, the U.S. Geological Survey, and other stakeholders. Comments from the PPRP and other reviewers were considered when preparing the report. The SSC model was completed at the end of 2011.« less
Pan, Hongwei; Lei, Hongjun; Liu, Xin; Wei, Huaibin; Liu, Shufang
2017-09-01
A large number of simple and informal landfills exist in developing countries, which pose as tremendous soil and groundwater pollution threats. Early warning and monitoring of landfill leachate pollution status is of great importance. However, there is a shortage of affordable and effective tools and methods. In this study, a soil column experiment was performed to simulate the pollution status of leachate using three-dimensional excitation-emission fluorescence (3D-EEMF) and parallel factor analysis (PARAFAC) models. Sum of squared residuals (SSR) and principal component analysis (PCA) were used to determine the optimal components for PARAFAC. A one-way analysis of variance showed that the component scores of the soil column leachate were significant influenced by landfill leachate (p<0.05). Therefore, the ratio of the component scores of the soil under the landfill to that of natural soil could be used to evaluate the leakage status of landfill leachate. Furthermore, a hazard index (HI) and a hazard evaluation standard were established. A case study of Kaifeng landfill indicated a low hazard (level 5) by the use of HI. In summation, HI is presented as a tool to evaluate landfill pollution status and for the guidance of municipal solid waste management. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.
2014-10-01
Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.
2015-03-01
Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer ...
Social assistance and conjugal union dissolution in Canada: a dynamic analysis.
Lefebvre, P; Merrigan, P
1997-02-01
"Using Statistics Canada's General Social Survey on Family and Friends, carried out in 1990, we piece together the matrimonial and conjugal life history of a large sample of Canadian men and women. We then estimate duration models (Cox's proportional hazard models) describing the evolutionary laws of marriages and unions, which depend on various economic or socio-demographic explanatory variables. The empirical modelling focuses primarily on estimating the impact of couples' earned incomes and of provincial welfare programs on the dissolution rate of first marriages and unions....We...find that welfare benefits do not have an impact on the hazard of union dissolution and that earned incomes have a positive effect on conjugal stability." (EXCERPT)
Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai
2017-01-01
Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA. PMID:28938011
Web processing service for landslide hazard assessment
NASA Astrophysics Data System (ADS)
Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.
2012-04-01
Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created and it can be downloaded from the server with the log file
NASA Astrophysics Data System (ADS)
Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara
2015-09-01
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...
Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less
NASA Astrophysics Data System (ADS)
Barsotti, Sara; Duncan, Melanie; Loughlin, Susan; Gísladóttir, Bryndis; Roberts, Matthew; Karlsdóttir, Sigrún; Scollo, Simona; Salerno, Giuseppe; Corsaro, Rosa Anna; Charalampakis, Marinos; Papadopoulos, Gerassimos
2017-04-01
The demand for timely analysis and advice on global volcanic activity from scientists is growing. At the same time, decision-makers require more than an understanding of hazards; they need to know what impacts to expect from ongoing and future events. ARISTOTLE (All Risk Integrated System TOwards Trans-boundary hoListic Early-warning) is a two-year EC funded pilot project designed to do just that. The Emergency Response Coordination Centre (ERCC) works to support and coordinate response to disasters both inside and outside Europe using resources from the countries participating in the European Union Civil Protection Mechanism. Led by INGV and ZAMG, the ARISTOTLE consortium comprises 15 institutions across Europe and aims to deliver multi-hazard advice on natural events, including their potential interactions and impact, both inside and outside of Europe to the ERCC. Where possible, the ERCC would like a pro-active provision of scientific advice by the scientific group. Iceland Met Office leads the volcanic hazards work, with BGS, INGV and NOA comprising the volcano observatory team. At this stage, the volcanology component of the project comprises mainly volcanic ash and gas dispersal and potential impact on population and ground-based critical infrastructures. We approach it by relying upon available and official volcano monitoring institutions' reporting of activity, existing assessments and global databases of past events, modelling tools, remote-sensing observational systems and official VAAC advisories. We also make use of global assessments of volcanic hazards, country profiles, exposure and proxy indicators of threat to livelihoods, infrastructure and economic assets (e.g. Global Volcano Model outputs). Volcanic ash fall remains the only hazard modelled at the global scale. Volcanic risk assessments remain in their infancy, owing to challenges related to the multitude of hazards, data availability and model representation. We therefore face a number of challenges in delivering pro-active scientific advice to ARISTOTLE, in addition to the main challenge of working within a multi-hazard framework. Here we present our methods for analysis and advice, along with the challenges we face, and hope to stimulate interesting discussion and receive constructive feedback, as well as explore how the global community can address the demand for scientific advice at the international level. The role of international networks and collaboration is clear; as is the critical role of volcano observatories, which are embedded in local communities and connected to the international community. We aim to enhance our approaches through the Global Volcano Model network (including IAVCEI, WOVO, GVP and VHub) and directly with volcano observatories, VAACs and civil protection agencies.
Application of Catastrophe Risk Modelling to Evacuation Public Policy
NASA Astrophysics Data System (ADS)
Woo, G.
2009-04-01
The decision by civic authorities to evacuate an area threatened by a natural hazard is especially fraught when the population in harm's way is extremely large, and where there is considerable uncertainty in the spatial footprint, scale, and strike time of a hazard event. Traditionally viewed as a hazard forecasting issue, civil authorities turn to scientists for advice on a potentially imminent dangerous event. However, the level of scientific confidence varies enormously from one peril and crisis situation to another. With superior observational data, meteorological and hydrological hazards are generally better forecast than geological hazards. But even with Atlantic hurricanes, the track and intensity of a hurricane can change significantly within a few hours. This complicated and delayed the decision to call an evacuation of New Orleans when threatened by Hurricane Katrina, and would present a severe dilemma if a major hurricane were appearing to head for New York. Evacuation needs to be perceived as a risk issue, requiring the expertise of catastrophe risk modellers as well as geoscientists. Faced with evidence of a great earthquake in the Indian Ocean in December 2004, seismologists were reluctant to give a tsunami warning without more direct sea observations. Yet, from a risk perspective, the risk to coastal populations would have warranted attempts at tsunami warning, even though there was significant uncertainty in the hazard forecast, and chance of a false alarm. A systematic coherent risk-based framework for evacuation decision-making exists, which weighs the advantages of an evacuation call against the disadvantages. Implicitly and qualitatively, such a cost-benefit analysis is undertaken by civic authorities whenever an evacuation is considered. With the progress in catastrophe risk modelling, such an analysis can be made explicit and quantitative, providing a transparent audit trail for the decision process. A stochastic event set, the core of a catastrophe risk model, is required to explore the casualty implications of different possible hazard scenarios, to assess the proportion of an evacuated population who would owe their lives to an evacuation, and to estimate the economic loss associated with an unnecessary evacuation. This paper will review the developing methodology for applying catastrophe risk modelling to support public policy in evacuation decision-making, and provide illustrations from across the range of natural hazards. Evacuation during volcanic crises is a prime example, recognizing the improving forecasting skill of volcanologists, now able to account probabilistically for precursory seismological, geodetic, and geochemical monitoring data. This methodology will be shown to help civic authorities make sounder risk-informed decisions on the timing and population segmentation of evacuation from both volcanoes and calderas, such as Vesuvius and Campi Flegrei, which are in densely populated urban regions.
Safety Analysis of Soybean Processing for Advanced Life Support
NASA Technical Reports Server (NTRS)
Hentges, Dawn L.
1999-01-01
Soybeans (cv. Hoyt) is one of the crops planned for food production within the Advanced Life Support System Integration Testbed (ALSSIT), a proposed habitat simulation for long duration lunar/Mars missions. Soybeans may be processed into a variety of food products, including soymilk, tofu, and tempeh. Due to the closed environmental system and importance of crew health maintenance, food safety is a primary concern on long duration space missions. Identification of the food safety hazards and critical control points associated with the closed ALSSIT system is essential for the development of safe food processing techniques and equipment. A Hazard Analysis Critical Control Point (HACCP) model was developed to reflect proposed production and processing protocols for ALSSIT soybeans. Soybean processing was placed in the type III risk category. During the processing of ALSSIT-grown soybeans, critical control points were identified to control microbiological hazards, particularly mycotoxins, and chemical hazards from antinutrients. Critical limits were suggested at each CCP. Food safety recommendations regarding the hazards and risks associated with growing, harvesting, and processing soybeans; biomass management; and use of multifunctional equipment were made in consideration of the limitations and restraints of the closed ALSSIT.
The Typhoon Disaster Analysis Based on Weibo Topic Heat
NASA Astrophysics Data System (ADS)
Yuan, J.; Gong, A.; Wang, J.; Li, J.
2018-04-01
Could social media data be utilized in hazard evaluation? Typhoon disaster as one of the costly disaster has become devastating threats for human. Moreover, social media change the communication way of human and citizens can turn to this platform to express disasterrelated information at real time. Therefore, social media improves situational awareness and widens the method of hazard information acquiring. With more and more studies investigating in relationship between social media response and degree of damage, the strong correlation has been proved. Weibo as one of the most popular social media in China can provide data with posted text, location, user identification and other additional information. Combining with 10 tropical cyclones and Weibo data in 2013, We perform a quantitative analysis between the grade of hazard situation and Weibo related topic heat in province scale. We provide a new model of Weibo topic heat to evaluate the Weibo activity in study area. Also we demonstrate the hazard assessing formula is H = 1.8845 ln(α) + 15.636 in tropical cyclone disaster. High level goodness of curve fitting also suggest that this equation can be used for rapid assessment of hazard caused by tropical cyclones.
Economic Impact Analyses of Interdisciplinary Multi-hazard Scenarios: ShakeOut and ARkStorm
NASA Astrophysics Data System (ADS)
Wein, A. M.; Rose, A.; Sue Wing, I.; Wei, D.
2011-12-01
U. S. Geological Survey (USGS) scientists are using an interdisciplinary strategy to develop and analyze multi-hazard scenarios to help communities enhance resilience to natural hazard disasters. Two such scenarios are the southern California ShakeOut earthquake and the California ARkStorm winter storm. Both scenarios are multi-hazard: Shakeout ground motions trigger landslides and liquefaction and ARkStorm involves wind, flood, landslide, and coastal hazards. A collaborative scenario-process engages partners and stakeholders throughout the development and use of the scenarios, In doing so, community resilience is enhanced by educating communities about hazards and hazard interdependencies, building networks from scientists to decision makers, exercising emergency management strategies, identifying emergency management issues, and motivating solutions prior to an event. In addition, interdisciplinary scenarios stimulate research on the various steps of analysis (e.g., natural hazard processes, physical damages, societal consequences, and policy connections). In particular, USGS scientists have collaborated with economists to advance methods to estimate the economic impacts (business interruption losses) of disasters. Our economic impact analyses evolved from the economic module in the Federal Emergency Management Agency's loss-estimation tool, HAZUS-MH, to a more encompassing input-output analysis for ShakeOut, to a more sophisticated Computable General Equilibrium model for ARkStorm. The analyses depend on physical damage and restoration time estimates from engineers and geographic analyses of economic assets in hazard zones. Economic resilience strategies are incorporated to represent resourcefulness and ingenuity that avoids potential losses during and after an event. Such strategies operate at three levels of the economy: micro (e.g., ability to catch up on lost production time), meso (e.g., coordination within a sector to share resources), and macro (e.g., price adjustments to redistribute scarce resources). A sensitivity analysis of the ARkStorm economic impact model explores the effects of 1) the magnitude of the shocks (e.g., flood damages to buildings and infrastructure, agricultural productivity, and lifeline service disruptions), 2) the sustainability of the economic resilience strategies, and 3) the amount, timing, and source of reconstruction funds. The inclusion of an economic analysis in ShakeOut and ARkStorm broadens the range of interest in the scenario results. For example, the relative contribution of ShakeOut economic shocks to business interruption losses emphasized the need to reduce the impacts of fire following earthquake and water service disruption. Based on the magnitude and duration of the economic impacts for the ARkStorm scenario, policy experts surmised that business interruption policy time elements would be exceeded and business interruptions would be largely unfunded calling attention to the need for innovative funding solutions. Finally, economic impact analyses inform the question of paying now to mitigate or paying more later to recover.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riddle, F. J.
2003-06-26
The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less
Hydrothermal Liquefaction Treatment Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios receivedmore » increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.« less
Hazard Analysis for Building 34 Vacuum Glove Box Assembly
NASA Technical Reports Server (NTRS)
Meginnis, Ian
2014-01-01
One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".
The Handling of Hazard Data on a National Scale: A Case Study from the British Geological Survey
NASA Astrophysics Data System (ADS)
Royse, Katherine R.
2011-11-01
This paper reviews how hazard data and geological map data have been combined by the British Geological Survey (BGS) to produce a set of GIS-based national-scale hazard susceptibility maps for the UK. This work has been carried out over the last 9 years and as such reflects the combined outputs of a large number of researchers at BGS. The paper details the inception of these datasets from the development of the seamless digital geological map in 2001 through to the deterministic 2D hazard models produced today. These datasets currently include landslides, shrink-swell, soluble rocks, compressible and collapsible deposits, groundwater flooding, geological indicators of flooding, radon potential and potentially harmful elements in soil. These models have been created using a combination of expert knowledge (from both within BGS and from outside bodies such as the Health Protection Agency), national databases (which contain data collected over the past 175 years), multi-criteria analysis within geographical information systems and a flexible rule-based approach for each individual geohazard. By using GIS in this way, it has been possible to model the distribution and degree of geohazards across the whole of Britain.
Johnson, Branden B; Hallman, William K; Cuite, Cara L
2015-03-01
Perceptions of institutions that manage hazards are important because they can affect how the public responds to hazard events. Antecedents of trust judgments have received far more attention than antecedents of attributions of responsibility for hazard events. We build upon a model of retrospective attribution of responsibility to individuals to examine these relationships regarding five classes of institutions that bear responsibility for food safety: producers (e.g., farmers), processors (e.g., packaging firms), watchdogs (e.g., government agencies), sellers (e.g., supermarkets), and preparers (e.g., restaurants). A nationally representative sample of 1,200 American adults completed an Internet-based survey in which a hypothetical scenario involving contamination of diverse foods with Salmonella served as the stimulus event. Perceived competence and good intentions of the institution moderately decreased attributions of responsibility. A stronger factor was whether an institution was deemed (potentially) aware of the contamination and free to act to prevent or mitigate it. Responsibility was rated higher the more aware and free the institution. This initial model for attributions of responsibility to impersonal institutions (as opposed to individual responsibility) merits further development. © 2014 Society for Risk Analysis.
Automatic Hazard Detection for Landers
NASA Technical Reports Server (NTRS)
Huertas, Andres; Cheng, Yang; Matthies, Larry H.
2008-01-01
Unmanned planetary landers to date have landed 'blind'; that is, without the benefit of onboard landing hazard detection and avoidance systems. This constrains landing site selection to very benign terrain,which in turn constrains the scientific agenda of missions. The state of the art Entry, Descent, and Landing (EDL) technology can land a spacecraft on Mars somewhere within a 20-100km landing ellipse.Landing ellipses are very likely to contain hazards such as craters, discontinuities, steep slopes, and large rocks, than can cause mission-fatal damage. We briefly review sensor options for landing hazard detection and identify a perception approach based on stereo vision and shadow analysis that addresses the broadest set of missions. Our approach fuses stereo vision and monocular shadow-based rock detection to maximize spacecraft safety. We summarize performance models for slope estimation and rock detection within this approach and validate those models experimentally. Instantiating our model of rock detection reliability for Mars predicts that this approach can reduce the probability of failed landing by at least a factor of 4 in any given terrain. We also describe a rock detector/mapper applied to large-high-resolution images from the Mars Reconnaissance Orbiter (MRO) for landing site characterization and selection for Mars missions.
Semiparametric Time-to-Event Modeling in the Presence of a Latent Progression Event
Rice, John D.; Tsodikov, Alex
2017-01-01
Summary In cancer research, interest frequently centers on factors influencing a latent event that must precede a terminal event. In practice it is often impossible to observe the latent event precisely, making inference about this process difficult. To address this problem, we propose a joint model for the unobserved time to the latent and terminal events, with the two events linked by the baseline hazard. Covariates enter the model parametrically as linear combinations that multiply, respectively, the hazard for the latent event and the hazard for the terminal event conditional on the latent one. We derive the partial likelihood estimators for this problem assuming the latent event is observed, and propose a profile likelihood–based method for estimation when the latent event is unobserved. The baseline hazard in this case is estimated nonparametrically using the EM algorithm, which allows for closed-form Breslow-type estimators at each iteration, bringing improved computational efficiency and stability compared with maximizing the marginal likelihood directly. We present simulation studies to illustrate the finite-sample properties of the method; its use in practice is demonstrated in the analysis of a prostate cancer data set. PMID:27556886
Semiparametric time-to-event modeling in the presence of a latent progression event.
Rice, John D; Tsodikov, Alex
2017-06-01
In cancer research, interest frequently centers on factors influencing a latent event that must precede a terminal event. In practice it is often impossible to observe the latent event precisely, making inference about this process difficult. To address this problem, we propose a joint model for the unobserved time to the latent and terminal events, with the two events linked by the baseline hazard. Covariates enter the model parametrically as linear combinations that multiply, respectively, the hazard for the latent event and the hazard for the terminal event conditional on the latent one. We derive the partial likelihood estimators for this problem assuming the latent event is observed, and propose a profile likelihood-based method for estimation when the latent event is unobserved. The baseline hazard in this case is estimated nonparametrically using the EM algorithm, which allows for closed-form Breslow-type estimators at each iteration, bringing improved computational efficiency and stability compared with maximizing the marginal likelihood directly. We present simulation studies to illustrate the finite-sample properties of the method; its use in practice is demonstrated in the analysis of a prostate cancer data set. © 2016, The International Biometric Society.
Maps Showing Seismic Landslide Hazards in Anchorage, Alaska
Jibson, Randall W.; Michael, John A.
2009-01-01
The devastating landslides that accompanied the great 1964 Alaska earthquake showed that seismically triggered landslides are one of the greatest geologic hazards in Anchorage. Maps quantifying seismic landslide hazards are therefore important for planning, zoning, and emergency-response preparation. The accompanying maps portray seismic landslide hazards for the following conditions: (1) deep, translational landslides, which occur only during great subduction-zone earthquakes that have return periods of =~300-900 yr; (2) shallow landslides for a peak ground acceleration (PGA) of 0.69 g, which has a return period of 2,475 yr, or a 2 percent probability of exceedance in 50 yr; and (3) shallow landslides for a PGA of 0.43 g, which has a return period of 475 yr, or a 10 percent probability of exceedance in 50 yr. Deep, translational landslide hazard zones were delineated based on previous studies of such landslides, with some modifications based on field observations of locations of deep landslides. Shallow-landslide hazards were delineated using a Newmark-type displacement analysis for the two probabilistic ground motions modeled.
Maps showing seismic landslide hazards in Anchorage, Alaska
Jibson, Randall W.
2014-01-01
The devastating landslides that accompanied the great 1964 Alaska earthquake showed that seismically triggered landslides are one of the greatest geologic hazards in Anchorage. Maps quantifying seismic landslide hazards are therefore important for planning, zoning, and emergency-response preparation. The accompanying maps portray seismic landslide hazards for the following conditions: (1) deep, translational landslides, which occur only during great subduction-zone earthquakes that have return periods of =300-900 yr; (2) shallow landslides for a peak ground acceleration (PGA) of 0.69 g, which has a return period of 2,475 yr, or a 2 percent probability of exceedance in 50 yr; and (3) shallow landslides for a PGA of 0.43 g, which has a return period of 475 yr, or a 10 percent probability of exceedance in 50 yr. Deep, translational landslide hazards were delineated based on previous studies of such landslides, with some modifications based on field observations of locations of deep landslides. Shallow-landslide hazards were delineated using a Newmark-type displacement analysis for the two probabilistic ground motions modeled.
Proposed method for hazard mapping of landslide propagation zone
NASA Astrophysics Data System (ADS)
Serbulea, Manole-Stelian; Gogu, Radu; Manoli, Daniel-Marcel; Gaitanaru, Dragos Stefan; Priceputu, Adrian; Andronic, Adrian; Anghel, Alexandra; Liviu Bugea, Adrian; Ungureanu, Constantin; Niculescu, Alexandru
2013-04-01
Sustainable development of communities situated in areas with landslide potential requires a fully understanding of the mechanisms that govern the triggering of the phenomenon as well as the propagation of the sliding mass, with catastrophic consequences on the nearby inhabitants and environment. Modern analysis methods for areas affected by the movement of the soil bodies are presented in this work, as well as a new procedure to assess the landslide hazard. Classical soil mechanics offer sufficient numeric models to assess the landslide triggering zone, such as Limit Equilibrium Methods (Fellenius, Janbu, Morgenstern-Price, Bishop, Spencer etc.), blocks model or progressive mobilization models, Lagrange-based finite element method etc. The computation methods for assessing the propagation zones are quite recent and have high computational requirements, thus not being sufficiently used in practice to confirm their feasibility. The proposed procedure aims to assess not only the landslide hazard factor, but also the affected areas, by means of simple mathematical operations. The method can easily be employed in GIS software, without requiring engineering training. The result is obtained by computing the first and second derivative of the digital terrain model (slope and curvature maps). Using the curvature maps, it is shown that one can assess the areas most likely to be affected by the propagation of the sliding masses. The procedure is first applied on a simple theoretical model and then used on a representative section of a high exposure area in Romania. The method is described by comparison with Romanian legislation for risk and vulnerability assessment, which specifies that the landslide hazard is to be assessed, using an average hazard factor Km, obtained from various other factors. Following the employed example, it is observed that using the Km factor there is an inconsistent distribution of the polygonal surfaces corresponding to different landslide potential. For small values of Km (0.00..0.10) the polygonal surfaces have reduced dimensions along the slopes belonging to main rivers. This can be corrected by including in the analysis the potential areas to be affected by soil instability. Finally, it is shown that the proposed procedure can be used to better assess these areas and to produce more reliable landslide hazard maps. This work was supported by a grant of the Romanian National Authority for Scientific Research, Program for research - Space Technology and Advanced Research - STAR, project number 30/2012.
NASA Astrophysics Data System (ADS)
Su, Pengcheng; Sun, Zhengchao; li, Yong
2017-04-01
Luding-Kangding highway cross the eastern edge of Qinghai-Tibet Plateau where belong to the most deep canyon area of plateau and mountains in western Sichuan with high mountain and steep slope. This area belongs to the intersection among Xianshuihe, Longmenshan and Anninghe fault zones which are best known in Sichuan province. In the region, seismic intensity is with high frequency and strength, new tectonic movement is strong, rock is cracked, there are much loose solid materials. Debris flow disaster is well developed under the multiple effects of the earthquake, strong rainfall and human activity which poses a great threat to the local people's life and property security. So this paper chooses Kangding and LuDing as the study area to do the debris flow hazard assessment through the in-depth analysis of development characteristics and formation mechanism of debris flow. Which can provide important evidence for local disaster assessment and early warning forecast. It also has the important scientific significance and practical value to safeguard the people's life and property safety and the security implementation of the national major project. In this article, occurrence mechanism of debris flow disasters in the study area is explored, factor of evaluation with high impact to debris flow hazards is identified, the database of initial evaluation factors is made by the evaluation unit of basin. The factors with high impact to hazards occurrence are selected by using the stepwise regression method of logistic regression model, at the same time the factors with low impact are eliminated, then the hazard evaluation factor system of debris flow is determined in the study area. Then every factors of evaluation factor system are quantified, and the weights of all evaluation factors are determined by using the analysis of stepwise regression. The debris flows hazard assessment and regionalization of all the whole study area are achieved eventually after establishing the hazard assessment model. In this paper, regional debris flows hazard assessment method with strong universality and reliable evaluation result is presented. The whole study area is divided into 1674 units by automatically extracting and artificial identification, and then 11 factors are selected as the initial assessment factors of debris flow hazard assessment in the study area. The factors of the evaluation index system are quantified using the method of standardized watershed unit amount ratio. The relationship between debris flow occurrence and each evaluation factor is simulated using logistic regression model. The weights of evaluation factors are determined, and the model of debris flows hazard assessment is established in the study area. Danger assessment result of debris flow was applied in line optimization and engineering disaster reduction of Sichuan-Tibet highway (section of Luding-Kangding).
Sensitivity analysis of seismic hazard for the northwestern portion of the state of Gujarat, India
Petersen, M.D.; Rastogi, B.K.; Schweig, E.S.; Harmsen, S.C.; Gomberg, J.S.
2004-01-01
We test the sensitivity of seismic hazard to three fault source models for the northwestern portion of Gujarat, India. The models incorporate different characteristic earthquake magnitudes on three faults with individual recurrence intervals of either 800 or 1600 years. These recurrence intervals imply that large earthquakes occur on one of these faults every 266-533 years, similar to the rate of historic large earthquakes in this region during the past two centuries and for earthquakes in intraplate environments like the New Madrid region in the central United States. If one assumes a recurrence interval of 800 years for large earthquakes on each of three local faults, the peak ground accelerations (PGA; horizontal) and 1-Hz spectral acceleration ground motions (5% damping) are greater than 1 g over a broad region for a 2% probability of exceedance in 50 years' hazard level. These probabilistic PGAs at this hazard level are similar to median deterministic ground motions. The PGAs for 10% in 50 years' hazard level are considerably lower, generally ranging between 0.2 g and 0.7 g across northwestern Gujarat. Ground motions calculated from our models that consider fault interevent times of 800 years are considerably higher than other published models even though they imply similar recurrence intervals. These higher ground motions are mainly caused by the application of intraplate attenuation relations, which account for less severe attenuation of seismic waves when compared to the crustal interplate relations used in these previous studies. For sites in Bhuj and Ahmedabad, magnitude (M) 7 3/4 earthquakes contribute most to the PGA and the 0.2- and 1-s spectral acceleration ground motion maps at the two considered hazard levels. ?? 2004 Elsevier B.V. All rights reserved.
Geological hazard zonation in a marble exploitation area (Apuan Alps, Italy)
NASA Astrophysics Data System (ADS)
Francioni, M.; Salvini, R.; Riccucci, S.
2011-12-01
The present paper describes the hazard mapping of an exploitation area sited in the Apuan Alps marble district (Italy) carried out by the integration of various survey and analysis methodologies. The research, supported by the Massa and Carrara Local Sanitary Agency responsible for workplace health and safety activities, aimed to reduce the high degree hazard of rock fall caused by the presence of potentially unstable blocks located on slopes overhanging the marble quarries. The study of rocky fronts bases on the knowledge of both the structural setting and the physical-mechanical properties of intact material and its discontinuities. In this work the main difficulty in obtaining this information was the inaccessibility of the slope overhanging the area (up to 500 meters high). For this reason, the structural and geological-engineering surveys were integrated by outcomes from digital photogrammetry carried out through terrestrial stereoscopic photos acquired from an aerostatic balloon and a helicopter. In this way, it was possible to derive the geometrical characteristics of joints (such as discontinuities dip, dip direction, spacing and persistence), blocks volumes and slopes morphology also in inaccessible areas. This information, combined with data coming from the geological-engineering survey, was used to perform the stability analysis of the slope. Subsequently, using the topographic map at the scale of 1:2,000, the Digital Terrain Model (DTM) of the slopes and several topographic profiles along it were produced. Assuming that there is a good correspondence between travelling paths and maximum down slope angle, probable trajectories of rock fall along the slope were calculated on the DTM by means of a GIS procedure which utilizes the ArcHydro module of EsriTM ArcMap software. When performing such a 2D numerical modelling of rock falls, lateral dispersion of trajectories has often been hampered by the "a priori" choice of the travelling path. Such a choice can be assessed largely subjective and it leads to possible errors. Thus, rock fall hazard zonation needs spatially distributed analyses including a reliable modelling of lateral dispersion. In this research Conefall software, a freeware QuanterraTM code that estimates the potential run out areas by means of a "so-called" cone method, was used to compute the spatial distribution of rock falls frequency, velocities and kinetic energies. In this way, a modelling approach based on local morphologies was employed to assess the accuracy of the 2D analysis by profiles created "a priori" along the maximum down slope angle. Final results about slope stability and run out analysis allowed to create rock fall hazard map and to advise the most suitable protection works to mitigate the hazard in the most risky sites.
Shao, Liyang; Zhang, Lianjun; Zhen, Zhen
2017-01-01
Children’s blood lead concentrations have been closely monitored over the last two decades in the United States. The bio-monitoring surveillance data collected in local agencies reflected the local temporal trends of children’s blood lead levels (BLLs). However, the analysis and modeling of the long-term time series of BLLs have rarely been reported. We attempted to quantify the long-term trends of children’s BLLs in the city of Syracuse, New York and evaluate the impacts of local lead poisoning prevention programs and Lead Hazard Control Program on reducing the children’s BLLs. We applied interrupted time series analysis on the monthly time series of BLLs surveillance data and used ARMA (autoregressive and moving average) models to measure the average children’s blood lead level shift and detect the seasonal pattern change. Our results showed that there were three intervention stages over the past 20 years to reduce children’s BLLs in the city of Syracuse, NY. The average of children’s BLLs was significantly decreased after the interventions, declining from 8.77μg/dL to 3.94μg/dL during1992 to 2011. The seasonal variation diminished over the past decade, but more short term influences were in the variation. The lead hazard control treatment intervention proved effective in reducing the children’s blood lead levels in Syracuse, NY. Also, the reduction of the seasonal variation of children’s BLLs reflected the impacts of the local lead-based paint mitigation program. The replacement of window and door was the major cost of lead house abatement. However, soil lead was not considered a major source of lead hazard in our analysis. PMID:28182688
Shao, Liyang; Zhang, Lianjun; Zhen, Zhen
2017-01-01
Children's blood lead concentrations have been closely monitored over the last two decades in the United States. The bio-monitoring surveillance data collected in local agencies reflected the local temporal trends of children's blood lead levels (BLLs). However, the analysis and modeling of the long-term time series of BLLs have rarely been reported. We attempted to quantify the long-term trends of children's BLLs in the city of Syracuse, New York and evaluate the impacts of local lead poisoning prevention programs and Lead Hazard Control Program on reducing the children's BLLs. We applied interrupted time series analysis on the monthly time series of BLLs surveillance data and used ARMA (autoregressive and moving average) models to measure the average children's blood lead level shift and detect the seasonal pattern change. Our results showed that there were three intervention stages over the past 20 years to reduce children's BLLs in the city of Syracuse, NY. The average of children's BLLs was significantly decreased after the interventions, declining from 8.77μg/dL to 3.94μg/dL during1992 to 2011. The seasonal variation diminished over the past decade, but more short term influences were in the variation. The lead hazard control treatment intervention proved effective in reducing the children's blood lead levels in Syracuse, NY. Also, the reduction of the seasonal variation of children's BLLs reflected the impacts of the local lead-based paint mitigation program. The replacement of window and door was the major cost of lead house abatement. However, soil lead was not considered a major source of lead hazard in our analysis.
Development of a methodology to assess man-made risks in Germany
NASA Astrophysics Data System (ADS)
Borst, D.; Jung, D.; Murshed, S. M.; Werner, U.
2006-09-01
Risk is a concept used to describe future potential outcomes of certain actions or events. Within the project "CEDIM - Risk Map Germany - Man-made Hazards" it is intended to develop methods for assessing and mapping the risk due to different human-induced hazards. This is a task that has not been successfully performed for Germany so far. Concepts of catastrophe modelling are employed including the spatial modelling of hazard, the compilation of different kinds of exposed elements, the estimation of their vulnerability and the direct loss potential in terms of human life and health. The paper is divided in two sections: First, an analytic framework for assessing the broad spectrum of human-induced risks is introduced. This approach is then applied for three important types of human-induced hazards that are representative for a whole class of hazards: Accidents due to nuclear power plants (NPP) or air traffic, and terrorism. For the analysis of accidents, risk is measured with respect to getting injured or dying when living in certain buffer zones around hazard locations. NPP hazard expert knowledge is used and supplemented with observations on aging effects leading to a proprietary index value for the risk. Air traffic risk is modelled as an area related phenomenon based on available accident statistics leading to an expected value of risk. Terrorism risk is assessed by the attraction certain elements (like embassies in the case of conventional threats) display in the eye of potential aggressors. For non-conventional targets like football games, a detailed approach measuring their susceptibility to different kinds of attacks within predefined scenarios was developed; this also allows a ranking of attack modes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustoni, Arnold L.
A laser hazard analysis and safety assessment was performed for the LH-40 IR Laser Rangefinder based on the 2000 version of the American National Standard Institute's Standard Z136.1, for the Safe Use of Lasers and Z136.6, for the Safe Use of Lasers Outdoors. The LH-40 IR Laser is central to the Long Range Reconnaissance and Observation System (LORROS). The LORROS is being evaluated by the Department 4149 Group to determine its capability as a long-range assessment tool. The manufacture lists the laser rangefinder as 'eye safe' (Class 1 laser classified under the CDRH Compliance Guide for Laser Products and 21more » CFR 1040 Laser Product Performance Standard). It was necessary that SNL validate this prior to its use involving the general public. A formal laser hazard analysis is presented for the typical mode of operation.« less
A Multihazard Regional Level Impact Assessment for South Asia
NASA Astrophysics Data System (ADS)
Amarnath, Giriraj; Alahacoon, Niranga; Aggarwal, Pramod; Smakhtin, Vladimir
2016-04-01
To prioritize climate adaptation strategies, there is a need for quantitative and systematic regional-level assessments which are comparable across multiple climatic hazard regimes. Assessing which countries in a region are most vulnerable to climate change requires analysis of multiple climatic hazards including: droughts, floods, extreme temperature as well as rainfall and sea-level rise. These five climatic hazards, along with population densities were modelled using GIS which enabled a summary of associated human exposure and agriculture losses. A combined index based on hazard, exposure and adaptive capacity is introduced to identify areas of extreme risks. The analysis results in population climate hazard exposure defined as the relative likelihood that a person in a given location was exposed to a given climate-hazard event in a given period of time. The study presents a detailed and coherent approach to fine-scale climate hazard mapping and identification of risks areas for the regions of South Asia that, for the first time, combines the following unique features: (a) methodological consistency across different climate-related hazards, (b) assessment of total exposure on population and agricultural losses, (c) regional-level spatial coverage, and (d) development of customized tools using ArcGIS toolbox that allow assessment of changes in exposure over time and easy replacement of existing datasets with a newly released or superior datasets. The resulting maps enable comparison of the most vulnerable regions in South Asia to climate-related hazards and is among the most urgent of policy needs. Subnational areas (regions/districts/provinces) most vulnerable to climate change impacts in South Asia are documented. The approach involves overlaying climate hazard maps, sensitivity maps, and adaptive capacity maps following the vulnerability assessment framework of the United Nations' Intergovernmental Panel on Climate Change (IPCC). The study used data on the spatial distribution of various climate-related hazards in 1,398 subnational areas of Bangladesh, Bhutan, India, Nepal, Pakistan and Sri Lanka. An analysis of country-level population exposure showed that approximately 750 million people are affected from combined climate-hazards. Of the affected population 72% are in India, followed by 12% each from Bangladesh and Pakistan. Due in part to the economic importance of agriculture, it was found to be most vulnerable and exposed to climate extremes. An analysis of individual hazards indicates that floods and droughts) are the dominant hazards impacting agricultural areas followed by extreme rainfall, extreme temperature and sea-level rise. Based on this vulnerability assessment, all the regions of Bangladesh and the Indian States in Andhra Pradesh, Bihar, Maharashtra, Karnataka and Orissa; Ampara, Puttalam, Trincomalee, Mannar and Batticaloa in Sri Lanka; Sind and Baluchistan in Pakistan; Central and East Nepal; and the transboundary river basins of Indus, Ganges and Brahmaputra are among the most vulnerable regions in South Asia.
A Model for Generating Multi-hazard Scenarios
NASA Astrophysics Data System (ADS)
Lo Jacomo, A.; Han, D.; Champneys, A.
2017-12-01
Communities in mountain areas are often subject to risk from multiple hazards, such as earthquakes, landslides, and floods. Each hazard has its own different rate of onset, duration, and return period. Multiple hazards tend to complicate the combined risk due to their interactions. Prioritising interventions for minimising risk in this context is challenging. We developed a probabilistic multi-hazard model to help inform decision making in multi-hazard areas. The model is applied to a case study region in the Sichuan province in China, using information from satellite imagery and in-situ data. The model is not intended as a predictive model, but rather as a tool which takes stakeholder input and can be used to explore plausible hazard scenarios over time. By using a Monte Carlo framework and varrying uncertain parameters for each of the hazards, the model can be used to explore the effect of different mitigation interventions aimed at reducing the disaster risk within an uncertain hazard context.
Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California
NASA Astrophysics Data System (ADS)
Mahdyiar, M.; Guin, J.
2005-12-01
Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground motion uncertainties. The approach is designed to integrate loss distribution functions with different degrees of correlation for portfolio analysis. The analysis is based on USGS 2002 regional seismicity model.
A methodology for modeling regional terrorism risk.
Chatterjee, Samrat; Abkowitz, Mark D
2011-07-01
Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.
Greenberg, Michael; Lioy, Paul; Ozbas, Birnur; Mantell, Nancy; Isukapalli, Sastry; Lahr, Michael; Altiok, Tayfur; Bober, Joseph; Lacy, Clifton; Lowrie, Karen; Mayer, Henry; Rovito, Jennifer
2013-11-01
We built three simulation models that can assist rail transit planners and operators to evaluate high and low probability rail-centered hazard events that could lead to serious consequences for rail-centered networks and their surrounding regions. Our key objective is to provide these models to users who, through planning with these models, can prevent events or more effectively react to them. The first of the three models is an industrial systems simulation tool that closely replicates rail passenger traffic flows between New York Penn Station and Trenton, New Jersey. Second, we built and used a line source plume model to trace chemical plumes released by a slow-moving freight train that could impact rail passengers, as well as people in surrounding areas. Third, we crafted an economic simulation model that estimates the regional economic consequences of a variety of rail-related hazard events through the year 2020. Each model can work independently of the others. However, used together they help provide a coherent story about what could happen and set the stage for planning that should make rail-centered transport systems more resistant and resilient to hazard events. We highlight the limitations and opportunities presented by using these models individually or in sequence. © 2013 Society for Risk Analysis.
Measuring Community Resilience to Coastal Hazards along the Northern Gulf of Mexico
Lam, Nina S. N.; Reams, Margaret; Li, Kenan; Li, Chi; Mata, Lillian P.
2016-01-01
The abundant research examining aspects of social-ecological resilience, vulnerability, and hazards and risk assessment has yielded insights into these concepts and suggested the importance of quantifying them. Quantifying resilience is complicated by several factors including the varying definitions of the term applied in the research, difficulties involved in selecting and aggregating indicators of resilience, and the lack of empirical validation for the indices derived. This paper applies a new model, called the resilience inference measurement (RIM) model, to quantify resilience to climate-related hazards for 52 U.S. counties along the northern Gulf of Mexico. The RIM model uses three elements (exposure, damage, and recovery indicators) to denote two relationships (vulnerability and adaptability), and employs both K-means clustering and discriminant analysis to derive the resilience rankings, thus enabling validation and inference. The results yielded a classification accuracy of 94.2% with 28 predictor variables. The approach is theoretically sound and can be applied to derive resilience indices for other study areas at different spatial and temporal scales. PMID:27499707
Measuring Community Resilience to Coastal Hazards along the Northern Gulf of Mexico.
Lam, Nina S N; Reams, Margaret; Li, Kenan; Li, Chi; Mata, Lillian P
2016-02-01
The abundant research examining aspects of social-ecological resilience, vulnerability, and hazards and risk assessment has yielded insights into these concepts and suggested the importance of quantifying them. Quantifying resilience is complicated by several factors including the varying definitions of the term applied in the research, difficulties involved in selecting and aggregating indicators of resilience, and the lack of empirical validation for the indices derived. This paper applies a new model, called the resilience inference measurement (RIM) model, to quantify resilience to climate-related hazards for 52 U.S. counties along the northern Gulf of Mexico. The RIM model uses three elements (exposure, damage, and recovery indicators) to denote two relationships (vulnerability and adaptability), and employs both K-means clustering and discriminant analysis to derive the resilience rankings, thus enabling validation and inference. The results yielded a classification accuracy of 94.2% with 28 predictor variables. The approach is theoretically sound and can be applied to derive resilience indices for other study areas at different spatial and temporal scales.
Vulnerabilities to Rock-Slope Failure Impacts from Christchurch, NZ Case History Analysis
NASA Astrophysics Data System (ADS)
Grant, A.; Wartman, J.; Massey, C. I.; Olsen, M. J.; Motley, M. R.; Hanson, D.; Henderson, J.
2015-12-01
Rock-slope failures during the 2010/11 Canterbury (Christchurch), New Zealand Earthquake Sequence resulted in 5 fatalities and caused an estimated US$400 million of damage to buildings and infrastructure. Reducing losses from rock-slope failures requires consideration of both hazard (i.e. likelihood of occurrence) and risk (i.e. likelihood of losses given an occurrence). Risk assessment thus requires information on the vulnerability of structures to rock or boulder impacts. Here we present 32 case histories of structures impacted by boulders triggered during the 2010/11 Canterbury earthquake sequence, in the Port Hills region of Christchurch, New Zealand. The consequences of rock fall impacts on structures, taken as penetration distance into structures, are shown to follow a power-law distribution with impact energy. Detailed mapping of rock fall sources and paths from field mapping, aerial lidar digital elevation model (DEM) data, and high-resolution aerial imagery produced 32 well-constrained runout paths of boulders that impacted structures. Impact velocities used for structural analysis were developed using lumped mass 2-D rock fall runout models using 1-m resolution lidar elevation data. Model inputs were based on calibrated surface parameters from mapped runout paths of 198 additional boulder runouts. Terrestrial lidar scans and structure from motion (SfM) imagery generated 3-D point cloud data used to measure structural damage and impacting boulders. Combining velocity distributions from 2-D analysis and high-precision boulder dimensions, kinetic energy distributions were calculated for all impacts. Calculated impact energy versus penetration distance for all cases suggests a power-law relationship between damage and impact energy. These case histories and resulting fragility curve should serve as a foundation for future risk analysis of rock fall hazards by linking vulnerability data to the predicted energy distributions from the hazard analysis.
Survival analysis: Part I — analysis of time-to-event
2018-01-01
Length of time is a variable often encountered during data analysis. Survival analysis provides simple, intuitive results concerning time-to-event for events of interest, which are not confined to death. This review introduces methods of analyzing time-to-event. The Kaplan-Meier survival analysis, log-rank test, and Cox proportional hazards regression modeling method are described with examples of hypothetical data. PMID:29768911
A joint frailty-copula model between tumour progression and death for meta-analysis.
Emura, Takeshi; Nakatochi, Masahiro; Murotani, Kenta; Rondeau, Virginie
2017-12-01
Dependent censoring often arises in biomedical studies when time to tumour progression (e.g., relapse of cancer) is censored by an informative terminal event (e.g., death). For meta-analysis combining existing studies, a joint survival model between tumour progression and death has been considered under semicompeting risks, which induces dependence through the study-specific frailty. Our paper here utilizes copulas to generalize the joint frailty model by introducing additional source of dependence arising from intra-subject association between tumour progression and death. The practical value of the new model is particularly evident for meta-analyses in which only a few covariates are consistently measured across studies and hence there exist residual dependence. The covariate effects are formulated through the Cox proportional hazards model, and the baseline hazards are nonparametrically modeled on a basis of splines. The estimator is then obtained by maximizing a penalized log-likelihood function. We also show that the present methodologies are easily modified for the competing risks or recurrent event data, and are generalized to accommodate left-truncation. Simulations are performed to examine the performance of the proposed estimator. The method is applied to a meta-analysis for assessing a recently suggested biomarker CXCL12 for survival in ovarian cancer patients. We implement our proposed methods in R joint.Cox package.
Soong, David T.; Murphy, Elizabeth A.; Straub, Timothy D.; Zeeb, Hannah L.
2016-11-22
Results of a flood-hazard analysis conducted by the U.S. Geological Survey, in cooperation with the Argonne National Laboratory, for four headwater streams within the Argonne National Laboratory property indicate that the 1-percent and 0.2-percent annual exceedance probability floods would cause multiple roads to be overtopped. Results indicate that most of the effects on the infrastructure would be from flooding of Freund Brook. Flooding on the Northeast and Southeast Drainage Ways would be limited to overtopping of one road crossing for each of those streams. The Northwest Drainage Way would be the least affected with flooding expected to occur in open grass or forested areas.The Argonne Site Sustainability Plan outlined the development of hydrologic and hydraulic models and the creation of flood-plain maps of the existing site conditions as a first step in addressing resiliency to possible climate change impacts as required by Executive Order 13653 “Preparing the United States for the Impacts of Climate Change.” The Hydrological Simulation Program-FORTRAN is the hydrologic model used in the study, and the Hydrologic Engineering Center‒River Analysis System (HEC–RAS) is the hydraulic model. The model results were verified by comparing simulated water-surface elevations to observed water-surface elevations measured at a network of five crest-stage gages on the four study streams. The comparison between crest-stage gage and simulated elevations resulted in an average absolute difference of 0.06 feet and a maximum difference of 0.19 feet.In addition to the flood-hazard model development and mapping, a qualitative stream assessment was conducted to evaluate stream channel and substrate conditions in the study reaches. This information can be used to evaluate erosion potential.
Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parisi, Carlo; Prescott, Steve; Ma, Zhegang
This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA modelsmore » for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Hazard Analysis and Risk- Based Preventive Controls for Human Food'' and its information collection... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food.'' IV. How To...
Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region
NASA Astrophysics Data System (ADS)
Ayele, Atalay
2017-10-01
Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.
Landslide hazard mapping with selected dominant factors: A study case of Penang Island, Malaysia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tay, Lea Tien; Alkhasawneh, Mutasem Sh.; Ngah, Umi Kalthum
Landslide is one of the destructive natural geohazards in Malaysia. In addition to rainfall as triggering factos for landslide in Malaysia, topographical and geological factors play important role in the landslide susceptibility analysis. Conventional topographic factors such as elevation, slope angle, slope aspect, plan curvature and profile curvature have been considered as landslide causative factors in many research works. However, other topographic factors such as diagonal length, surface area, surface roughness and rugosity have not been considered, especially for the research work in landslide hazard analysis in Malaysia. This paper presents landslide hazard mapping using Frequency Ratio (FR) and themore » study area is Penang Island of Malaysia. Frequency ratio approach is a variant of probabilistic method that is based on the observed relationships between the distribution of landslides and each landslide-causative factor. Landslide hazard map of Penang Island is produced by considering twenty-two (22) landslide causative factors. Among these twenty-two (22) factors, fourteen (14) factors are topographic factors. They are elevation, slope gradient, slope aspect, plan curvature, profile curvature, general curvature, tangential curvature, longitudinal curvature, cross section curvature, total curvature, diagonal length, surface area, surface roughness and rugosity. These topographic factors are extracted from the digital elevation model of Penang Island. The other eight (8) non-topographic factors considered are land cover, vegetation cover, distance from road, distance from stream, distance from fault line, geology, soil texture and rainfall precipitation. After considering all twenty-two factors for landslide hazard mapping, the analysis is repeated with fourteen dominant factors which are selected from the twenty-two factors. Landslide hazard map was segregated into four categories of risks, i.e. Highly hazardous area, Hazardous area, Moderately hazardous area and Not hazardous area. The maps was assessed using ROC (Rate of Curve) based on the area under the curve method (AUC). The result indicates an increase of accuracy from 77.76% (with all 22 factors) to 79.00% (with 14 dominant factors) in the prediction of landslide occurrence.« less
NASA Technical Reports Server (NTRS)
2012-01-01
One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.
National Workshop on Human Resource Innovations in Shipbuilding/Ship Repairs
1991-03-01
you at no cost. Something I would like to do before we get into the meat of the 10 program is to bring in Bob England, the Residence Manager for the...way in which all potential hazards of a worksite can be identified and prevented or controlled, and that worksites can become models for their...are: o Management commitment and employee participation, o Worksite analysis, o Hazard prevention and control, and o Safety and health training
Empirical Data Fusion for Convective Weather Hazard Nowcasting
NASA Astrophysics Data System (ADS)
Williams, J.; Ahijevych, D.; Steiner, M.; Dettling, S.
2009-09-01
This paper describes a statistical analysis approach to developing an automated convective weather hazard nowcast system suitable for use by aviation users in strategic route planning and air traffic management. The analysis makes use of numerical weather prediction model fields and radar, satellite, and lightning observations and derived features along with observed thunderstorm evolution data, which are aligned using radar-derived motion vectors. Using a dataset collected during the summers of 2007 and 2008 over the eastern U.S., the predictive contributions of the various potential predictor fields are analyzed for various spatial scales, lead-times and scenarios using a technique called random forests (RFs). A minimal, skillful set of predictors is selected for each scenario requiring distinct forecast logic, and RFs are used to construct an empirical probabilistic model for each. The resulting data fusion system, which ran in real-time at the National Center for Atmospheric Research during the summer of 2009, produces probabilistic and deterministic nowcasts of the convective weather hazard and assessments of the prediction uncertainty. The nowcasts' performance and results for several case studies are presented to demonstrate the value of this approach. This research has been funded by the U.S. Federal Aviation Administration to support the development of the Consolidated Storm Prediction for Aviation (CoSPA) system, which is intended to provide convective hazard nowcasts and forecasts for the U.S. Next Generation Air Transportation System (NextGen).
Shallow landslide hazard map of Seattle, Washington
Harp, Edwin L.; Michael, John A.; Laprade, William T.
2008-01-01
Landslides, particularly debris flows, have long been a significant cause of damage and destruction to people and property in the Puget Sound region. Following the years of 1996 and 1997, the Federal Emergency Management Agency designated Seattle as a “Project Impact” city with the goal of encouraging the city to become more disaster resistant to landslides and other natural hazards. A major recommendation of the Project Impact council was that the city and the U.S. Geological Survey collaborate to produce a landslide hazard map. An exceptional data set archived by the city containing more than 100 yr of landslide data from severe storm events allowed comparison of actual landslide locations with those predicted by slope-stability modeling. We used an infinite-slope analysis, which models slope segments as rigid friction blocks, to estimate the susceptibility of slopes to debris flows, which are water-laden slurries that can form from shallow failures of soil and weathered bedrock and can travel at high velocities down steep slopes. Data used for the analysis consisted of a digital slope map derived from recent light detection and ranging (LiDAR) imagery of Seattle, recent digital geologic mapping of the city, and shear-strength test data for the geologic units found in the surrounding area. The combination of these data layers within a geographic information system (GIS) platform allowed us to create a shallow landslide hazard map for Seattle.
Physically-based failure analysis of shallow layered soil deposits over large areas
NASA Astrophysics Data System (ADS)
Cuomo, Sabatino; Castorino, Giuseppe Claudio; Iervolino, Aniello
2014-05-01
In the last decades, the analysis of slope stability conditions over large areas has become popular among scientists and practitioners (Cascini et al., 2011; Cuomo and Della Sala, 2013). This is due to the availability of new computational tools (Baum et al., 2002; Godt et al., 2008; Baum and Godt, 2012; Salciarini et al., 2012) - implemented in GIS (Geographic Information System) platforms - which allow taking into account the major hydraulic and mechanical issues related to slope failure, even for unsaturated soils, as well as the spatial variability of both topography and soil properties. However, the effectiveness (Sorbino et al., 2010) of the above methods it is still controversial for landslides forecasting especially depending on the accuracy of DTM (Digital Terrain Model) and for the chance that distinct triggering mechanisms may occur over large area. Among the major uncertainties, layering of soil deposits is of primary importance due to soil layer conductivity contrast and differences in shear strength. This work deals with the hazard analysis of shallow landslides over large areas, considering two distinct schematizations of soil stratigraphy, i.e. homogeneous or layered. To this purpose, the physically-based model TRIGRS (Baum et al., 2002) is firstly used, then extended to the case of layered deposit: specifically, a unique set of hydraulic properties is assumed while distinct soil unit weight and shear strength are considered for each soil layer. Both models are applied to a significant study area of Southern Italy, about 4 km2 large, where shallow deposits of air-fall volcanic (pyroclastic) soils have been affected by several landslides, causing victims, damages and economic losses. The achieved results highlight that soil volume globally mobilized over the study area highly depends on local stratigraphy of shallow deposits. This relates to the depth of critical slip surface which rarely corresponds to the bedrock contact where cohesionless coarse materials lie on deeper soil layers with small effective cohesion. It is also shown that, due to a more realistic assessment of soil stratigraphy, the success of the model may increase when performing a back-analysis of a recent real event. References Baum, R. L., W. Z. Savage, and J. W. Godt (2002), TRIGRS-A Fortran program for transient rainfall infiltration and grid-based regional slope-stability analysis. U.S. Geological Survey, Open-file report 02-424, 35 p. Baum, R.L., Godt, J.W. (2012) Assessment of shallow landslide potential using 1-D and 3-D slope stability analysis Landslides and Engineered Slopes: Protecting Society through Improved Understanding - Eberhardt et al. (eds) 2012 Taylor & Francis Group, London, ISBN 978-0-415-62123-6, 1667-1672. Cascini L., Cuomo S., Della Sala M. (2011). Spatial and temporal occurrence of rainfall-induced shallow landslides of flow type: A case of Sarno-Quindici, Italy. Geomorphology, 126(1-2), 148-158. Cuomo S., Della Sala M. (2013). Spatially distributed analysis of shallow landslides and soil erosion induced by rainfall. (submitted to Natural Hazards). Godt, J.W., Baum, R.L., Savage, W.Z., Salciarini, D., Schulz, W.H., Harp, E.L. (2008). Transient deterministic shallow landslide modeling: requirements for susceptibility and hazard assessments in a GIS framework. Engineering Geology 102, 214-226. Salciarini, D., Tamagnini, C., Conversini, P., Rapinesi, S. (2012). Spatially distributed rainfall thresholds for the initiation of shallow landslides. Natural Hazards 61, 229-245. Sorbino G., Sica C., Cascini L. (2010). Susceptibility analysis of shallow landslides source areas using physically based models. Natural Hazards, 53(2), 313-332.
Atmospheric reentry of the in-core thermionic SP-100 reactor system
NASA Technical Reports Server (NTRS)
Stamatelatos, M. G.; Barsell, A. W.; Harris, P. A.; Francisco, J.
1987-01-01
Presumed end-of-life atmospheric reentry of the GA SP-100 system was studied to assess dispersal feasibility and associated hazards. Reentry was studied by sequential use of an orbital trajectory and a heat analysis computer program. Two heating models were used. The first model assumed a thermal equilibrium condition between the stagnation point aerodynamic heating and the radiative cooling of the skin material surface. The second model allowed for infinite conductivity of the skin material. Four reentering configurations were studied representing stages of increased SP-100 breakup: (1) radiator, shield and reactor, (2) shield and reactor, (3) reactor with control drums, and (4) reactor without control drums. Each reentering configuration was started from a circular orbit at 116 km having an inertial velocity near Mach 25. The assumed failing criterion was the attainment of melting temperature of a critical system component. The reentry analysis revealed breakup of the vessel in the neighborhood of 61 km altitude and scattering of the fuel elements. Subsequent breakup of the fuel elements was not predicted. Oxidation of the niobium skin material was calculated to cause an increase in surface temperature of less than ten percent. The concept of thermite analogs for enhancing reactor reentry dispersal was assessed and found to be feasible in principle. A conservative worst-case hazards analysis was performed for radioactive and nonradioactive toxic SP-100 materials assumed to be dispersed during end-of-life reentry. The hazards associated with this phase of the SP-100 mission were calculated to be insignificant.
NASA Technical Reports Server (NTRS)
Huang, Zhao-Feng; Fint, Jeffry A.; Kuck, Frederick M.
2005-01-01
This paper is to address the in-flight reliability of a liquid propulsion engine system for a launch vehicle. We first establish a comprehensive list of system and sub-system reliability drivers for any liquid propulsion engine system. We then build a reliability model to parametrically analyze the impact of some reliability parameters. We present sensitivity analysis results for a selected subset of the key reliability drivers using the model. Reliability drivers identified include: number of engines for the liquid propulsion stage, single engine total reliability, engine operation duration, engine thrust size, reusability, engine de-rating or up-rating, engine-out design (including engine-out switching reliability, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction), propellant specific hazards, engine start and cutoff transient hazards, engine combustion cycles, vehicle and engine interface and interaction hazards, engine health management system, engine modification, engine ground start hold down with launch commit criteria, engine altitude start (1 in. start), Multiple altitude restart (less than 1 restart), component, subsystem and system design, manufacturing/ground operation support/pre and post flight check outs and inspection, extensiveness of the development program. We present some sensitivity analysis results for the following subset of the drivers: number of engines for the propulsion stage, single engine total reliability, engine operation duration, engine de-rating or up-rating requirements, engine-out design, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction, and engine health management system implementation (basic redlines and more advanced health management systems).
Gagnon, B; Abrahamowicz, M; Xiao, Y; Beauchamp, M-E; MacDonald, N; Kasymjanova, G; Kreisman, H; Small, D
2010-03-30
C-reactive protein (CRP) is gaining credibility as a prognostic factor in different cancers. Cox's proportional hazard (PH) model is usually used to assess prognostic factors. However, this model imposes a priori assumptions, which are rarely tested, that (1) the hazard ratio associated with each prognostic factor remains constant across the follow-up (PH assumption) and (2) the relationship between a continuous predictor and the logarithm of the mortality hazard is linear (linearity assumption). We tested these two assumptions of the Cox's PH model for CRP, using a flexible statistical model, while adjusting for other known prognostic factors, in a cohort of 269 patients newly diagnosed with non-small cell lung cancer (NSCLC). In the Cox's PH model, high CRP increased the risk of death (HR=1.11 per each doubling of CRP value, 95% CI: 1.03-1.20, P=0.008). However, both the PH assumption (P=0.033) and the linearity assumption (P=0.015) were rejected for CRP, measured at the initiation of chemotherapy, which kept its prognostic value for approximately 18 months. Our analysis shows that flexible modeling provides new insights regarding the value of CRP as a prognostic factor in NSCLC and that Cox's PH model underestimates early risks associated with high CRP.
NASA Technical Reports Server (NTRS)
Daniledes, J.; Koch, J. R.
1980-01-01
The risk associated with the accidental release of carbon/graphite fibers (CF) from fires on commercial transport aircraft incorporating composite materials was assessed. Data are developed to evaluate the potential for CF damage to electrical and electronic equipment, assess the cost risk, and evaluate the hazard to continued operation. The subjects covered include identification of susceptible equipments, determination of infiltration transfer functions, analysis of airport operations, calculation of probabilities of equipment failures, assessment of the cost risk, and evaluation of the hazard to continued operation. The results show the risks associated with CF contamination are negligible through 1993.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day...
Probabilistic earthquake hazard analysis for Cairo, Egypt
NASA Astrophysics Data System (ADS)
Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan
2016-04-01
Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).
NASA Astrophysics Data System (ADS)
Odbert, Henry; Aspinall, Willy
2014-05-01
Evidence-based hazard assessment at volcanoes assimilates knowledge about the physical processes of hazardous phenomena and observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We discuss the uncertainty of inferences, and how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.
NASA Astrophysics Data System (ADS)
Odbert, Henry; Hincks, Thea; Aspinall, Willy
2015-04-01
Volcanic hazard assessments must combine information about the physical processes of hazardous phenomena with observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We show how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.
An assessment of gas emanation hazard using a geographic information system and geostatistics.
Astorri, F; Beaubien, S E; Ciotoli, G; Lombardi, S
2002-03-01
This paper describes the use of geostatistical analysis and GIS techniques to assess gas emanation hazards. The Mt. Vulsini volcanic district was selected for this study because of the wide range of natural phenomena locally present that affect gas migration in the near surface. In addition, soil gas samples that were collected in this area should allow for a calibration between the generated risk/hazard models and the measured distribution of toxic gas species at surface. The approach used during this study consisted of three general stages. First data were digitally organized into thematic layers, then software functions in the GIS program "ArcView" were used to compare and correlate these various layers, and then finally the produced "potential-risk" map was compared with radon soil gas data in order to validate the model and/or to select zones for further, more-detailed soil gas investigations.
Complete hazard ranking to analyze right-censored data: An ALS survival study.
Huang, Zhengnan; Zhang, Hongjiu; Boss, Jonathan; Goutman, Stephen A; Mukherjee, Bhramar; Dinov, Ivo D; Guan, Yuanfang
2017-12-01
Survival analysis represents an important outcome measure in clinical research and clinical trials; further, survival ranking may offer additional advantages in clinical trials. In this study, we developed GuanRank, a non-parametric ranking-based technique to transform patients' survival data into a linear space of hazard ranks. The transformation enables the utilization of machine learning base-learners including Gaussian process regression, Lasso, and random forest on survival data. The method was submitted to the DREAM Amyotrophic Lateral Sclerosis (ALS) Stratification Challenge. Ranked first place, the model gave more accurate ranking predictions on the PRO-ACT ALS dataset in comparison to Cox proportional hazard model. By utilizing right-censored data in its training process, the method demonstrated its state-of-the-art predictive power in ALS survival ranking. Its feature selection identified multiple important factors, some of which conflicts with previous studies.
Regression dilution in the proportional hazards model.
Hughes, M D
1993-12-01
The problem of regression dilution arising from covariate measurement error is investigated for survival data using the proportional hazards model. The naive approach to parameter estimation is considered whereby observed covariate values are used, inappropriately, in the usual analysis instead of the underlying covariate values. A relationship between the estimated parameter in large samples and the true parameter is obtained showing that the bias does not depend on the form of the baseline hazard function when the errors are normally distributed. With high censorship, adjustment of the naive estimate by the factor 1 + lambda, where lambda is the ratio of within-person variability about an underlying mean level to the variability of these levels in the population sampled, removes the bias. As censorship increases, the adjustment required increases and when there is no censorship is markedly higher than 1 + lambda and depends also on the true risk relationship.
A fault tree model to assess probability of contaminant discharge from shipwrecks.
Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M; Lindgren, J F; Dahllöf, I
2014-11-15
Shipwrecks on the sea floor around the world may contain hazardous substances that can cause harm to the marine environment. Today there are no comprehensive methods for environmental risk assessment of shipwrecks, and thus there is poor support for decision-making on prioritization of mitigation measures. The purpose of this study was to develop a tool for quantitative risk estimation of potentially polluting shipwrecks, and in particular an estimation of the annual probability of hazardous substance discharge. The assessment of the probability of discharge is performed using fault tree analysis, facilitating quantification of the probability with respect to a set of identified hazardous events. This approach enables a structured assessment providing transparent uncertainty and sensitivity analyses. The model facilitates quantification of risk, quantification of the uncertainties in the risk calculation and identification of parameters to be investigated further in order to obtain a more reliable risk calculation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Towards a probabilistic tsunami hazard analysis for the Gulf of Cadiz
NASA Astrophysics Data System (ADS)
Løvholt, Finn; Urgeles, Roger
2017-04-01
Landslides and volcanic flank collapses constitute a significant portion of all known tsunami sources, and they are less constrained geographically than earthquakes as they are not tied to large fault zones. While landslides have mostly produced local tsunamis historically, prehistoric evidence show that landslides can also produce ocean wide tsunamis. Because the landslide induced tsunami probability is more difficult to quantify than the one induced by earthquakes, our understanding of the landslide tsunami hazard is less understood. To improve our understanding and methodologies to deal with this hazard, we here present results and methods for a preliminary landslide probabilistic tsunami hazard assessment (LPTHA) for the Gulf of Cadiz for submerged landslides. The present literature on LPTHA is sparse, and studies have so far been separated into two groups, the first based on observed magnitude frequency distributions (MFD's), the second based on simplified geotechnical slope stability analysis. We argue that the MFD based approach is best suited when a sufficient amount of data covering a wide range of volumes is available, although uncertainties in the dating of the landslides often represent a potential large source of bias. To this end, the relatively rich availability of landslide data in the Gulf of Cadiz makes this area suitable for developing and testing LPTHA models. In the presentation, we will first explore the landslide data and statistics, including different spatial factors such as slope versus volume relationships, faults etc. Examples of how random realizations can be used to distribute tsunami source over the study area will be demonstrated. Furthermore, computational strategies for simulating both the landslide and the tsunami generation in a simplified way will be described. To this end, we use depth averaged viscoplastic landslide model coupled to the numerical tsunami model to represent a set of idealized tsunami sources, which are in turn put into a regional tsunami model for computing the tsunami propagation. We devote attention to discussing the epistemic uncertainty and sensitivity of the landslide input parameters, and how these may affect the hazard assessment. As the full variability of the landslide parameters cannot be endured, we show that there is a considerable challenge related to the multiple landslide parameter variability. Finally, we discuss some logical next steps in the analysis, as well as possible sources of error.
NASA Astrophysics Data System (ADS)
Themistocleous, K.
2017-09-01
PROTHEGO (PROTection of European Cultural HEritage from GeO-hazards) is a collaborative research project funded in the framework of the Joint Programming Initiative on Cultural Heritage and Global Change (JPICH) - Heritage Plus in 2015-2018 (www.prothego.eu). PROTHEGO aims to make an innovative contribution towards the analysis of geo-hazards in areas of cultural heritage, and uses novel space technology for the management of sites and world heritage monuments located throughout Europe, using specialized remote sensing techniques. Τhe methodology will include the 395 monuments of UNESCO in Europe, with case studies conducted in 4 UNESCO sites in England, Spain, Italy and Cyprus. For the Cyprus case study in Choirokoitia, Unmanned Aerial Vehicles (UAVs) are used to monitor and assess the risk from natural hazards on the archaeological site to evaluate cultural heritage sites deformation. The UAVs were flown over the study area to produce time-series data, including orthoimages, 3D models and digital elevation models of the Choirokoitia site in order to identify changes in the area caused by natural hazards.
Modelling tsunami inundation for risk analysis at the Andaman Sea Coast of Thailand
NASA Astrophysics Data System (ADS)
Kaiser, G.; Kortenhaus, A.
2009-04-01
The mega-tsunami of Dec. 26, 2004 strongly impacted the Andaman Sea coast of Thailand and devastated coastal ecosystems as well as towns, settlements and tourism resorts. In addition to the tragic loss of many lives, the destruction or damage of life-supporting infrastructure, such as buildings, roads, water & power supply etc. caused high economic losses in the region. To mitigate future tsunami impacts there is a need to assess the tsunami hazard and vulnerability in flood prone areas at the Andaman Sea coast in order to determine the spatial distribution of risk and to develop risk management strategies. In the bilateral German-Thai project TRAIT research is performed on integrated risk assessment for the Provinces Phang Nga and Phuket in southern Thailand, including a hazard analysis, i.e. modelling tsunami propagation to the coast, tsunami wave breaking and inundation characteristics, as well as vulnerability analysis of the socio-economic and the ecological system in order to determine the scenario-based, specific risk for the region. In this presentation results of the hazard analysis and the inundation simulation are presented and discussed. Numerical modelling of tsunami propagation and inundation simulation is an inevitable tool for risk analysis, risk management and evacuation planning. While numerous investigations have been made to model tsunami wave generation and propagation in the Indian Ocean, there is still a lack in determining detailed inundation patterns, i.e. water depth and flow dynamics. However, for risk management and evacuation planning this knowledge is essential. As the accuracy of the inundation simulation is strongly depending on the available bathymetric and the topographic data, a multi-scale approach is chosen in this work. The ETOPO Global Relief Model as a bathymetric basis and the Shuttle Radar Topography Mission (SRTM90) have been widely applied in tsunami modelling approaches as these data are free and almost world-wide available. However, to model tsunami-induced inundation for risk analysis and management purposes the accuracy of these data is not sufficient as the processes in the near-shore zone cannot be modelled accurately enough and the spatial resolution of the topography is weak. Moreover, the SRTM data provide a digital surface model which includes vegetation and buildings in the surface description. To improve the data basis additional bathymetric data were used in the near shore zone of the Phang Nga and Phuket coastlines and various remote sensing techniques as well as additional GPS measurements were applied to derive a high resolution topography from satellite and airborne data. Land use classifications and filter methods were developed to correct the digital surface models to digital elevation models. Simulations were then performed with a non-linear shallow water model to model the 2004 Asian Tsunami and to simulate possible future ones. Results of water elevation near the coast were compared with field measurements and observations, and the influence of the resolution of the topography on inundation patterns like water depth, velocity, dispersion and duration of the flood were analysed. The inundation simulation provides detailed hazard maps and is considered a reliable basis for risk assessment and risk zone mapping. Results are regarded vital for estimation of tsunami induced damages and evacuation planning. Results of the aforementioned simulations will be discussed during the conference. Differences of the numerical results using topographic data of different scales and modified by different post processing techniques will be analysed and explained. Further use of the results with respect to tsunami risk analysis and management will also be demonstrated.
NASA Astrophysics Data System (ADS)
Baruffini, Mirko
2010-05-01
Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a GIS-based system can be for effective and efficient disaster response management. In the coming years our GIS application will be a data base containing all information needed for the evaluation of risk sites along the Gotthard line. Our GIS application can help the technical management to decide about protection measures because of, in addition to the visualisation, tools for spatial data analysis will be available. REFERENCES Bründl M. (Ed.) 2009 : Risikokonzept für Naturgefahren - Leitfaden. Nationale Plattform für Naturgefahren PLANAT, Bern. 416 S. BUWAL 1999: Risikoanalyse bei gravitativen Naturgefahren - Methode, Fallbeispiele und Daten (Risk analyses for gravitational natural hazards). Bundesamt für Umwelt, Wald und Landschaft (BUWAL). Umwelt-Materialen Nr. 107, 1-244. Loat, R. & Zimmermann, M. 2004: La gestion des risques en Suisse (Risk Management in Switzerland). In: Veyret, Y., Garry, G., Meschinet de Richemont, N. & Armand Colin (eds) 2002: Colloque Arche de la Défense 22-24 octobre 2002, dans Risques naturels et aménagement en Europe, 108-120. Maggi R. et al, 2009: Evaluation of the optimal resilience for vulnerable infrastructure networks. An interdisciplinary pilot study on the transalpine transportation corridors, NRP 54 "Sustainable Development of the Built Environment", Projekt Nr. 405 440, Final Scientific Report, Lugano
Integrated hazard assessment of Cirenmaco glacial lake in Zhangzangbo valley, Central Himalayas
NASA Astrophysics Data System (ADS)
Wang, Weicai; Gao, Yang; Iribarren Anacona, Pablo; Lei, Yanbin; Xiang, Yang; Zhang, Guoqing; Li, Shenghai; Lu, Anxin
2018-04-01
Glacial lake outburst floods (GLOFs) have recently become one of the primary natural hazards in the Himalayas. There is therefore an urgent need to assess GLOF hazards in the region. Cirenmaco, a moraine-dammed lake located in the upstream portion of Zhangzangbo valley, Central Himalayas, has received public attention after its damaging 1981 outburst flood. Here, by combining remote sensing methods, bathymetric survey and 2D hydraulic modeling, we assessed the hazard posed by Cirenmaco in its current status. Inter-annual variation of Cirenmaco lake area indicates a rapid lake expansion from 0.10 ± 0.08 km2 in 1988 to 0.39 ± 0.04 km2 in 2013. Bathymetric survey shows the maximum water depth of the lake in 2012 was 115 ± 2 m and the lake volume was calculated to be 1.8 × 107 m3. Field geomorphic analysis shows that Cirenmaco glacial lake is prone to GLOFs as mass movements and ice and snow avalanches can impact the lake and the melting of the dead ice in the moraine can lower the dam level. HEC-RAS 2D model was then used to simulate moraine dam failure of the Cirenmaco and assess GLOF impacts downstream. Reconstruction of Cirenmaco 1981 GLOF shows that HEC-RAS can produce reasonable flood extent and water depth, thus demonstrate its ability to effectively model complex GLOFs. GLOF modeling results presented can be used as a basis for the implementation of disaster prevention and mitigation measures. As a case study, this work shows how we can integrate different methods to GLOF hazard assessment.
NASA Astrophysics Data System (ADS)
Panzera, Francesco; Lombardo, Giuseppe; Rigano, Rosaria
2010-05-01
The seismic hazard assessment (SHA) can be performed using either Deterministic or Probabilistic approaches. In present study a probabilistic analysis was carried out for the Catania and Siracusa towns using two different procedures: the 'site' (Albarello and Mucciarelli, 2002) and the 'seismotectonic' (Cornell 1968; Esteva, 1967) methodologies. The SASHA code (D'Amico and Albarello, 2007) was used to calculate seismic hazard through the 'site' approach, whereas the CRISIS2007 code (Ordaz et al., 2007) was adopted in the Esteva-Cornell procedure. According to current international conventions for PSHA (SSHAC, 1997), a logic tree approach was followed to consider and reduce the epistemic uncertainties, for both seismotectonic and site methods. The code SASHA handles the intensity data taking into account the macroseismic information of past earthquakes. CRISIS2007 code needs, as input elements, a seismic catalogue tested for completeness, a seismogenetic zonation and ground motion predicting equations. Data concerning the characterization of regional seismic sources and ground motion attenuation properties were taken from the literature. Special care was devoted to define source zone models, taking into account the most recent studies on regional seismotectonic features and, in particular, the possibility of considering the Malta escarpment as a potential source. The combined use of the above mentioned approaches allowed us to obtain useful elements to define the site seismic hazard in Catania and Siracusa. The results point out that the choice of the probabilistic model plays a fundamental role. It is indeed observed that when the site intensity data are used, the town of Catania shows hazard values higher than the ones found for Siracusa, for each considered return period. On the contrary, when the Esteva-Cornell method is used, Siracusa urban area shows higher hazard than Catania, for return periods greater than one hundred years. The higher hazard observed, through the site approach, for Catania area can be interpreted in terms of greater damage historically observed at this town and its smaller distance from the seismogenic structures. On the other hand, the higher level of hazard found for Siracusa, throughout the Esteva-Cornell approach, could be a consequence of the features of such method which spreads out the intensities over a wide area. However, in SHA the use of a combined approach is recommended for a mutual validation of obtained results and any choice between the two approaches is strictly linked to the knowledge of the local seismotectonic features. References Albarello D. and Mucciarelli M.; 2002: Seismic hazard estimates using ill?defined macroseismic data at site. Pure Appl. Geophys., 159, 1289?1304. Cornell C.A.; 1968: Engineering seismic risk analysis. Bull. Seism. Soc. Am., 58(5), 1583-1606. D'Amico V. and Albarello D.; 2007: Codice per il calcolo della pericolosità sismica da dati di sito (freeware). Progetto DPC-INGV S1, http://esse1.mi.ingv.it/d12.html Esteva L.; 1967: Criterios para la construcción de espectros para diseño sísmico. Proceedings of XII Jornadas Sudamericanas de Ingeniería Estructural y III Simposio Panamericano de Estructuras, Caracas, 1967. Published later in Boletín del Instituto de Materiales y Modelos Estructurales, Universidad Central de Venezuela, No. 19. Ordaz M., Aguilar A. and Arboleda J.; 2007: CRISIS2007, Program for computing seismic hazard. Version 5.4, Mexico City: UNAM. SSHAC (Senior Seismic Hazard Analysis Committee); 1997: Recommendations for probabilistic seismic hazard analysis: guidance on uncertainty and use of experts. NUREG/CR-6372.
Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time‐to‐Event Analysis
Gong, Xiajing; Hu, Meng
2018-01-01
Abstract Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time‐to‐event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high‐dimensional data featured by a large number of predictor variables. Our results showed that ML‐based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high‐dimensional data. The prediction performances of ML‐based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML‐based methods provide a powerful tool for time‐to‐event analysis, with a built‐in capacity for high‐dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. PMID:29536640
Nasejje, Justine B; Mwambi, Henry
2017-09-07
Uganda just like any other Sub-Saharan African country, has a high under-five child mortality rate. To inform policy on intervention strategies, sound statistical methods are required to critically identify factors strongly associated with under-five child mortality rates. The Cox proportional hazards model has been a common choice in analysing data to understand factors strongly associated with high child mortality rates taking age as the time-to-event variable. However, due to its restrictive proportional hazards (PH) assumption, some covariates of interest which do not satisfy the assumption are often excluded in the analysis to avoid mis-specifying the model. Otherwise using covariates that clearly violate the assumption would mean invalid results. Survival trees and random survival forests are increasingly becoming popular in analysing survival data particularly in the case of large survey data and could be attractive alternatives to models with the restrictive PH assumption. In this article, we adopt random survival forests which have never been used in understanding factors affecting under-five child mortality rates in Uganda using Demographic and Health Survey data. Thus the first part of the analysis is based on the use of the classical Cox PH model and the second part of the analysis is based on the use of random survival forests in the presence of covariates that do not necessarily satisfy the PH assumption. Random survival forests and the Cox proportional hazards model agree that the sex of the household head, sex of the child, number of births in the past 1 year are strongly associated to under-five child mortality in Uganda given all the three covariates satisfy the PH assumption. Random survival forests further demonstrated that covariates that were originally excluded from the earlier analysis due to violation of the PH assumption were important in explaining under-five child mortality rates. These covariates include the number of children under the age of five in a household, number of births in the past 5 years, wealth index, total number of children ever born and the child's birth order. The results further indicated that the predictive performance for random survival forests built using covariates including those that violate the PH assumption was higher than that for random survival forests built using only covariates that satisfy the PH assumption. Random survival forests are appealing methods in analysing public health data to understand factors strongly associated with under-five child mortality rates especially in the presence of covariates that violate the proportional hazards assumption.
Markov chains and semi-Markov models in time-to-event analysis.
Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J
2013-10-25
A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.
Markov chains and semi-Markov models in time-to-event analysis
Abner, Erin L.; Charnigo, Richard J.; Kryscio, Richard J.
2014-01-01
A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields. PMID:24818062
Jonkman, Nini H; Westland, Heleen; Groenwold, Rolf H H; Ågren, Susanna; Atienza, Felipe; Blue, Lynda; Bruggink-André de la Porte, Pieta W F; DeWalt, Darren A; Hebert, Paul L; Heisler, Michele; Jaarsma, Tiny; Kempen, Gertrudis I J M; Leventhal, Marcia E; Lok, Dirk J A; Mårtensson, Jan; Muñiz, Javier; Otsu, Haruka; Peters-Klimm, Frank; Rich, Michael W; Riegel, Barbara; Strömberg, Anna; Tsuyuki, Ross T; van Veldhuisen, Dirk J; Trappenburg, Jaap C A; Schuurmans, Marieke J; Hoes, Arno W
2016-03-22
Self-management interventions are widely implemented in the care for patients with heart failure (HF). However, trials show inconsistent results, and whether specific patient groups respond differently is unknown. This individual patient data meta-analysis assessed the effectiveness of self-management interventions in patients with HF and whether subgroups of patients respond differently. A systematic literature search identified randomized trials of self-management interventions. Data from 20 studies, representing 5624 patients, were included and analyzed with the use of mixed-effects models and Cox proportional-hazard models, including interaction terms. Self-management interventions reduced the risk of time to the combined end point of HF-related hospitalization or all-cause death (hazard ratio, 0.80; 95% confidence interval [CI], 0.71-0.89), time to HF-related hospitalization (hazard ratio, 0.80; 95% CI, 0.69-0.92), and improved 12-month HF-related quality of life (standardized mean difference, 0.15; 95% CI, 0.00-0.30). Subgroup analysis revealed a protective effect of self-management on the number of HF-related hospital days in patients <65 years of age (mean, 0.70 versus 5.35 days; interaction P=0.03). Patients without depression did not show an effect of self-management on survival (hazard ratio for all-cause mortality, 0.86; 95% CI, 0.69-1.06), whereas in patients with moderate/severe depression, self-management reduced survival (hazard ratio, 1.39; 95% CI, 1.06-1.83, interaction P=0.01). This study shows that self-management interventions had a beneficial effect on time to HF-related hospitalization or all-cause death and HF-related hospitalization alone and elicited a small increase in HF-related quality of life. The findings do not endorse limiting self-management interventions to subgroups of patients with HF, but increased mortality in depressed patients warrants caution in applying self-management strategies in these patients. © 2016 American Heart Association, Inc.
77 FR 55371 - System Safety Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-07
...-based rule and FRA seeks comments on all aspects of the proposed rule. An SSP would be implemented by a... SSP would be the risk-based hazard management program and risk-based hazard analysis. A properly implemented risk-based hazard management program and risk-based hazard analysis would identify the hazards and...
Introduction to “Global tsunami science: Past and future, Volume I”
Geist, Eric L.; Fritz, Hermann; Rabinovich, Alexander B.; Tanioka, Yuichiro
2016-01-01
Twenty-five papers on the study of tsunamis are included in Volume I of the PAGEOPH topical issue “Global Tsunami Science: Past and Future”. Six papers examine various aspects of tsunami probability and uncertainty analysis related to hazard assessment. Three papers relate to deterministic hazard and risk assessment. Five more papers present new methods for tsunami warning and detection. Six papers describe new methods for modeling tsunami hydrodynamics. Two papers investigate tsunamis generated by non-seismic sources: landslides and meteorological disturbances. The final three papers describe important case studies of recent and historical events. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.
Introduction to "Global Tsunami Science: Past and Future, Volume I"
NASA Astrophysics Data System (ADS)
Geist, Eric L.; Fritz, Hermann M.; Rabinovich, Alexander B.; Tanioka, Yuichiro
2016-12-01
Twenty-five papers on the study of tsunamis are included in Volume I of the PAGEOPH topical issue "Global Tsunami Science: Past and Future". Six papers examine various aspects of tsunami probability and uncertainty analysis related to hazard assessment. Three papers relate to deterministic hazard and risk assessment. Five more papers present new methods for tsunami warning and detection. Six papers describe new methods for modeling tsunami hydrodynamics. Two papers investigate tsunamis generated by non-seismic sources: landslides and meteorological disturbances. The final three papers describe important case studies of recent and historical events. Collectively, this volume highlights contemporary trends in global tsunami research, both fundamental and applied toward hazard assessment and mitigation.
Multi-scale landslide hazard assessment: Advances in global and regional methodologies
NASA Astrophysics Data System (ADS)
Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang
2010-05-01
The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.
ERIC Educational Resources Information Center
Reardon, Sean F.; Brennan, Robert T.; Buka, Stephen L.
2002-01-01
Developed procedures for constructing a retrospective person-period data set from cross-sectional data and discusses modeling strategies for estimating multilevel discrete-time event history models. Applied the methods to the analysis of cigarette use by 1,979 urban adolescents. Results show the influence of the racial composition of the…
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2012 CFR
2012-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2014 CFR
2014-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2011 CFR
2011-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2013 CFR
2013-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2010 CFR
2010-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
Seibert, Tyler M; Fan, Chun Chieh; Wang, Yunpeng; Zuber, Verena; Karunamuni, Roshan; Parsons, J Kellogg; Eeles, Rosalind A; Easton, Douglas F; Kote-Jarai, ZSofia; Al Olama, Ali Amin; Garcia, Sara Benlloch; Muir, Kenneth; Grönberg, Henrik; Wiklund, Fredrik; Aly, Markus; Schleutker, Johanna; Sipeky, Csilla; Tammela, Teuvo Lj; Nordestgaard, Børge G; Nielsen, Sune F; Weischer, Maren; Bisbjerg, Rasmus; Røder, M Andreas; Iversen, Peter; Key, Tim J; Travis, Ruth C; Neal, David E; Donovan, Jenny L; Hamdy, Freddie C; Pharoah, Paul; Pashayan, Nora; Khaw, Kay-Tee; Maier, Christiane; Vogel, Walther; Luedeke, Manuel; Herkommer, Kathleen; Kibel, Adam S; Cybulski, Cezary; Wokolorczyk, Dominika; Kluzniak, Wojciech; Cannon-Albright, Lisa; Brenner, Hermann; Cuk, Katarina; Saum, Kai-Uwe; Park, Jong Y; Sellers, Thomas A; Slavov, Chavdar; Kaneva, Radka; Mitev, Vanio; Batra, Jyotsna; Clements, Judith A; Spurdle, Amanda; Teixeira, Manuel R; Paulo, Paula; Maia, Sofia; Pandha, Hardev; Michael, Agnieszka; Kierzek, Andrzej; Karow, David S; Mills, Ian G; Andreassen, Ole A; Dale, Anders M
2018-01-10
To develop and validate a genetic tool to predict age of onset of aggressive prostate cancer (PCa) and to guide decisions of who to screen and at what age. Analysis of genotype, PCa status, and age to select single nucleotide polymorphisms (SNPs) associated with diagnosis. These polymorphisms were incorporated into a survival analysis to estimate their effects on age at diagnosis of aggressive PCa (that is, not eligible for surveillance according to National Comprehensive Cancer Network guidelines; any of Gleason score ≥7, stage T3-T4, PSA (prostate specific antigen) concentration ≥10 ng/L, nodal metastasis, distant metastasis). The resulting polygenic hazard score is an assessment of individual genetic risk. The final model was applied to an independent dataset containing genotype and PSA screening data. The hazard score was calculated for these men to test prediction of survival free from PCa. Multiple institutions that were members of international PRACTICAL consortium. All consortium participants of European ancestry with known age, PCa status, and quality assured custom (iCOGS) array genotype data. The development dataset comprised 31 747 men; the validation dataset comprised 6411 men. Prediction with hazard score of age of onset of aggressive cancer in validation set. In the independent validation set, the hazard score calculated from 54 single nucleotide polymorphisms was a highly significant predictor of age at diagnosis of aggressive cancer (z=11.2, P<10 -16 ). When men in the validation set with high scores (>98th centile) were compared with those with average scores (30th-70th centile), the hazard ratio for aggressive cancer was 2.9 (95% confidence interval 2.4 to 3.4). Inclusion of family history in a combined model did not improve prediction of onset of aggressive PCa (P=0.59), and polygenic hazard score performance remained high when family history was accounted for. Additionally, the positive predictive value of PSA screening for aggressive PCa was increased with increasing polygenic hazard score. Polygenic hazard scores can be used for personalised genetic risk estimates that can predict for age at onset of aggressive PCa. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Geophysical Hazards and Preventive Disaster Management of Extreme Natural Events
NASA Astrophysics Data System (ADS)
Ismail-Zadeh, A.; Takeuchi, K.
2007-12-01
Geophysical hazard is potentially damaging natural event and/or phenomenon, which may cause the loss of life or injury, property damage, social and economic disruption, or environmental degradation. Extreme natural hazards are a key manifestation of the complex hierarchical nonlinear Earth system. An understanding, accurate modeling and forecasting of the extreme hazards are most important scientific challenges. Several recent extreme natural events (e.g., 2004 Great Indian Ocean Earthquake and Tsunami and the 2005 violent Katrina hurricane) demonstrated strong coupling between solid Earth and ocean, and ocean and atmosphere. These events resulted in great humanitarian tragedies because of a weak preventive disaster management. The less often natural events occur (and the extreme events are rare by definition), the more often the disaster managers postpone the preparedness to the events. The tendency to reduce the funding for preventive disaster management of natural catastrophes is seldom follows the rules of responsible stewardship for future generations neither in developing countries nor in highly developed economies where it must be considered next to malfeasance. Protecting human life and property against earthquake disasters requires an uninterrupted chain of tasks: from (i) understanding of physics of the events, analysis and monitoring, through (ii) interpretation, modeling, hazard assessment, and prediction, to (iii) public awareness, preparedness, and preventive disaster management.
Liu, Xiang; Saat, Mohd Rapik; Barkan, Christopher P L
2014-07-15
Railroads play a key role in the transportation of hazardous materials in North America. Rail transport differs from highway transport in several aspects, an important one being that rail transport involves trains in which many railcars carrying hazardous materials travel together. By contrast to truck accidents, it is possible that a train accident may involve multiple hazardous materials cars derailing and releasing contents with consequently greater potential impact on human health, property and the environment. In this paper, a probabilistic model is developed to estimate the probability distribution of the number of tank cars releasing contents in a train derailment. Principal operational characteristics considered include train length, derailment speed, accident cause, position of the first car derailed, number and placement of tank cars in a train and tank car safety design. The effect of train speed, tank car safety design and tank car positions in a train were evaluated regarding the number of cars that release their contents in a derailment. This research provides insights regarding the circumstances affecting multiple-tank-car release incidents and potential strategies to reduce their occurrences. The model can be incorporated into a larger risk management framework to enable better local, regional and national safety management of hazardous materials transportation by rail. Copyright © 2014 Elsevier B.V. All rights reserved.
a model based on crowsourcing for detecting natural hazards
NASA Astrophysics Data System (ADS)
Duan, J.; Ma, C.; Zhang, J.; Liu, S.; Liu, J.
2015-12-01
Remote Sensing Technology provides a new method for the detecting,early warning,mitigation and relief of natural hazards. Given the suddenness and the unpredictability of the location of natural hazards as well as the actual demands for hazards work, this article proposes an evaluation model for remote sensing detecting of natural hazards based on crowdsourcing. Firstly, using crowdsourcing model and with the help of the Internet and the power of hundreds of millions of Internet users, this evaluation model provides visual interpretation of high-resolution remote sensing images of hazards area and collects massive valuable disaster data; secondly, this evaluation model adopts the strategy of dynamic voting consistency to evaluate the disaster data provided by the crowdsourcing workers; thirdly, this evaluation model pre-estimates the disaster severity with the disaster pre-evaluation model based on regional buffers; lastly, the evaluation model actuates the corresponding expert system work according to the forecast results. The idea of this model breaks the boundaries between geographic information professionals and the public, makes the public participation and the citizen science eventually be realized, and improves the accuracy and timeliness of hazards assessment results.
Cole, Stephen R.; Hudgens, Michael G.; Tien, Phyllis C.; Anastos, Kathryn; Kingsley, Lawrence; Chmiel, Joan S.; Jacobson, Lisa P.
2012-01-01
To estimate the association of antiretroviral therapy initiation with incident acquired immunodeficiency syndrome (AIDS) or death while accounting for time-varying confounding in a cost-efficient manner, the authors combined a case-cohort study design with inverse probability-weighted estimation of a marginal structural Cox proportional hazards model. A total of 950 adults who were positive for human immunodeficiency virus type 1 were followed in 2 US cohort studies between 1995 and 2007. In the full cohort, 211 AIDS cases or deaths occurred during 4,456 person-years. In an illustrative 20% random subcohort of 190 participants, 41 AIDS cases or deaths occurred during 861 person-years. Accounting for measured confounders and determinants of dropout by inverse probability weighting, the full cohort hazard ratio was 0.41 (95% confidence interval: 0.26, 0.65) and the case-cohort hazard ratio was 0.47 (95% confidence interval: 0.26, 0.83). Standard multivariable-adjusted hazard ratios were closer to the null, regardless of study design. The precision lost with the case-cohort design was modest given the cost savings. Results from Monte Carlo simulations demonstrated that the proposed approach yields approximately unbiased estimates of the hazard ratio with appropriate confidence interval coverage. Marginal structural model analysis of case-cohort study designs provides a cost-efficient design coupled with an accurate analytic method for research settings in which there is time-varying confounding. PMID:22302074
Cystic Fibrosis Associated with Worse Survival After Liver Transplantation.
Black, Sylvester M; Woodley, Frederick W; Tumin, Dmitry; Mumtaz, Khalid; Whitson, Bryan A; Tobias, Joseph D; Hayes, Don
2016-04-01
Survival in cystic fibrosis patients after liver transplantation and liver-lung transplantation is not well studied. To discern survival rates after liver transplantation and liver-lung transplantation in patients with and without cystic fibrosis. The United Network for Organ Sharing database was queried from 1987 to 2013. Univariate Cox proportional hazards, multivariate Cox models, and propensity score matching were performed. Liver transplant and liver-lung transplant were performed in 212 and 53 patients with cystic fibrosis, respectively. Univariate Cox proportional hazards regression identified lower survival in cystic fibrosis after liver transplant compared to a reference non-cystic fibrosis liver transplant cohort (HR 1.248; 95 % CI 1.012, 1.541; p = 0.039). Supplementary analysis found graft survival was similar across the 3 recipient categories (log-rank test: χ(2) 2.68; p = 0.262). Multivariate Cox models identified increased mortality hazard among cystic fibrosis patients undergoing liver transplantation (HR 2.439; 95 % CI 1.709, 3.482; p < 0.001) and liver-lung transplantation (HR 2.753; 95 % CI 1.560, 4.861; p < 0.001). Propensity score matching of cystic fibrosis patients undergoing liver transplantation to non-cystic fibrosis controls identified a greater mortality hazard in the cystic fibrosis cohort using a Cox proportional hazards model stratified on matched pairs (HR 3.167; 95 % CI 1.265, 7.929, p = 0.014). Liver transplantation in cystic fibrosis is associated with poorer long-term patient survival compared to non-cystic fibrosis patients, although the difference is not due to graft survival.
Doubly Robust Additive Hazards Models to Estimate Effects of a Continuous Exposure on Survival.
Wang, Yan; Lee, Mihye; Liu, Pengfei; Shi, Liuhua; Yu, Zhi; Abu Awad, Yara; Zanobetti, Antonella; Schwartz, Joel D
2017-11-01
The effect of an exposure on survival can be biased when the regression model is misspecified. Hazard difference is easier to use in risk assessment than hazard ratio and has a clearer interpretation in the assessment of effect modifications. We proposed two doubly robust additive hazards models to estimate the causal hazard difference of a continuous exposure on survival. The first model is an inverse probability-weighted additive hazards regression. The second model is an extension of the doubly robust estimator for binary exposures by categorizing the continuous exposure. We compared these with the marginal structural model and outcome regression with correct and incorrect model specifications using simulations. We applied doubly robust additive hazard models to the estimation of hazard difference of long-term exposure to PM2.5 (particulate matter with an aerodynamic diameter less than or equal to 2.5 microns) on survival using a large cohort of 13 million older adults residing in seven states of the Southeastern United States. We showed that the proposed approaches are doubly robust. We found that each 1 μg m increase in annual PM2.5 exposure was associated with a causal hazard difference in mortality of 8.0 × 10 (95% confidence interval 7.4 × 10, 8.7 × 10), which was modified by age, medical history, socioeconomic status, and urbanicity. The overall hazard difference translates to approximately 5.5 (5.1, 6.0) thousand deaths per year in the study population. The proposed approaches improve the robustness of the additive hazards model and produce a novel additive causal estimate of PM2.5 on survival and several additive effect modifications, including social inequality.
NASA Astrophysics Data System (ADS)
CHEN, Huali; Tokunaga, Tomochika; Ito, Yuka; Sawamukai, Marie
2014-05-01
Floods, the most common natural disaster in the world, cause serious loss of life and economic damage. Flood is one of the disasters in the coastal lowland along the Kujukuri Plain, Chiba Prefecture, Japan. Many natural and human activities have changed the surface environment of the Plain. These include agricultural development, urban and industrial development, change of the drainage patterns of the land surface, deposition and/or erosion of the river valleys, and so on. In addition, wide spread occurrence of land subsidence has been caused by the abstraction of natural gas dissolved in groundwater. The locations of the groundwater extraction include nearby the coast, and it may increase the flood risk. Hence, it is very important to evaluate flood hazard by taking into account the temporal change of land elevation caused by land subsidence, and to develop hazard maps for protecting surface environment and land-use planning. Multicriteria decision analysis (MCDA) provides methodology and techniques for analyzing complex decision problems, which often involve incommensurable data or criteria. Also, Geographical Information System (GIS) is the powerful tool since it manages large amount of spatial data involved in MCDA. The purpose of this study is to present a flood hazard model using MCDA techniques with GIS support in a region where primary data are scare. The model incorporates six parameters: river system, topography, land-use, flood control project, passing flood from coast, and precipitation. Main data sources used are 10 meter resolution topography data, airborne laser scanning data, leveling data, Landsat-TM data, two 1:30,000 scale river watershed map, and precipitation data from precipitation observation stations around the study area. River system map was created by merging the river order, the line density, and the river sink point density layers. Land-use data were derived from Landsat-TM images. A final hazard map for 2004, as an example, was obtained using an algorithm that combines factors in weighted linear combinations. The assignment of the weight/rank values and their analysis were realized by the application of the Analytic Hierarchy Process (AHP) method. This study is the preliminary work to investigate the flood hazard at the Kujukuri Plain. Flood hazard map of the other years will be analyzed to investigate the temporal change of the flood hazard area, and more data will be collected and added to improve the assessment.
Savonitto, Stefano; Morici, Nuccia; Nozza, Anna; Cosentino, Francesco; Perrone Filardi, Pasquale; Murena, Ernesto; Morocutti, Giorgio; Ferri, Marco; Cavallini, Claudio; Eijkemans, Marinus Jc; Stähli, Barbara E; Schrieks, Ilse C; Toyama, Tadashi; Lambers Heerspink, H J; Malmberg, Klas; Schwartz, Gregory G; Lincoff, A Michael; Ryden, Lars; Tardif, Jean Claude; Grobbee, Diederick E
2018-01-01
To define the predictors of long-term mortality in patients with type 2 diabetes mellitus and recent acute coronary syndrome. A total of 7226 patients from a randomized trial, testing the effect on cardiovascular outcomes of the dual peroxisome proliferator-activated receptor agonist aleglitazar in patients with type 2 diabetes mellitus and recent acute coronary syndrome (AleCardio trial), were analysed. Median follow-up was 2 years. The independent mortality predictors were defined using Cox regression analysis. The predictive information provided by each variable was calculated as percent of total chi-square of the model. All-cause mortality was 4.0%, with cardiovascular death contributing for 73% of mortality. The mortality prediction model included N-terminal proB-type natriuretic peptide (adjusted hazard ratio = 1.68; 95% confidence interval = 1.51-1.88; 27% of prediction), lack of coronary revascularization (hazard ratio = 2.28; 95% confidence interval = 1.77-2.93; 18% of prediction), age (hazard ratio = 1.04; 95% confidence interval = 1.02-1.05; 15% of prediction), heart rate (hazard ratio = 1.02; 95% confidence interval = 1.01-1.03; 10% of prediction), glycated haemoglobin (hazard ratio = 1.11; 95% confidence interval = 1.03-1.19; 8% of prediction), haemoglobin (hazard ratio = 1.01; 95% confidence interval = 1.00-1.02; 8% of prediction), prior coronary artery bypass (hazard ratio = 1.61; 95% confidence interval = 1.11-2.32; 7% of prediction) and prior myocardial infarction (hazard ratio = 1.40; 95% confidence interval = 1.05-1.87; 6% of prediction). In patients with type 2 diabetes mellitus and recent acute coronary syndrome, mortality prediction is largely dominated by markers of cardiac, rather than metabolic, dysfunction.
Abdel Raheem, Ali; Shin, Tae Young; Chang, Ki Don; Santok, Glen Denmer R; Alenzi, Mohamed Jayed; Yoon, Young Eun; Ham, Won Sik; Han, Woong Kyu; Choi, Young Deuk; Rha, Koon Ho
2018-06-19
To develop a predictive nomogram for chronic kidney disease-free survival probability in the long term after partial nephrectomy. A retrospective analysis was carried out of 698 patients with T1 renal tumors undergoing partial nephrectomy at a tertiary academic institution. A multivariable Cox regression analysis was carried out based on parameters proven to have an impact on postoperative renal function. Patients with incomplete data, <12 months follow up and preoperative chronic kidney disease stage III or greater were excluded. The study end-points were to identify independent risk factors for new-onset chronic kidney disease development, as well as to construct a predictive model for chronic kidney disease-free survival probability after partial nephrectomy. The median age was 52 years, median tumor size was 2.5 cm and mean warm ischemia time was 28 min. A total of 91 patients (13.1%) developed new-onset chronic kidney disease at a median follow up of 60 months. The chronic kidney disease-free survival rates at 1, 3, 5 and 10 year were 97.1%, 94.4%, 85.3% and 70.6%, respectively. On multivariable Cox regression analysis, age (1.041, P = 0.001), male sex (hazard ratio 1.653, P < 0.001), diabetes mellitus (hazard ratio 1.921, P = 0.046), tumor size (hazard ratio 1.331, P < 0.001) and preoperative estimated glomerular filtration rate (hazard ratio 0.937, P < 0.001) were independent predictors for new-onset chronic kidney disease. The C-index for chronic kidney disease-free survival was 0.853 (95% confidence interval 0.815-0.895). We developed a novel nomogram for predicting the 5-year chronic kidney disease-free survival probability after on-clamp partial nephrectomy. This model might have an important role in partial nephrectomy decision-making and follow-up plan after surgery. External validation of our nomogram in a larger cohort of patients should be considered. © 2018 The Japanese Urological Association.
NASA Astrophysics Data System (ADS)
Spanoudaki, Katerina; Nikiforakis, Ioannis K.; Kampanis, Nikolaos A.
2017-04-01
Developing effective early warning and coordination systems can save thousands of lives and protect people, property and the environment in the event of natural and man-made disasters. In its document "Towards Better Protection of Citizens against Disaster Risks: Strengthening Early Warning Systems in Europe", the Commission points out that it seeks to follow a multi-hazard approach, to develop near real time alert systems, to ensure a near real time dissemination of alerts to Participating States, and to improve its rapid analytical capacity. In this context, the EU project DECATASTROPHIZE (http://decatastrophize.eu/project/) co-financed by the EU Humanitarian Aid and Civil Protection aims to develop a Geospatial Early warning Decision Support System (GE-DSS) to assess, prepare for and respond to multiple and/or simultaneous natural and man-made hazards, disasters, and environmental incidents by using existing models/systems in each partner country (Cyprus, France, Greece, Italy and Spain) in a synergistic way on ONE multi-platform, called DECAT. Specifically, project partners will establish appropriate geo-databases for test areas and use existing hazard models to produce hazard and vulnerability geo-spatial information for earthquakes, landslides, tsunamis, floods, forest fires and marine oil spills. The GE-DSS in will consist of one source code with six geodatabases, i.e., one for each partner and risk data in the respective test area. Each partner organization will be able to manage and monitor its own data/database and their results using Multi-Criteria Decision Analysis (MCDA). The GE-DSS will be demonstrated at the local, regional and national levels through a set of Command Post and Table Top Disaster Exercises. As part of the DECAT GE-DSS, the gis-based geo-database and assessment of marine oil spill hazard and environmental vulnerability for the coasts of Crete in South Aegean Sea are presented here. Environmental Sensitivity Index (ESI) maps are produced comprising shoreline classification - ranked according to a scale relating to sensitivity, natural persistence of oil, and ease of cleanup (Adler and Inbar, 2007; Alves et al., 2014) - biological resources and human-use resources, i.e. specific areas that have added sensitivity and value because of their use, such as beaches, water intakes, and archaeological sites (NOAA, 2002). Seasonal hazard maps (surface oil slick, beached oil) are produced employing a modified version of the open source Lagrangian oil spill fate and transport model MEDSLIK-II (http://medslikii.bo.ingv.it/) coupled with a high-resolution 3D hydrodynamic model. The model predicts the transport and weathering of oil spills following a Lagrangian approach for the solution of the advection-diffusion equation. Transport is governed by the 3D sea currents and wave field. In addition to advective and diffusive displacements, the model simulates several physical and chemical processes that transform the oil (evaporation, emulsification, dispersion in the water column, biodegradation, adhesion to coast). The analysis is carried out under multiple oil spill scenarios accounting for the busiest ship lanes and meteorological conditions using multiple year hydrodynamics. The results highlight the hazard faced by coastal areas of Crete with high ESI. Acknowledgement This work has been co-financed by the EU Humanitarian Aid and Civil Protection under Grant Agreement No. ECHO/SUB/2015/713788/PREP - Project ''DECATASTROPHIZE : UsE of SDSS and MCDA To prepAre for diSasTeRs Or Plan for multIplE HaZards''. References Adler, E. & Inbar, M. (2007). Shoreline sensitivity to oil spills, the Mediterranean coast of Israel: Assessment and analysis. Ocean Coast. Manage., 50 (1-2), 24-34. Alves, T. M., Kokinou, E. & Zodiatis, G. A (2014). A three-step model to assess shoreline and offshore susceptibility to oil spills: The South Aegean (Crete) as an analogue for confined marine basins. Mar. Poll. Bull., 86, 443-457. NOAA (2002). Environmental Sensitivity Index Guidelines, version 3.0. NOAA Technical Memorandum NOS OR&R 11. Seattle: Hazardous Response and Assessment Division, National Oceanic and Atmospheric Administration, 129p.
NASA Astrophysics Data System (ADS)
Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.
2016-11-01
Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.
NASA Technical Reports Server (NTRS)
Chirico, Peter G.
2007-01-01
This viewgraph presentation provides USGS/USAID natural resource assessments in Afghanistan through the mapping of coal, oil and natural gas, minerals, hydrologic resources and earthquake and flood hazards.
2013-01-01
Background An inverse relationship between experience and risk of injury has been observed in many occupations. Due to statistical challenges, however, it has been difficult to characterize the role of experience on the hazard of injury. In particular, because the time observed up to injury is equivalent to the amount of experience accumulated, the baseline hazard of injury becomes the main parameter of interest, excluding Cox proportional hazards models as applicable methods for consideration. Methods Using a data set of 81,301 hourly production workers of a global aluminum company at 207 US facilities, we compared competing parametric models for the baseline hazard to assess whether experience affected the hazard of injury at hire and after later job changes. Specific models considered included the exponential, Weibull, and two (a hypothesis-driven and a data-driven) two-piece exponential models to formally test the null hypothesis that experience does not impact the hazard of injury. Results We highlighted the advantages of our comparative approach and the interpretability of our selected model: a two-piece exponential model that allowed the baseline hazard of injury to change with experience. Our findings suggested a 30% increase in the hazard in the first year after job initiation and/or change. Conclusions Piecewise exponential models may be particularly useful in modeling risk of injury as a function of experience and have the additional benefit of interpretability over other similarly flexible models. PMID:23841648
Examining social, physical, and environmental dimensions of tornado vulnerability in Texas.
Siebeneck, Laura
2016-01-01
To develop a vulnerability model that captures the social, physical, and environmental dimensions of tornado vulnerability of Texas counties. Guided by previous research and methodologies proposed in the hazards and emergency management literature, a principle components analysis is used to create a tornado vulnerability index. Data were gathered from open source information available through the US Census Bureau, American Community Surveys, and the Texas Natural Resources Information System. Texas counties. The results of the model yielded three indices that highlight geographic variability of social vulnerability, built environment vulnerability, and tornado hazard throughout Texas. Further analyses suggest that counties with the highest tornado vulnerability include those with high population densities and high tornado risk. This article demonstrates one method for assessing statewide tornado vulnerability and presents how the results of this type of analysis can be applied by emergency managers towards the reduction of tornado vulnerability in their communities.
NASA Astrophysics Data System (ADS)
Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan
2015-04-01
Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babendreier, Justin E.; Castleton, Karl J.
2005-08-01
Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less
Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela
2017-04-01
Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and Siracusa (Italy). The latter target analysis is enriched by the use of local observed tsunami data, both geological and historical. Indeed, tsunami data-sets available for Siracusa are particularly rich with respect to the scarce and heterogeneous data-sets usually available elsewhere. Therefore, they can represent a further valuable source of information to benchmark and strengthen the results of such kind of studies. The work is funded by the Italian Flagship Project RITMARE, the two EC FP7 ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389) projects, the TSUMAPS-NEAM (Grant agreement ECHO/SUB/2015/718568/PREV26) project and the INGV-DPC Agreement.
14 CFR 417.227 - Toxic release hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from any...
Dynamic frailty models based on compound birth-death processes.
Putter, Hein; van Houwelingen, Hans C
2015-07-01
Frailty models are used in survival analysis to model unobserved heterogeneity. They accommodate such heterogeneity by the inclusion of a random term, the frailty, which is assumed to multiply the hazard of a subject (individual frailty) or the hazards of all subjects in a cluster (shared frailty). Typically, the frailty term is assumed to be constant over time. This is a restrictive assumption and extensions to allow for time-varying or dynamic frailties are of interest. In this paper, we extend the auto-correlated frailty models of Henderson and Shimakura and of Fiocco, Putter and van Houwelingen, developed for longitudinal count data and discrete survival data, to continuous survival data. We present a rigorous construction of the frailty processes in continuous time based on compound birth-death processes. When the frailty processes are used as mixtures in models for survival data, we derive the marginal hazards and survival functions and the marginal bivariate survival functions and cross-ratio function. We derive distributional properties of the processes, conditional on observed data, and show how to obtain the maximum likelihood estimators of the parameters of the model using a (stochastic) expectation-maximization algorithm. The methods are applied to a publicly available data set. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Austin, Peter C; Wagner, Philippe; Merlo, Juan
2017-03-15
Multilevel data occurs frequently in many research areas like health services research and epidemiology. A suitable way to analyze such data is through the use of multilevel regression models (MLRM). MLRM incorporate cluster-specific random effects which allow one to partition the total individual variance into between-cluster variation and between-individual variation. Statistically, MLRM account for the dependency of the data within clusters and provide correct estimates of uncertainty around regression coefficients. Substantively, the magnitude of the effect of clustering provides a measure of the General Contextual Effect (GCE). When outcomes are binary, the GCE can also be quantified by measures of heterogeneity like the Median Odds Ratio (MOR) calculated from a multilevel logistic regression model. Time-to-event outcomes within a multilevel structure occur commonly in epidemiological and medical research. However, the Median Hazard Ratio (MHR) that corresponds to the MOR in multilevel (i.e., 'frailty') Cox proportional hazards regression is rarely used. Analogously to the MOR, the MHR is the median relative change in the hazard of the occurrence of the outcome when comparing identical subjects from two randomly selected different clusters that are ordered by risk. We illustrate the application and interpretation of the MHR in a case study analyzing the hazard of mortality in patients hospitalized for acute myocardial infarction at hospitals in Ontario, Canada. We provide R code for computing the MHR. The MHR is a useful and intuitive measure for expressing cluster heterogeneity in the outcome and, thereby, estimating general contextual effects in multilevel survival analysis. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.
Wagner, Philippe; Merlo, Juan
2016-01-01
Multilevel data occurs frequently in many research areas like health services research and epidemiology. A suitable way to analyze such data is through the use of multilevel regression models (MLRM). MLRM incorporate cluster‐specific random effects which allow one to partition the total individual variance into between‐cluster variation and between‐individual variation. Statistically, MLRM account for the dependency of the data within clusters and provide correct estimates of uncertainty around regression coefficients. Substantively, the magnitude of the effect of clustering provides a measure of the General Contextual Effect (GCE). When outcomes are binary, the GCE can also be quantified by measures of heterogeneity like the Median Odds Ratio (MOR) calculated from a multilevel logistic regression model. Time‐to‐event outcomes within a multilevel structure occur commonly in epidemiological and medical research. However, the Median Hazard Ratio (MHR) that corresponds to the MOR in multilevel (i.e., ‘frailty’) Cox proportional hazards regression is rarely used. Analogously to the MOR, the MHR is the median relative change in the hazard of the occurrence of the outcome when comparing identical subjects from two randomly selected different clusters that are ordered by risk. We illustrate the application and interpretation of the MHR in a case study analyzing the hazard of mortality in patients hospitalized for acute myocardial infarction at hospitals in Ontario, Canada. We provide R code for computing the MHR. The MHR is a useful and intuitive measure for expressing cluster heterogeneity in the outcome and, thereby, estimating general contextual effects in multilevel survival analysis. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:27885709
Farré, Núria; Aranyó, Júlia; Enjuanes, Cristina; Verdú-Rotellar, José María; Ruiz, Sonia; Gonzalez-Robledo, Gina; Meroño, Oona; de Ramon, Marta; Moliner, Pedro; Bruguera, Jordi; Comin-Colet, Josep
2015-02-15
Obese patients with chronic Heart Failure (HF) have better outcome than their lean counterparts, although little is known about the pathophysiology of this obesity paradox. Our aim was to evaluate the hypothesis that patients with chronic HF and obesity (defined as body mass index (BMI)≥30kg/m(2)), may have an attenuated neurohormonal activation in comparison with non-obese patients. The present study is the post-hoc analysis of a cohort of 742 chronic HF patients from a single-center study evaluating sympathetic activation by measuring baseline levels of norepinephrine (NE). Obesity was present in 33% of patients. Higher BMI and obesity were significantly associated with lower NE levels in multivariable linear regression models adjusted for covariates (p<0.001). Addition to NE in multivariate Cox proportional hazard models attenuated the prognostic impact of BMI in terms of outcomes. Finally, when we explored the prognosis impact of raised NE levels (>70th percentile) carrying out a separate analysis in obese and non-obese patients we found that in both groups NE remained a significant independent predictor of poorer outcomes, despite the lower NE levels in patients with chronic HF and obesity: all-cause mortality hazard ratio=2.37 (95% confidence interval, 1.14-4.94) and hazard ratio=1.59 (95% confidence interval, 1.05-2.4) in obese and non-obese respectively; and cardiovascular mortality hazard ratio=3.08 (95% confidence interval, 1.05-9.01) in obese patients and hazard ratio=2.08 (95% confidence interval, 1.42-3.05) in non-obese patients. Patients with chronic HF and obesity have significantly lower sympathetic activation. This finding may partially explain the obesity paradox described in chronic HF patients. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Geospatial Data Integration for Assessing Landslide Hazard on Engineered Slopes
NASA Astrophysics Data System (ADS)
Miller, P. E.; Mills, J. P.; Barr, S. L.; Birkinshaw, S. J.
2012-07-01
Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety) for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator's hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator's existing field-based approaches.
Conceptual Development of a National Volcanic Hazard Model for New Zealand
NASA Astrophysics Data System (ADS)
Stirling, Mark; Bebbington, Mark; Brenna, Marco; Cronin, Shane; Christophersen, Annemarie; Deligne, Natalia; Hurst, Tony; Jolly, Art; Jolly, Gill; Kennedy, Ben; Kereszturi, Gabor; Lindsay, Jan; Neall, Vince; Procter, Jonathan; Rhoades, David; Scott, Brad; Shane, Phil; Smith, Ian; Smith, Richard; Wang, Ting; White, James D. L.; Wilson, Colin J. N.; Wilson, Tom
2017-06-01
We provide a synthesis of a workshop held in February 2016 to define the goals, challenges and next steps for developing a national probabilistic volcanic hazard model for New Zealand. The workshop involved volcanologists, statisticians, and hazards scientists from GNS Science, Massey University, University of Otago, Victoria University of Wellington, University of Auckland, and University of Canterbury. We also outline key activities that will develop the model components, define procedures for periodic update of the model, and effectively articulate the model to end-users and stakeholders. The development of a National Volcanic Hazard Model is a formidable task that will require long-term stability in terms of team effort, collaboration and resources. Development of the model in stages or editions that are modular will make the process a manageable one that progressively incorporates additional volcanic hazards over time, and additional functionalities (e.g. short-term forecasting). The first edition is likely to be limited to updating and incorporating existing ashfall hazard models, with the other hazards associated with lahar, pyroclastic density currents, lava flow, ballistics, debris avalanche, and gases/aerosols being considered in subsequent updates.
Rossitsa River Basin: Flood Hazard and Risk Identification
NASA Astrophysics Data System (ADS)
Mavrova-Guirguinova, Maria; Pencheva, Denislava
2017-04-01
The process of Flood Risk Management Planning and adaptation of measures for flood risk reduction as the Early Warning provoke the necessity of surveys involving Identification aspects. This project presents risk identification combining two lines of analysis: (1) Creation a mathematical model of rainfall-runoff processes in a watershed based on limited number of observed input and output variables; (2) Procedures for determination of critical thresholds - discharges/water levels corresponding to certain consequences. The pilot region is Rossitsa river basin, Sevlievo, Bulgaria. The first line of analysis follows next steps: (a) Creation and calibration of Unit Hydrograph Models based on limited number of observed data for discharge and precipitation; The survey at the selected region has 22 observations for excess rainfall and discharge. (b) The relations of UHM coefficients from the input parameters have been determined statistically, excluding the ANN model of the run-off coefficient as a function of 3 parameters (amount of precipitation two days before, soil condition, intensity of the rainfall) where a feedforward neural network is used. (c) Additional simulations with UHM aiming at generation of synthetic data for rainfall-runoff events, which extend the range of observed data; (d) Training, validation and testing a generalized regional ANN Model for discharge forecasting with 4 input parameters, where the training data set consists of synthetic data, validation and testing data sets consists of observations. A function between consequences and discharges has been reached in the second line of analysis concerning critical hazard levels determination. Unsteady simulations with the hydraulic model using three typical hydrographs for determination of the existing time for reaction from one to upper critical threshold are made. Correction of the critical thresholds aiming at providing necessary time for reaction between the thresholds and probability analysis of the finally determined critical thresholds are made. The result of the described method is a Catalogue for off-line flood hazard and risk identification. It can be used as interactive computer system, based on simulations of the ANN "Catalogue". Flood risk identification of the future rainfall event is made in a multi-dimensional space for each kind of soil conditions (dry, average wet and wet condition) and observed amount of precipitation two days before. Rainfall-runoff scenarios in case of intensive rainfall or sustained rainfall (more than 6 hours) are taken into account. Critical thresholds and hazard zones needed of specific operative activities (rescue and recovery) corresponded to each of the regulated flood protection levels (unite, municipality, regional or national) are presented. The Catalogue gives the opportunity for flood hazard scenarios extraction. Regarding that, the Catalogue is useful on the prevention stage of flood protection planning (emergency operations, measures and resources for their implementation planning) and creation of scenarios for training the Emergency Plans. Concerning application for Early Warning, it gives approximate forecast for flood hazard. The Catalogue supplies the necessary time for reaction of about 24 hours. Thus, Early Warning is possible to the responsible authorities, all parts if the Unified Rescue System, members of suitable Headquarters for disaster protection (on municipality, region or national level).
Effect and clinical prediction of worsening renal function in acute decompensated heart failure.
Breidthardt, Tobias; Socrates, Thenral; Noveanu, Markus; Klima, Theresia; Heinisch, Corinna; Reichlin, Tobias; Potocki, Mihael; Nowak, Albina; Tschung, Christopher; Arenja, Nisha; Bingisser, Roland; Mueller, Christian
2011-03-01
We aimed to establish the prevalence and effect of worsening renal function (WRF) on survival among patients with acute decompensated heart failure. Furthermore, we sought to establish a risk score for the prediction of WRF and externally validate the previously established Forman risk score. A total of 657 consecutive patients with acute decompensated heart failure presenting to the emergency department and undergoing serial creatinine measurements were enrolled. The potential of the clinical parameters at admission to predict WRF was assessed as the primary end point. The secondary end point was all-cause mortality at 360 days. Of the 657 patients, 136 (21%) developed WRF, and 220 patients had died during the first year. WRF was more common in the nonsurvivors (30% vs 41%, p = 0.03). Multivariate regression analysis found WRF to independently predict mortality (hazard ratio 1.92, p <0.01). In a single parameter model, previously diagnosed chronic kidney disease was the only independent predictor of WRF and achieved an area under the receiver operating characteristic curve of 0.60. After the inclusion of the blood gas analysis parameters into the model history of chronic kidney disease (hazard ratio 2.13, p = 0.03), outpatient diuretics (hazard ratio 5.75, p <0.01), and bicarbonate (hazard ratio 0.91, p <0.01) were all predictive of WRF. A risk score was developed using these predictors. On receiver operating characteristic curve analysis, the Forman and Basel prediction rules achieved an area under the curve of 0.65 and 0.71, respectively. In conclusion, WRF was common in patients with acute decompensated heart failure and was linked to significantly worse outcomes. However, the clinical parameters failed to adequately predict its occurrence, making a tailored therapy approach impossible. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Volpe, M.; Selva, J.; Tonini, R.; Romano, F.; Lorito, S.; Brizuela, B.; Argyroudis, S.; Salzano, E.; Piatanesi, A.
2016-12-01
Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) is a methodology to assess the exceedance probability for different thresholds of tsunami hazard intensity, at a specific site or region in a given time period, due to a seismic source. A large amount of high-resolution inundation simulations is typically required for taking into account the full variability of potential seismic sources and their slip distributions. Starting from regional SPTHA offshore results, the computational cost can be reduced by considering for inundation calculations only a subset of `important' scenarios. We here use a method based on an event tree for the treatment of the seismic source aleatory variability; a cluster analysis on the offshore results to define the important sources; epistemic uncertainty treatment through an ensemble modeling approach. We consider two target sites in the Mediterranean (Milazzo, Italy, and Thessaloniki, Greece) where coastal (non nuclear) critical infrastructures (CIs) are located. After performing a regional SPTHA covering the whole Mediterranean, for each target site, few hundreds of representative scenarios are filtered out of all the potential seismic sources and the tsunami inundation is explicitly modeled, obtaining a site-specific SPTHA, with a complete characterization of the tsunami hazard in terms of flow depth and velocity time histories. Moreover, we also explore the variability of SPTHA at the target site accounting for coseismic deformation (i.e. uplift or subsidence) due to near field sources located in very shallow water. The results are suitable and will be applied for subsequent multi-hazard risk analysis for the CIs. These applications have been developed in the framework of the Italian Flagship Project RITMARE, EC FP7 ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389) projects, and of the INGV-DPC Agreement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.
Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regardingmore » their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.« less
Qiao, Yuanhua; Keren, Nir; Mannan, M Sam
2009-08-15
Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.
NASA Astrophysics Data System (ADS)
Wood, N. J.; Jones, J.; Spielman, S.
2013-12-01
Near-field tsunami hazards are credible threats to many coastal communities throughout the world. Along the U.S. Pacific Northwest coast, low-lying areas could be inundated by a series of catastrophic tsunami waves that begin to arrive in a matter of minutes following a Cascadia subduction zone (CSZ) earthquake. This presentation summarizes analytical efforts to classify communities with similar characteristics of community vulnerability to tsunami hazards. This work builds on past State-focused inventories of community exposure to CSZ-related tsunami hazards in northern California, Oregon, and Washington. Attributes used in the classification, or cluster analysis, include demography of residents, spatial extent of the developed footprint based on mid-resolution land cover data, distribution of the local workforce, and the number and type of public venues, dependent-care facilities, and community-support businesses. Population distributions also are characterized by a function of travel time to safety, based on anisotropic, path-distance, geospatial modeling. We used an unsupervised-model-based clustering algorithm and a v-fold, cross-validation procedure (v=50) to identify the appropriate number of community types. We selected class solutions that provided the appropriate balance between parsimony and model fit. The goal of the vulnerability classification is to provide emergency managers with a general sense of the types of communities in tsunami hazard zones based on similar characteristics instead of only providing an exhaustive list of attributes for individual communities. This classification scheme can be then used to target and prioritize risk-reduction efforts that address common issues across multiple communities. The presentation will include a discussion of the utility of proposed place classifications to support regional preparedness and outreach efforts.
NASA Astrophysics Data System (ADS)
Khan, K. M.; Rashid, S.; Yaseen, M.; Ikram, M.
2016-12-01
The Karakoram Highway (KKH) 'eighth wonder of the world', constructed and completed by the consent of Pakistan and China in 1979 as a Friendship Highway. It connect Gilgit-Baltistan, a strategically prominent region of Pakistan, with Xinjiang region in China. Due to manifold geology/geomorphology, soil formation, steep slopes, climate change well as unsustainable anthropogenic activities, still, KKH is remarkably vulnerable to natural hazards i.e. land subsistence, landslides, erosion, rock fall, floods, debris flows, cyclical torrential rainfall and snowfall, lake outburst etc. Most of the time these geohazard's damaging effects jeopardized the life in the region. To ascertain the nature and frequency of the disaster and vulnerability zoning, a rating and management (logistic) analysis were made to investigate the spatiotemporal sharing of the natural hazard. The substantial dynamics of the physiograpy, geology, geomorphology, soils and climate were carefully understand while slope, aspect, elevation, profile curvature and rock hardness was calculated by different techniques. To assess the nature and intensity geospatial analysis were conducted and magnitude of every factor was gauged by using logistic regression. Moreover, ever relative variable was integrated in the evaluation process. Logistic regression and geospatial techniques were used to map the geohazard vulnerability zoning (GVZ). The GVZ model findings were endorsed by the reviews of documented hazards in the current years and the precision was realized more than 88.1 %. The study has proved the model authentication by highlighting the comfortable indenture among the vulnerability mapping and past documented hazards. By using a receiver operating characteristic curve, the logistic regression model made satisfactory results. The outcomes will be useful in sustainable land use and infrastructure planning, mainly in high risk zones for reduceing economic damages and community betterment.
Development and analysis of air quality modeling simulations for hazardous air pollutants
NASA Astrophysics Data System (ADS)
Luecken, D. J.; Hutzell, W. T.; Gipson, G. L.
The concentrations of five hazardous air pollutants were simulated using the community multi-scale air quality (CMAQ) modeling system. Annual simulations were performed over the continental United States for the entire year of 2001 to support human exposure estimates. Results are shown for formaldehyde, acetaldehyde, benzene, 1,3-butadiene and acrolein. Photochemical production in the atmosphere is predicted to dominate ambient formaldehyde and acetaldehyde concentrations, and to account for a significant fraction of ambient acrolein concentrations. Spatial and temporal variations are large throughout the domain over the year. Predicted concentrations are compared with observations for formaldehyde, acetaldehyde, benzene and 1,3-butadiene. Although the modeling results indicate an overall slight tendency towards underprediction, they reproduce episodic and seasonal behavior of pollutant concentrations at many monitors with good skill.
NASA Astrophysics Data System (ADS)
Chiessi, Vittorio; D'Orefice, Maurizio; Scarascia Mugnozza, Gabriele; Vitale, Valerio; Cannese, Christian
2010-07-01
This paper describes the results of a rockfall hazard assessment for the village of San Quirico (Abruzzo region, Italy) based on an engineering-geological model. After the collection of geological, geomechanical, and geomorphological data, the rockfall hazard assessment was performed based on two separate approaches: i) simulation of detachment of rock blocks and their downhill movement using a GIS; and ii) application of geostatistical techniques to the analysis of georeferenced observations of previously fallen blocks, in order to assess the probability of arrival of blocks due to potential future collapses. The results show that the trajectographic analysis is significantly influenced by the input parameters, with particular reference to the coefficients of restitution values. In order to solve this problem, the model was calibrated based on repeated field observations. The geostatistical approach is useful because it gives the best estimation of point-source phenomena such as rockfalls; however, the sensitivity of results to basic assumptions, e.g. assessment of variograms and choice of a threshold value, may be problematic. Consequently, interpolations derived from different variograms have been used and compared among them; hence, those showing the lowest errors were adopted. The data sets which were statistically analysed are relevant to both kinetic energy and surveyed rock blocks in the accumulation area. The obtained maps highlight areas susceptible to rock block arrivals, and show that the area accommodating the new settlement of S. Quirico Village has the highest level of hazard according to both probabilistic and deterministic methods.
Shallow-landslide hazard map of Seattle, Washington
Harp, Edwin L.; Michael, John A.; Laprade, William T.
2006-01-01
Landslides, particularly debris flows, have long been a significant cause of damage and destruction to people and property in the Puget Sound region. Following the years of 1996 and 1997, the Federal Emergency Management Agency (FEMA) designated Seattle as a 'Project Impact' city with the goal of encouraging the city to become more disaster resistant to the effects of landslides and other natural hazards. A major recommendation of the Project Impact council was that the city and the U.S. Geological Survey (USGS) collaborate to produce a landslide hazard map of the city. An exceptional data set archived by the city, containing more than 100 years of landslide data from severe storm events, allowed comparison of actual landslide locations with those predicted by slope-stability modeling. We used an infinite-slope analysis, which models slope segments as rigid friction blocks, to estimate the susceptibility of slopes to shallow landslides which often mobilize into debris flows, water-laden slurries that can form from shallow failures of soil and weathered bedrock, and can travel at high velocities down steep slopes. Data used for analysis consisted of a digital slope map derived from recent Light Detection and Ranging (LIDAR) imagery of Seattle, recent digital geologic mapping, and shear-strength test data for the geologic units in the surrounding area. The combination of these data layers within a Geographic Information System (GIS) platform allowed the preparation of a shallow landslide hazard map for the entire city of Seattle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanton, M.L.; Cooper, A.T.; Castleton, K.J.
1995-11-01
Pacific Northwest`s Surface Environmental Surveillance Project (SESP) is an ongoing effort tot design, review, and conducted monitoring on and off the Hanford site. Chemicals of concern that were selected are listed. Using modeled exposure pathways, the offsite cancer incidence and hazard quotient were calculated and a retrospective pathway analysis performed to estimate what onsite concentrations would be required in the soil for each chemical of concern and other detected chemicals that would be required to obtain an estimated offsite human-health risk of 1.0E-06 cancer incidence or 1.0 hazard quotient. This analysis indicates that current nonradiological chemical contamination occurring on themore » site does not pose a significant offsite human-health risk; the highest cancer incidence to the offsite maximally exposed individual was from arsenic (1.76E-10); the highest hazard quotient was chromium(VI) (1.48E-04). The most sensitive pathways of exposure were surfacewater and aquatic food consumption. Combined total offsite excess cancer incidence was 2.09E-10 and estimated hazard quotient was 2.40E-04. Of the 17 identified chemicals of concern, the SESP does not currently (routinely) monitor arsenic, benzo(a)pyrene, bis(2- ethylhexyl)phthalate (BEHP), and chrysene. Only 3 of the chemicals of concern (arsenic, BEHP, chloroform) could actually occur in onsite soil at concern high enough to cause a 1.0E-06 excess cancer incidence or a 1.0 hazard index for a given offsite exposure pathway. During the retrospective analysis, 20 other chemicals were also evaluated; only vinyl chloride and thallium could reach targeted offsite risk values.« less
Gagnon, B; Abrahamowicz, M; Xiao, Y; Beauchamp, M-E; MacDonald, N; Kasymjanova, G; Kreisman, H; Small, D
2010-01-01
Background: C-reactive protein (CRP) is gaining credibility as a prognostic factor in different cancers. Cox's proportional hazard (PH) model is usually used to assess prognostic factors. However, this model imposes a priori assumptions, which are rarely tested, that (1) the hazard ratio associated with each prognostic factor remains constant across the follow-up (PH assumption) and (2) the relationship between a continuous predictor and the logarithm of the mortality hazard is linear (linearity assumption). Methods: We tested these two assumptions of the Cox's PH model for CRP, using a flexible statistical model, while adjusting for other known prognostic factors, in a cohort of 269 patients newly diagnosed with non-small cell lung cancer (NSCLC). Results: In the Cox's PH model, high CRP increased the risk of death (HR=1.11 per each doubling of CRP value, 95% CI: 1.03–1.20, P=0.008). However, both the PH assumption (P=0.033) and the linearity assumption (P=0.015) were rejected for CRP, measured at the initiation of chemotherapy, which kept its prognostic value for approximately 18 months. Conclusion: Our analysis shows that flexible modeling provides new insights regarding the value of CRP as a prognostic factor in NSCLC and that Cox's PH model underestimates early risks associated with high CRP. PMID:20234363
NASA Astrophysics Data System (ADS)
Lan, Hengxing; Derek Martin, C.; Lim, C. H.
2007-02-01
Geographic information system (GIS) modeling is used in combination with three-dimensional (3D) rockfall process modeling to assess rockfall hazards. A GIS extension, RockFall Analyst (RA), which is capable of effectively handling large amounts of geospatial information relative to rockfall behaviors, has been developed in ArcGIS using ArcObjects and C#. The 3D rockfall model considers dynamic processes on a cell plane basis. It uses inputs of distributed parameters in terms of raster and polygon features created in GIS. Two major components are included in RA: particle-based rockfall process modeling and geostatistics-based rockfall raster modeling. Rockfall process simulation results, 3D rockfall trajectories and their velocity features either for point seeders or polyline seeders are stored in 3D shape files. Distributed raster modeling, based on 3D rockfall trajectories and a spatial geostatistical technique, represents the distribution of spatial frequency, the flying and/or bouncing height, and the kinetic energy of falling rocks. A distribution of rockfall hazard can be created by taking these rockfall characteristics into account. A barrier analysis tool is also provided in RA to aid barrier design. An application of these modeling techniques to a case study is provided. The RA has been tested in ArcGIS 8.2, 8.3, 9.0 and 9.1.
NASA Astrophysics Data System (ADS)
Odbert, H. M.; Aspinall, W.; Phillips, J.; Jenkins, S.; Wilson, T. M.; Scourse, E.; Sheldrake, T.; Tucker, P.; Nakeshree, K.; Bernardara, P.; Fish, K.
2015-12-01
Societies rely on critical services such as power, water, transport networks and manufacturing. Infrastructure may be sited to minimise exposure to natural hazards but not all can be avoided. The probability of long-range transport of a volcanic plume to a site is comparable to other external hazards that must be considered to satisfy safety assessments. Recent advances in numerical models of plume dispersion and stochastic modelling provide a formalized and transparent approach to probabilistic assessment of hazard distribution. To understand the risks to critical infrastructure far from volcanic sources, it is necessary to quantify their vulnerability to different hazard stressors. However, infrastructure assets (e.g. power plantsand operational facilities) are typically complex systems in themselves, with interdependent components that may differ in susceptibility to hazard impact. Usually, such complexity means that risk either cannot be estimated formally or that unsatisfactory simplifying assumptions are prerequisite to building a tractable risk model. We present a new approach to quantifying risk by bridging expertise of physical hazard modellers and infrastructure engineers. We use a joint expert judgment approach to determine hazard model inputs and constrain associated uncertainties. Model outputs are chosen on the basis of engineering or operational concerns. The procedure facilitates an interface between physical scientists, with expertise in volcanic hazards, and infrastructure engineers, with insight into vulnerability to hazards. The result is a joined-up approach to estimating risk from low-probability hazards to critical infrastructure. We describe our methodology and show preliminary results for vulnerability to volcanic hazards at a typical UK industrial facility. We discuss our findings in the context of developing bespoke assessment of hazards from distant sources in collaboration with key infrastructure stakeholders.
EFEHR - the European Facilities for Earthquake Hazard and Risk: beyond the web-platform
NASA Astrophysics Data System (ADS)
Danciu, Laurentiu; Wiemer, Stefan; Haslinger, Florian; Kastli, Philipp; Giardini, Domenico
2017-04-01
European Facilities for Earthquake Hazard and Risk (EEFEHR) represents the sustainable community resource for seismic hazard and risk in Europe. The EFEHR web platform is the main gateway to access data, models and tools as well as provide expertise relevant for assessment of seismic hazard and risk. The main services (databases and web-platform) are hosted at ETH Zurich and operated by the Swiss Seismological Service (Schweizerischer Erdbebendienst SED). EFEHR web-portal (www.efehr.org) collects and displays (i) harmonized datasets necessary for hazard and risk modeling, e.g. seismic catalogues, fault compilations, site amplifications, vulnerabilities, inventories; (ii) extensive seismic hazard products, namely hazard curves, uniform hazard spectra and maps for national and regional assessments. (ii) standardized configuration files for re-computing the regional seismic hazard models; (iv) relevant documentation of harmonized datasets, models and web-services. Today, EFEHR distributes full output of the 2013 European Seismic Hazard Model, ESHM13, as developed within the SHARE project (http://www.share-eu.org/); the latest results of the 2014 Earthquake Model of the Middle East (EMME14), derived within the EMME Project (www.emme-gem.org); the 2001 Global Seismic Hazard Assessment Project (GSHAP) results and the 2015 updates of the Swiss Seismic Hazard. New datasets related to either seismic hazard or risk will be incorporated as they become available. We present the currents status of the EFEHR platform, with focus on the challenges, summaries of the up-to-date datasets, user experience and feedback, as well as the roadmap to future technological innovation beyond the web-platform development. We also show the new services foreseen to fully integrate with the seismological core services of European Plate Observing System (EPOS).
NASA Astrophysics Data System (ADS)
Hung, Hung-Chih; Wu, Ju-Yu; Hung, Chih-Hsuan
2017-04-01
1. Background Asia-Pacific region is one of the most vulnerable areas of the world to climate-related hazards and extremes due to rapid urbanization and over-development in hazard-prone areas. It is thus increasingly recognized that the management of land use and reduction of hazard risk are inextricably linked. This is especially critical from the perspective of integrated river basin management. A range of studies has targeted existing vulnerability assessments. However, limited attention has been paid to the cumulative effects of multiple vulnerable factors and their dynamics faced by local communities. This study proposes a novel methodology to access the changing cumulative vulnerability to climate-related hazards, and to examine the relationship between the attraction factors relevant to the general process of urbanization and vulnerability variability with a focus on a river basin management unit. 2. Methods and data The methods applied in this study include three steps. First, using Intergovernmental Panel on Climate Change's (IPCC) approach, a Cumulative Vulnerability Assessment Framework (CVAF) is built with a goal to characterize and compare the vulnerability to climate-related hazards within river basin regions based on a composition of multiple indicators. We organize these indicator metrics into three categories: (1) hazard exposure; (2) socioeconomic sensitivity, and (3) adaptive capacity. Second, the CVAF is applied by combining a geographical information system (GIS)-based spatial statistics technique with a multicriteria decision analysis (MCDA) to assess and map the changing cumulative vulnerability, comparing conditions in 1996 and 2006 in Danshui River Basin, Taiwan. Third, to examine the affecting factors of vulnerability changing, we develop a Vulnerability Changing Model (VCM) using four attraction factors to reflect how the process of urban developments leads to vulnerability changing. The factors are transport networks, land uses, production values of industries, and infrastructures. We then conduct a regression analysis to test the VCM. To illustrate the proposed methodology, the data are collected from the National Science and Technology Center for Disaster Reduction, Taiwan as well as the National Land Use Investigation and official census statistics. 3. Results and policy implications Results of CVAF analysis demonstrate heterogeneous patterns of vulnerability in the region, and highlight trends of long-term changes. The vulnerable areas unfold as clustered patterns and spatial analogues across regions, rather than randomly distributed. Highest cumulative vulnerability is concentrated in densely populated and downstream reaches (such as Taipei City) of the Danshui River in both time periods. When examining the VCM, it indicates that upper stream and more remote areas generally show low vulnerability, increases are observed in some areas between 1996 and 2006 due to land use intensification, industrial and infrastructure expansion. These findings suggest that land use planning should consider the socioeconomic progression and infrastructure investment factors that contribute to urban sprawl and address current as well as future urban developments vulnerable to hazard risk transmission. The cumulative vulnerability assessment, mapping methods and modelling presented here can be applied to other climate change and hazard risks to highlight priority areas for further investigation and contribute towards improving river basin management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ely, Geoffrey P.
2013-10-31
This project uses dynamic rupture simulations to investigate high-frequency seismic energy generation. The relevant phenomena (frictional breakdown, shear heating, effective normal-stress fluctuations, material damage, etc.) controlling rupture are strongly interacting and span many orders of magnitude in spatial scale, requiring highresolution simulations that couple disparate physical processes (e.g., elastodynamics, thermal weakening, pore-fluid transport, and heat conduction). Compounding the computational challenge, we know that natural faults are not planar, but instead have roughness that can be approximated by power laws potentially leading to large, multiscale fluctuations in normal stress. The capacity to perform 3D rupture simulations that couple these processes willmore » provide guidance for constructing appropriate source models for high-frequency ground motion simulations. The improved rupture models from our multi-scale dynamic rupture simulations will be used to conduct physicsbased (3D waveform modeling-based) probabilistic seismic hazard analysis (PSHA) for California. These calculation will provide numerous important seismic hazard results, including a state-wide extended earthquake rupture forecast with rupture variations for all significant events, a synthetic seismogram catalog for thousands of scenario events and more than 5000 physics-based seismic hazard curves for California.« less
Tectonic models for Yucca Mountain, Nevada
O'Leary, Dennis W.
2006-01-01
Performance of a high-level nuclear waste repository at Yucca Mountain hinges partly on long-term structural stability of the mountain, its susceptibility to tectonic disruption that includes fault displacement, seismic ground motion, and igneous intrusion. Because of the uncertainty involved with long-term (10,000 yr minimum) prediction of tectonic events (e.g., earthquakes) and the incomplete understanding of the history of strain and its mechanisms in the Yucca Mountain region, a tectonic model is needed. A tectonic model should represent the structural assemblage of the mountain in its tectonic setting and account for that assemblage through a history of deformation in which all of the observed deformation features are linked in time and space. Four major types of tectonic models have been proposed for Yucca Mountain: a caldera model; simple shear (detachment fault) models; pure shear (planar fault) models; and lateral shear models. Most of the models seek to explain local features in the context of well-accepted regional deformation mechanisms. Evaluation of the models in light of site characterization shows that none of them completely accounts for all the known tectonic features of Yucca Mountain or is fully compatible with the deformation history. The Yucca Mountain project does not endorse a preferred tectonic model. However, most experts involved in the probabilistic volcanic hazards analysis and the probabilistic seismic hazards analysis preferred a planar fault type model. ?? 2007 Geological Society of America. All rights reserved.
NASA Astrophysics Data System (ADS)
Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui
2018-02-01
The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.
Opinion: The use of natural hazard modeling for decision making under uncertainty
David E. Calkin; Mike Mentis
2015-01-01
Decision making to mitigate the effects of natural hazards is a complex undertaking fraught with uncertainty. Models to describe risks associated with natural hazards have proliferated in recent years. Concurrently, there is a growing body of work focused on developing best practices for natural hazard modeling and to create structured evaluation criteria for complex...
Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout
NASA Astrophysics Data System (ADS)
Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner
2014-12-01
During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly evolving crisis, accurately accounting for and propagating all uncertainties and enabling rational decision making under uncertainty.
Mesh versus bathtub - effects of flood models on exposure analysis in Switzerland
NASA Astrophysics Data System (ADS)
Röthlisberger, Veronika; Zischg, Andreas; Keiler, Margreth
2016-04-01
In Switzerland, mainly two types of maps that indicate potential flood zones are available for flood exposure analyses: 1) Aquaprotect, a nationwide overview provided by the Federal Office for the Environment and 2) communal flood hazard maps available from the 26 cantons. The model used to produce Aquaprotect can be described as a bathtub approach or linear superposition method with three main parameters, namely the horizontal and vertical distance of a point to water features and the size of the river sub-basin. Whereas the determination of flood zones in Aquaprotect is based on a uniform, nationwide model, the communal flood hazard maps are less homogenous, as they have been elaborated either at communal or cantonal levels. Yet their basic content (i.e. indication of potential flood zones for three recurrence periods, with differentiation of at least three inundation depths) is described in national directives and the vast majority of communal flood hazard maps are based on 2D inundation simulations using meshes. Apart from the methodical differences between Aquaprotect and the communal flood hazard maps (and among different communal flood hazard maps), all of these maps include a layer with a similar recurrence period (i.e. Aquaprotect 250 years, flood hazard maps 300 years) beyond the intended protection level of installed structural systems. In our study, we compare the resulting exposure by overlaying the two types of flood maps with a complete, harmonized, and nationwide dataset of building polygons. We assess the different exposure at the national level, and also consider differences among the 26 cantons and the six biogeographically unique regions, respectively. It was observed that while the nationwide exposure rates for both types of flood maps are similar, the differences within certain cantons and biogeographical regions are remarkable. We conclude that flood maps based on bathtub models are appropriate for assessments at national levels, while maps based on 2D simulations are preferable at sub-national levels.
Choudhary, Gaurav; Jankowich, Matthew; Wu, Wen-Chih
2014-07-01
Although elevated pulmonary artery systolic pressure (PASP) is associated with heart failure (HF), whether PASP measurement can help predict future HF admissions is not known, especially in African Americans who are at increased risk for HF. We hypothesized that elevated PASP is associated with increased risk of HF admission and improves HF prediction in African American population. We conducted a longitudinal analysis using the Jackson Heart Study cohort (n=3125; 32.2% men) with baseline echocardiography-derived PASP and follow-up for HF admissions. Hazard ratio for HF admission was estimated using Cox proportional hazard model adjusted for variables in the Atherosclerosis Risk in Community (ARIC) HF prediction model. During a median follow-up of 3.46 years, 3.42% of the cohort was admitted for HF. Subjects with HF had a higher PASP (35.6±11.4 versus 27.6±6.9 mm Hg; P<0.001). The hazard of HF admission increased with higher baseline PASP (adjusted hazard ratio per 10 mm Hg increase in PASP: 2.03; 95% confidence interval, 1.67-2.48; adjusted hazard ratio for highest [≥33 mm Hg] versus lowest quartile [<24 mm Hg] of PASP: 2.69; 95% confidence interval, 1.43-5.06) and remained significant irrespective of history of HF or preserved/reduced ejection fraction. Addition of PASP to the ARIC model resulted in a significant improvement in model discrimination (area under the curve=0.82 before versus 0.84 after; P=0.03) and improved net reclassification index (11-15%) using PASP as a continuous or dichotomous (cutoff=33 mm Hg) variable. Elevated PASP predicts HF admissions in African Americans and may aid in early identification of at-risk subjects for aggressive risk factor modification. © 2014 American Heart Association, Inc.
Development of a Probabilistic Tsunami Hazard Analysis in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka
2006-07-01
It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less
Earthquake Rate Models for Evolving Induced Seismicity Hazard in the Central and Eastern US
NASA Astrophysics Data System (ADS)
Llenos, A. L.; Ellsworth, W. L.; Michael, A. J.
2015-12-01
Injection-induced earthquake rates can vary rapidly in space and time, which presents significant challenges to traditional probabilistic seismic hazard assessment methodologies that are based on a time-independent model of mainshock occurrence. To help society cope with rapidly evolving seismicity, the USGS is developing one-year hazard models for areas of induced seismicity in the central and eastern US to forecast the shaking due to all earthquakes, including aftershocks which are generally omitted from hazards assessments (Petersen et al., 2015). However, the spatial and temporal variability of the earthquake rates make them difficult to forecast even on time-scales as short as one year. An initial approach is to use the previous year's seismicity rate to forecast the next year's seismicity rate. However, in places such as northern Oklahoma the rates vary so rapidly over time that a simple linear extrapolation does not accurately forecast the future, even when the variability in the rates is modeled with simulations based on an Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988) to account for earthquake clustering. Instead of relying on a fixed time period for rate estimation, we explore another way to determine when the earthquake rate should be updated. This approach could also objectively identify new areas where the induced seismicity hazard model should be applied. We will estimate the background seismicity rate by optimizing a single set of ETAS aftershock triggering parameters across the most active induced seismicity zones -- Oklahoma, Guy-Greenbrier, the Raton Basin, and the Azle-Dallas-Fort Worth area -- with individual background rate parameters in each zone. The full seismicity rate, with uncertainties, can then be estimated using ETAS simulations and changes in rate can be detected by applying change point analysis in ETAS transformed time with methods already developed for Poisson processes.
Project of Near-Real-Time Generation of ShakeMaps and a New Hazard Map in Austria
NASA Astrophysics Data System (ADS)
Jia, Yan; Weginger, Stefan; Horn, Nikolaus; Hausmann, Helmut; Lenhardt, Wolfgang
2016-04-01
Target-orientated prevention and effective crisis management can reduce or avoid damage and save lives in case of a strong earthquake. To achieve this goal, a project for automatic generated ShakeMaps (maps of ground motion and shaking intensity) and updating the Austrian hazard map was started at ZAMG (Zentralanstalt für Meteorologie und Geodynamik) in 2015. The first goal of the project is set for a near-real-time generation of ShakeMaps following strong earthquakes in Austria to provide rapid, accurate and official information to support the governmental crisis management. Using newly developed methods and software by SHARE (Seismic Hazard Harmonization in Europe) and GEM (Global Earthquake Model), which allows a transnational analysis at European level, a new generation of Austrian hazard maps will be ultimately calculated. More information and a status of our project will be given by this presentation.
Structural Health Monitoring Analysis for the Orbiter Wing Leading Edge
NASA Technical Reports Server (NTRS)
Yap, Keng C.
2010-01-01
This viewgraph presentation reviews Structural Health Monitoring Analysis for the Orbiter Wing Leading Edge. The Wing Leading Edge Impact Detection System (WLE IDS) and the Impact Analysis Process are also described to monitor WLE debris threats. The contents include: 1) Risk Management via SHM; 2) Hardware Overview; 3) Instrumentation; 4) Sensor Configuration; 5) Debris Hazard Monitoring; 6) Ascent Response Summary; 7) Response Signal; 8) Distribution of Flight Indications; 9) Probabilistic Risk Analysis (PRA); 10) Model Correlation; 11) Impact Tests; 12) Wing Leading Edge Modeling; 13) Ascent Debris PRA Results; and 14) MM/OD PRA Results.
Hansen, Richard A.; Khodneva, Yulia; Glasser, Stephen P.; Qian, Jingjing; Redmond, Nicole; Safford, Monika M.
2018-01-01
Background Mixed evidence suggests second-generation antidepressants may increase risk of cardiovascular and cerebrovascular events. Objective Assess whether antidepressant use is associated with acute coronary heart disease, stroke, cardiovascular disease death, and all-cause mortality. Methods Secondary analyses of the Reasons for Geographic and Racial Differences in Stroke (REGARDS) longitudinal cohort study were conducted. Use of selective serotonin reuptake inhibitors, serotonin and norepinephrine reuptake inhibitors, bupropion, nefazodone, and trazodone was measured during the baseline (2003-2007) in-home visit. Outcomes of coronary heart disease, stroke, cardiovascular disease death, and all-cause mortality were assessed every 6 months and adjudicated by medical record review. Cox proportional hazards time-to-event analysis followed patients until their first event on or before December 31, 2011, iteratively adjusting for covariates. Results Among 29,616 participants, 3,458 (11.7%) used an antidepressant of interest. Intermediate models adjusting for everything but physical and mental health found an increased risk of acute coronary heart disease (Hazard Ratio=1.21; 95% CI 1.04-1.41), stroke (Hazard Ratio=1.28; 95% CI 1.02-1.60), cardiovascular disease death (Hazard Ratio =1.29; 95% CI 1.09-1.53), and all-cause mortality (Hazard Ratio=1.27; 95% CI 1.15-1.41) for antidepressant users. Risk estimates trended in this direction for all outcomes in the fully adjusted model, but only remained statistically associated with increased risk of all-cause mortality (Hazard Ratio=1.12; 95% CI 1.01-1.24). This risk was attenuated in sensitivity analyses censoring follow-up time at 2-years (Hazard Ratio=1.37; 95% CI 1.11-1.68). Conclusions In fully adjusted models antidepressant use was associated with a small increase in all-cause mortality. PMID:26783360
A critical analysis of hazard resilience measures within sustainability assessment frameworks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Elizabeth C., E-mail: echiso1@lsu.edu; Sattler, Meredith, E-mail: msattler@lsu.edu; Friedland, Carol J., E-mail: friedland@lsu.edu
Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site,more » community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
PECH, S.H.
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
Li, Huixia; Luo, Miyang; Luo, Jiayou; Zheng, Jianfei; Zeng, Rong; Du, Qiyun; Fang, Junqun; Ouyang, Na
2016-11-23
A risk prediction model of non-syndromic cleft lip with or without cleft palate (NSCL/P) was established by a discriminant analysis to predict the individual risk of NSCL/P in pregnant women. A hospital-based case-control study was conducted with 113 cases of NSCL/P and 226 controls without NSCL/P. The cases and the controls were obtained from 52 birth defects' surveillance hospitals in Hunan Province, China. A questionnaire was administered in person to collect the variables relevant to NSCL/P by face to face interviews. Logistic regression models were used to analyze the influencing factors of NSCL/P, and a stepwise Fisher discriminant analysis was subsequently used to construct the prediction model. In the univariate analysis, 13 influencing factors were related to NSCL/P, of which the following 8 influencing factors as predictors determined the discriminant prediction model: family income, maternal occupational hazards exposure, premarital medical examination, housing renovation, milk/soymilk intake in the first trimester of pregnancy, paternal occupational hazards exposure, paternal strong tea drinking, and family history of NSCL/P. The model had statistical significance (lambda = 0.772, chi-square = 86.044, df = 8, P < 0.001). Self-verification showed that 83.8 % of the participants were correctly predicted to be NSCL/P cases or controls with a sensitivity of 74.3 % and a specificity of 88.5 %. The area under the receiver operating characteristic curve (AUC) was 0.846. The prediction model that was established using the risk factors of NSCL/P can be useful for predicting the risk of NSCL/P. Further research is needed to improve the model, and confirm the validity and reliability of the model.
Application of hazard and effects management tools and links to the HSE case
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gower-Jones, A.D.; Graaf, G.C. van der; Milne, D.J.
1996-12-31
Many tools and techniques are promoted for the analysis and management of hazards and their effects. The proliferation in the last 5-6 years of these tools has resulted in an overload on designers, engineers and operators of E&P activities and assets to the extent that they are unsure what to do when and how this fits together. This paper starts from the basic E&P business (a business model) the basic structure of any accidental event (bow tie) and maps the tools and techniques to analyze the hazards and effects for both asset and activity HSE management. The links to developingmore » an HSE case within the HSE-MS for assets and activities are given.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolton, P.
The purpose of this task was to support ESH-3 in providing Airborne Release Fraction and Respirable Fraction training to safety analysts at LANL who perform accident analysis, hazard analysis, safety analysis, and/or risk assessments at nuclear facilities. The task included preparation of materials for and the conduct of two 3-day training courses covering the following topics: safety analysis process; calculation model; aerosol physic concepts for safety analysis; and overview of empirically derived airborne release fractions and respirable fractions.
Warning Triggers in Environmental Hazards: Who Should Be Warned to Do What and When?
Cova, Thomas J; Dennison, Philip E; Li, Dapeng; Drews, Frank A; Siebeneck, Laura K; Lindell, Michael K
2017-04-01
Determining the most effective public warnings to issue during a hazardous environmental event is a complex problem. Three primary questions need to be answered: Who should take protective action? What is the best action? and When should this action be initiated? Warning triggers provide a proactive means for emergency managers to simultaneously answer these questions by recommending that a target group take a specified protective action if a preset environmental trigger condition occurs (e.g., warn a community to evacuate if a wildfire crosses a proximal ridgeline). Triggers are used to warn the public across a wide variety of environmental hazards, and an improved understanding of their nature and role promises to: (1) advance protective action theory by unifying the natural, built, and social themes in hazards research into one framework, (2) reveal important information about emergency managers' risk perception, situational awareness, and threat assessment regarding threat behavior and public response, and (3) advance spatiotemporal models for representing the geography and timing of disaster warning and response (i.e., a coupled natural-built-social system). We provide an overview and research agenda designed to advance our understanding and modeling of warning triggers. © 2016 Society for Risk Analysis.
Loss Estimations due to Earthquakes and Secondary Technological Hazards
NASA Astrophysics Data System (ADS)
Frolova, N.; Larionov, V.; Bonnin, J.
2009-04-01
Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.
Heat wave hazard classification and risk assessment using artificial intelligence fuzzy logic.
Keramitsoglou, Iphigenia; Kiranoudis, Chris T; Maiheu, Bino; De Ridder, Koen; Daglis, Ioannis A; Manunta, Paolo; Paganini, Marc
2013-10-01
The average summer temperatures as well as the frequency and intensity of hot days and heat waves are expected to increase due to climate change. Motivated by this consequence, we propose a methodology to evaluate the monthly heat wave hazard and risk and its spatial distribution within large cities. A simple urban climate model with assimilated satellite-derived land surface temperature images was used to generate a historic database of urban air temperature fields. Heat wave hazard was then estimated from the analysis of these hourly air temperatures distributed at a 1-km grid over Athens, Greece, by identifying the areas that are more likely to suffer higher temperatures in the case of a heat wave event. Innovation lies in the artificial intelligence fuzzy logic model that was used to classify the heat waves from mild to extreme by taking into consideration their duration, intensity and time of occurrence. The monthly hazard was subsequently estimated as the cumulative effect from the individual heat waves that occurred at each grid cell during a month. Finally, monthly heat wave risk maps were produced integrating geospatial information on the population vulnerability to heat waves calculated from socio-economic variables.
Relocating San Miguel Volcanic Seismic Events for Receiver Functions and Tomographic Models
NASA Astrophysics Data System (ADS)
Patlan, E.; Velasco, A. A.; Konter, J.
2009-12-01
The San Miguel volcano lies near the city of San Miguel, El Salvador (13.43N and -88.26W). San Miguel volcano, an active stratovolcano, presents a significant natural hazard for the city of San Miguel. Furthermore, the internal state and activity of volcanoes remains an important component to understanding volcanic hazard. The main technology for addressing volcanic hazards and processes is through the analysis of data collected from the deployment of seismic sensors that record ground motion. Six UTEP seismic stations were deployed around San Miguel volcano from 2007-2008 to define the magma chamber and assess the seismic and volcanic hazard. We utilize these data to develop images of the earth structure beneath the volcano, studying the volcanic processes by identifying different sources, and investigating the role of earthquakes and faults in controlling the volcanic processes. We will calculate receiver functions to determine the thickness of San Miguel volcano internal structure, within the Caribbean plate. Crustal thicknesses will be modeled using calculated receiver functions from both theoretical and hand-picked P-wave arrivals. We will use this information derived from receiver functions, along with P-wave delay times, to map the location of the magma chamber.
Polinski, Jennifer M.; Schneeweiss, Sebastian; Glynn, Robert J.; Lii, Joyce; Rassen, Jeremy
2012-01-01
Purpose Under Medicare Part D, patient characteristics influence plan choice, which in turn influences Part D coverage gap entry. We compared pre-defined propensity score (PS) and high-dimensional propensity score (hdPS) approaches to address such ‘confounding by health system use’ in assessing whether coverage gap entry is associated with cardiovascular events or death. Methods We followed 243,079 Medicare patients aged 65+ with linked prescription, medical, and plan-specific data in 2005–2007. Patients reached the coverage gap and were followed until an event or year’s end. Exposed patients were responsible for drug costs in the gap; unexposed patients (patients with non-Part D drug insurance and Part D patients receiving a low-income subsidy (LIS)) received financial assistance. Exposed patients were 1:1 PS- or hdPS-matched to unexposed patients. The PS model included 52 predefined covariates; the hdPS model added 400 empirically identified covariates. Hazard ratios for death and any of five cardiovascular outcomes were compared. In sensitivity analyses, we explored residual confounding using only LIS patients in the unexposed group. Results In unadjusted analyses, exposed patients had no greater hazard of death (HR=1.00; 95% CI, 0.84–1.20) or other outcomes. PS- (HR=1.29;0.99–1.66) and hdPS- (HR=1.11;0.86–1.42) matched analyses showed elevated but non-significant hazards of death. In sensitivity analyses, the PS analysis showed a protective effect (HR=0.78;0.61–0.98), while the hdPS analysis (HR=1.06;0.82–1.37) confirmed the main hdPS findings. Conclusion Although the PS-matched analysis suggested elevated though non-significant hazards of death among patients with no financial assistance during the gap, the hdPS analysis produced lower estimates that were stable across sensitivity analyses. PMID:22552984
Back analysis of Swiss flood danger map to define local flood hazards
NASA Astrophysics Data System (ADS)
Choffet, Marc; Derron, Marc-Henri; Jaboyedoff, Michel; Leroi, Eric; Mayis, Arnaud
2010-05-01
The flood hazard maps for the entire Switzerland will be available at the end of 2011. Furthermore, the Swiss territory has been covered by aerial laser scanning (ALS) providing high resolution digital elevation model (DEM). This paper describes the development of a method for analyzing the local flood hazard based on Swiss hazard maps and HR-DEM. In their original state, Swiss hazard maps are constructed on the basis of an aggregation of information, a matrix intensity, and frequency. The degree of danger represented by the yellow, blue and red zones gives no information on the water level at each point of the territory. The developed method is based on a superposition of the danger map with the HR-DEM to determine the water level in a hazard area. To perform this method, (1) a triangulation is based on the intersection of the hazard map with the HR-DEM. It uses the limits of area where information is contrain. The hazard map perimeter and the boundaries of hazard areas give information on the widest possible overflow in case of flooding. It is also possible to associate it with a return period. (2) Based on these areas and the difference with the DEM, it is possible to calibrate the highest flood level and the extract water levels for the entire area. This analysis of existing documents opens up interesting perspectives for understanding how infrastructures are threatened by flood hazard by predicting water levels and potential damages to buildings while proposing remedial measures. Indeed, this method allows estimating the water level at each point of a building in case of flooding. It is designed to provide spatial information on water height levels; this offers a different approach of buildings in danger zones. Indeed, it is possible to discern several elements, such as areas of water accumulation involving longer flood duration, possible structural damages to buildings due to high hydrostatic pressure, determination of a local hazard, or the display of water levels in 3D.
Plenary: Progress in Regional Landslide Hazard Assessment—Examples from the USA
Baum, Rex L.; Schulz, William; Brien, Dianne L.; Burns, William J.; Reid, Mark E.; Godt, Jonathan W.
2014-01-01
Landslide hazard assessment at local and regional scales contributes to mitigation of landslides in developing and densely populated areas by providing information for (1) land development and redevelopment plans and regulations, (2) emergency preparedness plans, and (3) economic analysis to (a) set priorities for engineered mitigation projects and (b) define areas of similar levels of hazard for insurance purposes. US Geological Survey (USGS) research on landslide hazard assessment has explored a range of methods that can be used to estimate temporal and spatial landslide potential and probability for various scales and purposes. Cases taken primarily from our work in the U.S. Pacific Northwest illustrate and compare a sampling of methods, approaches, and progress. For example, landform mapping using high-resolution topographic data resulted in identification of about four times more landslides in Seattle, Washington, than previous efforts using aerial photography. Susceptibility classes based on the landforms captured 93 % of all historical landslides (all types) throughout the city. A deterministic model for rainfall infiltration and shallow landslide initiation, TRIGRS, was able to identify locations of 92 % of historical shallow landslides in southwest Seattle. The potentially unstable areas identified by TRIGRS occupied only 26 % of the slope areas steeper than 20°. Addition of an unsaturated infiltration model to TRIGRS expands the applicability of the model to areas of highly permeable soils. Replacement of the single cell, 1D factor of safety with a simple 3D method of columns improves accuracy of factor of safety predictions for both saturated and unsaturated infiltration models. A 3D deterministic model for large, deep landslides, SCOOPS, combined with a three-dimensional model for groundwater flow, successfully predicted instability in steep areas of permeable outwash sand and topographic reentrants. These locations are consistent with locations of large, deep, historically active landslides. For an area in Seattle, a composite of the three maps illustrates how maps produced by different approaches might be combined to assess overall landslide potential. Examples from Oregon, USA, illustrate how landform mapping and deterministic analysis for shallow landslide potential have been adapted into standardized methods for efficiently producing detailed landslide inventory and shallow landslide susceptibility maps that have consistent content and format statewide.
Integrated survival analysis using an event-time approach in a Bayesian framework
Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.
2015-01-01
Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.
Integrated survival analysis using an event-time approach in a Bayesian framework.
Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M
2015-02-01
Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.
Preliminary design package for Sunair SEC-601 solar collector
NASA Technical Reports Server (NTRS)
1978-01-01
The preliminary design of the Owens-Illinois model Sunair SEC-601 tubular air solar collector is presented. Information in this package includes the subsystem design and development approaches, hazard analysis, and detailed drawings available as the preliminary design review.
Evaluating the influence of gully erosion on landslide hazard analysis triggered by heavy rainfall
NASA Astrophysics Data System (ADS)
Ruljigaljig, Tjuku; Tsai, Ching-Jun; Peng, Wen-Fei; Yu, Teng-To
2017-04-01
During the rainstorm period such as typhoon or heavy rain, the development of gully will induce a large-scale landslide. The purpose of this study is to assess and quantify the existence and development of gully for the purpose of triggering landslides by analyzing the landslides hazard. Firstly, based on multi-scale DEM data, this study uses wavelet transform to construct an automatic algorithm. The 1-meter DEM is used to evaluate the location and type of gully, and to establish an evaluation model for predicting erosion development.In this study, routes in the Chai-Yi were studied to clarify the damage potential of roadways from local gully. The local of gully is regarded as a parameter to reduce the strength parameter. The distribution of factor of safe (F.S.) is compared with the landslide inventory map. The result of this research could be used to increase the prediction accuracy of landslide hazard analysis due to heavy rainfalls.
Empirical calibration of a roadside hazardousness index for Spanish two-lane rural roads.
Pardillo-Mayora, José M; Domínguez-Lira, Carlos A; Jurado-Piña, Rafael
2010-11-01
Crash records and roadside data from Spanish two-lane rural roads were analyzed to study the effect of roadside configuration on safety. Four indicators were used to characterize the main roadside features that have an influence on the consequences of roadway departures: roadside slope, non-traversable obstacles distance from the roadway edge, safety barrier installation, and alignment. Based on the analysis of the effect of roadside configuration on the frequency and severity of run-off-road injury crashes, a categorical roadside hazardousness scale was defined. Cluster analysis was applied to group the combinations of the four indicators into categories with homogeneous effects on run-off-road injury crashes frequency and severity. As a result a 5-level Roadside Hazardousness Index (RHI) was defined. RHI can be used as reference to normalize the collection of roadside safety related information. The index can also be used as variable for inclusion of roadside condition information in multivariate crash prediction models. 2010 Elsevier Ltd. All rights reserved.
Using the CABLES model to assess and minimize risk in research: control group hazards.
Koocher, G P
2002-01-01
CABLES is both an acronym and metaphor for conceptualizing research participation risk by considering 6 distinct domains in which risks of harm to research participants may exist: cognitive, affective, biological, legal, economic, and social/cultural. These domains are described and illustrated, along with suggestions for minimizing or eliminating the potential hazards to human participants in biomedical and behavioral science research. Adoption of a thoughtful ethical analysis addressing all 6 CABLES strands in designing research provides a strong protective step toward safeguarding and promoting the well-being of study participants.
Nath, D C; Land, K C; Singh, K K
1994-04-01
This paper investigates the effects of continued breast-feeding after resumption of menses on fertility, using data from two retrospective surveys in India and single decrement life table and multivariate time-dependent hazards analyses. Breast-feeding even after the return of menses is found to be associated with longer birth intervals. The interaction of breast-feeding duration after resumption of menses and postpartum amenorrhoea has a significant effect on the risk of conception after return of menses.
The Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2)
,
2008-01-01
California?s 35 million people live among some of the most active earthquake faults in the United States. Public safety demands credible assessments of the earthquake hazard to maintain appropriate building codes for safe construction and earthquake insurance for loss protection. Seismic hazard analysis begins with an earthquake rupture forecast?a model of probabilities that earthquakes of specified magnitudes, locations, and faulting types will occur during a specified time interval. This report describes a new earthquake rupture forecast for California developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP 2007).
Aviation Safety Program Atmospheric Environment Safety Technologies (AEST) Project
NASA Technical Reports Server (NTRS)
Colantonio, Ron
2011-01-01
Engine Icing: Characterization and Simulation Capability: Develop knowledge bases, analysis methods, and simulation tools needed to address the problem of engine icing; in particular, ice-crystal icing Airframe Icing Simulation and Engineering Tool Capability: Develop and demonstrate 3-D capability to simulate and model airframe ice accretion and related aerodynamic performance degradation for current and future aircraft configurations in an expanded icing environment that includes freezing drizzle/rain Atmospheric Hazard Sensing and Mitigation Technology Capability: Improve and expand remote sensing and mitigation of hazardous atmospheric environments and phenomena
Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems
NASA Astrophysics Data System (ADS)
Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.
2011-12-01
The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed technique to a training dataset of induced earthquakes recorded by Berkeley-Geysers network, which is installed in The Geysers geothermal area in Northern California. The reliability of the techniques is then tested by using a different dataset performing seismic hazard analysis in a time-evolving approach, which provides with ground-motion values having fixed probabilities of exceedence. Those values can be finally compared with the observations by using appropriate statistical tests.
Models and analysis for multivariate failure time data
NASA Astrophysics Data System (ADS)
Shih, Joanna Huang
The goal of this research is to develop and investigate models and analytic methods for multivariate failure time data. We compare models in terms of direct modeling of the margins, flexibility of dependency structure, local vs. global measures of association, and ease of implementation. In particular, we study copula models, and models produced by right neutral cumulative hazard functions and right neutral hazard functions. We examine the changes of association over time for families of bivariate distributions induced from these models by displaying their density contour plots, conditional density plots, correlation curves of Doksum et al, and local cross ratios of Oakes. We know that bivariate distributions with same margins might exhibit quite different dependency structures. In addition to modeling, we study estimation procedures. For copula models, we investigate three estimation procedures. the first procedure is full maximum likelihood. The second procedure is two-stage maximum likelihood. At stage 1, we estimate the parameters in the margins by maximizing the marginal likelihood. At stage 2, we estimate the dependency structure by fixing the margins at the estimated ones. The third procedure is two-stage partially parametric maximum likelihood. It is similar to the second procedure, but we estimate the margins by the Kaplan-Meier estimate. We derive asymptotic properties for these three estimation procedures and compare their efficiency by Monte-Carlo simulations and direct computations. For models produced by right neutral cumulative hazards and right neutral hazards, we derive the likelihood and investigate the properties of the maximum likelihood estimates. Finally, we develop goodness of fit tests for the dependency structure in the copula models. We derive a test statistic and its asymptotic properties based on the test of homogeneity of Zelterman and Chen (1988), and a graphical diagnostic procedure based on the empirical Bayes approach. We study the performance of these two methods using actual and computer generated data.
Sustainable regional development and natural hazard impacts
NASA Astrophysics Data System (ADS)
Petrova, Elena; Svetlosanov, Vladimir; Kudin, Valery
2016-04-01
During the last decades, natural hazard impacts on social and economic development in many countries were increasing due to the expansion of human activities into the areas prone to natural risks as well as to increasing in number and severity of natural hazardous events caused by climate changes and other natural phenomena. The escalation of severe disasters (such as Tohoku earthquake and tsunami in Japan 2011) triggered by natural hazards and related natural-technological and environmental events is increasingly threatening sustainable development at different levels from regional to global scale. In our study, we develop a model of ecological, economic and social sustainable development for the European part of Russia and the Republic of Belarus. The model consists of six blocks including 1) population, 2) environment, 3) mineral resources, 4) geographic space, 5) investments, and 6) food production and import. These blocks were created based on the analysis of the main processes at the regional level; all the blocks are closely interrelated between each other. Reaching the limit values of block parameters corresponds to a sharp deterioration of the system; as a result, the system can lose its stability. Aggravation of natural and natural-technological risk impacts on each block and should be taken into account in the model of regional development. Natural hazards can cause both strong influences and small but permanent perturbations. In both cases, a system can become unstable. The criterion for sustainable development is proposed. The Russian Foundation for Humanities and Belorussian Republican Foundation for Fundamental Research supported the study (project 15-22-01008).
Determining the Financial Impact of Flood Hazards in Ungaged Basins
NASA Astrophysics Data System (ADS)
Cotterman, K. A.; Gutenson, J. L.; Pradhan, N. R.; Byrd, A.
2017-12-01
Many portions of the Earth lack adequate authoritative or in situ data that is of great value in determining natural hazard vulnerability from both anthropogenic and physical perspective. Such locations include the majority of developing nations, which do not possess adequate warning systems and protective infrastructure. The lack of warning and protection from natural hazards make these nations vulnerable to the destructive power of events such as floods. The goal of this research is to demonstrate an initial workflow with which to characterize flood financial hazards with global datasets and crowd-sourced, non-authoritative data in ungagged river basins. This workflow includes the hydrologic and hydraulic response of the watershed to precipitation, characterized by the physics-based modeling application Gridded Surface-Subsurface Hydrologic Analysis (GSSHA) model. In addition, data infrastructure and resources are available to approximate the human impact of flooding. Open source, volunteer geographic information (VGI) data can provide global coverage of elements at risk of flooding. Additional valuation mechanisms can then translate flood exposure into percentage and financial damage to each building. The combinations of these tools allow the authors to remotely assess flood hazards with minimal computational, temporal, and financial overhead. This combination of deterministic and stochastic modeling provides the means to quickly characterize watershed flood vulnerability and will allow emergency responders and planners to better understand the implications of flooding, both spatially and financially. In either a planning, real-time, or forecasting scenario, the system will assist the user in understanding basin flood vulnerability and increasing community resiliency and preparedness.
Mace Firebaugh, Casey; Moyes, Simon; Jatrana, Santosh; Rolleston, Anna; Kerse, Ngaire
2018-01-18
The relationship between physical activity, function, and mortality is not established in advanced age. Physical activity, function, and mortality were followed in a cohort of Māori and non-Māori adults living in advanced age for a period of six years. Generalised Linear regression models were used to analyse the association between physical activity and NEADL while Kaplan-Meier survival analysis, and Cox-proportional hazard models were used to assess the association between the physical activity and mortality. The Hazard Ratio for mortality for those in the least active physical activity quartile was 4.1 for Māori and 1.8 for non- Māori compared to the most active physical activity quartile. There was an inverse relationship between physical activity and mortality, with lower hazard ratios for mortality at all levels of physical activity. Higher levels of physical activity were associated with lower mortality and higher functional status in advanced aged adults.
An interdisciplinary perspective on social and physical determinants of seismic risk
NASA Astrophysics Data System (ADS)
Lin, K.-H.; Chang, Y.-C.; Liu, G.-Y.; Chan, C.-H.; Lin, T.-H.; Yeh, C.-H.
2015-01-01
While disaster studies researchers usually view risk as a function of hazard, exposure, and vulnerability, few studies have systematically examined the relationships among the various physical and socioeconomic determinants underlying disasters, and fewer have done so through seismic risk analysis. In the context of the 1999 Chi-Chi earthquake in Taiwan, this study constructs five hypothetical models to test different determinants that affect disaster fatality at the village level, namely seismic hazard intensity, population, building fragility, demographics and socioeconomics. The Poisson Regression Model is used to estimate the impact of natural hazards and social factors on fatality. Results indicate that although all of the determinants have an impact on the specific dimension of seismic fatality, some indicators of social inequality, such as gender ratio, dependency ratio, income and its SD, are the driving determinants deteriorating vulnerability to seismic risk. These findings have strong social implications for policy interventions to mitigate such disasters. This study presents an interdisciplinary investigation into social and physical determinants in seismic risk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hwang, H.H.M.; Chen, C.H.S.
1990-04-16
An assessment of the seismic hazard that exists along the major crude oil pipeline running through the New Madrid seismic zone from southeastern Louisiana to Patoka, Illinois is examined in the report. An 1811-1812 type New Madrid earthquake with moment magnitude 8.2 is assumed to occur at three locations where large historical earthquakes have occurred. Six pipeline crossings of the major rivers in West Tennessee are chosen as the sites for hazard evaluation because of the liquefaction potential at these sites. A seismologically-based model is used to predict the bedrock accelerations. Uncertainties in three model parameters, i.e., stress parameter, cutoffmore » frequency, and strong-motion duration are included in the analysis. Each parameter is represented by three typical values. From the combination of these typical values, a total of 27 earthquake time histories can be generated for each selected site due to an 1811-1812 type New Madrid earthquake occurring at a postulated seismic source.« less
NASA Astrophysics Data System (ADS)
Werren, G.; Balin, D.; Reynard, E.; Lane, S. N.
2012-04-01
Flood modelling is essential for flood hazard assessment. Modelling becomes a challenge in small, ungauged watersheds prone to flash floods, like the ones draining the town of Beni Mellal (Morocco). Four temporary streams meet in the urban area of Beni Mellal, producing every year sheet floods, harmful to infrastructure and to people. Here, statistical analysis may not give realistic results, but the study of these repeated real flash flood events may provide a better understanding of watershed specific hydrology. This study integrates a larger cooperation project between Switzerland and Morroco, aimed at knowledge transfer in disaster risk reduction, especially through hazard mapping and land-use planning, related to implementation of hazard maps. Hydrologic and hydraulic modelling was carried out to obtain hazard maps. An important point was to find open source data and methods that could still produce a realistic model for the area concerned, in order to provide easy-to-use, cost-effective tools for risk management in developing countries like Morocco, where routine data collection is largely lacking. The data used for modelling is the Web available TRMM 3-Hour 0.25 degree rainfall data provided by the Tropical Rainfall Measurement Mission Project (TRMM). Hydrologic modelling for discharge estimation was undertaken using methods available in the HEC-HMS software provided by the US Army Corps of Engineers® (USACE). Several transfer models were used, so as to choose the best-suited method available. As no model calibration was possible for no measured flow data was available, a one-at-the-time sensitivity analysis was performed on the parameters chosen, in order to detect their influence on the results. But the most important verification method remained field observation, through post-flood field campaigns aimed at mapping water surfaces and depths in the flooded areas, as well as river section monitoring, where rough discharge estimates could be obtained using empirical equations. Another information source was local knowledge, as people could give a rough estimation of concentration time by describing flood evolution. Finally, hydraulic modelling of the flooded areas in the urban perimeter was performed using the USACE HEC-RAS® software capabilities. A specific challenge at this stage was field morphology, as the flooded areas form large alluvial fans, with very different flood behaviour compared to flood plains. Model "calibration" at this stage was undertaken using the mapped water surfaces and depths. Great care was taken for field geometry design, where field observations, measured cross sections and field images were used to improve the existing DTM data. The model included protection dikes already built by local authorities in their flood-fight effort. Because of flash-flood specific behaviour, only maximal flooded surfaces and flow velocities were simulated through steady flow analysis in HEC-RAS. The discharge estimates obtained for the chosen event were comparable to 10-year return periods as estimated by the watershed authorities. Times of concentration correspond to this previous estimation and to local people descriptions. The modelled water surfaces reflect field reality. Flash-flood modelling demands extensive knowledge of the studied field in order to compensate data scarcity. However, more precise data, like radar rainfall estimates available in Morocco, would definitely improve outputs. In this perspective, better data access at the local level and good use of the available methods could benefit the disaster risk reduction effort as a whole.
Cramer, C.H.
2006-01-01
The Mississippi embayment, located in the central United States, and its thick deposits of sediments (over 1 km in places) have a large effect on earthquake ground motions. Several previous studies have addressed how these thick sediments might modify probabilistic seismic-hazard maps. The high seismic hazard associated with the New Madrid seismic zone makes it particularly important to quantify the uncertainty in modeling site amplification to better represent earthquake hazard in seismic-hazard maps. The methodology of the Memphis urban seismic-hazard-mapping project (Cramer et al., 2004) is combined with the reference profile approach of Toro and Silva (2001) to better estimate seismic hazard in the Mississippi embayment. Improvements over previous approaches include using the 2002 national seismic-hazard model, fully probabilistic hazard calculations, calibration of site amplification with improved nonlinear soil-response estimates, and estimates of uncertainty. Comparisons are made with the results of several previous studies, and estimates of uncertainty inherent in site-amplification modeling for the upper Mississippi embayment are developed. I present new seismic-hazard maps for the upper Mississippi embayment with the effects of site geology incorporating these uncertainties.
Interdisciplinary approach for disaster risk reduction in Valtellina Valley, northern Italy
NASA Astrophysics Data System (ADS)
Garcia, Carolina; Blahut, Jan; Luna, Byron Quan; Poretti, Ilaria; Camera, Corrado; de Amicis, Mattia; Sterlacchini, Simone
2010-05-01
Inside the framework of the European research network Mountain Risks, an interdisciplinary research group has been working in the Consortium of Mountain Municipalities of Valtellina di Tirano (northern Italy). This area has been continuously affected by several mountain hazards such as landslides, debris flows and floods that directly affect the population, and in some cases caused several deaths and million euros of losses. An aim of the interdisciplinary work in this study area, is to integrate different scientific products of the research group, in the areas of risk assessment, management and governance, in order to generate, among others, risk reduction tools addressed to general public and stakeholders. Two types of phenomena have been particularly investigated: debris flows and floods. The scientific products range from modeling to mapping of hazard and risk, emergency planning based on real time decision support systems, surveying for the evaluation of risk perception and preparedness, among others. Outputs from medium scale hazard and risk modeling could be used for decision makers and spatial planners as well as civil protection authorities to have a general overview of the area and indentify hot spots for further detailed analysis. Subsequently, local scale analysis is necessary to define possible events and risk scenarios for emergency planning. As for the modeling of past events and new scenarios of debris flows, physical outputs were used as inputs into physical vulnerability assessment and quantitative risk analysis within dynamic runout models. On a pilot zone, the physical damage was quantified for each affected structure within the context of physical vulnerability and different empirical vulnerability curves were obtained. Prospective economic direct losses were estimated. For floods hazard assessment, different approaches and models are being tested, in order to produce flood maps for various return periods, and related to registered rainfalls. About Civil Protection topics, the main aim is to set up and manage contingency plans in advance; that is, to identify and prepare people in charge to take action to define the activities to be performed, to be aware of available resources and to optimize the communication system among the people involved, in order to efficiently face a prospective crisis phase. For this purpose, a real time emergency plan has been develop based GIS (Geographical Information Systems), DSS (Decision Support Systems), and ICT (Information & Communication Technology).
Probabilistic seismic vulnerability and risk assessment of stone masonry structures
NASA Astrophysics Data System (ADS)
Abo El Ezz, Ahmad
Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for conducting rapid vulnerability assessment of stone masonry buildings. With modification of input structural parameters, it can be adapted and applied to any other building class. A sensitivity analysis of the seismic vulnerability modelling is conducted to quantify the uncertainties associated with each of the input parameters. The proposed methodology was validated for a scenario-based seismic risk assessment of existing buildings in Old Quebec City. The procedure for hazard compatible vulnerability modelling was used to develop seismic fragility functions in terms of spectral acceleration representative of the inventoried buildings. A total of 1220 buildings were considered. The assessment was performed for a scenario event of magnitude 6.2 at distance 15km with a probability of exceedance of 2% in 50 years. The study showed that most of the expected damage is concentrated in the old brick and stone masonry buildings.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-19
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Analysis and Risk-Based Preventive Controls for Human Food.'' FOR FURTHER INFORMATION CONTACT: Domini Bean... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day comment...
Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Sperotto, Anna; Glade, Thomas; Marcomini, Antonio
2016-03-01
This paper presents a review of existing multi-risk assessment concepts and tools applied by organisations and projects providing the basis for the development of a multi-risk methodology in a climate change perspective. Relevant initiatives were developed for the assessment of multiple natural hazards (e.g. floods, storm surges, droughts) affecting the same area in a defined timeframe (e.g. year, season, decade). Major research efforts were focused on the identification and aggregation of multiple hazard types (e.g. independent, correlated, cascading hazards) by means of quantitative and semi-quantitative approaches. Moreover, several methodologies aim to assess the vulnerability of multiple targets to specific natural hazards by means of vulnerability functions and indicators at the regional and local scale. The overall results of the review show that multi-risk approaches do not consider the effects of climate change and mostly rely on the analysis of static vulnerability (i.e. no time-dependent vulnerabilities, no changes among exposed elements). A relevant challenge is therefore to develop comprehensive formal approaches for the assessment of different climate-induced hazards and risks, including dynamic exposure and vulnerability. This requires the selection and aggregation of suitable hazard and vulnerability metrics to make a synthesis of information about multiple climate impacts, the spatial analysis and ranking of risks, including their visualization and communication to end-users. To face these issues, climate impact assessors should develop cross-sectorial collaborations among different expertise (e.g. modellers, natural scientists, economists) integrating information on climate change scenarios with sectorial climate impact assessment, towards the development of a comprehensive multi-risk assessment process. Copyright © 2015 Elsevier Ltd. All rights reserved.
Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources
Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.
2009-01-01
The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.
Chen, Keping; Blong, Russell; Jacobson, Carol
2003-04-01
This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.
Log-normal frailty models fitted as Poisson generalized linear mixed models.
Hirsch, Katharina; Wienke, Andreas; Kuss, Oliver
2016-12-01
The equivalence of a survival model with a piecewise constant baseline hazard function and a Poisson regression model has been known since decades. As shown in recent studies, this equivalence carries over to clustered survival data: A frailty model with a log-normal frailty term can be interpreted and estimated as a generalized linear mixed model with a binary response, a Poisson likelihood, and a specific offset. Proceeding this way, statistical theory and software for generalized linear mixed models are readily available for fitting frailty models. This gain in flexibility comes at the small price of (1) having to fix the number of pieces for the baseline hazard in advance and (2) having to "explode" the data set by the number of pieces. In this paper we extend the simulations of former studies by using a more realistic baseline hazard (Gompertz) and by comparing the model under consideration with competing models. Furthermore, the SAS macro %PCFrailty is introduced to apply the Poisson generalized linear mixed approach to frailty models. The simulations show good results for the shared frailty model. Our new %PCFrailty macro provides proper estimates, especially in case of 4 events per piece. The suggested Poisson generalized linear mixed approach for log-normal frailty models based on the %PCFrailty macro provides several advantages in the analysis of clustered survival data with respect to more flexible modelling of fixed and random effects, exact (in the sense of non-approximate) maximum likelihood estimation, and standard errors and different types of confidence intervals for all variance parameters. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A hazard rate analysis of fertility using duration data from Malaysia.
Chang, C
1988-01-01
Data from the Malaysia Fertility and Family Planning Survey (MFLS) of 1974 were used to investigate the effects of biological and socioeconomic variables on fertility based on the hazard rate model. Another study objective was to investigate the robustness of the findings of Trussell et al. (1985) by comparing the findings of this study with theirs. The hazard rate of conception for the jth fecundable spell of the ith woman, hij, is determined by duration dependence, tij, measured by the waiting time to conception; unmeasured heterogeneity (HETi; the time-invariant variables, Yi (race, cohort, education, age at marriage); and time-varying variables, Xij (age, parity, opportunity cost, income, child mortality, child sex composition). In this study, all the time-varying variables were constant over a spell. An asymptotic X2 test for the equality of constant hazard rates across birth orders, allowing time-invariant variables and heterogeneity, showed the importance of time-varying variables and duration dependence. Under the assumption of fixed effects heterogeneity and the Weibull distribution for the duration of waiting time to conception, the empirical results revealed a negative parity effect, a negative impact from male children, and a positive effect from child mortality on the hazard rate of conception. The estimates of step functions for the hazard rate of conception showed parity-dependent fertility control, evidence of heterogeneity, and the possibility of nonmonotonic duration dependence. In a hazard rate model with piecewise-linear-segment duration dependence, the socioeconomic variables such as cohort, child mortality, income, and race had significant effects, after controlling for the length of the preceding birth. The duration dependence was consistant with the common finding, i.e., first increasing and then decreasing at a slow rate. The effects of education and opportunity cost on fertility were insignificant.
An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest
Kim, Sung-Min; Yi, Huiuk; Choi, Yosoon
2017-01-01
In this study, current geographic information system (GIS)-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or three subtopics according to the steps involved in the reclamation procedure, or elements of the hazard of interest. Because GIS is appropriated for the handling of geospatial data in relation to mining-induced hazards, the application and feasibility of exploiting GIS-based modeling and assessment of mining-induced hazards within the mining industry could be expanded further. PMID:29186922
An Overview of GIS-Based Modeling and Assessment of Mining-Induced Hazards: Soil, Water, and Forest.
Suh, Jangwon; Kim, Sung-Min; Yi, Huiuk; Choi, Yosoon
2017-11-27
In this study, current geographic information system (GIS)-based methods and their application for the modeling and assessment of mining-induced hazards were reviewed. Various types of mining-induced hazard, including soil contamination, soil erosion, water pollution, and deforestation were considered in the discussion of the strength and role of GIS as a viable problem-solving tool in relation to mining-induced hazards. The various types of mining-induced hazard were classified into two or three subtopics according to the steps involved in the reclamation procedure, or elements of the hazard of interest. Because GIS is appropriated for the handling of geospatial data in relation to mining-induced hazards, the application and feasibility of exploiting GIS-based modeling and assessment of mining-induced hazards within the mining industry could be expanded further.
Coastal flooding hazard assessment on potentially vulnerable coastal sectors at Varna regional coast
NASA Astrophysics Data System (ADS)
Eftimova, Petya; Valchev, Nikolay; Andreeva, Nataliya
2017-04-01
Storm induced flooding is one of the most significant threats that the coastal communities face. In the light of the climate change it is expected to gain even more importance. Therefore, the adequate assessment of this hazard could increase the capability of mitigation of environmental, social, and economic impacts. The study was accomplished in the frames of the Coastal Risk Assessment Framework (CRAF) developed within the FP7 RISC-KIT Project (Resilience-Increasing Strategies for Coasts - toolkit). The hazard assessment was applied on three potentially vulnerable coastal sectors located at the regional coast of Varna, Bulgarian Black Sea coast. The potential "hotspot" candidates were selected during the initial phase of CRAF which evaluated the coastal risks at regional level. The area of interest comprises different coastal types - from natural beaches and rocky cliffs to man modified environments presented by coastal and port defense structures such as the Varna Port breakwater, groynes, jetties and beaches formed by the presence of coastal structures. The assessment of coastal flooding was done using combination of models -XBeach model and LISFLOOD inundation model applied consecutively. The XBeach model was employed to calculate the hazard intensities at the coast up to the berm crest, while LISFLOOD model was used to calculate the intensity and extent of flooding in the hinterland. At the first stage, 75 extreme storm events were simulated using XBeach model run in "non-hydrostatic" mode to obtain series of flood depth, depth-velocity and overtopping discharges at the predefined coastal cross-shore transects. Extreme value analysis was applied to the calculated hazard parameters series in order to determine their probability distribution functions. This is so called response approach, which is focused on the onshore impact rather than on the deep water boundary conditions. It allows calculation of the hazard extremes probability distribution induced by a variety of combinations of waves and surges. The considered return periods were 20, 50 and 100 years. Subsequently, the overtopping volumes corresponding to preferred return periods were fed into LISFLOOD model to calculate the intensity and extent of the hinterland flooding. For the beaches with fast-rising slopes backed by cliffs a combination of XBeach and LISFLOOD output was applied in order to properly map the flood depth and depth-velocity spatial distribution.
Elghafghuf, Adel; Dufour, Simon; Reyher, Kristen; Dohoo, Ian; Stryhn, Henrik
2014-12-01
Mastitis is a complex disease affecting dairy cows and is considered to be the most costly disease of dairy herds. The hazard of mastitis is a function of many factors, both managerial and environmental, making its control a difficult issue to milk producers. Observational studies of clinical mastitis (CM) often generate datasets with a number of characteristics which influence the analysis of those data: the outcome of interest may be the time to occurrence of a case of mastitis, predictors may change over time (time-dependent predictors), the effects of factors may change over time (time-dependent effects), there are usually multiple hierarchical levels, and datasets may be very large. Analysis of such data often requires expansion of the data into the counting-process format - leading to larger datasets - thus complicating the analysis and requiring excessive computing time. In this study, a nested frailty Cox model with time-dependent predictors and effects was applied to Canadian Bovine Mastitis Research Network data in which 10,831 lactations of 8035 cows from 69 herds were followed through lactation until the first occurrence of CM. The model was fit to the data as a Poisson model with nested normally distributed random effects at the cow and herd levels. Risk factors associated with the hazard of CM during the lactation were identified, such as parity, calving season, herd somatic cell score, pasture access, fore-stripping, and proportion of treated cases of CM in a herd. The analysis showed that most of the predictors had a strong effect early in lactation and also demonstrated substantial variation in the baseline hazard among cows and between herds. A small simulation study for a setting similar to the real data was conducted to evaluate the Poisson maximum likelihood estimation approach with both Gaussian quadrature method and Laplace approximation. Further, the performance of the two methods was compared with the performance of a widely used estimation approach for frailty Cox models based on the penalized partial likelihood. The simulation study showed good performance for the Poisson maximum likelihood approach with Gaussian quadrature and biased variance component estimates for both the Poisson maximum likelihood with Laplace approximation and penalized partial likelihood approaches. Copyright © 2014. Published by Elsevier B.V.
A stereo-vision hazard-detection algorithm to increase planetary lander autonomy
NASA Astrophysics Data System (ADS)
Woicke, Svenja; Mooij, Erwin
2016-05-01
For future landings on any celestial body, increasing the lander autonomy as well as decreasing risk are primary objectives. Both risk reduction and an increase in autonomy can be achieved by including hazard detection and avoidance in the guidance, navigation, and control loop. One of the main challenges in hazard detection and avoidance is the reconstruction of accurate elevation models, as well as slope and roughness maps. Multiple methods for acquiring the inputs for hazard maps are available. The main distinction can be made between active and passive methods. Passive methods (cameras) have budgetary advantages compared to active sensors (radar, light detection and ranging). However, it is necessary to proof that these methods deliver sufficiently good maps. Therefore, this paper discusses hazard detection using stereo vision. To facilitate a successful landing not more than 1% wrong detections (hazards that are not identified) are allowed. Based on a sensitivity analysis it was found that using a stereo set-up at a baseline of ≤ 2 m is feasible at altitudes of ≤ 200 m defining false positives of less than 1%. It was thus shown that stereo-based hazard detection is an effective means to decrease the landing risk and increase the lander autonomy. In conclusion, the proposed algorithm is a promising candidate for future landers.
Young, Gary S.; Fox, Mary A.; Trush, Michael; Kanarek, Norma; Glass, Thomas A.; Curriero, Frank C.
2012-01-01
Population exposure to multiple chemicals in air presents significant challenges for environmental public health. Air quality regulations distinguish criteria air pollutants (CAPs) (e.g., ozone, PM2.5) from hazardous air pollutants (HAPs)—187 chemicals which include carcinogens and others that are associated with respiratory, cardiovascular, neurological and numerous other non-cancer health effects. Evidence of the public’s cumulative exposure and the health effects of HAPs are quite limited. A multilevel model is used to assess differential exposure to HAP respiratory, neurological, and cancer hazards (2005) related to the Townsend Index of Socioeconomic Deprivation (TSI), after adjustment for regional population size and economic activity, and local population density. We found significant positive associations between tract TSI and respiratory and cancer HAP exposure hazards, and smaller effects for neurological HAPs. Tracts in the top quintile of TSI have between 38%–60% higher HAP exposure than the bottom quintile; increasing population size from the bottom quintile to the top quintile modifies HAP exposure hazard related to TSI, increasing cancer HAP exposure hazard by 6% to 20% and increasing respiratory HAP exposure hazard by 12% to 27%. This study demonstrates the value of social epidemiological methods for analyzing differential exposure and advancing cumulative risk assessment. PMID:22829799
Young, Gary S; Fox, Mary A; Trush, Michael; Kanarek, Norma; Glass, Thomas A; Curriero, Frank C
2012-06-01
Population exposure to multiple chemicals in air presents significant challenges for environmental public health. Air quality regulations distinguish criteria air pollutants (CAPs) (e.g., ozone, PM2.5) from hazardous air pollutants (HAPs)-187 chemicals which include carcinogens and others that are associated with respiratory, cardiovascular, neurological and numerous other non-cancer health effects. Evidence of the public's cumulative exposure and the health effects of HAPs are quite limited. A multilevel model is used to assess differential exposure to HAP respiratory, neurological, and cancer hazards (2005) related to the Townsend Index of Socioeconomic Deprivation (TSI), after adjustment for regional population size and economic activity, and local population density. We found significant positive associations between tract TSI and respiratory and cancer HAP exposure hazards, and smaller effects for neurological HAPs. Tracts in the top quintile of TSI have between 38%-60% higher HAP exposure than the bottom quintile; increasing population size from the bottom quintile to the top quintile modifies HAP exposure hazard related to TSI, increasing cancer HAP exposure hazard by 6% to 20% and increasing respiratory HAP exposure hazard by 12% to 27%. This study demonstrates the value of social epidemiological methods for analyzing differential exposure and advancing cumulative risk assessment.
Data Model for Multi Hazard Risk Assessment Spatial Support Decision System
NASA Astrophysics Data System (ADS)
Andrejchenko, Vera; Bakker, Wim; van Westen, Cees
2014-05-01
The goal of the CHANGES Spatial Decision Support System is to support end-users in making decisions related to risk reduction measures for areas at risk from multiple hydro-meteorological hazards. The crucial parts in the design of the system are the user requirements, the data model, the data storage and management, and the relationships between the objects in the system. The implementation of the data model is carried out entirely with an open source database management system with a spatial extension. The web application is implemented using open source geospatial technologies with PostGIS as the database, Python for scripting, and Geoserver and javascript libraries for visualization and the client-side user-interface. The model can handle information from different study areas (currently, study areas from France, Romania, Italia and Poland are considered). Furthermore, the data model handles information about administrative units, projects accessible by different types of users, user-defined hazard types (floods, snow avalanches, debris flows, etc.), hazard intensity maps of different return periods, spatial probability maps, elements at risk maps (buildings, land parcels, linear features etc.), economic and population vulnerability information dependent on the hazard type and the type of the element at risk, in the form of vulnerability curves. The system has an inbuilt database of vulnerability curves, but users can also add their own ones. Included in the model is the management of a combination of different scenarios (e.g. related to climate change, land use change or population change) and alternatives (possible risk-reduction measures), as well as data-structures for saving the calculated economic or population loss or exposure per element at risk, aggregation of the loss and exposure using the administrative unit maps, and finally, producing the risk maps. The risk data can be used for cost-benefit analysis (CBA) and multi-criteria evaluation (SMCE). The data model includes data-structures for CBA and SMCE. The model is at the stage where risk and cost-benefit calculations can be stored but the remaining part is currently under development. Multi-criteria information, user management and the relation of these with the rest of the model is our next step. Having a carefully designed data model plays a crucial role in the development of the whole system for rapid development, keeping the data consistent, and in the end, support the end-user in making good decisions in risk-reduction measures related to multiple natural hazards. This work is part of the EU FP7 Marie Curie ITN "CHANGES"project (www.changes-itn.edu)
Comprehensive Analysis of Drought Persistence, Hazard, and Recovery across the CONUS
NASA Astrophysics Data System (ADS)
Zarekarizi, M.; Ahmadi, B.; Moradkhani, H.
2017-12-01
Drought is a creeping intertwined natural hazard affecting society more than any other natural disaster and causing enormous damages on economy and ecosystems. Better understanding of potential drought hazard can help water managers and stakeholders devising mitigation plans to minimize the adverse effects of droughts. In this study, soil moisture, simulated by the Variable Infiltration Capacity (VIC) land surface model, is used to analyze the probability of agricultural drought with different severities across the CONUS. Due to the persistence of soil moisture, a drought episode at a particular time is affected by its earlier status; therefore, this study has utilized a Copula function to model the selected hydrologic variable over the time. The probability of drought intensity for each unit is presented spatially. If the unit remains in the drought condition at the same or lower intensity, drought persists and if it improves above a pre-defined threshold, the unit recovers. Results show that the west of US is more vulnerable to drought persistence in summer and spring while the Midwest and Northeast of US are experiencing drought persistence in fall and winter. In addition, the analysis reveals that as the intensity of drought in a given season decreases the following season has higher chance of recovery.
An Approach to Addressing Selection Bias in Survival Analysis
Carlin, Caroline S.; Solid, Craig A.
2014-01-01
This work proposes a frailty model that accounts for non-random treatment assignment in survival analysis. Using Monte Carlo simulation, we found that estimated treatment parameters from our proposed endogenous selection survival model (esSurv) closely parallel the consistent two-stage residual inclusion (2SRI) results, while offering computational and interpretive advantages. The esSurv method greatly enhances computational speed relative to 2SRI by eliminating the need for bootstrapped standard errors, and generally results in smaller standard errors than those estimated by 2SRI. In addition, esSurv explicitly estimates the correlation of unobservable factors contributing to both treatment assignment and the outcome of interest, providing an interpretive advantage over the residual parameter estimate in the 2SRI method. Comparisons with commonly used propensity score methods and with a model that does not account for non-random treatment assignment show clear bias in these methods that is not mitigated by increased sample size. We illustrate using actual dialysis patient data comparing mortality of patients with mature arteriovenous grafts for venous access to mortality of patients with grafts placed but not yet ready for use at the initiation of dialysis. We find strong evidence of endogeneity (with estimate of correlation in unobserved factors ρ̂ = 0.55), and estimate a mature-graft hazard ratio of 0.197 in our proposed method, with a similar 0.173 hazard ratio using 2SRI. The 0.630 hazard ratio from a frailty model without a correction for the non-random nature of treatment assignment illustrates the importance of accounting for endogeneity. PMID:24845211
Anderson, Carl A; McRae, Allan F; Visscher, Peter M
2006-07-01
Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.
Tsunamis: Global Exposure and Local Risk Analysis
NASA Astrophysics Data System (ADS)
Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.
2014-12-01
The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.
He, Ye; Lin, Huazhen; Tu, Dongsheng
2018-06-04
In this paper, we introduce a single-index threshold Cox proportional hazard model to select and combine biomarkers to identify patients who may be sensitive to a specific treatment. A penalized smoothed partial likelihood is proposed to estimate the parameters in the model. A simple, efficient, and unified algorithm is presented to maximize this likelihood function. The estimators based on this likelihood function are shown to be consistent and asymptotically normal. Under mild conditions, the proposed estimators also achieve the oracle property. The proposed approach is evaluated through simulation analyses and application to the analysis of data from two clinical trials, one involving patients with locally advanced or metastatic pancreatic cancer and one involving patients with resectable lung cancer. Copyright © 2018 John Wiley & Sons, Ltd.
Lee, A H; Yau, K K
2001-01-01
To identify factors associated with hospital length of stay (LOS) and to model variations in LOS within Diagnosis Related Groups (DRGs). A proportional hazards frailty modelling approach is proposed that accounts for patient transfers and the inherent correlation of patients clustered within hospitals. The investigation is based on patient discharge data extracted for a group of obstetrical DRGs. Application of the frailty approach has highlighted several significant factors after adjustment for patient casemix and random hospital effects. In particular, patients admitted for childbirth with private medical insurance coverage have higher risk of prolonged hospitalization compared to public patients. The determination of pertinent factors provides important information to hospital management and clinicians in assessing the risk of prolonged hospitalization. The analysis also enables the comparison of inter-hospital variations across adjacent DRGs.
Prognostic factors in multiple myeloma: selection using Cox's proportional hazard model.
Pasqualetti, P; Collacciani, A; Maccarone, C; Casale, R
1996-01-01
The pretreatment characteristics of 210 patients with multiple myeloma, observed between 1980 and 1994, were evaluated as potential prognostic factors for survival. Multivariate analysis according to Cox's proportional hazard model identified in the 160 dead patients with myeloma, among 26 different single prognostic variables, the following factors in order of importance: beta 2-microglobulin; bone marrow plasma cell percentage, hemoglobinemia, degree of lytic bone lesions, serum creatinine, and serum albumin. By analysis of these variables a prognostic index (PI), that considers the regression coefficients derived by Cox's model of all significant factors, was obtained. Using this it was possible to separate the whole patient group into three stages: stage I (PI < 1.485, 67 patients), stage II (PI: 1.485-2.090, 76 patients), and stage III (PI > 2.090, 67 patients), with a median survivals of 68, 36 and 13 months (P < 0.0001), respectively. Also the responses to therapy (P < 0.0001) and the survival curves (P < 0.00001) presented significant differences among the three subgroups. Knowledge of these factors could be of value in predicting prognosis and in planning therapy in patients with multiple myeloma.
Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.
Gong, Xiajing; Hu, Meng; Zhao, Liang
2018-05-01
Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Seismic hazard in the eastern United States
Mueller, Charles; Boyd, Oliver; Petersen, Mark D.; Moschetti, Morgan P.; Rezaeian, Sanaz; Shumway, Allison
2015-01-01
The U.S. Geological Survey seismic hazard maps for the central and eastern United States were updated in 2014. We analyze results and changes for the eastern part of the region. Ratio maps are presented, along with tables of ground motions and deaggregations for selected cities. The Charleston fault model was revised, and a new fault source for Charlevoix was added. Background seismicity sources utilized an updated catalog, revised completeness and recurrence models, and a new adaptive smoothing procedure. Maximum-magnitude models and ground motion models were also updated. Broad, regional hazard reductions of 5%–20% are mostly attributed to new ground motion models with stronger near-source attenuation. The revised Charleston fault geometry redistributes local hazard, and the new Charlevoix source increases hazard in northern New England. Strong increases in mid- to high-frequency hazard at some locations—for example, southern New Hampshire, central Virginia, and eastern Tennessee—are attributed to updated catalogs and/or smoothing.
Shakiba, Maryam; Mansournia, Mohammad Ali; Salari, Arsalan; Soori, Hamid; Mansournia, Nasrin; Kaufman, Jay S
2018-06-01
In longitudinal studies, standard analysis may yield biased estimates of exposure effect in the presence of time-varying confounders that are also intermediate variables. We aimed to quantify the relationship between obesity and coronary heart disease (CHD) by appropriately adjusting for time-varying confounders. This study was performed in a subset of participants from the Atherosclerosis Risk in Communities (ARIC) Study (1987-2010), a US study designed to investigate risk factors for atherosclerosis. General obesity was defined as body mass index (weight (kg)/height (m)2) ≥30, and abdominal obesity (AOB) was defined according to either waist circumference (≥102 cm in men and ≥88 cm in women) or waist:hip ratio (≥0.9 in men and ≥0.85 in women). The association of obesity with CHD was estimated by G-estimation and compared with results from accelerated failure-time models using 3 specifications. The first model, which adjusted for baseline covariates, excluding metabolic mediators of obesity, showed increased risk of CHD for all obesity measures. Further adjustment for metabolic mediators in the second model and time-varying variables in the third model produced negligible changes in the hazard ratios. The hazard ratios estimated by G-estimation were 1.15 (95% confidence interval (CI): 0.83, 1.47) for general obesity, 1.65 (95% CI: 1.35, 1.92) for AOB based on waist circumference, and 1.38 (95% CI: 1.13, 1.99) for AOB based on waist:hip ratio, suggesting that AOB increased the risk of CHD. The G-estimated hazard ratios for both measures were further from the null than those derived from standard models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.
The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less
Veeramany, Arun; Coles, Garill A.; Unwin, Stephen D.; ...
2017-08-25
The Pacific Northwest National Laboratory developed a risk framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions. In this paper, we briefly recap the framework and demonstrate its implementation for seismic and geomagnetic hazards using a benchmark reliability test system. We describe integration of a collection of models implemented to perform hazard analysis, fragility evaluation, consequence estimation, and postevent restoration. We demonstrate the value of the framework as a multihazard power grid risk assessment and management tool. As a result, the research will benefit transmission planners and emergency planners by improving their ability to maintain a resilientmore » grid infrastructure against impacts from major events.« less
Development of FB-MultiPier dynamic vessel-collision analysis models, phase 2 : [summary].
DOT National Transportation Integrated Search
2014-07-01
When collisions between large vessels and bridge : supports occur, they can result in significant : damage to bridge and vessel. These collisions : are extremely hazardous, often taking lives on : the vessel and the bridge. Direct costs of repair : a...
Analysis of collision safety associated with CEM and conventional cars mixed within a consist
DOT National Transportation Integrated Search
2003-11-16
collision dynamics model of a passenger train-to-passenger train collision has been developed to simulate the potential safety hazards and benefits associated with mixing conventional and crash energy management (CEM) cars within a consist. This pape...
NASA Technical Reports Server (NTRS)
Roberts, Dar A.; Church, Richard; Ustin, Susan L.; Brass, James A. (Technical Monitor)
2001-01-01
Large urban wildfires throughout southern California have caused billions of dollars of damage and significant loss of life over the last few decades. Rapid urban growth along the wildland interface, high fuel loads and a potential increase in the frequency of large fires due to climatic change suggest that the problem will worsen in the future. Improved fire spread prediction and reduced uncertainty in assessing fire hazard would be significant, both economically and socially. Current problems in the modeling of fire spread include the role of plant community differences, spatial heterogeneity in fuels and spatio-temporal changes in fuels. In this research, we evaluated the potential of Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Airborne Synthetic Aperture Radar (AIRSAR) data for providing improved maps of wildfire fuel properties. Analysis concentrated in two areas of Southern California, the Santa Monica Mountains and Santa Barbara Front Range. Wildfire fuel information can be divided into four basic categories: fuel type, fuel load (live green and woody biomass), fuel moisture and fuel condition (live vs senesced fuels). To map fuel type, AVIRIS data were used to map vegetation species using Multiple Endmember Spectral Mixture Analysis (MESMA) and Binary Decision Trees. Green live biomass and canopy moisture were mapped using AVIRIS through analysis of the 980 nm liquid water absorption feature and compared to alternate measures of moisture and field measurements. Woody biomass was mapped using L and P band cross polarimetric data acquired in 1998 and 1999. Fuel condition was mapped using spectral mixture analysis to map green vegetation (green leaves), nonphotosynthetic vegetation (NPV; stems, wood and litter), shade and soil. Summaries describing the potential of hyperspectral and SAR data for fuel mapping are provided by Roberts et al. and Dennison et al. To utilize remotely sensed data to assess fire hazard, fuel-type maps were translated into standard fuel models accessible to the FARSITE fire spread simulator. The FARSITE model and BEHAVE are considered industry standards for fire behavior analysis. Anderson level fuels map, generated using a binary decision tree classifier are available for multiple dates in the Santa Monica Mountains and at least one date for Santa Barbara. Fuel maps that will fill in the areas between Santa Barbara and the Santa Monica Mountains study sites are in progress, as part of a NASA Regional Earth Science Application Center, the Southern California Wildfire Hazard Center. Species-level maps, were supplied to fire managing agencies (Los Angeles County Fire, California Department of Forestry). Research results were published extensively in the refereed and non-refereed literature. Educational outreach included funding of several graduate students, undergraduate intern training and an article featured in the California Alliance for Minorities Program (CAMP) Quarterly Journal.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
Dynamic safety assessment of natural gas stations using Bayesian network.
Zarei, Esmaeil; Azadeh, Ali; Khakzad, Nima; Aliabadi, Mostafa Mirzaei; Mohammadfam, Iraj
2017-01-05
Pipelines are one of the most popular and effective ways of transporting hazardous materials, especially natural gas. However, the rapid development of gas pipelines and stations in urban areas has introduced a serious threat to public safety and assets. Although different methods have been developed for risk analysis of gas transportation systems, a comprehensive methodology for risk analysis is still lacking, especially in natural gas stations. The present work is aimed at developing a dynamic and comprehensive quantitative risk analysis (DCQRA) approach for accident scenario and risk modeling of natural gas stations. In this approach, a FMEA is used for hazard analysis while a Bow-tie diagram and Bayesian network are employed to model the worst-case accident scenario and to assess the risks. The results have indicated that the failure of the regulator system was the worst-case accident scenario with the human error as the most contributing factor. Thus, in risk management plan of natural gas stations, priority should be given to the most probable root events and main contribution factors, which have identified in the present study, in order to reduce the occurrence probability of the accident scenarios and thus alleviate the risks. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Schaub, Y.; Huggel, C.; Serraino, M.; Haeberli, W.
2012-04-01
The changes in high-mountain environments are increasingly fast and complex. GIS-based models of the Swiss Alps show that numerous topographic overdeepenings are likely to appear on progressively exposed glacier beds, which are considered as potential sites of future lake formation. In many cases these newly forming lakes will be situated in an over-steepened and destabilized high-mountain environment and are, therefore, prone to impact waves from landslides. The risk of glacier lake outburst floods, endangering infrastructure, residential areas and persons further downvalley, is increasing with further lake formation and glacier recession. This risk may persist for many decades if not centuries. Future-oriented hazard assessments have to be integrative and must deal with all possible process chains. Reference studies and methodologies are still scarce, however. We present an approach to compare risks resulting from high-mountain lakes in the Swiss Alps amongst each other. Already existing lakes are thereby as much included in the analysis as future ones. The presented risk assessment approach integrates the envisaged high-mountain hazard process chain with present and future socio-economic conditions. Applying the concept of integral risk management, the hazard and damage potentials have to be analyzed. The areas that feature the topographic potential for rock/iceavalanches to reach a lake were analyzed regarding their susceptibility to slope failure including the factors slope inclination, permafrost occurrence, glacier recession and bedrock lithology. Together with the analysis of the lakes (volume and runout path of potential outburst floods), the hazard analysis of the process chain was completed. As an example, high long-term hazard potentials in the Swiss Alps have, for instance, to be expected in the area of the Great Aletsch glacier. A methodology for the assessment of the damage potential was elaborated and will be presented. In order to estimate the location of the largest damage potentials, driving forces of different spatial development scenarios for the Swiss Alps will be implemented in a land allocation model for the Swiss Alps. By bringing together hazard, exposure and vulnerability analyses, a risk assessment for the entire Swiss Alps regarding lake-outburst floods triggered by impacts of rock/ice avalanches can be conducted for today, the middle of the century and even beyond.
Robert E. Keane; Stacy A. Drury; Eva C. Karau; Paul F. Hessburg; Keith M. Reynolds
2010-01-01
This paper presents modeling methods for mapping fire hazard and fire risk using a research model called FIREHARM (FIRE Hazard and Risk Model) that computes common measures of fire behavior, fire danger, and fire effects to spatially portray fire hazard over space. FIREHARM can compute a measure of risk associated with the distribution of these measures over time using...
NASA Astrophysics Data System (ADS)
Andreeva, Nataliya; Eftimova, Petya; Valchev, Nikolay; Prodanov, Bogdan
2017-04-01
The study presents evaluation and comparative analysis of storm induced flooding impacts on different coastal receptors at a scale of Varna region using INtegrated DisRuption Assessment (INDRA) model. The model was developed within the FP7 RISC-KIT project, as a part of Coastal Risk Assessment Framework (CRAF) consisting of two phases. CRAF Phase 1 is a screening process that evaluates coastal risk at a regional scale by means of coastal indices approach, which helps to identify potentially vulnerable coastal sectors: hot spots (HS). CRAF Phase 2 has the objective to assess and rank identified hotspots by detailed risk analysis done by jointly performing a hazard assessment and an impact evaluation on different categories (population, businesses, ecosystems, transport and utilities) using INDRA model at a regional level. Basically, the model assess the shock of events by estimating the impact on directly exposed to flooding hazard receptors of different vulnerability, as well as the potential ripple effects during an event in order to assess the "indirect" impacts, which occur outside the hazard area and/or continue after the event for all considered categories. The potential impacts are expressed in terms of uniform "Impact Indicators", which independently score the indirect impacts of these categories assessing disruption and recovery of the receptors. The ultimate hotspot ranking is obtained through the use of a Multi Criteria analysis (MCA) incorporated in the model, considering preferences of stakeholders. The case study area - Varna regional coast - is located on the western Black Sea, Bulgaria. The coastline, with a length of about 70 km, stretches from cape Ekrene to cape St. Atanas and includes Varna Bay. After application of CRAF Phase 1 three hotspots were selected for further analysis: Kabakum beach (HS1), Varna Central beach plus Port wall (HS2) and Artificial Island (HS3). For first two hotspots beaches and associated infrastructure are the assets that attract holiday-makers and tourists in summer season. For HS3 the exposed area is occupied by storage premises for industrial goods and oil/fuel tanks. Flooding hazard was assessed through coupled use of XBeach 1D and LISFLOOD 2D inundation models at the selected hotspots. The "response" approach was adopted as 75 extreme storm events were simulated to obtain storm maxima series of overtopping discharges, flood depth, depth-velocity and berm retreat. The selected return periods within the extreme value analysis were 20, 50 and 100 years. For impact evaluation by INDRA model the categories "Population" and "Business" were considered. Impacts on Population were addressed by 3 impact indicators: "Risk to Life", "Household Displacement Time" and "Household Financial Recovery", while for Business category only by "Business Financial Recovery". Hotspots ranking was done using MCA by weighting of the evaluated indicators: focused on Risk to Life (F1) and on Business Financial Recovery (F2). MCA scoring focused on Household displacement/recovery was not evaluated because modelling results revealed quite a low number of flooded household receptors. Results show that for both F1 and F2 and for all considered return periods HS2 has the highest scores, which makes it a final hotspot.
DOT National Transportation Integrated Search
1985-10-01
This report summarizes the findings from the second phase of a two-part analysis of hazardous materials truck routes in the Dallas-Fort Worth area. Phase II of this study analyzes the risk of transporting hazardous materials on freeways and arterial ...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
A Probabilistic Tsunami Hazard Assessment Methodology and Its Application to Crescent City, CA
NASA Astrophysics Data System (ADS)
Gonzalez, F. I.; Leveque, R. J.; Waagan, K.; Adams, L.; Lin, G.
2012-12-01
A PTHA methodology, based in large part on Probabilistic Seismic Hazard Assessment methods (e.g., Cornell, 1968; SSHAC, 1997; Geist and Parsons, 2005), was previously applied to Seaside, OR (Gonzalez, et al., 2009). This initial version of the method has been updated to include: a revised method to estimate tidal uncertainty; an improved method for generating stochastic realizations to estimate slip distribution uncertainty (Mai and Beroza, 2002; Blair, et al., 2011); additional near-field sources in the Cascadia Subduction Zone, based on the work of Goldfinger, et al. (2012); far-field sources in Japan, based on information updated since the 3 March 2011 Tohoku tsunami (Japan Earthquake Research Committee, 2011). The GeoClaw tsunami model (Berger, et. al, 2011) is used to simulate generation, propagation and inundation. We will discuss this revised PTHA methodology and the results of its application to Crescent City, CA. Berger, M.J., D. L. George, R. J. LeVeque, and K. T. Mandli, The GeoClaw software for depth-averaged flows with adaptive refinement, Adv. Water Res. 34 (2011), pp. 1195-1206. Blair, J.L., McCrory, P.A., Oppenheimer, D.H., and Waldhauser, F. (2011): A Geo-referenced 3D model of the Juan de Fuca Slab and associated seismicity: U.S. Geological Survey Data Series 633, v.1.0, available at http://pubs.usgs.gov/ds/633/. Cornell, C. A. (1968): Engineering seismic risk analysis, Bull. Seismol. Soc. Am., 58, 1583-1606. Geist, E. L., and T. Parsons (2005): Probabilistic Analysis of Tsunami Hazards, Nat. Hazards, 37 (3), 277-314. Goldfinger, C., Nelson, C.H., Morey, A.E., Johnson, J.E., Patton, J.R., Karabanov, E., Gutiérrez-Pastor, J., Eriksson, A.T., Gràcia, E., Dunhill, G., Enkin, R.J., Dallimore, A., and Vallier, T. (2012): Turbidite event history—Methods and implications for Holocene paleoseismicity of the Cascadia subduction zone: U.S. Geological Survey Professional Paper 1661-F, 170 p. (Available at http://pubs.usgs.gov/pp/pp1661f/). González, F.I., E.L. Geist, B. Jaffe, U. Kânoglu, H. Mofjeld, C.E. Synolakis, V.V Titov, D. Arcas, D. Bellomo, D. Carlton, T. Horning, J. Johnson, J. Newman, T. Parsons, R. Peters, C. Peterson, G .Priest, A. Venturato, J. Weber, F. Wong, and A. Yalciner (2009): Probabilistic Tsunami Hazard Assessment at Seaside, Oregon, for Near- and Far-Field Seismic Sources, J. Geophys. Res., 114, C11023, doi:10.1029/2008JC005132. Japan Earthquake Research Committee, (2011): http://www.jishin.go.jp/main/p_hyoka02.htm Mai, P. M., and G. C. Beroza (2002): A spatial random field model to characterize complexity in earthquake slip, J. Geophys. Res., 107(B11), 2308, doi:10.1029/2001JB000588. SSHAC (Senior Seismic Hazard Analysis Committee) (1997): Recommendations for Probabilistic Seismic Hazard Analysis: Guidance on Uncertainty and Use of Experts, Main Report Rep. NUREG/CR-6372 UCRL-ID-122160 Vol. 1, 256 pp, U.S. Nuclear Regulatory Commission.
Seismic Hazard and Risk Assessments for Beijing-Tianjin-Tangshan, China, Area
Xie, F.; Wang, Z.; Liu, J.
2011-01-01
Seismic hazard and risk in the Beijing-Tianjin-Tangshan, China, area were estimated from 500-year intensity observations. First, we digitized the intensity observations (maps) using ArcGIS with a cell size of 0.1 ?? 0.1??. Second, we performed a statistical analysis on the digitized intensity data, determined an average b value (0.39), and derived the intensity-frequency relationship (hazard curve) for each cell. Finally, based on a Poisson model for earthquake occurrence, we calculated seismic risk in terms of a probability of I ??? 7, 8, or 9 in 50 years. We also calculated the corresponding 10 percent probability of exceedance of these intensities in 50 years. The advantages of assessing seismic hazard and risk from intensity records are that (1) fewer assumptions (i. e., earthquake source and ground motion attenuation) are made, and (2) site-effect is included. Our study shows that the area has high seismic hazard and risk. Our study also suggests that current design peak ground acceleration or intensity for the area may not be adequate. ?? 2010 Birkh??user / Springer Basel AG.
Kammerer, A.M.; ten Brink, Uri S.; Titov, V.V.
2017-01-01
In response to the 2004 Indian Ocean Tsunami, the United States Nuclear Regulatory Commission (US NRC) initiated a long-term research program to improve understanding of tsunami hazard levels for nuclear facilities in the United States. For this effort, the US NRC organized a collaborative research program with the United States Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA) with a goal of assessing tsunami hazard on the Atlantic and Gulf Coasts of the United States. Necessarily, the US NRC research program includes both seismic- and landslide-based tsunamigenic sources in both the near and the far fields. The inclusion of tsunamigenic landslides, an important category of sources that impact tsunami hazard levels for the Atlantic and Gulf Coasts is a key difference between this program and most other tsunami hazard assessment programs. The initial phase of this work consisted of collection, interpretation, and analysis of available offshore data, with significant effort focused on characterizing offshore near-field landslides and analyzing their tsunamigenic potential and properties. In the next phase of research, additional field investigations will be conducted in key locations of interest and additional analysis will be undertaken. Simultaneously, the MOST tsunami generation and propagation model used by NOAA will first be enhanced to include landslide-based initiation mechanisms and then will be used to investigate the impact of the tsunamigenic sources identified and characterized by the USGS. The potential for probabilistic tsunami hazard assessment will also be explore in the final phases of the program.
NASA Astrophysics Data System (ADS)
Lee, Ching-Fang; Huang, Wei-Kai; Chang, Yu-Lin; Chi, Shu-Yeong; Liao, Wu-Chang
2018-01-01
Typhoons Megi (2010) and Saola (2012) brought torrential rainfall which triggered regional landslides and flooding hazards along Provincial Highway No. 9 in northeastern Taiwan. To reduce property loss and saving lives, this study combines multi-hazard susceptibility assessment with environmental geology map a rock mass rating system (RMR), remote sensing analysis, and micro-topography interpretation to develop an integrated landslide hazard assessment approach and reflect the intrinsic state of slopeland from the past toward the future. First, the degree of hazard as indicated by historical landslides was used to determine many landslide regions in the past. Secondly, geo-mechanical classification of rock outcroppings was performed by in-situ investigation along the vulnerable road sections. Finally, a high-resolution digital elevation model was extracted from airborne LiDAR and multi-temporal remote sensing images which was analyzed to discover possible catastrophic landslide hotspot shortly. The results of the analysis showed that 37% of the road sections in the study area were highly susceptible to landslide hazards. The spatial distribution of the road sections revealed that those characterized by high susceptibility were located near the boundaries of fault zones and in areas of lithologic dissimilarity. Headward erosion of gullies and concave-shaped topographic features had an adverse effect and was the dominant factor triggering landslides. Regional landslide reactivation on this coastal highway are almost related to the past landslide region based on hazard statistics. The final results of field validation demonstrated that an accuracy of 91% could be achieved for forecasting geohazard followed by intense rainfall events and typhoons.
Bowden, Jack; Seaman, Shaun; Huang, Xin; White, Ian R
2016-04-30
In randomised controlled trials of treatments for late-stage cancer, it is common for control arm patients to receive the experimental treatment around the point of disease progression. This treatment switching can dilute the estimated treatment effect on overall survival and impact the assessment of a treatment's benefit on health economic evaluations. The rank-preserving structural failure time model of Robins and Tsiatis (Comm. Stat., 20:2609-2631) offers a potential solution to this problem and is typically implemented using the logrank test. However, in the presence of substantial switching, this test can have low power because the hazard ratio is not constant over time. Schoenfeld (Biometrika, 68:316-319) showed that when the hazard ratio is not constant, weighted versions of the logrank test become optimal. We present a weighted logrank test statistic for the late stage cancer trial context given the treatment switching pattern and working assumptions about the underlying hazard function in the population. Simulations suggest that the weighted approach can lead to large efficiency gains in either an intention-to-treat or a causal rank-preserving structural failure time model analysis compared with the unweighted approach. Furthermore, violation of the working assumptions used in the derivation of the weights only affects the efficiency of the estimates and does not induce bias or inflate the type I error rate. The weighted logrank test statistic should therefore be considered for use as part of a careful secondary, exploratory analysis of trial data affected by substantial treatment switching. ©2015 The Authors. Statistics inMedicine Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Zarekarizi, M.; Moradkhani, H.
2015-12-01
Extreme events are proven to be affected by climate change, influencing hydrologic simulations for which stationarity is usually a main assumption. Studies have discussed that this assumption would lead to large bias in model estimations and higher flood hazard consequently. Getting inspired by the importance of non-stationarity, we determined how the exceedance probabilities have changed over time in Johnson Creek River, Oregon. This could help estimate the probability of failure of a structure that was primarily designed to resist less likely floods according to common practice. Therefore, we built a climate informed Bayesian hierarchical model and non-stationarity was considered in modeling framework. Principle component analysis shows that North Atlantic Oscillation (NAO), Western Pacific Index (WPI) and Eastern Asia (EA) are mostly affecting stream flow in this river. We modeled flood extremes using peaks over threshold (POT) method rather than conventional annual maximum flood (AMF) mainly because it is possible to base the model on more information. We used available threshold selection methods to select a suitable threshold for the study area. Accounting for non-stationarity, model parameters vary through time with climate indices. We developed a couple of model scenarios and chose one which could best explain the variation in data based on performance measures. We also estimated return periods under non-stationarity condition. Results show that ignoring stationarity could increase the flood hazard up to four times which could increase the probability of an in-stream structure being overtopped.
NASA Astrophysics Data System (ADS)
Garcia, J.; Arcila, M.; Benito, B.; Eraso, J.; García, R.; Gomez Capera, A.; Pagani, M.; Pinho, R.; Rendon, H.; Torres, Y.
2013-05-01
Latin America is a seismically active region with complex tectonic settings that make the creation of hazard models challenging. Over the past two decades PSHA studies have been completed for this region in the context of global (Shedlock, 1999), regional (Dimaté et al., 1999) and national initiatives. Currently different research groups are developing new models for various nations. The Global Earthquake Model (GEM), an initiative aiming at the creation of a large global community working collaboratively on building hazard and risk models using open standards and tools, is promoting the collaboration between different national projects and groups so as to facilitate the creation of harmonized regional models. The creation of a harmonized hazard model can follow different approaches, varying from a simple patching of available models to a complete homogenisation of basic information and the subsequent creation of a completely new PSHA model. In this contribution we describe the process and results of a first attempt aiming at the creation of a community based model covering the Caribbean and Central America regions. It consists of five main steps: 1- Identification and collection of available PSHA input models; 2- Analysis of the consistency, transparency and reproducibility of each model; 3- Selection (if more then a model exists for the same region); 4- Representation of the models in a standardized format and incorporation of new knowledge from recent studies; 5- Proposal(s) of harmonization We consider some PHSA studies completed over the latest twenty years in the region comprising the Caribbean (CAR), Central America (CAM) and northern South America (SA), we illustrate a tentative harmonization of the seismic source geometries models and we discuss the steps needed toward a complete harmonisation of the models. Our will is to have a model based on best practices and high standards created though a combination of knowledge and competences coming from the scientific community, incorporating national and regional Institutions. This is an ambitious goal that can be pursued only through an intense and open cooperation between all the interested subjects.
Oh, Eric J; Shepherd, Bryan E; Lumley, Thomas; Shaw, Pamela A
2018-04-15
For time-to-event outcomes, a rich literature exists on the bias introduced by covariate measurement error in regression models, such as the Cox model, and methods of analysis to address this bias. By comparison, less attention has been given to understanding the impact or addressing errors in the failure time outcome. For many diseases, the timing of an event of interest (such as progression-free survival or time to AIDS progression) can be difficult to assess or reliant on self-report and therefore prone to measurement error. For linear models, it is well known that random errors in the outcome variable do not bias regression estimates. With nonlinear models, however, even random error or misclassification can introduce bias into estimated parameters. We compare the performance of 2 common regression models, the Cox and Weibull models, in the setting of measurement error in the failure time outcome. We introduce an extension of the SIMEX method to correct for bias in hazard ratio estimates from the Cox model and discuss other analysis options to address measurement error in the response. A formula to estimate the bias induced into the hazard ratio by classical measurement error in the event time for a log-linear survival model is presented. Detailed numerical studies are presented to examine the performance of the proposed SIMEX method under varying levels and parametric forms of the error in the outcome. We further illustrate the method with observational data on HIV outcomes from the Vanderbilt Comprehensive Care Clinic. Copyright © 2017 John Wiley & Sons, Ltd.
Petersen, Mark D.; Zeng, Yuehua; Haller, Kathleen M.; McCaffrey, Robert; Hammond, William C.; Bird, Peter; Moschetti, Morgan; Shen, Zhengkang; Bormann, Jayne; Thatcher, Wayne
2014-01-01
The 2014 National Seismic Hazard Maps for the conterminous United States incorporate additional uncertainty in fault slip-rate parameter that controls the earthquake-activity rates than was applied in previous versions of the hazard maps. This additional uncertainty is accounted for by new geodesy- and geology-based slip-rate models for the Western United States. Models that were considered include an updated geologic model based on expert opinion and four combined inversion models informed by both geologic and geodetic input. The two block models considered indicate significantly higher slip rates than the expert opinion and the two fault-based combined inversion models. For the hazard maps, we apply 20 percent weight with equal weighting for the two fault-based models. Off-fault geodetic-based models were not considered in this version of the maps. Resulting changes to the hazard maps are generally less than 0.05 g (acceleration of gravity). Future research will improve the maps and interpret differences between the new models.
Lupo, Philip J; Symanski, Elaine
2009-11-01
Often, in studies evaluating the health effects of hazardous air pollutants (HAPs), researchers rely on ambient air levels to estimate exposure. Two potential data sources are modeled estimates from the U.S. Environmental Protection Agency (EPA) Assessment System for Population Exposure Nationwide (ASPEN) and ambient air pollutant measurements from monitoring networks. The goal was to conduct comparisons of modeled and monitored estimates of HAP levels in the state of Texas using traditional approaches and a previously unexploited method, concordance correlation analysis, to better inform decisions regarding agreement. Census tract-level ASPEN estimates and monitoring data for all HAPs throughout Texas, available from the EPA Air Quality System, were obtained for 1990, 1996, and 1999. Monitoring sites were mapped to census tracts using U.S. Census data. Exclusions were applied to restrict the monitored data to measurements collected using a common sampling strategy with minimal missing values over time. Comparisons were made for 28 HAPs in 38 census tracts located primarily in urban areas throughout Texas. For each pollutant and by year of assessment, modeled and monitored air pollutant annual levels were compared using standard methods (i.e., ratios of model-to-monitor annual levels). Concordance correlation analysis was also used, which assesses linearity and agreement while providing a formal method of statistical inference. Forty-eight percent of the median model-to-monitor values fell between 0.5 and 2, whereas only 17% of concordance correlation coefficients were significant and greater than 0.5. On the basis of concordance correlation analysis, the findings indicate there is poorer agreement when compared with the previously applied ad hoc methods to assess comparability between modeled and monitored levels of ambient HAPs.
NASA Astrophysics Data System (ADS)
Mullens, E.; Mcpherson, R. A.
2016-12-01
This work develops detailed trends in climate hazards affecting the Department of Transportation's Region 6, in the South Central U.S. Firstly, a survey was developed to gather information regarding weather and climate hazards in the region from the transportation community, identifying key phenomena and thresholds to evaluate. Statistically downscaled datasets were obtained from the Multivariate Adaptive Constructed Analogues (MACA) project, and the Asynchronous Regional Regression Model (ARRM), for a total of 21 model projections, two coupled model intercomparisons (CMIP3, and CMIP5), and four emissions pathways (A1Fi, B1, RCP8.5, RCP4.5). Specific hazards investigated include winter weather, freeze-thaw cycles, hot and cold extremes, and heavy precipitation. Projections for each of these variables were calculated for the region, utilizing spatial mapping, and time series analysis at the climate division level. The results indicate that cold-season phenomena such as winter weather, freeze-thaw, and cold extremes, decrease in intensity and frequency, particularly with the higher emissions pathways. Nonetheless, specific model and downscaling method yields variability in magnitudes, with the most notable decreasing trends late in the 21st century. Hot days show a pronounced increase, particularly with greater emissions, producing annual mean 100oF day frequencies by late 21st century analogous to the 2011 heatwave over the central Southern Plains. Heavy precipitation, evidenced by return period estimates and counts-over-thresholds, also show notable increasing trends, particularly between the recent past through mid-21st Century. Conversely, mean precipitation does not show significant trends and is regionally variable. Precipitation hazards (e.g., winter weather, extremes) diverge between downscaling methods and their associated model samples much more substantially than temperature, suggesting that the choice of global model and downscaled data is particularly important when considering region-specific impacts for precipitation. These results are intended to inform region transportation professionals of the susceptibility of the area to climate extremes, and to be a resource for assessing and incorporating changing risk probabilities into their planning processes.
BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Selva, Jacopo
2013-04-01
The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).
Bayesian network learning for natural hazard assessments
NASA Astrophysics Data System (ADS)
Vogel, Kristin
2016-04-01
Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables and incomplete observations. Further studies rise the challenge of relying on very small data sets. Since parameter estimates for complex models based on few observations are unreliable, it is necessary to focus on simplified, yet still meaningful models. A so called Markov Blanket approach is developed to identify the most relevant model components and to construct a simple Bayesian network based on those findings. Since the proceeding is completely data driven, it can easily be transferred to various applications in natural hazard domains. This study is funded by the Deutsche Forschungsgemeinschaft (DFG) within the research training programme GRK 2043/1 "NatRiskChange - Natural hazards and risks in a changing world" at Potsdam University.
Analysis of underlying and multiple-cause mortality data.
Moussa, M A; El Sayed, A M; Sugathan, T N; Khogali, M M; Verma, D
1992-01-01
"A variety of life table models were used for the analysis of the (1984-86) Kuwaiti cause-specific mortality data. These models comprised total mortality, multiple-decrement, cause-elimination, cause-delay and disease dependency. The models were illustrated by application to a set of four chronic diseases: hypertensive, ischaemic heart, cerebrovascular and diabetes mellitus. The life table methods quantify the relative weights of different diseases as hazards to mortality after adjustment for other causes. They can also evaluate the extent of dependency between underlying cause of death and other causes mentioned on [the] death certificate using an extended underlying-cause model." (SUMMARY IN FRE AND ITA) excerpt
NASA Technical Reports Server (NTRS)
Tai, H.; Wilson, J. W.; Maiden, D. L.
2003-01-01
The atmospheric ionizing radiation (AIR) ER-2 preflight analysis, one of the first attempts to obtain a relatively complete measurement set of the high-altitude radiation level environment, is described in this paper. The primary thrust is to characterize the atmospheric radiation and to define dose levels at high-altitude flight. A secondary thrust is to develop and validate dosimetric techniques and monitoring devices for protecting aircrews. With a few chosen routes, we can measure the experimental results and validate the AIR model predictions. Eventually, as more measurements are made, we gain more understanding about the hazardous radiation environment and acquire more confidence in the prediction models.