Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis
NASA Technical Reports Server (NTRS)
Shortle, J. F.; Allocco, M.
2005-01-01
Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.
NASA Technical Reports Server (NTRS)
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Supplemental Hazard Analysis and Risk Assessment - Hydrotreater
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish amore » lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.« less
Hydrothermal Liquefaction Treatment Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios receivedmore » increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.« less
Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less
Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...
Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory
NASA Astrophysics Data System (ADS)
Deyi, Feng; Ichikawa, M.
1989-11-01
In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.
Assessing the risk posed by natural hazards to infrastructures
NASA Astrophysics Data System (ADS)
Eidsvig, Unni; Kristensen, Krister; Vidar Vangelsten, Bjørn
2015-04-01
The modern society is increasingly dependent on infrastructures to maintain its function, and disruption in one of the infrastructure systems may have severe consequences. The Norwegian municipalities have, according to legislation, a duty to carry out a risk and vulnerability analysis and plan and prepare for emergencies in a short- and long term perspective. Vulnerability analysis of the infrastructures and their interdependencies is an important part of this analysis. This paper proposes a model for assessing the risk posed by natural hazards to infrastructures. The model prescribes a three level analysis with increasing level of detail, moving from qualitative to quantitative analysis. This paper focuses on the second level, which consists of a semi-quantitative analysis. The purpose of this analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures identified in the level 1 analysis and investigate the need for further analyses, i.e. level 3 quantitative analyses. The proposed level 2 analysis considers the frequency of the natural hazard, different aspects of vulnerability including the physical vulnerability of the infrastructure itself and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale. The proposed indicators characterize the robustness of the infrastructure, the importance of the infrastructure as well as interdependencies between society and infrastructure affecting the potential for cascading effects. Each indicator is ranked on a 1-5 scale based on pre-defined ranking criteria. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented, where risk to primary road, water supply and power network threatened by storm and landslide is assessed. The application examples show that the proposed model provides a useful tool for screening of undesirable events, with the ultimate goal to reduce the societal vulnerability.
Renjith, V R; Madhu, G; Nayagam, V Lakshmana Gomathi; Bhasi, A B
2010-11-15
The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. Copyright © 2010 Elsevier B.V. All rights reserved.
Quantitative risk analysis of oil storage facilities in seismic areas.
Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto
2005-08-31
Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.
Safety evaluation methodology for advanced coal extraction systems
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.
1981-01-01
Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.
ERIC Educational Resources Information Center
Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr
2018-01-01
The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luria, Paolo; Aspinall, Peter A
2003-08-01
The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based onmore » a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)« less
Assessing the risk posed by natural hazards to infrastructures
NASA Astrophysics Data System (ADS)
Eidsvig, Unni Marie K.; Kristensen, Krister; Vidar Vangelsten, Bjørn
2017-03-01
This paper proposes a model for assessing the risk posed by natural hazards to infrastructures, with a focus on the indirect losses and loss of stability for the population relying on the infrastructure. The model prescribes a three-level analysis with increasing level of detail, moving from qualitative to quantitative analysis. The focus is on a methodology for semi-quantitative analyses to be performed at the second level. The purpose of this type of analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures, identifying the most critical scenarios and investigating the need for further analyses (third level). The proposed semi-quantitative methodology considers the frequency of the natural hazard, different aspects of vulnerability, including the physical vulnerability of the infrastructure itself, and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale according to pre-defined ranking criteria. The proposed indicators, which characterise conditions that influence the probability of an infrastructure malfunctioning caused by a natural event, are defined as (1) robustness and buffer capacity, (2) level of protection, (3) quality/level of maintenance and renewal, (4) adaptability and quality of operational procedures and (5) transparency/complexity/degree of coupling. Further indicators describe conditions influencing the socio-economic consequences of the infrastructure malfunctioning, such as (1) redundancy and/or substitution, (2) cascading effects and dependencies, (3) preparedness and (4) early warning, emergency response and measures. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard, the potential duration of the infrastructure malfunctioning (e.g. depending on the required restoration effort) and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented for demonstration purposes, where risk posed by adverse weather and natural hazards to primary road, water supply and power networks is assessed. The application examples show that the proposed model provides a useful tool for screening of potential undesirable events, contributing to a targeted reduction of the risk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Helms, J.
2017-02-10
The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investmentsmore » or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.« less
Development of a Probabilistic Tsunami Hazard Analysis in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka
2006-07-01
It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yi; Global Institution for Collaborative Research and Education, Hokkaido University, Sapporo; Song, Jie
Purpose: To identify prognostic biomarkers in pancreatic cancer using high-throughput quantitative image analysis. Methods and Materials: In this institutional review board–approved study, we retrospectively analyzed images and outcomes for 139 locally advanced pancreatic cancer patients treated with stereotactic body radiation therapy (SBRT). The overall population was split into a training cohort (n=90) and a validation cohort (n=49) according to the time of treatment. We extracted quantitative imaging characteristics from pre-SBRT {sup 18}F-fluorodeoxyglucose positron emission tomography, including statistical, morphologic, and texture features. A Cox proportional hazard regression model was built to predict overall survival (OS) in the training cohort using 162more » robust image features. To avoid over-fitting, we applied the elastic net to obtain a sparse set of image features, whose linear combination constitutes a prognostic imaging signature. Univariate and multivariate Cox regression analyses were used to evaluate the association with OS, and concordance index (CI) was used to evaluate the survival prediction accuracy. Results: The prognostic imaging signature included 7 features characterizing different tumor phenotypes, including shape, intensity, and texture. On the validation cohort, univariate analysis showed that this prognostic signature was significantly associated with OS (P=.002, hazard ratio 2.74), which improved upon conventional imaging predictors including tumor volume, maximum standardized uptake value, and total legion glycolysis (P=.018-.028, hazard ratio 1.51-1.57). On multivariate analysis, the proposed signature was the only significant prognostic index (P=.037, hazard ratio 3.72) when adjusted for conventional imaging and clinical factors (P=.123-.870, hazard ratio 0.53-1.30). In terms of CI, the proposed signature scored 0.66 and was significantly better than competing prognostic indices (CI 0.48-0.64, Wilcoxon rank sum test P<1e-6). Conclusion: Quantitative analysis identified novel {sup 18}F-fluorodeoxyglucose positron emission tomography image features that showed improved prognostic value over conventional imaging metrics. If validated in large, prospective cohorts, the new prognostic signature might be used to identify patients for individualized risk-adaptive therapy.« less
NASA Astrophysics Data System (ADS)
Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.
2016-11-01
Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.
Canister Storage Building (CSB) Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
POWERS, T.B.
2000-03-16
This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other controls required to protect the public, workers, and environment.« less
Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W
2014-06-01
Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.
White, Paul A; Johnson, George E
2016-05-01
Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the relationships between genetic damage and disease, and the concomitant ability to use genetic toxicity results per se. © Her Majesty the Queen in Right of Canada 2016. Reproduced with the permission of the Minister of Health.
Critical asset and portfolio risk analysis: an all-hazards framework.
Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark
2007-08-01
This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.
Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V
2015-07-08
Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition.
Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site
NASA Astrophysics Data System (ADS)
Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.
2012-04-01
The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological Development, Area "Environment", Activity 1.3.3.1 "Prediction of triggering and risk assessment for landslides".
Wang, Jiawei; Liu, Ruimin; Wang, Haotian; Yu, Wenwen; Xu, Fei; Shen, Zhenyao
2015-12-01
In this study, positive matrix factorization (PMF) and principal components analysis (PCA) were combined to identify and apportion pollution-based sources of hazardous elements in the surface sediments in the Yangtze River estuary (YRE). Source identification analysis indicated that PC1, including Al, Fe, Mn, Cr, Ni, As, Cu, and Zn, can be defined as a sewage component; PC2, including Pb and Sb, can be considered as an atmospheric deposition component; and PC3, containing Cd and Hg, can be considered as an agricultural nonpoint component. To better identify the sources and quantitatively apportion the concentrations to their sources, eight sources were identified with PMF: agricultural/industrial sewage mixed (18.6 %), mining wastewater (15.9 %), agricultural fertilizer (14.5 %), atmospheric deposition (12.8 %), agricultural nonpoint (10.6 %), industrial wastewater (9.8 %), marine activity (9.0 %), and nickel plating industry (8.8 %). Overall, the hazardous element content seems to be more connected to anthropogenic activity instead of natural sources. The PCA results laid the foundation for the PMF analysis by providing a general classification of sources. PMF resolves more factors with a higher explained variance than PCA; PMF provided both the internal analysis and the quantitative analysis. The combination of the two methods can provide more reasonable and reliable results.
The role of models in estimating consequences as part of the risk assessment process.
Forde-Folle, K; Mitchell, D; Zepeda, C
2011-08-01
The degree of disease risk represented by the introduction, spread, or establishment of one or several diseases through the importation of animals and animal products is assessed by importing countries through an analysis of risk. The components of a risk analysis include hazard identification, risk assessment, risk management, and risk communication. A risk assessment starts with identification of the hazard(s) and then continues with four interrelated steps: release assessment, exposure assessment, consequence assessment, and risk estimation. Risk assessments may be either qualitative or quantitative. This paper describes how, through the integration of epidemiological and economic models, the potential adverse biological and economic consequences of exposure can be quantified.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Sperotto, Anna; Glade, Thomas; Marcomini, Antonio
2016-03-01
This paper presents a review of existing multi-risk assessment concepts and tools applied by organisations and projects providing the basis for the development of a multi-risk methodology in a climate change perspective. Relevant initiatives were developed for the assessment of multiple natural hazards (e.g. floods, storm surges, droughts) affecting the same area in a defined timeframe (e.g. year, season, decade). Major research efforts were focused on the identification and aggregation of multiple hazard types (e.g. independent, correlated, cascading hazards) by means of quantitative and semi-quantitative approaches. Moreover, several methodologies aim to assess the vulnerability of multiple targets to specific natural hazards by means of vulnerability functions and indicators at the regional and local scale. The overall results of the review show that multi-risk approaches do not consider the effects of climate change and mostly rely on the analysis of static vulnerability (i.e. no time-dependent vulnerabilities, no changes among exposed elements). A relevant challenge is therefore to develop comprehensive formal approaches for the assessment of different climate-induced hazards and risks, including dynamic exposure and vulnerability. This requires the selection and aggregation of suitable hazard and vulnerability metrics to make a synthesis of information about multiple climate impacts, the spatial analysis and ranking of risks, including their visualization and communication to end-users. To face these issues, climate impact assessors should develop cross-sectorial collaborations among different expertise (e.g. modellers, natural scientists, economists) integrating information on climate change scenarios with sectorial climate impact assessment, towards the development of a comprehensive multi-risk assessment process. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.
2009-04-01
In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide location data. These results show a high concordance between the landslide inventory and the high susceptibility estimated zone with an adjustment of 95.1 % for ANN model and 89.4% for LR model. In addition, we make a comparative analysis of both techniques using the Receiver Operating Characteristic (ROC) curve, a graphical plot of the sensitivity vs. (1 - specificity) for a binary classifier system in function of its discrimination threshold, and calculating the Area Under the ROC (AUROC) value for each model. Finally, the previous models are used for the developing a new probabilistic landslide hazard map for future events. They are obtained combining the expected triggering factor (calculated earthquake ground motion) for a return period of 475 years with the susceptibility map.
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kienhuis, Anne S., E-mail: anne.kienhuis@rivm.nl; RIKILT, Institute of Food Safety, Wageningen UR, PO Box 230, 6700 AE, Wageningen; Netherlands Toxicogenomics Centre
Hepatic systems toxicology is the integrative analysis of toxicogenomic technologies, e.g., transcriptomics, proteomics, and metabolomics, in combination with traditional toxicology measures to improve the understanding of mechanisms of hepatotoxic action. Hepatic toxicology studies that have employed toxicogenomic technologies to date have already provided a proof of principle for the value of hepatic systems toxicology in hazard identification. In the present review, acetaminophen is used as a model compound to discuss the application of toxicogenomics in hepatic systems toxicology for its potential role in the risk assessment process, to progress from hazard identification towards hazard characterization. The toxicogenomics-based parallelogram is usedmore » to identify current achievements and limitations of acetaminophen toxicogenomic in vivo and in vitro studies for in vitro-to-in vivo and interspecies comparisons, with the ultimate aim to extrapolate animal studies to humans in vivo. This article provides a model for comparison of more species and more in vitro models enhancing the robustness of common toxicogenomic responses and their relevance to human risk assessment. To progress to quantitative dose-response analysis needed for hazard characterization, in hepatic systems toxicology studies, generation of toxicogenomic data of multiple doses/concentrations and time points is required. Newly developed bioinformatics tools for quantitative analysis of toxicogenomic data can aid in the elucidation of dose-responsive effects. The challenge herein is to assess which toxicogenomic responses are relevant for induction of the apical effect and whether perturbations are sufficient for the induction of downstream events, eventually causing toxicity.« less
Recommended health and safety guidelines for coal gasification pilot plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-01-01
The greatest toxic hazards in coal conversion are presented by the known and suspected carcinogens, because they are hazardous at low levels of exposure, have delayed action with no immediate warning, and have grave consequences. As for routes of entry, it is to be noted that various solids and liquids may reach the body by inhalation of particles, deposition of particles, or indirectly by contact with dirty surfaces. Other toxicants are most likely to enter the body by inhalation. The overall carcinogenic hazard cannot be precisely estimated from chemical analysis alone, because the possible interactions are far too complex. Further,more » the hazard cannot at present be quantitatively defined by available biological tests. The same limitations probably apply to toxic effects other than carcinogenesis, with the posible exception of some immediate responses (e.g., chemical asphyxia, primary respiratory irration). It is not practical to recommend comprehensive workplace exposure limits on a basis similar to those for individual toxicants; however, a limit for one important kind of hazard (high-boiling suspected carcinogens) can be recommended. The carcinogenic hazards associated with airborne particles and surface contamination are the most crucial of the whole spectrum and offer a practical target for control, if not for quantitative evaluation. The only direct quantitative evidence now availabl is from epidemiology in analogous circulstances and there are severe limitations on the comprehensiveness and reliability of such evidence. Some specific targets for control through industrial hygiene practices can be identified. The presence of any strong irritant of the respiratory mucosa, other mucous surfaces, and the skin should be regarded as a danger signal because of possible potentiation of carcinogens and other toxicants.« less
NASA Astrophysics Data System (ADS)
Grasso, S.; Maugeri, M.
After the Summit held in Washington on August 20-22 2001 to plan the first World Conference on the mitigation of Natural Hazards, a Group for the analysis of Natural Hazards within the Mediterranean area has been formed. The Group has so far determined the following hazards: (1) Seismic hazard (hazard for historical buildings included); (2) Hazard linked to the quantity and quality of water; (3) Landslide hazard; (4) Volcanic hazard. The analysis of such hazards implies the creation and the management of data banks, which can only be used if the data are properly geo-settled to allow a crossed use of them. The obtained results must be therefore represented on geo-settled maps. The present study is part of a research programme, namely "Detailed Scenarios and Actions for Seismic Prevention of Damage in the Urban Area of Catania", financed by the National Department for the Civil Protection and the National Research Council-National Group for the Defence Against Earthquakes (CNR-GNDT). Nowadays the south-eastern area of Sicily, called the "Iblea" seismic area of Sicily, is considered as one of the most intense seismic zones in Italy, based on the past and current seismic history and on the typology of civil buildings. Safety against earthquake hazards has two as pects: structural safety against potentially destructive dynamic forces and site safety related to geotechnical phenomena such as amplification, land sliding and soil liquefaction. So the correct evaluation of seismic hazard is highly affected by risk factors due to geological nature and geotechnical properties of soils. The effect of local geotechnical conditions on damages suffered by buildings under seismic conditions has been widely recognized, as it is demonstrated by the Manual for Zonation on Seismic Geotechnical Hazards edited by the International Society for Soil Mechanics and Geotechnical Engineering (TC4, 1999). The evaluation of local amplification effects may be carried out by means of either rigorous complex methods of analysis or qualitative procedures. A semi quantitative procedure based on the definition of the geotechnical hazard index has been applied for the zonation of the seismic geotechnical hazard of the city of Catania. In particular this procedure has been applied to define the influence of geotechnical properties of soil in a central area of the city of Catania, where some historical buildings of great importance are sited. It was also performed an investigation based on the inspection of more than one hundred historical ecclesiastical buildings of great importance, located in the city. Then, in order to identify the amplification effects due to the site conditions, a geotechnical survey form was prepared, to allow a semi quantitative evaluation of the seismic geotechnical hazard for all these historical buildings. In addition, to evaluate the foundation soil time -history response, a 1-D dynamic soil model was employed for all these buildings, considering the non linearity of soil behaviour. Using a GIS, a map of the seismic geotechnical hazard, of the liquefaction hazard and a preliminary map of the seismic hazard for the city of Catania have been obtained. From the analysis of obtained results it may be noticed that high hazard zones are mainly clayey sites
Concerns related to Safety Management of Engineered Nanomaterials in research environment
NASA Astrophysics Data System (ADS)
Groso, A.; Meyer, Th
2013-04-01
Since the rise of occupational safety and health research on nanomaterials a lot of progress has been made in generating health effects and exposure data. However, when detailed quantitative risk analysis is in question, more research is needed, especially quantitative measures of workers exposure and standards to categorize toxicity/hazardousness data. In the absence of dose-response relationships and quantitative exposure measurements, control banding (CB) has been widely adopted by OHS community as a pragmatic tool in implementing a risk management strategy based on a precautionary approach. Being in charge of health and safety in a Swiss university, where nanomaterials are largely used and produced, we are also faced with the challenge related to nanomaterials' occupational safety. In this work, we discuss the field application of an in-house risk management methodology similar to CB as well as some other methodologies. The challenges and issues related to the process will be discussed. Since exact data on nanomaterials hazardousness are missing for most of the situations, we deduce that the outcome of the analysis for a particular process is essentially the same with a simple methodology that determines only exposure potential and the one taking into account the hazardousness of ENPs. It is evident that when reliable data on hazardousness factors (as surface chemistry, solubility, carcinogenicity, toxicity etc.) will be available, more differentiation will be possible in determining the risk for different materials. On the protective measures side, all CB methodologies are inclined to overprotection side, only that some of them suggest comprehensive protective/preventive measures and others remain with basic advices. The implementation and control of protective measures in research environment will also be discussed.
Determination of Sulfate by Conductometric Titration: An Undergraduate Laboratory Experiment
ERIC Educational Resources Information Center
Garcia, Jennifer; Schultz, Linda D.
2016-01-01
The classic technique for sulfate analysis in an undergraduate quantitative analysis lab involves precipitation as the barium salt with barium chloride, collection of the precipitate by gravity filtration using ashless filter paper, and removal of the filter paper by charring over a Bunsen burner. The entire process is time-consuming, hazardous,…
Framework Analysis for Determining Mode of Action & Human Relevance
The overall aim of a cancer risk assessment is to characterize the risk to humans from environmental exposures. This risk characterization includes a qualitative and quantitative risk characterization that relies on the development of separate hazard, dose- response and exposure...
Probabilistic Volcanic Hazard and Risk Assessment
NASA Astrophysics Data System (ADS)
Marzocchi, W.; Neri, A.; Newhall, C. G.; Papale, P.
2007-08-01
Quantifying Long- and Short-Term Volcanic Hazard: Building Up a Common Strategy for Italian Volcanoes, Erice Italy, 8 November 2006 The term ``hazard'' can lead to some misunderstanding. In English, hazard has the generic meaning ``potential source of danger,'' but for more than 30 years [e.g., Fournier d'Albe, 1979], hazard has been also used in a more quantitative way, that reads, ``the probability of a certain hazardous event in a specific time-space window.'' However, many volcanologists still use ``hazard'' and ``volcanic hazard'' in purely descriptive and subjective ways. A recent meeting held in November 2006 at Erice, Italy, entitled ``Quantifying Long- and Short-Term Volcanic Hazard: Building up a Common Strategy for Italian Volcanoes'' (http://www.bo.ingv.it/erice2006) concluded that a more suitable term for the estimation of quantitative hazard is ``probabilistic volcanic hazard assessment'' (PVHA).
Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W
2001-01-01
To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs.
Safety analysis, risk assessment, and risk acceptance criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jamali, K.; Stack, D.W.; Sullivan, L.H.
1997-08-01
This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities andmore » that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.« less
Anderson, Carl A; McRae, Allan F; Visscher, Peter M
2006-07-01
Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.
Occupational hazard evaluation model underground coal mine based on unascertained measurement theory
NASA Astrophysics Data System (ADS)
Deng, Quanlong; Jiang, Zhongan; Sun, Yaru; Peng, Ya
2017-05-01
In order to study how to comprehensively evaluate the influence of several occupational hazard on miners’ physical and mental health, based on unascertained measurement theory, occupational hazard evaluation indicator system was established to make quantitative and qualitative analysis. Determining every indicator weight by information entropy and estimating the occupational hazard level by credible degree recognition criteria, the evaluation model was programmed by Visual Basic, applying the evaluation model to occupational hazard comprehensive evaluation of six posts under a coal mine, and the occupational hazard degree was graded, the evaluation results are consistent with actual situation. The results show that dust and noise is most obvious among the coal mine occupational hazard factors. Excavation face support workers are most affected, secondly, heading machine drivers, coal cutter drivers, coalface move support workers, the occupational hazard degree of these four types workers is II mild level. The occupational hazard degree of ventilation workers and safety inspection workers is I level. The evaluation model could evaluate underground coal mine objectively and accurately, and can be employed to the actual engineering.
NASA Astrophysics Data System (ADS)
Harper, Bryan; Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto; Harper, Stacey
2015-06-01
The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure-activity relationships.
Antoine, Clemence; Benfari, Giovanni; Michelena, Hector I; Malouf, Joseph F; Nkomo, Vuyisile T; Thapa, Prabin; Enriquez-Sarano, Maurice
2018-05-31
Background -Echocardiographic quantitation of degenerative mitral regurgitation (DMR) is recommended whenever possible in clinical guidelines but is criticized and its scalability to routine clinical practice doubted. We hypothesized that echocardiographic DMR quantitation, performed in routine clinical practice by multiple practitioners predicts independently long-term survival, and thus is essential to DMR management. Methods -We included patients diagnosed with isolated mitral-valve-prolapse 2003-2011 and any degree of MR quantified by any physician/sonographer in routine clinical practice. Clinical/echocardiographic data acquired at diagnosis were retrieved electronically. Endpoint was mortality under medical treatment analyzed by Kaplan-Meir method and Proportional-Hazard models. Results -The cohort included 3914 patients (55% male) aged 62±17 years, with left ventricular ejection fraction (LVEF) 63±8% and routinely measured effective regurgitant orifice area (EROA) 19[0-40] mm 2 During follow-up (6.7±3.1 years) 696 patients died under medical management and 1263 underwent mitral surgery. In multivariate analysis, routinely measured EROA was associated with mortality (adjusted-hazard-ratio 1.19[1.13-1.24] p<0.0001 per-10mm 2 ) independently of LVEF and end-systolic diameter, symptoms and age/comorbidities. The association between routinely measured EROA and mortality persisted with competitive risk modeling (adjusted hazard-ratio 1.15[1.10-1.20] per 10mm 2 p<0.0001), or in patients without guideline-based Class I/II surgical triggers (adjusted hazard ratio 1.19[1.10-1.28] per 10mm 2 p<0.0001) and in all subgroups examined (all p<0.01). Spline curve analysis showed that, compared with general population mortality, excess mortality appears for moderate DMR (EROA ≥20mm 2 ) becomes notable ≥EROA 30mm 2 and steadily increases with higher EROA levels, > 40 mm 2 threshold. Conclusions -Echocardiographic DMR quantitation is scalable to routine practice and is independently associated with clinical outcome. Routinely measured EROA is strongly associated with long-term survival under medical treatment. Excess mortality vs. the general population appears in the "moderate" DMR range and steadily increases with higher EROA. Hence, individual EROA values should be integrated into therapeutic considerations, additionally to categorical DMR grading.
Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F
2010-01-01
The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.
Application of an adaptive neuro-fuzzy inference system to ground subsidence hazard mapping
NASA Astrophysics Data System (ADS)
Park, Inhye; Choi, Jaewon; Jin Lee, Moung; Lee, Saro
2012-11-01
We constructed hazard maps of ground subsidence around abandoned underground coal mines (AUCMs) in Samcheok City, Korea, using an adaptive neuro-fuzzy inference system (ANFIS) and a geographical information system (GIS). To evaluate the factors related to ground subsidence, a spatial database was constructed from topographic, geologic, mine tunnel, land use, and ground subsidence maps. An attribute database was also constructed from field investigations and reports on existing ground subsidence areas at the study site. Five major factors causing ground subsidence were extracted: (1) depth of drift; (2) distance from drift; (3) slope gradient; (4) geology; and (5) land use. The adaptive ANFIS model with different types of membership functions (MFs) was then applied for ground subsidence hazard mapping in the study area. Two ground subsidence hazard maps were prepared using the different MFs. Finally, the resulting ground subsidence hazard maps were validated using the ground subsidence test data which were not used for training the ANFIS. The validation results showed 95.12% accuracy using the generalized bell-shaped MF model and 94.94% accuracy using the Sigmoidal2 MF model. These accuracy results show that an ANFIS can be an effective tool in ground subsidence hazard mapping. Analysis of ground subsidence with the ANFIS model suggests that quantitative analysis of ground subsidence near AUCMs is possible.
The Typhoon Disaster Analysis Based on Weibo Topic Heat
NASA Astrophysics Data System (ADS)
Yuan, J.; Gong, A.; Wang, J.; Li, J.
2018-04-01
Could social media data be utilized in hazard evaluation? Typhoon disaster as one of the costly disaster has become devastating threats for human. Moreover, social media change the communication way of human and citizens can turn to this platform to express disasterrelated information at real time. Therefore, social media improves situational awareness and widens the method of hazard information acquiring. With more and more studies investigating in relationship between social media response and degree of damage, the strong correlation has been proved. Weibo as one of the most popular social media in China can provide data with posted text, location, user identification and other additional information. Combining with 10 tropical cyclones and Weibo data in 2013, We perform a quantitative analysis between the grade of hazard situation and Weibo related topic heat in province scale. We provide a new model of Weibo topic heat to evaluate the Weibo activity in study area. Also we demonstrate the hazard assessing formula is H = 1.8845 ln(α) + 15.636 in tropical cyclone disaster. High level goodness of curve fitting also suggest that this equation can be used for rapid assessment of hazard caused by tropical cyclones.
Data gap filling techniques are commonly used to predict hazard in the absence of empirical data. The most established techniques are read-across, trend analysis and quantitative structure-activity relationships (QSARs). Toxic equivalency factors (TEFs) are less frequently used d...
Risk assessment of major hazards and its application in urban planning: a case study.
Zhou, Yafei; Liu, Mao
2012-03-01
With the rapid development of industry in China, the number of establishments that are proposed or under construction is increasing year by year, and many are industries that handle flammable, explosive, toxic, harmful, and dangerous substances. Accidents such as fire, explosion, and toxic diffusion inevitably happen. Accidents resulting from these major hazards in cities cause a large number of casualties and property losses. It is increasingly important to analyze the risk of major hazards in cities realistically and to suitably plan and utilize the surrounding land based on the risk analysis results, thereby reducing the hazards. A theoretical system for risk assessment of major hazards in cities is proposed in this article, and the major hazard risk for the entire city is analyzed quantitatively. Risks of various major accidents are considered together, superposition effect is analyzed, individual risk contours of the entire city are drawn out, and the level of risk in the city is assessed using "as low as reasonably practicable" guidelines. After the entire city's individual risk distribution is obtained, risk zones are divided according to corresponding individual risk value of HSE, and land-use planning suggestions are proposed. Finally, a city in China is used as an example to illustrate the risk assessment process of the city's major hazard and its application in urban land-use planning. The proposed method has a certain theoretical and practical significance in establishing and improving risk analysis of major hazard and urban land-use planning. On the one hand, major urban public risk is avoided; further, the land is utilized in the best possible way in order to obtain the maximum benefit from its use. © 2011 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Cao, An-ye; Dou, Lin-ming; Wang, Chang-bin; Yao, Xiao-xiao; Dong, Jing-yuan; Gu, Yu
2016-11-01
Identification of precursory characteristics is a key issue for rock burst prevention. The aim of this research is to provide a reference for assessing rock burst risk and determining potential rock burst risk areas in coal mining. In this work, the microseismic multidimensional information for the identification of rock bursts and spatial-temporal pre-warning was investigated in a specific coalface which suffered high rock burst risk in a mining area near a large residual coal pillar. Firstly, microseismicity evolution prior to a disastrous rock burst was qualitatively analysed, and the abnormal clustering of seismic sources, abnormal variations in daily total energy release, and event counts can be regarded as precursors to rock burst. Secondly, passive tomographic imaging has been used to locate high seismic activity zones and assess rock burst hazard when the coalface passes through residual pillar areas. The results show that high-velocity or velocity anomaly regions correlated well with strong seismic activities in future mining periods and that passive tomography has the potential to describe, both quantitatively and periodically, hazardous regions and assess rock burst risk. Finally, the bursting strain energy index was further used for short-term spatial-temporal pre-warning of rock bursts. The temporal sequence curve and spatial contour nephograms indicate that the status of the danger and the specific hazardous zones, and levels of rock burst risk can be quantitatively and rapidly analysed in short time and in space. The multidimensional precursory characteristic identification of rock bursts, including qualitative analysis, intermediate and short-time quantitative predictions, can guide the choice of measures implemented to control rock bursts in the field, and provides a new approach to monitor and forecast rock bursts in space and time.
NASA Technical Reports Server (NTRS)
Baker, V. R. (Principal Investigator); Holz, R. K.; Hulke, S. D.; Patton, P. C.; Penteado, M. M.
1975-01-01
The author has identified the following significant results. Development of a quantitative hydrogeomorphic approach to flood hazard evaluation was hindered by (1) problems of resolution and definition of the morphometric parameters which have hydrologic significance, and (2) mechanical difficulties in creating the necessary volume of data for meaningful analysis. Measures of network resolution such as drainage density and basin Shreve magnitude indicated that large scale topographic maps offered greater resolution than small scale suborbital imagery and orbital imagery. The disparity in network resolution capabilities between orbital and suborbital imagery formats depends on factors such as rock type, vegetation, and land use. The problem of morphometric data analysis was approached by developing a computer-assisted method for network analysis. The system allows rapid identification of network properties which can then be related to measures of flood response.
Frikha, Youssef; Fellner, Johann; Zairi, Moncef
2017-09-01
Despite initiatives for enhanced recycling and waste utilization, landfill still represents the dominant disposal path for municipal solid waste (MSW). The environmental impacts of landfills depend on several factors, including waste composition, technical barriers, landfill operation and climatic conditions. A profound evaluation of all factors and their impact is necessary in order to evaluate the environmental hazards emanating from landfills. The present paper investigates a sanitary landfill located in a semi-arid climate (Tunisia) and highlights major differences in quantitative and qualitative leachate characteristics compared to landfills situated in moderate climates. Besides the qualitative analysis of leachate samples, a quantitative analysis including the simulation of leachate generation (using the HELP model) has been conducted. The results of the analysis indicate a high load of salts (Cl, Na, inorganic nitrogen) in the leachate compared to other landfills. Furthermore the simulations with HELP model highlight that a major part of the leachate generated originates form the water content of waste.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish
The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less
Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; ...
2015-06-04
The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less
Hazardous Materials Flow by Rail
DOT National Transportation Integrated Search
1990-03-01
The report presents a quantitative overview of the movement of hazardous materials by rail in the United States. The data used is a hazardous materials rail waybill sample developed at TSC from the 1983 Rail Waybill Sample. The report examines (1) th...
Landslide risk models for decision making.
Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio
2009-11-01
This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.
Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems
NASA Astrophysics Data System (ADS)
Kwag, Shinyoung
Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.
Marahatta, Sujan B.; Gautam, Sanju; Paudel, Grish; Yadav, Uday N.
2018-01-01
Background: An occupational hazard contributes to severe health problems among workers due to unhygienic conditions and lack of safety at the workplace. Economic growth and productivity can be well achieved by promoting health, safety, and improved quality workplace. Aims: To assess the knowledge of occupational hazards, use of Personal Protective Equipment (PPE) and the factors associated with awareness on occupational hazards among automobile repair artisans in Kathmandu, Nepal. Settings and Design: A quantitative descriptive cross-sectional study with 400 automobile repair artisans from Kathmandu metropolitan city was conducted between March and September, 2015. Materials and Methods: Simple random sampling was adopted for recruiting study participants. Pre-tested, validated semi-structured questionnaire was used in line with study objectives to collect the data. Statistical Analysis: Logistic regression analysis was employed to identify factors associated with the awareness of occupational hazard and PPE use. Results: Of total, 56% had awareness on occupational hazard and 44.3% of artisans were using the personal protective equipments. Being educated, having job duration ≥6 years and having pre-service training for work were significantly associated with the awareness of occupational hazards. Notably, we found that those who were aware of occupational hazard (OR = 3.01, 95% CI: 1.98–4.57) were three times more likely to use the safety measures when compared to those who were unaware. Conclusions: Awareness on occupational hazard and use of PPE is very low among automobile repair artisans in Kathmandu, Nepal. Therefore, interventions on awareness raising on areas, like provision of pre-service training, promotion of safety advocacy, and enforcement of appropriate regulations for work place. PMID:29743786
Risk assessment in the North Caucasus ski resorts
NASA Astrophysics Data System (ADS)
Komarov, Anton Y.; Seliverstov, Yury G.; Glazovskaya, Tatyana G.; Turchaninova, Alla S.
2016-10-01
Avalanches pose a significant problem in most mountain regions of Russia. The constant growth of economic activity, and therefore the increased avalanche hazard, in the North Caucasus region lead to demand for the development of large-scale avalanche risk assessment methods. Such methods are needed for the determination of appropriate avalanche protection measures as well as for economic assessments.The requirement of natural hazard risk assessments is determined by the Federal Law of the Russian Federation (Federal Law 21.12.1994 N 68-FZ, 2016). However, Russian guidelines (SNIP 11-02-96, 2013; SNIP 22-02-2003, 2012) are not clearly presented concerning avalanche risk assessment calculations. Thus, we discuss these problems by presenting a new avalanche risk assessment approach, with the example of developing but poorly researched ski resort areas. The suggested method includes the formulas to calculate collective and individual avalanche risk. The results of risk analysis are shown in quantitative data that can be used to determine levels of avalanche risk (appropriate, acceptable and inappropriate) and to suggest methods to decrease the individual risk to an acceptable level or better. The analysis makes it possible to compare risk quantitative data obtained from different regions, analyze them and evaluate the economic feasibility of protection measures.
Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A
2001-10-12
As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.
Risk assessment of airborne pesticides in a Mediterranean region of Spain.
López, Antonio; Yusà, Vicent; Muñoz, Amalia; Vera, Teresa; Borràs, Esther; Ródenas, Milagros; Coscollà, Clara
2017-01-01
A risk assessment strategy based on the quantitative target analysis and semi-quantitative retrospective screening determination of pesticides in PM10 has been developed. The proposed methodology was applied to 345 real samples from nine stations of a Mediterranean area in Spain, and the risk was assessed for adult, children and infants. Forty pesticides were detected with average concentrations ranging from 8 to 30,000pgm -3 . Each station showed its specific pesticide profile, which is linked to the different types of crops around each station. For adults, children and infants the estimated chronic inhalation risk, expressed as Hazard Quotient (HQ), was <1 for all pesticides. The cumulative exposure for organophosphates, neonicotinoids, benzimidazoles, carbamates, micro-organism and triazoles pesticides (HI, Hazard Index) were <1 for the three groups of populations assessed. For infants, the cancer risk estimated for the detected pesticides classified as possible and potential carcinogens were lower than 1.0 E-06, except for carbendazim and hexythiazox. Copyright © 2016 Elsevier B.V. All rights reserved.
Radiological Hazard of Spallation Products in Accelerator-Driven System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saito, M.; Stankovskii, A.; Artisyuk, V.
The central issue underlying this paper is related to elucidating the hazard of radioactive spallation products that might be an important factor affecting the design option of accelerator-driven systems (ADSs). Hazard analysis based on the concept of Annual Limit on Intake identifies alpha-emitting isotopes of rare earths (REs) (dysprosium, gadolinium, and samarium) as the dominant contributors to the overall toxicity of traditional (W, Pb, Pb-Bi) targets. The matter is addressed from several points of view: code validation to simulate their yields, choice of material for the neutron producing targets, and challenging the beam type. The paper quantitatively determines the domainmore » in which the toxicity of REs exceeds that of polonium activation products broadly discussed now in connection with advertising lead-bismuth technology for the needs of ADSs.« less
NASA Astrophysics Data System (ADS)
Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza
2015-04-01
The analysis of the hazard and vulnerability in coastal areas caused for erosion is based on studies of coastal dynamics since that allows having a better information detail that is useful for decision-making in aspects like prevention, mitigation, disaster reduction and integrated risk management. The Town of La Bocana, located in Buenaventura (Colombian Pacific) was selected to carry out the threat assessment for coastal erosion based on three components: i) magnitude, ii) occurrence and iii) susceptibility. Vulnerability meanwhile, is also composed of three main components for its evaluation: i) exposure ii) fragility and iii) resilience, which in turn are evaluated in 6 dimensions of vulnerability: physical, social, economic, ecological, institutional and cultural. The hazard analysis performed used a semi-quantitative approach, and an index of variables such as type of geomorphological unit, type of beach, exposure of the surfing coast, occurrence, among others. Quantitative data of coastal retreat was measured through the use of DSAS (Digital Shoreline Analysis System) an application of ArcGIS, as well as the development of digital elevation models from the beach and 6 beach profiles strategically located on the coast obtained with GNSS technology. Sediment samples collected from these beaches, medium height and wave direction were used as complementary data. The information was integrated across the coast line into segments of 250 x 250 meters. 4 sectors are part of the coastal area of La Bocana: Pianguita, Vistahermosa, Donwtown and Shangay. 6 vulnerability dimensions units were taken from these population, as well as its density for exposure, wich was analyzed through a multi-array method that include variables such as, land use, population, type of structure, education, basic services, among others, to measure frailty, and their respective indicator of resilience. The hazard analysis results indicate that Vistahermosa is in very high threat, while Donwtown and Pianguita are in a medium hazard. Particularly these two sectors have the mayor population density and the biggest hotel development and services infraestructure; meanwhile Shangay was scored with low hazard because the wave action has no direct impact on it. Vulnerability analysis suggest that the sector of Shangay has a very high vulnerability status because it is a sector that does not have any basic services and have low levels of schooling, meanwhile Downtown, Vistahermosa and Pianguita are in the average of vulnerability. Additionally, it was determined that in recent years the sector of Vista hermosa the erosion rates are up to -xx m yr-1, while in other sectors the regression of the coastline can be associated with local tidal peaks that occur during April and October, while other months of the year are typically for recovery and stability processes.
NASA Astrophysics Data System (ADS)
Runge, Melody G.; Bebbington, Mark S.; Cronin, Shane J.; Lindsay, Jan M.; Moufti, Mohammed R.
2016-02-01
During probabilistic volcanic hazard analysis of volcanic fields, a greater variety of spatial data on crustal features should help improve forecasts of future vent locations. Without further examination, however, geophysical estimations of crustal or other features may be non-informative. Here, we present a new, robust, non-parametric method to quantitatively determine the existence of any relationship between natural phenomena (e.g., volcanic eruptions) and a variety of geophysical data. This provides a new validation tool for incorporating a range of potentially hazard-diagnostic observable data into recurrence rate estimates and hazard analyses. Through this study it is shown that the location of Cenozoic volcanic fields across the Arabian Shield appear to be related to locations of major and minor faults, at higher elevations, and regions where gravity anomaly values were between - 125 mGal and 0 mGal. These findings support earlier hypotheses that the western shield uplift was related to Cenozoic volcanism. At the harrat (volcanic field)-scale, higher vent density regions are related to both elevation and gravity anomaly values. A by-product of this work is the collection of existing data on the volcanism across Saudi Arabia, with all vent locations provided herein, as well as updated maps for Harrats Kura, Khaybar, Ithnayn, Kishb, and Rahat. This work also highlights the potential dangers of assuming relationships between observed data and the occurrence of a natural phenomenon without quantitative assessment or proper consideration of the effects of data resolution.
NASA Astrophysics Data System (ADS)
Power, William; Wang, Xiaoming; Lane, Emily; Gillibrand, Philip
2013-09-01
Regional source tsunamis represent a potentially devastating threat to coastal communities in New Zealand, yet are infrequent events for which little historical information is available. It is therefore essential to develop robust methods for quantitatively estimating the hazards posed, so that effective mitigation measures can be implemented. We develop a probabilistic model for the tsunami hazard posed to the Auckland region of New Zealand from the Kermadec Trench and the southern New Hebrides Trench subduction zones. An innovative feature of our model is the systematic analysis of uncertainty regarding the magnitude-frequency distribution of earthquakes in the source regions. The methodology is first used to estimate the tsunami hazard at the coastline, and then used to produce a set of scenarios that can be applied to produce probabilistic maps of tsunami inundation for the study region; the production of these maps is described in part II. We find that the 2,500 year return period regional source tsunami hazard for the densely populated east coast of Auckland is dominated by events originating in the Kermadec Trench, while the equivalent hazard to the sparsely populated west coast is approximately equally due to events on the Kermadec Trench and the southern New Hebrides Trench.
Assessment of the Casualty Risk of Multiple Meteorological Hazards in China
Xu, Wei; Zhuo, Li; Zheng, Jing; Ge, Yi; Gu, Zhihui; Tian, Yugang
2016-01-01
A study of the frequency, intensity, and risk of extreme climatic events or natural hazards is important for assessing the impacts of climate change. Many models have been developed to assess the risk of multiple hazards, however, most of the existing approaches can only model the relative levels of risk. This paper reports the development of a method for the quantitative assessment of the risk of multiple hazards based on information diffusion. This method was used to assess the risks of loss of human lives from 11 types of meteorological hazards in China at the prefectural and provincial levels. Risk curves of multiple hazards were obtained for each province and the risks of 10-year, 20-year, 50-year, and 100-year return periods were mapped. The results show that the provinces (municipalities, autonomous regions) in southeastern China are at higher risk of multiple meteorological hazards as a result of their geographical location and topography. The results of this study can be used as references for the management of meteorological disasters in China. The model can be used to quantitatively calculate the risks of casualty, direct economic losses, building collapse, and agricultural losses for any hazards at different spatial scales. PMID:26901210
Assessment of the Casualty Risk of Multiple Meteorological Hazards in China.
Xu, Wei; Zhuo, Li; Zheng, Jing; Ge, Yi; Gu, Zhihui; Tian, Yugang
2016-02-17
A study of the frequency, intensity, and risk of extreme climatic events or natural hazards is important for assessing the impacts of climate change. Many models have been developed to assess the risk of multiple hazards, however, most of the existing approaches can only model the relative levels of risk. This paper reports the development of a method for the quantitative assessment of the risk of multiple hazards based on information diffusion. This method was used to assess the risks of loss of human lives from 11 types of meteorological hazards in China at the prefectural and provincial levels. Risk curves of multiple hazards were obtained for each province and the risks of 10-year, 20-year, 50-year, and 100-year return periods were mapped. The results show that the provinces (municipalities, autonomous regions) in southeastern China are at higher risk of multiple meteorological hazards as a result of their geographical location and topography. The results of this study can be used as references for the management of meteorological disasters in China. The model can be used to quantitatively calculate the risks of casualty, direct economic losses, building collapse, and agricultural losses for any hazards at different spatial scales.
Assessment and Control of Spacecraft Charging Risks on the International Space Station
NASA Technical Reports Server (NTRS)
Koontz, Steve; Valentine, Mark; Keeping, Thomas; Edeen, Marybeth; Spetch, William; Dalton, Penni
2004-01-01
The International Space Station (ISS) operates in the F2 region of Earth's ionosphere, orbiting at altitudes ranging from 350 to 450 km at an inclination of 51.6 degrees. The relatively dense, cool F2 ionospheric plasma suppresses surface charging processes much of the time, and the flux of relativistic electrons is low enough to preclude deep dielectric charging processes. The most important spacecraft charging processes in the ISS orbital environment are: 1) ISS electrical power system interactions with the F2 plasma, 2) magnetic induction processes resulting from flight through the geomagnetic field and, 3) charging processes that result from interaction with auroral electrons at high latitude. Recently, the continuing review and evaluation of putative ISS charging hazards required by the ISS Program Office revealed that ISS charging could produce an electrical shock hazard to the ISS crew during extravehicular activity (EVA). ISS charging risks are being evaluated in an ongoing measurement and analysis campaign. The results of ISS charging measurements are combined with a recently developed model of ISS charging (the Plasma Interaction Model) and an exhaustive analysis of historical ionospheric variability data (ISS Ionospheric Specification) to evaluate ISS charging risks using Probabilistic Risk Assessment (PRA) methods. The PRA combines estimates of the frequency of occurrence and severity of the charging hazards with estimates of the reliability of various hazard controls systems, as required by NASA s safety and risk management programs, to enable design and selection of a hazard control approach that minimizes overall programmatic and personnel risk. The PRA provides a quantitative methodology for incorporating the results of the ISS charging measurement and analysis campaigns into the necessary hazard reports, EVA procedures, and ISS flight rules required for operating ISS in a safe and productive manner.
NASA Astrophysics Data System (ADS)
Maruthi, Y. A.; Das, N. Lakshmana; Ramprasad, S.; Ram, S. S.; Sudarshan, M.
2015-08-01
The present studies focus the quantitative analysis of elements in school chalk to ensure the safety of its use. The elements like Calcium (Ca), Aluminum (Al), Iron (Fe), Silicon (Si) and Chromium (Cr) were analyzed from settled chalk dust samples collected from five classrooms (CD-1) and also from another set of unused chalk samples collected from local market (CD-2) using Energy Dispersive X-Ray florescence(ED-XRF) spectroscopy. Presence of these elements in significant concentrations in school chalk confirmed that, it is an irritant and occupational hazard. It is suggested to use protective equipments like filtered mask for mouth, nose and chalk holders. This study also suggested using the advanced mode of techniques like Digital boards, marker boards and power point presentations to mitigate the occupational hazard for classroom chalk
Landslide inventories: The essential part of seismic landslide hazard analyses
Harp, E.L.; Keefer, D.K.; Sato, H.P.; Yagi, H.
2011-01-01
A detailed and accurate landslide inventory is an essential part of seismic landslide hazard analysis. An ideal inventory would cover the entire area affected by an earthquake and include all of the landslides that are possible to detect down to sizes of 1-5. m in length. The landslides must also be located accurately and mapped as polygons depicting their true shapes. Such mapped landslide distributions can then be used to perform seismic landslide hazard analysis and other quantitative analyses. Detailed inventory maps of landslide triggered by earthquakes began in the early 1960s with the use of aerial photography. In recent years, advances in technology have resulted in the accessibility of satellite imagery with sufficiently high resolution to identify and map all but the smallest of landslides triggered by a seismic event. With this ability to view any area of the globe, we can acquire imagery for any earthquake that triggers significant numbers of landslides. However, a common problem of incomplete coverage of the full distributions of landslides has emerged along with the advent of high resolution satellite imagery. ?? 2010.
Models of volcanic eruption hazards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wohletz, K.H.
1992-01-01
Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluidmore » flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.« less
Models of volcanic eruption hazards
NASA Astrophysics Data System (ADS)
Wohletz, K. H.
Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluid flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.
NASA Astrophysics Data System (ADS)
Septiadi, Deni; S, Yarianto Sugeng B.; Sriyana; Anzhar, Kurnia; Suntoko, Hadi
2018-03-01
The potential sources of meteorological phenomena in Nuclear Power Plant (NPP) area of interest are identified and the extreme values of the possible resulting hazards associated which such phenomena are evaluated to derive the appropriate design bases for the NPP. The appropriate design bases shall be determined according to the Nuclear Energy Regulatory Agency (Bapeten) applicable regulations, which presently do not indicate quantitative criteria for purposes of determining the design bases for meteorological hazards. These meteorological investigations are also carried out to evaluate the regional and site specific meteorological parameters which affect the transport and dispersion of radioactive effluents on the environment of the region around the NPP site. The meteorological hazards are to be monitored and assessed periodically over the lifetime of the plant to ensure that consistency with the design assumptions is maintained throughout the full lifetime of the facility.
A fault tree model to assess probability of contaminant discharge from shipwrecks.
Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M; Lindgren, J F; Dahllöf, I
2014-11-15
Shipwrecks on the sea floor around the world may contain hazardous substances that can cause harm to the marine environment. Today there are no comprehensive methods for environmental risk assessment of shipwrecks, and thus there is poor support for decision-making on prioritization of mitigation measures. The purpose of this study was to develop a tool for quantitative risk estimation of potentially polluting shipwrecks, and in particular an estimation of the annual probability of hazardous substance discharge. The assessment of the probability of discharge is performed using fault tree analysis, facilitating quantification of the probability with respect to a set of identified hazardous events. This approach enables a structured assessment providing transparent uncertainty and sensitivity analyses. The model facilitates quantification of risk, quantification of the uncertainties in the risk calculation and identification of parameters to be investigated further in order to obtain a more reliable risk calculation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Multi-hazard risk assessment applied to hydraulic fracturing operations
NASA Astrophysics Data System (ADS)
Garcia-Aristizabal, Alexander; Gasparini, Paolo; Russo, Raffaella; Capuano, Paolo
2017-04-01
Without exception, the exploitation of any energy resource produces impacts and intrinsically bears risks. Therefore, to make sound decisions about future energy resource exploitation, it is important to clearly understand the potential environmental impacts in the full life-cycle of an energy development project, distinguishing between the specific impacts intrinsically related to exploiting a given energy resource and those shared with the exploitation of other energy resources. Technological advances as directional drilling and hydraulic fracturing have led to a rapid expansion of unconventional resources (UR) exploration and exploitation; as a consequence, both public health and environmental concerns have risen. The main objective of a multi-hazard risk assessment applied to the development of UR is to assess the rate (or the likelihood) of occurrence of incidents and the relative potential impacts on surrounding environment, considering different hazards and their interactions. Such analyses have to be performed considering the different stages of development of a project; however, the discussion in this paper is mainly focused on the analysis applied to the hydraulic fracturing stage of a UR development project. The multi-hazard risk assessment applied to the development of UR poses a number of challenges, making of this one a particularly complex problem. First, a number of external hazards might be considered as potential triggering mechanisms. Such hazards can be either of natural origin or anthropogenic events caused by the same industrial activities. Second, failures might propagate through the industrial elements, leading to complex scenarios according to the layout of the industrial site. Third, there is a number of potential risk receptors, ranging from environmental elements (as the air, soil, surface water, or groundwater) to local communities and ecosystems. The multi-hazard risk approach for this problem is set by considering multiple hazards (and their possible interactions) as possible sources of system's perturbation that might drive to the development of an incidental event. Given the complexity of the problem, we adopt a multi-level approach: first, perform a qualitative analysis oriented to the identification of a wide range of possible scenarios; this process is based on a review of potential impacts in different risk receptors reported in literature, which is condensed in a number of causal diagrams created for different stages of a UR development project. Second, the most important scenarios for quantitative multi-hazard risk analyses are selected for further quantification. This selection is based on the identification of major risks, i.e., those related with the occurrence of low probability/high impact extreme events. The general framework for the quantitative multi-hazard risk analysis is represented using a so-called bow-tie structure. It is composed of a fault tree on the left hand side of the graphic plot, identifying the possible events causing the critical (or top) event, and an event tree on the right-hand side showing the possible consequences of the critical event. This work was supported under SHEER: "Shale Gas Exploration and Exploitation Induced Risks" project n.640896, funded from Horizon 2020 - R&I Framework Programme, call H2020-LCE-2014-1
Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout
NASA Astrophysics Data System (ADS)
Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner
2014-12-01
During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly evolving crisis, accurately accounting for and propagating all uncertainties and enabling rational decision making under uncertainty.
A Multihazard Regional Level Impact Assessment for South Asia
NASA Astrophysics Data System (ADS)
Amarnath, Giriraj; Alahacoon, Niranga; Aggarwal, Pramod; Smakhtin, Vladimir
2016-04-01
To prioritize climate adaptation strategies, there is a need for quantitative and systematic regional-level assessments which are comparable across multiple climatic hazard regimes. Assessing which countries in a region are most vulnerable to climate change requires analysis of multiple climatic hazards including: droughts, floods, extreme temperature as well as rainfall and sea-level rise. These five climatic hazards, along with population densities were modelled using GIS which enabled a summary of associated human exposure and agriculture losses. A combined index based on hazard, exposure and adaptive capacity is introduced to identify areas of extreme risks. The analysis results in population climate hazard exposure defined as the relative likelihood that a person in a given location was exposed to a given climate-hazard event in a given period of time. The study presents a detailed and coherent approach to fine-scale climate hazard mapping and identification of risks areas for the regions of South Asia that, for the first time, combines the following unique features: (a) methodological consistency across different climate-related hazards, (b) assessment of total exposure on population and agricultural losses, (c) regional-level spatial coverage, and (d) development of customized tools using ArcGIS toolbox that allow assessment of changes in exposure over time and easy replacement of existing datasets with a newly released or superior datasets. The resulting maps enable comparison of the most vulnerable regions in South Asia to climate-related hazards and is among the most urgent of policy needs. Subnational areas (regions/districts/provinces) most vulnerable to climate change impacts in South Asia are documented. The approach involves overlaying climate hazard maps, sensitivity maps, and adaptive capacity maps following the vulnerability assessment framework of the United Nations' Intergovernmental Panel on Climate Change (IPCC). The study used data on the spatial distribution of various climate-related hazards in 1,398 subnational areas of Bangladesh, Bhutan, India, Nepal, Pakistan and Sri Lanka. An analysis of country-level population exposure showed that approximately 750 million people are affected from combined climate-hazards. Of the affected population 72% are in India, followed by 12% each from Bangladesh and Pakistan. Due in part to the economic importance of agriculture, it was found to be most vulnerable and exposed to climate extremes. An analysis of individual hazards indicates that floods and droughts) are the dominant hazards impacting agricultural areas followed by extreme rainfall, extreme temperature and sea-level rise. Based on this vulnerability assessment, all the regions of Bangladesh and the Indian States in Andhra Pradesh, Bihar, Maharashtra, Karnataka and Orissa; Ampara, Puttalam, Trincomalee, Mannar and Batticaloa in Sri Lanka; Sind and Baluchistan in Pakistan; Central and East Nepal; and the transboundary river basins of Indus, Ganges and Brahmaputra are among the most vulnerable regions in South Asia.
Chemical Warfare Agent Operational Exposure Hazard Assessment Research: FY07 Report and Analysis
2010-07-01
of the nerve agents sarin, soman, cyclohexylsarin, VX, and Russian VX in human urine using isotope-dilution gas chromatography-tandem mass...Needham L.L.; Barr, D.B. Quantitation of organophosphorous nerve agent metabolites in human urine using isotope dilution gas chromatography-tandem mass... nerve agents , VX, GB, or GF, and to determine lethal percutaneous (PC) levels of VX. Calibration of Physiologically-based pharmacokinetic biomarkers
Landslide hazard assessment: recent trends and techniques.
Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S
2013-01-01
Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.
Beyond eruptive scenarios: assessing tephra fallout hazard from Neapolitan volcanoes.
Sandri, Laura; Costa, Antonio; Selva, Jacopo; Tonini, Roberto; Macedonio, Giovanni; Folch, Arnau; Sulpizio, Roberto
2016-04-12
Assessment of volcanic hazards is necessary for risk mitigation. Typically, hazard assessment is based on one or a few, subjectively chosen representative eruptive scenarios, which use a specific combination of eruptive sizes and intensities to represent a particular size class of eruption. While such eruptive scenarios use a range of representative members to capture a range of eruptive sizes and intensities in order to reflect a wider size class, a scenario approach neglects to account for the intrinsic variability of volcanic eruptions, and implicitly assumes that inter-class size variability (i.e. size difference between different eruptive size classes) dominates over intra-class size variability (i.e. size difference within an eruptive size class), the latter of which is treated as negligible. So far, no quantitative study has been undertaken to verify such an assumption. Here, we adopt a novel Probabilistic Volcanic Hazard Analysis (PVHA) strategy, which accounts for intrinsic eruptive variabilities, to quantify the tephra fallout hazard in the Campania area. We compare the results of the new probabilistic approach with the classical scenario approach. The results allow for determining whether a simplified scenario approach can be considered valid, and for quantifying the bias which arises when full variability is not accounted for.
Logistic Regression in the Identification of Hazards in Construction
NASA Astrophysics Data System (ADS)
Drozd, Wojciech
2017-10-01
The construction site and its elements create circumstances that are conducive to the formation of risks to safety during the execution of works. Analysis indicates the critical importance of these factors in the set of characteristics that describe the causes of accidents in the construction industry. This article attempts to analyse the characteristics related to the construction site, in order to indicate their importance in defining the circumstances of accidents at work. The study includes sites inspected in 2014 - 2016 by the employees of the District Labour Inspectorate in Krakow (Poland). The analysed set of detailed (disaggregated) data includes both quantitative and qualitative characteristics. The substantive task focused on classification modelling in the identification of hazards in construction and identifying those of the analysed characteristics that are important in an accident. In terms of methodology, resource data analysis using statistical classifiers, in the form of logistic regression, was the method used.
NASA Astrophysics Data System (ADS)
Su, Xing; Meng, Xingmin; Ye, Weilin; Wu, Weijiang; Liu, Xingrong; Wei, Wanhong
2018-03-01
Tianshui City is one of the mountainous cities that are threatened by severe geo-hazards in Gansu Province, China. Statistical probability models have been widely used in analyzing and evaluating geo-hazards such as landslide. In this research, three approaches (Certainty Factor Method, Weight of Evidence Method and Information Quantity Method) were adopted to quantitively analyze the relationship between the causative factors and the landslides, respectively. The source data used in this study are including the SRTM DEM and local geological maps in the scale of 1:200,000. 12 causative factors (i.e., altitude, slope, aspect, curvature, plan curvature, profile curvature, roughness, relief amplitude, and distance to rivers, distance to faults, distance to roads, and the stratum lithology) were selected to do correlation analysis after thorough investigation of geological conditions and historical landslides. The results indicate that the outcomes of the three models are fairly consistent.
NASA Astrophysics Data System (ADS)
Poluyan, L. V.; Syutkina, E. V.; Guryev, E. S.
2017-11-01
The comparative analysis of key features of the software systems TOXI+Risk and ALOHA is presented. The authors made a comparison of domestic (TOXI+Risk) and foreign (ALOHA) software systems allowing to give the quantitative assessment of impact areas (pressure, thermal, toxic) in case of hypothetical emergencies in potentially hazardous objects of the oil, gas, chemical, petrochemical and oil-processing industry. Both software systems use different mathematical models for assessment of the release rate of a chemically hazardous substance from a storage tank and its evaporation. The comparison of the accuracy of definition of impact areas made by both software systems to verify the examples shows good convergence of both products. The analysis results showed that the ALOHA software can be actively used for forecasting and immediate assessment of emergency situations, assessment of damage as a result of emergencies on the territories of municipalities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maruthi, Y. A., E-mail: ymjournal2014@gmail.com; Das, N. Lakshmana, E-mail: nldas9@gmail.com; Ramprasad, S., E-mail: ramprasadsurakala@gmail.com
The present studies focus the quantitative analysis of elements in school chalk to ensure the safety of its use. The elements like Calcium (Ca), Aluminum (Al), Iron (Fe), Silicon (Si) and Chromium (Cr) were analyzed from settled chalk dust samples collected from five classrooms (CD-1) and also from another set of unused chalk samples collected from local market (CD-2) using Energy Dispersive X-Ray florescence(ED-XRF) spectroscopy. Presence of these elements in significant concentrations in school chalk confirmed that, it is an irritant and occupational hazard. It is suggested to use protective equipments like filtered mask for mouth, nose and chalk holders.more » This study also suggested using the advanced mode of techniques like Digital boards, marker boards and power point presentations to mitigate the occupational hazard for classroom chalk.« less
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Barsotti, Sara; Sandri, Laura; Tumi Guðmundsson, Magnús
2015-04-01
Icelandic volcanism is largely dominated by basaltic magma. Nevertheless the presence of glaciers over many Icelandic volcanic systems results in frequent phreatomagmatic eruptions and associated tephra production, making explosive eruptions the most common type of volcanic activity. Jökulhlaups are commonly considered as major volcanic hazard in Iceland for their high frequency and potentially very devastating local impact. Tephra fallout is also frequent and can impact larger areas. It is driven by the wind direction that can change with both altitude and season, making impossible to predict a priori where the tephra will be deposited during the next eruptions. Most of the volcanic activity in Iceland occurs in the central eastern part, over 100 km to the east of the main population centre around the capital Reykjavík. Therefore, the hazard from tephra fallout in Reykjavík is expected to be smaller than for communities settled near the main volcanic systems. However, within the framework of quantitative hazard and risk analyses, less frequent and/or less intense phenomena should not be neglected, since their risk evaluation depends on the effects suffered by the selected target. This is particularly true if the target is highly vulnerable, as large urban areas or important infrastructures. In this work we present the preliminary analysis aiming to perform a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra fallout focused on the target area which includes the municipality of Reykjavík and the Keflavík international airport. This approach reverts the more common perspective where the hazard analysis is focused on the source (the volcanic system) and it follows a multi-source approach: indeed, the idea is to quantify, homogeneously, the hazard due to the main hazardous volcanoes that could pose a tephra fallout threat for the municipality of Reykjavík and the Keflavík airport. PVHA for each volcanic system is calculated independently and the results from all the PVHAs can be combined at the end. This will allow to: 1) possibly add the contribution of new volcanic systems, 2) compare and hierarchically rank the tephra fallout risk among both all the considered volcanoes and, possibly, other kinds of risk, and 3) quantitatively assess the overall tephra fallout hazard over the target area. As practical application, we selected a first subset consisting of the five most hazardous volcanic systems for tephra fallout that could affect the selected target area. These are the ones with the highest number of eruptions in the last 1100 years (Katla, Hekla, Grímsvötn) and the ones located closest to the target area (Reykjanes and Snæfellsjökull). PVHA is computed using the PyBetVH tool (an improvement of the Bayesian Event Tree for Volcanic Hazard -BET_VH- model) and tephra dispersal is modelled by means of VOL-CALPUFF numerical code. Katla volcanic system is used as pilot case study because of its eruptive history and behaviour are well known and documented. We found that some considerations and results derived from the study of Katla could be general and applied to the other considered volcanoes and, more in general, to other Icelandic volcanic systems. The work was financially supported by the European Science Foundation (ESF), in the framework of the Research Networking Programme MeMoVolc.
Cantley, Linda F; Taiwo, Oyebode A; Galusha, Deron; Barbour, Russell; Slade, Martin D; Tessier-Sherman, Baylah; Cullen, Mark R
2014-01-01
This study aimed to examine the effect of an ergonomic hazard control (HC) initiative, undertaken as part of a company ergonomics standard, on worker injury risk. Using the company's ergonomic hazards database to identify jobs with and without ergonomic HC implementation and linking to individual job and injury histories, injury risk among person-jobs with HC implementation (the HC group) was compared to those without HC (NoHC group) using random coefficient models. Further analysis of the HC group was conducted to determine the effect of additional ergonomic hazards controlled on injury risk. Among 123 jobs at 17 plant locations, 347 ergonomic hazards were quantitatively identified during the study period. HC were implemented for 204 quantified ergonomic hazards in 84 jobs, impacting 10 385 persons (12 967 person-jobs). No HC were implemented for quantified ergonomic hazards in the remaining 39 jobs affecting 4155 persons (5046 person-jobs). Adjusting for age, sex, plant origin, and year to control for any temporal trend in injury risk, the relative risk (RR) for musculoskeletal disorder (MSD) was 0.85 and the RR for any injury or MSD was 0.92 in the HC compared to NoHC group. Among the HC group, each ergonomic hazard controlled was associated with risk reduction for MSD and acute injury outcomes (RR 0.93). Systematic ergonomic HC through participatory ergonomics, as part of a mandatory company ergonomics standard, is associated with MSD and injury risk reduction among workers in jobs with HC implemented.
Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J
2016-08-20
Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) 1/2 ; Occupational hazards risk assessment index=2 Health effect level ×2 exposure ratio ×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions and can comprehensively and accurately evaluate occupational health risk caused by DMF.
Threat Assessment of Hazardous Materials Transportation in Aircraft Cargo Compartments.
DOT National Transportation Integrated Search
1999-12-01
The Volpe National Transportation Systems Center of the U.S. Department of Transportation's (DOT's) Research and Special Programs Administration (RSPA) has conducted a quantitative threat assessment for RSPA's Office of Hazardous Materials Safety (OH...
TRIM.Risk is used to integrate the information on exposure received from TRIM.FaTE or TRIM.Expo with that on dose-response or hazard assessment and to provide quantitative descriptions of risk or hazard and some of the attendant uncertainties.
NASA Astrophysics Data System (ADS)
Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza; Morales-Giraldo, David; Rangel-Buitrago, Nelson
2014-05-01
Analysis of hazards and vulnerability associated to coastal erosion along coastlines is a first issue in order to establish plans for adaptation to climate change in coastal areas. La Barra Town, Buenaventura (Pacific ocean of Colombia) and Providence - Santa Catalina Islands (Colombian Caribbean) were selected to develop a detailed analysis of coastal erosion hazard and vulnerability from different perspectives: i) physical (hazard) , ii) social , iii) conservation approach and iv) cultural heritage (Raizal). The analysis was made by a semi quantitative approximation method, applying variables associated with the intrinsic coastal zone properties (i.e. type of beach, exposure of the coast to waves, etc.). Coastal erosion data and associated variables as well land use; conservation and heritage data were used to carry out a further detailed analysis of the human - structural vulnerability and exposure to hazards. The data shows erosion rates close to -17 m yr-1 in La Barra Town (highlighting their critical condition and urgent relocation process), while in some sectors of Providence Island, such as Old Town, erosion rate was -5 m yr-1. The observed erosion process affects directly the land use and the local and regional economy. The differences between indexes and the structural and physical vulnerability as well the use of methodological variables are presented in the context of each region. In this work, all the information was worked using a GIS environment since this allows editing and updating the information continuously. The application of this methodology generates useful information in order to promote risk management as well prevention, mitigation and reduction plans. In both areas the adaptation must be a priority strategy to be considered, including relocation alternatives and sustainable protection with the support of studies of uses and future outlooks in the coast. The methodology is framed into the use of GIS tools and it highlights their benefits in the analysis of information.
Burgoon, Lyle D; Druwe, Ingrid L; Painter, Kyle; Yost, Erin E
2017-02-01
Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values, and risk screening values. We aim to use computational toxicology and quantitative high-throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene may induce DNA damage and steatosis using qHTS data and two separate adverse outcome pathways (AOPs). We also demonstrate how bootstrap natural spline-based meta-regression can be used to integrate data across multiple assay replicates to generate a concentration-response curve. We used this analysis to calculate an in vitro point of departure of 0.751 μM and risk-specific in vitro concentrations of 0.29 μM and 0.28 μM for 1:1,000 and 1:10,000 risk, respectively, for DNA damage. Based on the available evidence, and considering that only a single HSD17B4 assay is available, we have low overall confidence in the steatosis hazard identification. This case study suggests that coupling qHTS assays with AOPs and ontologies will facilitate hazard identification. Combining this with quantitative evidence integration methods, such as bootstrap meta-regression, may allow risk assessors to identify points of departure and risk-specific internal/in vitro concentrations. These results are sufficient to prioritize the chemicals; however, in the longer term we will need to estimate external doses for risk screening purposes, such as through margin of exposure methods. © 2016 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Hetényi, G.; Diehl, T.; Singer, J.; Kissling, E. H.; Clinton, J. F.; Wiemer, S.
2015-12-01
The Eastern Himalayas are home to a seemingly complex seismo-tectonic evolution. The rate of instrumental seismicity is lower than the average along the orogen, there is no record of large historical events, but both paleoseismology and GPS studies point to potentially large (M>8) earthquakes. Due to the lack of a permanent seismic monitoring system in the area, our current level of understanding is inappropriate to create a reliable quantitative seismic hazard model for the region. Existing maps are based on questionable hypotheses and show major inconsistencies when compared to each other. Here we present results on national and regional scales from a 38-station broadband seismological network we operated for almost 2 years in the Kingdom of Bhutan. A thorough, state-of-the-art analysis of local and regional earthquakes builds a comprehensive catalogue that reveals significantly (2-to-3 orders of magnitude) more events than detected from global networks. The seismotectonic analysis reveals new patterns of seismic activity as well as striking differences over relatively short distances within the Himalayas, only partly explained by surface observations such as geology. We compare a priori and a posteriori (BMC) magnitude of completeness maps and show that our network was able to detect all felt events during its operation. Some of these events could be felt at surprisingly large distances. Based on our experiment and experience, we draft the pillars on which a permanent seismological observatory for Bhutan could be constructed. Such a continuous monitoring system of seismic activity could then lead to a reliable quantitative seismic hazard model for Bhutan and surrounding regions, and serve as a base to improve building codes and general preparedness.
NON-TRADITIONAL RESPONSES TO PHARMACEUTICALS IN AQUATIC ECOSYSTEMS
Quantitation of human and veterinary pharmaceuticals in environmental matrices has resulted in pharmaceuticals in the environment receiving unprecedented attention from the scientific community. Aquatic hazard assessments often use quantitative structure activity relationships an...
Total Risk Integrated Methodology (TRIM) - TRIM.Risk
TRIM.Riskis used to integrate the information on exposure received from TRIM.FaTE or TRIM.Expo with that on dose-response or hazard assessment and to provide quantitative descriptions of risk or hazard and some of the attendant uncertainties.
GENETIC ACTIVITY PROFILES AND HAZARD ASSESSMENT
A methodology has been developed to display and evaluate multiple test quantitative information on genetic toxicants for purposes of hazard/risk assessment. ose information is collected from the open literature: either the lowest effective dose (LED) or the highest ineffective do...
Comparative analysis of hazardous household waste in two Mexican regions.
Delgado, Otoniel Buenrostro; Ojeda-Benítez, Sara; Márquez-Benavides, Liliana
2007-01-01
Household hazardous waste (HHW) generation in two Mexican regions was examined, a northern region (bordering with the USA) and a central region. The aim of this work was to determine the dynamics of solid waste generation and to be able to compare the results of both regions, regarding consumption patterns and solid waste generation rates. In the northern region, household solid waste was analysed quantitatively. In order to perform this analysis, the population was categorized into three socioeconomic strata (lower, middle, upper). Waste characterization revealed the presence of products that give origin to household hazardous waste. In the northern region (Mexicali city), household hazardous waste comprised 3.7% of municipal solid waste, the largest categories in this fraction were home care products (29.2%), cleaning products (19.5%) and batteries and electronic equipment (15.7%). In the central region, HHW comprised 1.03% of municipal solid waste; the main categories in this fraction were represented by cleaning products (39%), self care products (27.3%), and insecticides (14.4%). In Mexicali, the socioeconomic study demonstrated that the production of HHW is independent of the income level. Furthermore, the composition of the solid waste stream in both regions suggested the influence of another set of variables such as local climate, migration patterns and marketing coverage. Further research is needed in order to establish the effect of low quantities of HHW upon the environment and public health.
10 CFR Appendix A to Part 70 - Reportable Safety Events
Code of Federal Regulations, 2013 CFR
2013-01-01
... quantitative standards established to satisfy the requirements in § 70.61(b)(4). (4) An event or condition such... material or hazardous chemicals produced from licensed materials that exceeds the quantitative standards...
10 CFR Appendix A to Part 70 - Reportable Safety Events
Code of Federal Regulations, 2012 CFR
2012-01-01
... quantitative standards established to satisfy the requirements in § 70.61(b)(4). (4) An event or condition such... material or hazardous chemicals produced from licensed materials that exceeds the quantitative standards...
10 CFR Appendix A to Part 70 - Reportable Safety Events
Code of Federal Regulations, 2010 CFR
2010-01-01
... quantitative standards established to satisfy the requirements in § 70.61(b)(4). (4) An event or condition such... material or hazardous chemicals produced from licensed materials that exceeds the quantitative standards...
10 CFR Appendix A to Part 70 - Reportable Safety Events
Code of Federal Regulations, 2011 CFR
2011-01-01
... quantitative standards established to satisfy the requirements in § 70.61(b)(4). (4) An event or condition such... material or hazardous chemicals produced from licensed materials that exceeds the quantitative standards...
10 CFR Appendix A to Part 70 - Reportable Safety Events
Code of Federal Regulations, 2014 CFR
2014-01-01
... quantitative standards established to satisfy the requirements in § 70.61(b)(4). (4) An event or condition such... material or hazardous chemicals produced from licensed materials that exceeds the quantitative standards...
Initiating Event Analysis of a Lithium Fluoride Thorium Reactor
NASA Astrophysics Data System (ADS)
Geraci, Nicholas Charles
The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to arrive at a list of key initiating events that can be used to address vulnerabilities during the design phases of LFTR development.
Robboy, Caroline Alex
2002-01-01
This article explores the hazards faced by social constructionists who attempt to conduct quantitative research on sexual orientation development. By critically reviewing two quantitative research studies, this article explores the ways in which the very nature of social constructionist arguments may be incongruous with the methodological requirements of quantitative studies. I suggest this conflict is a result of the differing natures of these two modes of scholarly inquiry. While research requires the acceptance of certain analytical categories, the strength of social constructionism comes from its reflexive scrutiny and problematization of those very categories. Ultimately, social constructionists who try to apply their theories/perspectives must necessarily conform to the methodological constraints of quantitative research. The intent of this article is not to suggest that it is futile or self-contradictory for social constructionists to attempt empirical research, but that these are two distinct modes of scholarly inquiry which can, and should, co-exist in a dialectical relationship to each other.
Seismic hazard assessment and pattern recognition of earthquake prone areas in the Po Plain (Italy)
NASA Astrophysics Data System (ADS)
Gorshkov, Alexander; Peresan, Antonella; Soloviev, Alexander; Panza, Giuliano F.
2014-05-01
A systematic and quantitative assessment, capable of providing first-order consistent information about the sites where large earthquakes may occur, is crucial for the knowledgeable seismic hazard evaluation. The methodology for the pattern recognition of areas prone to large earthquakes is based on the morphostructural zoning method (MSZ), which employs topographic data and present-day tectonic structures for the mapping of earthquake-controlling structures (i.e. the nodes formed around lineaments intersections) and does not require the knowledge about past seismicity. The nodes are assumed to be characterized by a uniform set of topographic, geologic, and geophysical parameters; on the basis of such parameters the pattern recognition algorithm defines a classification rule to discriminate seismogenic and non-seismogenic nodes. This methodology has been successfully applied since the early 1970s in a number of regions worldwide, including California, where it permitted the identification of areas that have been subsequently struck by strong events and that previously were not considered prone to strong earthquakes. Recent studies on the Iberian Peninsula and the Rhone Valley, have demonstrated the applicability of MSZ to flat basins, with a relatively flat topography. In this study, the analysis is applied to the Po Plain (Northern Italy), an area characterized by a flat topography, to allow for the systematic identification of the nodes prone to earthquakes with magnitude larger or equal to M=5.0. The MSZ method differs from the standard morphostructural analysis where the term "lineament" is used to define the complex of alignments detectable on topographic maps or on satellite images. According to that definition the lineament is locally defined and the existence of the lineament does not depend on the surrounding areas. In MSZ, the primary element is the block - a relatively homogeneous area - while the lineament is a secondary element of the morphostructure. The identified earthquake prone areas provide first-order systematic information that may significantly contribute to seismic hazard assessment in the Italian territory. The information about the possible location of strong earthquakes provided by the morphostructural analysis, in fact, can be naturally incorporated in the neo-deterministic procedure for seismic hazard assessment (NDSHA), so as to fill in possible gaps in known seismicity. Moreover, the space information about earthquake prone areas can be fruitfully combined with the space-time information provided by the quantitative analysis of the seismic flow, so as to identify the priority areas (with linear dimensions of few tens kilometers), where the probability of a strong earthquake is relatively high, for detailed local scale studies. The new indications about the seismogenic potential obtained from this study, although less accurate than detailed fault studies, have the advantage of being independent on past seismicity information, since they rely on the systematic and quantitative analysis of the available geological and morphostructural data. Thus, this analysis appears particularly useful in areas where historical information is scarce; special attention should be paid to seismogenic nodes that are not related with known active faults or past earthquakes.
Stock, Greg M.; Luco, Nicolas; Collins, Brian D.; Harp, Edwin L.; Reichenbach, Paola; Frankel, Kurt L.
2012-01-01
caused injuries within developed regions located on or adjacent to talus slopes, highlighting the need for additional investigations into rock-fall hazard and risk. This assessment builds upon previous investigations of rock fall hazard and risk in Yosemite Valley (Wieczorek et al., 1998, 1999; Guzzetti et al., 2003; Wieczorek et al., 2008), and focuses on hazard and risk to structures posed by relatively frequent fragmental-type rock falls (Evans and Hungr, 1999), up to approximately 100,000 m3 in volume.
[Clinical research XXIII. From clinical judgment to meta-analyses].
Rivas-Ruiz, Rodolfo; Castelán-Martínez, Osvaldo D; Pérez-Rodríguez, Marcela; Palacios-Cruz, Lino; Noyola-Castillo, Maura E; Talavera, Juan O
2014-01-01
Systematic reviews (SR) are studies made in order to ask clinical questions based on original articles. Meta-analysis (MTA) is the mathematical analysis of SR. These analyses are divided in two groups, those which evaluate the measured results of quantitative variables (for example, the body mass index -BMI-) and those which evaluate qualitative variables (for example, if a patient is alive or dead, or if he is healing or not). Quantitative variables generally use the mean difference analysis and qualitative variables can be performed using several calculations: odds ratio (OR), relative risk (RR), absolute risk reduction (ARR) and hazard ratio (HR). These analyses are represented through forest plots which allow the evaluation of each individual study, as well as the heterogeneity between studies and the overall effect of the intervention. These analyses are mainly based on Student's t test and chi-squared. To take appropriate decisions based on the MTA, it is important to understand the characteristics of statistical methods in order to avoid misinterpretations.
Beyond eruptive scenarios: assessing tephra fallout hazard from Neapolitan volcanoes
Sandri, Laura; Costa, Antonio; Selva, Jacopo; Tonini, Roberto; Macedonio, Giovanni; Folch, Arnau; Sulpizio, Roberto
2016-01-01
Assessment of volcanic hazards is necessary for risk mitigation. Typically, hazard assessment is based on one or a few, subjectively chosen representative eruptive scenarios, which use a specific combination of eruptive sizes and intensities to represent a particular size class of eruption. While such eruptive scenarios use a range of representative members to capture a range of eruptive sizes and intensities in order to reflect a wider size class, a scenario approach neglects to account for the intrinsic variability of volcanic eruptions, and implicitly assumes that inter-class size variability (i.e. size difference between different eruptive size classes) dominates over intra-class size variability (i.e. size difference within an eruptive size class), the latter of which is treated as negligible. So far, no quantitative study has been undertaken to verify such an assumption. Here, we adopt a novel Probabilistic Volcanic Hazard Analysis (PVHA) strategy, which accounts for intrinsic eruptive variabilities, to quantify the tephra fallout hazard in the Campania area. We compare the results of the new probabilistic approach with the classical scenario approach. The results allow for determining whether a simplified scenario approach can be considered valid, and for quantifying the bias which arises when full variability is not accounted for. PMID:27067389
Risk analysis for roadways subjected to multiple landslide-related hazards
NASA Astrophysics Data System (ADS)
Corominas, Jordi; Mavrouli, Olga
2014-05-01
Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two dynamic factors: the service load and the wall deformation. The consequences are then calculated for each hazard type according to its characteristics (mechanism, magnitude, frequency). The difference of this method in comparison with other methodologies for landslide-related hazards lies in the hazard scenarios and consequence profiles that are investigated. The depth of analysis permits to account for local conditions either concerning the hazard or the consequences (the latter with respect to the very particular characteristics of the roadway such as traffic, number of lanes, velocity…). Furthermore it provides an extensive list of quantitative risk descriptors, including both individual and collective ones. The methodology was made automatic using the data sheets by Microsoft Excel. The results can be used to support decision-taking for the planning of protection measures. Gaps in knowledge and restrictions are discussed as well.
NASA Astrophysics Data System (ADS)
Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus
2016-04-01
In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach torrent in Tyrol (Austria), are analysed in detail. A couple of buildings are entirely reconstructed within the physical scale model at the scale 1:30. They include basement and first floor and thereby all relevant openings on the building envelopes. The results from experimental modelling represent the data basis for further physics-based vulnerability analysis. Hence, the applied vulnerability analysis concept significantly extends the methods presently used in flood risk assessment. The results of the study are of basic importance for practical application, as they provide extensive information to support hazard zone mapping and management, as well as the planning of local technical protection measures.
SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette; Coppersmith, Ryan; Coppersmith, Kevin
A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-riskmore » informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.« less
Cantley, Linda F; Taiwo, Oyebode A; Galusha, Deron; Barbour, Russell; Slade, Martin D; Tessier-Sherman, Baylah; Cullen, Mark R
2014-01-01
Objectives This study aimed to examine the effect of an ergonomic hazard control (HC) initiative, undertaken as part of a company ergonomics standard, on worker injury risk. Methods Using the company's ergonomic hazards database to identify jobs with and without ergonomic HC implementation and linking to individual job and injury histories, injury risk among person-jobs with HC implementation (the HC group) was compared to those without HC (NoHC group) using random coefficient models. Further analysis of the HC group was conducted to determine the effect of additional ergonomic hazards controlled on injury risk. Results Among 123jobs at 17 plant locations, 347 ergonomic hazards were quantitatively identified during the study period. HC were implemented for 204 quantified ergonomic hazards in 84 jobs, impacting 10 385 persons (12 967 person-jobs). No HC were implemented for quantified ergonomic hazards in the remaining 39 jobs affecting 4155 persons (5046 person-jobs). Adjusting for age, sex, plant origin, and year to control for any temporal trend in injury risk, the relative risk (RR) for musculoskeletal disorder (MSD) was 0.85 and the RR for any injury or MSD was 0.92 in the HC compared to NoHC group. Among the HC group, each ergonomic hazard controlled was associated with risk reduction for MSD and acute injury outcomes (RR 0.93). Conclusion Systematic ergonomic HC through participatory ergonomics, as part of a mandatory company ergonomics standard, is associated with MSD and injury risk reduction among workers in jobs with HC implemented. PMID:24142048
Aiding alternatives assessment with an uncertainty-tolerant hazard scoring method.
Faludi, Jeremy; Hoang, Tina; Gorman, Patrick; Mulvihill, Martin
2016-11-01
This research developed a single-score system to simplify and clarify decision-making in chemical alternatives assessment, accounting for uncertainty. Today, assessing alternatives to hazardous constituent chemicals is a difficult task-rather than comparing alternatives by a single definitive score, many independent toxicological variables must be considered at once, and data gaps are rampant. Thus, most hazard assessments are only comprehensible to toxicologists, but business leaders and politicians need simple scores to make decisions. In addition, they must balance hazard against other considerations, such as product functionality, and they must be aware of the high degrees of uncertainty in chemical hazard data. This research proposes a transparent, reproducible method to translate eighteen hazard endpoints into a simple numeric score with quantified uncertainty, alongside a similar product functionality score, to aid decisions between alternative products. The scoring method uses Clean Production Action's GreenScreen as a guide, but with a different method of score aggregation. It provides finer differentiation between scores than GreenScreen's four-point scale, and it displays uncertainty quantitatively in the final score. Displaying uncertainty also illustrates which alternatives are early in product development versus well-defined commercial products. This paper tested the proposed assessment method through a case study in the building industry, assessing alternatives to spray polyurethane foam insulation containing methylene diphenyl diisocyanate (MDI). The new hazard scoring method successfully identified trade-offs between different alternatives, showing finer resolution than GreenScreen Benchmarking. Sensitivity analysis showed that different weighting schemes in hazard scores had almost no effect on alternatives ranking, compared to uncertainty from data gaps. Copyright © 2016 Elsevier Ltd. All rights reserved.
Chemical incidents resulted in hazardous substances releases in the context of human health hazards.
Pałaszewska-Tkacz, Anna; Czerczak, Sławomir; Konieczko, Katarzyna
2017-02-21
The research purpose was to analyze data concerning chemical incidents in Poland collected in 1999-2009 in terms of health hazards. The data was obtained, using multimodal information technology (IT) system, from chemical incidents reports prepared by rescuers at the scene. The final analysis covered sudden events associated with uncontrolled release of hazardous chemical substances or mixtures, which may potentially lead to human exposure. Releases of unidentified substances where emergency services took action to protect human health or environment were also included. The number of analyzed chemical incidents in 1999-2009 was 2930 with more than 200 different substances released. The substances were classified into 13 groups of substances and mixtures posing analogous risks. Most common releases were connected with non-flammable corrosive liquids, including: hydrochloric acid (199 cases), sulfuric(VI) acid (131 cases), sodium and potassium hydroxides (69 cases), ammonia solution (52 cases) and butyric acid (32 cases). The next group were gases hazardous only due to physico-chemical properties, including: extremely flammable propane-butane (249 cases) and methane (79 cases). There was no statistically significant trend associated with the total number of incidents. Only with the number of incidents with flammable corrosive, toxic and/or harmful liquids, the regression analysis revealed a statistically significant downward trend. The number of victims reported was 1997, including 1092 children and 18 fatalities. The number of people injured, number of incidents and the high 9th place of Poland in terms of the number of Seveso establishments, and 4 times higher number of hazardous industrial establishments not covered by the Seveso Directive justify the need for systematic analysis of hazards and their proper identification. It is advisable enhance health risk assessment, both qualitative and quantitative, by slight modification of the data collection system so as to enable the determination of released chemical concentration and exposed populations. Int J Occup Med Environ Health 2017;30(1):95-110. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
Traditionally, human health risk assessments have relied on qualitative approaches for hazard identification, often using the Hill criteria and weight of evidence determinations to integrate data from multiple studies. Recently, the National Research Council has recommended the ...
Tsunami Catalogues for the Eastern Mediterranean - Revisited.
NASA Astrophysics Data System (ADS)
Ambraseys, N.; Synolakis, C. E.
2008-12-01
We critically examine examine tsunami catalogues of tsunamis in the Eastern Mediterranean published in the last decade, by reference to the original sources, see Ambraseys (2008). Such catalogues have been widely used in the aftermath of the 2004 Boxing Day tsunami for probabilistic hazard analysis, even to make projections for a ten year time frame. On occasion, such predictions have caused panic and have reduced the credibility of the scientific community in making hazard assessments. We correct classification and other spurious errors in earlier catalogues and posit a new list. We conclude that for some historic events, any assignment of magnitude, even on a six point intensity scale is inappropriate due to lack of information. Further we assert that any tsunami catalogue, including ours, can only be used in conjunction with sedimentologic evidence to quantitatively infer the return period of larger events. Statistical analyses correlating numbers of tsunami events derived solely from catalogues with their inferred or imagined intensities are meaningless, at least when focusing on specific locales where only a handful of tsunamis are known to have been historically reported. Quantitative hazard assessments based on scenario events of historic tsunamis for which -at best- only the size and approximate location of the parent earthquake is known should be undertaken with extreme caution and only with benefit of geologic studies to enhance the understanding of the local tectonics. Ambraseys N. (2008) Earthquakes in the Eastern Mediterranean and the Middle East: multidisciplinary study of 2000 years of seimicity, Cambridge Univ. Press, Cambridge (ISBN 9780521872928).
Social transformation in transdisciplinary natural hazard management
NASA Astrophysics Data System (ADS)
Attems, Marie-Sophie; Fuchs, Sven; Thaler, Thomas
2017-04-01
Due to annual increases of natural hazard losses, there is a discussion among authorities and communities in Europe on innovative solutions to increase resilience, and consequently, business-as-usual in risk management practices is often questioned. Therefore, the current situation of risk management requests a societal transformation to response adequately and effectively to the new global dynamics. An emerging concept is the implementation of multiple-use mitigation systems against hazards such as floods, avalanches and land-slides. However, one key aspect refers to the involvement of knowledge outside academic research. Therefore, transdisciplinary knowledge can be used to discuss vital factors which are needed to upscale the implementation of multiple-use mitigation measures. The method used in this contribution is an explorative scenario analysis applied in Austria and processes the knowledge gained in transdisciplinary workshops. The scenario analysis combines qualitative data and the quantitative relations in order to generate a set of plausible future outcomes. The goal is to establish a small amount of consistent scenarios, which are efficient and thereby representative as well as significantly different from each other. The results of the discussions among relevant stakeholders within the workshops and a subsequent quantitative analysis, showed that vital variables influencing the multiple use of mitigation measures are the (1) current legislation, (2) risk acceptance among authorities and the public, (3) land-use pressure, (4) the demand for innovative solutions, (5) the available technical standards and possibilities and (6) finally the policy entrepreneurship. Four different scenarios were the final result of the analysis. Concluding the results, in order to make multiple-use alleviations systems possible contemporary settings concerning risk management strategies will have to change in the future. Legislation and thereby current barriers have to be altered in order to create a possibility for innovative solutions. If the state of the art in technical perspectives allows constructions with limited additional risk, multiple-use structures are an option in risk management. The present and future land-use pressure also intensifies the economic interest in finding and accepting such measures.
Hazard Screening Methods for Nanomaterials: A Comparative Study
Murphy, Finbarr; Mullins, Martin; Furxhi, Irini; Costa, Anna L.; Simeone, Felice C.
2018-01-01
Hazard identification is the key step in risk assessment and management of manufactured nanomaterials (NM). However, the rapid commercialisation of nano-enabled products continues to out-pace the development of a prudent risk management mechanism that is widely accepted by the scientific community and enforced by regulators. However, a growing body of academic literature is developing promising quantitative methods. Two approaches have gained significant currency. Bayesian networks (BN) are a probabilistic, machine learning approach while the weight of evidence (WoE) statistical framework is based on expert elicitation. This comparative study investigates the efficacy of quantitative WoE and Bayesian methodologies in ranking the potential hazard of metal and metal-oxide NMs—TiO2, Ag, and ZnO. This research finds that hazard ranking is consistent for both risk assessment approaches. The BN and WoE models both utilize physico-chemical, toxicological, and study type data to infer the hazard potential. The BN exhibits more stability when the models are perturbed with new data. The BN has the significant advantage of self-learning with new data; however, this assumes all input data is equally valid. This research finds that a combination of WoE that would rank input data along with the BN is the optimal hazard assessment framework. PMID:29495342
Prognostic value of long noncoding RNA MALAT1 in digestive system malignancies.
Zhai, Hui; Li, Xiao-Mei; Maimaiti, Ailifeire; Chen, Qing-Jie; Liao, Wu; Lai, Hong-Mei; Liu, Fen; Yang, Yi-Ning
2015-01-01
MALAT1, a newly discovered long noncoding RNA (lncRNA), has been reported to be highly expressed in many types of cancers. This meta-analysis summarizes its potential prognostic value in digestive system malignancies. A quantitative meta-analysis was performed through a systematic search in PubMed, Cochrane Library, Web of Science and Chinese National Knowledge Infrastructure (CNKI) for eligible papers on the prognostic impact of MALAT1 in digestive system malignancies from inception to Apr. 25, 2015. Pooled hazard ratios (HRs) with 95% confidence interval (95% CI) were calculated to summarize the effect. Five studies were included in the study, with a total of 527 patients. A significant association was observed between MALAT1 abundance and poor overall survival (OS) of patients with digestive system malignancies, with pooled hazard ratio (HR) of 7.68 (95% confidence interval [CI]: 4.32-13.66, P<0.001). Meta sensitivity analysis suggested the reliability of our findings. No publication bias was observed. MALAT1 abundance may serve as a novel predictive factor for poor prognosis in patients with digestive system malignancies.
Prognostic value of long noncoding RNA MALAT1 in digestive system malignancies
Zhai, Hui; Li, Xiao-Mei; Maimaiti, Ailifeire; Chen, Qing-Jie; Liao, Wu; Lai, Hong-Mei; Liu, Fen; Yang, Yi-Ning
2015-01-01
Background: MALAT1, a newly discovered long noncoding RNA (lncRNA), has been reported to be highly expressed in many types of cancers. This meta-analysis summarizes its potential prognostic value in digestive system malignancies. Methods: A quantitative meta-analysis was performed through a systematic search in PubMed, Cochrane Library, Web of Science and Chinese National Knowledge Infrastructure (CNKI) for eligible papers on the prognostic impact of MALAT1 in digestive system malignancies from inception to Apr. 25, 2015. Pooled hazard ratios (HRs) with 95% confidence interval (95% CI) were calculated to summarize the effect. Results: Five studies were included in the study, with a total of 527 patients. A significant association was observed between MALAT1 abundance and poor overall survival (OS) of patients with digestive system malignancies, with pooled hazard ratio (HR) of 7.68 (95% confidence interval [CI]: 4.32-13.66, P<0.001). Meta sensitivity analysis suggested the reliability of our findings. No publication bias was observed. Conclusions: MALAT1 abundance may serve as a novel predictive factor for poor prognosis in patients with digestive system malignancies. PMID:26770406
Grinchuk, Oleg V; Yenamandra, Surya P; Iyer, Ramakrishnan; Singh, Malay; Lee, Hwee Kuan; Lim, Kiat Hon; Chow, Pierce Kah-Hoe; Kuznetsov, Vladamir A
2018-01-01
Currently, molecular markers are not used when determining the prognosis and treatment strategy for patients with hepatocellular carcinoma (HCC). In the present study, we proposed that the identification of common pro-oncogenic pathways in primary tumors (PT) and adjacent non-malignant tissues (AT) typically used to predict HCC patient risks may result in HCC biomarker discovery. We examined the genome-wide mRNA expression profiles of paired PT and AT samples from 321 HCC patients. The workflow integrated differentially expressed gene selection, gene ontology enrichment, computational classification, survival predictions, image analysis and experimental validation methods. We developed a 24-ribosomal gene-based HCC classifier (RGC), which is prognostically significant in both PT and AT. The RGC gene overexpression in PT was associated with a poor prognosis in the training (hazard ratio = 8.2, P = 9.4 × 10 -6 ) and cross-cohort validation (hazard ratio = 2.63, P = 0.004) datasets. The multivariate survival analysis demonstrated the significant and independent prognostic value of the RGC. The RGC displayed a significant prognostic value in AT of the training (hazard ratio = 5.0, P = 0.03) and cross-validation (hazard ratio = 1.9, P = 0.03) HCC groups, confirming the accuracy and robustness of the RGC. Our experimental and bioinformatics analyses suggested a key role for c-MYC in the pro-oncogenic pattern of ribosomal biogenesis co-regulation in PT and AT. Microarray, quantitative RT-PCR and quantitative immunohistochemical studies of the PT showed that DKK1 in PT is the perspective biomarker for poor HCC outcomes. The common co-transcriptional pattern of ribosome biogenesis genes in PT and AT from HCC patients suggests a new scalable prognostic system, as supported by the model of tumor-like metabolic redirection/assimilation in non-malignant AT. The RGC, comprising 24 ribosomal genes, is introduced as a robust and reproducible prognostic model for stratifying HCC patient risks. The adjacent non-malignant liver tissue alone, or in combination with HCC tissue biopsy, could be an important target for developing predictive and monitoring strategies, as well as evidence-based therapeutic interventions, that aim to reduce the risk of post-surgery relapse in HCC patients. © 2017 The Authors. Published by FEBS Press and John Wiley & Sons Ltd.
Orbital Debris Quarterly News. Volume 13; No. 1
NASA Technical Reports Server (NTRS)
Liou, J.-C. (Editor); Shoots, Debi (Editor)
2009-01-01
Topics discussed include: new debris from a decommissioned satellite with a nuclear power source; debris from the destruction of the Fengyun-1C meteorological satellite; quantitative analysis of the European Space Agency's Automated Transfer Vehicle 'Jules Verne' reentry event; microsatellite impact tests; solar cycle 24 predictions and other long-term projections and geosynchronus (GEO) environment for the Orbital Debris Engineering Model (ORDEM2008). Abstracts from the NASA Orbital Debris Program Office, examining satellite reentry risk assessments and statistical issues for uncontrolled reentry hazards, are also included.
Carducci, Annalaura; Alfani, Simona; Sassi, Manuela; Cinini, Alessandra; Calamusa, Andrea
2011-03-01
This paper describes the methods followed by the Pisa University OCS for collecting, storing and analyzing all health-related articles and database contents. Moreover, an example population survey on the topic of food safety based on such analysis is shown. Articles published each day since 1999 in Italy's three most popular newspapers are collected and stored in a Data Base Text; on these articles quantitative and qualitative analyses were conducted. On the basis of these results as well as of epidemiological data, a questionnaire survey was carried out about sources of information, knowledge and risk perception of citizens regarding food safety. On a total of 24,434 articles on all health topics, 18% regarded food related hazards: their evolution over time showed peaks on BSE, avian flu and dioxin. A large proportion of the people surveyed declared having changed their food habits, at least temporarily, as a consequence of media information. Most get their information on food safety mainly from television. Most respondents remembered having previously heard news on BSE, avian flu and dioxin, but did not recall having heard of listeriosis, brucellosis or typhoid fever. Newspapers articles facing food related hazards tend to be alarming thus affecting the citizens risk perception. On the other hand people often ignore how to manage their own food safety in a practical way. Analysis of media messages can help to evaluate and correct the negative effects that may result in wrong information. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Dynamic safety assessment of natural gas stations using Bayesian network.
Zarei, Esmaeil; Azadeh, Ali; Khakzad, Nima; Aliabadi, Mostafa Mirzaei; Mohammadfam, Iraj
2017-01-05
Pipelines are one of the most popular and effective ways of transporting hazardous materials, especially natural gas. However, the rapid development of gas pipelines and stations in urban areas has introduced a serious threat to public safety and assets. Although different methods have been developed for risk analysis of gas transportation systems, a comprehensive methodology for risk analysis is still lacking, especially in natural gas stations. The present work is aimed at developing a dynamic and comprehensive quantitative risk analysis (DCQRA) approach for accident scenario and risk modeling of natural gas stations. In this approach, a FMEA is used for hazard analysis while a Bow-tie diagram and Bayesian network are employed to model the worst-case accident scenario and to assess the risks. The results have indicated that the failure of the regulator system was the worst-case accident scenario with the human error as the most contributing factor. Thus, in risk management plan of natural gas stations, priority should be given to the most probable root events and main contribution factors, which have identified in the present study, in order to reduce the occurrence probability of the accident scenarios and thus alleviate the risks. Copyright © 2016 Elsevier B.V. All rights reserved.
Stock, Greg M.; Luco, Nicolas; Collins, Brian D.; Harp, Edwin L.; Reichenbach, Paola; Frankel, Kurt L.
2014-01-01
Rock falls are common in Yosemite Valley, California, posing substantial hazard and risk to the approximately four million annual visitors to Yosemite National Park. Rock falls in Yosemite Valley over the past few decades have damaged structures and caused injuries within developed regions located on or adjacent to talus slopes highlighting the need for additional investigations into rock-fall hazard and risk. This assessment builds upon previous investigations of rock-fall hazard and risk in Yosemite Valley and focuses on hazard and risk to structures posed by relatively frequent fragmental-type rock falls as large as approximately 100,000 (cubic meters) in volume.
SAMCO: Society Adaptation for coping with Mountain risks in a global change COntext
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Bernardie, Severine; Malet, Jean-Philippe; Puissant, Anne; Houet, Thomas; Berger, Frederic; Fort, Monique; Pierre, Daniel
2013-04-01
The SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points with (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform. The strength and originality of the SAMCO project will be to combine different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) and to gather various interdisciplinary expertises in earth sciences, environmental sciences, and social sciences. The multidisciplinary background of the members could potentially lead to the development of new concepts and emerging strategies for mountain hazard/risk adaptation. Research areas, characterized by a variety of environmental, economical and social settings, are severely affected by landslides, and have experienced significant land use modifications (reforestation, abandonment of traditional agricultural practices) and human interferences (urban expansion, ski resorts construction) over the last century.
Assessing Surface Fuel Hazard in Coastal Conifer Forests through the Use of LiDAR Remote Sensing
NASA Astrophysics Data System (ADS)
Koulas, Christos
The research problem that this thesis seeks to examine is a method of predicting conventional fire hazards using data drawn from specific regions, namely the Sooke and Goldstream watershed regions in coastal British Columbia. This thesis investigates whether LiDAR data can be used to describe conventional forest stand fire hazard classes. Three objectives guided this thesis: to discuss the variables associated with fire hazard, specifically the distribution and makeup of fuel; to examine the relationship between derived LiDAR biometrics and forest attributes related to hazard assessment factors defined by the Capitol Regional District (CRD); and to assess the viability of the LiDAR biometric decision tree in the CRD based on current frameworks for use. The research method uses quantitative datasets to assess the optimal generalization of these types of fire hazard data through discriminant analysis. Findings illustrate significant LiDAR-derived data limitations, and reflect the literature in that flawed field application of data modelling techniques has led to a disconnect between the ways in which fire hazard models have been intended to be used by scholars and the ways in which they are used by those tasked with prevention of forest fires. It can be concluded that a significant trade-off exists between computational requirements for wildfire simulation models and the algorithms commonly used by field teams to apply these models with remote sensing data, and that CRD forest management practices would need to change to incorporate a decision tree model in order to decrease risk.
Arikawa, Hisashi; Sato, Shunsuke; Fujiki, Tetsuya; Matsumoto, Keiji
2017-08-01
We developed a new method for isolation and quantitation of polyhydroxyalkanoate (PHA) from culture broth. In this method, the cells were sonicated in sodium dodecyl sulfate (SDS) solution and centrifuged to recover PHA. The recovered PHA was rinsed with deionized water and ethanol, and then weighed after drying. Hazardous chemicals such as chloroform, methanol, and sulfuric acid were not used, and no expensive analytical instruments were needed. We applied this method to Cupriavidus necator culture broths that included various amounts of poly(3-hydroxybutyrate) (PHB) or poly(3-hydroxybutyrate-co-3-hydroxyhexanoate) (PHBHHx) from flasks and jar fermentors. The quantitation by this method was practical for use with a wide range of production amounts and PHA monomer compositions compared to the conventional whole-cell methanolysis method with gas chromatographic analysis, and besides, the recovered PHAs were adequately pure (≥96% purity). Therefore, this new method would be valuable not only for quantitation of PHA but also for preparation of samples to characterize their mechanical properties. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Slope Hazard and Risk Assessment in the Tropics: Malaysia' Experience
NASA Astrophysics Data System (ADS)
Mohamad, Zakaria; Azahari Razak, Khamarrul; Ahmad, Ferdaus; Manap, Mohamad Abdul; Ramli, Zamri; Ahmad, Azhari; Mohamed, Zainab
2015-04-01
The increasing number of geological hazards in Malaysia has often resulted in casualties and extensive devastation with high mitigation cost. Given the destructive capacity and high frequency of disaster, Malaysia has taken a step forward to address the multi-scale landslide risk reduction emphasizing pre-disaster action rather than post-disaster reaction. Slope hazard and risk assessment in a quantitative manner at regional and national scales remains challenging in Malaysia. This paper presents the comprehensive methodology framework and operational needs driven by modern and advanced geospatial technology to address the aforementioned issues in the tropics. The Slope Hazard and Risk Mapping, the first national project in Malaysia utilizing the multi-sensor LIDAR has been critically implemented with the support of multi- and trans-disciplinary partners. The methodological model has been formulated and evaluated given the complexity of risk scenarios in this knowledge driven project. Instability slope problems in the urban, mountainous and tectonic landscape are amongst them, and their spatial information is of crucial for regional landslide assessment. We develop standard procedures with optimal parameterization for susceptibility, hazard and risk assessment in the selected regions. Remarkably, we are aiming at producing an utmost complete landslide inventory in both space and time. With the updated reliable terrain and landscape models, the landslide conditioning factor maps can be accurately derived depending on the landslide types and failure mechanisms which crucial for hazard and risk assessment. We also aim to improve the generation of elements at risk for landslide and promote integrated approaches for a better disaster risk analysis. As a result, a new tool, notably multi-sensor LIDAR technology is a very promising tool for an old geological problem and its derivative data for hazard and risk analysis is an effective preventive measure in Malaysia. Geological, morphological, and physical factors coupled with anthropogenic activities made the spatiotemporal prediction of possible slope failures very challenging. Changing climate and land-use-and-land-cover required a dynamic geo-system approach for assessing multi-hazard in Malaysia and it is still a great challenge to be dealt with. We also critically discussed the capability, limitation and future direction of geo-information tools particularly the active sensors for systematically providing the spatial input towards landslide hazard and possible risk. The cost-and-benefit of developed methods compared to traditional mapping techniques is also elaborated. This paper put forth the critical and practical framework ranging from updating landslide inventory to mitigating landslide risk as an attempt to support the establishment of a comprehensive landslide risk management in Malaysia. The advancement of multistage processing sequence based on airborne-, and ground-based laser remote sensing technology coupling with the sophisticated satellite positioning system, advanced geographical information system and expert knowledge leading to a better understanding of the landslide processes and their dynamics in time and space. Given the state-of-the-art of multi-sensor-LIDAR and complexity of tropical environment, this first landslide project carried out at the national scale provides a better indication and recommendation on the use of modern and advanced mapping technology for assessing tropical landslide geomorphology in an objective, reproducible and quantitative manner.
Dinitz, Laura B.
2008-01-01
With costs of natural disasters skyrocketing and populations increasingly settling in areas vulnerable to natural hazards, society is challenged to better allocate its limited risk-reduction resources. In 2000, Congress passed the Disaster Mitigation Act, amending the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Robert T. Stafford Disaster Relief and Emergency Assistance Act, Pub. L. 93-288, 1988; Federal Emergency Management Agency, 2002, 2008b; Disaster Mitigation Act, 2000), mandating that State, local, and tribal communities prepare natural-hazard mitigation plans to qualify for pre-disaster mitigation grants and post-disaster aid. The Federal Emergency Management Agency (FEMA) was assigned to coordinate and implement hazard-mitigation programs, and it published information about specific mitigation-plan requirements and the mechanisms (through the Hazard Mitigation Grant Program-HMGP) for distributing funds (Federal Emergency Management Agency, 2002). FEMA requires that each community develop a mitigation strategy outlining long-term goals to reduce natural-hazard vulnerability, mitigation objectives and specific actions to reduce the impacts of natural hazards, and an implementation plan for those actions. The implementation plan should explain methods for prioritizing, implementing, and administering the actions, along with a 'cost-benefit review' justifying the prioritization. FEMA, along with the National Institute of Building Sciences (NIBS), supported the development of HAZUS ('Hazards U.S.'), a geospatial natural-hazards loss-estimation tool, to help communities quantify potential losses and to aid in the selection and prioritization of mitigation actions. HAZUS was expanded to a multiple-hazard version, HAZUS-MH, that combines population, building, and natural-hazard science and economic data and models to estimate physical damages, replacement costs, and business interruption for specific natural-hazard scenarios. HAZUS-MH currently performs analyses for earthquakes, floods, and hurricane wind. HAZUS-MH loss estimates, however, do not account for some uncertainties associated with the specific natural-hazard scenarios, such as the likelihood of occurrence within a particular time horizon or the effectiveness of alternative risk-reduction options. Because of the uncertainties involved, it is challenging to make informative decisions about how to cost-effectively reduce risk from natural-hazard events. Risk analysis is one approach that decision-makers can use to evaluate alternative risk-reduction choices when outcomes are unknown. The Land Use Portfolio Model (LUPM), developed by the U.S. Geological Survey (USGS), is a geospatial scenario-based tool that incorporates hazard-event uncertainties to support risk analysis. The LUPM offers an approach to estimate and compare risks and returns from investments in risk-reduction measures. This paper describes and demonstrates a hypothetical application of the LUPM for Ventura County, California, and examines the challenges involved in developing decision tools that provide quantitative methods to estimate losses and analyze risk from natural hazards.
A Software Tool for Quantitative Seismicity Analysis - ZMAP
NASA Astrophysics Data System (ADS)
Wiemer, S.; Gerstenberger, M.
2001-12-01
Earthquake catalogs are probably the most basic product of seismology, and remain arguably the most useful for tectonic studies. Modern seismograph networks can locate up to 100,000 earthquakes annually, providing a continuous and sometime overwhelming stream of data. ZMAP is a set of tools driven by a graphical user interface (GUI), designed to help seismologists analyze catalog data. ZMAP is primarily a research tool suited to the evaluation of catalog quality and to addressing specific hypotheses; however, it can also be useful in routine network operations. Examples of ZMAP features include catalog quality assessment (artifacts, completeness, explosion contamination), interactive data exploration, mapping transients in seismicity (rate changes, b-values, p-values), fractal dimension analysis and stress tensor inversions. Roughly 100 scientists worldwide have used the software at least occasionally. About 30 peer-reviewed publications have made use of ZMAP. ZMAP code is open source, written in the commercial software language Matlab by the Mathworks, a widely used software in the natural sciences. ZMAP was first published in 1994, and has continued to grow over the past 7 years. Recently, we released ZMAP v.6. The poster will introduce the features of ZMAP. We will specifically focus on ZMAP features related to time-dependent probabilistic hazard assessment. We are currently implementing a ZMAP based system that computes probabilistic hazard maps, which combine the stationary background hazard as well as aftershock and foreshock hazard into a comprehensive time dependent probabilistic hazard map. These maps will be displayed in near real time on the Internet. This poster is also intended as a forum for ZMAP users to provide feedback and discuss the future of ZMAP.
Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...
NASA Astrophysics Data System (ADS)
Odbert, Henry; Aspinall, Willy
2014-05-01
Evidence-based hazard assessment at volcanoes assimilates knowledge about the physical processes of hazardous phenomena and observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We discuss the uncertainty of inferences, and how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.
NASA Astrophysics Data System (ADS)
Odbert, Henry; Hincks, Thea; Aspinall, Willy
2015-04-01
Volcanic hazard assessments must combine information about the physical processes of hazardous phenomena with observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We show how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.
Industrial Accidents Triggered by Natural Hazards: an Emerging Risk Issue
NASA Astrophysics Data System (ADS)
Renni, Elisabetta; Krausmann, Elisabeth; Basco, Anna; Salzano, Ernesto; Cozzani, Valerio
2010-05-01
Natural disasters such as earthquakes, tsunamis, flooding or hurricanes have recently and dramatically hit several countries worldwide. Both direct and indirect consequences involved the population, causing on the one hand a high number of fatalities and on the other hand so relevant economical losses that the national gross product may be affected for many years. Loss of critical industrial infrastructures (electricity generation and distribution, gas pipelines, oil refineries, etc.) also occurred, causing further indirect damage to the population. In several cases, accident scenarios with large releases of hazardous materials were triggered by these natural events, causing so-called "Natech events", in which the overall damage resulted from the simultaneous consequences of the natural event and of the release of hazardous substances. Toxic releases, large fires and explosions, as well as possible long-term environmental pollution, economical losses, and overloading of emergency systems were recognised by post-event studies as the main issues of these Natech scenarios. In recent years the increasing frequency and severity of some natural hazards due to climate change has slowly increased the awareness of Natech risk as an emerging risk among the stakeholders. Indeed, the iNTeg-Risk project, co-funded by the European Commission within the 7th Framework Program specifically addresses these scenarios among new technological issues on public safety. The present study, in part carried out within the iNTeg-Risk project, was aimed at the analysis and further development of methods and tools for the assessment and mitigation of Natech accidents. Available tools and knowledge gaps in the assessment of Natech scenarios were highlighted. The analysis mainly addressed the potential impact of flood, lightning and earthquake events on industrial installations where hazardous substances are present. Preliminary screening methodologies and more detailed methods based on quantitative risk analysis were developed. Strategies based on the use of multiple information layers aiming at the identification of mitigation and early warning systems were also explored. A case-study in the Emilia-Romagna region is presented.
Characterizing crown fuel distribution for conifers in the interior western United States
Seth Ex; Frederick W. Smith; Tara Keyser
2015-01-01
Canopy fire hazard evaluation is essential for prioritizing fuel treatments and for assessing potential risk to firefighters during suppression activities. Fire hazard is usually expressed as predicted potential fire behavior, which is sensitive to the methodology used to quantitatively describe fuel profiles: methodologies that assume that fuel is distributed...
USDA-ARS?s Scientific Manuscript database
Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...
NASA Astrophysics Data System (ADS)
Flanigan, Katherine A.; Johnson, Nephi R.; Hou, Rui; Ettouney, Mohammed; Lynch, Jerome P.
2017-04-01
The ability to quantitatively assess the condition of railroad bridges facilitates objective evaluation of their robustness in the face of hazard events. Of particular importance is the need to assess the condition of railroad bridges in networks that are exposed to multiple hazards. Data collected from structural health monitoring (SHM) can be used to better maintain a structure by prompting preventative (rather than reactive) maintenance strategies and supplying quantitative information to aid in recovery. To that end, a wireless monitoring system is validated and installed on the Harahan Bridge which is a hundred-year-old long-span railroad truss bridge that crosses the Mississippi River near Memphis, TN. This bridge is exposed to multiple hazards including scour, vehicle/barge impact, seismic activity, and aging. The instrumented sensing system targets non-redundant structural components and areas of the truss and floor system that bridge managers are most concerned about based on previous inspections and structural analysis. This paper details the monitoring system and the analytical method for the assessment of bridge condition based on automated data-driven analyses. Two primary objectives of monitoring the system performance are discussed: 1) monitoring fatigue accumulation in critical tensile truss elements; and 2) monitoring the reliability index values associated with sub-system limit states of these members. Moreover, since the reliability index is a scalar indicator of the safety of components, quantifiable condition assessment can be used as an objective metric so that bridge owners can make informed damage mitigation strategies and optimize resource management on single bridge or network levels.
Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems
NASA Astrophysics Data System (ADS)
Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.
2012-04-01
Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central Greece. Moreover, remote sensing has proven very effective in delineating spatial variability and features in drought monitoring and assessment.
Zhu, H B; Su, C J; Tang, H F; Ruan, Z; Liu, D H; Wang, H; Qian, Y L
2017-10-20
Objective: To establish a method for rapid determination of 47 volatile organic compounds in the air of workplace using portable gas chromatography - mass spectrometer(GC - MS). Methods: The mixed standard gas with different concentration levels was made by using the static gas distribution method with the high purity nitrogen as dilution gas. The samples were injected into the GC - MS by a hand - held probe. Retention time and characteristic ion were used for qualitative analysis,and the internal standard method was usd for quantitation. Results: The 47 poisonous substances were separated and determined well. The linear range of this method was 0.2 - 16.0 mg/m(3),and the relative standard deviation of 45 volatile ovganic compounds was 3.8% - 15.8%. The average recovery was 79.3% - 119.0%. Conclusion: The method is simple,accurate,sensitive,has good separation effect,short analysis period, can be used for qualitative and quantitative analysis of volatile organic compounds in the workplace, and also supports the rapid identification and detection of occupational hazards.
MetaPathways v2.5: quantitative functional, taxonomic and usability improvements.
Konwar, Kishori M; Hanson, Niels W; Bhatia, Maya P; Kim, Dongjae; Wu, Shang-Ju; Hahn, Aria S; Morgan-Lang, Connor; Cheung, Hiu Kan; Hallam, Steven J
2015-10-15
Next-generation sequencing is producing vast amounts of sequence information from natural and engineered ecosystems. Although this data deluge has an enormous potential to transform our lives, knowledge creation and translation need software applications that scale with increasing data processing and analysis requirements. Here, we present improvements to MetaPathways, an annotation and analysis pipeline for environmental sequence information that expedites this transformation. We specifically address pathway prediction hazards through integration of a weighted taxonomic distance and enable quantitative comparison of assembled annotations through a normalized read-mapping measure. Additionally, we improve LAST homology searches through BLAST-equivalent E-values and output formats that are natively compatible with prevailing software applications. Finally, an updated graphical user interface allows for keyword annotation query and projection onto user-defined functional gene hierarchies, including the Carbohydrate-Active Enzyme database. MetaPathways v2.5 is available on GitHub: http://github.com/hallamlab/metapathways2. shallam@mail.ubc.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine
2014-03-01
Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less
Xu, Wang; Ren, Changliang; Teoh, Chai Lean; Peng, Juanjuan; Gadre, Shubhankar Haribhau; Rhee, Hyun-Woo; Lee, Chi-Lik Ken; Chang, Young-Tae
2014-09-02
Herein, a small-molecule fluorescent sensor array for rapid identification of seven heavy metal ions was designed and synthesized, with its sensing mechanism mimicking that of a tongue. The photoinduced electron transfer and intramolecular charge transfer mechanism result in combinatorial interactions between sensor array and heavy metal ions, which lead to diversified fluorescence wavelength shifts and emission intensity changes. Upon principle component analysis (PCA), this result renders clear identification of each heavy metal ion on a 3D spatial dispersion graph. Further exploration provides a concentration-dependent pattern, allowing both qualitative and quantitative measurements of heavy metal ions. On the basis of this information, a "safe-zone" concept was proposed, which provides rapid exclusion of versatile hazardous species from clean water samples based on toxicity characteristic leaching procedure standards. This type of small-molecule fluorescent sensor array could open a new avenue for multiple heavy metal ion detection and simplified water quality analysis.
Modeling Compound Flood Hazards in Coastal Embayments
NASA Astrophysics Data System (ADS)
Moftakhari, H.; Schubert, J. E.; AghaKouchak, A.; Luke, A.; Matthew, R.; Sanders, B. F.
2017-12-01
Coastal cities around the world are built on lowland topography adjacent to coastal embayments and river estuaries, where multiple factors threaten increasing flood hazards (e.g. sea level rise and river flooding). Quantitative risk assessment is required for administration of flood insurance programs and the design of cost-effective flood risk reduction measures. This demands a characterization of extreme water levels such as 100 and 500 year return period events. Furthermore, hydrodynamic flood models are routinely used to characterize localized flood level intensities (i.e., local depth and velocity) based on boundary forcing sampled from extreme value distributions. For example, extreme flood discharges in the U.S. are estimated from measured flood peaks using the Log-Pearson Type III distribution. However, configuring hydrodynamic models for coastal embayments is challenging because of compound extreme flood events: events caused by a combination of extreme sea levels, extreme river discharges, and possibly other factors such as extreme waves and precipitation causing pluvial flooding in urban developments. Here, we present an approach for flood risk assessment that coordinates multivariate extreme analysis with hydrodynamic modeling of coastal embayments. First, we evaluate the significance of correlation structure between terrestrial freshwater inflow and oceanic variables; second, this correlation structure is described using copula functions in unit joint probability domain; and third, we choose a series of compound design scenarios for hydrodynamic modeling based on their occurrence likelihood. The design scenarios include the most likely compound event (with the highest joint probability density), preferred marginal scenario and reproduced time series of ensembles based on Monte Carlo sampling of bivariate hazard domain. The comparison between resulting extreme water dynamics under the compound hazard scenarios explained above provides an insight to the strengths/weaknesses of each approach and helps modelers choose the appropriate scenario that best fit to the needs of their project. The proposed risk assessment approach can help flood hazard modeling practitioners achieve a more reliable estimate of risk, by cautiously reducing the dimensionality of the hazard analysis.
Global quantitative indices reflecting provider process-of-care: data-base derivation.
Moran, John L; Solomon, Patricia J
2010-04-19
Controversy has attended the relationship between risk-adjusted mortality and process-of-care. There would be advantage in the establishment, at the data-base level, of global quantitative indices subsuming the diversity of process-of-care. A retrospective, cohort study of patients identified in the Australian and New Zealand Intensive Care Society Adult Patient Database, 1993-2003, at the level of geographic and ICU-level descriptors (n = 35), for both hospital survivors and non-survivors. Process-of-care indices were established by analysis of: (i) the smoothed time-hazard curve of individual patient discharge and determined by pharmaco-kinetic methods as area under the hazard-curve (AUC), reflecting the integrated experience of the discharge process, and time-to-peak-hazard (TMAX, in days), reflecting the time to maximum rate of hospital discharge; and (ii) individual patient ability to optimize output (as length-of-stay) for recorded data-base physiological inputs; estimated as a technical production-efficiency (TE, scaled [0,(maximum)1]), via the econometric technique of stochastic frontier analysis. For each descriptor, multivariate correlation-relationships between indices and summed mortality probability were determined. The data-set consisted of 223129 patients from 99 ICUs with mean (SD) age and APACHE III score of 59.2(18.9) years and 52.7(30.6) respectively; 41.7% were female and 45.7% were mechanically ventilated within the first 24 hours post-admission. For survivors, AUC was maximal in rural and for-profit ICUs, whereas TMAX (>or= 7.8 days) and TE (>or= 0.74) were maximal in tertiary-ICUs. For non-survivors, AUC was maximal in tertiary-ICUs, but TMAX (>or= 4.2 days) and TE (>or= 0.69) were maximal in for-profit ICUs. Across descriptors, significant differences in indices were demonstrated (analysis-of-variance, P
Gwinn, Maureen R; Craig, Jeneva; Axelrad, Daniel A; Cook, Rich; Dockins, Chris; Fann, Neal; Fegley, Robert; Guinnup, David E; Helfand, Gloria; Hubbell, Bryan; Mazur, Sarah L; Palma, Ted; Smith, Roy L; Vandenberg, John; Sonawane, Babasaheb
2011-01-01
Quantifying the benefits of reducing hazardous air pollutants (HAPs, or air toxics) has been limited by gaps in toxicological data, uncertainties in extrapolating results from high-dose animal experiments to estimate human effects at lower doses, limited ambient and personal exposure monitoring data, and insufficient economic research to support valuation of the health impacts often associated with exposure to individual air toxics. To address some of these issues, the U.S. Environmental Protection Agency held the Workshop on Estimating the Benefits of Reducing Hazardous Air Pollutants (HAPs) in Washington, DC, from 30 April to 1 May 2009. Experts from multiple disciplines discussed how best to move forward on air toxics benefits assessment, with a focus on developing near-term capability to conduct quantitative benefits assessment. Proposed methodologies involved analysis of data-rich pollutants and application of this analysis to other pollutants, using dose-response modeling of animal data for estimating benefits to humans, determining dose-equivalence relationships for different chemicals with similar health effects, and analysis similar to that used for criteria pollutants. Limitations and uncertainties in economic valuation of benefits assessment for HAPS were discussed as well. These discussions highlighted the complexities in estimating the benefits of reducing air toxics, and participants agreed that alternative methods for benefits assessment of HAPs are needed. Recommendations included clearly defining the key priorities of the Clean Air Act air toxics program to identify the most effective approaches for HAPs benefits analysis, focusing on susceptible and vulnerable populations, and improving dose-response estimation for quantification of benefits.
MiR-221, a potential prognostic biomarker for recurrence in papillary thyroid cancer.
Dai, Lei; Wang, Yaozong; Chen, Liangliang; Zheng, Jueru; Li, Jianjun; Wu, Xianjiang
2017-01-07
Many studies have reported several transcriptionally deregulated microRNAs (miRNAs) in papillary thyroid cancer (PTC) tissue in comparison with benign thyroid nodules and normal thyroid tissues. However, the correlation between miRNA expressions and PTC recurrence still remains unclear. The PTC patients who scheduled to undergo total thyroidectomy by the same surgical team in Ningbo NO.2 Hospital from March 1998 to March 2008 were enrolled in this study. The clinical and pathological characteristics of each patient were recorded in detail. The selected miRNA expressions were detected using quantitative reverse transcriptase-polymerase chain reaction (qRT-PCR). Potential predictive factors for cancer recurrence were evaluated by univariate and multivariate Cox proportional hazard analysis. A total of 78 patients were enrolled with 49 females at a mean age of 45.8 years. Enrolled patients were divided into two groups: nonrecurrent group (n = 54) and recurrent group (n = 24). The results from the univariate Cox proportional hazard analysis revealed that primary tumor size, TNM stage, extrathyroid extension, miR-221, and miR-222 expressions were significantly associated with PTC recurrence (P < 0.05). The tissue expression of miR-221 was the only independent risk factor for PTC recurrence (HR 1.41; 95%CI 1.14-1.95, P = 0.007) by multiple Cox proportional hazard analysis. This study identified the potential role of miR-221 as a prognostic biomarker for the recurrence in PTC.
Baker, Valerie A; Harries, Helen M; Waring, Jeff F; Duggan, Colette M; Ni, Hong A; Jolly, Robert A; Yoon, Lawrence W; De Souza, Angus T; Schmid, Judith E; Brown, Roger H; Ulrich, Roger G; Rockett, John C
2004-01-01
Microarrays have the potential to significantly impact our ability to identify toxic hazards by the identification of mechanistically relevant markers of toxicity. To be useful for risk assessment, however, microarray data must be challenged to determine reliability and interlaboratory reproducibility. As part of a series of studies conducted by the International Life Sciences Institute Health and Environmental Science Institute Technical Committee on the Application of Genomics to Mechanism-Based Risk Assessment, the biological response in rats to the hepatotoxin clofibrate was investigated. Animals were treated with high (250 mg/kg/day) or low (25 mg/kg/day) doses for 1, 3, or 7 days in two laboratories. Clinical chemistry parameters were measured, livers removed for histopathological assessment, and gene expression analysis was conducted using cDNA arrays. Expression changes in genes involved in fatty acid metabolism (e.g., acyl-CoA oxidase), cell proliferation (e.g., topoisomerase II-Alpha), and fatty acid oxidation (e.g., cytochrome P450 4A1), consistent with the mechanism of clofibrate hepatotoxicity, were detected. Observed differences in gene expression levels correlated with the level of biological response induced in the two in vivo studies. Generally, there was a high level of concordance between the gene expression profiles generated from pooled and individual RNA samples. Quantitative real-time polymerase chain reaction was used to confirm modulations for a number of peroxisome proliferator marker genes. Though the results indicate some variability in the quantitative nature of the microarray data, this appears due largely to differences in experimental and data analysis procedures used within each laboratory. In summary, this study demonstrates the potential for gene expression profiling to identify toxic hazards by the identification of mechanistically relevant markers of toxicity. PMID:15033592
Expert agreements and disagreements on induced seismicity by Enhanced Geothermal Systems
NASA Astrophysics Data System (ADS)
Trutnevyte, E.; Azevedo, I. L.
2016-12-01
Enhanced or Engineered Geothermal Systems (EGS) are at an early stage of development and only a handful of projects exist worldwide. In face of limited empirical evidence on EGS induced seismicity, expert elicitation provides a complementary view to quantitative assessments and basic science. We present the results of an international expert elicitation exercise with 14 experts from 6 countries. The elicitation aimed at evaluating induced seismicity hazard and risk for EGS and characterizing associated uncertainty. The state-of-the-art expert elicitation method was used: it combines technical analysis with behavioral science-informed elicitation of expert judgement in order to minimize subjectivity. The experts assessed a harmonized scenario of an EGS plant, its operational characteristics, geological context, and surrounding buildings and infrastructures. The experts provided quantitative estimates of exceedance probabilities of induced M>=3 and M>=5, maximum magnitudes that could be observed, and made judgements on economic loss, injuries, and fatalities in the case of M=3 and M=5. The experts also rated the importance of factors that influence induced seismicity hazard and risk (e.g. reservoir depth, injected volumes, exposed building stock etc.) and the potential uncertainty reductions through future research. We present the findings of this elicitation and highlight the points of expert agreements and disagreements.
Occupational asthma from tafenoquine in the pharmaceutical industry: implications for QSAR.
Cannon, J; Fitzgerald, B; Seed, M; Agius, R; Jiwany, A; Cullinan, P
2015-04-01
We report occupational asthma and rhinitis in a formulation pharmacist, employed in the development of tafenoquine. Tafenoquine is a new anti-malarial drug in development; the pure drug substance has an asthma hazard index of zero and previously was not known to be a respiratory sensitizing agent. The implications of this finding for the refinement of quantitative structural analysis of asthmagenic chemicals are discussed. © The Author 2015. Published by Oxford University Press on behalf of the Society of Occupational Medicine. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cadwallader, L.C.
1997-03-01
This report presents safety information about powered industrial trucks. The basic lift truck, the counterbalanced sit down rider truck, is the primary focus of the report. Lift truck engineering is briefly described, then a hazard analysis is performed on the lift truck. Case histories and accident statistics are also given. Rules and regulations about lift trucks, such as the US Occupational Safety an Health Administration laws and the Underwriter`s Laboratories standards, are discussed. Safety issues with lift trucks are reviewed, and lift truck safety and reliability are discussed. Some quantitative reliability values are given.
NASA Technical Reports Server (NTRS)
Deckert, George
2010-01-01
This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1990. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis. The Chemical Analysis Group uses wet-chemical and instrumental methods for elemental, compositional, and isotopic analyses of solid, liquid, and gaseous samples and provides specialized analytical services. The Instrumental Analysis Group uses nuclear counting techniques in radiochemical analyses over a wide range of sample types from low-level environmental samples to samples of high radioactivity. The Organic Analysis Group uses amore » number of complementary techniques to separate and to quantitatively and qualitatively analyze complex organic mixtures and compounds at the trace level, including synthetic fuels, toxic substances, fossil-fuel residues and emissions, pollutants, biologically active compounds, pesticides, and drugs. The Environmental Analysis Group performs analyses of inorganic environmental and hazardous waste and coal samples.« less
Regional Analysis of the Hazard Level of Glacial Lakes in the Cordillera Blanca, Peru
NASA Astrophysics Data System (ADS)
Chisolm, Rachel E.; Jhon Sanchez Leon, Walter; McKinney, Daene C.; Cochachin Rapre, Alejo
2016-04-01
The Cordillera Blanca mountain range is the highest in Peru and contains many of the world's tropical glaciers. This region is severely impacted by climate change causing accelerated glacier retreat. Secondary impacts of climate change on glacier retreat include stress on water resources and the risk of glacial lake outburst floods (GLOFs) from the many lakes that are forming and growing at the base of glaciers. A number of GLOFs originating from lakes in the Cordillera Blanca have occurred over the last century, several of which have had catastrophic impacts on cities and communities downstream. Glaciologists and engineers in Peru have been studying the lakes of the Cordillera Blanca for many years and have identified several lakes that are considered dangerous. However, a systematic analysis of all the lakes in the Cordillera Blanca has never before been attempted. Some methodologies for this type of systematic analysis have been proposed (eg. Emmer and Vilimek 2014; Wang, et al. 2011), but as yet they have only been applied to a few select lakes in the Cordillera Blanca. This study uses remotely sensed data to study all of the lakes of the Glacial Lake Inventory published by the Glaciology and Water Resources Unit of Peru's National Water Authority (UGRH 2011). The objective of this study is to assign a level of potential hazard to each glacial lake in the Cordillera Blanca and to ascertain if any of the lakes beyond those that have already been studied might pose a danger to nearby populations. A number of parameters of analysis, both quantitative and qualitative, have been selected to assess the hazard level of each glacial lake in the Cordillera Blanca using digital elevation models, satellite imagery, and glacier outlines. These parameters are then combined to come up with a preliminary assessment of the hazard level of each lake; the equation weighting each parameter draws on previously published methodologies but is tailored to the regional characteristics of glacial lakes and their hazard potential. This phase of glacial lake hazard assessment aims to be geographically comprehensive in order to identify potentially dangerous lakes that may have previously been ignored. A second phase of analysis that includes site visits will be necessary for a thorough analysis at each lake to determine the potential hazard for downstream communities. The objective of the work presented here is to identify potentially dangerous lakes that warrant further study rather than provide a final hazard assessment for each lake of the glacial lake inventory in the Cordillera Blanca. References: Emmer, A. and Vilímek, V.: New method for assessing the potential hazardousness of glacial lakes in the Cordillera Blanca, Peru, Hydrol. Earth Syst. Sci. Discuss., 11, 2391-2439, 2014. UGRH - Unidad de Glaciologia y Recursos Hidricos. Inventario de Lagunas Glaciares del Peru. Ministerio de Agricultura y Riego, Autoridad Nacional del Agua, Direcccion de Conservacion y Planeamiento de Recursos Hidricos, Huaraz, Peru, 2014. Wang, W., Yao, T., Gao, Y., Yang, X., and Kattel, D. B.: A first-order method to identify potentially dangerous glacial lakes in a region of the southeastern Tibetan Plateau, Mountain Res. Develop., 31, 122-130, 2011.
Tsunamis: Global Exposure and Local Risk Analysis
NASA Astrophysics Data System (ADS)
Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.
2014-12-01
The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.
Flight in low-level wind shear
NASA Technical Reports Server (NTRS)
Frost, W.
1983-01-01
Results of studies of wind shear hazard to aircraft operation are summarized. Existing wind shear profiles currently used in computer and flight simulator studies are reviewed. The governing equations of motion for an aircraft are derived incorporating the variable wind effects. Quantitative discussions of the effects of wind shear on aircraft performance are presented. These are followed by a review of mathematical solutions to both the linear and nonlinear forms of the governing equations. Solutions with and without control laws are presented. The application of detailed analysis to develop warning and detection systems based on Doppler radar measuring wind speed along the flight path is given. A number of flight path deterioration parameters are defined and evaluated. Comparison of computer-predicted flight paths with those measured in a manned flight simulator is made. Some proposed airborne and ground-based wind shear hazard warning and detection systems are reviewed. The advantages and disadvantages of both types of systems are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babendreier, Justin E.; Castleton, Karl J.
2005-08-01
Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less
Megas, Georgios; Chrisofos, Michael; Anastasiou, Ioannis; Tsitlidou, Aida; Choreftaki, Theodosia; Deliveliotis, Charalampos
2015-01-01
The objective of this study was to evaluate the expression of estrogen receptors (ER(α) and ER(β)) and androgen receptors (ARs) as prognostic factors for biochemical recurrence, disease progression and survival in patients with pT3N0M0 prostate cancer (PCa) in an urban Greek population. A total of 100 consecutive patients with pT3N0M0 PCa treated with radical prostatectomy participated in the study. The mean age and follow-up were 64.2 and 6 years, respectively. The HSCORE was used for semi-quantitative analysis of the immunoreactivity of the receptors. The prognostic value of the ER(α) and ER(β) and AR was assessed in terms of recurrence, progression, and survival. AR expression was not associated with any of the above parameters; however, both ERs correlated with the prognosis. A univariate Cox regression analysis showed that ER(α) positive staining was significantly associated with a greater hazard for all outcomes. Increased ER(β) staining was significantly associated with a lower hazard for all outcomes in the univariate analysis. When both ER HSCORES were used for the analysis, it was found that patients with high ER(α) or low ER(β) HSCORES compared with patients with negatively stained ER(α) and >1.7 hSCORE ER(β) had 6.03, 10.93, and 10.53 times greater hazard for biochemical disease recurrence, progression of disease and death, respectively. Multiple Cox proportional hazard analyses showed that the age, preoperative prostate specific antigen, Gleason score and ERs were independent predictors of all outcomes. ER expression is an important prognosticator after radical prostatectomy in patients with pT3N0M0 PCa. By contrast, AR expression has limited prognostic value. PMID:25219910
Megas, Georgios; Chrisofos, Michael; Anastasiou, Ioannis; Tsitlidou, Aida; Choreftaki, Theodosia; Deliveliotis, Charalampos
2015-01-01
The objective of this study was to evaluate the expression of estrogen receptors (ER(α) and ER(β)) and androgen receptors (ARs) as prognostic factors for biochemical recurrence, disease progression and survival in patients with pT3N0M0 prostate cancer (PCa) in an urban Greek population. A total of 100 consecutive patients with pT3N0M0 PCa treated with radical prostatectomy participated in the study. The mean age and follow-up were 64.2 and 6 years, respectively. The HSCORE was used for semi-quantitative analysis of the immunoreactivity of the receptors. The prognostic value of the ER(α) and ER(β) and AR was assessed in terms of recurrence, progression, and survival. AR expression was not associated with any of the above parameters; however, both ERs correlated with the prognosis. A univariate Cox regression analysis showed that ER(α) positive staining was significantly associated with a greater hazard for all outcomes. Increased ER(β) staining was significantly associated with a lower hazard for all outcomes in the univariate analysis. When both ER HSCORES were used for the analysis, it was found that patients with high ER(α) or low ER(β) HSCORES compared with patients with negatively stained ER(α) and >1.7 hSCORE ER(β) had 6.03, 10.93, and 10.53 times greater hazard for biochemical disease recurrence, progression of disease and death, respectively. Multiple Cox proportional hazard analyses showed that the age, preoperative prostate specific antigen, Gleason score and ERs were independent predictors of all outcomes. ER expression is an important prognosticator after radical prostatectomy in patients with pT3N0M0 PCa. By contrast, AR expression has limited prognostic value.
Analysis of shallow landslides and soil erosion induced by rainfall over large areas
NASA Astrophysics Data System (ADS)
Cuomo, Sabatino; Della Sala, Maria
2014-05-01
Due to heavy rainstorms, steep hillslopes may be affected by either shallow landslides or soil superficial erosion (Acharya et al., 2011), which originate different flow-like mass movements in adjacent or overlapping source areas (Cascini et al., 2013). Triggering analysis (Cascini et al., 2011) is a relevant issue for hazard assessment that is, in turn, the first step of risk analysis procedures (Fell et al., 2008). Nevertheless, the available approaches separately consider shallow landslides and soil erosion. Specifically, quantitative models for landslides triggering analysis allow simulating the physical processes leading to failure such as pore water pressure increase and soil shear mobilization and provide estimates of the amount of material potentially involved; however, success of quantitative methods must be carefully evaluated in complex geological setting as recently outlined (Sorbino et al., 2010) and further applications to real case histories are straightforward. On the other hand, a wide range of models exist for soil erosion analysis, which differ in terms of complexity, processes considered and data required for the model calibration and practical applications; in particular, quantitative models can estimate the source areas and the amount of eroded soil through empirical relationships or mathematical equations describing the main physical processes governing soil erosion (Merritt et al., 2003). In this work a spatially distributed analysis is proposed for testing the potentialities of two available models to respectively investigate the spatial occurrence of first-time shallow landslides and superficial soil erosion repeatedly occurring in a large test area of the Southern Italy. Both analyses take into account the seasonal variation of soil suction, rainfall characteristics and soil cover use (Cuomo and Della Sala, 2013). The achieved results show that the source areas of shallow landslides strongly depend on rainfall intensity and duration and soil initial suction. On the other hand, the source areas for erosion phenomena depend on rainfall characteristics and soil cover, with simulated eroded areas larger in autumn season. In addition, for a past event, the simulated source areas of shallow landslides are smaller than those observed in the field while the simulated eroded areas with thickness greater than 5 cm are comparable with the in-situ evidences if the analysis takes into account high rainfall intensity and a spatially variable soil cover use, thus providing a consistent interpretation of the event. References Acharya, G., Cochrane, T., Davies, T., Bowman, E. (2011). Quantifying and modeling postfailure sediment yields from laboratory-scale soil erosion and shallow landslide experiments with silty loess. Geomorphology 129, 49-58. Cascini L., Cuomo S., Della Sala M. (2011). Spatial and temporal occurrence of rainfall-induced shallow landslides of flow type: A case of Sarno-Quindici, Italy. Geomorphology, 126(1-2), 148-158. Cascini, L., Sorbino, G., Cuomo, S., Ferlisi, S. (2013). Seasonal effects of rainfall on the shallow pyroclastic deposits of the Campania region (southern Italy). Landslides, 1-14, DOI: 10.1007/s10346-013-0395-3. Cuomo S., Della Sala M. (2013). Spatially distributed analysis of shallow landslides and soil erosion induced by rainfall. (submitted to Natural Hazards). Fell, R., Corominas J., Bonnard, C., Cascini, L., Leroi E., Savage, W.Z., on behalf of the JTC-1 Joint Technical Committee on Landslides and Engineered Slopes (2008). Guidelines for landslide susceptibility, hazard and risk zoning for land use planning. Engineering Geolology, 102(3-4):85-98. Merritt, W.S., Latcher, R.A., Jakeman, A.J. (2003). A review of erosion and sediment transport models. Environmental Modelling and Software 18, 761- 799. Sorbino G., Sica C., Cascini L. (2010). Susceptibility analysis of shallow landslides source areas using physically based models. Natural Hazards, 53(2), 313-332.
Mathematical modelling and quantitative methods.
Edler, L; Poirier, K; Dourson, M; Kleiner, J; Mileson, B; Nordmann, H; Renwick, A; Slob, W; Walton, K; Würtzen, G
2002-01-01
The present review reports on the mathematical methods and statistical techniques presently available for hazard characterisation. The state of the art of mathematical modelling and quantitative methods used currently for regulatory decision-making in Europe and additional potential methods for risk assessment of chemicals in food and diet are described. Existing practices of JECFA, FDA, EPA, etc., are examined for their similarities and differences. A framework is established for the development of new and improved quantitative methodologies. Areas for refinement, improvement and increase of efficiency of each method are identified in a gap analysis. Based on this critical evaluation, needs for future research are defined. It is concluded from our work that mathematical modelling of the dose-response relationship would improve the risk assessment process. An adequate characterisation of the dose-response relationship by mathematical modelling clearly requires the use of a sufficient number of dose groups to achieve a range of different response levels. This need not necessarily lead to an increase in the total number of animals in the study if an appropriate design is used. Chemical-specific data relating to the mode or mechanism of action and/or the toxicokinetics of the chemical should be used for dose-response characterisation whenever possible. It is concluded that a single method of hazard characterisation would not be suitable for all kinds of risk assessments, and that a range of different approaches is necessary so that the method used is the most appropriate for the data available and for the risk characterisation issue. Future refinements to dose-response characterisation should incorporate more clearly the extent of uncertainty and variability in the resulting output.
NASA Astrophysics Data System (ADS)
Schmith, Johanne; Höskuldsson, Ármann; Holm, Paul Martin; Larsen, Guðrún
2018-04-01
Katla volcano in Iceland produces hazardous large explosive basaltic eruptions on a regular basis, but very little quantitative data for future hazard assessments exist. Here details on fragmentation mechanism and eruption dynamics are derived from a study of deposit stratigraphy with detailed granulometry and grain morphology analysis, granulometric modeling, componentry and the new quantitative regularity index model of fragmentation mechanism. We show that magma/water interaction is important in the ash generation process, but to a variable extent. By investigating the large explosive basaltic eruptions from 1755 and 1625, we document that eruptions of similar size and magma geochemistry can have very different fragmentation dynamics. Our models show that fragmentation in the 1755 eruption was a combination of magmatic degassing and magma/water-interaction with the most magma/water-interaction at the beginning of the eruption. The fragmentation of the 1625 eruption was initially also a combination of both magmatic and phreatomagmatic processes, but magma/water-interaction diminished progressively during the later stages of the eruption. However, intense magma/water interaction was reintroduced during the final stages of the eruption dominating the fine fragmentation at the end. This detailed study of fragmentation changes documents that subglacial eruptions have highly variable interaction with the melt water showing that the amount and access to melt water changes significantly during eruptions. While it is often difficult to reconstruct the progression of eruptions that have no quantitative observational record, this study shows that integrating field observations and granulometry with the new regularity index can form a coherent model of eruption evolution.
Rockfall hazard and risk assessments along roads at a regional scale: example in Swiss Alps
NASA Astrophysics Data System (ADS)
Michoud, C.; Derron, M.-H.; Horton, P.; Jaboyedoff, M.; Baillifard, F.-J.; Loye, A.; Nicolet, P.; Pedrazzini, A.; Queyrel, A.
2012-03-01
Unlike fragmental rockfall runout assessments, there are only few robust methods to quantify rock-mass-failure susceptibilities at regional scale. A detailed slope angle analysis of recent Digital Elevation Models (DEM) can be used to detect potential rockfall source areas, thanks to the Slope Angle Distribution procedure. However, this method does not provide any information on block-release frequencies inside identified areas. The present paper adds to the Slope Angle Distribution of cliffs unit its normalized cumulative distribution function. This improvement is assimilated to a quantitative weighting of slope angles, introducing rock-mass-failure susceptibilities inside rockfall source areas previously detected. Then rockfall runout assessment is performed using the GIS- and process-based software Flow-R, providing relative frequencies for runout. Thus, taking into consideration both susceptibility results, this approach can be used to establish, after calibration, hazard and risk maps at regional scale. As an example, a risk analysis of vehicle traffic exposed to rockfalls is performed along the main roads of the Swiss alpine valley of Bagnes.
Contents of microscopic fungi in dusts coming from cereal analysis laboratories.
Szwajkowska-Michalek, Lidia; Stuper, Kinga; Lakomy, Piotr; Matysiak, Anna; Perkowski, Juliusz
2010-01-01
Microscopic fungi - components of bioaerosol found in the workplace environment of individuals employed in the agricultural sector - constitute a considerable hazard for their health. This study includes quantitative and qualitative analyses of mycobionta contained in 20 samples of dusts collected from laboratories conducting analyses of cereals. A total of 27 species of viable microscopic fungi were isolated. The most frequently isolated genera Penicillium and Aspergillus, accounting for 27 percent and 26 percent of analyzed isolates. The content of fungal biomass was determined quantitatively using a fungal marker, ergosterol (ERG). Concentrations of this metabolite for all samples ranged from 0.48 mg/kg-212.36 mg/kg. Based on the analyses, it may be stated that the concentration of microfungi in settled dust from laboratories conducting analyses of cereals was varied, and in several cases markedly exceeded admissible concentration levels.
NASA Astrophysics Data System (ADS)
Sandri, Laura; Jolly, Gill; Lindsay, Jan; Howe, Tracy; Marzocchi, Warner
2010-05-01
One of the main challenges of modern volcanology is to provide the public with robust and useful information for decision-making in land-use planning and in emergency management. From the scientific point of view, this translates into reliable and quantitative long- and short-term volcanic hazard assessment and eruption forecasting. Because of the complexity in characterizing volcanic events, and of the natural variability of volcanic processes, a probabilistic approach is more suitable than deterministic modeling. In recent years, two probabilistic codes have been developed for quantitative short- and long-term eruption forecasting (BET_EF) and volcanic hazard assessment (BET_VH). Both of them are based on a Bayesian Event Tree, in which volcanic events are seen as a chain of logical steps of increasing detail. At each node of the tree, the probability is computed by taking into account different sources of information, such as geological and volcanological models, past occurrences, expert opinion and numerical modeling of volcanic phenomena. Since it is a Bayesian tool, the output probability is not a single number, but a probability distribution accounting for aleatory and epistemic uncertainty. In this study, we apply BET_VH in order to quantify the long-term volcanic hazard due to base surge invasion in the region around Auckland, New Zealand's most populous city. Here, small basaltic eruptions from monogenetic cones pose a considerable risk to the city in case of phreatomagmatic activity: evidence for base surges are not uncommon in deposits from past events. Currently, we are particularly focussing on the scenario simulated during Exercise Ruaumoko, a national disaster exercise based on the build-up to an eruption in the Auckland Volcanic Field. Based on recent papers by Marzocchi and Woo, we suggest a possible quantitative strategy to link probabilistic scientific output and Boolean decision making. It is based on cost-benefit analysis, in which all costs and benefits of mitigation actions have to be evaluated and compared, weighting them with the probability of occurrence of a specific threatening volcanic event. An action should be taken when the benefit of that action outweighs the costs. It is worth remarking that this strategy does not guarantee to recommend a decision that we would have taken with the benefit of hindsight. However, this strategy will be successful over the long-tem. Furthermore, it has the overwhelming advantage of providing a quantitative decision rule that is set before any emergency, and thus it will be justifiable at any stage of the process. In our present application, we are trying to set up a cost-benefit scheme for the call of an evacuation to protect people in the Auckland Volcanic Field against base surge invasion. Considering the heterogeneity of the urban environment and the size of the region at risk, we propose a cost-benefit scheme that is space dependent, to take into account higher costs when an eruption threatens sensible sites for the city and/or the nation, such as the international airport or the harbour. Finally, we compare our findings with the present Contingency Plan for Auckland.
Clark, Renee M; Besterfield-Sacre, Mary E
2009-03-01
We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.
Space vehicle propulsion systems: Environmental space hazards
NASA Technical Reports Server (NTRS)
Disimile, P. J.; Bahr, G. K.
1990-01-01
The hazards that exist in geolunar space which may degrade, disrupt, or terminate the performance of space-based LOX/LH2 rocket engines are evaluated. Accordingly, a summary of the open literature pertaining to the geolunar space hazards is provided. Approximately 350 citations and about 200 documents and abstracts were reviewed; the documents selected give current and quantitative detail. The methodology was to categorize the various space hazards in relation to their importance in specified regions of geolunar space. Additionally, the effect of the various space hazards in relation to spacecraft and their systems were investigated. It was found that further investigation of the literature would be required to assess the effects of these hazards on propulsion systems per se; in particular, possible degrading effects on exterior nozzle structure, directional gimbals, and internal combustion chamber integrity and geometry.
Deng, Fang-Ming; Donin, Nicholas M; Pe Benito, Ruth; Melamed, Jonathan; Le Nobin, Julien; Zhou, Ming; Ma, Sisi; Wang, Jinhua; Lepor, Herbert
2016-08-01
The risk of biochemical recurrence (BCR) following radical prostatectomy for pathologic Gleason 7 prostate cancer varies according to the proportion of Gleason 4 component. We sought to explore the value of several novel quantitative metrics of Gleason 4 disease for the prediction of BCR in men with Gleason 7 disease. We analyzed a cohort of 2630 radical prostatectomy cases from 1990-2007. All pathologic Gleason 7 cases were identified and assessed for quantity of Gleason pattern 4. Three methods were used to quantify the extent of Gleason 4: a quantitative Gleason score (qGS) based on the proportion of tumor composed of Gleason pattern 4, a size-weighted score (swGS) incorporating the overall quantity of Gleason 4, and a size index (siGS) incorporating the quantity of Gleason 4 based on the index lesion. Associations between the above metrics and BCR were evaluated using Cox proportional hazards regression analysis. qGS, swGS, and siGS were significantly associated with BCR on multivariate analysis when adjusted for traditional Gleason score, age, prostate specific antigen, surgical margin, and stage. Using Harrell's c-index to compare the scoring systems, qGS (0.83), swGS (0.84), and siGS (0.84) all performed better than the traditional Gleason score (0.82). Quantitative measures of Gleason pattern 4 predict BCR better than the traditional Gleason score. In men with Gleason 7 prostate cancer, quantitative analysis of the proportion of Gleason pattern 4 (quantitative Gleason score), as well as size-weighted measurement of Gleason 4 (size-weighted Gleason score), and a size-weighted measurement of Gleason 4 based on the largest tumor nodule significantly improve the predicted risk of biochemical recurrence compared with the traditional Gleason score. Copyright © 2015 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Assessing crown fire potential by linking models of surface and crown fire behavior
Joe H. Scott; Elizabeth D. Reinhardt
2001-01-01
Fire managers are increasingly concerned about the threat of crown fires, yet only now are quantitative methods for assessing crown fire hazard being developed. Links among existing mathematical models of fire behavior are used to develop two indices of crown fire hazard-the Torching Index and Crowning Index. These indices can be used to ordinate different forest...
Joseph, Paul; Tretsiakova-McNally, Svetlana
2015-01-01
Polymeric materials often exhibit complex combustion behaviours encompassing several stages and involving solid phase, gas phase and interphase. A wide range of qualitative, semi-quantitative and quantitative testing techniques are currently available, both at the laboratory scale and for commercial purposes, for evaluating the decomposition and combustion behaviours of polymeric materials. They include, but are not limited to, techniques such as: thermo-gravimetric analysis (TGA), oxygen bomb calorimetry, limiting oxygen index measurements (LOI), Underwriters Laboratory 94 (UL-94) tests, cone calorimetry, etc. However, none of the above mentioned techniques are capable of quantitatively deciphering the underpinning physiochemical processes leading to the melt flow behaviour of thermoplastics. Melt-flow of polymeric materials can constitute a serious secondary hazard in fire scenarios, for example, if they are present as component parts of a ceiling in an enclosure. In recent years, more quantitative attempts to measure the mass loss and melt-drip behaviour of some commercially important chain- and step-growth polymers have been accomplished. The present article focuses, primarily, on the experimental and some theoretical aspects of melt-flow behaviours of thermoplastics under heat/fire conditions. PMID:28793746
Joseph, Paul; Tretsiakova-McNally, Svetlana
2015-12-15
Polymeric materials often exhibit complex combustion behaviours encompassing several stages and involving solid phase, gas phase and interphase. A wide range of qualitative, semi-quantitative and quantitative testing techniques are currently available, both at the laboratory scale and for commercial purposes, for evaluating the decomposition and combustion behaviours of polymeric materials. They include, but are not limited to, techniques such as: thermo-gravimetric analysis (TGA), oxygen bomb calorimetry, limiting oxygen index measurements (LOI), Underwriters Laboratory 94 (UL-94) tests, cone calorimetry, etc. However, none of the above mentioned techniques are capable of quantitatively deciphering the underpinning physiochemical processes leading to the melt flow behaviour of thermoplastics. Melt-flow of polymeric materials can constitute a serious secondary hazard in fire scenarios, for example, if they are present as component parts of a ceiling in an enclosure. In recent years, more quantitative attempts to measure the mass loss and melt-drip behaviour of some commercially important chain- and step-growth polymers have been accomplished. The present article focuses, primarily, on the experimental and some theoretical aspects of melt-flow behaviours of thermoplastics under heat/fire conditions.
NASA Astrophysics Data System (ADS)
Guillard-Gonçalves, C.; Zêzere, J. L.; Pereira, S.; Garcia, R. A. C.
2016-02-01
This study offers a semi-quantitative assessment of the physical vulnerability of buildings to landslides in a Portuguese municipality (Loures), as well as the quantitative landslide risk analysis computed as the product of the landslide hazard by the vulnerability and the economic value of the buildings. The hazard was assessed by combining the spatiotemporal probability and the frequency-magnitude relationship of the landslides. The physical vulnerability assessment was based on an inquiry of a pool of European landslide experts and a sub-pool of landslide experts who know the study area, and the answers' variability was assessed with standard deviation. The average vulnerability of the basic geographic entities was compared by changing the map unit and applying the vulnerability to all the buildings of a test site, the inventory of which was listed on the field. The economic value was calculated using an adaptation of the Portuguese Tax Services approach, and the risk was computed for different landslide magnitudes and different spatiotemporal probabilities. As a rule, the vulnerability values given by the sub-pool of experts who know the study area are higher than those given by the European experts, namely for the high-magnitude landslides. The obtained vulnerabilities vary from 0.2 to 1 as a function of the structural building types and the landslide magnitude, and are maximal for 10 and 20 m landslide depths. However, the highest risk was found for the landslides that are 3 m deep, because these landslides combine a relatively high frequency in the Loures municipality with a substantial potential damage.
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
Rock Slide Risk Assessment: A Semi-Quantitative Approach
NASA Astrophysics Data System (ADS)
Duzgun, H. S. B.
2009-04-01
Rock slides can be better managed by systematic risk assessments. Any risk assessment methodology for rock slides involves identification of rock slide risk components, which are hazard, elements at risk and vulnerability. For a quantitative/semi-quantitative risk assessment for rock slides, a mathematical value the risk has to be computed and evaluated. The quantitative evaluation of risk for rock slides enables comparison of the computed risk with the risk of other natural and/or human-made hazards and providing better decision support and easier communication for the decision makers. A quantitative/semi-quantitative risk assessment procedure involves: Danger Identification, Hazard Assessment, Elements at Risk Identification, Vulnerability Assessment, Risk computation, Risk Evaluation. On the other hand, the steps of this procedure require adaptation of existing or development of new implementation methods depending on the type of landslide, data availability, investigation scale and nature of consequences. In study, a generic semi-quantitative risk assessment (SQRA) procedure for rock slides is proposed. The procedure has five consecutive stages: Data collection and analyses, hazard assessment, analyses of elements at risk and vulnerability and risk assessment. The implementation of the procedure for a single rock slide case is illustrated for a rock slope in Norway. Rock slides from mountain Ramnefjell to lake Loen are considered to be one of the major geohazards in Norway. Lake Loen is located in the inner part of Nordfjord in Western Norway. Ramnefjell Mountain is heavily jointed leading to formation of vertical rock slices with height between 400-450 m and width between 7-10 m. These slices threaten the settlements around Loen Valley and tourists visiting the fjord during summer season, as the released slides have potential of creating tsunami. In the past, several rock slides had been recorded from the Mountain Ramnefjell between 1905 and 1950. Among them, four of the slides caused formation of tsunami waves which washed up to 74 m above the lake level. Two of the slides resulted in many fatalities in the inner part of the Loen Valley as well as great damages. There are three predominant joint structures in Ramnefjell Mountain, which controls failure and the geometry of the slides. The first joint set is a foliation plane striking northeast-southwest and dipping 35Ë -40Ë to the east-southeast. The second and the third joint sets are almost perpendicular and parallel to the mountain side and scarp, respectively. These three joint sets form slices of rock columns with width ranging between 7-10 m and height of 400-450 m. It is stated that the joints in set II are opened between 1-2 m, which may bring about collection of water during heavy rainfall or snow melt causing the slices to be pressed out. It is estimated that water in the vertical joints both reduces the shear strength of sliding plane and causes reduction of normal stress on the sliding plane due to formation of uplift force. Hence rock slides in Ramnefjell mountain occur in plane failure mode. The quantitative evaluation of rock slide risk requires probabilistic analysis of rock slope stability and identification of consequences if the rock slide occurs. In this study failure probability of a rock slice is evaluated by first-order reliability method (FORM). Then in order to use the calculated probability of failure value (Pf) in risk analyses, it is required to associate this Pf with frequency based probabilities (i.ePf / year) since the computed failure probabilities is a measure of hazard and not a measure of risk unless they are associated with the consequences of the failure. This can be done by either considering the time dependent behavior of the basic variables in the probabilistic models or associating the computed Pf with frequency of the failures in the region. In this study, the frequency of previous rock slides in the previous century in Remnefjell is used for evaluation of frequency based probability to be used in risk assessment. The major consequence of a rock slide is generation of a tsunami in the lake Loen, causing inundation of residential areas around the lake. Risk is assessed by adapting damage probability matrix approach, which is originally developed for risk assessment for buildings in case of earthquake.
White, W R; Crom, R L; Walker, K D
1996-07-23
It should be emphasized that the proposed ferry hazard categorizations do not represent absolute risks for introducing FMD into Panama, but instead provide a systematic method for comparing and estimating risks in the absence of quantitative data. A hazard rating of high may not necessarily represent a high quantitative risk for the introduction of FMD, but is high when compared to other scenarios. A low hazard rating may estimate a low quantitative risk of importing FMD, but economic consequences of a potential outbreak should also be considered. When further data become available, a more complete assessment of the risks of the Crucero Express compared to airplanes, cargo boats, and small boats can be performed. At present, the risk of the Crucero Express is at least as low as the other transport modes described above. Since vehicles are not presently allowed transport from Colombia to Panama, they present no risk to Panama, but with proper cleaning and disinfection procedures, vehicles can be permitted with low risk. However, the Crucero Express can carry 125 vehicles, and thorough cleaning and disinfection of this many cars will require modern and efficient facilities not yet present at either port.
NASA Astrophysics Data System (ADS)
Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin
2016-03-01
How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (p<0.01), while there were no significant association for both PFS and OS in the group of patients without receiving maintenance bevacizumab. Therefore, this study demonstrated the feasibility of using quantitative adiposity-related CT image features based statistical prediction models to generate a new clinical marker and predict the clinical outcome of EOC patients receiving maintenance bevacizumab-based chemotherapy.
Air Monitoring for Hazardous Gas Detection
NASA Technical Reports Server (NTRS)
Arkin, C. Richard; Griffin, Timothy P.; Adams, Frederick W.; Naylor, Guy; Haskell, William; Floyd, David; Curley, Charles; Follistein, Duke W.
2004-01-01
The Hazardous Gas Detection Lab (HGDL) at Kennedy Space Center is involved in the design and development of instrumentation that can detect and quantify various hazardous gases. Traditionally these systems are designed for leak detection of the cryogenic gases used for the propulsion of the Shuttle and other vehicles. Mass spectrometers are the basis of these systems, which provide excellent quantitation, sensitivity, selectivity, response times and detection limits. A Table lists common gases monitored for aerospace applications. The first five gases, hydrogen, helium, nitrogen, oxygen, and argon are historically the focus of the HGDL.
Network Science Based Quantification of Resilience Demonstrated on the Indian Railways Network.
Bhatia, Udit; Kumar, Devashish; Kodra, Evan; Ganguly, Auroop R
2015-01-01
The structure, interdependence, and fragility of systems ranging from power-grids and transportation to ecology, climate, biology and even human communities and the Internet have been examined through network science. While response to perturbations has been quantified, recovery strategies for perturbed networks have usually been either discussed conceptually or through anecdotal case studies. Here we develop a network science based quantitative framework for measuring, comparing and interpreting hazard responses as well as recovery strategies. The framework, motivated by the recently proposed temporal resilience paradigm, is demonstrated with the Indian Railways Network. Simulations inspired by the 2004 Indian Ocean Tsunami and the 2012 North Indian blackout as well as a cyber-physical attack scenario illustrate hazard responses and effectiveness of proposed recovery strategies. Multiple metrics are used to generate various recovery strategies, which are simply sequences in which system components should be recovered after a disruption. Quantitative evaluation of these strategies suggests that faster and more efficient recovery is possible through network centrality measures. Optimal recovery strategies may be different per hazard, per community within a network, and for different measures of partial recovery. In addition, topological characterization provides a means for interpreting the comparative performance of proposed recovery strategies. The methods can be directly extended to other Large-Scale Critical Lifeline Infrastructure Networks including transportation, water, energy and communications systems that are threatened by natural or human-induced hazards, including cascading failures. Furthermore, the quantitative framework developed here can generalize across natural, engineered and human systems, offering an actionable and generalizable approach for emergency management in particular as well as for network resilience in general.
Network Science Based Quantification of Resilience Demonstrated on the Indian Railways Network
Bhatia, Udit; Kumar, Devashish; Kodra, Evan; Ganguly, Auroop R.
2015-01-01
The structure, interdependence, and fragility of systems ranging from power-grids and transportation to ecology, climate, biology and even human communities and the Internet have been examined through network science. While response to perturbations has been quantified, recovery strategies for perturbed networks have usually been either discussed conceptually or through anecdotal case studies. Here we develop a network science based quantitative framework for measuring, comparing and interpreting hazard responses as well as recovery strategies. The framework, motivated by the recently proposed temporal resilience paradigm, is demonstrated with the Indian Railways Network. Simulations inspired by the 2004 Indian Ocean Tsunami and the 2012 North Indian blackout as well as a cyber-physical attack scenario illustrate hazard responses and effectiveness of proposed recovery strategies. Multiple metrics are used to generate various recovery strategies, which are simply sequences in which system components should be recovered after a disruption. Quantitative evaluation of these strategies suggests that faster and more efficient recovery is possible through network centrality measures. Optimal recovery strategies may be different per hazard, per community within a network, and for different measures of partial recovery. In addition, topological characterization provides a means for interpreting the comparative performance of proposed recovery strategies. The methods can be directly extended to other Large-Scale Critical Lifeline Infrastructure Networks including transportation, water, energy and communications systems that are threatened by natural or human-induced hazards, including cascading failures. Furthermore, the quantitative framework developed here can generalize across natural, engineered and human systems, offering an actionable and generalizable approach for emergency management in particular as well as for network resilience in general. PMID:26536227
Clinical value of protein expression of kallikrein-related peptidase 7 (KLK7) in ovarian cancer.
Dorn, Julia; Gkazepis, Apostolos; Kotzsch, Matthias; Kremer, Marcus; Propping, Corinna; Mayer, Katharina; Mengele, Karin; Diamandis, Eleftherios P; Kiechle, Marion; Magdolen, Viktor; Schmitt, Manfred
2014-01-01
Expression of the kallikrein-related peptidase 7 (KLK7) is dysregulated in ovarian cancer. We assessed KLK7 expression by ELISA and quantitative immunohistochemistry and analyzed its association with clinicopathological parameters and patients' outcome. KLK7 antigen concentrations were determined in tumor tissue extracts of 98 ovarian cancer patients by ELISA. For analysis of KLK7 immunoexpression in ovarian cancer tissue microarrays, a manual quantitative scoring system as well as a software tool for quantitative high-throughput automated image analysis was used. In immunohistochemical analyses, expression levels of KLK7 were not associated with patients' outcome. However, in multivariate analyses, KLK7 antigen levels in tumor tissue extracts were significantly associated with both overall and progression-free survival: ovarian cancer patients with high KLK7 levels had a significantly, 2-fold lower risk of death [hazard ratio (HR)=0.51, 95% confidence interval (CI)=0.29-0.90, p=0.019] or relapse [HR=0.47, 95% CI=0.25-0.91, p=0.024), as compared with patients who displayed low KLK7 levels. Our results indicate that - in contrast to earlier findings - high KLK7 antigen levels in tumor tissue extracts may be associated with a better prognosis of ovarian cancer patients.
NASA Astrophysics Data System (ADS)
Dimou, Kaotar; Emond, Claude
2017-06-01
In recent decades, the control banding (CB) approach has been recognised as a hazard assessment methodology because of its increased importance in the occupational safety, health and hygiene (OSHH) industry. According to the American Industrial Hygiene Association, this approach originates from the pharmaceutical industry in the United Kingdom. The aim of the CB approach is to protect more than 90% (or approximately 2.7 billion) of the world’s workers who do not have access to OSHH professionals and traditional quantitative risk assessment methods. In other words, CB is a qualitative or semi-quantitative tool designed to prevent occupational accidents by controlling worker exposures to potentially hazardous chemicals in the absence of comprehensive toxicological and exposure data. These criteria correspond very precisely to the development and production of engineered nanomaterials (ENMs). Considering the significant lack of scientific knowledge about work-related health risks because of ENMs, CB is, in general, appropriate for these issues. Currently, CB can be adapted to the specificities of ENMs; hundreds of nanotechnology products containing ENMs are already on the market. In this context, this qualitative or semi-quantitative approach appears to be relevant for characterising and quantifying the degree of physico-chemical and biological reactivities of ENMs, leading towards better control of human health effects and the safe handling of ENMs in workplaces. The need to greater understand the CB approach is important to further manage the risks related to handling hazardous substances, such as ENMs, without established occupational exposure limits. In recent years, this topic has garnered much interest, including discussions in many technical papers. Several CB models have been developed, and many countries have created their own nano-specific CB instruments. The aims of this research were to perform a literature review about CBs, to classify the main approaches that were developed worldwide, and then to suggest an original methodology based on the characterisation of the hazard. For this research, our team conducted a systematic literature review over the past 20 years. This approach is important in understanding the conceptual basis for CB and the model’s overall effectiveness. These considerations will lead to the proposal of an original hazard assessment method based on physico-chemical and biological characteristics. Such a method should help the entire industry better understand the ability of the CB approach to limit workers’ exposure, while identifying the strengths and weaknesses of the approach. Developing this practice method will help to provide relevant recommendations to workers who handle hazardous chemicals such as ENMs and to the general population.
2016-01-01
The literature of environmental microbiology broadly discusses issues associated with microbial hazards in archives, but these publications are mainly devoted to paper documents. There are few articles on historical parchment documents, which used to be very important for the development of literature and the art of writing. These studies present a broad spectrum of methods for the assessment of biodeterioration hazards of the parchment document in question. They are based on both conventional microbiological methods and advanced techniques of molecular biology. Here, a qualitative analysis was conducted, based on genetic identification of bacteria and fungi present on the document as well as denaturing gradient gel electrophoresis profiling and examining the destructive potential of isolated microbes. Moreover, the study involved a quantitative and qualitative microbiological assessment of the indoor air in the room where the parchment was kept. The microbes with the highest destructive potential that were isolated from the investigated item were Bacillus cereus and Acinetobacter lwoffii bacteria and Penicillium chrysogenum, Chaetomium globosum, and Trichoderma longibrachiatum fungi. The presence of the B. cereus strain was particularly interesting since, under appropriate conditions, it leads to complete parchment degradation within several days. PMID:26896133
NASA Technical Reports Server (NTRS)
Smart, Christian
1998-01-01
During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen out the many failure modes that did not contribute significantly to the catastrophic risk. The Hazard Analysis and FMEA for the SSME were also used to build ESDs that show the chain of events leading from the failure mode occurence to one of the following end states: catastrophic failure, engine shutdown, or siccessful operation( successful with respect to the failure mode under consideration).
INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.J. Garrett
2005-02-17
The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology formore » this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.« less
Hazard Response Modeling Uncertainty (A Quantitative Method)
1988-10-01
was conducted by the National Maritime Institute under contract to the United Kingdom Health and Safety Executive. Instantaneous releases of 2000...the National Maritime Institute under contract to the United Kingdom Health and Safety Executive with the sponsorship of numerous international...WORK UNIT ELEMENT NO. NO. NO. ACESSION NO. _____________________65502F I 00O ne I " 11. TITLE (incl& e Security Oauffication) Hazard Response
NASA Astrophysics Data System (ADS)
Carlier, Benoit; Puissant, Anne; Dujarric, Constance
2017-04-01
Vulnerability assessment together with hazard exposure is generally accepted as the two main steps of risk analysis. If quantitative methods to estimate hazard exposure are now well-defined, it is not the case regarding vulnerability assessment. Vulnerability is a complex concept involving a variety of disciplines from physical and socio-economic sciences (i.e. engineering, economics, social and health sciences etc.). Currently, two opposite trends exist: the 'physical' approach in which vulnerability is analysed as potential impacts (i.e. structural and functional) on the elements at risk (building, network, land cover); and the 'social' approach in which vulnerability is a combination of socio-economic variables determining people's ability to anticipate before a catastrophic event, to react during it, and to recover after it. For a complete analysis of vulnerability it is essential to combine these two approaches but in reality few works exists. The objective of this research is to improve the Potential Damage Index (PDI), detailed in Puissant el al. (2013), originally developed to assess physical injury, structural and functional consequences of landslide hazard, by including socio-economic characteristics of population information. Data from the French Census data (INSEE, 2012) and a survey on risk perception (100 questionnaires obtained between 2014 and 2015/16) were used to propose an overall index taking into account the three main phases of risk management: preparedness, crisis management and recovery. This new index called Global Potential Damage Index (GPDI) is applied on the Upper Guil Catchment to assess potential torrential floods hazard in the context of the French funded project SAMCO (Society Adaptation for coping with Mountain risks in a global change Context). Results of the PDI are compared with the GPDI and show significant differences. GPDI scores mapping are lower than PDI scores indicating that resilient population may qualify results obtained for physical consequences. In GPDI the social and institutional component is expressed by a unique value applied for the overall stakes of a same community. Consequently, socio-economics differences between Upper Guil catchments communities are highlighted and make results easily understandable for local manager.
A quantitative risk analysis approach to port hydrocarbon logistics.
Ronza, A; Carol, S; Espejo, V; Vílchez, J A; Arnaldos, J
2006-01-16
A method is presented that allows quantitative risk analysis to be performed on marine hydrocarbon terminals sited in ports. A significant gap was identified in the technical literature on QRA for the handling of hazardous materials in harbours published prior to this work. The analysis is extended to tanker navigation through port waters and loading and unloading facilities. The steps of the method are discussed, beginning with data collecting. As to accident scenario identification, an approach is proposed that takes into account minor and massive spills due to loading arm failures and tank rupture. Frequency estimation is thoroughly reviewed and a shortcut approach is proposed for frequency calculation. This allows for the two-fold possibility of a tanker colliding/grounding at/near the berth or while navigating to/from the berth. A number of probability data defining the possibility of a cargo spill after an external impact on a tanker are discussed. As to consequence and vulnerability estimates, a scheme is proposed for the use of ratios between the numbers of fatal victims, injured and evacuated people. Finally, an example application is given, based on a pilot study conducted in the Port of Barcelona, where the method was tested.
Hazard Analysis Guidelines for Transit Projects
DOT National Transportation Integrated Search
2000-01-01
These hazard analysis guidelines discuss safety critical systems and subsystems, types of hazard analyses, when hazard analyses should be performed, and the hazard analysis philosophy. These guidelines are published by FTA to assist the transit indus...
Miranda, Nahieh Toscano; Sequinel, Rodrigo; Hatanaka, Rafael Rodrigues; de Oliveira, José Eduardo; Flumignan, Danilo Luiz
2017-04-01
Benzene, toluene, ethylbenzene, and xylenes are some of the most hazardous constituents found in commercial gasoline samples; therefore, these components must be monitored to avoid toxicological problems. We propose a new routine method of ultrafast gas chromatography coupled to flame ionization detection for the direct determination of benzene, toluene, ethylbenzene, and xylenes in commercial gasoline. This method is based on external standard calibration to quantify each compound, including the validation step of the study of linearity, detection and quantification limits, precision, and accuracy. The time of analysis was less than 3.2 min, with quantitative statements regarding the separation and quantification of all compounds in commercial gasoline samples. Ultrafast gas chromatography is a promising alternative method to official analytical techniques. Government laboratories could consider using this method for quality control. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Protection of agriculture against drought in Slovenia based on vulnerability and risk assessment
NASA Astrophysics Data System (ADS)
Dovžak, M.; Stanič, S.; Bergant, K.; Gregorič, G.
2012-04-01
Past and recent extreme events, like earthquakes, extreme droughts, heat waves, flash floods and volcanic eruptions continuously remind us that natural hazards are an integral component of the global environment. Despite rapid improvement of detection techniques many of these events evade long-term or even mid-term prediction and can thus have disastrous impacts on affected communities and environment. Effective mitigation and preparedness strategies will be possible to develop only after gaining the understanding on how and where such hazards may occur, what causes them, what circumstances increase their severity, and what their impacts may be and their study has the recent years emerged as under the common title of natural hazard management. The first step in natural risk management is risk identification, which includes hazard analysis and monitoring, vulnerability analysis and determination of the risk level. The presented research focuses on drought, which is at the present already the most widespread as well as still unpredictable natural hazard. Its primary aim was to assess the frequency and the consequences of droughts in Slovenia based on drought events in the past, to develop methodology for drought vulnerability and risk assessment that can be applied in Slovenia and wider in South-Eastern Europe, to prepare maps of drought risk and crop vulnerability and to guidelines to reduce the vulnerability of the crops. Using the amounts of plant available water in the soil, slope inclination, solar radiation, land use and irrigation infrastructure data sets as inputs, we obtained vulnerability maps for Slovenia using GIS-based multi-criteria decision analysis with a weighted linear combination of the input parameters. The weight configuration was optimized by comparing the modelled crop damage to the assessed actual damage, which was available for the extensive drought case in 2006. Drought risk was obtained quantitatively as a function of hazard and vulnerability and presented in the same way as the vulnerability, as a GIS-based map. Risk maps show geographic regions in Slovenia where droughts pose a major threat to the agriculture and together with the vulnerability maps provide the basis for drought management, in particular for the appropriate mitigation and response actions in specific regions. The developed methodology is expected to be applied to the entire region of South-Eastern Europe within the initiative of the Drought Management Centre for Southeastern Europe.
The SAMCO Web-platform for resilience assessment in mountainous valleys impacted by landslide risks.
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Thomas, Loic; Bernardie, Severine
2016-04-01
The ANR-SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points: (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform available on the web. The strength and originality of the SAMCO project lies in the combination of different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) that are implemented in a user-oriented web-platform, currently in development. We present the first results of this development task, architecture and functions of the web-tools, the case studies database showing the multi-hazard maps and the stakes at risks. Risk assessment over several area of interest in Alpine or Pyrenean valleys are still in progress, but the first analyses are presented for current and future periods for which climate change and land-use (economical, geographical and social aspects) scenarios are taken into account. This tool, dedicated to stakeholders, should be finally used to evaluate resilience of mountainous regions since multiple scenarios can be tested and compared.
Smith, Philip A; Lockhart, Bonnie; Besser, Brett W; Michalski, Michael A R
2014-01-01
Hazardous atmospheres in confined spaces may be obvious when a source of air contamination or oxygen (O2) deficiency is recognized. Such is often the case in general industry settings, especially with work processes which create hazardous atmospheres that may be anticipated. Hazards present in active sewers are also well recognized; but the possibility that O2 deficiency or high airborne contaminant concentrations may exist in new construction sewers or storm drains has been repeatedly ignored with deadly results. Low O2 and high carbon dioxide (CO2) concentrations may exist in new construction manholes that have not yet been connected to an active sewer or drain system, and these concentrations have been shown to vary over time. A recent incident is described where workers repeatedly entered such a confined space without incident, but subsequent entry resulted in a fatality and a near-miss for a co-worker rescuer. Additional cases are discussed, with an emphasis placed on elevated CO2 concentrations as a causative factor. A description is provided for the adsorptive gas chromatography whole-air analysis methodology used to quantitatively determine atmospheric conditions present at this type of fatality site or others after an incident, and for the gas chromatography-mass spectrometry method used to provide confirmation of analyte identity with high certainty. Many types of confined spaces may be encountered in addition to the underground varieties discussed, and many possible atmospheric hazards are possible. The definitive whole-air analysis approach described here may be of use and should be considered to investigate many confined space fatality and near-miss cases, and to better understand the causes of dangerous atmosphere conditions that may arise in confined spaces.
Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones
NASA Astrophysics Data System (ADS)
Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto
2015-04-01
Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions Euros, shows that geological and geophysical investigations necessary to assess a reliable deterministic hazard evaluation are largely justified.
Defining and Measuring Coastal Vulnerability and Resilience to Natural Hazards
NASA Astrophysics Data System (ADS)
Becker, M. K.; Hoagland, P.
2014-12-01
Accounting for an estimated 23 percent of the world's population, coastal communities face many types of natural hazards. In particular, they may be vulnerable to the effects of tropical cyclones, flooding due to tsunamis or storm surges, erosion, saltwater intrusion, and subsidence. These coastal hazards are further exacerbated by population growth and climate change. There is a lack of consensus in the literature about what constitutes vulnerability (negative impacts) and resilience (recovery from negative impacts) and how to measure these phenomena. While some important work has focused on the long-term effects of coastal hazards on economic growth, little has been done to understand, in quantitative terms, the extent to which coastal communities may be vulnerable to such hazards and, if so, whether they can be resilient. We surveyed nine indicators of human well-being in order to determine their potential suitability as measures of coastal vulnerability or resilience. Some measures, such as the Gross Domestic Product, the Human Development Index, and the Gini coefficient, comprise economic or distributional indicators of human welfare; others, such as the Social Vulnerability Index, are more complex and difficult to interpret. We selected per capita personal income as the most viable indicator, due largely to its simplicity and its availability over several decades. We used it to examine human community vulnerability and resilience to a specific coastal hazard—significant storm surges from major coastal hurricanes—in several US coastal metropolitan areas. We compiled data on per capita personal income from the US Bureau of Economic Analysis for 15 to 20 years prior and subsequent to four major hurricanes: Hugo, which hit the Charleston, South Carolina, metropolitan area in 1989; Bob, Cape Cod, Massachusetts, in 1991; Andrew, Miami, Florida, in 1992; and Opal, Pensacola, Florida, in 1995. Intervention analysis using linear regression suggests that these coastal areas exhibited the full range of possible combinations of vulnerability and resilience.
14 CFR 437.29 - Hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...
14 CFR 437.29 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...
Lunar mission safety and rescue: Hazards analysis and safety requirements
NASA Technical Reports Server (NTRS)
1971-01-01
The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.
Bartlett, John M S; Christiansen, Jason; Gustavson, Mark; Rimm, David L; Piper, Tammy; van de Velde, Cornelis J H; Hasenburg, Annette; Kieback, Dirk G; Putter, Hein; Markopoulos, Christos J; Dirix, Luc Y; Seynaeve, Caroline; Rea, Daniel W
2016-01-01
Hormone receptors HER2/neu and Ki-67 are markers of residual risk in early breast cancer. An algorithm (IHC4) combining these markers may provide additional information on residual risk of recurrence in patients treated with hormone therapy. To independently validate the IHC4 algorithm in the multinational Tamoxifen Versus Exemestane Adjuvant Multicenter Trial (TEAM) cohort, originally developed on the trans-ATAC (Arimidex, Tamoxifen, Alone or in Combination Trial) cohort, by comparing 2 methodologies. The IHC4 biomarker expression was quantified on TEAM cohort samples (n = 2919) by using 2 independent methodologies (conventional 3,3'-diaminobezidine [DAB] immunohistochemistry with image analysis and standardized quantitative immunofluorescence [QIF] by AQUA technology). The IHC4 scores were calculated by using the same previously established coefficients and then compared with recurrence-free and distant recurrence-free survival, using multivariate Cox proportional hazards modeling. The QIF model was highly significant for prediction of residual risk (P < .001), with continuous model scores showing a hazard ratio (HR) of 1.012 (95% confidence interval [95% CI]: 1.010-1.014), which was significantly higher than that for the DAB model (HR: 1.008, 95% CI: 1.006-1.009); P < .001). Each model added significant prognostic value in addition to recognized clinical prognostic factors, including nodal status, in multivariate analyses. Quantitative immunofluorescence, however, showed more accuracy with respect to overall residual risk assessment than the DAB model. The use of the IHC4 algorithm was validated on the TEAM trial for predicting residual risk in patients with breast cancer. These data support the use of the IHC4 algorithm clinically, but quantitative and standardized approaches need to be used.
Inayat-Hussain, Salmaan H; Fukumura, Masao; Muiz Aziz, A; Jin, Chai Meng; Jin, Low Wei; Garcia-Milian, Rolando; Vasiliou, Vasilis; Deziel, Nicole C
2018-08-01
Recent trends have witnessed the global growth of unconventional oil and gas (UOG) production. Epidemiologic studies have suggested associations between proximity to UOG operations with increased adverse birth outcomes and cancer, though specific potential etiologic agents have not yet been identified. To perform effective risk assessment of chemicals used in UOG production, the first step of hazard identification followed by prioritization specifically for reproductive toxicity, carcinogenicity and mutagenicity is crucial in an evidence-based risk assessment approach. To date, there is no single hazard classification list based on the United Nations Globally Harmonized System (GHS), with countries applying the GHS standards to generate their own chemical hazard classification lists. A current challenge for chemical prioritization, particularly for a multi-national industry, is inconsistent hazard classification which may result in misjudgment of the potential public health risks. We present a novel approach for hazard identification followed by prioritization of reproductive toxicants found in UOG operations using publicly available regulatory databases. GHS classification for reproductive toxicity of 157 UOG-related chemicals identified as potential reproductive or developmental toxicants in a previous publication was assessed using eleven governmental regulatory agency databases. If there was discordance in classifications across agencies, the most stringent classification was assigned. Chemicals in the category of known or presumed human reproductive toxicants were further evaluated for carcinogenicity and germ cell mutagenicity based on government classifications. A scoring system was utilized to assign numerical values for reproductive health, cancer and germ cell mutation hazard endpoints. Using a Cytoscape analysis, both qualitative and quantitative results were presented visually to readily identify high priority UOG chemicals with evidence of multiple adverse effects. We observed substantial inconsistencies in classification among the 11 databases. By adopting the most stringent classification within and across countries, 43 chemicals were classified as known or presumed human reproductive toxicants (GHS Category 1), while 31 chemicals were classified as suspected human reproductive toxicants (GHS Category 2). The 43 reproductive toxicants were further subjected to analysis for carcinogenic and mutagenic properties. Calculated hazard scores and Cytoscape visualization yielded several high priority chemicals including potassium dichromate, cadmium, benzene and ethylene oxide. Our findings reveal diverging GHS classification outcomes for UOG chemicals across regulatory agencies. Adoption of the most stringent classification with application of hazard scores provides a useful approach to prioritize reproductive toxicants in UOG and other industries for exposure assessments and selection of safer alternatives. Copyright © 2018 Elsevier Ltd. All rights reserved.
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...
Open space suitability analysis for emergency shelter after an earthquake
NASA Astrophysics Data System (ADS)
Anhorn, J.; Khazai, B.
2015-04-01
In an emergency situation shelter space is crucial for people affected by natural hazards. Emergency planners in disaster relief and mass care can greatly benefit from a sound methodology that identifies suitable shelter areas and sites where shelter services need to be improved. A methodology to rank suitability of open spaces for contingency planning and placement of shelter in the immediate aftermath of a disaster is introduced. The Open Space Suitability Index uses the combination of two different measures: a qualitative evaluation criterion for the suitability and manageability of open spaces to be used as shelter sites and another quantitative criterion using a capacitated accessibility analysis based on network analysis. For the qualitative assessment implementation issues, environmental considerations and basic utility supply are the main categories to rank candidate shelter sites. A geographic information system is used to reveal spatial patterns of shelter demand. Advantages and limitations of this method are discussed on the basis of an earthquake hazard case study in the Kathmandu Metropolitan City. According to the results, out of 410 open spaces under investigation, 12.2% have to be considered not suitable (Category D and E) while 10.7% are Category A and 17.6% are Category B. Almost two-thirds (59.55%) are fairly suitable (Category C).
Mohamed, Heba M; Lamie, Nesrine T
2016-09-01
In the past few decades the analytical community has been focused on eliminating or reducing the usage of hazardous chemicals and solvents, in different analytical methodologies, that have been ascertained to be extremely dangerous to human health and environment. In this context, environmentally friendly, green, or clean practices have been implemented in different research areas. This study presents a greener alternative of conventional RP-HPLC methods for the simultaneous determination and quantitative analysis of a pharmaceutical ternary mixture composed of telmisartan, hydrochlorothiazide, and amlodipine besylate, using an ecofriendly mobile phase and short run time with the least amount of waste production. This solvent-replacement approach was feasible without compromising method performance criteria, such as separation efficiency, peak symmetry, and chromatographic retention. The greenness profile of the proposed method was assessed and compared with reported conventional methods using the analytical Eco-Scale as an assessment tool. The proposed method was found to be greener in terms of usage of hazardous chemicals and solvents, energy consumption, and production of waste. The proposed method can be safely used for the routine analysis of the studied pharmaceutical ternary mixture with a minimal detrimental impact on human health and the environment.
Development of risk-based nanomaterial groups for occupational exposure control
NASA Astrophysics Data System (ADS)
Kuempel, E. D.; Castranova, V.; Geraci, C. L.; Schulte, P. A.
2012-09-01
Given the almost limitless variety of nanomaterials, it will be virtually impossible to assess the possible occupational health hazard of each nanomaterial individually. The development of science-based hazard and risk categories for nanomaterials is needed for decision-making about exposure control practices in the workplace. A possible strategy would be to select representative (benchmark) materials from various mode of action (MOA) classes, evaluate the hazard and develop risk estimates, and then apply a systematic comparison of new nanomaterials with the benchmark materials in the same MOA class. Poorly soluble particles are used here as an example to illustrate quantitative risk assessment methods for possible benchmark particles and occupational exposure control groups, given mode of action and relative toxicity. Linking such benchmark particles to specific exposure control bands would facilitate the translation of health hazard and quantitative risk information to the development of effective exposure control practices in the workplace. A key challenge is obtaining sufficient dose-response data, based on standard testing, to systematically evaluate the nanomaterials' physical-chemical factors influencing their biological activity. Categorization processes involve both science-based analyses and default assumptions in the absence of substance-specific information. Utilizing data and information from related materials may facilitate initial determinations of exposure control systems for nanomaterials.
Schaafsma, G; Hertsenberg, A J; Marquart, J
2011-08-01
Within the framework of REACH, an assessment regarding local dermal effects and skin sensitisation should be performed for substances. Quantitative hazard information for these effects is often not available. Furthermore, it is difficult to relate the way in which animals are exposed in dermal toxicity studies directly to dermal exposure in practice. In the absence of quantitative information, a qualitative assessment for dermal effects is the most reasonable option. The qualitative approach as proposed in the REACH guidance recommends only general risk management measures (RMM) for three categories with a low, moderate and high identified hazard, without specifying which RMM are needed for a specific exposure scenario. We propose to differentiate frequency of exposure based on differences in activities and to compare measured and estimated local skin exposure levels with rules of thumb for evaluation of control of risks per hazard category. For workers, specific RMM regimes are assigned to each combination of hazard category and process category (PROC). For consumers, a strategy in which RMM are arranged from product-integrated measures to the use of personal protective equipment (PPE) is presented. Our approach may be transferred into automated assessment tools like Chesar and CEFIC GES. Copyright © 2011 Elsevier Inc. All rights reserved.
Injury risks of EMS responders: evidence from the National Fire Fighter Near-Miss Reporting System
Taylor, Jennifer A; Davis, Andrea L; Barnes, Brittany; Lacovara, Alicia V; Patel, Reema
2015-01-01
Objectives We analysed near-miss and injury events reported to the National Fire Fighter Near-Miss Reporting System (NFFNMRS) to investigate the workplace hazards and safety concerns of Emergency Medical Services (EMS) responders in the USA. Methods We reviewed 769 ‘non-fire emergency event’ reports from the NFFNMRS using a mixed methods approach. We identified 185 emergency medical calls and analysed their narrative text fields. We assigned Mechanism of Near-Miss/Injury and Nature of Injury codes and then tabulated frequencies (quantitative). We coded major themes regarding work hazards and safety concerns reported by the EMS responders (qualitative). Results Of the 185 emergency medical calls, the most commonly identified Mechanisms of Near-Miss/Injury to EMS responders was Assaults, followed by Struck-by Motor Vehicle, and Motor Vehicle Collision. The most commonly identified weapon used in an assault was a firearm. We identified 5 major domains of workplace hazards and safety concerns: Assaults by Patients, Risks from Motor Vehicles, Personal Protective Equipment, Relationships between Emergency Responders, and Policies, Procedures and Practices. Conclusions Narrative text from the NFFNMRS is a rich source of data that can be analysed quantitatively and qualitatively to provide insight into near-misses and injuries sustained by EMS responders. Near-miss reporting systems are critical components for occupational hazard surveillance. PMID:26068510
NASA Astrophysics Data System (ADS)
Baruffini, Mirko
2010-05-01
Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a GIS-based system can be for effective and efficient disaster response management. In the coming years our GIS application will be a data base containing all information needed for the evaluation of risk sites along the Gotthard line. Our GIS application can help the technical management to decide about protection measures because of, in addition to the visualisation, tools for spatial data analysis will be available. REFERENCES Bründl M. (Ed.) 2009 : Risikokonzept für Naturgefahren - Leitfaden. Nationale Plattform für Naturgefahren PLANAT, Bern. 416 S. BUWAL 1999: Risikoanalyse bei gravitativen Naturgefahren - Methode, Fallbeispiele und Daten (Risk analyses for gravitational natural hazards). Bundesamt für Umwelt, Wald und Landschaft (BUWAL). Umwelt-Materialen Nr. 107, 1-244. Loat, R. & Zimmermann, M. 2004: La gestion des risques en Suisse (Risk Management in Switzerland). In: Veyret, Y., Garry, G., Meschinet de Richemont, N. & Armand Colin (eds) 2002: Colloque Arche de la Défense 22-24 octobre 2002, dans Risques naturels et aménagement en Europe, 108-120. Maggi R. et al, 2009: Evaluation of the optimal resilience for vulnerable infrastructure networks. An interdisciplinary pilot study on the transalpine transportation corridors, NRP 54 "Sustainable Development of the Built Environment", Projekt Nr. 405 440, Final Scientific Report, Lugano
Billy, T J; Wachsmuth, I K
1997-08-01
Recent outbreaks of foodborne illness and studies by expert groups have established the need for fundamental change in the United States meat and poultry inspection programme to reduce the risk of foodborne illness. The Food Safety and Inspection Service (FSIS) of the United States Department of Agriculture (USDA) has embarked on a broad effort to bring about such change, with particular emphasis on the reduction of pathogenic micro-organisms in raw meat and poultry products. The publication on 25 July 1996 of the Final Rule on pathogen reduction and hazard analysis and critical control point (HACCP) systems was a major milestone in the FSIS strategy for change. The Final Rule provides a framework for change and clarifies the respective roles of industry and government in ensuring the safety of meat and poultry products. With the implementation of this Final Rule underway, the FSIS has been exploring ways in which slaughter inspection carried out under an HACCP-based system can be changed so that food safety risks are addressed more adequately and the allocation of inspection resources is improved further. In addition, the FSIS is broadening the focus of food safety activities to extend beyond slaughter and processing plants by working with industry, academia and other government agencies. Such co-operation should lead to the development of measures to improve food safety before animals reach the slaughter plant and after products leave the inspected establishment for distribution to the retail level. For the future, the FSIS believes that quantitative risk assessments will be at the core of food safety activities. Risk assessments provide the most effective means of identifying how specific pathogens and other hazards may be encountered throughout the farm-to-table chain and of measuring the potential impact of various interventions. In addition, these assessments will be used in the development and evaluation of HACCP systems. The FSIS is currently conducting a quantitative risk assessment for eggs, and several surveys and studies are being performed to supply data needed to conduct other risk assessments. The FSIS has established a food safety research agenda which will fill data gaps.
Flood hazard studies in Central Texas using orbital and suborbital remote sensing machinery
NASA Technical Reports Server (NTRS)
Baker, V. R.; Holz, R. K.; Patton, P. C.
1975-01-01
Central Texas is subject to infrequent, unusually intense rainstorms which cause extremely rapid runoff from drainage basins developed on the deeply dissected limestone and marl bedrock of the Edwards Plateau. One approach to flood hazard evaluation in this area is a parametric model relating flood hydrograph characteristics to quantitative geomorphic properties of the drainage basins. The preliminary model uses multiple regression techniques to predict potential peak flood discharge from basin magnitude, drainage density, and ruggedness number. After mapping small catchment networks from remote sensing imagery, input data for the model are generated by network digitization and analysis by a computer assisted routine of watershed analysis. The study evaluated the network resolution capabilities of the following data formats: (1) large-scale (1:24,000) topographic maps, employing Strahler's "method of v's," (2) standard low altitude black and white aerial photography (1:13,000 and 1:20,000 scales), (3) NASA - generated aerial infrared photography at scales ranging from 1:48,000 to 1:123,000, and (4) Skylab Earth Resources Experiment Package S-190A and S-190B sensors (1:750,000 and 1:500,000 respectively).
The protective role of coastal marshes: a systematic review and meta-analysis.
Shepard, Christine C; Crain, Caitlin M; Beck, Michael W
2011-01-01
Salt marshes lie between many human communities and the coast and have been presumed to protect these communities from coastal hazards by providing important ecosystem services. However, previous characterizations of these ecosystem services have typically been based on a small number of historical studies, and the consistency and extent to which marshes provide these services has not been investigated. Here, we review the current evidence for the specific processes of wave attenuation, shoreline stabilization and floodwater attenuation to determine if and under what conditions salt marshes offer these coastal protection services. We conducted a thorough search and synthesis of the literature with reference to these processes. Seventy-five publications met our selection criteria, and we conducted meta-analyses for publications with sufficient data available for quantitative analysis. We found that combined across all studies (n = 7), salt marsh vegetation had a significant positive effect on wave attenuation as measured by reductions in wave height per unit distance across marsh vegetation. Salt marsh vegetation also had a significant positive effect on shoreline stabilization as measured by accretion, lateral erosion reduction, and marsh surface elevation change (n = 30). Salt marsh characteristics that were positively correlated to both wave attenuation and shoreline stabilization were vegetation density, biomass production, and marsh size. Although we could not find studies quantitatively evaluating floodwater attenuation within salt marshes, there are several studies noting the negative effects of wetland alteration on water quantity regulation within coastal areas. Our results show that salt marshes have value for coastal hazard mitigation and climate change adaptation. Because we do not yet fully understand the magnitude of this value, we propose that decision makers employ natural systems to maximize the benefits and ecosystem services provided by salt marshes and exercise caution when making decisions that erode these services.
Martinez-Fiestas, Myriam; Rodríguez-Garzón, Ignacio; Delgado-Padial, Antonio; Lucas-Ruiz, Valeriano
2017-09-01
This article presents a cross-cultural study on perceived risk in the construction industry. Worker samples from three different countries were studied: Spain, Peru and Nicaragua. The main goal was to explain how construction workers perceive their occupational hazard and to analyze how this is related to their national culture. The model used to measure perceived risk was the psychometric paradigm. The results show three very similar profiles, indicating that risk perception is independent of nationality. A cultural analysis was conducted using the Hofstede model. The results of this analysis and the relation to perceived risk showed that risk perception in construction is independent of national culture. Finally, a multiple lineal regression analysis was conducted to determine what qualitative attributes could predict the global quantitative size of risk perception. All of the findings have important implications regarding the management of safety in the workplace.
Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.
NASA Astrophysics Data System (ADS)
Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed
2016-04-01
Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended continental platform and its lack of proto-oceanic crust northward.
NASA Astrophysics Data System (ADS)
Manimaran, P.; Narayana, A. C.
2018-07-01
In this paper, we study the multifractal characteristics and cross-correlation behaviour of Air Pollution Index (API) time series data through multifractal detrended cross-correlation analysis method. We analyse the daily API records of nine air pollutants of the university of Hyderabad campus for a period of three years (2013-2016). The cross-correlation behaviour has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, it is found that the cross-correlation analysis shows anti-correlation behaviour for all possible 36 bivariate time series. We also observe the existence of multifractal nature in all the bivariate time series in which many of them show strong multifractal behaviour. In particular, the hazardous particulate matter PM2.5 and inhalable particulate matter PM10 shows anti-correlated behaviour with all air pollutants.
Collection and Extraction of Occupational Air Samples for Analysis of Fungal DNA.
Lemons, Angela R; Lindsley, William G; Green, Brett J
2018-05-02
Traditional methods of identifying fungal exposures in occupational environments, such as culture and microscopy-based approaches, have several limitations that have resulted in the exclusion of many species. Advances in the field over the last two decades have led occupational health researchers to turn to molecular-based approaches for identifying fungal hazards. These methods have resulted in the detection of many species within indoor and occupational environments that have not been detected using traditional methods. This protocol details an approach for determining fungal diversity within air samples through genomic DNA extraction, amplification, sequencing, and taxonomic identification of fungal internal transcribed spacer (ITS) regions. ITS sequencing results in the detection of many fungal species that are either not detected or difficult to identify to species level using culture or microscopy. While these methods do not provide quantitative measures of fungal burden, they offer a new approach to hazard identification and can be used to determine overall species richness and diversity within an occupational environment.
Sumner, John; Ross, Tom; Jenson, Ian; Pointon, Andrew
2005-11-25
A risk profile of microbial hazards across the supply continuum for the beef, sheep and goat meat industries was developed using both a qualitative tool and a semi-quantitative, spreadsheet tool, Risk Ranger. The latter is useful for highlighting factors contributing to food safety risk and for ranking the risk of various product/pathogen combinations. In the present profile the qualitative tool was used as a preliminary screen for a wide range of hazard-product pairings while Risk Ranger was used to rank in order of population health risk pairings for which quantitative data were available and for assessing the effect of hypothetical scenarios. 'High' risk hazard-product pairings identified were meals contaminated with Clostridium perfringens provided by caterers which have not implemented HACCP; kebabs cross-contaminated by Salmonella present in drip trays or served undercooked; meals served in the home cross-contaminated with Salmonella. 'Medium' risk hazard-product pairings identified were ready-to-eat meats contaminated with Listeria monocytogenes and which have extended shelf life; Uncooked Comminuted Fermented Meat (UCFM)/Salami contaminated with Enterohaemorrhagic E. coli (EHEC) and Salmonella; undercooked hamburgers contaminated with EHEC; kebabs contaminated by Salmonella under normal production or following final "flash" heating. Identified 'low' risk hazard-product pairings included cooked, ready-to-eat sausages contaminated with Salmonella; UCFM/Salami contaminated with L. monocytogenes; well-cooked hamburgers contaminated with EHEC. The risk profile provides information of value to Australia's risk managers in the regulatory, processing and R&D sectors of the meat and meat processing industry for the purposes of identifying food safety risks in the industry and for prioritising risk management actions.
Lancaster, Kathryn E; MacLean, Sarah A; Lungu, Thandie; Mmodzi, Pearson; Hosseinipour, Mina C; Hershow, Rebecca B; Powers, Kimberly A; Pence, Brian W; Hoffman, Irving F; Miller, William C; Go, Vivian F
2018-04-16
Alcohol use is pervasive among female sex workers (FSW) placing them at increased risk of violence and sexual risk behaviors. FSW often live and work where alcohol is highly normative. To understand the socioecological influences on hazardous alcohol use among FSW in Malawi. In 2014, 200 FSW identified through venue-based sampling in Lilongwe, Malawi, completed a quantitative behavioral survey, with a sub-sample participating in qualitative interviews. Multivariable log-binomial regression was used to identify associations between hazardous alcohol use (AUDIT score ≥ 7) and time in sex work, clients per week, unprotected sex, alcohol use with clients, and living environment. Qualitative interviews enhanced findings from quantitative data and identify emergent themes around socioecological influences on alcohol use. Over 50% reported hazardous alcohol use and lived in an alcohol-serving venue. Hazardous alcohol use was associated with sex work duration of ≥2 years (aPR: 1.30; 95%CI: 1.02,1.65) and alcohol use at last sex with a client (aPR: 1.29; 95%CI: 1.06,1.57). FSW perceived alcohol as a facilitator for sex work by reducing inhibitions and attracting clients, but acknowledged alcohol leads to violence and/or unprotected sex. Despite these risks and a motivation to reduce use, FSW feared that refusing to drink would be tantamount to turning away clients. Although FSW recognized alcohol-related risks, the norms and power dynamics of sex work perpetuated hazardous alcohol use. Multilevel interventions are needed to collectively change norms around drinking and sex work that will enable FSW to reduce alcohol consumption when engaging in their work.
Volcano collapse promoted by hydrothermal alteration and edifice shape, Mount Rainier, Washington
Reid, M.E.; Sisson, T.W.; Brien, D.L.
2001-01-01
Catastrophic collapses of steep volcano flanks threaten many populated regions, and understanding factors that promote collapse could save lives and property. Large collapses of hydrothermally altered parts of Mount Rainier have generated far-traveled debris flows; future flows would threaten densely populated parts of the Puget Sound region. We evaluate edifice collapse hazards at Mount Rainier using a new three-dimensional slope stability method incorporating detailed geologic mapping and subsurface geophysical imaging to determine distributions of strong (fresh) and weak (altered) rock. Quantitative three-dimensional slope stability calculations reveal that sizeable flank collapse (>0.1 km3) is promoted by voluminous, weak, hydrothermally altered rock situated high on steep slopes. These conditions exist only on Mount Rainier's upper west slope, consistent with the Holocene debris-flow history. Widespread alteration on lower flanks or concealed in regions of gentle slope high on the edifice does not greatly facilitate collapse. Our quantitative stability assessment method can also provide useful hazard predictions using reconnaissance geologic information and is a potentially rapid and inexpensive new tool for aiding volcano hazard assessments.
Characterizing the nature and variability of avalanche hazard in western Canada
NASA Astrophysics Data System (ADS)
Shandro, Bret; Haegeli, Pascal
2018-04-01
The snow and avalanche climate types maritime, continental and transitional are well established and have been used extensively to characterize the general nature of avalanche hazard at a location, study inter-seasonal and large-scale spatial variabilities and provide context for the design of avalanche safety operations. While researchers and practitioners have an experience-based understanding of the avalanche hazard associated with the three climate types, no studies have described the hazard character of an avalanche climate in detail. Since the 2009/2010 winter, the consistent use of Statham et al. (2017) conceptual model of avalanche hazard in public avalanche bulletins in Canada has created a new quantitative record of avalanche hazard that offers novel opportunities for addressing this knowledge gap. We identified typical daily avalanche hazard situations using self-organizing maps (SOMs) and then calculated seasonal prevalence values of these situations. This approach produces a concise characterization that is conducive to statistical analyses, but still provides a comprehensive picture that is informative for avalanche risk management due to its link to avalanche problem types. Hazard situation prevalence values for individual seasons, elevations bands and forecast regions provide unprecedented insight into the inter-seasonal and spatial variability of avalanche hazard in western Canada.
NASA Astrophysics Data System (ADS)
Luna, B. Quan; Blahut, J.; van Westen, C. J.; Sterlacchini, S.; van Asch, T. W. J.; Akbas, S. O.
2011-07-01
For a quantitative assessment of debris flow risk, it is essential to consider not only the hazardous process itself but also to perform an analysis of its consequences. This should include the estimation of the expected monetary losses as the product of the hazard with a given magnitude and the vulnerability of the elements exposed. A quantifiable integrated approach of both hazard and vulnerability is becoming a required practice in risk reduction management. This study aims at developing physical vulnerability curves for debris flows through the use of a dynamic run-out model. Dynamic run-out models for debris flows are able to calculate physical outputs (extension, depths, velocities, impact pressures) and to determine the zones where the elements at risk could suffer an impact. These results can then be applied to consequence analyses and risk calculations. On 13 July 2008, after more than two days of intense rainfall, several debris and mud flows were released in the central part of the Valtellina Valley (Lombardy Region, Northern Italy). One of the largest debris flows events occurred in a village called Selvetta. The debris flow event was reconstructed after extensive field work and interviews with local inhabitants and civil protection teams. The Selvetta event was modelled with the FLO-2D program, an Eulerian formulation with a finite differences numerical scheme that requires the specification of an input hydrograph. The internal stresses are isotropic and the basal shear stresses are calculated using a quadratic model. The behaviour and run-out of the flow was reconstructed. The significance of calculated values of the flow depth, velocity, and pressure were investigated in terms of the resulting damage to the affected buildings. The physical damage was quantified for each affected structure within the context of physical vulnerability, which was calculated as the ratio between the monetary loss and the reconstruction value. Three different empirical vulnerability curves were obtained, which are functions of debris flow depth, impact pressure, and kinematic viscosity, respectively. A quantitative approach to estimate the vulnerability of an exposed element to a debris flow which can be independent of the temporal occurrence of the hazard event is presented.
Establish an Agent-Simulant Technology Relationship (ASTR)
2017-04-14
for quantitative measures that characterize simulant performance in testing , such as the ability to be removed from surfaces. Component-level ASTRs...Overall Test and Agent-Simulant Technology Relationship (ASTR) process. 1.2 Background. a. Historically, many tests did not develop quantitative ...methodology report14. Report provides a VX-TPP ASTR for post -decon contact hazard and off- gassing. In the Stryker production verification test (PVT
BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Selva, Jacopo
2013-04-01
The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).
Rocky Mountain Research Station USDA Forest Service
2005-01-01
The Guide to Fuel Treatments analyzes a range of potential silvicultural thinnings and surface fuel treatments for 25 representative dry-forest stands in the Western United States. The guide provides quantitative guidelines and visualization for treatment based on scientific principles identified for reducing potential crown fires. This fact sheet identifies the...
Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai
2017-01-01
Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA. PMID:28938011
Schreiber, Stefanie; Landau, Susan M; Fero, Allison; Schreiber, Frank; Jagust, William J
2015-10-01
The applicability of β-amyloid peptide (Aβ) positron emission tomography (PET) as a biomarker in clinical settings to aid in selection of individuals at preclinical and prodromal Alzheimer disease (AD) will depend on the practicality of PET image analysis. In this context, visual-based Aβ PET assessment seems to be the most feasible approach. To determine the agreement between visual and quantitative Aβ PET analysis and to assess the ability of both techniques to predict conversion from mild cognitive impairment (MCI) to AD. A longitudinal study was conducted among the Alzheimer's Disease Neuroimaging Initiative (ADNI) sites in the United States and Canada during a 1.6-year mean follow-up period. The study was performed from September 21, 2010, to August 11, 2014; data analysis was conducted from September 21, 2014, to May 26, 2015. Participants included 401 individuals with MCI receiving care at a specialty clinic (219 [54.6%] men; mean [SD] age, 71.6 [7.5] years; 16.2 [2.7] years of education). All participants were studied with florbetapir F 18 [18F] PET. The standardized uptake value ratio (SUVR) positivity threshold was 1.11, and one reader rated all images, with a subset of 125 scans rated by a second reader. Sensitivity and specificity of positive and negative [18F] florbetapir PET categorization, which was estimated with cerebrospinal fluid Aβ1-42 as the reference standard. Risk for conversion to AD was assessed using Cox proportional hazards regression models. The frequency of Aβ positivity was 48.9% (196 patients; visual analysis), 55.1% (221 patients; SUVR), and 64.8% (166 patients; cerebrospinal fluid), yielding substantial agreement between visual and SUVR data (κ = 0.74) and between all methods (Fleiss κ = 0.71). For approximately 10% of the 401 participants in whom visual and SUVR data disagreed, interrater reliability was moderate (κ = 0.44), but it was very high if visual and quantitative results agreed (κ = 0.92). Visual analysis had a lower sensitivity (79% vs 85%) but higher specificity (96% vs 90%), respectively, compared with SUVR. The conversion rate was 15.2% within a mean of 1.6 years, and a positive [18F] florbetapir baseline scan was associated with a 6.91-fold (SUVR) or 11.38-fold (visual) greater hazard for AD conversion, which changed only modestly after covariate adjustment for apolipoprotein ε4, concurrent fludeoxyglucose F 18 PET scan, and baseline cognitive status. Visual and SUVR Aβ PET analysis may be equivalently used to determine Aβ status for individuals with MCI participating in clinical trials, and both approaches add significant value for clinical course prognostication.
Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.
2016-04-01
Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition, the flexibility of NDSHA allows for generation of ground shaking maps at specified long-term return times, which may permit a straightforward comparison between NDSHA and PSHA maps in terms of average rates of exceedance for specified time windows. The comparison of NDSHA and PSHA maps, particularly for very long recurrence times, may indicate to what extent probabilistic ground shaking estimates are consistent with those from physical models of seismic waves propagation. A systematic comparison over the territory of Italy is carried out exploiting the uniqueness of the Italian earthquake catalogue, a data set covering more than a millennium (a time interval about ten times longer than that available in most of the regions worldwide) with a satisfactory completeness level for M>5, which warrants the results of analysis. By analysing in some detail seismicity in the Vrancea region, we show that well constrained macroseismic field information for individual earthquakes may provide useful information about the reliability of ground shaking estimates. Finally, in order to generalise observations, the comparative analysis is extended to further regions where both standard NDSHA and PSHA maps are available (e.g. State of Gujarat, India). The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with other national scale probabilistic maps, all obtained by PSHA, are considered for this comparative analysis.
Kundzewicz, Z. W.; Krysanova, V.; Dankers, R.; Hirabayashi, Y.; Kanae, S.; Hattermann, F. F.; Huang, S.; Milly, Paul C.D.; Stoffel, M.; Driessen, P.P.J.; Matczak, P.; Quevauviller, P.; Schellnhuber, H.-J.
2017-01-01
This paper interprets differences in flood hazard projections over Europe and identifies likely sources of discrepancy. Further, it discusses potential implications of these differences for flood risk reduction and adaptation to climate change. The discrepancy in flood hazard projections raises caution, especially among decision makers in charge of water resources management, flood risk reduction, and climate change adaptation at regional to local scales. Because it is naïve to expect availability of trustworthy quantitative projections of future flood hazard, in order to reduce flood risk one should focus attention on mapping of current and future risks and vulnerability hotspots and improve the situation there. Although an intercomparison of flood hazard projections is done in this paper and differences are identified and interpreted, it does not seems possible to recommend which large-scale studies may be considered most credible in particular areas of Europe.
NASA Astrophysics Data System (ADS)
Blauhut, Veit; Stölzle, Michael; Stahl, Kerstin
2017-04-01
Drought induced low flow extremes, despite a variety of management strategies, can cause direct and indirect impacts on socio economic and ecological functions of rivers. These negative effects determine local risk and are a function of the regional drought hazard and the river system's vulnerability. Whereas drought risk analysis is known to be essential for drought management, risk analysis for low flow is less common. Where no distributed hydrological models exist, merely the local hazard at gauging stations is available to represent the entire catchment. Vulnerability information are only sparsely available. Hence, a comprehensive understanding of the drivers of low flow risk along the longitudinal river profile is often lacking. For two different rivers in southwestern Germany, this study analysed major low flow events of the past five decades. Applying a transdisciplinary approach, the hazard component is assessed by hydro-climatic analysis, hydrological modelling and forward looking stress test scenarios; the vulnerability component is estimated by a combination of impact assessment and vulnerability estimation, based on stakeholder workshops, questionnaires and regional characteristics. The results show distinct differences in low flow risk between the catchments and along the river. These differences are due to: hydrogeological characteristics that govern groundwater-surface water interaction, catchment-specific anthropogenic stimuli such as low flow decrease by near-stream groundwater pumping for public water supply or low flow augmentation by treatment plant discharge. Thus, low flow risk is anthropogenically influenced in both ways: positive and negative. Furthermore, the measured longitudinal profiles highlight the impracticability of single gauges to represent quantitative and qualitative conditions of entire rivers. Hence, this work calls for a comprehensive spatially variable consideration of flow characteristics and human influences to analyse low flow risk as the basis for an adequate low flow management.
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...
Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less
Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waters, Michael; Jackson, Marcus
2008-11-15
The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens andmore » presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including dose-response studies in toxicology and pathology. Each of the public databases has been discussed in prior publications. They will be briefly described in the present report from the perspective of aggregating datasets to augment the data and information contained within them.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riddle, F. J.
2003-06-26
The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less
Hazard Analysis for Building 34 Vacuum Glove Box Assembly
NASA Technical Reports Server (NTRS)
Meginnis, Ian
2014-01-01
One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".
Earth reencounter probabilities for aborted space disposal of hazardous nuclear waste
NASA Technical Reports Server (NTRS)
Friedlander, A. L.; Feingold, H.
1977-01-01
A quantitative assessment is made of the long-term risk of earth reencounter and reentry associated with aborted disposal of hazardous material in the space environment. Numerical results are presented for 10 candidate disposal options covering a broad spectrum of disposal destinations and deployment propulsion systems. Based on representative models of system failure, the probability that a single payload will return and collide with earth within a period of 250,000 years is found to lie in the range .0002-.006. Proportionately smaller risk attaches to shorter time intervals. Risk-critical factors related to trajectory geometry and system reliability are identified as possible mechanisms of hazard reduction.
Systems Measures of Water Distribution System Resilience
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Katherine A.; Murray, Regan; Walker, La Tonya Nicole
2015-01-01
Resilience is a concept that is being used increasingly to refer to the capacity of infrastructure systems to be prepared for and able to respond effectively and rapidly to hazardous events. In Section 2 of this report, drinking water hazards, resilience literature, and available resilience tools are presented. Broader definitions, attributes and methods for measuring resilience are presented in Section 3. In Section 4, quantitative systems performance measures for water distribution systems are presented. Finally, in Section 5, the performance measures and their relevance to measuring the resilience of water systems to hazards is discussed along with needed improvements tomore » water distribution system modeling tools.« less
Extrapolation, uncertainty factors, and the precautionary principle.
Steel, Daniel
2011-09-01
This essay examines the relationship between the precautionary principle and uncertainty factors used by toxicologists to estimate acceptable exposure levels for toxic chemicals from animal experiments. It shows that the adoption of uncertainty factors in the United States in the 1950s can be understood by reference to the precautionary principle, but not by cost-benefit analysis because of a lack of relevant quantitative data at that time. In addition, it argues that uncertainty factors continue to be relevant to efforts to implement the precautionary principle and that the precautionary principle should not be restricted to cases involving unquantifiable hazards. Copyright © 2011 Elsevier Ltd. All rights reserved.
Development of vulnerability curves to typhoon hazards based on insurance policy and claim dataset
NASA Astrophysics Data System (ADS)
Mo, Wanmei; Fang, Weihua; li, Xinze; Wu, Peng; Tong, Xingwei
2016-04-01
Vulnerability refers to the characteristics and circumstances of an exposure that make it vulnerable to the effects of some certain hazards. It can be divided into physical vulnerability, social vulnerability, economic vulnerabilities and environmental vulnerability. Physical vulnerability indicates the potential physical damage of exposure caused by natural hazards. Vulnerability curves, quantifying the loss ratio against hazard intensity with a horizontal axis for the intensity and a vertical axis for the Mean Damage Ratio (MDR), is essential to the vulnerability assessment and quantitative evaluation of disasters. Fragility refers to the probability of diverse damage states under different hazard intensity, revealing a kind of characteristic of the exposure. Fragility curves are often used to quantify the probability of a given set of exposure at or exceeding a certain damage state. The development of quantitative fragility and vulnerability curves is the basis of catastrophe modeling. Generally, methods for quantitative fragility and vulnerability assessment can be categorized into empirical, analytical and expert opinion or judgment-based ones. Empirical method is one of the most popular methods and it relies heavily on the availability and quality of historical hazard and loss dataset, which has always been a great challenge. Analytical method is usually based on the engineering experiments and it is time-consuming and lacks built-in validation, so its credibility is also sometimes criticized widely. Expert opinion or judgment-based method is quite effective in the absence of data but the results could be too subjective so that the uncertainty is likely to be underestimated. In this study, we will present the fragility and vulnerability curves developed with empirical method based on simulated historical typhoon wind, rainfall and induced flood, and insurance policy and claim datasets of more than 100 historical typhoon events. Firstly, an insurance exposure classification system is built according to structure type, occupation type and insurance coverage. Then MDR estimation method based on considering insurance policy structure and claim information is proposed and validated. Following that, fragility and vulnerability curves of the major exposure types for construction, homeowner insurance and enterprise property insurance are fitted with empirical function based on the historical dataset. The results of this study can not only help understand catastrophe risk and mange insured disaster risks, but can also be applied in other disaster risk reduction efforts.
Lech, Tomasz
2016-05-01
The literature of environmental microbiology broadly discusses issues associated with microbial hazards in archives, but these publications are mainly devoted to paper documents. There are few articles on historical parchment documents, which used to be very important for the development of literature and the art of writing. These studies present a broad spectrum of methods for the assessment of biodeterioration hazards of the parchment document in question. They are based on both conventional microbiological methods and advanced techniques of molecular biology. Here, a qualitative analysis was conducted, based on genetic identification of bacteria and fungi present on the document as well as denaturing gradient gel electrophoresis profiling and examining the destructive potential of isolated microbes. Moreover, the study involved a quantitative and qualitative microbiological assessment of the indoor air in the room where the parchment was kept. The microbes with the highest destructive potential that were isolated from the investigated item were Bacillus cereus and Acinetobacter lwoffii bacteria and Penicillium chrysogenum,Chaetomium globosum, and Trichoderma longibrachiatum fungi. The presence of the B. cereuss train was particularly interesting since, under appropriate conditions, it leads to complete parchment degradation within several days. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Selected considerations of implementation of the GNSS
NASA Astrophysics Data System (ADS)
Cwiklak, Janusz; Fellner, Andrzej; Fellner, Radoslaw; Jafernik, Henryk; Sledzinski, Janusz
2014-05-01
The article describes analysis of the safety and risk for the implementation of precise approach procedures (Localizer Performance and Vertical Guidance - LPV) with GNSS sensor at airports in Warsaw and Katowice. There were used some techniques of the identification of threats (inducing controlled flight into terrain, landing accident, mid-air collision) and evaluations methods based on Fault Tree Analysis, probability of the risk, safety risk evaluation matrix and Functional Hazard Assesment. Also safety goals were determined. Research led to determine probabilities of appearing of threats, as well as allow compare them with regard to the ILS. As a result of conducting the Preliminary System Safety Assessment (PSSA), there were defined requirements essential to reach the required level of the safety. It is worth to underline, that quantitative requirements were defined using FTA.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Hazard Analysis and Risk- Based Preventive Controls for Human Food'' and its information collection... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food.'' IV. How To...
NASA Astrophysics Data System (ADS)
Castellanos Abella, Enrique A.; Van Westen, Cees J.
Geomorphological information can be combined with decision-support tools to assess landslide hazard and risk. A heuristic model was applied to a rural municipality in eastern Cuba. The study is based on a terrain mapping units (TMU) map, generated at 1:50,000 scale by interpretation of aerial photos, satellite images and field data. Information describing 603 terrain units was collected in a database. Landslide areas were mapped in detail to classify the different failure types and parts. Three major landslide regions are recognized in the study area: coastal hills with rockfalls, shallow debris flows and old rotational rockslides denudational slopes in limestone, with very large deep-seated rockslides related to tectonic activity and the Sierra de Caujerí scarp, with large rockslides. The Caujerí scarp presents the highest hazard, with recent landslides and various signs of active processes. The different landforms and the causative factors for landslides were analyzed and used to develop the heuristic model. The model is based on weights assigned by expert judgment and organized in a number of components such as slope angle, internal relief, slope shape, geological formation, active faults, distance to drainage, distance to springs, geomorphological subunits and existing landslide zones. From these variables a hierarchical heuristic model was applied in which three levels of weights were designed for classes, variables, and criteria. The model combines all weights into a single hazard value for each pixel of the landslide hazard map. The hazard map was then divided by two scales, one with three classes for disaster managers and one with 10 detailed hazard classes for technical staff. The range of weight values and the number of existing landslides is registered for each class. The resulting increasing landslide density with higher hazard classes indicates that the output map is reliable. The landslide hazard map was used in combination with existing information on buildings and infrastructure to prepare a qualitative risk map. The complete lack of historical landslide information and geotechnical data precludes the development of quantitative deterministic or probabilistic models.
Pérez-Milena, Alejandro; Redondo-Olmedilla, Manuel de Dios; Martínez-Fernández, María Luz; Jiménez-Pulido, Idoia; Mesa-Gallardo, Inmaculada; Leal-Helmling, Francisco Javier
2017-11-01
To determine the changes in hazardous drinking in adolescents in the last decade, as well as their motivations and experiences. Firstly, a descriptive design using a self-report questionnaire, and secondly an explanatory qualitative design, with video recordings of discussion groups with content analysis (coding, triangulation of categories and verification of results). Pupils from an urban High School, administering a questionnaire every 3 years from 2004 to 2013. Purposive sampling was used to elect groups in qualitative design. Homogeneity criteria: education level; heterogeneity criteria: age, gender, and drug use. Questionnaire: age, gender, drug use, and the CAGE test. Interviews: semi-structured on a previous script, evaluating experiences and expectations. Descriptive design: A total of 1,558 questionnaires, age 14.2±0.3years, 50% female. The prevalence of alcohol drinking decreases (13%), but its hazardous use increases (11%; P<.001, χ 2 ). This is associated with being female (P<.01 χ 2 ), higher alcohol consumption (>6 standard drink units weekly; P<.001, ANOVA), during the weekend (56%; P<.01, χ 2 ) and multiple drug use (P<.01, χ 2 ). CAGE questionnaire: 37% ≥1positive response (related to hazardous drinking, P<.05 χ 2 ), 18% ≥2answers. A total of 48 respondents, classified into 4 categories: personal factors (age, gender), social influences (family, friends), consumption standards (accessibility, nightlife), and addiction (risk, multiple drug use). Despite the decrease in the prevalence of alcohol drinking, the increase in the percentage of the hazardous drinking is a public health problem. It is related to being female, binge-drinking, and multiple drug use. Nightlife and social standards are the main reasons given by adolescents, who have no perception of risk. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.
NASA Astrophysics Data System (ADS)
Williams, G. T.; Kennedy, B. M.; Wilson, T. M.; Fitzgerald, R. H.; Tsunematsu, K.; Teissier, A.
2017-09-01
Recent casualties in volcanic eruptions due to trauma from blocks and bombs necessitate more rigorous, ballistic specific risk assessment. Quantitative assessments are limited by a lack of experimental and field data on the vulnerability of buildings to ballistic hazards. An improved, quantitative understanding of building vulnerability to ballistic impacts is required for informing appropriate life safety actions and other risk reduction strategies. We assessed ballistic impacts to buildings from eruptions at Usu Volcano and Mt. Ontake in Japan and compiled available impact data from eruptions elsewhere to identify common damage patterns from ballistic impacts to buildings. We additionally completed a series of cannon experiments which simulate ballistic block impacts to building claddings to investigate their performance over a range of ballistic projectile velocities, masses and energies. Our experiments provide new insights by quantifying (1) the hazard associated with post-impact shrapnel from building and rock fragments; (2) the effect of impact obliquity on damage; and (3) the additional impact resistance buildings possess when claddings are struck in areas directly supported by framing components. This was not well identified in previous work which may have underestimated building vulnerability to ballistic hazards. To improve assessment of building vulnerability to ballistics, we use our experimental and field data to develop quantitative vulnerability models known as fragility functions. Our fragility functions and field studies show that although unreinforced buildings are highly vulnerable to large ballistics (> 20 cm diameter), they can still provide shelter, preventing death during eruptions.
NASA Technical Reports Server (NTRS)
2012-01-01
One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.
Wald, David J.
2010-01-01
This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey’s (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER’s overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra–Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability.
Marano, K.D.; Wald, D.J.; Allen, T.I.
2010-01-01
This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.
Local to global: a collaborative approach to volcanic risk assessment
NASA Astrophysics Data System (ADS)
Calder, Eliza; Loughlin, Sue; Barsotti, Sara; Bonadonna, Costanza; Jenkins, Susanna
2017-04-01
Volcanic risk assessments at all scales present challenges related to the multitude of volcanic hazards, data gaps (hazards and vulnerability in particular), model representation and resources. Volcanic hazards include lahars, pyroclastic density currents, lava flows, tephra fall, ballistics, gas dispersal and also earthquakes, debris avalanches, tsunamis and more ... they can occur in different combinations and interact in different ways throughout the unrest, eruption and post-eruption period. Volcanoes and volcanic hazards also interact with other natural hazards (e.g. intense rainfall). Currently many hazards assessments consider the hazards from a single volcano but at national to regional scales the potential impacts of multiple volcanoes over time become important. The hazards that have the greatest tendency to affect large areas up to global scale are those transported in the atmosphere: volcanic particles and gases. Volcanic ash dispersal has the greatest potential to directly or indirectly affect the largest number of people worldwide, it is currently the only volcanic hazard for which a global assessment exists. The quantitative framework used (primarily at a regional scale) considers the hazard at a given location from any volcano. Flow hazards such as lahars and floods can have devastating impacts tens of kilometres from a source volcano and lahars can be devastating decades after an eruption has ended. Quantitative assessment of impacts is increasingly undertaken after eruptions to identify thresholds for damage and reduced functionality. Some hazards such as lava flows could be considered binary (totally destructive) but others (e.g. ash fall) have varying degrees of impact. Such assessments are needed to enhance available impact and vulnerability data. Currently, most studies focus on physical vulnerability but there is a growing emphasis on social vulnerability showing that it is highly variable and dynamic with pre-eruption socio-economic conditions tending to influence longer term well-being and recovery. The volcanological community includes almost 100 Volcano Observatories worldwide, the official institutions responsible for monitoring volcanoes. They may be dedicated institutions, or operate from national institutions (geological surveys, universities, met agencies). They have a key role in early warning, forecasting and long term hazard assessment (often in the form of volcanic hazards maps). The complexity of volcanic systems means that once unrest begins there are multiple potential eruptive outcomes and short term forecasts can change rapidly. This local knowledge of individual volcanoes underpins hazard and risk assessments developed at national, regional and global scales. Combining this local expertise with the knowledge of the international research community (including interdisciplinary perspectives) creates a powerful partnership. A collaborative approach is therefore needed to develop effective volcanic risk assessments at regional to global scale. The World Organisation of Volcano Observatories is a Commission of IAVCEI, alongside other Commissions such as 'Hazard and Risk' (with an active working group on volcanic hazards maps) and the 'Cities and Volcanoes' Commission. The Global Volcano Model network is a collaborative initiative developing hazards and risk information at national to global scales, underpinned by local expertise. Partners include IAVCEI, Smithsonian Institution, International Volcanic Health Hazard Network, VHub and other initiatives and institutions.
[Hazard function and life table: an introduction to the failure time analysis].
Matsushita, K; Inaba, H
1987-04-01
Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.
NASA Astrophysics Data System (ADS)
Zumpano, Veronica; Balteanu, Dan; Mazzorana, Bruno; Micu, Mihai
2014-05-01
It is increasingly important to provide to stakeholders tools that will enable them to better understand what is the state of the environment in which they live and manage and to help them to make decisions that aim to minimize the consequences of hydro-meteorological hazards. Very often, however, quantitative studies, especially for large areas, are difficult to perform. This is due to the fact that unfortunately isn't often possible to have the numerous data required to perform the analysis. In addition it has been proven that in scenario analysis, often deterministic approaches are not able to detect some features of the system revealing unexpected behaviors, and resulting in underestimation or omission of some impact factors. Here are presented some preliminary results obtained applying Formative Scenario Analysis that can be considered a possible solution for landslide risk analysis in cases where the data needed even if existent are not available. This method is an expert based approach that integrates intuitions and qualitative evaluations of impact factors with the quantitative analysis of relations between these factors: a group of experts with different but pertinent expertise, determine (by a rating procedure) quantitative relations between these factors, then through mathematical operations the scenarios describing a certain state of the system are obtained. The approach is applied to Buzau County (Romania), an area belonging to the Curvature Romanian Carpathians and Subcarpathians, a region strongly affected by environmental hazards. The region has been previously involved in numerous episodes of severe hydro-meteorological events that caused considerable damages (1975, 2005, 2006). In this application we are referring only to one type of landslides that can be described as shallow and medium-seated with a (mainly) translational movement that can go from slide to flow. The material involved can be either soil, debris or a mixture of both, in Romanian literature these typical movements has been described as alunecare curgatoare. The Formative Scenario Analysis approach will be applied for each component of risk (H,V,and A) and then the acquired states will be combined in order to obtain for obtaining a series of alternatives scenarios for risk. The approach is structured in two main sections corresponding to a level of influence of conditioning factors and a response. In this latter are obtained the results of the formative scenario approach trained with the conditioning factors of the first level. These factors are divided in two subsets representing 2 levels of influences, k=1 comprises the global factors while in k=2 one finds local factors. In order to include uncertainty estimation within the analysis the method of knowledge representation type-1 fuzzy sets is introduced and hence decisions made by experts on certain events are expressed in terms of triangular fuzzy numbers.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day...
Polynuclear aromatic hydrocarbon analysis using the synchronous scanning luminoscope
NASA Astrophysics Data System (ADS)
Hyfantis, George J., Jr.; Teglas, Matthew S.; Wilbourn, Robert G.
2001-02-01
12 The Synchronous Scanning Luminoscope (SSL) is a field- portable, synchronous luminescence spectrofluorometer that was developed for on-site analysis of contaminated soil and ground water. The SSL is capable of quantitative analysis of total polynuclear aromatic hydrocarbons (PAHs) using phosphorescence and fluorescence techniques with a high correlation to laboratory data as illustrated by this study. The SSL is also capable of generating benzo(a)pyrene equivalency results, based on seven carcinogenic PAHs and Navy risk numbers, with a high correlation to laboratory data as illustrated by this study. These techniques allow rapid field assessments of total PAHs and benzo(a)pyrene equivalent concentrations. The Luminoscope is capable of detecting total PAHs to the parts per billion range. This paper describes standard field methods for using the SSL and describes the results of field/laboratory testing of PAHs. SSL results from two different hazardous waste sites are discussed.
Sample and data processing considerations for the NIST quantitative infrared database
NASA Astrophysics Data System (ADS)
Chu, Pamela M.; Guenther, Franklin R.; Rhoderick, George C.; Lafferty, Walter J.; Phillips, William
1999-02-01
Fourier-transform infrared (FT-IR) spectrometry has become a useful real-time in situ analytical technique for quantitative gas phase measurements. In fact, the U.S. Environmental Protection Agency (EPA) has recently approved open-path FT-IR monitoring for the determination of hazardous air pollutants (HAP) identified in EPA's Clean Air Act of 1990. To support infrared based sensing technologies, the National Institute of Standards and Technology (NIST) is currently developing a standard quantitative spectral database of the HAPs based on gravimetrically prepared standard samples. The procedures developed to ensure the quantitative accuracy of the reference data are discussed, including sample preparation, residual sample contaminants, data processing considerations, and estimates of error.
NASA Astrophysics Data System (ADS)
Luna, Byron Quan; Blahut, Jan; Camera, Corrado; van Westen, Cees; Sterlacchini, Simone; Apuani, Tiziana; Akbas, Sami
2010-05-01
For a quantitative risk assessment framework it is essential to assess not only the hazardous process itself but to perform an analysis of their consequences. This quantitative assessment should include the expected monetary losses as the product of the probability of occurrence of a hazard with a given magnitude and its vulnerability. A quantifiable integrated approach of both hazard and risk is becoming a required practice in risk reduction management. Dynamic run-out models for debris flows are able to calculate physical outputs (extension, depths, velocities, impact pressures) and to determine the zones where the elements at risk could suffer an impact. These results are then applied for vulnerability and risk calculations. The risk assessment has been conducted in the Valtellina Valley, a typical Italian alpine valley lying in northern Italy (Lombardy Region). On 13th July 2008, after more than two days of intense rainfall, several debris and mud flows were released in the central part of valley between Morbegno and Berbenno. One of the largest debris flows occurred in Selvetta. The debris flow event was reconstructed after extensive field work and interviews with local inhabitants and civil protection teams. Also inside the Valtellina valley, between the 22nd and the 23rd of May 1983, two debris flows happened in Tresenda (Teglio municipality), causing casualties and considerable economic damages. On the same location, during the 26th of November 2002, another debris flow occurred that caused significant damage. For the quantification of a new scenario, the outcome results obtained from the event of Selvetta were applied in Tresenda. The Selvetta and Tresenda event were modelled with the FLO2D program. FLO2D is an Eulerian formulation with a finite differences numerical scheme that requires the specification of an input hydrograph. The internal stresses are isotropic and the basal shear stresses are calculated using a quadratic model. The significance of calculated values of pressure and velocity were investigated in terms of the resulting damage to the affected buildings. The physical damage was quantified for each affected structure within the context of physical vulnerability, which is defined as the ratio between the monetary loss and the reconstruction value. Two different empirical vulnerability curves were obtained, which are functions of debris flow velocity and pressure, respectively. Prospective economic direct losses were estimated.
Hristozov, Danail; Zabeo, Alex; Alstrup Jensen, Keld; Gottardo, Stefania; Isigonis, Panagiotis; Maccalman, Laura; Critto, Andrea; Marcomini, Antonio
2016-11-01
Several tools to facilitate the risk assessment and management of manufactured nanomaterials (MN) have been developed. Most of them require input data on physicochemical properties, toxicity and scenario-specific exposure information. However, such data are yet not readily available, and tools that can handle data gaps in a structured way to ensure transparent risk analysis for industrial and regulatory decision making are needed. This paper proposes such a quantitative risk prioritisation tool, based on a multi-criteria decision analysis algorithm, which combines advanced exposure and dose-response modelling to calculate margins of exposure (MoE) for a number of MN in order to rank their occupational risks. We demonstrated the tool in a number of workplace exposure scenarios (ES) involving the production and handling of nanoscale titanium dioxide, zinc oxide (ZnO), silver and multi-walled carbon nanotubes. The results of this application demonstrated that bag/bin filling, manual un/loading and dumping of large amounts of dry powders led to high emissions, which resulted in high risk associated with these ES. The ZnO MN revealed considerable hazard potential in vivo, which significantly influenced the risk prioritisation results. In order to study how variations in the input data affect our results, we performed probabilistic Monte Carlo sensitivity/uncertainty analysis, which demonstrated that the performance of the proposed model is stable against changes in the exposure and hazard input variables.
Noise and contrast comparison of visual and infrared images of hazards as seen inside an automobile
NASA Astrophysics Data System (ADS)
Meitzler, Thomas J.; Bryk, Darryl; Sohn, Eui J.; Lane, Kimberly; Bednarz, David; Jusela, Daniel; Ebenstein, Samuel; Smith, Gregory H.; Rodin, Yelena; Rankin, James S., II; Samman, Amer M.
2000-06-01
The purpose of this experiment was to quantitatively measure driver performance for detecting potential road hazards in visual and infrared (IR) imagery of road scenes containing varying combinations of contrast and noise. This pilot test is a first step toward comparing various IR and visual sensors and displays for the purpose of an enhanced vision system to go inside the driver compartment. Visible and IR road imagery obtained was displayed on a large screen and on a PC monitor and subject response times were recorded. Based on the response time, detection probabilities were computed and compared to the known time of occurrence of a driving hazard. The goal was to see what combinations of sensor, contrast and noise enable subjects to have a higher detection probability of potential driving hazards.
NASA Technical Reports Server (NTRS)
Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.
2014-01-01
This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of abort triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of abort triggers.
NASA Technical Reports Server (NTRS)
Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.
2014-01-01
This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of Abort Triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of Abort Triggers.
Rubin, Oliver; Rossing, Tine
2012-01-01
The Latin American region is particularly prone to climate-related natural hazards. However, this article argues that natural hazards are only partly to blame for the region's vulnerability to natural disasters with quantitative evidence suggesting instead that income per capita and inequality are main determinants of natural disaster mortality in Latin America. Locally, the region's poor are particularly susceptible to climate-related natural hazards. As a result of their limited access to capital, adaptation based on social assets constitutes an effective coping strategy. Evidence from Bolivia and Belize illustrates the importance of social assets in protecting the most vulnerable against natural disasters.
Flood hazards studies in the Mississippi River basin using remote sensing
NASA Technical Reports Server (NTRS)
Rango, A.; Anderson, A. T.
1974-01-01
The Spring 1973 Mississippi River flood was investigated using remotely sensed data from ERTS-1. Both manual and automatic analyses of the data indicated that ERTS-1 is extremely useful as a regional tool for flood mamagement. Quantitative estimates of area flooded were made in St. Charles County, Missouri and Arkansas. Flood hazard mapping was conducted in three study areas along the Mississippi River using pre-flood ERTS-1 imagery enlarged to 1:250,000 and 1:100,000 scale. Initial results indicate that ERTS-1 digital mapping of flood prone areas can be performed at 1:62,500 which is comparable to some conventional flood hazard map scales.
Self-selection and moral hazard in Chilean health insurance.
Sapelli, Claudio; Vial, Bernardita
2003-05-01
We study the existence of self-selection and moral hazard in the Chilean health insurance industry. Dependent workers must purchase health insurance either from one public or several private insurance providers. For them, we analyze the relationship between health care services utilization and the choice of either private or public insurance. In the case of independent workers, where there is no mandate, we analyze the relationship between utilization and the decision to voluntarily purchase health insurance. The results show self-selection against insurance companies for independent workers, and against public insurance for dependent workers. Moral hazard is negligible in the case of hospitalization, but for medical visits, it is quantitatively important.
NASA Astrophysics Data System (ADS)
van Westen, Cees; Bakker, Wim; Zhang, Kaixi; Jäger, Stefan; Assmann, Andre; Kass, Steve; Andrejchenko, Vera; Olyazadeh, Roya; Berlin, Julian; Cristal, Irina
2014-05-01
Within the framework of the EU FP7 Marie Curie Project CHANGES (www.changes-itn.eu) and the EU FP7 Copernicus project INCREO (http://www.increo-fp7.eu) a spatial decision support system is under development with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. The Spatial Decision Support System will be composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk curves, tables, graphs). The envisaged users of the platform are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analysing spatial data at a municipal scale.
Social vulnerability assessment: a growing practice in Europe?
NASA Astrophysics Data System (ADS)
Tapsell, S.; McC arthy, S.
2012-04-01
This paper builds upon work on social vulnerability from the CapHaz-Net consortium, an ongoing research project funded by the European Commission in its 7th Framework Programme. The project focuses on the social dimensions of natural hazards, as well as on regional practices of risk prevention and management, and aims at improving the resilience of European societies to natural hazards, paying particular attention to social capacity building. The topic of social vulnerability is one of seven themes being addressed in the project. There are various rationales for examining the relevance of social vulnerability to natural hazards. Vulnerability assessment has now been accepted as a requirement for the effective development of emergency management capability, and assessment of social vulnerability has been recognised as being integral to understanding the risk to natural hazards. The aim of our research was to examine social vulnerability, how it might be understood in the context of natural hazards in Europe, and how social vulnerability can be addressed to increase social capacity. The work comprised a review of research on social vulnerability to different natural hazards within Europe and included concepts and definitions of social vulnerability (and related concepts), the purpose of vulnerability assessment and who decides who is vulnerable, different approaches to assessing or measuring social vulnerability (such as the use of 'classical' quantitative vulnerability indicators and qualitative community-based approaches, along with the advantages and disadvantages of both), conceptual frameworks for assessing social vulnerability and three case studies of social vulnerability studies within Europe: flash floods in the Italian Alps, fluvial flooding in Germany and heat waves in Spain. The review reveals variable application of social vulnerability analysis across Europe and there are indications why this might be the case. Reasons could range from the scale of country policy and the particular risk management focus to the smaller scale risk management perceptions of the analysis techniques employed being to resource expensive, difficult to interpret or to operationalise. This paper will provide a context with some empirical examples to perhaps explain the growing popularity of concepts such as resilience and capacity building which lie more comfortably with policy makers and risk managers as concepts which focus on the solution rather than identifying a problem by assessing social vulnerability.
77 FR 55371 - System Safety Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-07
...-based rule and FRA seeks comments on all aspects of the proposed rule. An SSP would be implemented by a... SSP would be the risk-based hazard management program and risk-based hazard analysis. A properly implemented risk-based hazard management program and risk-based hazard analysis would identify the hazards and...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2012 CFR
2012-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2014 CFR
2014-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2011 CFR
2011-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2013 CFR
2013-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2010 CFR
2010-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
Long-term multi-hazard assessment for El Misti volcano (Peru)
NASA Astrophysics Data System (ADS)
Sandri, Laura; Thouret, Jean-Claude; Constantinescu, Robert; Biass, Sébastien; Tonini, Roberto
2014-02-01
We propose a long-term probabilistic multi-hazard assessment for El Misti Volcano, a composite cone located <20 km from Arequipa. The second largest Peruvian city is a rapidly expanding economic centre and is classified by UNESCO as World Heritage. We apply the Bayesian Event Tree code for Volcanic Hazard (BET_VH) to produce probabilistic hazard maps for the predominant volcanic phenomena that may affect c.900,000 people living around the volcano. The methodology accounts for the natural variability displayed by volcanoes in their eruptive behaviour, such as different types/sizes of eruptions and possible vent locations. For this purpose, we treat probabilistically several model runs for some of the main hazardous phenomena (lahars, pyroclastic density currents (PDCs), tephra fall and ballistic ejecta) and data from past eruptions at El Misti (tephra fall, PDCs and lahars) and at other volcanoes (PDCs). The hazard maps, although neglecting possible interactions among phenomena or cascade effects, have been produced with a homogeneous method and refer to a common time window of 1 year. The probability maps reveal that only the north and east suburbs of Arequipa are exposed to all volcanic threats except for ballistic ejecta, which are limited to the uninhabited but touristic summit cone. The probability for pyroclastic density currents reaching recently expanding urban areas and the city along ravines is around 0.05 %/year, similar to the probability obtained for roof-critical tephra loading during the rainy season. Lahars represent by far the most probable threat (around 10 %/year) because at least four radial drainage channels can convey them approximately 20 km away from the volcano across the entire city area in heavy rain episodes, even without eruption. The Río Chili Valley represents the major concern to city safety owing to the probable cascading effect of combined threats: PDCs and rockslides, dammed lake break-outs and subsequent lahars or floods. Although this study does not intend to replace the current El Misti hazard map, the quantitative results of this probabilistic multi-hazard assessment can be incorporated into a multi-risk analysis, to support decision makers in any future improvement of the current hazard evaluation, such as further land-use planning and possible emergency management.
NASA Astrophysics Data System (ADS)
Guillard-Gonçalves, C.; Zêzere, J. L.; Pereira, S.; Garcia, R. A. C.
2015-09-01
This study offers a semi-quantitative assessment of the physical vulnerability of buildings to landslides in the Loures municipality, as well as an analysis of the landslide risk computed as the product of the vulnerability by the economic value of the buildings and by the landslide hazard. The physical vulnerability assessment, which was based on a questionnaire sent to a pool of Portuguese and European researchers, and the assessment of the subjectivity of their answers are innovative contributions of this work. The generalization of the vulnerability to the smallest statistical subsection was validated by changing the map unit and applying the vulnerability to all the buildings of a test site (approximately 800 buildings), which were inventoried during fieldwork. The economic value of the buildings of the Loures municipality was calculated using an adaptation of the Portuguese Tax Services formula. The hazard was assessed by combining the susceptibility of the slopes, the spatio-temporal probability and the frequency-magnitude relationship of the landslide. Finally, the risk was mapped for different landslide magnitudes and different spatio-temporal probabilities. The highest landslide risk was found for the landslide with a depth of 3 m in the landslide body, and a height of 1m in the landslide foot.
Some current advances in biophysical applications of ionizing radiation for health preservation
NASA Astrophysics Data System (ADS)
Watt, D. E.
1987-03-01
Radiation Physics is a subject of major importance in application to health preservation through investigative, diagnostic, analytical and therapeutic procedures for clinical purposes. Its benefits are enormous and well-established. However there are also hazards and so it is important for health preservation purposes to establish quantitatively the degree of risk undergone by persons exposed to radiation in the natural environment, in their occupations and in medical treatment. In this paper a brief indication is given of the extensive utilisation of the unique properties of radiation in biomedical application. This is followed by fuller discussion on new developments in our understanding of radiation damage mechanisms in radiotherapy and radiological protection. An example is given in biomedical research into the role of trace elements in gallstone formation using neutron activation anaysis, proton induced X-ray emission and X-ray fluorescence analysis as complementary techniques for maximising sensitivity in multielemental analysis by induced radiation. Procedures are described for measuring radiation effect, at bone/ tissue and lung/air interfaces, due to the uptake of radioactive material from the natural environment. Finally a topical subject in nuclear medicine viz. the possible advantages and hazards of Auger electron cascades resulting from inner shell vacancies in electron capture nuclides, is examined in the light of new evidence.
Chua, M L; Setyawati, M I; Li, H; Fang, C H Y; Gurusamy, S; Teoh, F T L; Leong, D T; George, S
2017-05-01
We investigated the physicochemical properties (size, shape, elemental composition, and endotoxin) of size resolved particulate matter (PM) collected from the indoor and corridor environments of classrooms. A comparative hazard profiling of these PM was conducted using human microvascular endothelial cells (HMVEC). Oxidative stress-dependent cytotoxicity responses were assessed using quantitative reverse transcriptase polymerase chain reaction (RT-PCR) and high content screening (HCS), and disruption of monolayer cell integrity was assessed using fluorescence microscopy and transwell assay. Scanning electron microscopy (SEM) coupled with energy-dispersive X-ray spectroscopy (EDX) analysis showed differences in the morphology and elemental composition of PM of different sizes and origins. While the total mass of PM collected from indoor environment was lower in comparison with those collected from the corridor, the endotoxin content was substantially higher in indoor PM (e.g., ninefold higher endotoxin level in indoor PM 8.1-20 ). The ability to induce oxidative stress-mediated cytotoxicity and leakiness in cell monolayer were higher for indoor PM compared to those collected from the corridor. In conclusion, this comparative analysis suggested that indoor PM is relatively more hazardous to the endothelial system possibly because of higher endotoxin content. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Chen, Guangchao; Peijnenburg, Willie; Xiao, Yinlong; Vijver, Martina G
2017-07-12
As listed by the European Chemicals Agency, the three elements in evaluating the hazards of engineered nanomaterials (ENMs) include the integration and evaluation of toxicity data, categorization and labeling of ENMs, and derivation of hazard threshold levels for human health and the environment. Assessing the hazards of ENMs solely based on laboratory tests is time-consuming, resource intensive, and constrained by ethical considerations. The adoption of computational toxicology into this task has recently become a priority. Alternative approaches such as (quantitative) structure-activity relationships ((Q)SAR) and read-across are of significant help in predicting nanotoxicity and filling data gaps, and in classifying the hazards of ENMs to individual species. Thereupon, the species sensitivity distribution (SSD) approach is able to serve the establishment of ENM hazard thresholds sufficiently protecting the ecosystem. This article critically reviews the current knowledge on the development of in silico models in predicting and classifying the hazard of metallic ENMs, and the development of SSDs for metallic ENMs. Further discussion includes the significance of well-curated experimental datasets and the interpretation of toxicity mechanisms of metallic ENMs based on reported models. An outlook is also given on future directions of research in this frontier.
Comber, Mike H I; Walker, John D; Watts, Chris; Hermens, Joop
2003-08-01
The use of quantitative structure-activity relationships (QSARs) for deriving the predicted no-effect concentration of discrete organic chemicals for the purposes of conducting a regulatory risk assessment in Europe and the United States is described. In the United States, under the Toxic Substances Control Act (TSCA), the TSCA Interagency Testing Committee and the U.S. Environmental Protection Agency (U.S. EPA) use SARs to estimate the hazards of existing and new chemicals. Within the Existing Substances Regulation in Europe, QSARs may be used for data evaluation, test strategy indications, and the identification and filling of data gaps. To illustrate where and when QSARs may be useful and when their use is more problematic, an example, methyl tertiary-butyl ether (MTBE), is given and the predicted and experimental data are compared. Improvements needed for new QSARs and tools for developing and using QSARs are discussed.
Starvin, A M; Rao, T Prasada
2004-09-10
As a part of removal of toxic heavy metals from hazardous wastes, solid phase extraction (SPE) of mercury(II) at trace and ultra trace levels was studied using 1-(2-thiazolylazo)-2-naphthol (TAN) functionalized activated carbon (AC). The SPE material removes traces of mercury(II) quantitatively in the pH range 6.0 +/- 0.2. Other parameters that influence quantitative recovery of mercury(II), viz. percent concentration of TAN in AC, amount of TAN-AC, preconcentration time and volume of aqueous phase were varied and optimized. The possible means of removal of Hg(II) from other metal ions that are likely to be present in the wastes of the chloroalkali industry is discussed. The potential of TAN-functionalized AC SPE material for decontaminating mercury from the brine sludge and cell house effluent of a chloralkali plant has been evaluated.
NASA Technical Reports Server (NTRS)
Stewart, R. B.; Grose, W. L.
1975-01-01
Parametric studies were made with a multilayer atmospheric diffusion model to place quantitative limits on the uncertainty of predicting ground-level toxic rocket-fuel concentrations. Exhaust distributions in the ground cloud, cloud stabilized geometry, atmospheric coefficients, the effects of exhaust plume afterburning of carbon monoxide CO, assumed surface mixing-layer division in the model, and model sensitivity to different meteorological regimes were studied. Large-scale differences in ground-level predictions are quantitatively described. Cloud alongwind growth for several meteorological conditions is shown to be in error because of incorrect application of previous diffusion theory. In addition, rocket-plume calculations indicate that almost all of the rocket-motor carbon monoxide is afterburned to carbon dioxide CO2, thus reducing toxic hazards due to CO. The afterburning is also shown to have a significant effect on cloud stabilization height and on ground-level concentrations of exhaust products.
Goodman-Meza, David; Pitpitan, Eileen V.; Semple, Shirley J.; Wagner, Karla D.; Chavarin, Claudia V.; Strathdee, Steffanie A.; Patterson, Thomas L.
2015-01-01
Background and Objectives Male clients of female sex workers (FSWs) are at high risk for HIV. Whereas the HIV risks of alcohol use are well understood, less is known about hazardous alcohol use among male clients of FSWs, particularly in Mexico. We sought to identify risk factors for hazardous alcohol use and test associations between hazardous alcohol use and HIV risk behaviour among male clients in Tijuana. Method Male clients of FSWs in Tijuana (n = 400) completed a quantitative interview in 2008. The AUDIT was used to characterize hazardous alcohol use. Multivariate logistic regression was used to determine independent associations of demographic and HIV risk variables with hazardous alcohol use (vs. non-hazardous). Results Forty percent of our sample met criteria for hazardous alcohol use. Variables independently associated with hazardous drinking were reporting any sexually transmitted infection (STI), having sex with a FSW while under the influence of alcohol, being younger than 36 years of age, living in Tijuana, and ever having been jailed. Hazardous drinkers were less likely ever to have been deported or to have shared injection drugs. Discussion and Conclusions Hazardous alcohol use is associated with HIV risk, including engaging in sex with FSWs while intoxicated and having an STI among male clients of FSWs in Tijuana. Scientific Significance We systematically described patterns and correlates of hazardous alcohol use among male clients of FSWs in Tijuana, Mexico. The results suggest that HIV/STI risk reduction interventions must target hazardous alcohol users, and be tailored to address alcohol use. PMID:25066863
Goodman-Meza, David; Pitpitan, Eileen V; Semple, Shirley J; Wagner, Karla D; Chavarin, Claudia V; Strathdee, Steffanie A; Patterson, Thomas L
2014-01-01
Male clients of female sex workers (FSWs) are at high risk for HIV. Whereas the HIV risks of alcohol use are well understood, less is known about hazardous alcohol use among male clients of FSWs, particularly in Mexico. We sought to identify risk factors for hazardous alcohol use and test associations between hazardous alcohol use and HIV risk behavior among male clients in Tijuana. Male clients of FSWs in Tijuana (n = 400) completed a quantitative interview in 2008. The AUDIT was used to characterize hazardous alcohol use. Multivariate logistic regression was used to determine independent associations of demographic and HIV risk variables with hazardous alcohol use (vs. non-hazardous). Forty percent of our sample met criteria for hazardous alcohol use. Variables independently associated with hazardous drinking were reporting any sexually transmitted infection (STI), having sex with a FSW while under the influence of alcohol, being younger than 36 years of age, living in Tijuana, and ever having been jailed. Hazardous drinkers were less likely ever to have been deported or to have shared injection drugs. Hazardous alcohol use is associated with HIV risk, including engaging in sex with FSWs while intoxicated and having an STI among male clients of FSWs in Tijuana. We systematically described patterns and correlates of hazardous alcohol use among male clients of FSWs in Tijuana, Mexico. The results suggest that HIV/STI risk reduction interventions must target hazardous alcohol users, and be tailored to address alcohol use. © American Academy of Addiction Psychiatry.
Assessment of social vulnerability to natural hazards in Nepal
NASA Astrophysics Data System (ADS)
Gautam, Dipendra
2017-12-01
This paper investigates district-wide social vulnerability to natural hazards in Nepal. Disasters such as earthquakes, floods, landslides, epidemics, and droughts are common in Nepal. Every year thousands of people are killed and huge economic and environmental losses occur in Nepal due to various natural hazards. Although natural hazards are well recognized, quantitative and qualitative social vulnerability mapping has not existed until now in Nepal. This study aims to quantify the social vulnerability on a local scale, considering all 75 districts using the available census. To perform district-level vulnerability mapping, 13 variables were selected and aggregated indexes were plotted in an ArcGIS environment. The sum of results shows that only 4 districts in Nepal have a very low social vulnerability index whereas 46 districts (61 %) are at moderate to high social vulnerability levels. Vulnerability mapping highlights the immediate need for decentralized frameworks to tackle natural hazards in district level; additionally, the results of this study can contribute to preparedness, planning and resource management, inter-district coordination, contingency planning, and public awareness efforts.
Ueda, Kazuhiro; Kaneda, Yoshikazu; Sudo, Manabu; Mitsutaka, Jinbo; Li, Tao-Sheng; Suga, Kazuyoshi; Tanaka, Nobuyuki; Hamano, Kimikazu
2005-11-01
Emphysema is a well-known risk factor for developing air leak or persistent air leak after pulmonary resection. Although quantitative computed tomography (CT) and spirometry are used to diagnose emphysema, it remains controversial whether these tests are predictive of the duration of postoperative air leak. Sixty-two consecutive patients who were scheduled to undergo major lung resection for cancer were enrolled in this prospective study to define the best predictor of postoperative air leak duration. Preoperative factors analyzed included spirometric variables and area of emphysema (proportion of the low-attenuation area) that was quantified in a three-dimensional CT lung model. Chest tubes were removed the day after disappearance of the air leak, regardless of pleural drainage. Univariate and multivariate proportional hazards analyses were used to determine the influence of preoperative factors on chest tube time (air leak duration). By univariate analysis, site of resection (upper, lower), forced expiratory volume in 1 second, predicted postoperative forced expiratory volume in 1 second, and area of emphysema (< 1%, 1% to 10%, > 10%) were significant predictors of air leak duration. By multivariate analysis, site of resection and area of emphysema were the best independent determinants of air leak duration. The results were similar for patients with a smoking history (n = 40), but neither forced expiratory volume in 1 second nor predicted postoperative forced expiratory volume in 1 second were predictive of air leak duration. Quantitative CT is superior to spirometry in predicting air leak duration after major lung resection for cancer. Quantitative CT may aid in the identification of patients, particularly among those with a smoking history, requiring additional preventive procedures against air leak.
Ishii, Kazuei; Furuichi, Toru; Nagao, Yukari
2013-02-01
Land use at contaminated sites, following remediation, is often needed for regional redevelopment. However, there exist few methods of developing economically and socially feasible land-use plans based on regional needs because of the wide variety of land-use requirements. This study proposes a new needs analysis method for the conceptual land-use planning of contaminated sites and illustrates this method with a case study of an illegal dumping site for hazardous waste. In this method, planning factors consisting of the land-use attributes and related facilities are extracted from the potential needs of the residents through a preliminary questionnaire. Using the extracted attributes of land use and the related facilities, land-use cases are designed for selection-based conjoint analysis. A second questionnaire for respondents to the first one who indicated an interest in participating in the second questionnaire is conducted for the conjoint analysis to determine the utility function and marginal cost of each attribute in order to prioritize the planning factors to develop a quantitative and economically and socially feasible land-use plan. Based on the results, site-specific land-use alternatives are developed and evaluated by the utility function obtained from the conjoint analysis. In this case study of an illegal dumping site for hazardous waste, the uses preferred as part of a conceptual land-use plan following remediation of the site were (1) agricultural land and a biogas plant designed to recover energy from biomass or (2) a park with a welfare facility and an athletic field. Our needs analysis method with conjoint analysis is applicable to the development of conceptual land-use planning for similar sites following remediation, particularly when added value is considered. Copyright © 2012 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Y; Shirato, H; Song, J
2015-06-15
Purpose: This study aims to identify novel prognostic imaging biomarkers in locally advanced pancreatic cancer (LAPC) using quantitative, high-throughput image analysis. Methods: 86 patients with LAPC receiving chemotherapy followed by SBRT were retrospectively studied. All patients had a baseline FDG-PET scan prior to SBRT. For each patient, we extracted 435 PET imaging features of five types: statistical, morphological, textural, histogram, and wavelet. These features went through redundancy checks, robustness analysis, as well as a prescreening process based on their concordance indices with respect to the relevant outcomes. We then performed principle component analysis on the remaining features (number ranged frommore » 10 to 16), and fitted a Cox proportional hazard regression model using the first 3 principle components. Kaplan-Meier analysis was used to assess the ability to distinguish high versus low-risk patients separated by median predicted survival. To avoid overfitting, all evaluations were based on leave-one-out cross validation (LOOCV), in which each holdout patient was assigned to a risk group according to the model obtained from a separate training set. Results: For predicting overall survival (OS), the most dominant imaging features were wavelet coefficients. There was a statistically significant difference in OS between patients with predicted high and low-risk based on LOOCV (hazard ratio: 2.26, p<0.001). Similar imaging features were also strongly associated with local progression-free survival (LPFS) (hazard ratio: 1.53, p=0.026) on LOOCV. In comparison, neither SUVmax nor TLG was associated with LPFS (p=0.103, p=0.433) (Table 1). Results for progression-free survival and distant progression-free survival showed similar trends. Conclusion: Radiomic analysis identified novel imaging features that showed improved prognostic value over conventional methods. These features characterize the degree of intra-tumor heterogeneity reflected on FDG-PET images, and their biological underpinnings warrant further investigation. If validated in large, prospective cohorts, this method could be used to stratify patients based on individualized risk.« less
NASA Astrophysics Data System (ADS)
Martinotti, Maria Elena; Pisano, Luca; Marchesini, Ivan; Rossi, Mauro; Peruccacci, Silvia; Brunetti, Maria Teresa; Melillo, Massimo; Amoruso, Giuseppe; Loiacono, Pierluigi; Vennari, Carmela; Vessia, Giovanna; Trabace, Maria; Parise, Mario; Guzzetti, Fausto
2017-03-01
In karst environments, heavy rainfall is known to cause multiple geohydrological hazards, including inundations, flash floods, landslides and sinkholes. We studied a period of intense rainfall from 1 to 6 September 2014 in the Gargano Promontory, a karst area in Puglia, southern Italy. In the period, a sequence of torrential rainfall events caused severe damage and claimed two fatalities. The amount and accuracy of the geographical and temporal information varied for the different hazards. The temporal information was most accurate for the inundation caused by a major river, less accurate for flash floods caused by minor torrents and even less accurate for landslides. For sinkholes, only generic information on the period of occurrence of the failures was available. Our analysis revealed that in the promontory, rainfall-driven hazards occurred in response to extreme meteorological conditions and that the karst landscape responded to the torrential rainfall with a threshold behaviour. We exploited the rainfall and the landslide information to design the new ensemble-non-exceedance probability (E-NEP) algorithm for the quantitative evaluation of the possible occurrence of rainfall-induced landslides and of related geohydrological hazards. The ensemble of the metrics produced by the E-NEP algorithm provided better diagnostics than the single metrics often used for landslide forecasting, including rainfall duration, cumulated rainfall and rainfall intensity. We expect that the E-NEP algorithm will be useful for landslide early warning in karst areas and in other similar environments. We acknowledge that further tests are needed to evaluate the algorithm in different meteorological, geological and physiographical settings.
14 CFR 417.227 - Toxic release hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from any...
Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E.; Harper, Martin
2015-01-01
This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure. PMID:23252451
Crouch, Edmund A; Labarre, David; Golden, Neal J; Kause, Janell R; Dearfield, Kerry L
2009-10-01
The U.S. Department of Agriculture, Food Safety and Inspection Service is exploring quantitative risk assessment methodologies to incorporate the use of the Codex Alimentarius' newly adopted risk management metrics (e.g., food safety objectives and performance objectives). It is suggested that use of these metrics would more closely tie the results of quantitative microbial risk assessments (QMRAs) to public health outcomes. By estimating the food safety objective (the maximum frequency and/or concentration of a hazard in a food at the time of consumption) and the performance objective (the maximum frequency and/or concentration of a hazard in a food at a specified step in the food chain before the time of consumption), risk managers will have a better understanding of the appropriate level of protection (ALOP) from microbial hazards for public health protection. We here demonstrate a general methodology that allows identification of an ALOP and evaluation of corresponding metrics at appropriate points in the food chain. It requires a two-dimensional probabilistic risk assessment, the example used being the Monte Carlo QMRA for Clostridium perfringens in ready-to eat and partially cooked meat and poultry products, with minor modifications to evaluate and abstract required measures. For demonstration purposes, the QMRA model was applied specifically to hot dogs produced and consumed in the United States. Evaluation of the cumulative uncertainty distribution for illness rate allows a specification of an ALOP that, with defined confidence, corresponds to current industry practices.
Functional-analytical capabilities of GIS technology in the study of water use risks
NASA Astrophysics Data System (ADS)
Nevidimova, O. G.; Yankovich, E. P.; Yankovich, K. S.
2015-02-01
Regional security aspects of economic activities are of great importance for legal regulation in environmental management. This has become a critical issue due to climate change, especially in regions where severe climate conditions have a great impact on almost all types of natural resource uses. A detailed analysis of climate and hydrological situation in Tomsk Oblast considering water use risks was carried out. Based on developed author's techniques an informational and analytical database was created using ArcGIS software platform, which combines statistical (quantitative) and spatial characteristics of natural hazards and socio-economic factors. This system was employed to perform areal zoning according to the degree of water use risks involved.
Dose-mortality assessment upon reuse and recycling of industrial sludge.
Lin, Kae-Long; Chen, Bor-Yann
2007-09-05
This study provides a novel attempt to put forward, in general toxicological terms, quantitative ranking of toxicity of various sources of sludge for possible reusability in further applications. The high leaching concentrations of copper in printed circuit board (PCB) sludge and chromium in leather sludge apparently exceeded current Taiwan's EPA regulatory thresholds and should be classified as hazardous wastes. Dose-mortality analysis indicated that the toxicity ranking of different sources of sludge was PCB sludge>CaF(2) sludge>leather sludge. PCB sludge was also confirmed as a hazardous waste since the toxicity potency of PCB sludge was nearly identical to CdCl(2). However, leather sludge seemed to be much less toxic than as anticipated, perhaps due to a significant decrease of toxic species bioavailable in the aqueous phase to the reporter bacterium Escherichia coli DH5alpha. For possible reusability of sludge, maximum concentrations allowable to be considered "safe" (ca. EC(100)/100) were 9.68, 42.1 and 176 mgL(-1) for CaF(2) sludge, PCB sludge and leather sludge, respectively.
Analysis of Infiltration-Suction Response in Unsaturated Residual Soil Slope in Gelugor, Penang
NASA Astrophysics Data System (ADS)
Ashraf Mohamad Ismail, Mohd; Hasliza Hamzah, Nur; Min, Ng Soon; Hazreek Zainal Abidin, Mohd; Tajudin, Saiful Azhar Ahmad; Madun, Aziman
2018-04-01
Rainfall infiltration on residual soil slope may impair slope stability by altering the pore-water pressure in the soil. A study has been carried out on unsaturated residual soil slope in Gelugor, Penang to determine the changes in matric suction of residual soils at different depth due to rainwater infiltration. The sequence of this study includes the site investigation, field instrumentation, laboratory experiment and numerical modeling. Void ratio and porosity of soil were found to be decreasing with depth while the bulk density and dry density of soil increased due to lower porosity of soil at greater depth. Soil infiltration rate and matric suction of all depths decrease with the increase of volumetric water content as well as the degree of saturation. Numerical modeling was used to verify and predict the relationship between infiltration-suction response and degree of saturation. Numerical models can be used to integrate the rainfall scenarios into quantitative landslide hazard assessments. Thus, development plans and mitigation measures can be designed for estimated impacts from hazard assessments based on collected data.
Spatial pattern recognition of seismic events in South West Colombia
NASA Astrophysics Data System (ADS)
Benítez, Hernán D.; Flórez, Juan F.; Duque, Diana P.; Benavides, Alberto; Lucía Baquero, Olga; Quintero, Jiber
2013-09-01
Recognition of seismogenic zones in geographical regions supports seismic hazard studies. This recognition is usually based on visual, qualitative and subjective analysis of data. Spatial pattern recognition provides a well founded means to obtain relevant information from large amounts of data. The purpose of this work is to identify and classify spatial patterns in instrumental data of the South West Colombian seismic database. In this research, clustering tendency analysis validates whether seismic database possesses a clustering structure. A non-supervised fuzzy clustering algorithm creates groups of seismic events. Given the sensitivity of fuzzy clustering algorithms to centroid initial positions, we proposed a methodology to initialize centroids that generates stable partitions with respect to centroid initialization. As a result of this work, a public software tool provides the user with the routines developed for clustering methodology. The analysis of the seismogenic zones obtained reveals meaningful spatial patterns in South-West Colombia. The clustering analysis provides a quantitative location and dispersion of seismogenic zones that facilitates seismological interpretations of seismic activities in South West Colombia.
IN SILICO MODELLING OF HAZARDOUS ENDPOINTS: CURRENT PROBLEMS AND PROSPECTIVES
The primary hurdles for Quantitative Structure-Activity Relationships (QSARs) to overcome their acceptance for regulatory purposes will be discussed. They include (a) the development of more mechanistic representations of chemical structure, (b) the classification of toxicity pa...
New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks
NASA Astrophysics Data System (ADS)
Kurtz, Nolan Scot
The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may be more or less computationally difficult, while the resolution of the network also has large affects. To take advantage of some types of topologies, network hierarchical structures with super-link representation have been used in the literature to increase the computational efficiency by analyzing smaller, densely connected networks; however, such structures were based on user input and subjective at times. To address this, algorithms must be automated and reliable. These hierarchical structures may indicate the structure of the network itself. This risk analysis methodology has been expanded to larger networks using such automated hierarchical structures. Component importance is the most important objective from such network analysis; however, this may only provide the information of which bridges to inspect/repair earliest and little else. High correlations influence such component importance measures in a negative manner. Additionally, a regional approach is not appropriately modelled. To investigate a more regional view, group importance measures based on hierarchical structures have been created. Such structures may also be used to create regional inspection/repair approaches. Using these analytical, quantitative risk approaches, the next generation of decision makers may make both component and regional-based optimal decisions using information from both network function and further effects of infrastructure deterioration.
Damman, Peter; Holmvang, Lene; Tijssen, Jan G P; Lagerqvist, Bo; Clayton, Tim C; Pocock, Stuart J; Windhausen, Fons; Hirsch, Alexander; Fox, Keith A A; Wallentin, Lars; de Winter, Robbert J
2012-01-01
The aim of this study was to evaluate the independent prognostic value of qualitative and quantitative admission electrocardiographic (ECG) analysis regarding long-term outcomes after non-ST-segment elevation acute coronary syndromes (NSTE-ACS). From the Fragmin and Fast Revascularization During Instability in Coronary Artery Disease (FRISC II), Invasive Versus Conservative Treatment in Unstable Coronary Syndromes (ICTUS), and Randomized Intervention Trial of Unstable Angina 3 (RITA-3) patient-pooled database, 5,420 patients with NSTE-ACS with qualitative ECG data, of whom 2,901 had quantitative data, were included in this analysis. The main outcome was 5-year cardiovascular death or myocardial infarction. Hazard ratios (HRs) were calculated with Cox regression models, and adjustments were made for established outcome predictors. The additional discriminative value was assessed with the category-less net reclassification improvement and integrated discrimination improvement indexes. In the 5,420 patients, the presence of ST-segment depression (≥1 mm; adjusted HR 1.43, 95% confidence interval [CI] 1.25 to 1.63) and left bundle branch block (adjusted HR 1.64, 95% CI 1.18 to 2.28) were independently associated with long-term cardiovascular death or myocardial infarction. Risk increases were short and long term. On quantitative ECG analysis, cumulative ST-segment depression (≥5 mm; adjusted HR 1.34, 95% CI 1.05 to 1.70), the presence of left bundle branch block (adjusted HR 2.15, 95% CI 1.36 to 3.40) or ≥6 leads with inverse T waves (adjusted HR 1.22, 95% CI 0.97 to 1.55) was independently associated with long-term outcomes. No interaction was observed with treatment strategy. No improvements in net reclassification improvement and integrated discrimination improvement were observed after the addition of quantitative characteristics to a model including qualitative characteristics. In conclusion, in the FRISC II, ICTUS, and RITA-3 NSTE-ACS patient-pooled data set, admission ECG characteristics provided long-term prognostic value for cardiovascular death or myocardial infarction. Quantitative ECG characteristics provided no incremental discrimination compared to qualitative data. Copyright © 2012 Elsevier Inc. All rights reserved.
Meallem, Ilana; Garb, Yaakov; Cwikel, Julie
2010-01-01
The Bedouin of the Negev region of Israel are a formerly nomadic, indigenous, ethnic minority, of which 40% currently live in unrecognized villages without organized, solid waste disposal. This study, using both quantitative and qualitative methods, explored the transition from traditional rubbish production and disposal to current uses, the current composition of rubbish, methods of waste disposal, and the extent of exposure to waste-related environmental hazards in the village of Um Batim. The modern, consumer lifestyle produced both residential and construction waste that was dumped very close to households. Waste was tended to by women who predominantly used backyard burning for disposal, exposing villagers to corrosive, poisonous, and dangerously flammable items at these burn sites. Village residents expressed a high level of concern over environmental hazards, yet no organized waste disposal or environmental hazards reduction was implemented.
Application of Catastrophe Risk Modelling to Evacuation Public Policy
NASA Astrophysics Data System (ADS)
Woo, G.
2009-04-01
The decision by civic authorities to evacuate an area threatened by a natural hazard is especially fraught when the population in harm's way is extremely large, and where there is considerable uncertainty in the spatial footprint, scale, and strike time of a hazard event. Traditionally viewed as a hazard forecasting issue, civil authorities turn to scientists for advice on a potentially imminent dangerous event. However, the level of scientific confidence varies enormously from one peril and crisis situation to another. With superior observational data, meteorological and hydrological hazards are generally better forecast than geological hazards. But even with Atlantic hurricanes, the track and intensity of a hurricane can change significantly within a few hours. This complicated and delayed the decision to call an evacuation of New Orleans when threatened by Hurricane Katrina, and would present a severe dilemma if a major hurricane were appearing to head for New York. Evacuation needs to be perceived as a risk issue, requiring the expertise of catastrophe risk modellers as well as geoscientists. Faced with evidence of a great earthquake in the Indian Ocean in December 2004, seismologists were reluctant to give a tsunami warning without more direct sea observations. Yet, from a risk perspective, the risk to coastal populations would have warranted attempts at tsunami warning, even though there was significant uncertainty in the hazard forecast, and chance of a false alarm. A systematic coherent risk-based framework for evacuation decision-making exists, which weighs the advantages of an evacuation call against the disadvantages. Implicitly and qualitatively, such a cost-benefit analysis is undertaken by civic authorities whenever an evacuation is considered. With the progress in catastrophe risk modelling, such an analysis can be made explicit and quantitative, providing a transparent audit trail for the decision process. A stochastic event set, the core of a catastrophe risk model, is required to explore the casualty implications of different possible hazard scenarios, to assess the proportion of an evacuated population who would owe their lives to an evacuation, and to estimate the economic loss associated with an unnecessary evacuation. This paper will review the developing methodology for applying catastrophe risk modelling to support public policy in evacuation decision-making, and provide illustrations from across the range of natural hazards. Evacuation during volcanic crises is a prime example, recognizing the improving forecasting skill of volcanologists, now able to account probabilistically for precursory seismological, geodetic, and geochemical monitoring data. This methodology will be shown to help civic authorities make sounder risk-informed decisions on the timing and population segmentation of evacuation from both volcanoes and calderas, such as Vesuvius and Campi Flegrei, which are in densely populated urban regions.
The hazard of exposure to impulse noise as a function of frequency, volume 2
NASA Astrophysics Data System (ADS)
Patterson, James H., Jr.; Carrier, Melvin, Jr.; Bordwell, Kevin; Lomba, Ilia M.; Gautier, Roger P.
1991-06-01
The energy spectrum of a noise is known to be an important variable in determining the effects of a traumatic exposure. However, existing criteria for exposure to impulse noise do not consider the frequency spectrum of an impulse as a variable in the evaluation of the hazards to the auditory system. This report presents the results of a study that was designed to determine the relative potential that impulsive energy concentrated at different frequencies has in causing auditory systems trauma. One hundred and eighteen (118) chinchilla, divided into 20 groups with 5 to 7 animals per group, were used in these experiments. Pre- and post-exposure hearing thresholds were measured at 10 test frequencies between 0.125 and 8 kHz on each animal using avoidance conditioning procedures. Quantitative histology (cochleograms) was used to determine the extent and pattern of the sensory cell damage. The noise exposure stimuli consisted of six different computer-generated narrow band tone bursts having center frequencies located at 0.260, 0.775, 1.025, 1.350, 2.450, and 3.550 kHz. Each narrow band exposure stimulus was presented at two to four different intensities. An analysis of the audiometric and histological data allowed a frequency weighting function to be derived. The weighting function clearly demonstrates that equivalent amounts of impulsive energy concentrated at different frequencies is not equally hazardous to auditory function.
A critical analysis of hazard resilience measures within sustainability assessment frameworks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Elizabeth C., E-mail: echiso1@lsu.edu; Sattler, Meredith, E-mail: msattler@lsu.edu; Friedland, Carol J., E-mail: friedland@lsu.edu
Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site,more » community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
PECH, S.H.
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
Predicting MCI outcome with clinically available MRI and CSF biomarkers
Heister, D.; Brewer, J.B.; Magda, S.; Blennow, K.
2011-01-01
Objective: To determine the ability of clinically available volumetric MRI (vMRI) and CSF biomarkers, alone or in combination with a quantitative learning measure, to predict conversion to Alzheimer disease (AD) in patients with mild cognitive impairment (MCI). Methods: We stratified 192 MCI participants into positive and negative risk groups on the basis of 1) degree of learning impairment on the Rey Auditory Verbal Learning Test; 2) medial temporal atrophy, quantified from Food and Drug Administration–approved software for automated vMRI analysis; and 3) CSF biomarker levels. We also stratified participants based on combinations of risk factors. We computed Cox proportional hazards models, controlling for age, to assess 3-year risk of converting to AD as a function of risk group and used Kaplan-Meier analyses to determine median survival times. Results: When risk factors were examined separately, individuals testing positive showed significantly higher risk of converting to AD than individuals testing negative (hazard ratios [HR] 1.8–4.1). The joint presence of any 2 risk factors substantially increased risk, with the combination of greater learning impairment and increased atrophy associated with highest risk (HR 29.0): 85% of patients with both risk factors converted to AD within 3 years, vs 5% of those with neither. The presence of medial temporal atrophy was associated with shortest median dementia-free survival (15 months). Conclusions: Incorporating quantitative assessment of learning ability along with vMRI or CSF biomarkers in the clinical workup of MCI can provide critical information on risk of imminent conversion to AD. PMID:21998317
Arenal-type pyroclastic flows: A probabilistic event tree risk analysis
NASA Astrophysics Data System (ADS)
Meloy, Anthony F.
2006-09-01
A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such, an ETA is considered to be a valuable quantitative decision support tool.
NASA Astrophysics Data System (ADS)
Gallina, Valentina; Torressan, Silvia; Zabeo, Alex; Critto, Andrea; Glade, Thomas; Marcomini, Antonio
2015-04-01
Climate change is expected to pose a wide range of impacts on natural and human systems worldwide, increasing risks from long-term climate trends and disasters triggered by weather extremes. Accordingly, in the future, one region could be potentially affected by interactions, synergies and trade-offs of multiple hazards and impacts. A multi-risk risk approach is needed to effectively address multiple threats posed by climate change across regions and targets supporting decision-makers toward a new paradigm of multi-hazard and risk management. Relevant initiatives have been already developed for the assessment of multiple hazards and risks affecting the same area in a defined timeframe by means of quantitative and semi-quantitative approaches. Most of them are addressing the relations of different natural hazards, however, the effect of future climate change is usually not considered. In order to fill this gap, an advanced multi-risk methodology was developed at the Euro-Mediterranean Centre on Climate Change (CMCC) for estimating cumulative impacts related to climate change at the regional (i.e. sub-national) scale. This methodology was implemented into an assessment tool which allows to scan and classify quickly natural systems and human assets at risk resulting from different interacting hazards. A multi-hazard index is proposed to evaluate the relationships of different climate-related hazards (e.g. sea-level rise, coastal erosion, storm surge) occurring in the same spatial and temporal area, by means of an influence matrix and the disjoint probability function. Future hazard scenarios provided by regional climate models are used as input for this step in order to consider possible effects of future climate change scenarios. Then, the multi-vulnerability of different exposed receptors (e.g. natural systems, beaches, agricultural and urban areas) is estimated through a variety of vulnerability indicators (e.g. vegetation cover, sediment budget, % of urbanization), tailored case by case to different sets of natural hazards and elements at risk. Finally, the multi-risk assessment integrates the multi-hazard with the multi-vulnerability index of exposed receptors, providing a relative ranking of areas and targets potentially affected by multiple risks in the considered region. The methodology was applied to the North Adriatic coast (Italy) producing a range of GIS-based multi-hazard, exposure, multi-vulnerability and multi-risk maps that can be used by policy-makers to define risk management and adaptation strategies. Results show that areas affected by higher multi-hazard scores are located close to the coastline where all the investigated hazards are present. Multi-vulnerability assumes relatively high scores in the whole case study, showing that beaches, wetlands, protected areas and river mouths are the more sensible targets. The final estimate of multi-risk for coastal municipalities provides useful information for local public authorities to set future priorities for adaptation and define future plans for shoreline and coastal management in view of climate change.
Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun
2018-09-01
Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.
[The concept of risk and its estimation].
Zocchetti, C; Della Foglia, M; Colombi, A
1996-01-01
The concept of risk, in relation to human health, is a topic of primary interest for occupational health professionals. A new legislation recently established in Italy (626/94) according to European Community directives in the field of Preventive Medicine, called attention to this topic, and in particular to risk assessment and evaluation. Motivated by this context and by the impression that the concept of risk is frequently misunderstood, the present paper has two aims: the identification of the different meanings of the term "risk" in the new Italian legislation and the critical discussion of some commonly used definitions; and the proposal of a general definition, with the specification of a mathematical expression for quantitative risk estimation. The term risk (and risk estimation, assessment, or evaluation) has mainly referred to three different contexts: hazard identification, exposure assessment, and adverse health effects occurrence. Unfortunately, there are contexts in the legislation in which it is difficult to identify the true meaning of the term. This might cause equivocal interpretations and erroneous applications of the law because hazard evaluation, exposure assessment, and adverse health effects identification are completely different topics that require integrated but distinct approaches to risk management. As far as a quantitative definition of risk is of concern, we suggest an algorithm which connects the three basic risk elements (hazard, exposure, adverse health effects) by means of their probabilities of occurrence: the probability of being exposed (to a definite dose) given that a specific hazard is present (Pr(e[symbol: see text]p)), and the probability of occurrence of an adverse health effect as a consequence of that exposure (Pr(d[symbol: see text]e)). Using these quantitative components, risk can be defined as a sequence of measurable events that starts with hazard identification and terminates with disease occurrence; therefore, the following formal definition of risk is proposed: the probability of occurrence, in a given period of time, of an adverse health effect as a consequence of the existence of an hazard. In formula: R(d[symbol: see text]p) = Pr(e[symbol: see text]p) x Pr(d[symbol: see text]e). While Pr(e[symbol: see text]p) (exposure given hazard) must be evaluated in the situation under study, two alternatives exist for the estimation of the occurrence of adverse health effects (Pr(d[symbol: see text]e)): a "direct" estimation of the damage (Pr(d[symbol: see text]e) through formal epidemiologic studies conducted in the situation under observation; and an "indirect" estimation of Pr(d[symbol: see text]e) using information taken from the scientific literature (epidemiologic evaluations, dose-response relationships, extrapolations, ...). Both conditions are presented along with their respective advantages, disadvantages, and uncertainties. The usefulness of the proposed algorithm is discussed with respect to commonly used applications of risk assessment in occupational medicine; the relevance of time for risk estimation (both in the term of duration of observation, duration of exposure, and latency of effect) is briefly explained; and how the proposed algorithm takes into account (in terms of prevention and public health) both the etiologic relevance of the exposure and the consequences of exposure removal is highlighted. As a last comment, it is suggested that the diffuse application of good work practices (technical, behavioral, organizational, ...), or the exhaustive use of check lists, can be relevant in terms of improvement of prevention efficacy, but does not represent any quantitative procedure of risk assessment which, in any circumstance, must be considered the elective approach to adverse health effect prevention.
Lee, Geewon; Kim, Ki Uk; Lee, Ji Won; Suh, Young Ju; Jeong, Yeon Joo
2017-05-01
Background Although fibrotic idiopathic interstitial pneumonias (IIPs) alone and those combined with pulmonary emphysema are naturally progressive diseases, the process of deterioration and outcomes are variable. Purpose To evaluate and compare serial changes of computed tomography (CT) abnormalities and prognostic predictive factors in fibrotic IIPs alone and those combined with pulmonary emphysema. Material and Methods A total of 148 patients with fibrotic IIPs alone (82 patients) and those combined with pulmonary emphysema (66 patients) were enrolled. Semi-quantitative CT analysis was used to assess the extents of CT characteristics which were evaluated on initial and follow-up CT images. Univariate and multivariate analyses were performed to assess the effects of clinical and CT variables on survival. Results Significant differences were noted between fibrotic scores, as determined using initial CT scans, in the fibrotic IIPs alone (21.22 ± 9.83) and those combined with pulmonary emphysema groups (14.70 ± 7.28) ( P < 0.001). At follow-up CT scans, changes in the extent of ground glass opacities (GGO) were greater ( P = 0.031) and lung cancer was more prevalent ( P = 0.001) in the fibrotic IIPs combined with pulmonary emphysema group. Multivariate Cox proportional hazards analysis showed changes in the extent of GGO (hazard ratio, 1.056) and the presence of lung cancer (hazard ratio, 4.631) were predictive factors of poor survivals. Conclusion Although patients with fibrotic IIPs alone and those combined with pulmonary emphysema have similar mortalities, lung cancer was more prevalent in patients with fibrotic IIPs combined with pulmonary emphysema. Furthermore, changes in the extent of GGO and the presence of lung cancer were independent prognostic factors of poor survivals.
NASA Astrophysics Data System (ADS)
Xia, Wei; Chen, Ying; Zhang, Rui; Yan, Zhuangzhi; Zhou, Xiaobo; Zhang, Bo; Gao, Xin
2018-02-01
Our objective was to identify prognostic imaging biomarkers for hepatocellular carcinoma in contrast-enhanced computed tomography (CECT) with biological interpretations by associating imaging features and gene modules. We retrospectively analyzed 371 patients who had gene expression profiles. For the 38 patients with CECT imaging data, automatic intra-tumor partitioning was performed, resulting in three spatially distinct subregions. We extracted a total of 37 quantitative imaging features describing intensity, geometry, and texture from each subregion. Imaging features were selected after robustness and redundancy analysis. Gene modules acquired from clustering were chosen for their prognostic significance. By constructing an association map between imaging features and gene modules with Spearman rank correlations, the imaging features that significantly correlated with gene modules were obtained. These features were evaluated with Cox’s proportional hazard models and Kaplan-Meier estimates to determine their prognostic capabilities for overall survival (OS). Eight imaging features were significantly correlated with prognostic gene modules, and two of them were associated with OS. Among these, the geometry feature volume fraction of the subregion, which was significantly correlated with all prognostic gene modules representing cancer-related interpretation, was predictive of OS (Cox p = 0.022, hazard ratio = 0.24). The texture feature cluster prominence in the subregion, which was correlated with the prognostic gene module representing lipid metabolism and complement activation, also had the ability to predict OS (Cox p = 0.021, hazard ratio = 0.17). Imaging features depicting the volume fraction and textural heterogeneity in subregions have the potential to be predictors of OS with interpretable biological meaning.
Manned Mars mission radiation environment and radiobiology
NASA Technical Reports Server (NTRS)
Nachtwey, D. S.
1986-01-01
Potential radiation hazards to crew members on manned Mars missions are discussed. It deals briefly with radiation sources and environments likely to be encountered during various phases of such missions, providing quantitative estimates of these environments. Also provided are quantitative data and discussions on the implications of such radiation on the human body. Various sorts of protective measures are suggested. Recent re-evaluation of allowable dose limits by the National Council of Radiation Protection is discussed, and potential implications from such activity are assessed.
Rethinking vulnerability analysis and governance with emphasis on a participatory approach.
Rossignol, Nicolas; Delvenne, Pierre; Turcanu, Catrinel
2015-01-01
This article draws on vulnerability analysis as it emerged as a complement to classical risk analysis, and it aims at exploring its ability for nurturing risk and vulnerability governance actions. An analysis of the literature on vulnerability analysis allows us to formulate a three-fold critique: first, vulnerability analysis has been treated separately in the natural and the technological hazards fields. This separation prevents vulnerability from unleashing the full range of its potential, as it constrains appraisals into artificial categories and thus already closes down the outcomes of the analysis. Second, vulnerability analysis focused on assessment tools that are mainly quantitative, whereas qualitative appraisal is a key to assessing vulnerability in a comprehensive way and to informing policy making. Third, a systematic literature review of case studies reporting on participatory approaches to vulnerability analysis allows us to argue that participation has been important to address the above, but it remains too closed down in its approach and would benefit from embracing a more open, encompassing perspective. Therefore, we suggest rethinking vulnerability analysis as one part of a dynamic process between opening-up and closing-down strategies, in order to support a vulnerability governance framework. © 2014 Society for Risk Analysis.
Visualization and Hierarchical Analysis of Flow in Discrete Fracture Network Models
NASA Astrophysics Data System (ADS)
Aldrich, G. A.; Gable, C. W.; Painter, S. L.; Makedonska, N.; Hamann, B.; Woodring, J.
2013-12-01
Flow and transport in low permeability fractured rock is primary in interconnected fracture networks. Prediction and characterization of flow and transport in fractured rock has important implications in underground repositories for hazardous materials (eg. nuclear and chemical waste), contaminant migration and remediation, groundwater resource management, and hydrocarbon extraction. We have developed methods to explicitly model flow in discrete fracture networks and track flow paths using passive particle tracking algorithms. Visualization and analysis of particle trajectory through the fracture network is important to understanding fracture connectivity, flow patterns, potential contaminant pathways and fast paths through the network. However, occlusion due to the large number of highly tessellated and intersecting fracture polygons preclude the effective use of traditional visualization methods. We would also like quantitative analysis methods to characterize the trajectory of a large number of particle paths. We have solved these problems by defining a hierarchal flow network representing the topology of particle flow through the fracture network. This approach allows us to analyses the flow and the dynamics of the system as a whole. We are able to easily query the flow network, and use paint-and-link style framework to filter the fracture geometry and particle traces based on the flow analytics. This allows us to greatly reduce occlusion while emphasizing salient features such as the principal transport pathways. Examples are shown that demonstrate the methodology and highlight how use of this new method allows quantitative analysis and characterization of flow and transport in a number of representative fracture networks.
Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis
NASA Astrophysics Data System (ADS)
Wright, Heather; Pallister, John; Newhall, Chris
2015-04-01
Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the public. VDAP trees evaluate probabilities of: magmatic intrusion, likelihood of eruption, magnitude of eruption, and types of associated hazardous events and their extents. In a few cases, trees have been extended to also assess and communicate vulnerability and relative risk.
The Protective Role of Coastal Marshes: A Systematic Review and Meta-analysis
Shepard, Christine C.; Crain, Caitlin M.; Beck, Michael W.
2011-01-01
Background Salt marshes lie between many human communities and the coast and have been presumed to protect these communities from coastal hazards by providing important ecosystem services. However, previous characterizations of these ecosystem services have typically been based on a small number of historical studies, and the consistency and extent to which marshes provide these services has not been investigated. Here, we review the current evidence for the specific processes of wave attenuation, shoreline stabilization and floodwater attenuation to determine if and under what conditions salt marshes offer these coastal protection services. Methodology/Principal Findings We conducted a thorough search and synthesis of the literature with reference to these processes. Seventy-five publications met our selection criteria, and we conducted meta-analyses for publications with sufficient data available for quantitative analysis. We found that combined across all studies (n = 7), salt marsh vegetation had a significant positive effect on wave attenuation as measured by reductions in wave height per unit distance across marsh vegetation. Salt marsh vegetation also had a significant positive effect on shoreline stabilization as measured by accretion, lateral erosion reduction, and marsh surface elevation change (n = 30). Salt marsh characteristics that were positively correlated to both wave attenuation and shoreline stabilization were vegetation density, biomass production, and marsh size. Although we could not find studies quantitatively evaluating floodwater attenuation within salt marshes, there are several studies noting the negative effects of wetland alteration on water quantity regulation within coastal areas. Conclusions/Significance Our results show that salt marshes have value for coastal hazard mitigation and climate change adaptation. Because we do not yet fully understand the magnitude of this value, we propose that decision makers employ natural systems to maximize the benefits and ecosystem services provided by salt marshes and exercise caution when making decisions that erode these services. PMID:22132099
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-19
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Analysis and Risk-Based Preventive Controls for Human Food.'' FOR FURTHER INFORMATION CONTACT: Domini Bean... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day comment...
Pieterse, B; Rijk, I J C; Simon, E; van Vugt-Lussenburg, B M A; Fokke, B F H; van der Wijk, M; Besselink, H; Weber, R; van der Burg, B
2015-10-01
A combined chemical and biological analysis of samples from a major obsolete pesticide and persistent organic pollutant (POP) dumpsite in Northern Tajikistan was carried out. The chemical analytical screening focused on a range of prioritized compounds and compounds known to be present locally. Since chemical analytics does not allow measurements of hazards in complex mixtures, we tested the use of a novel effect-based approach using a panel of quantitative high-throughput CALUX reporter assays measuring distinct biological effects relevant in hazard assessment. Assays were included for assessing effects related to estrogen, androgen, and progestin signaling, aryl hydrocarbon receptor-mediated signaling, AP1 signaling, genotoxicity, oxidative stress, chemical hypoxia, and ER stress. With this panel of assays, we first quantified the biological activities of the individual chemicals measured in chemical analytics. Next, we calculated the expected sum activity by these chemicals in the samples of the pesticide dump site and compared the results with the measured CALUX bioactivity of the total extracts of these samples. The results showed that particularly endocrine disruption-related effects were common among the samples. This was consistent with the toxicological profiles of the individual chemicals that dominated these samples. However, large discrepancies between chemical and biological analysis were found in a sample from a burn place present in this site, with biological activities that could not be explained by chemical analysis. This is likely to be caused by toxic combustion products or by spills of compounds that were not targeted in the chemical analysis.
Risk Analysis of a Fuel Storage Terminal Using HAZOP and FTA
Baixauli-Pérez, Mª Piedad
2017-01-01
The size and complexity of industrial chemical plants, together with the nature of the products handled, means that an analysis and control of the risks involved is required. This paper presents a methodology for risk analysis in chemical and allied industries that is based on a combination of HAZard and OPerability analysis (HAZOP) and a quantitative analysis of the most relevant risks through the development of fault trees, fault tree analysis (FTA). Results from FTA allow prioritizing the preventive and corrective measures to minimize the probability of failure. An analysis of a case study is performed; it consists in the terminal for unloading chemical and petroleum products, and the fuel storage facilities of two companies, in the port of Valencia (Spain). HAZOP analysis shows that loading and unloading areas are the most sensitive areas of the plant and where the most significant danger is a fuel spill. FTA analysis indicates that the most likely event is a fuel spill in tank truck loading area. A sensitivity analysis from the FTA results show the importance of the human factor in all sequences of the possible accidents, so it should be mandatory to improve the training of the staff of the plants. PMID:28665325
Risk Analysis of a Fuel Storage Terminal Using HAZOP and FTA.
Fuentes-Bargues, José Luis; González-Cruz, Mª Carmen; González-Gaya, Cristina; Baixauli-Pérez, Mª Piedad
2017-06-30
The size and complexity of industrial chemical plants, together with the nature of the products handled, means that an analysis and control of the risks involved is required. This paper presents a methodology for risk analysis in chemical and allied industries that is based on a combination of HAZard and OPerability analysis (HAZOP) and a quantitative analysis of the most relevant risks through the development of fault trees, fault tree analysis (FTA). Results from FTA allow prioritizing the preventive and corrective measures to minimize the probability of failure. An analysis of a case study is performed; it consists in the terminal for unloading chemical and petroleum products, and the fuel storage facilities of two companies, in the port of Valencia (Spain). HAZOP analysis shows that loading and unloading areas are the most sensitive areas of the plant and where the most significant danger is a fuel spill. FTA analysis indicates that the most likely event is a fuel spill in tank truck loading area. A sensitivity analysis from the FTA results show the importance of the human factor in all sequences of the possible accidents, so it should be mandatory to improve the training of the staff of the plants.
NASA Astrophysics Data System (ADS)
Isaure, Marie-Pierre; Laboudigue, Agnès; Manceau, Alain; Sarret, Géraldine; Tiffreau, Christophe; Trocellier, Patrick; Lamble, Géraldine; Hazemann, Jean-Louis; Chateigner, Daniel
2002-05-01
Dredging and disposal of sediments onto agricultural soils is a common practice in industrial and urban areas that can be hazardous to the environment when the sediments contain heavy metals. This chemical hazard can be assessed by evaluating the mobility and speciation of metals after sediment deposition. In this study, the speciation of Zn in the coarse (500 to 2000 μm) and fine (<2 μm) fractions of a contaminated sediment dredged from a ship canal in northern France and deposited on an agricultural soil was determined by physical analytical techniques on raw and chemically treated samples. Zn partitioning between coexisting mineral phases and its chemical associations were first determined by micro-particle-induced X-ray emission and micro-synchrotron-based X-ray radiation fluorescence. Zn-containing mineral species were then identified by X-ray diffraction and powder and polarized extended X-ray absorption fine structure spectroscopy (EXAFS). The number, nature, and proportion of Zn species were obtained by a coupled principal component analysis (PCA) and least squares fitting (LSF) procedure, applied herein for the first time to qualitatively (number and nature of species) and quantitatively (relative proportion of species) speciate a metal in a natural system. The coarse fraction consists of slag grains originating from nearby Zn smelters. In this fraction, Zn is primarily present as sphalerite (ZnS) and to a lesser extent as willemite (Zn 2SiO 4), Zn-containing ferric (oxyhydr)oxides, and zincite (ZnO). In the fine fraction, ZnS and Zn-containing Fe (oxyhydr)oxides are the major forms, and Zn-containing phyllosilicate is the minor species. Weathering of ZnS, Zn 2SiO 4, and ZnO under oxidizing conditions after the sediment disposal accounts for the uptake of Zn by Fe (oxyhydr)oxides and phyllosilicates. Two geochemical processes can explain the retention of Zn by secondary minerals: uptake on preexisting minerals and precipitation with dissolved Fe and Si. The second process likely occurs because dissolved Zn and Si are supersaturated with respect to Zn phyllosilicate. EXAFS spectroscopy, in combination with PCA and LSF, is shown to be a meaningful approach to quantitatively determining the speciation of trace elements in sediments and soils.
Modeling and Hazard Analysis Using STPA
NASA Astrophysics Data System (ADS)
Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka
2010-09-01
A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.
NASA Astrophysics Data System (ADS)
McBride, S.; Herbulock, D.
2015-12-01
Providing natural hazards scientists the opportunity to question and engage directly with journalists in a workshop setting proved effective at shifting scientists' attitudes on their role in media and public communication during natural disasters. Scientists surveyed after the encounter expressed a more responsive attitude to communicating during crises, increased willingness to support scientific peers' communication efforts and more realistic perspectives on journalists' needs and objectives. Geoscientists experienced unprecedented and intensive media and public scrutiny during the Canterbury, New Zealand earthquakes of 2010-2012. Following major quakes and aftershocks, there was a sustained high level of public demand for information and expert analysis of the underlying geological events and ongoing hazards and risks. Once the crisis ended, a period of reflection gave rise to understanding of the need for further media and communication training amongst natural hazards scientists. A workshop designed to explore scientists' attitudes to public communication during disasters and challenge their views on media, press offices and the expectations of the public was developed and implemented by the Science Media Centre, New Zealand and Massey University. This research was developed as an evaluation of this workshop. Quantitative analysis with some qualititive analysis were the methods used. Some findings include: a shift in how journalists were perceived by scientists after the workshop, largely influenced by perspectives shared during a panel where invited journalists reflected on their own experiences and answered questions from scientists. discussions on different spokespeople from different science institutions contributing to the public discussion showed a change in perception from a preference for one central spokesperson to increased support for a variety of perspectives from multiple scientists. This was influenced by insight provided by journalists during the workshop into the practical demands of media and the need for multiple voices in media coverage. changes in how they would manage media queries, as well as a shift from passive to responsive attitudes towards communicating in crises.
Dervišević, Irma; Minić, Duško; Kamberović, Željko; Ćosović, Vladan; Ristić, Mirjana
2013-06-01
In this paper, we have analyzed parts of printed circuit board (PCB) and liquid crystal display (LCD) screens of mobile phones and computers, quantitative and qualitative chemical compositions of individual components, and complete PCBs were determined. Differential thermal analysis (DTA) and differential scanning calorimetry (DSC) methods were used to determine the temperatures of phase transformations, whereas qualitative and quantitative compositions of the samples were determined by X-ray fluorescence spectrometry (XRF), inductively coupled plasma optical emission spectrometry (ICP-OES), and scanning electron microscopy (SEM)-energy dispersive X-ray spectrometry (EDS) analyses. The microstructure of samples was studied by optical microscopy. Based on results of the analysis, a procedure for recycling PCBs is proposed. The emphasis was on the effects that can be achieved in the recycling process by extraction of some parts before the melting process. In addition, newly developed materials can be an adequate substitute for some of the dangerous and harmful materials, such as lead and arsenic are proposed, which is in accordance with the European Union (EU) Restriction of the use of certain hazardous substances (RoHS) directive as well as some alternative materials for use in the electronics industry instead of gold and gold alloys.
Smadi, Hanan; Sargeant, Jan M
2013-02-01
The current quantitative risk assessment model followed the framework proposed by the Codex Alimentarius to provide an estimate of the risk of human salmonellosis due to consumption of chicken breasts which were bought from Canadian retail stores and prepared in Canadian domestic kitchens. The model simulated the level of Salmonella contamination on chicken breasts throughout the retail-to-table pathway. The model used Canadian input parameter values, where available, to represent risk of salmonellosis. From retail until consumption, changes in the concentration of Salmonella on each chicken breast were modeled using equations for growth and inactivation. The model predicted an average of 318 cases of salmonellosis per 100,000 consumers per year. Potential reasons for this overestimation were discussed. A sensitivity analysis showed that concentration of Salmonella on chicken breasts at retail and food hygienic practices in private kitchens such as cross-contamination due to not washing cutting boards (or utensils) and hands after handling raw meat along with inadequate cooking contributed most significantly to the risk of human salmonellosis. The outcome from this model emphasizes that responsibility for protection from Salmonella hazard on chicken breasts is a shared responsibility. Data needed for a comprehensive Canadian Salmonella risk assessment were identified for future research. © 2012 Society for Risk Analysis.
Multi Hazard Assessment: The Azores Archipelagos (PT) case
NASA Astrophysics Data System (ADS)
Aifantopoulou, Dorothea; Boni, Giorgio; Cenci, Luca; Kaskara, Maria; Kontoes, Haris; Papoutsis, Ioannis; Paralikidis, Sideris; Psichogyiou, Christina; Solomos, Stavros; Squicciarino, Giuseppe; Tsouni, Alexia; Xerekakis, Themos
2016-04-01
The COPERNICUS EMS Risk & Recovery Mapping (RRM) activity offers services to support efficient design and implementation of mitigation measures and recovery planning based on EO data exploitation. The Azores Archipelagos case was realized in the context of the FWC 259811 Copernicus EMS RRM, and provides potential impact information for a number of natural disasters. The analysis identified population and assets at risk (infrastructures and environment). The risk assessment was based on hazard and vulnerability of structural elements, road network characteristics, etc. Integration of different hazards and risks was accounted in establishing the necessary first response/ first aid infrastructure. EO data (Pleiades and WV-2), were used to establish a detailed background information, common for the assessment of the whole of the risks. A qualitative Flood hazard level was established, through a "Flood Susceptibility Index" that accounts for upstream drainage area and local slope along the drainage network (Manfreda et al. 2014). Indicators, representing different vulnerability typologies, were accounted for. The risk was established through intersecting hazard and vulnerability (risk- specific lookup table). Probabilistic seismic hazards maps (PGA) were obtained by applying the Cornell (1968) methodology as implemented in CRISIS2007 (Ordaz et al. 2007). The approach relied on the identification of potential sources, the assessment of earthquake recurrence and magnitude distribution, the selection of ground motion model, and the mathematical model to calculate seismic hazard. Lava eruption areas and a volcanic activity related coefficient were established through available historical data. Lava flow paths and their convergence were estimated through applying a cellular, automata based, Lava Flow Hazard numerical model (Gestur Leó Gislason, 2013). The Landslide Hazard Index of NGI (Norwegian Geotechnical Institute) for heavy rainfall (100 year extreme monthly rainfall) and earthquake (475 years return period) was used. Topography, lithology, soil moisture and LU/LC were also accounted for. Soil erosion risk was assessed through the empirical model RUSLE (Renard et al. 1991b). Rainfall erosivity, topography and vegetation cover are the main parameters which were used for predicting the proneness to soil loss. Expected, maximum tsunami wave heights were estimated for a specific earthquake scenario at designated forecast points along the coasts. Deformation at the source was calculated by utilizing the Okada code (Okada, 1985). Tsunami waves' generation and propagation is based on the SWAN model (JRC/IPSC modification). To estimate the wave height (forecast points) the Green's Law function was used (JRC Tsunami Analysis Tool). Storm tracks' historical data indicate a return period of 17 /41 years for H1 /H2 hurricane categories respectively. NOAA WAVEWATCH III model hindcast reanalysis was used to estimate the maximum significant wave height (wind and swell) along the coastline during two major storms. The associated storm-surge risk assessment accounted also for the coastline morphology. Seven empirical (independent) indicators were used to express the erosion susceptibility of the coasts. Each indicator is evaluated according to a semi?quantitative score that represents low, medium and high level of erosion risk or impact. The estimation of the coastal erosion hazard was derived through aggregating the indicators in a grid scale.
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
Hillslope hydrology and stability
Lu, Ning; Godt, Jonathan
2012-01-01
Landslides are caused by a failure of the mechanical balance within hillslopes. This balance is governed by two coupled physical processes: hydrological or subsurface flow and stress. The stabilizing strength of hillslope materials depends on effective stress, which is diminished by rainfall. This book presents a cutting-edge quantitative approach to understanding hydro-mechanical processes across variably saturated hillslope environments and to the study and prediction of rainfall-induced landslides. Topics covered include historic synthesis of hillslope geomorphology and hydrology, total and effective stress distributions, critical reviews of shear strength of hillslope materials and different bases for stability analysis. Exercises and homework problems are provided for students to engage with the theory in practice. This is an invaluable resource for graduate students and researchers in hydrology, geomorphology, engineering geology, geotechnical engineering and geomechanics and for professionals in the fields of civil and environmental engineering and natural hazard analysis.
Double standards and the international trade of pesticides: the Brazilian case.
Porto, Marcelo Firpo; Milanez, Bruno; Soares, Wagner Lopes; Meyer, Armando
2010-01-01
Despite bans on certain pesticides and their replacement by others considered less hazardous, the widespread use of these substances in agriculture continues to threaten the environment and the health of millions of people. This article discusses the current double standard in the international trade of pesticides and focuses on Brazil, one of the main users of pesticides in the world, analyzing the trends in foreign trade (imports and exports) of selected pesticides as a function of changes in legislation in the United States, the European Union, and Brazil from 1989 to 2006. We applied time line analysis to eight organochlorines already banned in Brazil and conducted a case-by-case qualitative and quantitative analysis of nine other pesticides. The results indicate the existence of double standards, as demonstrated by the continued exports to Brazil of some pesticides banned in the United States and Europe.
DOT National Transportation Integrated Search
1985-10-01
This report summarizes the findings from the second phase of a two-part analysis of hazardous materials truck routes in the Dallas-Fort Worth area. Phase II of this study analyzes the risk of transporting hazardous materials on freeways and arterial ...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
Developing Daily Quantitative Damage Estimates From Geospatial Layers To Support Post Event Recovery
NASA Astrophysics Data System (ADS)
Woods, B. K.; Wei, L. H.; Connor, T. C.
2014-12-01
With the growth of natural hazard data available in near real-time it is increasingly feasible to deliver damage estimates caused by natural disasters. These estimates can be used in disaster management setting or by commercial entities to optimize the deployment of resources and/or routing of goods and materials. This work outlines an end-to-end, modular process to generate estimates of damage caused by severe weather. The processing stream consists of five generic components: 1) Hazard modules that provide quantitate data layers for each peril. 2) Standardized methods to map the hazard data to an exposure layer based on atomic geospatial blocks. 3) Peril-specific damage functions that compute damage metrics at the atomic geospatial block level. 4) Standardized data aggregators, which map damage to user-specific geometries. 5) Data dissemination modules, which provide resulting damage estimates in a variety of output forms. This presentation provides a description of this generic tool set, and an illustrated example using HWRF-based hazard data for Hurricane Arthur (2014). In this example, the Python-based real-time processing ingests GRIB2 output from the HWRF numerical model, dynamically downscales it in conjunctions with a land cover database using a multiprocessing pool, and a just-in-time compiler (JIT). The resulting wind fields are contoured, and ingested into a PostGIS database using OGR. Finally, the damage estimates are calculated at the atomic block level and aggregated to user-defined regions using PostgreSQL queries to construct application specific tabular and graphics output.
NASA Astrophysics Data System (ADS)
Mastrolorenzo, G.; Pappalardo, L.; Troise, C.; Panizza, A.; de Natale, G.
2008-07-01
Tephra fall is a relevant hazard of Campi Flegrei caldera (Southern Italy), due to the high vulnerability of Naples metropolitan area to such an event. Here, tephra derive from magmatic as well as phreatomagmatic activity. On the basis of both new and literature data on known, past eruptions (Volcanic Explosivity Index (VEI), grain size parameters, velocity at the vent, column heights and erupted mass), and factors controlling tephra dispersion (wind velocity and direction), 2D numerical simulations of fallout dispersion and deposition have been performed for a large number of case events. A bayesian inversion has been applied to retrieve the best values of critical parameters (e.g., vertical mass distribution, diffusion coefficients, velocity at the vent), not directly inferable by volcanological study. Simulations are run in parallel on multiple processors to allow a fully probabilistic analysis, on a very large catalogue preserving the statistical proprieties of past eruptive history. Using simulation results, hazard maps have been computed for different scenarios: upper limit scenario (worst-expected scenario), eruption-range scenario, and whole-eruption scenario. Results indicate that although high hazard characterizes the Campi Flegrei caldera, the territory to the east of the caldera center, including the whole district of Naples, is exposed to high hazard values due to the dominant westerly winds. Consistently with the stratigraphic evidence of nature of past eruptions, our numerical simulations reveal that even in the case of a subplinian eruption (VEI = 3), Naples is exposed to tephra fall thicknesses of some decimeters, thereby exceeding the critical limit for roof collapse. Because of the total number of people living in Campi Flegrei and the city of Naples (ca. two million of inhabitants), the tephra fallout risk related to a plinian eruption of Campi Flegrei largely matches or exceeds the risk related to a similar eruption at Vesuvius.
Spatially resolved hazard and exposure assessments: an example of lead in soil at Lavrion, Greece.
Tristán, E; Demetriades, A; Ramsey, M H; Rosenbaum, M S; Stavrakis, P; Thornton, I; Vassiliades, E; Vergou, K
2000-01-01
Spatially resolved hazard assessment (SRHA) and spatially resolved exposure assessment (SREA) are methodologies that have been devised for assessing child exposure to soil containing environmental pollutants. These are based on either a quantitative or a semiquantitative approach. The feasibility of the methodologies has been demonstrated in a study assessing child exposure to Pb accessible in soil at the town of Lavrion in Greece. Using a quantitative approach, both measured and kriged concentrations of Pb in soil are compared with an "established" statutory threshold value. The probabilistic approach gives a refined classification of the contaminated land, since it takes into consideration the uncertainty in both the actual measurement and estimated kriged values. Two exposure assessment models (i.e., IEUBK and HESP) are used as the basis of the quantitative SREA methodologies. The significant correlation between the blood-Pb predictions, using the IEUBK model, and measured concentrations provides a partial validation of the method, because it allows for the uncertainty in the measurements and the lack of some site-specific measurements. The semiquantitative applications of SRHA and SREA incorporate both qualitative information (e.g., land use and dustiness of waste) and quantitative information (e.g., distance from wastes and distance from industry). The significant correlation between the results of these assessments and the measured blood-Pb levels confirms the robust nature of this approach. Successful application of these methodologies could reduce the cost of the assessment and allow areas to be prioritized for further investigation, remediation, or risk management.
Real-time detection of hazardous materials in air
NASA Astrophysics Data System (ADS)
Schechter, Israel; Schroeder, Hartmut; Kompa, Karl L.
1994-03-01
A new detection system has been developed for real-time analysis of organic compounds in ambient air. It is based on multiphoton ionization by an unfocused laser beam in a single parallel-plate device. Thus, the ionization volume can be relatively large. The amount of laser created ions is determined quantitatively from the induced total voltage drop between the biased plates (Q equals (Delta) V(DOT)C). Mass information is obtained from computer analysis of the time-dependent signal. When a KrF laser (5 ev) is used, most of the organic compounds can be ionized in a two-photon process, but none of the standard components of atmospheric air are ionized by this process. Therefore, this instrument may be developed as a `sniffer' for organic materials. The method has been applied for benzene analysis in air. The detection limit is about 10 ppb. With a simple preconcentration technique the detection limit can be decreased to the sub-ppb range. Simple binary mixtures are also resolved.
Organic liquids that are essentially immiscible with water migrate through the subsurface through the influence of capillary, viscous and buoyancy forces. our experimental methods were employed. irst, quantitative displacement experiments using short soil columns; second, additio...
QUANTITATION OF MOLECULAR ENDPOINTS FOR THE DOSE-RESPONSE COMPONENT OF CANCER RISK ASSESSMENT
Cancer risk assessment involves the steps of hazard identification, dose-response assessment, exposure assessment and risk characterization. The rapid advances in the use of molecular biology approaches has had an impact on all four components, but the greatest overall current...
Organic liquids that are essentially immiscible with water migrate through the subsurface through the influence of capillary, viscous and buoyancy forces. Four experimental methods were employed. First, quantitative displacement experiments using short soil columns; second, add...
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
SENSITIVE PARAMETER EVALUATION FOR A VADOSE ZONE FATE AND TRANSPORT MODEL
This report presents information pertaining to quantitative evaluation of the potential impact of selected parameters on output of vadose zone transport and fate models used to describe the behavior of hazardous chemicals in soil. The Vadose 2one Interactive Processes (VIP) model...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frost, J.K.; Ball, W.C. Jr.; Levin, M.L.
Sputum cytopathologic monitoring detects squamous cell lung cancers at an extremely early stage (x-ray negative). It holds further potential for preventing disease by detecting epithelial alterations which reflect environmental hazards. The addition of sputum cytology screening to screening by chest x-ray film does not significantly reduce mortality from all types of lung cancer, but preliminary analysis of Johns Hopkins Lung Project data suggests that mortality from squamous cell carcinoma is reduced. Quantitative automated cytopathology systems and biochemical/immunological cell markers enhance understanding of these precursors and offer great promise for increasing capacity, accuracy, and usefulness in cytopathology screening of workers. Cytologicalmore » specimens collected over years of screening workers considered at risk may be important to eventually understanding development and prevention of major occupational diseases.« less
Food Security, Decision Making and the Use of Remote Sensing in Famine Early Warning Systems
NASA Technical Reports Server (NTRS)
Brown, Molly E.
2008-01-01
Famine early warning systems use remote sensing in combination with socio-economic and household food economy analysis to provide timely and rigorous information on emerging food security crises. The Famine Early Warning Systems Network (FEWS NET) is the US Agency for International Development's decision support system in 20 African countries, as well as in Guatemala, Haiti and Afghanistan. FEWS NET provides early and actionable policy guidance for the US Government and its humanitarian aid partners. As we move into an era of climate change where weather hazards will become more frequent and severe, understanding how to provide quantitative and actionable scientific information for policy makers using biophysical data is critical for an appropriate and effective response.
Environmental Quality and the U.S. Power Sector: Air Quality, Land Use and Environmental Justice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Massetti, Emanuele; Brown, Marilyn Ann; Lapsa, Melissa Voss
This baseline report summarizes key environmental quality issues associated with electricity generation, transmission, distribution, and end use in the United States. Its scope includes non-greenhouse gas air pollution (i.e., sulfur dioxide, nitrogen oxides, particulate matter and hazardous air pollutants), land use, water pollution, ecological impacts, human health, and environmental justice. The discussion characterizes both current impacts and recent trends, as well as assessments of key drivers of change. For example, the air emissions section includes a quantitative decomposition analysis of the drivers of change in sulfur dioxide emissions reductions from coal-fired power plants. The report is divided into four topicalmore » sections: air emissions, land use and ecology, water quality, and environmental justice.« less
Resampling to Address the Winner's Curse in Genetic Association Analysis of Time to Event
Poirier, Julia G.; Faye, Laura L.; Dimitromanolakis, Apostolos; Paterson, Andrew D.; Sun, Lei
2015-01-01
ABSTRACT The “winner's curse” is a subtle and difficult problem in interpretation of genetic association, in which association estimates from large‐scale gene detection studies are larger in magnitude than those from subsequent replication studies. This is practically important because use of a biased estimate from the original study will yield an underestimate of sample size requirements for replication, leaving the investigators with an underpowered study. Motivated by investigation of the genetics of type 1 diabetes complications in a longitudinal cohort of participants in the Diabetes Control and Complications Trial/Epidemiology of Diabetes Interventions and Complications (DCCT/EDIC) Genetics Study, we apply a bootstrap resampling method in analysis of time to nephropathy under a Cox proportional hazards model, examining 1,213 single‐nucleotide polymorphisms (SNPs) in 201 candidate genes custom genotyped in 1,361 white probands. Among 15 top‐ranked SNPs, bias reduction in log hazard ratio estimates ranges from 43.1% to 80.5%. In simulation studies based on the observed DCCT/EDIC genotype data, genome‐wide bootstrap estimates for false‐positive SNPs and for true‐positive SNPs with low‐to‐moderate power are closer to the true values than uncorrected naïve estimates, but tend to overcorrect SNPs with high power. This bias‐reduction technique is generally applicable for complex trait studies including quantitative, binary, and time‐to‐event traits. PMID:26411674
Mocellin, Simone; Hoon, Dave; Ambrosi, Alessandro; Nitti, Donato; Rossi, Carlo Riccardo
2006-08-01
The detection of circulating tumor cells (CTC) in patients with melanoma represents an appealing prognostic tool, but no consensus exists on this topic. We aimed to comprehensively and quantitatively summarize the evidence for the use of CTC to predict patients' clinical outcome. Fifty-three studies enrolling 5,433 patients were reviewed. Correlation of CTC status with tumor-node-metastasis disease stage and patients' overall (OS) and progression-free (PFS) survival was assessed by means of association statistics and meta-analysis, respectively. CTC status correlated with both tumor-node-metastasis stage (stage I, 32%; stage II, 41.7%; stage III, 41.1%; stage IV, 47.4%; P(trend) < 0.0001) and survival (OS: hazard ratio, 2.42; 95% confidence interval, 1.7-3.45, P < 0.0001; PFS: hazard ratio, 2.45; 95% confidence interval, 1.78-3.38; P < 0.0001). However, statistical heterogeneity was significant for both OS and PFS, likely underscoring the wide variability in study design. Furthermore, CTC positivity rates in early stages were higher and in the metastatic setting were lower than expected, which indicates an unsatisfactory accuracy of currently available CTC detection assays. Our findings suggest that CTC might have a clinically valuable prognostic power in patients with melanoma. However, the heterogeneity of the studies thus far published warrants caution not to overestimate the favorable results of pooled data.
Zhang, H-L; Li, L; Cheng, C-J; Sun, X-C
2018-02-01
The study aims to detect the association of miR-146a-5p with intracranial aneurysms (IAs). The expression of miR-146a-5p was compared from plasma samples between 72 patients with intracranial aneurysms (IAs) and 40 healthy volunteers by quantitative Real-time polymerase chain reaction (qRT-PCR). Statistical analysis was performed to analyze the relationship between miR-146a-5p expression and clinical data and overall survival (OS) time of IAs patients. Univariate and multivariate Cox proportional hazards have also been performed. Notably, higher miR-146a-5p expression was found in plasma samples from 72 patients with intracranial aneurysms (IAs) compared with 40 healthy controls. Higher miR-146a-5p expression was significantly associated with rupture and Hunt-Hess level in IAs patients. Kaplan-Meier survival analysis verified that higher miR-146a-5p expression predicted a shorter overall survival (OS) compared with lower miR-146a-5p expression in IAs patients. Univariate and multivariate Cox proportional hazards demonstrated that higher miR-146a-5p expression, rupture, and Hunt-Hess were independent risk factors of OS in patients with intracranial aneurysms (IAs). MiR-146a-5p expression may serve as a biomarker for predicting prognosis in patients with IAs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hennebert, Pierre, E-mail: pierre.hennebert@ineris.fr; Papin, Arnaud; Padox, Jean-Marie
Highlights: • Knowledge of wastes in substances will be necessary to assess HP1–HP15 hazard properties. • A new analytical protocol is proposed for this and tested by two service laboratories on 32 samples. • Sixty-three percentage of the samples have a satisfactory analytical balance between 90% and 110%. • Eighty-four percentage of the samples were classified identically (Seveso Directive) for their hazardousness by the two laboratories. • The method, in progress, is being normalized in France and is be proposed to CEN. - Abstract: The classification of waste as hazardous could soon be assessed in Europe using largely the hazardmore » properties of its constituents, according to the the Classification, Labelling and Packaging (CLP) regulation. Comprehensive knowledge of the component constituents of a given waste will therefore be necessary. An analytical protocol for determining waste composition is proposed, which includes using inductively coupled plasma (ICP) screening methods to identify major elements and gas chromatography/mass spectrometry (GC–MS) screening techniques to measure organic compounds. The method includes a gross or indicator measure of ‘pools’ of higher molecular weight organic substances that are taken to be less bioactive and less hazardous, and of unresolved ‘mass’ during the chromatography of volatile and semi-volatile compounds. The concentration of some elements and specific compounds that are linked to specific hazard properties and are subject to specific regulation (examples include: heavy metals, chromium(VI), cyanides, organo-halogens, and PCBs) are determined by classical quantitative analysis. To check the consistency of the analysis, the sum of the concentrations (including unresolved ‘pools’) should give a mass balance between 90% and 110%. Thirty-two laboratory samples comprising different industrial wastes (liquids and solids) were tested by two routine service laboratories, to give circa 7000 parameter results. Despite discrepancies in some parameters, a satisfactory sum of estimated or measured concentrations (analytical balance) of 90% was reached for 20 samples (63% of the overall total) during this first test exercise, with identified reasons for most of the unsatisfactory results. Regular use of this protocol (which is now included in the French legislation) has enabled service laboratories to reach a 90% mass balance for nearly all the solid samples tested, and most of liquid samples (difficulties were caused in some samples from polymers in solution and vegetable oil). The protocol is submitted to French and European normalization bodies (AFNOR and CEN) and further improvements are awaited.« less
Accident analysis and control options in support of the sludge water system safety analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
HEY, B.E.
A hazards analysis was initiated for the SWS in July 2001 (SNF-8626, K Basin Sludge and Water System Preliminary Hazard Analysis) and updated in December 2001 (SNF-10020 Rev. 0, Hazard Evaluation for KE Sludge and Water System - Project A16) based on conceptual design information for the Sludge Retrieval System (SRS) and 60% design information for the cask and container. SNF-10020 was again revised in September 2002 to incorporate new hazards identified from final design information and from a What-if/Checklist evaluation of operational steps. The process hazards, controls, and qualitative consequence and frequency estimates taken from these efforts have beenmore » incorporated into Revision 5 of HNF-3960, K Basins Hazards Analysis. The hazards identification process documented in the above referenced reports utilized standard industrial safety techniques (AIChE 1992, Guidelines for Hazard Evaluation Procedures) to systematically guide several interdisciplinary teams through the system using a pre-established set of process parameters (e.g., flow, temperature, pressure) and guide words (e.g., high, low, more, less). The teams generally included representation from the U.S. Department of Energy (DOE), K Basins Nuclear Safety, T Plant Nuclear Safety, K Basin Industrial Safety, fire protection, project engineering, operations, and facility engineering.« less
Distance education course on spatial multi-hazard risk assessment, using Open Source software
NASA Astrophysics Data System (ADS)
van Westen, C. J.; Frigerio, S.
2009-04-01
As part of the capacity building activities of the United Nations University - ITC School on Disaster Geo-Information Management (UNU-ITC DGIM) the International Institute for Geoinformation Science and Earth Observation (ITC) has developed a distance education course on the application of Geographic Information Systems for multi-hazard risk assessment. This course is designed for academic staff, as well as for professionals working in (non-) governmental organizations where knowledge of disaster risk management is essential. The course guides the participants through the entire process of risk assessment, on the basis of a case study of a city exposed to multiple hazards, in a developing country. The courses consists of eight modules, each with a guide book explaining the theoretical background, and guiding the participants through spatial data requirements for risk assessment, hazard assessment procedures, generation of elements at risk databases, vulnerability assessment, qualitative and quantitative risk assessment methods, risk evaluation and risk reduction. Linked to the theory is a large set of exercises, with exercise descriptions, answer sheets, demos and GIS data. The exercises deal with four different types of hazards: earthquakes, flooding, technological hazards, and landslides. One important consideration in designing the course is that people from developing countries should not be restricted in using it due to financial burdens for software acquisition. Therefore the aim was to use Open Source software as a basis. The GIS exercises are written for the ILWIS software. All exercises have also been integrated into a WebGIS, using the Open source software CartoWeb (based on GNU License). It is modular and customizable thanks to its object-oriented architecture and based on a hierarchical structure (to manage and organize every package of information of every step required in risk assessment). Different switches for every component of the risk assessment course have been defined and through various menus the user can define the options for every exercise. For every layer of information tools for querying, printing, searching and surface analysis are implemented, allowing the option to compare maps at different scale and for on-line interpretations.
Multi-scale landslide hazard assessment: Advances in global and regional methodologies
NASA Astrophysics Data System (ADS)
Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang
2010-05-01
The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.
Tracking Hazard Analysis Data in a Jungle of Changing Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Robin S.; Young, Jonathan
2006-05-16
Tracking hazard analysis data during the 'life cycle' of a project can be an extremely complicated task. However, a few simple rules, used consistently, can give you the edge that will save countless headaches and provide the information that will help integrate the hazard analysis and design activities even if performed in parallel.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... 0584-AD65 School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... of Management and Budget (OMB) cleared the associated information collection requirements (ICR) on...
Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations
NASA Technical Reports Server (NTRS)
Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art
2012-01-01
This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).
Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land
2006-01-01
We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.
QSAR modeling of cumulative environmental end-points for the prioritization of hazardous chemicals.
Gramatica, Paola; Papa, Ester; Sangion, Alessandro
2018-01-24
The hazard of chemicals in the environment is inherently related to the molecular structure and derives simultaneously from various chemical properties/activities/reactivities. Models based on Quantitative Structure Activity Relationships (QSARs) are useful to screen, rank and prioritize chemicals that may have an adverse impact on humans and the environment. This paper reviews a selection of QSAR models (based on theoretical molecular descriptors) developed for cumulative multivariate endpoints, which were derived by mathematical combination of multiple effects and properties. The cumulative end-points provide an integrated holistic point of view to address environmentally relevant properties of chemicals.
Pathogen and indicator concentrations normally vary by several orders of magnitude in raw waters, and to an even greater extent during hazardous event periods. This variation in concentration typically dominate the estimate of infection generated in a quantitative microbial risk ...
High Throughput Heuristics for Prioritizing Human Exposure to Environmental Chemicals
The risk posed to human health by any of the thousands of untested anthropogenic chemicals in our environment is a function of both the potential hazard presented by the chemical, and the possibility of being exposed. Without the capacity to make quantitative, albeit uncertain, f...
Integrated Safety Analysis Tiers
NASA Technical Reports Server (NTRS)
Shackelford, Carla; McNairy, Lisa; Wetherholt, Jon
2009-01-01
Commercial partnerships and organizational constraints, combined with complex systems, may lead to division of hazard analysis across organizations. This division could cause important hazards to be overlooked, causes to be missed, controls for a hazard to be incomplete, or verifications to be inefficient. Each organization s team must understand at least one level beyond the interface sufficiently enough to comprehend integrated hazards. This paper will discuss various ways to properly divide analysis among organizations. The Ares I launch vehicle integrated safety analyses effort will be utilized to illustrate an approach that addresses the key issues and concerns arising from multiple analysis responsibilities.
Seismic hazard assessment: Issues and alternatives
Wang, Z.
2011-01-01
Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.
A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva
2018-03-01
The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.
Willeit, Peter; Raschenberger, Julia; Heydon, Emma E; Tsimikas, Sotirios; Haun, Margot; Mayr, Agnes; Weger, Siegfried; Witztum, Joseph L; Butterworth, Adam S; Willeit, Johann; Kronenberg, Florian; Kiechl, Stefan
2014-01-01
Short telomeres have been linked to various age-related diseases. We aimed to assess the association of telomere length with incident type 2 diabetes mellitus (T2DM) in prospective cohort studies. Leucocyte relative telomere length (RTL) was measured using quantitative polymerase chain reaction in 684 participants of the prospective population-based Bruneck Study (1995 baseline), with repeat RTL measurements performed in 2005 (n = 558) and 2010 (n = 479). Hazard ratios for T2DM were calculated across quartiles of baseline RTL using Cox regression models adjusted for age, sex, body-mass index, smoking, socio-economic status, physical activity, alcohol consumption, high-density lipoprotein cholesterol, log high-sensitivity C-reactive protein, and waist-hip ratio. Separate analyses corrected hazard ratios for within-person variability using multivariate regression calibration of repeated measurements. To contextualise findings, we systematically sought PubMed, Web of Science and EMBASE for relevant articles and pooled results using random-effects meta-analysis. Over 15 years of follow-up, 44 out of 606 participants free of diabetes at baseline developed incident T2DM. The adjusted hazard ratio for T2DM comparing the bottom vs. the top quartile of baseline RTL (i.e. shortest vs. longest) was 2.00 (95% confidence interval: 0.90 to 4.49; P = 0.091), and 2.31 comparing the bottom quartile vs. the remainder (1.21 to 4.41; P = 0.011). The corresponding hazard ratios corrected for within-person RTL variability were 3.22 (1.27 to 8.14; P = 0.014) and 2.86 (1.45 to 5.65; P = 0.003). In a random-effects meta-analysis of three prospective cohort studies involving 6,991 participants and 2,011 incident T2DM events, the pooled relative risk was 1.31 (1.07 to 1.60; P = 0.010; I2 = 69%). Low RTL is independently associated with the risk of incident T2DM. To avoid regression dilution biases in observed associations of RTL with disease risk, future studies should implement methods correcting for within-person variability in RTL. The causal role of short telomeres in T2DM development remains to be determined.
Interdisciplinary approach for disaster risk reduction in Valtellina Valley, northern Italy
NASA Astrophysics Data System (ADS)
Garcia, Carolina; Blahut, Jan; Luna, Byron Quan; Poretti, Ilaria; Camera, Corrado; de Amicis, Mattia; Sterlacchini, Simone
2010-05-01
Inside the framework of the European research network Mountain Risks, an interdisciplinary research group has been working in the Consortium of Mountain Municipalities of Valtellina di Tirano (northern Italy). This area has been continuously affected by several mountain hazards such as landslides, debris flows and floods that directly affect the population, and in some cases caused several deaths and million euros of losses. An aim of the interdisciplinary work in this study area, is to integrate different scientific products of the research group, in the areas of risk assessment, management and governance, in order to generate, among others, risk reduction tools addressed to general public and stakeholders. Two types of phenomena have been particularly investigated: debris flows and floods. The scientific products range from modeling to mapping of hazard and risk, emergency planning based on real time decision support systems, surveying for the evaluation of risk perception and preparedness, among others. Outputs from medium scale hazard and risk modeling could be used for decision makers and spatial planners as well as civil protection authorities to have a general overview of the area and indentify hot spots for further detailed analysis. Subsequently, local scale analysis is necessary to define possible events and risk scenarios for emergency planning. As for the modeling of past events and new scenarios of debris flows, physical outputs were used as inputs into physical vulnerability assessment and quantitative risk analysis within dynamic runout models. On a pilot zone, the physical damage was quantified for each affected structure within the context of physical vulnerability and different empirical vulnerability curves were obtained. Prospective economic direct losses were estimated. For floods hazard assessment, different approaches and models are being tested, in order to produce flood maps for various return periods, and related to registered rainfalls. About Civil Protection topics, the main aim is to set up and manage contingency plans in advance; that is, to identify and prepare people in charge to take action to define the activities to be performed, to be aware of available resources and to optimize the communication system among the people involved, in order to efficiently face a prospective crisis phase. For this purpose, a real time emergency plan has been develop based GIS (Geographical Information Systems), DSS (Decision Support Systems), and ICT (Information & Communication Technology).
Probabilistic analysis of tsunami hazards
Geist, E.L.; Parsons, T.
2006-01-01
Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
Analysis of On-board Hazard Detection and Avoidance for Safe Lunar Landing
NASA Technical Reports Server (NTRS)
Johnson, Andrew E.; Huertas, Andres; Werner, Robert A.; Montgomery, James F.
2008-01-01
Landing hazard detection and avoidance technology is being pursued within NASA to improve landing safety and increase access to sites of interest on the lunar surface. The performance of a hazard detection and avoidance system depends on properties of the terrain, sensor performance, algorithm design, vehicle characteristics and the overall all guidance navigation and control architecture. This paper analyzes the size of the region that must be imaged, sensor performance parameters and the impact of trajectory angle on hazard detection performance. The analysis shows that vehicle hazard tolerance is the driving parameter for hazard detection system design.
ACCELERATED FAILURE TIME MODELS PROVIDE A USEFUL STATISTICAL FRAMEWORK FOR AGING RESEARCH
Swindell, William R.
2009-01-01
Survivorship experiments play a central role in aging research and are performed to evaluate whether interventions alter the rate of aging and increase lifespan. The accelerated failure time (AFT) model is seldom used to analyze survivorship data, but offers a potentially useful statistical approach that is based upon the survival curve rather than the hazard function. In this study, AFT models were used to analyze data from 16 survivorship experiments that evaluated the effects of one or more genetic manipulations on mouse lifespan. Most genetic manipulations were found to have a multiplicative effect on survivorship that is independent of age and well-characterized by the AFT model “deceleration factor”. AFT model deceleration factors also provided a more intuitive measure of treatment effect than the hazard ratio, and were robust to departures from modeling assumptions. Age-dependent treatment effects, when present, were investigated using quantile regression modeling. These results provide an informative and quantitative summary of survivorship data associated with currently known long-lived mouse models. In addition, from the standpoint of aging research, these statistical approaches have appealing properties and provide valuable tools for the analysis of survivorship data. PMID:19007875
Accelerated failure time models provide a useful statistical framework for aging research.
Swindell, William R
2009-03-01
Survivorship experiments play a central role in aging research and are performed to evaluate whether interventions alter the rate of aging and increase lifespan. The accelerated failure time (AFT) model is seldom used to analyze survivorship data, but offers a potentially useful statistical approach that is based upon the survival curve rather than the hazard function. In this study, AFT models were used to analyze data from 16 survivorship experiments that evaluated the effects of one or more genetic manipulations on mouse lifespan. Most genetic manipulations were found to have a multiplicative effect on survivorship that is independent of age and well-characterized by the AFT model "deceleration factor". AFT model deceleration factors also provided a more intuitive measure of treatment effect than the hazard ratio, and were robust to departures from modeling assumptions. Age-dependent treatment effects, when present, were investigated using quantile regression modeling. These results provide an informative and quantitative summary of survivorship data associated with currently known long-lived mouse models. In addition, from the standpoint of aging research, these statistical approaches have appealing properties and provide valuable tools for the analysis of survivorship data.
Gianoli, Enrica; Brusoni, Daniela; Cornaggia, Nicoletta; Saretto, Gianni
2012-01-01
In the present work the chemical compositions of the products used in shoes manufacturing are reported. The data were collected over the period 2004-2007 in 156 shoe factories in Vigevano area during a study aiming the evaluation of safety conditions and occupational exposure to hazardous chemicals of the employees. The study was part of a regional project for "Occupational cancer prevention in the footwear industry". In the first phase of the study an information form on production cycle, products used and their composition was filled during preliminary audit. In the second phase of the study an in depth qualitative/quantitative evaluation of professional exposure was conducted in 13 selected shoe factories. Data analysis showed the increase in use of water-based adhesives at expense of solvent-based adhesives, the reduction to less than 3.5 weight %, and up to 1 weight %, of n-hexane concentration in solvent mixtures, the increase in use of products containing less hazardous ketones, esters, cyclohexane and heptane. Only in very few cases, products containing from 4 to 12 weight% of toluene were used. These data attest a positive trend in workers risks prevention in shoes industry.
Ronza, A; Vílchez, J A; Casal, J
2007-07-19
Risk assessment of hazardous material spill scenarios, and quantitative risk assessment in particular, make use of event trees to account for the possible outcomes of hazardous releases. Using event trees entails the definition of probabilities of occurrence for events such as spill ignition and blast formation. This study comprises an extensive analysis of ignition and explosion probability data proposed in previous work. Subsequently, the results of the survey of two vast US federal spill databases (HMIRS, by the Department of Transportation, and MINMOD, by the US Coast Guard) are reported and commented on. Some tens of thousands of records of hydrocarbon spills were analysed. The general pattern of statistical ignition and explosion probabilities as a function of the amount and the substance spilled is discussed. Equations are proposed based on statistical data that predict the ignition probability of hydrocarbon spills as a function of the amount and the substance spilled. Explosion probabilities are put forth as well. Two sets of probability data are proposed: it is suggested that figures deduced from HMIRS be used in land transportation risk assessment, and MINMOD results with maritime scenarios assessment. Results are discussed and compared with previous technical literature.
NASA Astrophysics Data System (ADS)
Mastrolorenzo, G.; Pappalardo, L.; de Natale, G.; Troise, C.; Rossano, S.; Panizza, A.
2009-04-01
Probabilistic approaches based on available volcanological data from real eruptions of Campi Flegrei and Somma-Vesuvius, are assembled in a comprehensive assessment of volcanic hazards at the Neapolitan area. This allows to compare the volcanic hazards related to the different types of events, which can be used for evaluating the conditional probability of flows and falls hazard in case of a volcanic crisis. Hazard maps are presented, based on a rather complete set of numerical simulations, produced using field and laboratory data as input parameters relative to a large range (VEI 1 to 5) of fallout and pyroclastic-flow events and their relative occurrence. The results allow us to quantitatively evaluate and compare the hazard related to pyroclastic fallout and density currents (PDCs) at the Neapolitan volcanoes and their surroundings, including the city of Naples. Due to its position between the two volcanic areas, the city of Naples is particularly exposed to volcanic risk from VEI>2 eruptions, as recorded in the local volcanic succession. Because dominant wind directions, the area of Naples is particularly prone to fallout hazard from Campi Flegrei caldera eruptions in the VEI range 2-5. The hazard from PDCs decreases roughly radially with distance from the eruptive vents and is strongly controlled by the topographic heights. Campi Flegrei eruptions are particularly hazardous for Naples, although the Camaldoli and Posillipo hills produce an effective barrier to propagation to the very central part of Naples. PDCs from Vesuvius eruptions with VEI>4 can cover the city of Naples, whereas even VEI>3 eruptions have a moderate fallout hazard there.
NASA Astrophysics Data System (ADS)
Tarolli, Paolo; Prosdocimi, Massimo; Sofia, Giulia; Dalla Fontana, Giancarlo
2015-04-01
A real opportunity and challenge for the hazard mapping is offered by the use of smartphones and low-cost and flexible photogrammetric technique (i.e. 'Structure-from-Motion'-SfM-). Differently from the other traditional photogrammetric methods, the SfM allows to reconstitute three-dimensional geometries (Digital Surface Models, DSMs) from randomly acquired images. The images can be acquired by standalone digital cameras (compact or reflex), or even by smartphones built-in cameras. This represents a "revolutionary" advance compared with more expensive technologies and applications (e.g. Terrestrial Laser Scanner TLS, airborne lidar) (Tarolli, 2014). Through fast, simple and consecutive field surveys, anyone with a smartphone can take a lot of pictures of the same study area. This way, high-resolution and multi-temporal DSMs may be obtained and used to better monitor and understand erosion and deposition processes. Furthermore, these topographic data can also facilitate to quantify volumes of eroded materials due to landslides and recognize the major critical issues that usually occur during a natural hazard (e.g. river bank erosion and/or collapse due to floods). In this work we considered different case studies located in different environmental contexts of Italy, where extensive photosets were obtained using smartphones. TLS data were also considered in the analysis as benchmark to compare with SfM data. Digital Surface Models (DSMs) derived from SfM at centimeter grid-cell resolution revealed to be effective to automatically recognize areas subject to surface instabilities, and estimate quantitatively erosion and deposition volumes, for example. Morphometric indexes such as landform curvature and surface roughness, and statistical thresholds (e.g. standard deviation) of these indices, served as the basis for the proposed analyses. The results indicate that SfM technique through smartphones really offers a fast, simple and affordable alternative to lidar technology. Anyone (included farmers, technicians or who work at Civil Protection) who has a good smartphone can take photographs and, from these photographs, they can easily obtain high-resolution DSMs. Therefore, SfM technique accomplished with smartphones can be a very strategic tool for post-event field surveys, to increase the existing knowledge on such events, and to provide fast technical solutions for risk mitigation (e.g. landslide and flood risk management). The future challenge consists of using only a smartphone for local scale post-event analyses. This can be even enhanced by the development of specific apps that are able to build quickly a 3D view of the case study and arrange a preliminary quantitative analysis of the process involved, ready to be sent to Civil Protection for further elaborations. Tarolli, P. (2014). High-resolution topography for understanding Earth surface processes: opportunities and challenges. Geomorphology, 216, 295-312, doi:10.1016/j.geomorph.2014.03.008.
Sinha, Tara; Desai, Sapna; Mahal, Ajay
2014-07-01
Health microinsurance is a financial tool that increases utilization of health care services among low-income persons. There is limited understanding of the illnesses for which insured persons are hospitalized. Analysis of health claims at VimoSEWA, an Indian microinsurance scheme, shows that a significant proportion of hospitalization among insured adult women is for common illnesses—fever, diarrhoea and malaria—that are amenable to outpatient treatment. This study aims to understand the factors that result in hospitalization for common illnesses. The article uses a mixed methods approach. Quantitative data were collected from a household survey of 816 urban low-income households in Gujarat, India. The qualitative data are based on 10 in-depth case studies of insured women hospitalized for common illnesses and interviews with five providers. Quantitative and qualitative data were supplemented with data from the insurance scheme’s administrative records. Socioeconomic characteristics and morbidity patterns among insured and uninsured women were similar with fever the most commonly reported illness. While fever was the leading cause for hospitalization among insured women, no uninsured women were hospitalized for fever. Qualitative investigation indicates that 9 of 10 hospitalized women first sought outpatient treatment. Precipitating factors for hospitalization were either the persistence or worsening of symptoms. Factors that facilitated hospitalization included having insurance and the perceptions of doctors regarding the need for hospitalization. In the absence of quality primary care, health insurance can lead to hospitalization for non-serious illnesses. Deterrents to hospitalization point away from member moral hazard; provider moral hazard cannot be ruled out. This study underscores the need for quality primary health care and its better integration with health microinsurance schemes.
Sheu, Yahtyng; Zmuda, Joseph M; Boudreau, Robert M; Petit, Moira A; Ensrud, Kristine E; Bauer, Douglas C; Gordon, Christopher L; Orwoll, Eric S; Cauley, Jane A
2011-01-01
Many fractures occur in individuals without osteoporosis defined by areal bone mineral density (aBMD). Inclusion of other aspects of skeletal strength may be useful in identifying at-risk subjects. We used surrogate measures of bone strength at the radius and tibia measured by peripheral quantitative computed tomography (pQCT) to evaluate their relationships with nonvertebral fracture risk. Femoral neck (FN) aBMD, measured by dual-energy X-ray absorptiometry (DXA), also was included. The study population consisted of 1143 white men aged 69+ years with pQCT measures at the radius and tibia from the Minneapolis and Pittsburgh centers of the Osteoporotic Fractures in Men (MrOS) study. Principal-components analysis and Cox proportional-hazards modeling were used to identify 21 of 58 pQCT variables with a major contribution to nonvertebral incident fractures. After a mean 2.9 years of follow-up, 39 fractures occurred. Men without incident fractures had significantly greater bone mineral content, cross-sectional area, and indices of bone strength than those with fractures by pQCT. Every SD decrease in the 18 of 21 pQCT parameters was significantly associated with increased fracture risk (hazard ration ranged from 1.4 to 2.2) independent of age, study site, body mass index (BMI), and FN aBMD. Using area under the receiver operation characteristics curve (AUC), the combination of FN aBMD and three radius strength parameters individually increased fracture prediction over FN aBMD alone (AUC increased from 0.73 to 0.80). Peripheral bone strength measures are associated with fracture risk and may improve our ability to identify older men at high risk of fracture. © 2011 American Society for Bone and Mineral Research.
Sparano, Joseph A; Goldstein, Lori J; Childs, Barrett H; Shak, Steven; Brassard, Diana; Badve, Sunil; Baehner, Frederick L; Bugarini, Roberto; Rowley, Steve; Perez, Edith A; Shulman, Lawrence N; Martino, Silvana; Davidson, Nancy E; Kenny, Paraic A; Sledge, George W; Gray, Robert
2011-11-15
To conduct an exploratory analysis of the relationship between gene expression and recurrence in patients with operable triple-negative breast cancer (TNBC) treated with adjuvant doxorubicin-containing chemotherapy. RNA was extracted from archived tumor samples derived from 246 patients with stage I-III TNBC treated with adjuvant doxorubicin-containing chemotherapy, and was analyzed by quantitative reverse transcriptase PCR for a panel of 374 genes. The relationship between gene expression and recurrence was evaluated using weighted Cox proportional hazards model score tests. Growth factor receptor bound protein 7 (GRB7) was the only gene for which higher expression was significantly associated with increased recurrence in TNBC (Korn's adjusted P value = 0.04). In a Cox proportional hazards model adjusted for clinicopathologic features, higher GRB7 expression was associated with an increased recurrence risk (HR = 2.31; P = 0.04 using the median as the split). The 5-year recurrence rates were 10.5% [95% confidence intervals (CI), 7.8-14.1] in the low and 20.4% (95% CI, 16.5-25.0) in the high GRB7 groups. External validation in other datasets indicated that GRB7 expression was not prognostic in two adjuvant trials including variable systemic therapy, but in two other trials showed that high GBR7 expression was associated with resistance to neoadjuvant doxorubicin and taxane therapy. GRB7 was associated with an increased risk of recurrence in TNBC, suggesting that GRB7 or GRB7-dependent pathways may serve as potential biomarkers for therapeutic targets. Therapeutic targeting of one or more factors identified which function as interaction nodes or effectors should also be considered.
Sparano, Joseph A.; Goldstein, Lori J.; Childs, Barrett H.; Shak, Steven; Brassard, Diana; Badve, Sunil; Baehner, Frederick L.; Bugarini, Roberto; Rowley, Steve; Perez, Edith; Shulman, Lawrence N.; Martino, Silvana; Davidson, Nancy E.; Kenny, Paraic A.; Sledge, George W.; Gray, Robert
2012-01-01
Purpose To perform an exploratory analysis of the relationship between gene expression and recurrence in patients with operable triple negative breast cancer (TNBC) treated with adjuvant doxorubicin-containing chemotherapy. Experimental design RNA was extracted from archived tumor samples derived from 246 patients with stage I-III TNBC treated with adjuvant doxorubicin-containing chemotherapy, and was analyzed by quantitative RT-PCR for a panel of 374 genes. The relationship between gene expression and recurrence was evaluated using weighted Cox proportional hazards model score tests. Results GRB7 was the only gene for which higher expression was significantly associated with increased recurrence in TNBC (Korn’s adjusted p value=0.04). In a Cox proportional hazards model adjusted for clinicopathologic features, higher GRB7 expression was associated with an increased recurrence risk (HR 2.31, p=0.04 using the median as the split). The 5-year recurrence rates were 10.5% (95% confidence intervals [CI] 7.8%, 14.1%) in the low and 20.4% (95% CI 16.5%, 25.0%) in the high GRB7 groups. External validation in other datasets indicated that GRB7 expression was not prognostic in two adjuvant trials including variable systemic therapy, but in two other trials showed that high GBR7 expression was associated with resistance to neoadjuvant doxorubicin and taxane therapy. Conclusions GRB7 was associated with an increased risk of recurrence in TNBC, suggesting that GRB7 or GRB7-dependent pathways may serve as potential biomarkers for therapeutic targets. Therapeutic targeting of one or more factors identified which function as interaction nodes or effectors should also be considered. PMID:21933890
Guide for Hydrogen Hazards Analysis on Components and Systems
NASA Technical Reports Server (NTRS)
Beeson, Harold; Woods, Stephen
2003-01-01
The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.
2010-01-01
Introduction Accurate assessment of estrogen receptor (ER), progesterone receptor (PR), and Ki-67 is essential in the histopathologic diagnostics of breast cancer. Commercially available image analysis systems are usually bundled with dedicated analysis hardware and, to our knowledge, no easily installable, free software for immunostained slide scoring has been described. In this study, we describe a free, Internet-based web application for quantitative image analysis of ER, PR, and Ki-67 immunohistochemistry in breast cancer tissue sections. Methods The application, named ImmunoRatio, calculates the percentage of positively stained nuclear area (labeling index) by using a color deconvolution algorithm for separating the staining components (diaminobenzidine and hematoxylin) and adaptive thresholding for nuclear area segmentation. ImmunoRatio was calibrated using cell counts defined visually as the gold standard (training set, n = 50). Validation was done using a separate set of 50 ER, PR, and Ki-67 stained slides (test set, n = 50). In addition, Ki-67 labeling indexes determined by ImmunoRatio were studied for their prognostic value in a retrospective cohort of 123 breast cancer patients. Results The labeling indexes by calibrated ImmunoRatio analyses correlated well with those defined visually in the test set (correlation coefficient r = 0.98). Using the median Ki-67 labeling index (20%) as a cutoff, a hazard ratio of 2.2 was obtained in the survival analysis (n = 123, P = 0.01). ImmunoRatio was shown to adapt to various staining protocols, microscope setups, digital camera models, and image acquisition settings. The application can be used directly with web browsers running on modern operating systems (e.g., Microsoft Windows, Linux distributions, and Mac OS). No software downloads or installations are required. ImmunoRatio is open source software, and the web application is publicly accessible on our website. Conclusions We anticipate that free web applications, such as ImmunoRatio, will make the quantitative image analysis of ER, PR, and Ki-67 easy and straightforward in the diagnostic assessment of breast cancer specimens. PMID:20663194
Tuominen, Vilppu J; Ruotoistenmäki, Sanna; Viitanen, Arttu; Jumppanen, Mervi; Isola, Jorma
2010-01-01
Accurate assessment of estrogen receptor (ER), progesterone receptor (PR), and Ki-67 is essential in the histopathologic diagnostics of breast cancer. Commercially available image analysis systems are usually bundled with dedicated analysis hardware and, to our knowledge, no easily installable, free software for immunostained slide scoring has been described. In this study, we describe a free, Internet-based web application for quantitative image analysis of ER, PR, and Ki-67 immunohistochemistry in breast cancer tissue sections. The application, named ImmunoRatio, calculates the percentage of positively stained nuclear area (labeling index) by using a color deconvolution algorithm for separating the staining components (diaminobenzidine and hematoxylin) and adaptive thresholding for nuclear area segmentation. ImmunoRatio was calibrated using cell counts defined visually as the gold standard (training set, n = 50). Validation was done using a separate set of 50 ER, PR, and Ki-67 stained slides (test set, n = 50). In addition, Ki-67 labeling indexes determined by ImmunoRatio were studied for their prognostic value in a retrospective cohort of 123 breast cancer patients. The labeling indexes by calibrated ImmunoRatio analyses correlated well with those defined visually in the test set (correlation coefficient r = 0.98). Using the median Ki-67 labeling index (20%) as a cutoff, a hazard ratio of 2.2 was obtained in the survival analysis (n = 123, P = 0.01). ImmunoRatio was shown to adapt to various staining protocols, microscope setups, digital camera models, and image acquisition settings. The application can be used directly with web browsers running on modern operating systems (e.g., Microsoft Windows, Linux distributions, and Mac OS). No software downloads or installations are required. ImmunoRatio is open source software, and the web application is publicly accessible on our website. We anticipate that free web applications, such as ImmunoRatio, will make the quantitative image analysis of ER, PR, and Ki-67 easy and straightforward in the diagnostic assessment of breast cancer specimens.
Tools for Material Design and Selection
NASA Astrophysics Data System (ADS)
Wehage, Kristopher
The present thesis focuses on applications of numerical methods to create tools for material characterization, design and selection. The tools generated in this work incorporate a variety of programming concepts, from digital image analysis, geometry, optimization, and parallel programming to data-mining, databases and web design. The first portion of the thesis focuses on methods for characterizing clustering in bimodal 5083 Aluminum alloys created by cryomilling and powder metallurgy. The bimodal samples analyzed in the present work contain a mixture of a coarse grain phase, with a grain size on the order of several microns, and an ultra-fine grain phase, with a grain size on the order of 200 nm. The mixing of the two phases is not homogeneous and clustering is observed. To investigate clustering in these bimodal materials, various microstructures were created experimentally by conventional cryomilling, Hot Isostatic Pressing (HIP), Extrusion, Dual-Mode Dynamic Forging (DMDF) and a new 'Gradient' cryomilling process. Two techniques for quantitative clustering analysis are presented, formulated and implemented. The first technique, the Area Disorder function, provides a metric of the quality of coarse grain dispersion in an ultra-fine grain matrix and the second technique, the Two-Point Correlation function, provides a metric of long and short range spatial arrangements of the two phases, as well as an indication of the mean feature size in any direction. The two techniques are implemented on digital images created by Scanning Electron Microscopy (SEM) and Electron Backscatter Detection (EBSD) of the microstructures. To investigate structure--property relationships through modeling and simulation, strategies for generating synthetic microstructures are discussed and a computer program that generates randomized microstructures with desired configurations of clustering described by the Area Disorder Function is formulated and presented. In the computer program, two-dimensional microstructures are generated by Random Sequential Adsorption (RSA) of voxelized ellipses representing the coarse grain phase. A simulated annealing algorithm is used to geometrically optimize the placement of the ellipses in the model to achieve varying user-defined configurations of spatial arrangement of the coarse grains. During the simulated annealing process, the ellipses are allowed to overlap up to a specified threshold, allowing triple junctions to form in the model. Once the simulated annealing process is complete, the remaining space is populated by smaller ellipses representing the ultra-fine grain phase. Uniform random orientations are assigned to the grains. The program generates text files that can be imported in to Crystal Plasticity Finite Element Analysis Software for stress analysis. Finally, numerical methods and programming are applied to current issues in green engineering and hazard assessment. To understand hazards associated with materials and select safer alternatives, engineers and designers need access to up-to-date hazard information. However, hazard information comes from many disparate sources and aggregating, interpreting and taking action on the wealth of data is not trivial. In light of these challenges, a Framework for Automated Hazard Assessment based on the GreenScreen list translator is presented. The framework consists of a computer program that automatically extracts data from the GHS-Japan hazard database, loads the data into a machine-readable JSON format, transforms the JSON document in to a GreenScreen JSON document using the GreenScreen List Translator v1.2 and performs GreenScreen Benchmark scoring on the material. The GreenScreen JSON documents are then uploaded to a document storage system to allow human operators to search for, modify or add additional hazard information via a web interface.
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Sandri, Laura; Anne Thompson, Mary
2015-06-01
PyBetVH is a completely new, free, open-source and cross-platform software implementation of the Bayesian Event Tree for Volcanic Hazard (BET_VH), a tool for estimating the probability of any magmatic hazardous phenomenon occurring in a selected time frame, accounting for all the uncertainties. New capabilities of this implementation include the ability to calculate hazard curves which describe the distribution of the exceedance probability as a function of intensity (e.g., tephra load) on a grid of points covering the target area. The computed hazard curves are (i) absolute (accounting for the probability of eruption in a given time frame, and for all the possible vent locations and eruptive sizes) and (ii) Bayesian (computed at different percentiles, in order to quantify the epistemic uncertainty). Such curves allow representation of the full information contained in the probabilistic volcanic hazard assessment (PVHA) and are well suited to become a main input to quantitative risk analyses. PyBetVH allows for interactive visualization of both the computed hazard curves, and the corresponding Bayesian hazard/probability maps. PyBetVH is designed to minimize the efforts of end users, making PVHA results accessible to people who may be less experienced in probabilistic methodologies, e.g. decision makers. The broad compatibility of Python language has also allowed PyBetVH to be installed on the VHub cyber-infrastructure, where it can be run online or downloaded at no cost. PyBetVH can be used to assess any type of magmatic hazard from any volcano. Here we illustrate how to perform a PVHA through PyBetVH using the example of analyzing tephra fallout from the Okataina Volcanic Centre (OVC), New Zealand, and highlight the range of outputs that the tool can generate.
An evaluation of treatment strategies for head and neck cancer in an African American population.
Ignacio, D N; Griffin, J J; Daniel, M G; Serlemitsos-Day, M T; Lombardo, F A; Alleyne, T A
2013-07-01
This study evaluated treatment strategies for head and neck cancers in a predominantly African American population. Data were collected utilizing medical records and the tumour registry at the Howard University Hospital. Kaplan-Meier method was used for survival analysis and Cox proportional hazards regression analysis predicted the hazard of death. Analysis revealed that the main treatment strategy was radiation combined with platinum for all stages except stage I. Cetuximab was employed in only 1% of cases. Kaplan-Meier analysis revealed stage II patients had poorer outcome than stage IV while Cox proportional hazard regression analysis (p = 0.4662) showed that stage I had a significantly lower hazard of death than stage IV (HR = 0.314; p = 0.0272). Contributory factors included tobacco and alcohol but body mass index (BMI) was inversely related to hazard of death. There was no difference in survival using any treatment modality for African Americans.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, T.; Ungers, L.; Briggs, T.
1980-08-01
The purpose of this study is to estimate both quantitatively and qualitatively, the worker and societal risks attributable to four photovoltaic cell (solar cell) production processes. Quantitative risk values were determined by use of statistics from the California semiconductor industry. The qualitative risk assessment was performed using a variety of both governmental and private sources of data. The occupational health statistics derived from the semiconductor industry were used to predict injury and fatality levels associated with photovoltaic cell manufacturing. The use of these statistics to characterize the two silicon processes described herein is defensible from the standpoint that many ofmore » the same process steps and materials are used in both the semiconductor and photovoltaic industries. These health statistics are less applicable to the gallium arsenide and cadmium sulfide manufacturing processes, primarily because of differences in the materials utilized. Although such differences tend to discourage any absolute comparisons among the four photovoltaic cell production processes, certain relative comparisons are warranted. To facilitate a risk comparison of the four processes, the number and severity of process-related chemical hazards were assessed. This qualitative hazard assessment addresses both the relative toxicity and the exposure potential of substances in the workplace. In addition to the worker-related hazards, estimates of process-related emissions and wastes are also provided.« less
Fish acute toxicity syndromes and their use in the QSAR approach to hazard assessment.
McKim, J M; Bradbury, S P; Niemi, G J
1987-01-01
Implementation of the Toxic Substances Control Act of 1977 creates the need to reliably establish testing priorities because laboratory resources are limited and the number of industrial chemicals requiring evaluation is overwhelming. The use of quantitative structure activity relationship (QSAR) models as rapid and predictive screening tools to select more potentially hazardous chemicals for in-depth laboratory evaluation has been proposed. Further implementation and refinement of quantitative structure-toxicity relationships in aquatic toxicology and hazard assessment requires the development of a "mode-of-action" database. With such a database, a qualitative structure-activity relationship can be formulated to assign the proper mode of action, and respective QSAR, to a given chemical structure. In this review, the development of fish acute toxicity syndromes (FATS), which are toxic-response sets based on various behavioral and physiological-biochemical measurements, and their projected use in the mode-of-action database are outlined. Using behavioral parameters monitored in the fathead minnow during acute toxicity testing, FATS associated with acetylcholinesterase (AChE) inhibitors and narcotics could be reliably predicted. However, compounds classified as oxidative phosphorylation uncouplers or stimulants could not be resolved. Refinement of this approach by using respiratory-cardiovascular responses in the rainbow trout, enabled FATS associated with AChE inhibitors, convulsants, narcotics, respiratory blockers, respiratory membrane irritants, and uncouplers to be correctly predicted. PMID:3297660
Ignat'eva, L P; Pogorelova, I G; Potapova, M O
2006-01-01
Based on the quantitative and qualitative determination of drinking water pollution by a large number of substances of varying toxicity and hazard, the authors calculated a possible risks of their adverse impact on the body by the development of carcinogenic and noncarcinogenic effects.
Lack of complete and appropriate human data requires prediction of the hazards for exposed human populations by extrapolation from available animal and in vitro data. Predictive models for the toxicity of chemicals can be constructed by linking kinetic and mode of action data uti...
A vision and strategy for exposure modelling at the U.S. EPA Office of Research and Development
Traditional, hazard-driven, single-chemical risk assessment practices cannot keep pace with the vast and growing numbers of chemicals in commerce. A well-defined, quantitative, and defensible means of identifying those with the greatest risk potential is needed, with exposure con...
ABSTRACT:Only a fraction of chemicals in commerce have been fully assessed for their potential hazards to human health due to difficulties involved in conventional regulatory tests. It has recently been proposed that quantitative transcriptomic data can be used to determine bench...
The cost of testing chemicals as reproductive toxicants precludes the possibility of evaluating large chemical inventories without a robust strategic approach for setting priorities. The use of quantitative structure-activity relationships (QSARs) in early hazard identification m...
Quantitative structure activity relationship and risk analysis of some pesticides in the goat milk.
Muhammad, Faqir; Awais, Mian Muhammad; Akhtar, Masood; Anwar, Muhammad Irfan
2013-01-04
The detection and quantification of different pesticides in the goat milk samples collected from different localities of Faisalabad, Pakistan was performed by HPLC using solid phase microextraction. The analysis showed that about 50% milk samples were contaminated with pesticides. The mean±SEM levels (ppm) of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.34±0.007, 0.063±0.002, 0.034±0.002 and 0.092±0.002, respectively; whereas, methyl parathion was not detected in any of the analyzed samples. Quantitative structure activity relationship (QSAR) models were suggested to predict the residues of unknown pesticides in the goat milk using their known physicochemical characteristics including molecular weight (MW), melting point (MP), and log octanol to water partition coefficient (Ko/w) in relation to the characteristics such as pH, % fat, specific gravity and refractive index of goat milk. The analysis revealed good correlation coefficient (R2 = 0.985) for goat QSAR model. The coefficients for Ko/w and refractive index for the studied pesticides were higher in goat milk. This suggests that these are better determinants for pesticide residue prediction in the milk of these animals. Based upon the determined pesticide residues and their provisional tolerable daily intakes, risk analysis was also conducted which showed that daily intake levels of pesticide residues including cyhalothrin, chlorpyrifos and cypermethrin in present study are 2.68, 5.19 and 2.71 times higher, respectively in the goat milk. This intake of pesticide contaminated milk might pose health hazards to humans in this locality.
Quantitative structure activity relationship and risk analysis of some pesticides in the goat milk
2013-01-01
The detection and quantification of different pesticides in the goat milk samples collected from different localities of Faisalabad, Pakistan was performed by HPLC using solid phase microextraction. The analysis showed that about 50% milk samples were contaminated with pesticides. The mean±SEM levels (ppm) of cyhalothrin, endosulfan, chlorpyrifos and cypermethrin were 0.34±0.007, 0.063±0.002, 0.034±0.002 and 0.092±0.002, respectively; whereas, methyl parathion was not detected in any of the analyzed samples. Quantitative structure activity relationship (QSAR) models were suggested to predict the residues of unknown pesticides in the goat milk using their known physicochemical characteristics including molecular weight (MW), melting point (MP), and log octanol to water partition coefficient (Ko/w) in relation to the characteristics such as pH, % fat, specific gravity and refractive index of goat milk. The analysis revealed good correlation coefficient (R2 = 0.985) for goat QSAR model. The coefficients for Ko/w and refractive index for the studied pesticides were higher in goat milk. This suggests that these are better determinants for pesticide residue prediction in the milk of these animals. Based upon the determined pesticide residues and their provisional tolerable daily intakes, risk analysis was also conducted which showed that daily intake levels of pesticide residues including cyhalothrin, chlorpyrifos and cypermethrin in present study are 2.68, 5.19 and 2.71 times higher, respectively in the goat milk. This intake of pesticide contaminated milk might pose health hazards to humans in this locality. PMID:23369514
Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W
2013-06-01
Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.
Preliminary hazards analysis -- vitrification process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coordes, D.; Ruggieri, M.; Russell, J.
1994-06-01
This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s constructionmore » and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.« less
NASA Astrophysics Data System (ADS)
Biass, S.; Todde, A.; Cioni, R.; Pistolesi, M.; Geshi, N.; Bonadonna, C.
2017-10-01
We present an exposure analysis of infrastructure and lifeline to tephra fallout for a future large-scale explosive eruption of Sakurajima volcano. An eruption scenario is identified based on the field characterization of the last subplinian eruption at Sakurajima and a review of reports of the eruptions that occurred in the past six centuries. A scenario-based probabilistic hazard assessment is performed using the Tephra2 model, considering various eruption durations to reflect complex eruptive sequences of all considered reference eruptions. A quantitative exposure analysis of infrastructures and lifelines is presented primarily using open-access data. The post-event impact assessment of Magill et al. (Earth Planets Space 65:677-698, 2013) after the 2011 VEI 2 eruption of Shinmoedake is used to discuss the vulnerability and the resilience of infrastructures during a future large eruption of Sakurajima. Results indicate a main eastward dispersal, with longer eruption durations increasing the probability of tephra accumulation in proximal areas and reducing it in distal areas. The exposure analysis reveals that 2300 km of road network, 18 km2 of urban area, and 306 km2 of agricultural land have a 50% probability of being affected by an accumulation of tephra of 1 kg/m2. A simple qualitative exposure analysis suggests that the municipalities of Kagoshima, Kanoya, and Tarumizu are the most likely to suffer impacts. Finally, the 2011 VEI 2 eruption of Shinmoedake demonstrated that the already implemented mitigation strategies have increased resilience and improved recovery of affected infrastructures. Nevertheless, the extent to which these mitigation actions will perform during the VEI 4 eruption presented here is unclear and our hazard assessment points to possible damages on the Sakurajima peninsula and the neighboring municipality of Tarumizu.
Ben Bouallègue, Fayçal; Vauchot, Fabien; Mariano-Goulart, Denis; Payoux, Pierre
2018-02-09
We evaluated the performance of amyloid PET textural and shape features in discriminating normal and Alzheimer's disease (AD) subjects, and in predicting conversion to AD in subjects with mild cognitive impairment (MCI) or significant memory concern (SMC). Subjects from the Alzheimer's Disease Neuroimaging Initiative with available baseline 18 F-florbetapir and T1-MRI scans were included. The cross-sectional cohort consisted of 181 controls and 148 AD subjects. The longitudinal cohort consisted of 431 SMC/MCI subjects, 85 of whom converted to AD during follow-up. PET images were normalized to MNI space and post-processed using in-house software. Relative retention indices (SUVr) were computed with respect to pontine, cerebellar, and composite reference regions. Several textural and shape features were extracted then combined using a support vector machine (SVM) to build a predictive model of AD conversion. Diagnostic and prognostic performance was evaluated using ROC analysis and survival analysis with the Cox proportional hazard model. The three SUVr and all the tested features effectively discriminated AD subjects in cross-sectional analysis (all p < 0.001). In longitudinal analysis, the variables with the highest prognostic value were composite SUVr (AUC 0.86; accuracy 81%), skewness (0.87; 83%), local minima (0.85; 79%), Geary's index (0.86; 81%), gradient norm maximal argument (0.83; 82%), and the SVM model (0.91; 86%). The adjusted hazard ratio for AD conversion was 5.5 for the SVM model, compared with 4.0, 2.6, and 3.8 for cerebellar, pontine and composite SUVr (all p < 0.001), indicating that appropriate amyloid textural and shape features predict conversion to AD with at least as good accuracy as classical SUVr.
Kavlock, Robert; Dix, David
2010-02-01
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly available through the Aggregated Computational Toxicology Resource (ACToR), the Distributed Structure-Searchable Toxicity (DSSTox) Database Network, and other U.S. EPA websites. While initially focused on improving the hazard identification process, the CTRP is placing increasing emphasis on using high-throughput bioactivity profiling data in systems modeling to support quantitative risk assessments, and in developing complementary higher throughput exposure models. This integrated approach will enable analysis of life-stage susceptibility, and understanding of the exposures, pathways, and key events by which chemicals exert their toxicity in developing systems (e.g., endocrine-related pathways). The CTRP will be a critical component in next-generation risk assessments utilizing quantitative high-throughput data and providing a much higher capacity for assessing chemical toxicity than is currently available.
Analysis and design of randomised clinical trials involving competing risks endpoints.
Tai, Bee-Choo; Wee, Joseph; Machin, David
2011-05-19
In randomised clinical trials involving time-to-event outcomes, the failures concerned may be events of an entirely different nature and as such define a classical competing risks framework. In designing and analysing clinical trials involving such endpoints, it is important to account for the competing events, and evaluate how each contributes to the overall failure. An appropriate choice of statistical model is important for adequate determination of sample size. We describe how competing events may be summarised in such trials using cumulative incidence functions and Gray's test. The statistical modelling of competing events using proportional cause-specific and subdistribution hazard functions, and the corresponding procedures for sample size estimation are outlined. These are illustrated using data from a randomised clinical trial (SQNP01) of patients with advanced (non-metastatic) nasopharyngeal cancer. In this trial, treatment has no effect on the competing event of loco-regional recurrence. Thus the effects of treatment on the hazard of distant metastasis were similar via both the cause-specific (unadjusted csHR = 0.43, 95% CI 0.25 - 0.72) and subdistribution (unadjusted subHR 0.43; 95% CI 0.25 - 0.76) hazard analyses, in favour of concurrent chemo-radiotherapy followed by adjuvant chemotherapy. Adjusting for nodal status and tumour size did not alter the results. The results of the logrank test (p = 0.002) comparing the cause-specific hazards and the Gray's test (p = 0.003) comparing the cumulative incidences also led to the same conclusion. However, the subdistribution hazard analysis requires many more subjects than the cause-specific hazard analysis to detect the same magnitude of effect. The cause-specific hazard analysis is appropriate for analysing competing risks outcomes when treatment has no effect on the cause-specific hazard of the competing event. It requires fewer subjects than the subdistribution hazard analysis for a similar effect size. However, if the main and competing events are influenced in opposing directions by an intervention, a subdistribution hazard analysis may be warranted.
Zhang, Sheng; Huang, Jinsheng; Yang, Baigbing; Lin, Binjie; Xu, Xinyun; Chen, Jinru; Zhao, Zhuandi; Tu, Xiaozhi; Bin, Haihua
2014-04-01
To improve the occupational health management levels in electroplating enterprises with quantitative classification measures and to provide a scientific basis for the prevention and control of occupational hazards in electroplating enterprises and the protection of workers' health. A quantitative classification table was created for the occupational health management in electroplating enterprises. The evaluation indicators included 6 items and 27 sub-items, with a total score of 100 points. Forty electroplating enterprises were selected and scored according to the quantitative classification table. These electroplating enterprises were classified into grades A, B, and C based on the scores. Among 40 electroplating enterprises, 11 (27.5%) had scores of >85 points (grade A), 23 (57.5%) had scores of 60∼85 points (grade B), and 6 (15.0%) had scores of <60 points (grade C). Quantitative classification management for electroplating enterprises is a valuable attempt, which is helpful for the supervision and management by the health department and provides an effective method for the self-management of enterprises.
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
DOT National Transportation Integrated Search
1988-05-01
The report is devoted to the review and discussion of generic hazards associated with the ground, launch, orbital and re-entry phases of space operations. Since the DOT Office of Commercial Space Transportation (OCST) has been charged with protecting...
9 CFR 417.4 - Validation, Verification, Reassessment.
Code of Federal Regulations, 2012 CFR
2012-01-01
... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...
9 CFR 417.4 - Validation, Verification, Reassessment.
Code of Federal Regulations, 2010 CFR
2010-01-01
... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...
9 CFR 417.4 - Validation, Verification, Reassessment.
Code of Federal Regulations, 2011 CFR
2011-01-01
... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
The application of quantitative risk assessment to microbial food safety risks.
Jaykus, L A
1996-01-01
Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data limitations to date. Conclusions include a brief discussion of subsequent uncertainty and risk analysis methodologies, and a commentary on present and future applications of QRA in the management of the public health risks associated with the presence of pathogenic microorganisms in the food supply.
A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam
NASA Astrophysics Data System (ADS)
Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen
2014-05-01
Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho are too short for a meaningful frequency analysis. The detailed hazard mapping is performed by a 2D hydrodynamic model for Can Tho city. As the scenarios are derived in a Monte-Carlo framework, the final flood hazard maps are probabilistic, i.e. show the median flood hazard along with uncertainty estimates for each defined level of probabilities of exceedance. For the pluvial flood hazard a frequency analysis of the hourly rain gauge data of Can Tho is performed implementing a peak-over-threshold procedure. Based on this frequency analysis synthetic rains storms are generated in a Monte-Carlo framework for the same probabilities of exceedance as in the fluvial flood hazard analysis. Probabilistic flood hazard maps were then generated with the same 2D hydrodynamic model for the city. In a last step the fluvial and pluvial scenarios are combined assuming independence of the events. These scenarios were also transferred into hazard maps by the 2D hydrodynamic model finally yielding combined fluvial-pluvial probabilistic flood hazard maps for Can Tho. The derived set of maps may be used for an improved city planning or a flood risk analysis.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-25
... establishment's process control plans, that is, its Hazard Analysis and Critical Control Point plans. DATES... control plans, i.e., its Hazard Analysis and Critical Control Point (HACCP) plans; and (3) make the recall... systematic prevention of biological, chemical, and physical hazards. HACCP plans are establishment-developed...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... determine whether there are food hazards that are reasonably likely to occur for each type of juice...
1993-03-01
Naval Weapons Center (NWC) at China Lake, California. Sponsored by the U.S. Dept. of Energy and the Gas Research Institute, the trials consisted of...are about an order of magnitude greater. 3. Hanford Kr 8 5 The results from 13 dispersion trials conducted at the Atomic Energy Commission’s Hanford...1.000 HC5 200 15.44 0.0155 0.996 800 1.274 0.00127 1.003 H12 200 1583 2.489 0.636 800 87.85 0.0995 0.883 H13 200 430.5 0.5049 0.853 800 14.61 0.01526
Meier, D C; Benkstein, K D; Hurst, W S; Chu, P M
2017-05-01
Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, -5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals.
Determination of volatile organic compounds for a systematic evaluation of third-hand smoking.
Ueta, Ikuo; Saito, Yoshihiro; Teraoka, Kenta; Miura, Tomoya; Jinno, Kiyokatsu
2010-01-01
Third-hand smoking was quantitatively evaluated with a polymer-packed sample preparation needle and subsequent gas chromatography-mass spectroscopy analysis. The extraction needle was prepared with polymeric particles as the extraction medium, and successful extraction of typical gaseous volatile organic compounds (VOCs) was accomplished with the extraction needle. For an evaluation of this new cigarette hazard, several types of clothing fabrics were exposed to sidestream smoke, and the smoking-related VOCs evaporated from the fabrics to the environmental air were preconcentrated with the extraction needle. Smoking-related VOCs in smokers' breath were also measured using the extraction needle, and the effect of the breath VOCs on third-hand smoking pollution was evaluated. The results demonstrated that a trace amount of smoking-related VOCs was successfully determined by the proposed method. The adsorption and desorption behaviors of smoking-related VOCs were clearly different for each fabric material, and the time variations of these VOCs concentrations were quantitatively evaluated. The VOCs in the smokers' breath were clearly higher than that of nonsmokers'; however, the results suggested that no significant effect of the smokers' breath on the potential pollution occurred in the typical life space. The method was further applied to the determination of the actual third-hand smoking pollution in an automobile, and a future possibility of the proposed method to the analysis of trace amounts of VOCs in environmental air samples was suggested.
Meier, D.C.; Benkstein, K.D.; Hurst, W.S.; Chu, P.M.
2016-01-01
Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, −5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals. PMID:28090126
Evaluating the tuberculosis hazard posed to cattle from wildlife across Europe.
Hardstaff, Joanne L; Marion, Glenn; Hutchings, Michael R; White, Piran C L
2014-10-01
Tuberculosis (TB) caused by infection with Mycobacterium bovis (M. bovis) and other closely related members of the M. tuberculosis complex (MTC) infects many domestic and wildlife species across Europe. Transmission from wildlife species to cattle complicates the control of disease in cattle. By determining the level of TB hazard for which a given wildlife species is responsible, the potential for transmission to the cattle population can be evaluated. We undertook a quantitative review of TB hazard across Europe on a country-by-country basis for cattle and five widely-distributed wildlife species. Cattle posed the greatest current and potential TB hazard other cattle for the majority of countries in Europe. Wild boar posed the greatest hazard of all the wildlife species, indicating that wild boar have the greatest ability to transmit the disease to cattle. The most common host systems for TB hazards in Europe are the cattle-deer-wild boar ones. The cattle-roe deer-wild boar system is found in 10 countries, and the cattle-red deer-wild boar system is found in five countries. The dominance of cattle with respect to the hazards in many regions confirms that intensive surveillance of cattle for TB should play an important role in any TB control programme. The significant contribution that wildlife can make to the TB hazard to cattle is also of concern, given current population and distribution increases of some susceptible wildlife species, especially wild boar and deer, and the paucity of wildlife TB surveillance programmes. Copyright © 2013 Elsevier Ltd. All rights reserved.
Newsome, R; Tran, N; Paoli, G M; Jaykus, L A; Tompkin, B; Miliotis, M; Ruthman, T; Hartnett, E; Busta, F F; Petersen, B; Shank, F; McEntire, J; Hotchkiss, J; Wagner, M; Schaffner, D W
2009-03-01
Through a cooperative agreement with the U.S. Food and Drug Administration, the Institute of Food Technologists developed a risk-ranking framework prototype to enable comparison of microbiological and chemical hazards in foods and to assist policy makers, risk managers, risk analysts, and others in determining the relative public health impact of specific hazard-food combinations. The prototype is a bottom-up system based on assumptions that incorporate expert opinion/insight with a number of exposure and hazard-related risk criteria variables, which are propagated forward with food intake data to produce risk-ranking determinations. The prototype produces a semi-quantitative comparative assessment of food safety hazards and the impacts of hazard control measures. For a specific hazard-food combination the prototype can produce a single metric: a final risk value expressed as annual pseudo-disability adjusted life years (pDALY). The pDALY is a harmonization of the very different dose-response relationships observed for chemicals and microbes. The prototype was developed on 2 platforms, a web-based user interface and an Analytica(R) model (Lumina Decision Systems, Los Gatos, Calif., U.S.A.). Comprising visual basic language, the web-based platform facilitates data input and allows use concurrently from multiple locations. The Analytica model facilitates visualization of the logic flow, interrelationship of input and output variables, and calculations/algorithms comprising the prototype. A variety of sortable risk-ranking reports and summary information can be generated for hazard-food pairs, showing hazard and dose-response assumptions and data, per capita consumption by population group, and annual p-DALY.
Forensic and homeland security applications of modern portable Raman spectroscopy.
Izake, Emad L
2010-10-10
Modern detection and identification of chemical and biological hazards within the forensic and homeland security contexts may well require conducting the analysis in field while adapting a non-contact approach to the hazard. Technological achievements on both surface and resonance enhancement Raman scattering re-developed Raman spectroscopy to become the most adaptable spectroscopy technique for stand-off and non-contact analysis of hazards. On the other hand, spatially offset Raman spectroscopy proved to be very valuable for non-invasive chemical analysis of hazards concealed within non-transparent containers and packaging. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Quantification of tsunami hazard on Canada's Pacific Coast; implications for risk assessment
NASA Astrophysics Data System (ADS)
Evans, Stephen G.; Delaney, Keith B.
2015-04-01
Our assessment of tsunami hazard on Canada's Pacific Coast (i.e., the coast of British Columbia) begins with a review of the 1964 tsunami generated by The Great Alaska Earthquake (M9.2) that resulted in significant damage to coastal communities and infrastructure. In particular, the tsunami waves swept up inlets on the west coast of Vancouver Island and damaged several communities; Port Alberni suffered upwards of 5M worth of damage. At Port Alberni, the maximum tsunami wave height was estimated at 8.2 m above mean sea level and was recorded on the stream gauge on the Somass River located at about 7 m a.s.l, 6 km upstream from its mouth. The highest wave (9.75 m above tidal datum) was reported from Shields Bay, Graham Island, Queen Charlotte Islands (Haida Gwaii). In addition, the 1964 tsunami was recorded on tide gauges at a number of locations on the BC coast. The 1964 signal and the magnitude and frequency of traces of other historical Pacific tsunamis (both far-field and local) are analysed in the Tofino tide gauge records and compared to tsunami traces in other tide gauges in the Pacific Basin (e.g., Miyako, Japan). Together with a review of the geological evidence for tsunami occurrence along Vancouver Island's west coast, we use this tide gauge data to develop a quantitative framework for tsunami hazard on Canada's Pacific coast. In larger time scales, tsunamis are a major component of the hazard from Cascadia megathrust events. From sedimentological evidence and seismological considerations, the recurrence interval of megathrust events on the Cascadia Subduction Zone has been estimated by others at roughly 500 years. We assume that the hazard associated with a high-magnitude destructive tsunami thus has an annual frequency of roughly 1/500. Compared to other major natural hazards in western Canada this represents a very high annual probability of potentially destructive hazard that, in some coastal communities, translates into high levels of local risk including life-loss risk. Our analysis further indicates that in terms of life-loss risk, communities on Canada's Pacific Coast that are exposed to high tsunami hazard, experience the highest natural risk in Canada. Although sparsely populated, the (outer) coast of British Columbia has important critical infrastructure that includes port developments, shoreline facilities related to forest resource exploitation, a large number of First Nations Reserves, small municipal centres, towns, and villages, (some of which are ecotourism and sport fishing centres), and a limited number of industrial facilities. For selected areas on the west coast of Vancouver Island inundation maps have been prepared for a range of tsunami scenarios. We find that key facilities and critical infrastructure are exposed to the hazards associated with tsunami inundation.
Background: Quantitative high-throughput screening (qHTS) assays are increasingly being employed to inform chemical hazard identification. Hundreds of chemicals have been tested in dozens of cell lines across extensive concentration ranges by the National Toxicology Program in co...
75 FR 51388 - 2-methyl-1,3-propanediol; Exemption from the Requirement of a Tolerance
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-20
... mutagenic in an in vitro chromosome aberration test, bacterial gene mutation test, and mammalian cell gene... Research on Cancer (IARC). Based on available studies, there is no evidence of genotoxic activity. There is... hazard endpoint, the Agency has determined that a quantitative risk assessment using safety factors...
Toxic Hazards Research Unit Annual Report: 1987
1988-03-01
Low Density Lipoproteins and in Model Membranes 14 Sep Is Cigarette Smoking Neurotoric? Mr. Steven Goden Dr. R. Kutzman 4> ’October 1986 through September 1987 2I ...based pharmcokinetic model phosphoniteo,o-diethylmethyl quantitative structure activity relationship respiratory epithelium Salmonella sensitization j...in Cultured Respiratory Epithelial Cells ----------------------- ------------ 73 6 PHARMACOKINETIC AND PHARMACODYNAMIC MODELING .------------------ 78
Toxic Substances List. 1972 Edition.
ERIC Educational Resources Information Center
Christensen, Herbert E., Ed.; And Others
The second edition of the Toxic Substances List, containing some 13,000 entries, is prepared annually by the National Institute for Occupational Safety and Health (NIOSH) in compliance with the Occupational Safety and Health Act of 1970. The purpose of the List is to identify all known toxic substances but not to quantitate the hazard. The List…
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Hazard index calculation for 31 May 1984 microburst at Erie, Colorado
NASA Technical Reports Server (NTRS)
Kropfli, R. A.
1988-01-01
Two x-band Doppler radars, operated by NOAA, were used to collect high resolution data within a small, benign looking microburst during the PHOENIX II boundary layer experiment. The lowest 2.5 km of the microbursts was observed throughout its development and dissipation over a 15 minute period. These observations presented an excellent opportunity to compute a quantitative threat to a hypothetical aircraft whose flight track would carry it through the microburst. The hazard index is based on the kinetic energy loss to the aircraft that would be produced by the microburst; it is a function of the vertical air motion, horizontal spatial derivatives of the wind field, and the assumed aircraft air speed and direction. Indices were computed and plotted for all 8 volume scans and peak values were observed to be sufficiently high to present a significant hazard to an aircraft.
Hazards and occupational risk in hard coal mines - a critical analysis of legal requirements
NASA Astrophysics Data System (ADS)
Krause, Marcin
2017-11-01
This publication concerns the problems of occupational safety and health in hard coal mines, the basic elements of which are the mining hazards and the occupational risk. The work includes a comparative analysis of selected provisions of general and industry-specific law regarding the analysis of hazards and occupational risk assessment. Based on a critical analysis of legal requirements, basic assumptions regarding the practical guidelines for occupational risk assessment in underground coal mines have been proposed.
Liu, Yuewei; Steenland, Kyle; Rong, Yi; Hnizdo, Eva; Huang, Xiji; Zhang, Hai; Shi, Tingming; Sun, Yi; Wu, Tangchun; Chen, Weihong
2013-11-01
Crystalline silica has been classified as a human carcinogen by the International Agency for Research on Cancer (Lyon, France); however, few previous studies have provided quantitative data on silica exposure, silicosis, and/or smoking. We investigated a cohort in China (in 1960-2003) of 34,018 workers without exposure to carcinogenic confounders. Cumulative silica exposure was estimated by linking a job-exposure matrix to work history. Cox proportional hazards model was used to conduct exposure-response analysis and risk assessment. During a mean 34.5-year follow-up, 546 lung cancer deaths were identified. Categorical analyses by quartiles of cumulative silica exposure (using a 25-year lag) yielded hazard ratios of 1.26, 1.54, 1.68, and 1.70, respectively, compared with the unexposed group. Monotonic exposure-response trends were observed among nonsilicotics (P for trend < 0.001). Analyses using splines showed similar trends. The joint effect of silica and smoking was more than additive and close to multiplicative. For workers exposed from ages 20 to 65 years at 0.1 mg/m(3) of silica exposure, the estimated excess lifetime risk (through age 75 years) was 0.51%. These findings confirm silica as a human carcinogen and suggest that current exposure limits in many countries might be insufficient to protect workers from lung cancer. They also indicate that smoking cessation could help reduce lung cancer risk for silica-exposed individuals.
Moshiri, Ata S; Doumani, Ryan; Yelistratova, Lola; Blom, Astrid; Lachance, Kristina; Shinohara, Michi M; Delaney, Martha; Chang, Oliver; McArdle, Susan; Thomas, Hannah; Asgari, Maryam M; Huang, Meei-Li; Schwartz, Stephen M; Nghiem, Paul
2017-04-01
Previous studies have reached conflicting conclusions regarding the proportion of Merkel cell carcinomas (MCCs) that contain the Merkel cell polyomavirus (MCPyV) and the clinical significance of tumor viral status. To address these controversies, we detected MCPyV large T antigen using immunohistochemistry with two distinct antibodies and MCPyV DNA using quantitative PCR. Tumors were called MCPyV-positive if two or more of these three assays indicated presence of this virus. A total of 53 of 282 (19%) MCC tumors in this cohort were virus-negative using this multimodal system. Immunohistochemistry with the CM2B4 antibody had the best overall performance (sensitivity = 0.882, specificity = 0.943) compared with the multimodal classification. Multivariate analysis including age, sex, and immunosuppression showed that, relative to MCC patients with virus-positive tumors, virus-negative MCC patients had significantly increased risk of disease progression (hazard ratio = 1.77, 95% confidence interval = 1.20-2.62) and death from MCC (hazard ratio = 1.85, 95% confidence interval = 1.19-2.89). We confirm that approximately 20% of MCCs are not driven by MCPyV and that such virus-negative MCCs, which can be quite reliably identified by immunohistochemistry using the CM2B4 antibody alone, represent a more aggressive subtype that warrants closer clinical follow-up. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L.; Vogel, R. M.
2015-12-01
Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.
NASA Astrophysics Data System (ADS)
Sadegh, M.; Moftakhari, H.; AghaKouchak, A.
2017-12-01
Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point (HACCP...
Saat, Mohd Rapik; Barkan, Christopher P L
2011-05-15
North America railways offer safe and generally the most economical means of long distance transport of hazardous materials. Nevertheless, in the event of a train accident releases of these materials can pose substantial risk to human health, property or the environment. The majority of railway shipments of hazardous materials are in tank cars. Improving the safety design of these cars to make them more robust in accidents generally increases their weight thereby reducing their capacity and consequent transportation efficiency. This paper presents a generalized tank car safety design optimization model that addresses this tradeoff. The optimization model enables evaluation of each element of tank car safety design, independently and in combination with one another. We present the optimization model by identifying a set of Pareto-optimal solutions for a baseline tank car design in a bicriteria decision problem. This model provides a quantitative framework for a rational decision-making process involving tank car safety design enhancements to reduce the risk of transporting hazardous materials. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Carlier, Benoit; Dujarric, Constance; Puissant, Anne; Lissak, Candide; Viel, Vincent; Bétard, François; Madelin, Malika; Fort, Monique; Arnaud-Fassetta, Gilles
2015-04-01
The Guil catchment is particularly prone to torrential and gravitational hazards such as floods, debris flows, landslides or avalanches due to several predisposing factors (bedrock supplying abundant debris, strong hillslope-channel connectivity) in a context of summer Mediterranean rainstorms as triggers. These hazards severely impact the local population (fatalities, destruction of buildings and infrastructures, loss of agricultural land, road closures). Since the second half of the 20th century, the progressive decline of agro-pastoralism and the development of tourism activities led to a concentration of human stakes on alluvial cones and valley bottom, therefore an increase of vulnerability for mountainous communities. Following the 1957 and 2000 catastrophic floods and the 1948 and 2008 avalanche episodes, some measures were taken to reduce exposure to risks (engineering works, standards of construction, rescue training…). Nevertheless, in front of urban expansion (land pressures and political pressures) and obsolescence of the existing protective measures, it is essential to reassess the vulnerability of the stakes exposed to hazards. Vulnerability analysis is, together with hazard evaluation, one of the major steps of risk assessment. In the frame of the SAMCO project designed for mountain risk assessment, our goal is to estimate specific form of vulnerability for communities living in the Upper Guil catchment in order to provide useful documentation for a better management of the valley bottom and the implementation of adequate mitigation measures. Here we present preliminary results on three municipalities of the upper Guil catchment: Aiguilles, Abriès, and Ristolas. We propose an empirical semi-quantitative indicator of potential hazards consequences on element at risk (based on GIS) with an application to different (local and regional scale) scales. This indicator, called Potential Damage Index, enable us to describe, quantify, and visualize direct (physical injury, structural and functional damage on buildings, lifelines and land cover) and indirect (socio-economic impacts) consequences. The method allows estimating the possible damage caused by torrential and gravitational hazards by combining weighted indicators (age, state, land use, number of occupied floors, etc.) reflecting the exposure of elements at risk (land cover, buildings, and lifelines) to obtain different maps of total consequences. Besides, this method allows introducing temporality by modifying the weight and the combination of variables. For example, we can operate the distinction between day and night or between the off-season and the touristic season. Another benefit of this method is to permit the visualization of the vulnerability evolution producing diachronic maps. All these maps, combined with adequate hazards map, will contribute to a better assessment of vulnerability in the Queyras and must help the development of better land use and evacuation plans, and thus are important tools for local authorities. This study will be completed by an analysis of social and institutional vulnerability realized on the basis of interview with local councillors and risk perception survey led with the local populations and the tourists.
NASA Astrophysics Data System (ADS)
Schmutz, Daria; Zimmermann, Markus; Keiler, Margreth
2017-04-01
Sediment connectivity is defined as the degree of coupling between sediment sources and sinks in a system and describes the effectiveness of the transfer of sediment from hillslopes into channels and within channels (Bracken et al. 2015). Borselli et al. (2008) developed a connectivity index (IC) based on digital terrain models (DTMs). Cavalli et al. (2013) adapted this index for mountainous catchments. These measures of connectivity provide overall information about connectivity pattern in the catchment, thus the understanding of sediment connectivity can help to improve the hazard analysis in these areas. Considering the location of settlements in the alpine regions, high sediment transfer can pose a threat to villages located nearby torrents or at the debris cones. However, there is still a lack of studies on the linkage between IC and hazardous events with high sediment yield in alpine catchments. In this study, the expressiveness and applicability of IC is tested in relation with hazardous events in several catchments of the Bernese and Pennine Alps (Switzerland). The IC is modelled based on DTMs (resolution 2 m or if available 0.5 m) indicating the surface from the time before and after a documented hazardous event and analysed with respect to changes in connectivity caused by the event. The spatial pattern of connectivity is compared with the observed sediment dynamic during the event using event documentations. In order to validate the IC, a semi-quantitative field connectivity index (FIC) is developed addressing characteristics of the channel, banks and slopes and applied in a selection of the case studies. First analysis shows that the IC is highly sensitive to the resolution and quality of the DTM. Connectivity calculated by the IC is highest along the channel. The general pattern of connectivity is comparable applying the IC for the DTM before and after the event. Range of the connectivity values gained from IC modelling is highly specific for each study area and so are their changes by the events. Whereas some slopes show an increased connectivity, others are less connected or not affected according to the IC. Further results of the comparison between the FIC and the IC and an evaluation of both indices in the context of hazardous events will be presented. REFERENCES Borselli, L., Cassi, P. & Torri, D. 2008: Prolegomena to sediment and flow connectivity in the landscape. A GIS and field numerical assessment. CATENA 75 (3), 268-277. Bracken, L. J., Turnbull, L., Wainwright, J. & Bogaart, P. 2015: Sediment connectivity. A framework for understanding sediment transfer at multiple scales. Earth Surface Processes and Landforms 40 (2), 177-188. Cavalli, M., Trevisani, S., Comiti, F. & Marchi, L. 2013: Geomorphometric assessment of spatial sediment connectivity in small Alpine catchments. Geomorphology 188, 31-41.
Little, C L; Lock, D; Barnes, J; Mitchell, R T
2003-09-01
A meta-analysis of eight UK food studies was carried out to determine the microbiological quality of food and its relationship with the presence in food businesses of hazard analysis systems and food hygiene training. Of the 19,022 premises visited to collect food samples in these studies between 1997 and 2002, two thirds (66%) were catering premises and one third (34%) were retail premises. Comparison with PHLS Microbiological Guidelines revealed that significantly more ready-to-eat food samples from catering premises (20%; 2,511/12,703) were of unsatisfactory or unacceptable microbiological quality compared to samples from retail premises (12%; 1,039/8,462) (p < 0.00001). Three quarters (76%) of retail premises had hazard analysis systems in place compared with 59% of catering premises (p < 0.00001). In 87% of retail premises the manager had received some form of food hygiene training compared with 80% of catering premises (p < 0.00001). From premises where the manager had received no food hygiene training a greater proportion of samples were of unsatisfactory and unacceptable microbiological quality (20% retail, 27% catering) compared with premises where the manager had received food hygiene training (11% retail, 19% catering) (p < 0.00001). Where the manager of the premises had received food hygiene training, documented hazard analysis systems were more likely to be in place (p < 0.00001). Higher proportions of samples of unsatisfactory and unacceptable microbiological quality (17% retail, 22% catering) were from premises where there was no hazard analysis system in place compared to premises that had a documented hazard analysis system in place (10% retail, 18% catering) (p < 0.00001). Our meta-analysis suggests that the lower microbiological quality of ready-to-eat foods from catering premises compared with those collected from retail premises may reflect differences in management food hygiene training and the presence of a hazard analysis system. The importance of adequate training for food handlers and their managers as a pre-requisite for effective hazard analysis and critical control point (HACCP) based controls is therefore emphasised.
NASA Astrophysics Data System (ADS)
Gu, Yingxin
This thesis is concerned with atmospheric particles produced by sandstorms and volcanic eruptions. Three studies were conducted in order to examine particle retrieval methodology, and apply these towards an improved understanding of large-scale sandstorms. A thermal infrared remote sensing retrieval method developed by Wen and Rose [1994], which retrieves particle sizes, optical depth, and total masses of silicate particles in the volcanic cloud, was applied to an April 07, 2001 sandstorm over northern China, using MODIS. Results indicate that the area of the dust cloud observed was 1.34 million km2, the mean particle radius of the dust was 1.44 mum, and the mean optical depth at 11 mum was 0.79. The mean burden of dust was approximately 4.8 tons/km2 and the main portion of the dust storm on April 07, 2001 contained 6.5 million tons of dust. The results are supported by both independent remote sensing data (TOMS) and in-situ data for a similar event in 1998, therefore suggesting that the technique is appropriate for quantitative analysis of silicate dust clouds. This is the first quantitative evaluation of annual and seasonal dust loading in 2003 produced by Saharan dust storms by satellite remote sensing analysis. The retrieved mean particle effective radii of 2003 dust events are between 1.7--2.6 mum which is small enough to be inhaled and is hazardous to human health. The retrieved yearly dust mass load is 658--690 Tg, which is ˜45% of the annual global mineral dust production. Winter is the heaviest dust loading season in the year 2003, which is more than 5 times larger than that in the summer season in 2003.The mean optical depths at 11 mum in the winter season (around 0.7) are higher than those in the summer season (around 0.5). The results could help both meteorologists and environmental scientists to evaluate and predict the hazard degree caused by Saharan dust storms. (Abstract shortened by UMI.)
NASA System Safety Handbook. Volume 1; System Safety Framework and Concepts for Implementation
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Smith, Curtis; Stamatelatos, Michael; Youngblood, Robert
2011-01-01
System safety assessment is defined in NPR 8715.3C, NASA General Safety Program Requirements as a disciplined, systematic approach to the analysis of risks resulting from hazards that can affect humans, the environment, and mission assets. Achievement of the highest practicable degree of system safety is one of NASA's highest priorities. Traditionally, system safety assessment at NASA and elsewhere has focused on the application of a set of safety analysis tools to identify safety risks and formulate effective controls.1 Familiar tools used for this purpose include various forms of hazard analyses, failure modes and effects analyses, and probabilistic safety assessment (commonly also referred to as probabilistic risk assessment (PRA)). In the past, it has been assumed that to show that a system is safe, it is sufficient to provide assurance that the process for identifying the hazards has been as comprehensive as possible and that each identified hazard has one or more associated controls. The NASA Aerospace Safety Advisory Panel (ASAP) has made several statements in its annual reports supporting a more holistic approach. In 2006, it recommended that "... a comprehensive risk assessment, communication and acceptance process be implemented to ensure that overall launch risk is considered in an integrated and consistent manner." In 2009, it advocated for "... a process for using a risk-informed design approach to produce a design that is optimally and sufficiently safe." As a rationale for the latter advocacy, it stated that "... the ASAP applauds switching to a performance-based approach because it emphasizes early risk identification to guide designs, thus enabling creative design approaches that might be more efficient, safer, or both." For purposes of this preface, it is worth mentioning three areas where the handbook emphasizes a more holistic type of thinking. First, the handbook takes the position that it is important to not just focus on risk on an individual basis but to consider measures of aggregate safety risk and to ensure wherever possible that there be quantitative measures for evaluating how effective the controls are in reducing these aggregate risks. The term aggregate risk, when used in this handbook, refers to the accumulation of risks from individual scenarios that lead to a shortfall in safety performance at a high level: e.g., an excessively high probability of loss of crew, loss of mission, planetary contamination, etc. Without aggregated quantitative measures such as these, it is not reasonable to expect that safety has been optimized with respect to other technical and programmatic objectives. At the same time, it is fully recognized that not all sources of risk are amenable to precise quantitative analysis and that the use of qualitative approaches and bounding estimates may be appropriate for those risk sources. Second, the handbook stresses the necessity of developing confidence that the controls derived for the purpose of achieving system safety not only handle risks that have been identified and properly characterized but also provide a general, more holistic means for protecting against unidentified or uncharacterized risks. For example, while it is not possible to be assured that all credible causes of risk have been identified, there are defenses that can provide protection against broad categories of risks and thereby increase the chances that individual causes are contained. Third, the handbook strives at all times to treat uncertainties as an integral aspect of risk and as a part of making decisions. The term "uncertainty" here does not refer to an actuarial type of data analysis, but rather to a characterization of our state of knowledge regarding results from logical and physical models that approximate reality. Uncertainty analysis finds how the output parameters of the models are related to plausible variations in the input parameters and in the modeling assumptions. The evaluation of unrtainties represents a method of probabilistic thinking wherein the analyst and decision makers recognize possible outcomes other than the outcome perceived to be "most likely." Without this type of analysis, it is not possible to determine the worth of an analysis product as a basis for making decisions related to safety and mission success. In line with these considerations the handbook does not take a hazard-analysis-centric approach to system safety. Hazard analysis remains a useful tool to facilitate brainstorming but does not substitute for a more holistic approach geared to a comprehensive identification and understanding of individual risk issues and their contributions to aggregate safety risks. The handbook strives to emphasize the importance of identifying the most critical scenarios that contribute to the risk of not meeting the agreed-upon safety objectives and requirements using all appropriate tools (including but not limited to hazard analysis). Thereafter, emphasis shifts to identifying the risk drivers that cause these scenarios to be critical and ensuring that there are controls directed toward preventing or mitigating the risk drivers. To address these and other areas, the handbook advocates a proactive, analytic-deliberative, risk-informed approach to system safety, enabling the integration of system safety activities with systems engineering and risk management processes. It emphasizes how one can systematically provide the necessary evidence to substantiate the claim that a system is safe to within an acceptable risk tolerance, and that safety has been achieved in a cost-effective manner. The methodology discussed in this handbook is part of a systems engineering process and is intended to be integral to the system safety practices being conducted by the NASA safety and mission assurance and systems engineering organizations. The handbook posits that to conclude that a system is adequately safe, it is necessary to consider a set of safety claims that derive from the safety objectives of the organization. The safety claims are developed from a hierarchy of safety objectives and are therefore hierarchical themselves. Assurance that all the claims are true within acceptable risk tolerance limits implies that all of the safety objectives have been satisfied, and therefore that the system is safe. The acceptable risk tolerance limits are provided by the authority who must make the decision whether or not to proceed to the next step in the life cycle. These tolerances are therefore referred to as the decision maker's risk tolerances. In general, the safety claims address two fundamental facets of safety: 1) whether required safety thresholds or goals have been achieved, and 2) whether the safety risk is as low as possible within reasonable impacts on cost, schedule, and performance. The latter facet includes consideration of controls that are collective in nature (i.e., apply generically to broad categories of risks) and thereby provide protection against unidentified or uncharacterized risks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bloom, R.R.
1996-04-01
The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented VAC*TRAX mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses an indirectly heated, batch vacuum dryer to thermally desorb organic compounds from mixed wastes. This process hazards analysis evaluated 102 potential hazards. The three significant hazards identified involved the inclusion of oxygen in a process that also included an ignition source and fuel. Changesmore » to the design of the MTU were made concurrent with the hazard identification and analysis; all hazards with initial risk rankings of 1 or 2 were reduced to acceptable risk rankings of 3 or 4. The overall risk to any population group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.« less
Meris, Ronald G; Barbera, Joseph A
2014-01-01
In a large-scale outdoor, airborne, hazardous materials (HAZMAT) incident, such as ruptured chlorine rail cars during a train derailment, the local Incident Commanders and HAZMAT emergency responders must obtain accurate information quickly to assess the situation and act promptly and appropriately. HAZMAT responders must have a clear understanding of key information and how to integrate it into timely and effective decisions for action planning. This study examined the use of HAZMAT plume modeling as a decision support tool during incident action planning in this type of extreme HAZMAT incident. The concept of situation awareness as presented by Endsley's dynamic situation awareness model contains three levels: perception, comprehension, and projection. It was used to examine the actions of incident managers related to adequate data acquisition, current situational understanding, and accurate situation projection. Scientists and engineers have created software to simulate and predict HAZMAT plume behavior, the projected hazard impact areas, and the associated health effects. Incorporating the use of HAZMAT plume projection modeling into an incident action plan may be a complex process. The present analysis used a mixed qualitative and quantitative methodological approach and examined the use and limitations of a "HAZMAT Plume Modeling Cycle" process that can be integrated into the incident action planning cycle. HAZMAT response experts were interviewed using a computer-based simulation. One of the research conclusions indicated the "HAZMAT Plume Modeling Cycle" is a critical function so that an individual/team can be tasked with continually updating the hazard plume model with evolving data, promoting more accurate situation awareness.
U.S. EPA health assessment for diesel engine exhaust: a review.
Ris, Charles
2007-01-01
In 2002 the U.S. Environmental Protection Agency (EPA) released a Health assessment Document for Diesel Engine Exhaust. The objective of this assessment was to examine the possible health hazards associated with exposure to diesel engine exhaust (DE). The assessment concludes that long-term inhalation exposure is likely to pose a lung cancer hazard to humans as inferred from epidemiologic and certain animal studies. Estimation of cancer potency from available epidemiology studies was not attempted because of the absence of a confident cancer dose-response and animal studies were not judged appropriate for cancer potency estimation. A noncancer chronic human health hazard is inferred from rodent studies which show dose-dependent inflammation and histopathology in the rat lung. For these noncancer effects a safe exposure concentration for humans was estimated. Short-term exposures were noted to cause irritation and inflammatory symptoms of a transient nature, these being highly variable across an exposed population. The assessment also indicates that there is emerging evidence for the exacerbation of existing allergies and asthma symptoms; however, as of 2002 the data were inadequate for quantitative dose-response analysis. The assessment conclusions are based on studies that used exposures from engines built prior to the mid 1990s. More recent engines without high-efficiency particle traps would be expected to have exhaust emissions with similar characteristics. With additional cancer epidemiology studies expected in 2007-2008, and a growing body of evidence for allergenicity and cardiovascular effects, future health assessments will have an expanded health effects data base to evaluate.
MicroRNA-34c-5p is related to recurrence in laryngeal squamous cell carcinoma.
Re, Massimo; Çeka, Artan; Rubini, Corrado; Ferrante, Luigi; Zizzi, Antonio; Gioacchini, Federico M; Tulli, Michele; Spazzafumo, Liana; Sellari-Franceschini, Stefano; Procopio, Antonio D; Olivieri, Fabiola
2015-09-01
Altered microRNA expression has been found in many cancer types, including laryngeal squamous cell carcinoma (LSCC). We investigated the association of LSCC-related miR-34c-5p with disease-free survival and overall survival. Retrospective cohort study. Expression levels of miR-34c-5p were detected in 90 LSCC formalin-fixed paraffin-embedded tissues by reverse-transcription quantitative polymerase chain reaction. Overall survival and disease-free survival were evaluated using the Kaplan-Meier method, and multivariate analysis was performed using Cox proportional hazard analysis. A downregulation of miR-34c-5p expression significantly correlated with worse disease-free and overall survival. In the multivariate analysis, low miR-34c-5p expression was associated with an increased risk of recurrence. A downregulation of miR-34c-5p in LSCC is independently associated with unfavorable disease-free survival, suggesting that miR-34c-5p might be a promising marker for evaluating the risk of recurrences. NA. © 2015 The American Laryngological, Rhinological and Otological Society, Inc.
GOSAT/TANSO-FTS Measurement of Volcanic and Geothermal CO2 Emissions
NASA Astrophysics Data System (ADS)
Schwandner, Florian M.; Carn, Simon A.; Newhall, Christopher G.
2010-05-01
Approximately one tenth of the Earth's human population lives in direct reach of volcanic hazards. Being able to provide sufficiently early and scientifically sound warning is a key to volcanic hazard mitigation. Quantitative time-series monitoring of volcanic CO2 emissions will likely play a key role in such early warning activities in the future. Impending volcanic eruptions or any potentially disastrous activity that involves movement of magma in the subsurface, is often preceded by an early increase of CO2 emissions. Conventionally, volcanic CO2 monitoring is done either in campaigns of soil emission measurements (grid of one-time measuring points) that are labor intensive and slow, or by ground-based remote FTIR measurements in emission plumes. These methods are not easily available at all sites of potential activity and prohibitively costly to employ on a large number of volcanoes. In addition, both of these ground-based approaches pose a significant risk to the workers conducting these measurements. Some aircraft-based measurements have been conducted as well in the past, however these are limited by the usually meager funding situation of individual observatories, the hazard such flights pose to equipment and crew, and by the inaccessibility of parts of the plume due to ash hazards. The core motivation for this study is therefore to develop a method for volcanic CO2 monitoring from space that will provide sufficient coverage, resolution, and data quality for an application to quantitative time series monitoring and correlation with other available datasets, from a safe distance and with potentially global reach. In summary, the purpose of the proposed research is to quantify volcanic CO2 emissions using satellite-borne observations. Quantitative estimates will be useful for warning of impending volcanic eruptions, and assessing the contribution of volcanic CO2 to global GHG. Our approach encompasses method development and testing for the detection of volcanic CO2 anomalies using GOSAT and correlation with Aura/OMI, AIRS, and ASTER determined SO2 fluxes and ground based monitoring of CO2 and other geophysical and geochemical parameters. This will provide the ground work for future higher spatial resolution satellite missions. This is a joint effort from two GOSAT-IBUKI data application projects: "Satellite-Borne Quantification of Carbon Dioxide Emissions from Volcanoes and Geothermal Areas" (PI Schwandner), and "Application of GOSAT/TANSO-FTS to the Measurement of Volcanic CO2 Emissions" (PI Carn).
NASA Astrophysics Data System (ADS)
Al-Halbouni, Djamil; Holohan, Eoghan P.; Saberi, Leila; Alrshdan, Hussam; Sawarieh, Ali; Closson, Damien; Walter, Thomas R.; Dahm, Torsten
2017-05-01
Ground subsidence and sinkhole collapse are phenomena affecting regions of karst geology worldwide. The rapid development of such phenomena around the Dead Sea in the last four decades poses a major geological hazard to the local population, agriculture and industry. Nonetheless many aspects of this hazard are still incompletely described and understood, especially on the eastern Dead Sea shore. In this work, we present a first low altitude (< 150 m above ground) aerial photogrammetric survey with a Helikite Balloon at the sinkhole area of Ghor Al-Haditha, Jordan. We provide a detailed qualitative and quantitative analysis of a new, high resolution digital surface model (5 cm px-1) and orthophoto of this area (2.1 km2). We also outline the factors affecting the quality and accuracy of this approach. Our analysis reveals a kilometer-scale sinuous depression bound partly by flexure and partly by non-tectonic faults. The estimated minimum volume loss of this subsided zone is 1.83 ṡ 106 m3 with an average subsidence rate of 0.21 m yr-1 over the last 25 years. Sinkholes in the surveyed area are localized mainly within this depression. The sinkholes are commonly elliptically shaped (mean eccentricity 1.31) and clustered (nearest neighbor ratio 0.69). Their morphologies and orientations depend on the type of sediment they form in: in mud, sinkholes have a low depth to diameter ratio (0.14) and a long-axis azimuth of NNE-NE. In alluvium, sinkholes have a higher ratio (0.4) and are orientated NNW-N. From field work, we identify actively evolving artesian springs and channelized, sediment-laden groundwater flows that appear locally in the main depression. Consequently, subrosion, i.e. subsurface mechanical erosion, is identified as a key physical process, in addition to dissolution, behind the subsidence and sinkhole hazard. Furthermore, satellite image analysis links the development of the sinuous depression and sinkhole formation at Ghor Al-Haditha to preferential groundwater flow paths along ancient and current wadi riverbeds.
Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances
This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.
Rasmussen's legacy: A paradigm change in engineering for safety.
Leveson, Nancy G
2017-03-01
This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Nave, Rosella; Isaia, Roberto; Sandri, Laura; Cristiani, Chiara
2016-04-01
In the communication chain between scientists and decision makers (end users), scientific outputs, as maps, are a fundamental source of information on hazards zoning and the related at risk areas definition. Anyway the relationship between volcanic phenomena, their probability and potential impact can be complex and the geospatial information not easily decoded or understood by not experts even if decision makers. Focusing on volcanic hazard the goal of MED SUV WP6 Task 3 is to improve the communication efficacy of scientific outputs, to contribute in filling the gap between scientists and decision-makers. Campi Flegrei caldera, in Neapolitan area has been chosen as the pilot research area where to apply an evaluation/validation procedure to provide a robust evaluation of the volcanic maps and its validation resulting from end users response. The selected sample involved are decision makers and officials from Campanian Region Civil Protection and municipalities included in Campi Flegrei RED ZONE, the area exposed to risk from to pyroclastic currents hazard. Semi-structured interviews, with a sample of decision makers and civil protection officials have been conducted to acquire both quantitative and qualitative data. The tested maps have been: the official Campi Flegrei Caldera RED ZONE map, three maps produced by overlapping the Red Zone limit on Orthophoto, DTM and Contour map, as well as other maps included a probabilistic one, showing volcanological data used to border the Red Zone. The outcomes' analysis have assessed level of respondents' understanding of content as displayed, and their needs in representing the complex information embedded in volcanic hazard. The final output has been the development of a leaflet as "guidelines" that can support decision makers and officials in understanding volcanic hazard and risk maps, and also in using them as a communication tool in information program for the population at risk. The same evaluation /validation process has been applied also on the scientific output of MED-SUV WP6, as a tool for the short-term probabilistic volcanic hazard assessment. For the Campi Flegrei volcanic system, the expected tool has been implemented to compute hazard curves, hazard maps and probability maps for tephra fallout on a target grid covering the Campania region. This allows the end user to visualize the hazard from tephra fallout and its uncertainty. The response of end-users to such products will help to determine to what extent end-users understand them, find them useful, and match their requirements. In order to involve also Etna area in WP6 TASK 3 activities, a questionnaire developed in the VUELCO project (Volcanic Unrest in Europe and Latin America) has been proposed to Sicily Civil Protection officials having decision-making responsibility in case of volcanic unrest at Etna and Stromboli, to survey their opinions and requirements also in case of volcanic unrest
NASA Astrophysics Data System (ADS)
Nicolae Lerma, Alexandre; Bulteau, Thomas; Elineau, Sylvain; Paris, François; Durand, Paul; Anselme, Brice; Pedreros, Rodrigo
2018-01-01
A modelling chain was implemented in order to propose a realistic appraisal of the risk in coastal areas affected by overflowing as well as overtopping processes. Simulations are performed through a nested downscaling strategy from regional to local scale at high spatial resolution with explicit buildings, urban structures such as sea front walls and hydraulic structures liable to affect the propagation of water in urban areas. Validation of the model performance is based on hard and soft available data analysis and conversion of qualitative to quantitative information to reconstruct the area affected by flooding and the succession of events during two recent storms. Two joint probability approaches (joint exceedance contour and environmental contour) are used to define 100-year offshore conditions scenarios and to investigate the flood response to each scenario in terms of (1) maximum spatial extent of flooded areas, (2) volumes of water propagation inland and (3) water level in flooded areas. Scenarios of sea level rise are also considered in order to evaluate the potential hazard evolution. Our simulations show that for a maximising 100-year hazard scenario, for the municipality as a whole, 38 % of the affected zones are prone to overflow flooding and 62 % to flooding by propagation of overtopping water volume along the seafront. Results also reveal that for the two kinds of statistic scenarios a difference of about 5 % in the forcing conditions (water level, wave height and period) can produce significant differences in terms of flooding like +13.5 % of water volumes propagating inland or +11.3 % of affected surfaces. In some areas, flood response appears to be very sensitive to the chosen scenario with differences of 0.3 to 0.5 m in water level. The developed approach enables one to frame the 100-year hazard and to characterize spatially the robustness or the uncertainty over the results. Considering a 100-year scenario with mean sea level rise (0.6 m), hazard characteristics are dramatically changed with an evolution of the overtopping / overflowing process ratio and an increase of a factor 4.84 in volumes of water propagating inland and 3.47 in flooded surfaces.
Space Propulsion Hazards Analysis Manual (SPHAM). Volume 2. Appendices
1988-10-01
lb. RESTRICTIVE MARKINGS UNCLASSIFIED 2a. SECURITY CLASSIFICATION AUTHORITY 3 . DISTRIBUTION/AVAILABILITY OF REPORT Approved for public release...Volume I Chapter 2 - Requirementb and the Hazards Analysis Process .... Volume I Chapter 3 - Accident Scenarios...list of the hazardous materials that are discussed; 3 ) description of the failure scenarios; 4) type of post-accident environment that is discussed
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, F.W.
1994-03-28
This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.
NASA Astrophysics Data System (ADS)
Devoli, Graziella; Tiranti, Davide; Cremonini, Roberto; Sund, Monica; Boje, Søren
2018-05-01
Only few countries operate systematically national and regional forecasting services for rainfall-induced landslides (i.e., debris flows, debris avalanches and shallow slides), among them Norway and Italy. In Norway, the Norwegian Water Resources and Energy Directorate (NVE) operates a landslide forecasting service at national level. In Italy, the Regional Agency for Environmental Protection, ARPA Piemonte, is responsible for issuing landslide warnings for the Piedmont region, located in northwestern Italy. A daily hazard assessment is performed, describing both expected awareness level and type of landslide hazard for a selected warning region. Both services provide regular landslide hazard assessments based on a combination of quantitative thresholds and daily rainfall forecasts together with qualitative expert analysis. Daily warning reports are published at http://www.arpa.piemonte.gov.it/rischinaturali and http://www.varsom.no, last access: 7 May 2018. In spring 2013, ARPA Piemonte and the NVE issued warnings for hydro-meteorological hazards due to the arrival of a deep and large low-pressure system, called herein Vb cyclone
. This kind of weather system is known to produce the largest floods in Europe. Less known is that this weather pattern can trigger landslides as well. In this study, we present the experiences of NVE and ARPA Piemonte in the late spring of 2013. The Vb cyclone influenced weather throughout Europe over a long period, from the end of April until the beginning of June 2013. However, major affects were observed in the first half part of this period in Piedmont, while in Norway, major damage was reported from 15 May to 2 June 2013. Floods and landslides significantly damaged roads, railways, buildings and other infrastructure in both countries. This case study shows that large synoptic pattern can produce different natural hazards in different parts of Europe, from sandstorms at low latitudes, to flood and landslides when the system moves across the mountain regions. These secondary effects were effectively forecasted by the two landslide warning services, operating in different parts of Europe. The landslide risks were also properly communicated to the public some days in advance. This analysis has allowed the establishment of fruitful international collaboration between ARPA Piemonte and NVE and the future exchange of experiences, procedures and methods relating to similar events.
Intensified coastal development behind nourished beaches
NASA Astrophysics Data System (ADS)
Armstrong, Scott; Lazarus, Eli; Limber, Patrick; Goldstein, Evan; Thorpe, Curtis; Ballinger, Rhoda
2016-04-01
Population density, housing development, and property values in coastal counties along the U.S. Atlantic and Gulf Coasts continue to rise despite increasing hazard from storm impacts. Since the 1970s, beach nourishment, which involves importing sand to deliberately widen an eroding beach, has been the main strategy in the U.S. for protecting coastal properties from erosion and flooding hazards. Paradoxically, investment in hazard protection may intensify development. Here, we examine the housing stock of all existing shorefront single-family homes in Florida - a microcosm of U.S. coastal hazards and development - to quantitatively compare development in nourishing and non-nourishing towns. We find that nourishing towns now account for more than half of Florida's coastline, and that houses in nourishing towns are larger and more numerous. Even as the mean size of single-family homes nationwide has grown steadily since 1970, Florida's shorefront stock has exceeded the national average by 34%, and in nourishing towns by 45%. This emergent disparity between nourishing and non-nourishing towns in Florida demonstrates a pattern of intensifying coastal risk, and is likely representative of a dominant trend in coastal development more generally. These data lend empirical support to the hypothesis that US coastal development and hazard mitigation through beach nourishment have become dynamically coupled.
NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Burton, Paul W.
2010-09-01
The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.
Xiao, Linda; Alder, Rhiannon; Mehta, Megha; Krayem, Nadine; Cavasinni, Bianca; Laracy, Sean; Cameron, Shane; Fu, Shanlin
2018-04-01
Cocaine trafficking in the form of textile impregnation is routinely encountered as a concealment method. Raman spectroscopy has been a popular and successful testing method used for in situ screening of cocaine in textiles and other matrices. Quantitative analysis of cocaine in these matrices using Raman spectroscopy has not been reported to date. This study aimed to develop a simple Raman method for quantifying cocaine using atropine as the model analogue in various types of textiles. Textiles were impregnated with solutions of atropine in methanol. The impregnated atropine was extracted using less hazardous acidified water with the addition of potassium thiocyanate (KSCN) as an internal standard for Raman analysis. Despite the presence of background matrix signals arising from the textiles, the cocaine analogue could easily be identified by its characteristic Raman bands. The successful use of KSCN normalised the analyte signal response due to different textile matrix background interferences and thus removed the need for a matrix-matched calibration. The method was linear over a concentration range of 6.25-37.5 mg/cm 2 with a coefficient of determination (R 2 ) at 0.975 and acceptable precision and accuracy. A simple and accurate Raman spectroscopy method for the analysis and quantification of a cocaine analogue impregnated in textiles has been developed and validated for the first time. This proof-of-concept study has demonstrated that atropine can act as an ideal model compound to study the problem of cocaine impregnation in textile. The method has the potential to be further developed and implemented in real world forensic cases. Copyright © 2017 John Wiley & Sons, Ltd.
Li, Jie-Li; Fan, Guang-Yu; Liu, Yu-Jie; Zeng, Zi-Hang; Huang, Jing-Juan; Yang, Zong-Ming; Meng, Xiang-Yu
2018-01-01
We aimed to quantitatively synthesize data from randomized controlled trials (RCTs) concerning maintenance for multiple myeloma (MM). We searched electronic literature databases and conference proceedings to identify relevant RCTs. We selected eligible RCTs using predefined selection criteria. We conducted meta-analysis comparing maintenance containing new agents and conventional maintenance, and subgroup analysis by transplantation status and mainstay agent as well. We performed trial sequential analysis (TSA) to determine adequacy of sample size for overall and subgroup meta-analyses. We performed network meta-analysis (NMA) to compare and rank included regimens. A total of 22 RCTs involving 9,968 MM patients and 15 regimens were included, the overall quality of which was adequate. Significant heterogeneity was detected for progression-free survival (PFS) but not overall survival (OS). Meta-analyses showed that maintenance containing new agents significantly improved PFS but not OS [PFS: Hazard Ratio (HR) = 0.59, 95% Confidence Interval (CI) = 0.54 to 0.64; OS: HR = 0.93, 95% CI = 0.87 to 1.00], compared with controls. Subgroup analyses revealed lenalidomide (Len)-based therapies better than thalidomide-based ones (HR = 0.50 and 0.66, respectively; P = 0.001). NMA revealed that most of the maintenance regimens containing new agents were significantly better than simple observation in terms of PFS but not OS. Len single agent was the most effective, considering PFS and OS both. We concluded that conventional maintenance has very limited effect. Maintenance containing new agents is highly effective in improving PFS, but has very limited effect on OS. Maintenance with Len may have the largest survival benefits. Emerging strategies may further change the landscape of maintenance of MM.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L. K.; Vogel, R. M.
2015-11-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, Laura K.; Vogel, Richard M.
2016-04-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.
Statistical analysis of the uncertainty related to flood hazard appraisal
NASA Astrophysics Data System (ADS)
Notaro, Vincenza; Freni, Gabriele
2015-12-01
The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.
NASA Astrophysics Data System (ADS)
Moreira, Francisco; Silva, Nuno
2016-08-01
Safety systems require accident avoidance. This is covered by application standards, processes, techniques and tools that support the identification, analysis, elimination or reduction to an acceptable level of system risks and hazards. Ideally, a safety system should be free of hazards. However, both industry and academia have been struggling to ensure appropriate risk and hazard analysis, especially in what concerns completeness of the hazards, formalization, and timely analysis in order to influence the specifications and the implementation. Such analysis is also important when considering a change to an existing system. The Common Safety Method for Risk Evaluation and Assessment (CSM- RA) is a mandatory procedure whenever any significant change is proposed to the railway system in a European Member State. This paper provides insights on the fundamentals of CSM-RA based and complemented with Hazard Analysis. When and how to apply them, and the relation and similarities of these processes with industry standards and the system life cycles is highlighted. Finally, the paper shows how CSM-RA can be the basis of a change management process, guiding the identification and management of the hazards helping ensuring the similar safety level as the initial system. This paper will show how the CSM-RA principles can be used in other domains particularly for space system evolution.
Going beyond the flood insurance rate map: insights from flood hazard map co-production
NASA Astrophysics Data System (ADS)
Luke, Adam; Sanders, Brett F.; Goodrich, Kristen A.; Feldman, David L.; Boudreau, Danielle; Eguiarte, Ana; Serrano, Kimberly; Reyes, Abigail; Schubert, Jochen E.; AghaKouchak, Amir; Basolo, Victoria; Matthew, Richard A.
2018-04-01
Flood hazard mapping in the United States (US) is deeply tied to the National Flood Insurance Program (NFIP). Consequently, publicly available flood maps provide essential information for insurance purposes, but they do not necessarily provide relevant information for non-insurance aspects of flood risk management (FRM) such as public education and emergency planning. Recent calls for flood hazard maps that support a wider variety of FRM tasks highlight the need to deepen our understanding about the factors that make flood maps useful and understandable for local end users. In this study, social scientists and engineers explore opportunities for improving the utility and relevance of flood hazard maps through the co-production of maps responsive to end users' FRM needs. Specifically, two-dimensional flood modeling produced a set of baseline hazard maps for stakeholders of the Tijuana River valley, US, and Los Laureles Canyon in Tijuana, Mexico. Focus groups with natural resource managers, city planners, emergency managers, academia, non-profit, and community leaders refined the baseline hazard maps by triggering additional modeling scenarios and map revisions. Several important end user preferences emerged, such as (1) legends that frame flood intensity both qualitatively and quantitatively, and (2) flood scenario descriptions that report flood magnitude in terms of rainfall, streamflow, and its relation to an historic event. Regarding desired hazard map content, end users' requests revealed general consistency with mapping needs reported in European studies and guidelines published in Australia. However, requested map content that is not commonly produced included (1) standing water depths following the flood, (2) the erosive potential of flowing water, and (3) pluvial flood hazards, or flooding caused directly by rainfall. We conclude that the relevance and utility of commonly produced flood hazard maps can be most improved by illustrating pluvial flood hazards and by using concrete reference points to describe flooding scenarios rather than exceedance probabilities or frequencies.
Jarvis, J; Seed, M; Elton, R; Sawyer, L; Agius, R
2005-01-01
Aims: To investigate quantitatively, relationships between chemical structure and reported occupational asthma hazard for low molecular weight (LMW) organic compounds; to develop and validate a model linking asthma hazard with chemical substructure; and to generate mechanistic hypotheses that might explain the relationships. Methods: A learning dataset used 78 LMW chemical asthmagens reported in the literature before 1995, and 301 control compounds with recognised occupational exposures and hazards other than respiratory sensitisation. The chemical structures of the asthmagens and control compounds were characterised by the presence of chemical substructure fragments. Odds ratios were calculated for these fragments to determine which were associated with a likelihood of being reported as an occupational asthmagen. Logistic regression modelling was used to identify the independent contribution of these substructures. A post-1995 set of 21 asthmagens and 77 controls were selected to externally validate the model. Results: Nitrogen or oxygen containing functional groups such as isocyanate, amine, acid anhydride, and carbonyl were associated with an occupational asthma hazard, particularly when the functional group was present twice or more in the same molecule. A logistic regression model using only statistically significant independent variables for occupational asthma hazard correctly assigned 90% of the model development set. The external validation showed a sensitivity of 86% and specificity of 99%. Conclusions: Although a wide variety of chemical structures are associated with occupational asthma, bifunctional reactivity is strongly associated with occupational asthma hazard across a range of chemical substructures. This suggests that chemical cross-linking is an important molecular mechanism leading to the development of occupational asthma. The logistic regression model is freely available on the internet and may offer a useful but inexpensive adjunct to the prediction of occupational asthma hazard. PMID:15778257
Modeling of marginal burning state of fire spread in live chaparral shrub fuel bed
X. Zhou; S. Mahalingam; D. Weise
2005-01-01
Prescribed burning in chaparral, currently used to manage wildland fuels and reduce wildfire hazard, is often conducted under marginal burning conditions. The relative importance of the fuel and environmental variables that determine fire spread success in chaparral fuels is not quantitatively understood. Based on extensive experimental study, a two-dimensional...
Safety characteristics of the lithium SO2 system
NASA Technical Reports Server (NTRS)
Watson, T.
1978-01-01
Extensive tests were conducted to quantitatively define the safety characteristics of high-rate SO2 multicell batteries under various discharge and temperature profiles, which closely simulated actual field-use conditions. The resulting behavior patters of the multicell batteries and the corrective action which can be implemented to minimize or prevent hazardous battery performance are briefly summarized.
Why It Won't Happen to Me: How Older Adolescents Make Personal Risk Assessments.
ERIC Educational Resources Information Center
Chapin, John; Chirico, JoAnn
This study sought to document optimistic bias among older adolescents in the context of numerous hazards. It was among the first studies to triangulate quantitative and qualitative measures to investigate how individuals make personal risk assessments within the optimistic bias literature. Results from a small-scale survey and follow-up interviews…
Methylmercury Poisoning—An Assessment of the Sportfish Hazard in California
Dales, Loring; Kahn, Ephraim; Wei, Eddie
1971-01-01
A quantitative assessment of the methylmercury risk in California entails measurement of the contamination distribution, the probability of methylmercury intake and knowledge of the toxicological properties of methylmercury. This article reviews the scientific basis for the California State Task Force's decision to warn the public against excessive consumption of sport fish contaminated by methylmercury. PMID:5544687
Rini, Brian I; Escudier, Bernard; Martini, Jean-Francois; Magheli, Ahmed; Svedman, Christer; Lopatin, Margarita; Knezevic, Dejan; Goddard, Audrey D; Febbo, Phillip G; Li, Rachel; Lin, Xun; Valota, Olga; Staehler, Michael; Motzer, Robert J; Ravaud, Alain
2018-05-17
Adjuvant sunitinib prolonged disease-free survival (DFS) (hazard ratio [HR] 0.76) in patients with locoregional high-risk renal cell carcinoma (RCC) in the S-TRAC trial (ClinicalTrials.gov NCT00375674). The 16-gene Recurrence Score (RS) assay was previously developed and validated to estimate risk for disease recurrence in patients with RCC post-nephrectomy. This analysis further validated the prognostic value of RS assay in patients from S-TRAC and explored association of RS results with prediction of sunitinib benefit. The analysis was prospectively designed with prespecified genes, algorithm, endpoints, and analytical methods. Primary RCC was available from 212 patients with informed consent; primary analysis focused on patients with T3 RCC. Gene expression was quantitated by RT-PCR. Time to recurrence (TTR), DFS, and renal cancer-specific survival (RCSS) were analyzed using Cox proportional hazards regression. Results: Baseline characteristics were similar between patients with and without RS results, and between the sunitinib and placebo arms among patients with RS results. RS results predicted TTR, DFS, and RCSS in both arms, with the strongest results observed in the placebo arm. When high versus low RS groups were compared, HR for recurrence was 9.18 (95% CI, 2.15-39.24; P < 0.001) in the placebo arm; interaction of RS results with treatment was not significant. Conclusions: The strong prognostic performance of the 16-gene RS assay was confirmed in S-TRAC, and the RS assay is now supported by level IB evidence. RS results may help identify patients at high risk for recurrence who may derive higher absolute benefit from adjuvant therapy. Copyright ©2018, American Association for Cancer Research.
Rish, William R; Pfau, Edward J
2018-04-01
A bounding risk assessment is presented that evaluates possible human health risk from a hypothetical scenario involving a 10,000-gallon release of flowback water from horizontal fracturing of Marcellus Shale. The water is assumed to be spilled on the ground, infiltrates into groundwater that is a source of drinking water, and an adult and child located downgradient drink the groundwater. Key uncertainties in estimating risk are given explicit quantitative treatment using Monte Carlo analysis. Chemicals that contribute significantly to estimated health risks are identified, as are key uncertainties and variables to which risk estimates are sensitive. The results show that hypothetical exposure via drinking water impacted by chemicals in Marcellus Shale flowback water, assumed to be spilled onto the ground surface, results in predicted bounds between 10 -10 and 10 -6 (for both adult and child receptors) for excess lifetime cancer risk. Cumulative hazard indices (HI CUMULATIVE ) resulting from these hypothetical exposures have predicted bounds (5th to 95th percentile) between 0.02 and 35 for assumed adult receptors and 0.1 and 146 for assumed child receptors. Predicted health risks are dominated by noncancer endpoints related to ingestion of barium and lithium in impacted groundwater. Hazard indices above unity are largely related to exposure to lithium. Salinity taste thresholds are likely to be exceeded before drinking water exposures result in adverse health effects. The findings provide focus for policy discussions concerning flowback water risk management. They also indicate ways to improve the ability to estimate health risks from drinking water impacted by a flowback water spill (i.e., reducing uncertainty). © 2017 Society for Risk Analysis.
Cano-Sancho, German; Salmon, Andrew G.
2017-01-01
Background: The prevalence of obesity is increasing in all countries, becoming a substantial public health concern worldwide. Increasing evidence has associated obesity with persistent pollutants such as the pesticide DDT and its metabolite p,p′-DDE. Objectives: Our objective was to systematically review the literature on the association between exposure to the pesticide DDT and its metabolites and obesity to develop hazard identification conclusions. Methods: We applied a systematic review-based strategy to identify and integrate evidence from epidemiological, in vivo, and in vitro studies. The evidence from prospective epidemiological studies was quantitatively synthesized by meta-analysis. We rated the body of evidence and integrated the streams of evidence to systematically develop hazard identification conclusions. Results: We identified seven epidemiological studies reporting prospective associations between exposure to p,p′-DDE and adiposity assessed by body mass index (BMI) z-score. The results from the meta-analysis revealed positive associations between exposure to p,p′-DDE and BMI z-score (β=0.13 BMI z-score (95% CI: 0.01, 0.25) per log increase of p,p′-DDE). Two studies constituted the primary in vivo evidence. Both studies reported positive associations between exposure to p,p′-DDT and increased adiposity in rodents. We identified 19 in vivo studies and 7 in vitro studies that supported the biological plausibility of the obesogenic effects of p,p′-DDT and p,p′-DDE. Conclusions: We classified p,p′-DDT and p,p′-DDE as “presumed” to be obesogenic for humans, based on a moderate level of primary human evidence, a moderate level of primary in vivo evidence, and a moderate level of supporting evidence from in vivo and in vitro studies. https://doi.org/10.1289/EHP527 PMID:28934091
Rossi, Marta; Turati, Federica; Lagiou, Pagona; Trichopoulos, Dimitrios; La Vecchia, Carlo; Trichopoulou, Antonia
2015-03-01
High glycemic load (GL) has been associated with excess stroke risk. Data suggest a different role of diet in the etiology of ischemic and hemorrhagic stroke. We analyzed data from 19,824 participants of the Greek cohort of the population-based European Prospective Investigation into Cancer and nutrition (EPIC), who were free of cardiovascular diseases, cancer, and diabetes at baseline and had not developed diabetes. Diet was assessed at enrollment through a validated, interviewer-administered semi-quantitative food frequency questionnaire. The average daily GL was derived using standard tables. We also conducted a meta-analysis on GL and stroke (overall, ischemic and hemorrhagic), using random-effects models. In the Greek EPIC cohort, 304 incident stroke cases were identified (67 ischemic, 49 hemorrhagic). Using Cox proportional hazards regression models adjusted for potential confounders, the hazard ratios for the highest versus the lowest GL tertiles were 1.07 [95 % confidence interval (CI) 0.74-1.54] for overall stroke, 1.55 (95 % CI 0.72-3.36) for ischemic and 0.48 (95 % CI 0.18-1.25) for hemorrhagic stroke (p-heterogeneity <0.01). The meta-analysis, including a total of 3,088 incident cases and 247 deaths from stroke (1,469 cases and 126 deaths ischemic; 576 cases and 94 deaths hemorrhagic), estimated pooled relative risks for the highest versus the lowest GL levels of 1.23 (95 % CI 1.07-1.41) for overall, 1.35 (95 % CI 1.06-1.72) for ischemic, and 1.09 (95 % CI 0.81-1.47) for hemorrhagic stroke (p-heterogeneity = 0.275). This study indicates that GL is an important determinant of the more common ischemic-though not of the hemorrhagic-stroke.
Cano-Sancho, German; Salmon, Andrew G; La Merrill, Michele A
2017-09-18
The prevalence of obesity is increasing in all countries, becoming a substantial public health concern worldwide. Increasing evidence has associated obesity with persistent pollutants such as the pesticide DDT and its metabolite p,p '-DDE. Our objective was to systematically review the literature on the association between exposure to the pesticide DDT and its metabolites and obesity to develop hazard identification conclusions. We applied a systematic review-based strategy to identify and integrate evidence from epidemiological, in vivo , and in vitro studies. The evidence from prospective epidemiological studies was quantitatively synthesized by meta-analysis. We rated the body of evidence and integrated the streams of evidence to systematically develop hazard identification conclusions. We identified seven epidemiological studies reporting prospective associations between exposure to p,p' -DDE and adiposity assessed by body mass index (BMI) z -score. The results from the meta-analysis revealed positive associations between exposure to p,p' -DDE and BMI z -score (β=0.13 BMI z -score (95% CI: 0.01, 0.25) per log increase of p,p' -DDE). Two studies constituted the primary in vivo evidence. Both studies reported positive associations between exposure to p,p' -DDT and increased adiposity in rodents. We identified 19 in vivo studies and 7 in vitro studies that supported the biological plausibility of the obesogenic effects of p,p' -DDT and p,p' -DDE. We classified p,p' -DDT and p,p' -DDE as "presumed" to be obesogenic for humans, based on a moderate level of primary human evidence, a moderate level of primary in vivo evidence, and a moderate level of supporting evidence from in vivo and in vitro studies. https://doi.org/10.1289/EHP527.
A tool for assessment of heart failure prescribing quality: A systematic review and meta-analysis.
El Hadidi, Seif; Darweesh, Ebtissam; Byrne, Stephen; Bermingham, Margaret
2018-04-16
Heart failure (HF) guidelines aim to standardise patient care. Internationally, prescribing practice in HF may deviate from guidelines and so a standardised tool is required to assess prescribing quality. A systematic review and meta-analysis were performed to identify a quantitative tool for measuring adherence to HF guidelines and its clinical implications. Eleven electronic databases were searched to include studies reporting a comprehensive tool for measuring adherence to prescribing guidelines in HF patients aged ≥18 years. Qualitative studies or studies measuring prescription rates alone were excluded. Study quality was assessed using the Good ReseArch for Comparative Effectiveness Checklist. In total, 2455 studies were identified. Sixteen eligible full-text articles were included (n = 14 354 patients, mean age 69 ± 8 y). The Guideline Adherence Index (GAI), and its modified versions, was the most frequently cited tool (n = 13). Other tools identified were the Individualised Reconciled Evidence Recommendations, the Composite Heart Failure Performance, and the Heart Failure Scale. The meta-analysis included the GAI studies of good to high quality. The average GAI-3 was 62%. Compared to low GAI, high GAI patients had lower mortality rate (7.6% vs 33.9%) and lower rehospitalisation rates (23.5% vs 24.5%); both P ≤ .05. High GAI was associated with reduced risk of mortality (hazard ratio = 0.29, 95% confidence interval, 0.06-0.51) and rehospitalisation (hazard ratio = 0.64, 95% confidence interval, 0.41-1.00). No tool was used to improve prescribing quality. The GAI is the most frequently used tool to assess guideline adherence in HF. High GAI is associated with improved HF outcomes. Copyright © 2018 John Wiley & Sons, Ltd.
Thiel, A; Etheve, S; Fabian, E; Leeman, W R; Plautz, J R
2015-10-01
Consumer health risk assessment for feed additives is based on the estimated human exposure to the additive that may occur in livestock edible tissues compared to its hazard. We present an approach using alternative methods for consumer health risk assessment. The aim was to use the fewest possible number of animals to estimate its hazard and human exposure without jeopardizing the safety upon use. As an example we selected the feed flavoring substance piperine and applied in silico modeling for residue estimation, results from literature surveys, and Read-Across to assess metabolism in different species. Results were compared to experimental in vitro metabolism data in rat and chicken, and to quantitative analysis of residues' levels from the in vivo situation in livestock. In silico residue modeling showed to be a worst case: the modeled residual levels were considerably higher than the measured residual levels. The in vitro evaluation of livestock versus rodent metabolism revealed no major differences in metabolism between the species. We successfully performed a consumer health risk assessment without performing additional animal experiments. As shown, the use and combination of different alternative methods supports animal welfare consideration and provides future perspective to reducing the number of animals. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Consumer Product Chemicals in Indoor Dust: A Quantitative Meta-analysis of U.S. Studies
2016-01-01
Indoor dust is a reservoir for commercial consumer product chemicals, including many compounds with known or suspected health effects. However, most dust exposure studies measure few chemicals in small samples. We systematically searched the U.S. indoor dust literature on phthalates, replacement flame retardants (RFRs), perfluoroalkyl substances (PFASs), synthetic fragrances, and environmental phenols and estimated pooled geometric means (GMs) and 95% confidence intervals for 45 chemicals measured in ≥3 data sets. In order to rank and contextualize these results, we used the pooled GMs to calculate residential intake from dust ingestion, inhalation, and dermal uptake from air, and then identified hazard traits from the Safer Consumer Products Candidate Chemical List. Our results indicate that U.S. indoor dust consistently contains chemicals from multiple classes. Phthalates occurred in the highest concentrations, followed by phenols, RFRs, fragrance, and PFASs. Several phthalates and RFRs had the highest residential intakes. We also found that many chemicals in dust share hazard traits such as reproductive and endocrine toxicity. We offer recommendations to maximize comparability of studies and advance indoor exposure science. This information is critical in shaping future exposure and health studies, especially related to cumulative exposures, and in providing evidence for intervention development and public policy. PMID:27623734
The hazard of exposure to impulse noise as a function of frequency, volume 1
NASA Astrophysics Data System (ADS)
Patterson, James H., Jr.; Carrier, Melvin, Jr.; Bordwell, Kevin; Gautier, Ilia M.; Hamernik, Roger P.
1991-06-01
The energy spectrum of a noise is known to be an important variable in determining the effects of a traumatic exposure. However, existing criteria for exposure to impulse noise do not consider the frequency spectrum of an impulse as a variable in the evaluation of the hazards to the auditory system. This report presents the results of a study that was designed to determine the relative potential that impulsive energy concentrated at different frequencies has in causing auditory system trauma. One hundred and eighteen (118) chinchilla, divided into 20 groups with 5 to 7 animals per group, were used in these experiments. Pre- and post-exposure hearing thresholds were measured at 10 test frequencies between 0.125 and 8 kHz on each animal using avoidance conditioning procedures. Quantitative histology (cochleograms) was used to determine the extent and pattern of the sensory cell damage. The noise exposure stimuli consisted of six different computer-generated narrow band tone bursts having center frequencies located at 0.260, 0.775, 1.350, 2.450, and 3.550 kHz. Each narrow band exposure stimulus was presented at two to four different intensities. An analysis of the audiometric and histological data allowed frequency weighing functions to be derived.
From preparedness to risk: from the singular risk of nuclear war to the plurality of all hazards.
Deville, Joe; Guggenheim, Michael
2017-08-17
Debates on risk have largely assumed risk to be the outcome of calculative practices. There is a related assumption that risk objects come only in one form, and that the reason not everything can be transformed into a risk is because of the difficulties in calculating and creating universal quantitative comparisons. In this article, building on recent studies of preparedness that have broadened understandings of risk, we provide an analysis of how preparedness measures might themselves produce risk, in particular through risk's durable instantiation, or what we call 'concretization'. Our empirical focus is on how government agencies in two countries shifted their attention from the risk of nuclear attack during the Cold War to an all hazards approach to preparedness. Comparing the mid- to late-twentieth century histories of the UK and Switzerland, we show that both countries shifted from focusing from a single risk to plural risks. This shift cannot be explained by a change in prevailing calculative practices, or by the fact that the risks changed historically. Instead, it is driven by historically specific changes in how risks are produced and reproduced in relation to how materializations of risk operate over time. © London School of Economics and Political Science 2017.
Mestre, Nélia C; Rocha, Thiago L; Canals, Miquel; Cardoso, Cátia; Danovaro, Roberto; Dell'Anno, Antonio; Gambi, Cristina; Regoli, Francesco; Sanchez-Vidal, Anna; Bebianno, Maria João
2017-09-01
Portmán Bay is a heavily contaminated area resulting from decades of metal mine tailings disposal, and is considered a suitable shallow-water analogue to investigate the potential ecotoxicological impact of deep-sea mining. Resuspension plumes were artificially created by removing the top layer of the mine tailings deposit by bottom trawling. Mussels were deployed at three sites: i) off the mine tailings deposit area; ii) on the mine tailings deposit beyond the influence from the resuspension plumes; iii) under the influence of the artificially generated resuspension plumes. Surface sediment samples were collected at the same sites for metal analysis and ecotoxicity assessment. Metal concentrations and a battery of biomarkers (oxidative stress, metal exposure, biotransformation and oxidative damage) were measured in different mussel tissues. The environmental hazard posed by the resuspension plumes was investigated by a quantitative weight of evidence (WOE) model that integrated all the data. The resuspension of sediments loaded with metal mine tails demonstrated that chemical contaminants were released by trawling subsequently inducing ecotoxicological impact in mussels' health. Considering as sediment quality guidelines (SQGs) those indicated in Spanish action level B for the disposal of dredged material at sea, the WOE model indicates that the hazard is slight off the mine tailings deposit, moderate on the mine tailings deposit without the influence from the resuspension plumes, and major under the influence of the resuspension plumes. Portmán Bay mine tailings deposit is a by-product of sulphide mining, and despite differences in environmental setting, it can reflect the potential ecotoxic effects to marine fauna from the impact of resuspension of plumes created by deep-sea mining of polymetallic sulphides. A similar approach as in this study could be applied in other areas affected by sediment resuspension and for testing future deep-sea mining sites in order to assess the associated environmental hazards. Copyright © 2017 Elsevier Ltd. All rights reserved.
Furlan, Elisa; Torresan, Silvia; Critto, Andrea; Marcomini, Antonio
2018-03-15
In the last few decades the health of marine ecosystems has been progressively endangered by the anthropogenic presence. Natural and human-made pressures, as well as climate change effects, are posing increasing threats on marine areas, triggering alteration of biological, chemical and physical processes. Planning of marine areas has become a challenge for decision makers involved in the design of sustainable management options. In order to address threats posed by climate drivers in combination with local to regional anthropogenic pressures affecting marine ecosystems and activities, a multi-hazard assessment methodology was developed and applied to the Adriatic Sea for the reference scenario 2000-2015. Through a four-stages process based on the consecutive analysis of hazard, exposure, vulnerability and risk the methodology allows a semi-quantitative evaluation of the relative risk from anthropogenic and natural sources to multiple endpoints, thus supporting the identification and ranking of areas and targets more likely to be at risk. Resulting output showed that the higher relative hazard scores are linked to exogenic pressures (e.g. sea surface temperature variation) while the lower ones resulted from endogenic and more localized stressors (e.g. abrasion, nutrient input). Relatively very high scores were observed for vulnerability over the whole case study for almost all the considered pressures, showing seagrasses meadows, maërl and coral beds as the most susceptible targets. The approach outlined in this study provides planners and decision makers a quick-screening tool to evaluate progress towards attaining a good environmental status and to identify marine areas where management actions and adaptation strategies would be best targeted. Moreover, by focusing on risks induced by land-based drivers, resulting output can support the design of infrastructures for reducing pressures on the sea, contributing to improve the land-sea interface management. Copyright © 2017 Elsevier B.V. All rights reserved.
Seismic Risk Assessment for the Kyrgyz Republic
NASA Astrophysics Data System (ADS)
Pittore, Massimiliano; Sousa, Luis; Grant, Damian; Fleming, Kevin; Parolai, Stefano; Fourniadis, Yannis; Free, Matthew; Moldobekov, Bolot; Takeuchi, Ko
2017-04-01
The Kyrgyz Republic is one of the most socially and economically dynamic countries in Central Asia, and one of the most endangered by earthquake hazard in the region. In order to support the government of the Kyrgyz Republic in the development of a country-level Disaster Risk Reduction strategy, a comprehensive seismic risk study has been developed with the support of the World Bank. As part of this project, state-of-the-art hazard, exposure and vulnerability models have been developed and combined into the assessment of direct physical and economic risk on residential, educational and transportation infrastructure. The seismic hazard has been modelled with three different approaches, in order to provide a comprehensive overview of the possible consequences. A probabilistic seismic hazard assessment (PSHA) approach has been used to quantitatively evaluate the distribution of expected ground shaking intensity, as constrained by the compiled earthquake catalogue and associated seismic source model. A set of specific seismic scenarios based on events generated from known fault systems have been also considered, in order to provide insight on the expected consequences in case of strong events in proximity of densely inhabited areas. Furthermore, long-span catalogues of events have been generated stochastically and employed in the probabilistic analysis of expected losses over the territory of the Kyrgyz Republic. Damage and risk estimates have been computed by using an exposure model recently developed for the country, combined with the assignment of suitable fragility/vulnerability models. The risk estimation has been carried out with spatial aggregation at the district (rayon) level. The obtained results confirm the high level of seismic risk throughout the country, also pinpointing the location of several risk hotspots, particularly in the southern districts, in correspondence with the Ferghana valley. The outcome of this project will further support the local decision makers in implementing specific prevention and mitigation measures that are consistent with a broad risk reduction strategy.
Monte Carlo simulation for slip rate sensitivity analysis in Cimandiri fault area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pratama, Cecep, E-mail: great.pratama@gmail.com; Meilano, Irwan; Nugraha, Andri Dian
Slip rate is used to estimate earthquake recurrence relationship which is the most influence for hazard level. We examine slip rate contribution of Peak Ground Acceleration (PGA), in probabilistic seismic hazard maps (10% probability of exceedance in 50 years or 500 years return period). Hazard curve of PGA have been investigated for Sukabumi using a PSHA (Probabilistic Seismic Hazard Analysis). We observe that the most influence in the hazard estimate is crustal fault. Monte Carlo approach has been developed to assess the sensitivity. Then, Monte Carlo simulations properties have been assessed. Uncertainty and coefficient of variation from slip rate formore » Cimandiri Fault area has been calculated. We observe that seismic hazard estimates is sensitive to fault slip rate with seismic hazard uncertainty result about 0.25 g. For specific site, we found seismic hazard estimate for Sukabumi is between 0.4904 – 0.8465 g with uncertainty between 0.0847 – 0.2389 g and COV between 17.7% – 29.8%.« less
Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology
NASA Technical Reports Server (NTRS)
Woods, Stephen
2009-01-01
This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.
Yan, Fang; Xu, Kaili
2017-01-01
Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.
Yan, Fang; Xu, Kaili
2017-01-01
Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440
Agent-based simulation for human-induced hazard analysis.
Bulleit, William M; Drewek, Matthew W
2011-02-01
Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.
An ecotoxicological approach for hazard identification of energy ash.
Stiernström, S; Hemström, K; Wik, O; Carlsson, G; Bengtsson, B-E; Breitholtz, M
2011-02-01
Within the EU, ash should be classified by its inherent hazardous effects under criterion H-14 (ecotoxic) in the Directive on waste (2008/98/EC). Today, however, there are no harmonized quantitative criterions for such a classification, but it is stated that biological test systems can be used. In this study seven ash materials were leached and characterized, both biologically and chemically. The objectives were to evaluate if (a) clear concentration-response relationships could be achieved for the selected toxicity tests (bacteria, algae, crustacean and fish), (b) some test(s) are generally more sensitive and (c) the toxic responses were consistent with the chemical analyzes. Interestingly, our results indicate that high concentrations of non-hazardous components (Ca, K) influenced the toxicity of almost all ash eluates, whereas hazardous components (e.g. Zn, Pb) only influenced the toxicity of the eluates ranked as most hazardous. If considering both hazardous and non-hazardous substances, the observed toxic responses were relatively consistent with the chemical analyzes. Our results further showed that the (sub)chronic tests were much more sensitive than the acute tests. However, the use of extrapolation factors to compensate for using the less sensitive acute tests will likely lead to either over- or underestimations of toxicity. Our recommendation is therefore that classification of waste according to H-14 should be based on (sub)chronic test data. Finally, given that treatment of the eluates prior to toxicity testing has a major significance on the concentration and speciation of released substances, further studies are needed in order to propose a relevant testing scheme. Copyright © 2010 Elsevier Ltd. All rights reserved.
Chen, Yuhuan; Dennis, Sherri B; Hartnett, Emma; Paoli, Greg; Pouillot, Régis; Ruthman, Todd; Wilson, Margaret
2013-03-01
Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012.
Cho, Herman M.; Washton, Nancy M.; Mueller, Karl T.; Sears, Jr., Jesse A.; Townsend, Mark R.; Ewing, James R.
2016-06-14
A magic-angle-spinning (MAS) nuclear magnetic resonance (NMR) probe is described that includes double containment enclosures configured to seal and contain hazardous samples for analysis. The probe is of a modular design that ensures containment of hazardous samples during sample analysis while preserving spin speeds for superior NMR performance and convenience of operation.
Connor, Thomas H; Smith, Jerome P
2016-09-01
At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.
Integration of expert knowledge and uncertainty in natural risk assessment
NASA Astrophysics Data System (ADS)
Baruffini, Mirko; Jaboyedoff, Michel
2010-05-01
Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and uncertainties. We followed the same approach for each term of risks i.e. hazard, vulnerability, element at risk, exposition. This risk approach can be achieved by a comprehensive use of several artificial intelligence (AI) technologies, which are done through, for example: (1) GIS techniques; (2) FR or T-PDF for qualitatively predicting risks for possible review results; and (3) A Multi-Criteria Evaluation for analyzing weak points. The main advantages of FR or T-PDF involve the ability to express not-fully-formalized knowledge, easy knowledge representation and acquisition, and self updatability. The results show that such an approach points out quite wide zone of uncertainty. REFERENCES Zadeh L.A. 1965 : Fuzzy Sets. Information and Control, 8:338-353.
NASA Astrophysics Data System (ADS)
Reveillere, A. R.; Bertil, D. B.; Douglas, J. D.; Grisanti, L. G.; Lecacheux, S. L.; Monfort, D. M.; Modaressi, H. M.; Müller, H. M.; Rohmer, J. R.; Sedan, O. S.
2012-04-01
In France, risk assessments for natural hazards are usually carried out separately and decision makers lack comprehensive information. Moreover, since the cause of the hazard (e.g. meteorological, geological) and the physical phenomenon that causes damage (e.g. inundation, ground shaking) may be fundamentally different, the quantitative comparison of single risk assessments that were not conducted in a compatible framework is not straightforward. Comprehensive comparative risk assessments exist in a few other countries. For instance, the Risk Map Germany project has developed and applied a methodology for quantitatively comparing the risk of relevant natural hazards at various scales (city, state) in Germany. The present on-going work applies a similar methodology to the Pointe-à-Pitre urban area, which represents more than half of the population of Guadeloupe, an overseas region in the French West Indies. Relevant hazards as well as hazard intensity levels differ from continental Europe, which will lead to different conclusions. French West Indies are prone to a large number of hazards, among which hurricanes, volcanic eruptions and earthquakes dominate. Hurricanes cause damage through three phenomena: wind, heavy rainfall and storm surge, the latter having had a preeminent role during the largest historical event in 1928. Seismic risk is characterized by many induced phenomena, among which earthquake shocks dominate. This study proposes a comparison of earthquake and cyclonic storm surge risks. Losses corresponding to hazard intensities having the same probability of occurrence are calculated. They are quantified in a common loss unit, chosen to be the direct economic losses. Intangible or indirect losses are not considered. The methodology therefore relies on (i) a probabilistic hazard assessment, (ii) a loss ratio estimation for the exposed elements and (iii) an economic estimation of these assets. Storm surge hazard assessment is based on the selection of relevant historical cyclones and on the simulation of the associated wave and cyclonic surge. The combined local sea elevations, called "set-up", are then fitted with a statistical distribution in order to obtain its time return characteristics. Several run-ups are then extracted, the inundation areas are calculated and the relative losses of the affected assets are deduced. The Probabilistic Seismic Hazard Assessment and the exposed elements location and seismic vulnerability result from past public risk assessment studies. The loss estimations are computed for several return time periods, measured in percentage of buildings being in a given EMS-98 damage state per grid block, which are then converted into loss ratio. In parallel, an asset estimation is conducted. It is mainly focused on private housing, but it considers some major public infrastructures as well. The final outcome of this work is a direct economic loss-frequency plot for earthquake and storm surge. The Probable Maximum Loss and the Average Annual Loss derivate from this risk curve. In addition, different sources of uncertainty are identified through the loss estimation process. The full propagation of these uncertainties can provide an interval of confidence, which can be assigned to the risk-curve and we show how such additional information can be useful for risk comparison.
Dollberg, D D; Bolyard, M L; Smith, D L
1986-03-01
This investigation has shown that crystalline silica has been identified as being present in the Mount St. Helens volcanic ash at levels of 3 to 7 per cent by weight. This identification has been established using X-ray powder diffraction, infrared spectrophotometry, visible spectrophotometry, electron microscopy, and Laser Raman spectrophotometry. Quantitative analysis by IR, XRD, and visible spectrophotometry requires a preliminary phosphoric acid digestion of the ash sample to remove the plagioclase silicate material which interferes with the determination by these methods. Electron microscopic analysis as well as Laser Raman spectrophotometric analysis of the untreated ash confirms the presence of silica and at levels found by the XRD and IR analysis of the treated samples. An interlaboratory study of volcanic ash samples by 15 laboratories confirms the presence and levels of crystalline silica. Although several problems with applying the digestion procedure were observed in this hastily organized supply, all laboratories employing the digestion procedure reported the presence of crystalline silica. These results unequivocally put to rest the question of the presence of silica in the volcanic ash from eruptions of Mount St. Helens in 1980.
Daepp, Madeleine I. G.; Hamilton, Marcus J.; West, Geoffrey B.; Bettencourt, Luís M. A.
2015-01-01
The firm is a fundamental economic unit of contemporary human societies. Studies on the general quantitative and statistical character of firms have produced mixed results regarding their lifespans and mortality. We examine a comprehensive database of more than 25 000 publicly traded North American companies, from 1950 to 2009, to derive the statistics of firm lifespans. Based on detailed survival analysis, we show that the mortality of publicly traded companies manifests an approximately constant hazard rate over long periods of observation. This regularity indicates that mortality rates are independent of a company's age. We show that the typical half-life of a publicly traded company is about a decade, regardless of business sector. Our results shed new light on the dynamics of births and deaths of publicly traded companies and identify some of the necessary ingredients of a general theory of firms. PMID:25833247
[PSYCHO PHYSIOLOGICAL MARKERS OF ACCELERATED AGING AMONG THOSE WORKING WITH OCCUPATIONAL HAZARDS].
Bashkireva, A S; Kachan, Ye Yu; Kulapina, M E
2015-01-01
Using comparative analysis of two occupational groups we assessed the significance of psycho physiological markers of short-term memory accelerated aging in order to reveal how the age-related changes and working process affect mental work capacity. We revealed peculiarities of systemic structure of functions which determine mental work capacity depending on the age and length of service in lorry drivers. It was proved that age and long driving experience affect mnestic functions which show up quantitative and qualitative changes such as reduced volume of memorized information, longer time needed to memorize it, and tendency to diminished accuracy of memorization. We also proved that premature age-related changes of psycho physiological indices in drivers are the "risk indicators", while long driving experience is a real risk factor contributing to the acceleration of aging.
Chemistry of mutagens and carcinogens in broiled food.
Nishimura, S
1986-01-01
From a chemical point of view, the following subjects are important areas in studies on mutagens and carcinogens in broiled foods. In addition to heterocyclic amines which need microsomal activation, the structural elucidation of more labile direct-acting mutagens is necessary. It is known that there are still various unknown minor mutagens in broiled foods. Although the structural characterization of such compounds is more difficult, it is important since they might be hazardous in spite of their low mutagenicity. A more feasible and easier method for quantitative analysis of mutagens, in addition to HPLC and GC/MS methods presently employed, must be developed. The mechanism of formation of mutagens by broiling of food should be studied. An effective chemical method to prevent formation of mutagens or to destroy them, once formed, should be developed. PMID:3757944
Daepp, Madeleine I G; Hamilton, Marcus J; West, Geoffrey B; Bettencourt, Luís M A
2015-05-06
The firm is a fundamental economic unit of contemporary human societies. Studies on the general quantitative and statistical character of firms have produced mixed results regarding their lifespans and mortality. We examine a comprehensive database of more than 25 000 publicly traded North American companies, from 1950 to 2009, to derive the statistics of firm lifespans. Based on detailed survival analysis, we show that the mortality of publicly traded companies manifests an approximately constant hazard rate over long periods of observation. This regularity indicates that mortality rates are independent of a company's age. We show that the typical half-life of a publicly traded company is about a decade, regardless of business sector. Our results shed new light on the dynamics of births and deaths of publicly traded companies and identify some of the necessary ingredients of a general theory of firms.
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...
Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Y.J.; Reich, M.
Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.
An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study
NASA Technical Reports Server (NTRS)
Ray, Paul S.
1996-01-01
The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.
NASA Astrophysics Data System (ADS)
Wang, Lei
Natural and human-induced environmental changes have been altering the earth's surface and hydrological processes, and thus directly contribute to the severity of flood hazards. To understand these changes and their impacts, this research developed a GIS-based hydrological and hydraulic modeling system, which incorporates state-of-the-art remote sensing data to simulate flood under various scenarios. The conceptual framework and technical issues of incorporating multi-scale remote sensing data have been addressed. This research develops an object-oriented hydrological modeling framework. Compared with traditional lumped or cell-based distributed hydrological modeling frameworks, the object-oriented framework allows basic spatial hydrologic units to have various size and irregular shape. This framework is capable of assimilating various GIS and remotely-sensed data with different spatial resolutions. It ensures the computational efficiency, while preserving sufficient spatial details of input data and model outputs. Sensitivity analysis and comparison of high resolution LIDAR DEM with traditional USGS 30m resolution DEM suggests that the use of LIDAR DEMs can greatly reduce uncertainty in calibration of flow parameters in the hydrologic model and hence increase the reliability of modeling results. In addition, subtle topographic features and hydrologic objects like surface depressions and detention basins can be extracted from the high resolution LiDAR DEMs. An innovative algorithm has been developed to efficiently delineate surface depressions and detention basins from LiDAR DEMs. Using a time series of Landsat images, a retrospective analysis of surface imperviousness has been conducted to assess the hydrologic impact of urbanization. The analysis reveals that with rapid urbanization the impervious surface has been increased from 10.1% to 38.4% for the case study area during 1974--2002. As a result, the peak flow for a 100-year flood event has increased by 20% and the floodplain extent has expanded by about 21.6%. The quantitative analysis suggests that the large regional detentions basins have effectively offset the adverse effect of increased impervious surface during the urbanization process. Based on the simulation and scenario analyses of land subsidence and potential climate changes, some planning measures and policy implications have been derived for guiding smart urban growth and sustainable resource development and management to minimize flood hazards.
Application of systems and control theory-based hazard analysis to radiation oncology.
Pawlicki, Todd; Samost, Aubrey; Brown, Derek W; Manger, Ryan P; Kim, Gwe-Ya; Leveson, Nancy G
2016-03-01
Both humans and software are notoriously challenging to account for in traditional hazard analysis models. The purpose of this work is to investigate and demonstrate the application of a new, extended accident causality model, called systems theoretic accident model and processes (STAMP), to radiation oncology. Specifically, a hazard analysis technique based on STAMP, system-theoretic process analysis (STPA), is used to perform a hazard analysis. The STPA procedure starts with the definition of high-level accidents for radiation oncology at the medical center and the hazards leading to those accidents. From there, the hierarchical safety control structure of the radiation oncology clinic is modeled, i.e., the controls that are used to prevent accidents and provide effective treatment. Using STPA, unsafe control actions (behaviors) are identified that can lead to the hazards as well as causal scenarios that can lead to the identified unsafe control. This information can be used to eliminate or mitigate potential hazards. The STPA procedure is demonstrated on a new online adaptive cranial radiosurgery procedure that omits the CT simulation step and uses CBCT for localization, planning, and surface imaging system during treatment. The STPA procedure generated a comprehensive set of causal scenarios that are traced back to system hazards and accidents. Ten control loops were created for the new SRS procedure, which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Eighty three unsafe control actions were identified as well as 472 causal scenarios that could lead to those unsafe control actions. STPA provides a method for understanding the role of management decisions and hospital operations on system safety and generating process design requirements to prevent hazards and accidents. The interaction of people, hardware, and software is highlighted. The method of STPA produces results that can be used to improve safety and prevent accidents and warrants further investigation.
Discussion of methodological issues for conducting benefit-cost analysis and provides guidance for selecting and applying the most appropriate and useful mechanisms in benefit-cost analysis of toxic substances, hazardous materials, and solid waste control