Karin Riley; Matthew Thompson; Peter Webley; Kevin D. Hyde
2017-01-01
Modeling has been used to characterize and map natural hazards and hazard susceptibility for decades. Uncertainties are pervasive in natural hazards analysis, including a limited ability to predict where and when extreme events will occur, with what consequences, and driven by what contributing factors. Modeling efforts are challenged by the intrinsic...
NASA Technical Reports Server (NTRS)
Kaplan, Michael L.; Lin, Yuh-Lang
2005-01-01
The purpose of the research was to develop and test improved hazard algorithms that could result in the development of sensors that are better able to anticipate potentially severe atmospheric turbulence, which affects aircraft safety. The research focused on employing numerical simulation models to develop improved algorithms for the prediction of aviation turbulence. This involved producing both research simulations and real-time simulations of environments predisposed to moderate and severe aviation turbulence. The research resulted in the following fundamental advancements toward the aforementioned goal: 1) very high resolution simulations of turbulent environments indicated how predictive hazard indices could be improved resulting in a candidate hazard index that indicated the potential for improvement over existing operational indices, 2) a real-time turbulence hazard numerical modeling system was improved by correcting deficiencies in its simulation of moist convection and 3) the same real-time predictive system was tested by running the code twice daily and the hazard prediction indices updated and improved. Additionally, a simple validation study was undertaken to determine how well a real time hazard predictive index performed when compared to commercial pilot observations of aviation turbulence. Simple statistical analyses were performed in this validation study indicating potential skill in employing the hazard prediction index to predict regions of varying intensities of aviation turbulence. Data sets from a research numerical model where provided to NASA for use in a large eddy simulation numerical model. A NASA contractor report and several refereed journal articles where prepared and submitted for publication during the course of this research.
1990-12-31
health hazards from weapons combustion products, to include rockets and missiles, became evident, Research to elucidate significant health effects of...CO/CO2 ratios was low for all but one of dhe formulations, In general, if the model were to be used in its present state for health risk assessments...35 Part 2: Modeling for Health Hazard Prediction Introduction ................................................. 37 Results and D iscussion
Risk management and precaution: insights on the cautious use of evidence.
Hrudey, Steve E; Leiss, William
2003-01-01
Risk management, done well, should be inherently precautionary. Adopting an appropriate degree of precaution with respect to feared health and environmental hazards is fundamental to risk management. The real problem is in deciding how precautionary to be in the face of inevitable uncertainties, demanding that we understand the equally inevitable false positives and false negatives from screening evidence. We consider a framework for detection and judgment of evidence of well-characterized hazards, using the concepts of sensitivity, specificity, positive predictive value, and negative predictive value that are well established for medical diagnosis. Our confidence in predicting the likelihood of a true danger inevitably will be poor for rare hazards because of the predominance of false positives; failing to detect a true danger is less likely because false negatives must be rarer than the danger itself. Because most controversial environmental hazards arise infrequently, this truth poses a dilemma for risk management. PMID:14527835
Processing LiDAR Data to Predict Natural Hazards
NASA Technical Reports Server (NTRS)
Fairweather, Ian; Crabtree, Robert; Hager, Stacey
2008-01-01
ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.
Characterizing crown fuel distribution for conifers in the interior western United States
Seth Ex; Frederick W. Smith; Tara Keyser
2015-01-01
Canopy fire hazard evaluation is essential for prioritizing fuel treatments and for assessing potential risk to firefighters during suppression activities. Fire hazard is usually expressed as predicted potential fire behavior, which is sensitive to the methodology used to quantitatively describe fuel profiles: methodologies that assume that fuel is distributed...
Evaluating fuel complexes for fire hazard mitigation planning in the southeastern United States
Anne G. Andreu; Dan Shea; Bernard R. Parresol; Roger D. Ottmar
2012-01-01
Fire hazard mitigation planning requires an accurate accounting of fuel complexes to predict potential fire behavior and effects of treatment alternatives. In the southeastern United States, rapid vegetation growth coupled with complex land use history and forest management options requires a dynamic approach to fuel characterization. In this study we assessed...
NASA Technical Reports Server (NTRS)
Kaplan, Michael L.; Lin, Yuh-Lang
2004-01-01
During the grant period, several tasks were performed in support of the NASA Turbulence Prediction and Warning Systems (TPAWS) program. The primary focus of the research was on characterizing the preturbulence environment by developing predictive tools and simulating atmospheric conditions that preceded severe turbulence. The goal of the research being to provide both dynamical understanding of conditions that preceded turbulence as well as providing predictive tools in support of operational NASA B-757 turbulence research flights. The advancements in characterizing the preturbulence environment will be applied by NASA to sensor development for predicting turbulence onboard commercial aircraft. Numerical simulations with atmospheric models as well as multi-scale observational analyses provided insights into the environment organizing turbulence in a total of forty-eight specific case studies of severe accident producing turbulence on commercial aircraft. These accidents exclusively affected commercial aircraft. A paradigm was developed which diagnosed specific atmospheric circulation systems from the synoptic scale down to the meso-y scale that preceded turbulence in both clear air and in proximity to convection. The emphasis was primarily on convective turbulence as that is what the TPAWS program is most focused on in terms of developing improved sensors for turbulence warning and avoidance. However, the dynamical paradigm also has applicability to clear air and mountain turbulence. This dynamical sequence of events was then employed to formulate and test new hazard prediction indices that were first tested in research simulation studies and then ultimately were further tested in support of the NASA B-757 turbulence research flights. The new hazard characterization algorithms were utilized in a Real Time Turbulence Model (RTTM) that was operationally employed to support the NASA B-757 turbulence research flights. Improvements in the RTTM were implemented in an effort to increase the accuracy of the operational characterization of the preturbulence environment. Additionally, the initial research necessary to create a statistical evaluation scheme for the characterization indices utilized in the RTTM was undertaken. Results of all components of this research were then published in NASA contractor reports and scientific journal papers.
NASA Astrophysics Data System (ADS)
Muhammad, Ario; Goda, Katsuichiro
2018-03-01
This study investigates the impact of model complexity in source characterization and digital elevation model (DEM) resolution on the accuracy of tsunami hazard assessment and fatality estimation through a case study in Padang, Indonesia. Two types of earthquake source models, i.e. complex and uniform slip models, are adopted by considering three resolutions of DEMs, i.e. 150 m, 50 m, and 10 m. For each of the three grid resolutions, 300 complex source models are generated using new statistical prediction models of earthquake source parameters developed from extensive finite-fault models of past subduction earthquakes, whilst 100 uniform slip models are constructed with variable fault geometry without slip heterogeneity. The results highlight that significant changes to tsunami hazard and fatality estimates are observed with regard to earthquake source complexity and grid resolution. Coarse resolution (i.e. 150 m) leads to inaccurate tsunami hazard prediction and fatality estimation, whilst 50-m and 10-m resolutions produce similar results. However, velocity and momentum flux are sensitive to the grid resolution and hence, at least 10-m grid resolution needs to be implemented when considering flow-based parameters for tsunami hazard and risk assessments. In addition, the results indicate that the tsunami hazard parameters and fatality number are more sensitive to the complexity of earthquake source characterization than the grid resolution. Thus, the uniform models are not recommended for probabilistic tsunami hazard and risk assessments. Finally, the findings confirm that uncertainties of tsunami hazard level and fatality in terms of depth, velocity and momentum flux can be captured and visualized through the complex source modeling approach. From tsunami risk management perspectives, this indeed creates big data, which are useful for making effective and robust decisions.
Liaw, Horng-Jang; Wang, Tzu-Ai
2007-03-06
Flash point is one of the major quantities used to characterize the fire and explosion hazard of liquids. Herein, a liquid with dissolved salt is presented in a salt-distillation process for separating close-boiling or azeotropic systems. The addition of salts to a liquid may reduce fire and explosion hazard. In this study, we have modified a previously proposed model for predicting the flash point of miscible mixtures to extend its application to solvent/salt mixtures. This modified model was verified by comparison with the experimental data for organic solvent/salt and aqueous-organic solvent/salt mixtures to confirm its efficacy in terms of prediction of the flash points of these mixtures. The experimental results confirm marked increases in liquid flash point increment with addition of inorganic salts relative to supplementation with equivalent quantities of water. Based on this evidence, it appears reasonable to suggest potential application for the model in assessment of the fire and explosion hazard for solvent/salt mixtures and, further, that addition of inorganic salts may prove useful for hazard reduction in flammable liquids.
Oberdörster, Günter; Graham, Uschi
2018-05-08
Inhalation exposure to elongated cleavage fragments occurring at mineral and rock mining and crushing operations raises important questions regarding potential health effects given their resemblance to fibers with known adverse health effects like amphibole asbestos. Thus, a major goal for establishing a toxicity profile for elongate mineral particles (EMPs) is to identify and characterize a suspected hazard and characterize a risk by examining together results of hazard and exposure assessment. This will require not only knowledge about biokinetics of inhaled EMPs but also about underlying mechanisms of effects induced by retained EMPs. In vitro toxicity assays with predictive power for in vivo effects have been established as useful screening tools for toxicological characterization of particulate materials including EMPs. Important determinants of physiological/toxicological mechanisms are physico-chemical and functional properties of inhaled particulate materials. Of the physico-chemical (intrinsic) properties, size, shape and surface characteristics are well known to affect toxicological responses; functional properties include (i) solubility/dissolution rate in physiological fluid simulants in vitro and following inhalation in vivo; (ii) ROS-inducing capacity in vitro and in vivo determined as specific particle surface reactivity; (iii) bioprocessing in vivo. A key parameter for all is the dose and duration of exposure, requiring to establish exposure-dose-response relationships. Examples of studies with fibrous and non-fibrous particles are discussed to illustrate the relevancy of evaluating extrinsic and intrinsic particle properties for predicting in vivo responses of new particulate materials. This will allow hazard and risk ranking/grouping based on a comparison to toxicologically well-characterized positive and negative benchmarks. Future efforts should be directed at developing and validating new approaches using in vitro (non-animal) studies for establishing a complete risk assessment for EMPs. Further comparative in-depth analyses with analytical and ultra-high resolution technology examining bioprocessing events at target organ sites have proven highly successful to identify biotransformations in target cells at near atomic level. In the case of EMPs, such analyses can be essential to separate benign from harmful ones. Copyright © 2018. Published by Elsevier Inc.
Flash-point prediction for binary partially miscible mixtures of flammable solvents.
Liaw, Horng-Jang; Lu, Wen-Hung; Gerbaud, Vincent; Chen, Chan-Cheng
2008-05-30
Flash point is the most important variable used to characterize fire and explosion hazard of liquids. Herein, partially miscible mixtures are presented within the context of liquid-liquid extraction processes. This paper describes development of a model for predicting the flash point of binary partially miscible mixtures of flammable solvents. To confirm the predictive efficacy of the derived flash points, the model was verified by comparing the predicted values with the experimental data for the studied mixtures: methanol+octane; methanol+decane; acetone+decane; methanol+2,2,4-trimethylpentane; and, ethanol+tetradecane. Our results reveal that immiscibility in the two liquid phases should not be ignored in the prediction of flash point. Overall, the predictive results of this proposed model describe the experimental data well. Based on this evidence, therefore, it appears reasonable to suggest potential application for our model in assessment of fire and explosion hazards, and development of inherently safer designs for chemical processes containing binary partially miscible mixtures of flammable solvents.
Application of high-density data for hazard prediction, safety assesment, and risk characterization
There are long lists of chemicals that require some level of evaluation for safety determination. These include the European Union’s Registration, Evaluation, Authorization and Restriction of Chemical substances (REACH) program, Environment Canada’s existing substances evaluatio...
Integrating asthma hazard characterization methods for consumer products.
Maier, A; Vincent, M J; Gadagbui, B; Patterson, J; Beckett, W; Dalton, P; Kimber, I; Selgrade, M J K
2014-10-01
Despite extensive study, definitive conclusions regarding the relationship between asthma and consumer products remain elusive. Uncertainties reflect the multi-faceted nature of asthma (i.e., contributions of immunologic and non-immunologic mechanisms). Many substances used in consumer products are associated with occupational asthma or asthma-like syndromes. However, risk assessment methods do not adequately predict the potential for consumer product exposures to trigger asthma and related syndromes under lower-level end-user conditions. A decision tree system is required to characterize asthma and respiratory-related hazards associated with consumer products. A system can be built to incorporate the best features of existing guidance, frameworks, and models using a weight-of-evidence (WoE) approach. With this goal in mind, we have evaluated chemical hazard characterization methods for asthma and asthma-like responses. Despite the wealth of information available, current hazard characterization methods do not definitively identify whether a particular ingredient will cause or exacerbate asthma, asthma-like responses, or sensitization of the respiratory tract at lower levels associated with consumer product use. Effective use of hierarchical lines of evidence relies on consideration of the relevance and potency of assays, organization of assays by mode of action, and better assay validation. It is anticipated that the analysis of existing methods will support the development of a refined WoE approach. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Landslide modeling and forecasting—recent progress by the u.s. geological survey
Baum, Rex L.; Kean, Jason W.
2015-01-01
Landslide studies by the U.S. Geological Survey (USGS) are focused on two main objectives: scientific understanding and forecasting. The first objective is to gain better understanding of the physical processes involved in landslide initiation and movement. This objective is largely in support of the second objective, to develop predictive capabilities to answer the main hazard questions. Answers to the following six questions are needed to characterize the hazard from landslides: (1) Where will landslides occur? (2) What kind(s) of landslides will occur? (3) When will landslides occur? (4) How big will the landslides be? (5) How fast will the landslides travel? (6) How far will the landslides go? Although these questions are sometimes recast in different terms, such as frequency or recurrence rather than timing (when), the questions or their variants address the spatial, physical, and temporal aspects of landslide hazards. Efforts to develop modeling and forecasting capabilities by the USGS are primarily focused on specific landslide types that pose a high degree of hazard and show relatively high potential for predictability.
Monitoring and characterizing natural hazards with satellite InSAR imagery
Lu, Zhong; Zhang, Jixian; Zhang, Yonghong; Dzurisin, Daniel
2010-01-01
Interferometric synthetic aperture radar (InSAR) provides an all-weather imaging capability for measuring ground-surface deformation and inferring changes in land surface characteristics. InSAR enables scientists to monitor and characterize hazards posed by volcanic, seismic, and hydrogeologic processes, by landslides and wildfires, and by human activities such as mining and fluid extraction or injection. Measuring how a volcano’s surface deforms before, during, and after eruptions provides essential information about magma dynamics and a basis for mitigating volcanic hazards. Measuring spatial and temporal patterns of surface deformation in seismically active regions is extraordinarily useful for understanding rupture dynamics and estimating seismic risks. Measuring how landslides develop and activate is a prerequisite to minimizing associated hazards. Mapping surface subsidence or uplift related to extraction or injection of fluids during exploitation of groundwater aquifers or petroleum reservoirs provides fundamental data on aquifer or reservoir properties and improves our ability to mitigate undesired consequences. Monitoring dynamic water-level changes in wetlands improves hydrological modeling predictions and the assessment of future flood impacts. In addition, InSAR imagery can provide near-real-time estimates of fire scar extents and fire severity for wildfire management and control. All-weather satellite radar imagery is critical for studying various natural processes and is playing an increasingly important role in understanding and forecasting natural hazards.
A re-evaluation of PETROTOX for predicting acute and chronic toxicity of petroleum substances.
Redman, Aaron D; Parkerton, Thomas F; Leon Paumen, Miriam; Butler, Josh D; Letinski, Daniel J; den Haan, Klass
2017-08-01
The PETROTOX model was developed to perform aquatic hazard assessment of petroleum substances based on substance composition. The model relies on the hydrocarbon block method, which is widely used for conducting petroleum substance risk assessments providing further justification for evaluating model performance. Previous work described this model and provided a preliminary calibration and validation using acute toxicity data for limited petroleum substance. The objective of the present study was to re-evaluate PETROTOX using expanded data covering both acute and chronic toxicity endpoints on invertebrates, algae, and fish for a wider range of petroleum substances. The results indicated that recalibration of 2 model parameters was required, namely, the algal critical target lipid body burden and the log octanol-water partition coefficient (K OW ) limit, used to account for reduced bioavailability of hydrophobic constituents. Acute predictions from the updated model were compared with observed toxicity data and found to generally be within a factor of 3 for algae and invertebrates but overestimated fish toxicity. Chronic predictions were generally within a factor of 5 of empirical data. Furthermore, PETROTOX predicted acute and chronic hazard classifications that were consistent or conservative in 93 and 84% of comparisons, respectively. The PETROTOX model is considered suitable for the purpose of characterizing petroleum substance hazard in substance classification and risk assessments. Environ Toxicol Chem 2017;36:2245-2252. © 2017 SETAC. © 2017 SETAC.
Mantovani, Alberto; Maranghi, Francesca; La Rocca, Cinzia; Tiboni, Gian Mario; Clementi, Maurizio
2008-09-01
The paper discusses current knowledge and possible research priorities on biomarkers of exposure, effect and susceptibility for potential endocrine activities of agrochemicals (dicarboximides, ethylene bisdithiocarbammates, triazoles, etc.). Possible widespread, multiple-pathway exposure to agrochemicals highlights the need to assess internal exposure of animals or humans, which is the most relevant exposure measure for hazard and risk estimation; however, exposure data should be integrated by early indicators predictive of possible health effects, particularly for vulnerable groups such as mother-child pairs. Research need include: non-invasive biomarkers for children biomonitoring; novel biomarkers of total exposure to measure whole endocrine disrupter-related burden; characterization of biomarkers of susceptibility, including the role of markers of nutritional status; anchoring early molecular markers to established toxicological endpoints to support their predictivity; integrating "omics"-based approaches in a system-toxicology framework. As biomonitoring becomes increasingly important in the environment-and-health scenario, toxicologists can substantially contribute both to the characterization of new biomarkers and to the predictivity assessment and improvement of the existing ones.
NASA Astrophysics Data System (ADS)
Berge-Thierry, C.; Hollender, F.; Guyonnet-Benaize, C.; Baumont, D.; Ameri, G.; Bollinger, L.
2017-09-01
Seismic analysis in the context of nuclear safety in France is currently guided by a pure deterministic approach based on Basic Safety Rule ( Règle Fondamentale de Sûreté) RFS 2001-01 for seismic hazard assessment, and on the ASN/2/01 Guide that provides design rules for nuclear civil engineering structures. After the 2011 Tohohu earthquake, nuclear operators worldwide were asked to estimate the ability of their facilities to sustain extreme seismic loads. The French licensees then defined the `hard core seismic levels', which are higher than those considered for design or re-assessment of the safety of a facility. These were initially established on a deterministic basis, and they have been finally justified through state-of-the-art probabilistic seismic hazard assessments. The appreciation and propagation of uncertainties when assessing seismic hazard in France have changed considerably over the past 15 years. This evolution provided the motivation for the present article, the objectives of which are threefold: (1) to provide a description of the current practices in France to assess seismic hazard in terms of nuclear safety; (2) to discuss and highlight the sources of uncertainties and their treatment; and (3) to use a specific case study to illustrate how extended source modeling can help to constrain the key assumptions or parameters that impact upon seismic hazard assessment. This article discusses in particular seismic source characterization, strong ground motion prediction, and maximal magnitude constraints, according to the practice of the French Atomic Energy Commission. Due to increases in strong motion databases in terms of the number and quality of the records in their metadata and the uncertainty characterization, several recently published empirical ground motion prediction models are eligible for seismic hazard assessment in France. We show that propagation of epistemic and aleatory uncertainties is feasible in a deterministic approach, as in a probabilistic way. Assessment of seismic hazard in France in the framework of the safety of nuclear facilities should consider these recent advances. In this sense, the opening of discussions with all of the stakeholders in France to update the reference documents (i.e., RFS 2001-01; ASN/2/01 Guide) appears appropriate in the short term.
Hashida, Masahiro; Kamezaki, Ryousuke; Goto, Makoto; Shiraishi, Junji
2017-03-01
The ability to predict hazards in possible situations in a general X-ray examination room created for Kiken-Yochi training (KYT) is quantified by use of free-response receiver-operating characteristics (FROC) analysis for determining whether the total number of years of clinical experience, involvement in general X-ray examinations, occupation, and training each have an impact on the hazard prediction ability. Twenty-three radiological technologists (RTs) (years of experience: 2-28), four nurses (years of experience: 15-19), and six RT students observed 53 scenes of KYT: 26 scenes with hazardous points (hazardous points are those that might cause injury to patients) and 27 scenes without points. Based on the results of these observations, we calculated the alternative free-response receiver-operating characteristic (AFROC) curve and the figure of merit (FOM) to quantify the hazard prediction ability. The results showed that the total number of years of clinical experience did not have any impact on hazard prediction ability, whereas recent experience with general X-ray examinations greatly influenced this ability. In addition, the hazard prediction ability varied depending on the occupations of the observers while they were observing the same scenes in KYT. The hazard prediction ability of the radiologic technology students was improved after they had undergone patient safety training. This proposed method with FROC observer study enabled the quantification and evaluation of the hazard prediction capability, and the application of this approach to clinical practice may help to ensure the safety of examinations and treatment in the radiology department.
Computational Approaches to Chemical Hazard Assessment
Luechtefeld, Thomas; Hartung, Thomas
2018-01-01
Summary Computational prediction of toxicity has reached new heights as a result of decades of growth in the magnitude and diversity of biological data. Public packages for statistics and machine learning make model creation faster. New theory in machine learning and cheminformatics enables integration of chemical structure, toxicogenomics, simulated and physical data in the prediction of chemical health hazards, and other toxicological information. Our earlier publications have characterized a toxicological dataset of unprecedented scale resulting from the European REACH legislation (Registration Evaluation Authorisation and Restriction of Chemicals). These publications dove into potential use cases for regulatory data and some models for exploiting this data. This article analyzes the options for the identification and categorization of chemicals, moves on to the derivation of descriptive features for chemicals, discusses different kinds of targets modeled in computational toxicology, and ends with a high-level perspective of the algorithms used to create computational toxicology models. PMID:29101769
NASA Astrophysics Data System (ADS)
Greco, Roberto; Pagano, Luca
2017-12-01
To manage natural risks, an increasing effort is being put in the development of early warning systems (EWS), namely, approaches facing catastrophic phenomena by timely forecasting and alarm spreading throughout exposed population. Research efforts aimed at the development and implementation of effective EWS should especially concern the definition and calibration of the interpretative model. This paper analyses the main features characterizing predictive models working in EWS by discussing their aims and their features in terms of model accuracy, evolutionary stage of the phenomenon at which the prediction is carried out and model architecture. Original classification criteria based on these features are developed throughout the paper and shown in their practical implementation through examples of flow-like landslides and earth flows, both of which are characterized by rapid evolution and quite representative of many applications of EWS.
The Impact Hazard in the Context of Other Natural Hazards and Predictive Science
NASA Astrophysics Data System (ADS)
Chapman, C. R.
1998-09-01
The hazard due to impact of asteroids and comets has been recognized as analogous, in some ways, to other infrequent but consequential natural hazards (e.g. floods and earthquakes). Yet, until recently, astronomers and space agencies have felt no need to do what their colleagues and analogous agencies must do in order the assess, quantify, and communicate predictions to those with a practical interest in the predictions (e.g. public officials who must assess the threats, prepare for mitigation, etc.). Recent heightened public interest in the impact hazard, combined with increasing numbers of "near misses" (certain to increase as Spaceguard is implemented) requires that astronomers accept the responsibility to place their predictions and assessments in terms that may be appropriately considered. I will report on preliminary results of a multi-year GSA/NCAR study of "Prediction in the Earth Sciences: Use and Misuse in Policy Making" in which I have represented the impact hazard, while others have treated earthquakes, floods, weather, global climate change, nuclear waste disposal, acid rain, etc. The impact hazard presents an end-member example of a natural hazard, helping those dealing with more prosaic issues to learn from an extreme. On the other hand, I bring to the astronomical community some lessons long adopted in other cases: the need to understand the policy purposes of impact predictions, the need to assess potential societal impacts, the requirements to very carefully assess prediction uncertainties, considerations of potential public uses of the predictions, awareness of ethical considerations (e.g. conflicts of interest) that affect predictions and acceptance of predictions, awareness of appropriate means for publicly communicating predictions, and considerations of the international context (especially for a hazard that knows no national boundaries).
Airborne Systems Technology Application to the Windshear Threat
NASA Technical Reports Server (NTRS)
Arbuckle, P. Douglas; Lewis, Michael S.; Hinton, David A.
1996-01-01
The general approach and products of the NASA/FAA Airborne Windshear Program conducted by NASA Langley Research Center are summarized, with references provided for the major technical contributions. During this period, NASA conducted 2 years of flight testing to characterize forward-looking sensor performance. The NASA/FAA Airborne Windshear Program was divided into three main elements: Hazard Characterization, Sensor Technology, and Flight Management Systems. Simulation models developed under the Hazard Characterization element are correlated with flight test data. Flight test results comparing the performance and characteristics of the various Sensor Technologies (microwave radar, lidar, and infrared) are presented. Most of the activities in the Flight Management Systems element were conducted in simulation. Simulation results from a study evaluating windshear crew procedures and displays for forward-looking sensor-equipped airplanes are discussed. NASA Langley researchers participated heavily in the FAA process of generating certification guidelines for predictive windshear detection systems. NASA participants felt that more valuable technology products were generated by the program because of this interaction. NASA involvement in the process and the resulting impact on products and technology transfer are discussed in this paper.
High-Throughput Models for Exposure-Based Chemical ...
The United States Environmental Protection Agency (U.S. EPA) must characterize potential risks to human health and the environment associated with manufacture and use of thousands of chemicals. High-throughput screening (HTS) for biological activity allows the ToxCast research program to prioritize chemical inventories for potential hazard. Similar capabilities for estimating exposure potential would support rapid risk-based prioritization for chemicals with limited information; here, we propose a framework for high-throughput exposure assessment. To demonstrate application, an analysis was conducted that predicts human exposure potential for chemicals and estimates uncertainty in these predictions by comparison to biomonitoring data. We evaluated 1936 chemicals using far-field mass balance human exposure models (USEtox and RAIDAR) and an indicator for indoor and/or consumer use. These predictions were compared to exposures inferred by Bayesian analysis from urine concentrations for 82 chemicals reported in the National Health and Nutrition Examination Survey (NHANES). Joint regression on all factors provided a calibrated consensus prediction, the variance of which serves as an empirical determination of uncertainty for prioritization on absolute exposure potential. Information on use was found to be most predictive; generally, chemicals above the limit of detection in NHANES had consumer/indoor use. Coupled with hazard HTS, exposure HTS can place risk earlie
Spatial prediction of landslide hazard using discriminant analysis and GIS
Peter V. Gorsevski; Paul Gessler; Randy B. Foltz
2000-01-01
Environmental attributes relevant for spatial prediction of landslides triggered by rain and snowmelt events were derived from digital elevation model (DEM). Those data in conjunction with statistics and geographic information system (GIS) provided a detailed basis for spatial prediction of landslide hazard. The spatial prediction of landslide hazard in this paper is...
Initial source and site characterization studies for the U.C. Santa Barbara campus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Archuleta, R.; Nicholson, C.; Steidl, J.
1997-12-01
The University of California Campus-Laboratory Collaboration (CLC) project is an integrated 3 year effort involving Lawrence Livermore National Laboratory (LLNL) and four UC campuses - Los Angeles (UCLA), Riverside (UCR), Santa Barbara (UCSB), and San Diego (UCSD) - plus additional collaborators at San Diego State University (SDSU), at Los Alamos National Laboratory and in industry. The primary purpose of the project is to estimate potential ground motions from large earthquakes and to predict site-specific ground motions for one critical structure on each campus. This project thus combines the disciplines of geology, seismology, geodesy, soil dynamics, and earthquake engineering into amore » fully integrated approach. Once completed, the CLC project will provide a template to evaluate other buildings at each of the four UC campuses, as well as provide a methodology for evaluating seismic hazards at other critical sites in California, including other UC locations at risk from large earthquakes. Another important objective of the CLC project is the education of students and other professional in the application of this integrated, multidisciplinary, state-of-the-art approach to the assessment of earthquake hazard. For each campus targeted by the CLC project, the seismic hazard study will consist of four phases: Phase I - Initial source and site characterization, Phase II - Drilling, logging, seismic monitoring, and laboratory dynamic soil testing, Phase III - Modeling of predicted site-specific earthquake ground motions, and Phase IV - Calculations of 3D building response. This report cover Phase I for the UCSB campus and incudes results up through March 1997.« less
de la Cruz, Elba; Fournier, María Luisa; García, Fernando; Molina, Andrea; Chavarría, Guadalupe; Alfaro, Margarita; Ramírez, Fernando; Rodríguez, César
2014-01-01
Antibiotics alter the homeostasis of microbial communities and select for antibiotic-resistant bacteria in the wild. Thus, the accumulation of unnaturally high concentration of these substances in the environment due to their use in human activities can be regarded as a neglected form of pollution, especially in countries with agricultural-based economies. Qualitative and quantitative information on antibiotic usage in Costa Rica is scarce, hence the design and enforcement of prevention strategies and corrective measures is difficult. To address this issue, and aiming in the long run to contribute with a more rational use of pharmaceuticals in the tropics, we characterized the hazard associated with the antibiotics used during 2008 in agriculture, aquaculture, pig farming, veterinary medicine and human medicine in the major irrigation district of Costa Rica. Hazard indicators were calculated based on antibiotic use and a weighted algorithm that also considered antibiotic fate, toxicity, and resistance. Moreover, hazard quotients were computed using maximum environmental concentrations reported for Costa Rican surface waters and predicted no effect concentrations for aquatic organisms. The number of antibiotics used in the ATID during the study were n = 38 from 15 families. Antibiotic consumption was estimated at 1169-109908 g ha(-1) year(-1) and, distinctively, almost half of this figure was traced back to phenicols. Tetracyclines, with a particular contribution of oxytetracycline, were the most widely used antibiotics in agriculture and veterinary medicine. Oxytetracycline, florfenicol, chlortetracycline, sulfamethoxazole, erythromycin, ciprofloxacin, enrofloxacin, sulfamethazine, trimethoprim and tylosin, in that order showed the highest hazard indicators. Moreover, hazard quotients greater than 1 were calculated for oxacillin, doxycycline, oxytetracycline, sulfamethazine, and ciprofloxacin. Studies dealing with the ecotoxicology of tetracyclines, sulfonamides and quinolones, as well as surveys of phenicol resistance among environmental bacteria, should be prioritized in Costa Rica.
NEL, ANDRE; XIA, TIAN; MENG, HUAN; WANG, XIANG; LIN, SIJIE; JI, ZHAOXIA; ZHANG, HAIYUAN
2014-01-01
Conspectus The production of engineered nanomaterials (ENMs) is a scientific breakthrough in material design and the development of new consumer products. While the successful implementation of nanotechnology is important for the growth of the global economy, we also need to consider the possible environmental health and safety (EHS) impact as a result of the novel physicochemical properties that could generate hazardous biological outcomes. In order to assess ENM hazard, reliable and reproducible screening approaches are needed to test the basic materials as well as nano-enabled products. A platform is required to investigate the potentially endless number of bio-physicochemical interactions at the nano/bio interface, in response to which we have developed a predictive toxicological approach. We define a predictive toxicological approach as the use of mechanisms-based high throughput screening in vitro to make predictions about the physicochemical properties of ENMs that may lead to the generation of pathology or disease outcomes in vivo. The in vivo results are used to validate and improve the in vitro high throughput screening (HTS) and to establish structure-activity relationships (SARs) that allow hazard ranking and modeling by an appropriate combination of in vitro and in vivo testing. This notion is in agreement with the landmark 2007 report from the US National Academy of Sciences, “Toxicity Testing in the 21st Century: A Vision and a Strategy” (http://www.nap.edu/catalog.php?record_id=11970), which advocates increased efficiency of toxicity testing by transitioning from qualitative, descriptive animal testing to quantitative, mechanistic and pathway-based toxicity testing in human cells or cell lines using high throughput approaches. Accordingly, we have implemented HTS approaches to screen compositional and combinatorial ENM libraries to develop hazard ranking and structure-activity relationships that can be used for predicting in vivo injury outcomes. This predictive approach allows the bulk of the screening analysis and high volume data generation to be carried out in vitro, following which limited, but critical, validation studies are carried out in animals or whole organisms. Risk reduction in the exposed human or environmental populations can then focus on limiting or avoiding exposures that trigger these toxicological responses as well as implementing safer design of potentially hazardous ENMs. In this communication, we review the tools required for establishing predictive toxicology paradigms to assess inhalation and environmental toxicological scenarios through the use of compositional and combinatorial ENM libraries, mechanism-based HTS assays, hazard ranking and development of nano-SARs. We will discuss the major injury paradigms that have emerged based on specific ENM properties, as well as describing the safer design of ZnO nanoparticles based on characterization of dissolution chemistry as a major predictor of toxicity. PMID:22676423
Savonitto, Stefano; Morici, Nuccia; Nozza, Anna; Cosentino, Francesco; Perrone Filardi, Pasquale; Murena, Ernesto; Morocutti, Giorgio; Ferri, Marco; Cavallini, Claudio; Eijkemans, Marinus Jc; Stähli, Barbara E; Schrieks, Ilse C; Toyama, Tadashi; Lambers Heerspink, H J; Malmberg, Klas; Schwartz, Gregory G; Lincoff, A Michael; Ryden, Lars; Tardif, Jean Claude; Grobbee, Diederick E
2018-01-01
To define the predictors of long-term mortality in patients with type 2 diabetes mellitus and recent acute coronary syndrome. A total of 7226 patients from a randomized trial, testing the effect on cardiovascular outcomes of the dual peroxisome proliferator-activated receptor agonist aleglitazar in patients with type 2 diabetes mellitus and recent acute coronary syndrome (AleCardio trial), were analysed. Median follow-up was 2 years. The independent mortality predictors were defined using Cox regression analysis. The predictive information provided by each variable was calculated as percent of total chi-square of the model. All-cause mortality was 4.0%, with cardiovascular death contributing for 73% of mortality. The mortality prediction model included N-terminal proB-type natriuretic peptide (adjusted hazard ratio = 1.68; 95% confidence interval = 1.51-1.88; 27% of prediction), lack of coronary revascularization (hazard ratio = 2.28; 95% confidence interval = 1.77-2.93; 18% of prediction), age (hazard ratio = 1.04; 95% confidence interval = 1.02-1.05; 15% of prediction), heart rate (hazard ratio = 1.02; 95% confidence interval = 1.01-1.03; 10% of prediction), glycated haemoglobin (hazard ratio = 1.11; 95% confidence interval = 1.03-1.19; 8% of prediction), haemoglobin (hazard ratio = 1.01; 95% confidence interval = 1.00-1.02; 8% of prediction), prior coronary artery bypass (hazard ratio = 1.61; 95% confidence interval = 1.11-2.32; 7% of prediction) and prior myocardial infarction (hazard ratio = 1.40; 95% confidence interval = 1.05-1.87; 6% of prediction). In patients with type 2 diabetes mellitus and recent acute coronary syndrome, mortality prediction is largely dominated by markers of cardiac, rather than metabolic, dysfunction.
Flooding Fragility Experiments and Prediction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Tahhan, Antonio; Muchmore, Cody
2016-09-01
This report describes the work that has been performed on flooding fragility, both the experimental tests being carried out and the probabilistic fragility predictive models being produced in order to use the text results. Flooding experiments involving full-scale doors have commenced in the Portal Evaluation Tank. The goal of these experiments is to develop a full-scale component flooding experiment protocol and to acquire data that can be used to create Bayesian regression models representing the fragility of these components. This work is in support of the Risk-Informed Safety Margin Characterization (RISMC) Pathway external hazards evaluation research and development.
NASA Technical Reports Server (NTRS)
Tai, H.; Wilson, J. W.; Maiden, D. L.
2003-01-01
The atmospheric ionizing radiation (AIR) ER-2 preflight analysis, one of the first attempts to obtain a relatively complete measurement set of the high-altitude radiation level environment, is described in this paper. The primary thrust is to characterize the atmospheric radiation and to define dose levels at high-altitude flight. A secondary thrust is to develop and validate dosimetric techniques and monitoring devices for protecting aircrews. With a few chosen routes, we can measure the experimental results and validate the AIR model predictions. Eventually, as more measurements are made, we gain more understanding about the hazardous radiation environment and acquire more confidence in the prediction models.
Crundall, David; Kroll, Victoria
2018-05-18
Can hazard perception testing be useful for the emergency services? Previous research has found emergency response drivers' (ERDs) to perform better than controls, however these studies used clips of normal driving. In contrast, the current study filmed footage from a fire-appliance on blue-light training runs through Nottinghamshire, and endeavoured to discriminate between different groups of EDRs based on experience and collision risk. Thirty clips were selected to create two variants of the hazard perception test: a traditional push-button test requiring speeded-responses to hazards, and a prediction test that occludes at hazard onset and provides four possible outcomes for participants to choose between. Three groups of fire-appliance drivers (novices, low-risk experienced and high-risk experienced), and age-matched controls undertook both tests. The hazard perception test only discriminated between controls and all FA drivers, whereas the hazard prediction test was more sensitive, discriminating between high and low-risk experienced fire appliance drivers. Eye movement analyses suggest that the low-risk drivers were better at prioritising the hazardous precursors, leading to better predictive accuracy. These results pave the way for future assessment and training tools to supplement emergency response driver training, while supporting the growing literature that identifies hazard prediction as a more robust measure of driver safety than traditional hazard perception tests. Copyright © 2018 Elsevier Ltd. All rights reserved.
The objectives of the Household Hazardous Waste Characterization Study (the HHW Study) were to: 1) Quantity the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County Florida’s (the County) residential solid waste (characterized in this study as municipal s...
NASA Astrophysics Data System (ADS)
Li, Deying; Yin, Kunlong; Gao, Huaxi; Liu, Changchun
2009-10-01
Although the project of the Three Gorges Dam across the Yangtze River in China can utilize this huge potential source of hydroelectric power, and eliminate the loss of life and damage by flood, it also causes environmental problems due to the big rise and fluctuation of the water, such as geo-hazards. In order to prevent and predict geo-hazards, the establishment of prediction system of geo-hazards is very necessary. In order to implement functions of hazard prediction of regional and urban geo-hazard, single geo-hazard prediction, prediction of landslide surge and risk evaluation, logical layers of the system consist of data capturing layer, data manipulation and processing layer, analysis and application layer, and information publication layer. Due to the existence of multi-source spatial data, the research on the multi-source transformation and fusion data should be carried on in the paper. Its applicability of the system was testified on the spatial prediction of landslide hazard through spatial analysis of GIS in which information value method have been applied aims to identify susceptible areas that are possible to future landslide, on the basis of historical record of past landslide, terrain parameter, geology, rainfall and anthropogenic activity. Detailed discussion was carried out on spatial distribution characteristics of landslide hazard in the new town of Badong. These results can be used for risk evaluation. The system can be implemented as an early-warning and emergency management tool by the relevant authorities of the Three Gorges Reservoir in the future.
Wieczorek, Gerald F.; Snyder, James B.; Borchers, James W.; Reichenbach, Paola
2007-01-01
Since 1857, several hundred rockfalls, rockslides, and debris flows have been observed in Yosemite National Park. At 12:45 a.m. on December 26, 2003, a severe winter storm triggered a rockfall west of Glacier Point in Yosemite Valley. Rock debris moved quickly eastward down Staircase Falls toward Curry Village. As the rapidly moving rock mass reached talus at the bottom of Staircase Falls, smaller pieces of flying rock penetrated occupied cabins. Physical characterization of the rockfall site included rockfall volume, joint patterns affecting initial release of rock and the travel path of rockfall, factors affecting weathering and weakening of bedrock, and hydrology affecting slope stability within joints. Although time return intervals are not predictable, a three-dimensional rockfall model was used to assess future rockfall potential and risk. Predictive rockfall and debris-flow methods suggest that landslide hazards beneath these steep cliffs extend farther than impact ranges defined from surface talus in Yosemite Valley, leaving some park facilities vulnerable.
Tracking Temporal Hazard in the Human Electroencephalogram Using a Forward Encoding Model
2018-01-01
Abstract Human observers automatically extract temporal contingencies from the environment and predict the onset of future events. Temporal predictions are modeled by the hazard function, which describes the instantaneous probability for an event to occur given it has not occurred yet. Here, we tackle the question of whether and how the human brain tracks continuous temporal hazard on a moment-to-moment basis, and how flexibly it adjusts to strictly implicit variations in the hazard function. We applied an encoding-model approach to human electroencephalographic data recorded during a pitch-discrimination task, in which we implicitly manipulated temporal predictability of the target tones by varying the interval between cue and target tone (i.e. the foreperiod). Critically, temporal predictability either was driven solely by the passage of time (resulting in a monotonic hazard function) or was modulated to increase at intermediate foreperiods (resulting in a modulated hazard function with a peak at the intermediate foreperiod). Forward-encoding models trained to predict the recorded EEG signal from different temporal hazard functions were able to distinguish between experimental conditions, showing that implicit variations of temporal hazard bear tractable signatures in the human electroencephalogram. Notably, this tracking signal was reconstructed best from the supplementary motor area, underlining this area’s link to cognitive processing of time. Our results underline the relevance of temporal hazard to cognitive processing and show that the predictive accuracy of the encoding-model approach can be utilized to track abstract time-resolved stimuli. PMID:29740594
A statistical framework for applying RNA profiling to chemical hazard detection.
Kostich, Mitchell S
2017-12-01
Use of 'omics technologies in environmental science is expanding. However, application is mostly restricted to characterizing molecular steps leading from toxicant interaction with molecular receptors to apical endpoints in laboratory species. Use in environmental decision-making is limited, due to difficulty in elucidating mechanisms in sufficient detail to make quantitative outcome predictions in any single species or in extending predictions to aquatic communities. Here we introduce a mechanism-agnostic statistical approach, supplementing mechanistic investigation by allowing probabilistic outcome prediction even when understanding of molecular pathways is limited, and facilitating extrapolation from results in laboratory test species to predictions about aquatic communities. We use concepts familiar to environmental managers, supplemented with techniques employed for clinical interpretation of 'omics-based biomedical tests. We describe the framework in step-wise fashion, beginning with single test replicates of a single RNA variant, then extending to multi-gene RNA profiling, collections of test replicates, and integration of complementary data. In order to simplify the presentation, we focus on using RNA profiling for distinguishing presence versus absence of chemical hazards, but the principles discussed can be extended to other types of 'omics measurements, multi-class problems, and regression. We include a supplemental file demonstrating many of the concepts using the open source R statistical package. Published by Elsevier Ltd.
Up-to-date Probabilistic Earthquake Hazard Maps for Egypt
NASA Astrophysics Data System (ADS)
Gaber, Hanan; El-Hadidy, Mahmoud; Badawy, Ahmed
2018-04-01
An up-to-date earthquake hazard analysis has been performed in Egypt using a probabilistic seismic hazard approach. Through the current study, we use a complete and homogenous earthquake catalog covering the time period between 2200 BC and 2015 AD. Three seismotectonic models representing the seismic activity in and around Egypt are used. A logic-tree framework is applied to allow for the epistemic uncertainty in the declustering parameters, minimum magnitude, seismotectonic setting and ground-motion prediction equations. The hazard analysis is performed for a grid of 0.5° × 0.5° in terms of types of rock site for the peak ground acceleration (PGA) and spectral acceleration at 0.2-, 0.5-, 1.0- and 2.0-s periods. The hazard is estimated for three return periods (72, 475 and 2475 years) corresponding to 50, 10 and 2% probability of exceedance in 50 years. The uniform hazard spectra for the cities of Cairo, Alexandria, Aswan and Nuwbia are constructed. The hazard maps show that the highest ground acceleration values are expected in the northeastern part of Egypt around the Gulf of Aqaba (PGA up to 0.4 g for return period 475 years) and in south Egypt around the city of Aswan (PGA up to 0.2 g for return period 475 years). The Western Desert of Egypt is characterized by the lowest level of hazard (PGA lower than 0.1 g for return period 475 years).
NASA Astrophysics Data System (ADS)
Staley, Dennis; Negri, Jacquelyn; Kean, Jason
2016-04-01
Population expansion into fire-prone steeplands has resulted in an increase in post-fire debris-flow risk in the western United States. Logistic regression methods for determining debris-flow likelihood and the calculation of empirical rainfall intensity-duration thresholds for debris-flow initiation represent two common approaches for characterizing hazard and reducing risk. Logistic regression models are currently being used to rapidly assess debris-flow hazard in response to design storms of known intensities (e.g. a 10-year recurrence interval rainstorm). Empirical rainfall intensity-duration thresholds comprise a major component of the United States Geological Survey (USGS) and the National Weather Service (NWS) debris-flow early warning system at a regional scale in southern California. However, these two modeling approaches remain independent, with each approach having limitations that do not allow for synergistic local-scale (e.g. drainage-basin scale) characterization of debris-flow hazard during intense rainfall. The current logistic regression equations consider rainfall a unique independent variable, which prevents the direct calculation of the relation between rainfall intensity and debris-flow likelihood. Regional (e.g. mountain range or physiographic province scale) rainfall intensity-duration thresholds fail to provide insight into the basin-scale variability of post-fire debris-flow hazard and require an extensive database of historical debris-flow occurrence and rainfall characteristics. Here, we present a new approach that combines traditional logistic regression and intensity-duration threshold methodologies. This method allows for local characterization of both the likelihood that a debris-flow will occur at a given rainfall intensity, the direct calculation of the rainfall rates that will result in a given likelihood, and the ability to calculate spatially explicit rainfall intensity-duration thresholds for debris-flow generation in recently burned areas. Our approach synthesizes the two methods by incorporating measured rainfall intensity into each model variable (based on measures of topographic steepness, burn severity and surface properties) within the logistic regression equation. This approach provides a more realistic representation of the relation between rainfall intensity and debris-flow likelihood, as likelihood values asymptotically approach zero when rainfall intensity approaches 0 mm/h, and increase with more intense rainfall. Model performance was evaluated by comparing predictions to several existing regional thresholds. The model, based upon training data collected in southern California, USA, has proven to accurately predict rainfall intensity-duration thresholds for other areas in the western United States not included in the original training dataset. In addition, the improved logistic regression model shows promise for emergency planning purposes and real-time, site-specific early warning. With further validation, this model may permit the prediction of spatially-explicit intensity-duration thresholds for debris-flow generation in areas where empirically derived regional thresholds do not exist. This improvement would permit the expansion of the early-warning system into other regions susceptible to post-fire debris flow.
King, Michael; Marston, Louise; Švab, Igor; Maaroos, Heidi-Ingrid; Geerlings, Mirjam I.; Xavier, Miguel; Benjamin, Vicente; Torres-Gonzalez, Francisco; Bellon-Saameno, Juan Angel; Rotar, Danica; Aluoja, Anu; Saldivia, Sandra; Correa, Bernardo; Nazareth, Irwin
2011-01-01
Background Little is known about the risk of progression to hazardous alcohol use in people currently drinking at safe limits. We aimed to develop a prediction model (predictAL) for the development of hazardous drinking in safe drinkers. Methods A prospective cohort study of adult general practice attendees in six European countries and Chile followed up over 6 months. We recruited 10,045 attendees between April 2003 to February 2005. 6193 European and 2462 Chilean attendees recorded AUDIT scores below 8 in men and 5 in women at recruitment and were used in modelling risk. 38 risk factors were measured to construct a risk model for the development of hazardous drinking using stepwise logistic regression. The model was corrected for over fitting and tested in an external population. The main outcome was hazardous drinking defined by an AUDIT score ≥8 in men and ≥5 in women. Results 69.0% of attendees were recruited, of whom 89.5% participated again after six months. The risk factors in the final predictAL model were sex, age, country, baseline AUDIT score, panic syndrome and lifetime alcohol problem. The predictAL model's average c-index across all six European countries was 0.839 (95% CI 0.805, 0.873). The Hedge's g effect size for the difference in log odds of predicted probability between safe drinkers in Europe who subsequently developed hazardous alcohol use and those who did not was 1.38 (95% CI 1.25, 1.51). External validation of the algorithm in Chilean safe drinkers resulted in a c-index of 0.781 (95% CI 0.717, 0.846) and Hedge's g of 0.68 (95% CI 0.57, 0.78). Conclusions The predictAL risk model for development of hazardous consumption in safe drinkers compares favourably with risk algorithms for disorders in other medical settings and can be a useful first step in prevention of alcohol misuse. PMID:21853028
King, Michael; Marston, Louise; Švab, Igor; Maaroos, Heidi-Ingrid; Geerlings, Mirjam I; Xavier, Miguel; Benjamin, Vicente; Torres-Gonzalez, Francisco; Bellon-Saameno, Juan Angel; Rotar, Danica; Aluoja, Anu; Saldivia, Sandra; Correa, Bernardo; Nazareth, Irwin
2011-01-01
Little is known about the risk of progression to hazardous alcohol use in people currently drinking at safe limits. We aimed to develop a prediction model (predictAL) for the development of hazardous drinking in safe drinkers. A prospective cohort study of adult general practice attendees in six European countries and Chile followed up over 6 months. We recruited 10,045 attendees between April 2003 to February 2005. 6193 European and 2462 Chilean attendees recorded AUDIT scores below 8 in men and 5 in women at recruitment and were used in modelling risk. 38 risk factors were measured to construct a risk model for the development of hazardous drinking using stepwise logistic regression. The model was corrected for over fitting and tested in an external population. The main outcome was hazardous drinking defined by an AUDIT score ≥8 in men and ≥5 in women. 69.0% of attendees were recruited, of whom 89.5% participated again after six months. The risk factors in the final predictAL model were sex, age, country, baseline AUDIT score, panic syndrome and lifetime alcohol problem. The predictAL model's average c-index across all six European countries was 0.839 (95% CI 0.805, 0.873). The Hedge's g effect size for the difference in log odds of predicted probability between safe drinkers in Europe who subsequently developed hazardous alcohol use and those who did not was 1.38 (95% CI 1.25, 1.51). External validation of the algorithm in Chilean safe drinkers resulted in a c-index of 0.781 (95% CI 0.717, 0.846) and Hedge's g of 0.68 (95% CI 0.57, 0.78). The predictAL risk model for development of hazardous consumption in safe drinkers compares favourably with risk algorithms for disorders in other medical settings and can be a useful first step in prevention of alcohol misuse.
Williams, Denita; Castleman, Jennifer; Lee, Chi-Ching; Mote, Beth; Smith, Mary Alice
2009-11-01
One-third of the annual cases of listeriosis in the United States occur during pregnancy and can lead to miscarriage or stillbirth, premature delivery, or infection of the newborn. Previous risk assessments completed by the Food and Drug Administration/the Food Safety Inspection Service of the U.S. Department of Agriculture/the Centers for Disease Control and Prevention (FDA/USDA/CDC) and Food and Agricultural Organization/the World Health Organization (FAO/WHO) were based on dose-response data from mice. Recent animal studies using nonhuman primates and guinea pigs have both estimated LD(50)s of approximately 10(7) Listeria monocytogenes colony forming units (cfu). The FAO/WHO estimated a human LD(50) of 1.9 x 10(6) cfu based on data from a pregnant woman consuming contaminated soft cheese. We reevaluated risk based on dose-response curves from pregnant rhesus monkeys and guinea pigs. Using standard risk assessment methodology including hazard identification, exposure assessment, hazard characterization, and risk characterization, risk was calculated based on the new dose-response information. To compare models, we looked at mortality rate per serving at predicted doses ranging from 10(-4) to 10(12) L. monocytogenes cfu. Based on a serving of 10(6) L. monocytogenes cfu, the primate model predicts a death rate of 5.9 x 10(-1) compared to the FDA/USDA/CDC (fig. IV-12) predicted rate of 1.3 x 10(-7). Based on the guinea pig and primate models, the mortality rate calculated by the FDA/USDA/CDC is underestimated for this susceptible population.
Empirical calibration of a roadside hazardousness index for Spanish two-lane rural roads.
Pardillo-Mayora, José M; Domínguez-Lira, Carlos A; Jurado-Piña, Rafael
2010-11-01
Crash records and roadside data from Spanish two-lane rural roads were analyzed to study the effect of roadside configuration on safety. Four indicators were used to characterize the main roadside features that have an influence on the consequences of roadway departures: roadside slope, non-traversable obstacles distance from the roadway edge, safety barrier installation, and alignment. Based on the analysis of the effect of roadside configuration on the frequency and severity of run-off-road injury crashes, a categorical roadside hazardousness scale was defined. Cluster analysis was applied to group the combinations of the four indicators into categories with homogeneous effects on run-off-road injury crashes frequency and severity. As a result a 5-level Roadside Hazardousness Index (RHI) was defined. RHI can be used as reference to normalize the collection of roadside safety related information. The index can also be used as variable for inclusion of roadside condition information in multivariate crash prediction models. 2010 Elsevier Ltd. All rights reserved.
Enhanced Weather Radar (EWxR) System
NASA Technical Reports Server (NTRS)
Kronfeld, Kevin M. (Technical Monitor)
2003-01-01
An airborne weather radar system, the Enhanced Weather Radar (EWxR), with enhanced on-board weather radar data processing was developed and tested. The system features additional weather data that is uplinked from ground-based sources, specialized data processing, and limited automatic radar control to search for hazardous weather. National Weather Service (NWS) ground-based Next Generation Radar (NEXRAD) information is used by the EWxR system to augment the on-board weather radar information. The system will simultaneously display NEXRAD and on-board weather radar information in a split-view format. The on-board weather radar includes an automated or hands-free storm-finding feature that optimizes the radar returns by automatically adjusting the tilt and range settings for the current altitude above the terrain and searches for storm cells near the atmospheric 0-degree isotherm. A rule-based decision aid was developed to automatically characterize cells as hazardous, possibly-hazardous, or non-hazardous based upon attributes of that cell. Cell attributes are determined based on data from the on-board radar and from ground-based radars. A flight path impact prediction algorithm was developed to help pilots to avoid hazardous weather along their flight plan and their mission. During development the system was tested on the NASA B757 aircraft and final tests were conducted on the Rockwell Collins Sabreliner.
Whitney, John W.; O'Leary, Dennis W.
1993-01-01
Tectonic characterization of a potential high-level nuclear waste repository at Yucca Mountain, Nevada, is needed to assess seismic and possible volcanic hazards that could affect the site during the preclosure (next 100 years) and the behavior of the hydrologic system during the postclosure (the following 10,000 years) periods. Tectonic characterization is based on assembling mapped geological structures in their chronological order of development and activity, and interpreting their dynamic interrelationships. Addition of mechanistic models and kinematic explanations for the identified tectonic processes provides one or more tectonic models having predictive power. Proper evaluation and application of tectonic models can aid in seismic design and help anticipate probable occurrence of future geologic events of significance to the repository and its design.
The paper discusses measurement issues relating to the characterization of organic emissions from hazardous waste incineration processes under EPA's new risk burn guidance. The recently published draft quidance recommends that hazardous waste combustion facilities complete a mass...
Borcherdt, Roger D.
2012-01-01
VS30, defined as the average seismic shear-wave velocity from the surface to a depth of 30 meters, has found wide-spread use as a parameter to characterize site response for simplified earthquake resistant design as implemented in building codes worldwide. VS30 , as initially introduced by the author for the US 1994 NEHRP Building Code, provides unambiguous definitions of site classes and site coefficients for site-dependent response spectra based on correlations derived from extensive borehole logging and comparative ground-motion measurement programs in California. Subsequent use of VS30 for development of strong ground motion prediction equations (GMPEs) and measurement of extensive sets of VS borehole data have confirmed the previous empirical correlations and established correlations of SVS30 with VSZ at other depths. These correlations provide closed form expressions to predict S30 V at a large number of additional sites and further justify S30 V as a parameter to characterize site response for simplified building codes, GMPEs, ShakeMap, and seismic hazard mapping.
Automatic Hazard Detection for Landers
NASA Technical Reports Server (NTRS)
Huertas, Andres; Cheng, Yang; Matthies, Larry H.
2008-01-01
Unmanned planetary landers to date have landed 'blind'; that is, without the benefit of onboard landing hazard detection and avoidance systems. This constrains landing site selection to very benign terrain,which in turn constrains the scientific agenda of missions. The state of the art Entry, Descent, and Landing (EDL) technology can land a spacecraft on Mars somewhere within a 20-100km landing ellipse.Landing ellipses are very likely to contain hazards such as craters, discontinuities, steep slopes, and large rocks, than can cause mission-fatal damage. We briefly review sensor options for landing hazard detection and identify a perception approach based on stereo vision and shadow analysis that addresses the broadest set of missions. Our approach fuses stereo vision and monocular shadow-based rock detection to maximize spacecraft safety. We summarize performance models for slope estimation and rock detection within this approach and validate those models experimentally. Instantiating our model of rock detection reliability for Mars predicts that this approach can reduce the probability of failed landing by at least a factor of 4 in any given terrain. We also describe a rock detector/mapper applied to large-high-resolution images from the Mars Reconnaissance Orbiter (MRO) for landing site characterization and selection for Mars missions.
Rail-highway crossing hazard prediction : research results
DOT National Transportation Integrated Search
1979-12-01
This document presents techniques for constructing and evaluating railroad grade : crossing hazard indexes. Hazard indexes are objective formulas for comparing or ranking : crossings according to relative hazard or for calculating absolute hazard (co...
Artificialized land characteristics and sediment connectivity explain muddy flood hazard in Wallonia
NASA Astrophysics Data System (ADS)
de Walque, Baptiste; Bielders, Charles; Degré, Aurore; Maugnard, Alexandre
2017-04-01
Muddy flood occurrence is an off-site erosion problem of growing interest in Europe and in particular in the loess belt and Condroz regions of Wallonia (Belgium). In order to assess the probability of occurrence of muddy floods in specific places, a muddy flood hazard prediction model has been built. It was used to test 11 different explanatory variables in simple and multiple logistic regressions approaches. A database of 442 muddy flood-affected sites and an equal number of homologous non flooded sites was used. For each site, relief, land use, sediment production and sediment connectivity of the contributing area were extracted. To assess the prediction quality of the model, we proceeded to a validation using 48 new pairs of homologous sites. Based on Akaïke Information Criterion (AIC), we determined that the best muddy flood hazard assessment model requires a total of 6 explanatory variable as inputs: the spatial aggregation of the artificialized land, the sediment connectivity, the artificialized land proximity to the outlet, the proportion of artificialized land, the mean slope and the Gravelius index of compactness of the contributive area. The artificialized land properties listed above showed to improve substantially the model quality (p-values from 10e-10 to 10e-4). All of the 3 properties showed negative correlation with the muddy flood hazard. These results highlight the importance of considering the artificialized land characteristics in the sediment transport assessment models. Indeed, artificialized land such as roads may dramatically deviate flows and influence the connectivity in the landscape. Besides the artificialized land properties, the sediment connectivity showed significant explanatory power (p-value of 10e-11). A positive correlation between the sediment connectivity and the muddy flood hazard was found, ranging from 0.3 to 0.45 depending on the sediment connectivity index. Several studies already have highlighted the importance of this parameter in the sediment transport characterization in the landscape. Using the best muddy flood probability of occurrence threshold value of 0.49, the validation of the best multiple logistic regression resulted in a prediction quality of 75.6% (original dataset) and 81.2% (secondary dataset). The developed statistical model could be used as a reliable tool to target muddy floods mitigation measures in sites resulting with the highest muddy floods hazard.
Characterizing the nature and variability of avalanche hazard in western Canada
NASA Astrophysics Data System (ADS)
Shandro, Bret; Haegeli, Pascal
2018-04-01
The snow and avalanche climate types maritime, continental and transitional are well established and have been used extensively to characterize the general nature of avalanche hazard at a location, study inter-seasonal and large-scale spatial variabilities and provide context for the design of avalanche safety operations. While researchers and practitioners have an experience-based understanding of the avalanche hazard associated with the three climate types, no studies have described the hazard character of an avalanche climate in detail. Since the 2009/2010 winter, the consistent use of Statham et al. (2017) conceptual model of avalanche hazard in public avalanche bulletins in Canada has created a new quantitative record of avalanche hazard that offers novel opportunities for addressing this knowledge gap. We identified typical daily avalanche hazard situations using self-organizing maps (SOMs) and then calculated seasonal prevalence values of these situations. This approach produces a concise characterization that is conducive to statistical analyses, but still provides a comprehensive picture that is informative for avalanche risk management due to its link to avalanche problem types. Hazard situation prevalence values for individual seasons, elevations bands and forecast regions provide unprecedented insight into the inter-seasonal and spatial variability of avalanche hazard in western Canada.
NASA Astrophysics Data System (ADS)
Mohammad, R.; Ramsey, M.; Scheidt, S. P.
2010-12-01
Prior to mineral dust deposition affecting albedo, aerosols can have direct and indirect effects on local to regional scale climate by changing both the shortwave and longwave radiative forcing. In addition, mineral dust causes health hazards, such as respiratory-related illnesses and deaths, loss of agricultural soil, and safety hazards to aviation and motorists due to reduced visibility. Previous work utilized satellite and ground-based TIR data to describe the direct longwave radiative effect of the Saharan Air Layer (SAL) over the Atlantic Ocean originating from dust storms in the Western Sahara. TIR emission spectroscopy was used to identify the spectral absorption features of that dust. The current research focuses on Kuwait and utilizes a comprehensive set of spatial, analytical and geological tools to characterize dust emissions and its radiative effects. Surface mineral composition maps for the Kuwait region were created using ASTER images and GIS datasets in order to identify the possible sources of wind-blown dust. Backward trajectory analysis using the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model suggests the dust source areas were located in Iraq, Syria, Jordan and Saudi Arabia. Samples collected from two dust storms (May and July 2010) were analyzed for their mineral composition and to validate the dust source areas identified by the modeling and remote sensing analysis. These air fall dust samples were collected in glass containers on a 13 meter high rooftop in the suburb of Rumaithiya in Kuwait. Additional samples will be collected to expand the analysis and their chemical compositions will be characterized by a combination of laboratory X-ray fluorescence (XRF), Scanning Electron Microscopy (SEM) and TIR emission spectroscopy. The overarching objective of this ongoing research is to both characterize the effects of mineral dust on climate as well as establish a predictive tool that can identify dust storm sources and potentially aid in establishing a more accurate prediction and warning system in the Middle East region.
Risk is the combination of hazard and exposure. Risk characterization at UST release sites has traditionally emphasized hazard (presence of residual fuel) with little attention to exposure. Exposure characterization often limited to a one-dimensional model such as the RBCA equa...
Inanloo, Bahareh; Tansel, Berrin
2015-06-01
The aim of this research was to investigate accidental releases of ammonia followed by an en-route incident in an attempt to further predict the consequences of hazardous cargo accidents. The air dispersion model Areal Locations of Hazardous Atmospheres (ALOHA) was employed to track the probable outcomes of a hazardous material release of a tanker truck under different explosion scenarios. The significance of identification of the flammable zones was taken into consideration; in case the flammable vapor causes an explosion. The impacted areas and the severity of the probable destructions were evaluated for an explosion by considering the overpressure waves. ALOHA in conjunction with ArcGIS was used to delineate the flammable and overpressure impact zones for different scenarios. Based on the results, flammable fumes were formed in oval shapes having a chief axis along the wind direction at the time of release. The expansions of the impact areas under the overpressure value which can lead to property damage for 2 and 20 tons releases, under very stable and unstable atmospheric conditions were estimated to be around 1708, 1206; 3742, 3527 feet, respectively, toward the wind direction. A sensitivity analysis was done to assess the significance of wind speed on the impact zones. The insight provided by this study can be utilized by decision makers in transportation of hazardous materials as a guide for possible rerouting, rescheduling, or limiting the quantity of hazardous cargo to reduce the possible impacts after hazardous cargo accidents during transport. Copyright © 2015 Elsevier Ltd. All rights reserved.
Rodríguez-Cano, Rubén; López-Durán, Ana; Martínez-Vispo, Carmela; Martínez, Úrsula; Fernández Del Río, Elena; Becoña, Elisardo
2016-12-01
Diverse studies have found a relation between alcohol consumption and smoking relapse. Few studies have analyzed the relation of smoking relapse with pretreatment alcohol consumption and gender differences. The main purpose of this study is to analyze the influence of alcohol consumption in smoking relapse over 12 months (3-, 6-, and 12-months follow-up) and to determine possible gender differences. The sample included 374 smokers who quit smoking by participating in a psychological smoking cessation treatment. We assessed hazardous pretreatment alcohol drinking (AUDIT), cigarette consumption (FTND; number of cigarettes) and sociodemographic variables. Higher scores on hazardous pretreatment alcohol drinking predict smoking relapse at 3-, 6-, and 12-months after smoking cessation. In males, higher scores on hazardous pretreatment alcohol drinking predict relapse at 6 and at 12 months. In females, higher scores on hazardous pretreatment alcohol drinking predict tobacco relapse at 3 months. Hazardous pretreatment alcohol drinking predicts relapse at all intervals after smoking cessation (3-, 6-, and 12-months follow-up). However, the influence of hazardous pretreatment alcohol drinking on smoking relapse differs as a function of gender, as it is a short-term predictor in women (3 months) and a long-term predictor in men (6 and 12 months). Copyright © 2016 Elsevier Inc. All rights reserved.
Bellón, Juan Ángel; de Dios Luna, Juan; King, Michael; Nazareth, Irwin; Motrico, Emma; GildeGómez-Barragán, María Josefa; Torres-González, Francisco; Montón-Franco, Carmen; Sánchez-Celaya, Marta; Díaz-Barreiros, Miguel Ángel; Vicens, Catalina; Moreno-Peral, Patricia
2017-04-01
Little is known about the risk of progressing to hazardous alcohol use in abstinent or low-risk drinkers. To develop and validate a simple brief risk algorithm for the onset of hazardous alcohol drinking (HAD) over 12 months for use in primary care. Prospective cohort study in 32 health centres from six Spanish provinces, with evaluations at baseline, 6 months, and 12 months. Forty-one risk factors were measured and multilevel logistic regression and inverse probability weighting were used to build the risk algorithm. The outcome was new occurrence of HAD during the study, as measured by the AUDIT. From the lists of 174 GPs, 3954 adult abstinent or low-risk drinkers were recruited. The 'predictAL-10' risk algorithm included just nine variables (10 questions): province, sex, age, cigarette consumption, perception of financial strain, having ever received treatment for an alcohol problem, childhood sexual abuse, AUDIT-C, and interaction AUDIT-C*Age. The c-index was 0.886 (95% CI = 0.854 to 0.918). The optimal cutoff had a sensitivity of 0.83 and specificity of 0.80. Excluding childhood sexual abuse from the model (the 'predictAL-9'), the c-index was 0.880 (95% CI = 0.847 to 0.913), sensitivity 0.79, and specificity 0.81. There was no statistically significant difference between the c-indexes of predictAL-10 and predictAL-9. The predictAL-10/9 is a simple and internally valid risk algorithm to predict the onset of hazardous alcohol drinking over 12 months in primary care attendees; it is a brief tool that is potentially useful for primary prevention of hazardous alcohol drinking. © British Journal of General Practice 2017.
Engineering Geology | Alaska Division of Geological & Geophysical Surveys
Tidal Datum Portal Climate and Cryosphere Hazards Coastal Hazards Program Guide to Geologic Hazards in Tidal Datum Portal Climate and Cryosphere Hazards Coastal Hazards Program Guide to Geologic Hazards in Highway and development of avalanche susceptibility and prediction models near Atigun Pass. Alaska coastal
Redman, A D; Butler, J D; Letinski, D J; Di Toro, D M; Leon Paumen, M; Parkerton, T F
2018-05-01
Solid-phase microextraction fibers coated with polydimethylsiloxane (PDMS) provide a convenient passive sampling format to characterize bioavailability of petroleum substances. Hydrocarbons absorb onto PDMS in proportion to both freely dissolved concentrations and partitioning properties of the individual constituents, which parallels the mechanistic basis used to predict aquatic toxicity in the PETROTOX model. When deployed in a non-depletive manner, combining SPME with thermal desorption and quantification using gas chromatography-flame ionization creates a biomimetic extraction (BE) procedure that has the potential to simplify aquatic hazard assessments of petroleum substances since the total moles of all hydrocarbons sorbed to the fiber can be related to toxic thresholds in target lipid of aquatic organisms. The objective of this work is to describe the technical basis for applying BE measurements to predict toxicity of petroleum substances. Critical BE-based PDMS concentrations corresponding to adverse effects were empirically derived from toxicity tests on different petroleum substances with multiple test species. The resulting species sensitivity distribution (SSD) of PDMS effect concentrations was then compared and found consistent with the previously reported target lipid-based SSD. Further, BE data collected on samples of aqueous media dosed with a wide range of petroleum substances were highly correlated to predicted toxic units derived using the PETROTOX model. These findings provide justification for applying BE in environmental hazard and risk evaluations of petroleum substances and related mixtures. Copyright © 2018 Elsevier Ltd. All rights reserved.
Assessment and prediction of debris-flow hazards
Wieczorek, Gerald F.; ,
1993-01-01
Study of debris-flow geomorphology and initiation mechanism has led to better understanding of debris-flow processes. This paper reviews how this understanding is used in current techniques for assessment and prediction of debris-flow hazards.
NASA Technical Reports Server (NTRS)
Estes, Sue M.
2009-01-01
The Public Health application area focuses on Earth science applications to public health and safety, particularly regarding infectious disease, emergency preparedness and response, and environmental health issues. The application explores issues of toxic and pathogenic exposure, as well as natural and man-made hazards and their effects, for risk characterization/mitigation and improvements to health and safety. The program elements of the NASA Applied Sciences Program are: Agricultural Efficiency, Air Quality, Climate, Disaster Management, Ecological Forecasting, Water Resources, Weather, and Public Health.
Gustavsson, Mikael B; Hellohf, Andreas; Backhaus, Thomas
2017-05-15
Registration dossiers for 11,678 industrial chemicals were retrieved from the database of the European Chemicals Agency, of which 3566 provided a numerical entry for the corresponding predicted no effect concentration for the freshwater environment (PNEC). A distribution-based examination of 2244 of these entries reveals that the average PNEC of an industrial chemical in Europe is 238nmol/L, covering a span of 9 orders of magnitude. A comparison with biocides, pesticides, pharmaceuticals and WFD-priority pollutants reveals that, in average, industrial chemicals are least hazardous (hazard ranking: industrial chemicals≪pharmaceuticals
Bellón, Juan Ángel; de Dios Luna, Juan; King, Michael; Nazareth, Irwin; Motrico, Emma; GildeGómez-Barragán, María Josefa; Torres-González, Francisco; Montón-Franco, Carmen; Sánchez-Celaya, Marta; Díaz-Barreiros, Miguel Ángel; Vicens, Catalina; Moreno-Peral, Patricia
2017-01-01
Background Little is known about the risk of progressing to hazardous alcohol use in abstinent or low-risk drinkers. Aim To develop and validate a simple brief risk algorithm for the onset of hazardous alcohol drinking (HAD) over 12 months for use in primary care. Design and setting Prospective cohort study in 32 health centres from six Spanish provinces, with evaluations at baseline, 6 months, and 12 months. Method Forty-one risk factors were measured and multilevel logistic regression and inverse probability weighting were used to build the risk algorithm. The outcome was new occurrence of HAD during the study, as measured by the AUDIT. Results From the lists of 174 GPs, 3954 adult abstinent or low-risk drinkers were recruited. The ‘predictAL-10’ risk algorithm included just nine variables (10 questions): province, sex, age, cigarette consumption, perception of financial strain, having ever received treatment for an alcohol problem, childhood sexual abuse, AUDIT-C, and interaction AUDIT-C*Age. The c-index was 0.886 (95% CI = 0.854 to 0.918). The optimal cutoff had a sensitivity of 0.83 and specificity of 0.80. Excluding childhood sexual abuse from the model (the ‘predictAL-9’), the c-index was 0.880 (95% CI = 0.847 to 0.913), sensitivity 0.79, and specificity 0.81. There was no statistically significant difference between the c-indexes of predictAL-10 and predictAL-9. Conclusion The predictAL-10/9 is a simple and internally valid risk algorithm to predict the onset of hazardous alcohol drinking over 12 months in primary care attendees; it is a brief tool that is potentially useful for primary prevention of hazardous alcohol drinking. PMID:28360074
Hazard assessment through hybrid in vitro / in silico approach: The case of zearalenone.
Ehrlich, Veronika A; Dellafiora, Luca; Mollergues, Julie; Dall'Asta, Chiara; Serrant, Patrick; Marin-Kuan, Maricel; Lo Piparo, Elena; Schilter, Benoit; Cozzini, Pietro
2015-01-01
Within the framework of reduction, refinement and replacement of animal experiments, new approaches for identification and characterization of chemical hazards have been developed. Grouping and read across has been promoted as a most promising alternative approach. It uses existing toxicological information on a group of chemicals to make predictions on the toxicity of uncharacterized ones. In the present work, the feasibility of applying in vitro and in silico techniques to group chemicals for read across was studied using the food mycotoxin zearalenone (ZEN) and metabolites as a case study. ZEN and its reduced metabolites are known to act through activation of the estrogen receptor α (ERα). The ranking of their estrogenic potencies appeared highly conserved across test systems including binding, in vitro and in vivo assays. This data suggests that activation of ERα may play a role in the molecular initiating event (MIE) and be predictive of adverse effects and provides the rationale to model receptor-binding for hazard identification. The investigation of receptor-ligand interactions through docking simulation proved to accurately rank estrogenic potencies of ZEN and reduced metabolites, showing the suitability of the model to address estrogenic potency for this group of compounds. Therefore, the model was further applied to biologically uncharacterized, commercially unavailable, oxidized ZEN metabolites (6α-, 6β-, 8α-, 8β-, 13- and 15-OH-ZEN). Except for 15-OH-ZEN, the data indicate that in general, the oxidized metabolites would be considered a lower estrogenic concern than ZEN and reduced metabolites.
USGS: Building on leadership in mapping oceans and coasts
Myers, M.D.
2008-01-01
The US Geological Survey (USGS) offers continuously improving technologies for mapping oceans and coasts providing unique opportunity for characterizing the marine environment and to expand the understanding of coastal and ocean processes, resources, and hazards. USGS, which has been designated as a leader for mapping the Exclusive Economic Zone, has made an advanced strategic plan, Facing Tomorrow's Challenges- US Geological Survey Science in the Decade 2007 to 2017. This plan focuses on innovative and transformational themes that serve key clients and customers, expand partnerships, and have long-term national impact. The plan includes several key science directions, including Understanding Ecosystems and Predicting Ecosystem Change, Energy and Minerals for America's Future, and A National Hazards, Risk, and Resilience Assessment Program. USGS has also collaborated with diverse partners to incorporate mapping and monitoring within interdisciplinary research programs, addressing the system-scale response of coastal and marine ecosystems.
NASA Astrophysics Data System (ADS)
Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.
2015-12-01
The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence probabilities P30(i) for all earthquakes (CEFMs) and calculated maximum coastal tsunami heights. In the synthesis, aleatory uncertainties relating to incompleteness of governing equations, CEFM modeling, bathymetry and topography data, etc, are modeled assuming a log-normal probabilistic distribution. Examples of tsunami hazard curves will be presented.
A Unified Flash Flood Database across the United States
Gourley, Jonathan J.; Hong, Yang; Flamig, Zachary L.; Arthur, Ami; Clark, Robert; Calianno, Martin; Ruin, Isabelle; Ortel, Terry W.; Wieczorek, Michael; Kirstetter, Pierre-Emmanuel; Clark, Edward; Krajewski, Witold F.
2013-01-01
Despite flash flooding being one of the most deadly and costly weather-related natural hazards worldwide, individual datasets to characterize them in the United States are hampered by limited documentation and can be difficult to access. This study is the first of its kind to assemble, reprocess, describe, and disseminate a georeferenced U.S. database providing a long-term, detailed characterization of flash flooding in terms of spatiotemporal behavior and specificity of impacts. The database is composed of three primary sources: 1) the entire archive of automated discharge observations from the U.S. Geological Survey that has been reprocessed to describe individual flooding events, 2) flash-flooding reports collected by the National Weather Service from 2006 to the present, and 3) witness reports obtained directly from the public in the Severe Hazards Analysis and Verification Experiment during the summers 2008–10. Each observational data source has limitations; a major asset of the unified flash flood database is its collation of relevant information from a variety of sources that is now readily available to the community in common formats. It is anticipated that this database will be used for many diverse purposes, such as evaluating tools to predict flash flooding, characterizing seasonal and regional trends, and improving understanding of dominant flood-producing processes. We envision the initiation of this community database effort will attract and encompass future datasets.
This fact sheet provides an overview of the 10 on-line characterization and remediation databases available on the Hazardous Waste Clean-Up Information (CLU-IN) website sponsored by the U.S. Environmental Protection Agency.
Modelling multi-hazard hurricane damages on an urbanized coast with a Bayesian Network approach
van Verseveld, H.C.W.; Van Dongeren, A. R.; Plant, Nathaniel G.; Jäger, W.S.; den Heijer, C.
2015-01-01
Hurricane flood impacts to residential buildings in coastal zones are caused by a number of hazards, such as inundation, overflow currents, erosion, and wave attack. However, traditional hurricane damage models typically make use of stage-damage functions, where the stage is related to flooding depth only. Moreover, these models are deterministic and do not consider the large amount of uncertainty associated with both the processes themselves and with the predictions. This uncertainty becomes increasingly important when multiple hazards (flooding, wave attack, erosion, etc.) are considered simultaneously. This paper focusses on establishing relationships between observed damage and multiple hazard indicators in order to make better probabilistic predictions. The concept consists of (1) determining Local Hazard Indicators (LHIs) from a hindcasted storm with use of a nearshore morphodynamic model, XBeach, and (2) coupling these LHIs and building characteristics to the observed damages. We chose a Bayesian Network approach in order to make this coupling and used the LHIs ‘Inundation depth’, ‘Flow velocity’, ‘Wave attack’, and ‘Scour depth’ to represent flooding, current, wave impacts, and erosion related hazards.The coupled hazard model was tested against four thousand damage observations from a case site at the Rockaway Peninsula, NY, that was impacted by Hurricane Sandy in late October, 2012. The model was able to accurately distinguish ‘Minor damage’ from all other outcomes 95% of the time and could distinguish areas that were affected by the storm, but not severely damaged, 68% of the time. For the most heavily damaged buildings (‘Major Damage’ and ‘Destroyed’), projections of the expected damage underestimated the observed damage. The model demonstrated that including multiple hazards doubled the prediction skill, with Log-Likelihood Ratio test (a measure of improved accuracy and reduction in uncertainty) scores between 0.02 and 0.17 when only one hazard is considered and a score of 0.37 when multiple hazards are considered simultaneously. The LHIs with the most predictive skill were ‘Inundation depth’ and ‘Wave attack’. The Bayesian Network approach has several advantages over the market-standard stage-damage functions: the predictive capacity of multiple indicators can be combined; probabilistic predictions can be obtained, which include uncertainty; and quantitative as well as descriptive information can be used simultaneously.
Landscape Hazards in Yukon Communities: Geological Mapping for Climate Change Adaptation Planning
NASA Astrophysics Data System (ADS)
Kennedy, K.; Kinnear, L.
2010-12-01
Climate change is considered to be a significant challenge for northern communities where the effects of increased temperature and climate variability are beginning to affect infrastructure and livelihoods (Arctic Climate Impact Assessment, 2004). Planning for and adapting to ongoing and future changes in climate will require the identification and characterization of social, economic, cultural, political and biophysical vulnerabilities. This pilot project addresses physical landscape vulnerabilities in two communities in the Yukon Territory through community-scale landscape hazard mapping and focused investigations of community permafrost conditions. Landscape hazards are identified by combining pre-existing data from public utilities and private-sector consultants with new geophysical techniques (ground penetrating radar and electrical resistivity), shallow drilling, surficial geological mapping, and permafrost characterization. Existing landscape vulnerabilities are evaluated based on their potential for hazard (low, medium or high) under current climate conditions, as well as under future climate scenarios. Detailed hazard maps and landscape characterizations for both communities will contribute to overall adaptation plans and allow for informed development, planning and mitigation of potentially threatening hazards in and around the communities.
Predictive validity of the AUDIT for hazardous alcohol consumption in recently released prisoners.
Thomas, Emma; Degenhardt, Louisa; Alati, Rosa; Kinner, Stuart
2014-01-01
This study aimed to assess the predictive validity of the Alcohol Use Disorders Identification Test (AUDIT) among adult prisoners with respect to hazardous drinking following release, and identify predictors of post-release hazardous drinking among prisoners screening positive for risk of alcohol-related harm on the AUDIT. Data came from a survey-based longitudinal study of 1325 sentenced adult prisoners in Queensland, Australia. Baseline interviews were conducted pre-release with follow-up at 3 and 6 months post-release. We calculated sensitivity, specificity and area under the receiver operating characteristic (AUROC) to quantify the predictive validity of the AUDIT administered at baseline with respect to post-release hazardous drinking. Other potential predictors of hazardous drinking were measured by self-report and their association with the outcome was examined using logistic regression. At a cut-point of 8 or above, sensitivity of the AUDIT with respect to hazardous drinking at 3-month follow-up was 81.0% (95%CI: 77.9-84.6%) and specificity was 65.6% (95%CI: 60.6-70.3%). The AUROC was 0.78 (95%CI: 0.75-0.81), indicating moderate accuracy. Among those scoring 8 or above, high expectations to drink post-release (AOR: 2.49; 95%CI: 1.57-3.94) and past amphetamine-type stimulant (ATS) use (AOR: 1.64; 95%CI: 1.06-2.56) were significantly associated with hazardous drinking at 3 months post-release. Results were similar at 6 months. Among adult prisoners in our sample, pre-release AUDIT scores predicted hazardous drinking six months after release with acceptable accuracy, sensitivity and specificity. Among prisoners screening positive on the AUDIT, expectations of post-release drinking and ATS use are potential targets for intervention to reduce future hazardous drinking. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Early identification systems for emerging foodborne hazards.
Marvin, H J P; Kleter, G A; Prandini, A; Dekkers, S; Bolton, D J
2009-05-01
This paper provides a non-exhausting overview of early warning systems for emerging foodborne hazards that are operating in the various places in the world. Special attention is given to endpoint-focussed early warning systems (i.e. ECDC, ISIS and GPHIN) and hazard-focussed early warning systems (i.e. FVO, RASFF and OIE) and their merit to successfully identify a food safety problem in an early stage is discussed. Besides these early warning systems which are based on monitoring of either disease symptoms or hazards, also early warning systems and/or activities that intend to predict the occurrence of a food safety hazard in its very beginning of development or before that are described. Examples are trend analysis, horizon scanning, early warning systems for mycotoxins in maize and/or wheat and information exchange networks (e.g. OIE and GIEWS). Furthermore, recent initiatives that aim to develop predictive early warning systems based on the holistic principle are discussed. The assumption of the researchers applying this principle is that developments outside the food production chain that are either directly or indirectly related to the development of a particular food safety hazard may also provide valuable information to predict the development of this hazard.
EPA is developing methods for utilizing computational chemistry, high-throughput screening (HTS) and various toxicogenomic technologies to predict potential for toxicity and prioritize limited testing resources towards chemicals that likely represent the greatest hazard to human ...
Autonomous Soil Assessment System: A Data-Driven Approach to Planetary Mobility Hazard Detection
NASA Astrophysics Data System (ADS)
Raimalwala, K.; Faragalli, M.; Reid, E.
2018-04-01
The Autonomous Soil Assessment System predicts mobility hazards for rovers. Its development and performance are presented, with focus on its data-driven models, machine learning algorithms, and real-time sensor data fusion for predictive analytics.
Asano, Junichi; Hirakawa, Akihiro
2017-01-01
The Cox proportional hazards cure model is a survival model incorporating a cure rate with the assumption that the population contains both uncured and cured individuals. It contains a logistic regression for the cure rate, and a Cox regression to estimate the hazard for uncured patients. A single predictive model for both the cure and hazard can be developed by using a cure model that simultaneously predicts the cure rate and hazards for uncured patients; however, model selection is a challenge because of the lack of a measure for quantifying the predictive accuracy of a cure model. Recently, we developed an area under the receiver operating characteristic curve (AUC) for determining the cure rate in a cure model (Asano et al., 2014), but the hazards measure for uncured patients was not resolved. In this article, we propose novel C-statistics that are weighted by the patients' cure status (i.e., cured, uncured, or censored cases) for the cure model. The operating characteristics of the proposed C-statistics and their confidence interval were examined by simulation analyses. We also illustrate methods for predictive model selection and for further interpretation of variables using the proposed AUCs and C-statistics via application to breast cancer data.
Reviewing and visualizing the interactions of natural hazards
NASA Astrophysics Data System (ADS)
Gill, Joel C.; Malamud, Bruce D.
2014-12-01
This paper presents a broad overview, characterization, and visualization of the interaction relationships between 21 natural hazards, drawn from six hazard groups (geophysical, hydrological, shallow Earth, atmospheric, biophysical, and space hazards). A synthesis is presented of the identified interaction relationships between these hazards, using an accessible visual format particularly suited to end users. Interactions considered are primarily those where a primary hazard triggers or increases the probability of secondary hazards occurring. In this paper we do the following: (i) identify, through a wide-ranging review of grey- and peer-review literature, 90 interactions; (ii) subdivide the interactions into three levels, based on how well we can characterize secondary hazards, given information about the primary hazard; (iii) determine the spatial overlap and temporal likelihood of the triggering relationships occurring; and (iv) examine the relationship between primary and secondary hazard intensities for each identified hazard interaction and group these into five possible categories. In this study we have synthesized, using accessible visualization techniques, large amounts of information drawn from many scientific disciplines. We outline the importance of constraining hazard interactions and reinforce the importance of a holistic (or multihazard) approach to natural hazard assessment. This approach allows those undertaking research into single hazards to place their work within the context of other hazards. It also communicates important aspects of hazard interactions, facilitating an effective analysis by those working on reducing and managing disaster risk within both the policy and practitioner communities.
Lunar mission safety and rescue: Hazards analysis and safety requirements
NASA Technical Reports Server (NTRS)
1971-01-01
The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.
NASA Astrophysics Data System (ADS)
Bieringer, Paul E.; Rodriguez, Luna M.; Vandenberghe, Francois; Hurst, Jonathan G.; Bieberbach, George; Sykes, Ian; Hannan, John R.; Zaragoza, Jake; Fry, Richard N.
2015-12-01
Accurate simulations of the atmospheric transport and dispersion (AT&D) of hazardous airborne materials rely heavily on the source term parameters necessary to characterize the initial release and meteorological conditions that drive the downwind dispersion. In many cases the source parameters are not known and consequently based on rudimentary assumptions. This is particularly true of accidental releases and the intentional releases associated with terrorist incidents. When available, meteorological observations are often not representative of the conditions at the location of the release and the use of these non-representative meteorological conditions can result in significant errors in the hazard assessments downwind of the sensors, even when the other source parameters are accurately characterized. Here, we describe a computationally efficient methodology to characterize both the release source parameters and the low-level winds (eg. winds near the surface) required to produce a refined downwind hazard. This methodology, known as the Variational Iterative Refinement Source Term Estimation (STE) Algorithm (VIRSA), consists of a combination of modeling systems. These systems include a back-trajectory based source inversion method, a forward Gaussian puff dispersion model, a variational refinement algorithm that uses both a simple forward AT&D model that is a surrogate for the more complex Gaussian puff model and a formal adjoint of this surrogate model. The back-trajectory based method is used to calculate a ;first guess; source estimate based on the available observations of the airborne contaminant plume and atmospheric conditions. The variational refinement algorithm is then used to iteratively refine the first guess STE parameters and meteorological variables. The algorithm has been evaluated across a wide range of scenarios of varying complexity. It has been shown to improve the source parameters for location by several hundred percent (normalized by the distance from source to the closest sampler), and improve mass estimates by several orders of magnitude. Furthermore, it also has the ability to operate in scenarios with inconsistencies between the wind and airborne contaminant sensor observations and adjust the wind to provide a better match between the hazard prediction and the observations.
Food-safety hazards in the pork chain in Nagaland, North East India: implications for human health.
Fahrion, Anna Sophie; Jamir, Lanu; Richa, Kenivole; Begum, Sonuwara; Rutsa, Vilatuo; Ao, Simon; Padmakumar, Varijaksha P; Deka, Ram Pratim; Grace, Delia
2013-12-24
Pork occupies an important place in the diet of the population of Nagaland, one of the North East Indian states. We carried out a pilot study along the pork meat production chain, from live animal to end consumer. The goal was to obtain information about the presence of selected food borne hazards in pork in order to assess the risk deriving from these hazards to the health of the local consumers and make recommendations for improving food safety. A secondary objective was to evaluate the utility of risk-based approaches to food safety in an informal food system. We investigated samples from pigs and pork sourced at slaughter in urban and rural environments, and at retail, to assess a selection of food-borne hazards. In addition, consumer exposure was characterized using information about hygiene and practices related to handling and preparing pork. A qualitative hazard characterization, exposure assessment and hazard characterization for three representative hazards or hazard proxies, namely Enterobacteriaceae, T. solium cysticercosis and antibiotic residues, is presented. Several important potential food-borne pathogens are reported for the first time including Listeria spp. and Brucella suis. This descriptive pilot study is the first risk-based assessment of food safety in Nagaland. We also characterise possible interventions to be addressed by policy makers, and supply data to inform future risk assessments.
Seibert, Tyler M; Fan, Chun Chieh; Wang, Yunpeng; Zuber, Verena; Karunamuni, Roshan; Parsons, J Kellogg; Eeles, Rosalind A; Easton, Douglas F; Kote-Jarai, ZSofia; Al Olama, Ali Amin; Garcia, Sara Benlloch; Muir, Kenneth; Grönberg, Henrik; Wiklund, Fredrik; Aly, Markus; Schleutker, Johanna; Sipeky, Csilla; Tammela, Teuvo Lj; Nordestgaard, Børge G; Nielsen, Sune F; Weischer, Maren; Bisbjerg, Rasmus; Røder, M Andreas; Iversen, Peter; Key, Tim J; Travis, Ruth C; Neal, David E; Donovan, Jenny L; Hamdy, Freddie C; Pharoah, Paul; Pashayan, Nora; Khaw, Kay-Tee; Maier, Christiane; Vogel, Walther; Luedeke, Manuel; Herkommer, Kathleen; Kibel, Adam S; Cybulski, Cezary; Wokolorczyk, Dominika; Kluzniak, Wojciech; Cannon-Albright, Lisa; Brenner, Hermann; Cuk, Katarina; Saum, Kai-Uwe; Park, Jong Y; Sellers, Thomas A; Slavov, Chavdar; Kaneva, Radka; Mitev, Vanio; Batra, Jyotsna; Clements, Judith A; Spurdle, Amanda; Teixeira, Manuel R; Paulo, Paula; Maia, Sofia; Pandha, Hardev; Michael, Agnieszka; Kierzek, Andrzej; Karow, David S; Mills, Ian G; Andreassen, Ole A; Dale, Anders M
2018-01-10
To develop and validate a genetic tool to predict age of onset of aggressive prostate cancer (PCa) and to guide decisions of who to screen and at what age. Analysis of genotype, PCa status, and age to select single nucleotide polymorphisms (SNPs) associated with diagnosis. These polymorphisms were incorporated into a survival analysis to estimate their effects on age at diagnosis of aggressive PCa (that is, not eligible for surveillance according to National Comprehensive Cancer Network guidelines; any of Gleason score ≥7, stage T3-T4, PSA (prostate specific antigen) concentration ≥10 ng/L, nodal metastasis, distant metastasis). The resulting polygenic hazard score is an assessment of individual genetic risk. The final model was applied to an independent dataset containing genotype and PSA screening data. The hazard score was calculated for these men to test prediction of survival free from PCa. Multiple institutions that were members of international PRACTICAL consortium. All consortium participants of European ancestry with known age, PCa status, and quality assured custom (iCOGS) array genotype data. The development dataset comprised 31 747 men; the validation dataset comprised 6411 men. Prediction with hazard score of age of onset of aggressive cancer in validation set. In the independent validation set, the hazard score calculated from 54 single nucleotide polymorphisms was a highly significant predictor of age at diagnosis of aggressive cancer (z=11.2, P<10 -16 ). When men in the validation set with high scores (>98th centile) were compared with those with average scores (30th-70th centile), the hazard ratio for aggressive cancer was 2.9 (95% confidence interval 2.4 to 3.4). Inclusion of family history in a combined model did not improve prediction of onset of aggressive PCa (P=0.59), and polygenic hazard score performance remained high when family history was accounted for. Additionally, the positive predictive value of PSA screening for aggressive PCa was increased with increasing polygenic hazard score. Polygenic hazard scores can be used for personalised genetic risk estimates that can predict for age at onset of aggressive PCa. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
78 FR 69745 - Safety and Security Plans for Class 3 Hazardous Materials Transported by Rail
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Security Plans for Class 3 Hazardous Materials Transported by Rail AGENCY: Pipeline and Hazardous Materials... characterization, classification, and selection of a packing group for Class 3 materials, and the corresponding...
NASA Astrophysics Data System (ADS)
Chapman, Martin Colby
1998-12-01
The design earthquake selection problem is fundamentally probabilistic. Disaggregation of a probabilistic model of the seismic hazard offers a rational and objective approach that can identify the most likely earthquake scenario(s) contributing to hazard. An ensemble of time series can be selected on the basis of the modal earthquakes derived from the disaggregation. This gives a useful time-domain realization of the seismic hazard, to the extent that a single motion parameter captures the important time-domain characteristics. A possible limitation to this approach arises because most currently available motion prediction models for peak ground motion or oscillator response are essentially independent of duration, and modal events derived using the peak motions for the analysis may not represent the optimal characterization of the hazard. The elastic input energy spectrum is an alternative to the elastic response spectrum for these types of analyses. The input energy combines the elements of amplitude and duration into a single parameter description of the ground motion that can be readily incorporated into standard probabilistic seismic hazard analysis methodology. This use of the elastic input energy spectrum is examined. Regression analysis is performed using strong motion data from Western North America and consistent data processing procedures for both the absolute input energy equivalent velocity, (Vsbea), and the elastic pseudo-relative velocity response (PSV) in the frequency range 0.5 to 10 Hz. The results show that the two parameters can be successfully fit with identical functional forms. The dependence of Vsbea and PSV upon (NEHRP) site classification is virtually identical. The variance of Vsbea is uniformly less than that of PSV, indicating that Vsbea can be predicted with slightly less uncertainty as a function of magnitude, distance and site classification. The effects of site class are important at frequencies less than a few Hertz. The regression modeling does not resolve significant effects due to site class at frequencies greater than approximately 5 Hz. Disaggregation of general seismic hazard models using Vsbea indicates that the modal magnitudes for the higher frequency oscillators tend to be larger, and vary less with oscillator frequency, than those derived using PSV. Insofar as the elastic input energy may be a better parameter for quantifying the damage potential of ground motion, its use in probabilistic seismic hazard analysis could provide an improved means for selecting earthquake scenarios and establishing design earthquakes for many types of engineering analyses.
Agent-based simulation for human-induced hazard analysis.
Bulleit, William M; Drewek, Matthew W
2011-02-01
Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.
Gazica, Michele W; Spector, Paul E
2016-01-01
Safety climate, violence prevention climate, and civility climate were independently developed and linked to domain-specific workplace hazards, although all three were designed to promote the physical and psychological safety of workers. To test domain specificity between conceptually related workplace climates and relevant workplace hazards. Data were collected from 368 persons employed in various industries and descriptive statistics were calculated for all study variables. Correlational and relative weights analyses were used to test for domain specificity. The three climate domains were similarly predictive of most workplace hazards, regardless of domain specificity. This study suggests that the three climate domains share a common higher order construct that may predict relevant workplace hazards better than any of the scales alone.
Asteroid Impact & Deflection Assessment mission: Kinetic impactor
NASA Astrophysics Data System (ADS)
Cheng, A. F.; Michel, P.; Jutzi, M.; Rivkin, A. S.; Stickle, A.; Barnouin, O.; Ernst, C.; Atchison, J.; Pravec, P.; Richardson, D. C.; AIDA Team
2016-02-01
The Asteroid Impact & Deflection Assessment (AIDA) mission will be the first space experiment to demonstrate asteroid impact hazard mitigation by using a kinetic impactor to deflect an asteroid. AIDA is an international cooperation, consisting of two mission elements: the NASA Double Asteroid Redirection Test (DART) mission and the ESA Asteroid Impact Mission (AIM) rendezvous mission. The primary goals of AIDA are (i) to test our ability to perform a spacecraft impact on a potentially hazardous near-Earth asteroid and (ii) to measure and characterize the deflection caused by the impact. The AIDA target will be the binary near-Earth asteroid (65803) Didymos, with the deflection experiment to occur in late September, 2022. The DART impact on the secondary member of the binary at 7 km/s is expected to alter the binary orbit period by about 4 minutes, assuming a simple transfer of momentum to the target, and this period change will be measured by Earth-based observatories. The AIM spacecraft will characterize the asteroid target and monitor results of the impact in situ at Didymos. The DART mission is a full-scale kinetic impact to deflect a 150 m diameter asteroid, with known impactor conditions and with target physical properties characterized by the AIM mission. Predictions for the momentum transfer efficiency of kinetic impacts are given for several possible target types of different porosities, using Housen and Holsapple (2011) crater scaling model for impact ejecta mass and velocity distributions. Results are compared to numerical simulation results using the Smoothed Particle Hydrodynamics code of Jutzi and Michel (2014) with good agreement. The model also predicts that the ejecta from the DART impact may make Didymos into an active asteroid, forming an ejecta coma that may be observable from Earth-based telescopes. The measurements from AIDA of the momentum transfer from the DART impact, the crater size and morphology, and the evolution of an ejecta coma will substantially advance understanding of impact processes on asteroids.
Statistical Maps of Ground Magnetic Disturbance Derived from Global Geospace Models
NASA Astrophysics Data System (ADS)
Rigler, E. J.; Wiltberger, M. J.; Love, J. J.
2017-12-01
Electric currents in space are the principal driver of magnetic variations measured at Earth's surface. These in turn induce geoelectric fields that present a natural hazard for technological systems like high-voltage power distribution networks. Modern global geospace models can reasonably simulate large-scale geomagnetic response to solar wind variations, but they are less successful at deterministic predictions of intense localized geomagnetic activity that most impacts technological systems on the ground. Still, recent studies have shown that these models can accurately reproduce the spatial statistical distributions of geomagnetic activity, suggesting that their physics are largely correct. Since the magnetosphere is a largely externally driven system, most model-measurement discrepancies probably arise from uncertain boundary conditions. So, with realistic distributions of solar wind parameters to establish its boundary conditions, we use the Lyon-Fedder-Mobarry (LFM) geospace model to build a synthetic multivariate statistical model of gridded ground magnetic disturbance. From this, we analyze the spatial modes of geomagnetic response, regress on available measurements to fill in unsampled locations on the grid, and estimate the global probability distribution of extreme magnetic disturbance. The latter offers a prototype geomagnetic "hazard map", similar to those used to characterize better-known geophysical hazards like earthquakes and floods.
Flows of Selected Hazardous Materials by Rail
DOT National Transportation Integrated Search
1990-03-01
This report reviews the hazardous materials rail traffic of 33 selected hazardous materials commoditites or commodity groups in 1986, a relatively typical recent year. The flow of the selected commodities by rail are characterized and their geographi...
Haas, Jessica R.; Thompson, Matthew P.; Tillery, Anne C.; Scott, Joe H.
2017-01-01
Wildfires can increase the frequency and magnitude of catastrophic debris flows. Integrated, proactive natural hazard assessment would therefore characterize landscapes based on the potential for the occurrence and interactions of wildfires and postwildfire debris flows. This chapter presents a new modeling effort that can quantify the variability surrounding a key input to postwildfire debris-flow modeling, the amount of watershed burned at moderate to high severity, in a prewildfire context. The use of stochastic wildfire simulation captures variability surrounding the timing and location of ignitions, fire weather patterns, and ultimately the spatial patterns of watershed area burned. Model results provide for enhanced estimates of postwildfire debris-flow hazard in a prewildfire context, and multiple hazard metrics are generated to characterize and contrast hazards across watersheds. Results can guide mitigation efforts by allowing planners to identify which factors may be contributing the most to the hazard rankings of watersheds.
NASA Astrophysics Data System (ADS)
Bellalem, Fouzi; Talbi, Abdelhak; Djellit, Hamou; Ymmel, Hayet; Mobarki, Mourad
2018-03-01
The region of Blida is characterized by a relatively high seismic activity, pointed especially during the past two centuries. Indeed, it experienced a significant number of destructive earthquakes such as the earthquakes of March 2, 1825 and January 2, 1867, with intensity of X and IX, respectively. This study aims to investigate potential seismic hazard in Blida city and its surrounding regions. For this purpose, a typical seismic catalog was compiled using historical macroseismic events that occurred over a period of a few hundred years, and the recent instrumental seismicity dating back to 1900. The parametric-historic procedure introduced by Kijko and Graham (1998, 1999) was applied to assess seismic hazard in the study region. It is adapted to deal with incomplete catalogs and does not use any subjective delineation of active seismic zones. Because of the lack of recorded strong motion data, three ground prediction models have been considered, as they seem the most adapted to the seismicity of the study region. Results are presented as peak ground acceleration (PGA) seismic hazard maps, showing expected peak accelerations with 10% probability of exceedance in 50-year period. As the most significant result, hot spot regions with high PGA values are mapped. For example, a PGA of 0.44 g has been found in a small geographical area centered on Blida city.
Building a risk-targeted regional seismic hazard model for South-East Asia
NASA Astrophysics Data System (ADS)
Woessner, J.; Nyst, M.; Seyhan, E.
2015-12-01
The last decade has tragically shown the social and economic vulnerability of countries in South-East Asia to earthquake hazard and risk. While many disaster mitigation programs and initiatives to improve societal earthquake resilience are under way with the focus on saving lives and livelihoods, the risk management sector is challenged to develop appropriate models to cope with the economic consequences and impact on the insurance business. We present the source model and ground motions model components suitable for a South-East Asia earthquake risk model covering Indonesia, Malaysia, the Philippines and Indochine countries. The source model builds upon refined modelling approaches to characterize 1) seismic activity from geologic and geodetic data on crustal faults and 2) along the interface of subduction zones and within the slabs and 3) earthquakes not occurring on mapped fault structures. We elaborate on building a self-consistent rate model for the hazardous crustal fault systems (e.g. Sumatra fault zone, Philippine fault zone) as well as the subduction zones, showcase some characteristics and sensitivities due to existing uncertainties in the rate and hazard space using a well selected suite of ground motion prediction equations. Finally, we analyze the source model by quantifying the contribution by source type (e.g., subduction zone, crustal fault) to typical risk metrics (e.g.,return period losses, average annual loss) and reviewing their relative impact on various lines of businesses.
NASA Astrophysics Data System (ADS)
Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang
2017-09-01
Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.
Allen, M B; Billig, E; Reese, P P; Shults, J; Hasz, R; West, S; Abt, P L
2016-01-01
Donation after cardiac death is an important source of transplantable organs, but evidence suggests donor warm ischemia contributes to inferior outcomes. Attempts to predict recipient outcome using donor hemodynamic measurements have not yielded statistically significant results. We evaluated novel measures of donor hemodynamics as predictors of delayed graft function and graft failure in a cohort of 1050 kidneys from 566 donors. Hemodynamics were described using regression line slopes, areas under the curve, and time beyond thresholds for systolic blood pressure, oxygen saturation, and shock index (heart rate divided by systolic blood pressure). A logistic generalized estimation equation model showed that area under the curve for systolic blood pressure was predictive of delayed graft function (above median: odds ratio 1.42, 95% confidence interval [CI] 1.06-1.90). Multivariable Cox regression demonstrated that slope of oxygen saturation during the first 10 minutes after extubation was associated with graft failure (below median: hazard ratio 1.30, 95% CI 1.03-1.64), with 5-year graft survival of 70.0% (95%CI 64.5%-74.8%) for donors above the median versus 61.4% (95%CI 55.5%-66.7%) for those below the median. Among older donors, increased shock index slope was associated with increased hazard of graft failure. Validation of these findings is necessary to determine the utility of characterizing donor warm ischemia to predict recipient outcome. © Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.
Spector, Paul E.
2016-01-01
Background Safety climate, violence prevention climate, and civility climate were independently developed and linked to domain-specific workplace hazards, although all three were designed to promote the physical and psychological safety of workers. Purpose To test domain specificity between conceptually related workplace climates and relevant workplace hazards. Methods Data were collected from 368 persons employed in various industries and descriptive statistics were calculated for all study variables. Correlational and relative weights analyses were used to test for domain specificity. Results The three climate domains were similarly predictive of most workplace hazards, regardless of domain specificity. Discussion This study suggests that the three climate domains share a common higher order construct that may predict relevant workplace hazards better than any of the scales alone. PMID:27110930
NASA Astrophysics Data System (ADS)
Sparks, S. R.
2008-12-01
Volcanic eruptions in arcs are complex natural phenomena, involving the movement of magma to the Earth's surface and interactions with the surrounding crust during ascent and with the surface environment during eruption, resulting in secondary hazards. Magma changes its properties profoundly during ascent and eruption and many of the underlying processes of heat and mass transfer and physical property changes that govern volcanic flows and magmatic interactions with the environment are highly non-linear. Major direct hazards include tephra fall, pyroclastic flows from explosions and dome collapse, volcanic blasts, lahars, debris avalanches and tsunamis. There are also health hazards related to emissions of gases and very fine volcanic ash. These hazards and progress in their assessment are illustrated mainly from the ongoing eruption of the Soufriere Hills volcano. Montserrat. There are both epistemic and aleatory uncertainties in the assessment of volcanic hazards, which can be large, making precise prediction a formidable objective. Indeed in certain respects volcanic systems and hazardous phenomena may be intrinsically unpredictable. As with other natural phenomena, predictions and hazards inevitably have to be expressed in probabilistic terms that take account of these uncertainties. Despite these limitations significant progress is being made in the ability to anticipate volcanic activity in volcanic arcs and, in favourable circumstances, make robust hazards assessments and predictions. Improvements in monitoring ground deformation, gas emissions and seismicity are being combined with more advanced models of volcanic flows and their interactions with the environment. In addition more structured and systematic methods for assessing hazards and risk are emerging that allow impartial advice to be given to authorities during volcanic crises. There remain significant issues of how scientific advice and associated uncertainties are communicated to provide effective mitigation during volcanic crises.
Topography and geology site effects from the intensity prediction model (ShakeMap) for Austria
NASA Astrophysics Data System (ADS)
del Puy Papí Isaba, María; Jia, Yan; Weginger, Stefan
2017-04-01
The seismicity in Austria can be categorized as moderated. Despite the fact that the hazard seems to be rather low, earthquakes can cause great damage and losses, specially in densely populated and industrialized areas. It is well known, that equations which predict intensity as a function of magnitude and distance, among other parameters, are useful tool for hazard and risk assessment. Therefore, this study aims to determine an empirical model of the ground shaking intensities (ShakeMap) of a series of earthquakes occurred in Austria between 1000 and 2014. Furthermore, the obtained empirical model will lead to further interpretation of both, contemporary and historical earthquakes. A total of 285 events, which epicenters were located in Austria, and a sum of 22.739 reported macreoseismic data points from Austria and adjoining countries, were used. These events are enclosed in the period 1000-2014 and characterized by having a local magnitude greater than 3. In the first state of the model development, the data was careful selected, e.g. solely intensities equal or greater than III were used. In a second state the data was adjusted to the selected empirical model. Finally, geology and topography corrections were obtained by means of the model residuals in order to derive intensity-based site amplification effects.
McKee, Richard H; Tibaldi, Rosalie; Adenuga, Moyinoluwa D; Carrillo, Juan-Carlos; Margary, Alison
2018-02-01
The European chemical control regulation (REACH) requires that data on physical/chemical, toxicological and environmental hazards be compiled. Additionally, REACH requires formal assessments to ensure that substances can be safely used for their intended purposes. For health hazard assessments, reference values (Derived No Effect levels, DNELs) are calculated from toxicology data and compared to estimated exposure levels. If the ratio of the predicted exposure level to the DNEL, i.e. the Risk Characterization Ratio (RCR), is less than 1, the risk is considered controlled; otherwise, additional Risk Management Measures (RMM) must be applied. These requirements pose particular challenges for complex substances. Herein, "white spirit", a complex hydrocarbon solvent, is used as an example to illustrate how these procedures were applied. Hydrocarbon solvents were divided into categories of similar substances. Representative substances were identified for DNEL determinations. Adjustment factors were applied to the no effect levels to calculate the DNELs. Exposure assessments utilized a standardized set of generic exposure scenarios (GES) which incorporated exposure predictions for solvent handling activities. Computer-based tools were developed to automate RCR calculations and identify appropriate RMMs, allowing consistent communications to users via safety data sheets. Copyright © 2017 ExxonMobil Biomedical Sciences Inc. Published by Elsevier Inc. All rights reserved.
The GOES-R/JPSS Approach for Identifying Hazardous Low Clouds: Overview and Operational Impacts
NASA Astrophysics Data System (ADS)
Calvert, Corey; Pavolonis, Michael; Lindstrom, Scott; Gravelle, Chad; Terborg, Amanda
2017-04-01
Low ceiling and visibility is a weather hazard that nearly every forecaster, in nearly every National Weather Service (NWS) Weather Forecast Office (WFO), must regularly address. In addition, national forecast centers such as the Aviation Weather Center (AWC), Alaska Aviation Weather Unit (AAWU) and the Ocean Prediction Center (OPC) are responsible for issuing low ceiling and visibility related products. As such, reliable methods for detecting and characterizing hazardous low clouds are needed. Traditionally, hazardous areas of Fog/Low Stratus (FLS) are identified using a simple stand-alone satellite product that is constructed by subtracting the 3.9 and 11 μm brightness temperatures. However, the 3.9-11 μm brightness temperature difference (BTD) has several major limitations. In an effort to address the limitations of the BTD product, the GOES-R Algorithm Working Group (AWG) developed an approach that fuses satellite, Numerical Weather Prediction (NWP) model, Sea Surface Temperature (SST) analyses, and other data sets (e.g. digital surface elevation maps, surface emissivity maps, and surface type maps) to determine the probability that hazardous low clouds are present using a naïve Bayesian classifier. In addition, recent research has focused on blending geostationary (e.g. GOES-R) and low earth orbit (e.g. JPSS) satellite data to further improve the products. The FLS algorithm has adopted an enterprise approach in that it can utilize satellite data from a variety of current and future operational sensors and NWP data from a variety of models. The FLS products are available in AWIPS/N-AWIPS/AWIPS-II and have been evaluated within NWS operations over the last four years as part of the Satellite Proving Ground. Forecaster feedback has been predominantly positive and references to these products within Area Forecast Discussions (AFD's) indicate that the products are influencing operational forecasts. At the request of the NWS, the FLS products are currently being transitioned to NOAA/NESDIS operations, which will ensure that users have long-term access to these products. This paper will provide an overview of the FLS products and illustrate how they are being used to improve transportation safety and efficiency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yi; Global Institution for Collaborative Research and Education, Hokkaido University, Sapporo; Song, Jie
Purpose: To identify prognostic biomarkers in pancreatic cancer using high-throughput quantitative image analysis. Methods and Materials: In this institutional review board–approved study, we retrospectively analyzed images and outcomes for 139 locally advanced pancreatic cancer patients treated with stereotactic body radiation therapy (SBRT). The overall population was split into a training cohort (n=90) and a validation cohort (n=49) according to the time of treatment. We extracted quantitative imaging characteristics from pre-SBRT {sup 18}F-fluorodeoxyglucose positron emission tomography, including statistical, morphologic, and texture features. A Cox proportional hazard regression model was built to predict overall survival (OS) in the training cohort using 162more » robust image features. To avoid over-fitting, we applied the elastic net to obtain a sparse set of image features, whose linear combination constitutes a prognostic imaging signature. Univariate and multivariate Cox regression analyses were used to evaluate the association with OS, and concordance index (CI) was used to evaluate the survival prediction accuracy. Results: The prognostic imaging signature included 7 features characterizing different tumor phenotypes, including shape, intensity, and texture. On the validation cohort, univariate analysis showed that this prognostic signature was significantly associated with OS (P=.002, hazard ratio 2.74), which improved upon conventional imaging predictors including tumor volume, maximum standardized uptake value, and total legion glycolysis (P=.018-.028, hazard ratio 1.51-1.57). On multivariate analysis, the proposed signature was the only significant prognostic index (P=.037, hazard ratio 3.72) when adjusted for conventional imaging and clinical factors (P=.123-.870, hazard ratio 0.53-1.30). In terms of CI, the proposed signature scored 0.66 and was significantly better than competing prognostic indices (CI 0.48-0.64, Wilcoxon rank sum test P<1e-6). Conclusion: Quantitative analysis identified novel {sup 18}F-fluorodeoxyglucose positron emission tomography image features that showed improved prognostic value over conventional imaging metrics. If validated in large, prospective cohorts, the new prognostic signature might be used to identify patients for individualized risk-adaptive therapy.« less
NASA Astrophysics Data System (ADS)
Lane, E. M.; Gillibrand, P. A.; Wang, X.; Power, W.
2013-09-01
Regional source tsunamis pose a potentially devastating hazard to communities and infrastructure on the New Zealand coast. But major events are very uncommon. This dichotomy of infrequent but potentially devastating hazards makes realistic assessment of the risk challenging. Here, we describe a method to determine a probabilistic assessment of the tsunami hazard by regional source tsunamis with an "Average Recurrence Interval" of 2,500-years. The method is applied to the east Auckland region of New Zealand. From an assessment of potential regional tsunamigenic events over 100,000 years, the inundation of the Auckland region from the worst 100 events is modelled using a hydrodynamic model and probabilistic inundation depths on a 2,500-year time scale were determined. Tidal effects on the potential inundation were included by coupling the predicted wave heights with the probability density function of tidal heights at the inundation site. Results show that the more exposed northern section of the east coast and outer islands in the Hauraki Gulf face the greatest hazard from regional tsunamis in the Auckland region. Incorporating tidal effects into predictions of inundation reduced the predicted hazard compared to modelling all the tsunamis arriving at high tide giving a more accurate hazard assessment on the specified time scale. This study presents the first probabilistic analysis of dynamic modelling of tsunami inundation for the New Zealand coast and as such provides the most comprehensive assessment of tsunami inundation of the Auckland region from regional source tsunamis available to date.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish
The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less
Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; ...
2015-06-04
The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less
Multisensor of Remotely Sensed Data for Characterizing Seismotectonic Activities in Malaysia
NASA Astrophysics Data System (ADS)
Abu Bakar, Rabieahtul; Azahari Razak, Khamarrul; Anuar Jamaludin, Tajul; Tongkul, Felix; Mohamad, Zakaria; Ramli, Zamri; Abd Manap, Mohamad; Rahman, Muhammad Zulkarnain Abdul
2015-04-01
Seismically induced events pose serious hazards yet are difficult to predict. Despite remarkable efforts of mapping, monitoring and modelling of such great events at regional or local scales, the understanding of the processes in the Earth's dynamic system remains elusive. Although Malaysia is in a relatively low seismic hazard zone, the current trend and pattern of seismotectonic activities triggered a series of fundamental study to better understand the relationship between the earthquakes, recent tectonics and seismically active fault zones. Several conventional mapping techniques have been intensively used but shown some limitations. Remote sensing is the preferable mean to quantify the seismic activity accurately in a larger area within a short period. Still, only few of such studies have been carried out in this subduction region. Characterization of seismotectonic activities from space in a tropical environment is very challenging given the complexity of its physiographic, climatic, geologic conditions and anthropogenic activities. There are many factors controlling the success rate of the implementation mainly due to the lack of historical earthquakes, geomorphological evidence, and proper identification of regional tectonic patterns. In this study, we aim at providing better insight to extract and characterize seismotectonic activities by integrating passive and active remotely-sensed data, geodetic data, historical records, GIS-based data analysis and in-situ measurements as well quantify them based on field investigation and expert knowledge. It is crucial to perform spatiotemporal analysis of its activities in the most seismically induced region in North-Western Sabah. A comprehensive geodatabase of seismotectonic events are developed and allowed us to analyse the spatiotemporal activities. A novelty of object-based image method for extracting tropical seismically active faults and related seismotectonic features are introduced and evaluated. We aim to develop the exchangeable and transferable rule-set with optimal parameterization for such aforementioned tasks. A geomorphometric-based remotely sensed approach is used to understand the tectonic geomorphology in processes affecting the environment at different spatial scales. As a result of this study, questions related to cascading natural disasters, e.g. landslides can be quantitatively answered. Development and applications of seismically induced landslide hazard and risk zonation at different scales are conceptually presented and critically discussed. So far, quantification evaluation of uncertainties associated to spatial seismic hazard and risks prediction remains very challenging to understand and it is an interest of on-going research. In the near-future, it is crucial to address the changes of climate and land-use-land-cover in relation to temporal and spatial pattern of seismically induced landslides. It is also important to assess, model and incorporate the changes due to natural disasters into a sustainable risk management. As a conclusion, the characteristics, development and function of tectonic movement, as one of the components for geomorphological process-response system is crucial for a regional seismic study. With newly emerging multi-sensor of remotely sensed data coupled with the satellite positioning system promises a better mapping and monitoring tool for seismotectonic activities in such a way that it can be used to map, monitor, and model related seismically induced processes for a comprehensive hazard and associated risk assessment.
NASA Astrophysics Data System (ADS)
Moschetti, M. P.; Rennolet, S.; Thompson, E.; Yeck, W.; McNamara, D. E.; Herrmann, R. B.; Powers, P.; Hoover, S. M.
2016-12-01
Recent efforts to characterize the seismic hazard resulting from increased seismicity rates in Oklahoma and Kansas highlight the need for a regionalized ground motion characterization. To support these efforts, we measure and compile strong ground motions and compare these average ground motions intensity measures (IMs) with existing ground motion prediction equations (GMPEs). IMs are computed for available broadband and strong-motion records from M≥3 earthquakes occurring January 2009-April 2016, using standard strong motion processing guidelines. We verified our methods by comparing results from specific earthquakes to other standard procedures such as the USGS Shakemap system. The large number of records required an automated processing scheme, which was complicated by the extremely high rate of small-magnitude earthquakes 2014-2016. Orientation-independent IMs include peak ground motions (acceleration and velocity) and pseudo-spectral accelerations (5 percent damping, 0.1-10 s period). Metadata for the records included relocated event hypocenters. The database includes more than 160,000 records from about 3200 earthquakes. Estimates of the mean and standard deviation of the IMs are computed by distance binning at intervals of 2 km. Mean IMs exhibit a clear break in geometrical attenuation at epicentral distances of about 50-70 km, which is consistent with previous studies in the CEUS. Comparisons of these ground motions with modern GMPEs provide some insight into the relative IMs of induced earthquakes in Oklahoma and Kansas relative to the western U.S. and the central and eastern U.S. The site response for these stations is uncertain because very little is known about shallow seismic velocity in the region, and we make no attempt to correct observed IMs to a reference site conditions. At close distances, the observed IMs are lower than the predictions of the seed GMPEs of the NGA-East project (and about consistent with NGA-West-2 ground motions). This ground motion database may be used to inform future seismic hazard forecast models and in the development of regionally appropriate GMPEs.
Ionospheric manifestations of earthquakes and tsunamis in a dynamic atmosphere
NASA Astrophysics Data System (ADS)
Godin, Oleg A.; Zabotin, Nikolay A.; Zabotina, Liudmila
2015-04-01
Observations of the ionosphere provide a new, promising modality for characterizing large-scale physical processes that occur on land and in the ocean. There is a large and rapidly growing body of evidence that a number of natural hazards, including large earthquakes, strong tsunamis, and powerful tornadoes, have pronounced ionospheric manifestations, which are reliably detected by ground-based and satellite-borne instruments. As the focus shifts from detecting the ionospheric features associated with the natural hazards to characterizing the hazards for the purposes of improving early warning systems and contributing to disaster recovery, it becomes imperative to relate quantitatively characteristics of the observed ionospheric disturbances and the underlying natural hazard. The relation between perturbations at the ground level and their ionospheric manifestations is strongly affected by parameters of the intervening atmosphere. In this paper, we employ the ray theory to model propagation of acoustic-gravity waves in three-dimensionally inhomogeneous atmosphere. Huygens' wavefront-tracing and Hamiltonian ray-tracing algorithms are used to simulate propagation of body waves from an earthquake hypocenter through the earth's crust and ocean to the upper atmosphere. We quantify the influence of temperature stratification and winds, including their seasonal variability, and air viscosity and thermal conductivity on the geometry and amplitude of ionospheric disturbances that are generated by seismic surface waves and tsunamis. Modeling results are verified by comparing observations of the velocity fluctuations at altitudes of 150-160 km by a coastal Dynasonde HF radar system with theoretical predictions of ionospheric manifestations of background infragravity waves in the ocean. Dynasonde radar systems are shown to be a promising means for monitoring acoustic-gravity wave activity and observing ionospheric perturbations due to earthquakes and tsunamis. We will discuss the effects of the background ionospheric disturbances and uncertainty in atmospheric parameters on the feasibility and accuracy of retrieval of the open-ocean tsunami heights from observations of the ionosphere.
Mild cognitive impairment as a risk factor for Parkinson's disease dementia.
Hoogland, Jeroen; Boel, Judith A; de Bie, Rob M A; Geskus, Ronald B; Schmand, Ben A; Dalrymple-Alford, John C; Marras, Connie; Adler, Charles H; Goldman, Jennifer G; Tröster, Alexander I; Burn, David J; Litvan, Irene; Geurtsen, Gert J
2017-07-01
The International Parkinson and Movement Disorder Society criteria for mild cognitive impairment in PD were recently formulated. The aim of this international study was to evaluate the predictive validity of the comprehensive (level II) version of these criteria by assessment of their contribution to the hazard of PD dementia. Individual patient data were selected from four separate studies on cognition in PD that provided information on demographics, motor examination, depression, neuropsychological examination suitable for application of level II criteria, and longitudinal follow-up for conversion to dementia. Survival analysis evaluated the predictive value of level II criteria for cognitive decline toward dementia as expressed by the relative hazard of dementia. A total of 467 patients were included. The analyses showed a clear contribution of impairment according to level II mild cognitive impairment criteria, age, and severity of PD motor symptoms to the hazard of dementia. There was a trend of increasing hazard of dementia with declining neuropsychological performance. This is the first large international study evaluating the predictive validity of level II mild cognitive impairment criteria for PD. The results showed a clear and unique contribution of classification according to level II criteria to the hazard of PD dementia. This finding supports their predictive validity and shows that they contribute important new information on the hazard of dementia, beyond known demographic and PD-specific factors of influence. © 2017 International Parkinson and Movement Disorder Society. © 2017 International Parkinson and Movement Disorder Society.
NASA Astrophysics Data System (ADS)
Blauhut, V.; Stahl, K.; Stagge, J. H.; Tallaksen, L. M.; De Stefano, L.; Vogt, J.
2015-12-01
Drought is one of the most costly natural hazards in Europe. Due to its complexity, drought risk, the combination of the natural hazard and societal vulnerability, is difficult to define and challenging to detect and predict, as the impacts of drought are very diverse, covering the breadth of socioeconomic and environmental systems. Pan-European maps of drought risk could inform the elaboration of guidelines and policies to address its documented severity and impact across borders. This work (1) tests the capability of commonly applied hazard indicators and vulnerability factors to predict annual drought impact occurrence for different sectors and macro regions in Europe and (2) combines information on past drought impacts, drought hazard indicators, and vulnerability factors into estimates of drought risk at the pan-European scale. This "hybrid approach" bridges the gap between traditional vulnerability assessment and probabilistic impact forecast in a statistical modelling framework. Multivariable logistic regression was applied to predict the likelihood of impact occurrence on an annual basis for particular impact categories and European macro regions. The results indicate sector- and macro region specific sensitivities of hazard indicators, with the Standardised Precipitation Evapotranspiration Index for a twelve month aggregation period (SPEI-12) as the overall best hazard predictor. Vulnerability factors have only limited ability to predict drought impacts as single predictor, with information about landuse and water resources as best vulnerability-based predictors. (3) The application of the "hybrid approach" revealed strong regional (NUTS combo level) and sector specific differences in drought risk across Europe. The majority of best predictor combinations rely on a combination of SPEI for shorter and longer aggregation periods, and a combination of information on landuse and water resources. The added value of integrating regional vulnerability information with drought risk prediction could be proven. Thus, the study contributes to the overall understanding of drivers of drought impacts, current practice of drought indicators selection for specific application, and drought risk assessment.
Multivariate Drought Characterization in India for Monitoring and Prediction
NASA Astrophysics Data System (ADS)
Sreekumaran Unnithan, P.; Mondal, A.
2016-12-01
Droughts are one of the most important natural hazards that affect the society significantly in terms of mortality and productivity. The metric that is most widely used by the India Meteorological Department (IMD) to monitor and predict the occurrence, spread, intensification and termination of drought is based on the univariate Standardized Precipitation Index (SPI). However, droughts may be caused by the influence and interaction of many variables (such as precipitation, soil moisture, runoff, etc.), emphasizing the need for a multivariate approach for drought characterization. This study advocates and illustrates use of the recently proposed multivariate standardized drought index (MSDI) in monitoring and prediction of drought and assessing its concerned risk in the Indian region. MSDI combines information from multiple sources: precipitation and soil moisture, and has been deemed to be a more reliable drought index. All-India monthly rainfall and soil moisture data sets are analysed for the period 1980 to 2014 to characterize historical droughts using both the univariate indices, the precipitation-based SPI and the standardized soil moisture index (SSI), as well as the multivariate MSDI using parametric and non-parametric approaches. We confirm that MSDI can capture droughts of 1986 and 1990 that aren't detected by using SPI alone. Moreover, in 1987, MSDI indicated a higher severity of drought when a deficiency in both soil moisture and precipitation was encountered. Further, this study also explores the use of MSDI for drought forecasts and assesses its performance vis-à-vis existing predictions from the IMD. Future research efforts will be directed towards formulating a more robust standardized drought indicator that can take into account socio-economic aspects that also play a key role for water-stressed regions such as India.
Wang, Yan; Deng, Lei; Caballero-Guzman, Alejandro; Nowack, Bernd
2016-12-01
Nano iron oxide particles are beneficial to our daily lives through their use in paints, construction materials, biomedical imaging and other industrial fields. However, little is known about the possible risks associated with the current exposure level of engineered nano iron oxides (nano-FeOX) to organisms in the environment. The goal of this study was to predict the release of nano-FeOX to the environment and assess their risks for surface waters in the EU and Switzerland. The material flows of nano-FeOX to technical compartments (waste incineration and waste water treatment plants) and to the environment were calculated with a probabilistic modeling approach. The mean value of the predicted environmental concentrations (PECs) of nano-FeOX in surface waters in the EU for a worst-case scenario (no particle sedimentation) was estimated to be 28 ng/l. Using a probabilistic species sensitivity distribution, the predicted no-effect concentration (PNEC) was determined from ecotoxicological data. The risk characterization ratio, calculated by dividing the PEC by PNEC values, was used to characterize the risks. The mean risk characterization ratio was predicted to be several orders of magnitude smaller than 1 (1.4 × 10 - 4 ). Therefore, this modeling effort indicates that only a very limited risk is posed by the current release level of nano-FeOX to organisms in surface waters. However, a better understanding of the hazards of nano-FeOX to the organisms in other ecosystems (such as sediment) needs to be assessed to determine the overall risk of these particles to the environment.
Progress in Development of an Airborne Turbulence Detection System
NASA Technical Reports Server (NTRS)
Hamilton, David W.; Proctor, Fred H.
2006-01-01
Aircraft encounters with turbulence are the leading cause of in-flight injuries (Tyrvanas 2003) and have occasionally resulted in passenger and crew fatalities. Most of these injuries are caused by sudden and unexpected encounters with severe turbulence in and around convective activity (Kaplan et al 2005). To alleviate this problem, the Turbulence Prediction and Warning Systems (TPAWS) element of NASA s Aviation Safety program has investigated technologies to detect and warn of hazardous in-flight turbulence. This effort has required the numerical modeling of atmospheric convection: 1) for characterizing convectively induced turbulence (CIT) environments, 2) for defining turbulence hazard metrics, and 3) as a means of providing realistic three-dimensional data sets that can be used to test and evaluate turbulence detection sensors. The data sets are being made available to industry and the FAA for certification of future airborne turbulence-detection systems (ATDS) with warning capability. Early in the TPAWS project, a radar-based ATDS was installed and flight tested on NASA s research aircraft, a B-757. This ATDS utilized new algorithms and hazard metrics that were developed for use with existing airborne predictive windshear radars, thus avoiding the installation of new hardware. This system was designed to detect and warn of hazardous CIT even in regions with weak radar reflectivity (i.e. 5-15 dBz). Results from an initial flight test of the ATDS were discussed in Hamilton and Proctor (2002a; 2002b). In companion papers (Proctor et al 2002a; 2002b), a numerical simulation of the most significant encounter from that flight test was presented. Since the presentation of these papers a second flight test has been conducted providing additional cases for examination. In this paper, we will present results from NASA s flight test and a numerical model simulation of a turbulence environment encountered on 30 April 2002. Progress leading towards FAA certification of industry built ATDS will also be discussed.
NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco
2016-04-01
The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including hanging wall and directivity effects) within modern ground motion prediction equations, can have an influence on the seismic hazard at a site. Yet we also illustrate the conditions under which these effects may be partially tempered when considering the full uncertainty in rupture behaviour within the fault system. The third challenge is the development of efficient means for representing both aleatory and epistemic uncertainties from active fault models in PSHA. In implementing state-of-the-art seismic hazard models into OpenQuake, such as those recently undertaken in California and Japan, new modeling techniques are needed that redefine how we treat interdependence of ruptures within the model (such as mutual exclusivity), and the propagation of uncertainties emerging from geology. Finally, we illustrate how OpenQuake, and GEM's additional toolkits for model preparation, can be applied to address long-standing issues in active fault modeling in PSHA. These include constraining the seismogenic coupling of a fault and the partitioning of seismic moment between the active fault surfaces and the surrounding seismogenic crust. We illustrate some of the possible roles that geodesy can play in the process, but highlight where this may introduce new uncertainties and potential biases into the seismic hazard process, and how these can be addressed.
NASA Technical Reports Server (NTRS)
Evans, Diane
2012-01-01
Objective 2.1.1: Improve understanding of and improve the predictive capability for changes in the ozone layer, climate forcing, and air quality associated with changes in atmospheric composition. Objective 2.1.2: Enable improved predictive capability for weather and extreme weather events. Objective 2.1.3: Quantify, understand, and predict changes in Earth s ecosystems and biogeochemical cycles, including the global carbon cycle, land cover, and biodiversity. Objective 2.1.4: Quantify the key reservoirs and fluxes in the global water cycle and assess water cycle change and water quality. Objective 2.1.5: Improve understanding of the roles of the ocean, atmosphere, land and ice in the climate system and improve predictive capability for its future evolution. Objective 2.1.6: Characterize the dynamics of Earth s surface and interior and form the scientific basis for the assessment and mitigation of natural hazards and response to rare and extreme events. Objective 2.1.7: Enable the broad use of Earth system science observations and results in decision-making activities for societal benefits.
Nanomaterial characterization: considerations and needs for hazard assessment and safety evaluation.
Boverhof, Darrell R; David, Raymond M
2010-02-01
Nanotechnology is a rapidly emerging field of great interest and promise. As new materials are developed and commercialized, hazard information also needs to be generated to reassure regulators, workers, and consumers that these materials can be used safely. The biological properties of nanomaterials are closely tied to the physical characteristics, including size, shape, dissolution rate, agglomeration state, and surface chemistry, to name a few. Furthermore, these properties can be altered by the medium used to suspend or disperse these water-insoluble particles. However, the current toxicology literature lacks much of the characterization information that allows toxicologists and regulators to develop "rules of thumb" that could be used to assess potential hazards. To effectively develop these rules, toxicologists need to know the characteristics of the particle that interacts with the biological system. This void leaves the scientific community with no options other than to evaluate all materials for all potential hazards. Lack of characterization could also lead to different laboratories reporting discordant results on seemingly the same test material because of subtle differences in the particle or differences in the dispersion medium used that resulted in altered properties and toxicity of the particle. For these reasons, good characterization using a minimal characterization data set should accompany and be required of all scientific publications on nanomaterials.
Next-Level ShakeZoning for Earthquake Hazard Definition in Nevada
NASA Astrophysics Data System (ADS)
Louie, J. N.; Savran, W. H.; Flinchum, B. A.; Dudley, C.; Prina, N.; Pullammanappallil, S.; Pancha, A.
2011-12-01
We are developing "Next-Level ShakeZoning" procedures tailored for defining earthquake hazards in Nevada. The current Federally sponsored tools- the USGS hazard maps and ShakeMap, and FEMA HAZUS- were developed as statistical summaries to match earthquake data from California, Japan, and Taiwan. The 2008 Wells and Mogul events in Nevada showed in particular that the generalized statistical approach taken by ShakeMap cannot match actual data on shaking from earthquakes in the Intermountain West, even to first order. Next-Level ShakeZoning relies on physics and geology to define earthquake shaking hazards, rather than statistics. It follows theoretical and computational developments made over the past 20 years, to capitalize on detailed and specific local data sets to more accurately model the propagation and amplification of earthquake waves through the multiple geologic basins of the Intermountain West. Excellent new data sets are now available for Las Vegas Valley. Clark County, Nevada has completed the nation's very first effort to map earthquake hazard class systematically through an entire urban area using Optim's SeisOpt° ReMi technique, which was adapted for large-scale data collection. Using the new Parcel Map in computing shaking in the Valley for scenario earthquakes is crucial for obtaining realistic predictions of ground motions. In an educational element of the project, a dozen undergraduate students have been computing 50 separate earthquake scenarios affecting Las Vegas Valley, using the Next-Level ShakeZoning process. Despite affecting only the upper 30 meters, the Vs30 geotechnical shear-velocity from the Parcel Map shows clear effects on 3-d shaking predictions computed so far at frequencies from 0.1 Hz up to 1.0 Hz. The effect of the Parcel Map on even the 0.1-Hz waves is prominent even with the large mismatch of wavelength to geotechnical depths. Amplifications and de-amplifications affected by the Parcel Map exceed a factor of two, and are highly dependent on the particular scenario. As well, Parcel Map amplification effects extend into areas not characterized in the Parcel Map. The fully 3-d Next-Level ShakeZoning scenarios show many areas of shaking amplification and de-amplification that USGS ShakeMap scenarios cannot predict. For example, the Frenchman Mountain scenario shows PGV of the two approaches within 15% of each other near the source, but upwards of 200% relative amplification or de-amplification, depending on location, throughout Las Vegas Valley.
Risk Management for Wilderness Programs.
ERIC Educational Resources Information Center
Schimelpfenig, Tod
This paper discusses subjective hazards in wilderness activities and suggests means of assessing and managing related risks. Wilderness educators conveniently group hazards into objective and subjective ones. Objective hazards such as rockfall, moving water, and weather, while not necessarily predictable, are visible and understandable. Subjective…
Wang, Yuanjia; Chen, Tianle; Zeng, Donglin
2016-01-01
Learning risk scores to predict dichotomous or continuous outcomes using machine learning approaches has been studied extensively. However, how to learn risk scores for time-to-event outcomes subject to right censoring has received little attention until recently. Existing approaches rely on inverse probability weighting or rank-based regression, which may be inefficient. In this paper, we develop a new support vector hazards machine (SVHM) approach to predict censored outcomes. Our method is based on predicting the counting process associated with the time-to-event outcomes among subjects at risk via a series of support vector machines. Introducing counting processes to represent time-to-event data leads to a connection between support vector machines in supervised learning and hazards regression in standard survival analysis. To account for different at risk populations at observed event times, a time-varying offset is used in estimating risk scores. The resulting optimization is a convex quadratic programming problem that can easily incorporate non-linearity using kernel trick. We demonstrate an interesting link from the profiled empirical risk function of SVHM to the Cox partial likelihood. We then formally show that SVHM is optimal in discriminating covariate-specific hazard function from population average hazard function, and establish the consistency and learning rate of the predicted risk using the estimated risk scores. Simulation studies show improved prediction accuracy of the event times using SVHM compared to existing machine learning methods and standard conventional approaches. Finally, we analyze two real world biomedical study data where we use clinical markers and neuroimaging biomarkers to predict age-at-onset of a disease, and demonstrate superiority of SVHM in distinguishing high risk versus low risk subjects.
In silico prediction of drug-induced myelotoxicity by using Naïve Bayes method.
Zhang, Hui; Yu, Peng; Zhang, Teng-Guo; Kang, Yan-Li; Zhao, Xiao; Li, Yuan-Yuan; He, Jia-Hui; Zhang, Ji
2015-11-01
Drug-induced myelotoxicity usually leads to decrease the production of platelets, red cells, and white cells. Thus, early identification and characterization of myelotoxicity hazard in drug development is very necessary. The purpose of this investigation was to develop a prediction model of drug-induced myelotoxicity by using a Naïve Bayes classifier. For comparison, other prediction models based on support vector machine and single-hidden-layer feed-forward neural network methods were also established. Among all the prediction models, the Naïve Bayes classification model showed the best prediction performance, which offered an average overall prediction accuracy of [Formula: see text] for the training set and [Formula: see text] for the external test set. The significant contributions of this study are that we first developed a Naïve Bayes classification model of drug-induced myelotoxicity adverse effect using a larger scale dataset, which could be employed for the prediction of drug-induced myelotoxicity. In addition, several important molecular descriptors and substructures of myelotoxic compounds have been identified, which should be taken into consideration in the design of new candidate compounds to produce safer and more effective drugs, ultimately reducing the attrition rate in later stages of drug development.
Sensing Hazards with Operational Unmanned Technology
NASA Astrophysics Data System (ADS)
Hood, R. E.
2016-12-01
The Unmanned Aircraft Systems (UAS) Program of the National Oceanic and Atmospheric Administration (NOAA) is working with the National Weather Service, the National Ocean Service, other Federal agencies, private industry, and academia to evaluate the feasibility of UAS observations to provide time critical information needed for situational awareness, prediction, warning, and damage assessment of hazards. This activity is managed within a portfolio of projects entitled "Sensing Hazards with Operational Unmanned Technology (SHOUT)." The diversity of this portfolio includes evaluations of high altitude UAS observations for high impact oceanic storms prediction to low altitude UAS observations of rivers, severe storms, and coastal areas for pre-hazard situational awareness and post-hazard damage assessments. Each SHOUT evaluation project begins with a proof-of-concept field demonstration of a UAS observing strategy for a given hazard and then matures to joint studies of both scientific data impact along with cost and operational feasibility of the observing strategy for routine applications. The technology readiness and preliminary evaulation results will be presented for several UAS observing strategies designed for improved observations of oceanic storms, floods, severe storms, and coastal ecosystem hazards.
44 CFR 201.4 - Standard State Mitigation Plans.
Code of Federal Regulations, 2011 CFR
2011-10-01
... reduce risks from natural hazards and serves as a guide for State decision makers as they commit resources to reducing the effects of natural hazards. (b) Planning process. An effective planning process is... risk assessments must characterize and analyze natural hazards and risks to provide a statewide...
HOW to Recognize and Reduce Tree Hazards in Recreation Sites
Kathyn Robbins
1986-01-01
An understanding of the many factors affecting tree hazards in recreation sites will help predict which trees are most likely to fail. Hazard tree management deals with probabilities of failure. This guide, written for anyone involved in management or maintenance of public use areas that contain trees, is intended to help minimize the risk associated with hazard trees...
Hisamatsu, Tadakazu; Ono, Nobukazu; Imaizumi, Akira; Mori, Maiko; Suzuki, Hiroaki; Uo, Michihide; Hashimoto, Masaki; Naganuma, Makoto; Matsuoka, Katsuyoshi; Mizuno, Shinta; Kitazume, Mina T.; Yajima, Tomoharu; Ogata, Haruhiko; Iwao, Yasushi; Hibi, Toshifumi; Kanai, Takanori
2015-01-01
Ulcerative colitis (UC) is characterized by chronic intestinal inflammation. Patients with UC have repeated remission and relapse. Clinical biomarkers that can predict relapse in UC patients in remission have not been identified. To facilitate the prediction of relapse of UC, we investigated the potential of novel multivariate indexes using statistical modeling of plasma free amino acid (PFAA) concentrations. We measured fasting PFAA concentrations in 369 UC patients in clinical remission, and 355 were observed prospectively for up to 1 year. Relapse rate within 1 year was 23% (82 of 355 patients). The age- and gender-adjusted hazard ratio for the lowest quartile compared with the highest quartile of plasma histidine concentration was 2.55 (95% confidence interval: 1.41–4.62; p = 0.0020 (log-rank), p for trend = 0.0005). We demonstrated that plasma amino acid profiles in UC patients in clinical remission can predict the risk of relapse within 1 year. Decreased histidine level in PFAAs was associated with increased risk of relapse. Metabolomics could be promising for the establishment of a non-invasive predictive marker in inflammatory bowel disease. PMID:26474176
NASA Astrophysics Data System (ADS)
Herrero, Andre; Spagnuolo, Elena; Akinci, Aybige; Pucci, Stefano
2016-04-01
In the present study we attempted to improve the seismic hazard assessment taking into account possible sources of epistemic uncertainty and the azimuthal variability of the ground motions which, at a particular site, is significantly influenced by the rupture mechanism and the rupture direction relative to the site. As a study area we selected Marmara Region (Turkey), especially the city of Istanbul which is characterized by one of the highest levels of seismic risk in Europe and the Mediterranean region. The seismic hazard in the city is mainly associated with two active fault segments which are located at about 20-30 km south of Istanbul. In this perspective first we proposed a methodology to incorporate this new information such as nucleation point in a probabilistic seismic hazard analysis (PSHA) framework. Secondly we introduced information about those fault segments by focusing on the fault rupture characteristics which affect the azimuthal variations of the ground motion spatial distribution i.e. source directivity effect and its influence on the probabilistic seismic hazard analyses (PSHA). An analytical model developed by Spudich and Chiou (2008) is used as a corrective factor that modifies the Next Generation Attenuation (NGA, Power et al. 2008) ground motion predictive equations (GMPEs) introducing rupture related parameters that generally lump together into the term directivity effect. We used the GMPEs as derived by the Abrahamson and Silva (2008) and the Boore and Atkinson (2008); our results are given in terms of 10% probability of exceedance of PSHA (at several periods from 0.5 s to 10 s) in 50 years on rock site condition; the correction for directivity introduces a significant contribution to the percentage ratio between the seismic hazards computed using the directivity model respect to the seismic hazard standard practice. In particular, we benefited the dynamic simulation from a previous study (Aochi & Utrich, 2015) aimed at evaluating the seismic potential of the Marmara region to derive a statistical distribution for nucleation position. Our results suggest that accounting for rupture related parameters in a PSHA using deterministic information from dynamic models is feasible and in particular, the use of a non-uniform statistical distribution for nucleation position has serious consequences on the hazard assessment. Since the directivity effect is conditional on the nucleation position the hazard map changes with the assumptions made. A worst case scenario (both the faults are rupturing towards the city of Istanbul) predicts up to 25% change than the standard formulation at 2 sec and increases with longer periods. The former result is heavily different if a deterministically based nucleation position is assumed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Serrato, M.; Jungho, I.; Jensen, J.
2012-01-17
Remote sensing technology can provide a cost-effective tool for monitoring hazardous waste sites. This study investigated the usability of HyMap airborne hyperspectral remote sensing data (126 bands at 2.3 x 2.3 m spatial resolution) to characterize the vegetation at U.S. Department of Energy uranium processing sites near Monticello, Utah and Monument Valley, Arizona. Grass and shrub species were mixed on an engineered disposal cell cover at the Monticello site while shrub species were dominant in the phytoremediation plantings at the Monument Valley site. The specific objectives of this study were to: (1) estimate leaf-area-index (LAI) of the vegetation using threemore » different methods (i.e., vegetation indices, red-edge positioning (REP), and machine learning regression trees), and (2) map the vegetation cover using machine learning decision trees based on either the scaled reflectance data or mixture tuned matched filtering (MTMF)-derived metrics and vegetation indices. Regression trees resulted in the best calibration performance of LAI estimation (R{sup 2} > 0.80). The use of REPs failed to accurately predict LAI (R{sup 2} < 0.2). The use of the MTMF-derived metrics (matched filter scores and infeasibility) and a range of vegetation indices in decision trees improved the vegetation mapping when compared to the decision tree classification using just the scaled reflectance. Results suggest that hyperspectral imagery are useful for characterizing biophysical characteristics (LAI) and vegetation cover on capped hazardous waste sites. However, it is believed that the vegetation mapping would benefit from the use of 1 higher spatial resolution hyperspectral data due to the small size of many of the vegetation patches (< 1m) found on the sites.« less
NASA's Planetary Defense Coordination Office at NASA HQ
NASA Astrophysics Data System (ADS)
Daou, D.; Johnson, L.; Fast, K. E.; Landis, R.; Friedensen, V. P.; Kelley, M.
2017-09-01
NASA and its partners maintain a watch for near-Earth objects (NEOs), asteroids and comets that pass close to the Earth, as part of an ongoing effort to discover, catalog, and characterize these bodies. The PDCO is responsible for: • Ensuring the early detection of potentially hazardous objects (PHOs) - asteroids and comets whose orbit are predicted to bring them within 0.05 Astronomical Units of Earth; and of a size large enough to reach Earth's surface - that is, greater than perhaps 30 to 50 meters; • Tracking and characterizing PHOs and issuing warnings about potential impacts; • Providing timely and accurate communications about PHOs; and • Performing as a lead coordination node in U.S. Government planning for response to an actual impact threat. The PDCO collaborates with other U.S. Government agencies, other national and international agencies, and professional and amateur astronomers around the world. The PDCO also is responsible for facilitating communications between the science community and the public should any potentially hazardous NEO be discovered. In addition, the PDCO works closely with the United Nations Office of Outer Space Affairs, its Committee on the Peaceful Uses of Outer Space, and its Action Team on Near Earth Objects (also known as Action Team 14). The PDCO is a leading member of the International Asteroid Warning Network (IAWN) and the Space Missions Planning Advisory Group (SMPAG), multinational endeavors recommended by the United Nations for an international response to the NEO impact hazard and established and operated by the spacecapable nations. The PDCO also communicates with the scientific community through channels such as NASA's Small Bodies Assessment Group (SBAG). In this talk, we will provide an update to the office's various efforts and new opportunities for partnerships in the continuous international effort for Planetary Defense.
Could the collapse of a massive speleothem be the record of a large paleoearthquake?
NASA Astrophysics Data System (ADS)
Valentini, Alessandro; Pace, Bruno; Vasta, Marcello; Ferranti, Luigi; Colella, Abner; Vassallo, Maurizio
2016-04-01
Earthquake forecast and seismic hazard models are generally based on historical and instrumental seismicity. However, in regions characterized by moderate strain rates and by strong earthquakes with recurrence longer than the time span covered by historical catalogues, different approaches are desirable to provide an independent test of seismologically-based models. We used non-conventional methods, such as the so-called "Fragile Geological Features", and in particular cave speleothems, for assessing and improving existing paleoseismological databases and seismic hazard models. In this work we present a detailed study of a massive speleothem found collapsed in the Cola Cave (Abruzzo region, Central Apennines, Italy) that could be considered the record of a large paleoearthquake. Radiometric dating and geotechnical measurements are carried out to characterize the collapse time and the mechanical properties of speleothem. We performed theoretical and numerical modelling in order to estimate the values of the horizontal ground acceleration required to failure the speleothems. In particular we used a finite element method (FEM), with the SAP200 software, starting from the detailed geometry of the speleothem and its mechanical properties. We used several individual seismogenic source geometries and four different ground motion prediction equations to calculate the possible response spectra. We carried out also a seismic noise survey to understand and quantify any ground motion amplification phenomenon. The results suggest two faults located in the Fucino area as the most probable causative sources of the cave speleothem collapses, recorded ~4-5 ka ago, with a Mw=6.8 ± 0.2. Our approach contributes to assess the existence of past earthquakes integrating the classical paleoseismological trenches techniques, and to attribute the retrieved event to geometrically-defined individual seismogenic sources, which represents a key contribution to improve fault-based seismic hazard models.
NASA's Planetary Defense Coordination Office at NASA HQ
NASA Astrophysics Data System (ADS)
Daou, D.; Johnson, L.; Fast, K. E.; Landis, R.; Friedensen, V. P.; Kelley, M.
2017-12-01
NASA and its partners maintain a watch for near-Earth objects (NEOs), asteroids and comets that pass close to the Earth, as part of an ongoing effort to discover, catalog, and characterize these bodies. The PDCO is responsible for: Ensuring the early detection of potentially hazardous objects (PHOs) - asteroids and comets whose orbit are predicted to bring them within 0.05 Astronomical Units of Earth; and of a size large enough to reach Earth's surface - that is, greater than perhaps 30 to 50 meters; Tracking and characterizing PHOs and issuing warnings about potential impacts; Providing timely and accurate communications about PHOs; and Performing as a lead coordination node in U.S. Government planning for response to an actual impact threat. The PDCO collaborates with other U.S. Government agencies, other national and international agencies, and professional and amateur astronomers around the world. The PDCO also is responsible for facilitating communications between the science community and the public should any potentially hazardous NEO be discovered. In addition, the PDCO works closely with the United Nations Office of Outer Space Affairs, its Committee on the Peaceful Uses of Outer Space, and its Action Team on Near Earth Objects (also known as Action Team 14). The PDCO is a leading member of the International Asteroid Warning Network (IAWN) and the Space Missions Planning Advisory Group (SMPAG), multinational endeavors recommended by the United Nations for an international response to the NEO impact hazard and established and operated by the space-capable nations. The PDCO also communicates with the scientific community through channels such as NASA's Small Bodies Assessment Group (SBAG). In this talk, we will provide an update to the office's various efforts and new opportunities for partnerships in the continuous international effort for Planetary Defense.
Wong, Ivan G.; Stokoe, Kenneth; Cox, Brady R.; Yuan, Jiabei; Knudsen, Keith L.; Terra, Fabia; Okubo, Paul G.; Lin, Yin-Cheng
2011-01-01
To assess the level and nature of ground shaking in Hawaii for the purposes of earthquake hazard mitigation and seismic design, empirical ground-motion prediction models are desired. To develop such empirical relationships, knowledge of the subsurface site conditions beneath strong-motion stations is critical. Thus, as a first step to develop ground-motion prediction models for Hawaii, spectral-analysis-of-surface-waves (SASW) profiling was performed at the 22 free-field U.S. Geological Survey (USGS) strong-motion sites on the Big Island to obtain shear-wave velocity (VS) data. Nineteen of these stations recorded the 2006 Kiholo Bay moment magnitude (M) 6.7 earthquake, and 17 stations recorded the triggered M 6.0 Mahukona earthquake. VS profiling was performed to reach depths of more than 100 ft. Most of the USGS stations are situated on sites underlain by basalt, based on surficial geologic maps. However, the sites have varying degrees of weathering and soil development. The remaining strong-motion stations are located on alluvium or volcanic ash. VS30 (average VS in the top 30 m) values for the stations on basalt ranged from 906 to 1908 ft/s [National Earthquake Hazards Reduction Program (NEHRP) site classes C and D], because most sites were covered with soil of variable thickness. Based on these data, an NEHRP site-class map was developed for the Big Island. These new VS data will be a significant input into an update of the USGS statewide hazard maps and to the operation of ShakeMap on the island of Hawaii.
Time prediction of failure a type of lamps by using general composite hazard rate model
NASA Astrophysics Data System (ADS)
Riaman; Lesmana, E.; Subartini, B.; Supian, S.
2018-03-01
This paper discusses the basic survival model estimates to obtain the average predictive value of lamp failure time. This estimate is for the parametric model, General Composite Hazard Level Model. The random time variable model used is the exponential distribution model, as the basis, which has a constant hazard function. In this case, we discuss an example of survival model estimation for a composite hazard function, using an exponential model as its basis. To estimate this model is done by estimating model parameters, through the construction of survival function and empirical cumulative function. The model obtained, will then be used to predict the average failure time of the model, for the type of lamp. By grouping the data into several intervals and the average value of failure at each interval, then calculate the average failure time of a model based on each interval, the p value obtained from the tes result is 0.3296.
Building resilience to weather-related hazards through better preparedness
NASA Astrophysics Data System (ADS)
Keller, Julia; Golding, Brian; Johnston, David; Ruti, Paolo
2017-04-01
Recent developments in weather forecasting have transformed our ability to predict weather-related hazards, while mobile communication is radically changing the way that people receive information. At the same time, vulnerability to weather-related hazards is growing through urban expansion, population growth and climate change. This talk will address issues facing the science community in responding to the Sendai Framework objective to "substantially increase the availability of and access to multi-hazard early warning systems" in the context of weather-related hazards. It will also provide an overview of activities and approaches developed in the World Meteorological Organisation's High Impact Weather (HIWeather) project. HIWeather has identified and is promoting research in key multi-disciplinary gaps in our knowledge, including in basic meteorology, risk prediction, communication and decision making, that affect our ability to provide effective warnings. The results will be pulled together in demonstration projects that will both showcase leading edge capability and build developing country capacity.
Natural hazard modeling and uncertainty analysis [Chapter 2
Matthew Thompson; Jord J. Warmink
2017-01-01
Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...
Multivariate Models for Prediction of Human Skin Sensitization Hazard
Strickland, Judy; Zang, Qingda; Paris, Michael; Lehmann, David M.; Allen, David; Choksi, Neepa; Matheson, Joanna; Jacobs, Abigail; Casey, Warren; Kleinstreuer, Nicole
2016-01-01
One of ICCVAM’s top priorities is the development and evaluation of non-animal approaches to identify potential skin sensitizers. The complexity of biological events necessary to produce skin sensitization suggests that no single alternative method will replace the currently accepted animal tests. ICCVAM is evaluating an integrated approach to testing and assessment based on the adverse outcome pathway for skin sensitization that uses machine learning approaches to predict human skin sensitization hazard. We combined data from three in chemico or in vitro assays—the direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT), and KeratinoSens™ assay—six physicochemical properties, and an in silico read-across prediction of skin sensitization hazard into 12 variable groups. The variable groups were evaluated using two machine learning approaches, logistic regression (LR) and support vector machine (SVM), to predict human skin sensitization hazard. Models were trained on 72 substances and tested on an external set of 24 substances. The six models (three LR and three SVM) with the highest accuracy (92%) used: (1) DPRA, h-CLAT, and read-across; (2) DPRA, h-CLAT, read-across, and KeratinoSens; or (3) DPRA, h-CLAT, read-across, KeratinoSens, and log P. The models performed better at predicting human skin sensitization hazard than the murine local lymph node assay (accuracy = 88%), any of the alternative methods alone (accuracy = 63–79%), or test batteries combining data from the individual methods (accuracy = 75%). These results suggest that computational methods are promising tools to effectively identify potential human skin sensitizers without animal testing. PMID:27480324
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L.; Vogel, R. M.
2015-12-01
Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.
Dynamic wake prediction and visualization with uncertainty analysis
NASA Technical Reports Server (NTRS)
Holforty, Wendy L. (Inventor); Powell, J. David (Inventor)
2005-01-01
A dynamic wake avoidance system utilizes aircraft and atmospheric parameters readily available in flight to model and predict airborne wake vortices in real time. A novel combination of algorithms allows for a relatively simple yet robust wake model to be constructed based on information extracted from a broadcast. The system predicts the location and movement of the wake based on the nominal wake model and correspondingly performs an uncertainty analysis on the wake model to determine a wake hazard zone (no fly zone), which comprises a plurality of wake planes, each moving independently from another. The system selectively adjusts dimensions of each wake plane to minimize spatial and temporal uncertainty, thereby ensuring that the actual wake is within the wake hazard zone. The predicted wake hazard zone is communicated in real time directly to a user via a realistic visual representation. In an example, the wake hazard zone is visualized on a 3-D flight deck display to enable a pilot to visualize or see a neighboring aircraft as well as its wake. The system substantially enhances the pilot's situational awareness and allows for a further safe decrease in spacing, which could alleviate airport and airspace congestion.
Correlates of household seismic hazard adjustment adoption.
Lindell, M K; Whitney, D J
2000-02-01
This study examined the relationships of self-reported adoption of 12 seismic hazard adjustments (pre-impact actions to reduce danger to persons and property) with respondents' demographic characteristics, perceived risk, perceived hazard knowledge, perceived protection responsibility, and perceived attributes of the hazard adjustments. Consistent with theoretical predictions, perceived attributes of the hazard adjustments differentiated among the adjustments and had stronger correlations with adoption than any of the other predictors. These results identify the adjustments and attributes that emergency managers should address to have the greatest impact on improving household adjustment to earthquake hazard.
Ferrari, Benoît; Paxéus, Nicklas; Lo Giudice, Roberto; Pollio, Antonino; Garric, Jeanne
2003-07-01
In four countries (France, Greece, Italy, and Sweden) occurrence in sewage treatment plant (STP) effluents and ecotoxicity of the pharmaceuticals carbamazepine, clofibric acid, and diclofenac were investigated. Bioassays were performed on bacteria, algae, microcrustaceans, and fishes in order to calculate their predicted no-effect concentrations (PNEC) and to perform a first approach of risk characterization. For this aim, risk has been estimated by the predicted environmental concentration/PNEC ratio and the measured environmental concentration/PNEC ratio. First, regarding the PNEC, carbamazepine appears to be the more hazardous compound. Second, even though it is demonstrated that carbamazepine, clofibric acid, and diclofenac have been detected in effluents, only carbamazepine have been detected in all sewage treatment plants with the greatest concentrations. Third, risk quotients greater than unity were calculated only for carbamazepine, suggesting that risk for the water compartment is expected.
Radiation protection using Martian surface materials in human exploration of Mars
NASA Technical Reports Server (NTRS)
Kim, M. H.; Thibeault, S. A.; Wilson, J. W.; Heilbronn, L.; Kiefer, R. L.; Weakley, J. A.; Dueber, J. L.; Fogarty, T.; Wilkins, R.
2001-01-01
To develop materials for shielding astronauts from the hazards of GCR, natural Martian surface materials are considered for their potential as radiation shielding for manned Mars missions. The modified radiation fluences behind various kinds of Martian rocks and regolith are determined by solving the Boltzmann equation using NASA Langley's HZETRN code along with the 1977 Solar Minimum galactic cosmic ray environmental model. To develop structural shielding composite materials for Martian surface habitats, theoretical predictions of the shielding properties of Martian regolith/polyimide composites has been computed to assess their shielding effectiveness. Adding high-performance polymer binders to Martian regolith to enhance structural properties also enhances the shielding properties of these composites because of the added hydrogenous constituents. Heavy ion beam testing of regolith simulant/polyimide composites is planned to validate this prediction. Characterization and proton beam tests are performed to measure structural properties and to compare the shielding effects on microelectronic devices, respectively.
Ground-motion prediction from tremor
Baltay, Annemarie S.; Beroza, Gregory C.
2013-01-01
The widespread occurrence of tremor, coupled with its frequency content and location, provides an exceptional opportunity to test and improve strong ground-motion attenuation relations for subduction zones. We characterize the amplitude of thousands of individual 5 min tremor events in Cascadia during three episodic tremor and slip events to constrain the distance decay of peak ground acceleration (PGA) and peak ground velocity (PGV). We determine the anelastic attenuation parameter for ground-motion prediction equations (GMPEs) to a distance of 150 km, which is sufficient to place important constraints on ground-motion decay. Tremor PGA and PGV show a distance decay that is similar to subduction-zone-specific GMPEs developed from both data and simulations; however, the massive amount of data present in the tremor observations should allow us to refine distance-amplitude attenuation relationships for use in hazard maps, and to search for regional variations and intrasubduction zone differences in ground-motion attenuation.
Solar cosmic ray hazard to interplanetary and earth-orbital space travel
NASA Technical Reports Server (NTRS)
Yucker, W. R.
1972-01-01
A statistical treatment of the radiation hazards to astronauts due to solar cosmic ray protons is reported to determine shielding requirements for solar proton events. More recent data are incorporated into the present analysis in order to improve the accuracy of the predicted mission fluence and dose. The effects of the finite data sample are discussed. Mission fluence and dose versus shield thickness data are presented for mission lengths up to 3 years during periods of maximum and minimum solar activity; these correspond to various levels of confidence that the predicted hazard will not be exceeded.
Polk, William W; Sharma, Monita; Sayes, Christie M; Hotchkiss, Jon A; Clippinger, Amy J
2016-04-23
Aerosol generation and characterization are critical components in the assessment of the inhalation hazards of engineered nanomaterials (NMs). An extensive review was conducted on aerosol generation and exposure apparatus as part of an international expert workshop convened to discuss the design of an in vitro testing strategy to assess pulmonary toxicity following exposure to aerosolized particles. More specifically, this workshop focused on the design of an in vitro method to predict the development of pulmonary fibrosis in humans following exposure to multi-walled carbon nanotubes (MWCNTs). Aerosol generators, for dry or liquid particle suspension aerosolization, and exposure chambers, including both commercially available systems and those developed by independent researchers, were evaluated. Additionally, characterization methods that can be used and the time points at which characterization can be conducted in order to interpret in vitro exposure results were assessed. Summarized below is the information presented and discussed regarding the relevance of various aerosol generation and characterization techniques specific to aerosolized MWCNTs exposed to cells cultured at the air-liquid interface (ALI). The generation of MWCNT aerosols relevant to human exposures and their characterization throughout exposure in an ALI system is critical for extrapolation of in vitro results to toxicological outcomes in humans.
Diffraction and Dissipation of Atmospheric Waves in the Vicinity of Caustics
NASA Astrophysics Data System (ADS)
Godin, O. A.
2015-12-01
A large and increasing number of ground-based and satellite-borne instruments has been demonstrated to reliably reveal ionospheric manifestations of natural hazards such as large earthquakes, strong tsunamis, and powerful tornadoes. To transition from detection of ionospheric manifestations of natural hazards to characterization of the hazards for the purposes of improving early warning systems and contributing to disaster recovery, it is necessary to relate quantitatively characteristics of the observed ionospheric disturbances and the underlying natural hazard and, in particular, accurately model propagation of atmospheric waves from the ground or ocean surface to the ionosphere. The ray theory has been used extensively to model propagation of atmospheric waves and proved to be very efficient in elucidating the effects of atmospheric variability on ionospheric signatures of natural hazards. However, the ray theory predicts unphysical, divergent values of the wave amplitude and needs to be modified in the vicinity of caustics. This paper presents an asymptotic theory that describes diffraction, focusing and increased dissipation of acoustic-gravity waves in the vicinity of caustics and turning points. Air temperature, viscosity, thermal conductivity, and wind velocity are assumed to vary gradually with height and horizontal coordinates, and slowness of these variations determines the large parameter of the problem. Uniform asymptotics of the wave field are expressed in terms of Airy functions and their derivatives. The geometrical, or Berry, phase, which arises in the consistent WKB approximation for acoustic-gravity waves, plays an important role in the caustic asymptotics. In addition to the wave field in the vicinity of the caustic, these asymptotics describe wave reflection from the caustic and the evanescent wave field beyond the caustic. The evanescent wave field is found to play an important role in ionospheric manifestations of tsunamis.
The NHERI RAPID Facility: Enabling the Next-Generation of Natural Hazards Reconnaissance
NASA Astrophysics Data System (ADS)
Wartman, J.; Berman, J.; Olsen, M. J.; Irish, J. L.; Miles, S.; Gurley, K.; Lowes, L.; Bostrom, A.
2017-12-01
The NHERI post-disaster, rapid response research (or "RAPID") facility, headquartered at the University of Washington (UW), is a collaboration between UW, Oregon State University, Virginia Tech, and the University of Florida. The RAPID facility will enable natural hazard researchers to conduct next-generation quick response research through reliable acquisition and community sharing of high-quality, post-disaster data sets that will enable characterization of civil infrastructure performance under natural hazard loads, evaluation of the effectiveness of current and previous design methodologies, understanding of socio-economic dynamics, calibration of computational models used to predict civil infrastructure component and system response, and development of solutions for resilient communities. The facility will provide investigators with the hardware, software and support services needed to collect, process and assess perishable interdisciplinary data following extreme natural hazard events. Support to the natural hazards research community will be provided through training and educational activities, field deployment services, and by promoting public engagement with science and engineering. Specifically, the RAPID facility is undertaking the following strategic activities: (1) acquiring, maintaining, and operating state-of-the-art data collection equipment; (2) developing and supporting mobile applications to support interdisciplinary field reconnaissance; (3) providing advisory services and basic logistics support for research missions; (4) facilitating the systematic archiving, processing and visualization of acquired data in DesignSafe-CI; (5) training a broad user base through workshops and other activities; and (6) engaging the public through citizen science, as well as through community outreach and education. The facility commenced operations in September 2016 and will begin field deployments beginning in September 2018. This poster will provide an overview of the vision for the RAPID facility, the equipment that will be available for use, the facility's operations, and opportunities for user training and facility use.
NASA Astrophysics Data System (ADS)
Fan, Linfeng; Lehmann, Peter; McArdell, Brian; Or, Dani
2017-03-01
Debris flows and landslides induced by heavy rainfall represent an ubiquitous and destructive natural hazard in steep mountainous regions. For debris flows initiated by shallow landslides, the prediction of the resulting pathways and associated hazard is often hindered by uncertainty in determining initiation locations, volumes and mechanical state of the mobilized debris (and by model parameterization). We propose a framework for linking a simplified physically-based debris flow runout model with a novel Landslide Hydro-mechanical Triggering (LHT) model to obtain a coupled landslide-debris flow susceptibility and hazard assessment. We first compared the simplified debris flow model of Perla (1980) with a state-of-the art continuum-based model (RAMMS) and with an empirical model of Rickenmann (1999) at the catchment scale. The results indicate that predicted runout distances by the Perla model are in reasonable agreement with inventory measurements and with the other models. Predictions of localized shallow landslides by LHT model provides information on water content of released mass. To incorporate effects of water content and flow viscosity as provided by LHT on debris flow runout, we adapted the Perla model. The proposed integral link between landslide triggering susceptibility quantified by LHT and subsequent debris flow runout hazard calculation using the adapted Perla model provides a spatially and temporally resolved framework for real-time hazard assessment at the catchment scale or along critical infrastructure (roads, railroad lines).
Help in making fuel management decisions.
Peter J. Roussopoulos; Von J. Johnson
1975-01-01
Describes how to compare predictions of fuel hazard for Northeastern logging slash with a number of fuel hazard "standards." This system provides objective criteria for making fuel management decisions.
Evaluating MoE and its Uncertainty and Variability for Food Contaminants (EuroTox presentation)
Margin of Exposure (MoE), is a metric for quantifying the relationship between exposure and hazard. Ideally, it is the ratio of the dose associated with hazard and an estimate of exposure. For example, hazard may be characterized by a benchmark dose (BMD), and, for food contami...
Multivariate Models for Prediction of Human Skin Sensitization ...
One of the lnteragency Coordinating Committee on the Validation of Alternative Method's (ICCVAM) top priorities is the development and evaluation of non-animal approaches to identify potential skin sensitizers. The complexity of biological events necessary to produce skin sensitization suggests that no single alternative method will replace the currently accepted animal tests. ICCVAM is evaluating an integrated approach to testing and assessment based on the adverse outcome pathway for skin sensitization that uses machine learning approaches to predict human skin sensitization hazard. We combined data from three in chemico or in vitro assays - the direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT) and KeratinoSens TM assay - six physicochemical properties and an in silico read-across prediction of skin sensitization hazard into 12 variable groups. The variable groups were evaluated using two machine learning approaches , logistic regression and support vector machine, to predict human skin sensitization hazard. Models were trained on 72 substances and tested on an external set of 24 substances. The six models (three logistic regression and three support vector machine) with the highest accuracy (92%) used: (1) DPRA, h-CLAT and read-across; (2) DPRA, h-CLAT, read-across and KeratinoSens; or (3) DPRA, h-CLAT, read-across, KeratinoSens and log P. The models performed better at predicting human skin sensitization hazard than the murine
How well should probabilistic seismic hazard maps work?
NASA Astrophysics Data System (ADS)
Vanneste, K.; Stein, S.; Camelbeeck, T.; Vleminckx, B.
2016-12-01
Recent large earthquakes that gave rise to shaking much stronger than shown in earthquake hazard maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSHA model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating the shaking history of an area with assumed distribution of earthquakes, frequency-magnitude relation, temporal occurrence model, and ground-motion prediction equation. We compare the "observed" shaking at many sites over time to that predicted by a hazard map generated for the same set of parameters. PSHA predicts that the fraction of sites at which shaking will exceed that mapped is p = 1 - exp(t/T), where t is the duration of observations and T is the map's return period. This implies that shaking in large earthquakes is typically greater than shown on hazard maps, as has occurred in a number of cases. A large number of simulated earthquake histories yield distributions of shaking consistent with this forecast, with a scatter about this value that decreases as t/T increases. The median results are somewhat lower than predicted for small values of t/T and approach the predicted value for larger values of t/T. Hence, the algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. Validation is more complicated because a real observed earthquake history can yield a fractional exceedance significantly higher or lower than that predicted while still being consistent with the hazard map in question. As a result, given that in the real world we have only a single sample, it is hard to assess whether a misfit between a map and observations arises by chance or reflects a biased map.
Fujii, Hideki; Nishimoto, Naoki; Yamaguchi, Seiko; Kurai, Osamu; Miyano, Masato; Ueda, Wataru; Oba, Hiroko; Aoki, Tetsuya; Kawada, Norifumi; Okawa, Kiyotaka
2016-05-10
It is important to screen for alcohol consumption and drinking customs in a standardized manner. The aim of this study was 1) to investigate whether the AUDIT score is useful for predicting hazardous drinking using optimal cutoff scores and 2) to use multivariate analysis to evaluate whether the AUDIT score was more useful than pre-existing laboratory tests for predicting hazardous drinking. A cross-sectional study using the Alcohol Use Disorders Identification Test (AUDIT) was conducted in 334 outpatients who consulted our internal medicine department. The patients completed self-reported questionnaires and underwent a diagnostic interview, physical examination, and laboratory testing. Forty (23 %) male patients reported daily alcohol consumption ≥ 40 g, and 16 (10 %) female patients reported consumption ≥ 20 g. The optimal cutoff values of hazardous drinking were calculated using a 10-fold cross validation, resulting in an optimal AUDIT score cutoff of 8.2, with a sensitivity of 95.5 %, specificity of 87.0 %, false positive rate of 13.0 %, false negative rate of 4.5 %, and area under the receiver operating characteristic curve of 0.97. Multivariate analysis revealed that the most popular short version of the AUDIT consisting solely of its three consumption items (AUDIT-C) and patient sex were significantly associated with hazardous drinking. The aspartate transaminase (AST)/alanine transaminase (ALT) ratio and mean corpuscular volume (MCV) were weakly significant. This study showed that the AUDIT score and particularly the AUDIT-C score were more useful than the AST/ALT ratio and MCV for predicting hazardous drinking.
Landslide hazard assessment: recent trends and techniques.
Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S
2013-01-01
Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.
Chen, Guangchao; Peijnenburg, Willie; Xiao, Yinlong; Vijver, Martina G
2017-07-12
As listed by the European Chemicals Agency, the three elements in evaluating the hazards of engineered nanomaterials (ENMs) include the integration and evaluation of toxicity data, categorization and labeling of ENMs, and derivation of hazard threshold levels for human health and the environment. Assessing the hazards of ENMs solely based on laboratory tests is time-consuming, resource intensive, and constrained by ethical considerations. The adoption of computational toxicology into this task has recently become a priority. Alternative approaches such as (quantitative) structure-activity relationships ((Q)SAR) and read-across are of significant help in predicting nanotoxicity and filling data gaps, and in classifying the hazards of ENMs to individual species. Thereupon, the species sensitivity distribution (SSD) approach is able to serve the establishment of ENM hazard thresholds sufficiently protecting the ecosystem. This article critically reviews the current knowledge on the development of in silico models in predicting and classifying the hazard of metallic ENMs, and the development of SSDs for metallic ENMs. Further discussion includes the significance of well-curated experimental datasets and the interpretation of toxicity mechanisms of metallic ENMs based on reported models. An outlook is also given on future directions of research in this frontier.
Debris flow hazards mitigation--Mechanics, prediction, and assessment
Chen, C.-L.; Major, J.J.
2007-01-01
These proceedings contain papers presented at the Fourth International Conference on Debris-Flow Hazards Mitigation: Mechanics, Prediction, and Assessment held in Chengdu, China, September 10-13, 2007. The papers cover a wide range of topics on debris-flow science and engineering, including the factors triggering debris flows, geomorphic effects, mechanics of debris flows (e.g., rheology, fluvial mechanisms, erosion and deposition processes), numerical modeling, various debris-flow experiments, landslide-induced debris flows, assessment of debris-flow hazards and risk, field observations and measurements, monitoring and alert systems, structural and non-structural countermeasures against debris-flow hazards and case studies. The papers reflect the latest devel-opments and advances in debris-flow research. Several studies discuss the development and appli-cation of Geographic Information System (GIS) and Remote Sensing (RS) technologies in debris-flow hazard/risk assessment. Timely topics presented in a few papers also include the development of new or innovative techniques for debris-flow monitoring and alert systems, especially an infra-sound acoustic sensor for detecting debris flows. Many case studies illustrate a wide variety of debris-flow hazards and related phenomena as well as their hazardous effects on human activities and settlements.
NASA Technical Reports Server (NTRS)
Atwater, Terrill
1993-01-01
Prediction of the capacity remaining in used high rate, high energy batteries is important information to the user. Knowledge of the capacity remaining in used batteries results in better utilization. This translates into improved readiness and cost savings due to complete, efficient use. High rate batteries, due to their chemical nature, are highly sensitive to misuse (i.e., over discharge or very high rate discharge). Battery failure due to misuse or manufacturing defects could be disastrous. Since high rate, high energy batteries are expensive and energetic, a reliable method of predicting both failures and remaining energy has been actively sought. Due to concerns over safety, the behavior of lithium/sulphur dioxide cells at different temperatures and current drains was examined. The main thrust of this effort was to determine failure conditions for incorporation in hazard anticipation circuitry. In addition, capacity prediction formulas have been developed from test data. A process that performs continuous, real-time hazard anticipation and capacity prediction was developed. The introduction of this process into microchip technology will enable the production of reliable, safe, and efficient high energy batteries.
Hazard rating forest stands for gypsy moth
Ray R., Jr. Hicks
1991-01-01
A gypsy moth hazard exists when forest conditions prevail that are conducive to extensive damage from gypsy moth. Combining forest hazard rating with information on insect population trends provides the basis for predicting the probability (risk) of an event occurring. The likelihood of defoliation is termed susceptibility and the probability of damage (mortality,...
Choudhary, Gaurav; Jankowich, Matthew; Wu, Wen-Chih
2014-07-01
Although elevated pulmonary artery systolic pressure (PASP) is associated with heart failure (HF), whether PASP measurement can help predict future HF admissions is not known, especially in African Americans who are at increased risk for HF. We hypothesized that elevated PASP is associated with increased risk of HF admission and improves HF prediction in African American population. We conducted a longitudinal analysis using the Jackson Heart Study cohort (n=3125; 32.2% men) with baseline echocardiography-derived PASP and follow-up for HF admissions. Hazard ratio for HF admission was estimated using Cox proportional hazard model adjusted for variables in the Atherosclerosis Risk in Community (ARIC) HF prediction model. During a median follow-up of 3.46 years, 3.42% of the cohort was admitted for HF. Subjects with HF had a higher PASP (35.6±11.4 versus 27.6±6.9 mm Hg; P<0.001). The hazard of HF admission increased with higher baseline PASP (adjusted hazard ratio per 10 mm Hg increase in PASP: 2.03; 95% confidence interval, 1.67-2.48; adjusted hazard ratio for highest [≥33 mm Hg] versus lowest quartile [<24 mm Hg] of PASP: 2.69; 95% confidence interval, 1.43-5.06) and remained significant irrespective of history of HF or preserved/reduced ejection fraction. Addition of PASP to the ARIC model resulted in a significant improvement in model discrimination (area under the curve=0.82 before versus 0.84 after; P=0.03) and improved net reclassification index (11-15%) using PASP as a continuous or dichotomous (cutoff=33 mm Hg) variable. Elevated PASP predicts HF admissions in African Americans and may aid in early identification of at-risk subjects for aggressive risk factor modification. © 2014 American Heart Association, Inc.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L. K.; Vogel, R. M.
2015-11-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, Laura K.; Vogel, Richard M.
2016-04-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.
A statistical approach to evaluate flood risk at the regional level: an application to Italy
NASA Astrophysics Data System (ADS)
Rossi, Mauro; Marchesini, Ivan; Salvati, Paola; Donnini, Marco; Guzzetti, Fausto; Sterlacchini, Simone; Zazzeri, Marco; Bonazzi, Alessandro; Carlesi, Andrea
2016-04-01
Floods are frequent and widespread in Italy, causing every year multiple fatalities and extensive damages to public and private structures. A pre-requisite for the development of mitigation schemes, including financial instruments such as insurance, is the ability to quantify their costs starting from the estimation of the underlying flood hazard. However, comprehensive and coherent information on flood prone areas, and estimates on the frequency and intensity of flood events, are not often available at scales appropriate for risk pooling and diversification. In Italy, River Basins Hydrogeological Plans (PAI), prepared by basin administrations, are the basic descriptive, regulatory, technical and operational tools for environmental planning in flood prone areas. Nevertheless, such plans do not cover the entire Italian territory, having significant gaps along the minor hydrographic network and in ungauged basins. Several process-based modelling approaches have been used by different basin administrations for the flood hazard assessment, resulting in an inhomogeneous hazard zonation of the territory. As a result, flood hazard assessments expected and damage estimations across the different Italian basin administrations are not always coherent. To overcome these limitations, we propose a simplified multivariate statistical approach for the regional flood hazard zonation coupled with a flood impact model. This modelling approach has been applied in different Italian basin administrations, allowing a preliminary but coherent and comparable estimation of the flood hazard and the relative impact. Model performances are evaluated comparing the predicted flood prone areas with the corresponding PAI zonation. The proposed approach will provide standardized information (following the EU Floods Directive specifications) on flood risk at a regional level which can in turn be more readily applied to assess flood economic impacts. Furthermore, in the assumption of an appropriate flood risk statistical characterization, the proposed procedure could be applied straightforward outside the national borders, particularly in areas with similar geo-environmental settings.
NASA Astrophysics Data System (ADS)
iMOST Team; Harrington, A. D.; Carrier, B. L.; Fernandez-Remolar, D. C.; Fogarty, J.; McCoy, J. T.; Rucker, M. A.; Spry, J. A.; Altieri, F.; Amelin, Y.; Ammannito, E.; Anand, M.; Beaty, D. W.; Benning, L. G.; Bishop, J. L.; Borg, L. E.; Boucher, D.; Brucato, J. R.; Busemann, H.; Campbell, K. A.; Czaja, A. D.; Debaille, V.; Des Marais, D. J.; Dixon, M.; Ehlmann, B. L.; Farmer, J. D.; Glavin, D. P.; Goreva, Y. S.; Grady, M. M.; Hallis, L. J.; Hausrath, E. M.; Herd, C. D. K.; Horgan, B.; Humayun, M.; Kleine, T.; Kleinhenz, J.; Mangold, N.; Mackelprang, R.; Mayhew, L. E.; McCubbin, F. M.; McLennan, S. M.; McSween, H. Y.; Moser, D. E.; Moynier, F.; Mustard, J. F.; Niles, P. B.; Ori, G. G.; Raulin, F.; Rettberg, P.; Schmitz, N.; Sefton-Nash, E.; Sephton, M. A.; Shaheen, R.; Shuster, D. L.; Siljestrom, S.; Smith, C. L.; Steele, A.; Swindle, T. D.; ten Kate, I. L.; Tosca, N. J.; Usui, T.; Van Kranendonk, M. J.; Wadhwa, M.; Weiss, B. P.; Werner, S. C.; Westall, F.; Wheeler, R. M.; Zipfel, J.; Zorzano, M. P.
2018-04-01
Thorough characterization and evaluation of returned martian regolith and airfall samples are critical to understanding the potential health and engineering system hazards during future human exploration.
NASA Astrophysics Data System (ADS)
Scotti, Oona; Peruzza, Laura
2016-04-01
The key questions we ask are: What is the best strategy to fill in the gap in knowledge and know-how in Europe when considering faults in seismic hazard assessments? Are field geologists providing the relevant information for seismic hazard assessment? Are seismic hazard analysts interpreting field data appropriately? Is the full range of uncertainties associated with the characterization of faults correctly understood and propagated in the computations? How can fault-modellers contribute to a better representation of the long-term behaviour of fault-networks in seismic hazard studies? Providing answers to these questions is fundamental, in order to reduce the consequences of future earthquakes and improve the reliability of seismic hazard assessments. An informal working group was thus created at a meeting in Paris in November 2014, partly financed by the Institute of Radioprotection and Nuclear Safety, with the aim to motivate exchanges between field geologists, fault modellers and seismic hazard practitioners. A variety of approaches were presented at the meeting and a clear gap emerged between some field geologists, that are not necessarily familiar with probabilistic seismic hazard assessment methods and needs and practitioners that do not necessarily propagate the "full" uncertainty associated with the characterization of faults. The group thus decided to meet again a year later in Chieti (Italy), to share concepts and ideas through a specific exercise on a test case study. Some solutions emerged but many problems of seismic source characterizations with people working in the field as well as with people tackling models of interacting faults remained. Now, in Wien, we want to open the group and launch a call for the European community at large to contribute to the discussion. The 2016 EGU session Fault2SHA is motivated by such an urgency to increase the number of round tables on this topic and debate on the peculiarities of using faults in seismic hazard assessment in Europe. Europe is a country dominated by slow deforming regions where the long histories of seismicity are the main source of information to infer fault behaviour. Geodetic studies, geomorphological studies as well as paleoseismological studies are welcome complementary data that are slowly filling in the database but are at present insufficient, by themselves, to allow characterizing faults. Moreover, Europe is characterized by complex fault systems (Upper Rhine Graben, Central and Southern Apennines, Corinth, etc.) and the degree of uncertainty in the characterization of the faults can be very different from one country to the other. This requires developing approaches and concepts that are adapted to the European context. It is thus the specificity of the European situation that motivates the creation of a predominantly European group where field geologists, fault modellers and fault-PSHA practitioners may exchange and learn from each other's experience.
LAV@HAZARD: a Web-GIS Framework for Real-Time Forecasting of Lava Flow Hazards
NASA Astrophysics Data System (ADS)
Del Negro, C.; Bilotta, G.; Cappello, A.; Ganci, G.; Herault, A.
2014-12-01
Crucial to lava flow hazard assessment is the development of tools for real-time prediction of flow paths, flow advance rates, and final flow lengths. Accurate prediction of flow paths and advance rates requires not only rapid assessment of eruption conditions (especially effusion rate) but also improved models of lava flow emplacement. Here we present the LAV@HAZARD web-GIS framework, which combines spaceborne remote sensing techniques and numerical simulations for real-time forecasting of lava flow hazards. By using satellite-derived discharge rates to drive a lava flow emplacement model, LAV@HAZARD allows timely definition of parameters and maps essential for hazard assessment, including the propagation time of lava flows and the maximum run-out distance. We take advantage of the flexibility of the HOTSAT thermal monitoring system to process satellite images coming from sensors with different spatial, temporal and spectral resolutions. HOTSAT was designed to ingest infrared satellite data acquired by the MODIS and SEVIRI sensors to output hot spot location, lava thermal flux and discharge rate. We use LAV@HAZARD to merge this output with the MAGFLOW physics-based model to simulate lava flow paths and to update, in a timely manner, flow simulations. Thus, any significant changes in lava discharge rate are included in the predictions. A significant benefit in terms of computational speed was obtained thanks to the parallel implementation of MAGFLOW on graphic processing units (GPUs). All this useful information has been gathered into the LAV@HAZARD platform which, due to the high degree of interactivity, allows generation of easily readable maps and a fast way to explore alternative scenarios. We will describe and demonstrate the operation of this framework using a variety of case studies pertaining to Mt Etna, Sicily. Although this study was conducted on Mt Etna, the approach used is designed to be applicable to other volcanic areas around the world.
Risk Management and Physical Modelling for Mountainous Natural Hazards
NASA Astrophysics Data System (ADS)
Lehning, Michael; Wilhelm, Christian
Population growth and climate change cause rapid changes in mountainous regions resulting in increased risks of floods, avalanches, debris flows and other natural hazards. Xevents are of particular concern, since attempts to protect against them result in exponentially growing costs. In this contribution, we suggest an integral risk management approach to dealing with natural hazards that occur in mountainous areas. Using the example of a mountain pass road, which can be protected from the danger of an avalanche by engineering (galleries) and/or organisational (road closure) measures, we show the advantage of an optimal combination of both versus the traditional approach, which is to rely solely on engineering structures. Organisational measures become especially important for Xevents because engineering structures cannot be designed for those events. However, organisational measures need a reliable and objective forecast of the hazard. Therefore, we further suggest that such forecasts should be developed using physical numerical modelling. We present the status of current approaches to using physical modelling to predict snow cover stability for avalanche warnings and peak runoff from mountain catchments for flood warnings. While detailed physical models can already predict peak runoff reliably, they are only used to support avalanche warnings. With increased process knowledge and computer power, current developments should lead to a enhanced role for detailed physical models in natural mountain hazard prediction.
NASA Astrophysics Data System (ADS)
Koga-Vicente, A.; Friedel, M. J.
2010-12-01
Every year thousands of people are affected by floods and landslide hazards caused by rainstorms. The problem is more serious in tropical developing countries because of the susceptibility as a result of the high amount of available energy to form storms, and the high vulnerability due to poor economic and social conditions. Predictive models of hazards are important tools to manage this kind of risk. In this study, a comparison of two different modeling approaches was made for predicting hydrometeorological hazards in 12 cities on the coast of São Paulo, Brazil, from 1994 to 2003. In the first approach, an empirical multiple linear regression (MLR) model was developed and used; the second approach used a type of unsupervised nonlinear artificial neural network called a self-organized map (SOM). By using twenty three independent variables of susceptibility (precipitation, soil type, slope, elevation, and regional atmospheric system scale) and vulnerability (distribution and total population, income and educational characteristics, poverty intensity, human development index), binary hazard responses were obtained. Model performance by cross-validation indicated that the respective MLR and SOM model accuracy was about 67% and 80%. Prediction accuracy can be improved by the addition of information, but the SOM approach is preferred because of sparse data and highly nonlinear relations among the independent variables.
Current Range Safety Capabilities
1994-02-01
weights of up to 10 pounds. 12 (4) Tactical Aircraft Overpressure Signature Prediction. This interactive computer program accurately predicts the...Here the effect might be the loss of an aircraft and/or lives. "MINIMIZING PROCEDURES" are the things you plan to do to prevent the hazard from...occurrence is highly subjective end will dominate the discussion. The guidelnes below may be of some help. HAZARD CATEGORY CATASTROPHIC: Death. Loss of
Zhou, Qianqian; Leng, Guoyong; Feng, Leyang
2017-07-13
Understanding historical changes in flood damage and the underlying mechanisms is critical for predicting future changes for better adaptations. In this study, a detailed assessment of flood damage for 1950–1999 is conducted at the state level in the conterminous United States (CONUS). Geospatial datasets on possible influencing factors are then developed by synthesizing natural hazards, population, wealth, cropland and urban area to explore the relations with flood damage. A considerable increase in flood damage in CONUS is recorded for the study period which is well correlated with hazards. Comparably, runoff indexed hazards simulated by the Variable Infiltration Capacity (VIC) modelmore » can explain a larger portion of flood damage variations than precipitation in 84% of the states. Cropland is identified as an important factor contributing to increased flood damage in central US while urbanland exhibits positive and negative relations with total flood damage and damage per unit wealth in 20 and 16 states, respectively. Altogether, flood damage in 34 out of 48 investigated states can be predicted at the 90% confidence level. In extreme cases, ~76% of flood damage variations can be explained in some states, highlighting the potential of future flood damage prediction based on climate change and socioeconomic scenarios.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Qianqian; Leng, Guoyong; Feng, Leyang
Understanding historical changes in flood damage and the underlying mechanisms is critical for predicting future changes for better adaptations. In this study, a detailed assessment of flood damage for 1950–1999 is conducted at the state level in the conterminous United States (CONUS). Geospatial datasets on possible influencing factors are then developed by synthesizing natural hazards, population, wealth, cropland and urban area to explore the relations with flood damage. A considerable increase in flood damage in CONUS is recorded for the study period which is well correlated with hazards. Comparably, runoff indexed hazards simulated by the Variable Infiltration Capacity (VIC) modelmore » can explain a larger portion of flood damage variations than precipitation in 84% of the states. Cropland is identified as an important factor contributing to increased flood damage in central US while urbanland exhibits positive and negative relations with total flood damage and damage per unit wealth in 20 and 16 states, respectively. Altogether, flood damage in 34 out of 48 investigated states can be predicted at the 90% confidence level. In extreme cases, ~76% of flood damage variations can be explained in some states, highlighting the potential of future flood damage prediction based on climate change and socioeconomic scenarios.« less
Caraviello, D Z; Weigel, K A; Gianola, D
2004-05-01
Predicted transmitting abilities (PTA) of US Jersey sires for daughter longevity were calculated using a Weibull proportional hazards sire model and compared with predictions from a conventional linear animal model. Culling data from 268,008 Jersey cows with first calving from 1981 to 2000 were used. The proportional hazards model included time-dependent effects of herd-year-season contemporary group and parity by stage of lactation interaction, as well as time-independent effects of sire and age at first calving. Sire variances and parameters of the Weibull distribution were estimated, providing heritability estimates of 4.7% on the log scale and 18.0% on the original scale. The PTA of each sire was expressed as the expected risk of culling relative to daughters of an average sire. Risk ratios (RR) ranged from 0.7 to 1.3, indicating that the risk of culling for daughters of the best sires was 30% lower than for daughters of average sires and nearly 50% lower than than for daughters of the poorest sires. Sire PTA from the proportional hazards model were compared with PTA from a linear model similar to that used for routine national genetic evaluation of length of productive life (PL) using cross-validation in independent samples of herds. Models were compared using logistic regression of daughters' stayability to second, third, fourth, or fifth lactation on their sires' PTA values, with alternative approaches for weighting the contribution of each sire. Models were also compared using logistic regression of daughters' stayability to 36, 48, 60, 72, and 84 mo of life. The proportional hazards model generally yielded more accurate predictions according to these criteria, but differences in predictive ability between methods were smaller when using a Kullback-Leibler distance than with other approaches. Results of this study suggest that survival analysis methodology may provide more accurate predictions of genetic merit for longevity than conventional linear models.
Are seismic hazard assessment errors and earthquake surprises unavoidable?
NASA Astrophysics Data System (ADS)
Kossobokov, Vladimir
2013-04-01
Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.
The Framework of a Coastal Hazards Model - A Tool for Predicting the Impact of Severe Storms
Barnard, Patrick L.; O'Reilly, Bill; van Ormondt, Maarten; Elias, Edwin; Ruggiero, Peter; Erikson, Li H.; Hapke, Cheryl; Collins, Brian D.; Guza, Robert T.; Adams, Peter N.; Thomas, Julie
2009-01-01
The U.S. Geological Survey (USGS) Multi-Hazards Demonstration Project in Southern California (Jones and others, 2007) is a five-year project (FY2007-FY2011) integrating multiple USGS research activities with the needs of external partners, such as emergency managers and land-use planners, to produce products and information that can be used to create more disaster-resilient communities. The hazards being evaluated include earthquakes, landslides, floods, tsunamis, wildfires, and coastal hazards. For the Coastal Hazards Task of the Multi-Hazards Demonstration Project in Southern California, the USGS is leading the development of a modeling system for forecasting the impact of winter storms threatening the entire Southern California shoreline from Pt. Conception to the Mexican border. The modeling system, run in real-time or with prescribed scenarios, will incorporate atmospheric information (that is, wind and pressure fields) with a suite of state-of-the-art physical process models (that is, tide, surge, and wave) to enable detailed prediction of currents, wave height, wave runup, and total water levels. Additional research-grade predictions of coastal flooding, inundation, erosion, and cliff failure will also be performed. Initial model testing, performance evaluation, and product development will be focused on a severe winter-storm scenario developed in collaboration with the Winter Storm Working Group of the USGS Multi-Hazards Demonstration Project in Southern California. Additional offline model runs and products will include coastal-hazard hindcasts of selected historical winter storms, as well as additional severe winter-storm simulations based on statistical analyses of historical wave and water-level data. The coastal-hazards model design will also be appropriate for simulating the impact of storms under various sea level rise and climate-change scenarios. The operational capabilities of this modeling system are designed to provide emergency planners with the critical information they need to respond quickly and efficiently and to increase public safety and mitigate damage associated with powerful coastal storms. For instance, high resolution local models will predict detailed wave heights, breaking patterns, and current strengths for use in warning systems for harbor-mouth navigation and densely populated coastal regions where beach safety is threatened. The offline applications are intended to equip coastal managers with the information needed to manage and allocate their resources effectively to protect sections of coast that may be most vulnerable to future severe storms.
Effect and clinical prediction of worsening renal function in acute decompensated heart failure.
Breidthardt, Tobias; Socrates, Thenral; Noveanu, Markus; Klima, Theresia; Heinisch, Corinna; Reichlin, Tobias; Potocki, Mihael; Nowak, Albina; Tschung, Christopher; Arenja, Nisha; Bingisser, Roland; Mueller, Christian
2011-03-01
We aimed to establish the prevalence and effect of worsening renal function (WRF) on survival among patients with acute decompensated heart failure. Furthermore, we sought to establish a risk score for the prediction of WRF and externally validate the previously established Forman risk score. A total of 657 consecutive patients with acute decompensated heart failure presenting to the emergency department and undergoing serial creatinine measurements were enrolled. The potential of the clinical parameters at admission to predict WRF was assessed as the primary end point. The secondary end point was all-cause mortality at 360 days. Of the 657 patients, 136 (21%) developed WRF, and 220 patients had died during the first year. WRF was more common in the nonsurvivors (30% vs 41%, p = 0.03). Multivariate regression analysis found WRF to independently predict mortality (hazard ratio 1.92, p <0.01). In a single parameter model, previously diagnosed chronic kidney disease was the only independent predictor of WRF and achieved an area under the receiver operating characteristic curve of 0.60. After the inclusion of the blood gas analysis parameters into the model history of chronic kidney disease (hazard ratio 2.13, p = 0.03), outpatient diuretics (hazard ratio 5.75, p <0.01), and bicarbonate (hazard ratio 0.91, p <0.01) were all predictive of WRF. A risk score was developed using these predictors. On receiver operating characteristic curve analysis, the Forman and Basel prediction rules achieved an area under the curve of 0.65 and 0.71, respectively. In conclusion, WRF was common in patients with acute decompensated heart failure and was linked to significantly worse outcomes. However, the clinical parameters failed to adequately predict its occurrence, making a tailored therapy approach impossible. Copyright © 2011 Elsevier Inc. All rights reserved.
Templates of Change: Storms and Shoreline Hazards.
ERIC Educational Resources Information Center
Dolan, Robert; Hayden, Bruce
1980-01-01
Presents results of research designed to assess and predict the storm-related hazards of living on the coast. Findings suggest that certain sections of coastline are more vulnerable than others to storm damage. (WB)
Memory Binding Test Predicts Incident Amnestic Mild Cognitive Impairment.
Mowrey, Wenzhu B; Lipton, Richard B; Katz, Mindy J; Ramratan, Wendy S; Loewenstein, David A; Zimmerman, Molly E; Buschke, Herman
2016-07-14
The Memory Binding Test (MBT), previously known as Memory Capacity Test, has demonstrated discriminative validity for distinguishing persons with amnestic mild cognitive impairment (aMCI) and dementia from cognitively normal elderly. We aimed to assess the predictive validity of the MBT for incident aMCI. In a longitudinal, community-based study of adults aged 70+, we administered the MBT to 246 cognitively normal elderly adults at baseline and followed them annually. Based on previous work, a subtle reduction in memory binding at baseline was defined by a Total Items in the Paired (TIP) condition score of ≤22 on the MBT. Cox proportional hazards models were used to assess the predictive validity of the MBT for incident aMCI accounting for the effects of covariates. The hazard ratio of incident aMCI was also assessed for different prediction time windows ranging from 4 to 7 years of follow-up, separately. Among 246 controls who were cognitively normal at baseline, 48 developed incident aMCI during follow-up. A baseline MBT reduction was associated with an increased risk for developing incident aMCI (hazard ratio (HR) = 2.44, 95% confidence interval: 1.30-4.56, p = 0.005). When varying the prediction window from 4-7 years, the MBT reduction remained significant for predicting incident aMCI (HR range: 2.33-3.12, p: 0.0007-0.04). Persons with poor performance on the MBT are at significantly greater risk for developing incident aMCI. High hazard ratios up to seven years of follow-up suggest that the MBT is sensitive to early disease.
Evaluating Developmental Neurotoxicity Hazard: Better than Before
EPA researchers grew neural networks in their laboratory that showed the promise of helping to screen thousands of chemicals in the environment that are yet to be characterized for developmental neurotoxicity hazard through traditional methods.
Advance of Hazardous Operation Robot and its Application in Special Equipment Accident Rescue
NASA Astrophysics Data System (ADS)
Zeng, Qin-Da; Zhou, Wei; Zheng, Geng-Feng
A survey of hazardous operation robot is given out in this article. Firstly, the latest researches such as nuclear industry robot, fire-fighting robot and explosive-handling robot are shown. Secondly, existing key technologies and their shortcomings are summarized, including moving mechanism, control system, perceptive technology and power technology. Thirdly, the trend of hazardous operation robot is predicted according to current situation. Finally, characteristics and hazards of special equipment accident, as well as feasibility of hazardous operation robot in the area of special equipment accident rescue are analyzed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gary Mecham
2010-08-01
This report is a companion to the Facilities Condition and Hazard Assessment for Materials and Fuel Complex Sodium Processing Facilities MFC-799/799A and Nuclear Calibration Laboratory MFC-770C (referred to as the Facilities Condition and Hazards Assessment). This report specifically responds to the requirement of Section 9.2, Item 6, of the Facilities Condition and Hazards Assessment to provide an updated assessment and verification of the residual hazardous materials remaining in the Sodium Processing Facilities processing system. The hazardous materials of concern are sodium and sodium hydroxide (caustic). The information supplied in this report supports the end-point objectives identified in the Transition Planmore » for Multiple Facilities at the Materials and Fuels Complex, Advanced Test Reactor, Central Facilities Area, and Power Burst Facility, as well as the deactivation and decommissioning critical decision milestone 1, as specified in U.S. Department of Energy Guide 413.3-8, “Environmental Management Cleanup Projects.” Using a tailored approach and based on information obtained through a combination of process knowledge, emergency management hazardous assessment documentation, and visual inspection, this report provides sufficient detail regarding the quantity of hazardous materials for the purposes of facility transfer; it also provides that further characterization/verification of these materials is unnecessary.« less
NASA Astrophysics Data System (ADS)
Mei, Xiong; Gong, Guangcai
2018-07-01
As potential carriers of hazardous pollutants, airborne particles may deposit onto surfaces due to gravitational settling. A modified Markov chain model to predict gravity induced particle dispersion and deposition is proposed in the paper. The gravity force is considered as a dominant weighting factor to adjust the State Transfer Matrix, which represents the probabilities of the change of particle spatial distributions between consecutive time steps within an enclosure. The model performance has been further validated by particle deposition in a ventilation chamber and a horizontal turbulent duct flow in pre-existing literatures. Both the proportion of deposited particles and the dimensionless deposition velocity are adopted to characterize the validation results. Comparisons between our simulated results and the experimental data from literatures show reasonable accuracy. Moreover, it is also found that the dimensionless deposition velocity can be remarkably influenced by particle size and stream-wise velocity in a typical horizontal flow. This study indicates that the proposed model can predict the gravity-dominated airborne particle deposition with reasonable accuracy and acceptable computing time.
Multi scenario seismic hazard assessment for Egypt
NASA Astrophysics Data System (ADS)
Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed
2018-01-01
Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.
Multi scenario seismic hazard assessment for Egypt
NASA Astrophysics Data System (ADS)
Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed
2018-05-01
Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.
NASA Technical Reports Server (NTRS)
Coulbert, C. D.
1978-01-01
A method for predicting the probable course of fire development in an enclosure is presented. This fire modeling approach uses a graphic plot of five fire development constraints, the relative energy release criteria (RERC), to bound the heat release rates in an enclosure as a function of time. The five RERC are flame spread rate, fuel surface area, ventilation, enclosure volume, and total fuel load. They may be calculated versus time based on the specified or empirical conditions describing the specific enclosure, the fuel type and load, and the ventilation. The calculation of these five criteria, using the common basis of energy release rates versus time, provides a unifying framework for the utilization of available experimental data from all phases of fire development. The plot of these criteria reveals the probable fire development envelope and indicates which fire constraint will be controlling during a criteria time period. Examples of RERC application to fire characterization and control and to hazard analysis are presented along with recommendations for the further development of the concept.
Global precipitation measurement (GPM) preliminary design
NASA Astrophysics Data System (ADS)
Neeck, Steven P.; Kakar, Ramesh K.; Azarbarzin, Ardeshir A.; Hou, Arthur Y.
2008-10-01
The overarching Earth science mission objective of the Global Precipitation Measurement (GPM) mission is to develop a scientific understanding of the Earth system and its response to natural and human-induced changes. This will enable improved prediction of climate, weather, and natural hazards for present and future generations. The specific scientific objectives of GPM are advancing: Precipitation Measurement through combined use of active and passive remote-sensing techniques, Water/Energy Cycle Variability through improved knowledge of the global water/energy cycle and fresh water availability, Climate Prediction through better understanding of surface water fluxes, soil moisture storage, cloud/precipitation microphysics and latent heat release, Weather Prediction through improved numerical weather prediction (NWP) skills from more accurate and frequent measurements of instantaneous rain rates with better error characterizations and improved assimilation methods, Hydrometeorological Prediction through better temporal sampling and spatial coverage of highresolution precipitation measurements and innovative hydro-meteorological modeling. GPM is a joint initiative with the Japan Aerospace Exploration Agency (JAXA) and other international partners and is the backbone of the Committee on Earth Observation Satellites (CEOS) Precipitation Constellation. It will unify and improve global precipitation measurements from a constellation of dedicated and operational active/passive microwave sensors. GPM is completing the Preliminary Design Phase and is advancing towards launch in 2013 and 2014.
NASA Astrophysics Data System (ADS)
Contreras Vargas, M. T.; Escauriaza, C. R.; Westerink, J. J.
2017-12-01
In recent years, the occurrence of flash floods and landslides produced by hydrometeorological events in Andean watersheds has had devastating consequences in urban and rural areas near the mountains. Two factors have hindered the hazard forecast in the region: 1) The spatial and temporal variability of climate conditions, which reduce the time range that the storm features can be predicted; and 2) The complexity of the basin morphology that characterizes the Andean region, and increases the velocity and the sediment transport capacity of flows that reach urbanized areas. Hydrodynamic models have become key tools to assess potential flood risks. Two-dimensional (2D) models based on the shallow-water equations are widely used to determine with high accuracy and resolution, the evolution of flow depths and velocities during floods. However, the high-computational requirements and long computational times have encouraged research to develop more efficient methodologies for predicting the flood propagation on real time. Our objective is to develop new surrogate models (i.e. metamodeling) to quasi-instantaneously evaluate floods propagation in the Andes foothills. By means a small set of parameters, we define storms for a wide range of meteorological conditions. Using a 2D hydrodynamic model coupled in mass and momentum with the sediment concentration, we compute on high-fidelity the propagation of a flood set. Results are used as a database to perform sophisticated interpolation/regression, and approximate efficiently the flow depth and velocities in critical points during real storms. This is the first application of surrogate models to evaluate flood propagation in the Andes foothills, improving the efficiency of flood hazard prediction. The model also opens new opportunities to improve early warning systems, helping decision makers to inform citizens, enhancing the reslience of cities near mountain regions. This work has been supported by CONICYT/FONDAP grant 15110017, and by the Vice Chancellor of Research of the Pontificia Universidad Catolica de Chile, through the Research Internationalization Grant, PUC1566 funded by MINEDUC.
Wang, Yan; Nowack, Bernd
2018-05-01
Many research studies have endeavored to investigate the ecotoxicological hazards of engineered nanomaterials (ENMs). However, little is known regarding the actual environmental risks of ENMs, combining both hazard and exposure data. The aim of the present study was to quantify the environmental risks for nano-Al 2 O 3 , nano-SiO 2 , nano iron oxides, nano-CeO 2 , and quantum dots by comparing the predicted environmental concentrations (PECs) with the predicted-no-effect concentrations (PNECs). The PEC values of these 5 ENMs in freshwaters in 2020 for northern Europe and southeastern Europe were taken from a published dynamic probabilistic material flow analysis model. The PNEC values were calculated using probabilistic species sensitivity distribution (SSD). The order of the PNEC values was quantum dots < nano-CeO 2 < nano iron oxides < nano-Al 2 O 3 < nano-SiO 2 . The risks posed by these 5 ENMs were demonstrated to be in the reverse order: nano-Al 2 O 3 > nano-SiO 2 > nano iron oxides > nano-CeO 2 > quantum dots. However, all risk characterization values are 4 to 8 orders of magnitude lower than 1, and no risk was therefore predicted for any of the investigated ENMs at the estimated release level in 2020. Compared to static models, the dynamic material flow model allowed us to use PEC values based on a more complex parameterization, considering a dynamic input over time and time-dependent release of ENMs. The probabilistic SSD approach makes it possible to include all available data to estimate hazards of ENMs by considering the whole range of variability between studies and material types. The risk-assessment approach is therefore able to handle the uncertainty and variability associated with the collected data. The results of the present study provide a scientific foundation for risk-based regulatory decisions of the investigated ENMs. Environ Toxicol Chem 2018;37:1387-1395. © 2018 SETAC. © 2018 SETAC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Y; Shirato, H; Song, J
2015-06-15
Purpose: This study aims to identify novel prognostic imaging biomarkers in locally advanced pancreatic cancer (LAPC) using quantitative, high-throughput image analysis. Methods: 86 patients with LAPC receiving chemotherapy followed by SBRT were retrospectively studied. All patients had a baseline FDG-PET scan prior to SBRT. For each patient, we extracted 435 PET imaging features of five types: statistical, morphological, textural, histogram, and wavelet. These features went through redundancy checks, robustness analysis, as well as a prescreening process based on their concordance indices with respect to the relevant outcomes. We then performed principle component analysis on the remaining features (number ranged frommore » 10 to 16), and fitted a Cox proportional hazard regression model using the first 3 principle components. Kaplan-Meier analysis was used to assess the ability to distinguish high versus low-risk patients separated by median predicted survival. To avoid overfitting, all evaluations were based on leave-one-out cross validation (LOOCV), in which each holdout patient was assigned to a risk group according to the model obtained from a separate training set. Results: For predicting overall survival (OS), the most dominant imaging features were wavelet coefficients. There was a statistically significant difference in OS between patients with predicted high and low-risk based on LOOCV (hazard ratio: 2.26, p<0.001). Similar imaging features were also strongly associated with local progression-free survival (LPFS) (hazard ratio: 1.53, p=0.026) on LOOCV. In comparison, neither SUVmax nor TLG was associated with LPFS (p=0.103, p=0.433) (Table 1). Results for progression-free survival and distant progression-free survival showed similar trends. Conclusion: Radiomic analysis identified novel imaging features that showed improved prognostic value over conventional methods. These features characterize the degree of intra-tumor heterogeneity reflected on FDG-PET images, and their biological underpinnings warrant further investigation. If validated in large, prospective cohorts, this method could be used to stratify patients based on individualized risk.« less
Prediction and Prevention of Chemical Reaction Hazards: Learning by Simulation.
ERIC Educational Resources Information Center
Shacham, Mordechai; Brauner, Neima; Cutlip, Michael B.
2001-01-01
Points out that chemical hazards are the major cause of accidents in chemical industry and describes a safety teaching approach using a simulation. Explains a problem statement on exothermic liquid-phase reactions. (YDS)
Hazard function theory for nonstationary natural hazards
Read, Laura K.; Vogel, Richard M.
2016-04-11
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.« less
Hazard function theory for nonstationary natural hazards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Read, Laura K.; Vogel, Richard M.
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.« less
Quantitative Microbial Risk Assessment for Clostridium perfringens in Natural and Processed Cheeses
Lee, Heeyoung; Lee, Soomin; Kim, Sejeong; Lee, Jeeyeon; Ha, Jimyeong; Yoon, Yohan
2016-01-01
This study evaluated the risk of Clostridium perfringens (C. perfringens) foodborne illness from natural and processed cheeses. Microbial risk assessment in this study was conducted according to four steps: hazard identification, hazard characterization, exposure assessment, and risk characterization. The hazard identification of C. perfringens on cheese was identified through literature, and dose response models were utilized for hazard characterization of the pathogen. For exposure assessment, the prevalence of C. perfringens, storage temperatures, storage time, and annual amounts of cheese consumption were surveyed. Eventually, a simulation model was developed using the collected data and the simulation result was used to estimate the probability of C. perfringens foodborne illness by cheese consumption with @RISK. C. perfringens was determined to be low risk on cheese based on hazard identification, and the exponential model (r = 1.82×10−11) was deemed appropriate for hazard characterization. Annual amounts of natural and processed cheese consumption were 12.40±19.43 g and 19.46±14.39 g, respectively. Since the contamination levels of C. perfringens on natural (0.30 Log CFU/g) and processed cheeses (0.45 Log CFU/g) were below the detection limit, the initial contamination levels of natural and processed cheeses were estimated by beta distribution (α1 = 1, α2 = 91; α1 = 1, α2 = 309)×uniform distribution (a = 0, b = 2; a = 0, b = 2.8) to be −2.35 and −2.73 Log CFU/g, respectively. Moreover, no growth of C. perfringens was observed for exposure assessment to simulated conditions of distribution and storage. These data were used for risk characterization by a simulation model, and the mean values of the probability of C. perfringens foodborne illness by cheese consumption per person per day for natural and processed cheeses were 9.57×10−14 and 3.58×10−14, respectively. These results indicate that probability of C. perfringens foodborne illness by consumption cheese is low, and it can be used to establish microbial criteria for C. perfringens on natural and processed cheeses. PMID:26954204
Political Orientation Predicts Credulity Regarding Putative Hazards.
Fessler, Daniel M T; Pisor, Anne C; Holbrook, Colin
2017-05-01
To benefit from information provided by other people, people must be somewhat credulous. However, credulity entails risks. The optimal level of credulity depends on the relative costs of believing misinformation and failing to attend to accurate information. When information concerns hazards, erroneous incredulity is often more costly than erroneous credulity, given that disregarding accurate warnings is more harmful than adopting unnecessary precautions. Because no equivalent asymmetry exists for information concerning benefits, people should generally be more credulous of hazard information than of benefit information. This adaptive negatively biased credulity is linked to negativity bias in general and is more prominent among people who believe the world to be more dangerous. Because both threat sensitivity and beliefs about the dangerousness of the world differ between conservatives and liberals, we predicted that conservatism would positively correlate with negatively biased credulity. Two online studies of Americans supported this prediction, potentially illuminating how politicians' alarmist claims affect different portions of the electorate.
CyberShake: A Physics-Based Seismic Hazard Model for Southern California
NASA Astrophysics Data System (ADS)
Graves, Robert; Jordan, Thomas H.; Callaghan, Scott; Deelman, Ewa; Field, Edward; Juve, Gideon; Kesselman, Carl; Maechling, Philip; Mehta, Gaurang; Milner, Kevin; Okaya, David; Small, Patrick; Vahi, Karan
2011-03-01
CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i.e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and magnitude uncertainty estimates used in the definition of the ruptures than is found in the traditional GMPE approach. This reinforces the need for continued development of a better understanding of earthquake source characterization and the constitutive relations that govern the earthquake rupture process.
CyberShake: A Physics-Based Seismic Hazard Model for Southern California
Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.
2011-01-01
CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and magnitude uncertainty estimates used in the definition of the ruptures than is found in the traditional GMPE approach. This reinforces the need for continued development of a better understanding of earthquake source characterization and the constitutive relations that govern the earthquake rupture process. ?? 2010 Springer Basel AG.
NASA Astrophysics Data System (ADS)
Hao, Zengchao; Hao, Fanghua; Singh, Vijay P.
2016-08-01
Drought is among the costliest natural hazards worldwide and extreme drought events in recent years have caused huge losses to various sectors. Drought prediction is therefore critically important for providing early warning information to aid decision making to cope with drought. Due to the complicated nature of drought, it has been recognized that the univariate drought indicator may not be sufficient for drought characterization and hence multivariate drought indices have been developed for drought monitoring. Alongside the substantial effort in drought monitoring with multivariate drought indices, it is of equal importance to develop a drought prediction method with multivariate drought indices to integrate drought information from various sources. This study proposes a general framework for multivariate multi-index drought prediction that is capable of integrating complementary prediction skills from multiple drought indices. The Multivariate Ensemble Streamflow Prediction (MESP) is employed to sample from historical records for obtaining statistical prediction of multiple variables, which is then used as inputs to achieve multivariate prediction. The framework is illustrated with a linearly combined drought index (LDI), which is a commonly used multivariate drought index, based on climate division data in California and New York in the United States with different seasonality of precipitation. The predictive skill of LDI (represented with persistence) is assessed by comparison with the univariate drought index and results show that the LDI prediction skill is less affected by seasonality than the meteorological drought prediction based on SPI. Prediction results from the case study show that the proposed multivariate drought prediction outperforms the persistence prediction, implying a satisfactory performance of multivariate drought prediction. The proposed method would be useful for drought prediction to integrate drought information from various sources for early drought warning.
Aftereffect Calculation and Prediction of Methanol Tank Leak’s Environmental Risk Accident
NASA Astrophysics Data System (ADS)
Lang, Yueting; Zheng, Lina; Chen, Henan; Wang, Qiushi; Jiang, Hui; Pan, Yiwen
2018-01-01
With the increasing frequency of environmental risk accidents, more emphasis was placed on environmental risk assessment. In this article, the aftermath of an Environmental Risk Accident on Methanol Tank Leakage occurred on a cryogenic unit area in a certain oilfield processing plant have been mainly calculated and predicted. Major hazards were identified through the major hazards identification on dangerous chemicals, which could afterwards analyze maximum credible accident and confirm source item and the source intensity. In the end, the consequence of the accident has been calculated so that the impact on surrounding environment can be predicted after the accident.
TC4 Observing Campaign: An Operational Test of NASA Planetary Defense Network
NASA Astrophysics Data System (ADS)
Reddy, V.; Kelley, M. S.; Landis, R. R.
Impacts due to near-Earth objects ( 90% near-Earth asteroids, or NEAs, and 10% comets) are one of the natural hazards that can pose a great risk to life on Earth, but one that can potentially be mitigated, if the threat is detected with sufficient lead-time. While the probability of such an event is low, the outcome is so catastrophic that we are well justified in investing a modest effort to minimize this threat. Historically, asteroid impacts have altered the course of evolution on the Earth. In 2013 the Chelyabinsk meteor over Russia, which injured over 1600 people and caused $30M in damages, reinforced the importance of detecting and characterizing small NEAs that pose a greater threat than most large NEAs discovered so far. The NASA Planetary Defense Coordination Office (PDCO) was established to ensure the early detection, tracking and characterization of potentially hazardous objects (PHOs) and is the lead office for providing timely and accurate communications and coordination of U.S. Government planning for response to an actual impact threat. In an effort to test the operational readiness of all entities critical to planetary defense, the NASA PDCO is supporting a community-led exercise. The target of this exercise is 2012 TC4, a 20- meter diameter asteroid that is currently expected to pass by the Earth over Antarctica on Oct. 12, 2017 at a distance of only 2.3 Earth radii. The goal of the TC4 Observing Campaign is to recover, track, and characterize 2012 TC4 as a potential impactor in order to exercise the entire Planetary Defense system from observations, modeling, prediction, and communication. The paper will present an overview of the campaign and summarize early results from the exercise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, T.; Ungers, L.; Briggs, T.
1980-08-01
The purpose of this study is to estimate both quantitatively and qualitatively, the worker and societal risks attributable to four photovoltaic cell (solar cell) production processes. Quantitative risk values were determined by use of statistics from the California semiconductor industry. The qualitative risk assessment was performed using a variety of both governmental and private sources of data. The occupational health statistics derived from the semiconductor industry were used to predict injury and fatality levels associated with photovoltaic cell manufacturing. The use of these statistics to characterize the two silicon processes described herein is defensible from the standpoint that many ofmore » the same process steps and materials are used in both the semiconductor and photovoltaic industries. These health statistics are less applicable to the gallium arsenide and cadmium sulfide manufacturing processes, primarily because of differences in the materials utilized. Although such differences tend to discourage any absolute comparisons among the four photovoltaic cell production processes, certain relative comparisons are warranted. To facilitate a risk comparison of the four processes, the number and severity of process-related chemical hazards were assessed. This qualitative hazard assessment addresses both the relative toxicity and the exposure potential of substances in the workplace. In addition to the worker-related hazards, estimates of process-related emissions and wastes are also provided.« less
Kim, Su Hwan; Kim, Byeong Gwan; Kim, Won; Oh, Sohee; Kim, Hwi Young; Jung, Yong Jin; Jeong, Ji Bong; Kim, Ji Won; Lee, Kook Lae
2016-04-01
Gastrointestinal bleeding (GIB) often accompanies alcoholic hepatitis (AH). The study aimed to investigate clinical characteristics of GIB in AH patients and to identify risk factors for mortality in AH patients with GIB. Data from 329 patients hospitalized with AH in a single center during 1999-2014 were retrospectively analyzed. Patients with AH were dichotomized into GIB and non-GIB groups. The GIB group was further divided into portal hypertensive bleeding (PHB) and non-PHB groups. Clinical characteristics and survival outcomes were compared between the groups. Risk factors for mortality were analyzed using Cox regression. Among the 329 AH patients, 132 experienced GIB at admission or during hospitalization. The most common cause of GIB was an esophageal varix. The GIB group had worse survival outcomes than the non-GIB group (log-rank test, P = 0.034). The PHB group had worse survival outcomes than the non-PHB group (log-rank test, P = 0.001). On multivariate analysis, alcohol consumption, ascites, encephalopathy, infection, Maddrey's discriminant function, and the model for end-stage liver disease (MELD) score independently predicted mortality in the entire AH cohort. The MELD score (hazard ratio, 1.085; 95% confidence interval, 1.052-1.120; P < 0.001) and PHB (hazard ratio, 2.162; 95% confidence interval, 1.021-4.577; P = 0.044) were significant prognosticators for patients with AH and GIB. The presence of PHB and a higher MELD score adversely affected survival in AH patients with GIB. Accordingly, prompt endoscopic examination for exploring the etiologies of GIB may alert physicians to predict the risk of death in AH patients with GIB. © 2015 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.
Evaluation of fog predictions and detection : [summary].
DOT National Transportation Integrated Search
2015-03-01
Fog can make driving conditions extremely hazardous. These hazards are further increased : at night and/or when combined with smoke. Nationally, about 38,000 fog-related highway : incidents occur each year, with over 600 fatalities. Florida ranks thi...
NASA Astrophysics Data System (ADS)
Braun, Andreas; Jaque Castillo, Edilia
2017-04-01
The Andes of central Chile are a natural environment characterized by multiple natural hazards (mass movements, volcanic hazards, seismic hazards, snow avalanches to name a few). The totality of these hazards, according to the notion of Müller-Mahn et al. an in relation to vulnerable entities, spans a riskscape. Spatial planning should take this riskscape into account in order to ensure a save an resilient regional development. However, as frequently observed in developing or newly developed countries, such precaution measures are only hardly realized. Spatial planing tends to be reactive to private inversion, opportunistic and frequently clientelistic. This results in spatial structures whose future development is vulnerable to natural disasters. The contribution analyses these circumstances within a riskscape in central Chile. Within the VIII. Region, close to the volcanic complex Nevados de Chillan, a touristic development around a Hotel for winter sports is established. However, the place is affected by a multitude of natural hazards. The contribution, on the basis of primary and secondary data, first provides hazard maps for several natural hazards. Secondly, the individual hazard maps are merged to an overall hazard map. This overall hazard map is related to the vulnerable entities to span a riskscape. The vulnerable entities are settlements, but also tourist infrastructures. Then, the contribution compares how a precautions spatial planning could have avoided putting vulnerable entities at risk, which spatial structure - especially regarding tourism - is actually found and which challenges for spatial development do exist. It reveals that the most important tourist infrastructures are found particularly at places, characterized by a high overall hazard. Furthermore, it will show that alternatives at economically equally attractive sites, but with a much smaller overall hazard, would have existed. It concludes by discussing possible reasons for this by considering the Chilean planning system.
Solar-terrestrial Predictions Proceedings. Volume 1: Prediction Group Reports
NASA Technical Reports Server (NTRS)
Donnelly, R. F. (Editor)
1979-01-01
The current practice in solar terrestrial predictions is reviewed with emphasis of prediction, warning, and monitoring services. Topics covered include: ionosphere-reflected HF radio propagation; radiation hazards for manned space flights and high altitude and high latitude aircraft flights; and geomagnetic activity.
Lin Receives 2010 Natural Hazards Focus Group Award for Graduate Research
NASA Astrophysics Data System (ADS)
2010-11-01
Ning Lin has been awarded the Natural Hazards Focus Group Award for Graduate Research, given annually to a recent Ph.D. recipient for outstanding contributions to natural hazards research. Lin's thesis is entitled “Multi-hazard risk analysis related to hurricanes.” She is scheduled to present an invited talk in the Extreme Natural Events: Modeling, Prediction, and Mitigation session (NH20) during the 2010 AGU Fall Meeting, held 13-17 December in San Francisco, Calif. Lin will be formally presented with the award at the Natural Hazards focus group reception on 14 December 2010.
Characterization and prediction of chemical functions and weight fractions in consumer products.
Isaacs, Kristin K; Goldsmith, Michael-Rock; Egeghy, Peter; Phillips, Katherine; Brooks, Raina; Hong, Tao; Wambaugh, John F
2016-01-01
Assessing exposures from the thousands of chemicals in commerce requires quantitative information on the chemical constituents of consumer products. Unfortunately, gaps in available composition data prevent assessment of exposure to chemicals in many products. Here we propose filling these gaps via consideration of chemical functional role. We obtained function information for thousands of chemicals from public sources and used a clustering algorithm to assign chemicals into 35 harmonized function categories (e.g., plasticizers, antimicrobials, solvents). We combined these functions with weight fraction data for 4115 personal care products (PCPs) to characterize the composition of 66 different product categories (e.g., shampoos). We analyzed the combined weight fraction/function dataset using machine learning techniques to develop quantitative structure property relationship (QSPR) classifier models for 22 functions and for weight fraction, based on chemical-specific descriptors (including chemical properties). We applied these classifier models to a library of 10196 data-poor chemicals. Our predictions of chemical function and composition will inform exposure-based screening of chemicals in PCPs for combination with hazard data in risk-based evaluation frameworks. As new information becomes available, this approach can be applied to other classes of products and the chemicals they contain in order to provide essential consumer product data for use in exposure-based chemical prioritization.
Characterization and Prediction of Chemical Functions and ...
Assessing exposures from the thousands of chemicals in commerce requires quantitative information on the chemical constituents of consumer products. Unfortunately, gaps in available composition data prevent assessment of exposure to chemicals in many products. Here we propose filling these gaps via consideration of chemical functional role. We obtained function information for thousands of chemicals from public sources and used a clustering algorithm to assign chemicals into 35 harmonized function categories (e.g., plasticizers, antimicrobials, solvents). We combined these functions with weight fraction data for 4115 personal care products (PCPs) to characterize the composition of 66 different product categories (e.g., shampoos). We analyzed the combined weight fraction/function dataset using machine learning techniques to develop quantitative structure property relationship (QSPR) classifier models for 22 functions and for weight fraction, based on chemical-specific descriptors (including chemical properties). We applied these classifier models to a library of 10196 data-poor chemicals. Our predictions of chemical function and composition will inform exposure-based screening of chemicals in PCPs for combination with hazard data in risk-based evaluation frameworks. As new information becomes available, this approach can be applied to other classes of products and the chemicals they contain in order to provide essential consumer product data for use in exposure-b
Esserman, Laura J.; Berry, Donald A.; DeMichele, Angela; Carey, Lisa; Davis, Sarah E.; Buxton, Meredith; Hudis, Cliff; Gray, Joe W.; Perou, Charles; Yau, Christina; Livasy, Chad; Krontiras, Helen; Montgomery, Leslie; Tripathy, Debasish; Lehman, Constance; Liu, Minetta C.; Olopade, Olufunmilayo I.; Rugo, Hope S.; Carpenter, John T.; Dressler, Lynn; Chhieng, David; Singh, Baljit; Mies, Carolyn; Rabban, Joseph; Chen, Yunn-Yi; Giri, Dilip; van 't Veer, Laura; Hylton, Nola
2012-01-01
Purpose Neoadjuvant chemotherapy for breast cancer provides critical information about tumor response; how best to leverage this for predicting recurrence-free survival (RFS) is not established. The I-SPY 1 TRIAL (Investigation of Serial Studies to Predict Your Therapeutic Response With Imaging and Molecular Analysis) was a multicenter breast cancer study integrating clinical, imaging, and genomic data to evaluate pathologic response, RFS, and their relationship and predictability based on tumor biomarkers. Patients and Methods Eligible patients had tumors ≥ 3 cm and received neoadjuvant chemotherapy. We determined associations between pathologic complete response (pCR; defined as the absence of invasive cancer in breast and nodes) and RFS, overall and within receptor subsets. Results In 221 evaluable patients (median tumor size, 6.0 cm; median age, 49 years; 91% classified as poor risk on the basis of the 70-gene prognosis profile), 41% were hormone receptor (HR) negative, and 31% were human epidermal growth factor receptor 2 (HER2) positive. For 190 patients treated without neoadjuvant trastuzumab, pCR was highest for HR-negative/HER2-positive patients (45%) and lowest for HR-positive/HER2-negative patients (9%). Achieving pCR predicted favorable RFS. For 172 patients treated without trastuzumab, the hazard ratio for RFS of pCR versus no pCR was 0.29 (95% CI, 0.07 to 0.82). pCR was more predictive of RFS by multivariate analysis when subtype was taken into account, and point estimates of hazard ratios within the HR-positive/HER2-negative (hazard ratio, 0.00; 95% CI, 0.00 to 0.93), HR-negative/HER2-negative (hazard ratio, 0.25; 95% CI, 0.04 to 0.97), and HER2-positive (hazard ratio, 0.14; 95% CI, 0.01 to 1.0) subtypes are lower. Ki67 further improved the prediction of pCR within subsets. Conclusion In this biologically high-risk group, pCR differs by receptor subset. pCR is more highly predictive of RFS within every established receptor subset than overall, demonstrating that the extent of outcome advantage conferred by pCR is specific to tumor biology. PMID:22649152
Empirical Data Fusion for Convective Weather Hazard Nowcasting
NASA Astrophysics Data System (ADS)
Williams, J.; Ahijevych, D.; Steiner, M.; Dettling, S.
2009-09-01
This paper describes a statistical analysis approach to developing an automated convective weather hazard nowcast system suitable for use by aviation users in strategic route planning and air traffic management. The analysis makes use of numerical weather prediction model fields and radar, satellite, and lightning observations and derived features along with observed thunderstorm evolution data, which are aligned using radar-derived motion vectors. Using a dataset collected during the summers of 2007 and 2008 over the eastern U.S., the predictive contributions of the various potential predictor fields are analyzed for various spatial scales, lead-times and scenarios using a technique called random forests (RFs). A minimal, skillful set of predictors is selected for each scenario requiring distinct forecast logic, and RFs are used to construct an empirical probabilistic model for each. The resulting data fusion system, which ran in real-time at the National Center for Atmospheric Research during the summer of 2009, produces probabilistic and deterministic nowcasts of the convective weather hazard and assessments of the prediction uncertainty. The nowcasts' performance and results for several case studies are presented to demonstrate the value of this approach. This research has been funded by the U.S. Federal Aviation Administration to support the development of the Consolidated Storm Prediction for Aviation (CoSPA) system, which is intended to provide convective hazard nowcasts and forecasts for the U.S. Next Generation Air Transportation System (NextGen).
A method for mapping flood hazard along roads.
Kalantari, Zahra; Nickman, Alireza; Lyon, Steve W; Olofsson, Bo; Folkeson, Lennart
2014-01-15
A method was developed for estimating and mapping flood hazard probability along roads using road and catchment characteristics as physical catchment descriptors (PCDs). The method uses a Geographic Information System (GIS) to derive candidate PCDs and then identifies those PCDs that significantly predict road flooding using a statistical modelling approach. The method thus allows flood hazards to be estimated and also provides insights into the relative roles of landscape characteristics in determining road-related flood hazards. The method was applied to an area in western Sweden where severe road flooding had occurred during an intense rain event as a case study to demonstrate its utility. The results suggest that for this case study area three categories of PCDs are useful for prediction of critical spots prone to flooding along roads: i) topography, ii) soil type, and iii) land use. The main drivers among the PCDs considered were a topographical wetness index, road density in the catchment, soil properties in the catchment (mainly the amount of gravel substrate) and local channel slope at the site of a road-stream intersection. These can be proposed as strong indicators for predicting the flood probability in ungauged river basins in this region, but some care is needed in generalising the case study results other potential factors are also likely to influence the flood hazard probability. Overall, the method proposed represents a straightforward and consistent way to estimate flooding hazards to inform both the planning of future roadways and the maintenance of existing roadways. Copyright © 2013 Elsevier Ltd. All rights reserved.
Insights into earthquake hazard map performance from shaking history simulations
NASA Astrophysics Data System (ADS)
Stein, S.; Vanneste, K.; Camelbeeck, T.; Vleminckx, B.
2017-12-01
Why recent large earthquakes caused shaking stronger than predicted by earthquake hazard maps is under debate. This issue has two parts. Verification involves how well maps implement probabilistic seismic hazard analysis (PSHA) ("have we built the map right?"). Validation asks how well maps forecast shaking ("have we built the right map?"). We explore how well a map can ideally perform by simulating an area's shaking history and comparing "observed" shaking to that predicted by a map generated for the same parameters. The simulations yield shaking distributions whose mean is consistent with the map, but individual shaking histories show large scatter. Infrequent large earthquakes cause shaking much stronger than mapped, as observed. Hence, PSHA seems internally consistent and can be regarded as verified. Validation is harder because an earthquake history can yield shaking higher or lower than that predicted while being consistent with the hazard map. The scatter decreases for longer observation times because the largest earthquakes and resulting shaking are increasingly likely to have occurred. For the same reason, scatter is much less for the more active plate boundary than for a continental interior. For a continental interior, where the mapped hazard is low, even an M4 event produces exceedances at some sites. Larger earthquakes produce exceedances at more sites. Thus many exceedances result from small earthquakes, but infrequent large ones may cause very large exceedances. However, for a plate boundary, an M6 event produces exceedance at only a few sites, and an M7 produces them in a larger, but still relatively small, portion of the study area. As reality gives only one history, and a real map involves assumptions about more complicated source geometries and occurrence rates, which are unlikely to be exactly correct and thus will contribute additional scatter, it is hard to assess whether misfit between actual shaking and a map — notably higher-than-mapped shaking — arises by chance or reflects biases in the map. Due to this problem, there are limits to how well we can expect hazard maps to predict future shaking, as well as to our ability to test the performance of a hazard map based on available observations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bishop, M.S.
1991-12-01
At the request of the USAF Regional Hospital Elmendorf/SGPB (PACAF), the Armstrong Laboratory, Occupational and Environmental Health Directorate, conducted a hazardous waste characterization survey of unknown drums at Elmendorf AFB from 2 Aug - 13 Aug 91. The scope of the survey was to sample and characterize drums of unknown material stored at Elmendorf AFB, Shemya AFB, and Galena and King Salmon Airports. Several waste streams were sampled at Elmendorf AFB to revalidate sample results from a previous survey.
Improvements on mapping soil liquefaction at a regional scale
NASA Astrophysics Data System (ADS)
Zhu, Jing
Earthquake induced soil liquefaction is an important secondary hazard during earthquakes and can lead to significant damage to infrastructure. Mapping liquefaction hazard is important in both planning for earthquake events and guiding relief efforts by positioning resources once the events have occurred. This dissertation addresses two aspects of liquefaction hazard mapping at a regional scale including 1) predictive liquefaction hazard mapping and 2) post-liquefaction cataloging. First, current predictive hazard liquefaction mapping relies on detailed geologic maps and geotechnical data, which are not always available in at-risk regions. This dissertation improves the predictive liquefaction hazard mapping by the development and validation of geospatial liquefaction models (Chapter 2 and 3) that predict liquefaction extent and are appropriate for global application. The geospatial liquefaction models are developed using logistic regression from a liquefaction database consisting of the data from 27 earthquake events from six countries. The model that performs best over the entire dataset includes peak ground velocity (PGV), VS30, distance to river, distance to coast, and precipitation. The model that performs best over the noncoastal dataset includes PGV, VS30, water table depth, distance to water body, and precipitation. Second, post-earthquake liquefaction cataloging historically relies on field investigation that is often limited by time and expense, and therefore results in limited and incomplete liquefaction inventories. This dissertation improves the post-earthquake cataloging by the development and validation of a remote sensing-based method that can be quickly applied over a broad region after an earthquake and provide a detailed map of liquefaction surface effects (Chapter 4). Our method uses the optical satellite images before and after an earthquake event from the WorldView-2 satellite with 2 m spatial resolution and eight spectral bands. Our method uses the changes of spectral variables that are sensitive to surface moisture and soil characteristics paired with a supervised classification.
Berman, Jesse D; Peters, Thomas M; Koehler, Kirsten A
2018-05-28
To design a method that uses preliminary hazard mapping data to optimize the number and location of sensors within a network for a long-term assessment of occupational concentrations, while preserving temporal variability, accuracy, and precision of predicted hazards. Particle number concentrations (PNCs) and respirable mass concentrations (RMCs) were measured with direct-reading instruments in a large heavy-vehicle manufacturing facility at 80-82 locations during 7 mapping events, stratified by day and season. Using kriged hazard mapping, a statistical approach identified optimal orders for removing locations to capture temporal variability and high prediction precision of PNC and RMC concentrations. We compared optimal-removal, random-removal, and least-optimal-removal orders to bound prediction performance. The temporal variability of PNC was found to be higher than RMC with low correlation between the two particulate metrics (ρ = 0.30). Optimal-removal orders resulted in more accurate PNC kriged estimates (root mean square error [RMSE] = 49.2) at sample locations compared with random-removal order (RMSE = 55.7). For estimates at locations having concentrations in the upper 10th percentile, the optimal-removal order preserved average estimated concentrations better than random- or least-optimal-removal orders (P < 0.01). However, estimated average concentrations using an optimal-removal were not statistically different than random-removal when averaged over the entire facility. No statistical difference was observed for optimal- and random-removal methods for RMCs that were less variable in time and space than PNCs. Optimized removal performed better than random-removal in preserving high temporal variability and accuracy of hazard map for PNC, but not for the more spatially homogeneous RMC. These results can be used to reduce the number of locations used in a network of static sensors for long-term monitoring of hazards in the workplace, without sacrificing prediction performance.
Sliwinski-Korell, A; Lutz, F
1998-04-01
In the last years the standards for professional handling of hazardous material as well as health and safety in the veterinary practice became considerably more stringent. This is expressed in various safety regulations, particularly the decree of hazardous material and the legislative directives concerning health and safety at work. In part 1, a definition based on the law for hazardous material is given and the potential risks are mentioned. The correct documentation regarding the protection of the purchase, storage, working conditions and removal of hazardous material and of the personal is explained. General rules for the handling of hazardous material are described. In part 2, particular emphasis is put on the handling of flammable liquids, disinfectants, cytostatica, pressurised gas, liquid nitrogen, narcotics, mailing of potentially infectious material and safe disposal of hazardous waste. Advice about possible unrecognized hazards and references is also given.
NASA Astrophysics Data System (ADS)
Valchev, Nikolay; Eftimova, Petya; Andreeva, Nataliya; Prodanov, Bogdan
2017-04-01
Coastal zone is among the fastest evolving areas worldwide. Ever increasing population inhabiting coastal settlements develops often conflicting economic and societal activities. The existing imbalance between the expansion of these activities, on one hand, and the potential to accommodate them in a sustainable manner, on the other, becomes a critical problem. Concurrently, coasts are affected by various hydro-meteorological phenomena such as storm surges, heavy seas, strong winds and flash floods, which intensities and occurrence frequency is likely to increase due to the climate change. This implies elaboration of tools capable of quick prediction of impact of those phenomena on the coast and providing solutions in terms of disaster risk reduction measures. One such tool is Bayesian network. Proposed paper describes the set-up of such network for Varna Bay (Bulgaria, Western Black Sea). It relates near-shore storm conditions to their onshore flood potential and ultimately to relevant impact as relative damage on coastal and manmade environment. Methodology for set-up and training of the Bayesian network was developed within RISC-KIT project (Resilience-Increasing Strategies for Coasts - toolKIT). Proposed BN reflects the interaction between boundary conditions, receptors, hazard, and consequences. Storm boundary conditions - maximum significant wave height and peak surge level, were determined on the basis of their historical and projected occurrence. The only hazard considered in this study is flooding characterized by maximum inundation depth. BN was trained with synthetic events created by combining estimated boundary conditions. Flood impact was modeled with the process-based morphodynamical model XBeach. Restaurants, sport and leisure facilities, administrative buildings, and car parks were introduced in the network as receptors. Consequences (impact) are estimated in terms of relative damage caused by given inundation depth. National depth-damage (susceptibility) curves were used to define the percentage of damage ranked as low, moderate, high and very high. Besides previously described components, BN includes also two hazard influencing disaster risk reduction (DRR) measures: re-enforced embankment of Varna Port wall and beach nourishment. As a result of training process the network is able to evaluate spatially varying hazards and damages for specific storm conditions. Moreover, it is able to predict where on the site the highest impact would occur and to quantify the mitigation capacity of proposed DRR measures. For example, it is estimated that storm impact would be considerably reduced in present conditions but vulnerability would be still high in climate change perspective.
Prentice, J C; Pizer, S D; Conlin, P R
2016-12-01
To characterize the relationship between HbA 1c variability and adverse health outcomes among US military veterans with Type 2 diabetes. This retrospective cohort study used Veterans Affairs and Medicare claims for veterans with Type 2 diabetes taking metformin who initiated a second diabetes medication (n = 50 861). The main exposure of interest was HbA 1c variability during a 3-year baseline period. HbA 1c variability, categorized into quartiles, was defined as standard deviation, coefficient of variation and adjusted standard deviation, which accounted for the number and mean number of days between HbA 1c tests. Cox proportional hazard models predicted mortality, hospitalization for ambulatory care-sensitive conditions, and myocardial infarction or stroke and were controlled for mean HbA 1c levels and the direction of change in HbA 1c levels during the baseline period. Over a mean 3.3 years of follow-up, all HbA 1c variability measures significantly predicted each outcome. Using the adjusted standard deviation measure for HbA 1c variability, the hazard ratios for the third and fourth quartile predicting mortality were 1.14 (95% CI 1.04, 1.25) and 1.42 (95% CI 1.28, 1.58), for myocardial infarction and stroke they were 1.25 (95% CI 1.10, 1.41) and 1.23 (95% CI 1.07, 1.42) and for ambulatory-care sensitive condition hospitalization they were 1.10 (95% CI 1.03, 1.18) and 1.11 (95% CI 1.03, 1.20). Higher baseline HbA 1c levels independently predicted the likelihood of each outcome. In veterans with Type 2 diabetes, greater HbA 1c variability was associated with an increased risk of adverse long-term outcomes, independently of HbA 1c levels and direction of change. Limiting HbA 1c fluctuations over time may reduce complications. © 2016 Diabetes UK.
Predictive teratology: teratogenic risk-hazard identification partnered in the discovery process.
Augustine-Rauch, K A
2008-11-01
Unexpected teratogenicity is ranked as one of the most prevalent causes for toxicity-related attrition of drug candidates. Without proactive assessment, the liability tends to be identified relatively late in drug development, following significant investment in compound and engagement in pre clinical and clinical studies. When unexpected teratogenicity occurs in pre-clinical development, three principle questions arise: Can clinical trials that include women of child bearing populations be initiated? Will all compounds in this pharmacological class produce the same liability? Could this effect be related to the chemical structure resulting in undesirable off-target adverse effects? The first question is typically addressed at the time of the unexpected finding and involves considering the nature of the teratogenicity, whether or not maternal toxicity could have had a role in onset, human exposure margins and therapeutic indication. The latter two questions can be addressed proactively, earlier in the discovery process as drug target profiling and lead compound optimization is taking place. Such proactive approaches include thorough assessment of the literature for identification of potential liabilities and follow-up work that can be conducted on the level of target expression and functional characterization using molecular biology and developmental model systems. Developmental model systems can also be applied in the form of in vitro teratogenicity screens, and show potential for effective hazard identification or issue resolution on the level of characterizing teratogenic mechanism. This review discusses approaches that can be applied for proactive assessment of compounds for teratogenic liability.
This document explains how to generate data which characterizes the performance of hazardous waste treatment systems in terms of the composition of treated hazardous waste streams plus treatment system operation and design.
IMMUNE SYSTEM ONTOGENY AND DEVELOPMENTAL IMMUNOTOXICOLOGY
Animal testing for the identification and characterization of hazard(s), associated with exposure to toxic chemicals, is an accepted approach for identifying the potential risk to humans. The rodent, in particular the rat, has been the most commonly used species for routine toxi...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cochran, J.R.; McDonald, J.R.; Russell, R.J.
1995-10-01
This report documents US Department of Energy (DOE)-funded activities that have adapted the US Navy`s Surface Towed Ordnance Locator System (STOLS) to meet DOE needs for a ``... better, faster, safer and cheaper ...`` system for characterizing inactive hazardous waste sites. These activities were undertaken by Sandia National Laboratories (Sandia), the Naval Research Laboratory, Geo-Centers Inc., New Mexico State University and others under the title of the Magnetometer Towed Array (MTA).
Wang, Ying; Na, Guangshui; Zong, Humin; Ma, Xindong; Yang, Xianhai; Mu, Jingli; Wang, Lijun; Lin, Zhongsheng; Zhang, Zhifeng; Wang, Juying; Zhao, Jinsong
2018-02-01
Adverse outcome pathways (AOPs) are a novel concept that effectively considers the toxic modes of action and guides the ecological risk assessment of chemicals. To better use toxicity data including biochemical or molecular responses and mechanistic data, we further developed a species sensitivity-weighted distribution (SSWD) method for bisphenol A and 4-nonylphenol. Their aquatic predicted-no-effect concentrations (PNECs) were derived using the log-normal statistical extrapolation method. We calculated aquatic PNECs of bisphenol A and 4-nonylphenol with values of 4.01 and 0.721 µg/L, respectively. The ecological risk of each chemical in different aquatic environments near Tianjin, China, a coastal municipality along the Bohai Sea, was characterized by hazard quotient and probabilistic risk quotient assessment techniques. Hazard quotients of 7.02 and 5.99 at 2 municipal sewage sites using all of the endpoints were observed for 4-nonylphenol, which indicated high ecological risks posed by 4-nonylphenol to aquatic organisms, especially endocrine-disrupting effects. Moreover, a high ecological risk of 4-nonylphenol was indicated based on the probabilistic risk quotient method. The present results show that combining the SSWD method and the AOP concept could better protect aquatic organisms from adverse effects such as endocrine disruption and could decrease uncertainty in ecological risk assessment. Environ Toxicol Chem 2018;37:551-562. © 2017 SETAC. © 2017 SETAC.
Rochman, Chelsea M; Lewison, Rebecca L; Eriksen, Marcus; Allen, Harry; Cook, Anna-Marie; Teh, Swee J
2014-04-01
The accumulation of plastic debris in pelagic habitats of the subtropical gyres is a global phenomenon of growing concern, particularly with regard to wildlife. When animals ingest plastic debris that is associated with chemical contaminants, they are at risk of bioaccumulating hazardous pollutants. We examined the relationship between the bioaccumulation of hazardous chemicals in myctophid fish associated with plastic debris and plastic contamination in remote and previously unmonitored pelagic habitats in the South Atlantic Ocean. Using a published model, we defined three sampling zones where accumulated densities of plastic debris were predicted to differ. Contrary to model predictions, we found variable levels of plastic debris density across all stations within the sampling zones. Mesopelagic lanternfishes, sampled from each station and analyzed for bisphenol A (BPA), alkylphenols, alkylphenol ethoxylates, polychlorinated biphenyls (PCBs) and polybrominated diphenyl ethers (PBDEs), exhibited variability in contaminant levels, but this variability was not related to plastic debris density for most of the targeted compounds with the exception of PBDEs. We found that myctophid sampled at stations with greater plastic densities did have significantly larger concentrations of BDE#s 183 -209 in their tissues suggesting that higher brominated congeners of PBDEs, added to plastics as flame-retardants, are indicative of plastic contamination in the marine environment. Our results provide data on a previously unsampled pelagic gyre and highlight the challenges associated with characterizing plastic debris accumulation and associated risks to wildlife. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Brockman, Philip; Barker, Ben C., Jr.; Koch, Grady J.; Nguyen, Dung Phu Chi; Britt, Charles L., Jr.; Petros, Mulugeta
1999-01-01
NASA Langley Research Center (LaRC) has field tested a 2.0 gm, 100 Hertz, pulsed coherent lidar to detect and characterize wake vortices and to measure atmospheric winds and turbulence. The quantification of aircraft wake-vortex hazards is being addressed by the Wake Vortex Lidar (WVL) Project as part of Aircraft Vortex Spacing System (AVOSS), which is under the Reduced Spacing Operations Element of the Terminal Area Productivity (TAP) Program. These hazards currently set the minimum, fixed separation distance between two aircraft and affect the number of takeoff and landing operations on a single runway under Instrument Meteorological Conditions (IMC). The AVOSS concept seeks to safely reduce aircraft separation distances, when weather conditions permit, to increase the operational capacity of major airports. The current NASA wake-vortex research efforts focus on developing and validating wake vortex encounter models, wake decay and advection models, and wake sensing technologies. These technologies will be incorporated into an automated AVOSS that can properly select safe separation distances for different weather conditions, based on the aircraft pair and predicted/measured vortex behavior. The sensor subsystem efforts focus on developing and validating wake sensing technologies. The lidar system has been field-tested to provide real-time wake vortex trajectory and strength data to AVOSS for wake prediction verification. Wake vortices, atmospheric winds, and turbulence products have been generated from processing the lidar data collected during deployments to Norfolk (ORF), John F. Kennedy (JFK), and Dallas/Fort Worth (DFW) International Airports.
Ground motion models used in the 2014 U.S. National Seismic Hazard Maps
Rezaeian, Sanaz; Petersen, Mark D.; Moschetti, Morgan P.
2015-01-01
The National Seismic Hazard Maps (NSHMs) are an important component of seismic design regulations in the United States. This paper compares hazard using the new suite of ground motion models (GMMs) relative to hazard using the suite of GMMs applied in the previous version of the maps. The new source characterization models are used for both cases. A previous paper (Rezaeian et al. 2014) discussed the five NGA-West2 GMMs used for shallow crustal earthquakes in the Western United States (WUS), which are also summarized here. Our focus in this paper is on GMMs for earthquakes in stable continental regions in the Central and Eastern United States (CEUS), as well as subduction interface and deep intraslab earthquakes. We consider building code hazard levels for peak ground acceleration (PGA), 0.2-s, and 1.0-s spectral accelerations (SAs) on uniform firm-rock site conditions. The GMM modifications in the updated version of the maps created changes in hazard within 5% to 20% in WUS; decreases within 5% to 20% in CEUS; changes within 5% to 15% for subduction interface earthquakes; and changes involving decreases of up to 50% and increases of up to 30% for deep intraslab earthquakes for most U.S. sites. These modifications were combined with changes resulting from modifications in the source characterization models to obtain the new hazard maps.
Landslide Hazard Mapping in Rwanda Using Logistic Regression
NASA Astrophysics Data System (ADS)
Piller, A.; Anderson, E.; Ballard, H.
2015-12-01
Landslides in the United States cause more than $1 billion in damages and 50 deaths per year (USGS 2014). Globally, figures are much more grave, yet monitoring, mapping and forecasting of these hazards are less than adequate. Seventy-five percent of the population of Rwanda earns a living from farming, mostly subsistence. Loss of farmland, housing, or life, to landslides is a very real hazard. Landslides in Rwanda have an impact at the economic, social, and environmental level. In a developing nation that faces challenges in tracking, cataloging, and predicting the numerous landslides that occur each year, satellite imagery and spatial analysis allow for remote study. We have focused on the development of a landslide inventory and a statistical methodology for assessing landslide hazards. Using logistic regression on approximately 30 test variables (i.e. slope, soil type, land cover, etc.) and a sample of over 200 landslides, we determine which variables are statistically most relevant to landslide occurrence in Rwanda. A preliminary predictive hazard map for Rwanda has been produced, using the variables selected from the logistic regression analysis.
In the heat of the moment: Effectively engaging scientists and diverging science in hazard events.
NASA Astrophysics Data System (ADS)
Brosnan, D. M.
2015-12-01
Scientists are increasingly called upon to use their expertise to help minimize disasters stemming from natural and human induced hazards ranging from volcanoes, earthquakes and tsunamis to oil-spills. Decision-makers want scientists who collect and analyze data to be able to predict the likelihood and severity of a hazard occurrence. When there is an event, they look to scientists to find ways to ameliorate the consequences. Science cannot predict with the accuracy sought by scientists and scientists themselves are rarely aware of the cascading consequences that they are being asked to minimize. Importantly too, scientists differ in their interpretation of data and uncertainties. While these differences are the spark of science they are often the bane of disaster decisions. This presentation addresses the applicatoin of science in the midst of hazard crises. Using examples from several global disasters it explores how different techniques to deal with scientific uncertainties and diverging conclusions among scientists has been more or less successful. The presentation addresses methods and opportunities exist for effectively applying science during hazard events.
Framework for computationally-predicted AOPs
Framework for computationally-predicted AOPs Given that there are a vast number of existing and new chemicals in the commercial pipeline, emphasis is placed on developing high throughput screening (HTS) methods for hazard prediction. Adverse Outcome Pathways (AOPs) represent a...
Assessing the accuracy of software predictions of mammalian and microbial metabolites
New chemical development and hazard assessments benefit from accurate predictions of mammalian and microbial metabolites. Fourteen biotransformation libraries encoded in eight software packages that predict metabolite structures were assessed for their sensitivity (proportion of ...
Uncertainties in predicting debris flow hazards following wildfire [Chapter 19
Kevin D. Hyde; Karin Riley; Cathelijne Stoof
2017-01-01
Wildfire increases the probability of debris flows posing hazardous conditions where valuesâatârisk exist downstream of burned areas. Conditions and processes leading to postfire debris flows usually follow a general sequence defined here as the postfire debris flow hazard cascade: biophysical setting, fire processes, fire effects, rainfall, debris flow, and valuesâatâ...
Multivariate models for prediction of human skin sensitization hazard.
Strickland, Judy; Zang, Qingda; Paris, Michael; Lehmann, David M; Allen, David; Choksi, Neepa; Matheson, Joanna; Jacobs, Abigail; Casey, Warren; Kleinstreuer, Nicole
2017-03-01
One of the Interagency Coordinating Committee on the Validation of Alternative Method's (ICCVAM) top priorities is the development and evaluation of non-animal approaches to identify potential skin sensitizers. The complexity of biological events necessary to produce skin sensitization suggests that no single alternative method will replace the currently accepted animal tests. ICCVAM is evaluating an integrated approach to testing and assessment based on the adverse outcome pathway for skin sensitization that uses machine learning approaches to predict human skin sensitization hazard. We combined data from three in chemico or in vitro assays - the direct peptide reactivity assay (DPRA), human cell line activation test (h-CLAT) and KeratinoSens™ assay - six physicochemical properties and an in silico read-across prediction of skin sensitization hazard into 12 variable groups. The variable groups were evaluated using two machine learning approaches, logistic regression and support vector machine, to predict human skin sensitization hazard. Models were trained on 72 substances and tested on an external set of 24 substances. The six models (three logistic regression and three support vector machine) with the highest accuracy (92%) used: (1) DPRA, h-CLAT and read-across; (2) DPRA, h-CLAT, read-across and KeratinoSens; or (3) DPRA, h-CLAT, read-across, KeratinoSens and log P. The models performed better at predicting human skin sensitization hazard than the murine local lymph node assay (accuracy 88%), any of the alternative methods alone (accuracy 63-79%) or test batteries combining data from the individual methods (accuracy 75%). These results suggest that computational methods are promising tools to identify effectively the potential human skin sensitizers without animal testing. Published 2016. This article has been contributed to by US Government employees and their work is in the public domain in the USA. Published 2016. This article has been contributed to by US Government employees and their work is in the public domain in the USA.
NASA Astrophysics Data System (ADS)
Caccavale, Mauro; Matano, Fabio; Sacchi, Marco; Mazzola, Salvatore; Somma, Renato; Troise, Claudia; De Natale, Giuseppe
2014-05-01
The Ischia island is a large, complex, partly submerged, active volcanic field located about 20 km east to the Campi Flegrei, a major active volcano-tectonic area near Naples. The island is morphologically characterized in its central part by the resurgent block of Mt. Epomeo, controlled by NW-SE and NE-SW trending fault systems, by mountain stream basin with high relief energy and by a heterogeneous coastline with alternation of beach and tuff/lava cliffs in a continuous reshape due to the weather and sea erosion. The volcano-tectonic process is a main factor for slope stability, as it produces seismic activity and generated steep slopes in volcanic deposits (lava, tuff, pumice and ash layers) characterized by variable strength. In the Campi Flegrei and surrounding areas the possible occurrence of a moderate/large seismic event represents a serious threat for the inhabitants, for the infrastructures as well as for the environment. The most relevant seismic sources for Ischia are represented by the Campi Flegrei caldera and a 5 km long fault located below the island north coast. However those sources are difficult to constrain. The first one due to the on-shore and off-shore extension not yet completely defined. The second characterized only by few large historical events is difficult to parameterize in the framework of probabilistic hazard approach. The high population density, the presence of many infrastructures and the more relevant archaeological sites associated with the natural and artistic values, makes this area a strategic natural laboratory to develop new methodologies. Moreover Ischia represents the only sector, in the Campi Flegrei area, with documented historical landslides originated by earthquake, allowing for the possibility of testing the adequacy and stability of the method. In the framework of the Italian project MON.I.C.A (infrastructural coastlines monitoring) an innovative and dedicated probabilistic methodology has been applied to identify the areas with higher susceptibility of landslide occurrence due to the seismic effect. The (PSLHA) combines the probability of exceedance maps for different GM parameters with the geological and geomorphological information, in terms of critical acceleration and dynamic stability factor. Generally the maps are evaluated for Peak Ground Acceleration, Velocity or Intensity, are well related with anthropic infrastructures (e.g. streets, building, etc.). Each ground motion parameter represents a different aspect in the hazard and has a different correlation with the generation of possible damages. Many works pointed out that other GM like Arias and Housner intensity and the absolute displacement could represent a better choice to analyse for example the cliffs stability. The selection of the GM parameter is of crucial importance to obtain the most useful hazard maps. However in the last decades different Ground Motion Prediction Equations for a new set of GM parameters have been published. Based on this information a series of landslide hazard maps can be produced. The new maps will lead to the identification of areas with highest probability of landslide induced by an earthquake. In a strategic site like Ischia this new methodologies will represent an innovative and advanced tool for the landslide hazard mitigation.
NASA Astrophysics Data System (ADS)
Moya, J. L.; Skocypec, R. D.; Thomas, R. K.
1993-09-01
Over the past 40 years, Sandia National Laboratories (SNL) has been actively engaged in research to improve the ability to accurately predict the response of engineered systems to abnormal thermal and structural environments. These engineered systems contain very hazardous materials. Assessing the degree of safety/risk afforded the public and environment by these engineered systems, therefore, is of upmost importance. The ability to accurately predict the response of these systems to accidents (to abnormal environments) is required to assess the degree of safety. Before the effect of the abnormal environment on these systems can be determined, it is necessary to ascertain the nature of the environment. Ascertaining the nature of the environment, in turn, requires the ability to physically characterize and numerically simulate the abnormal environment. Historically, SNL has demonstrated the level of safety provided by these engineered systems by either of two approaches: a purely regulatory approach, or by a probabilistic risk assessment (PRA). This paper will address the latter of the two approaches.
Chemical and ecotoxicological properties of three bio-oils from pyrolysis of biomasses.
Campisi, Tiziana; Samorì, Chiara; Torri, Cristian; Barbera, Giuseppe; Foschini, Anna; Kiwan, Alisar; Galletti, Paola; Tagliavini, Emilio; Pasteris, Andrea
2016-10-01
In view of the potential use of pyrolysis-based technologies, it is crucial to understand the environmental hazards of pyrolysis-derived products, in particular bio-oils. Here, three bio-oils were produced from fast pyrolysis of pine wood and intermediate pyrolysis of corn stalk and poultry litter. They were fully characterized by chemical analysis and tested for their biodegradability and their ecotoxicity on the crustacean Daphnia magna and the green alga Raphidocelis subcapitata. These tests were chosen as required by the European REACH regulation. These three bio-oils were biodegradable, with 40-60% of biodegradation after 28 days, and had EC50 values above 100mgL(-1) for the crustacean and above 10mgL(-1) for the alga, showing low toxicity to the aquatic life. The toxic unit approach was applied to verify whether the observed toxicity could be predicted from the data available for the substances detected in the bio-oils. The predicted values largely underestimated the experimental values. Copyright © 2016 Elsevier Inc. All rights reserved.
Using the Triad Approach to Improve the Cost-effectiveness of Hazardous Waste Site Cleanups
U.S. EPA's Office of Solid Waste and Emergency Response is promoting more effective strategies for characterizing, monitoring, and cleaning up hazardous waste sites. In particular, a paradigm based on using an integrated triad of systematic planning...
Developmental Neurotoxicity Testing: A Path Forward
Great progress has been made over the past 40 years in understanding the hazards of exposure to a small number of developmental neurotoxicants. Lead, PCBs, and methylmercury are all good examples of science-based approaches to characterizing the hazard to the developing nervous s...
Vandermoere, Frédéric
2008-04-01
This case study examines the hazard and risk perception and the need for decontamination according to people exposed to soil pollution. Using an ecological-symbolic approach (ESA), a multidisciplinary model is developed that draws upon psychological and sociological perspectives on risk perception and includes ecological variables by using data from experts' risk assessments. The results show that hazard perception is best predicted by objective knowledge, subjective knowledge, estimated knowledge of experts, and the assessed risks. However, experts' risk assessments induce an increase in hazard perception only when residents know the urgency of decontamination. Risk perception is best predicted by trust in the risk management. Additionally, need for decontamination relates to hazard perception, risk perception, estimated knowledge of experts, and thoughts about sustainability. In contrast to the knowledge deficit model, objective and subjective knowledge did not significantly relate to risk perception and need for decontamination. The results suggest that residents can make a distinction between hazards in terms of the seriousness of contamination on the one hand, and human health risks on the other hand. Moreover, next to the importance of social determinants of environmental risk perception, this study shows that the output of experts' risk assessments-or the objective risks-can create a hazard awareness rather than an alarming risk consciousness, despite residents' distrust of scientific knowledge.
Driver Vigilance in Automated Vehicles: Hazard Detection Failures Are a Matter of Time.
Greenlee, Eric T; DeLucia, Patricia R; Newton, David C
2018-06-01
The primary aim of the current study was to determine whether monitoring the roadway for hazards during automated driving results in a vigilance decrement. Although automated vehicles are relatively novel, the nature of human-automation interaction within them has the classic hallmarks of a vigilance task. Drivers must maintain attention for prolonged periods of time to detect and respond to rare and unpredictable events, for example, roadway hazards that automation may be ill equipped to detect. Given the similarity with traditional vigilance tasks, we predicted that drivers of a simulated automated vehicle would demonstrate a vigilance decrement in hazard detection performance. Participants "drove" a simulated automated vehicle for 40 minutes. During that time, their task was to monitor the roadway for roadway hazards. As predicted, hazard detection rate declined precipitously, and reaction times slowed as the drive progressed. Further, subjective ratings of workload and task-related stress indicated that sustained monitoring is demanding and distressing and it is a challenge to maintain task engagement. Monitoring the roadway for potential hazards during automated driving results in workload, stress, and performance decrements similar to those observed in traditional vigilance tasks. To the degree that vigilance is required of automated vehicle drivers, performance errors and associated safety risks are likely to occur as a function of time on task. Vigilance should be a focal safety concern in the development of vehicle automation.
Framework Analysis for Determining Mode of Action & Human Relevance
The overall aim of a cancer risk assessment is to characterize the risk to humans from environmental exposures. This risk characterization includes a qualitative and quantitative risk characterization that relies on the development of separate hazard, dose- response and exposure...
SITE CHARACTERIZATION LIBRARY: VOLUMN 1 (RELEASE 2.5)
This CD-ROM, Volume 1, Release 2.5, of EPA's National Exposure Research Laboratory (NERL - Las Vegas) Site Characterization Library, contains additional electronic documents and computer programs related to the characterization of hazardous waste sites. EPA has produced this libr...
Mapping debris-flow hazard in Honolulu using a DEM
Ellen, Stephen D.; Mark, Robert K.; ,
1993-01-01
A method for mapping hazard posed by debris flows has been developed and applied to an area near Honolulu, Hawaii. The method uses studies of past debris flows to characterize sites of initiation, volume at initiation, and volume-change behavior during flow. Digital simulations of debris flows based on these characteristics are then routed through a digital elevation model (DEM) to estimate degree of hazard over the area.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hedgecock, N.S.
1990-01-01
At the request of 67 Combat Support Group/DEEV the Air Force Occupational and Environmental Health Laboratory conducted a waste-water characterization and hazardous-waste technical assistance survey at Bergstrom AFB (BAFB) from 6-15 Mar 89. The scope of the waste-water survey was to characterize the effluent exiting the base and the effluent from 23 industrial facilities and 10 food-serving facilities. The scope of the hazardous-waste survey was to address hazardous-waste-management practices and explore opportunities for hazardous waste minimization. Specific recommendations from the survey include: (1) Accompany City of Austin personnel during waste-water sampling procedures; (2) Sample at the manhole exiting the mainmore » lift station rather than at the lift station wet well; (3) Split waste-water samples with the City of Austin for comparison of results; (4) Ensure that oil/water separators and grease traps are functioning properly and are cleaned out regularly; (5) Limit the quantity of soaps and solvents discharged down the drain to the sanitary sewer; (6) Establish a waste disposal contract for the removal of wastes in the Petroleum Oils and Lubricants underground storage tanks. (7) Remove, analyze, and properly dispose of oil contaminated soil from accumulation sites. (8) Move indoors or secure, cover, and berm the aluminum sign reconditioning tank at 67 Civil Engineering Squadron Protective Coating. (9) Connect 67 Combat Repair Squadron Test Cell floor drains to the sanitary sewer.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, S.P.; Hedgecock, N.S.
1989-10-01
Personnel from the AFOEHL conducted a waste-water characterization and hazardous-waste technical assistance survey at MAFB from 28 Nov to 9 Dec 1988. The scope of this survey was to characterize the waste-water, address hazardous-waste-management practices, and explore opportunities for hazardous waste minimization. The waste water survey team analyzed the base's industrial effluent, effluent from oil/water separators, and storm water. The team performed a shop-by-shop evaluation of chemical-waste-management practices. Survey results showed that MAFB needs to improve its hazardous-waste-management program. Recommendations for improvement include: (1) Collecting two additional grab samples on separate days from the hospital discharge. Analyze for EPA Methodmore » 601 to determine if the grab sample from the survey gives a true indication of what is being discharged. (2) Locate the source and prevent mercury from the hospital from discharging into the sanitary sewer. (3) Dilute the soaps used for cleaning at the Fuels Lab, Building 7060. (4) Investigate the source of chromium from the Photo Lab. (5) Clean out the sewer system manhole directly downgradient from the Photo Lab. (6) Locate the source of contamination in the West Ditch Outfall. (7) Reconnect the two oil/water separators that discharge into the storm sewerage system. (8) Investigate the source of methylene chloride coming on the base. (9) Investigate the source of mercury at Fuel Cell Repair, building 7005.« less
Screening tests for hazard classification of complex waste materials - Selection of methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weltens, R., E-mail: reinhilde.weltens@vito.be; Vanermen, G.; Tirez, K.
In this study we describe the development of an alternative methodology for hazard characterization of waste materials. Such an alternative methodology for hazard assessment of complex waste materials is urgently needed, because the lack of a validated instrument leads to arbitrary hazard classification of such complex waste materials. False classification can lead to human and environmental health risks and also has important financial consequences for the waste owner. The Hazardous Waste Directive (HWD) describes the methodology for hazard classification of waste materials. For mirror entries the HWD classification is based upon the hazardous properties (H1-15) of the waste which canmore » be assessed from the hazardous properties of individual identified waste compounds or - if not all compounds are identified - from test results of hazard assessment tests performed on the waste material itself. For the latter the HWD recommends toxicity tests that were initially designed for risk assessment of chemicals in consumer products (pharmaceuticals, cosmetics, biocides, food, etc.). These tests (often using mammals) are not designed nor suitable for the hazard characterization of waste materials. With the present study we want to contribute to the development of an alternative and transparent test strategy for hazard assessment of complex wastes that is in line with the HWD principles for waste classification. It is necessary to cope with this important shortcoming in hazardous waste classification and to demonstrate that alternative methods are available that can be used for hazard assessment of waste materials. Next, by describing the pros and cons of the available methods, and by identifying the needs for additional or further development of test methods, we hope to stimulate research efforts and development in this direction. In this paper we describe promising techniques and argument on the test selection for the pilot study that we have performed on different types of waste materials. Test results are presented in a second paper. As the application of many of the proposed test methods is new in the field of waste management, the principles of the tests are described. The selected tests tackle important hazardous properties but refinement of the test battery is needed to fulfil the a priori conditions.« less
NASA Astrophysics Data System (ADS)
Ji, Zhonghui; Li, Ning; Wu, Xianhua
2017-08-01
Based on the related impact factors of precipitation anomaly referred in previous research, eight atmospheric circulation indicators in pre-winter and spring picked out by correlation analysis as the independent variables and the hazard levels of drought/flood sudden alternation index (DFSAI) as the dependent variables were used to construct the nonlinear and nonparametric classification and regression tree (CART) for the threshold determination and hazard evaluation on bimonthly and monthly scales in Huaihe River basin. Results show that the spring indicators about Arctic oscillation index (AOI_S), Asia polar vortex area index (APVAI_S), and Asian meridional circulation index (AMCI_S) were extracted as the three main impact factors, which were proved to be suitable for the hazard levels assessment of the drought/flood sudden alternation (DFSA) disaster based on bimonthly scale. On monthly scale, AOI_S, northern hemisphere polar vortex intensity index in pre-winter (NHPVII_PW), and AMCI_S are the three primary variables in hazard level prediction of DFSA in May and June; NHPVII_PW, AMCI_PW, and AMCI_S are for that in June and July; NHPVII_PW and EASMI are for that in July and August. The type of the disaster (flood to drought/drought to flood/no DFSA) and hazard level under different conditions also can be obtained from each model. The hazard level and type were expressed by the integer from - 3 to 3, which change from the high level of disaster that flood to drought (level - 3) to the high level of the reverse type (level 3). The middle number 0 represents no DFSA. The high levels of the two sides decrease progressively to the neutralization (level 0). When AOI_S less than - 0.355, the disaster of the quick turn from drought to flood is more apt to happen (level 1) on bimonthly scale; when AOI_S less than - 1.32, the same type disaster may occur (level 2) in May and June on monthly scale. When NHPVII_PW less than 341.5, the disaster of the quick turn from flood to drought will occur (level - 1) in June and July on monthly scale. By this analogy, different hazard types and levels all can be judged from the optimal models. The corresponding data from 2011 to 2015 were selected to verify the final models through the comparison between the predicted and actual levels, and the models of M1 (bimonthly scale), M2, and M3 (monthly scale) were proved to be acceptable by the prediction accuracy rate (compared the predicted with the observed levels, 73%, 11/15). The proposed CART method in this research is a new try for the short-term climate prediction.
Urban Seismic Hazard Mapping for Memphis, Shelby County, Tennessee
Gomberg, Joan
2006-01-01
Earthquakes cannot be predicted, but scientists can forecast how strongly the ground is likely to shake as a result of an earthquake. Seismic hazard maps provide one way of conveying such forecasts. The U.S. Geological Survey (USGS), which produces seismic hazard maps for the Nation, is now engaged in developing more detailed maps for vulnerable urban areas. The first set of these maps is now available for Memphis, Tennessee.
Total lightning characteristics of recent hazardous weather events in Japan
NASA Astrophysics Data System (ADS)
Hobara, Y.; Kono, S.; Ogawa, T.; Heckman, S.; Stock, M.; Liu, C.
2017-12-01
In recent years, the total lightning (IC + CG) activity have attracted a lot of attention to improve the quality of prediction of hazardous weather phenomena (hail, wind gusts, tornadoes, heavy precipitation). Sudden increases of the total lightning flash rate so-called lightning jump (LJ) preceding the hazardous weather, reported in several studies, are one of the promising precursors. Although, increases in the frequency and intensity of these extreme weather events were reported in Japan, relationship with these events with total lightning have not studied intensively yet. In this paper, we will demonstrate the recent results from Japanese total lightning detection network (JTLN) in relation with hazardous weather events occurred in Japan in the period of 2014-2016. Automatic thunderstorm cell tracking was carried out based on the very high spatial and temporal resolution X-band MP radar echo data (1 min and 250 m) to correlate with total lightning activity. Results obtained reveal promising because the flash rate of total lightning tends to increase about 10 40 minutes before the onset of the extreme weather events. We also present the differences in lightning characteristics of thunderstorm cells between hazardous weather events and non-hazardous weather events, which is a vital information to improve the prediction efficiency.
CHARACTERIZATION OF RISKS POSED BY COMBUSTOR EMISSIONS
Risk characterization is the final step of the risk assessment process as practiced in the U.S. EPA. In risk characterization, the major scientific evidence and "bottom-line" results from the other components of the risk assessment process, hazard identification, dose-response as...
Nurses' short-term prediction of violence in acute psychiatric intensive care.
Björkdahl, A; Olsson, D; Palmstierna, T
2006-03-01
To evaluate the short-term predictive capacity of the Brøset Violence Checklist (BVC) when used by nurses in a psychiatric intensive care unit. Seventy-three patients were assessed according to the BVC three times daily. Violent incidents were recorded with the Staff Observation Aggression Scale, revised version. An extended Cox proportional hazards model with multiple events and time-dependent covariates was estimated to evaluate how the highest BVC sum of the last 24 h and its separate items affect the risk for severe violence within the next 24 h. With a BVC sum of one or more, hazard for severe violence was six times higher than if the sum was zero. Four of the six separate items significantly increased the risk for severe violence with hazard ratios between 3.0 and 6.3. Risk for in-patient violence in a short-term perspective can to a high degree be predicted by nurses using the BVC.
Rosin-Rammler Distributions in ANSYS Fluent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunham, Ryan Q.
In Health Physics monitoring, particles need to be collected and tracked. One method is to predict the motion of potential health hazards with computer models. Particles released from various sources within a glove box can become a respirable health hazard if released into the area surrounding a glove box. The goal of modeling the aerosols in a glove box is to reduce the hazards associated with a leak in the glove box system. ANSYS Fluent provides a number of tools for modeling this type of environment. Particles can be released using injections into the flow path with turbulent properties. Themore » models of particle tracks can then be used to predict paths and concentrations of particles within the flow. An attempt to understand and predict the handling of data by Fluent was made, and results iteratively tracked. Trends in data were studied to comprehend the final results. The purpose of the study was to allow a better understanding of the operation of Fluent for aerosol modeling for future application in many fields.« less
Modelling of a spread of hazardous substances in a Floreon+ system
NASA Astrophysics Data System (ADS)
Ronovsky, Ales; Brzobohaty, Tomas; Kuchar, Stepan; Vojtek, David
2017-07-01
This paper is focused on a module of an automatized numerical modelling of a spread of hazardous substances developed for the Floreon+ system on demand of the Fire Brigade of Moravian-Silesian. The main purpose of the module is to provide more accurate prediction for smog situations that are frequent problems in the region. It can be operated by non-scientific user through the Floreon+ client and can be used as a short term prediction model of an evolution of concentrations of dangerous substances (SO2, PMx) from stable sources, such as heavy industry factories, local furnaces or highways or as fast prediction of spread of hazardous substances in case of crash of mobile source of contamination (transport of dangerous substances) or in case of a leakage in a local chemical factory. The process of automatic gathering of atmospheric data, connection of Floreon+ system with an HPC infrastructure necessary for computing of such an advantageous model and the model itself are described bellow.
NASA Astrophysics Data System (ADS)
Lehmann, Peter; von Ruette, Jonas; Fan, Linfeng; Or, Dani
2014-05-01
Rapid debris flows initiated by rainfall induced shallow landslides present a highly destructive natural hazard in steep terrain. The impact and run-out paths of debris flows depend on the volume, composition and initiation zone of released material and are requirements to make accurate debris flow predictions and hazard maps. For that purpose we couple the mechanistic 'Catchment-scale Hydro-mechanical Landslide Triggering (CHLT)' model to compute timing, location, and landslide volume with simple approaches to estimate debris flow runout distances. The runout models were tested using two landslide inventories obtained in the Swiss Alps following prolonged rainfall events. The predicted runout distances were in good agreement with observations, confirming the utility of such simple models for landscape scale estimates. In a next step debris flow paths were computed for landslides predicted with the CHLT model for a certain range of soil properties to explore its effect on runout distances. This combined approach offers a more complete spatial picture of shallow landslide and subsequent debris flow hazards. The additional information provided by CHLT model concerning location, shape, soil type and water content of the released mass may also be incorporated into more advanced models of runout to improve predictability and impact of such abruptly-released mass.
Petersen, M.D.; Dewey, J.; Hartzell, S.; Mueller, C.; Harmsen, S.; Frankel, A.D.; Rukstales, K.
2004-01-01
The ground motion hazard for Sumatra and the Malaysian peninsula is calculated in a probabilistic framework, using procedures developed for the US National Seismic Hazard Maps. We constructed regional earthquake source models and used standard published and modified attenuation equations to calculate peak ground acceleration at 2% and 10% probability of exceedance in 50 years for rock site conditions. We developed or modified earthquake catalogs and declustered these catalogs to include only independent earthquakes. The resulting catalogs were used to define four source zones that characterize earthquakes in four tectonic environments: subduction zone interface earthquakes, subduction zone deep intraslab earthquakes, strike-slip transform earthquakes, and intraplate earthquakes. The recurrence rates and sizes of historical earthquakes on known faults and across zones were also determined from this modified catalog. In addition to the source zones, our seismic source model considers two major faults that are known historically to generate large earthquakes: the Sumatran subduction zone and the Sumatran transform fault. Several published studies were used to describe earthquakes along these faults during historical and pre-historical time, as well as to identify segmentation models of faults. Peak horizontal ground accelerations were calculated using ground motion prediction relations that were developed from seismic data obtained from the crustal interplate environment, crustal intraplate environment, along the subduction zone interface, and from deep intraslab earthquakes. Most of these relations, however, have not been developed for large distances that are needed for calculating the hazard across the Malaysian peninsula, and none were developed for earthquake ground motions generated in an interplate tectonic environment that are propagated into an intraplate tectonic environment. For the interplate and intraplate crustal earthquakes, we have applied ground-motion prediction relations that are consistent with California (interplate) and India (intraplate) strong motion data that we collected for distances beyond 200 km. For the subduction zone equations, we recognized that the published relationships at large distances were not consistent with global earthquake data that we collected and modified the relations to be compatible with the global subduction zone ground motions. In this analysis, we have used alternative source and attenuation models and weighted them to account for our uncertainty in which model is most appropriate for Sumatra or for the Malaysian peninsula. The resulting peak horizontal ground accelerations for 2% probability of exceedance in 50 years range from over 100% g to about 10% g across Sumatra and generally less than 20% g across most of the Malaysian peninsula. The ground motions at 10% probability of exceedance in 50 years are typically about 60% of the ground motions derived for a hazard level at 2% probability of exceedance in 50 years. The largest contributors to hazard are from the Sumatran faults.
Is Directivity Still Effective in a PSHA Framework?
NASA Astrophysics Data System (ADS)
Spagnuolo, E.; Herrero, A.; Cultrera, G.
2008-12-01
Source rupture parameters, like directivity, modulate the energy release causing variations in the radiated signal amplitude. Thus they affect the empirical predictive equations and as a consequence the seismic hazard assessment. Classical probabilistic hazard evaluations, e.g. Cornell (1968), use very simple predictive equations only based on magnitude and distance which do not account for variables concerning the rupture process. However nowadays, a few predictive equations (e.g. Somerville 1997, Spudich and Chiou 2008) take into account for rupture directivity. Also few implementations have been made in a PSHA framework (e.g. Convertito et al. 2006, Rowshandel 2006). In practice, these new empirical predictive models incorporate quantitatively the rupture propagation effects through the introduction of variables like rake, azimuth, rupture velocity and laterality. The contribution of all these variables is summarized in corrective factors derived from measuring differences between the real data and the predicted ones Therefore, it's possible to keep the older computation, making use of a simple predictive model, and besides, to incorporate the directivity effect through the corrective factors. Any single supplementary variable meaning a new integral in the parametric space. However the difficulty consists of the constraints on parameter distribution functions. We present the preliminary result for ad hoc distributions (Gaussian, uniform distributions) in order to test the impact of incorporating directivity into PSHA models. We demonstrate that incorporating directivity in PSHA by means of the new predictive equations may lead to strong percentage variations in the hazard assessment.
Gasoline toxicology: overview of regulatory and product stewardship programs.
Swick, Derek; Jaques, Andrew; Walker, J C; Estreicher, Herb
2014-11-01
Significant efforts have been made to characterize the toxicological properties of gasoline. There have been both mandatory and voluntary toxicology testing programs to generate hazard characterization data for gasoline, the refinery process streams used to blend gasoline, and individual chemical constituents found in gasoline. The Clean Air Act (CAA) (Clean Air Act, 2012: § 7401, et seq.) is the primary tool for the U.S. Environmental Protection Agency (EPA) to regulate gasoline and this supplement presents the results of the Section 211(b) Alternative Tier 2 studies required for CAA Fuel and Fuel Additive registration. Gasoline blending streams have also been evaluated by EPA under the voluntary High Production Volume (HPV) Challenge Program through which the petroleum industry provide data on over 80 refinery streams used in gasoline. Product stewardship efforts by companies and associations such as the American Petroleum Institute (API), Conservation of Clean Air and Water Europe (CONCAWE), and the Petroleum Product Stewardship Council (PPSC) have contributed a significant amount of hazard characterization data on gasoline and related substances. The hazard of gasoline and anticipated exposure to gasoline vapor has been well characterized for risk assessment purposes. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kritikos, Theodosios; Robinson, Tom R.; Davies, Tim R. H.
2015-04-01
Currently, regional coseismic landslide hazard analyses require comprehensive historical landslide inventories as well as detailed geotechnical data. Consequently, such analyses have not been possible where these data are not available. A new approach is proposed herein to assess coseismic landslide hazard at regional scale for specific earthquake scenarios in areas without historical landslide inventories. The proposed model employs fuzzy logic and geographic information systems to establish relationships between causative factors and coseismic slope failures in regions with well-documented and substantially complete coseismic landslide inventories. These relationships are then utilized to estimate the relative probability of landslide occurrence in regions with neither historical landslide inventories nor detailed geotechnical data. Statistical analyses of inventories from the 1994 Northridge and 2008 Wenchuan earthquakes reveal that shaking intensity, topography, and distance from active faults and streams are the main controls on the spatial distribution of coseismic landslides. Average fuzzy memberships for each factor are developed and aggregated to model the relative coseismic landslide hazard for both earthquakes. The predictive capabilities of the models are assessed and show good-to-excellent model performance for both events. These memberships are then applied to the 1999 Chi-Chi earthquake, using only a digital elevation model, active fault map, and isoseismal data, replicating prediction of a future event in a region lacking historic inventories and/or geotechnical data. This similarly results in excellent model performance, demonstrating the model's predictive potential and confirming it can be meaningfully applied in regions where previous methods could not. For such regions, this method may enable a greater ability to analyze coseismic landslide hazard from specific earthquake scenarios, allowing for mitigation measures and emergency response plans to be better informed of earthquake-related hazards.
NASA Astrophysics Data System (ADS)
Bindi, D.; Cotton, F.; Kotha, S. R.; Bosse, C.; Stromeyer, D.; Grünthal, G.
2017-09-01
We present a ground motion prediction equation (GMPE) for probabilistic seismic hazard assessments (PSHA) in low-to-moderate seismicity areas, such as Germany. Starting from the NGA-West2 flat-file (Ancheta et al. in Earthquake Spectra 30:989-1005, 2014), we develop a model tailored to the hazard application in terms of data selection and implemented functional form. In light of such hazard application, the GMPE is derived for hypocentral distance (along with the Joyner-Boore one), selecting recordings at sites with vs30 ≥ 360 m/s, distances within 300 km, and magnitudes in the range 3 to 8 (being 7.4 the maximum magnitude for the PSHA in the target area). Moreover, the complexity of the considered functional form is reflecting the availability of information in the target area. The median predictions are compared with those from the NGA-West2 models and with one recent European model, using the Sammon's map constructed for different scenarios. Despite the simplification in the functional form, the assessed epistemic uncertainty in the GMPE median is of the order of those affecting the NGA-West2 models for the magnitude range of interest of the hazard application. On the other hand, the simplification of the functional form led to an increment of the apparent aleatory variability. In conclusion, the GMPE developed in this study is tailored to the needs for applications in low-to-moderate seismic areas and for short return periods (e.g., 475 years); its application in studies where the hazard is involving magnitudes above 7.4 and for long return periods is not advised.
Final Report: Seismic Hazard Assessment at the PGDP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Zhinmeng
2007-06-01
Selecting a level of seismic hazard at the Paducah Gaseous Diffusion Plant for policy considerations and engineering design is not an easy task because it not only depends on seismic hazard, but also on seismic risk and other related environmental, social, and economic issues. Seismic hazard is the main focus. There is no question that there are seismic hazards at the Paducah Gaseous Diffusion Plant because of its proximity to several known seismic zones, particularly the New Madrid Seismic Zone. The issues in estimating seismic hazard are (1) the methods being used and (2) difficulty in characterizing the uncertainties ofmore » seismic sources, earthquake occurrence frequencies, and ground-motion attenuation relationships. This report summarizes how input data were derived, which methodologies were used, and what the hazard estimates at the Paducah Gaseous Diffusion Plant are.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.
Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regardingmore » their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.« less
Assessment and Prediction of Natural Hazards from Satellite Imagery
Gillespie, Thomas W.; Chu, Jasmine; Frankenberg, Elizabeth; Thomas, Duncan
2013-01-01
Since 2000, there have been a number of spaceborne satellites that have changed the way we assess and predict natural hazards. These satellites are able to quantify physical geographic phenomena associated with the movements of the earth’s surface (earthquakes, mass movements), water (floods, tsunamis, storms), and fire (wildfires). Most of these satellites contain active or passive sensors that can be utilized by the scientific community for the remote sensing of natural hazards over a number of spatial and temporal scales. The most useful satellite imagery for the assessment of earthquake damage comes from high-resolution (0.6 m to 1 m pixel size) passive sensors and moderate resolution active sensors that can quantify the vertical and horizontal movement of the earth’s surface. High-resolution passive sensors have been used to successfully assess flood damage while predictive maps of flood vulnerability areas are possible based on physical variables collected from passive and active sensors. Recent moderate resolution sensors are able to provide near real time data on fires and provide quantitative data used in fire behavior models. Limitations currently exist due to atmospheric interference, pixel resolution, and revisit times. However, a number of new microsatellites and constellations of satellites will be launched in the next five years that contain increased resolution (0.5 m to 1 m pixel resolution for active sensors) and revisit times (daily ≤ 2.5 m resolution images from passive sensors) that will significantly improve our ability to assess and predict natural hazards from space. PMID:25170186
Volcanic hazards and their mitigation: progress and problems
Tilling, R.I.
1989-01-01
A review of hazards mitigation approaches and techniques indicates that significant advances have been made in hazards assessment, volcano monioring, and eruption forecasting. For example, the remarkable accuracy of the predictions of dome-building events at Mount St. Helens since June 1980 is unprecedented. Yet a predictive capability for more voluminous and explosive eruptions still has not been achieved. Studies of magma-induced seismicity and ground deformation continue to provide the most systematic and reliable data for early detection of precursors to eruptions and shallow intrusions. In addition, some other geophysical monitoring techniques and geochemical methods have been refined and are being more widely applied and tested. Comparison of the four major volcanic disasters of the 1980s (Mount St. Helens, U.S.A. (1980), El Chichon, Mexico (1982); Galunggung, Indonesia (1982); and Nevado del Ruiz, Colombia (1985)) illustrates the importance of predisaster geoscience studies, volcanic hazards assessments, volcano monitoring, contingency planning, and effective communications between scientists and authorities. -from Author
Kim, Cheol-Hee; Park, Jin-Ho; Park, Cheol-Jin; Na, Jin-Gyun
2004-03-01
The Chemical Accidents Response Information System (CARIS) was developed at the Center for Chemical Safety Management in South Korea in order to track and predict the dispersion of hazardous chemicals in the case of an accident or terrorist attack involving chemical companies. The main objective of CARIS is to facilitate an efficient emergency response to hazardous chemical accidents by rapidly providing key information in the decision-making process. In particular, the atmospheric modeling system implemented in CARIS, which is composed of a real-time numerical weather forecasting model and an air pollution dispersion model, can be used as a tool to forecast concentrations and to provide a wide range of assessments associated with various hazardous chemicals in real time. This article introduces the components of CARIS and describes its operational modeling system. Some examples of the operational modeling system and its use for emergency preparedness are presented and discussed. Finally, this article evaluates the current numerical weather prediction model for Korea.
Environmental monitoring and surveillance strategies are essential for identifying potential hazards of contaminant exposure to aquatic organisms. Chemical monitoring is effective for chemicals with well characterized hazards and for which sensitive analytical methods are availa...
CHARACTERIZATION OF A LOSS OF HETEROZYGOSITY CANCER HAZARD IDENTIFICATION ASSAY.
Tumor development generally requires the loss of heterozygosity (LOH) at one or more loci. Thus, the ability to determine whether a chemical is capable of causing LOH is an important part of cancer hazard identification. The mouse lymphoma assay detects a broad spectrum of geneti...
Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less
Predictive Models of target organ and Systemic toxicities (BOSC)
The objective of this work is to predict the hazard classification and point of departure (PoD) of untested chemicals in repeat-dose animal testing studies. We used supervised machine learning to objectively evaluate the predictive accuracy of different classification and regress...
Cheng, Feixiong; Shen, Jie; Yu, Yue; Li, Weihua; Liu, Guixia; Lee, Philip W; Tang, Yun
2011-03-01
There is an increasing need for the rapid safety assessment of chemicals by both industries and regulatory agencies throughout the world. In silico techniques are practical alternatives in the environmental hazard assessment. It is especially true to address the persistence, bioaccumulative and toxicity potentials of organic chemicals. Tetrahymena pyriformis toxicity is often used as a toxic endpoint. In this study, 1571 diverse unique chemicals were collected from the literature and composed of the largest diverse data set for T. pyriformis toxicity. Classification predictive models of T. pyriformis toxicity were developed by substructure pattern recognition and different machine learning methods, including support vector machine (SVM), C4.5 decision tree, k-nearest neighbors and random forest. The results of a 5-fold cross-validation showed that the SVM method performed better than other algorithms. The overall predictive accuracies of the SVM classification model with radial basis functions kernel was 92.2% for the 5-fold cross-validation and 92.6% for the external validation set, respectively. Furthermore, several representative substructure patterns for characterizing T. pyriformis toxicity were also identified via the information gain analysis methods. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sajid, Muhammad
This tutorial/survey paper presents the assessment/determination of level of hazard/threat to emerging microelectronics devices in Low Earth Orbit (LEO) space radiation environment with perigee at 300 Km, apogee at 600Km altitude having different orbital inclinations to predict the reliability of onboard Bulk Built-In Current Sensor (BBICS) fabricated in 350nm technology node at OptMA Lab. UFMG Brazil. In this context, the various parameters for space radiation environment have been analyzed to characterize the ionizing radiation environment effects on proposed BBICS. The Space radiation environment has been modeled in the form of particles trapped in Van-Allen radiation belts(RBs), Energetic Solar Particles Events (ESPE) and Galactic Cosmic Rays (GCR) where as its potential effects on Device- Under-Test (DUT) has been predicted in terms of Total Ionizing Dose (TID), Single-Event Effects (SEE) and Displacement Damage Dose (DDD). Finally, the required mitigation techniques including necessary shielding requirements to avoid undesirable effects of radiation environment at device level has been estimated /determined with assumed standard thickness of Aluminum shielding. In order to evaluate space radiation environment and analyze energetic particles effects on BBICS, OMERE toolkit developed by TRAD was utilized.
NASA Technical Reports Server (NTRS)
Kaplan, Michael L.; Lux, Kevin M.; Cetola, Jeffrey D.; Huffman, Allan W.; Riordan, Allen J.; Slusser, Sarah W.; Lin, Yuh-Lang; Charney, Joseph J.; Waight, Kenneth T.
2004-01-01
Real-time prediction of environments predisposed to producing moderate-severe aviation turbulence is studied. We describe the numerical model and its postprocessing system designed for said prediction of environments predisposed to severe aviation turbulence as well as presenting numerous examples of its utility. The numerical model is MASS version 5.13, which is integrated over three different grid matrices in real time on a university work station in support of NASA Langley Research Center s B-757 turbulence research flight missions. The postprocessing system includes several turbulence-related products, including four turbulence forecasting indices, winds, streamlines, turbulence kinetic energy, and Richardson numbers. Additionally, there are convective products including precipitation, cloud height, cloud mass fluxes, lifted index, and K-index. Furthermore, soundings, sounding parameters, and Froude number plots are also provided. The horizontal cross-section plot products are provided from 16 000 to 46 000 ft in 2000-ft intervals. Products are available every 3 hours at the 60- and 30-km grid interval and every 1.5 hours at the 15-km grid interval. The model is initialized from the NWS ETA analyses and integrated two times a day.
Acosta, Juan; Fernández-Armenta, Juan; Borràs, Roger; Anguera, Ignasi; Bisbal, Felipe; Martí-Almor, Julio; Tolosana, Jose M; Penela, Diego; Andreu, David; Soto-Iglesias, David; Evertz, Reinder; Matiello, María; Alonso, Concepción; Villuendas, Roger; de Caralt, Teresa M; Perea, Rosario J; Ortiz, Jose T; Bosch, Xavier; Serra, Luis; Planes, Xavier; Greiser, Andreas; Ekinci, Okan; Lasalvia, Luis; Mont, Lluis; Berruezo, Antonio
2018-04-01
The aim of this study was to analyze whether scar characterization could improve the risk stratification for life-threatening ventricular arrhythmias and sudden cardiac death (SCD). Among patients with a cardiac resynchronization therapy (CRT) indication, appropriate defibrillator (CRT-D) therapy rates are low. Primary prevention patients with a class I indication for CRT were prospectively enrolled and assigned to CRT-D or CRT pacemaker according to physician's criteria. Pre-procedure contrast-enhanced cardiac magnetic resonance was obtained and analyzed to identify scar presence or absence, quantify the amount of core and border zone (BZ), and depict BZ distribution. The presence, mass, and characteristics of BZ channels in the scar were recorded. The primary endpoint was appropriate defibrillator therapy or SCD. 217 patients (39.6% ischemic) were included. During a median follow-up of 35.5 months (12 to 62 months), the primary endpoint occurred in 25 patients (11.5%) and did not occur in patients without myocardial scar. Among patients with scar (n = 125, 57.6%), those with implantable cardioverter-defibrillator (ICD) therapies or SCD exhibited greater scar mass (38.7 ± 34.2 g vs. 17.9 ± 17.2 g; p < 0.001), scar heterogeneity (BZ mass/scar mass ratio) (49.5 ± 13.0 vs. 40.1 ± 21.7; p = 0.044), and BZ channel mass (3.6 ± 3.0 g vs. 1.8 ± 3.4 g; p = 0.018). BZ mass (hazard ratio: 1.06 [95% confidence interval: 1.04 to 1.08]; p < 0.001) and BZ channel mass (hazard ratio: 1.21 [95% confidence interval: 1.10 to 1.32]; p < 0.001) were the strongest predictors of the primary endpoint. An algorithm based on scar mass and the absence of BZ channels identified 148 patients (68.2%) without ICD therapy/SCD during follow-up with a 100% negative predictive value. The presence, extension, heterogeneity, and qualitative distribution of BZ tissue of myocardial scar independently predict appropriate ICD therapies and SCD in CRT patients. Copyright © 2018 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Earthquake prediction: the interaction of public policy and science.
Jones, L M
1996-01-01
Earthquake prediction research has searched for both informational phenomena, those that provide information about earthquake hazards useful to the public, and causal phenomena, causally related to the physical processes governing failure on a fault, to improve our understanding of those processes. Neither informational nor causal phenomena are a subset of the other. I propose a classification of potential earthquake predictors of informational, causal, and predictive phenomena, where predictors are causal phenomena that provide more accurate assessments of the earthquake hazard than can be gotten from assuming a random distribution. Achieving higher, more accurate probabilities than a random distribution requires much more information about the precursor than just that it is causally related to the earthquake. PMID:11607656
NASA Astrophysics Data System (ADS)
Armstrong, Michael James
Increases in power demands and changes in the design practices of overall equipment manufacturers has led to a new paradigm in vehicle systems definition. The development of unique power systems architectures is of increasing importance to overall platform feasibility and must be pursued early in the aircraft design process. Many vehicle systems architecture trades must be conducted concurrent to platform definition. With an increased complexity introduced during conceptual design, accurate predictions of unit level sizing requirements must be made. Architecture specific emergent requirements must be identified which arise due to the complex integrated effect of unit behaviors. Off-nominal operating scenarios present sizing critical requirements to the aircraft vehicle systems. These requirements are architecture specific and emergent. Standard heuristically defined failure mitigation is sufficient for sizing traditional and evolutionary architectures. However, architecture concepts which vary significantly in terms of structure and composition require that unique failure mitigation strategies be defined for accurate estimations of unit level requirements. Identifying of these off-nominal emergent operational requirements require extensions to traditional safety and reliability tools and the systematic identification of optimal performance degradation strategies. Discrete operational constraints posed by traditional Functional Hazard Assessment (FHA) are replaced by continuous relationships between function loss and operational hazard. These relationships pose the objective function for hazard minimization. Load shedding optimization is performed for all statistically significant failures by varying the allocation of functional capability throughout the vehicle systems architecture. Expressing hazards, and thereby, reliability requirements as continuous relationships with the magnitude and duration of functional failure requires augmentations to the traditional means for system safety assessment (SSA). The traditional two state and discrete system reliability assessment proves insufficient. Reliability is, therefore, handled in an analog fashion: as a function of magnitude of failure and failure duration. A series of metrics are introduced which characterize system performance in terms of analog hazard probabilities. These include analog and cumulative system and functional risk, hazard correlation, and extensions to the traditional component importance metrics. Continuous FHA, load shedding optimization, and analog SSA constitute the SONOMA process (Systematic Off-Nominal Requirements Analysis). Analog system safety metrics inform both architecture optimization (changes in unit level capability and reliability) and architecture augmentation (changes in architecture structure and composition). This process was applied for two vehicle systems concepts (conventional and 'more-electric') in terms of loss/hazard relationships with varying degrees of fidelity. Application of this process shows that the traditional assumptions regarding the structure of the function loss vs. hazard relationship apply undue design bias to functions and components during exploratory design. This bias is illustrated in terms of inaccurate estimations of the system and function level risk and unit level importance. It was also shown that off-nominal emergent requirements must be defined specific to each architecture concept. Quantitative comparisons of architecture specific off-nominal performance were obtained which provide evidence to the need for accurate definition of load shedding strategies during architecture exploratory design. Formally expressing performance degradation strategies in terms of the minimization of a continuous hazard space enhances the system architects ability to accurately predict sizing critical emergent requirements concurrent to architecture definition. Furthermore, the methods and frameworks generated here provide a structured and flexible means for eliciting these architecture specific requirements during the performance of architecture trades.
Controlling Hazardous Releases while Protecting Passengers in Civil Infrastructure Systems
NASA Astrophysics Data System (ADS)
Rimer, Sara P.; Katopodes, Nikolaos D.
2015-11-01
The threat of accidental or deliberate toxic chemicals released into public spaces is a significant concern to public safety, and the real-time detection and mitigation of such hazardous contaminants has the potential to minimize harm and save lives. Furthermore, the safe evacuation of occupants during such a catastrophe is of utmost importance. This research develops a comprehensive means to address such scenarios, through both the sensing and control of contaminants, and the modeling of and potential communication to occupants as they evacuate. A computational fluid dynamics model is developed of a simplified public space characterized by a long conduit (e.g. airport terminal) with unidirectional ambient flow that is capable of detecting and mitigating the hazardous contaminant (via boundary ports) over several time horizons using model predictive control optimization. Additionally, a physical prototype is built to test the real-time feasibility of this computational flow control model. The prototype is a blower wind-tunnel with an elongated test section with the capability of sensing (via digital camera) an injected `contaminant' (propylene glycol smoke), and then mitigating that contaminant using actuators (compressed air operated vacuum nozzles) which are operated by a set of pressure regulators and a programmable controller. Finally, an agent-based model is developed to simulate ``agents'' (i.e. building occupants) as they evacuate a public space, and is coupled with the computational flow control model such that agents must interact with a dynamic, threatening environment. NSF-CMMI #0856438.
Genome-wide Association Study Implicates PARD3B-based AIDS Restriction
Nelson, George W.; Lautenberger, James A.; Chinn, Leslie; McIntosh, Carl; Johnson, Randall C.; Sezgin, Efe; Kessing, Bailey; Malasky, Michael; Hendrickson, Sher L.; Pontius, Joan; Tang, Minzhong; An, Ping; Winkler, Cheryl A.; Limou, Sophie; Le Clerc, Sigrid; Delaneau, Olivier; Zagury, Jean-François; Schuitemaker, Hanneke; van Manen, Daniëlle; Bream, Jay H.; Gomperts, Edward D.; Buchbinder, Susan; Goedert, James J.; Kirk, Gregory D.; O'Brien, Stephen J.
2011-01-01
Background. Host genetic variation influences human immunodeficiency virus (HIV) infection and progression to AIDS. Here we used clinically well-characterized subjects from 5 pretreatment HIV/AIDS cohorts for a genome-wide association study to identify gene associations with rate of AIDS progression. Methods. European American HIV seroconverters (n = 755) were interrogated for single-nucleotide polymorphisms (SNPs) (n = 700,022) associated with progression to AIDS 1987 (Cox proportional hazards regression analysis, co-dominant model). Results. Association with slower progression was observed for SNPs in the gene PARD3B. One of these, rs11884476, reached genome-wide significance (relative hazard = 0.3; P =3. 370 × 10−9) after statistical correction for 700,022 SNPs and contributes 4.52% of the overall variance in AIDS progression in this study. Nine of the top-ranked SNPs define a PARD3B haplotype that also displays significant association with progression to AIDS (hazard ratio, 0.3; P = 3.220 × 10−8). One of these SNPs, rs10185378, is a predicted exonic splicing enhancer; significant alteration in the expression profile of PARD3B splicing transcripts was observed in B cell lines with alternate rs10185378 genotypes. This SNP was typed in European cohorts of rapid progressors and was found to be protective for AIDS 1993 definition (odds ratio, 0.43, P = .025). Conclusions. These observations suggest a potential unsuspected pathway of host genetic influence on the dynamics of AIDS progression. PMID:21502085
Mascarella, Marco A; Mannard, Erin; Silva, Sabrina Daniela; Zeitouni, Anthony
2018-05-01
Hematologic markers, such as the neutrophil-to-lymphocyte ratio (NLR), characterize the inflammatory response to cancer and are associated with poorer survival in various malignancies. We evaluate the effect of pretreatment NLR on overall survival (OS) in patients with head and neck squamous cell carcinoma (HNSCC). Using multiple databases, a systematic search for articles evaluating the effect of NLR on OS in patients with HNSCC was performed. An inverse variation, random-effects model was used to analyze the data. A total of 24 of 241 articles, including 6479 patients, were analyzed. The combined hazard ratio for OS in patients with an elevated NLR (range 2.04-5) was 1.78 (confidence interval [CI] 1.53-2.07; P < .0001). The hazard ratios for site-specific cancer: oral cavity 1.56 CI 1.23-1.98 (P < .001), nasopharynx 1.66 CI 1.35-2.04 (P < .001), larynx 1.55 CI 1.26-1.92 (P < .001), and hypopharynx 2.36 CI 1.54-3.61 (P < .001). An elevated NLR is predictive of poorer OS in patients with HNSCC. © 2018 Wiley Periodicals, Inc.
Ates, Gamze; Favyts, Dorien; Hendriks, Giel; Derr, Remco; Mertens, Birgit; Verschaeve, Luc; Rogiers, Vera; Y Doktorova, Tatyana
2016-11-01
To ensure safety for humans, it is essential to characterize the genotoxic potential of new chemical entities, such as pharmaceutical and cosmetic substances. In a first tier, a battery of in vitro tests is recommended by international regulatory agencies. However, these tests suffer from inadequate specificity: compounds may be wrongly categorized as genotoxic, resulting in unnecessary, time-consuming, and expensive in vivo follow-up testing. In the last decade, novel assays (notably, reporter-based assays) have been developed in an attempt to overcome these drawbacks. Here, we have investigated the performance of two in vitro reporter-based assays, Vitotox and ToxTracker. A set of reference compounds was selected to span a variety of mechanisms of genotoxic action and applicability domains (e.g., pharmaceutical and cosmetic ingredients). Combining the performance of the two assays, we achieved 93% sensitivity and 79% specificity for prediction of gentoxicity for this set of compounds. Both assays permit quick high-throughput analysis of drug candidates, while requiring only small quantities of the test substances. Our study shows that these two assays, when combined, can be a reliable method for assessment of genotoxicity hazard. Copyright © 2016 Elsevier B.V. All rights reserved.
Analysis of impact/impulse noise for predicting noise induced hearing loss
NASA Astrophysics Data System (ADS)
Vipperman, Jeffrey S.; Prince, Mary M.; Flamm, Angela M.
2003-04-01
Studies indicate that the statistical properties and temporal structure of the sound signal are important in determining the extent of hearing hazard. As part of a pilot study to examine hearing conservation program effectiveness, NIOSH collected noise samples of impact noise sources in an automobile stamping plant, focusing on jobs with peak sound levels (Lpk) of greater than 120 dB. Digital tape recordings of sounds were collected using a Type I Precision Sound Level Meter and microphone connected to a DAT tape recorder. The events were archived and processed as .wav files to extract single events of interest on CD-R media and CD audio media. A preliminary analysis of sample wavelet files was conducted to characterize each event using metrics such as the number of impulses per unit time, the repetition rate or temporal pattern of these impulses, index of peakedness, crest factor, kurtosis, coefficient of kurtosis, rise time, fall time, and peak time. The spectrum, duration, and inverse of duration for each waveform were also computed. Finally, the data were evaluated with the Auditory Hazard Assessment Algorithm (AHAAH). Improvements to data collection for a future study examining different strategies for evaluating industrial noise exposure will be discussed.
Near field ice detection using infrared based optical imaging technology
NASA Astrophysics Data System (ADS)
Abdel-Moati, Hazem; Morris, Jonathan; Zeng, Yousheng; Corie, Martin Wesley; Yanni, Victor Garas
2018-02-01
If not detected and characterized, icebergs can potentially pose a hazard to oil and gas exploration, development and production operations in arctic environments as well as commercial shipping channels. In general, very large bergs are tracked and predicted using models or satellite imagery. Small and medium bergs are detectable using conventional marine radar. As icebergs decay they shed bergy bits and growlers, which are much smaller and more difficult to detect. Their low profile above the water surface, in addition to occasional relatively high seas, makes them invisible to conventional marine radar. Visual inspection is the most common method used to detect bergy bits and growlers, but the effectiveness of visual inspections is reduced by operator fatigue and low light conditions. The potential hazard from bergy bits and growlers is further increased by short detection range (<1 km). As such, there is a need for robust and autonomous near-field detection of such smaller icebergs. This paper presents a review of iceberg detection technology and explores applications for infrared imagers in the field. Preliminary experiments are performed and recommendations are made for future work, including a proposed imager design which would be suited for near field ice detection.
Seismic Hazard and Ground Motion Characterization at the Itoiz Dam (Northern Spain)
NASA Astrophysics Data System (ADS)
Rivas-Medina, A.; Santoyo, M. A.; Luzón, F.; Benito, B.; Gaspar-Escribano, J. M.; García-Jerez, A.
2012-08-01
This paper presents a new hazard-consistent ground motion characterization of the Itoiz dam site, located in Northern Spain. Firstly, we propose a methodology with different approximation levels to the expected ground motion at the dam site. Secondly, we apply this methodology taking into account the particular characteristics of the site and of the dam. Hazard calculations were performed following the Probabilistic Seismic Hazard Assessment method using a logic tree, which accounts for different seismic source zonings and different ground-motion attenuation relationships. The study was done in terms of peak ground acceleration and several spectral accelerations of periods coinciding with the fundamental vibration periods of the dam. In order to estimate these ground motions we consider two different dam conditions: when the dam is empty ( T = 0.1 s) and when it is filled with water to its maximum capacity ( T = 0.22 s). Additionally, seismic hazard analysis is done for two return periods: 975 years, related to the project earthquake, and 4,975 years, identified with an extreme event. Soil conditions were also taken into account at the site of the dam. Through the proposed methodology we deal with different forms of characterizing ground motion at the study site. In a first step, we obtain the uniform hazard response spectra for the two return periods. In a second step, a disaggregation analysis is done in order to obtain the controlling earthquakes that can affect the dam. Subsequently, we characterize the ground motion at the dam site in terms of specific response spectra for target motions defined by the expected values SA ( T) of T = 0.1 and 0.22 s for the return periods of 975 and 4,975 years, respectively. Finally, synthetic acceleration time histories for earthquake events matching the controlling parameters are generated using the discrete wave-number method and subsequently analyzed. Because of the short relative distances between the controlling earthquakes and the dam site we considered finite sources in these computations. We conclude that directivity effects should be taken into account as an important variable in this kind of studies for ground motion characteristics.
Multi-hazard risk analysis using the FP7 RASOR Platform
NASA Astrophysics Data System (ADS)
Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew
2014-10-01
Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.
NASA Astrophysics Data System (ADS)
Gochis, E. E.; Lechner, H. N.; Brill, K. A.; Lerner, G.; Ramos, E.
2014-12-01
Graduate students at Michigan Technological University developed the "Landslides!" activity to engage middle & high school students participating in summer engineering programs in a hands-on exploration of geologic engineering and STEM (Science, Technology, Engineering and Math) principles. The inquiry-based lesson plan is aligned to Next Generation Science Standards and is appropriate for 6th-12th grade classrooms. During the activity students focus on the factors contributing to landslide development and engineering practices used to mitigate hazards of slope stability hazards. Students begin by comparing different soil types and by developing predictions of how sediment type may contribute to differences in slope stability. Working in groups, students then build tabletop hill-slope models from the various materials in order to engage in evidence-based reasoning and test their predictions by adding groundwater until each group's modeled slope fails. Lastly students elaborate on their understanding of landslides by designing 'engineering solutions' to mitigate the hazards observed in each model. Post-evaluations from students demonstrate that they enjoyed the hands-on nature of the activity and the application of engineering principles to mitigate a modeled natural hazard.
NASA Astrophysics Data System (ADS)
Eble, M. C.; uslu, B. U.; Wright, L.
2013-12-01
Synthetic tsunamis generated from source regions around the Pacific Basin are analyzed in terms of their relative impact on United States coastal locations.. The region of tsunami origin is as important as the expected magnitude and the predicted inundation for understanding tsunami hazard. The NOAA Center for Tsunami Research has developed high-resolution tsunami models capable of predicting tsunami arrival time and amplitude of waves at each location. These models have been used to conduct tsunami hazard assessments to assess maximum impact and tsunami inundation for use by local communities in education and evacuation map development. Hazard assessment studies conducted for Los Angeles, San Francisco, Crescent City, Hilo, and Apra Harbor are combined with results of tsunami forecast model development at each of seventy-five locations. Complete hazard assessment, identifies every possible tsunami variation from a pre-computed propagation database. Study results indicate that the Eastern Aleutian Islands and Alaska are the most likely regions to produce the largest impact on the West Coast of the United States, while the East Philippines and Mariana trench regions impact Apra Harbor, Guam. Hawaii appears to be impacted equally from South America, Alaska and the Kuril Islands.
A Model for Generating Multi-hazard Scenarios
NASA Astrophysics Data System (ADS)
Lo Jacomo, A.; Han, D.; Champneys, A.
2017-12-01
Communities in mountain areas are often subject to risk from multiple hazards, such as earthquakes, landslides, and floods. Each hazard has its own different rate of onset, duration, and return period. Multiple hazards tend to complicate the combined risk due to their interactions. Prioritising interventions for minimising risk in this context is challenging. We developed a probabilistic multi-hazard model to help inform decision making in multi-hazard areas. The model is applied to a case study region in the Sichuan province in China, using information from satellite imagery and in-situ data. The model is not intended as a predictive model, but rather as a tool which takes stakeholder input and can be used to explore plausible hazard scenarios over time. By using a Monte Carlo framework and varrying uncertain parameters for each of the hazards, the model can be used to explore the effect of different mitigation interventions aimed at reducing the disaster risk within an uncertain hazard context.
Suspected endocrine disrupting substances (EDS) are now being evaluated by several regulatory authorities. A debate is in progress about whether or not EDS can be adequately assessed by following the standard approach involving identification of intrinsic hazards, prediction of e...
An international workshop was held in 2006 to evaluate experimental techniques for hazard identification and hazard characterization of sensitizing agents in terms of their ability to produce data, including dose–response information, to inform risk assessment. Human testing to i...
Thousands of compounds in the environment have not been characterized for developmental neurotoxicity (DNT) hazard. To address this issue, methods to screen compounds rapidly for DNT hazard evaluation are necessary and are being developed for key neurodevelopmental processes. In...
EPA is charged with assessing the risks of both acute and chronic exposures to hazardous air pollutants
(HAPs). The emissions from sources of HAPs are often characterized as temporally-averaged values,
however, patterns of exposure not captured in such measures may infl...
29 CFR 1926.407 - Hazardous (classified) locations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Electrical Code, lists or defines hazardous gases, vapors, and dusts by “Groups” characterized by their... the class, group, and operating temperature or temperature range, based on operation in a 40-degree C... be marked to indicate the group. (C) Fixed general-purpose equipment in Class I locations, other than...
29 CFR 1926.407 - Hazardous (classified) locations.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Electrical Code, lists or defines hazardous gases, vapors, and dusts by “Groups” characterized by their... the class, group, and operating temperature or temperature range, based on operation in a 40-degree C... be marked to indicate the group. (C) Fixed general-purpose equipment in Class I locations, other than...
Investigations at hazardous waste sites and sites of chemical spills often require on-site measurements and sampling activities to assess the type and extent of contamination. This document is a compilation of sampling methods and materials suitable to address most needs that ari...
NATIONAL SURVEYS OF MULTIPLE ENVIRONMENTAL HAZARDS TO YOUNG CHILDREN IN HOMES AND CHILD CARE CENTERS
The Department of Housing and Urban Development (HUD) has teamed with other federal agencies to characterize exposure of multiple environmental hazards to young children in two main indoor environments, homes and daycare centers. Under the co-sponsorship of HUD and the Nationa...
NASA Astrophysics Data System (ADS)
De Agostini, A.; Floris, M.; Pasquali, P.; Barbieri, M.; Cantone, A.; Riccardi, P.; Stevan, G.; Genevois, R.
2012-04-01
In the last twenty years, Differential Synthetic Aperture Radar Interferometry (DInSAR) techniques have been widely used to investigate geological processes, such as subsidence, earthquakes and landslides, through the evaluation of earth surface displacements caused by these processes. In the study of mass movements, contribution of interferometry can be limited due to the acquisition geometry of RADAR images and the rough morphology of mountain and hilly regions which represent typical landslide-prone areas. In this study, the advanced DInSAR techniques (i.e. Small Baseline Subset and Persistent Scatterers techniques), available in SARscape software, are used. These methods involve the use of multiple acquisitions stacks (large SAR temporal series) allowing improvements and refinements in landslide identification, characterization and hazard evaluation at the basin scale. Potential and limits of above mentioned techniques are outlined and discussed. The study area is the Agno Valley, located in the North-Eastern sector of Italian Alps and included in the Vicenza Province (Veneto Region, Italy). This area and the entire Vicenza Province were hit by an exceptional rainfall event on November 2010 that triggered more than 500 slope instabilities. The main aim of the work is to verify if spatial information available before the rainfall event, including ERS and ENVISAT RADAR data from 1992 to 2010, were able to predict the landslides occurred in the study area, in order to implement an effectiveness forecasting model. In the first step of the work a susceptibility analysis is carried out using landslide dataset from the IFFI project (Inventario Fenomeni Franosi in Italia, Landslide Italian Inventory) and related predisposing factors, which consist of morphometric (elevation, slope, aspect and curvature) and non-morphometric (land use, distance of roads and distance of river) factors available from the Veneto Region spatial database. Then, to test the prediction, the results of susceptibility analysis are compared with the location of landslides occurred in the study area during the November 2010 rainfall event. In the second step, results of DInSAR analysis (displacement maps over the time) are added on the prediction analysis to build up a map containing both spatial and temporal information on landslides and, as in the previous case, the prediction is tested by using November 2010 instabilities dataset. Comparison of the two tests allows to evaluate the contribution of interferometric techniques. Finally, morphometric factors and interferometric RADAR data are combined to design a preliminary analysis scheme that provide information on possible use of DInSAR techniques in landslide hazard evaluation of a given area.
AGU:Comments Requested on Natural Hazards Position Statement
NASA Astrophysics Data System (ADS)
2004-11-01
Natural hazards (earthquakes, floods, hurricanes, landslides, meteors, space weather, tornadoes, volcanoes, and other geophysical phenomena) are an integral component of our dynamic planet. These can have disastrous effects on vulnerable communities and ecosystems. By understanding how and where hazards occur, what causes them, and what circumstances increase their severity, we can develop effective strategies to reduce their impact. In practice, mitigating hazards requires addressing issues such as real-time monitoring and prediction, emergency preparedness, public education and awareness, post-disaster recovery, engineering, construction practices, land use, and building codes. Coordinated approaches involving scientists, engineers, policy makers, builders, lenders, insurers, news media, educators, relief organizations, and the public are therefore essential to reducing the adverse effects of natural hazards.
Validating the Proton Prediction System (PPS)
2006-12-01
hazards for astro - proton fluence model (Feynman et al., 2002) fits nauts on the missions to the Moon and Mars observed SEP event fluences of E>10MeV...events limited the useful PPS test cases to 78 of the J(E>10MeV) = 347 x ( Fx )0.941, (3) 101 solar flares. Although they can be serious radiation...hazards (Reames, 1999), PPS does not where Fx is the GOES 1-8 A X-ray flare half-power predict the E> 10MeV peaks often seen during the fluence in J cm -2
NASA Technical Reports Server (NTRS)
Stone, Henry W.; Edmonds, Gary O.
1995-01-01
Remotely controlled mobile robot used to locate, characterize, identify, and eventually mitigate incidents involving hazardous-materials spills/releases. Possesses number of innovative features, allowing it to perform mission-critical functions such as opening and unlocking doors and sensing for hazardous materials. Provides safe means for locating and identifying spills and eliminates risks of injury associated with use of manned entry teams. Current version of vehicle, called HAZBOT III, also features unique mechanical and electrical design enabling vehicle to operate safely within combustible atmosphere.
Wastewater Characterization and Hazardous Waste Survey, Reese Air Force Base, Texas
1988-04-01
drain. The pH of caustic soda C’?- generally classifies this waste as hazardous. h. Personnel from the CE Power Production shop are neutralizing spent ...been changed out. A disposal practice for this waste will be formulated upon determination of whether or not the spent solvent is hazardous. Shop...sampling has been S performed to determine the characteristics of this waste. Spent rags are also thrown in the trash. Personnel also perform cadmium
Probabilistic Surface Characterization for Safe Landing Hazard Detection and Avoidance (HDA)
NASA Technical Reports Server (NTRS)
Johnson, Andrew E. (Inventor); Ivanov, Tonislav I. (Inventor); Huertas, Andres (Inventor)
2015-01-01
Apparatuses, systems, computer programs and methods for performing hazard detection and avoidance for landing vehicles are provided. Hazard assessment takes into consideration the geometry of the lander. Safety probabilities are computed for a plurality of pixels in a digital elevation map. The safety probabilities are combined for pixels associated with one or more aim points and orientations. A worst case probability value is assigned to each of the one or more aim points and orientations.
EPAs ToxCast Research Program: Developing Predictive Bioactivity Signatures for Chemicals
The international community needs better predictive tools for assessing the hazards and risks of chemicals. It is technically feasible to collect bioactivity data on virtually all chemicals of potential concern ToxCast is providing a proof of concept for obtaining predictive, b...
Fire and explosion hazards related to the industrial use of potassium and sodium methoxides.
Kwok, Q; Acheson, B; Turcotte, R; Janès, A; Marlair, G
2013-04-15
Sodium and potassium methoxides are used as an intermediary for a variety of products in several industrial applications. For example, current production of so called "1G-biodiesel" relies on processing a catalytic reaction called "transesterification". This reaction transforms lipid resources from biomass materials into fatty acid methyl and ethyl esters. 1-G biodiesel processes imply the use of methanol, caustic potash (KOH), and caustic soda (NaOH) for which the hazards are well characterized. The more recent introduction of the direct catalysts CH3OK and CH3ONa may potentially introduce new process hazards. From an examination of existing MSDSs concerning these products, it appears that no consensus currently exists on their intrinsic hazardous properties. Recently, l'Institut National de l'Environnement Industriel et des Risques (France) and the Canadian Explosives Research Laboratory (Canada) have embarked upon a joint effort to better characterize the thermal hazards associated with these catalysts. This work employs the more conventional tests for water reactivity as an ignition source, fire and dust explosion hazards, using isothermal nano-calorimetry, isothermal basket tests, the Fire Propagation Apparatus and a standard 20 L sphere, respectively. It was found that these chemicals can become self-reactive close to room temperature under specific conditions and can generate explosible dusts. Copyright © 2013 Crown. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Qiu, Sihang; Chen, Bin; Wang, Rongxiao; Zhu, Zhengqiu; Wang, Yuan; Qiu, Xiaogang
2018-04-01
Hazardous gas leak accident has posed a potential threat to human beings. Predicting atmospheric dispersion and estimating its source become increasingly important in emergency management. Current dispersion prediction and source estimation models cannot satisfy the requirement of emergency management because they are not equipped with high efficiency and accuracy at the same time. In this paper, we develop a fast and accurate dispersion prediction and source estimation method based on artificial neural network (ANN), particle swarm optimization (PSO) and expectation maximization (EM). The novel method uses a large amount of pre-determined scenarios to train the ANN for dispersion prediction, so that the ANN can predict concentration distribution accurately and efficiently. PSO and EM are applied for estimating the source parameters, which can effectively accelerate the process of convergence. The method is verified by the Indianapolis field study with a SF6 release source. The results demonstrate the effectiveness of the method.
Determining the Financial Impact of Flood Hazards in Ungaged Basins
NASA Astrophysics Data System (ADS)
Cotterman, K. A.; Gutenson, J. L.; Pradhan, N. R.; Byrd, A.
2017-12-01
Many portions of the Earth lack adequate authoritative or in situ data that is of great value in determining natural hazard vulnerability from both anthropogenic and physical perspective. Such locations include the majority of developing nations, which do not possess adequate warning systems and protective infrastructure. The lack of warning and protection from natural hazards make these nations vulnerable to the destructive power of events such as floods. The goal of this research is to demonstrate an initial workflow with which to characterize flood financial hazards with global datasets and crowd-sourced, non-authoritative data in ungagged river basins. This workflow includes the hydrologic and hydraulic response of the watershed to precipitation, characterized by the physics-based modeling application Gridded Surface-Subsurface Hydrologic Analysis (GSSHA) model. In addition, data infrastructure and resources are available to approximate the human impact of flooding. Open source, volunteer geographic information (VGI) data can provide global coverage of elements at risk of flooding. Additional valuation mechanisms can then translate flood exposure into percentage and financial damage to each building. The combinations of these tools allow the authors to remotely assess flood hazards with minimal computational, temporal, and financial overhead. This combination of deterministic and stochastic modeling provides the means to quickly characterize watershed flood vulnerability and will allow emergency responders and planners to better understand the implications of flooding, both spatially and financially. In either a planning, real-time, or forecasting scenario, the system will assist the user in understanding basin flood vulnerability and increasing community resiliency and preparedness.
Developing seismogenic source models based on geologic fault data
Haller, Kathleen M.; Basili, Roberto
2011-01-01
Calculating seismic hazard usually requires input that includes seismicity associated with known faults, historical earthquake catalogs, geodesy, and models of ground shaking. This paper will address the input generally derived from geologic studies that augment the short historical catalog to predict ground shaking at time scales of tens, hundreds, or thousands of years (e.g., SSHAC 1997). A seismogenic source model, terminology we adopt here for a fault source model, includes explicit three-dimensional faults deemed capable of generating ground motions of engineering significance within a specified time frame of interest. In tectonically active regions of the world, such as near plate boundaries, multiple seismic cycles span a few hundred to a few thousand years. In contrast, in less active regions hundreds of kilometers from the nearest plate boundary, seismic cycles generally are thousands to tens of thousands of years long. Therefore, one should include sources having both longer recurrence intervals and possibly older times of most recent rupture in less active regions of the world rather than restricting the model to include only Holocene faults (i.e., those with evidence of large-magnitude earthquakes in the past 11,500 years) as is the practice in tectonically active regions with high deformation rates. During the past 15 years, our institutions independently developed databases to characterize seismogenic sources based on geologic data at a national scale. Our goal here is to compare the content of these two publicly available seismogenic source models compiled for the primary purpose of supporting seismic hazard calculations by the Istituto Nazionale di Geofisica e Vulcanologia (INGV) and the U.S. Geological Survey (USGS); hereinafter we refer to the two seismogenic source models as INGV and USGS, respectively. This comparison is timely because new initiatives are emerging to characterize seismogenic sources at the continental scale (e.g., SHARE in the Euro-Mediterranean, http://www.share-eu.org/; EMME in the Middle East, http://www.emme-gem.org/) and global scale (e.g., GEM, http://www.globalquakemodel.org/; Anonymous 2008). To some extent, each of these efforts is still trying to resolve the level of optimal detail required for this type of compilation. The comparison we provide defines a common standard for consideration by the international community for future regional and global seismogenic source models by identifying the necessary parameters that capture the essence of geological fault data in order to characterize seismogenic sources. In addition, we inform potential users of differences in our usage of common geological/seismological terms to avoid inappropriate use of the data in our models and provide guidance to convert the data from one model to the other (for detailed instructions, see the electronic supplement to this article). Applying our recommendations will permit probabilistic seismic hazard assessment codes to run seamlessly using either seismogenic source input. The USGS and INGV database schema compare well at a first-level inspection. Both databases contain a set of fields representing generalized fault three-dimensional geometry and additional fields that capture the essence of past earthquake occurrences. Nevertheless, there are important differences. When we further analyze supposedly comparable fields, many are defined differently. These differences would cause anomalous results in hazard prediction if one assumes the values are similarly defined. The data, however, can be made fully compatible using simple transformations.
Crabbe, Helen; Fletcher, Tony; Close, Rebecca; Watts, Michael J; Ander, E Louise; Smedley, Pauline L; Verlander, Neville Q; Gregory, Martin; Middleton, Daniel R S; Polya, David A; Studden, Mike; Leonardi, Giovanni S
2017-12-01
Approximately one million people in the UK are served by private water supplies (PWS) where main municipal water supply system connection is not practical or where PWS is the preferred option. Chronic exposure to contaminants in PWS may have adverse effects on health. South West England is an area with elevated arsenic concentrations in groundwater and over 9000 domestic dwellings here are supplied by PWS. There remains uncertainty as to the extent of the population exposed to arsenic (As), and the factors predicting such exposure. We describe a hazard assessment model based on simplified geology with the potential to predict exposure to As in PWS. Households with a recorded PWS in Cornwall were recruited to take part in a water sampling programme from 2011 to 2013. Bedrock geologies were aggregated and classified into nine Simplified Bedrock Geological Categories (SBGC), plus a cross-cutting "mineralized" area. PWS were sampled by random selection within SBGCs and some 508 households volunteered for the study. Transformations of the data were explored to estimate the distribution of As concentrations for PWS by SBGC. Using the distribution per SBGC, we predict the proportion of dwellings that would be affected by high concentrations and rank the geologies according to hazard. Within most SBGCs, As concentrations were found to have log-normal distributions. Across these areas, the proportion of dwellings predicted to have drinking water over the prescribed concentration value (PCV) for As ranged from 0% to 20%. From these results, a pilot predictive model was developed calculating the proportion of PWS above the PCV for As and hazard ranking supports local decision making and prioritization. With further development and testing, this can help local authorities predict the number of dwellings that might fail the PCV for As, based on bedrock geology. The model presented here for Cornwall could be applied in areas with similar geologies. Application of the method requires independent validation and further groundwater-derived PWS sampling on other geological formations.
29 CFR 1926.450 - Scope, application and definitions applicable to this subpart.
Code of Federal Regulations, 2010 CFR
2010-07-01
... one who is capable of identifying existing and predictable hazards in the surroundings or working... locking together the tubes of a tube and coupler scaffold. Crawling board (chicken ladder) means a... alternative designs, materials or methods to protect against a hazard which the employer can demonstrate will...
Development of sinkholes resulting from man's activities in the Eastern United States
Newton, John G.
1987-01-01
Alternatives that allow avoiding or minimizing sinkhole hazards are most numerous when a problem or potential problem is recognized during site evaluation. The number of alternatives declines after the beginning of site development. Where sinkhole development is predictable, zoning of land use can minimize hazards.
The occurrence of arsenic in groundwater is a recognized environmental hazard with worldwide importance and much effort has been focused on surveying and predicting where arsenic occurs. Temporal variability is one aspect of this environmental hazard that has until recently recei...
Prior nonhip limb fracture predicts subsequent hip fracture in institutionalized elderly people.
Nakamura, K; Takahashi, S; Oyama, M; Oshiki, R; Kobayashi, R; Saito, T; Yoshizawa, Y; Tsuchiya, Y
2010-08-01
This 1-year cohort study of nursing home residents revealed that historical fractures of upper limbs or nonhip lower limbs were associated with hip fracture (hazard ratio = 2.14), independent of activities of daily living (ADL), mobility, dementia, weight, and type of nursing home. Prior nonhip fractures are useful for predicting of hip fracture in institutional settings. The aim of this study was to evaluate the utility of fracture history for the prediction of hip fracture in nursing home residents. This was a cohort study with a 1-year follow-up. Subjects were 8,905 residents of nursing homes in Niigata, Japan (mean age, 84.3 years). Fracture histories were obtained from nursing home medical records. ADL levels were assessed by caregivers. Hip fracture diagnosis was based on hospital medical records. Subjects had fracture histories of upper limbs (5.0%), hip (14.0%), and nonhip lower limbs (4.6%). Among historical single fractures, only prior nonhip lower limbs significantly predicted subsequent fracture (adjusted hazard ratio, 2.43; 95% confidence interval (CI), 1.30-4.57). The stepwise method selected the best model, in which a combined historical fracture at upper limbs or nonhip lower limbs (adjusted hazard ratio, 2.14; 95% CI, 1.30-3.52), dependence, ADL levels, mobility, dementia, weight, and type of nursing home independently predicted subsequent hip fracture. A fracture history at upper or nonhip lower limbs, in combination with other known risk factors, is useful for the prediction of future hip fracture in institutional settings.
40 CFR 280.63 - Initial site characterization.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 26 2010-07-01 2010-07-01 false Initial site characterization. 280.63... Hazardous Substances § 280.63 Initial site characterization. (a) Unless directed to do otherwise by the implementing agency, owners and operators must assemble information about the site and the nature of the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durda, J.L.; Suit-Kowalski, L.; Preziosi, D.
1997-12-31
An ecological risk assessment was conducted to evaluate the potential for adverse environmental impacts associated with chemicals released to air as a result of a proposed expansion of a hazardous waste landfill in Ontario. The purpose of the risk assessment was to characterize ecological risks associated with the proposed expansion relative to those associated with the existing landfill and those that would exist if the current landfill was completely closed and background conditions prevailed. The ecological risk assessment was one part of a comprehensive environmental impact assessment of the proposed landfill continuation that was being performed under the requirements ofmore » Ontario`s Environmental Assessment Act. Air monitoring data from the facility were used to identify a list of 141 chemicals potentially released during landfill continuation, as well as to characterize current emissions and background chemical levels. An ecological risk-based chemical screening process that considered background concentration, source strength, environmental partitioning, bioaccumulation potential, and toxicity was used to select a group of 23 chemicals for detailed evaluation in the ecological risk assessment. Dispersion, deposition, partitioning and bioaccumulation modeling were used to predict potential exposures in ecological receptors. Receptors were selected for evaluation based on regional habitat characteristics, exposure potential, toxicant sensitivity, ecological significance, population status, and societal value. Livestock and agricultural crop and pasture species were key receptors for the assessment, given the highly agricultural nature of the study area. In addition, native wildlife species, including the endangered Henslow`s sparrow and the regionally vulnerable pugnose minnow, also were considered.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boissonnade, A; Hossain, Q; Kimball, J
Since the mid-l980's, assessment of the wind and tornado risks at the Department of Energy (DOE) high and moderate hazard facilities has been based on the straight wind/tornado hazard curves given in UCRL-53526 (Coats, 1985). These curves were developed using a methodology that utilized a model, developed by McDonald, for severe winds at sub-tornado wind speeds and a separate model, developed by Fujita, for tornado wind speeds. For DOE sites not covered in UCRL-53526, wind and tornado hazard assessments are based on the criteria outlined in DOE-STD-1023-95 (DOE, 1996), utilizing the methodology in UCRL-53526; Subsequent to the publication of UCRL53526,more » in a study sponsored by the Nuclear Regulatory Commission (NRC), the Pacific Northwest Laboratory developed tornado wind hazard curves for the contiguous United States, NUREG/CR-4461 (Ramsdell, 1986). Because of the different modeling assumptions and underlying data used to develop the tornado wind information, the wind speeds at specified exceedance levels, at a given location, based on the methodology in UCRL-53526, are different than those based on the methodology in NUREG/CR-4461. In 1997, Lawrence Livermore National Laboratory (LLNL) was funded by the DOE to review the current methodologies for characterizing tornado wind hazards and to develop a state-of-the-art wind/tornado characterization methodology based on probabilistic hazard assessment techniques and current historical wind data. This report describes the process of developing the methodology and the database of relevant tornado information needed to implement the methodology. It also presents the tornado wind hazard curves obtained from the application of the method to DOE sites throughout the contiguous United States.« less
Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO)
Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing
2016-01-01
The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle’s speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles. PMID:27420073
Zhang, Taolin; Zhou, Xiaodong; Yang, Lizhong
2016-03-05
This work investigated experimentally and theoretically the fire hazards of thermal-insulation materials used in diesel locomotives under different radiation heat fluxes. Based on the experimental results, the critical heat flux for ignition was determined to be 6.15 kW/m² and 16.39 kW/m² for pure polyurethane and aluminum-polyurethane respectively. A theoretical model was established for both to predict the fire behaviors under different circumstances. The fire behavior of the materials was evaluated based on the flashover and the total heat release rate (HRR). The fire hazards levels were classified based on different experimental results. It was found that the fire resistance performance of aluminum-polyurethane is much better than that of pure-polyurethane under various external heat fluxes. The concentration of toxic pyrolysis volatiles generated from aluminum-polyurethane materials is much higher than that of pure polyurethane materials, especially when the heat flux is below 50 kW/m². The hazard index HI during peak width time was proposed based on the comprehensive impact of time and concentrations. The predicted HI in this model coincides with the existed N-gas and FED models which are generally used to evaluate the fire gas hazard in previous researches. The integrated model named HNF was proposed as well to estimate the fire hazards of materials by interpolation and weighted average calculation.
Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO).
Yan, Lixin; Zhang, Yishi; He, Yi; Gao, Song; Zhu, Dunyao; Ran, Bin; Wu, Qing
2016-07-13
The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1) the Markov blanket (MB) algorithm is employed to extract the main factors associated with hazardous traffic events; (2) a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle's speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G) have significant influences on hazardous traffic events. The sequential minimal optimization (SMO) algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles.
Zhang, Taolin; Zhou, Xiaodong; Yang, Lizhong
2016-01-01
This work investigated experimentally and theoretically the fire hazards of thermal-insulation materials used in diesel locomotives under different radiation heat fluxes. Based on the experimental results, the critical heat flux for ignition was determined to be 6.15 kW/m2 and 16.39 kW/m2 for pure polyurethane and aluminum-polyurethane respectively. A theoretical model was established for both to predict the fire behaviors under different circumstances. The fire behavior of the materials was evaluated based on the flashover and the total heat release rate (HRR). The fire hazards levels were classified based on different experimental results. It was found that the fire resistance performance of aluminum-polyurethane is much better than that of pure-polyurethane under various external heat fluxes. The concentration of toxic pyrolysis volatiles generated from aluminum-polyurethane materials is much higher than that of pure polyurethane materials, especially when the heat flux is below 50 kW/m2. The hazard index HI during peak width time was proposed based on the comprehensive impact of time and concentrations. The predicted HI in this model coincides with the existed N-gas and FED models which are generally used to evaluate the fire gas hazard in previous researches. The integrated model named HNF was proposed as well to estimate the fire hazards of materials by interpolation and weighted average calculation. PMID:28773295
Liquefaction potential index: Field assessment
Toprak, S.; Holzer, T.L.
2003-01-01
Cone penetration test (CPT) soundings at historic liquefaction sites in California were used to evaluate the predictive capability of the liquefaction potential index (LPI), which was defined by Iwasaki et al. in 1978. LPI combines depth, thickness, and factor of safety of liquefiable material inferred from a CPT sounding into a single parameter. LPI data from the Monterey Bay region indicate that the probability of surface manifestations of liquefaction is 58 and 93%, respectively, when LPI equals or exceeds 5 and 15. LPI values also generally correlate with surface effects of liquefaction: Decreasing from a median of 12 for soundings in lateral spreads to 0 for soundings where no surface effects were reported. The index is particularly promising for probabilistic liquefaction hazard mapping where it may be a useful parameter for characterizing the liquefaction potential of geologic units.
Global Precipitation Measurement (GPM) Ground Validation (GV) Science Implementation Plan
NASA Technical Reports Server (NTRS)
Petersen, Walter A.; Hou, Arthur Y.
2008-01-01
For pre-launch algorithm development and post-launch product evaluation Global Precipitation Measurement (GPM) Ground Validation (GV) goes beyond direct comparisons of surface rain rates between ground and satellite measurements to provide the means for improving retrieval algorithms and model applications.Three approaches to GPM GV include direct statistical validation (at the surface), precipitation physics validation (in a vertical columns), and integrated science validation (4-dimensional). These three approaches support five themes: core satellite error characterization; constellation satellites validation; development of physical models of snow, cloud water, and mixed phase; development of cloud-resolving model (CRM) and land-surface models to bridge observations and algorithms; and, development of coupled CRM-land surface modeling for basin-scale water budget studies and natural hazard prediction. This presentation describes the implementation of these approaches.
Ballistic Performance of Porous-Ceramic, Thermal-Protection-Systems
NASA Technical Reports Server (NTRS)
Christiansen, E. L.; Davis, B. A.; Miller, J. E.; Bohl, W. E.; Foreman, C. D.
2009-01-01
Porous-ceramic, thermal protection systems are used heavily in current reentry vehicles like the Space Shuttle and are currently being proposed for the next generation of manned spacecraft, Orion. These materials insulate the structural components of a spacecraft against the intense thermal environments of atmospheric reentry. Furthermore, these materials are also highly exposed to space environmental hazards like meteoroid and orbital debris impacts. This paper discusses recent impact testing up to 9 km/s, and the findings of the influence of material equation-of-state on the simulation of the impact event to characterize the ballistic performance of these materials. These results will be compared with heritage models1 for these materials developed from testing at lower velocities. Assessments of predicted spacecraft risk based upon these tests and simulations will also be discussed.
Radar research on thunderstorms and lightning
NASA Technical Reports Server (NTRS)
Rust, W. D.; Doviak, R. J.
1982-01-01
Applications of Doppler radar to detection of storm hazards are reviewed. Normal radar sweeps reveal data on reflectivity fields of rain drops, ionized lightning paths, and irregularities in humidity and temperature. Doppler radar permits identification of the targets' speed toward or away from the transmitter through interpretation of the shifts in the microwave frequency. Wind velocity fields can be characterized in three dimensions by the use of two radar units, with a Nyquist limit on the highest wind speeds that may be recorded. Comparisons with models numerically derived from Doppler radar data show substantial agreement in storm formation predictions based on information gathered before the storm. Examples are provided of tornado observations with expanded Nyquist limits, gust fronts, turbulence, lightning and storm structures. Obtaining vertical velocities from reflectivity spectra is discussed.
Physics-Based Hazard Assessment for Critical Structures Near Large Earthquake Sources
NASA Astrophysics Data System (ADS)
Hutchings, L.; Mert, A.; Fahjan, Y.; Novikova, T.; Golara, A.; Miah, M.; Fergany, E.; Foxall, W.
2017-09-01
We argue that for critical structures near large earthquake sources: (1) the ergodic assumption, recent history, and simplified descriptions of the hazard are not appropriate to rely on for earthquake ground motion prediction and can lead to a mis-estimation of the hazard and risk to structures; (2) a physics-based approach can address these issues; (3) a physics-based source model must be provided to generate realistic phasing effects from finite rupture and model near-source ground motion correctly; (4) wave propagations and site response should be site specific; (5) a much wider search of possible sources of ground motion can be achieved computationally with a physics-based approach; (6) unless one utilizes a physics-based approach, the hazard and risk to structures has unknown uncertainties; (7) uncertainties can be reduced with a physics-based approach, but not with an ergodic approach; (8) computational power and computer codes have advanced to the point that risk to structures can be calculated directly from source and site-specific ground motions. Spanning the variability of potential ground motion in a predictive situation is especially difficult for near-source areas, but that is the distance at which the hazard is the greatest. The basis of a "physical-based" approach is ground-motion syntheses derived from physics and an understanding of the earthquake process. This is an overview paper and results from previous studies are used to make the case for these conclusions. Our premise is that 50 years of strong motion records is insufficient to capture all possible ranges of site and propagation path conditions, rupture processes, and spatial geometric relationships between source and site. Predicting future earthquake scenarios is necessary; models that have little or no physical basis but have been tested and adjusted to fit available observations can only "predict" what happened in the past, which should be considered description as opposed to prediction. We have developed a methodology for synthesizing physics-based broadband ground motion that incorporates the effects of realistic earthquake rupture along specific faults and the actual geology between the source and site.
Pelletier, J.D.; Mayer, L.; Pearthree, P.A.; House, P.K.; Demsey, K.A.; Klawon, J.K.; Vincent, K.R.
2005-01-01
Millions of people in the western United States live near the dynamic, distributary channel networks of alluvial fans where flood behavior is complex and poorly constrained. Here we test a new comprehensive approach to alluvial-fan flood hazard assessment that uses four complementary methods: two-dimensional raster-based hydraulic modeling, satellite-image change detection, fieldbased mapping of recent flood inundation, and surficial geologic mapping. Each of these methods provides spatial detail lacking in the standard method and each provides critical information for a comprehensive assessment. Our numerical model simultaneously solves the continuity equation and Manning's equation (Chow, 1959) using an implicit numerical method. It provides a robust numerical tool for predicting flood flows using the large, high-resolution Digital Elevation Models (DEMs) necessary to resolve the numerous small channels on the typical alluvial fan. Inundation extents and flow depths of historic floods can be reconstructed with the numerical model and validated against field- and satellite-based flood maps. A probabilistic flood hazard map can also be constructed by modeling multiple flood events with a range of specified discharges. This map can be used in conjunction with a surficial geologic map to further refine floodplain delineation on fans. To test the accuracy of the numerical model, we compared model predictions of flood inundation and flow depths against field- and satellite-based flood maps for two recent extreme events on the southern Tortolita and Harquahala piedmonts in Arizona. Model predictions match the field- and satellite-based maps closely. Probabilistic flood hazard maps based on the 10 yr, 100 yr, and maximum floods were also constructed for the study areas using stream gage records and paleoflood deposits. The resulting maps predict spatially complex flood hazards that strongly reflect small-scale topography and are consistent with surficial geology. In contrast, FEMA Flood Insurance Rate Maps (FIRMs) based on the FAN model predict uniformly high flood risk across the study areas without regard for small-scale topography and surficial geology. ?? 2005 Geological Society of America.
Salisbury, Margaret L; Xia, Meng; Zhou, Yueren; Murray, Susan; Tayob, Nabihah; Brown, Kevin K; Wells, Athol U; Schmidt, Shelley L; Martinez, Fernando J; Flaherty, Kevin R
2016-02-01
Idiopathic pulmonary fibrosis is a progressive lung disease with variable course. The Gender-Age-Physiology (GAP) Index and staging system uses clinical variables to stage mortality risk. It is unknown whether clinical staging predicts future decline in pulmonary function. We assessed whether the GAP stage predicts future pulmonary function decline and whether interval pulmonary function change predicts mortality after accounting for stage. Patients with idiopathic pulmonary fibrosis (N = 657) were identified retrospectively at three tertiary referral centers, and baseline GAP stages were assessed. Mixed models were used to describe average trajectories of FVC and diffusing capacity of the lung for carbon monoxide (Dlco). Multivariable Cox proportional hazards models were used to assess whether declines in pulmonary function ≥ 10% in 6 months predict mortality after accounting for GAP stage. Over a 2-year period, GAP stage was not associated with differences in yearly lung function decline. After accounting for stage, a 10% decrease in FVC or Dlco over 6 months independently predicted death or transplantation (FVC hazard ratio, 1.37; Dlco hazard ratio, 1.30; both, P ≤ .03). Patients with GAP stage 2 with declining pulmonary function experienced a survival profile similar to patients with GAP stage 3, with 1-year event-free survival of 59.3% (95% CI, 49.4-67.8) vs 56.9% (95% CI, 42.2-69.1). Baseline GAP stage predicted death or lung transplantation but not the rate of future pulmonary function decline. After accounting for GAP stage, a decline of ≥ 10% over 6 months independently predicted death or lung transplantation. Copyright © 2016 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lindquist, Eric
2016-04-01
The characterization of near-Earth-objects (NEOs) in regard to physical attributes and potential risk and impact factors presents a complex and complicates scientific and engineering challenge. The societal and policy risks and impacts are no less complex, yet are rarely considered in the same context as material properties or related factors. Further, NEO impacts are typically considered as discrete events, not as initial events in a dynamic cascading system. The objective of this contribution is to position the characterization of NEOs within the public policy process domain as a means to reflect on the science-policy nexus in regard to risks and multi-hazard impacts associated with these hazards. This will be accomplished through, first, a brief overview of the science-policy nexus, followed by a discussion of policy process frameworks, such as agenda setting and the multiple streams model, focusing events, and punctuated equilibrium, and their application and appropriateness to the problem of NEOs. How, too, for example, does NEO hazard and risk compare with other low probability, high risk, hazards in regard to public policy? Finally, we will reflect on the implications of alternative NEO "solutions" and the characterization of the NEO "problem," and the political and public acceptance of policy alternatives as a way to link NEO science and policy in the context of the overall NH9.12 panel.
NASA Astrophysics Data System (ADS)
Abed Gatea, Mezher; Ahmed, Anwar A.; jundee kadhum, Saad; Ali, Hasan Mohammed; Hussein Muheisn, Abbas
2018-05-01
The Safety Assessment Framework (SAFRAN) software has implemented here for radiological safety analysis; to verify that the dose acceptance criteria and safety goals are met with a high degree of confidence for dismantling of Tammuz-2 reactor core at Al-tuwaitha nuclear site. The activities characterizing, dismantling and packaging were practiced to manage the generated radioactive waste. Dose to the worker was considered an endpoint-scenario while dose to the public has neglected due to that Tammuz-2 facility is located in a restricted zone and 30m berm surrounded Al-tuwaitha site. Safety assessment for dismantling worker endpoint-scenario based on maximum external dose at component position level in the reactor pool and internal dose via airborne activity while, for characterizing and packaging worker endpoints scenarios have been done via external dose only because no evidence for airborne radioactivity hazards outside the reactor pool. The in-situ measurements approved that reactor core components are radiologically activated by Co-60 radioisotope. SAFRAN results showed that the maximum received dose for workers are (1.85, 0.64 and 1.3mSv/y) for activities dismantling, characterizing and packaging of reactor core components respectively. Hence, the radiological hazards remain below the low level hazard and within the acceptable annual dose for workers in radiation field
NASA Technical Reports Server (NTRS)
Proctor, Fred H.; Hinton, David A.; Bowles, Roland L.
2000-01-01
An aircraft exposed to hazardous low-level windshear may suffer a critical loss of airspeed and altitude, thus endangering its ability to remain airborne. In order to characterize this hazard, a nondimensional index was developed based oil aerodynamic principals and understanding of windshear phenomena, 'This paper reviews the development and application of the Bowles F-tactor. which is now used by onboard sensors for the detection of hazardous windshear. It was developed and tested during NASA/I:AA's airborne windshear program and is now required for FAA certification of onboard radar windshear detection systems. Reviewed in this paper are: 1) definition of windshear and description of atmospheric phenomena that may cause hazardous windshear. 2) derivation and discussion of the F-factor. 3) development of the F-factor hazard threshold, 4) its testing during field deployments, and 5) its use in accident reconstructions,
Preliminary Considerations for Classifying Hazards of Unmanned Aircraft Systems
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Maddalon, Jeffrey M.; Miner, Paul S.; Szatkowski, George N.; Ulrey, Michael L.; DeWalt, Michael P.; Spitzer, Cary R.
2007-01-01
The use of unmanned aircraft in national airspace has been characterized as the next great step forward in the evolution of civil aviation. To make routine and safe operation of these aircraft a reality, a number of technological and regulatory challenges must be overcome. This report discusses some of the regulatory challenges with respect to deriving safety and reliability requirements for unmanned aircraft. In particular, definitions of hazards and their classification are discussed and applied to a preliminary functional hazard assessment of a generic unmanned system.
Toward uniform probabilistic seismic hazard assessments for Southeast Asia
NASA Astrophysics Data System (ADS)
Chan, C. H.; Wang, Y.; Shi, X.; Ornthammarath, T.; Warnitchai, P.; Kosuwan, S.; Thant, M.; Nguyen, P. H.; Nguyen, L. M.; Solidum, R., Jr.; Irsyam, M.; Hidayati, S.; Sieh, K.
2017-12-01
Although most Southeast Asian countries have seismic hazard maps, various methodologies and quality result in appreciable mismatches at national boundaries. We aim to conduct a uniform assessment across the region by through standardized earthquake and fault databases, ground-shaking scenarios, and regional hazard maps. Our earthquake database contains earthquake parameters obtained from global and national seismic networks, harmonized by removal of duplicate events and the use of moment magnitude. Our active-fault database includes fault parameters from previous studies and from the databases implemented for national seismic hazard maps. Another crucial input for seismic hazard assessment is proper evaluation of ground-shaking attenuation. Since few ground-motion prediction equations (GMPEs) have used local observations from this region, we evaluated attenuation by comparison of instrumental observations and felt intensities for recent earthquakes with predicted ground shaking from published GMPEs. We then utilize the best-fitting GMPEs and site conditions into our seismic hazard assessments. Based on the database and proper GMPEs, we have constructed regional probabilistic seismic hazard maps. The assessment shows highest seismic hazard levels near those faults with high slip rates, including the Sagaing Fault in central Myanmar, the Sumatran Fault in Sumatra, the Palu-Koro, Matano and Lawanopo Faults in Sulawesi, and the Philippine Fault across several islands of the Philippines. In addition, our assessment demonstrates the important fact that regions with low earthquake probability may well have a higher aggregate probability of future earthquakes, since they encompass much larger areas than the areas of high probability. The significant irony then is that in areas of low to moderate probability, where building codes are usually to provide less seismic resilience, seismic risk is likely to be greater. Infrastructural damage in East Malaysia during the 2015 Sabah earthquake offers a case in point.
Moser, Heidrun; Roembke, Joerg; Donnevert, Gerhild; Becker, Roland
2011-02-01
The ecotoxicological characterization of waste is part of its assessment as hazardous or non-hazardous according to the European Waste List. For this classification 15 hazard criteria are derived from the Council Directive 91/689/EEC on hazardous waste. Some of the hazard criteria are based on the content of dangerous substances. The criterion H14 'ecotoxic' lacks of an assessment and testing strategy and no specific threshold values have been defined so far. Based on the recommendations of CEN guideline 14735 (2005), an international round robin test (ring test) was organized by the German Federal Environment Agency in order to define suitable test methods for the biological assessment of waste and waste eluates. A basic test battery, consisting of three aquatic and three terrestrial tests, was compiled. In addition, data were submitted for ten additional tests (five aquatic (including a genotoxicity test) and five terrestrial ones). The tests were performed with three representative waste types: an ash from an incineration plant, a soil containing high concentrations of organic contaminants (polycyclic aromatic hydrocarbons) and a preserved wood waste. The results of this ring test confirm that a combination of a battery of biological tests and chemical residual analysis is needed for an ecotoxicological characterization of wastes. With small modifications the basic test battery is considered to be well suitable for the hazard and risk assessment of wastes and waste eluates. All results and documents are accessible via a web-based data bank application.
Examining the Association between Hazardous Waste Facilities and Rural "Brain Drain"
ERIC Educational Resources Information Center
Hunter, Lori M.; Sutton, Jeannette
2004-01-01
Rural communities are increasingly being faced with the prospect of accepting facilities characterized as "opportunity-threat," such as facilities that generate, treat, store, or otherwise dispose of hazardous wastes. Such facilities may offer economic gains through jobs and tax revenue, although they may also act as environmental "disamenities."…
Thousands of chemicals have not been characterized for their DNT potential. Due to the need for DNT hazard identification, efforts to develop screening assays for DNT potential is a high priority. Multi-well microelectrode arrays (MEA) measure the spontaneous activity of electr...
NASA Astrophysics Data System (ADS)
Chartier, Thomas; Scotti, Oona; Boiselet, Aurelien; Lyon-Caen, Hélène
2016-04-01
Including faults in probabilistic seismic hazard assessment tends to increase the degree of uncertainty in the results due to the intrinsically uncertain nature of the fault data. This is especially the case in the low to moderate seismicity regions of Europe, where slow slipping faults are difficult to characterize. In order to better understand the key parameters that control the uncertainty in the fault-related hazard computations, we propose to build an analytic tool that provides a clear link between the different components of the fault-related hazard computations and their impact on the results. This will allow identifying the important parameters that need to be better constrained in order to reduce the resulting uncertainty in hazard and also provide a more hazard-oriented strategy for collecting relevant fault parameters in the field. The tool will be illustrated through the example of the West Corinth rifts fault-models. Recent work performed in the gulf has shown the complexity of the normal faulting system that is accommodating the extensional deformation of the rift. A logic-tree approach is proposed to account for this complexity and the multiplicity of scientifically defendable interpretations. At the nodes of the logic tree, different options that could be considered at each step of the fault-related seismic hazard will be considered. The first nodes represent the uncertainty in the geometries of the faults and their slip rates, which can derive from different data and methodologies. The subsequent node explores, for a given geometry/slip rate of faults, different earthquake rupture scenarios that may occur in the complex network of faults. The idea is to allow the possibility of several faults segments to break together in a single rupture scenario. To build these multiple-fault-segment scenarios, two approaches are considered: one based on simple rules (i.e. minimum distance between faults) and a second one that relies on physically-based simulations. The following nodes represents for each rupture scenario different rupture forecast models (i.e; characteristic or Gutenberg-Richter) and for a given rupture forecast, two probability models commonly used in seismic hazard assessment: poissonian or time-dependent. The final node represents an exhaustive set of ground motion prediction equations chosen in order to be compatible with the region. Finally, the expected probability of exceeding a given ground motion level is computed at each sites. Results will be discussed for a few specific localities of the West Corinth Gulf.
Marvin, Hans J P; Bouzembrak, Yamine; Janssen, Esmée M; van der Zande, Meike; Murphy, Finbarr; Sheehan, Barry; Mullins, Martin; Bouwmeester, Hans
2017-02-01
In this study, a Bayesian Network (BN) was developed for the prediction of the hazard potential and biological effects with the focus on metal- and metal-oxide nanomaterials to support human health risk assessment. The developed BN captures the (inter) relationships between the exposure route, the nanomaterials physicochemical properties and the ultimate biological effects in a holistic manner and was based on international expert consultation and the scientific literature (e.g., in vitro/in vivo data). The BN was validated with independent data extracted from published studies and the accuracy of the prediction of the nanomaterials hazard potential was 72% and for the biological effect 71%, respectively. The application of the BN is shown with scenario studies for TiO 2 , SiO 2 , Ag, CeO 2 , ZnO nanomaterials. It is demonstrated that the BN may be used by different stakeholders at several stages in the risk assessment to predict certain properties of a nanomaterials of which little information is available or to prioritize nanomaterials for further screening.
An alternative is to perform a set of relatively inexpensive and rapid high throughput screening (HTS) assays, derive signatures predictive of effects or modes of chemical toxicity from the HTS data, then use these predictions to prioritize chemicals for more detailed analysis. T...
Lack of complete and appropriate human data requires prediction of the hazards for exposed human populations by extrapolation from available animal and in vitro data. Predictive models for the toxicity of chemicals can be constructed by linking kinetic and mode of action data uti...
National Centers for Environmental Prediction (NCEP)
Tropical Marine Fire Weather Forecast Maps Unified Surface Analysis Climate Climate Prediction Climate forecasts of hazardous flight conditions at all levels within domestic and international air space. Climate Prediction Center monitors and forecasts short-term climate fluctuations and provides information on the
Adverse Housing Conditions and Early-Onset Delinquency.
Jackson, Dylan B; Newsome, Jamie; Lynch, Kellie R
2017-09-01
Housing constitutes an important health resource for children. Research has revealed that, when housing conditions are unfavorable, they can interfere with child health, academic performance, and cognition. Little to no research, however, has considered whether adverse housing conditions and early-onset delinquency are significantly associated with one another. This study explores the associations between structural and non-structural housing conditions and delinquent involvement during childhood. Data from the Fragile Families and Child Wellbeing Study (FFCWS) were employed in this study. Each adverse housing condition was significantly associated with early-onset delinquency. Even so, disarray and deterioration were only significantly linked to early delinquent involvement in the presence of health/safety hazards. The predicted probability of early-onset delinquency among children exposed to housing risks in the presence of health/safety hazards was nearly three times as large as the predicted probability of early-onset delinquency among children exposed only to disarray and/or deterioration, and nearly four times as large as the predicted probability of early-onset delinquency among children exposed to none of the adverse housing conditions. The findings suggest that minimizing housing-related health/safety hazards among at-risk subsets of the population may help to alleviate other important public health concerns-particularly early-onset delinquency. Addressing household health/safety hazards may represent a fruitful avenue for public health programs aimed at the prevention of early-onset delinquency. © Society for Community Research and Action 2017.
NASA Technical Reports Server (NTRS)
2002-01-01
ENSCO, Inc., developed the Meteorological and Atmospheric Real-time Safety Support (MARSS) system for real-time assessment of meteorological data displays and toxic material spills. MARSS also provides mock scenarios to guide preparations for emergencies involving meteorological hazards and toxic substances. Developed under a Small Business Innovation Research (SBIR) contract with Kennedy Space Center, MARSS was designed to measure how safe NASA and Air Force range safety personnel are while performing weather sensitive operations around launch pads. The system augments a ground operations safety plan that limits certain work operations to very specific weather conditions. It also provides toxic hazard prediction models to assist safety managers in planning for and reacting to releases of hazardous materials. MARSS can be used in agricultural, industrial, and scientific applications that require weather forecasts and predictions of toxic smoke movement. MARSS is also designed to protect urban areas, seaports, rail facilities, and airports from airborne releases of hazardous chemical substances. The system can integrate with local facility protection units and provide instant threat detection and assessment data that is reportable for local and national distribution.
Global Natural Disaster Risk Hotspots: Transition to a Regional Approach
NASA Astrophysics Data System (ADS)
Lerner-Lam, A.; Chen, R.; Dilley, M.
2005-12-01
The "Hotspots Project" is a collaborative study of the global distribution and occurrence of multiple natural hazards and the associated exposures of populations and their economic output. In this study we assess the global risks of two disaster-related outcomes: mortality and economic losses. We estimate risk levels by combining hazard exposure with historical vulnerability for two indicators of elements at risk-gridded population and Gross Domestic Product (GDP) per unit area - for six major natural hazards: earthquakes, volcanoes, landslides, floods, drought, and cyclones. By calculating relative risks for each grid cell rather than for countries as a whole, we are able to estimate risk levels at sub-national scales. These can then be used to estimate aggregate relative multiple hazard risk at regional and national scales. Mortality-related risks are assessed on a 2.5' x 2.5' latitude-longitude grid of global population (GPW Version 3). Economic risks are assessed at the same resolution for gridded GDP per unit area, using World Bank estimates of GDP based on purchasing power parity. Global hazard data were compiled from multiple sources. The project collaborated directly with UNDP and UNEP, the International Research Institute for Climate Prediction (IRI) at Columbia, and the Norwegian Geotechnical Institute (NGI) in the creation of data sets for several hazards for which global data sets did not previously exist. Drought, flood and volcano hazards are characterized in terms of event frequency, storms by frequency and severity, earthquakes by frequency and ground acceleration exceedance probability, and landslides by an index derived from probability of occurrence. The global analysis undertaken in this project is clearly limited by issues of scale as well as by the availability and quality of data. For some hazards, there exist only 15- to 25-year global records with relatively crude spatial information. Data on historical disaster losses, and particularly on economic losses, are also limited. On one hand the data are adequate for general identification of areas of the globe that are at relatively higher single- or multiple-hazard risk than other areas. On the other hand they are inadequate for understanding the absolute levels of risk posed by any specific hazard or combination of hazards. Nevertheless it is possible to assess in general terms the exposure and potential magnitude of losses to people and their assets in these areas. Such information, although not ideal, can still be very useful for informing a range of disaster prevention and preparedness measures, including prioritization of resources, targeting of more localized and detailed risk assessments, implementation of risk-based disaster management and emergency response strategies, and development of long-term plans for poverty reduction and economic development. In addition to summarizing the results of the Hotspots Project, we discuss data collection issues and suggest methodological approaches for making the transition to more detailed regional and national studies. Preliminary results for several regional case studies will be presented.
[Relations of landslide and debris flow hazards to environmental factors].
Zhang, Guo-ping; Xu, Jing; Bi, Bao-gui
2009-03-01
To clarify the relations of landslide and debris flow hazards to environmental factors is of significance to the prediction and evaluation of landslide and debris flow hazards. Base on the latitudinal and longitudinal information of 18431 landslide and debris flow hazards in China, and the 1 km x 1 km grid data of elevation, elevation difference, slope, slope aspect, vegetation type, and vegetation coverage, this paper analyzed the relations of landslide and debris flow hazards in this country to above-mentioned environmental factors by the analysis method of frequency ratio. The results showed that the landslide and debris flow hazards in China more occurred in lower elevation areas of the first and second transitional zones. When the elevation difference within a 1 km x 1 km grid cell was about 300 m and the slope was around 30 degree, there was the greatest possibility of the occurrence of landslide and debris hazards. Mountain forest land and slope cropland were the two land types the hazards most easily occurred. The occurrence frequency of the hazards was the highest when the vegetation coverage was about 80%-90%.
Wahesh, Edward; Lewis, Todd F
2015-01-01
The current study identified psychosocial variables associated with AUDIT-C hazardous drinking risk status for male and female college students. Logistic regression analysis revealed that AUDIT-C risk status was associated with alcohol-related negative consequences, injunctive norms, and descriptive norms for both male and female participants. Sociability and self-perception outcome expectancies predicted risk status for females. Cognitive and behavioral impairment expectancies predicted risk status for men in the sample. Implications for screening and brief intervention programming efforts are discussed. © The Author(s) 2015.
Markkanen, Pia; Quinn, Margaret; Galligan, Catherine; Sama, Susan; Brouillette, Natalie; Okyere, Daniel
2014-04-01
Home care (HC) aide is the fastest growing occupation, yet job hazards are under-studied. This study documents the context of HC aide work, characterizes occupational safety and health (OSH) hazards, and identifies preventive interventions using qualitative methods. We conducted 12 focus groups among aides and 26 in-depth interviews comprising 15 HC agency, union, and insurance company representatives as well as 11 HC recipients in Massachusetts. All focus groups and interviews were audio-recorded, transcribed, and coded with NVIVO software. Major OSH concerns were musculoskeletal disorders from client care tasks and verbal abuse. Performing tasks beyond specified job duties may be an OSH risk factor. HC aides' safety and clients' safety are closely linked. Client handling devices, client evaluation, care plan development, and training are key interventions for both aides' and clients' safety. Promoting OSH in HC is essential for maintaining a viable workforce. © 2013 Wiley Periodicals, Inc.
Time-dependent seismic hazard analysis for the Greater Tehran and surrounding areas
NASA Astrophysics Data System (ADS)
Jalalalhosseini, Seyed Mostafa; Zafarani, Hamid; Zare, Mehdi
2018-01-01
This study presents a time-dependent approach for seismic hazard in Tehran and surrounding areas. Hazard is evaluated by combining background seismic activity, and larger earthquakes may emanate from fault segments. Using available historical and paleoseismological data or empirical relation, the recurrence time and maximum magnitude of characteristic earthquakes for the major faults have been explored. The Brownian passage time (BPT) distribution has been used to calculate equivalent fictitious seismicity rate for major faults in the region. To include ground motion uncertainty, a logic tree and five ground motion prediction equations have been selected based on their applicability in the region. Finally, hazard maps have been presented.
New Tools for Investigating Chemical and Product Use
- The timely characterization of the human and ecological risk posed by thousands of existing and emerging commercial chemicals is a critical challenge - High throughput (HT) risk prioritization relies on hazard and exposure characterization - While advances have been made ...
Use of Severity Grades to Characterize Histopathologic Changes
The severity grade is an important component of a histopathologic diagnosis in a nonclinical toxicity study that helps distinguish treatment-related effects from background findings and aids in determining adverse dose levels during hazard characterization. Severity grades should...
Different regulatory schemes worldwide, and in particular the preparation for the new REACH legislation in Europe, increase the reliance on estimation methods for predicting potential chemical hazard.
Prediction of extreme floods in the eastern Central Andes based on a complex networks approach.
Boers, N; Bookhagen, B; Barbosa, H M J; Marwan, N; Kurths, J; Marengo, J A
2014-10-14
Changing climatic conditions have led to a significant increase in the magnitude and frequency of extreme rainfall events in the Central Andes of South America. These events are spatially extensive and often result in substantial natural hazards for population, economy and ecology. Here we develop a general framework to predict extreme events by introducing the concept of network divergence on directed networks derived from a non-linear synchronization measure. We apply our method to real-time satellite-derived rainfall data and predict more than 60% (90% during El Niño conditions) of rainfall events above the 99th percentile in the Central Andes. In addition to the societal benefits of predicting natural hazards, our study reveals a linkage between polar and tropical regimes as the responsible mechanism: the interplay of northward migrating frontal systems and a low-level wind channel from the western Amazon to the subtropics.
Lidar-Based Rock-Fall Hazard Characterization of Cliffs
Collins, Brian D.; Greg M.Stock,
2017-01-01
Rock falls from cliffs and other steep slopes present numerous challenges for detailed geological characterization. In steep terrain, rock-fall source areas are both dangerous and difficult to access, severely limiting the ability to make detailed structural and volumetric measurements necessary for hazard assessment. Airborne and terrestrial lidar survey methods can provide high-resolution data needed for volumetric, structural, and deformation analyses of rock falls, potentially making these analyses straightforward and routine. However, specific methods to collect, process, and analyze lidar data of steep cliffs are needed to maximize analytical accuracy and efficiency. This paper presents observations showing how lidar data sets should be collected, filtered, registered, and georeferenced to tailor their use in rock fall characterization. Additional observations concerning surface model construction, volumetric calculations, and deformation analysis are also provided.
NASA Astrophysics Data System (ADS)
Rey, Julien; Beauval, Céline; Douglas, John
2018-05-01
Probabilistic seismic hazard assessments are the basis of modern seismic design codes. To test fully a seismic hazard curve at the return periods of interest for engineering would require many thousands of years' worth of ground-motion recordings. Because strong-motion networks are often only a few decades old (e.g. in mainland France the first accelerometric network dates from the mid-1990s), data from such sensors can be used to test hazard estimates only at very short return periods. In this article, several hundreds of years of macroseismic intensity observations for mainland France are interpolated using a robust kriging-with-a-trend technique to establish the earthquake history of every French mainland municipality. At 24 selected cities representative of the French seismic context, the number of exceedances of intensities IV, V and VI is determined over time windows considered complete. After converting these intensities to peak ground accelerations using the global conversion equation of Caprio et al. (Ground motion to intensity conversion equations (GMICEs): a global relationship and evaluation of regional dependency, Bulletin of the Seismological Society of America 105:1476-1490, 2015), these exceedances are compared with those predicted by the European Seismic Hazard Model 2013 (ESHM13). In half of the cities, the number of observed exceedances for low intensities (IV and V) is within the range of predictions of ESHM13. In the other half of the cities, the number of observed exceedances is higher than the predictions of ESHM13. For intensity VI, the match is closer, but the comparison is less meaningful due to a scarcity of data. According to this study, the ESHM13 underestimates hazard in roughly half of France, even when taking into account the uncertainty in the conversion from intensity to acceleration. However, these results are valid only for the acceleration range tested in this study (0.01 to 0.09 g).
NASA Astrophysics Data System (ADS)
Koay, Swee Peng; Fukuoka, Hiroshi; Tien Tay, Lea; Murakami, Satoshi; Koyama, Tomofumi; Chan, Huah Yong; Sakai, Naoki; Hazarika, Hemanta; Jamaludin, Suhaimi; Lateh, Habibah
2016-04-01
Every year, hundreds of landslides occur in Malaysia and other tropical monsoon South East Asia countries. Therefore, prevention casualties and economical losses, by rain induced slope failure, are those countries government most important agenda. In Malaysia, millions of Malaysian Ringgit are allocated for slope monitoring and mitigation in every year budget. Besides monitoring the slopes, here, we propose the IT system which provides hazard map information, landslide historical information, slope failure prediction, knowledge on natural hazard, and information on evacuation centres via internet for user to understand the risk of landslides as well as flood. Moreover, the user can obtain information on rainfall intensity in the monitoring sites to predict the occurrence of the slope failure. Furthermore, we are working with PWD, Malaysia to set the threshold value for the landslide prediction system which will alert the officer if there is a risk of the slope failure in the monitoring sites by calculating rainfall intensity. Although the IT plays a significant role in information dissemination, education is also important in disaster prevention by educating school students to be more alert in natural hazard, and there will be bottom up approach to alert parents on what is natural hazard, by conversion among family members, as most of the parents are busy and may not have time to attend natural hazard workshop. There are many races living in Malaysia as well in most of South East Asia countries. It is not easy to educate them in single education method as the level of living and education are different. We started landslides education workshops in primary schools in rural and urban area, in Malaysia. We found out that we have to use their mother tongue language while conducting natural hazard education for better understanding. We took questionnaires from the students before and after the education workshop. Learning from the questionnaire result, the students are more alert on natural disaster then before, after attending the workshop.
NASA Astrophysics Data System (ADS)
Rey, Julien; Beauval, Céline; Douglas, John
2018-02-01
Probabilistic seismic hazard assessments are the basis of modern seismic design codes. To test fully a seismic hazard curve at the return periods of interest for engineering would require many thousands of years' worth of ground-motion recordings. Because strong-motion networks are often only a few decades old (e.g. in mainland France the first accelerometric network dates from the mid-1990s), data from such sensors can be used to test hazard estimates only at very short return periods. In this article, several hundreds of years of macroseismic intensity observations for mainland France are interpolated using a robust kriging-with-a-trend technique to establish the earthquake history of every French mainland municipality. At 24 selected cities representative of the French seismic context, the number of exceedances of intensities IV, V and VI is determined over time windows considered complete. After converting these intensities to peak ground accelerations using the global conversion equation of Caprio et al. (Ground motion to intensity conversion equations (GMICEs): a global relationship and evaluation of regional dependency, Bulletin of the Seismological Society of America 105:1476-1490, 2015), these exceedances are compared with those predicted by the European Seismic Hazard Model 2013 (ESHM13). In half of the cities, the number of observed exceedances for low intensities (IV and V) is within the range of predictions of ESHM13. In the other half of the cities, the number of observed exceedances is higher than the predictions of ESHM13. For intensity VI, the match is closer, but the comparison is less meaningful due to a scarcity of data. According to this study, the ESHM13 underestimates hazard in roughly half of France, even when taking into account the uncertainty in the conversion from intensity to acceleration. However, these results are valid only for the acceleration range tested in this study (0.01 to 0.09 g).
Hayek, Salim S; Divers, Jasmin; Raad, Mohamad; Xu, Jianzhao; Bowden, Donald W; Tracy, Melissa; Reiser, Jochen; Freedman, Barry I
2018-05-01
Type 2 diabetes mellitus is a major risk factor for cardiovascular disease; however, outcomes in individual patients vary. Soluble urokinase plasminogen activator receptor (suPAR) is a bone marrow-derived signaling molecule associated with adverse cardiovascular and renal outcomes in many populations. We characterized the determinants of suPAR in African Americans with type 2 diabetes mellitus and assessed whether levels were useful for predicting mortality beyond clinical characteristics, coronary artery calcium (CAC), and high-sensitivity C-reactive protein (hs-CRP). We measured plasma suPAR levels in 500 African Americans with type 2 diabetes mellitus enrolled in the African American-Diabetes Heart Study. We used Kaplan-Meier curves and Cox proportional hazards models adjusting for clinical characteristics, CAC, and hs-CRP to examine the association between suPAR and all-cause mortality. Last, we report the change in C-statistics comparing the additive values of suPAR, hs-CRP, and CAC to clinical models for prediction of mortality. The suPAR levels were independently associated with female sex, smoking, insulin use, decreased kidney function, albuminuria, and CAC. After a median 6.8-year follow-up, a total of 68 deaths (13.6%) were recorded. In a model incorporating suPAR, CAC, and hs-CRP, only suPAR was significantly associated with mortality (hazard ratio 2.66, 95% confidence interval 1.63-4.34). Addition of suPAR to a baseline clinical model significantly improved the C-statistic for all-cause death (Δ0.05, 95% confidence interval 0.01-0.10), whereas addition of CAC or hs-CRP did not. In African Americans with type 2 diabetes mellitus, suPAR was strongly associated with mortality and improved risk discrimination metrics beyond traditional risk factors, CAC and hs-CRP. Studies addressing the clinical usefulness of measuring suPAR concentrations are warranted. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
In silico toxicology protocols.
Myatt, Glenn J; Ahlberg, Ernst; Akahori, Yumi; Allen, David; Amberg, Alexander; Anger, Lennart T; Aptula, Aynur; Auerbach, Scott; Beilke, Lisa; Bellion, Phillip; Benigni, Romualdo; Bercu, Joel; Booth, Ewan D; Bower, Dave; Brigo, Alessandro; Burden, Natalie; Cammerer, Zoryana; Cronin, Mark T D; Cross, Kevin P; Custer, Laura; Dettwiler, Magdalena; Dobo, Krista; Ford, Kevin A; Fortin, Marie C; Gad-McDonald, Samantha E; Gellatly, Nichola; Gervais, Véronique; Glover, Kyle P; Glowienke, Susanne; Van Gompel, Jacky; Gutsell, Steve; Hardy, Barry; Harvey, James S; Hillegass, Jedd; Honma, Masamitsu; Hsieh, Jui-Hua; Hsu, Chia-Wen; Hughes, Kathy; Johnson, Candice; Jolly, Robert; Jones, David; Kemper, Ray; Kenyon, Michelle O; Kim, Marlene T; Kruhlak, Naomi L; Kulkarni, Sunil A; Kümmerer, Klaus; Leavitt, Penny; Majer, Bernhard; Masten, Scott; Miller, Scott; Moser, Janet; Mumtaz, Moiz; Muster, Wolfgang; Neilson, Louise; Oprea, Tudor I; Patlewicz, Grace; Paulino, Alexandre; Lo Piparo, Elena; Powley, Mark; Quigley, Donald P; Reddy, M Vijayaraj; Richarz, Andrea-Nicole; Ruiz, Patricia; Schilter, Benoit; Serafimova, Rositsa; Simpson, Wendy; Stavitskaya, Lidiya; Stidl, Reinhard; Suarez-Rodriguez, Diana; Szabo, David T; Teasdale, Andrew; Trejo-Martin, Alejandra; Valentin, Jean-Pierre; Vuorinen, Anna; Wall, Brian A; Watts, Pete; White, Angela T; Wichard, Joerg; Witt, Kristine L; Woolley, Adam; Woolley, David; Zwickl, Craig; Hasselgren, Catrin
2018-07-01
The present publication surveys several applications of in silico (i.e., computational) toxicology approaches across different industries and institutions. It highlights the need to develop standardized protocols when conducting toxicity-related predictions. This contribution articulates the information needed for protocols to support in silico predictions for major toxicological endpoints of concern (e.g., genetic toxicity, carcinogenicity, acute toxicity, reproductive toxicity, developmental toxicity) across several industries and regulatory bodies. Such novel in silico toxicology (IST) protocols, when fully developed and implemented, will ensure in silico toxicological assessments are performed and evaluated in a consistent, reproducible, and well-documented manner across industries and regulatory bodies to support wider uptake and acceptance of the approaches. The development of IST protocols is an initiative developed through a collaboration among an international consortium to reflect the state-of-the-art in in silico toxicology for hazard identification and characterization. A general outline for describing the development of such protocols is included and it is based on in silico predictions and/or available experimental data for a defined series of relevant toxicological effects or mechanisms. The publication presents a novel approach for determining the reliability of in silico predictions alongside experimental data. In addition, we discuss how to determine the level of confidence in the assessment based on the relevance and reliability of the information. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Murnyak, George R; Spencer, Clark O; Chaney, Ann E; Roberts, Welford C
2002-04-01
During the 1970s, the Army health hazard assessment (HHA) process developed as a medical program to minimize hazards in military materiel during the development process. The HHA Program characterizes health hazards that soldiers and civilians may encounter as they interact with military weapons and equipment. Thus, it is a resource for medical planners and advisors to use that can identify and estimate potential hazards that soldiers may encounter as they train and conduct missions. The U.S. Army Center for Health Promotion and Preventive Medicine administers the program, which is integrated with the Army's Manpower and Personnel Integration program. As the HHA Program has matured, an electronic database has been developed to record and monitor the health hazards associated with military equipment and systems. The current database tracks the results of HHAs and provides reporting designed to assist the HHA Program manager in daily activities.
Seismic hazard in the Intermountain West
Haller, Kathleen; Moschetti, Morgan P.; Mueller, Charles; Rezaeian, Sanaz; Petersen, Mark D.; Zeng, Yuehua
2015-01-01
The 2014 national seismic-hazard model for the conterminous United States incorporates new scientific results and important model adjustments. The current model includes updates to the historical catalog, which is spatially smoothed using both fixed-length and adaptive-length smoothing kernels. Fault-source characterization improved by adding faults, revising rates of activity, and incorporating new results from combined inversions of geologic and geodetic data. The update also includes a new suite of published ground motion models. Changes in probabilistic ground motion are generally less than 10% in most of the Intermountain West compared to the prior assessment, and ground-motion hazard in four Intermountain West cities illustrates the range and magnitude of change in the region. Seismic hazard at reference sites in Boise and Reno increased as much as 10%, whereas hazard in Salt Lake City decreased 5–6%. The largest change was in Las Vegas, where hazard increased 32–35%.
NASA Astrophysics Data System (ADS)
Ignac-Nowicka, Jolanta
2018-03-01
The paper analyzes the conditions of safe use of industrial gas systems and factors influencing gas hazards. Typical gas installation and its basic features have been characterized. The results of gas threat analysis in an industrial enterprise using FTA error tree method and ETA event tree method are presented. Compares selected methods of identifying hazards gas industry with respect to the scope of their use. The paper presents an analysis of two exemplary hazards: an industrial gas catastrophe (FTA) and an explosive gas explosion (ETA). In both cases, technical risks and human errors (human factor) were taken into account. The cause-effect relationships of hazards and their causes are presented in the form of diagrams in the drawings.
NASA Technical Reports Server (NTRS)
Welch, Richard V.; Edmonds, Gary O.
1994-01-01
The use of robotics in situations involving hazardous materials can significantly reduce the risk of human injuries. The Emergency Response Robotics Project, which began in October 1990 at the Jet Propulsion Laboratory, is developing a teleoperated mobile robot allowing HAZMAT (hazardous materials) teams to remotely respond to incidents involving hazardous materials. The current robot, called HAZBOT III, can assist in locating characterizing, identifying, and mitigating hazardous material incidents without risking entry team personnel. The active involvement of the JPL Fire Department HAZMAT team has been vital in developing a robotic system which enables them to perform remote reconnaissance of a HAZMAT incident site. This paper provides a brief review of the history of the project, discusses the current system in detail, and presents other areas in which robotics can be applied removing people from hazardous environments/operations.
How well can we test probabilistic seismic hazard maps?
NASA Astrophysics Data System (ADS)
Vanneste, Kris; Stein, Seth; Camelbeeck, Thierry; Vleminckx, Bart
2017-04-01
Recent large earthquakes that gave rise to shaking much stronger than shown in probabilistic seismic hazard (PSH) maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSH model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating shaking histories for an area with assumed uniform distribution of earthquakes, Gutenberg-Richter magnitude-frequency relation, Poisson temporal occurrence model, and ground-motion prediction equation (GMPE). We compare the maximum simulated shaking at many sites over time with that predicted by a hazard map generated for the same set of parameters. The Poisson model predicts that the fraction of sites at which shaking will exceed that of the hazard map is p = 1 - exp(-t/T), where t is the duration of observations and T is the map's return period. Exceedance is typically associated with infrequent large earthquakes, as observed in real cases. The ensemble of simulated earthquake histories yields distributions of fractional exceedance with mean equal to the predicted value. Hence, the PSH algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. However, simulated fractional exceedances show a large scatter about the mean value that decreases with increasing t/T, increasing observation time and increasing Gutenberg-Richter a-value (combining intrinsic activity rate and surface area), but is independent of GMPE uncertainty. This scatter is due to the variability of earthquake recurrence, and so decreases as the largest earthquakes occur in more simulations. Our results are important for evaluating the performance of a hazard map based on misfits in fractional exceedance, and for assessing whether such misfit arises by chance or reflects a bias in the map. More specifically, we determined for a broad range of Gutenberg-Richter a-values theoretical confidence intervals on allowed misfits in fractional exceedance and on the percentage of hazard-map bias that can thus be detected by comparison with observed shaking histories. Given that in the real world we only have one shaking history for an area, these results indicate that even if a hazard map does not fit the observations, it is very difficult to assess its veracity, especially for low-to-moderate-seismicity regions. Because our model is a simplified version of reality, any additional uncertainty or complexity will tend to widen these confidence intervals.
Letang, Emilio; Lewis, James J; Bower, Mark; Mosam, Anisa; Borok, Margareth; Campbell, Thomas B; Naniche, Denise; Newsom-Davis, Tom; Shaik, Fahmida; Fiorillo, Suzanne; Miro, Jose M; Schellenberg, David; Easterbrook, Philippa J
2013-06-19
To assess the incidence, predictors, and outcomes of Kaposi sarcoma-associated paradoxical immune reconstitution inflammatory syndrome (KS-IRIS) in antiretroviral therapy (ART)-naive HIV-infected patients with Kaposi sarcoma initiating ART in both well resourced and limited-resourced settings. Pooled analysis of three prospective cohorts of ART-naive HIV-infected patients with Kaposi sarcoma from sub-Saharan Africa (SSA) and one from the UK. KS-IRIS case definition was standardized across sites. Cox regression and Kaplan-Meier survival analysis were used to identify the incidence and predictors of KS-IRIS and Kaposi sarcoma-associated mortality. Fifty-eight of 417 (13.9%) eligible individuals experienced KS-IRIS with an incidence 2.5 times higher in the African vs. European cohorts (P=0.001). ART alone as initial Kaposi sarcoma treatment (hazard ratio 2.97, 95% confidence interval (CI) 1.02-8.69); T1 Kaposi sarcoma stage (hazard ratio 2.96, 95% CI 1.26-6.94); and plasma HIV-1 RNA more than 5 log₁₀ copies/ml (hazard ratio 2.14, 95% CI 1.25-3.67) independently predicted KS-IRIS at baseline. Detectable plasma Kaposi sarcoma-associated herpes virus (KSHV) DNA additionally predicted KS-IRIS among the 259 patients with KSHV DNA assessed (hazard ratio 2.98, 95% CI 1.23-7.19). Nineteen KS-IRIS patients died, all in SSA. Kaposi sarcoma mortality was 3.3-fold higher in Africa, and was predicted by KS-IRIS (hazard ratio 19.24, CI 7.62-48.58), lack of chemotherapy (hazard ratio 2.35, 95% CI 1.09-5.05), pre-ART CD4 cell count less than 200 cells/μl (hazard ratio 2.04, 95% CI 0.99-4.2), and detectable baseline KSHV DNA (hazard ratio 2.12, 95% CI 0.94-4.77). KS-IRIS incidence and mortality are higher in SSA than in the UK. This is largely explained by the more advanced Kaposi sarcoma disease and lower chemotherapy availability. KS-IRIS is a major contributor to Kaposi sarcoma-associated mortality in Africa. Our results support the need to increase awareness on KS-IRIS, encourage earlier presentation, referral and diagnosis of Kaposi sarcoma, and advocate on access to systemic chemotherapy in Africa. © 2013 Wolters Kluwer Health | Lippincott Williams & Wilkins
Hazard assessment of long-period ground motions for the Nankai Trough earthquakes
NASA Astrophysics Data System (ADS)
Maeda, T.; Morikawa, N.; Aoi, S.; Fujiwara, H.
2013-12-01
We evaluate a seismic hazard for long-period ground motions associated with the Nankai Trough earthquakes (M8~9) in southwest Japan. Large interplate earthquakes occurring around the Nankai Trough have caused serious damages due to strong ground motions and tsunami; most recent events were in 1944 and 1946. Such large interplate earthquake potentially causes damages to high-rise and large-scale structures due to long-period ground motions (e.g., 1985 Michoacan earthquake in Mexico, 2003 Tokachi-oki earthquake in Japan). The long-period ground motions are amplified particularly on basins. Because major cities along the Nankai Trough have developed on alluvial plains, it is therefore important to evaluate long-period ground motions as well as strong motions and tsunami for the anticipated Nankai Trough earthquakes. The long-period ground motions are evaluated by the finite difference method (FDM) using 'characterized source models' and the 3-D underground structure model. The 'characterized source model' refers to a source model including the source parameters necessary for reproducing the strong ground motions. The parameters are determined based on a 'recipe' for predicting strong ground motion (Earthquake Research Committee (ERC), 2009). We construct various source models (~100 scenarios) giving the various case of source parameters such as source region, asperity configuration, and hypocenter location. Each source region is determined by 'the long-term evaluation of earthquakes in the Nankai Trough' published by ERC. The asperity configuration and hypocenter location control the rupture directivity effects. These parameters are important because our preliminary simulations are strongly affected by the rupture directivity. We apply the system called GMS (Ground Motion Simulator) for simulating the seismic wave propagation based on 3-D FDM scheme using discontinuous grids (Aoi and Fujiwara, 1999) to our study. The grid spacing for the shallow region is 200 m and 100 m in horizontal and vertical, respectively. The grid spacing for the deep region is three times coarser. The total number of grid points is about three billion. The 3-D underground structure model used in the FD simulation is the Japan integrated velocity structure model (ERC, 2012). Our simulation is valid for period more than two seconds due to the lowest S-wave velocity and grid spacing. However, because the characterized source model may not sufficiently support short period components, we should be interpreted the reliable period of this simulation with caution. Therefore, we consider the period more than five seconds instead of two seconds for further analysis. We evaluate the long-period ground motions using the velocity response spectra for the period range between five and 20 second. The preliminary simulation shows a large variation of response spectra at a site. This large variation implies that the ground motion is very sensitive to different scenarios. And it requires studying the large variation to understand the seismic hazard. Our further study will obtain the hazard curves for the Nankai Trough earthquake (M 8~9) by applying the probabilistic seismic hazard analysis to the simulation results.
A Model-Based Prioritisation Exercise for the European Water Framework Directive
Daginnus, Klaus; Gottardo, Stefania; Payá-Pérez, Ana; Whitehouse, Paul; Wilkinson, Helen; Zaldívar, José-Manuel
2011-01-01
A model-based prioritisation exercise has been carried out for the Water Framework Directive (WFD) implementation. The approach considers two aspects: the hazard of a certain chemical and its exposure levels, and focuses on aquatic ecosystems, but also takes into account hazards due to secondary poisoning, bioaccumulation through the food chain and potential human health effects. A list provided by EU Member States, Stakeholders and Non-Governmental Organizations comprising 2,034 substances was evaluated according to hazard and exposure criteria. Then 78 substances classified as “of high concern” where analysed and ranked in terms of risk ratio (Predicted Environmental Concentration/Predicted No-Effect Concentration). This exercise has been complemented by a monitoring-based prioritization exercise using data provided by Member States. The proposed approach constitutes the first step in setting the basis for an open modular screening tool that could be used for the next prioritization exercises foreseen by the WFD. PMID:21556195
Models of volcanic eruption hazards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wohletz, K.H.
1992-01-01
Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluidmore » flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.« less
NASA Astrophysics Data System (ADS)
Edwards, John L.; Beekman, Randy M.; Buchanan, David B.; Farner, Scott; Gershzohn, Gary R.; Khuzadi, Mbuyi; Mikula, D. F.; Nissen, Gerry; Peck, James; Taylor, Shaun
2007-04-01
Human space travel is inherently dangerous. Hazardous conditions will exist. Real time health monitoring of critical subsystems is essential for providing a safe abort timeline in the event of a catastrophic subsystem failure. In this paper, we discuss a practical and cost effective process for developing critical subsystem failure detection, diagnosis and response (FDDR). We also present the results of a real time health monitoring simulation of a propellant ullage pressurization subsystem failure. The health monitoring development process identifies hazards, isolates hazard causes, defines software partitioning requirements and quantifies software algorithm development. The process provides a means to establish the number and placement of sensors necessary to provide real time health monitoring. We discuss how health monitoring software tracks subsystem control commands, interprets off-nominal operational sensor data, predicts failure propagation timelines, corroborate failures predictions and formats failure protocol.
Complete hazard ranking to analyze right-censored data: An ALS survival study.
Huang, Zhengnan; Zhang, Hongjiu; Boss, Jonathan; Goutman, Stephen A; Mukherjee, Bhramar; Dinov, Ivo D; Guan, Yuanfang
2017-12-01
Survival analysis represents an important outcome measure in clinical research and clinical trials; further, survival ranking may offer additional advantages in clinical trials. In this study, we developed GuanRank, a non-parametric ranking-based technique to transform patients' survival data into a linear space of hazard ranks. The transformation enables the utilization of machine learning base-learners including Gaussian process regression, Lasso, and random forest on survival data. The method was submitted to the DREAM Amyotrophic Lateral Sclerosis (ALS) Stratification Challenge. Ranked first place, the model gave more accurate ranking predictions on the PRO-ACT ALS dataset in comparison to Cox proportional hazard model. By utilizing right-censored data in its training process, the method demonstrated its state-of-the-art predictive power in ALS survival ranking. Its feature selection identified multiple important factors, some of which conflicts with previous studies.
Volcanic ash melting under conditions relevant to ash turbine interactions.
Song, Wenjia; Lavallée, Yan; Hess, Kai-Uwe; Kueppers, Ulrich; Cimarelli, Corrado; Dingwell, Donald B
2016-03-02
The ingestion of volcanic ash by jet engines is widely recognized as a potentially fatal hazard for aircraft operation. The high temperatures (1,200-2,000 °C) typical of jet engines exacerbate the impact of ash by provoking its melting and sticking to turbine parts. Estimation of this potential hazard is complicated by the fact that chemical composition, which affects the temperature at which volcanic ash becomes liquid, can vary widely amongst volcanoes. Here, based on experiments, we parameterize ash behaviour and develop a model to predict melting and sticking conditions for its global compositional range. The results of our experiments confirm that the common use of sand or dust proxy is wholly inadequate for the prediction of the behaviour of volcanic ash, leading to overestimates of sticking temperature and thus severe underestimates of the thermal hazard. Our model can be used to assess the deposition probability of volcanic ash in jet engines.
Models of volcanic eruption hazards
NASA Astrophysics Data System (ADS)
Wohletz, K. H.
Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluid flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.
Comparing the Performance of Japan's Earthquake Hazard Maps to Uniform and Randomized Maps
NASA Astrophysics Data System (ADS)
Brooks, E. M.; Stein, S. A.; Spencer, B. D.
2015-12-01
The devastating 2011 magnitude 9.1 Tohoku earthquake and the resulting shaking and tsunami were much larger than anticipated in earthquake hazard maps. Because this and all other earthquakes that caused ten or more fatalities in Japan since 1979 occurred in places assigned a relatively low hazard, Geller (2011) argued that "all of Japan is at risk from earthquakes, and the present state of seismological science does not allow us to reliably differentiate the risk level in particular geographic areas," so a map showing uniform hazard would be preferable to the existing map. Defenders of the maps countered by arguing that these earthquakes are low-probability events allowed by the maps, which predict the levels of shaking that should expected with a certain probability over a given time. Although such maps are used worldwide in making costly policy decisions for earthquake-resistant construction, how well these maps actually perform is unknown. We explore this hotly-contested issue by comparing how well a 510-year-long record of earthquake shaking in Japan is described by the Japanese national hazard (JNH) maps, uniform maps, and randomized maps. Surprisingly, as measured by the metric implicit in the JNH maps, i.e. that during the chosen time interval the predicted ground motion should be exceeded only at a specific fraction of the sites, both uniform and randomized maps do better than the actual maps. However, using as a metric the squared misfit between maximum observed shaking and that predicted, the JNH maps do better than uniform or randomized maps. These results indicate that the JNH maps are not performing as well as expected, that what factors control map performance is complicated, and that learning more about how maps perform and why would be valuable in making more effective policy.
NASA Astrophysics Data System (ADS)
Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui
2018-02-01
The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.
About Using Predictive Models and Tools To Assess Chemicals under TSCA
As part of EPA's effort to promote chemical safety, OPPT provides public access to predictive models and tools which can help inform the public on the hazards and risks of substances and improve chemical management decisions.
EPAs National Center for Computational Toxicology is developing methods that apply computational chemistry, high-throughput screening (HTS) and genomic technologies to predict potential toxicity and prioritize the use of limited testing resources.
Negatively-Biased Credulity and the Cultural Evolution of Beliefs
Fessler, Daniel M. T.; Pisor, Anne C.; Navarrete, Carlos David
2014-01-01
The functions of cultural beliefs are often opaque to those who hold them. Accordingly, to benefit from cultural evolution’s ability to solve complex adaptive problems, learners must be credulous. However, credulity entails costs, including susceptibility to exploitation, and effort wasted due to false beliefs. One determinant of the optimal level of credulity is the ratio between the costs of two types of errors: erroneous incredulity (failing to believe information that is true) and erroneous credulity (believing information that is false). This ratio can be expected to be asymmetric when information concerns hazards, as the costs of erroneous incredulity will, on average, exceed the costs of erroneous credulity; no equivalent asymmetry characterizes information concerning benefits. Natural selection can therefore be expected to have crafted learners’ minds so as to be more credulous toward information concerning hazards. This negatively-biased credulity extends general negativity bias, the adaptive tendency for negative events to be more salient than positive events. Together, these biases constitute attractors that should shape cultural evolution via the aggregated effects of learners’ differential retention and transmission of information. In two studies in the U.S., we demonstrate the existence of negatively-biased credulity, and show that it is most pronounced in those who believe the world to be dangerous, individuals who may constitute important nodes in cultural transmission networks. We then document the predicted imbalance in cultural content using a sample of urban legends collected from the Internet and a sample of supernatural beliefs obtained from ethnographies of a representative collection of the world’s cultures, showing that beliefs about hazards predominate in both. PMID:24736596
Negatively-biased credulity and the cultural evolution of beliefs.
Fessler, Daniel M T; Pisor, Anne C; Navarrete, Carlos David
2014-01-01
The functions of cultural beliefs are often opaque to those who hold them. Accordingly, to benefit from cultural evolution's ability to solve complex adaptive problems, learners must be credulous. However, credulity entails costs, including susceptibility to exploitation, and effort wasted due to false beliefs. One determinant of the optimal level of credulity is the ratio between the costs of two types of errors: erroneous incredulity (failing to believe information that is true) and erroneous credulity (believing information that is false). This ratio can be expected to be asymmetric when information concerns hazards, as the costs of erroneous incredulity will, on average, exceed the costs of erroneous credulity; no equivalent asymmetry characterizes information concerning benefits. Natural selection can therefore be expected to have crafted learners' minds so as to be more credulous toward information concerning hazards. This negatively-biased credulity extends general negativity bias, the adaptive tendency for negative events to be more salient than positive events. Together, these biases constitute attractors that should shape cultural evolution via the aggregated effects of learners' differential retention and transmission of information. In two studies in the U.S., we demonstrate the existence of negatively-biased credulity, and show that it is most pronounced in those who believe the world to be dangerous, individuals who may constitute important nodes in cultural transmission networks. We then document the predicted imbalance in cultural content using a sample of urban legends collected from the Internet and a sample of supernatural beliefs obtained from ethnographies of a representative collection of the world's cultures, showing that beliefs about hazards predominate in both.
Kim, Sung Han; Park, Boram; Joo, Jungnam; Joung, Jae Young; Seo, Ho Kyung; Chung, Jinsoo; Lee, Kang Hyun
2017-01-01
Objective To evaluate predictive factors for retrograde ureteral stent failure in patients with non-urological malignant ureteral obstruction. Materials and methods Between 2005 and 2014, medical records of 284 malignant ureteral obstruction patients with 712 retrograde ureteral stent trials including 63 (22.2%) having bilateral malignant ureteral obstruction were retrospectively reviewed. Retrograde ureteral stent failure was defined as the inability to place ureteral stents by cystoscopy, recurrent stent obstruction within one month, or non-relief of azotemia within one week from the prior retrograde ureteral stent. The clinicopathological parameters and first retrograde pyelographic findings were analyzed to investigate the predictive factors for retrograde ureteral stent failure and conversion to percutaneous nephrostomy in multivariate analysis with a statistical significance of p < 0.05. Results Retrograde ureteral stent failure was detected in 14.1% of patients. The mean number of retrograde ureteral stent placements and indwelling duration of the ureteral stents were 2.5 ± 2.6 times and 8.6 ± 4.0 months, respectively. Multivariate analyses identified several specific RGP findings as significant predictive factors for retrograde ureteral stent failure (p < 0.05). The significant retrograde pyelographic findings included grade 4 hydronephrosis (hazard ratio 4.10, 95% confidence interval 1.39–12.09), irreversible ureteral kinking (hazard ratio 2.72, confidence interval 1.03–7.18), presence of bladder invasion (hazard ratio 4.78, confidence interval 1.81–12.63), and multiple lesions of ureteral stricture (hazard ratio 3.46, confidence interval 1.35–8.83) (p < 0.05). Conclusion Retrograde pyelography might prevent unnecessary and ineffective retrograde ureteral stent trials in patients with advanced non-urological malignant ureteral obstruction. PMID:28931043
Yeh, Hsin-Chih; Jan, Hau-Chern; Wu, Wen-Jeng; Li, Ching-Chia; Li, Wei-Ming; Ke, Hung-Lung; Huang, Shu-Pin; Liu, Chia-Chu; Lee, Yung-Chin; Yang, Sheau-Fang; Liang, Peir-In; Huang, Chun-Nung
2015-01-01
To investigate the impact of preoperative hydronephrosis and flank pain on prognosis of patients with upper tract urothelial carcinoma. In total, 472 patients with upper tract urothelial carcinoma managed by radical nephroureterectomy were included from Kaohsiung Medical University Hospital Healthcare System. Clinicopathological data were collected retrospectively for analysis. The significance of hydronephrosis, especially when combined with flank pain, and other relevant factors on overall and cancer-specific survival were evaluated. Of the 472 patients, 292 (62%) had preoperative hydronephrosis and 121 (26%) presented with flank pain. Preoperative hydronephrosis was significantly associated with age, hematuria, flank pain, tumor location, and pathological tumor stage. Concurrent presence of hydronephrosis and flank pain was a significant predictor of non-organ-confined disease (multivariate-adjusted hazard ratio = 2.10, P = 0.025). Kaplan-Meier analysis showed significantly poorer overall and cancer-specific survival in patients with preoperative hydronephrosis (P = 0.005 and P = 0.026, respectively) and in patients with flank pain (P < 0.001 and P = 0.001, respectively) than those without. However, only simultaneous hydronephrosis and flank pain independently predicted adverse outcome (hazard ratio = 1.98, P = 0.016 for overall survival and hazard ratio = 1.87, P = 0.036 for and cancer-specific survival, respectively) in multivariate Cox proportional hazards models. In addition, concurrent presence of hydronephrosis and flank pain was also significantly predictive of worse survival in patient with high grade or muscle-invasive disease. Notably, there was no difference in survival between patients with hydronephrosis but devoid of flank pain and those without hydronephrosis. Concurrent preoperative presence of hydronephrosis and flank pain predicted non-organ-confined status of upper tract urothelial carcinoma. When accompanied with flank pain, hydronephrosis represented an independent predictor for worse outcome in patients with upper tract urothelial carcinoma.
Shikany, James M; Safford, Monika M; Newby, P K; Durant, Raegan W; Brown, Todd M; Judd, Suzanne E
2015-09-01
The association of overall diet, as characterized by dietary patterns, with risk of incident acute coronary heart disease (CHD) has not been studied extensively in samples including sociodemographic and regional diversity. We used data from 17 418 participants in Reasons for Geographic and Racial Differences in Stroke (REGARDS), a national, population-based, longitudinal study of white and black adults aged ≥45 years, enrolled from 2003 to 2007. We derived dietary patterns with factor analysis and used Cox proportional hazards regression to examine hazard of incident acute CHD events - nonfatal myocardial infarction and acute CHD death - associated with quartiles of consumption of each pattern, adjusted for various levels of covariates. Five primary dietary patterns emerged: Convenience, Plant-based, Sweets, Southern, and Alcohol and Salad. A total of 536 acute CHD events occurred over a median (interquartile range) 5.8 (2.1) years of follow-up. After adjustment for sociodemographics, lifestyle factors, and energy intake, highest consumers of the Southern pattern (characterized by added fats, fried food, eggs, organ and processed meats, and sugar-sweetened beverages) experienced a 56% higher hazard of acute CHD (comparing quartile 4 with quartile 1: hazard ratio, 1.56; 95% confidence interval, 1.17-2.08; P for trend across quartiles=0.003). Adding anthropometric and medical history variables to the model attenuated the association somewhat (hazard ratio, 1.37; 95% confidence interval, 1.01-1.85; P=0.036). A dietary pattern characteristic of the southern United States was associated with greater hazard of CHD in this sample of white and black adults in diverse regions of the United States. © 2015 American Heart Association, Inc.
Kajantie, Eero; Räikkönen, Katri; Henriksson, Markus; Leskinen, Jukka T; Forsén, Tom; Heinonen, Kati; Pesonen, Anu-Katriina; Osmond, Clive; Barker, David J P; Eriksson, Johan G
2012-01-01
Low intellectual ability is associated with an increased risk of coronary heart disease and stroke. Most studies have used a general intelligence score. We studied whether three different subscores of intellectual ability predict these disorders. We studied 2,786 men, born between 1934 and 1944 in Helsinki, Finland, who as conscripts at age 20 underwent an intellectual ability test comprising verbal, visuospatial (analogous to Raven's progressive matrices) and arithmetic reasoning subtests. We ascertained the later occurrence of coronary heart disease and stroke from validated national hospital discharge and death registers. 281 men (10.1%) had experienced a coronary heart disease event and 131 (4.7%) a stroke event. Coronary heart disease was predicted by low scores in all subtests, hazard ratios for each standard deviation (SD) lower score ranging from 1.21 to 1.30 (confidence intervals 1.08 to 1.46). Stroke was predicted by a low visuospatial reasoning score, the corresponding hazard ratio being 1.23 (95% confidence interval 1.04 to 1.46), adjusted for year and age at testing. Adjusted in addition for the two other scores, the hazard ratio was 1.40 (1.10 to 1.79). This hazard ratio was little affected by adjustment for socioeconomic status in childhood and adult life, whereas the same adjustments attenuated the associations between intellectual ability and coronary heart disease. The associations with stroke were also unchanged when adjusted for systolic blood pressure at 20 years and reimbursement for adult antihypertensive medication. Stroke is predicted by low visuospatial reasoning scores in relation to scores in the two other subtests. This association may be mediated by common underlying causes such as impaired brain development, rather than by mechanisms associated with risk factors shared by stroke and coronary heart disease, such as socio-economic status, hypertension and atherosclerosis.
The US EPA’s ToxCast program has generated a wealth of data in >600 in vitro assayson a library of 1060 environmentally relevant chemicals and failed pharmaceuticals to facilitate hazard identification. An inherent criticism of many in vitro-based strategies is the inability of a...
Momentary changes in craving predict smoking lapse behavior: a laboratory study.
Motschman, Courtney A; Germeroth, Lisa J; Tiffany, Stephen T
2018-04-27
Current research on factors that predict smoking lapse behavior is limited in its ability to fully characterize the critical moments leading up to decisions to smoke. We used a validated and widely used experimental analogue for smoking lapse to assess how moment-to-moment dynamics of craving relate to decisions to smoke. Heavy smokers (N = 128, M age = 35.9) participated in a 50-min laboratory delay to smoking task on 2 consecutive days, earning money for each 5 min they remained abstinent or ending the task by choosing to smoke. Participants rated craving and negative affect levels immediately prior to each choice. Participants were randomized to smoking as usual (n = 50) or overnight abstinence (n = 50 successfully abstained, n = 22 failed abstaining) prior to session 2. Discrete-time hazard models were used to examine craving and negative affect as time-varying predictors of smoking. Higher craving levels prior to smoking opportunities predicted increased risk of smoking. When controlling for craving levels, incremental increases in craving predicted increased smoking risk. Increases in negative affect incrementally predicted increased smoking risk at session 2 only. Smokers who failed to abstain were at a higher risk of smoking than those who successfully abstained, whereas abstinent and non-abstinent smokers did not differ in smoking risk. Findings demonstrate an extension of the smoking lapse paradigm that can be utilized to capture momentary changes in craving that predict smoking behavior. Evaluations of nuanced craving experiences may inform clinical and pharmacological research on preventing smoking lapse and relapse.
Spacecraft Charging: Hazard Causes, Hazard Effects, Hazard Controls
NASA Technical Reports Server (NTRS)
Koontz, Steve.
2018-01-01
Spacecraft flight environments are characterized both by a wide range of space plasma conditions and by ionizing radiation (IR), solar ultraviolet and X-rays, magnetic fields, micrometeoroids, orbital debris, and other environmental factors, all of which can affect spacecraft performance. Dr. Steven Koontz's lecture will provide a solid foundation in the basic engineering physics of spacecraft charging and charging effects that can be applied to solving practical spacecraft and spacesuit engineering design, verification, and operations problems, with an emphasis on spacecraft operations in low-Earth orbit, Earth's magnetosphere, and cis-Lunar space.
Use of Citizen Science and Social Media to Improve Wind Hazard and Damage Characterization
NASA Astrophysics Data System (ADS)
Lombardo, F.; Meidani, H.
2017-12-01
Windstorm losses are significant in the U.S. annually and cause damage worldwide. A large percentage of losses are caused by localized events (e.g., tornadoes). In order to better mitigate these losses improvement is needed in understanding the hazard characteristics and physical damage. However, due to the small-scale nature of these events the resolution of the dedicated measuring network does not capture most occurrences. As a result damage-based assessments are sometimes used to gauge intensity. These damage assessments often suffer from a lack of available manpower, inability to arrive at the scene rapidly and difficulty accessing a damaged site. The use and rapid dissemination of social media, the power of crowds engaged in scientific endeavors, and the public's awareness of their vulnerabilities point to a paradigm shift in how hazards can be sensed in a rapid manner. In this way, `human-sensor' data has the potential to radically improve fundamental understanding of hazard and disasters and resolve some of the existing challenges in wind hazard and damage characterization. Data from social media outlets such as Twitter have been used to aid in damage assessments from hazards such as flood and earthquake, however, the reliability and uncertainty of participatory sensing has been questioned and has been called the `biggest challenge' for its sustained use. This research proposes to investigate the efficacy of both citizen science applications and social media data to represent wind hazards and associated damage. Research has focused on a two-phase approach: 1) to have citizen scientists perform their own `damage survey' (i.e., questionnaire) with known damage to assess uncertainty in estimation and 2) downloading and analysis of social media text and imagery streams to ascertain the possibility of performing `unstructured damage surveys'. Early results have shown that the untrained public can estimate tornado damage levels in residential structures with some accuracy. In addition, valuable windstorm hazard and damage information in both text and imagery can be extracted and archived from Twitter in an automated fashion. Information extracted from these sources will feed into advances in hazard and disaster modeling, social-cognitive theories of human behavior and decision-making for hazard mitigation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Attebery, C.W.; Zimmer, A.T.; Hedgecock, N.S.
1989-01-01
A waste-water characterization hazardous-waste survey was conducted at Beale AFB by USAFOEHL/ECQ personnel to provide the base with sufficient information to address a State of California Notice of Violation concerning excessive discharges of boron and cyanide from the base sewage-treatment plant (STP). The results of the survey showed that the 9th RTS Precision Photo Lab along with other film-processing organizations were major contributors to the boron and cyanide discharge problems being experienced by the base STP. Maintenance organizations that utilize soaps and detergents that contain boron and cyanide also contributed to the problem.
Science should warn people of looming disaster
NASA Astrophysics Data System (ADS)
Kossobokov, Vladimir
2014-05-01
Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of special knowledge, education, and communication. In fact, it appears that a few seismic hazard assessment programs and/or methodologies were tested appropriately against real observations before being endorsed for estimation of earthquake related risks. The fatal evidence and aftermath of the past decades prove that many of the existing internationally accepted methodologies are grossly misleading and are evidently unacceptable for any kind of responsible risk evaluation and knowledgeable disaster prevention. In contrast, the confirmed reliability of pattern recognition aimed at earthquake prone areas and times of increased probability, along with realistic earthquake scaling and scenario modeling, allow us to conclude that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering this state-of-the-art knowledge of looming disaster in advance catastrophic events. In a lieu of seismic observations long enough for a reliable probabilistic assessment or a comprehensive physical theory of earthquake recurrence, pattern recognition applied to available geophysical and/or geological data sets remains a broad avenue to follow in seismic hazard forecast/prediction. Moreover, better understanding seismic process in terms of non-linear dynamics of a hierarchical system of blocks-and-faults and deterministic chaos, progress to new approaches in assessing time-dependent seismic hazard based on multiscale analysis of seismic activity and reproducible intermediate-term earthquake prediction technique. The algorithms, which make use of multidisciplinary data available and account for fractal nature of earthquake distributions in space and time, have confirmed their reliability by durable statistical testing in the on-going regular real-time application lasted for more than 20 years. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve in forecast/prediction products to optimistic challenging views on Hazard Predictability in space and time, so that not to repeat missed opportunities for disaster preparedness like it happen in advance the 2009 L'Aquila, M6.3 earthquake in Italy and the 2011, M9.0 mega-thrust off the Pacific coast of Tōhoku region in Japan.
Desrosiers, Christian; Hassan, Lama; Tanougast, Camel
2016-01-01
Objective: Predicting the survival outcome of patients with glioblastoma multiforme (GBM) is of key importance to clinicians for selecting the optimal course of treatment. The goal of this study was to evaluate the usefulness of geometric shape features, extracted from MR images, as a potential non-invasive way to characterize GBM tumours and predict the overall survival times of patients with GBM. Methods: The data of 40 patients with GBM were obtained from the Cancer Genome Atlas and Cancer Imaging Archive. The T1 weighted post-contrast and fluid-attenuated inversion-recovery volumes of patients were co-registered and segmented into delineate regions corresponding to three GBM phenotypes: necrosis, active tumour and oedema/invasion. A set of two-dimensional shape features were then extracted slicewise from each phenotype region and combined over slices to describe the three-dimensional shape of these phenotypes. Thereafter, a Kruskal–Wallis test was employed to identify shape features with significantly different distributions across phenotypes. Moreover, a Kaplan–Meier analysis was performed to find features strongly associated with GBM survival. Finally, a multivariate analysis based on the random forest model was used for predicting the survival group of patients with GBM. Results: Our analysis using the Kruskal–Wallis test showed that all but one shape feature had statistically significant differences across phenotypes, with p-value < 0.05, following Holm–Bonferroni correction, justifying the analysis of GBM tumour shapes on a per-phenotype basis. Furthermore, the survival analysis based on the Kaplan–Meier estimator identified three features derived from necrotic regions (i.e. Eccentricity, Extent and Solidity) that were significantly correlated with overall survival (corrected p-value < 0.05; hazard ratios between 1.68 and 1.87). In the multivariate analysis, features from necrotic regions gave the highest accuracy in predicting the survival group of patients, with a mean area under the receiver-operating characteristic curve (AUC) of 63.85%. Combining the features of all three phenotypes increased the mean AUC to 66.99%, suggesting that shape features from different phenotypes can be used in a synergic manner to predict GBM survival. Conclusion: Results show that shape features, in particular those extracted from necrotic regions, can be used effectively to characterize GBM tumours and predict the overall survival of patients with GBM. Advances in knowledge: Simple volumetric features have been largely used to characterize the different phenotypes of a GBM tumour (i.e. active tumour, oedema and necrosis). This study extends previous work by considering a wide range of shape features, extracted in different phenotypes, for the prediction of survival in patients with GBM. PMID:27781499
Quadrant III RFI draft report: Appendix B-I, Volume 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-12-01
In order to determine the nature and extent of contamination at a RCRA site it is often necessary to investigate and characterize the chemical composition of the medium in question that represents background conditions. Background is defined as current conditions present at a site which are unaffected by past treatment, storage, or disposal of hazardous waste (OEPA, 1991). The background composition of soils at the Portsmouth Gaseous Diffusion Plant (PORTS) site was characterized for the purpose of comparing investigative soil data to a background standard for each metal on the Target Compound List/Target Analyte List and each radiological parameter ofmore » concern in this RFI. Characterization of background compositions with respect to organic parameters was not performed because the organic parameters in the TCL/TAL are not naturally occurring at the site and because the site is not located in a highly industrialized area nor downgradient from another unrelated hazardous waste site. Characterization of the background soil composition with respect to metals and radiological parameters was performed by collecting and analyzing soil boring and hand-auger samples in areas deemed unaffected by past treatment, storage, or disposal of hazardous waste. Criteria used in determining whether a soil sample location would be representative of the true background condition included: environmental history of the location, relation to Solid Waste Management Units (SWMU`s), prevailing wind direction, surface runoff direction, and ground-water flow direction.« less
Quadrant III RFI draft report: Appendix B-I, Volume 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-12-01
In order to determine the nature and extent of contamination at a RCRA site it is often necessary to investigate and characterize the chemical composition of the medium in question that represents background conditions. Background is defined as current conditions present at a site which are unaffected by past treatment, storage, or disposal of hazardous waste (OEPA, 1991). The background composition of soils at the Portsmouth Gaseous Diffusion Plant (PORTS) site was characterized for the purpose of comparing investigative soil data to a background standard for each metal on the Target Compound List/Target Analyte List and each radiological parameter ofmore » concern in this RFI. Characterization of background compositions with respect to organic parameters was not performed because the organic parameters in the TCL/TAL are not naturally occurring at the site and because the site is not located in a highly industrialized area nor downgradient from another unrelated hazardous waste site. Characterization of the background soil composition with respect to metals and radiological parameters was performed by collecting and analyzing soil boring and hand-auger samples in areas deemed unaffected by past treatment, storage, or disposal of hazardous waste. Criteria used in determining whether a soil sample location would be representative of the true background condition included: environmental history of the location, relation to Solid Waste Management Units (SWMU's), prevailing wind direction, surface runoff direction, and ground-water flow direction.« less
Drought impact functions as intermediate step towards drought damage assessment
NASA Astrophysics Data System (ADS)
Bachmair, Sophie; Svensson, Cecilia; Prosdocimi, Ilaria; Hannaford, Jamie; Helm Smith, Kelly; Svoboda, Mark; Stahl, Kerstin
2016-04-01
While damage or vulnerability functions for floods and seismic hazards have gained considerable attention, there is comparably little knowledge on drought damage or loss. On the one hand this is due to the complexity of the drought hazard affecting different domains of the hydrological cycle and different sectors of human activity. Hence, a single hazard indicator is likely not able to fully capture this multifaceted hazard. On the other hand, drought impacts are often non-structural and hard to quantify or monetize. Examples are impaired navigability of streams, restrictions on domestic water use, reduced hydropower production, reduced tree growth, and irreversible deterioration/loss of wetlands. Apart from reduced crop yield, data about drought damage or loss with adequate spatial and temporal resolution is scarce, making the development of drought damage functions difficult. As an intermediate step towards drought damage functions we exploit text-based reports on drought impacts from the European Drought Impact report Inventory and the US Drought Impact Reporter to derive surrogate information for drought damage or loss. First, text-based information on drought impacts is converted into timeseries of absence versus presence of impacts, or number of impact occurrences. Second, meaningful hydro-meteorological indicators characterizing drought intensity are identified. Third, different statistical models are tested as link functions relating drought hazard indicators with drought impacts: 1) logistic regression for drought impacts coded as binary response variable; and 2) mixture/hurdle models (zero-inflated/zero-altered negative binomial regression) and an ensemble regression tree approach for modeling the number of drought impact occurrences. Testing the predictability of (number of) drought impact occurrences based on cross-validation revealed a good agreement between observed and modeled (number of) impacts for regions at the scale of federal states or provinces with good data availability. Impact functions representing localized drought impacts are more challenging to construct given that less data is available, yet may provide information that more directly addresses stakeholders' needs. Overall, our study contributes insights into how drought intensity translates into ecological and socioeconomic impacts, and how such information may be used for enhancing drought monitoring and early warning.
The New Italian Seismic Hazard Model
NASA Astrophysics Data System (ADS)
Marzocchi, W.; Meletti, C.; Albarello, D.; D'Amico, V.; Luzi, L.; Martinelli, F.; Pace, B.; Pignone, M.; Rovida, A.; Visini, F.
2017-12-01
In 2015 the Seismic Hazard Center (Centro Pericolosità Sismica - CPS) of the National Institute of Geophysics and Volcanology was commissioned of coordinating the national scientific community with the aim to elaborate a new reference seismic hazard model, mainly finalized to the update of seismic code. The CPS designed a roadmap for releasing within three years a significantly renewed PSHA model, with regard both to the updated input elements and to the strategies to be followed. The main requirements of the model were discussed in meetings with the experts on earthquake engineering that then will participate to the revision of the building code. The activities were organized in 6 tasks: program coordination, input data, seismicity models, ground motion predictive equations (GMPEs), computation and rendering, testing. The input data task has been selecting the most updated information about seismicity (historical and instrumental), seismogenic faults, and deformation (both from seismicity and geodetic data). The seismicity models have been elaborating in terms of classic source areas, fault sources and gridded seismicity based on different approaches. The GMPEs task has selected the most recent models accounting for their tectonic suitability and forecasting performance. The testing phase has been planned to design statistical procedures to test with the available data the whole seismic hazard models, and single components such as the seismicity models and the GMPEs. In this talk we show some preliminary results, summarize the overall strategy for building the new Italian PSHA model, and discuss in detail important novelties that we put forward. Specifically, we adopt a new formal probabilistic framework to interpret the outcomes of the model and to test it meaningfully; this requires a proper definition and characterization of both aleatory variability and epistemic uncertainty that we accomplish through an ensemble modeling strategy. We use a weighting scheme of the different components of the PSHA model that has been built through three different independent steps: a formal experts' elicitation, the outcomes of the testing phase, and the correlation between the outcomes. Finally, we explore through different techniques the influence on seismic hazard of the declustering procedure.
EPA perspective - exposure and effects prediction and monitoring
Risk-based decisions for environmental chemicals often rely on estimates of human exposure and biological response. Biomarkers have proven a useful empirical tool for evaluating exposure and hazard predictions. In the United States, the Centers for Disease Control and Preventio...
NASA's aviation safety research and technology program
NASA Technical Reports Server (NTRS)
Fichtl, G. H.
1977-01-01
Aviation safety is challenged by the practical necessity of compromising inherent factors of design, environment, and operation. If accidents are to be avoided these factors must be controlled to a degree not often required by other transport modes. The operational problems which challenge safety seem to occur most often in the interfaces within and between the design, the environment, and operations where mismatches occur due to ignorance or lack of sufficient understanding of these interactions. Under this report the following topics are summarized: (1) The nature of operating problems, (2) NASA aviation safety research, (3) clear air turbulence characterization and prediction, (4) CAT detection, (5) Measurement of Atmospheric Turbulence (MAT) Program, (6) Lightning, (7) Thunderstorm gust fronts, (8) Aircraft ground operating problems, (9) Aircraft fire technology, (10) Crashworthiness research, (11) Aircraft wake vortex hazard research, and (12) Aviation safety reporting system.
NASA Astrophysics Data System (ADS)
Necmioglu, O.; Meral Ozel, N.
2014-12-01
Accurate earthquake source parameters are essential for any tsunami hazard assessment and mitigation, including early warning systems. Complex tectonic setting makes the a priori accurate assumptions of earthquake source parameters difficult and characterization of the faulting type is a challenge. Information on tsunamigenic sources is of crucial importance in the Eastern Mediterranean and its Connected Seas, especially considering the short arrival times and lack of offshore sea-level measurements. In addition, the scientific community have had to abandon the paradigm of a ''maximum earthquake'' predictable from simple tectonic parameters (Ruff and Kanamori, 1980) in the wake of the 2004 Sumatra event (Okal, 2010) and one of the lessons learnt from the 2011 Tohoku event was that tsunami hazard maps may need to be prepared for infrequent gigantic earthquakes as well as more frequent smaller-sized earthquakes (Satake, 2011). We have initiated an extensive modeling study to perform a deterministic Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas. Characteristic earthquake source parameters (strike, dip, rake, depth, Mwmax) at each 0.5° x 0.5° size bin for 0-40 km depth (total of 310 bins) and for 40-100 km depth (total of 92 bins) in the Eastern Mediterranean, Aegean and Black Sea region (30°N-48°N and 22°E-44°E) have been assigned from the harmonization of the available databases and previous studies. These parameters have been used as input parameters for the deterministic tsunami hazard modeling. Nested Tsunami simulations of 6h duration with a coarse (2 arc-min) and medium (1 arc-min) grid resolution have been simulated at EC-JRC premises for Black Sea and Eastern and Central Mediterranean (30°N-41.5°N and 8°E-37°E) for each source defined using shallow water finite-difference SWAN code (Mader, 2004) for the magnitude range of 6.5 - Mwmax defined for that bin with a Mw increment of 0.1. Results show that not only the earthquakes resembling the well-known historical earthquakes such as AD 365 or AD 1303 in the Hellenic Arc, but also earthquakes with lower magnitudes do constitute to the tsunami hazard in the study area.
Enhancement of Environmental Hazard Degradation in the Presence of Lignin: a Proteomics Study
Sun, Su; Xie, Shangxian; Cheng, Yanbing; ...
2017-09-12
Proteomics studies of fungal systems have progressed dramatically based on the availability of more fungal genome sequences in recent years. Different proteomics strategies have been applied toward characterization of fungal proteome and revealed important gene functions and proteome dynamics. Presented here is the application of shot-gun proteomic technology to study the bio-remediation of environmental hazards by white-rot fungus. Lignin, a naturally abundant component of the plant biomass, is discovered to promote the degradation of Azo dye by white-rot fungus Irpex lacteus CD2 in the lignin/dye/fungus system. Shotgun proteomics technique was used to understand degradation mechanism at the protein level formore » the lignin/dye/fungus system. Our proteomics study can identify about two thousand proteins (one third of the predicted white-rot fungal proteome) in a single experiment, as one of the most powerful proteomics platforms to study the fungal system to date. The study shows a significant enrichment of oxidoreduction functional category under the dye/lignin combined treatment. An in vitro validation is performed and supports our hypothesis that the synergy of Fenton reaction and manganese peroxidase might play an important role in DR5B dye degradation. The results could guide the development of effective bioremediation strategies and efficient lignocellulosic biomass conversion.« less
Enhancement of Environmental Hazard Degradation in the Presence of Lignin: a Proteomics Study.
Sun, Su; Xie, Shangxian; Cheng, Yanbing; Yu, Hongbo; Zhao, Honglu; Li, Muzi; Li, Xiaotong; Zhang, Xiaoyu; Yuan, Joshua S; Dai, Susie Y
2017-09-12
Proteomics studies of fungal systems have progressed dramatically based on the availability of more fungal genome sequences in recent years. Different proteomics strategies have been applied toward characterization of fungal proteome and revealed important gene functions and proteome dynamics. Presented here is the application of shot-gun proteomic technology to study the bio-remediation of environmental hazards by white-rot fungus. Lignin, a naturally abundant component of the plant biomass, is discovered to promote the degradation of Azo dye by white-rot fungus Irpex lacteus CD2 in the lignin/dye/fungus system. Shotgun proteomics technique was used to understand degradation mechanism at the protein level for the lignin/dye/fungus system. Our proteomics study can identify about two thousand proteins (one third of the predicted white-rot fungal proteome) in a single experiment, as one of the most powerful proteomics platforms to study the fungal system to date. The study shows a significant enrichment of oxidoreduction functional category under the dye/lignin combined treatment. An in vitro validation is performed and supports our hypothesis that the synergy of Fenton reaction and manganese peroxidase might play an important role in DR5B dye degradation. The results could guide the development of effective bioremediation strategies and efficient lignocellulosic biomass conversion.
Enhancement of Environmental Hazard Degradation in the Presence of Lignin: a Proteomics Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Su; Xie, Shangxian; Cheng, Yanbing
Proteomics studies of fungal systems have progressed dramatically based on the availability of more fungal genome sequences in recent years. Different proteomics strategies have been applied toward characterization of fungal proteome and revealed important gene functions and proteome dynamics. Presented here is the application of shot-gun proteomic technology to study the bio-remediation of environmental hazards by white-rot fungus. Lignin, a naturally abundant component of the plant biomass, is discovered to promote the degradation of Azo dye by white-rot fungus Irpex lacteus CD2 in the lignin/dye/fungus system. Shotgun proteomics technique was used to understand degradation mechanism at the protein level formore » the lignin/dye/fungus system. Our proteomics study can identify about two thousand proteins (one third of the predicted white-rot fungal proteome) in a single experiment, as one of the most powerful proteomics platforms to study the fungal system to date. The study shows a significant enrichment of oxidoreduction functional category under the dye/lignin combined treatment. An in vitro validation is performed and supports our hypothesis that the synergy of Fenton reaction and manganese peroxidase might play an important role in DR5B dye degradation. The results could guide the development of effective bioremediation strategies and efficient lignocellulosic biomass conversion.« less
NASA Astrophysics Data System (ADS)
Park, Shinju; Berenguer, Marc; Sempere-Torres, Daniel; Baugh, Calum; Smith, Paul
2017-04-01
Flash floods induced by heavy rain are one of the hazardous natural events that significantly affect human lives. Because flash floods are characterized by their rapid onset, forecasting flash flood to lead an effective response requires accurate rainfall predictions with high spatial and temporal resolution and adequate representation of the hydrologic and hydraulic processes within a catchment that determine rainfall-runoff accumulations. We present extreme flash flood cases which occurred throughout Europe in 2015-2016 that were identified and forecasted by two real-time approaches: 1) the European Rainfall-Induced Hazard Assessment System (ERICHA) and 2) the European Runoff Index based on Climatology (ERIC). ERICHA is based on the nowcasts of accumulated precipitation generated from the pan-European radar composites produced by the EUMETNET project OPERA. It has the advantage of high-resolution precipitation inputs and rapidly updated forecasts (every 15 minutes), but limited forecast lead time (up to 8 hours). ERIC, on the other hand, provides 5-day forecasts based on the COSMO-LEPS NWP simulations updated 2 times a day but is only produced at a 7 km resolution. We compare the products from both systems and focus on showing the advantages, limitations and complementarities of ERICHA and ERIC for seamless high-resolution flash flood forecasting.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Prescott, Steven; Coleman, Justin
This report describes the current progress and status related to the Industry Application #2 focusing on External Hazards. For this industry application within the Light Water Reactor Sustainability (LWRS) Program Risk-Informed Safety Margin Characterization (RISMC) R&D Pathway, we will create the Risk-Informed Margin Management (RIMM) approach to represent meaningful (i.e., realistic facility representation) event scenarios and consequences by using an advanced 3D facility representation that will evaluate external hazards such as flooding and earthquakes in order to identify, model and analyze the appropriate physics that needs to be included to determine plant vulnerabilities related to external events; manage the communicationmore » and interactions between different physics modeling and analysis technologies; and develop the computational infrastructure through tools related to plant representation, scenario depiction, and physics prediction. One of the unique aspects of the RISMC approach is how it couples probabilistic approaches (the scenario) with mechanistic phenomena representation (the physics) through simulation. This simulation-based modeling allows decision makers to focus on a variety of safety, performance, or economic metrics. In this report, we describe the evaluation of various physics toolkits related to flooding representation. Ultimately, we will be coupling the flooding representation with other events such as earthquakes in order to provide coupled physics analysis for scenarios where interactions exist.« less
Volunteers in the earthquake hazard reduction program
Ward, P.L.
1978-01-01
With this in mind, I organized a small workshop for approximately 30 people on February 2 and 3, 1978, in Menlo Park, Calif. the purpose of the meeting was to discuss methods of involving volunteers in a meaningful way in earthquake research and in educating the public about earthquake hazards. The emphasis was on earthquake prediction research, but the discussions covered the whole earthquake hazard reduction program. Representatives attended from the earthquake research community, from groups doing socioeconomic research on earthquake matters, and from a wide variety of organizations who might sponsor volunteers.
Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources
Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.
2009-01-01
The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.
GO/NO-GO - When is medical hazard mitigation acceptable for launch?
NASA Technical Reports Server (NTRS)
Hamilton, Douglas R.; Polk, James D.
2005-01-01
Medical support of spaceflight missions is composed of complex tasks and decisions that dedicated to maintaining the health and performance of the crew and the completion of mission objectives. Spacecraft represent one of the most complex vehicles built by humans, and are built to very rigorous design specifications. In the course of a Flight Readiness Review (FRR) or a mission itself, the flight surgeon must be able to understand the impact of hazards and risks that may not be completely mitigated by design alone. Some hazards are not mitigated because they are never actually identified. When a hazard is identified, it must be reduced or waivered. Hazards that cannot be designed out of the vehicle or mission, are usually mitigated through other means to bring the residual risk to an acceptable level. This is possible in most engineered systems because failure modes are usually predictable and analysis can include taking these systems to failure. Medical support of space missions is complicated by the inability of flight surgeons to provide "exact" hazard and risk numbers to the NASA engineering community. Taking humans to failure is not an option. Furthermore, medical dogma is mostly comprised of "medical prevention" strategies that mitigate risk by examining the behaviour of a cohort of humans similar to astronauts. Unfortunately, this approach does not lend itself well for predicting the effect of a hazard in the unique environment of space. This presentation will discuss how Medical Operations uses an evidence-based approach to decide if hazard mitigation strategies are adequate to reduce mission risk to acceptable levels. Case studies to be discussed will include: 1. Risk of electrocution risk during EVA 2. Risk of cardiac event risk during long and short duration missions 3. Degraded cabin environmental monitoring on the ISS. Learning Objectives 1.) The audience will understand the challenges of mitigating medical risk caused by nominal and off-nominal mission events. 2.) The audience will understand the process by which medical hazards are identified and mitigated before launch. 3.) The audience will understand the roles and responsibilities of all the other flight control positions in participating in the process of reducing hazards and reducing medical risk to an acceptable level.
Close, Rebecca; Watts, Michael J.; Ander, E. Louise; Smedley, Pauline L.; Verlander, Neville Q.; Gregory, Martin; Middleton, Daniel R. S.; Polya, David A.; Studden, Mike; Leonardi, Giovanni S.
2017-01-01
Approximately one million people in the UK are served by private water supplies (PWS) where main municipal water supply system connection is not practical or where PWS is the preferred option. Chronic exposure to contaminants in PWS may have adverse effects on health. South West England is an area with elevated arsenic concentrations in groundwater and over 9000 domestic dwellings here are supplied by PWS. There remains uncertainty as to the extent of the population exposed to arsenic (As), and the factors predicting such exposure. We describe a hazard assessment model based on simplified geology with the potential to predict exposure to As in PWS. Households with a recorded PWS in Cornwall were recruited to take part in a water sampling programme from 2011 to 2013. Bedrock geologies were aggregated and classified into nine Simplified Bedrock Geological Categories (SBGC), plus a cross-cutting “mineralized” area. PWS were sampled by random selection within SBGCs and some 508 households volunteered for the study. Transformations of the data were explored to estimate the distribution of As concentrations for PWS by SBGC. Using the distribution per SBGC, we predict the proportion of dwellings that would be affected by high concentrations and rank the geologies according to hazard. Within most SBGCs, As concentrations were found to have log-normal distributions. Across these areas, the proportion of dwellings predicted to have drinking water over the prescribed concentration value (PCV) for As ranged from 0% to 20%. From these results, a pilot predictive model was developed calculating the proportion of PWS above the PCV for As and hazard ranking supports local decision making and prioritization. With further development and testing, this can help local authorities predict the number of dwellings that might fail the PCV for As, based on bedrock geology. The model presented here for Cornwall could be applied in areas with similar geologies. Application of the method requires independent validation and further groundwater-derived PWS sampling on other geological formations. PMID:29194429
MicroRNA Expression-Based Model Indicates Event-Free Survival in Pediatric Acute Myeloid Leukemia
Lim, Emilia L.; Trinh, Diane L.; Ries, Rhonda E.; Wang, Jim; Gerbing, Robert B.; Ma, Yussanne; Topham, James; Hughes, Maya; Pleasance, Erin; Mungall, Andrew J.; Moore, Richard; Zhao, Yongjun; Aplenc, Richard; Sung, Lillian; Kolb, E. Anders; Gamis, Alan; Smith, Malcolm; Gerhard, Daniela S.; Alonzo, Todd A.; Meshinchi, Soheil; Marra, Marco A.
2017-01-01
Purpose Children with acute myeloid leukemia (AML) whose disease is refractory to standard induction chemotherapy therapy or who experience relapse after initial response have dismal outcomes. We sought to comprehensively profile pediatric AML microRNA (miRNA) samples to identify dysregulated genes and assess the utility of miRNAs for improved outcome prediction. Patients and Methods To identify miRNA biomarkers that are associated with treatment failure, we performed a comprehensive sequence-based characterization of the pediatric AML miRNA landscape. miRNA sequencing was performed on 1,362 samples—1,303 primary, 22 refractory, and 37 relapse samples. One hundred sixty-four matched samples—127 primary and 37 relapse samples—were analyzed by using RNA sequencing. Results By using penalized lasso Cox proportional hazards regression, we identified 36 miRNAs the expression levels at diagnosis of which were highly associated with event-free survival. Combined expression of the 36 miRNAs was used to create a novel miRNA-based risk classification scheme (AMLmiR36). This new miRNA-based risk classifier identifies those patients who are at high risk (hazard ratio, 2.830; P ≤ .001) or low risk (hazard ratio, 0.323; P ≤ .001) of experiencing treatment failure, independent of conventional karyotype or mutation status. The performance of AMLmiR36 was independently assessed by using 878 patients from two different clinical trials (AAML0531 and AAML1031). Our analysis also revealed that miR-106a-363 was abundantly expressed in relapse and refractory samples, and several candidate targets of miR-106a-5p were involved in oxidative phosphorylation, a process that is suppressed in treatment-resistant leukemic cells. Conclusion To assess the utility of miRNAs for outcome prediction in patients with pediatric AML, we designed and validated a miRNA-based risk classification scheme. We also hypothesized that the abundant expression of miR-106a could increase treatment resistance via modulation of genes that are involved in oxidative phosphorylation. PMID:29068783
Real-time Forensic Disaster Analysis
NASA Astrophysics Data System (ADS)
Wenzel, F.; Daniell, J.; Khazai, B.; Mühr, B.; Kunz-Plapp, T.; Markus, M.; Vervaeck, A.
2012-04-01
The Center for Disaster Management and Risk Reduction Technology (CEDIM, www.cedim.de) - an interdisciplinary research center founded by the German Research Centre for Geoscience (GFZ) and Karlsruhe Institute of Technology (KIT) - has embarked on a new style of disaster research known as Forensic Disaster Analysis. The notion has been coined by the Integrated Research on Disaster Risk initiative (IRDR, www.irdrinternational.org) launched by ICSU in 2010. It has been defined as an approach to studying natural disasters that aims at uncovering the root causes of disasters through in-depth investigations that go beyond the reconnaissance reports and case studies typically conducted after disasters. In adopting this comprehensive understanding of disasters CEDIM adds a real-time component to the assessment and evaluation process. By comprehensive we mean that most if not all relevant aspects of disasters are considered and jointly analysed. This includes the impact (human, economy, and infrastructure), comparisons with recent historic events, social vulnerability, reconstruction and long-term impacts on livelihood issues. The forensic disaster analysis research mode is thus best characterized as "event-based research" through systematic investigation of critical issues arising after a disaster across various inter-related areas. The forensic approach requires (a) availability of global data bases regarding previous earthquake losses, socio-economic parameters, building stock information, etc.; (b) leveraging platforms such as the EERI clearing house, relief-web, and the many sources of local and international sources where information is organized; and (c) rapid access to critical information (e.g., crowd sourcing techniques) to improve our understanding of the complex dynamics of disasters. The main scientific questions being addressed are: What are critical factors that control loss of life, of infrastructure, and for economy? What are the critical interactions between hazard - socio-economic systems - technological systems? What were the protective measures and to what extent did they work? Can we predict pattern of losses and socio-economic implications for future extreme events from simple parameters: hazard parameters, historic evidence, socio-economic conditions? Can we predict implications for reconstruction from simple parameters: hazard parameters, historic evidence, socio-economic conditions? The M7.2 Van Earthquake (Eastern Turkey) of 23 Oct. 2011 serves as an example for a forensic approach.
Shea, Cristina A; Ward, Rachel E; Welch, Sarah A; Kiely, Dan K; Goldstein, Richard; Bean, Jonathan F
2018-06-01
The aim of the study was to examine whether the chair stand component of the Short Physical Performance Battery predicts fall-related injury among older adult primary care patients. A 2-yr longitudinal cohort study of 430 Boston-area primary care patients aged ≥65 yrs screened to be at risk for mobility decline was conducted. The three components of the Short Physical Performance Battery (balance time, gait speed, and chair stand time) were measured at baseline. Participants reported incidence of fall-related injuries quarterly for 2 yrs. Complementary log-log discrete time hazard models were constructed to examine the hazard of fall-related injury across Short Physical Performance Battery scores, adjusting for age, sex, race, Digit Symbol Substitution Test score, and fall history. Participants were 68% female and 83% white, with a mean (SD) age of 76.6 (7.0). A total of 137 (32%) reported a fall-related injury during the follow-up period. Overall, inability to perform the chair stand task was a significant predictor of fall-related injury (hazard ratio = 2.11, 95% confidence interval = 1.23-3.62, P = 0.01). Total Short Physical Performance Battery score, gait component score, and balance component score were not predictive of fall-related injury. Inability to perform the repeated chair stand task was associated with increased hazard of an injurious fall for 2 yrs among a cohort of older adult primary care patients.
Earthquake Hazard Assessment: an Independent Review
NASA Astrophysics Data System (ADS)
Kossobokov, Vladimir
2016-04-01
Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".
Contributions to the Characterization and Mitigation of Rotorcraft Brownout
NASA Astrophysics Data System (ADS)
Tritschler, John Kirwin
Rotorcraft brownout, the condition in which the flow field of a rotorcraft mobilizes sediment from the ground to generate a cloud that obscures the pilot's field of view, continues to be a significant hazard to civil and military rotorcraft operations. This dissertation presents methodologies for: (i) the systematic mitigation of rotorcraft brownout through operational and design strategies and (ii) the quantitative characterization of the visual degradation caused by a brownout cloud. In Part I of the dissertation, brownout mitigation strategies are developed through simulation-based brownout studies that are mathematically formulated within a numerical optimization framework. Two optimization studies are presented. The first study involves the determination of approach-to-landing maneuvers that result in reduced brownout severity. The second study presents a potential methodology for the design of helicopter rotors with improved brownout characteristics. The results of both studies indicate that the fundamental mechanisms underlying brownout mitigation are aerodynamic in nature, and the evolution of a ground vortex ahead of the rotor disk is seen to be a key element in the development of a brownout cloud. In Part II of the dissertation, brownout cloud characterizations are based upon the Modulation Transfer Function (MTF), a metric commonly used in the optics community for the characterization of imaging systems. The use of the MTF in experimentation is examined first, and the application of MTF calculation and interpretation methods to actual flight test data is described. The potential for predicting the MTF from numerical simulations is examined second, and an initial methodology is presented for the prediction of the MTF of a brownout cloud. Results from the experimental and analytical studies rigorously quantify the intuitively-known facts that the visual degradation caused by brownout is a space and time-dependent phenomenon, and that high spatial frequency features, i.e., fine-grained detail, are obscured before low spatial frequency features, i.e., large objects. As such, the MTF is a metric that is amenable to Handling Qualities (HQ) analyses.
Levich, R.A.; Linden, R.M.; Patterson, R.L.; Stuckless, J.S.
2000-01-01
Yucca Mountain, located ~100 mi northwest of Las Vegas, Nevada, has been designated by Congress as a site to be characterized for a potential mined geologic repository for high-level radioactive waste. This field trip will examine the regional geologic and hydrologic setting for Yucca Mountain, as well as specific results of the site characterization program. The first day focuses on the regional setting with emphasis on current and paleo hydrology, which are both of critical concern for predicting future performance of a potential repository. Morning stops will be southern Nevada and afternoon stops will be in Death Valley. The second day will be spent at Yucca Mountain. The field trip will visit the underground testing sites in the "Exploratory Studies Facility" and the "Busted Butte Unsaturated Zone Transport Field Test" plus several surface-based testing sites. Much of the work at the site has concentrated on studies of the unsaturated zone, an element of the hydrologic system that historically has received little attention. Discussions during the second day will compromise selected topics of Yucca Mountain geology, hydrology and geochemistry and will include the probabilistic volcanic hazard analysis and the seismicity and seismic hazard in the Yucca Mountain area. Evening discussions will address modeling of regional groundwater flow, the results of recent hydrologic studies by the Nye County Nuclear Waste Program Office, and the relationship of the geology and hydrology of Yucca Mountain to the performance of a potential repository. Day 3 will examine the geologic framework and hydrology of the Pahute Mesa-Oasis Valley Groundwater Basin and then will continue to Reno via Hawthorne, Nevada and the Walker Lake area.
Abecassis, Isaac Josh; Sen, Rajeev D; Barber, Jason; Shetty, Rakshith; Kelly, Cory M; Ghodke, Basavaraj V; Hallam, Danial K; Levitt, Michael R; Kim, Louis J; Sekhar, Laligam N
2018-06-14
Endovascular treatment of intracranial aneurysms is associated with higher rates of recurrence and retreatment, though contemporary rates and risk factors for basilar tip aneurysms (BTAs) are less well-described. To characterize progression, retreatement, and retreated progression of BTAs treated with microsurgical or endovascular interventions. We retrospectively reviewed records for 141 consecutive BTA patients. We included 158 anterior communicating artery (ACoA) and 118 middle cerebral artery (MCA) aneurysms as controls. Univariate and multivariate analyses were used to calculate rates of progression (recurrence of previously obliterated aneurysms and progression of known residual aneurysm dome or neck), retreatment, and retreated progression. Kaplan-Meier analysis was used to characterize 24-mo event rates for primary outcome prediction. Of 141 BTA patients, 62.4% were ruptured and 37.6% were unruptured. Average radiographical follow-up was 33 mo. Among ruptured aneurysms treated with clipping, there were 2 rehemorrhages due to recurrence (6.1%), and none in any other cohorts. Overall rates of progression (28.9%), retreatment (28.9%), and retreated progression (24.7%) were not significantly different between surgical and endovascular subgroups, though ruptured aneurysms had higher event rates. Multivariate modeling confirmed rupture status (P = .003, hazard ratio = 0.14) and aneurysm dome width (P = .005, hazard ratio = 1.23) as independent predictors of progression requiring retreatment. In a separate multivariate analysis with ACoA and MCA aneurysms, basilar tip location was an independent predictor of progression, retreatment, and retreated progression. BTAs have higher rates of progression and retreated progression than other aneurysm locations, independent of treatment modality. Rupture status and dome width are risk factors for progression requiring retreatment.
NASA Astrophysics Data System (ADS)
Goulet, C. A.; Abrahamson, N. A.; Al Atik, L.; Atkinson, G. M.; Bozorgnia, Y.; Graves, R. W.; Kuehn, N. M.; Youngs, R. R.
2017-12-01
The Next Generation Attenuation project for Central and Eastern North America (CENA), NGA-East, is a major multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER). The project was co-sponsored by the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), the Electric Power Research Institute (EPRI) and the U.S. Geological Survey (USGS). NGA-East involved a large number of participating researchers from various organizations in academia, industry and government and was carried-out as a combination of 1) a scientific research project and 2) a model-building component following the NRC Seismic Senior Hazard Analysis Committee (SSHAC) Level 3 process. The science part of the project led to several data products and technical reports while the SSHAC component aggregated the various results into a ground motion characterization (GMC) model. The GMC model consists in a set of ground motion models (GMMs) for median and standard deviation of ground motions and their associated weights, combined into logic-trees for use in probabilistic seismic hazard analyses (PSHA). NGA-East addressed many technical challenges, most of them related to the relatively small number of earthquake recordings available for CENA. To resolve this shortcoming, the project relied on ground motion simulations to supplement the available data. Other important scientific issues were addressed through research projects on topics such as the regionalization of seismic source, path and attenuation of motions, the treatment of variability and uncertainties and on the evaluation of site effects. Seven working groups were formed to cover the complexity and breadth of topics in the NGA-East project, each focused on a specific technical area. This presentation provides an overview of the NGA-East research project and its key products.
Seismic dynamics in advance and after the recent strong earthquakes in Italy and New Zealand
NASA Astrophysics Data System (ADS)
Nekrasova, A.; Kossobokov, V. G.
2017-12-01
We consider seismic events as a sequence of avalanches in self-organized system of blocks-and-faults of the Earth lithosphere and characterize earthquake series with the distribution of the control parameter, η = τ × 10B × (5-M) × L C of the Unified Scaling Law for Earthquakes, USLE (where τ is inter-event time, B is analogous to the Gutenberg-Richter b-value, and C is fractal dimension of seismic locus). A systematic analysis of earthquake series in Central Italy and New Zealand, 1993-2017, suggests the existence, in a long-term, of different rather steady levels of seismic activity characterized with near constant values of η, which, in mid-term, intermittently switch at times of transitions associated with the strong catastrophic events. On such a transition, seismic activity, in short-term, may follow different scenarios with inter-event time scaling of different kind, including constant, logarithmic, power law, exponential rise/decay or a mixture of those. The results do not support the presence of universality in seismic energy release. The observed variability of seismic activity in advance and after strong (M6.0+) earthquakes in Italy and significant (M7.0+) earthquakes in New Zealand provides important constraints on modelling realistic earthquake sequences by geophysicists and can be used to improve local seismic hazard assessments including earthquake forecast/prediction methodologies. The transitions of seismic regime in Central Italy and New Zealand started in 2016 are still in progress and require special attention and geotechnical monitoring. It would be premature to make any kind of definitive conclusions on the level of seismic hazard which is evidently high at this particular moment of time in both regions. The study supported by the Russian Science Foundation Grant No.16-17-00093.
Risk Factors and Outcomes in Transfusion-associated Circulatory Overload
Murphy, Edward L.; Kwaan, Nicholas; Looney, Mark R.; Gajic, Ognjen; Hubmayr, Rolf D.; Gropper, Michael A.; Koenigsberg, Monique; Wilson, Greg; Matthay, Michael; Bacchetti, Peter; Toy, Pearl
2013-01-01
BACKGROUND Transfusion-associated circulatory overload is characterized by new respiratory distress and hydrostatic pulmonary edema within 6 hours after blood transfusion, but its risk factors and outcomes are poorly characterized. METHODS Using a case control design, we enrolled 83 patients with severe transfusion-associated circulatory overload identified by active surveillance for hypoxemia and 163 transfused controls at the University of California, San Francisco (UCSF) and Mayo Clinic (Rochester, Minn) hospitals. Odds ratios (OR) and 95% confidence intervals (CI) were calculated using multivariable logistic regression, and survival and length of stay were analyzed using proportional hazard models. RESULTS Transfusion-associated circulatory overload was associated with chronic renal failure (OR 27.0; 95% CI, 5.2–143), a past history of heart failure (OR 6.6; 95% CI, 2.1–21), hemorrhagic shock (OR 113; 95% CI, 14.1–903), number of blood products transfused (OR 1.11 per unit; 95% CI, 1.01–1.22), and fluid balance per hour (OR 9.4 per liter; 95% CI, 3.1–28). Patients with transfusion-associated circulatory overload had significantly increased in-hospital mortality (hazard ratio 3.20; 95% CI, 1.23–8.10) after controlling for Acute Physiology and Chronic Health Evaluation-II (APACHE-II) score, and longer hospital and intensive care unit lengths of stay. CONCLUSIONS The risk of transfusion-associated circulatory overload increases with the number of blood products administered and a positive fluid balance, and in patients with pre-existing heart failure and chronic renal failure. These data, if replicated, could be used to construct predictive algorithms for transfusion-associated circulatory overload, and subsequent modifications of transfusion practice might prevent morbidity and mortality associated with this complication. PMID:23357450
San Mateo County Geographic Information Systems (GIS) project
Brabb, E.E.
1986-01-01
Earthquakes and ground failures in the United States cause billions of dollars of damages each year, but techniques for predicting and reducing these hazardous geologic processes remain elusive. geologists, geophysicists, hydrologists, engineers, cartographers, and computer specialists from the U.S geological Survey in Menlo Park, California, are working together on a project involving GIS techniques to determine how to predict the consequences of earthquakes and landslides, using San Mateo County as a subject area. Together with members of the Planning and Emergency Serivces Departments of San Mateo County and the Association of Bay Area Governments, They are also determining how to reduce the losses caused by hazards.
Long aftershock sequences within continents and implications for earthquake hazard assessment.
Stein, Seth; Liu, Mian
2009-11-05
One of the most powerful features of plate tectonics is that the known plate motions give insight into both the locations and average recurrence interval of future large earthquakes on plate boundaries. Plate tectonics gives no insight, however, into where and when earthquakes will occur within plates, because the interiors of ideal plates should not deform. As a result, within plate interiors, assessments of earthquake hazards rely heavily on the assumption that the locations of small earthquakes shown by the short historical record reflect continuing deformation that will cause future large earthquakes. Here, however, we show that many of these recent earthquakes are probably aftershocks of large earthquakes that occurred hundreds of years ago. We present a simple model predicting that the length of aftershock sequences varies inversely with the rate at which faults are loaded. Aftershock sequences within the slowly deforming continents are predicted to be significantly longer than the decade typically observed at rapidly loaded plate boundaries. These predictions are in accord with observations. So the common practice of treating continental earthquakes as steady-state seismicity overestimates the hazard in presently active areas and underestimates it elsewhere.
Beck, Matthias
2016-03-10
This paper revisits work on the socio-political amplification of risk, which predicts that those living in developing countries are exposed to greater risk than residents of developed nations. This prediction contrasts with the neoliberal expectation that market driven improvements in working conditions within industrialising/developing nations will lead to global convergence of hazard exposure levels. It also contradicts the assumption of risk society theorists that there will be an ubiquitous increase in risk exposure across the globe, which will primarily affect technically more advanced countries. Reviewing qualitative evidence on the impact of structural adjustment reforms in industrialising countries, the export of waste and hazardous waste recycling to these countries and new patterns of domestic industrialisation, the paper suggests that workers in industrialising countries continue to face far greater levels of hazard exposure than those of developed countries. This view is confirmed when a data set including 105 major multi-fatality industrial disasters from 1971 to 2000 is examined. The paper concludes that there is empirical support for the predictions of socio-political amplification of risk theory, which finds clear expression in the data in a consistent pattern of significantly greater fatality rates per industrial incident in industrialising/developing countries.
Prototype operational earthquake prediction system
Spall, Henry
1986-01-01
An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.
Effects of global change on hydro-geomorphological hazards in Mediterranean rivers
NASA Astrophysics Data System (ADS)
Andres Lopez-Tarazon, Jose
2015-04-01
Mediterranean river basins are characterized by high (often extreme) temporal variability in precipitation, and hence discharge. Mediterranean countries are considered sensitive to so-called global change, considered as the combination of climate and land use changes. All panels on climate evolution predict future scenarios of increasing frequency and magnitude of floods and extended droughts in the Mediterranean region; both floods and droughts are likely to lead to huge geomorphic adjustments of river channels so, major metamorphosis of fluvial systems is expected as a result of global change. Water resources in the Mediterranean region is subjected to rising pressures, becoming a key issue for all governments (i.e. clear imbalance between the available water resources and the increasing water demand related to increasing human population). Such pressures are likely to give rise to major ecological and economic changes and challenges that governments need to address as a matter of priority. Changes in river flow regimes associated with global change are therefore ushering in a new era, where there is a critical need to evaluate hydro-geomorphological hazard from headwaters to lowland areas (flooding can be not just a problem related to being under the water). A key question is how our understanding of these hazards associated with global change can be improved; improvement has to come from integrated research which includes all physical conditions that influence the conveyance of water and sediments, and the river's capacity (i.e. amount of sediment) and competence (i.e. channel deformation) that, in turn, will influence physical conditions of a given point in the river network. This is the framework of the present work; it is directed to develop an integrated approach which both improves our understanding of how rivers are likely to evolve as a result of global change, and addresses the associated hazards of fluvial environmental change.
Mohammed, Selma F; Hussain, Imad; AbouEzzeddine, Omar F; Abou Ezzeddine, Omar F; Takahama, Hiroyuki; Kwon, Susan H; Forfia, Paul; Roger, Véronique L; Redfield, Margaret M
2014-12-23
The prevalence and clinical significance of right ventricular (RV) systolic dysfunction (RVD) in patients with heart failure and preserved ejection fraction (HFpEF) are not well characterized. Consecutive, prospectively identified HFpEF (Framingham HF criteria, ejection fraction ≥50%) patients (n=562) from Olmsted County, Minnesota, underwent echocardiography at HF diagnosis and follow-up for cause-specific mortality and HF hospitalization. RV function was categorized by tertiles of tricuspid annular plane systolic excursion and by semiquantitative (normal, mild RVD, or moderate to severe RVD) 2-dimensional assessment. Whether RVD was defined by semiquantitative assessment or tricuspid annular plane systolic excursion ≤15 mm, HFpEF patients with RVD were more likely to have atrial fibrillation, pacemakers, and chronic diuretic therapy. At echocardiography, patients with RVD had slightly lower left ventricular ejection fraction, worse diastolic dysfunction, lower blood pressure and cardiac output, higher pulmonary artery systolic pressure, and more severe RV enlargement and tricuspid valve regurgitation. After adjustment for age, sex, pulmonary artery systolic pressure, and comorbidities, the presence of any RVD by semiquantitative assessment was associated with higher all-cause (hazard ratio=1.35; 95% confidence interval, 1.03-1.77; P=0.03) and cardiovascular (hazard ratio=1.85; 95% confidence interval, 1.20-2.80; P=0.006) mortality and higher first (hazard ratio=1.99; 95% confidence interval, 1.35-2.90; P=0.0006) and multiple (hazard ratio=1.81; 95% confidence interval, 1.18-2.78; P=0.007) HF hospitalization rates. RVD defined by tricuspid annular plane systolic excursion values showed similar but weaker associations with mortality and HF hospitalizations. In the community, RVD is common in HFpEF patients, is associated with clinical and echocardiographic evidence of more advanced HF, and is predictive of poorer outcomes. © 2014 American Heart Association, Inc.
Space-Time Earthquake Rate Models for One-Year Hazard Forecasts in Oklahoma
NASA Astrophysics Data System (ADS)
Llenos, A. L.; Michael, A. J.
2017-12-01
The recent one-year seismic hazard assessments for natural and induced seismicity in the central and eastern US (CEUS) (Petersen et al., 2016, 2017) rely on earthquake rate models based on declustered catalogs (i.e., catalogs with foreshocks and aftershocks removed), as is common practice in probabilistic seismic hazard analysis. However, standard declustering can remove over 90% of some induced sequences in the CEUS. Some of these earthquakes may still be capable of causing damage or concern (Petersen et al., 2015, 2016). The choices of whether and how to decluster can lead to seismicity rate estimates that vary by up to factors of 10-20 (Llenos and Michael, AGU, 2016). Therefore, in order to improve the accuracy of hazard assessments, we are exploring ways to make forecasts based on full, rather than declustered, catalogs. We focus on Oklahoma, where earthquake rates began increasing in late 2009 mainly in central Oklahoma and ramped up substantially in 2013 with the expansion of seismicity into northern Oklahoma and southern Kansas. We develop earthquake rate models using the space-time Epidemic-Type Aftershock Sequence (ETAS) model (Ogata, JASA, 1988; Ogata, AISM, 1998; Zhuang et al., JASA, 2002), which characterizes both the background seismicity rate as well as aftershock triggering. We examine changes in the model parameters over time, focusing particularly on background rate, which reflects earthquakes that are triggered by external driving forces such as fluid injection rather than other earthquakes. After the model parameters are fit to the seismicity data from a given year, forecasts of the full catalog for the following year can then be made using a suite of 100,000 ETAS model simulations based on those parameters. To evaluate this approach, we develop pseudo-prospective yearly forecasts for Oklahoma from 2013-2016 and compare them with the observations using standard Collaboratory for the Study of Earthquake Predictability tests for consistency.
Cuthbertson, Carmen C; Kucharska-Newton, Anna; Faurot, Keturah R; Stürmer, Til; Jonsson Funk, Michele; Palta, Priya; Windham, B Gwen; Thai, Sydney; Lund, Jennifer L
2018-07-01
Frailty is a geriatric syndrome characterized by weakness and weight loss and is associated with adverse health outcomes. It is often an unmeasured confounder in pharmacoepidemiologic and comparative effectiveness studies using administrative claims data. Among the Atherosclerosis Risk in Communities (ARIC) Study Visit 5 participants (2011-2013; n = 3,146), we conducted a validation study to compare a Medicare claims-based algorithm of dependency in activities of daily living (or dependency) developed as a proxy for frailty with a reference standard measure of phenotypic frailty. We applied the algorithm to the ARIC participants' claims data to generate a predicted probability of dependency. Using the claims-based algorithm, we estimated the C-statistic for predicting phenotypic frailty. We further categorized participants by their predicted probability of dependency (<5%, 5% to <20%, and ≥20%) and estimated associations with difficulties in physical abilities, falls, and mortality. The claims-based algorithm showed good discrimination of phenotypic frailty (C-statistic = 0.71; 95% confidence interval [CI] = 0.67, 0.74). Participants classified with a high predicted probability of dependency (≥20%) had higher prevalence of falls and difficulty in physical ability, and a greater risk of 1-year all-cause mortality (hazard ratio = 5.7 [95% CI = 2.5, 13]) than participants classified with a low predicted probability (<5%). Sensitivity and specificity varied across predicted probability of dependency thresholds. The Medicare claims-based algorithm showed good discrimination of phenotypic frailty and high predictive ability with adverse health outcomes. This algorithm can be used in future Medicare claims analyses to reduce confounding by frailty and improve study validity.
Kammerer, A.M.; ten Brink, Uri S.; Titov, V.V.
2017-01-01
In response to the 2004 Indian Ocean Tsunami, the United States Nuclear Regulatory Commission (US NRC) initiated a long-term research program to improve understanding of tsunami hazard levels for nuclear facilities in the United States. For this effort, the US NRC organized a collaborative research program with the United States Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA) with a goal of assessing tsunami hazard on the Atlantic and Gulf Coasts of the United States. Necessarily, the US NRC research program includes both seismic- and landslide-based tsunamigenic sources in both the near and the far fields. The inclusion of tsunamigenic landslides, an important category of sources that impact tsunami hazard levels for the Atlantic and Gulf Coasts is a key difference between this program and most other tsunami hazard assessment programs. The initial phase of this work consisted of collection, interpretation, and analysis of available offshore data, with significant effort focused on characterizing offshore near-field landslides and analyzing their tsunamigenic potential and properties. In the next phase of research, additional field investigations will be conducted in key locations of interest and additional analysis will be undertaken. Simultaneously, the MOST tsunami generation and propagation model used by NOAA will first be enhanced to include landslide-based initiation mechanisms and then will be used to investigate the impact of the tsunamigenic sources identified and characterized by the USGS. The potential for probabilistic tsunami hazard assessment will also be explore in the final phases of the program.
LiDAR Applications in Resource Geology and Benefits for Land Management
NASA Astrophysics Data System (ADS)
Mikulovsky, R. P.; De La Fuente, J. A.
2013-12-01
The US Forest Service (US Department of Agriculture) manages a broad range of geologic resources and hazards on National Forests and Grass Lands throughout the United States. Resources include rock and earth materials, groundwater, caves and paleontological resources, minerals, energy resources, and unique geologic areas. Hazards include landslides, floods, earthquakes, volcanic eruptions, and naturally hazardous materials (e.g., asbestos, radon). Forest Service Geologists who address these issues are Resource Geologists. They have been exploring LiDAR as a revolutionary tool to efficiently manage all of these hazards and resources. However, most LiDAR applications for management have focused on timber and fuels management, rather than landforms. This study shows the applications and preliminary results of using LiDAR for managing geologic resources and hazards on public lands. Applications shown include calculating sediment budgets, mapping and monitoring landslides, mapping and characterizing borrow pits or mines, determining landslide potential, mapping faults, and characterizing groundwater dependent ecosystems. LiDAR can be used to model potential locations of groundwater dependent ecosystems with threatened or endangered plant species such as Howellia aquatilis. This difficult to locate species typically exists on the Mendocino National Forest within sag ponds on landslide benches. LiDAR metrics of known sites are used to model potential habitat. Thus LiDAR can link the disciplines of geology, hydrology, botany, archaeology and others for enhanced land management. As LiDAR acquisition costs decrease and it becomes more accessible, land management organizations will find a wealth of applications with potential far-reaching benefits for managing geologic resources and hazards.
Site Characterization and Monitoring Technical Support Center FY16 Report
SCMTSC’s primary goal is to provide technical assistance to regional programs on complex hazardous waste site characterization issues. This annual report illustrates the range and extent of projects that SCMTSC supported in FY 2016. Our principal audiences are site project manage...
Predicting organ toxicity using in vitro bioactivity data and chemical structure
Animal testing alone cannot practically evaluate the health hazard posed by tens of thousands of environmental chemicals. Computational approaches together with high-throughput experimental data may provide more efficient means to predict chemical toxicity. Here, we use a superv...
ToxCast: EPAs Contribution to the Tox21 Consortium
The international community needs better predictive tools for assessing the hazards and risks of chemicals. It is technically feasible to collect bioactivity data on virtually all chemicals of potential concern ToxCast is providing a proof of concept for obtaining predictive, b...
NASA Technical Reports Server (NTRS)
Baker, W. E.; Kulesz, J. J.; Ricker, R. E.; Bessey, R. L.; Westine, P. S.; Parr, V. B.; Oldham, G. A.
1975-01-01
Technology needed to predict damage and hazards from explosions of propellant tanks and bursts of pressure vessels, both near and far from these explosions is introduced. Data are summarized in graphs, tables, and nomographs.
Multiscale modeling and simulation of embryogenesis for in silico predictive toxicology (WC9)
Translating big data from alternative and HTS platforms into hazard identification and risk assessment is an important need for predictive toxicology and for elucidating adverse outcome pathways (AOPs) in developmental toxicity. Understanding how chemical disruption of molecular ...
NASA Technical Reports Server (NTRS)
Bowles, Roland L.; Buck, Bill K.
2009-01-01
The objective of the research developed and presented in this document was to statistically assess turbulence hazard detection performance employing airborne pulse Doppler radar systems. The FAA certification methodology for forward looking airborne turbulence radars will require estimating the probabilities of missed and false hazard indications under operational conditions. Analytical approaches must be used due to the near impossibility of obtaining sufficient statistics experimentally. This report describes an end-to-end analytical technique for estimating these probabilities for Enhanced Turbulence (E-Turb) Radar systems under noise-limited conditions, for a variety of aircraft types, as defined in FAA TSO-C134. This technique provides for one means, but not the only means, by which an applicant can demonstrate compliance to the FAA directed ATDS Working Group performance requirements. Turbulence hazard algorithms were developed that derived predictive estimates of aircraft hazards from basic radar observables. These algorithms were designed to prevent false turbulence indications while accurately predicting areas of elevated turbulence risks to aircraft, passengers, and crew; and were successfully flight tested on a NASA B757-200 and a Delta Air Lines B737-800. Application of this defined methodology for calculating the probability of missed and false hazard indications taking into account the effect of the various algorithms used, is demonstrated for representative transport aircraft and radar performance characteristics.
Lessons of L'Aquila for Operational Earthquake Forecasting
NASA Astrophysics Data System (ADS)
Jordan, T. H.
2012-12-01
The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms and failures-to-predict. The best way to achieve this separation is to use probabilistic rather than deterministic statements in characterizing short-term changes in seismic hazards. The ICEF recommended establishing OEF systems that can provide the public with open, authoritative, and timely information about the short-term probabilities of future earthquakes. Because the public needs to be educated into the scientific conversation through repeated communication of probabilistic forecasts, this information should be made available at regular intervals, during periods of normal seismicity as well as during seismic crises. In an age of nearly instant information and high-bandwidth communication, public expectations regarding the availability of authoritative short-term forecasts are rapidly evolving, and there is a greater danger that information vacuums will spawn informal predictions and misinformation. L'Aquila demonstrates why the development of OEF capabilities is a requirement, not an option.
A Graph-Centric Approach for Metagenome-Guided Peptide and Protein Identification in Metaproteomics
Tang, Haixu; Li, Sujun; Ye, Yuzhen
2016-01-01
Metaproteomic studies adopt the common bottom-up proteomics approach to investigate the protein composition and the dynamics of protein expression in microbial communities. When matched metagenomic and/or metatranscriptomic data of the microbial communities are available, metaproteomic data analyses often employ a metagenome-guided approach, in which complete or fragmental protein-coding genes are first directly predicted from metagenomic (and/or metatranscriptomic) sequences or from their assemblies, and the resulting protein sequences are then used as the reference database for peptide/protein identification from MS/MS spectra. This approach is often limited because protein coding genes predicted from metagenomes are incomplete and fragmental. In this paper, we present a graph-centric approach to improving metagenome-guided peptide and protein identification in metaproteomics. Our method exploits the de Bruijn graph structure reported by metagenome assembly algorithms to generate a comprehensive database of protein sequences encoded in the community. We tested our method using several public metaproteomic datasets with matched metagenomic and metatranscriptomic sequencing data acquired from complex microbial communities in a biological wastewater treatment plant. The results showed that many more peptides and proteins can be identified when assembly graphs were utilized, improving the characterization of the proteins expressed in the microbial communities. The additional proteins we identified contribute to the characterization of important pathways such as those involved in degradation of chemical hazards. Our tools are released as open-source software on github at https://github.com/COL-IU/Graph2Pro. PMID:27918579
Simulations of defense strategies for Bennu: Material characterization and impulse delivery
Herbold, E. B.; Owen, J. M.; Swift, D. C.; ...
2015-05-19
Assessments of asteroid deflection strategies depend on material characterization to reduce the uncertainty in predictions of the deflection velocity resulting from impulsive loading. In addition to strength, equation of state, the initial state of the material including its competency (i.e. fractured or monolithic) and the amount of micro- or macroscopic porosity are important considerations to predict the thermomechanical response. There is recent interest in observing near-Earth asteroid (101955) Bennu due to its classification of being potentially hazardous with close approaches occurring every 6 years. Bennu is relatively large with a nominal diameter of 492 m, density estimates ranging from 0.9-1.26more » g/cm³ and is composed mainly of carbonaceous chondrite. There is a lack of data for highly porous carbonaceous chondrite at very large pressures and temperatures. In the absence of the specific material composition and state (e.g. layering, porosity as a function of depth) on Bennu we introduce a continuum constitutive model based on the response of granular materials and provide impact and standoff explosion simulations to investigate the response of highly porous materials to these types of impulsive loading scenarios. Simulations with impact speeds of 5 km/s show that the shock wave emanating from the impact site is highly dispersive and that a 10% porous material has a larger compacted volume compared with a 40% porous material with the same bulk density due to differences in compaction response.« less
Statistical modeling of landslide hazard using GIS
Peter V. Gorsevski; Randy B. Foltz; Paul E. Gessler; Terrance W. Cundy
2001-01-01
A model for spatial prediction of landslide hazard was applied to a watershed affected by landslide events that occurred during the winter of 1995-96, following heavy rains, and snowmelt. Digital elevation data with 22.86 m x 22.86 m resolution was used for deriving topographic attributes used for modeling. The model is based on the combination of logistic regression...
Forest fuels and landscape-level fire risk assessment of the ozark highlands, Missouri
Michael C. Stambaugh; Richard P. Guyette; Daniel C. Dey
2007-01-01
In this paper we describe a fire risk assessment of the Ozark Highlands. Fire risk is rated using information on ignition potential and fuel hazard. Fuel loading, a component of the fire hazard module, is weakly predicted (r2 = 0.19) by site- and landscape-level attributes. Fuel loading does not significantly differ between Ozark ecological...
Prospective Changes in Alcohol Use Among Hazardous Drinkers in the Absence of Treatment
Dearing, Ronda L.; Witkiewitz, Katie; Connors, Gerard J.; Walitzer, Kimberly S.
2012-01-01
Gaining a better understanding of the natural course of hazardous alcohol consumption could inform the development of brief interventions to encourage self-change. In the current study, hazardous drinkers (based on Alcohol Use Disorders Identification Test score) were recruited using advertisements to participate in a 2-year multi-wave prospective study. Participants (N = 206) provided self-reports every six months during the study, including reports of daily alcohol consumption. The current investigation focuses on self-initiated change in participants’ frequency of heavy drinking days (i.e., ≥ 5/4 drinks per day for men/women), as predicted by a number of demographic (e.g., age) and psychosocial (e.g., guilt-proneness) variables. Latent growth curve models of the change in percent heavy drinking days over the 2-year period provided an excellent fit to the observed data and indicated a significant decline in percent heavy drinking days over time. Reductions in heavy drinking frequency were predicted by younger age and higher guilt-proneness. The identification of these predictors of reductions in heavy drinking frequency provides information to guide future work investigating self-change among hazardous drinkers. PMID:22612252
Prospective changes in alcohol use among hazardous drinkers in the absence of treatment.
Dearing, Ronda L; Witkiewitz, Katie; Connors, Gerard J; Walitzer, Kimberly S
2013-03-01
Gaining a better understanding of the natural course of hazardous alcohol consumption could inform the development of brief interventions to encourage self-change. In the current study, hazardous drinkers (based on Alcohol Use Disorders Identification Test score) were recruited using advertisements to participate in a 2-year multiwave prospective study. Participants (n = 206) provided self-reports every six months during the study, including reports of daily alcohol consumption. The current investigation focuses on self-initiated change in participants' frequency of heavy drinking days (i.e., ≥ 5/4 drinks per day for men/women), as predicted by a number of demographic (e.g., age) and psychosocial (e.g., guilt-proneness) variables. Latent growth curve models of the change in percent heavy drinking days over the 2-year period provided an excellent fit to the observed data and indicated a significant decline in percent heavy drinking days over time. Reductions in heavy drinking frequency were predicted by younger age and higher guilt-proneness. The identification of these predictors of reductions in heavy drinking frequency provides information to guide future work investigating self-change among hazardous drinkers. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Meeting report: Workshop on reduction and predictability of natural disasters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rundle, J.; Klein, W.; Turcotte, D.
1997-04-21
Natural hazards such as earthquakes and severe floods are a continual menace to large segments of the population worldwide. Recently the United Nations has focused attention on this global problem by declaring the 90`s the Decade of Natural Hazard Reduction. In addition to the obvious threat to human life natural hazards can cause severe economic hardship locally and, in an ever more complex and interactive world economy, dislocations that are felt in areas far beyond the region of a specific event. To address these concerns a workshop on Reduction and Predictability of Natural Disasters was held at the Santa Femore » Institute on January 5--9, 1994. The Santa Fe Institute was originally founded in 1985 to study the emergent properties of complex nonlinear systems seen in a diversity of fields, from physical science to economics to biology. During the workshop, which brought together 25 geologists, geophysicists, hydrologists, physicists, and mathematicians, a wide variety of natural disasters and hazards were considered. These include earthquakes, landslides, floods, tsunamis, hurricanes, and tornadoes. The general them of the meeting was the application of the techniques of statistical mechanics to problems in the earth sciences.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alves, Vinicius M.; Laboratory for Molecular Modeling, Division of Chemical Biology and Medicinal Chemistry, Eshelman School of Pharmacy, University of North Carolina, Chapel Hill, NC 27599; Muratov, Eugene
Repetitive exposure to a chemical agent can induce an immune reaction in inherently susceptible individuals that leads to skin sensitization. Although many chemicals have been reported as skin sensitizers, there have been very few rigorously validated QSAR models with defined applicability domains (AD) that were developed using a large group of chemically diverse compounds. In this study, we have aimed to compile, curate, and integrate the largest publicly available dataset related to chemically-induced skin sensitization, use this data to generate rigorously validated and QSAR models for skin sensitization, and employ these models as a virtual screening tool for identifying putativemore » sensitizers among environmental chemicals. We followed best practices for model building and validation implemented with our predictive QSAR workflow using Random Forest modeling technique in combination with SiRMS and Dragon descriptors. The Correct Classification Rate (CCR) for QSAR models discriminating sensitizers from non-sensitizers was 71–88% when evaluated on several external validation sets, within a broad AD, with positive (for sensitizers) and negative (for non-sensitizers) predicted rates of 85% and 79% respectively. When compared to the skin sensitization module included in the OECD QSAR Toolbox as well as to the skin sensitization model in publicly available VEGA software, our models showed a significantly higher prediction accuracy for the same sets of external compounds as evaluated by Positive Predicted Rate, Negative Predicted Rate, and CCR. These models were applied to identify putative chemical hazards in the Scorecard database of possible skin or sense organ toxicants as primary candidates for experimental validation. - Highlights: • It was compiled the largest publicly-available skin sensitization dataset. • Predictive QSAR models were developed for skin sensitization. • Developed models have higher prediction accuracy than OECD QSAR Toolbox. • Putative chemical hazards in the Scorecard database were found using our models.« less
OPTIC NERVE INFILTRATION BY RETINOBLASTOMA: Predictive Clinical Features and Outcome.
Kaliki, Swathi; Tahiliani, Prerana; Mishra, Dilip K; Srinivasan, Visweswaran; Ali, Mohammed Hasnat; Reddy, Vijay Anand P
2016-06-01
To identify the clinical features predictive of any optic nerve infiltration and postlaminar optic nerve infiltration by retinoblastoma on histopathology and to report the outcome (metastasis and death) in these patients. Retrospective study. Of the 403 patients who underwent primary enucleation for retinoblastoma, 196 patients had optic nerve tumor infiltration (Group 1) and 207 patients had no evidence of optic nerve tumor infiltration (Group 2). Group 1 included patients with prelaminar (n = 47; 24%), laminar (n = 74; 38%), and postlaminar tumor infiltration with or without involving optic nerve transection (n = 74; 38%). Comparing Group 1 and Group 2, the patients in Group 1 had prolonged duration of symptoms (>6 months) (16% vs. 8%; P = 0.02) and were associated with no vision at presentation (23% vs. 10%; P = 0.01), higher rates of secondary glaucoma (42% vs. 12%; P < 0.0001), iris neovascularization (39% vs. 23%; P < 0.001), and larger tumors (mean tumor thickness, 12.8 mm vs. 12 mm; P = 0.0001). There was a higher prevalence of metastasis in Group 1 than in Group 2 (4% vs. 0%; P = 0.006). On multivariate analysis, clinical features predictive of any optic nerve tumor infiltration secondary glaucoma (hazard ratio = 5.38; P < 0.001) and those predictive of postlaminar optic nerve tumor infiltration included iris neovascularization (hazard ratio = 2.66; P = 0.001) and secondary glaucoma (hazard ratio = 3.13; P < 0.001). In this study, clinical features predictive of any optic nerve tumor infiltration included secondary glaucoma and those predictive of postlaminar optic nerve tumor infiltration included iris neovascularization and secondary glaucoma. Despite adjuvant treatment in those with postlaminar optic nerve tumor infiltration, metastasis occurred in 8% of patients.
Forecasting extreme temperature health hazards in Europe
NASA Astrophysics Data System (ADS)
Di Napoli, Claudia; Pappenberger, Florian; Cloke, Hannah L.
2017-04-01
Extreme hot temperatures, such as those experienced during a heat wave, represent a dangerous meteorological hazard to human health. Heat disorders such as sunstroke are harmful to people of all ages and responsible for excess mortality in the affected areas. In 2003 more than 50,000 people died in western and southern Europe because of a severe and sustained episode of summer heat [1]. Furthermore, according to the Intergovernmental Panel on Climate Change heat waves are expected to get more frequent in the future thus posing an increasing threat to human lives. Developing appropriate tools for extreme hot temperatures prediction is therefore mandatory to increase public preparedness and mitigate heat-induced impacts. A recent study has shown that forecasts of the Universal Thermal Climate Index (UTCI) provide a valid overview of extreme temperature health hazards on a global scale [2]. UTCI is a parameter related to the temperature of the human body and its regulatory responses to the surrounding atmospheric environment. UTCI is calculated using an advanced thermo-physiological model that includes the human heat budget, physiology and clothing. To forecast UTCI the model uses meteorological inputs, such as 2m air temperature, 2m water vapour pressure and wind velocity at body height derived from 10m wind speed, from NWP models. Here we examine the potential of UTCI as an extreme hot temperature prediction tool for the European area. UTCI forecasts calculated using above-mentioned parameters from ECMWF models are presented. The skill in predicting UTCI for medium lead times is also analysed and discussed for implementation to international health-hazard warning systems. This research is supported by the ANYWHERE project (EnhANcing emergencY management and response to extreme WeatHER and climate Events) which is funded by the European Commission's HORIZON2020 programme. [1] Koppe C. et al., Heat waves: risks and responses. World Health Organization. Health and Global Environmental Change, Series No. 2, Copenhagen, Denmark, 2004. [2] Pappenberger F. et al., Global forecasting of thermal health hazards: the skill of probabilistic predictions of the Universal Thermal Climate Index (UTCI), International Journal of Biometeorology 59(3): 311-323, 2015.
Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions
NASA Astrophysics Data System (ADS)
De Risi, Raffaele; Goda, Katsuichiro
2017-08-01
Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.
Playing against nature: improving earthquake hazard mitigation
NASA Astrophysics Data System (ADS)
Stein, S. A.; Stein, J.
2012-12-01
The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the uncertainties and the need to candidly assess them. It can be applied to exploring policies under various hazard scenarios and mitigating other natural hazards.ariation in total cost, the sum of expected loss and mitigation cost, as a function of mitigation level. The optimal level of mitigation, n*, minimizes the total cost. The expected loss depends on the hazard model, so the better the hazard model, the better the mitigation policy (Stein and Stein, 2012).
Characterization of the potential adverse effects is lacking for tens of thousands of chemicals that are present in the environment, and characterization of developmental neurotoxicity (DNT) hazard lags behind that of other adverse outcomes (e.g. hepatotoxicity). This is due in p...
Characterization of the potential adverse effects is lacking for tens of thousands of chemicals that are present in the environment, and characterization of developmental neurotoxicity (DNT) hazard lags behind that of other adverse outcomes (e.g. hepatotoxicity). This is due in p...
Predictive Models and Tools for Assessing Chemicals under the Toxic Substances Control Act (TSCA)
EPA has developed databases and predictive models to help evaluate the hazard, exposure, and risk of chemicals released to the environment and how workers, the general public, and the environment may be exposed to and affected by them.
Seismic source characterization for the 2014 update of the U.S. National Seismic Hazard Model
Moschetti, Morgan P.; Powers, Peter; Petersen, Mark D.; Boyd, Oliver; Chen, Rui; Field, Edward H.; Frankel, Arthur; Haller, Kathleen; Harmsen, Stephen; Mueller, Charles S.; Wheeler, Russell; Zeng, Yuehua
2015-01-01
We present the updated seismic source characterization (SSC) for the 2014 update of the National Seismic Hazard Model (NSHM) for the conterminous United States. Construction of the seismic source models employs the methodology that was developed for the 1996 NSHM but includes new and updated data, data types, source models, and source parameters that reflect the current state of knowledge of earthquake occurrence and state of practice for seismic hazard analyses. We review the SSC parameterization and describe the methods used to estimate earthquake rates, magnitudes, locations, and geometries for all seismic source models, with an emphasis on new source model components. We highlight the effects that two new model components—incorporation of slip rates from combined geodetic-geologic inversions and the incorporation of adaptively smoothed seismicity models—have on probabilistic ground motions, because these sources span multiple regions of the conterminous United States and provide important additional epistemic uncertainty for the 2014 NSHM.
NASA Astrophysics Data System (ADS)
Takarada, S.
2012-12-01
The first Workshop of Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER1) was held in Tsukuba, Ibaraki Prefecture, Japan from February 23 to 24, 2012. The workshop focused on the formulation of strategies to reduce the risks of disasters worldwide caused by the occurrence of earthquakes, tsunamis, and volcanic eruptions. More than 150 participants attended the workshop. During the workshop, the G-EVER1 accord was approved by the participants. The Accord consists of 10 recommendations like enhancing collaboration, sharing of resources, and making information about the risks of earthquakes and volcanic eruptions freely available and understandable. The G-EVER Hub website (http://g-ever.org) was established to promote the exchange of information and knowledge among the Asia-Pacific countries. Several G-EVER Working Groups and Task Forces were proposed. One of the working groups was tasked to make the next-generation real-time volcano hazard assessment system. The next-generation volcano hazard assessment system is useful for volcanic eruption prediction, risk assessment, and evacuation at various eruption stages. The assessment system is planned to be developed based on volcanic eruption scenario datasets, volcanic eruption database, and numerical simulations. Defining volcanic eruption scenarios based on precursor phenomena leading up to major eruptions of active volcanoes is quite important for the future prediction of volcanic eruptions. Compiling volcanic eruption scenarios after a major eruption is also important. A high quality volcanic eruption database, which contains compilations of eruption dates, volumes, and styles, is important for the next-generation volcano hazard assessment system. The volcanic eruption database is developed based on past eruption results, which only represent a subset of possible future scenarios. Hence, different distributions from the previous deposits are mainly observed due to the differences in vent position, volume, eruption rate, wind directions and topography. Therefore, numerical simulations with controlled parameters are needed for more precise volcanic eruption predictions. The use of the next-generation system should enable the visualization of past volcanic eruptions datasets such as distributions, eruption volumes and eruption rates, on maps and diagrams using timeline and GIS technology. Similar volcanic eruptions scenarios should be easily searchable from the eruption database. Using the volcano hazard assessment system, prediction of the time and area that would be affected by volcanic eruptions at any locations near the volcano should be possible, using numerical simulations. The system should estimate volcanic hazard risks by overlaying the distributions of volcanic deposits on major roads, houses and evacuation areas using a GIS enabled systems. Probabilistic volcanic hazards maps in active volcano sites should be made based on numerous numerical simulations. The next-generation real-time hazard assessment system would be implemented with user-friendly interface, making the risk assessment system easily usable and accessible online.
1991-11-01
CES/DEEV). The Defense Reutilization and Marketing Office (DRMO) is responsible for contractual removal of hazardous waste. BES supports the program...RGNT HOSP ELMEŕDORFGGEPB Test Results ~ Utnit- "-R~one LCL ucg L wosnLCL uq L FP in LCL mci L irwtn Ter~rrn ride LLL miYL 9.1 crot -ne LF mqzL
A multidimensional stability model for predicting shallow landslide size and shape across landscapes
David G. Milledge; Dino Bellugi; Jim A. McKean; Alexander L. Densmore; William E. Dietrich
2014-01-01
The size of a shallow landslide is a fundamental control on both its hazard and geomorphic importance. Existing models are either unable to predict landslide size or are computationally intensive such that they cannot practically be applied across landscapes. We derive a model appropriate for natural slopes that is capable of predicting shallow landslide size but...
Prediction of survival with multi-scale radiomic analysis in glioblastoma patients.
Chaddad, Ahmad; Sabri, Siham; Niazi, Tamim; Abdulkarim, Bassam
2018-06-19
We propose a multiscale texture features based on Laplacian-of Gaussian (LoG) filter to predict progression free (PFS) and overall survival (OS) in patients newly diagnosed with glioblastoma (GBM). Experiments use the extracted features derived from 40 patients of GBM with T1-weighted imaging (T1-WI) and Fluid-attenuated inversion recovery (FLAIR) images that were segmented manually into areas of active tumor, necrosis, and edema. Multiscale texture features were extracted locally from each of these areas of interest using a LoG filter and the relation between features to OS and PFS was investigated using univariate (i.e., Spearman's rank correlation coefficient, log-rank test and Kaplan-Meier estimator) and multivariate analyses (i.e., Random Forest classifier). Three and seven features were statistically correlated with PFS and OS, respectively, with absolute correlation values between 0.32 and 0.36 and p < 0.05. Three features derived from active tumor regions only were associated with OS (p < 0.05) with hazard ratios (HR) of 2.9, 3, and 3.24, respectively. Combined features showed an AUC value of 85.37 and 85.54% for predicting the PFS and OS of GBM patients, respectively, using the random forest (RF) classifier. We presented a multiscale texture features to characterize the GBM regions and predict he PFS and OS. The efficiency achievable suggests that this technique can be developed into a GBM MR analysis system suitable for clinical use after a thorough validation involving more patients. Graphical abstract Scheme of the proposed model for characterizing the heterogeneity of GBM regions and predicting the overall survival and progression free survival of GBM patients. (1) Acquisition of pretreatment MRI images; (2) Affine registration of T1-WI image with its corresponding FLAIR images, and GBM subtype (phenotypes) labelling; (3) Extraction of nine texture features from the three texture scales fine, medium, and coarse derived from each of GBM regions; (4) Comparing heterogeneity between GBM regions by ANOVA test; Survival analysis using Univariate (Spearman rank correlation between features and survival (i.e., PFS and OS) based on each of the GBM regions, Kaplan-Meier estimator and log-rank test to predict the PFS and OS of patient groups that grouped based on median of feature), and multivariate (random forest model) for predicting the PFS and OS of patients groups that grouped based on median of PFS and OS.
Joeng, Hee-Koung; Chen, Ming-Hui; Kang, Sangwook
2015-01-01
Discrete survival data are routinely encountered in many fields of study including behavior science, economics, epidemiology, medicine, and social science. In this paper, we develop a class of proportional exponentiated link transformed hazards (ELTH) models. We carry out a detailed examination of the role of links in fitting discrete survival data and estimating regression coefficients. Several interesting results are established regarding the choice of links and baseline hazards. We also characterize the conditions for improper survival functions and the conditions for existence of the maximum likelihood estimates under the proposed ELTH models. An extensive simulation study is conducted to examine the empirical performance of the parameter estimates under the Cox proportional hazards model by treating discrete survival times as continuous survival times, and the model comparison criteria, AIC and BIC, in determining links and baseline hazards. A SEER breast cancer dataset is analyzed in details to further demonstrate the proposed methodology. PMID:25772374
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, George; Zhang, Xi-Cheng
Concrete and asbestos-containing materials were widely used in U.S. Department of Energy (DOE) building construction in the 1940s and 1950s. Over the years, many of these porous building materials have been contaminated with radioactive sources, on and below the surface. This intractable radioactive-and-hazardous- asbestos mixed-waste-stream has created a tremendous challenge to DOE decontamination and decommissioning (D&D) project managers. The current practice to identify asbestos and to characterize radioactive contamination depth profiles involve bore sampling, and is inefficient, costly, and unsafe. A three-year research project was started on 10/1/98 at Rensselaer with the following ultimate goals: (1) development of novel non-destructivemore » methods for identifying the hazardous asbestos in real-time and in-situ, and (2) development of new algorithms and apparatus for characterizing the radioactive contamination depth profile in real-time and in-situ.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, George; Zhang, Xi-Cheng
Concrete and asbestos-containing materials were widely used in U.S. Department of Energy (DOE) building construction in the 1940s and 1950s. Over the years, many of these porous building materials have been contaminated with radioactive sources, on and below the surface. This intractable radioactive-and-hazardous-asbestos mixed-waste stream has created a tremendous challenge to DOE decontamination and decommissioning (D&D) project managers. The current practice to identify asbestos and to characterize radioactive contamination depth profiles in based solely on bore sampling, which is inefficient, costly, and unsafe. A three-year research project was started 1998 at Rensselaer with the following ultimate goals: (1) development ofmore » novel non-destructive methods for identifying the hazardous asbestos in real-time and in-situ, and (2) development of new algorithms and apparatus for characterizing the radioactive contamination depth profile in real-time and in-situ.« less
Geological hazard monitoring system in Georgia
NASA Astrophysics Data System (ADS)
Gaprindashvili, George
2017-04-01
Georgia belongs to one of world's most complex mountainous regions according to the scale and frequency of Geological processes and damage caused to population, farmlands, and Infrastructure facilities. Geological hazards (landslide, debrisflow/mudflow, rockfall, erosion and etc.) are affecting many populated areas, agricultural fields, roads, oil and gas pipes, high-voltage electric power transmission towers, hydraulic structures, and tourist complexes. Landslides occur almost in all geomorphological zones, resulting in wide differentiation in the failure types and mechanisms and in the size-frequency distribution. In Georgia, geological hazards triggered by: 1. Activation of highly intense earthquakes; 2. Meteorological events provoking the disaster processes on the background of global climatic change; 3. Large-scale Human impact on the environment. The prediction and monitoring of Geological Hazards is a very wide theme, which involves different researchers from different spheres. Geological hazard monitoring is essential to prevent and mitigate these hazards. In past years in Georgia several monitoring system, such as Ground-based geodetic techniques, Debrisflow Early Warning System (EWS) were installed on high sensitive landslide and debrisflow areas. This work presents description of Geological hazard monitoring system in Georgia.
Beer drinking accounts for most of the hazardous alcohol consumption reported in the United States.
Rogers, J D; Greenfield, T K
1999-11-01
Patterns and correlates of hazardous drinking, defined as occasions in which five or more drinks were consumed in a day, were compared for wine, beer and distilled spirits. From a probability sample of the U.S. adult household population, 2,817 respondents who had consumed at least one drink in the previous year were selected for analysis. The results show that, in the U.S., beer accounts for the bulk of alcohol consumed by the heaviest drinkers. Beer also accounts for a disproportionate share of hazardous drinking. Logistic regression analyses revealed that drinkers who consume beer in a hazardous fashion at least monthly are more likely to be young, male and unmarried, and less likely to be black than are other drinkers. Hazardous beer consumption is more predictive of alcohol-related problems than hazardous consumption of wine or spirits. Three potential explanations for the results are considered: advertising, beer-drinking subcultures and risk compensation. Additional research is urged in order to better specify the causal role of these and other factors in hazardous beer drinking.
Mapping mountain torrent hazards in the Hexi Corridor using an evidential reasoning approach
NASA Astrophysics Data System (ADS)
Ran, Youhua; Liu, Jinpeng; Tian, Feng; Wang, Dekai
2017-02-01
The Hexi Corridor is an important part of the Silk Road Economic Belt and a crucial channel for westward development in China. Many important national engineering projects pass through the corridor, such as highways, railways, and the West-to-East Gas Pipeline. The frequent torrent disasters greatly impact the security of infrastructure and human safety. In this study, an evidential reasoning approach based on Dempster-Shafer theory is proposed for mapping mountain torrent hazards in the Hexi Corridor. A torrent hazard map for the Hexi Corridor was generated by integrating the driving factors of mountain torrent disasters including precipitation, terrain, flow concentration processes, and the vegetation fraction. The results show that the capability of the proposed method is satisfactory. The torrent hazard map shows that there is high potential torrent hazard in the central and southeastern Hexi Corridor. The results are useful for engineering planning support and resource protection in the Hexi Corridor. Further efforts are discussed for improving torrent hazard mapping and prediction.
Volcanic hazards at Mount Rainier, Washington
Crandell, Dwight Raymond; Mullineaux, Donal Ray
1967-01-01
Mount Rainier is a large stratovolcano of andesitic rock in the Cascade Range of western Washington. Although the volcano as it now stands was almost completely formed before the last major glaciation, geologic formations record a variety of events that have occurred at the volcano in postglacial time. Repetition of some of these events today without warning would result in property damage and loss of life on a catastrophic scale. It is appropriate, therefore, to examine the extent, frequency, and apparent origin of these phenomena and to attempt to predict the effects on man of similar events in the future. The present report was prompted by a contrast that we noted during a study of surficial geologic deposits in Mount Rainier National Park, between the present tranquil landscape adjacent to the volcano and the violent events that shaped parts of that same landscape in the recent past. Natural catastrophes that have geologic causes - such as eruptions, landslides, earthquakes, and floods - all too often are disastrous primarily because man has not understood and made allowance for the geologic environment he occupies. Assessment of the potential hazards of a volcanic environment is especially difficult, for prediction of the time and kind of volcanic activity is still an imperfect art, even at active volcanoes whose behavior has been closely observed for many years. Qualified predictions, however, can be used to plan ways in which hazards to life and property can be minimized. The prediction of eruptions is handicapped because volcanism results from conditions far beneath the surface of the earth, where the causative factors cannot be seen and, for the most part, cannot be measured. Consequently, long-range predictions at Mount Rainier can be based only on the past behavior of the volcano, as revealed by study of the deposits that resulted from previous eruptions. Predictions of this sort, of course, cannot be specific as to time and locale of future events, and clearly are valid only if the past behavior is, as we believe, a reliable guide. The purpose of this report is to infer the events recorded by certain postglacial deposits at Mount Rainier and to suggest what bearing similar events in the future might have on land use within and near the park. In addition, table 2 (page 22) gives possible warning signs of an impending eruption. We want to increase man's understanding of a possibly hazardous geologic environment around Mount Rainier volcano, yet we do not wish to imply for certain that the hazards described are either immediate or inevitable. However, we do believe that hazards exist, that some caution is warranted, and that some major hazards can be avoided by judicious planning. Most of the events with which we are concerned are sporadic phenomena that have resulted directly or indirectly from volcanic eruptions. Although no eruptions (other than steam emission) of the volcano in historic time are unequivocally known (Hopson and others, 1962), pyroclastic (air-laid) deposits of pumice and rock debris attest to repeated, widely spaced eruptions during the 10,000 years or so of postglacial time. In addition, the constituents of some debris flows indicate an origin during eruptions of molten rock; other debris flows, because of their large size and constituents, are believed to have been caused by steam explosions. Some debris flows, however, are not related to volcanism at all.
Initiation process of earthquakes and its implications for seismic hazard reduction strategy.
Kanamori, H
1996-04-30
For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.
NASA Technical Reports Server (NTRS)
West, Leanne; Gimmestad, Gary; Smith, William; Kireev, Stanislav; Cornman, Larry B.; Schaffner, Philip R.; Tsoucalas, George
2008-01-01
The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining measurements of potential weather hazards to alert flight crews. The FLI concept is based on high-resolution Infrared (IR) Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing, and which have also been applied to the detection of aerosols and gases for other purposes. It is being evaluated for multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing, during all phases of flight. Previous sensitivity and characterization studies addressed the phenomenology that supports detection and mitigation by the FLI. Techniques for determining the range, and hence warning time, were demonstrated for several of the hazards, and a table of research instrument parameters was developed for investigating all of the hazards discussed above. This work supports the feasibility of detecting multiple hazards with an FLI multi-hazard airborne sensor, and for producing enhanced IR images in reduced visibility conditions; however, further research must be performed to develop a means to estimate the intensities of the hazards posed to an aircraft and to develop robust algorithms to relate sensor measurables to hazard levels. In addition, validation tests need to be performed with a prototype system.
Clark, Charles R; McKee, Richard H; Freeman, James J; Swick, Derek; Mahagaokar, Suneeta; Pigram, Glenda; Roberts, Linda G; Smulders, Chantal J; Beatty, Patrick W
2013-12-01
The process streams refined from petroleum crude oil for use in petroleum products are among those designated by USEPA as UVCB substances (unknown or variable composition, complex reaction products and biological materials). They are identified on global chemical inventories with unique Chemical Abstract Services (CAS) numbers and names. The chemical complexity of most petroleum substances presents challenges when evaluating their hazards and can result in differing evaluations due to the varying level of hazardous constituents and differences in national chemical control regulations. Global efforts to harmonize the identification of chemical hazards are aimed at promoting the use of consistent hazard evaluation criteria. This paper discusses a systematic approach for the health hazard evaluation of petroleum substances using chemical categories and the United Nations (UN) Globally Harmonized System (GHS) of classification and labeling. Also described are historical efforts to characterize the hazard of these substances and how they led to the development of categories, the identification of potentially hazardous constituents which should be considered, and a summary of the toxicology of the major petroleum product groups. The use of these categories can increase the utility of existing data, provide better informed hazard evaluations, and reduce the amount of animal testing required. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.
Polybrominated Diphenyl Ethers (PBDEs)
EPA's existing chemicals programs address pollution prevention, risk assessment, hazard and exposure assessment and/or characterization, and risk management for chemicals substances in commercial use.
Directions of the US Geological Survey Landslide Hazards Reduction Program
Wieczorek, G.F.
1993-01-01
The US Geological Survey (USGS) Landslide Hazards Reduction Program includes studies of landslide process and prediction, landslide susceptibility and risk mapping, landslide recurrence and slope evolution, and research application and technology transfer. Studies of landslide processes have been recently conducted in Virginia, Utah, California, Alaska, and Hawaii, Landslide susceptibility maps provide a very important tool for landslide hazard reduction. The effects of engineering-geologic characteristics of rocks, seismic activity, short and long-term climatic change on landslide recurrence are under study. Detailed measurement of movement and deformation has begun on some active landslides. -from Author
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, L.J.; Johnson, E.M.; Newman, L.M.
A series of seven randomly selected potential halogenated water disinfection by-products were evaluated in vitro by the hydra assay to determine their developmental toxicity hazard potential. For six of the chemicals tested by this assay (dibromoacetonitrile; trichloroacetonitrile; 2-chlorophenol; 2,4,6-trichlorophenol; trichloroacetic acid; dichloroacetone) it was predicted that they would be generally equally toxic to both adult and embryonic mammals when studied by means of standard developmental toxicity teratology tests. However, the potential water disinfection by-product chloroacetic acid (CA) was determined to be over eight times more toxic to the embryonic developmental portion of the assay than it was to the adults.more » Because of this potential selectivity, CA is a high-priority item for developmental toxicity tests in pregnant mammals to confirm or refute its apparent unique developmental hazard potential and/or to establish a NOAEL by the route of most likely human exposure.« less
Scattering of trajectories of hazardous asteroids
NASA Astrophysics Data System (ADS)
Sokolov, Leonid; Petrov, Nikita; Kuteeva, Galina; Vasilyev, Andrey
2018-05-01
Early detection of possible collisions of asteroids with the Earth is necessary to exept the asteroid-comet hazard. Many collisions associate with resonant returns after preceding approaches. The difficulty of collisions prediction is associated with a resonant returns after encounters with the Earth due to loss of precision in these predictions. On the other hand, we can use the fly-by effect to avoid hazardous asteroid from collision. The main research object is the asteroid Apophis (99942), for which we found about 100 orbits of possible impacts with the Earth and more than 10 - with the Moon. It is shown that the early (before 2029) change of the Apophis orbit allows to avoid all main impacts with the Earth in 21st century, associated with resonant returns, and such a change of the orbit, in principle, is feasible. The scattering of possible trajectories of Apophis after 2029 and after 2051, as well as 2015 RN35 and other dangerous objects, is discussed.
Predictions of asteroid hazard to the Earth for the 21st century
NASA Astrophysics Data System (ADS)
Petrov, Nikita; Sokolov, Leonid; Polyakhova, Elena; Oskina, Kristina
2018-05-01
Early detection and investigation of possible collisions and close approaches of asteroids with the Earth are necessary to exept the asteroid-comet hazard. The difficulty of prediction of close approaches and collisions associated with resonant returns after encounters with the Earth due to loss of precision in these encounters. The main research object is asteroid Apophis (99942), for which we found many possible orbits of impacts associated with resonant returns. It is shown that the early orbit change of Apophis allows to avoid main impacts, associated with resonant returns. Such a change of the orbit, in principle, is feasible. We also study the possible impacts with the Ground asteroid 2015 RN35. We present 21 possible collisions in this century, including 7 collisions with large gaps presented in NASA website. The results of observations by the telescope ZA-320M at Pulkovo Obser-vatory of the three near-Earth asteroids, namely, 7822, 20826, 68216, two of which 7822 and 68216 are potentially hazardous, are presented.
Volcanic ash melting under conditions relevant to ash turbine interactions
Song, Wenjia; Lavallée, Yan; Hess, Kai-Uwe; Kueppers, Ulrich; Cimarelli, Corrado; Dingwell, Donald B.
2016-01-01
The ingestion of volcanic ash by jet engines is widely recognized as a potentially fatal hazard for aircraft operation. The high temperatures (1,200–2,000 °C) typical of jet engines exacerbate the impact of ash by provoking its melting and sticking to turbine parts. Estimation of this potential hazard is complicated by the fact that chemical composition, which affects the temperature at which volcanic ash becomes liquid, can vary widely amongst volcanoes. Here, based on experiments, we parameterize ash behaviour and develop a model to predict melting and sticking conditions for its global compositional range. The results of our experiments confirm that the common use of sand or dust proxy is wholly inadequate for the prediction of the behaviour of volcanic ash, leading to overestimates of sticking temperature and thus severe underestimates of the thermal hazard. Our model can be used to assess the deposition probability of volcanic ash in jet engines. PMID:26931824
Evaluating the influence of gully erosion on landslide hazard analysis triggered by heavy rainfall
NASA Astrophysics Data System (ADS)
Ruljigaljig, Tjuku; Tsai, Ching-Jun; Peng, Wen-Fei; Yu, Teng-To
2017-04-01
During the rainstorm period such as typhoon or heavy rain, the development of gully will induce a large-scale landslide. The purpose of this study is to assess and quantify the existence and development of gully for the purpose of triggering landslides by analyzing the landslides hazard. Firstly, based on multi-scale DEM data, this study uses wavelet transform to construct an automatic algorithm. The 1-meter DEM is used to evaluate the location and type of gully, and to establish an evaluation model for predicting erosion development.In this study, routes in the Chai-Yi were studied to clarify the damage potential of roadways from local gully. The local of gully is regarded as a parameter to reduce the strength parameter. The distribution of factor of safe (F.S.) is compared with the landslide inventory map. The result of this research could be used to increase the prediction accuracy of landslide hazard analysis due to heavy rainfalls.
Suh, Young Joo; Han, Kyunghwa; Chang, Suyon; Kim, Jin Young; Im, Dong Jin; Hong, Yoo Jin; Lee, Hye-Jeong; Hur, Jin; Kim, Young Jin; Choi, Byoung Wook
2017-09-01
The SYNergy between percutaneous coronary intervention with TAXus and cardiac surgery (SYNTAX) score is an invasive coronary angiography (ICA)-based score for quantifying the complexity of coronary artery disease (CAD). Although the SYNTAX score was originally developed based on ICA, recent publications have reported that coronary computed tomography angiography (CCTA) is a feasible modality for the estimation of the SYNTAX score.The aim of our study was to investigate the prognostic value of the SYNTAX score, based on CCTA for the prediction of major adverse cardiac and cerebrovascular events (MACCEs) in patients with complex CAD.The current study was approved by the institutional review board of our institution, and informed consent was waived for this retrospective cohort study. We included 251 patients (173 men, mean age 66.0 ± 9.29 years) who had complex CAD [3-vessel disease or left main (LM) disease] on CCTA. SYNTAX score was obtained on the basis of CCTA. Follow-up clinical outcome data regarding composite MACCEs were also obtained. Cox proportional hazards models were developed to predict the risk of MACCEs based on clinical variables, treatment, and computed tomography (CT)-SYNTAX scores.During the median follow-up period of 1517 days, there were 48 MACCEs. Univariate Cox hazards models demonstrated that MACCEs were associated with advanced age, low body mass index (BMI), and dyslipidemia (P < .2). In patients with LM disease, MACCEs were associated with a higher SYNTAX score. In patients with CT-SYNTAX score ≥23, patients who underwent coronary artery bypass graft surgery (CABG) and percutaneous coronary intervention had significantly lower hazard ratios than patients who were treated with medication alone. In multivariate Cox hazards model, advanced age, low BMI, and higher SYNTAX score showed an increased hazard ratio for MACCE, while treatment with CABG showed a lower hazard ratio (P < .2).On the basis of our results, CT-SYNTAX score can be a useful method for noninvasively predicting MACCEs in patients with complex CAD, especially in patients with LM disease.
Ford, Michael T; Wiggins, Bryan K
2012-07-01
Interactions between occupational-level physical hazards and cognitive ability and skill requirements were examined as predictors of injury incidence rates as reported by the U. S. Bureau of Labor Statistics. Based on ratings provided in the Occupational Information Network (O*NET) database, results across 563 occupations indicate that physical hazards at the occupational level were strongly related to injury incidence rates. Also, as expected, the physical hazard-injury rate relationship was stronger among occupations with high cognitive ability and skill requirements. In addition, there was an unexpected main effect such that occupations with high cognitive ability and skill requirements had lower injury rates even after controlling for physical hazards. The main effect of cognitive ability and skill requirements, combined with the interaction with physical hazards, resulted in unexpectedly high injury rates for low-ability and low-skill occupations with low physical hazard levels. Substantive and methodological explanations for these interactions and their theoretical and practical implications are offered. Results suggest that organizations and occupational health and safety researchers and practitioners should consider the occupational level of analysis and interactions between physical hazards and cognitive requirements in future research and practice when attempting to understand and prevent injuries.
ToxCast: One Step in the NRC Vision of 21st Century Toxicology
The international community needs better predictive tools for assessing the hazards and risks of chemicals. It is technically feasible to collect bioactivity data on virtually all chemicals of potential concern ToxCast is providing a proof of concept for obtaining predictive, b...
The influence of hazard models on GIS-based regional risk assessments and mitigation policies
Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.
2006-01-01
Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.
An evaluation of treatment strategies for head and neck cancer in an African American population.
Ignacio, D N; Griffin, J J; Daniel, M G; Serlemitsos-Day, M T; Lombardo, F A; Alleyne, T A
2013-07-01
This study evaluated treatment strategies for head and neck cancers in a predominantly African American population. Data were collected utilizing medical records and the tumour registry at the Howard University Hospital. Kaplan-Meier method was used for survival analysis and Cox proportional hazards regression analysis predicted the hazard of death. Analysis revealed that the main treatment strategy was radiation combined with platinum for all stages except stage I. Cetuximab was employed in only 1% of cases. Kaplan-Meier analysis revealed stage II patients had poorer outcome than stage IV while Cox proportional hazard regression analysis (p = 0.4662) showed that stage I had a significantly lower hazard of death than stage IV (HR = 0.314; p = 0.0272). Contributory factors included tobacco and alcohol but body mass index (BMI) was inversely related to hazard of death. There was no difference in survival using any treatment modality for African Americans.
A Dynamic Hydrology-Critical Zone Framework for Rainfall-triggered Landslide Hazard Prediction
NASA Astrophysics Data System (ADS)
Dialynas, Y. G.; Foufoula-Georgiou, E.; Dietrich, W. E.; Bras, R. L.
2017-12-01
Watershed-scale coupled hydrologic-stability models are still in their early stages, and are characterized by important limitations: (a) either they assume steady-state or quasi-dynamic watershed hydrology, or (b) they simulate landslide occurrence based on a simple one-dimensional stability criterion. Here we develop a three-dimensional landslide prediction framework, based on a coupled hydrologic-slope stability model and incorporation of the influence of deep critical zone processes (i.e., flow through weathered bedrock and exfiltration to the colluvium) for more accurate prediction of the timing, location, and extent of landslides. Specifically, a watershed-scale slope stability model that systematically accounts for the contribution of driving and resisting forces in three-dimensional hillslope segments was coupled with a spatially-explicit and physically-based hydrologic model. The landslide prediction framework considers critical zone processes and structure, and explicitly accounts for the spatial heterogeneity of surface and subsurface properties that control slope stability, including soil and weathered bedrock hydrological and mechanical characteristics, vegetation, and slope morphology. To test performance, the model was applied in landslide-prone sites in the US, the hydrology of which has been extensively studied. Results showed that both rainfall infiltration in the soil and groundwater exfiltration exert a strong control on the timing and magnitude of landslide occurrence. We demonstrate the extent to which three-dimensional slope destabilizing factors, which are modulated by dynamic hydrologic conditions in the soil-bedrock column, control landslide initiation at the watershed scale.
Johnson, Adam G.; Ruiz, Jimmy; Isom, Scott; Lucas, John T.; Hinson, William H.; Watabe, Kounosuke; Laxton, Adrian W.; Tatter, Stephen B.; Chan, Michael D.
2017-01-01
Abstract Background. In this study we attempted to discern the factors predictive of neurologic death in patients with brain metastasis treated with upfront stereotactic radiosurgery (SRS) without whole brain radiation therapy (WBRT) while accounting for the competing risk of nonneurologic death. Methods. We performed a retrospective single-institution analysis of patients with brain metastasis treated with upfront SRS without WBRT. Competing risks analysis was performed to estimate the subdistribution hazard ratios (HRs) for neurologic and nonneurologic death for predictor variables of interest. Results. Of 738 patients treated with upfront SRS alone, neurologic death occurred in 226 (30.6%), while nonneurologic death occurred in 309 (41.9%). Multivariate competing risks analysis identified an increased hazard of neurologic death associated with diagnosis-specific graded prognostic assessment (DS-GPA) ≤ 2 (P = .005), melanoma histology (P = .009), and increased number of brain metastases (P<.001), while there was a decreased hazard associated with higher SRS dose (P = .004). Targeted agents were associated with a decreased HR of neurologic death in the first 1.5 years (P = .04) but not afterwards. An increased hazard of nonneurologic death was seen with increasing age (P =.03), nonmelanoma histology (P<.001), presence of extracranial disease (P<.001), and progressive systemic disease (P =.004). Conclusions. Melanoma, DS-GPA, number of brain metastases, and SRS dose are predictive of neurologic death, while age, nonmelanoma histology, and more advanced systemic disease are predictive of nonneurologic death. Targeted agents appear to delay neurologic death. PMID:27571883
Chung, Su Jin; Lee, Yoonju; Oh, Jungsu S; Kim, Jae Seung; Lee, Phil Hyu; Sohn, Young H
2018-05-10
The present study aimed to investigate whether the level of presynaptic dopamine neuronal loss predicts future development of wearing-off in de novo Parkinson's disease. This retrospective cohort study included a total of 342 non-demented patients with de novo Parkinson's disease who underwent dopamine transporter positron emission tomography scans at their initial evaluation and received dopaminergic medications for 24 months or longer. Onset of wearing-off was determined based on patients' medical records at their outpatient clinic visits every 3-6 months. Predictive power of dopamine transporter activity in striatal subregions and other clinical factors for the development of wearing-off was evaluated by Cox proportional hazard models. During a median follow-up period of 50.2 ± 18.9 months, 69 patients (20.2%) developed wearing-off. Patients with wearing-off exhibited less dopamine transporter activity in the putamen, particularly the anterior and posterior putamens, compared to those without wearing-off. Multivariate Cox proportional hazard models revealed that dopamine transporter activities of the anterior (hazard ratio 0.556; p = 0.008) and whole putamens (hazard ratio 0.504; p = 0.025) were significant predictors of development of wearing-off. In addition, younger age at onset of Parkinson's disease, lower body weight, and a motor phenotype of postural instability/gait disturbance were also significant predictors for development of wearing-off. The present results provide in vivo evidence to support the hypothesis that presynaptic dopamine neuronal loss, particularly in the anterior putamen, leads to development of wearing-off in Parkinson's disease. Copyright © 2018. Published by Elsevier Ltd.
RiskScape Volcano: Development of a risk assessment tool for volcanic hazards
NASA Astrophysics Data System (ADS)
Deligne, Natalia; King, Andrew; Jolly, Gill; Wilson, Grant; Wilson, Tom; Lindsay, Jan
2013-04-01
RiskScape is a multi-hazard risk assessment tool developed by GNS Science and the National Institute of Water and Atmospheric Research Ltd. (NIWA) in New Zealand that models the risk and impact of various natural hazards on a given built environment. RiskScape has a modular structure: the hazard module models hazard exposure (e.g., ash thickness at a given location), the asset module catalogues assets (built environment, infrastructure, and people) and their attributes exposed to the hazard, and the vulnerability module models the consequences of asset exposure to the hazard. Hazards presently included in RiskScape are earthquakes, river floods, tsunamis, windstorms, and ash from volcanic eruptions (specifically from Ruapehu). Here we present our framework for incorporating other volcanic hazards (e.g., pyroclastic density currents, lava flows, lahars, ground deformation) into RiskScape along with our approach for assessing asset vulnerability. We also will discuss the challenges of evaluating risk for 'point source' (e.g., stratovolcanoes) vs 'diffuse' (e.g., volcanic fields) volcanism using Ruapehu and the Auckland volcanic field as examples. Once operational, RiskScape Volcano will be a valuable resource both in New Zealand and internationally as a practical tool for evaluating risk and also as an example for how to predict the consequences of volcanic eruptions on both rural and urban environments.
NASA Astrophysics Data System (ADS)
Panzera, Francesco; Lombardo, Giuseppe; Rigano, Rosaria
2010-05-01
The seismic hazard assessment (SHA) can be performed using either Deterministic or Probabilistic approaches. In present study a probabilistic analysis was carried out for the Catania and Siracusa towns using two different procedures: the 'site' (Albarello and Mucciarelli, 2002) and the 'seismotectonic' (Cornell 1968; Esteva, 1967) methodologies. The SASHA code (D'Amico and Albarello, 2007) was used to calculate seismic hazard through the 'site' approach, whereas the CRISIS2007 code (Ordaz et al., 2007) was adopted in the Esteva-Cornell procedure. According to current international conventions for PSHA (SSHAC, 1997), a logic tree approach was followed to consider and reduce the epistemic uncertainties, for both seismotectonic and site methods. The code SASHA handles the intensity data taking into account the macroseismic information of past earthquakes. CRISIS2007 code needs, as input elements, a seismic catalogue tested for completeness, a seismogenetic zonation and ground motion predicting equations. Data concerning the characterization of regional seismic sources and ground motion attenuation properties were taken from the literature. Special care was devoted to define source zone models, taking into account the most recent studies on regional seismotectonic features and, in particular, the possibility of considering the Malta escarpment as a potential source. The combined use of the above mentioned approaches allowed us to obtain useful elements to define the site seismic hazard in Catania and Siracusa. The results point out that the choice of the probabilistic model plays a fundamental role. It is indeed observed that when the site intensity data are used, the town of Catania shows hazard values higher than the ones found for Siracusa, for each considered return period. On the contrary, when the Esteva-Cornell method is used, Siracusa urban area shows higher hazard than Catania, for return periods greater than one hundred years. The higher hazard observed, through the site approach, for Catania area can be interpreted in terms of greater damage historically observed at this town and its smaller distance from the seismogenic structures. On the other hand, the higher level of hazard found for Siracusa, throughout the Esteva-Cornell approach, could be a consequence of the features of such method which spreads out the intensities over a wide area. However, in SHA the use of a combined approach is recommended for a mutual validation of obtained results and any choice between the two approaches is strictly linked to the knowledge of the local seismotectonic features. References Albarello D. and Mucciarelli M.; 2002: Seismic hazard estimates using ill?defined macroseismic data at site. Pure Appl. Geophys., 159, 1289?1304. Cornell C.A.; 1968: Engineering seismic risk analysis. Bull. Seism. Soc. Am., 58(5), 1583-1606. D'Amico V. and Albarello D.; 2007: Codice per il calcolo della pericolosità sismica da dati di sito (freeware). Progetto DPC-INGV S1, http://esse1.mi.ingv.it/d12.html Esteva L.; 1967: Criterios para la construcción de espectros para diseño sísmico. Proceedings of XII Jornadas Sudamericanas de Ingeniería Estructural y III Simposio Panamericano de Estructuras, Caracas, 1967. Published later in Boletín del Instituto de Materiales y Modelos Estructurales, Universidad Central de Venezuela, No. 19. Ordaz M., Aguilar A. and Arboleda J.; 2007: CRISIS2007, Program for computing seismic hazard. Version 5.4, Mexico City: UNAM. SSHAC (Senior Seismic Hazard Analysis Committee); 1997: Recommendations for probabilistic seismic hazard analysis: guidance on uncertainty and use of experts. NUREG/CR-6372.
Martinez-Aguilar, Esther; Orbe, Josune; Fernández-Montero, Alejandro; Fernández-Alonso, Sebastián; Rodríguez, Jose A; Fernández-Alonso, Leopoldo; Páramo, Jose A; Roncal, Carmen
2017-11-01
The prognosis of patients with peripheral arterial disease (PAD) is characterized by an exceptionally high risk for myocardial infarction, ischemic stroke, and death; however, studies in search of new prognostic biomarkers in PAD are scarce. Even though low levels of high-density lipoprotein cholesterol (HDL-C) have been associated with higher risk of cardiovascular (CV) complications and death in different atherosclerotic diseases, recent epidemiologic studies have challenged its prognostic utility. The aim of this study was to test the predictive value of HDL-C as a risk factor for ischemic events or death in symptomatic PAD patients. Clinical and demographic parameters of 254 symptomatic PAD patients were recorded. Amputation, ischemic coronary disease, cerebrovascular disease, and all-cause mortality were recorded during a mean follow-up of 2.7 years. Multivariate analyses showed that disease severity (critical limb ischemia) was significantly reduced in patients with normal HDL-C levels compared with the group with low HDL-C levels (multivariate analysis odds ratio, 0.09; 95% confidence interval [CI], 0.03-0.24). A decreased risk for mortality (hazard ratio, 0.46; 95% CI, 0.21-0.99) and major adverse CV events (hazard ratio, 0.38; 95% CI, 0.16-0.86) was also found in patients with normal vs reduced levels of HDL-C in both Cox proportional hazards models and Kaplan-Meier estimates, after adjustment for confounding factors. Reduced HDL-C levels were significantly associated with higher risk for development of CV complications as well as with mortality in PAD patients. These findings highlight the usefulness of this simple test for early identification of PAD patients at high risk for development of major CV events. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Convectively Induced Turbulence Encountered During NASA's Fall-2000 Flight Experiments
NASA Technical Reports Server (NTRS)
Hamilton, David W.; Proctor, Fred H.
2002-01-01
Aircraft encounters with atmospheric turbulence are a leading cause of in-flight injuries aboard commercial airliners and cost the airlines millions of dollars each year. Most of these injuries are due to encounters with turbulence in and around convection. In a recent study of 44 turbulence accident reports between 1990 and 1996, 82% of the cases were found to be near or within convective activity (Kaplan et al. 1999). According to NTSB accident reports, pilots' descriptions of these turbulence encounters include 'abrupt', 'in Instrument Meteorological Conditions (IMC)', 'saw nothing on the weather radar', and 'the encounter occurred while deviating around' convective activity. Though the FAA has provided guidelines for aircraft operating in convective environments, turbulence detection capability could decrease the number of injuries by alerting pilots of a potential encounter. The National Aeronautics and Space Administration, through its Aviation Safety Program, is addressing turbulence hazards through research, flight experiments, and data analysis. Primary focus of this program element is the characterization of turbulence and its environment, as well as the development and testing of hazard estimation algorithms for both radar and in situ detection. The ultimate goal is to operationally test sensors that will provide ample warning prior to hazardous turbulence encounters. In order to collect data for support of these activities, NASA-Langley's B-757 research aircraft was directed into regions favorable for convectively induced turbulence (CIT). On these flights, the airborne predictive wind shear (PWS) radar, augmented with algorithms designed for turbulence detection, was operated in real time to test this capability. In this paper, we present the results of two research flights when turbulence was encountered. Described is an overview of the flights, the general radar performance, and details of four encounters with severe turbulence.
NASA Astrophysics Data System (ADS)
Špitalar, Maruša
2013-04-01
Natural disasters per se give a negative connotation. They are destructive to material elements in a space, nature itself and represent a threat to peoples' lives and health. Floods, especially flash floods due to its power and happening suddenly cause extensive damage. Hence, they are hard to predict and are characterized with violent movement, lots of lives are lost. Floods are among natural hazards the one causing the highest number of fatalities. Having said that very important aspects are humans' vulnerability, risk perception, their behavior when confronted with hazardous situations and on the other hand issues related to adequate warning signs and canals of communication. It is very important to take into consideration this segments also and not mainly just structural measures. However the aim of this paper is to emphasis mainly the social aspects of floods. It consists of two main parts. First one refers to mans' vulnerability, risk perception when it comes to danger caused by rising waters and how does culture influences peoples' response and reaction to flood causalities. The second part consists of data about detailed information on circumstances of death that have been collected from several different sources from several EU countries. There has been also available information on the age and gender of people who lost lives in flood events. With gender males dominated among death people since tend to risk more in risky situations. There has been also defined a vulnerable age group among flood fatalities. Analysis of circumstance of death enabled us to define risky groups that are very important for flood managers. Further on this is very beneficial also for risk prevention, early warning systems and creating the best canals in order to information about upcoming danger would successfully reach people at hazardous areas and also for the others to avoid them.
NASA Technical Reports Server (NTRS)
Rutishauser, David K.; Epp, Chirold; Robertson, Ed
2012-01-01
The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project is chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. Since its inception in 2006, the ALHAT Project has executed four field test campaigns to characterize and mature sensors and algorithms that support real-time hazard detection and global/local precision navigation for planetary landings. The driving objective for Government Fiscal Year 2012 (GFY2012) is to successfully demonstrate autonomous, real-time, closed loop operation of the ALHAT system in a realistic free flight scenario on Earth using the Morpheus lander developed at the Johnson Space Center (JSC). This goal represents an aggressive target consistent with a lean engineering culture of rapid prototyping and development. This culture is characterized by prioritizing early implementation to gain practical lessons learned and then building on this knowledge with subsequent prototyping design cycles of increasing complexity culminating in the implementation of the baseline design. This paper provides an overview of the ALHAT/Morpheus flight demonstration activities in GFY2012, including accomplishments, current status, results, and lessons learned. The ALHAT/Morpheus effort is also described in the context of a technology path in support of future crewed and robotic planetary exploration missions based upon the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN).
NASA Astrophysics Data System (ADS)
Satta, Alessio; Snoussi, Maria; Puddu, Manuela; Flayou, Latifa; Hout, Radouane
2016-06-01
The regional risk assessment carried out within the ClimVar & ICZM Project identified the coastal zone of Tetouan as a hotspot of the Mediterranean Moroccan coast and so it was chosen for the application of the Multi-Scale Coastal Risk Index for Local Scale (CRI-LS). The local scale approach provides a useful tool for local coastal planning and management by exploring the effects and the extensions of the hazards and combining hazard, vulnerability and exposure variables in order to identify areas where the risk is relatively high. The coast of Tetouan is one of the coastal areas that have been most rapidly and densely urbanized in Morocco and it is characterized by an erosive shoreline. Local authorities are facing the complex task of balancing development and managing coastal risks, especially coastal erosion and flooding, and then be prepared to the unavoidable impacts of climate change. The first phase of the application of the CRI-LS methodology to Tetouan consisted of defining the coastal hazard zone, which results from the overlaying of the erosion hazard zone and the flooding hazard zone. Nineteen variables were chosen to describe the Hazards, Vulnerability and Exposure factors. The scores corresponding to each variable were calculated and the weights assigned through an expert judgement elicitation. The resulting values are hosted in a geographic information system (GIS) platform that enables the individual variables and aggregated risk scores to be color-coded and mapped across the coastal hazard zone. The results indicated that 10% and 27% of investigated littoral fall under respectively very high and high vulnerability because of combination of high erosion rates with high capital land use. The risk map showed that some areas, especially the flood plains of Restinga, Smir and Martil-Alila, with distances over 5 km from the coast, are characterized by high levels of risk due to the low topography of the flood plains and to the high values of exposure. The CRI-LS provides a set of maps that allow identifying areas within the coastal hazard zone with relative higher risk from climate-related hazards. The method can be used to support coastal planning and management process in selecting the most suitable adaptation measures.
How EPA Assesses Chemical Safety
EPA's existing chemicals programs address pollution prevention, risk assessment, hazard and exposure assessment and/or characterization, and risk management for chemicals substances in commercial use.
Fact Sheet: N-Methylpyrrolidone (NMP)
EPA's existing chemicals programs address pollution prevention, risk assessment, hazard and exposure assessment and/or characterization, and risk management for chemicals substances in commercial use.
Current Chemical Risk Management Activities
EPA's existing chemicals programs address pollution prevention, risk assessment, hazard and exposure assessment and/or characterization, and risk management for chemicals substances in commercial use.
Goodman-Meza, David; Pitpitan, Eileen V.; Semple, Shirley J.; Wagner, Karla D.; Chavarin, Claudia V.; Strathdee, Steffanie A.; Patterson, Thomas L.
2015-01-01
Background and Objectives Male clients of female sex workers (FSWs) are at high risk for HIV. Whereas the HIV risks of alcohol use are well understood, less is known about hazardous alcohol use among male clients of FSWs, particularly in Mexico. We sought to identify risk factors for hazardous alcohol use and test associations between hazardous alcohol use and HIV risk behaviour among male clients in Tijuana. Method Male clients of FSWs in Tijuana (n = 400) completed a quantitative interview in 2008. The AUDIT was used to characterize hazardous alcohol use. Multivariate logistic regression was used to determine independent associations of demographic and HIV risk variables with hazardous alcohol use (vs. non-hazardous). Results Forty percent of our sample met criteria for hazardous alcohol use. Variables independently associated with hazardous drinking were reporting any sexually transmitted infection (STI), having sex with a FSW while under the influence of alcohol, being younger than 36 years of age, living in Tijuana, and ever having been jailed. Hazardous drinkers were less likely ever to have been deported or to have shared injection drugs. Discussion and Conclusions Hazardous alcohol use is associated with HIV risk, including engaging in sex with FSWs while intoxicated and having an STI among male clients of FSWs in Tijuana. Scientific Significance We systematically described patterns and correlates of hazardous alcohol use among male clients of FSWs in Tijuana, Mexico. The results suggest that HIV/STI risk reduction interventions must target hazardous alcohol users, and be tailored to address alcohol use. PMID:25066863
Goodman-Meza, David; Pitpitan, Eileen V; Semple, Shirley J; Wagner, Karla D; Chavarin, Claudia V; Strathdee, Steffanie A; Patterson, Thomas L
2014-01-01
Male clients of female sex workers (FSWs) are at high risk for HIV. Whereas the HIV risks of alcohol use are well understood, less is known about hazardous alcohol use among male clients of FSWs, particularly in Mexico. We sought to identify risk factors for hazardous alcohol use and test associations between hazardous alcohol use and HIV risk behavior among male clients in Tijuana. Male clients of FSWs in Tijuana (n = 400) completed a quantitative interview in 2008. The AUDIT was used to characterize hazardous alcohol use. Multivariate logistic regression was used to determine independent associations of demographic and HIV risk variables with hazardous alcohol use (vs. non-hazardous). Forty percent of our sample met criteria for hazardous alcohol use. Variables independently associated with hazardous drinking were reporting any sexually transmitted infection (STI), having sex with a FSW while under the influence of alcohol, being younger than 36 years of age, living in Tijuana, and ever having been jailed. Hazardous drinkers were less likely ever to have been deported or to have shared injection drugs. Hazardous alcohol use is associated with HIV risk, including engaging in sex with FSWs while intoxicated and having an STI among male clients of FSWs in Tijuana. We systematically described patterns and correlates of hazardous alcohol use among male clients of FSWs in Tijuana, Mexico. The results suggest that HIV/STI risk reduction interventions must target hazardous alcohol users, and be tailored to address alcohol use. © American Academy of Addiction Psychiatry.
Comber, Mike H I; Walker, John D; Watts, Chris; Hermens, Joop
2003-08-01
The use of quantitative structure-activity relationships (QSARs) for deriving the predicted no-effect concentration of discrete organic chemicals for the purposes of conducting a regulatory risk assessment in Europe and the United States is described. In the United States, under the Toxic Substances Control Act (TSCA), the TSCA Interagency Testing Committee and the U.S. Environmental Protection Agency (U.S. EPA) use SARs to estimate the hazards of existing and new chemicals. Within the Existing Substances Regulation in Europe, QSARs may be used for data evaluation, test strategy indications, and the identification and filling of data gaps. To illustrate where and when QSARs may be useful and when their use is more problematic, an example, methyl tertiary-butyl ether (MTBE), is given and the predicted and experimental data are compared. Improvements needed for new QSARs and tools for developing and using QSARs are discussed.
Correlates of AUDIT risk status for male and female college students.
Demartini, Kelly S; Carey, Kate B
2009-01-01
The current study identified gender-specific correlates of hazardous drinker status as defined by the AUDIT. A total of 462 college student volunteers completed the study in 2006. The sample was predominantly Caucasian (75%) and female (55%). Participants completed a survey assessing demographics, alcohol use patterns, and health indices. Scores of 8 or more on the AUDIT defined the at-risk subsample. Logistic regression models determined which variables predicted AUDIT risk status for men and women. The at-risk participants reported higher alcohol use and related problems, elevated sleep problems and lower health ratings. High typical blood alcohol concentration (BAC), lifetime drug use, and psychosocial problems predicted risk status for males. Binge frequency and psychosocial problems predicted risk status for females. Different behavioral profiles emerged for men and women identified as hazardous drinkers on the AUDIT. The efficacy of brief alcohol interventions could be enhanced by addressing these behavioral correlates.
Earthquake Prediction in a Big Data World
NASA Astrophysics Data System (ADS)
Kossobokov, V. G.
2016-12-01
The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. The examples of independent expertize of "seismic hazard maps", "precursors", and "forecast/prediction methods" are provided.
Mine Waste at The Kherzet Youcef Mine : Environmental Characterization
NASA Astrophysics Data System (ADS)
Issaad, Mouloud; Boutaleb, Abdelhak; Kolli, Omar
2017-04-01
Mining activity in Algeria has existed since antiquity. But it was very important since the 20th century. This activity has virtually ceased since the beginning of the 1990s, leaving many mine sites abandoned (so-called orphan mines). The abandonment of mining today poses many environmental problems (soil pollution, contamination of surface water, mining collapses...). The mining wastes often occupy large volumes that can be hazardous to the environment and human health, often neglected in the past: Faulting geotechnical implementation, acid mine drainage (AMD), alkalinity, presence of pollutants and toxic substances (heavy metals, cyanide...). The study started already six years ago and it covers all mines located in NE Algeria, almost are stopped for more than thirty years. So the most important is to have an overview of all the study area. After the inventory job of the abandoned mines, the rock drainage prediction will help us to classify sites according to their acid generating potential.
NASA Astrophysics Data System (ADS)
Lengline, O.; Marsan, D.; Got, J.; Pinel, V.
2007-12-01
The evolution of the seismicity at three basaltic volcanoes (Kilauea, Mauna-Loa and Piton de la Fournaise) is analysed during phases of magma accumulation. We show that the VT seismicity during these time-periods is characterized by an exponential increase at long-time scale (years). Such an exponential acceleration can be explained by a model of seismicity forced by the replenishment of a magmatic reservoir. The increase in stress in the edifice caused by this replenishment is modeled. This stress history leads to a cumulative number of damage, ie VT earthquakes, following the same exponential increase as found for seismicity. A long-term seismicity precursor is thus detected at basaltic volcanoes. Although this precursory signal is not able to predict the onset times of futures eruptions (as no diverging point is present in the model), it may help mitigating volcanic hazards.
Srinonprasert, V; Chalermsri, C; Aekplakorn, W
2018-05-04
Frailty is a clinical state of increased vulnerability from aging-associated decline. We aimed to determine if a Thai Frailty Index predicted all-cause mortality in community-dwelling older Thais when accounting for age, gender and socioeconomic status. Data of 8195 subjects aged 60 years and over from the Fourth Thai National Health Examination Survey were used to create the Thai Frailty Index by calculating the ratio of accumulated deficits using a cut-off point of 0.25 to define frailty. The associations were explored using Cox proportional hazard models. The mean age of participants was 69.2 years (SD 6.8). The prevalence of frailty was 22.1%. The Thai Frailty Index significantly predicted mortality (hazard ratio = 2.34, 95% CI 2.10-2.61, p < 0.001). The association between frailty and mortality was stronger in males (hazard ratio = 2.71, 95% CI 2.33-3.16). Higher wealth status had a protective effect among non-frail older adults but not among frail ones. In community-dwelling older Thai adults, the Thai Frailty Index demonstrated a high prevalence of frailty and predicted mortality. Frail older Thai adults did not earn the protective effect of reducing mortality with higher socioeconomic status. Maintaining health rather than accumulating wealth may be better for a longer healthier life for older people in middle income countries. Copyright © 2018. Published by Elsevier B.V.
Shallow translational slides hazard evaluation in Santa Marta de Penaguião (Douro valley - Portugal)
NASA Astrophysics Data System (ADS)
Pereira, Susana; Luís Zêzere, José; Bateira, Carlos
2010-05-01
The present study is developed for the municipality of Santa Marta de Penaguião (70 square kilometers), located in the Douro Valley region (Northern Portugal). In the past, several destructive landslides occurred in this area, and were responsible for deaths and destruction of houses and roads. Despite these losses, mitigation and landslide zonation programs are missing, and the land use planning at the municipal level did not solve yet the problem. The study area is mainly composed by metamorphic rocks (e.g., schist and quartzite). These rocks are strongly fractured, and weathered materials are abundant in clayed schist, mainly in those areas where agricultural terraces were constructed centuries ago for the vineyard monoculture. From the geomorphologic point of view, the study area is characterized by deep incised valleys, tectonic depressions and slopes controlled by the geological structure. Elevation ranges from 49 m to 1416 m. The main landslide triggering factor is rainfall and the mean annual precipitation ranges from 700 mm (in the bottom of fluvial valleys) to 2500 mm (in the mountains top). A landslide inventory was performed in 2005-2009 using aerial photo-interpretation (1/5.000 scale) and field work. The inventory includes 848 landslides, most of shallow translational slide type (85% of total slope movements). The landslide density is 10.5 events/square kilometers, and the average landslide area is 535 square meters. The susceptibility to shallow translational slide occurrence was assessed at the 1: 10 000 scale in a GIS environment. Two different bivariate statistical methods were used to evaluate landslide susceptibility: the Information Value and the Fuzzy Logic Gamma operator. Eight conditioning factors were weighted and integrated to model susceptibility: slope angle, slope aspect, slope curvature, lithology, geomorphologic units, fault density, land use and terrace structures build in slopes. The susceptibility results were validated using a random partition of the total set of shallow translational slides in two groups (training group and validation group, which were randomly defined, each corresponding to 50% of the complete landslide population.). This strategy allows the independent validation of landslide susceptibility models and the construction of prediction rate curves. The best prediction results were obtained using the information value method (Area Under Curve - AUC = 0.78). The landslide susceptibility map was classified in 5 susceptibility classes using the slope breaks within the best prediction curve. The empirical probability for each class was also estimated. Landslide hazard was assessed based on empirical probabilities, using an instability scenario similar to the event occurred in January 2001, which generated 603 shallow translational slides with a total unstable area of 93,029 square meters. This landslide event was triggered by 1064 mm of cumulative rainfall in 90 days, having 18 years of return period. Therefore, we assume that future occurrence of such rainfall amount will generate the same consequences regarding slope instability in the study area (i.e., the same number of landslides and equivalent total unstable area). The landslide hazard was also calculated per year to allow hazard comparison with other areas. The obtained results have short temporal validity and must be carefully analyzed due to rapid changes in land use in order to get more space for vineyard plantations. In recent years, the slope structures which sustained the soil erosion have been replaced systematically by terraces without soil support structures. In this context, the conditioning factors, susceptibility and hazard maps need to be regularly reassessed.
Development of Algal Interspecies Correlation Estimation Models for Chemical Hazard Assessment
Web-based Interspecies Correlation Estimation (ICE) is an application developed to predict the acute toxicity of a chemical from 1 species to another taxon. Web-ICE models use the acute toxicity value for a surrogate species to predict effect values for other species, thus potent...
Prediction of Composition and Emission Characteristics of Articles in Support of Exposure Assessment
The risk to humans from chemicals in consumer products is dependent on both hazard and exposure. The prediction and quantification of near-field (i.e., indoor) chemical exposure from household articles such as furniture and building materials is an ongoing effort. As opposed to (...
ToxCast: One Step in the NRC Vision of 21st Century Toxicology (T)
The international community needs better predictive tools for assessing the hazards and risks of chemicals. It is technically feasible to collect bioactivity data on virtually all chemicals of potential concern ToxCast is providing a proof of concept for obtaining predictive, b...
ToxCast: One Step in the NRC Vision of 21st Century Toxicology (S)
The international community needs better predictive tools for assessing the hazards and risks of chemicals. It is technically feasible to collect bioactivity data on virtually all chemicals of potential concern ToxCast is providing a proof of concept for obtaining predictive, b...
Predicting tree mortality following gypsy moth defoliation
D.E. Fosbroke; R.R. Hicks; K.W. Gottschalk
1991-01-01
Appropriate application of gypsy moth control strategies requires an accurate prediction of the distribution and intensity of tree mortality prior to defoliation. This prior information is necessary to better target investments in control activities where they are needed. This poster lays the groundwork for developing hazard-rating systems for forests of the...
Vinson, Daniel C.; Turner, Barbara J.; MSED; Manning, Brian K.; Galliher, James M.
2013-01-01
PURPOSE In clinical practice, detection of alcohol problems often relies on clinician suspicion instead of using a screening instrument. We assessed the sensitivity, specificity, and predictive values of clinician suspicion compared with screening-detected alcohol problems in patients. METHODS We undertook a cross-sectional study of 94 primary care clinicians’ office visits. Brief questionnaires were completed separately after a visit by both clinicians and eligible patients. The patient’s anonymous exit questionnaire screened for hazardous drinking based on the Alcohol Use Disorders Identification Test-Consumption (AUDIT-C) and for harmful drinking (alcohol abuse or dependence) based on 2 questions from the Diagnostic and Statistical Manual of Mental Disorders. After the visit, clinicians responded to the question, “Does this patient have problems with alcohol?” with answer options including “yes, hazardous drinking” and “yes, alcohol abuse or dependence.” Analyses assessed the associations between patients’ responses to screening questions and clinician’s suspicions. RESULTS Of 2,518 patients with an office visit, 2,173 were eligible, and 1,699 (78%) completed the exit questionnaire. One hundred seventy-one (10.1%) patients had a positive screening test for hazardous drinking (an AUDIT-C score of 5 or greater) and 64 (3.8%) for harmful drinking. Clinicians suspected alcohol problems in 81 patients (hazardous drinking in 37, harmful drinking in 40, and both in 4). The sensitivity of clinician suspicion of either hazardous or harmful drinking was 27% and the specificity was 98%. Positive and negative predictive values were 62% and 92%, respectively. CONCLUSION Clinician suspicion of alcohol problems had poor sensitivity but high specificity for identifying patients who had a positive screening test for alcohol problems. These data support the routine use of a screening tool to supplement clinicians’ suspicions, which already provide reasonable positive predictive value. PMID:23319506
Integrative Chemical-Biological Read-Across Approach for Chemical Hazard Classification
Low, Yen; Sedykh, Alexander; Fourches, Denis; Golbraikh, Alexander; Whelan, Maurice; Rusyn, Ivan; Tropsha, Alexander
2013-01-01
Traditional read-across approaches typically rely on the chemical similarity principle to predict chemical toxicity; however, the accuracy of such predictions is often inadequate due to the underlying complex mechanisms of toxicity. Here we report on the development of a hazard classification and visualization method that draws upon both chemical structural similarity and comparisons of biological responses to chemicals measured in multiple short-term assays (”biological” similarity). The Chemical-Biological Read-Across (CBRA) approach infers each compound's toxicity from those of both chemical and biological analogs whose similarities are determined by the Tanimoto coefficient. Classification accuracy of CBRA was compared to that of classical RA and other methods using chemical descriptors alone, or in combination with biological data. Different types of adverse effects (hepatotoxicity, hepatocarcinogenicity, mutagenicity, and acute lethality) were classified using several biological data types (gene expression profiling and cytotoxicity screening). CBRA-based hazard classification exhibited consistently high external classification accuracy and applicability to diverse chemicals. Transparency of the CBRA approach is aided by the use of radial plots that show the relative contribution of analogous chemical and biological neighbors. Identification of both chemical and biological features that give rise to the high accuracy of CBRA-based toxicity prediction facilitates mechanistic interpretation of the models. PMID:23848138
Beck, Matthias
2016-01-01
This paper revisits work on the socio-political amplification of risk, which predicts that those living in developing countries are exposed to greater risk than residents of developed nations. This prediction contrasts with the neoliberal expectation that market driven improvements in working conditions within industrialising/developing nations will lead to global convergence of hazard exposure levels. It also contradicts the assumption of risk society theorists that there will be an ubiquitous increase in risk exposure across the globe, which will primarily affect technically more advanced countries. Reviewing qualitative evidence on the impact of structural adjustment reforms in industrialising countries, the export of waste and hazardous waste recycling to these countries and new patterns of domestic industrialisation, the paper suggests that workers in industrialising countries continue to face far greater levels of hazard exposure than those of developed countries. This view is confirmed when a data set including 105 major multi-fatality industrial disasters from 1971 to 2000 is examined. The paper concludes that there is empirical support for the predictions of socio-political amplification of risk theory, which finds clear expression in the data in a consistent pattern of significantly greater fatality rates per industrial incident in industrialising/developing countries. PMID:26978378
Abdominal Circumference Versus Body Mass Index as Predictors of Lower Extremity Overuse Injury Risk.
Nye, Nathaniel S; Kafer, Drew S; Olsen, Cara; Carnahan, David H; Crawford, Paul F
2018-02-01
Abdominal circumference (AC) is superior to body mass index (BMI) as a measure of risk for various health outcomes. Our objective was to compare AC and BMI as predictors of lower extremity overuse injury (LEOI) risk. Retrospective review of electronic medical records of 79,868 US Air Force personnel over a 7-year period (2005-2011) for incidence of new LEOI. Subjects were stratified by BMI and AC. Injury risk for BMI/AC subgroups was calculated using Kaplan-Meier curves and Cox proportional-hazards regression. Receiver operating characteristic curves with area under the curve were used to compare each model's predictive value. Cox proportional-hazards regression showed significant risk association between elevated BMI, AC, and all injury types, with hazard ratios ranging 1.230-3.415 for obese versus normal BMI and 1.665-3.893 for high-risk versus low-risk AC (P < .05 for all measures). Receiver operating characteristic curves with area under the curve showed equivalent performance between BMI and AC for predicting all injury types. However, the combined model (AC and BMI) showed improved predictive ability over either model alone for joint injury, overall LEOI, and most strongly for osteoarthritis. Although AC and BMI alone performed similarly well, a combined approach using BMI and AC together improved risk estimation for LEOI.
Staley, Dennis M.
2014-01-01
Wildfire can significantly alter the hydrologic response of a watershed to the extent that even modest rainstorms can produce dangerous flash floods and debris flows. In this report, empirical models are used to predict the probability and magnitude of debris-flow occurrence in response to a 10-year rainstorm for the 2013 Springs fire in Ventura County, California. Overall, the models predict a relatively high probability (60–80 percent) of debris flow for 9 of the 99 drainage basins in the burn area in response to a 10-year recurrence interval design storm. Predictions of debris-flow volume suggest that debris flows may entrain a significant volume of material, with 28 of the 99 basins identified as having potential debris-flow volumes greater than 10,000 cubic meters. These results of the relative combined hazard analysis suggest there is a moderate likelihood of significant debris-flow hazard within and downstream of the burn area for nearby populations, infrastructure, wildlife, and water resources. Given these findings, we recommend that residents, emergency managers, and public works departments pay close attention to weather forecasts and National Weather Service-issued Debris Flow and Flash Flood Outlooks, Watches, and Warnings, and that residents adhere to any evacuation orders.
Li, Jiejie; Wang, Yilong; Lin, Jinxi; Wang, David; Wang, Anxin; Zhao, Xingquan; Liu, Liping; Wang, Chunxue; Wang, Yongjun
2015-07-01
Elevated soluble CD40 ligand (sCD40L) was shown to be related to cardiovascular events, but the role of sCD40L in predicting recurrent stroke remains unclear. Baseline sCD40L levels were measured in 3044 consecutive patients with acute minor stroke and transient ischemic attack, who had previously been enrolled in the Clopidogrel in High-Risk Patients With Acute Nondisabling Cerebrovascular Events (CHANCE) trial. Cox proportional-hazards model was used to assess the association of sCD40L with recurrent stroke. Patients in the top tertile of sCD40L levels had increased risk of recurrent stroke comparing with those in the bottom tertile, after adjusted for conventional confounding factors (hazard ratio, 1.49; 95% confidence interval, 1.11-2.00; P=0.008). The patients with elevated levels of both sCD40L and high-sensitive C-reactive protein also had increased risk of recurrent stroke (hazard ratio, 1.81; 95% confidence interval, 1.23-2.68; P=0.003). Elevated sCD40L levels independently predict recurrent stroke in patients with minor stroke and transient ischemic attack. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00979589. © 2015 American Heart Association, Inc.
Fact Sheet: Benzidine-Based Chemical Substances
EPA's existing chemicals programs address pollution prevention, risk assessment, hazard and exposure assessment and/or characterization, and risk management for chemicals substances in commercial use.
Fact Sheet: 1-Bromopropane (1-BP)
EPA's existing chemicals programs address pollution prevention, risk assessment, hazard and exposure assessment and/or characterization, and risk management for chemicals substances in commercial use.
Fact Sheet: Nonylphenols and Nonylphenol Ethoxylates
EPA's existing chemicals programs address pollution prevention, risk assessment, hazard and exposure assessment and/or characterization, and risk management for chemicals substances in commercial use.
Petersen, M.D.; Pankow, K.L.; Biasi, G.P.; Meremonte, M.
2008-01-01
The February 21, 2008 Wells, NV earthquake (M 6) was felt throughout eastern Nevada, southern Idaho, and western Utah. The town of Wells sustained significant damage to unreinforced masonry buildings. The earthquake occurred in a region of low seismic hazard with little seismicity, low geodetic strain rates, and few mapped faults. The peak horizontal ground acceleration predicted by the USGS National Seismic Hazard Maps is about 0.2 g at 2% probability of exceedance in 50 years, with the contributions coming mostly from the Ruby Mountain fault and background seismicity (M5-7.0). The hazard model predicts that the probability of occurrence of an M>6 event within 50 km of Wells is about 15% in 100 years. Although the earthquake was inside the USArray Transportable Array network, the nearest on-scale recordings of ground motions from the mainshock were too distant to estimate accelerations in town. The University of Nevada Reno, the University of Utah, and the U.S. Geological Survey deployed portable instruments to capture the ground motions from aftershocks of this rare normal-faulting event. Shaking from a M 4.7 aftershock recorded on portable instruments at distances less than 10 km exceeded 0.3 g, and sustained accelerations above 0.1 g lasted for about 5 seconds. For a magnitude 5 earthquake at 10 km distance the NGA equations predict median peak ground accelerations about 0.1 g. Ground motions from normal faulting earthquakes are poorly represented in the ground motion prediction equations. We compare portable and Transportable Array ground-motion recordings with prediction equations. Advanced National Seismic System stations in Utah recorded ground motions 250 km from the mainshock of about 2% g. The maximum ground motion recorded in Salt Lake City was in the center of the basin. We analyze the spatial variability of ground motions (rock vs. soil) and the influence of the Salt Lake Basin in modifying the ground motions. We then compare this data with the September 28, 2004 Parkfield aftershocks to contrast the differences between strike-slip and normal ground motions.
Fairlie, Anne M.; Maggs, Jennifer L.; Lanza, Stephanie T.
2016-01-01
Objective: Types of college drinkers have been identified using traditional measures (e.g., 12-month drinking frequency). We used an alternative multidimensional approach based on daily reports of alcohol behaviors to identify college drinker statuses, each with a unique behavioral profile. The current study aimed to (a) identify drinker statuses at the week level across four semesters, (b) examine the predictive utility of drinker status by testing associations with senior-year hazardous drinking and dependence symptoms, and (c) identify concurrent predictors (gender, drinking motivations, hazardous drinking, any dependence symptoms) of senior-year drinker status. We also compared the week-level drinker statuses with drinker statuses identified using traditional measures. Method: A multi-ethnic sample of U.S. college students completed 14-day bursts of daily web surveys across college (91%–96% completed ≥6 daily reports of the sampled week). Analyses focus on nine alcohol-related behaviors (including estimated blood alcohol concentration, pregaming, and drinking games) assessed daily in spring/sophomore year to fall/senior year and drinking motivations, hazardous drinking, and dependence symptoms assessed fall/senior year (n = 569; 56% women). Results: Four week-level drinker statuses were replicated across semesters: Nondrinker, Light Weekend, Heavy Weekend, and Heavy Frequent. Across semesters, drinker status was associated with senior-year hazardous drinking and any dependence symptoms. Senior-year fun/social motivations were also associated with senior-year drinker status. Differences in behavioral profiles between week-level drinker statuses and those identified using traditional measures were found. Conclusions: Replicable week-level drinker statuses were identified, suggesting consistency in possible types of drinking weeks. Drinker statuses were predictive of senior-year hazardous drinking and dependence symptoms. PMID:26751353
Safety focused modeling of lithium-ion batteries: A review
NASA Astrophysics Data System (ADS)
Abada, S.; Marlair, G.; Lecocq, A.; Petit, M.; Sauvant-Moynot, V.; Huet, F.
2016-02-01
Safety issues pertaining to Li-ion batteries justify intensive testing all along their value chain. However, progress in scientific knowledge regarding lithium based battery failure modes, as well as remarkable technologic breakthroughs in computing science, now allow for development and use of prediction tools to assist designers in developing safer batteries. Subsequently, this paper offers a review of significant modeling works performed in the area with a focus on the characterization of the thermal runaway hazard and their relating triggering events. Progress made in models aiming at integrating battery ageing effect and related physics is also discussed, as well as the strong interaction with modeling-focused use of testing, and the main achievements obtained towards marketing safer systems. Current limitations and new challenges or opportunities that are expected to shape future modeling activity are also put in perspective. According to market trends, it is anticipated that safety may still act as a restraint in the search for acceptable compromise with overall performance and cost of lithium-ion based and post lithium-ion rechargeable batteries of the future. In that context, high-throughput prediction tools capable of screening adequate new components properties allowing access to both functional and safety related aspects are highly desirable.
Abdel Raheem, Ali; Shin, Tae Young; Chang, Ki Don; Santok, Glen Denmer R; Alenzi, Mohamed Jayed; Yoon, Young Eun; Ham, Won Sik; Han, Woong Kyu; Choi, Young Deuk; Rha, Koon Ho
2018-06-19
To develop a predictive nomogram for chronic kidney disease-free survival probability in the long term after partial nephrectomy. A retrospective analysis was carried out of 698 patients with T1 renal tumors undergoing partial nephrectomy at a tertiary academic institution. A multivariable Cox regression analysis was carried out based on parameters proven to have an impact on postoperative renal function. Patients with incomplete data, <12 months follow up and preoperative chronic kidney disease stage III or greater were excluded. The study end-points were to identify independent risk factors for new-onset chronic kidney disease development, as well as to construct a predictive model for chronic kidney disease-free survival probability after partial nephrectomy. The median age was 52 years, median tumor size was 2.5 cm and mean warm ischemia time was 28 min. A total of 91 patients (13.1%) developed new-onset chronic kidney disease at a median follow up of 60 months. The chronic kidney disease-free survival rates at 1, 3, 5 and 10 year were 97.1%, 94.4%, 85.3% and 70.6%, respectively. On multivariable Cox regression analysis, age (1.041, P = 0.001), male sex (hazard ratio 1.653, P < 0.001), diabetes mellitus (hazard ratio 1.921, P = 0.046), tumor size (hazard ratio 1.331, P < 0.001) and preoperative estimated glomerular filtration rate (hazard ratio 0.937, P < 0.001) were independent predictors for new-onset chronic kidney disease. The C-index for chronic kidney disease-free survival was 0.853 (95% confidence interval 0.815-0.895). We developed a novel nomogram for predicting the 5-year chronic kidney disease-free survival probability after on-clamp partial nephrectomy. This model might have an important role in partial nephrectomy decision-making and follow-up plan after surgery. External validation of our nomogram in a larger cohort of patients should be considered. © 2018 The Japanese Urological Association.
Yeh, Hsin-Chih; Jan, Hau-Chern; Wu, Wen-Jeng; Li, Ching-Chia; Li, Wei-Ming; Ke, Hung-Lung; Huang, Shu-Pin; Liu, Chia-Chu; Lee, Yung-Chin; Yang, Sheau-Fang; Liang, Peir-In; Huang, Chun-Nung
2015-01-01
Objectives To investigate the impact of preoperative hydronephrosis and flank pain on prognosis of patients with upper tract urothelial carcinoma. Methods In total, 472 patients with upper tract urothelial carcinoma managed by radical nephroureterectomy were included from Kaohsiung Medical University Hospital Healthcare System. Clinicopathological data were collected retrospectively for analysis. The significance of hydronephrosis, especially when combined with flank pain, and other relevant factors on overall and cancer-specific survival were evaluated. Results Of the 472 patients, 292 (62%) had preoperative hydronephrosis and 121 (26%) presented with flank pain. Preoperative hydronephrosis was significantly associated with age, hematuria, flank pain, tumor location, and pathological tumor stage. Concurrent presence of hydronephrosis and flank pain was a significant predictor of non-organ-confined disease (multivariate-adjusted hazard ratio = 2.10, P = 0.025). Kaplan-Meier analysis showed significantly poorer overall and cancer-specific survival in patients with preoperative hydronephrosis (P = 0.005 and P = 0.026, respectively) and in patients with flank pain (P < 0.001 and P = 0.001, respectively) than those without. However, only simultaneous hydronephrosis and flank pain independently predicted adverse outcome (hazard ratio = 1.98, P = 0.016 for overall survival and hazard ratio = 1.87, P = 0.036 for and cancer-specific survival, respectively) in multivariate Cox proportional hazards models. In addition, concurrent presence of hydronephrosis and flank pain was also significantly predictive of worse survival in patient with high grade or muscle-invasive disease. Notably, there was no difference in survival between patients with hydronephrosis but devoid of flank pain and those without hydronephrosis. Conclusion Concurrent preoperative presence of hydronephrosis and flank pain predicted non-organ-confined status of upper tract urothelial carcinoma. When accompanied with flank pain, hydronephrosis represented an independent predictor for worse outcome in patients with upper tract urothelial carcinoma. PMID:26469704
Maxwell, Aaron W P; Baird, Grayson L; Iannuccilli, Jason D; Mayo-Smith, William W; Dupuy, Damian E
2017-05-01
Purpose To evaluate the performance of the radius, exophytic or endophytic, nearness to collecting system or sinus, anterior or posterior, and location relative to polar lines (RENAL) nephrometry and preoperative aspects and dimensions used for anatomic classification (PADUA) scoring systems and other tumor biometrics for prediction of local tumor recurrence in patients with renal cell carcinoma after thermal ablation. Materials and Methods This HIPAA-compliant study was performed with a waiver of informed consent after institutional review board approval was obtained. A retrospective evaluation of 207 consecutive patients (131 men, 76 women; mean age, 71.9 years ± 10.9) with 217 biopsy-proven renal cell carcinoma tumors treated with thermal ablation was conducted. Serial postablation computed tomography (CT) or magnetic resonance (MR) imaging was used to evaluate for local tumor recurrence. For each tumor, RENAL nephrometry and PADUA scores were calculated by using imaging-derived tumor morphologic data. Several additional tumor biometrics and combinations thereof were also measured, including maximum tumor diameter. The Harrell C index and hazard regression techniques were used to quantify associations with local tumor recurrence. Results The RENAL (hazard ratio, 1.43; P = .003) and PADUA (hazard ratio, 1.80; P < .0001) scores were found to be significantly associated with recurrence when regression techniques were used but demonstrated only poor to fair discrimination according to Harrell C index results (C, 0.68 and 0.75, respectively). Maximum tumor diameter showed the highest discriminatory strength of any individual variable evaluated (C, 0.81) and was also significantly predictive when regression techniques were used (hazard ratio, 2.98; P < .0001). For every 1-cm increase in diameter, the estimated rate of recurrence risk increased by 198%. Conclusion Maximum tumor diameter demonstrates superior performance relative to existing tumor scoring systems and other evaluated biometrics for prediction of local tumor recurrence after renal cell carcinoma ablation. © RSNA, 2016.
Saur, Randi; Hansen, Marianne Bang; Jansen, Anne; Heir, Trond
2017-04-01
To explore the types of risks and hazards that visually impaired individuals face, how they manage potential threats and how reactions to traumatic events are manifested and coped with. Participants were 17 visually impaired individuals who had experienced some kind of potentially traumatic event. Two focus groups and 13 individual interviews were conducted. The participants experienced a variety of hazards and potential threats in their daily life. Fear of daily accidents was more pronounced than fear of disasters. Some participants reported avoiding help-seeking in unsafe situations due to shame at not being able to cope. The ability to be independent was highlighted. Traumatic events were re-experienced through a variety of sense modalities. Fear of labelling and avoidance of potential risks were recurring topics, and the risks of social withdrawal and isolation were addressed. Visual impairment causes a need for predictability and adequate information to increase and prepare for coping and self-efficacy. The results from this study call for greater emphasis on universal design in order to ensure safety and predictability. Fear of being labelled may inhibit people from using assistive devices and adequate coping strategies and seeking professional help in the aftermath of a trauma. Implications for Rehabilitation Visual impairment entails a greater susceptibility to a variety of hazards and potential threats in daily life. This calls for a greater emphasis on universal design in public spaces to ensure confidence and safety. Visual impairment implies a need for predictability and adequate information to prepare for coping and self-efficacy. Rehabilitation professionals should be aware of the need for independence and self-reliance, the possible fear of labelling, avoidance of help-seeking or reluctance to use assistive devices. In rehabilitation after accidents or potential traumatizing events, professionals' knowledge about the needs for information, training and predictability is crucial. The possibility of social withdrawal or isolation should be considered.