Sample records for simulated hazardous low-level

  1. Airflow Hazard Visualization for Helicopter Pilots: Flight Simulation Study Results

    NASA Technical Reports Server (NTRS)

    Aragon, Cecilia R.; Long, Kurtis R.

    2005-01-01

    Airflow hazards such as vortices or low level wind shear have been identified as a primary contributing factor in many helicopter accidents. US Navy ships generate airwakes over their decks, creating potentially hazardous conditions for shipboard rotorcraft launch and recovery. Recent sensor developments may enable the delivery of airwake data to the cockpit, where visualizing the hazard data may improve safety and possibly extend ship/helicopter operational envelopes. A prototype flight-deck airflow hazard visualization system was implemented on a high-fidelity rotorcraft flight dynamics simulator. Experienced helicopter pilots, including pilots from all five branches of the military, participated in a usability study of the system. Data was collected both objectively from the simulator and subjectively from post-test questionnaires. Results of the data analysis are presented, demonstrating a reduction in crash rate and other trends that illustrate the potential of airflow hazard visualization to improve flight safety.

  2. Alert generation and cockpit presentation for an integrated microburst alerting system

    NASA Technical Reports Server (NTRS)

    Wanke, Craig; Hansman, R. John, Jr.

    1991-01-01

    Alert generation and cockpit presentation issues for low level wind shear (microburst) alerts are investigated. Alert generation issues center on the development of a hazard criterion which allows integration of both ground based and airborne wind shear detection systems to form an accurate picture of the aviation hazard posed by a particular wind shear situation. A methodology for the testing of a hazard criteria through flight simulation has been developed, and has been used to examine the effectiveness and feasibility of several possible criteria. Also, an experiment to evaluate candidate graphical cockpit displays for microburst alerts using a piloted simulator has been designed.

  3. Risks for the global freshwater system at 1.5 °C and 2 °C global warming

    NASA Astrophysics Data System (ADS)

    Döll, Petra; Trautmann, Tim; Gerten, Dieter; Müller Schmied, Hannes; Ostberg, Sebastian; Saaed, Fahad; Schleussner, Carl-Friedrich

    2018-04-01

    To support implementation of the Paris Agreement, the new HAPPI ensemble of 20 bias-corrected simulations of four climate models was used to drive two global hydrological models, WaterGAP and LPJmL, for assessing freshwater-related hazards and risks in worlds approximately 1.5 °C and 2 °C warmer than pre-industrial. Quasi-stationary HAPPI simulations are better suited than transient CMIP-like simulations for assessing hazards at the two targeted long-term global warming (GW) levels. We analyzed seven hydrological hazard indicators that characterize freshwater-related hazards for humans, freshwater biota and vegetation. Using a strict definition for significant differences, we identified for all but one indicator that areas with either significantly wetter or drier conditions (calculated as percent changes from 2006–2015) are smaller in the 1.5 °C world. For example, 7 day high flow is projected to increase significantly on 11% and 21% of the global land area at 1.5 °C and 2 °C, respectively. However, differences between hydrological hazards at the two GW levels are significant on less than 12% of the area. GW affects a larger area and more people by increases—rather than by decreases—of mean annual and 1-in-10 dry year streamflow, 7 day high flow, and groundwater recharge. The opposite is true for 7 day low flow, maximum snow storage, and soil moisture in the driest month of the growing period. Mean annual streamflow shows the lowest projected percent changes of all indicators. Among country groups, low income countries and lower middle income countries are most affected by decreased low flows and increased high flows, respectively, while high income countries are least affected by such changes. The incremental impact between 1.5 °C and 2 °C on high flows would be felt most by low income and lower middle income countries, the effect on soil moisture and low flows most by high income countries.

  4. Simulation investigation of the effect of the NASA Ames 80-by 120-foot wind tunnel exhaust flow on light aircraft operating in the Moffett field trafffic pattern

    NASA Technical Reports Server (NTRS)

    Streeter, Barry G.

    1986-01-01

    A preliminary study of the exhaust flow from the Ames Research Center 80 by 120 Foot Wind Tunnel indicated that the flow might pose a hazard to low-flying light aircraft operating in the Moffett Field traffic pattern. A more extensive evaluation of the potential hazard was undertaken using a fixed-base, piloted simulation of a light, twin-engine, general-aviation aircraft. The simulated aircraft was flown through a model of the wind tunnel exhaust by pilots of varying experience levels to develop a data base of aircraft and pilot reactions. It is shown that a light aircraft would be subjected to a severe disturbance which, depending upon entry condition and pilot reaction, could result in a low-altitude stall or cause damage to the aircraft tail structure.

  5. Hazardous Convective Weather in the Central United States: Present and Future

    NASA Astrophysics Data System (ADS)

    Liu, C.; Ikeda, K.; Rasmussen, R.

    2017-12-01

    Two sets of 13-year continental-scale convection-permitting simulations were performed using the 4-km-resolution WRF model. They consist of a retrospective simulation, which downscales the ERA-Interim reanalysis during the period October 2000 - September 2013, and a future climate sensitivity simulation for the same period based on the perturbed reanalysis-derived boundary conditions with the CMIP5 ensemble-mean high-end emission scenario climate change. The evaluation of the retrospective simulation indicates that the model is able to realistically reproduce the main characteristics of deep precipitating convection observed in the current climate such as the spectra of convective population and propagating mesoscale convective systems (MCSs). It is also shown that severe convection and associated MCS will increase in frequency and intensity, implying a potential increase in high impact convective weather in a future warmer climate. In this study, the warm-season hazardous convective weather (i.e., tonadoes, hails and damaging gusty wind) in the central United states is examined using these 4-km downscaling simulations. First, a model-based proxy for hazardous convective weather is derived on the basis of a set of characteristic meteorological variables such as the model composite radar reflectivity, updraft helicity, vertical wind shear, and low-level wind. Second, the developed proxy is applied to the retrospective simulation for estimate of the model hazardous weather events during the historical period. Third, the simulated hazardous weather statistics are evaluated against the NOAA severe weather reports. Lastly, the proxy is applied to the future climate simulation for the projected change of hazardous convective weather in response to global warming. Preliminary results will be reported at the 2017 AGU session "High Resolution Climate Modeling".

  6. Quantitative Microbial Risk Assessment for Clostridium perfringens in Natural and Processed Cheeses

    PubMed Central

    Lee, Heeyoung; Lee, Soomin; Kim, Sejeong; Lee, Jeeyeon; Ha, Jimyeong; Yoon, Yohan

    2016-01-01

    This study evaluated the risk of Clostridium perfringens (C. perfringens) foodborne illness from natural and processed cheeses. Microbial risk assessment in this study was conducted according to four steps: hazard identification, hazard characterization, exposure assessment, and risk characterization. The hazard identification of C. perfringens on cheese was identified through literature, and dose response models were utilized for hazard characterization of the pathogen. For exposure assessment, the prevalence of C. perfringens, storage temperatures, storage time, and annual amounts of cheese consumption were surveyed. Eventually, a simulation model was developed using the collected data and the simulation result was used to estimate the probability of C. perfringens foodborne illness by cheese consumption with @RISK. C. perfringens was determined to be low risk on cheese based on hazard identification, and the exponential model (r = 1.82×10−11) was deemed appropriate for hazard characterization. Annual amounts of natural and processed cheese consumption were 12.40±19.43 g and 19.46±14.39 g, respectively. Since the contamination levels of C. perfringens on natural (0.30 Log CFU/g) and processed cheeses (0.45 Log CFU/g) were below the detection limit, the initial contamination levels of natural and processed cheeses were estimated by beta distribution (α1 = 1, α2 = 91; α1 = 1, α2 = 309)×uniform distribution (a = 0, b = 2; a = 0, b = 2.8) to be −2.35 and −2.73 Log CFU/g, respectively. Moreover, no growth of C. perfringens was observed for exposure assessment to simulated conditions of distribution and storage. These data were used for risk characterization by a simulation model, and the mean values of the probability of C. perfringens foodborne illness by cheese consumption per person per day for natural and processed cheeses were 9.57×10−14 and 3.58×10−14, respectively. These results indicate that probability of C. perfringens foodborne illness by consumption cheese is low, and it can be used to establish microbial criteria for C. perfringens on natural and processed cheeses. PMID:26954204

  7. An operational-oriented approach to the assessment of low probability seismic ground motions for critical infrastructures

    NASA Astrophysics Data System (ADS)

    Garcia-Fernandez, Mariano; Assatourians, Karen; Jimenez, Maria-Jose

    2018-01-01

    Extreme natural hazard events have the potential to cause significant disruption to critical infrastructure (CI) networks. Among them, earthquakes represent a major threat as sudden-onset events with limited, if any, capability of forecast, and high damage potential. In recent years, the increased exposure of interdependent systems has heightened concern, motivating the need for a framework for the management of these increased hazards. The seismic performance level and resilience of existing non-nuclear CIs can be analyzed by identifying the ground motion input values leading to failure of selected key elements. Main interest focuses on the ground motions exceeding the original design values, which should correspond to low probability occurrence. A seismic hazard methodology has been specifically developed to consider low-probability ground motions affecting elongated CI networks. The approach is based on Monte Carlo simulation, which allows for building long-duration synthetic earthquake catalogs to derive low-probability amplitudes. This approach does not affect the mean hazard values and allows obtaining a representation of maximum amplitudes that follow a general extreme-value distribution. This facilitates the analysis of the occurrence of extremes, i.e., very low probability of exceedance from unlikely combinations, for the development of, e.g., stress tests, among other applications. Following this methodology, extreme ground-motion scenarios have been developed for selected combinations of modeling inputs including seismic activity models (source model and magnitude-recurrence relationship), ground motion prediction equations (GMPE), hazard levels, and fractiles of extreme ground motion. The different results provide an overview of the effects of different hazard modeling inputs on the generated extreme motion hazard scenarios. This approach to seismic hazard is at the core of the risk analysis procedure developed and applied to European CI transport networks within the framework of the European-funded INFRARISK project. Such an operational seismic hazard framework can be used to provide insight in a timely manner to make informed risk management or regulating further decisions on the required level of detail or on the adoption of measures, the cost of which can be balanced against the benefits of the measures in question.

  8. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach

    PubMed Central

    Dall'Osso, F.; Dominey-Howes, D.; Moore, C.; Summerhayes, S.; Withycombe, G.

    2014-01-01

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney. PMID:25492514

  9. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach.

    PubMed

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G

    2014-12-10

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  10. Risk assessment of debris flow in Yushu seismic area in China: a perspective for the reconstruction

    NASA Astrophysics Data System (ADS)

    Lan, H. X.; Li, L. P.; Zhang, Y. S.; Gao, X.; Liu, H. J.

    2013-11-01

    The 14 April 2010 Ms = 7.1 Yushu Earthquake (YE) had caused severe damage in the Jiegu township, the residential centre of Yushu Tibetan Autonomous Prefecture, Qinghai Province, China. In view of the fragile geological conditions after YE, risk assessment of secondary geohazards becomes an important concern for the reconstruction. A quantitative methodology was developed to assess the risk of debris flow by taking into account important intensity information. Debris flow scenarios were simulated with respect to rainfall events with 10, 50 and 100 yr returning period, respectively. The possible economic loss and fatalities caused by damage to buildings were assessed both in the settlement area and in the low hazard settlement area regarding the simulated debris flow events. Three modelled building types were adopted, i.e. hollow brick wood (HBW), hollow brick concrete (HBC) and reinforced concrete (RC) buildings. The results suggest that HBC structure achieves a good balance for the cost-benefit relationship compared with HBW and RC structures and thus could be an optimal choice for most of the new residential buildings in the Jiegu township. The low hazard boundary presents significant risk reduction efficiency in the 100 yr returning debris flow event. In addition, the societal risk for the settlement area is unacceptable when the 100 yr returning event occurs but reduces to ALARP (as low as reasonably practicable) level as the low hazard area is considered. Therefore, the low hazard area was highly recommended to be taken into account in the reconstruction. Yet, the societal risk might indeed approach an unacceptable level if one considers that YE has inevitably increased the occurrence frequency of debris flow. The quantitative results should be treated as a perspective for the reconstruction rather than precise numbers of future losses, owing to the complexity of the problem and the deficiency of data.

  11. Cockpit display of hazardous weather information

    NASA Technical Reports Server (NTRS)

    Hansman, R. John, Jr.; Wanke, Craig

    1991-01-01

    Information transfer and display issues associated with the dissemination of hazardous weather warnings are studied in the context of wind shear alerts. Operational and developmental wind shear detection systems are briefly reviewed. The July 11, 1988 microburst events observed as part of the Denver Terminal Doppler Weather Radar (TDWR) operational evaluation are analyzed in terms of information transfer and the effectiveness of the microburst alerts. Information transfer, message content and display issues associated with microburst alerts generated from ground based sources (Doppler Radar, Low Level Wind Shear Alert System, and Pilot Reports) are evaluated by means fo pilot opinion surveys and part task simulator studies.

  12. Simulating visibility under reduced acuity and contrast sensitivity.

    PubMed

    Thompson, William B; Legge, Gordon E; Kersten, Daniel J; Shakespeare, Robert A; Lei, Quan

    2017-04-01

    Architects and lighting designers have difficulty designing spaces that are accessible to those with low vision, since the complex nature of most architectural spaces requires a site-specific analysis of the visibility of mobility hazards and key landmarks needed for navigation. We describe a method that can be utilized in the architectural design process for simulating the effects of reduced acuity and contrast on visibility. The key contribution is the development of a way to parameterize the simulation using standard clinical measures of acuity and contrast sensitivity. While these measures are known to be imperfect predictors of visual function, they provide a way of characterizing general levels of visual performance that is familiar to both those working in low vision and our target end-users in the architectural and lighting-design communities. We validate the simulation using a letter-recognition task.

  13. Simulating Visibility Under Reduced Acuity and Contrast Sensitivity

    PubMed Central

    Thompson, William B.; Legge, Gordon E.; Kersten, Daniel J.; Shakespeare, Robert A.; Lei, Quan

    2017-01-01

    Architects and lighting designers have difficulty designing spaces that are accessible to those with low vision, since the complex nature of most architectural spaces requires a site-specific analysis of the visibility of mobility hazards and key landmarks needed for navigation. We describe a method that can be utilized in the architectural design process for simulating the effects of reduced acuity and contrast on visibility. The key contribution is the development of a way to parameterize the simulation using standard clinical measures of acuity and contrast sensitivity. While these measures are known to be imperfect predictors of visual function, they provide a way of characterizing general levels of visual performance that is familiar to both those working in low vision and our target end-users in the architectural and lighting design communities. We validate the simulation using a letter recognition task. PMID:28375328

  14. Flooding Hazard Maps of Different Land Uses in Subsidence Area

    NASA Astrophysics Data System (ADS)

    Lin, Yongjun; Chang, Hsiangkuan; Tan, Yihchi

    2017-04-01

    This study aims on flooding hazard maps of different land uses in the subsidence area of southern Taiwan. Those areas are low-lying due to subsidence resulting from over pumping ground water for aquaculture. As a result, the flooding due to storm surges and extreme rainfall are frequent in this area and are expected more frequently in the future. The main land uses there include: residence, fruit trees, and aquaculture. The hazard maps of the three land uses are investigated. The factors affecting hazards of different land uses are listed below. As for residence, flooding depth, duration of flooding, and rising rate of water surface level are factors affecting its degree of hazard. High flooding depth, long duration of flooding, and fast rising rate of water surface make residents harder to evacuate. As for fruit trees, flooding depth and duration of flooding affects its hazard most due to the root hypoxia. As for aquaculture, flooding depth affects its hazard most because the high flooding depth may cause the fish flush out the fishing ponds. An overland flow model is used for simulations of hydraulic parameters for factors such as flooding depth, rising rate of water surface level and duration of flooding. As above-mentioned factors, the hazard maps of different land uses can be made and high hazardous are can also be delineated in the subsidence areas.

  15. A comparative study of two hazard handling training methods for novice drivers.

    PubMed

    Wang, Y B; Zhang, W; Salvendy, G

    2010-10-01

    The effectiveness of two hazard perception training methods, simulation-based error training (SET) and video-based guided error training (VGET), for novice drivers' hazard handling performance was tested, compared, and analyzed. Thirty-two novice drivers participated in the hazard perception training. Half of the participants were trained using SET by making errors and/or experiencing accidents while driving with a desktop simulator. The other half were trained using VGET by watching prerecorded video clips of errors and accidents that were made by other people. The two groups had exposure to equal numbers of errors for each training scenario. All the participants were tested and evaluated for hazard handling on a full cockpit driving simulator one week after training. Hazard handling performance and hazard response were measured in this transfer test. Both hazard handling performance scores and hazard response distances were significantly better for the SET group than the VGET group. Furthermore, the SET group had more metacognitive activities and intrinsic motivation. SET also seemed more effective in changing participants' confidence, but the result did not reach the significance level. SET exhibited a higher training effectiveness of hazard response and handling than VGET in the simulated transfer test. The superiority of SET might benefit from the higher levels of metacognition and intrinsic motivation during training, which was observed in the experiment. Future research should be conducted to assess whether the advantages of error training are still effective under real road conditions.

  16. Analysis of Compound Water Hazard in Coastal Urbanized Areas under the Future Climate

    NASA Astrophysics Data System (ADS)

    Shibuo, Y.; Taniguchi, K.; Sanuki, H.; Yoshimura, K.; Lee, S.; Tajima, Y.; Koike, T.; Furumai, H.; Sato, S.

    2017-12-01

    Several studies indicate the increased frequency and magnitude of heavy rainfalls as well as the sea level rise under the future climate, which implies that coastal low-lying urbanized areas may experience increased risk against flooding. In such areas, where river discharge, tidal fluctuation, and city drainage networks altogether influence urban inundation, it is necessary to consider their potential interference to understand the effect of compound water hazard. For instance, pump stations cannot pump out storm water when the river water level is high, and in the meantime the river water level shall increase when it receives pumped water from cities. At the further downstream, as the tidal fluctuation regulates the water levels in the river, it will also affect the functionality of pump stations and possible inundation from rivers. In this study, we estimate compound water hazard in the coastal low-lying urbanized areas of the Tsurumi river basin under the future climate. We developed the seamlessly integrated river, sewerage, and coastal hydraulic model that can simulate river water levels, water flow in sewerage network, and inundation from the rivers and/or the coast to address the potential interference issue. As a forcing, the pseudo global warming method, which applies the changes in GCM anomaly to re-analysis data, is employed to produce ensemble typhoons to drive the seamlessly integrated model. The results show that heavy rainfalls caused by the observed typhoon generally become stronger under the pseudo global climate condition. It also suggests that the coastal low-lying areas become extensively inundated if the onset of river flooding and storm surge coincides.

  17. Airflow resistance and CO2 rebreathing properties of anti-asphyxia pillows designed for epilepsy.

    PubMed

    Catcheside, Peter G; Mohtar, Aaron A; Reynolds, Karen J

    2014-06-01

    Seizure related unconscious face-down positioning could contribute to sudden unexpected death in epilepsy via asphyxia. Low airflow resistance lattice foam pillows have been advocated for this group. However, data to support this approach remain lacking, and low airflow resistance per se may not negate asphyxia risk from expired gas rebreathing. This study was designed to compare the airflow resistance and CO2 rebreathing properties of lattice vs conventional pillows. Airflow resistance and inspired CO2 levels during replicate 10 min periods of simulated adult ventilation and CO2 rebreathing were compared between cotton, latex and two lattice pillows designed for use in epilepsy (one commercially available, one prototype). Kaplan-Meier and Cox regression analyses were used to examine the hazard of exceeding 10% inspired CO2 within 10-min of rebreathing. Inspiratory resistance was significantly lower in the commercially available and prototype lattice compared to cotton and latex pillows (mean±SD; 3.2±0.8, 2.6±0.4, 26.1±3.5, 4.6±0.4 cm H2O l(-1)s respectively at 0.2l s(-1)). During simulated rebreathing, inspired CO2 exceeded 10% within 2 min with cotton and latex pillows, compared to an upper asymptote around 8-9% at 10 min with lattice pillows. The hazard of exceeding 10% inspired CO2 was therefore markedly reduced with lattice compared to cotton and latex pillows (hazard ratio vs cotton pillow; commercial 0.04 [0.01-0.18], prototype 0.08 [0.02-0.26], latex 0.79 [0.33-1.87]). Conventional pillows can rapidly accumulate potentially life-threatening CO2 levels during simulated rebreathing. Lattice pillows appear to reduce asphyxia risk but accumulated CO2 may still reach levels threatening to health and survival. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  18. Analyzing recurrent events when the history of previous episodes is unknown or not taken into account: proceed with caution.

    PubMed

    Navarro, Albert; Casanovas, Georgina; Alvarado, Sergio; Moriña, David

    Researchers in public health are often interested in examining the effect of several exposures on the incidence of a recurrent event. The aim of the present study is to assess how well the common-baseline hazard models perform to estimate the effect of multiple exposures on the hazard of presenting an episode of a recurrent event, in presence of event dependence and when the history of prior-episodes is unknown or is not taken into account. Through a comprehensive simulation study, using specific-baseline hazard models as the reference, we evaluate the performance of common-baseline hazard models by means of several criteria: bias, mean squared error, coverage, confidence intervals mean length and compliance with the assumption of proportional hazards. Results indicate that the bias worsen as event dependence increases, leading to a considerable overestimation of the exposure effect; coverage levels and compliance with the proportional hazards assumption are low or extremely low, worsening with increasing event dependence, effects to be estimated, and sample sizes. Common-baseline hazard models cannot be recommended when we analyse recurrent events in the presence of event dependence. It is important to have access to the history of prior-episodes per subject, it can permit to obtain better estimations of the effects of the exposures. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  19. Development and validation of risk profiles of West African rural communities facing multiple natural hazards

    PubMed Central

    Renaud, Fabrice G.; Kloos, Julia; Walz, Yvonne; Rhyner, Jakob

    2017-01-01

    West Africa has been described as a hotspot of climate change. The reliance on rain-fed agriculture by over 65% of the population means that vulnerability to climatic hazards such as droughts, rainstorms and floods will continue. Yet, the vulnerability and risk levels faced by different rural social-ecological systems (SES) affected by multiple hazards are poorly understood. To fill this gap, this study quantifies risk and vulnerability of rural communities to drought and floods. Risk is assessed using an indicator-based approach. A stepwise methodology is followed that combines participatory approaches with statistical, remote sensing and Geographic Information System techniques to develop community level vulnerability indices in three watersheds (Dano, Burkina Faso; Dassari, Benin; Vea, Ghana). The results show varying levels of risk profiles across the three watersheds. Statistically significant high levels of mean risk in the Dano area of Burkina Faso are found whilst communities in the Dassari area of Benin show low mean risk. The high risk in the Dano area results from, among other factors, underlying high exposure to droughts and rainstorms, longer dry season duration, low caloric intake per capita, and poor local institutions. The study introduces the concept of community impact score (CIS) to validate the indicator-based risk and vulnerability modelling. The CIS measures the cumulative impact of the occurrence of multiple hazards over five years. 65.3% of the variance in observed impact of hazards/CIS was explained by the risk models and communities with high simulated disaster risk generally follow areas with high observed disaster impacts. Results from this study will help disaster managers to better understand disaster risk and develop appropriate, inclusive and well integrated mitigation and adaptation plans at the local level. It fulfills the increasing need to balance global/regional assessments with community level assessments where major decisions against risk are actually taken and implemented. PMID:28248969

  20. An Earth-Based Equivalent Low Stretch Apparatus to Assess Material Flammability for Microgravity & Extraterrestrial Fire-Safety Applications

    NASA Technical Reports Server (NTRS)

    Olson, S. L.; Beeson, H.; Haas, J.

    2001-01-01

    One of the performance goals for NASA's enterprise of Human Exploration and Development of Space (HEDS) is to develop methods, data bases, and validating tests for material flammability characterization, hazard reduction, and fire detection/suppression strategies for spacecraft and extraterrestrial habitats. This work addresses these needs by applying the fundamental knowledge gained from low stretch experiments to the development of a normal gravity low stretch material flammability test method. The concept of the apparatus being developed uses the low stretch geometry to simulate the conditions of the extraterrestrial environment through proper scaling of the sample dimensions to reduce the buoyant stretch in normal gravity. The apparatus uses controlled forced-air flow to augment the low stretch to levels which simulate Lunar or Martian gravity levels. In addition, the effect of imposed radiant heat flux on material flammability can be studied with the cone heater. After breadboard testing, the apparatus will be integrated into NASA's White Sands Test Facility's Atmosphere-Controlled Cone Calorimeter for evaluation as a new materials screening test method.

  1. 40 CFR 266.220 - What does a storage and treatment conditional exemption do?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR THE MANAGEMENT OF SPECIFIC HAZARDOUS WASTES AND SPECIFIC TYPES OF HAZARDOUS WASTE MANAGEMENT FACILITIES Conditional Exemption for Low-Level Mixed Waste Storage... exemption exempts your low-level mixed waste from the regulatory definition of hazardous waste in 40 CFR 261...

  2. Flight in low-level wind shear

    NASA Technical Reports Server (NTRS)

    Frost, W.

    1983-01-01

    Results of studies of wind shear hazard to aircraft operation are summarized. Existing wind shear profiles currently used in computer and flight simulator studies are reviewed. The governing equations of motion for an aircraft are derived incorporating the variable wind effects. Quantitative discussions of the effects of wind shear on aircraft performance are presented. These are followed by a review of mathematical solutions to both the linear and nonlinear forms of the governing equations. Solutions with and without control laws are presented. The application of detailed analysis to develop warning and detection systems based on Doppler radar measuring wind speed along the flight path is given. A number of flight path deterioration parameters are defined and evaluated. Comparison of computer-predicted flight paths with those measured in a manned flight simulator is made. Some proposed airborne and ground-based wind shear hazard warning and detection systems are reviewed. The advantages and disadvantages of both types of systems are discussed.

  3. Occupational-level interactions between physical hazards and cognitive ability and skill requirements in predicting injury incidence rates.

    PubMed

    Ford, Michael T; Wiggins, Bryan K

    2012-07-01

    Interactions between occupational-level physical hazards and cognitive ability and skill requirements were examined as predictors of injury incidence rates as reported by the U. S. Bureau of Labor Statistics. Based on ratings provided in the Occupational Information Network (O*NET) database, results across 563 occupations indicate that physical hazards at the occupational level were strongly related to injury incidence rates. Also, as expected, the physical hazard-injury rate relationship was stronger among occupations with high cognitive ability and skill requirements. In addition, there was an unexpected main effect such that occupations with high cognitive ability and skill requirements had lower injury rates even after controlling for physical hazards. The main effect of cognitive ability and skill requirements, combined with the interaction with physical hazards, resulted in unexpectedly high injury rates for low-ability and low-skill occupations with low physical hazard levels. Substantive and methodological explanations for these interactions and their theoretical and practical implications are offered. Results suggest that organizations and occupational health and safety researchers and practitioners should consider the occupational level of analysis and interactions between physical hazards and cognitive requirements in future research and practice when attempting to understand and prevent injuries.

  4. Assessment of Debris Flow Potential Hazardous Zones Using Numerical Models in the Mountain Foothills of Santiago, Chile

    NASA Astrophysics Data System (ADS)

    Celis, C.; Sepulveda, S. A.; Castruccio, A.; Lara, M.

    2017-12-01

    Debris and mudflows are some of the main geological hazards in the mountain foothills of Central Chile. The risk of flows triggered in the basins of ravines that drain the Andean frontal range into the capital city, Santiago, increases with time due to accelerated urban expansion. Susceptibility assessments were made by several authors to detect the main active ravines in the area. Macul and San Ramon ravines have a high to medium debris flow susceptibility, whereas Lo Cañas, Apoquindo and Las Vizcachas ravines have a medium to low debris flow susceptibility. This study emphasizes in delimiting the potential hazardous zones using the numerical simulation program RAMMS-Debris Flows with the Voellmy model approach, and the debris-flow model LAHARZ. This is carried out by back-calculating the frictional parameters in the depositional zone with a known event as the debris and mudflows in Macul and San Ramon ravines, on May 3rd, 1993, for the RAMMS approach. In the same scenario, we calibrate the coefficients to match conditions of the mountain foothills of Santiago for the LAHARZ model. We use the information obtained for every main ravine in the study area, mainly for the similarity in slopes and material transported. Simulations were made for the worst-case scenario, caused by the combination of intense rainfall storms, a high 0°C isotherm level and material availability in the basins where the flows are triggered. The results show that the runout distances are well simulated, therefore a debris-flow hazard map could be developed with these models. Correlation issues concerning the run-up, deposit thickness and transversal areas are reported. Hence, the models do not represent entirely the complexity of the phenomenon, but they are a reliable approximation for preliminary hazard maps.

  5. Turbulent transport model of wind shear in thunderstorm gust fronts and warm fronts

    NASA Technical Reports Server (NTRS)

    Lewellen, W. S.; Teske, M. E.; Segur, H. C. O.

    1978-01-01

    A model of turbulent flow in the atmospheric boundary layer was used to simulate the low-level wind and turbulence profiles associated with both local thunderstorm gust fronts and synoptic-scale warm fronts. Dimensional analyses of both type fronts provided the physical scaling necessary to permit normalized simulations to represent fronts for any temperature jump. The sensitivity of the thunderstorm gust front to five different dimensionless parameters as well as a change from axisymmetric to planar geometry was examined. The sensitivity of the warm front to variations in the Rossby number was examined. Results of the simulations are discussed in terms of the conditions which lead to wind shears which are likely to be most hazardous for aircraft operations.

  6. Environmental Risk Assessment: Spatial Analysis of Chemical Hazards and Risks in South Korea

    NASA Astrophysics Data System (ADS)

    Yu, H.; Heo, S.; Kim, M.; Lee, W. K.; Jong-Ryeul, S.

    2017-12-01

    This study identified chemical hazard and risk levels in Korea by analyzing the spatial distribution of chemical factories and accidents. The number of chemical factories and accidents in 5-km2 grids were used as the attribute value for spatial analysis. First, semi-variograms were conducted to examine spatial distribution patterns and to identify spatial autocorrelation of chemical factories and accidents. Semi-variograms explained that the spatial distribution of chemical factories and accidents were spatially autocorrelated. Second, the results of the semi-variograms were used in Ordinary Kriging to estimate chemical hazard and risk level. The level values were extracted from the Ordinary Kriging result and their spatial similarity was examined by juxtaposing the two values with respect to their location. Six peaks were identified in both the hazard and risk estimation result, and the peaks correlated with major cities in Korea. Third, the estimated hazard and risk levels were classified with geometrical interval and could be classified into four quadrants: Low Hazard and Low Risk (LHLR), Low Hazard and High Risk (LHHR), High Hazard and Low Risk (HHLR), and High Hazard and High Risk (HHHR). The 4 groups identified different chemical safety management issues in Korea; relatively safe LHLR group, many chemical reseller factories were found in HHLR group, chemical transportation accidents were in the LHHR group, and an abundance of factories and accidents were in the HHHR group. Each quadrant represented different safety management obstacles in Korea, and studying spatial differences can support the establishment of an efficient risk management plan.

  7. Risk-Based Prioritization Method for the Classification of Groundwater Pollution from Hazardous Waste Landfills.

    PubMed

    Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da

    2016-12-01

    Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.

  8. Wind shear over the Nice Côte d'Azur airport: case studies

    NASA Astrophysics Data System (ADS)

    Boilley, A.; Mahfouf, J.-F.

    2013-09-01

    The Nice Côte d'Azur international airport is subject to horizontal low-level wind shears. Detecting and predicting these hazards is a major concern for aircraft security. A measurement campaign took place over the Nice airport in 2009 including 4 anemometers, 1 wind lidar and 1 wind profiler. Two wind shear events were observed during this measurement campaign. Numerical simulations were carried out with Meso-NH in a configuration compatible with near-real time applications to determine the ability of the numerical model to predict these events and to study the meteorological situations generating an horizontal wind shear. A comparison between numerical simulation and the observation dataset is conducted in this paper.

  9. Wind shear over the Nice Côte d'Azur airport: case studies

    NASA Astrophysics Data System (ADS)

    Boilley, A.; Mahfouf, J.-F.

    2013-04-01

    The Nice Côte d'Azur international airport is subject to horizontal low-level wind shears. Detecting and predicting these hazards is a major concern for aircraft security. A measurement campaign took place over the Nice airport in 2009 including 4 anemometers, 1 wind lidar and 1 wind profiler. Two wind shear events were observed during this measurement campaign. Numerical simulations were carried out with Meso-NH in a configuration compatible with near-real time applications to determine the ability of the numerical model to predict these events and to study the meteorological situations generating a horizontal wind shear. A comparison between numerical simulation and the observation dataset is conducted in this paper.

  10. A Sensor-Independent Gust Hazard Metric

    NASA Technical Reports Server (NTRS)

    Stewart, Eric C.

    2001-01-01

    A procedure for calculating an intuitive hazard metric for gust effects on airplanes is described. The hazard metric is for use by pilots and is intended to replace subjective pilot reports (PIREPs) of the turbulence level. The hazard metric is composed of three numbers: the first describes the average airplane response to the turbulence, the second describes the positive peak airplane response to the gusts, and the third describes the negative peak airplane response to the gusts. The hazard metric is derived from any time history of vertical gust measurements and is thus independent of the sensor making the gust measurements. The metric is demonstrated for one simulated airplane encountering different types of gusts including those derived from flight data recorder measurements of actual accidents. The simulated airplane responses to the gusts compare favorably with the hazard metric.

  11. Monte Carlo simulation for slip rate sensitivity analysis in Cimandiri fault area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pratama, Cecep, E-mail: great.pratama@gmail.com; Meilano, Irwan; Nugraha, Andri Dian

    Slip rate is used to estimate earthquake recurrence relationship which is the most influence for hazard level. We examine slip rate contribution of Peak Ground Acceleration (PGA), in probabilistic seismic hazard maps (10% probability of exceedance in 50 years or 500 years return period). Hazard curve of PGA have been investigated for Sukabumi using a PSHA (Probabilistic Seismic Hazard Analysis). We observe that the most influence in the hazard estimate is crustal fault. Monte Carlo approach has been developed to assess the sensitivity. Then, Monte Carlo simulations properties have been assessed. Uncertainty and coefficient of variation from slip rate formore » Cimandiri Fault area has been calculated. We observe that seismic hazard estimates is sensitive to fault slip rate with seismic hazard uncertainty result about 0.25 g. For specific site, we found seismic hazard estimate for Sukabumi is between 0.4904 – 0.8465 g with uncertainty between 0.0847 – 0.2389 g and COV between 17.7% – 29.8%.« less

  12. The effects of presence and influence in nature images in a simulated hospital patient room.

    PubMed

    Vincent, Ellen; Battisto, Dina; Grimes, Larry

    2010-01-01

    Nature images are frequently used for therapeutic purposes in hospital settings. Nature images may distract people from pain and promote psychological and physiological well-being, yet limited research is available to guide the selection process of nature images. The hypothesis is that higher degrees of presence and/or influence in the still photograph make it more effective at holding the viewer's attention, which therefore may distract the viewer from pain, and therefore be considered therapeutic. Research questions include: (1) Is there a significant difference in the level of perceived presence among the selected images? (2) Is there a significant difference in the level of perceived influence among the selected images? (3) Is there a correlation between levels of presence and levels of influence? 109 college students were randomly assigned to one of four different image categories defined by Appleton's prospect refuge theory of landscape preference. Categories included prospect, refuge, hazard, and mixed prospect and refuge. A control group was also included. Each investigation was divided into five periods: prereporting, rest, a pain stressor (hand in ice water for up to 120 seconds), recovery, and postreporting. Physiological readings (vital signs) were measured repeatedly using a Dinamap automatic vital sign tracking machine. Psychological responses (mood) to the image were collected using a reliable instrument, the Profile of Mood States. No significant statistical difference in levels of presence was found among the four image categories. However, levels of influence differed and the hazard nature image category had significantly higher influence ratings and lower diastolic blood pressure readings during the pain treatment. A correlation (r = .62) between presence and influence was identified; as one rose, so did the other. Mood state was significantly low for the hazard nature image after the pain stressor experience. Though the hazard image caused distraction from pain, it is nontherapeutic because of the low mood ratings it received. These preliminary findings contribute methodology to the research field and stimulate interest for additional research into the visual effects of nature images on pain.

  13. Risk assessment of low-level chemical exposures from consumer products under the U.S. Consumer Product Safety Commission chronic hazard guidelines.

    PubMed Central

    Babich, M A

    1998-01-01

    The U.S. Consumer Product Safety Commission (CPSC) is an independent regulatory agency that was created in 1973. The CPSC has jurisdiction over more the 15,000 types of consumer products used in and around the home or by children, except items such as food, drugs, cosmetics, medical devices, pesticides, certain radioactive materials, products that emit radiation (e.g., microwave ovens), and automobiles. The CPSC has investigated many low-level exposures from consumer products, including formaldehyde emissions from urea-formaldehyde foam insulation and pressed wood products, CO and NO2 emmissions from combustion appliances, and dioxin in paper products. Many chemical hazards are addressed under the Federal Hazardous Substances Act (FHSA), which applies to acute and chronic health effects resulting from high- or low-level exposures. In 1992 the Commission issued guidelines for assessing chronic hazards under the FHSA, including carcinogenicity, neurotoxicity, reproductive/developmental toxicity, exposure, bioavailability, risk assessment, and acceptable risk. The chronic hazard guidelines describe a series of default assumptions, which are used in the absence of evidence to the contrary. However, the guidelines are intended to be sufficiently flexible to incorporate the latest scientific information. The use of alternative procedures is permissible, on a case-by-case basis, provided that the procedures used are scientifically defensible and supported by appropriate data. The application of the chronic hazard guidelines in assessing the risks from low-level exposures is discussed. PMID:9539035

  14. Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems

    NASA Astrophysics Data System (ADS)

    Kwag, Shinyoung

    Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.

  15. Do Crashes and Near Crashes in Simulator-Based Training Enhance Novice Drivers’ Visual Search for Latent Hazards?

    PubMed Central

    Vlakveld, Willem; Romoser, Matthew R. E.; Mehranian, Hasmik; Diete, Frank; Pollatsek, Alexander; Fisher, Donald L.

    2012-01-01

    Young drivers (younger than 25 years of age) are overrepresented in crashes. Research suggests that a relevant cause is inadequate visual search for possible hazards that are hidden from view. The objective of this study was to develop and evaluate a low-cost, fixed-base simulator training program that would address this failure. It was hypothesized that elicited crashes in the simulator training would result in better scanning for latent hazards in scenarios that were similar to the training scenarios but situated in a different environment (near transfer), and, to a lesser degree, would result in better scanning in scenarios that had altogether different latent hazards than those contained in the training scenarios (far transfer). To test the hypotheses, 18 trained and 18 untrained young novice drivers were evaluated on an advanced driving simulator (different from the training simulator). The eye movements of both groups were measured. In near transfer scenarios, trained drivers fixated the hazardous region 84% of the time, compared with only 57% of untrained drivers. In far transfer scenarios, trained drivers fixated the hazardous region 71 % of the time, compared with only 53% of untrained drivers. The differences between trained and untrained drivers in both the near transfer scenarios and the far transfer scenarios were significant, with a large effect size in the near transfer scenarios and a medium effect size in the far transfer scenarios [respectively: U = 63.00, p(2-tailed) < .01, r = −.53, and U = 88.00, p(2-tailed)<.05,r = −.39]. PMID:23082041

  16. SUBGRADE MONOLITHIC ENCASEMENT STABILIZATION OF CATEGORY 3 LOW LEVEL WASTE (LLW)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PHILLIPS, S.J.

    2004-02-03

    A highly efficient and effective technology has been developed and is being used for stabilization of Hazard Category 3 low-level waste at the U.S. Department of Energy's Hanford Site. Using large, structurally interconnected monoliths, which form one large monolith that fills a waste disposal trench, the patented technology can be used for final internment of almost any hazardous, radioactive, or toxic waste or combinations of these waste materials packaged in a variety of sizes, shapes, and volumes within governmental regulatory limits. The technology increases waste volumetric loading by 100 percent, area use efficiency by 200 percent, and volumetric configuration efficiencymore » by more than 500 percent over past practices. To date, in excess of 2,010 m{sup 3} of contact-handled and remote-handled low-level radioactive waste have been interned using this patented technology. Additionally, in excess of 120 m{sup 3} of low-level radioactive waste requiring stabilization in low-diffusion coefficient waste encasement matrix has been disposed using this technology. Greater than five orders of magnitude in radiation exposure reduction have been noted using this method of encasement of Hazard Category 3 waste. Additionally, exposure monitored at all monolith locations produced by the slip form technology is less than 1.29 x E-07 C {center_dot} kg{sup -1}. Monolithic encasement of Hazard Category 3 low-level waste and other waste category materials may be successfully accomplished using this technology at nominally any governmental or private sector waste disposal facility. Additionally, other waste materials consisting of hazardous, radioactive, toxic, or mixed waste materials can be disposed of using the monolithic slip form encasement technology.« less

  17. Method for the Preparation of Hazard Map in Urban Area Using Soil Depth and Groundwater Level

    NASA Astrophysics Data System (ADS)

    Kim, Sung-Wook; Choi, Eun-Kyeong; Cho, Jin Woo; Lee, Ju-Hyoung

    2017-04-01

    The hazard maps for predicting collapse on natural slopes consists of a combination of topographic, hydrological, and geological factors. Topographic factors are extracted from DEM, including aspect, slope, curvature, and topographic index. Hydrological factors, such as distance to drainage, drainage density, stream-power index, and wetness index are most important factors for slope instability. However, most of the urban areas are located on the plains and it is difficult to apply the hazard map using the topography and hydrological factors. In order to evaluate the risk of collapse of flat and low slope areas, soil depth and groundwater level data were collected and used as a factor for interpretation. In addition, the reliability of the hazard map was compared with the disaster history of the study area (Gangnam-gu and Yeouido district). In the disaster map of the disaster prevention agency, the urban area was mostly classified as the stable area and did not reflect the collapse history. Soil depth, drainage conditions and groundwater level obtained from boreholes were added as input data of hazard map, and disaster vulnerability increased at the location where the actual collapse points. In the study area where damage occurred, the moderate and low grades of the vulnerability of previous hazard map were 12% and 88%, respectively. While, the improved map showed 2% high grade, moderate grade 29%, low grade 66% and very low grade 2%. These results were similar to actual damage. Keywords: hazard map, urban area, soil depth, ground water level Acknowledgement This research was supported by a Grant from a Strategic Research Project (Horizontal Drilling and Stabilization Technologies for Urban Search and Rescue (US&R) Operation) funded by the Korea Institute of Civil Engineering and Building Technology.

  18. Fruit pomace as a fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mason, N.; Davis, D.C.; Hyde, G.M.

    1983-12-01

    In this study the solid waste (pomace) from grape and apple juice processing was chemically analyzed to determine high heating value. Grape pomace combustion was simulated at several excess air levels and combustion products were analyzed. Then grape pomace was actually burned in a concentric vortex furnace at several levels of excess air to determine combustion efficiency and to confirm flue gas pollutant characteristics. The results show that apple and grape pomace are chemically similar to wood from the combustion standpoint and that furnace slagging is not a problem because the ash fusion temperatures are considerably higher than combustion temperatures.more » The grape pomace burned at efficiencies of 44 to 61 percent with only low pollution hazard.« less

  19. A collaborative fire hazard reduction/ecosystem restoration stewardship project in a Montana mixed ponderosa pine/Douglas-fir/western larch wildland urban interface

    Treesearch

    Steve Slaughter; Laura Ward; Michael Hillis; Jim Chew; Rebecca McFarlan

    2004-01-01

    Forest Service managers and researchers designed and evaluated alternative disturbance-based fire hazard reduction/ecosystem restoration treatments in a greatly altered low-elevation ponderosa pine/Douglas-fir/western larch wildland urban interface. Collaboratively planned improvement cutting and prescribed fire treatment alternatives were evaluated in simulations of...

  20. Verbal collision avoidance messages during simulated driving: perceived urgency, alerting effectiveness and annoyance.

    PubMed

    Baldwin, Carryl L

    2011-04-01

    Matching the perceived urgency of an alert with the relative hazard level of the situation is critical for effective alarm response. Two experiments describe the impact of acoustic and semantic parameters on ratings of perceived urgency, annoyance and alerting effectiveness and on alarm response speed. Within a simulated driving context, participants rated and responded to collision avoidance system (CAS) messages spoken by a female or male voice (experiments 1 and 2, respectively). Results indicated greater perceived urgency and faster alarm response times as intensity increased from -2 dB signal to noise (S/N) ratio to +10 dB S/N, although annoyance ratings increased as well. CAS semantic content interacted with alarm intensity, indicating that at lower intensity levels participants paid more attention to the semantic content. Results indicate that both acoustic and semantic parameters independently and interactively impact CAS alert perceptions in divided attention conditions and this work can inform auditory alarm design for effective hazard matching. Matching the perceived urgency of an alert with the relative hazard level of the situation is critical for effective alarm response. Here, both acoustic and semantic parameters independently and interactively impacted CAS alert perceptions in divided attention conditions. This work can inform auditory alarm design for effective hazard matching. STATEMENT OF RELEVANCE: Results indicate that both acoustic parameters and semantic content can be used to design collision warnings with a range of urgency levels. Further, these results indicate that verbal warnings tailored to a specific hazard situation may improve hazard-matching capabilities without substantial trade-offs in perceived annoyance.

  1. A spatial DB model to simulate the road network efficiency in hydrogeological emergency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michele, Mangiameli, E-mail: michele.mangiameli@dica.unict.it; Giuseppe, Mussumeci

    We deal with the theme of the simulation of risk analysis using a technological approach based on the integration of exclusively free and open source tools: PostgreSQL as Database Management System (DBMS) and Quantum GIS-GRASS as Geographic Information System (GIS) platform. The case study is represented by a seismic land in Sicily characterized by steep slopes and frequent instability phenomena. This area includes a city of about 30.000 inhabitants (Enna) that lies on the top of a mountain at about 990 m a.s.l.. The access to the city is assured by few and very winding roads that are also highly vulnerablemore » to seismic and hydrogeological hazards. When exceptional rainfall events occur, the loss of efficiency of these roads should compromise timeliness and effectiveness of rescue operations. The data of the sample area have been structured into the adopted DBMS, and the connection to the GIS functionalities allows simulating the exceptional events. We analyzed the hazard, vulnerability and exposure related to these events and calculated the final risk defining three classes for each scenario: low (L), medium (M) and high (H). This study can be a valuable tool to prioritize risk levels and set priorities for intervention to the main road networks.« less

  2. A spatial DB model to simulate the road network efficiency in hydrogeological emergency

    NASA Astrophysics Data System (ADS)

    Michele, Mangiameli; Giuseppe, Mussumeci

    2015-12-01

    We deal with the theme of the simulation of risk analysis using a technological approach based on the integration of exclusively free and open source tools: PostgreSQL as Database Management System (DBMS) and Quantum GIS-GRASS as Geographic Information System (GIS) platform. The case study is represented by a seismic land in Sicily characterized by steep slopes and frequent instability phenomena. This area includes a city of about 30.000 inhabitants (Enna) that lies on the top of a mountain at about 990 m a.s.l.. The access to the city is assured by few and very winding roads that are also highly vulnerable to seismic and hydrogeological hazards. When exceptional rainfall events occur, the loss of efficiency of these roads should compromise timeliness and effectiveness of rescue operations. The data of the sample area have been structured into the adopted DBMS, and the connection to the GIS functionalities allows simulating the exceptional events. We analyzed the hazard, vulnerability and exposure related to these events and calculated the final risk defining three classes for each scenario: low (L), medium (M) and high (H). This study can be a valuable tool to prioritize risk levels and set priorities for intervention to the main road networks..

  3. Blue light hazard optimization for white light-emitting diode sources with high luminous efficacy of radiation and high color rendering index

    NASA Astrophysics Data System (ADS)

    Zhang, Jingjing; Guo, Weihong; Xie, Bin; Yu, Xingjian; Luo, Xiaobing; Zhang, Tao; Yu, Zhihua; Wang, Hong; Jin, Xing

    2017-09-01

    Blue light hazard of white light-emitting diodes (LED) is a hidden risk for human's photobiological safety. Recent spectral optimization methods focus on maximizing luminous efficacy and improving color performances of LEDs, but few of them take blue hazard into account. Therefore, for healthy lighting, it's urgent to propose a spectral optimization method for white LED source to exhibit low blue light hazard, high luminous efficacy of radiation (LER) and high color performances. In this study, a genetic algorithm with penalty functions was proposed for realizing white spectra with low blue hazard, maximal LER and high color rendering index (CRI) values. By simulations, white spectra from LEDs with low blue hazard, high LER (≥297 lm/W) and high CRI (≥90) were achieved at different correlated color temperatures (CCTs) from 2013 K to 7845 K. Thus, the spectral optimization method can be used for guiding the fabrication of LED sources in line with photobiological safety. It is also found that the maximum permissible exposure duration of the optimized spectra increases by 14.9% than that of bichromatic phosphor-converted LEDs with equal CCT.

  4. Remote vacuum compaction of compressible hazardous waste

    DOEpatents

    Coyne, M.J.; Fiscus, G.M.; Sammel, A.G.

    1998-10-06

    A system is described for remote vacuum compaction and containment of low-level radioactive or hazardous waste comprising a vacuum source, a sealable first flexible container, and a sealable outer flexible container for receiving one or more first flexible containers. A method for compacting low level radioactive or hazardous waste materials at the point of generation comprising the steps of sealing the waste in a first flexible container, sealing one or more first containers within an outer flexible container, breaching the integrity of the first containers, evacuating the air from the inner and outer containers, and sealing the outer container shut. 8 figs.

  5. Remote vacuum compaction of compressible hazardous waste

    DOEpatents

    Coyne, Martin J.; Fiscus, Gregory M.; Sammel, Alfred G.

    1998-01-01

    A system for remote vacuum compaction and containment of low-level radioactive or hazardous waste comprising a vacuum source, a sealable first flexible container, and a sealable outer flexible container for receiving one or more first flexible containers. A method for compacting low level radioactive or hazardous waste materials at the point of generation comprising the steps of sealing the waste in a first flexible container, sealing one or more first containers within an outer flexible container, breaching the integrity of the first containers, evacuating the air from the inner and outer containers, and sealing the outer container shut.

  6. Remote vacuum compaction of compressible hazardous waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coyne, M.J.; Fiscus, G.M.; Sammel, A.G.

    1996-12-31

    A system is described for remote vacuum compaction and containment of low-level radioactive or hazardous waste comprising a vacuum source, a sealable first flexible container, and a sealable outer flexible container for receiving one or more first flexible containers. A method for compacting low level radioactive or hazardous waste materials at the point of generation comprising the steps of sealing the waste in a first flexible container, sealing one or more first containers within an outer flexible container, breaching the integrity of the first containers, evacuating the air from the inner and outer containers, and sealing the outer container shut.

  7. Tsunami hazard assessment in the Hudson River Estuary based on dynamic tsunami-tide simulations

    NASA Astrophysics Data System (ADS)

    Shelby, Michael; Grilli, Stéphan T.; Grilli, Annette R.

    2016-12-01

    This work is part of a tsunami inundation mapping activity carried out along the US East Coast since 2010, under the auspice of the National Tsunami Hazard Mitigation program (NTHMP). The US East Coast features two main estuaries with significant tidal forcing, which are bordered by numerous critical facilities (power plants, major harbors,...) as well as densely built low-level areas: Chesapeake Bay and the Hudson River Estuary (HRE). HRE is the object of this work, with specific focus on assessing tsunami hazard in Manhattan, the Hudson and East River areas. In the NTHMP work, inundation maps are computed as envelopes of maximum surface elevation along the coast and inland, by simulating the impact of selected probable maximum tsunamis (PMT) in the Atlantic ocean margin and basin. At present, such simulations assume a static reference level near shore equal to the local mean high water (MHW) level. Here, instead we simulate maximum inundation in the HRE resulting from dynamic interactions between the incident PMTs and a tide, which is calibrated to achieve MHW at its maximum level. To identify conditions leading to maximum tsunami inundation, each PMT is simulated for four different phases of the tide and results are compared to those obtained for a static reference level. We first separately simulate the tide and the three PMTs that were found to be most significant for the HRE. These are caused by: (1) a flank collapse of the Cumbre Vieja Volcano (CVV) in the Canary Islands (with a 80 km3 volume representing the most likely extreme scenario); (2) an M9 coseismic source in the Puerto Rico Trench (PRT); and (3) a large submarine mass failure (SMF) in the Hudson River canyon of parameters similar to the 165 km3 historical Currituck slide, which is used as a local proxy for the maximum possible SMF. Simulations are performed with the nonlinear and dispersive long wave model FUNWAVE-TVD, in a series of nested grids of increasing resolution towards the coast, by one-way coupling. Four levels of nested grids are used, from a 1 arc-min spherical coordinate grid in the deep ocean down to a 39-m Cartesian grid in the HRE. Bottom friction coefficients in the finer grids are calibrated for the tide to achieve the local spatially averaged MHW level at high tide in the HRE. Combined tsunami-tide simulations are then performed for four phases of the tide corresponding to each tsunami arriving at Sandy Hook (NJ): 1.5 h ahead, concurrent with, 1.5 h after, and 3 h after the local high tide. These simulations are forced along the offshore boundary of the third-level grid by linearly superposing time series of surface elevation and horizontal currents of the calibrated tide and each tsunami wave train; this is done in deep enough water for a linear superposition to be accurate. Combined tsunami-tide simulations are then performed with FUNWAVE-TVD in this and the finest nested grids. Results show that, for the 3 PMTs, depending on the tide phase, the dynamic simulations lead to no or to a slightly increased inundation in the HRE (by up to 0.15 m depending on location), and to larger currents than for the simulations over a static level; the CRT SMF proxy tsunami is the PMT leading to maximum inundation in the HRE. For all tide phases, nonlinear interactions between tide and tsunami currents modify the elevation, current, and celerity of tsunami wave trains, mostly in the shallower water areas of the HRE where bottom friction dominates, as compared to a linear superposition of wave elevations and currents. We note that, while dynamic simulations predict a slight increase in inundation, this increase may be on the same order as, or even less than sources of uncertainty in the modeling of tsunami sources, such as their initial water elevation, and in bottom friction and bathymetry used in tsunami grids. Nevertheless, results in this paper provide insight into the magnitude and spatial variability of tsunami propagation and impact in the complex inland waterways surrounding New York City, and of their modification by dynamic tidal effects. We conclude that changes in inundation resulting from the inclusion of a dynamic tide in the specific case of the HRE, although of scientific interest, are not significant for tsunami hazard assessment and that the standard approach of specifying a static reference level equal to MHW is conservative. However, in other estuaries with similarly complex bathymetry/topography and stronger tidal currents, a simplified static approach might not be appropriate.

  8. Comparative hazard evaluation of near-infrared diode lasers.

    PubMed

    Marshall, W J

    1994-05-01

    Hazard evaluation methods from various laser protection standards differ when applied to extended-source, near-infrared lasers. By way of example, various hazard analyses are applied to laser training systems, which incorporate diode lasers, specifically those that assist in training military or law enforcement personnel in the proper use of weapons by simulating actual firing by the substitution of a beam of near-infrared energy for bullets. A correct hazard evaluation of these lasers is necessary since simulators are designed to be directed toward personnel during normal use. The differences among laser standards are most apparent when determining the hazard class of a laser. Hazard classification is based on a comparison of the potential exposures with the maximum permissible exposures in the 1986 and 1993 versions of the American National Standard for the Safe Use of Lasers, Z136.1, and the accessible emission limits of the federal laser product performance standard. Necessary safety design features of a particular system depend on the hazard class. The ANSI Z136.1-1993 standard provides a simpler and more accurate hazard assessment of low-power, near-infrared, diode laser systems than the 1986 ANSI standard. Although a specific system is evaluated, the techniques described can be readily applied to other near-infrared lasers or laser training systems.

  9. Large-eddy simulation of plume dispersion within regular arrays of cubic buildings

    NASA Astrophysics Data System (ADS)

    Nakayama, H.; Jurcakova, K.; Nagai, H.

    2011-04-01

    There is a potential problem that hazardous and flammable materials are accidentally or intentionally released within populated urban areas. For the assessment of human health hazard from toxic substances, the existence of high concentration peaks in a plume should be considered. For the safety analysis of flammable gas, certain critical threshold levels should be evaluated. Therefore, in such a situation, not only average levels but also instantaneous magnitudes of concentration should be accurately predicted. In this study, we perform Large-Eddy Simulation (LES) of plume dispersion within regular arrays of cubic buildings with large obstacle densities and investigate the influence of the building arrangement on the characteristics of mean and fluctuating concentrations.

  10. SimSup's Loop: A Control Theory Approach to Spacecraft Operator Training

    NASA Technical Reports Server (NTRS)

    Owens, Brandon Dewain; Crocker, Alan R.

    2015-01-01

    Immersive simulation is a staple of training for many complex system operators, including astronauts and ground operators of spacecraft. However, while much has been written about simulators, simulation facilities, and operator certification programs, the topic of how one develops simulation scenarios to train a spacecraft operator is relatively understated in the literature. In this paper, an approach is presented for using control theory as the basis for developing the immersive simulation scenarios for a spacecraft operator training program. The operator is effectively modeled as a high level controller of lower level hardware and software control loops that affect a select set of system state variables. Simulation scenarios are derived from a STAMP-based hazard analysis of the operator's high and low level control loops. The immersive simulation aspect of the overall training program is characterized by selecting a set of scenarios that expose the operator to the various inadequate control actions that stem from control flaws and inadequate control executions in the different sections of the typical control loop. Results from the application of this approach to the Lunar Atmosphere and Dust Environment Explorer (LADEE) mission are provided through an analysis of the simulation scenarios used for operator training and the actual anomalies that occurred during the mission. The simulation scenarios and inflight anomalies are mapped to specific control flaws and inadequate control executions in the different sections of the typical control loop to illustrate the characteristics of anomalies arising from the different sections of the typical control loop (and why it is important for operators to have exposure to these characteristics). Additionally, similarities between the simulation scenarios and inflight anomalies are highlighted to make the case that the simulation scenarios prepared the operators for the mission.

  11. Environmental Planning in Jonah's Basin: A Simulation Game and Experimental Analysis.

    ERIC Educational Resources Information Center

    Horsley, Doyne

    1982-01-01

    Described is a successfully field tested simulation which will help high school or college level students become familiar with flood hazards. Students assume the roles of members of the Jonah's Basin planning commission and plan solutions to the area's flood problems. (RM)

  12. Passive versus active hazard detection and avoidance systems

    NASA Astrophysics Data System (ADS)

    Neveu, D.; Mercier, G.; Hamel, J.-F.; Simard Bilodeau, V.; Woicke, S.; Alger, M.; Beaudette, D.

    2015-06-01

    Upcoming planetary exploration missions will require advanced guidance, navigation and control technologies to reach landing sites with high precision and safety. Various technologies are currently in development to meet that goal. Some technologies rely on passive sensors and benefit from the low mass and power of such solutions while others rely on active sensors and benefit from an improved robustness and accuracy. This paper presents two different hazard detection and avoidance (HDA) system design approaches. The first architecture relies only on a camera as the passive HDA sensor while the second relies, in addition, on a Lidar as the active HDA sensor. Both options use in common an innovative hazard map fusion algorithm aiming at identifying the safest landing locations. This paper presents the simulation tools and reports the closed-loop software simulation results obtained using each design option. The paper also reports the Monte Carlo simulation campaign that was used to assess the robustness of each design option. The performance of each design option is compared against each other in terms of performance criteria such as percentage of success, mean distance to nearest hazard, etc. The applicability of each design option to planetary exploration missions is also discussed.

  13. Predicting the impact of tsunami in California under rising sea level

    NASA Astrophysics Data System (ADS)

    Dura, T.; Garner, A. J.; Weiss, R.; Kopp, R. E.; Horton, B.

    2017-12-01

    The flood hazard for the California coast depends not only on the magnitude, location, and rupture length of Alaska-Aleutian subduction zone earthquakes and their resultant tsunamis, but also on rising sea levels, which combine with tsunamis to produce overall flood levels. The magnitude of future sea-level rise remains uncertain even on the decadal scale, with future sea-level projections becoming even more uncertain at timeframes of a century or more. Earthquake statistics indicate that timeframes of ten thousand to one hundred thousand years are needed to capture rare, very large earthquakes. Because of the different timescales between reliable sea-level projections and earthquake distributions, simply combining the different probabilities in the context of a tsunami hazard assessment may be flawed. Here, we considered 15 earthquakes between Mw 8 to Mw 9.4 bound by -171oW and -140oW of the Alaska-Aleutian subduction zone. We employed 24 realizations at each magnitude with random epicenter locations and different fault length-to-width ratios, and simulated the tsunami evolution from these 360 earthquakes at each decade from the years 2000 to 2200. These simulations were then carried out for different sea-level-rise projections to analyze the future flood hazard for California. Looking at the flood levels at tide gauges, we found that the flood level simulated at, for example, the year 2100 (including respective sea-level change) is different from the flood level calculated by adding the flood for the year 2000 to the sea-level change prediction for the year 2100. This is consistent for all sea-level rise scenarios, and this difference in flood levels range between 5% and 12% for the larger half of the given magnitude interval. Focusing on flood levels at the tide gauge in the Port of Los Angeles, the most probable flood level (including all earthquake magnitudes) in the year 2000 was 5 cm. Depending on the sea-level predictions, in the year 2050 the most probable flood levels could rise to 20 to 30 cm, but increase significantly from 2100 to 2200 to between 0.5 m and 2.5 m. Aside from the significant increase in flood level, it should be noted that the range over which potential most probable flood levels can vary is significant and defines a tremendous challenge for long-term planning of hazard mitigating measures.

  14. Modeling a Glacial Lake Outburst Flood Process Chain: The Case of Lake Palcacocha and Huaraz, Peru

    NASA Astrophysics Data System (ADS)

    Chisolm, Rachel; Somos-Valenzuela, Marcelo; Rivas Gomez, Denny; McKinney, Daene C.; Portocarrero Rodriguez, Cesar

    2016-04-01

    One of the consequences of recent glacier recession in the Cordillera Blanca, Peru, is the risk of Glacial Lake Outburst Floods (GLOFs) from lakes that have formed at the base of retreating glaciers. GLOFs are often triggered by avalanches falling into glacial lakes, initiating a chain of processes that may culminate in significant inundation and destruction downstream. This paper presents simulations of all of the processes involved in a potential GLOF originating from Lake Palcacocha, the source of a previously catastrophic GLOF on December 13, 1941, 1800 people in the city of Huaraz, Peru. The chain of processes simulated here includes: (1) avalanches above the lake; (2) lake dynamics resulting from the avalanche impact, including wave generation, propagation, and run-up across lakes; (3) terminal moraine overtopping and dynamic moraine erosion simulations to determine the possibility of breaching; (4) flood propagation along downstream valleys; and (5) inundation of populated areas. The results of each process feed into simulations of subsequent processes in the chain, finally resulting in estimates of inundation in the city of Huaraz. The results of the inundation simulations were converted into flood intensity and hazard maps (based on an intensity-likelihood matrix) that may be useful for city planning and regulation. Three avalanche events with volumes ranging from 0.5-3 x 106 m3 were simulated, and two scenarios of 15 m and 30 m lake lowering were simulated to assess the potential of mitigating the hazard level in Huaraz. For all three avalanche events, three-dimensional hydrodynamic models show large waves generated in the lake from the impact resulting in overtopping of the damming-moraine. Despite very high discharge rates (up to 63.4 x 103 m3/s), the erosion from the overtopping wave did not result in failure of the damming-moraine when simulated with a hydro-morphodynamic model using excessively conservative soil characteristics that provide very little erosion resistance. With the current lake level, all three avalanche events result in inundation in Huaraz, and the resulting hazard map shows a total affected area of 2.01 km2, most of which is in the high-hazard category. Lowering the lake has the potential to reduce the affected area by up to 35% resulting in a smaller portion of the inundated area in the high-hazard category.

  15. Assessment on the leakage hazard of landfill leachate using three-dimensional excitation-emission fluorescence and parallel factor analysis method.

    PubMed

    Pan, Hongwei; Lei, Hongjun; Liu, Xin; Wei, Huaibin; Liu, Shufang

    2017-09-01

    A large number of simple and informal landfills exist in developing countries, which pose as tremendous soil and groundwater pollution threats. Early warning and monitoring of landfill leachate pollution status is of great importance. However, there is a shortage of affordable and effective tools and methods. In this study, a soil column experiment was performed to simulate the pollution status of leachate using three-dimensional excitation-emission fluorescence (3D-EEMF) and parallel factor analysis (PARAFAC) models. Sum of squared residuals (SSR) and principal component analysis (PCA) were used to determine the optimal components for PARAFAC. A one-way analysis of variance showed that the component scores of the soil column leachate were significant influenced by landfill leachate (p<0.05). Therefore, the ratio of the component scores of the soil under the landfill to that of natural soil could be used to evaluate the leakage status of landfill leachate. Furthermore, a hazard index (HI) and a hazard evaluation standard were established. A case study of Kaifeng landfill indicated a low hazard (level 5) by the use of HI. In summation, HI is presented as a tool to evaluate landfill pollution status and for the guidance of municipal solid waste management. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Computer simulation of the processes of inactivation of bacterial cells by dynamic low-coherent speckles

    NASA Astrophysics Data System (ADS)

    Ulianova, Onega V.; Ulyanov, Sergey S.; Sazanova, Elena V.; Zhihong, Zhang; Sibo, Zhou; Luo, Qingming; Zudina, Irina; Bednov, Andrey

    2006-05-01

    Biochemical, biophysical and optical aspects of interaction of low-coherent light with bacterial cells have been discussed. Influence of low-coherent speckles on the colonies grows is investigated. It has been demonstrated that effects of light on the inhibition of cells (Francisella Tularensis) are connected with speckle dynamics. The regimes of illumination of cell suspension with purpose of devitalization of hazard bacteria, caused very dangerous infections, such as tularemia, are found. Mathematical model of interaction of low-coherent laser radiation with bacteria suspension has been proposed. Computer simulations of the processes of laser-cells interaction have been carried out.

  17. Mesh versus bathtub - effects of flood models on exposure analysis in Switzerland

    NASA Astrophysics Data System (ADS)

    Röthlisberger, Veronika; Zischg, Andreas; Keiler, Margreth

    2016-04-01

    In Switzerland, mainly two types of maps that indicate potential flood zones are available for flood exposure analyses: 1) Aquaprotect, a nationwide overview provided by the Federal Office for the Environment and 2) communal flood hazard maps available from the 26 cantons. The model used to produce Aquaprotect can be described as a bathtub approach or linear superposition method with three main parameters, namely the horizontal and vertical distance of a point to water features and the size of the river sub-basin. Whereas the determination of flood zones in Aquaprotect is based on a uniform, nationwide model, the communal flood hazard maps are less homogenous, as they have been elaborated either at communal or cantonal levels. Yet their basic content (i.e. indication of potential flood zones for three recurrence periods, with differentiation of at least three inundation depths) is described in national directives and the vast majority of communal flood hazard maps are based on 2D inundation simulations using meshes. Apart from the methodical differences between Aquaprotect and the communal flood hazard maps (and among different communal flood hazard maps), all of these maps include a layer with a similar recurrence period (i.e. Aquaprotect 250 years, flood hazard maps 300 years) beyond the intended protection level of installed structural systems. In our study, we compare the resulting exposure by overlaying the two types of flood maps with a complete, harmonized, and nationwide dataset of building polygons. We assess the different exposure at the national level, and also consider differences among the 26 cantons and the six biogeographically unique regions, respectively. It was observed that while the nationwide exposure rates for both types of flood maps are similar, the differences within certain cantons and biogeographical regions are remarkable. We conclude that flood maps based on bathtub models are appropriate for assessments at national levels, while maps based on 2D simulations are preferable at sub-national levels.

  18. Some relevant parameters for assessing fire hazards of combustible mine materials using laboratory scale experiments

    PubMed Central

    Litton, Charles D.; Perera, Inoka E.; Harteis, Samuel P.; Teacoach, Kara A.; DeRosa, Maria I.; Thomas, Richard A.; Smith, Alex C.

    2018-01-01

    When combustible materials ignite and burn, the potential for fire growth and flame spread represents an obvious hazard, but during these processes of ignition and flaming, other life hazards present themselves and should be included to ensure an effective overall analysis of the relevant fire hazards. In particular, the gases and smoke produced both during the smoldering stages of fires leading to ignition and during the advanced flaming stages of a developing fire serve to contaminate the surrounding atmosphere, potentially producing elevated levels of toxicity and high levels of smoke obscuration that render the environment untenable. In underground mines, these hazards may be exacerbated by the existing forced ventilation that can carry the gases and smoke to locations far-removed from the fire location. Clearly, materials that require high temperatures (above 1400 K) and that exhibit low mass loss during thermal decomposition, or that require high heat fluxes or heat transfer rates to ignite represent less of a hazard than materials that decompose at low temperatures or ignite at low levels of heat flux. In order to define and quantify some possible parameters that can be used to assess these hazards, small-scale laboratory experiments were conducted in a number of configurations to measure: 1) the toxic gases and smoke produced both during non-flaming and flaming combustion; 2) mass loss rates as a function of temperature to determine ease of thermal decomposition; and 3) mass loss rates and times to ignition as a function of incident heat flux. This paper describes the experiments that were conducted, their results, and the development of a set of parameters that could possibly be used to assess the overall fire hazard of combustible materials using small scale laboratory experiments. PMID:29599565

  19. Some relevant parameters for assessing fire hazards of combustible mine materials using laboratory scale experiments.

    PubMed

    Litton, Charles D; Perera, Inoka E; Harteis, Samuel P; Teacoach, Kara A; DeRosa, Maria I; Thomas, Richard A; Smith, Alex C

    2018-04-15

    When combustible materials ignite and burn, the potential for fire growth and flame spread represents an obvious hazard, but during these processes of ignition and flaming, other life hazards present themselves and should be included to ensure an effective overall analysis of the relevant fire hazards. In particular, the gases and smoke produced both during the smoldering stages of fires leading to ignition and during the advanced flaming stages of a developing fire serve to contaminate the surrounding atmosphere, potentially producing elevated levels of toxicity and high levels of smoke obscuration that render the environment untenable. In underground mines, these hazards may be exacerbated by the existing forced ventilation that can carry the gases and smoke to locations far-removed from the fire location. Clearly, materials that require high temperatures (above 1400 K) and that exhibit low mass loss during thermal decomposition, or that require high heat fluxes or heat transfer rates to ignite represent less of a hazard than materials that decompose at low temperatures or ignite at low levels of heat flux. In order to define and quantify some possible parameters that can be used to assess these hazards, small-scale laboratory experiments were conducted in a number of configurations to measure: 1) the toxic gases and smoke produced both during non-flaming and flaming combustion; 2) mass loss rates as a function of temperature to determine ease of thermal decomposition; and 3) mass loss rates and times to ignition as a function of incident heat flux. This paper describes the experiments that were conducted, their results, and the development of a set of parameters that could possibly be used to assess the overall fire hazard of combustible materials using small scale laboratory experiments.

  20. Enhancing hazard avoidance in teen-novice riders.

    PubMed

    Vidotto, Giulio; Bastianelli, Alessia; Spoto, Andrea; Sergeys, Filip

    2011-01-01

    Research suggests that novice drivers' safety performance is inferior to that of experienced drivers in different ways. One of the most critical skills related to accident avoidance by a novice driver is the detection, recognition and reaction to traffic hazards; it is called hazard perception and is defined as the ability to identify potentially dangerous traffic situations. The focus of this research is to assess how far a motorcycle simulator could improve hazard avoidance skills in teenagers. Four hundred and ten participants (207 in the experimental group and 203 in the control group) took part in this research. Results demonstrated that the mean proportion of avoided hazards increases as a function of the number of tracks performed in the virtual training. Participants of the experimental group after the training had a better proportion of avoided hazards than participants of the control group with a passive training based on a road safety lesson. Results provide good evidence that training with the simulator increases the number of avoided accidents in the virtual environment. It would be reasonable to explain this improvement by a higher level of hazard perception skills. Copyright © 2010 Elsevier Ltd. All rights reserved.

  1. 40 CFR 266.305 - What does the transportation and disposal conditional exemption do?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR THE MANAGEMENT OF SPECIFIC HAZARDOUS WASTES AND SPECIFIC TYPES OF HAZARDOUS WASTE MANAGEMENT FACILITIES Conditional Exemption for Low-Level... exemption exempts your waste from the regulatory definition of hazardous waste in 40 CFR 261.3 if your waste...

  2. Large Eddy Simulations of Severe Convection Induced Turbulence

    NASA Technical Reports Server (NTRS)

    Ahmad, Nash'at; Proctor, Fred

    2011-01-01

    Convective storms can pose a serious risk to aviation operations since they are often accompanied by turbulence, heavy rain, hail, icing, lightning, strong winds, and poor visibility. They can cause major delays in air traffic due to the re-routing of flights, and by disrupting operations at the airports in the vicinity of the storm system. In this study, the Terminal Area Simulation System is used to simulate five different convective events ranging from a mesoscale convective complex to isolated storms. The occurrence of convection induced turbulence is analyzed from these simulations. The validation of model results with the radar data and other observations is reported and an aircraft-centric turbulence hazard metric calculated for each case is discussed. The turbulence analysis showed that large pockets of significant turbulence hazard can be found in regions of low radar reflectivity. Moderate and severe turbulence was often found in building cumulus turrets and overshooting tops.

  3. Advanced Mechanistic 3D Spatial Modeling and Analysis Methods to Accurately Represent Nuclear Facility External Event Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sezen, Halil; Aldemir, Tunc; Denning, R.

    Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.

  4. Fuels planning: science synthesis and integration; forest structure and fire hazard fact sheet 05: fuel treatment principles for complex landscapes

    Treesearch

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    Appropriate types of thinning and surface fuel treatments are clearly useful in reducing surface and crown fire hazards under a wide range of fuels and topographic situations. This paper provides well-established scientific principles and simulation tools that can be used to adjust fuel treatments to attain specific risk levels.

  5. Coastal hazards and groundwater salinization on low coral islands.

    NASA Astrophysics Data System (ADS)

    Terry, James P.; Chui, T. F. May

    2016-04-01

    Remote oceanic communities living on low-lying coral islands (atolls) without surface water rely for their survival on the continuing viability of fragile groundwater resources. These exist in the form of fresh groundwater lenses (FGLs) that develop naturally within the porous coral sand and gravel substrate. Coastal hazards such as inundation by high-energy waves driven by storms and continuing sea-level rise (SLR) are among many possible threats to viable FGL size and quality on atolls. Yet, not much is known about the combined effects of wave washover during powerful storms and SLR on different sizes of coral island, nor conversely how island size influences lens resilience against damage. This study investigates FGL damage by salinization (and resilience) caused by such coastal hazards using a modelling approach. Numerical modelling is carried out to generate steady-state FGL configurations at three chosen island sizes (400, 600 and 800 m widths). Steady-state solutions reveal how FGL dimensions are related in a non-linear manner to coral island size, such that smaller islands develop much more restricted lenses than larger islands. A 40 cm SLR scenario is then imposed. This is followed by transient simulations to examine storm-induced wave washover and subsequent FGL responses to saline damage over a 1 year period. Smaller FGLs display greater potential for disturbance by SLR, while larger and more robust FGLs tend to show more resilience. Further results produce a somewhat counterintuitive finding: in the post-SLR condition, FGL vulnerability to washover salinization may actually be reduced, owing to the thinner layer of unsaturated substrate lying above the water table into which saline water can infiltrate during a storm event. Nonetheless, combined washover and SLR impacts imply overall that advancing groundwater salinization may lead to some coral islands becoming uninhabitable long before they are completely submerged by sea-level rise, thereby calling into question the sustainability of atoll communities that face recurrent coastal hazards.

  6. Regionalization and Evaluation of Impacts of Climate Change on Mexican Coasts

    NASA Astrophysics Data System (ADS)

    Nava-Sanchez, E. H.; Murillo-Jimenez, J. M.; Godinez-Orta, L.; Morales-Perez, R. A.

    2009-04-01

    Mexican coasts exhibit a high variety of geoforms and processes, and consequently, are exposed to a variability of types and impact levels of geological hazards. Tropical cyclones are the most devastating hazards for the Mexican coast, although, impact levels are higher on the southern coast of both Atlantic and Pacific oceans. The second dangerous geo-hazards are earthquakes and tsunamis, which affect all Pacific coast, causing more damage the earthquakes generated in the Cocos Trench. For seismic hazards, there is a regionalization of the Mexican territory, however, even though the high levels of damages caused by other natural hazards, there is a lack of initiatives for performing atlas of natural hazards or coastal management plans. Exceptions are the local scale atlas of natural hazards by the Mexican Geological Survey or some other local scale atlas made with several errors by non experience private consultant companies. Our work shows results of analyses of coastal geological hazards associated to global warming such as the sea level rise, and the increase in strength of some coastal processes. Initially, due to the high diversity in coastal environments for the Mexican coast, it was considered that, a regional characterization of the coastal zone, and the gathering of environmental data for determining levels of impact of the various coastal hazards, as an evaluation of coastal vulnerability. Thus, the basic criteria for defining Coastal Regions, in order of importance, were the following: geomorphology, climate, geology, tectonics, and oceanography. Also, some anthropogenic factors were taken in account for the coastal regionalization, such as civil construction along the coastline, land used and modification of the fluvial system. The analysis of such criteria, allows us to classify the Mexican coasts in 10 Coastal Regions. On the Pacific coast regions are: (I) Pacific Coast of Baja California, (II) Gulf Coast of Baja California, (III) Coastal Plain of the Gulf de California, (IV) Pacific Southwest Coast, and (V) Chiapaneca Coastal Plain. On the Atlantic coast, regions are: (VI) Tamulipeca Coastal Plain, (VII) Veracruzana Volcanic Coast, (VIII) Tabasqueña Coastal Plain, (IX) Yucatan Platform, and (X) Caribean Coast. Secondly, the coastal hazards associated to a rising sea level and increasing strength of coastal processes due to climate change, were analyzed, and allowed us to determine, in order of importance, the following hazards: (a) marine flooding, by sea level rise per se and effect of storm surges; (b) beach erosion by waves, causing lose of beach width or the retreat of the whole beach system, and overwash of sand barriers; (c) fluvial flooding of coastal plains and deltaic areas; (d) salinization of estuaries and aquifers by saltwater intrusion. Finally, after overlying the characteristics of each Coastal Region and its exposition to the identified coastal hazards, we concluded that, Coastal Regions highly vulnerable to sea level rise are number V and VIII, since they show wide lowlands (up to 7 m above MSL), and have high populated areas affected by heavy rain, tropical cyclones and storm surges; regions with moderate vulnerability are number VI, IX and X, which contain lowlands (up to 7 m above MSL), populated areas, exhibit watershed with low sediment production, and are located on the tropical cyclone tracks; regions with moderately low vulnerability are number III and VII, which contain relatively narrow lowlands, important lagoon and deltaic systems, several rivers are affected by anthropogenic activities, and are moderately affected by storms and tropical cyclones; regions with low vulnerability and short coastlines exposed to sea level rise hazards are regions number I and IV, which contain narrow lagoon and deltaic systems; and finally, region II is of a very low vulnerability, with narrow and scarce areas exposed to sea level rise hazards. This project was part of a Research Program on Climate Change Impacts, supported by the Mexican Institute for Water Technology and was carried out as a collaborative subprogram between that institute and the Interdisciplinary Center for Marine Sciences.

  7. Low Thrust Orbital Maneuvers Using Ion Propulsion

    NASA Astrophysics Data System (ADS)

    Ramesh, Eric

    2011-10-01

    Low-thrust maneuver options, such as electric propulsion, offer specific challenges within mission-level Modeling, Simulation, and Analysis (MS&A) tools. This project seeks to transition techniques for simulating low-thrust maneuvers from detailed engineering level simulations such as AGI's Satellite ToolKit (STK) Astrogator to mission level simulations such as the System Effectiveness Analysis Simulation (SEAS). Our project goals are as follows: A) Assess different low-thrust options to achieve various orbital changes; B) Compare such approaches to more conventional, high-thrust profiles; C) Compare computational cost and accuracy of various approaches to calculate and simulate low-thrust maneuvers; D) Recommend methods for implementing low-thrust maneuvers in high-level mission simulations; E) prototype recommended solutions.

  8. Risk assessment in infrastructure in educational institution: A study in Malaysia

    NASA Astrophysics Data System (ADS)

    Rasdan Ismail, Ahmad; Adilah Hamzah, Noor; Kamilah Makhtar, Nor; Azhar Mat Daud, Khairul; Zulkarnaen Khidzir, Nik; Husna Che Hassan, Nurul; Arifpin Mansor, Muhamad

    2017-10-01

    This particular study was conducted to assess the hazard exposure in education institution and to highlight the possible risk level available. The assessment utilised is Hazard Identification, Risk Assessment and Risk Control (HIRARC). There was a 2008’s form in order to determine the risk level of the hazard. There were over 111 of education institutions were selected around Malaysia to perform this assessment. Area chosen for each institution was office, playing field, canteen, classroom, toilet and drainage. By referring HIRARC Guideline 2008, the determination of risk rank is measure based on the formula likelihood multiply severity and the rank need to refer from risk matrix standard. There are several hazard have be found and shows the high, medium and low of risk level. The higher level of risk was discussed in the study which is hazard found in playing field and hazard in office. There several hazard that need to be control by education management to avoid increase of case accident in Education Sector, Malaysia. As conclusion, the exposure hazard among the staff and educators is high and further action and control are needed. Further study need to explore the best recommendation for control measure of the hazard exposed by education institution.

  9. Data fusion strategies for hazard detection and safe site selection for planetary and small body landings

    NASA Astrophysics Data System (ADS)

    Câmara, F.; Oliveira, J.; Hormigo, T.; Araújo, J.; Ribeiro, R.; Falcão, A.; Gomes, M.; Dubois-Matra, O.; Vijendran, S.

    2015-06-01

    This paper discusses the design and evaluation of data fusion strategies to perform tiered fusion of several heterogeneous sensors and a priori data. The aim is to increase robustness and performance of hazard detection and avoidance systems, while enabling safe planetary and small body landings anytime, anywhere. The focus is on Mars and asteroid landing mission scenarios and three distinct data fusion algorithms are introduced and compared. The first algorithm consists of a hybrid camera-LIDAR hazard detection and avoidance system, the H2DAS, in which data fusion is performed at both sensor-level data (reconstruction of the point cloud obtained with a scanning LIDAR using the navigation motion states and correcting the image for motion compensation using IMU data), feature-level data (concatenation of multiple digital elevation maps, obtained from consecutive LIDAR images, to achieve higher accuracy and resolution maps while enabling relative positioning) as well as decision-level data (fusing hazard maps from multiple sensors onto a single image space, with a single grid orientation and spacing). The second method presented is a hybrid reasoning fusion, the HRF, in which innovative algorithms replace the decision-level functions of the previous method, by combining three different reasoning engines—a fuzzy reasoning engine, a probabilistic reasoning engine and an evidential reasoning engine—to produce safety maps. Finally, the third method presented is called Intelligent Planetary Site Selection, the IPSIS, an innovative multi-criteria, dynamic decision-level data fusion algorithm that takes into account historical information for the selection of landing sites and a piloting function with a non-exhaustive landing site search capability, i.e., capable of finding local optima by searching a reduced set of global maps. All the discussed data fusion strategies and algorithms have been integrated, verified and validated in a closed-loop simulation environment. Monte Carlo simulation campaigns were performed for the algorithms performance assessment and benchmarking. The simulations results comprise the landing phases of Mars and Phobos landing mission scenarios.

  10. Association of neighbourhood unemployment rate with incident Type 2 diabetes mellitus in five German regions.

    PubMed

    Müller, G; Wellmann, J; Hartwig, S; Greiser, K H; Moebus, S; Jöckel, K-H; Schipf, S; Völzke, H; Maier, W; Meisinger, C; Tamayo, T; Rathmann, W; Berger, K

    2015-08-01

    To analyse the association of neighbourhood unemployment with incident self-reported physician-diagnosed Type 2 diabetes in a population aged 45-74 years from five German regions. Study participants were linked via their addresses at baseline to particular neighbourhoods. Individual-level data from five population-based studies were pooled and combined with contextual data on neighbourhood unemployment. Type 2 diabetes was assessed according to a self-reported physician diagnosis of diabetes. We estimated proportional hazard models (Weibull distribution) in order to obtain hazard ratios and 95% CIs of Type 2 diabetes mellitus, taking into account interval-censoring and clustering. We included 7250 participants residing in 228 inner city neighbourhoods in five German regions in our analysis. The incidence rate was 12.6 per 1000 person-years (95% CI 11.4-13.8). The risk of Type 2 diabetes mellitus was higher in men [hazard ratio 1.79 (95% CI 1.47-2.18)] than in women and higher in people with a low education level [hazard ratio 1.55 (95% CI 1.18-2.02)] than in those with a high education level. Independently of individual-level characteristics, we found a higher risk of Type 2 diabetes mellitus in neighbourhoods with high levels of unemployment [quintile 5; hazard ratio 1.72 (95% CI 1.23-2.42)] than in neighbourhoods with low unemployment (quintile 1). Low education level and high neighbourhood unemployment were independently associated with an elevated risk of Type 2 diabetes mellitus. Studies examining the impact of the residential environment on Type 2 diabetes mellitus will provide knowledge that is essential for the identification of high-risk populations. © 2014 The Authors. Diabetic Medicine © 2014 Diabetes UK.

  11. Ethnic Drinking Culture, Acculturation, and Enculturation in Relation to Alcohol Drinking Behavior Among Marriage-Based Male Immigrants in Taiwan.

    PubMed

    Chen, Hung-Hui; Chien, Li-Yin

    2018-04-01

    Drinking behavior among immigrants could be influenced by drinking-related cultural norms in their country of origin and host country. This study examined the association of ethnic drinking culture, acculturation, and enculturation with alcohol drinking among male immigrants in Taiwan. This cross-sectional survey recruited 188 male immigrants. Ethnic drinking culture was divided into dry and wet according to per capita alcohol consumption and abstinent rate in the countries of origin in reference to that in Taiwan. A scale, Bidimensional Acculturation Scale for Marriage-Based Immigrants, was developed to measure acculturation (adaptation to the host culture) and enculturation (maintenance of the original culture). Drinking patterns (abstinent, low-risk drinking, and hazardous drinking) were determined by scores on the Alcohol Use Disorder Identification Test. There was a significant interaction between ethnic drinking culture and enculturation/acculturation on drinking patterns. Multinomial logistic regression models identified that for those from dry ethnic drinking cultures, a high level of acculturation was associated with increased low-risk drinking, while a high level of enculturation was associated with decreased low-risk drinking. For those from wet ethnic drinking cultures, a low level of acculturation and high level of enculturation were associated with increased hazardous drinking. High family socioeconomic status was associated with increased drinking, while perceived insufficient family income was positively associated with hazardous use. To prevent hazardous use of alcohol, health education should be targeted at immigrant men who drink, especially among those who have economic problems, are from wet ethnic drinking cultures, and demonstrate low adaptation to the host culture.

  12. A Prototype Flight-Deck Airflow Hazard Visualization System

    NASA Technical Reports Server (NTRS)

    Aragon, Cecilia R.

    2004-01-01

    Airflow hazards such as turbulence, vortices, or low-level wind shear can pose a threat to landing aircraft and are especially dangerous to helicopters. Because pilots usually cannot see airflow, they may be unaware of the extent of the hazard. We have developed a prototype airflow hazard visual display for use in helicopter cockpits to alleviate this problem. We report on the results of a preliminary usability study of our airflow hazard visualization system in helicopter-shipboard operations.

  13. Of Modeling the Radiation Hazards Along Trajectory Space Vehicles Various Purpose

    NASA Astrophysics Data System (ADS)

    Grichshenko, Valentina

    2016-07-01

    The paper discusses the results of the simulation of radiation hazard along trajectory low-orbit spacecraft for various purposes, geostationary and navigation satellites. Developed criteria of reliability of memory cells in Space, including influence of cosmic rays (CR), differences of geophysical and geomagnetic situation on SV orbit are discussed. Numerical value of vertical geomagnetic stiffness, of CR flux and assessment of correlation failures of memory cells along low-orbit spacecrafts trajectory are presented. Obtained results are used to forecasting the radiation situation along SV orbit, reliability of memory cells in the Space and to optimize nominal equipment kit and payload of Kazakhstan SV.

  14. Establish an Agent-Simulant Technology Relationship (ASTR)

    DTIC Science & Technology

    2017-04-14

    for quantitative measures that characterize simulant performance in testing , such as the ability to be removed from surfaces. Component-level ASTRs...Overall Test and Agent-Simulant Technology Relationship (ASTR) process. 1.2 Background. a. Historically, many tests did not develop quantitative ...methodology report14. Report provides a VX-TPP ASTR for post -decon contact hazard and off- gassing. In the Stryker production verification test (PVT

  15. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model.

    PubMed

    Austin, Peter C

    2018-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest.

  16. Statistical power to detect violation of the proportional hazards assumption when using the Cox regression model

    PubMed Central

    Austin, Peter C.

    2017-01-01

    The use of the Cox proportional hazards regression model is widespread. A key assumption of the model is that of proportional hazards. Analysts frequently test the validity of this assumption using statistical significance testing. However, the statistical power of such assessments is frequently unknown. We used Monte Carlo simulations to estimate the statistical power of two different methods for detecting violations of this assumption. When the covariate was binary, we found that a model-based method had greater power than a method based on cumulative sums of martingale residuals. Furthermore, the parametric nature of the distribution of event times had an impact on power when the covariate was binary. Statistical power to detect a strong violation of the proportional hazards assumption was low to moderate even when the number of observed events was high. In many data sets, power to detect a violation of this assumption is likely to be low to modest. PMID:29321694

  17. Implications of Sea Level Rise on Coastal Flood Hazards

    NASA Astrophysics Data System (ADS)

    Roeber, V.; Li, N.; Cheung, K.; Lane, P.; Evans, R. L.; Donnelly, J. P.; Ashton, A. D.

    2012-12-01

    Recent global and local projections suggest the sea level will be on the order of 1 m or higher than the current level by the end of the century. Coastal communities and ecosystems in low-lying areas are vulnerable to impacts resulting from hurricane or large swell events in combination with sea-level rise. This study presents the implementation and results of an integrated numerical modeling package to delineate coastal inundation due to storm landfalls at future sea levels. The modeling package utilizes a suite of numerical models to capture both large-scale phenomena in the open ocean and small-scale processes in coastal areas. It contains four components to simulate (1) meteorological conditions, (2) astronomical tides and surge, (3) wave generation, propagation, and nearshore transformation, and (4) surf-zone processes and inundation onto dry land associated with a storm event. Important aspects of this package are the two-way coupling of a spectral wave model and a storm surge model as well as a detailed representation of surf and swash zone dynamics by a higher-order Boussinesq-type wave model. The package was validated with field data from Hurricane Ivan of 2005 on the US Gulf coast and applied to tropical and extratropical storm scenarios respectively at Eglin, Florida and Camp Lejeune, North Carolina. The results show a nonlinear increase of storm surge level and nearshore wave energy with a rising sea level. The exacerbated flood hazard can have major consequences for coastal communities with respect to erosion and damage to infrastructure.

  18. Factors Associated With Drinking Behavior Among Immigrant Women in Taiwan.

    PubMed

    Liu, Yi-Chun; Chen, Hung-Hui; Lee, Jia-Fu; Chu, Kuei-Hui; Chien, Li-Yin

    2017-04-16

    Transnational marriage-based immigrant women in Taiwan have moved to a country where alcohol use is prevalent and they face the challenge of adaptation into a new society, which could influence their drinking behavior. To describe the prevalence of alcohol drinking and examine factors associated with drinking patterns among immigrant women in Taiwan. This study was a cross-sectional questionnaire survey and data were collected from June through November in 2013. Convenience samples of 757 immigrant women were recruited across Taiwan. Alcohol use patterns during the past year were divided into abstinent, low-risk drinking, and hazardous drinking based on the Alcohol Use Disorder Identification Test. Measures included subject characteristics, exposure to cigarettes and alcohol, acculturation level, and perceived stress. The prevalence of drinking during the past year among immigrant women was 29.9% (low-risk drinking 27.6% and hazardous drinking: 2.3%). Multinomial logistic regression showed that women who were employed, who smoked, whose husbands drank, and who interacted with Taiwanese friends frequently were significantly more likely to be in the low-risk drinking group compared with the abstinent group. Women who were divorced/widowed, who had low education levels, who smoked, and whose husbands drank were significantly more likely to be in the hazardous drinking group compared with the abstinent group. More acculturation in immigrant women as indicated by working and frequently interacting with friends in mainstream society was related to low-risk drinking behavior; adversities as indicated by loss of marriage and low education level were related to hazardous drinking behavior.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belsher, Jeremy D.; Pierson, Kayla L.; Gimpel, Rod F.

    The Hanford site in southeast Washington contains approximately 207 million liters of radioactive and hazardous waste stored in 177 underground tanks. The U.S. Department of Energy's Office of River Protection is currently managing the Hanford waste treatment mission, which includes the storage, retrieval, treatment and disposal of the tank waste. Two recent studies, employing the modeling tools managed by the One System organization, have highlighted waste cleanup mission sensitivities. The Hanford Tank Waste Operations Simulator Sensitivity Study evaluated the impact that varying 21 different parameters had on the Hanford Tank Waste Operations Simulator model. It concluded that inaccuracies in themore » predicted phase partitioning of a few key components can result in significant changes in the waste treatment duration and in the amount of immobilized high-level waste that is produced. In addition, reducing the efficiency with which tank waste is retrieved and staged can increase mission duration. The 2012 WTP Tank Utilization Assessment concluded that flowsheet models need to include the latest low-activity waste glass algorithms or the waste treatment mission duration and the amount of low activity waste that is produced could be significantly underestimated. (authors)« less

  20. Modeling a glacial lake outburst flood process chain: the case of Lake Palcacocha and Huaraz, Peru

    NASA Astrophysics Data System (ADS)

    Somos-Valenzuela, Marcelo A.; Chisolm, Rachel E.; Rivas, Denny S.; Portocarrero, Cesar; McKinney, Daene C.

    2016-07-01

    One of the consequences of recent glacier recession in the Cordillera Blanca, Peru, is the risk of glacial lake outburst floods (GLOFs) from lakes that have formed at the base of retreating glaciers. GLOFs are often triggered by avalanches falling into glacial lakes, initiating a chain of processes that may culminate in significant inundation and destruction downstream. This paper presents simulations of all of the processes involved in a potential GLOF originating from Lake Palcacocha, the source of a previously catastrophic GLOF on 13 December 1941, killing about 1800 people in the city of Huaraz, Peru. The chain of processes simulated here includes (1) avalanches above the lake; (2) lake dynamics resulting from the avalanche impact, including wave generation, propagation, and run-up across lakes; (3) terminal moraine overtopping and dynamic moraine erosion simulations to determine the possibility of breaching; (4) flood propagation along downstream valleys; and (5) inundation of populated areas. The results of each process feed into simulations of subsequent processes in the chain, finally resulting in estimates of inundation in the city of Huaraz. The results of the inundation simulations were converted into flood intensity and preliminary hazard maps (based on an intensity-likelihood matrix) that may be useful for city planning and regulation. Three avalanche events with volumes ranging from 0.5 to 3 × 106 m3 were simulated, and two scenarios of 15 and 30 m lake lowering were simulated to assess the potential of mitigating the hazard level in Huaraz. For all three avalanche events, three-dimensional hydrodynamic models show large waves generated in the lake from the impact resulting in overtopping of the damming moraine. Despite very high discharge rates (up to 63.4 × 103 m3 s-1), the erosion from the overtopping wave did not result in failure of the damming moraine when simulated with a hydro-morphodynamic model using excessively conservative soil characteristics that provide very little erosion resistance. With the current lake level, all three avalanche events result in inundation in Huaraz due to wave overtopping, and the resulting preliminary hazard map shows a total affected area of 2.01 km2, most of which is in the high hazard category. Lowering the lake has the potential to reduce the affected area by up to 35 %, resulting in a smaller portion of the inundated area in the high hazard category.

  1. Controls of multi-modal wave conditions in a complex coastal setting

    USGS Publications Warehouse

    Hegermiller, Christie; Rueda, Ana C.; Erikson, Li H.; Barnard, Patrick L.; Antolinez, J.A.A.; Mendez, Fernando J.

    2017-01-01

    Coastal hazards emerge from the combined effect of wave conditions and sea level anomalies associated with storms or low-frequency atmosphere-ocean oscillations. Rigorous characterization of wave climate is limited by the availability of spectral wave observations, the computational cost of dynamical simulations, and the ability to link wave-generating atmospheric patterns with coastal conditions. We present a hybrid statistical-dynamical approach to simulating nearshore wave climate in complex coastal settings, demonstrated in the Southern California Bight, where waves arriving from distant, disparate locations are refracted over complex bathymetry and shadowed by offshore islands. Contributions of wave families and large-scale atmospheric drivers to nearshore wave energy flux are analyzed. Results highlight the variability of influences controlling wave conditions along neighboring coastlines. The universal method demonstrated here can be applied to complex coastal settings worldwide, facilitating analysis of the effects of climate change on nearshore wave climate.

  2. Controls of Multimodal Wave Conditions in a Complex Coastal Setting

    NASA Astrophysics Data System (ADS)

    Hegermiller, C. A.; Rueda, A.; Erikson, L. H.; Barnard, P. L.; Antolinez, J. A. A.; Mendez, F. J.

    2017-12-01

    Coastal hazards emerge from the combined effect of wave conditions and sea level anomalies associated with storms or low-frequency atmosphere-ocean oscillations. Rigorous characterization of wave climate is limited by the availability of spectral wave observations, the computational cost of dynamical simulations, and the ability to link wave-generating atmospheric patterns with coastal conditions. We present a hybrid statistical-dynamical approach to simulating nearshore wave climate in complex coastal settings, demonstrated in the Southern California Bight, where waves arriving from distant, disparate locations are refracted over complex bathymetry and shadowed by offshore islands. Contributions of wave families and large-scale atmospheric drivers to nearshore wave energy flux are analyzed. Results highlight the variability of influences controlling wave conditions along neighboring coastlines. The universal method demonstrated here can be applied to complex coastal settings worldwide, facilitating analysis of the effects of climate change on nearshore wave climate.

  3. Skier triggering of backcountry avalanches with skilled route selection

    NASA Astrophysics Data System (ADS)

    Sinickas, Alexandra; Haegeli, Pascal; Jamieson, Bruce

    2015-04-01

    Jamieson (2009) provided numerical estimates for the baseline probabilities of triggering an avalanche by a backcountry skier making fresh tracks without skilled route selection as a function of the North American avalanche danger scale (i.e., hazard levels Low, Moderate, Considerable, High and Extreme). Using the results of an expert survey, he showed that triggering probabilities while skiing directly up, down or across a trigger zone without skilled route selection increase roughly by a factor of 10 with each step of the North American avalanche danger scale (i.e. hazard level). The objective of the present study is to examine the effect of skilled route selection on the relationship between triggering probability and hazard level. To assess the effect of skilled route selection on triggering probability by hazard level, we analysed avalanche hazard assessments as well as reports of skiing activity and triggering of avalanches from 11 Canadian helicopter and snowcat operations during two winters (2012-13 and 2013-14). These reports were submitted to the daily information exchange among Canadian avalanche safety operations, and reflect professional decision-making and route selection practices of guides leading groups of skiers. We selected all skier-controlled or accidentally triggered avalanches with a destructive size greater than size 1 according to the Canadian avalanche size classification, triggered by any member of a guided group (guide or guest). These operations forecast the avalanche hazard daily for each of three elevation bands: alpine, treeline and below treeline. In contrast to the 2009 study, an exposure was defined as a group skiing within any one of the three elevation bands, and consequently within a hazard rating, for the day (~4,300 ratings over two winters). For example, a group that skied below treeline (rated Moderate) and treeline (rated Considerable) in one day, would receive one count for exposure to Moderate hazard, and one count for exposure to Considerable hazard. While the absolute values for triggering probability cannot be compared to the 2009 study because of different definitions of exposure, our preliminary results suggest that with skilled route selection the triggering probability is similar all hazard levels, except for extreme for which there are few exposures. This means that the guiding teams of backcountry skiing operations effectively control the hazard from triggering avalanches with skilled route selection. Groups were exposed relatively evenly to Low hazard (1275 times or 29% of total exposure), Moderate hazard (1450 times or 33 %) and Considerable hazard (1215 times or 28 %). At higher levels, the exposure reduced to roughly 380 times (9 % of total exposure) to High hazard, and only 13 times (0.3 %) to Extreme hazard. We assess the sensitivity of the results to some of our key assumptions.

  4. The dread factor: how hazards and safety training influence learning and performance.

    PubMed

    Burke, Michael J; Salvador, Rommel O; Smith-Crowe, Kristin; Chan-Serafin, Suzanne; Smith, Alexis; Sonesh, Shirley

    2011-01-01

    On the basis of hypotheses derived from social and experiential learning theories, we meta-analytically investigated how safety training and workplace hazards impact the development of safety knowledge and safety performance. The results were consistent with an expected interaction between the level of engagement of safety training and hazardous event/exposure severity in the promotion of safety knowledge and performance. For safety knowledge and safety performance, highly engaging training was considerably more effective than less engaging training when hazardous event/exposure severity was high, whereas highly and less engaging training had comparable levels of effectiveness when hazardous event/exposure severity was low. Implications of these findings for theory testing and incorporating information on objective risk into workplace safety research and practice are discussed.

  5. Evaluating the efficacy of a thermal exposure chamber designed for assessing workers' thermal hazard.

    PubMed

    Tsai, Perng-Jy; Lo, Chuh-Lun; Sun, Yih-Min; Juang, Yow-Jer; Liu, Hung-Hsin; Chen, Wang-Yi; Yeh, Wen-Yu

    2003-05-01

    This study was conducted on a thermal exposure chamber designed for assessing workers' thermal hazard. In order to assess the efficacy of the studied chamber, three environmental conditions were selected to simulate high, middle and low thermal impact situations, with air temperatures (Ta) of 43.12, 36.23 and 25.77 masculine C, globe temperatures (Tg) of 44.41, 41.07 and 29.24 masculine C, relative humidity (RH) of 77, 59 and 39%, and air flow velocities (Va) of 1.70, 0.91 and 0.25 m/s, respectively. For the three specified thermal impact conditions, results show that the coefficients of variation (CVs) for Ta, Tg, RH and Va measured in the chamber studied were consistently less than 10%, except for Va under the low thermal impact condition (=50%). For each specified thermal impact condition, we generated 1,000 environmental combinations by using the Monte Carlo simulation approach according to the variations obtained from the four environmental factors. We directly adopted the ISO 7933 approach to estimate the allowable exposure time (AET) for each simulated environmental condition. This study yielded a range in the 95% confidence interval (95% CI) of the estimated AETs for the three specified thermal impact conditions which were consistently less than 5 min. We further conducted the sensitivity analysis to examine the effect of the four environmental factors on estimating AETs. We found Va was the least important factor in estimating AETs for any specified thermal impact condition. In conclusion, although Va was found with great variation for the chamber specified in the low thermal impact condition, the exposure chamber studied can still be regarded as a feasible one for assessing workers' thermal hazard.

  6. Passenger rail security, planning, and resilience: application of network, plume, and economic simulation models as decision support tools.

    PubMed

    Greenberg, Michael; Lioy, Paul; Ozbas, Birnur; Mantell, Nancy; Isukapalli, Sastry; Lahr, Michael; Altiok, Tayfur; Bober, Joseph; Lacy, Clifton; Lowrie, Karen; Mayer, Henry; Rovito, Jennifer

    2013-11-01

    We built three simulation models that can assist rail transit planners and operators to evaluate high and low probability rail-centered hazard events that could lead to serious consequences for rail-centered networks and their surrounding regions. Our key objective is to provide these models to users who, through planning with these models, can prevent events or more effectively react to them. The first of the three models is an industrial systems simulation tool that closely replicates rail passenger traffic flows between New York Penn Station and Trenton, New Jersey. Second, we built and used a line source plume model to trace chemical plumes released by a slow-moving freight train that could impact rail passengers, as well as people in surrounding areas. Third, we crafted an economic simulation model that estimates the regional economic consequences of a variety of rail-related hazard events through the year 2020. Each model can work independently of the others. However, used together they help provide a coherent story about what could happen and set the stage for planning that should make rail-centered transport systems more resistant and resilient to hazard events. We highlight the limitations and opportunities presented by using these models individually or in sequence. © 2013 Society for Risk Analysis.

  7. Passenger Rail Security, Planning, and Resilience: Application of Network, Plume, and Economic Simulation Models as Decision Support Tools

    PubMed Central

    Greenberg, Michael; Lioy, Paul; Ozbas, Birnur; Mantell, Nancy; Isukapalli, Sastry; Lahr, Michael; Altiok, Tayfur; Bober, Joseph; Lacy, Clifton; Lowrie, Karen; Mayer, Henry; Rovito, Jennifer

    2014-01-01

    We built three simulation models that can assist rail transit planners and operators to evaluate high and low probability rail-centered hazard events that could lead to serious consequences for rail-centered networks and their surrounding regions. Our key objective is to provide these models to users who, through planning with these models, can prevent events or more effectively react to them. The first of the three models is an industrial systems simulation tool that closely replicates rail passenger traffic flows between New York Penn Station and Trenton, New Jersey. Second, we built and used a line source plume model to trace chemical plumes released by a slow-moving freight train that could impact rail passengers, as well as people in surrounding areas. Third, we crafted an economic simulation model that estimates the regional economic consequences of a variety of rail-related hazard events through the year 2020. Each model can work independently of the others. However, used together they help provide a coherent story about what could happen and set the stage for planning that should make rail-centered transport systems more resistant and resilient to hazard events. We highlight the limitations and opportunities presented by using these models individually or in sequence. PMID:23718133

  8. Expanding CyberShake Physics-Based Seismic Hazard Calculations to Central California

    NASA Astrophysics Data System (ADS)

    Silva, F.; Callaghan, S.; Maechling, P. J.; Goulet, C. A.; Milner, K. R.; Graves, R. W.; Olsen, K. B.; Jordan, T. H.

    2016-12-01

    As part of its program of earthquake system science, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by first simulating a tensor-valued wavefield of Strain Green Tensors. CyberShake then takes an earthquake rupture forecast and extends it by varying the hypocenter location and slip distribution, resulting in about 500,000 rupture variations. Seismic reciprocity is used to calculate synthetic seismograms for each rupture variation at each computation site. These seismograms are processed to obtain intensity measures, such as spectral acceleration, which are then combined with probabilities from the earthquake rupture forecast to produce a hazard curve. Hazard curves are calculated at seismic frequencies up to 1 Hz for hundreds of sites in a region and the results interpolated to obtain a hazard map. In developing and verifying CyberShake, we have focused our modeling in the greater Los Angeles region. We are now expanding the hazard calculations into Central California. Using workflow tools running jobs across two large-scale open-science supercomputers, NCSA Blue Waters and OLCF Titan, we calculated 1-Hz PSHA results for over 400 locations in Central California. For each location, we produced hazard curves using both a 3D central California velocity model created via tomographic inversion, and a regionally averaged 1D model. These new results provide low-frequency exceedance probabilities for the rapidly expanding metropolitan areas of Santa Barbara, Bakersfield, and San Luis Obispo, and lend new insights into the effects of directivity-basin coupling associated with basins juxtaposed to major faults such as the San Andreas. Particularly interesting are the basin effects associated with the deep sediments of the southern San Joaquin Valley. We will compare hazard estimates from the 1D and 3D models, summarize the challenges of expanding CyberShake to a new geographic region, and describe our future CyberShake plans.

  9. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    NASA Astrophysics Data System (ADS)

    Harper, Bryan; Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto; Harper, Stacey

    2015-06-01

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure-activity relationships.

  10. A probabilistic storm surge risk model for the German North Sea and Baltic Sea coast

    NASA Astrophysics Data System (ADS)

    Grabbert, Jan-Henrik; Reiner, Andreas; Deepen, Jan; Rodda, Harvey; Mai, Stephan; Pfeifer, Dietmar

    2010-05-01

    The German North Sea coast is highly exposed to storm surges. Due to its concave bay-like shape mainly orientated to the North-West, cyclones from Western, North-Western and Northern directions together with astronomical tide cause storm surges accumulating the water in the German bight. Due to the existence of widespread low-lying areas (below 5m above mean sea level) behind the defenses, large areas including large economic values are exposed to coastal flooding including cities like Hamburg or Bremen. The occurrence of extreme storm surges in the past like e.g. in 1962 taking about 300 lives and causing widespread flooding and 1976 raised the awareness and led to a redesign of the coastal defenses which provide a good level of protection for today's conditions. Never the less the risk of flooding exists. Moreover an amplification of storm surge risk can be expected under the influence of climate change. The Baltic Sea coast is also exposed to storm surges, which are caused by other meteorological patterns. The influence of the astronomical tide is quite low instead high water levels are induced by strong winds only. Since the exceptional extreme event in 1872 storm surge hazard has been more or less forgotten. Although such an event is very unlikely to happen, it is not impossible. Storm surge risk is currently (almost) non-insurable in Germany. The potential risk is difficult to quantify as there are almost no historical losses available. Also premiums are difficult to assess. Therefore a new storm surge risk model is being developed to provide a basis for a probabilistic quantification of potential losses from coastal inundation. The model is funded by the GDV (German Insurance Association) and is planned to be used within the German insurance sector. Results might be used for a discussion of insurance cover for storm surge. The model consists of a probabilistic event driven hazard and a vulnerability module, furthermore an exposure interface and a financial module to account for specific (re-) insurance conditions. This contribution will mainly concentrate on the hazard module. The hazard is covered by an event simulation engine enabling Monte Carlo simulations. The event generation is done on-the-fly. A classification of historical storm surges is used based on observed sea water levels at gauging stations and extended literature research. To characterize the origin of storm events and storm surges caused by those, also meteorological parameters like wind speed and wind direction are being used. If high water levels along the coast are mainly caused by strong wind from particular directions as observed at the North Sea, there is a clear empirical relationship between wind and surge (where surge is defined as the wind-driven component of the sea water level) which can be described by the ATWS (Average Transformed Wind speed). The parameters forming the load at the coastal defense elements are water level and wave parameters like significant wave height, wave period and wave direction. To assess the wave characteristics at the coast the numerical model SWAN (Simulating Waves Near Shore) from TU Delft has been used. To account for different probabilities of failure and inundation the coast is split into segments with similar defense characteristics like type of defense, height, width, orientation and others. The chosen approach covers the most relevant failure mechanisms for coastal dikes induced by wave overtopping and overflow. Dune failure is also considered in the model. Inundation of the hinterland after defense failure is modeled using a simple dynamical 2d-approach resulting in distributed water depths and flood outlines for each segment. Losses can be estimated depending on the input exposure data either coordinate based for single buildings or aggregated on postal code level using a set of depths-damage functions.

  11. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  12. A conceptual framework for economic optimization of single hazard surveillance in livestock production chains.

    PubMed

    Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W

    2014-06-01

    Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. 41 CFR 109-42.1102-52 - Low level contaminated personal property.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 42.11-Special Types of Hazardous Material and Certain Categories of Property § 109-42.1102-52 Low... 41 Public Contracts and Property Management 3 2012-01-01 2012-01-01 false Low level contaminated personal property. 109-42.1102-52 Section 109-42.1102-52 Public Contracts and Property Management Federal...

  14. Flood hazard mapping of Palembang City by using 2D model

    NASA Astrophysics Data System (ADS)

    Farid, Mohammad; Marlina, Ayu; Kusuma, Muhammad Syahril Badri

    2017-11-01

    Palembang as the capital city of South Sumatera Province is one of the metropolitan cities in Indonesia that flooded almost every year. Flood in the city is highly related to Musi River Basin. Based on Indonesia National Agency of Disaster Management (BNPB), the level of flood hazard is high. Many natural factors caused flood in the city such as high intensity of rainfall, inadequate drainage capacity, and also backwater flow due to spring tide. Furthermore, anthropogenic factors such as population increase, land cover/use change, and garbage problem make flood problem become worse. The objective of this study is to develop flood hazard map of Palembang City by using two dimensional model. HEC-RAS 5.0 is used as modelling tool which is verified with field observation data. There are 21 sub catchments of Musi River Basin in the flood simulation. The level of flood hazard refers to Head Regulation of BNPB number 2 in 2012 regarding general guideline of disaster risk assessment. The result for 25 year return per iod of flood shows that with 112.47 km2 area of inundation, 14 sub catchments are categorized in high hazard level. It is expected that the hazard map can be used for risk assessment.

  15. Fleeing to Fault Zones: Incorporating Syrian Refugees into Earthquake Risk Analysis along the East Anatolian and Dead Sea Rift Fault Zones

    NASA Astrophysics Data System (ADS)

    Wilson, B.; Paradise, T. R.

    2016-12-01

    The influx of millions of Syrian refugees into Turkey has rapidly changed the population distribution along the Dead Sea Rift and East Anatolian Fault zones. In contrast to other countries in the Middle East where refugees are accommodated in camp environments, the majority of displaced individuals in Turkey are integrated into cities, towns, and villages—placing stress on urban settings and increasing potential exposure to strong shaking. Yet, displaced populations are not traditionally captured in data sources used in earthquake risk analysis or loss estimations. Accordingly, we present a district-level analysis assessing the spatial overlap of earthquake hazards and refugee locations in southeastern Turkey to determine how migration patterns are altering seismic risk in the region. Using migration estimates from the U.S. Humanitarian Information Unit, we create three district-level population scenarios that combine official population statistics, refugee camp populations, and low, median, and high bounds for integrated refugee populations. We perform probabilistic seismic hazard analysis alongside these population scenarios to map spatial variations in seismic risk between 2011 and late 2015. Our results show a significant relative southward increase of seismic risk for this period due to refugee migration. Additionally, we calculate earthquake fatalities for simulated earthquakes using a semi-empirical loss estimation technique to determine degree of under-estimation resulting from forgoing migration data in loss modeling. We find that including refugee populations increased casualties by 11-12% using median population estimates, and upwards of 20% using high population estimates. These results communicate the ongoing importance of placing environmental hazards in their appropriate regional and temporal context which unites physical, political, cultural, and socio-economic landscapes. Keywords: Earthquakes, Hazards, Loss-Estimation, Syrian Crisis, Migration, Refugees

  16. A Review of Hazard Anticipation Training Programs for Young Drivers

    PubMed Central

    McDonald, Catherine C.; Goodwin, Arthur H.; Pradhan, Anuj K.; Romoser, Matthew R.E.; Williams, Allan F.

    2015-01-01

    Purpose Poor hazard anticipation skills are a risk factor associated with high motor vehicle crash rates of young drivers. A number of programs have been developed to improve these skills. The purpose of this review was to assess the empirical literature on hazard anticipation training for young drivers. Methods Studies were included if they: 1) included an assessment of hazard anticipation training outcomes; 2) were published between January 1, 1980 and December 31, 2013 in an English language peer-reviewed journal or conference proceeding; and 3) included at least one group that uniquely comprised a cohort of participants <21 years. Nineteen studies met inclusion criteria. Results Studies used a variety of training methods including interactive computer programs, videos, simulation, commentary driving, or a combination of approaches. Training effects were predominantly measured through computer-based testing and driving simulation with eye tracking. Four studies included an on-road evaluation. Most studies evaluated short-term outcomes (immediate or few days). In all studies, young drivers showed improvement in selected hazard anticipation outcomes, but none investigated crash effects. Conclusions Although there is promise in existing programs, future research should include long-term follow up, evaluate crash outcomes, and assess the optimal timing of hazard anticipation training taking into account the age and experience level of young drivers. PMID:26112734

  17. The Eyes Have It.

    ERIC Educational Resources Information Center

    Walsh, Janet

    1982-01-01

    Discusses issues related to possible health hazards associated with viewing video display terminals. Includes some findings of the 1979 NIOSH report on Potential Hazards of Video Display Terminals indicating level of radiation emitted is low and providing recommendations related to glare and back pain/muscular fatigue problems. (JN)

  18. In situ simulated cardiac arrest exercises to detect system vulnerabilities.

    PubMed

    Barbeito, Atilio; Bonifacio, Alberto; Holtschneider, Mary; Segall, Noa; Schroeder, Rebecca; Mark, Jonathan

    2015-06-01

    Sudden cardiac arrest is the leading cause of death in the United States. Despite new therapies, progress in this area has been slow, and outcomes remain poor even in the hospital setting, where providers, drugs, and devices are readily available. This is partly attributed to the quality of resuscitation, which is an important determinant of survival for patients who experience cardiac arrest. Systems problems, such as deficiencies in the physical space or equipment design, hospital-level policies, work culture, and poor leadership and teamwork, are now known to contribute significantly to the quality of resuscitation provided. We describe an in situ simulation-based quality improvement program that was designed to continuously monitor the cardiac arrest response process for hazards and defects and to detect opportunities for system optimization. A total of 72 simulated unannounced cardiac arrest exercises were conducted between October 2010 and September 2013 at various locations throughout our medical center and at different times of the day. We detected several environmental, human-machine interface, culture, and policy hazards and defects. We used the Systems Engineering Initiative for Patient Safety (SEIPS) model to understand the structure, processes, and outcomes related to the hospital's emergency response system. Multidisciplinary solutions were crafted for each of the hazards detected, and the simulation program was used to iteratively test the redesigned processes before implementation in real clinical settings. We describe an ongoing program that uses in situ simulation to identify and mitigate latent hazards and defects in the hospital emergency response system. The SEIPS model provides a framework for describing and analyzing the structure, processes, and outcomes related to these events.

  19. Biochemical process of low level radioactive liquid simulation waste containing detergent

    NASA Astrophysics Data System (ADS)

    Kundari, Noor Anis; Putra, Sugili; Mukaromah, Umi

    2015-12-01

    Research of biochemical process of low level radioactive liquid waste containing detergent has been done. Thse organic liquid wastes are generated in nuclear facilities such as from laundry. The wastes that are cotegorized as hazard and poison materials are also radioactive. It must be treated properly by detoxification of the hazard and decontamination of the radionuclides to ensure that the disposal of the waste meets the requirement of standard quality of water. This research was intended to determine decontamination factor and separation efficiensies, its kinetics law, and to produce a supernatant that ensured the environmental quality standard. The radioactive element in the waste was thorium with activity of 5.10-5 Ci/m3. The radioactive liquid waste which were generated in simulation plant contains detergents that was further processed by aerobic biochemical process using SGB 103 bacteria in a batch reactor equipped with aerators. Two different concentration of samples were processed and analyzed for 212 hours and 183 hours respectively at a room temperature. The product of this process is a liquid phase called as supernatant and solid phase material called sludge. The chemical oxygen demand (COD), biological oxygen demand (BOD), suspended solid (SS), and its alpha activity were analyzed. The results show that the decontamination factor and the separation efficiency of the lower concentration samples are higher compared to the samples with high concentration. Regarding the decontamination factor, the result for 212 hours processing of waste with detergent concentration of 1.496 g/L was 3.496 times, whereas at the detergent concentration of 0.748 g/L was 15.305 times for 183 hours processing. In case of the separation efficiency, the results for both samples were 71.396% and 93.465% respectively. The Bacterial growth kinetics equation follow Monod's model and the decreasing of COD and BOD were first order with the rate constant of 0.01 hour-1.

  20. Biochemical process of low level radioactive liquid simulation waste containing detergent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kundari, Noor Anis, E-mail: nooranis@batan.go.id; Putra, Sugili; Mukaromah, Umi

    Research of biochemical process of low level radioactive liquid waste containing detergent has been done. Thse organic liquid wastes are generated in nuclear facilities such as from laundry. The wastes that are cotegorized as hazard and poison materials are also radioactive. It must be treated properly by detoxification of the hazard and decontamination of the radionuclides to ensure that the disposal of the waste meets the requirement of standard quality of water. This research was intended to determine decontamination factor and separation efficiensies, its kinetics law, and to produce a supernatant that ensured the environmental quality standard. The radioactive elementmore » in the waste was thorium with activity of 5.10{sup −5} Ci/m{sup 3}. The radioactive liquid waste which were generated in simulation plant contains detergents that was further processed by aerobic biochemical process using SGB 103 bacteria in a batch reactor equipped with aerators. Two different concentration of samples were processed and analyzed for 212 hours and 183 hours respectively at a room temperature. The product of this process is a liquid phase called as supernatant and solid phase material called sludge. The chemical oxygen demand (COD), biological oxygen demand (BOD), suspended solid (SS), and its alpha activity were analyzed. The results show that the decontamination factor and the separation efficiency of the lower concentration samples are higher compared to the samples with high concentration. Regarding the decontamination factor, the result for 212 hours processing of waste with detergent concentration of 1.496 g/L was 3.496 times, whereas at the detergent concentration of 0.748 g/L was 15.305 times for 183 hours processing. In case of the separation efficiency, the results for both samples were 71.396% and 93.465% respectively. The Bacterial growth kinetics equation follow Monod’s model and the decreasing of COD and BOD were first order with the rate constant of 0.01 hour{sup −1}.« less

  1. An updated numerical simulation of the ground-water flow system for the Castle Lake debris dam, Mount St. Helens, Washington, and implications for dam stability against heave

    USGS Publications Warehouse

    Roeloffs, Evelyn A.

    1994-01-01

    A numerical simulation of the ground-water flow system in the Castle Lake debris dam, calibrated to data from the 1991 and 1992 water years, was used to estimate factors of safety against heave and internal erosion. The Castle Lake debris dam, 5 miles northwest of the summit of Mount St. Helens, impounds 19,000 acre-ft of water that could pose a flood hazard in the event of a lake breakout. A new topographic map of the Castle Lake area prior to the 1980 eruption of Mount St. Helens was prepared and used to calculate the thickness of the debris avalanche deposits that compose the dam. Water levels in 22 piezometers and discharges from seeps on the dam face measured several times per year beginning in 1990 supplemented measurements in 11 piezometers and less frequent seep discharge measurements made since 1983. Observations in one group of piezometers reveal heads above the land surface and head gradients favoring upward flow that correspond to factors of safety only slightly greater than 2. The steady-state ground-water flow system in the debris dam was simulated using a threedimensional finite difference computer program. A uniform, isotropic model having the same shape as the dam and a hydraulic conductivity of 1.55 ft/day simulates the correct water level at half the observation points, but is in error by 10 ft or more at other points. Spatial variations of hydraulic conductivity were required to calibrate the model. The model analysis suggests that ground water flows in both directions between the debris dam and Castle Lake. Factors of safety against heave and internal erosion were calculated where the model simulated upward flow of ground water. A critical gradient analysis yields factors of safety as low as 2 near the piezometers where water level observations indicate low factors of safety. Low safety factors are also computed near Castle Creek where slumping was caused by a storm in January, 1990. If hydraulic property contrasts are present in areas of the debris dam unsampled by piezometers, then low safety factors may exist that are not evident in the numerical model analysis. Numerical model simulations showed that lowering Castle Lake by 40 feet increases many factors of safety by 0.1, but increases greater than 1 are limited to the area of 1990 slumping.

  2. Comparing headphone and speaker effects on simulated driving.

    PubMed

    Nelson, T M; Nilsson, T H

    1990-12-01

    Twelve persons drove for three hours in an automobile simulator while listening to music at sound level 63dB over stereo headphones during one session and from a dashboard speaker during another session. They were required to steer a mountain highway, maintain a certain indicated speed, shift gears, and respond to occasional hazards. Steering and speed control were dependent on visual cues. The need to shift and the hazards were indicated by sound and vibration effects. With the headphones, the driver's average reaction time for the most complex task presented--shifting gears--was about one-third second longer than with the speaker. The use of headphones did not delay the development of subjective fatigue.

  3. Meteorological aspects associated with dust storms in the Sistan region, southeastern Iran

    NASA Astrophysics Data System (ADS)

    Kaskaoutis, D. G.; Rashki, A.; Houssos, E. E.; Mofidi, A.; Goto, D.; Bartzokas, A.; Francois, P.; Legrand, M.

    2015-07-01

    Dust storms are considered natural hazards that seriously affect atmospheric conditions, ecosystems and human health. A key requirement for investigating the dust life cycle is the analysis of the meteorological (synoptic and dynamic) processes that control dust emission, uplift and transport. The present work focuses on examining the synoptic and dynamic meteorological conditions associated with dust-storms in the Sistan region, southeastern Iran during the summer season (June-September) of the years 2001-2012. The dust-storm days (total number of 356) are related to visibility records below 1 km at Zabol meteorological station, located near to the dust source. RegCM4 model simulations indicate that the intense northern Levar wind, the high surface heating and the valley-like characteristics of the region strongly affect the meteorological dynamics and the formation of a low-level jet that are strongly linked with dust exposures. The intra-annual evolution of the dust storms does not seem to be significantly associated with El-Nino Southern Oscillation, despite the fact that most of the dust-storms are related to positive values of Oceanic Nino Index. National Center for Environmental Prediction/National Center for Atmospheric Research reanalysis suggests that the dust storms are associated with low sea-level pressure conditions over the whole south Asia, while at 700 hPa level a trough of low geopotential heights over India along with a ridge over Arabia and central Iran is the common scenario. A significant finding is that the dust storms over Sistan are found to be associated with a pronounced increase of the anticyclone over the Caspian Sea, enhancing the west-to-east pressure gradient and, therefore, the blowing of Levar. Infrared Difference Dust Index values highlight the intensity of the Sistan dust storms, while the SPRINTARS model simulates the dust loading and concentration reasonably well, since the dust storms are usually associated with peaks in model simulations.

  4. Geophysical Tools, Challenges and Perspectives Related to Natural Hazards, Climate Change and Food Security

    NASA Astrophysics Data System (ADS)

    Fucugauchi, J. U.

    2013-05-01

    In the coming decades a changing climate and natural hazards will likely increase the vulnerability of agricultural and other food production infrastructures, posing increasing treats to industrialized and developing economies. While food security concerns affect us globally, the huge differences among countries in stocks, population size, poverty levels, economy, technologic development, transportation, health care systems and basic infrastructure will pose a much larger burden on populations in the developing and less developed world. In these economies, increase in the magnitude, duration and frequency of droughts, floods, hurricanes, rising sea levels, heat waves, thunderstorms, freezing events and other phenomena will pose severe costs on the population. For this presentation, we concentrate on a geophysical perspective of the problems, tools available, challenges and short and long-term perspectives. In many instances, a range of natural hazards are considered as unforeseen catastrophes, which suddenly affect without warning, resulting in major losses. Although the forecasting capacity in the different situations arising from climate change and natural hazards is still limited, there are a range of tools available to assess scenarios and forecast models for developing and implementing better mitigation strategies and prevention programs. Earth observation systems, geophysical instrumental networks, satellite observatories, improved understanding of phenomena, expanded global and regional databases, geographic information systems, higher capacity for computer modeling, numerical simulations, etc provide a scientific-technical framework for developing strategies. Hazard prevention and mitigation programs will result in high costs globally, however major costs and challenges concentrate on the less developed economies already affected by poverty, famines, health problems, social inequalities, poor infrastructure, low life expectancy, high population growth, inadequate education systems, immigration, economic crises, conflicts and other issues. Case history analyses and proposals for collaboration programs, know-how transfer and better use of geophysical tools, data, observatories and monitoring networks will be discussed.

  5. A high-resolution physically-based global flood hazard map

    NASA Astrophysics Data System (ADS)

    Kaheil, Y.; Begnudelli, L.; McCollum, J.

    2016-12-01

    We present the results from a physically-based global flood hazard model. The model uses a physically-based hydrologic model to simulate river discharges, and 2D hydrodynamic model to simulate inundation. The model is set up such that it allows the application of large-scale flood hazard through efficient use of parallel computing. For hydrology, we use the Hillslope River Routing (HRR) model. HRR accounts for surface hydrology using Green-Ampt parameterization. The model is calibrated against observed discharge data from the Global Runoff Data Centre (GRDC) network, among other publicly-available datasets. The parallel-computing framework takes advantage of the river network structure to minimize cross-processor messages, and thus significantly increases computational efficiency. For inundation, we implemented a computationally-efficient 2D finite-volume model with wetting/drying. The approach consists of simulating flood along the river network by forcing the hydraulic model with the streamflow hydrographs simulated by HRR, and scaled up to certain return levels, e.g. 100 years. The model is distributed such that each available processor takes the next simulation. Given an approximate criterion, the simulations are ordered from most-demanding to least-demanding to ensure that all processors finalize almost simultaneously. Upon completing all simulations, the maximum envelope of flood depth is taken to generate the final map. The model is applied globally, with selected results shown from different continents and regions. The maps shown depict flood depth and extent at different return periods. These maps, which are currently available at 3 arc-sec resolution ( 90m) can be made available at higher resolutions where high resolution DEMs are available. The maps can be utilized by flood risk managers at the national, regional, and even local levels to further understand their flood risk exposure, exercise certain measures of mitigation, and/or transfer the residual risk financially through flood insurance programs.

  6. Pyroclastic flow hazard at Volcán Citlaltépetl

    USGS Publications Warehouse

    Sheridan, Michael F.; Hubbard, Bernard E.; Carrasco-Nunez, Gerardo; Siebe, Claus

    2004-01-01

    Volcán Citlaltépetl (Pico de Orizaba) with an elevation of 5,675 m is the highest volcano in North America. Its most recent catastrophic events involved the production of pyroclastic flows that erupted approximately 4,000, 8,500, and 13,000 years ago. The distribution of mapped deposits from these eruptions gives an approximate guide to the extent of products from potential future eruptions. Because the topography of this volcano is constantly changing computer simulations were made on the present topography using three computer algorithms: energy cone, FLOW2D, and FLOW3D. The Heim Coefficient (μ), used as a code parameter for frictional sliding in all our algorithms, is the ratio of the assumed drop in elevation (H) divided by the lateral extent of the mapped deposits (L). The viscosity parameter for the FLOW2D and FLOW3D codes was adjusted so that the paths of the flows mimicked those inferred from the mapped deposits. We modeled two categories of pyroclastic flows modeled for the level I and level II events. Level I pyroclastic flows correspond to small but more frequent block-and-ash flows that remain on the main cone. Level II flows correspond to more widespread flows from catastrophic eruptions with an approximate 4,000-year repose period. We developed hazard maps from simulations based on a National Imagery and Mapping Agency (NIMA) DTED-1 DEM with a 90 m grid and a vertical accuracy of ±30 m. Because realistic visualization is an important aid to understanding the risks related to volcanic hazards we present the DEM as modeled by FLOW3D. The model shows that the pyroclastic flows extend for much greater distances to the east of the volcano summit where the topographic relief is nearly 4,300 m. This study was used to plot hazard zones for pyroclastic flows in the official hazard map that was published recently.

  7. The Framework of a Coastal Hazards Model - A Tool for Predicting the Impact of Severe Storms

    USGS Publications Warehouse

    Barnard, Patrick L.; O'Reilly, Bill; van Ormondt, Maarten; Elias, Edwin; Ruggiero, Peter; Erikson, Li H.; Hapke, Cheryl; Collins, Brian D.; Guza, Robert T.; Adams, Peter N.; Thomas, Julie

    2009-01-01

    The U.S. Geological Survey (USGS) Multi-Hazards Demonstration Project in Southern California (Jones and others, 2007) is a five-year project (FY2007-FY2011) integrating multiple USGS research activities with the needs of external partners, such as emergency managers and land-use planners, to produce products and information that can be used to create more disaster-resilient communities. The hazards being evaluated include earthquakes, landslides, floods, tsunamis, wildfires, and coastal hazards. For the Coastal Hazards Task of the Multi-Hazards Demonstration Project in Southern California, the USGS is leading the development of a modeling system for forecasting the impact of winter storms threatening the entire Southern California shoreline from Pt. Conception to the Mexican border. The modeling system, run in real-time or with prescribed scenarios, will incorporate atmospheric information (that is, wind and pressure fields) with a suite of state-of-the-art physical process models (that is, tide, surge, and wave) to enable detailed prediction of currents, wave height, wave runup, and total water levels. Additional research-grade predictions of coastal flooding, inundation, erosion, and cliff failure will also be performed. Initial model testing, performance evaluation, and product development will be focused on a severe winter-storm scenario developed in collaboration with the Winter Storm Working Group of the USGS Multi-Hazards Demonstration Project in Southern California. Additional offline model runs and products will include coastal-hazard hindcasts of selected historical winter storms, as well as additional severe winter-storm simulations based on statistical analyses of historical wave and water-level data. The coastal-hazards model design will also be appropriate for simulating the impact of storms under various sea level rise and climate-change scenarios. The operational capabilities of this modeling system are designed to provide emergency planners with the critical information they need to respond quickly and efficiently and to increase public safety and mitigate damage associated with powerful coastal storms. For instance, high resolution local models will predict detailed wave heights, breaking patterns, and current strengths for use in warning systems for harbor-mouth navigation and densely populated coastal regions where beach safety is threatened. The offline applications are intended to equip coastal managers with the information needed to manage and allocate their resources effectively to protect sections of coast that may be most vulnerable to future severe storms.

  8. Modeling of Flood Risk for the Continental United States

    NASA Astrophysics Data System (ADS)

    Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.

    2011-12-01

    The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from the flood hazard model is used to drive a flood loss model that is coupled to a financial model.

  9. Hazard assessment of inorganics to three endangered fish in the Green River, Utah

    USGS Publications Warehouse

    Hamilton, S.J.

    1995-01-01

    Acute toxicity tests were conducted with three life stages of Colorado squawfish (Ptychocheilus lucius), razorback sucker (Xyrauchen texanus), and bonytail (Gila elegans) in a reconstituted water quality simulating the middle part of the Green River of Utah. Tests were conducted with boron, lithium, selenate, selenite, uranium, vanadium, and zinc. The overall rank order of toxicity to all species and life stages combined from most to least toxic was vanadium = zinc > selenite > lithium = uranium > selenate > boron. There was no difference between the three species in their sensitivity to the seven inorganics based on a rank-order evaluation at the species level. Colorado squawfish were 2-5 times more sensitive to selenate and selenite at the swimup life stage than older stages, whereas razorback suckers displayed equal sensitivity among life stages. Bonytail exhibited equal sensitivity to selenite, but were five times more sensitive to selenate at the swimup life stage than the older stages. Comparison of 96-hr LC50 values with a limited number of environmental water concentrations in Ashley Creek, Utah, which receives irrigation drainwater, revealed moderate hazard ratios for boron, selenate, selenite, and zinc, low hazard ratios for uranium and vanadium, but unknown ratios for lithium. These inorganic contaminants in drainwaters may adversely affect endangered fish in the Green River.

  10. How well can we test probabilistic seismic hazard maps?

    NASA Astrophysics Data System (ADS)

    Vanneste, Kris; Stein, Seth; Camelbeeck, Thierry; Vleminckx, Bart

    2017-04-01

    Recent large earthquakes that gave rise to shaking much stronger than shown in probabilistic seismic hazard (PSH) maps have stimulated discussion about how well these maps forecast future shaking. These discussions have brought home the fact that although the maps are designed to achieve certain goals, we know little about how well they actually perform. As for any other forecast, this question involves verification and validation. Verification involves assessing how well the algorithm used to produce hazard maps implements the conceptual PSH model ("have we built the model right?"). Validation asks how well the model forecasts the shaking that actually occurs ("have we built the right model?"). We explore the verification issue by simulating shaking histories for an area with assumed uniform distribution of earthquakes, Gutenberg-Richter magnitude-frequency relation, Poisson temporal occurrence model, and ground-motion prediction equation (GMPE). We compare the maximum simulated shaking at many sites over time with that predicted by a hazard map generated for the same set of parameters. The Poisson model predicts that the fraction of sites at which shaking will exceed that of the hazard map is p = 1 - exp(-t/T), where t is the duration of observations and T is the map's return period. Exceedance is typically associated with infrequent large earthquakes, as observed in real cases. The ensemble of simulated earthquake histories yields distributions of fractional exceedance with mean equal to the predicted value. Hence, the PSH algorithm appears to be internally consistent and can be regarded as verified for this set of simulations. However, simulated fractional exceedances show a large scatter about the mean value that decreases with increasing t/T, increasing observation time and increasing Gutenberg-Richter a-value (combining intrinsic activity rate and surface area), but is independent of GMPE uncertainty. This scatter is due to the variability of earthquake recurrence, and so decreases as the largest earthquakes occur in more simulations. Our results are important for evaluating the performance of a hazard map based on misfits in fractional exceedance, and for assessing whether such misfit arises by chance or reflects a bias in the map. More specifically, we determined for a broad range of Gutenberg-Richter a-values theoretical confidence intervals on allowed misfits in fractional exceedance and on the percentage of hazard-map bias that can thus be detected by comparison with observed shaking histories. Given that in the real world we only have one shaking history for an area, these results indicate that even if a hazard map does not fit the observations, it is very difficult to assess its veracity, especially for low-to-moderate-seismicity regions. Because our model is a simplified version of reality, any additional uncertainty or complexity will tend to widen these confidence intervals.

  11. National-scale analysis of simulated hydrological droughts (1891-2015)

    NASA Astrophysics Data System (ADS)

    Rudd, Alison C.; Bell, Victoria A.; Kay, Alison L.

    2017-07-01

    Droughts are phenomena that affect people and ecosystems in a variety of ways. One way to help with resilience to future droughts is to understand the characteristics of historic droughts and how these have changed over the recent past. Although, on average, Great Britain experiences a relatively wet climate it is also prone to periods of low rainfall which can lead to droughts. Until recently research into droughts of Great Britain has been neglected compared to other natural hazards such as storms and floods. This study is the first to use a national-scale gridded hydrological model to characterise droughts across Great Britain over the last century. Firstly, the model performance at low flows is assessed and it is found that the model can simulate low flows well in many catchments across Great Britain. Next, the threshold level method is applied to time series of monthly mean river flow and soil moisture to identify historic droughts (1891-2015). It is shown that the national-scale gridded output can be used to identify historic drought periods. A quantitative assessment of drought characteristics shows that groundwater-dependent areas typically experience more severe droughts, which have longer durations rather than higher intensities. There is substantial spatial and temporal variability in the drought characteristics, but there are no consistent changes through time.

  12. 33 CFR 154.2108 - Vapor-moving devices.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) POLLUTION FACILITIES TRANSFERRING OIL OR HAZARDOUS MATERIAL IN BULK Marine Vapor Control Systems Transfer... vibration; (4) Low lube oil level; (5) Low lube oil pressure; and (6) Excessive shaft bearing temperature...

  13. Biological Applications and Effects of Optical Masers

    DTIC Science & Technology

    1988-02-19

    LANDOLT RING SYSTEM 8-10 8. EARLY STUDIES ON SOLAR RADIATION AS A RETINAL HAZARD 10-15 9. RETINAL LIGHT TOXICITY AS A FUNCTION OF WAVE’.ENGTH 15-16 10...providing a simulated solor spectrum and 10 nm bandwidths throughout the near ultraviolet, visible and near infrared spectrum. This early ocular...do not present an ocular hazard at the levels used by the MILES prototype system or in fiber optic communication systems . By 1966 enough burn

  14. Quantifying the role of climate variability on extreme total water level impacts: An application of a full simulation model to Ocean Beach, California

    NASA Astrophysics Data System (ADS)

    Serafin, K.; Ruggiero, P.; Stockdon, H. F.; Barnard, P.; Long, J.

    2014-12-01

    Many coastal communities worldwide are vulnerable to flooding and erosion driven by extreme total water levels (TWL), potentially dangerous events produced by the combination of large waves, high tides, and high non-tidal residuals. The West coast of the United States provides an especially challenging environment to model these processes due to its complex geological setting combined with uncertain forecasts for sea level rise (SLR), changes in storminess, and possible changes in the frequency of major El Niños. Our research therefore aims to develop an appropriate methodology to assess present-day and future storm-induced coastal hazards along the entire U.S. West coast, filling this information gap. We present the application of this framework in a pilot study at Ocean Beach, California, a National Park site within the Golden Gate National Recreation Area where existing event-scale coastal change data can be used for model calibration and verification. We use a probabilistic, full simulation TWL model (TWL-FSM; Serafin and Ruggiero, in press) that captures the seasonal and interannual climatic variability in extremes using functions of regional climate indices, such as the Multivariate ENSO index (MEI), to represent atmospheric patterns related to the El Niño-Southern Oscillation (ENSO). In order to characterize the effect of climate variability on TWL components, we refine the TWL-FSM by splitting non-tidal residuals into low (monthly mean sea level anomalies) and high frequency (storm surge) components. We also develop synthetic climate indices using Markov sequences to reproduce the autocorrelated nature of ENSO behavior. With the refined TWL-FSM, we simulate each TWL component, resulting in synthetic TWL records providing robust estimates of extreme return level events (e.g., the 100-yr event) and the ability to examine the relative contribution of each TWL component to these extreme events. Extreme return levels are then used to drive storm impact models to examine the probability of coastal change (Stockdon et al., 2013) and thus, the vulnerability to storm-induced coastal hazards that Ocean Beach faces. Future climate variability is easily incorporated into this framework, allowing us to quantify how an evolving climate will alter future extreme TWLs and their related coastal impacts.

  15. Probabilistic tsunami inundation map based on stochastic earthquake source model: A demonstration case in Macau, the South China Sea

    NASA Astrophysics Data System (ADS)

    Li, Linlin; Switzer, Adam D.; Wang, Yu; Chan, Chung-Han; Qiu, Qiang; Weiss, Robert

    2017-04-01

    Current tsunami inundation maps are commonly generated using deterministic scenarios, either for real-time forecasting or based on hypothetical "worst-case" events. Such maps are mainly used for emergency response and evacuation planning and do not include the information of return period. However, in practice, probabilistic tsunami inundation maps are required in a wide variety of applications, such as land-use planning, engineer design and for insurance purposes. In this study, we present a method to develop the probabilistic tsunami inundation map using a stochastic earthquake source model. To demonstrate the methodology, we take Macau a coastal city in the South China Sea as an example. Two major advances of this method are: it incorporates the most updated information of seismic tsunamigenic sources along the Manila megathrust; it integrates a stochastic source model into a Monte Carlo-type simulation in which a broad range of slip distribution patterns are generated for large numbers of synthetic earthquake events. When aggregated the large amount of inundation simulation results, we analyze the uncertainties associated with variability of earthquake rupture location and slip distribution. We also explore how tsunami hazard evolves in Macau in the context of sea level rise. Our results suggest Macau faces moderate tsunami risk due to its low-lying elevation, extensive land reclamation, high coastal population and major infrastructure density. Macau consists of four districts: Macau Peninsula, Taipa Island, Coloane island and Cotai strip. Of these Macau Peninsula is the most vulnerable to tsunami due to its low-elevation and exposure to direct waves and refracted waves from the offshore region and reflected waves from mainland. Earthquakes with magnitude larger than Mw8.0 in the northern Manila trench would likely cause hazardous inundation in Macau. Using a stochastic source model, we are able to derive a spread of potential tsunami impacts for earthquakes with the same magnitude. The diversity is caused by both random rupture locations and heterogeneous slip distribution. Adding the sea level rise component, the inundated depth caused by 1 m sea level rise is equivalent to the one caused by 90 percentile of an ensemble of Mw8.4 earthquakes.

  16. Climatic Changes and Consequences on the French West Indies (C3AF), Hurricane and Tsunami Hazards Assessment

    NASA Astrophysics Data System (ADS)

    Arnaud, G.; Krien, Y.; Zahibo, N.; Dudon, B.

    2017-12-01

    Coastal hazards are among the most worrying threats of our time. In a context of climate change coupled to a large population increase, tropical areas could be the most exposed zones of the globe. In such circumstances, understanding the underlying processes can help to better predict storm surges and the associated global risks.Here we present the partial preliminary results integrated in a multidisciplinary project focused on climatic change effects over the coastal threat in the French West Indies and funded by the European Regional Development Fund. The study aims to provide a coastal hazard assessment based on hurricane surge and tsunami modeling including several aspects of climate changes that can affect hazards such as sea level rise, crustal subsidence/uplift, coastline changes etc. Several tsunamis scenarios have been simulated including tele-tsunamis to ensure a large range of tsunami hazards. Surge level of hurricane have been calculated using a large number of synthetic hurricanes to cover the actual and forecasted climate over the tropical area of Atlantic ocean. This hazard assessment will be later coupled with stakes assessed over the territory to provide risk maps.

  17. 77 FR 34229 - Idaho: Final Authorization of State Hazardous Waste Management Program; Revision

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-11

    ... capability for the disposal of remote-handled low-level radioactive waste ((LLW) generated at the Idaho... (FONSI), for the Remote-Handled Low-Level Radioactive Waste Onsite Disposal (RHLLWOD) on an Environmental... regulating phosphate (mineral processing) plants within the state. In response to this commenter's concerns...

  18. Solid Waste Management Plan. Revision 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-04-26

    The waste types discussed in this Solid Waste Management Plan are Municipal Solid Waste, Hazardous Waste, Low-Level Mixed Waste, Low-Level Radioactive Waste, and Transuranic Waste. The plan describes for each type of solid waste, the existing waste management facilities, the issues, and the assumptions used to develop the current management plan.

  19. 75 FR 62040 - Hazardous Waste Management System; Identification and Listing of Hazardous Waste; Proposed Exclusion

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-07

    ... on- site in the pickle acid and low level radioactive wastewater treatment systems. Support... water production waste treatment system. Once- through non-contact cooling water does not require... production (deionized and make- up non-contact cooling water) treatment system and once through non- contact...

  20. Addressing Loss of Efficiency Due to Misclassification Error in Enriched Clinical Trials for the Evaluation of Targeted Therapies Based on the Cox Proportional Hazards Model.

    PubMed

    Tsai, Chen-An; Lee, Kuan-Ting; Liu, Jen-Pei

    2016-01-01

    A key feature of precision medicine is that it takes individual variability at the genetic or molecular level into account in determining the best treatment for patients diagnosed with diseases detected by recently developed novel biotechnologies. The enrichment design is an efficient design that enrolls only the patients testing positive for specific molecular targets and randomly assigns them for the targeted treatment or the concurrent control. However there is no diagnostic device with perfect accuracy and precision for detecting molecular targets. In particular, the positive predictive value (PPV) can be quite low for rare diseases with low prevalence. Under the enrichment design, some patients testing positive for specific molecular targets may not have the molecular targets. The efficacy of the targeted therapy may be underestimated in the patients that actually do have the molecular targets. To address the loss of efficiency due to misclassification error, we apply the discrete mixture modeling for time-to-event data proposed by Eng and Hanlon [8] to develop an inferential procedure, based on the Cox proportional hazard model, for treatment effects of the targeted treatment effect for the true-positive patients with the molecular targets. Our proposed procedure incorporates both inaccuracy of diagnostic devices and uncertainty of estimated accuracy measures. We employed the expectation-maximization algorithm in conjunction with the bootstrap technique for estimation of the hazard ratio and its estimated variance. We report the results of simulation studies which empirically investigated the performance of the proposed method. Our proposed method is illustrated by a numerical example.

  1. Rapid design and optimization of low-thrust rendezvous/interception trajectory for asteroid deflection missions

    NASA Astrophysics Data System (ADS)

    Li, Shuang; Zhu, Yongsheng; Wang, Yukai

    2014-02-01

    Asteroid deflection techniques are essential in order to protect the Earth from catastrophic impacts by hazardous asteroids. Rapid design and optimization of low-thrust rendezvous/interception trajectories is considered as one of the key technologies to successfully deflect potentially hazardous asteroids. In this paper, we address a general framework for the rapid design and optimization of low-thrust rendezvous/interception trajectories for future asteroid deflection missions. The design and optimization process includes three closely associated steps. Firstly, shape-based approaches and genetic algorithm (GA) are adopted to perform preliminary design, which provides a reasonable initial guess for subsequent accurate optimization. Secondly, Radau pseudospectral method is utilized to transcribe the low-thrust trajectory optimization problem into a discrete nonlinear programming (NLP) problem. Finally, sequential quadratic programming (SQP) is used to efficiently solve the nonlinear programming problem and obtain the optimal low-thrust rendezvous/interception trajectories. The rapid design and optimization algorithms developed in this paper are validated by three simulation cases with different performance indexes and boundary constraints.

  2. Assessment of social vulnerability to natural hazards in Nepal

    NASA Astrophysics Data System (ADS)

    Gautam, Dipendra

    2017-12-01

    This paper investigates district-wide social vulnerability to natural hazards in Nepal. Disasters such as earthquakes, floods, landslides, epidemics, and droughts are common in Nepal. Every year thousands of people are killed and huge economic and environmental losses occur in Nepal due to various natural hazards. Although natural hazards are well recognized, quantitative and qualitative social vulnerability mapping has not existed until now in Nepal. This study aims to quantify the social vulnerability on a local scale, considering all 75 districts using the available census. To perform district-level vulnerability mapping, 13 variables were selected and aggregated indexes were plotted in an ArcGIS environment. The sum of results shows that only 4 districts in Nepal have a very low social vulnerability index whereas 46 districts (61 %) are at moderate to high social vulnerability levels. Vulnerability mapping highlights the immediate need for decentralized frameworks to tackle natural hazards in district level; additionally, the results of this study can contribute to preparedness, planning and resource management, inter-district coordination, contingency planning, and public awareness efforts.

  3. Agent-based simulation for human-induced hazard analysis.

    PubMed

    Bulleit, William M; Drewek, Matthew W

    2011-02-01

    Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.

  4. Hazardous gases and oxygen depletion in a wet paddy pile: an experimental study in a simulating underground rice mill pit, Thailand.

    PubMed

    Yenjai, Pornthip; Chaiear, Naesinee; Charerntanyarak, Lertchai; Boonmee, Mallika

    2012-01-01

    During the rice harvesting season in Thailand, large amounts of fresh paddy are sent to rice mills immediately after harvesting due to a lack of proper farm storage space. At certain levels of moisture content, rice grains may generate hazardous gases, which can replace oxygen (O(2)) in the confined spaces of underground rice mill pits. This phenomenon has been observed in a fatal accident in Thailand. Our study aimed to investigate the type of gases and their air concentrations emitted from the paddy piles at different levels of moisture content and duration of piling time. Four levels of moisture content in the paddy piles were investigated, including dry paddy group (< 14% wet basis (wb)), wet paddy groups (22-24, 25-27 and 28-30%wb). Our measurements were conducted in 16 experimental concrete pits 80 × 80 cm wide by 60 cm high. Gases emitted were measured with an infrared spectrophotometer and a multi-gas detector every 12 h for 5 days throughout the experiment. The results revealed high levels of carbon dioxide (CO(2)) (range 5,864-8,419 ppm) in all wet paddy groups, which gradually increased over time. The concentration of carbon monoxide (CO), methane (CH(4)), nitromethane (CH(3)NO(2)) and nitrous oxide (N(2)O) in all wet paddy groups increased with piling time and with moisture content, with ranges of 11-289; 2-8; 36-374; and 4-26 ppm, respectively. The highest levels of moisture content in the paddy piles were in the range 28-30%wb. Nitrogen dioxide (NO(2)) concentrations were low in all paddy groups. The percentage of O(2) in the wet paddy groups decreased with piling time and moisture content (from 18.7% to 4.1%). This study suggested that hazardous gases could be emitted in moist paddy piles, and their concentrations could increase with increasing moisture content and piling time period.

  5. Baseline Levels and Trimestral Variation of Triiodothyronine and Thyroxine and Their Association with Mortality in Maintenance Hemodialysis Patients

    PubMed Central

    Meuwese, Christiaan L.; Dekker, Friedo W.; Lindholm, Bengt; Qureshi, Abdul R.; Heimburger, Olof; Barany, Peter; Stenvinkel, Peter; Carrero, Juan J.

    2012-01-01

    Summary Background and objectives Conflicting evidence exists with regard to the association of thyroid hormones and mortality in dialysis patients. This study assesses the association between basal and trimestral variation of thyroid stimulating hormone, triiodothyronine, and thyroxine and mortality. Design, setting, participants, & measurements In 210 prevalent hemodialysis patients, serum triiodothyronine, thyroxine, thyroid stimulating hormone, and interleukin-6 were measured 3 months apart. Cardiovascular and non-cardiovascular deaths were registered during follow-up. Based on fluctuations along tertiles of distribution, four trimestral patterns were defined for each thyroid hormone: persistently low, decrease, increase, and persistently high. The association of baseline levels and trimestral variation with mortality was investigated with Kaplan–Meier curves and Cox proportional hazard models. Results During follow-up, 103 deaths occurred. Thyroid stimulating hormone levels did not associate with mortality. Patients with relatively low basal triiodothyronine concentrations had higher hazards of dying than patients with high levels. Longitudinally, patients with persistently low levels of triiodothyronine during the 3-month period had higher mortality hazards than those having persistently high levels. These associations were mainly attributable to cardiovascular-related mortality. The association between thyroxine and mortality was not altered after adjustment for triiodothyronine. Conclusions Hemodialysis patients with reduced triiodothyronine or thyroxine levels bear an increased mortality risk, especially due to cardiovascular causes. This was true when considering both baseline measurements and trimestral variation patterns. Our longitudinal design adds observational evidence supporting the hypothesis that the link may underlie a causal effect. PMID:22246282

  6. Baseline levels and trimestral variation of triiodothyronine and thyroxine and their association with mortality in maintenance hemodialysis patients.

    PubMed

    Meuwese, Christiaan L; Dekker, Friedo W; Lindholm, Bengt; Qureshi, Abdul R; Heimburger, Olof; Barany, Peter; Stenvinkel, Peter; Carrero, Juan J

    2012-01-01

    Conflicting evidence exists with regard to the association of thyroid hormones and mortality in dialysis patients. This study assesses the association between basal and trimestral variation of thyroid stimulating hormone, triiodothyronine, and thyroxine and mortality. In 210 prevalent hemodialysis patients, serum triiodothyronine, thyroxine, thyroid stimulating hormone, and interleukin-6 were measured 3 months apart. Cardiovascular and non-cardiovascular deaths were registered during follow-up. Based on fluctuations along tertiles of distribution, four trimestral patterns were defined for each thyroid hormone: persistently low, decrease, increase, and persistently high. The association of baseline levels and trimestral variation with mortality was investigated with Kaplan-Meier curves and Cox proportional hazard models. During follow-up, 103 deaths occurred. Thyroid stimulating hormone levels did not associate with mortality. Patients with relatively low basal triiodothyronine concentrations had higher hazards of dying than patients with high levels. Longitudinally, patients with persistently low levels of triiodothyronine during the 3-month period had higher mortality hazards than those having persistently high levels. These associations were mainly attributable to cardiovascular-related mortality. The association between thyroxine and mortality was not altered after adjustment for triiodothyronine. Hemodialysis patients with reduced triiodothyronine or thyroxine levels bear an increased mortality risk, especially due to cardiovascular causes. This was true when considering both baseline measurements and trimestral variation patterns. Our longitudinal design adds observational evidence supporting the hypothesis that the link may underlie a causal effect.

  7. Integrated risk management and communication: case study of Canton Vaud (Switzerland)

    NASA Astrophysics Data System (ADS)

    Artigue, Veronica; Aye, Zar Chi; Gerber, Christian; Derron, Marc-Henri; Jaboyedoff, Michel

    2017-04-01

    Canton Vaud's history is marked by events that remind us that any territory may have to cope with natural hazards such as devastating floods of the Baye and the Veraye rivers in Montreux (1927), the overflowing of the Rhône by dam failure (1935), the mud flow of Pissot (1995) and avalanches in the Prealps (1999). All of these examples have caused significant damage, and sometimes even fatalities, in the regions of Canton Vaud. In response to these new issues, the Swiss Confederation and the local authorities of the Canton decided to implement an integrated management policy of natural risks. The realization of natural hazards maps was the first step of the integrated management process. This work resulted in more than 10'000 maps and related documents for 94% of the municipalities of the Canton, covering 17% of its total surface. From this significant amount of data, the main issue is to propose a relevant communication and to build an integrated risk management structure. To make this available information relevant for end users, the implied teams worked to realize documents and tools for a better understanding of these data by all stakeholders. The first step of this process was to carry out a statistical and geographical analysis of hazard maps that allows identifying the most exposed areas to natural hazards. An atlas could thus be created. Then, continued under this framework, several topics have been discussed for each identified risk. The results show that 88 of 318 municipalities in Canton Vaud have at least a high hazard level on their territory, 108 with a moderate hazard level, 41 with a low level and 8 with a residual level. Only 73 of 318 municipalities remain with a minimum or zero hazard level. Concerning the type of hazard considered, 16% of the building zones are exposed to floods, 18% to mud flow, 16% to deep landslides, 14% to spontaneous surface landslides, 6% to rockfall, 55% to rock collapses and less than 5% to avalanches. As the national policies require to take into account the risk at the building scale, further analysis on the buildings have been made. 1'154 buildings are exposed to a high hazard level, while 8409, 21'130 and 14'980 buildings are exposed to a moderate, low and residual hazard level respectively. This paper addresses the complexity of the realization of the hazard map products of the Canton Vaud, particularly through the statistical analysis and the difficulties encountered for data availability and quality at the building scale. The authors highlight the necessary processes to build a robust communication for all the implied stakeholders of risk management in a dynamic and changing area through the example of the Canton Vaud.

  8. Assessment Of ITS/CVO User Services ITS/CVO Qualitative Benefit/Cost Analysis

    DOT National Transportation Integrated Search

    1996-06-01

    DRIVING SIMULATOR DEDICATED SHORT RANGE COMMUNICATIONS OR DSRC, HAZARD MATERIALS OR HAZMAT, COMMERCIAL VEHICLE OPERATIONS OR CVO : THE EXECUTIVE SUMMARY OF THIS REPORT DESCRIBES THE CURRENT LEVEL OF TECHNOLOGY USED BY U.S. MOTOR CARRIERS AND PROVIDES...

  9. A Review of Hazard Anticipation Training Programs for Young Drivers.

    PubMed

    McDonald, Catherine C; Goodwin, Arthur H; Pradhan, Anuj K; Romoser, Matthew R E; Williams, Allan F

    2015-07-01

    Poor hazard anticipation skills are a risk factor associated with high motor vehicle crash rates of young drivers. A number of programs have been developed to improve these skills. The purpose of this review was to assess the empirical literature on hazard anticipation training for young drivers. Studies were included if they (1) included an assessment of hazard anticipation training outcomes; (2) were published between January 1, 1980 and December 31, 2013 in an English language peer-reviewed journal or conference proceeding; and (3) included at least one group that uniquely comprised a cohort of participants aged <21 years. Nineteen studies met inclusion criteria. Studies used a variety of training methods including interactive computer programs, videos, simulation, commentary driving, or a combination of approaches. Training effects were predominantly measured through computer-based testing and driving simulation with eye tracking. Four studies included an on-road evaluation. Most studies evaluated short-term outcomes (immediate or few days). In all studies, young drivers showed improvement in selected hazard anticipation outcomes but none investigated crash effects. Although there is promise in existing programs, future research should include long-term follow-up, evaluate crash outcomes, and assess the optimal timing of hazard anticipation training taking into account the age and experience level of young drivers. Copyright © 2015 Society for Adolescent Health and Medicine. All rights reserved.

  10. Mortality in former Olympic athletes: retrospective cohort analysis

    PubMed Central

    Zwiers, R; Zantvoord, F W A; van Bodegom, D; van der Ouderaa, F J G; Westendorp, R G J

    2012-01-01

    Objective To assess the mortality risk in subsequent years (adjusted for year of birth, nationality, and sex) of former Olympic athletes from disciplines with different levels of exercise intensity. Design Retrospective cohort study. Setting Former Olympic athletes. Participants 9889 athletes (with a known age at death) who participated in the Olympic Games between 1896 and 1936, representing 43 types of disciplines with different levels of cardiovascular, static, and dynamic intensity exercise; high or low risk of bodily collision; and different levels of physical contact. Main outcome measure All cause mortality. Results Hazard ratios for mortality among athletes from disciplines with moderate cardiovascular intensity (1.01, 95% confidence interval 0.96 to 1.07) or high cardiovascular intensity (0.98, 0.92 to 1.04) were similar to those in athletes from disciplines with low cardiovascular intensity. The underlying static and dynamic components in exercise intensity showed similar non-significant results. Increased mortality was seen among athletes from disciplines with a high risk of bodily collision (hazard ratio 1.11, 1.06 to 1.15) and with high levels of physical contact (1.16, 1.11 to 1.22). In a multivariate analysis, the effect of high cardiovascular intensity remained similar (hazard ratio 1.05, 0.89 to 1.25); the increased mortality associated with high physical contact persisted (hazard ratio 1.13, 1.06 to 1.21), but that for bodily collision became non-significant (1.03, 0.98 to 1.09) as a consequence of its close relation with physical contact. Conclusions Among former Olympic athletes, engagement in disciplines with high intensity exercise did not bring a survival benefit compared with disciplines with low intensity exercise. Those who engaged in disciplines with high levels of physical contact had higher mortality than other Olympians later in life. PMID:23241269

  11. Interactive hazards education program for youth in a low SES community: a quasi-experimental pilot study.

    PubMed

    Webb, Michelle; Ronan, Kevin R

    2014-10-01

    A pilot study of an interactive hazards education program was carried out in Canberra (Australia), with direct input from youth participants. Effects were evaluated in relation to youths' interest in disasters, motivation to prepare, risk awareness, knowledge indicators, perceived preparedness levels, planning and practice for emergencies, and fear and anxiety indicators. Parents also provided ratings, including of actual home-based preparedness activities. Using a single group pretest-posttest with benchmarking design, a sample of 20 youths and their parents from a low SES community participated. Findings indicated beneficial changes on a number of indicators. Preparedness indicators increased significantly from pre- to posttest on both youth (p < 0.01) and parent ratings (p < 0.01). Parent ratings reflected an increase of just under six home-based preparedness activities. Youth knowledge about disaster mitigation also was seen to increase significantly (p < 0.001), increasing 39% from pretest levels. While personalized risk perceptions significantly increased (p < 0.01), anxiety and worry levels were seen either not to change (generalized anxiety, p > 0.05) or to reduce between pre- and posttest (hazards-specific fears, worry, and distress, ps ranged from p < 0.05 to < 0.001). In terms of predictors of preparedness, a number of variables were found to predict posttest preparedness levels, including information searching done by participants between education sessions. These pilot findings are the first to reflect quasi-experimental outcomes for a youth hazards education program carried out in a setting other than a school that focused on a sample of youth from a low SES community. © 2014 Society for Risk Analysis.

  12. Projected Flood Risks in China based on CMIP5

    NASA Astrophysics Data System (ADS)

    Xu, Ying

    2016-04-01

    Based on the simulations from 22 CMIP5 models and in combination with data on population, GDP, arable land, and terrain elevation, the spatial distributions of the flood risk levels are calculated and analyzed under RCP8.5 for the baseline period (1986-2005), the near term future period (2016-2035), the middle term future period (2046-2065), and the long term future period (2080-2099). (1) Areas with higher flood hazard risk levels in the future are concentrated in southeastern China, and the areas with the risk level III continue to expand. The major changes in flood hazard risks will occur in the middle and long term future. (2) In future, the areas of high vulnerability to flood hazards will be located in China's eastern region. In the middle and late 21st century, the extent of the high vulnerability area will expand eastward and its intensity will gradually increase. The highest vulnerability values are found in the provinces of Beijing, Tianjin, Hebei, Henan, Anhui, Shandong, Shanghai, Jiangsu, and in parts of the Pearl River Delta. Furthermore, the major cities in northeast China, as well as Wuhan, Changsha and Nanchang are highly vulnerable. (3) The regions with high flood risk levels will be located in eastern China, in the middle and lower reaches of Yangtze River and stretching northward to Beijing and Tianjin. High-risk flood areas are also occurring in major cities in Northeast China, in some parts of Shaanxi and Shanxi, and in some coastal areas in Southeast China. (4) Compared to the baseline period, the high flood risks will increase on a regional level towards the end of the 21st century, although the areas of flood hazards show little variation. In this paper, the projected future flood risks for different periods were analyzed under the RCP8.5 emission scenarios. By comparing the results with the simulations under the RCP 2.6 and RCP 4.5 scenarios, both scenarios show no differences in the spatial distribution, but in the intensity of flood hazard risks, which are weaker than for the RCP8.5 scenarios. By using the simulations from climate model ensembles to project future flood risks, uncertainty exists for various factors, such as the coarse resolution of global climate models, different approaches to flood assessments, the selection of the weighting coefficients, as well as the used greenhouse gas emission scheme, and the estimations of future population, GDP, and arable land. Therefore, further analysis is needed to reduce the uncertainties of future flood risks.

  13. An exhaustive approach for identification of flood risk hotspots in data poor regions enforcing combined geomorphic and socio-economic indicators

    NASA Astrophysics Data System (ADS)

    Mohanty, M. P.; Karmakar, S.; Ghosh, S.

    2017-12-01

    Many countries across the Globe are victims of floods. To monitor them, various sophisticated algorithms and flood models are used by the scientific community. However, there still lies a gap to efficiently mapping flood risk. The limitations being: (i) scarcity of extensive data inputs required for precise flood modeling, (ii) fizzling performance of models in large and complex terrains (iii) high computational cost and time, and (iv) inexpertise in handling model simulations by civic bodies. These factors trigger the necessity of incorporating uncomplicated and inexpensive, yet precise approaches to identify areas at different levels of flood risk. The present study addresses this issue by utilizing various easily available, low cost data in a GIS environment for a large flood prone and data poor region. A set of geomorphic indicators of Digital Elevation Model (DEM) are analysed through linear binary classification, and are used to identify the flood hazard. The performance of these indicators is then investigated using receiver operating characteristics (ROC) curve, whereas the calibration and validation of the derived flood maps are accomplished through a comparison with dynamically coupled 1-D 2-D flood model outputs. A high degree of similarity on flood inundation proves the reliability of the proposed approach in identifying flood hazard. On the other hand, an extensive list of socio-economic indicators is selected to represent the flood vulnerability at a very finer forward sortation level using multivariate Data Envelopment Analysis (DEA). A set of bivariate flood risk maps is derived combining the flood hazard and socio-economic vulnerability maps. Given the acute problem of floods in developing countries, the proposed methodology which may be characterized by low computational cost, lesser data requirement and limited flood modeling complexity may facilitate local authorities and planners for deriving effective flood management strategies.

  14. High-resolution marine flood modelling coupling overflow and overtopping processes: framing the hazard based on historical and statistical approaches

    NASA Astrophysics Data System (ADS)

    Nicolae Lerma, Alexandre; Bulteau, Thomas; Elineau, Sylvain; Paris, François; Durand, Paul; Anselme, Brice; Pedreros, Rodrigo

    2018-01-01

    A modelling chain was implemented in order to propose a realistic appraisal of the risk in coastal areas affected by overflowing as well as overtopping processes. Simulations are performed through a nested downscaling strategy from regional to local scale at high spatial resolution with explicit buildings, urban structures such as sea front walls and hydraulic structures liable to affect the propagation of water in urban areas. Validation of the model performance is based on hard and soft available data analysis and conversion of qualitative to quantitative information to reconstruct the area affected by flooding and the succession of events during two recent storms. Two joint probability approaches (joint exceedance contour and environmental contour) are used to define 100-year offshore conditions scenarios and to investigate the flood response to each scenario in terms of (1) maximum spatial extent of flooded areas, (2) volumes of water propagation inland and (3) water level in flooded areas. Scenarios of sea level rise are also considered in order to evaluate the potential hazard evolution. Our simulations show that for a maximising 100-year hazard scenario, for the municipality as a whole, 38 % of the affected zones are prone to overflow flooding and 62 % to flooding by propagation of overtopping water volume along the seafront. Results also reveal that for the two kinds of statistic scenarios a difference of about 5 % in the forcing conditions (water level, wave height and period) can produce significant differences in terms of flooding like +13.5 % of water volumes propagating inland or +11.3 % of affected surfaces. In some areas, flood response appears to be very sensitive to the chosen scenario with differences of 0.3 to 0.5 m in water level. The developed approach enables one to frame the 100-year hazard and to characterize spatially the robustness or the uncertainty over the results. Considering a 100-year scenario with mean sea level rise (0.6 m), hazard characteristics are dramatically changed with an evolution of the overtopping / overflowing process ratio and an increase of a factor 4.84 in volumes of water propagating inland and 3.47 in flooded surfaces.

  15. The Effect of Disturbances and Surrounding Air on the Droplet Impact Phenomena

    NASA Astrophysics Data System (ADS)

    Work, Andrew; Lian, Yongsheng; Sussman, Mark

    2013-11-01

    Supercooled Large Droplets (SLDs) represent an icing hazard in a number of areas, most obviously in aviation. SLDs pose a hazard above smaller supercooled droplets because they don't freeze completely on impact, and can spread or splash. Experiments have demonstrated that surrounding air plays an important role in the droplet impact phenomena: a low ambient pressure can suppress the droplet splashing. However, the effect of surrounding air on the droplet impact has not been adequately addressed. Numerical simulations are conducted to systematically investigate the interplay between the droplet and the surrounding air in the droplet splashing regime. Disturbances originating from the experimental droplet generator are also studied in the simulation. We investigate whether these disturbances are responsible for the fingering observed in experimentation. We compare the results of several perturbations on the droplet, as well as the effect of surface roughness. Simulations are conducted using the Moment of Fluid numerical method, and the grid features adaptive mesh refinement.

  16. Emissions model of waste treatment operations at the Idaho Chemical Processing Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schindler, R.E.

    1995-03-01

    An integrated model of the waste treatment systems at the Idaho Chemical Processing Plant (ICPP) was developed using a commercially-available process simulation software (ASPEN Plus) to calculate atmospheric emissions of hazardous chemicals for use in an application for an environmental permit to operate (PTO). The processes covered by the model are the Process Equipment Waste evaporator, High Level Liquid Waste evaporator, New Waste Calcining Facility and Liquid Effluent Treatment and Disposal facility. The processes are described along with the model and its assumptions. The model calculates emissions of NO{sub x}, CO, volatile acids, hazardous metals, and organic chemicals. Some calculatedmore » relative emissions are summarized and insights on building simulations are discussed.« less

  17. Biosafety Level 3 Recon Training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickens, Brian Scott; Chavez, Melanie Ann; Heimer, Donovan J.

    The Biosafety Level 3 Recon training is a 3D virtual tool developed for the Counter WMD Analysis Cell (CWAC) and the Asymmetric Warfare Group (AWG) by the Application Modeling and Development Team within the NEN-3 International Threat Reduction Group. The training simulates a situation where friendly forces have secured from hostile forces a suspected bioweapons development laboratory. The trainee is a squad member tasked to investigate the facility, locate laboratories within the facility, and identify hazards to entrants and the surrounding area. Before beginning the 3D simulation, the trainee must select the appropriate MOPP level for entering the facility. Themore » items in the simulation, including inside and outside the bioweapon facility, are items that are commonly used by scientists in Biosafety Level (BSL) laboratories. Each item has clickable red tags that, when activated, give the trainee a brief description of the item and a controllable turn-around view. The descriptions also contain information about potential hazards the item can present. Trainees must find all tagged items in order to complete the simulation, but can also reference descriptions and turn-around view of the items in a glossary menu. Training is intended to familiarize individuals whom have little or no biology or chemistry background with technical equipment used in BSL laboratories. The revised edition of this simulation (Biosafety Level 3 Virtual Lab) changes the trainee into a investigator instead of a military combatant. Many doors now require a virtual badge swipe to open. Airlock doors may come in sets such that the open door must be closed before the next door in the set can be opened. A user interface was added so that the instructor can edit the information about the items (the brief descriptions mentioned above) using the simulation software instead of the previous method of manually entering the material in xml settings files. Facility labels, such as "No Parking" and "Men's room", were changed from Korean, into English. No other changes were made.« less

  18. Flood hydrology and dam-breach hydraulic analyses of four reservoirs in the Black Hills, South Dakota

    USGS Publications Warehouse

    Hoogestraat, Galen K.

    2011-01-01

    Extensive information about the construction of dams or potential downstream hazards in the event of a dam breach is not available for many small reservoirs within the Black Hills National Forest. In 2009, the U.S. Forest Service identified the need for reconnaissance-level dam-breach assessments for four of these reservoirs within the Black Hills National Forest (Iron Creek, Horsethief, Lakota, and Mitchell Lakes) with the potential to flood downstream structures. Flood hydrology and dam-breach hydraulic analyses for the four selected reservoirs were conducted by the U.S. Geological Survey in cooperation with the U.S. Forest service to estimate the areal extent of downstream inundation. Three high-flow breach scenarios were considered for cases when the dam is in place (overtopped) and when a dam break (failure) occurs: the 100-year recurrence 24-hour precipitation, 500-year recurrence peak flow, and the probable maximum precipitation. Inundation maps were developed that show the estimated extent of downstream floodwaters from simulated scenarios. Simulation results were used to determine the hazard classification of a dam break (high, significant, or low), based primarily on the potential for loss of life or property damage resulting from downstream inundation because of the flood surge.The inflow design floods resulting from the two simulated storm events (100-year 24-hour and probable maximum precipitation) were determined using the U.S. Army Corps of Engineers Hydrologic Engineering Center Hydrologic Modeling System (HEC-HMS). The inflow design flood for the 500-year recurrence peak flow was determined by using regional regression equations developed for streamflow-gaging stations with similar watershed characteristics. The step-backwater hydraulic analysis model, Hydrologic Engineering Center's River Analysis System (HEC-RAS), was used to determine water-surface profiles of in-place and dam-break scenarios for the three inflow design floods that were simulated. Inundation maps for in-place and dam-break scenarios were developed for the area downstream from the dam to the mouth of each stream.Dam-break scenarios for three of the four reservoirs assessed in this study were rated as low hazards owing to absence of permanent structures downstream from the dams. Iron Creek Lake's downstream channel to its mouth does not include any permanent structures within the inundation flood plains. For the two reservoirs with the largest watershed areas, Lakota and Mitchell Lake, the additional floodwater surge resulting from a dam break would be minor relative to the magnitude of the large flood streamflow into the reservoirs, based on the similar areal extent of inundation for the in-place and dam-break scenarios as indicated by the developed maps. A dam-break scenario at Horsethief Lake is rated as a significant hazard because of potential lives-in-jeopardy in downstream dwellings and appreciable economic loss.

  19. [Uncertainty analysis of ecological risk assessment caused by heavy-metals deposition from MSWI emission].

    PubMed

    Liao, Zhi-Heng; Sun, Jia-Ren; Wu, Dui; Fan, Shao-Jia; Ren, Ming-Zhong; Lü, Jia-Yang

    2014-06-01

    The CALPUFF model was applied to simulate the ground-level atmospheric concentrations of Pb and Cd from municipal solid waste incineration (MSWI) plants, and the soil concentration model was used to estimate soil concentration increments after atmospheric deposition based on Monte Carlo simulation, then ecological risk assessment was conducted by the potential ecological risk index method. The results showed that the largest atmospheric concentrations of Pb and Cd were 5.59 x 109-3) microg x m(-3) and 5.57 x 10(-4) microg x m(-3), respectively, while the maxima of soil concentration incremental medium of Pb and Cd were 2.26 mg x kg(-1) and 0.21 mg x kg(-1), respectively; High risk areas were located next to the incinerators, Cd contributed the most to the ecological risk, and Pb was basically free of pollution risk; Higher ecological hazard level was predicted at the most polluted point in urban areas with a 55.30% probability, while in rural areas, the most polluted point was assessed to moderate ecological hazard level with a 72.92% probability. In addition, sensitivity analysis of calculation parameters in the soil concentration model was conducted, which showed the simulated results of urban and rural area were most sensitive to soil mix depth and dry deposition rate, respectively.

  20. 40 CFR 266.240 - How could you lose the conditional exemption for your LLMW and what action must you take?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR THE MANAGEMENT OF SPECIFIC HAZARDOUS WASTES AND SPECIFIC TYPES OF HAZARDOUS WASTE MANAGEMENT FACILITIES Conditional Exemption for Low-Level Mixed Waste Storage, Treatment, Transportation and Disposal. Loss of Conditional Exemption § 266...

  1. 40 CFR 266.240 - How could you lose the conditional exemption for your LLMW and what action must you take?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR THE MANAGEMENT OF SPECIFIC HAZARDOUS WASTES AND SPECIFIC TYPES OF HAZARDOUS WASTE MANAGEMENT FACILITIES Conditional Exemption for Low-Level Mixed Waste Storage, Treatment, Transportation and Disposal Loss of Conditional Exemption § 266...

  2. 40 CFR 266.240 - How could you lose the conditional exemption for your LLMW and what action must you take?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR THE MANAGEMENT OF SPECIFIC HAZARDOUS WASTES AND SPECIFIC TYPES OF HAZARDOUS WASTE MANAGEMENT FACILITIES Conditional Exemption for Low-Level Mixed Waste Storage, Treatment, Transportation and Disposal. Loss of Conditional Exemption § 266...

  3. 40 CFR 266.240 - How could you lose the conditional exemption for your LLMW and what action must you take?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR THE MANAGEMENT OF SPECIFIC HAZARDOUS WASTES AND SPECIFIC TYPES OF HAZARDOUS WASTE MANAGEMENT FACILITIES Conditional Exemption for Low-Level Mixed Waste Storage, Treatment, Transportation and Disposal Loss of Conditional Exemption § 266...

  4. 40 CFR 266.350 - What records must you keep at your facility and for how long?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... after the exempted waste is sent for disposal. (e) If you are not already subject to NRC, or NRC... AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR THE MANAGEMENT OF SPECIFIC HAZARDOUS WASTES AND SPECIFIC TYPES OF HAZARDOUS WASTE MANAGEMENT FACILITIES Conditional Exemption for Low-Level Mixed Waste...

  5. Preclinical evaluation of implantable cardioverter-defibrillator developed for magnetic resonance imaging use.

    PubMed

    Gold, Michael R; Kanal, Emanuel; Schwitter, Juerg; Sommer, Torsten; Yoon, Hyun; Ellingson, Michael; Landborg, Lynn; Bratten, Tara

    2015-03-01

    Many patients with an implantable cardioverter-defibrillator (ICD) have indications for magnetic resonance imaging (MRI). However, MRI is generally contraindicated in ICD patients because of potential risks from hazardous interactions between the MRI and ICD system. The purpose of this study was to use preclinical computer modeling, animal studies, and bench and scanner testing to demonstrate the safety of an ICD system developed for 1.5-T whole-body MRI. MRI hazards were assessed and mitigated using multiple approaches: design decisions to increase safety and reliability, modeling and simulation to quantify clinical MRI exposure levels, animal studies to quantify the physiologic effects of MRI exposure, and bench testing to evaluate safety margin. Modeling estimated the incidence of a chronic change in pacing capture threshold >0.5 V and 1.0 V to be less than 1 in 160,000 and less than 1 in 1,000,000 cases, respectively. Modeling also estimated the incidence of unintended cardiac stimulation to occur in less than 1 in 1,000,000 cases. Animal studies demonstrated no delay in ventricular fibrillation detection and no reduction in ventricular fibrillation amplitude at clinical MRI exposure levels, even with multiple exposures. Bench and scanner testing demonstrated performance and safety against all other MRI-induced hazards. A preclinical strategy that includes comprehensive computer modeling, animal studies, and bench and scanner testing predicts that an ICD system developed for the magnetic resonance environment is safe and poses very low risks when exposed to 1.5-T normal operating mode whole-body MRI. Copyright © 2015 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  6. Development of Candidate Chemical Simulant List: The Evaluation of Candidate Chemical Simulants Which May Be Used in Chemically Hazardous Operations

    DTIC Science & Technology

    1982-12-01

    generation FDA Food and Drug Administration (U.S.A.) FEMA Flavoring Extract Manufacturer’s Associatic. FID Flame ionization detector FPD Flame...medicinally in the form of local analgesic or anti-inflammatory ointmer,ts or liniments S (Collins et al., 1971). It was given GRAS status by the Flavor ...methyl salicylate is considered safe for use as a flavoring agent in various foods when added in low concentrations, it has been found to be acutely

  7. Stabilization and disposal of Argonne-West low-level mixed wastes in ceramicrete waste forms.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, D. B.; Singh, D.; Strain, R. V.

    1998-02-17

    The technology of room-temperature-setting phosphate ceramics or Ceramicrete{trademark} technology, developed at Argonne National Laboratory (ANL)-East is being used to treat and dispose of low-level mixed wastes through the Department of Energy complex. During the past year, Ceramicrete{trademark} technology was implemented for field application at ANL-West. Debris wastes were treated and stabilized: (a) Hg-contaminated low-level radioactive crushed light bulbs and (b) low-level radioactive Pb-lined gloves (part of the MWIR {number_sign} AW-W002 waste stream). In addition to hazardous metals, these wastes are contaminated with low-level fission products. Initially, bench-scale waste forms with simulated and actual waste streams were fabricated by acid-base reactionsmore » between mixtures of magnesium oxide powders and an acid phosphate solution, and the wastes. Size reduction of Pb-lined plastic glove waste was accomplished by cryofractionation. The Ceramicrete{trademark} process produces dense, hard ceramic waste forms. Toxicity Characteristic Leaching Procedure (TCLP) results showed excellent stabilization of both Hg and Pb in the waste forms. The principal advantage of this technology is that immobilization of contaminants is the result of both chemical stabilization and subsequent microencapsulation of the reaction products. Based on bench-scale studies, Ceramicrete{trademark} technology has been implemented in the fabrication of 5-gal waste forms at ANL-West. Approximately 35 kg of real waste has been treated. The TCLP is being conducted on the samples from the 5-gal waste forms. It is expected that because the waste forms pass the limits set by the EPAs Universal Treatment Standard, they will be sent to a radioactive-waste disposal facility.« less

  8. Managing the financial risk of low water levels in Great Lakes with index-based contracts

    NASA Astrophysics Data System (ADS)

    Meyer, E.; Characklis, G. W.; Brown, C. M.; Moody, P.

    2014-12-01

    Low water levels in the Great Lakes have recently had significant financial impacts on the region's commercial shipping, responsible for transporting millions of dollars' worth of bulk goods each year. Low lake levels can significantly affect shipping firms, as cargo capacity is a function of draft, or the distance between water level and the ship's bottom. Draft increases with weight, and lower lake levels force ships to reduce cargo to prevent running aground in shallow harbors, directly impacting the finances of shipping companies. Risk transfer instruments may provide adaptable, yet unexplored, alternatives for managing these financial risks, at significantly less expense than more traditional solutions (e.g., dredging). Index-based financial instruments can be particularly attractive as contract payouts are directly linked to well-defined transparent metrics (e.g., lake levels), eliminating the need for subjective adjustors, as well as concerns over moral hazard. In developing such instruments, a major challenge is identifying an index that is well correlated with financial losses, and thus a contract that reliably pays out when losses are experienced (low basis risk). In this work, a relationship between lake levels and shipping revenues is developed, and actuarial analyses of the frequency and magnitude of revenue losses is completed using this relationship and synthetic water level data. This analysis is used to develop several types of index-based contracts. A standardized suite of binary contracts is developed, with each indexed to lake levels and priced according to predefined thresholds. These are combined to form portfolios with different objectives (e.g. options, collars), with optimal portfolio structure and length of coverage determined by limiting basis risk and contract cost, using simulations over the historic dataset. Results suggest that portfolios of these binary contracts can substantially reduce the risk of financial losses during periods of low lake level at a cost of only 1-3% of total revenues.

  9. Near and far field contamination modeling in a large scale enclosure: Fire Dynamics Simulator comparisons with measured observations.

    PubMed

    Ryder, Noah L; Schemel, Christopher F; Jankiewicz, Sean P

    2006-03-17

    The occurrence of a fire, no matter how small, often exposes objects to significant levels of contamination from the products of combustion. The production and dispersal of these contaminants has been an issue of relevance in the field of fire science for many years, though little work has been done to examine the contamination levels accumulated within an enclosure some time after an incident. This phenomenon is of great importance when considering the consequences associated with even low level contamination of sensitive materials, such as food, pharmaceuticals, clothing, electrical equipment, etc. Not only does such exposure present a localized hazard, but also the shipment of contaminated goods places distant recipients at risk. It is the intent of this paper to use a well-founded computational fluid dynamic (CFD) program, the Fire Dynamics Simulator (FDS), a large eddy simulation (LES) code developed by National Institute of Standards and Technology (NIST), to model smoke dispersion in order to assess the subject of air contamination and post fire surface contamination in a warehouse facility. Measured results are then compared with the results from the FDS model. Two components are examined: the production rate of contaminates and the trajectory of contaminates caused by the forced ventilation conditions. Each plays an important role in determining the extent to which the products of combustion are dispersed and the levels to which products are exposed to the contaminants throughout the enclosure. The model results indicate a good first-order approximation to the measured surface contamination levels. The proper application of the FDS model can provide a cost and time efficient means of evaluating contamination levels within a defined volume.

  10. Potential environmental impacts of light-emitting diodes (LEDs): metallic resources, toxicity, and hazardous waste classification.

    PubMed

    Lim, Seong-Rin; Kang, Daniel; Ogunseitan, Oladele A; Schoenung, Julie M

    2011-01-01

    Light-emitting diodes (LEDs) are advertised as environmentally friendly because they are energy efficient and mercury-free. This study aimed to determine if LEDs engender other forms of environmental and human health impacts, and to characterize variation across different LEDs based on color and intensity. The objectives are as follows: (i) to use standardized leachability tests to examine whether LEDs are to be categorized as hazardous waste under existing United States federal and California state regulations; and (ii) to use material life cycle impact and hazard assessment methods to evaluate resource depletion and toxicity potentials of LEDs based on their metallic constituents. According to federal standards, LEDs are not hazardous except for low-intensity red LEDs, which leached Pb at levels exceeding regulatory limits (186 mg/L; regulatory limit: 5). However, according to California regulations, excessive levels of copper (up to 3892 mg/kg; limit: 2500), Pb (up to 8103 mg/kg; limit: 1000), nickel (up to 4797 mg/kg; limit: 2000), or silver (up to 721 mg/kg; limit: 500) render all except low-intensity yellow LEDs hazardous. The environmental burden associated with resource depletion potentials derives primarily from gold and silver, whereas the burden from toxicity potentials is associated primarily with arsenic, copper, nickel, lead, iron, and silver. Establishing benchmark levels of these substances can help manufacturers implement design for environment through informed materials substitution, can motivate recyclers and waste management teams to recognize resource value and occupational hazards, and can inform policymakers who establish waste management policies for LEDs.

  11. Application of Gumbel I and Monte Carlo methods to assess seismic hazard in and around Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2018-05-01

    A proper assessment of seismic hazard is of considerable importance in order to achieve suitable building construction criteria. This paper presents probabilistic seismic hazard assessment in and around Pakistan (23° N-39° N; 59° E-80° E) in terms of peak ground acceleration (PGA). Ground motion is calculated in terms of PGA for a return period of 475 years using a seismogenic-free zone method of Gumbel's first asymptotic distribution of extreme values and Monte Carlo simulation. Appropriate attenuation relations of universal and local types have been used in this study. The results show that for many parts of Pakistan, the expected seismic hazard is relatively comparable with the level specified in the existing PGA maps.

  12. Survivorship analysis when cure is a possibility: a Monte Carlo study.

    PubMed

    Goldman, A I

    1984-01-01

    Parametric survivorship analyses of clinical trials commonly involves the assumption of a hazard function constant with time. When the empirical curve obviously levels off, one can modify the hazard function model by use of a Gompertz or Weibull distribution with hazard decreasing over time. Some cancer treatments are thought to cure some patients within a short time of initiation. Then, instead of all patients having the same hazard, decreasing over time, a biologically more appropriate model assumes that an unknown proportion (1 - pi) have constant high risk whereas the remaining proportion (pi) have essentially no risk. This paper discusses the maximum likelihood estimation of pi and the power curves of the likelihood ratio test. Monte Carlo studies provide results for a variety of simulated trials; empirical data illustrate the methods.

  13. Sinkhole Susceptibility Hazard Zones Using GIS and Analytical Hierarchical Process (ahp): a Case Study of Kuala Lumpur and Ampang Jaya

    NASA Astrophysics Data System (ADS)

    Rosdi, M. A. H. M.; Othman, A. N.; Zubir, M. A. M.; Latif, Z. A.; Yusoff, Z. M.

    2017-10-01

    Sinkhole is not classified as new phenomenon in this country, especially surround Klang Valley. Since 1968, the increasing numbers of sinkhole incident have been reported in Kuala Lumpur and the vicinity areas. As the results, it poses a serious threat for human lives, assets and structure especially in the capital city of Malaysia. Therefore, a Sinkhole Hazard Model (SHM) was generated with integration of GIS framework by applying Analytical Hierarchical Process (AHP) technique in order to produced sinkhole susceptibility hazard map for the particular area. Five consecutive parameters for main criteria each categorized by five sub classes were selected for this research which is Lithology (LT), Groundwater Level Decline (WLD), Soil Type (ST), Land Use (LU) and Proximity to Groundwater Wells (PG). A set of relative weights were assigned to each inducing factor and computed through pairwise comparison matrix derived from expert judgment. Lithology and Groundwater Level Decline has been identified gives the highest impact to the sinkhole development. A sinkhole susceptibility hazard zones was classified into five prone areas namely very low, low, moderate, high and very high hazard. The results obtained were validated with thirty three (33) previous sinkhole inventory data. This evaluation shows that the model indicates 64 % and 21 % of the sinkhole events fall within high and very high hazard zones respectively. Based on this outcome, it clearly represents that AHP approach is useful to predict natural disaster such as sinkhole hazard.

  14. Evaluation of iconic versus F-map microburst displays

    NASA Technical Reports Server (NTRS)

    Salzberger, Mark; Hansman, R. John; Wanke, Craig

    1994-01-01

    Previous studies have shown graphical presentation methods of hazardous wind shear to be superior to textual or audible warnings alone. Positional information and the strength of the hazard were observed to be and were cited by pilots as the most important factors in a display. In this experiment the use of the three different graphical presentations of hazardous wind shear are examined. Airborne predictive detectors of wind shear enable the dissemination of varying levels of information. The effectiveness of iconic and mapping display modes of different complexities are addressed through simulation and analysis. Different positional and time-varying situations are presented in a 'part-task' Boeing 767 simulator using data from actual microburst events. Experienced airline pilots fly approach profiles using both iconic and F-map wind shear alerting displays. Microburst accompanied each event is also shown to the pilot. Mapping display types are expected to be found exceptionally efficient at conveying location comparison information while iconic displays simplify the threat recognition process. Preliminary results from the simulator study are presented. Recommendations concerning the suitability of multilevel iconic and mapping displays are made. Situational problems with current display prototypes are also addressed.

  15. Normalization of Testosterone Levels After Testosterone Replacement Therapy Is Associated With Decreased Incidence of Atrial Fibrillation.

    PubMed

    Sharma, Rishi; Oni, Olurinde A; Gupta, Kamal; Sharma, Mukut; Sharma, Ram; Singh, Vikas; Parashara, Deepak; Kamalakar, Surineni; Dawn, Buddhadeb; Chen, Guoqing; Ambrose, John A; Barua, Rajat S

    2017-05-09

    Atrial fibrillation (AF) is the most common cardiac dysrhythmia associated with significant morbidity and mortality. Several small studies have reported that low serum total testosterone (TT) levels were associated with a higher incidence of AF. In contrast, it is also reported that anabolic steroid use is associated with an increase in the risk of AF. To date, no study has explored the effect of testosterone normalization on new incidence of AF after testosterone replacement therapy (TRT) in patients with low testosterone. Using data from the Veterans Administrations Corporate Data Warehouse, we identified a national cohort of 76 639 veterans with low TT levels and divided them into 3 groups. Group 1 had TRT resulting in normalization of TT levels (normalized TRT), group 2 had TRT without normalization of TT levels (nonnormalized TRT), and group 3 did not receive TRT (no TRT). Propensity score-weighted stabilized inverse probability of treatment weighting Cox proportional hazard methods were used for analysis of the data from these groups to determine the association between post-TRT levels of TT and the incidence of AF. Group 1 (40 856 patients, median age 66 years) had significantly lower risk of AF than group 2 (23 939 patients, median age 65 years; hazard ratio 0.90, 95% CI 0.81-0.99, P =0.0255) and group 3 (11 853 patients, median age 67 years; hazard ratio 0.79, 95% CI 0.70-0.89, P =0.0001). There was no statistical difference between groups 2 and 3 (hazard ratio 0.89, 95% CI 0.78- 1.0009, P =0.0675) in incidence of AF. These novel results suggest that normalization of TT levels after TRT is associated with a significant decrease in the incidence of AF. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  16. Analysis and improved design considerations for airborne pulse Doppler radar signal processing in the detection of hazardous windshear

    NASA Technical Reports Server (NTRS)

    Lee, Jonggil

    1990-01-01

    High resolution windspeed profile measurements are needed to provide reliable detection of hazardous low altitude windshear with an airborne pulse Doppler radar. The system phase noise in a Doppler weather radar may degrade the spectrum moment estimation quality and the clutter cancellation capability which are important in windshear detection. Also the bias due to weather return Doppler spectrum skewness may cause large errors in pulse pair spectral parameter estimates. These effects are analyzed for the improvement of an airborne Doppler weather radar signal processing design. A method is presented for the direct measurement of windspeed gradient using low pulse repetition frequency (PRF) radar. This spatial gradient is essential in obtaining the windshear hazard index. As an alternative, the modified Prony method is suggested as a spectrum mode estimator for both the clutter and weather signal. Estimation of Doppler spectrum modes may provide the desired windshear hazard information without the need of any preliminary processing requirement such as clutter filtering. The results obtained by processing a NASA simulation model output support consideration of mode identification as one component of a windshear detection algorithm.

  17. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  18. Three-Dimensional Simulation of Avalanche-Generated Impulse Waves and Evaluation of Lake-Lowering Scenarios at Lake Palcacocha, Peru

    NASA Astrophysics Data System (ADS)

    Chisolm, R. E.; McKinney, D. C.

    2014-12-01

    Accelerated retreat of Andean glaciers in recent decades due to a warming climate has caused the emergence and growth of glacial lakes. As these lakes continue to grow, they pose an increasing risk of glacial lake outburst floods (GLOFs). GLOFs can be triggered by moraine failures or by avalanches, rockslides, or ice calving into glacial lakes. For many decades Lake Palcacocha in the Cordillera Blanca, Peru has threatened citizens living in the city of Huaraz which was devastated by a GLOF in 1941. A safety system for Lake Palcacocha was put in place in the 1970's to control the lake level, but the lake has since grown to the point where it is once again dangerous. Overhanging ice from the glaciers above and a relatively low freeboard make the lake vulnerable to avalanches and landslides. Lake Palcacocha is used as a case study to investigate the impact of an avalanche event on the lake dynamics. Three-dimensional lake modeling in the context of glacial hazards is not common, but 3D simulations can enhance our understanding of avalanche-generated impulse waves and their downstream impacts. In this work, a 3D hydrodynamic model is used to simulate the generation of an impulse wave from an avalanche falling into the lake, wave propagation, and overtopping of the terminal moraine. These results are used as inputs to a downstream model to predict the impact from a GLOF. As lowering the level of the lake is the most likely mitigation alternative, several scenarios are considered to evaluate the impact from avalanche events with a reduction in the lake level. The results of this work can be used to evaluate the effectiveness of the current lake management system and potential lake-lowering alternatives. Use of a robust 3D lake model enables more accurate predictions of peak flows during GLOF events and the time scales of these events so that mitigation strategies can be developed that reduce the risk to communities living downstream of hazardous lakes.

  19. Tele-Supervised Adaptive Ocean Sensor Fleet

    NASA Technical Reports Server (NTRS)

    Lefes, Alberto; Podnar, Gregg W.; Dolan, John M.; Hosler, Jeffrey C.; Ames, Troy J.

    2009-01-01

    The Tele-supervised Adaptive Ocean Sensor Fleet (TAOSF) is a multi-robot science exploration architecture and system that uses a group of robotic boats (the Ocean-Atmosphere Sensor Integration System, or OASIS) to enable in-situ study of ocean surface and subsurface characteristics and the dynamics of such ocean phenomena as coastal pollutants, oil spills, hurricanes, or harmful algal blooms (HABs). The OASIS boats are extended- deployment, autonomous ocean surface vehicles. The TAOSF architecture provides an integrated approach to multi-vehicle coordination and sliding human-vehicle autonomy. One feature of TAOSF is the adaptive re-planning of the activities of the OASIS vessels based on sensor input ( smart sensing) and sensorial coordination among multiple assets. The architecture also incorporates Web-based communications that permit control of the assets over long distances and the sharing of data with remote experts. Autonomous hazard and assistance detection allows the automatic identification of hazards that require human intervention to ensure the safety and integrity of the robotic vehicles, or of science data that require human interpretation and response. Also, the architecture is designed for science analysis of acquired data in order to perform an initial onboard assessment of the presence of specific science signatures of immediate interest. TAOSF integrates and extends five subsystems developed by the participating institutions: Emergent Space Tech - nol ogies, Wallops Flight Facility, NASA s Goddard Space Flight Center (GSFC), Carnegie Mellon University, and Jet Propulsion Laboratory (JPL). The OASIS Autonomous Surface Vehicle (ASV) system, which includes the vessels as well as the land-based control and communications infrastructure developed for them, controls the hardware of each platform (sensors, actuators, etc.), and also provides a low-level waypoint navigation capability. The Multi-Platform Simulation Environment from GSFC is a surrogate for the OASIS ASV system and allows for independent development and testing of higher-level software components. The Platform Communicator acts as a proxy for both actual and simulated platforms. It translates platform-independent messages from the higher control systems to the device-dependent communication protocols. This enables the higher-level control systems to interact identically with heterogeneous actual or simulated platforms.

  20. Testing the high turbulence level breakdown of low-frequency gyrokinetics against high-frequency cyclokinetic simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Zhao, E-mail: zhao.deng@foxmail.com; Waltz, R. E.

    2015-05-15

    This paper presents numerical simulations of the nonlinear cyclokinetic equations in the cyclotron harmonic representation [R. E. Waltz and Zhao Deng, Phys. Plasmas 20, 012507 (2013)]. Simulations are done with a local flux-tube geometry and with the parallel motion and variation suppressed using a newly developed rCYCLO code. Cyclokinetic simulations dynamically follow the high-frequency ion gyro-phase motion which is nonlinearly coupled into the low-frequency drift-waves possibly interrupting and suppressing gyro-averaging and increasing the transport over gyrokinetic levels. By comparing the more fundamental cyclokinetic simulations with the corresponding gyrokinetic simulations, the breakdown of gyrokinetics at high turbulence levels is quantitatively testedmore » over a range of relative ion cyclotron frequency 10 < Ω*{sup  }< 100 where Ω*{sup  }= 1/ρ*, and ρ* is the relative ion gyroradius. The gyrokinetic linear mode rates closely match the cyclokinetic low-frequency rates for Ω*{sup  }> 5. Gyrokinetic transport recovers cyclokinetic transport at high relative ion cyclotron frequency (Ω*{sup  }≥ 50) and low turbulence level as required. Cyclokinetic transport is found to be lower than gyrokinetic transport at high turbulence levels and low-Ω* values with stable ion cyclotron (IC) modes. The gyrokinetic approximation is found to break down when the density perturbations exceed 20%. For cyclokinetic simulations with sufficiently unstable IC modes and sufficiently low Ω*{sup  }∼ 10, the high-frequency component of cyclokinetic transport level can exceed the gyrokinetic transport level. However, the low-frequency component of the cyclokinetic transport and turbulence level does not exceed that of gyrokinetics. At higher and more physically relevant Ω*{sup  }≥ 50 values and physically realistic IC driving rates, the low-frequency component of the cyclokinetic transport and turbulence level is still smaller than that of gyrokinetics. Thus, the cyclokinetic simulations do not account for the so-called “L-mode near edge short fall” seen in some low-frequency gyrokinetic transport and turbulence simulations.« less

  1. Inter-model analysis of tsunami-induced coastal currents

    NASA Astrophysics Data System (ADS)

    Lynett, Patrick J.; Gately, Kara; Wilson, Rick; Montoya, Luis; Arcas, Diego; Aytore, Betul; Bai, Yefei; Bricker, Jeremy D.; Castro, Manuel J.; Cheung, Kwok Fai; David, C. Gabriel; Dogan, Gozde Guney; Escalante, Cipriano; González-Vida, José Manuel; Grilli, Stephan T.; Heitmann, Troy W.; Horrillo, Juan; Kânoğlu, Utku; Kian, Rozita; Kirby, James T.; Li, Wenwen; Macías, Jorge; Nicolsky, Dmitry J.; Ortega, Sergio; Pampell-Manis, Alyssa; Park, Yong Sung; Roeber, Volker; Sharghivand, Naeimeh; Shelby, Michael; Shi, Fengyan; Tehranirad, Babak; Tolkova, Elena; Thio, Hong Kie; Velioğlu, Deniz; Yalçıner, Ahmet Cevdet; Yamazaki, Yoshiki; Zaytsev, Andrey; Zhang, Y. J.

    2017-06-01

    To help produce accurate and consistent maritime hazard products, the National Tsunami Hazard Mitigation Program organized a benchmarking workshop to evaluate the numerical modeling of tsunami currents. Thirteen teams of international researchers, using a set of tsunami models currently utilized for hazard mitigation studies, presented results for a series of benchmarking problems; these results are summarized in this paper. Comparisons focus on physical situations where the currents are shear and separation driven, and are thus de-coupled from the incident tsunami waveform. In general, we find that models of increasing physical complexity provide better accuracy, and that low-order three-dimensional models are superior to high-order two-dimensional models. Inside separation zones and in areas strongly affected by eddies, the magnitude of both model-data errors and inter-model differences can be the same as the magnitude of the mean flow. Thus, we make arguments for the need of an ensemble modeling approach for areas affected by large-scale turbulent eddies, where deterministic simulation may be misleading. As a result of the analyses presented herein, we expect that tsunami modelers now have a better awareness of their ability to accurately capture the physics of tsunami currents, and therefore a better understanding of how to use these simulation tools for hazard assessment and mitigation efforts.

  2. A geostatistical extreme-value framework for fast simulation of natural hazard events

    PubMed Central

    Stephenson, David B.

    2016-01-01

    We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768

  3. Significant events in low-level flow conditions hazardous to aircraft

    NASA Technical Reports Server (NTRS)

    Alexander, M. B.; Camp, D. W.

    1983-01-01

    Atmospheric parameters recorded during high surface winds are analyzed to determine magnitude, frequency, duration, and simultaneity of occurrence of low level flow conditions known to be hazardous to the ascent and descent of conventional aircraft and the space shuttle. Graphic and tabular presentations of mean and extreme values and simultaneous occurrences of turbulence (gustiness and a gust factor), wind shear (speed and direction), and vertical motion (updrafts and downdrafts), along with associated temperature inversions are included as function of tower height, layer and/or distance for six 5 sec intervals (one interval every 100 sec) of parameters sampled simultaneously at the rate of 10 speeds, directions and temperatures per second during an approximately 10 min period.

  4. Usability Evaluation of a Flight-Deck Airflow Hazard Visualization System

    NASA Technical Reports Server (NTRS)

    Aragon, Cecilia R.

    2004-01-01

    Many aircraft accidents each year are caused by encounters with unseen airflow hazards near the ground, such as vortices, downdrafts, low level wind shear, microbursts, or turbulence from surrounding vegetation or structures near the landing site. These hazards can be dangerous even to airliners; there have been hundreds of fatalities in the United States in the last two decades attributable to airliner encounters with microbursts and low level wind shear alone. However, helicopters are especially vulnerable to airflow hazards because they often have to operate in confined spaces and under operationally stressful conditions (such as emergency search and rescue, military or shipboard operations). Providing helicopter pilots with an augmented-reality display visualizing local airflow hazards may be of significant benefit. However, the form such a visualization might take, and whether it does indeed provide a benefit, had not been studied before our experiment. We recruited experienced military and civilian helicopter pilots for a preliminary usability study to evaluate a prototype augmented-reality visualization system. The study had two goals: first, to assess the efficacy of presenting airflow data in flight; and second, to obtain expert feedback on sample presentations of hazard indicators to refine our design choices. The study addressed the optimal way to provide critical safety information to the pilot, what level of detail to provide, whether to display specific aerodynamic causes or potential effects only, and how to safely and effectively shift the locus of attention during a high-workload task. Three-dimensional visual cues, with varying shape, color, transparency, texture, depth cueing, and use of motion, depicting regions of hazardous airflow, were developed and presented to the pilots. The study results indicated that such a visualization system could be of significant value in improving safety during critical takeoff and landing operations, and also gave clear indications of the best design choices in producing the hazard visual cues.

  5. Map Your Hazards! - an Interdisciplinary, Place-Based Educational Approach to Assessing Natural Hazards, Social Vulnerability, Risk and Risk Perception.

    NASA Astrophysics Data System (ADS)

    Brand, B. D.; McMullin-Messier, P. A.; Schlegel, M. E.

    2014-12-01

    'Map your Hazards' is an educational module developed within the NSF Interdisciplinary Teaching about Earth for a Sustainable Future program (InTeGrate). The module engages students in place-based explorations of natural hazards, social vulnerability, and the perception of natural hazards and risk. Students integrate geoscience and social science methodologies to (1) identify and assess hazards, vulnerability and risk within their communities; (2) distribute, collect and evaluate survey data (designed by authors) on the knowledge, risk perception and preparedness within their social networks; and (3) deliver a PPT presentation to local stakeholders detailing their findings and recommendations for development of a prepared, resilient community. 'Map your Hazards' underwent four rigorous assessments by a team of geoscience educators and external review before being piloted in our classrooms. The module was piloted in a 300-level 'Volcanoes and Society' course at Boise State University, a 300-level 'Environmental Sociology' course at Central Washington University, and a 100-level 'Natural Disasters and Environmental Geology' course at the College of Western Idaho. In all courses students reported a fascination with learning about the hazards around them and identifying the high risk areas in their communities. They were also surprised at the low level of knowledge, inaccurate risk perception and lack of preparedness of their social networks. This successful approach to engaging students in an interdisciplinary, place-based learning environment also has the broad implications of raising awareness of natural hazards (survey participants are provided links to local hazard and preparedness information). The data and preparedness suggestions can be shared with local emergency managers, who are encouraged to attend the student's final presentations. All module materials are published at serc.carleton.edu/integrate/ and are appropriate to a wide range of classrooms.

  6. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  7. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE PAGES

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; ...

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  8. Assessing the Impacts of Flooding Caused by Extreme Rainfall Events Through a Combined Geospatial and Numerical Modeling Approach

    NASA Astrophysics Data System (ADS)

    Santillan, J. R.; Amora, A. M.; Makinano-Santillan, M.; Marqueso, J. T.; Cutamora, L. C.; Serviano, J. L.; Makinano, R. M.

    2016-06-01

    In this paper, we present a combined geospatial and two dimensional (2D) flood modeling approach to assess the impacts of flooding due to extreme rainfall events. We developed and implemented this approach to the Tago River Basin in the province of Surigao del Sur in Mindanao, Philippines, an area which suffered great damage due to flooding caused by Tropical Storms Lingling and Jangmi in the year 2014. The geospatial component of the approach involves extraction of several layers of information such as detailed topography/terrain, man-made features (buildings, roads, bridges) from 1-m spatial resolution LiDAR Digital Surface and Terrain Models (DTM/DSMs), and recent land-cover from Landsat 7 ETM+ and Landsat 8 OLI images. We then used these layers as inputs in developing a Hydrologic Engineering Center Hydrologic Modeling System (HEC HMS)-based hydrologic model, and a hydraulic model based on the 2D module of the latest version of HEC River Analysis System (RAS) to dynamically simulate and map the depth and extent of flooding due to extreme rainfall events. The extreme rainfall events used in the simulation represent 6 hypothetical rainfall events with return periods of 2, 5, 10, 25, 50, and 100 years. For each event, maximum flood depth maps were generated from the simulations, and these maps were further transformed into hazard maps by categorizing the flood depth into low, medium and high hazard levels. Using both the flood hazard maps and the layers of information extracted from remotely-sensed datasets in spatial overlay analysis, we were then able to estimate and assess the impacts of these flooding events to buildings, roads, bridges and landcover. Results of the assessments revealed increase in number of buildings, roads and bridges; and increase in areas of land-cover exposed to various flood hazards as rainfall events become more extreme. The wealth of information generated from the flood impact assessment using the approach can be very useful to the local government units and the concerned communities within Tago River Basin as an aid in determining in an advance manner all those infrastructures (buildings, roads and bridges) and land-cover that can be affected by different extreme rainfall event flood scenarios.

  9. Difficulties in applying numerical simulations to an evaluation of occupational hazards caused by electromagnetic fields

    PubMed Central

    Zradziński, Patryk

    2015-01-01

    Due to the various physical mechanisms of interaction between a worker's body and the electromagnetic field at various frequencies, the principles of numerical simulations have been discussed for three areas of worker exposure: to low frequency magnetic field, to low and intermediate frequency electric field and to radiofrequency electromagnetic field. This paper presents the identified difficulties in applying numerical simulations to evaluate physical estimators of direct and indirect effects of exposure to electromagnetic fields at various frequencies. Exposure of workers operating a plastic sealer have been taken as an example scenario of electromagnetic field exposure at the workplace for discussion of those difficulties in applying numerical simulations. The following difficulties in reliable numerical simulations of workers’ exposure to the electromagnetic field have been considered: workers’ body models (posture, dimensions, shape and grounding conditions), working environment models (objects most influencing electromagnetic field distribution) and an analysis of parameters for which exposure limitations are specified in international guidelines and standards. PMID:26323781

  10. Multi scale modelling of landslide hazard and risk assessment in data scarce area - a case study on Dhalai District, Tripura, India

    NASA Astrophysics Data System (ADS)

    Ghosh, Kapil; De, Sunil Kumar

    2017-04-01

    Successful landslide management plans and policy depends on in-depth knowledge about the hazard and associated risk. Thus, the present research is intended to present an integrated approach involving uses of geospatial technologies for landslide hazard and risk assessment at different scales (site specific to regional level). The landslide hazard map at regional scale (district level) is prepared by using weight-rating based method. To analyze landslide manifestation in the Dhalai district of Tripura different causative factor maps (lithology, road buffer, slope, relative relief, rainfall, fault buffer, landuse/landcover and drainage density) are derived. The analysis revealed that the geological structure and human interference have more influence than other considered factors on the landslide occurrences. The landslide susceptibility zonation map shows that about 1.64 and 16.68% of the total study area is falling under very high and high susceptibility zones respectively. The landslide risk assessment at district level is generated by integrating hazard scouring and resource damage potential scouring (fuzzy membership values) maps. The values of landslide risk matrix are varying within the range of 0.001 to 0.18 and the risk assessment map shows that only 0.45% (10.80 km2) of the district is under very high risk zone, whereas, about 50% pixels of existing road section are under very high to high level of landslide risk. The major part (94.06%) of the district is under very low to low risk zone. Landslide hazard and risk assessment at site specific level have been carried out through intensive field investigation in which it is found that the Ambassa landslide is located within 150 m buffer zone of fault line. Variation of geo-electrical resistivity (2.2Ωm to 31.4Ωm) indicates the complex geological character in this area. Based on the obtained geo-technical result which helps to identify the degree of risk to the existing resource, it is appropriate to implement the management plans such as construction of sub-surface drainage, extension of retaining walls, cutting/filling of slope in scientific manner. Keywords: landslide, hazard, risk, fuzzy set theory

  11. 40 CFR 266.350 - What records must you keep at your facility and for how long?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... three years after the exempted waste is sent for disposal. (e) If you are not already subject to NRC, or... AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR THE MANAGEMENT OF SPECIFIC HAZARDOUS WASTES AND SPECIFIC TYPES OF HAZARDOUS WASTE MANAGEMENT FACILITIES Conditional Exemption for Low-Level Mixed Waste...

  12. 40 CFR 266.350 - What records must you keep at your facility and for how long?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... three years after the exempted waste is sent for disposal. (e) If you are not already subject to NRC, or... AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS FOR THE MANAGEMENT OF SPECIFIC HAZARDOUS WASTES AND SPECIFIC TYPES OF HAZARDOUS WASTE MANAGEMENT FACILITIES Conditional Exemption for Low-Level Mixed Waste...

  13. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  14. Identification of hazardous drinking with the Young Adult Alcohol Consequences Questionnaire: Relative operating characteristics as a function of gender.

    PubMed

    Read, Jennifer P; Haas, Amie L; Radomski, Sharon; Wickham, Robert E; Borish, Sarah E

    2016-10-01

    Heavy and problematic drinking is common on college campuses and is associated with myriad hazardous outcomes. The Young Adult Alcohol Consequences Questionnaire (YAACQ; Read et al., 2006) was developed to provide comprehensive and expedient assessment of negative consequences of young adult drinking and has been used in a number of research and clinical settings. To date, no empirically derived cutoffs for the YAACQ have been available for use in the identification of those drinkers at greatest risk. This was the objective of the present study. In a large (N = 1,311) and demographically heterogeneous multisite sample, we identified cutoff scores for the YAACQ, and the contrasted detection of hazardous drinking using these cutoffs with those recommended for the Alcohol Use Disorders Identification Test (AUDIT). We also examined whether cutoffs differed by gender. Results of receiver operating characteristic (ROC) analysis yielded cutoffs that delineate 3 levels (or zones) of hazardous drinking risk: low, moderate, and high. A cutoff of 8 differentiated those at low risk from those at moderate risk or greater, and a cutoff of 16 differentiated between moderate and high risk. These zones corresponded to other indices of risky drinking, including heavy episodic "binge" drinking, more frequent alcohol consumption, and engagement in alcohol risk behaviors. Scores differentiating low to moderate risk differed for men (8) and women (10), whereas the cutoff for high risk was the same (16) across the sexes. Findings suggest that the YAACQ can be used to reliably assess level of drinking risk among college students. Furthermore, these cut scores may be used to refer to interventions varying in intensity level, based on level of indicated alcohol risk. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. iFLOOD: A Real Time Flood Forecast System for Total Water Modeling in the National Capital Region

    NASA Astrophysics Data System (ADS)

    Sumi, S. J.; Ferreira, C.

    2017-12-01

    Extreme flood events are the costliest natural hazards impacting the US and frequently cause extensive damages to infrastructure, disruption to economy and loss of lives. In 2016, Hurricane Matthew brought severe damage to South Carolina and demonstrated the importance of accurate flood hazard predictions that requires the integration of riverine and coastal model forecasts for total water prediction in coastal and tidal areas. The National Weather Service (NWS) and the National Ocean Service (NOS) provide flood forecasts for almost the entire US, still there are service-gap areas in tidal regions where no official flood forecast is available. The National capital region is vulnerable to multi-flood hazards including high flows from annual inland precipitation events and surge driven coastal inundation along the tidal Potomac River. Predicting flood levels on such tidal areas in river-estuarine zone is extremely challenging. The main objective of this study is to develop the next generation of flood forecast systems capable of providing accurate and timely information to support emergency management and response in areas impacted by multi-flood hazards. This forecast system is capable of simulating flood levels in the Potomac and Anacostia River incorporating the effects of riverine flooding from the upstream basins, urban storm water and tidal oscillations from the Chesapeake Bay. Flood forecast models developed so far have been using riverine data to simulate water levels for Potomac River. Therefore, the idea is to use forecasted storm surge data from a coastal model as boundary condition of this system. Final output of this validated model will capture the water behavior in river-estuary transition zone far better than the one with riverine data only. The challenge for this iFLOOD forecast system is to understand the complex dynamics of multi-flood hazards caused by storm surges, riverine flow, tidal oscillation and urban storm water. Automated system simulations will help to develop a seamless integration with the boundary systems in the service-gap area with new insights into our scientific understanding of such complex systems. A visualization system is being developed to allow stake holders and the community to have access to the flood forecasting for their region with sufficient lead time.

  16. Wake Vortex Prediction Models for Decay and Transport Within Stratified Environments

    NASA Astrophysics Data System (ADS)

    Switzer, George F.; Proctor, Fred H.

    2002-01-01

    This paper proposes two simple models to predict vortex transport and decay. The models are determined empirically from results of three-dimensional large eddy simulations, and are applicable to wake vortices out of ground effect and not subjected to environmental winds. The results, from the large eddy simulations assume a range of ambient turbulence and stratification levels. The models and the results from the large eddy simulations support the hypothesis that the decay of the vortex hazard is decoupled from its change in descent rate.

  17. Insights into earthquake hazard map performance from shaking history simulations

    NASA Astrophysics Data System (ADS)

    Stein, S.; Vanneste, K.; Camelbeeck, T.; Vleminckx, B.

    2017-12-01

    Why recent large earthquakes caused shaking stronger than predicted by earthquake hazard maps is under debate. This issue has two parts. Verification involves how well maps implement probabilistic seismic hazard analysis (PSHA) ("have we built the map right?"). Validation asks how well maps forecast shaking ("have we built the right map?"). We explore how well a map can ideally perform by simulating an area's shaking history and comparing "observed" shaking to that predicted by a map generated for the same parameters. The simulations yield shaking distributions whose mean is consistent with the map, but individual shaking histories show large scatter. Infrequent large earthquakes cause shaking much stronger than mapped, as observed. Hence, PSHA seems internally consistent and can be regarded as verified. Validation is harder because an earthquake history can yield shaking higher or lower than that predicted while being consistent with the hazard map. The scatter decreases for longer observation times because the largest earthquakes and resulting shaking are increasingly likely to have occurred. For the same reason, scatter is much less for the more active plate boundary than for a continental interior. For a continental interior, where the mapped hazard is low, even an M4 event produces exceedances at some sites. Larger earthquakes produce exceedances at more sites. Thus many exceedances result from small earthquakes, but infrequent large ones may cause very large exceedances. However, for a plate boundary, an M6 event produces exceedance at only a few sites, and an M7 produces them in a larger, but still relatively small, portion of the study area. As reality gives only one history, and a real map involves assumptions about more complicated source geometries and occurrence rates, which are unlikely to be exactly correct and thus will contribute additional scatter, it is hard to assess whether misfit between actual shaking and a map — notably higher-than-mapped shaking — arises by chance or reflects biases in the map. Due to this problem, there are limits to how well we can expect hazard maps to predict future shaking, as well as to our ability to test the performance of a hazard map based on available observations.

  18. Investigating Uncertainty and Sensitivity in Integrated, Multimedia Environmental Models: Tools for FRAMES-3MRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babendreier, Justin E.; Castleton, Karl J.

    2005-08-01

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less

  19. Tsunami hazard potential for the equatorial southwestern Pacific atolls of Tokelau from scenario-based simulations

    NASA Astrophysics Data System (ADS)

    Orpin, Alan R.; Rickard, Graham J.; Gerring, Peter K.; Lamarche, Geoffroy

    2016-05-01

    Devastating tsunami over the last decade have significantly heightened awareness of the potential consequences and vulnerability of low-lying Pacific islands and coastal regions. Our appraisal of the potential tsunami hazard for the atolls of the Tokelau Islands is based on a tsunami source-propagation-inundation model using Gerris Flow Solver, adapted from the companion study by Lamarche et al. (2015) for the islands of Wallis and Futuna. We assess whether there is potential for tsunami flooding on any of the village islets from a selection of 14 earthquake-source experiments. These earthquake sources are primarily based on the largest Pacific earthquakes of Mw ≥ 8.1 since 1950 and other large credible sources of tsunami that may impact Tokelau. Earthquake-source location and moment magnitude are related to tsunami-wave amplitudes and tsunami flood depths simulated for each of the three atolls of Tokelau. This approach yields instructive results for a community advisory but is not intended to be fully deterministic. Rather, the underlying aim is to identify credible sources that present the greatest potential to trigger an emergency response. Results from our modelling show that wave fields are channelled by the bathymetry of the Pacific basin in such a way that the swathes of the highest waves sweep immediately northeast of the Tokelau Islands. Our limited simulations suggest that trans-Pacific tsunami from distant earthquake sources to the north of Tokelau pose the most significant inundation threat. In particular, our assumed worst-case scenario for the Kuril Trench generated maximum modelled-wave amplitudes in excess of 1 m, which may last a few hours and include several wave trains. Other sources can impact specific sectors of the atolls, particularly distant earthquakes from Chile and Peru, and regional earthquake sources to the south. Flooding is dependent on the wave orientation and direct alignment to the incoming tsunami. Our "worst-case" tsunami simulations of the Tokelau Islands suggest that dry areas remain around the villages, which are typically built on a high islet. Consistent with the oral history of little or no perceived tsunami threat, simulations from the recent Tohoku and Chile earthquake sources suggest only limited flooding around low-lying islets of the atoll. Where potential tsunami flooding is inferred from the modelling, recommended minimum evacuation heights above local sea level are compiled, with particular attention paid to variations in tsunami flood depth around the atolls, subdivided into directional quadrants around each atoll. However, complex wave behaviours around the atolls, islets, tidal channels and within the lagoons are also observed in our simulations. Wave amplitudes within the lagoons may exceed 50 cm, increasing any inundation and potential hazards on the inner shoreline of the atolls, which in turn may influence evacuation strategies. Our study shows that indicative simulation studies can be achieved even with only basic field information. In part, this is due to the spatially and vertically limited topography of the atoll, short reef flat and steep seaward bathymetry, and the simple depth profile of the lagoon bathymetry.

  20. Review of the Literature on Determinants of Chemical Hazard Information Recall among Workers and Consumers

    PubMed Central

    Sathar, Farzana; Dalvie, Mohamed Aqiel; Rother, Hanna-Andrea

    2016-01-01

    In many low and middle income countries (LMIC), workers’ and consumers’ only access to risk and hazard information in relation to the chemicals they use or work with is on the chemical label and safety data sheet. Recall of chemical hazard information is vital in order for label warnings and precautionary information to promote effective safety behaviors. A literature review, therefore, was conducted on determinants of chemical hazard information recall among workers and consumers globally. Since comprehension and recall are closely linked, the determinants of both were reviewed. Literature was reviewed from both online and print peer reviewed journals for all study designs and countries. This review indicated that the level of education, previous training and the inclusion of pictograms on the hazard communication material are all factors that contribute to the recall of hazard information. The influence of gender and age on recall is incongruent and remains to be explored. More research is required on the demographic predictors of the recall of hazard information, the effect of design and non-design factors on recall, the effect of training on the recall among low literate populations and the examining of different regions or contexts. PMID:27258291

  1. A lava flow simulation model for the development of volcanic hazard maps for Mount Etna (Italy)

    NASA Astrophysics Data System (ADS)

    Damiani, M. L.; Groppelli, G.; Norini, G.; Bertino, E.; Gigliuto, A.; Nucita, A.

    2006-05-01

    Volcanic hazard assessment is of paramount importance for the safeguard of the resources exposed to volcanic hazards. In the paper we present ELFM, a lava flow simulation model for the evaluation of the lava flow hazard on Mount Etna (Sicily, Italy), the most important active volcano in Europe. The major contributions of the paper are: (a) a detailed specification of the lava flow simulation model and the specification of an algorithm implementing it; (b) the definition of a methodological framework for applying the model to the specific volcano. For what concerns the former issue, we propose an extended version of an existing stochastic model that has been applied so far only to the assessment of the volcanic hazard on Lanzarote and Tenerife (Canary Islands). Concerning the methodological framework, we claim model validation is definitely needed for assessing the effectiveness of the lava flow simulation model. To that extent a strategy has been devised for the generation of simulation experiments and evaluation of their outcomes.

  2. Continued Research into Characterizing the Preturbulence Environment for Sensor Development, New Hazard Algorithms and Experimental Flight Planning

    NASA Technical Reports Server (NTRS)

    Kaplan, Michael L.; Lin, Yuh-Lang

    2005-01-01

    The purpose of the research was to develop and test improved hazard algorithms that could result in the development of sensors that are better able to anticipate potentially severe atmospheric turbulence, which affects aircraft safety. The research focused on employing numerical simulation models to develop improved algorithms for the prediction of aviation turbulence. This involved producing both research simulations and real-time simulations of environments predisposed to moderate and severe aviation turbulence. The research resulted in the following fundamental advancements toward the aforementioned goal: 1) very high resolution simulations of turbulent environments indicated how predictive hazard indices could be improved resulting in a candidate hazard index that indicated the potential for improvement over existing operational indices, 2) a real-time turbulence hazard numerical modeling system was improved by correcting deficiencies in its simulation of moist convection and 3) the same real-time predictive system was tested by running the code twice daily and the hazard prediction indices updated and improved. Additionally, a simple validation study was undertaken to determine how well a real time hazard predictive index performed when compared to commercial pilot observations of aviation turbulence. Simple statistical analyses were performed in this validation study indicating potential skill in employing the hazard prediction index to predict regions of varying intensities of aviation turbulence. Data sets from a research numerical model where provided to NASA for use in a large eddy simulation numerical model. A NASA contractor report and several refereed journal articles where prepared and submitted for publication during the course of this research.

  3. DESIGN REPORT: LOW-NOX BURNERS FOR PACKAGE BOILERS

    EPA Science Inventory

    The report describes a low-NOx burner design, presented for residual-oil-fired industrial boilers and boilers cofiring conventional fuels and nitrated hazardous wastes. The burner offers lower NOx emission levels for these applications than conventional commercial burners. The bu...

  4. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly evolving crisis, accurately accounting for and propagating all uncertainties and enabling rational decision making under uncertainty.

  5. How Prepared Are Medical and Nursing Students to Identify Common Hazards in the Intensive Care Unit?

    PubMed

    Clay, Alison S; Chudgar, Saumil M; Turner, Kathleen M; Vaughn, Jacqueline; Knudsen, Nancy W; Farnan, Jeanne M; Arora, Vineet M; Molloy, Margory A

    2017-04-01

    Care in the hospital is hazardous. Harm in the hospital may prolong hospitalization, increase suffering, result in death, and increase costs of care. Although the interprofessional team is critical to eliminating hazards that may result in adverse events to patients, professional students' formal education may not prepare them adequately for this role. To determine if medical and nursing students can identify hazards of hospitalization that could result in harm to patients and to detect differences between professions in the types of hazards identified. Mixed-methods observational study of graduating nursing (n = 51) and medical (n = 93) students who completed two "Room of Horrors" simulations to identify patient safety hazards. Qualitative analysis was used to extract themes from students' written hazard descriptions. Fisher's exact test was used to determine differences in frequency of hazards identified between groups. Identification of hazards by students was low: 66% did not identify missing personal protective equipment for a patient on contact isolation, and 58% did not identify a medication administration error (medication hanging for a patient with similar name). Interprofessional differences existed in how hazards were identified: medical students noted that restraints were not indicated (73 vs. 2%, P < 0.001), whereas nursing students noted that there was no order for the restraints (58.5 vs. 0%, P < 0.0001). Nursing students discovered more issues with malfunctioning or incorrectly used equipment than medical students. Teams performed better than individuals, especially for hazards in the second simulation that were similar to those in the first: need to replace a central line with erythema (73% teams identified) versus need to replace a peripheral intravenous line (10% individuals, P < 0.0001). Nevertheless, teams of students missed many intensive care unit-specific hazards: 54% failed to identify the presence of pressure ulcers; 85% did not notice high tidal volumes on the ventilator; and 90% did not identify the absence of missing spontaneous awakening/breathing trials and absent stress ulcer prophylaxis. Graduating nursing and medical students missed several hazards of hospitalization, especially those related to the intensive care unit. Orientation for residents and new nurses should include education on hospitalization hazards. Ideally, this orientation should be interprofessional to allow appreciation for each other's roles and responsibilities.

  6. Serum Bicarbonate and Mortality in Stage 3 and Stage 4 Chronic Kidney Disease

    PubMed Central

    Schold, Jesse D.; Arrigain, Susana; Jolly, Stacey E.; Wehbe, Edgard; Raina, Rupesh; Simon, James F.; Srinivas, Titte R.; Jain, Anil; Schreiber, Martin J.; Nally, Joseph V.

    2011-01-01

    Summary Background and objectives The incidence and prevalence of metabolic acidosis increase with declining kidney function. We studied the associations of both low and high serum bicarbonate levels with all-cause mortality among stage 3 and 4 chronic kidney disease (CKD) patients. Design, setting, participants, & measurements We examined factors associated with low (<23 mmol/L) and high (>32 mmol/L) serum bicarbonate levels using logistic regression models and associations between bicarbonate and all-cause mortality using Cox-proportional hazard models, Kaplan–Meier survival curves, and time-dependent analysis. Results Out of 41,749 patients, 13.9% (n = 5796) had low and 1.6% (n = 652) had high serum bicarbonate levels. After adjusting for relevant covariates, there was a significant association between low serum bicarbonate and all-cause mortality (hazard ratio [HR] 1.23, 95% CI 1.16, 1.31). This association was not statistically significant among patients with stage 4 CKD and diabetes. The time-dependent analysis demonstrated a significant mortality risk associated with a decline from normal to low bicarbonate level (HR 1.59, 95% CI 1.49, 1.69). High serum bicarbonate levels were associated with death irrespective of the level of kidney function (HR 1.74, 95% CI 1.52, 2.00). When serum bicarbonate was examined as a continuous variable, a J-shaped relationship was noted between serum bicarbonate and mortality. Conclusions Low serum bicarbonate levels are associated with increased mortality among stage 3 CKD patients and patients without diabetes. High serum bicarbonate levels are associated with mortality in both stage 3 and stage 4 CKD patients. PMID:21885787

  7. VOCs monitoring system simulation and design

    NASA Astrophysics Data System (ADS)

    Caldararu, Florin; Vasile, Alexandru; Vatra, Cosmin

    2010-11-01

    The designed and simulated system will be used in the tanning industry, for Volatile Organic Compound (VOC) measurements. In this industry, about 90% of the solvent contained in the emulsions evaporates during its application, giving rise to VOC, which are at the same time hazardous atmospheric pollutants and one of the sources of ground level photochemical ozone formation. It results that a monitoring system is necessary in a leather finishing process, in order to detect hazardous VOC concentration and conducting process in order of VOC concentration diminishing. The paper presents the design of a VOC monitoring system, which includes sensors for VOCs and temperature, the conditioning circuitry for these sensors, the suction system of the gas in the hood, the data acquisition and the computing system and graphic interface. The used sensor in the detection system is a semiconductor sensor, produced by Figaro Engineering Inc., characterized by a short response time, high sensitivity at almost all VOC substances. The design of the conditioning circuitry and data acquisition is done in order to compensate the sensor response variation with temperature and to maintain the low response time of the sensor. The temperature compensation is obtained by using a thermistor circuitry, and the compensation is done within the software design. A Mitsubishi PLC is used to receive the output signals of the circuits including the sensor and of the thermistor, respectively. The acquisition and computing system is done using Mitsubishi ALPHA 2 controller and a graphical terminal, GOT 1000.

  8. Infrared low-level wind shear work

    NASA Technical Reports Server (NTRS)

    Adamson, Pat

    1988-01-01

    Results of field experiments for the detection of clear air disturbance and low level wind shear utilizing an infrared airborne system are given in vugraph form. The hits, misses and nuisance alarms scores are given. Information is given on the infrared spatial resolution technique. The popular index of aircraft hazard (F= WX over g - VN over AS) is developed for a remote temperature sensor.

  9. Community Capacity in The Face Of Landslide Hazards in the Southern Of Semarang City

    NASA Astrophysics Data System (ADS)

    Tjahjono, Heri; Suripin; Kismartini

    2018-02-01

    The study was done at Semarang, Central Java. The aims of the study are: (a) to know the variation in the level of community capacity in dealing with landslide hazards in the southern of Semarang city; (B) to know the factors that affect the capacity of communities in facing the hazards of landslides. This research was conducted by the sample method with a sample of 198 people, taken by purposive sampling. Samples taken are people living in areas that have experienced landslide or in areas that are expected to be vulnerable to landslides. The variables used in this research are (1) regulatory and institutional capacity in the prevention of landslide disaster, (2) early warning system in community, (3) education of disaster skill training, (4) mitigation to reduce basic risk factor, and (5) Preparedness on all fronts. Data were collected with questioner and interviews. Data analysis was performed by percentage descriptions, and map overlay analysis using ArcGIS release 10.3 technology. The result of the research shows that there are 5 variations of society's capacity level in facing the landslide hazard in southern Semarang city, that is the very high capacity of society as much as 4,35 % of the people that researched, the high community capacity is 7,25 % of the people that researched, the medium community capacity is 30.43 %. of the people that researched, low community capacity as much as 36.23 % of the people that researched and very low community capacity as much as 21.74% of the people that researched. Based on the result of overlay map of landslide threat in southern Semarang City with map about variation of community capacity level in facing landslide hazard indicate that community capacity with very high criterion and high occupancy area of threat of landslide with high and medium criterion which have been experienced landslide. While the capacity of the community with the criteria of medium, low and very low occupies the threat of landslide areas with high, medium, low and very low criteria that have never experienced landslide. The existence of the experience of a landslide disaster is one of the factors that encourage the community to increase the community capacity in facing the landslide.

  10. [Working environment measurement of radioactive substances].

    PubMed

    Kunugita, Naoki

    2007-12-01

    The control of the working environment is one of the most important duties in any working place to prevent occupational disease. In Japan, in the case of the controlled area using unsealed radioisotopes, the measurement of the concentration of airborne radioactive substances should be carried out under the regulations of the "Industrial Safety and Health Law" and the "Ordinance on Prevention of Ionizing Radiation Hazards". Many reports showed that the results of regular working environment measurements of radioactive substances were about background levels. Safe working environments are sufficiently guaranteed by a suitable estimation and handling under the strict regulation by the "Laws Concerning the Prevention from Radiation Hazards Due to Radioisotopes and Others". The regulation by "Ordinance on Prevention of Ionizing Radiation Hazards" would be relaxed in the field of education and research, which use very low quantities of radioactive substances, in ways such as estimation by calculation in place of the actual measurement, decrease of the number of monthly measurements, and measurement exemption for low levels of isotopes.

  11. 49 CFR 173.240 - Bulk packaging for certain low hazard solid materials.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... this subchapter and the special provisions specified in column 7 of the § 172.101 table. (a) Rail cars... the IBC packaging code specified for the specific hazardous material in Column (7) of the § 172.101... subchapter at the Packing Group performance level as specified in Column (5) of the § 172.101 Table of this...

  12. 49 CFR 173.240 - Bulk packaging for certain low hazard solid materials.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... this subchapter and the special provisions specified in column 7 of the § 172.101 table. (a) Rail cars... the IBC packaging code specified for the specific hazardous material in Column (7) of the § 172.101... subchapter at the Packing Group performance level as specified in Column (5) of the § 172.101 Table of this...

  13. 49 CFR 173.240 - Bulk packaging for certain low hazard solid materials.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... this subchapter and the special provisions specified in column 7 of the § 172.101 table. (a) Rail cars... the IBC packaging code specified for the specific hazardous material in Column (7) of the § 172.101... subchapter at the Packing Group performance level as specified in Column (5) of the § 172.101 Table of this...

  14. 49 CFR 173.240 - Bulk packaging for certain low hazard solid materials.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... this subchapter and the special provisions specified in column 7 of the § 172.101 table. (a) Rail cars... the IBC packaging code specified for the specific hazardous material in Column (7) of the § 172.101... subchapter at the Packing Group performance level as specified in Column (5) of the § 172.101 Table of this...

  15. 49 CFR 173.240 - Bulk packaging for certain low hazard solid materials.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... this subchapter and the special provisions specified in column 7 of the § 172.101 table. (a) Rail cars... the IBC packaging code specified for the specific hazardous material in Column (7) of the § 172.101... subchapter at the Packing Group performance level as specified in Column (5) of the § 172.101 Table of this...

  16. The research of distributed interactive simulation based on HLA in coal mine industry inherent safety

    NASA Astrophysics Data System (ADS)

    Dou, Zhi-Wu

    2010-08-01

    To solve the inherent safety problem puzzling the coal mining industry, analyzing the characteristic and the application of distributed interactive simulation based on high level architecture (DIS/HLA), a new method is proposed for developing coal mining industry inherent safety distributed interactive simulation adopting HLA technology. Researching the function and structure of the system, a simple coal mining industry inherent safety is modeled with HLA, the FOM and SOM are developed, and the math models are suggested. The results of the instance research show that HLA plays an important role in developing distributed interactive simulation of complicated distributed system and the method is valid to solve the problem puzzling coal mining industry. To the coal mining industry, the conclusions show that the simulation system with HLA plays an important role to identify the source of hazard, to make the measure for accident, and to improve the level of management.

  17. Vegetative soil covers for hazardous waste landfills

    NASA Astrophysics Data System (ADS)

    Peace, Jerry L.

    Shallow land burial has been the preferred method for disposing of municipal and hazardous wastes in the United States because it is the simplest, cheapest, and most cost-effective method of disposal. Arid and semiarid regions of the western United States have received considerable attention over the past two decades in reference to hazardous, radioactive, and mixed waste disposal. Disposal is based upon the premise that low mean annual precipitation, high evapotranspiration, and low or negligible recharge, favor waste isolation from the environment for long periods of time. The objective of this study is to demonstrate that containment of municipal and hazardous wastes in arid and semiarid environments can be accomplished effectively without traditional, synthetic materials and complex, multi-layer systems. This research demonstrates that closure covers utilizing natural soils and native vegetation i.e., vegetative soil covers, will meet the technical equivalency criteria prescribed by the U.S. Environmental Protection Agency for hazardous waste landfills. Vegetative soil cover design combines layers of natural soil, native plant species, and climatic conditions to form a sustainable, functioning ecosystem that maintains the natural water balance. In this study, percolation through a natural analogue and an engineered cover is simulated using the one-dimensional, numerical code UNSAT-H. UNSAT-H is a Richards' equation-based model that simulates soil water infiltration, unsaturated flow, redistribution, evaporation, plant transpiration, and deep percolation. This study incorporates conservative, site-specific soil hydraulic and vegetation parameters. Historical meteorological data from 1919 to 1996 are used to simulate percolation through the natural analogue and an engineered cover, with and without vegetation. This study indicates that a 1 m (3 ft) cover is the minimum design thickness necessary to meet the U.S. Environmental Protection Agency-prescribed technical equivalency criteria of 31.5 mm/year and 1 x 10-7 cm/second for net annual percolation and average flux, respectively. Increasing cover thickness to 1.2 m (4 ft) or 1.5 m (5 ft) results in limited additional improvement in cover performance. Under historical climatic conditions, net annual percolation and average flux through a 1 m (3 ft) cover is directed upward at 0.28 mm/year and 9.03 x 10-10 cm/second, respectively, for a soil cover with vegetation.

  18. Effects of HUD-supported lead hazard control interventions in housing on children's blood lead.

    PubMed

    Clark, Scott; Galke, Warren; Succop, Paul; Grote, Joann; McLaine, Pat; Wilson, Jonathan; Dixon, Sherry; Menrath, William; Roda, Sandy; Chen, Mei; Bornschein, Robert; Jacobs, David

    2011-02-01

    The Evaluation of the US Department of Housing and Urban Development Lead-Based Paint Hazard Control Grant Program studied the effectiveness of the housing intervention performed in reducing the blood lead of children at four post-intervention times (6-months, 1-year, 2-years, and 3-years). A repeat measures analysis showed that blood lead levels declined up to three-years post-intervention. The results at each successive collection time were significantly lower than at the previous post-intervention time except for the difference between the levels at two and three years. At two-years post-intervention, geometric mean blood lead levels were approximately 37% lower than at pre-intervention. Children with pre-intervention blood lead levels as low as 10 μg/dL experienced substantial declines in blood lead levels. Previous studies have found substantial improvements only if a child's pre-intervention blood lead level was above 20 μg/dL. Individual interior lead hazard control treatments as grouped by Interior Strategy were not a significant predictor of post-intervention blood lead levels. However, children living in dwellings where exterior lead hazard control interventions were done had lower blood lead levels at one-year post-intervention than those living in dwellings without the exterior interventions (all other factors being equal), but those differences were only significant when the mean exterior paint lead loading at pre-intervention was about the 90th percentile (7.0mg/cm(2)). This observation suggests that exterior lead hazard control can be an important component of a lead hazard control plan. Children who were six to eleven months of age at pre-intervention had a significant increase in blood lead at one-year post-intervention, probably due to other exposures. Copyright © 2010 Elsevier Inc. All rights reserved.

  19. Loss-of-function mutations in APOC3 and risk of ischemic vascular disease.

    PubMed

    Jørgensen, Anders Berg; Frikke-Schmidt, Ruth; Nordestgaard, Børge G; Tybjærg-Hansen, Anne

    2014-07-03

    High plasma levels of nonfasting triglycerides are associated with an increased risk of ischemic cardiovascular disease. Whether lifelong low levels of nonfasting triglycerides owing to mutations in the gene encoding apolipoprotein C3 (APOC3) are associated with a reduced risk of ischemic cardiovascular disease in the general population is unknown. Using data from 75,725 participants in two general-population studies, we first tested whether low levels of nonfasting triglycerides were associated with reduced risks of ischemic vascular disease and ischemic heart disease. Second, we tested whether loss-of-function mutations in APOC3, which were associated with reduced levels of nonfasting triglycerides, were also associated with reduced risks of ischemic vascular disease and ischemic heart disease. During follow-up, ischemic vascular disease developed in 10,797 participants, and ischemic heart disease developed in 7557 of these 10,797 participants. Participants with nonfasting triglyceride levels of less than 1.00 mmol per liter (90 mg per deciliter) had a significantly lower incidence of cardiovascular disease than those with levels of 4.00 mmol per liter (350 mg per deciliter) or more (hazard ratio for ischemic vascular disease, 0.43; 95% confidence interval [CI], 0.35 to 0.54; hazard ratio for ischemic heart disease, 0.40; 95% CI, 0.31 to 0.52). Heterozygosity for loss-of-function mutations in APOC3, as compared with no APOC3 mutations, was associated with a mean reduction in nonfasting triglyceride levels of 44% (P<0.001). The cumulative incidences of ischemic vascular disease and ischemic heart disease were reduced in heterozygotes as compared with noncarriers of APOC3 mutations (P=0.009 and P=0.05, respectively), with corresponding risk reductions of 41% (hazard ratio, 0.59; 95% CI, 0.41 to 0.86; P=0.007) and 36% (hazard ratio, 0.64; 95% CI, 0.41 to 0.99; P=0.04). Loss-of-function mutations in APOC3 were associated with low levels of triglycerides and a reduced risk of ischemic cardiovascular disease. (Funded by the European Union and others.).

  20. Potential Risk Assessment of Mountain Torrent Disasters on Sloping Fields in China

    NASA Astrophysics Data System (ADS)

    GAO, X.

    2017-12-01

    China's sloping fields have the problems of low production and serious soil erosion, and mountain torrent disasters will bring more serious soil and water loss to traditional extensive exploitation of sloping field resources. In this paper, China's sloping fields were classified into three grades, such as slightly steep, steep and very steep grade. According to the geological hazards prevention and control regulation, the historical data of China's mountain torrent disasters were spatially interpolated and divided into five classes, such as extremely low, low, middle, high and extremely high level. And the risk level map of mountain torrents was finished in ArcGIS. By using overlaying analysis on sloping fields and risk level map, the potential risk regionalization map of sloping fields in various slope grades was obtained finally. The results shows that the very steep and steep sloping fields are mainly distributed in the first or second stage terraces in China. With the increase of hazard risk level, the area of sloping fields decreases rapidly and the sloping fields in extremely low and low risk levels of mountain torrents reach 98.9%. With the increase of slope grade, the area of sloping fields in various risk levels also declines sharply. The sloping fields take up approximately 60 65% and 26 30% in slightly steep and steep grade areas separately at different risk level. The risk regionalization map can provide effective information for returning farmland to forests or grassland and reducing water and soil erosion of sloping fields in the future.

  1. The importance of vegetation change in the prediction of future tropical cyclone flood statistics

    NASA Astrophysics Data System (ADS)

    Irish, J. L.; Resio, D.; Bilskie, M. V.; Hagen, S. C.; Weiss, R.

    2015-12-01

    Global sea level rise is a near certainty over the next century (e.g., Stocker et al. 2013 [IPCC] and references therein). With sea level rise, coastal topography and land cover (hereafter "landscape") is expected to change and tropical cyclone flood hazard is expected to accelerate (e.g., Irish et al. 2010 [Ocean Eng], Woodruff et al. 2013 [Nature], Bilskie et al. 2014 [Geophys Res Lett], Ferreira et al. 2014 [Coast Eng], Passeri et al. 2015 [Nat Hazards]). Yet, the relative importance of sea-level rise induced landscape change on future tropical cyclone flood hazard assessment is not known. In this paper, idealized scenarios are used to evaluate the relative impact of one class of landscape change on future tropical cyclone extreme-value statistics in back-barrier regions: sea level rise induced vegetation migration and loss. The joint probability method with optimal sampling (JPM-OS) (Resio et al. 2009 [Nat Hazards]) with idealized surge response functions (e.g., Irish et al. 2009 [Nat Hazards]) is used to quantify the present-day and future flood hazard under various sea level rise scenarios. Results are evaluated in terms of their impact on the flood statistics (a) when projected flood elevations are included directly in the JPM analysis (Figure 1) and (b) when represented as additional uncertainty within the JPM integral (Resio et al. 2013 [Nat Hazards]), i.e., as random error. Findings are expected to aid in determining the level of effort required to reasonably account for future landscape change in hazard assessments, namely in determining when such processes are sufficiently captured by added uncertainty and when sea level rise induced vegetation changes must be considered dynamically, via detailed modeling initiatives. Acknowledgements: This material is based upon work supported by the National Science Foundation under Grant No. CMMI-1206271 and by the National Sea Grant College Program of the U.S. Department of Commerce's National Oceanic and Atmospheric Administration under Grant No. NA10OAR4170099. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of these organizations. The STOKES ARCC at the University of Central Florida provided computational resources for storm surge simulations.

  2. Experimental Validation of a Forward Looking Interferometer for Detection of Clear Air Turbulence due to Mountain Waves

    NASA Technical Reports Server (NTRS)

    Schaffner, Philip R.; Daniels, Taumi S.; West, Leanne L.; Gimmestad, Gary G.; Lane, Sarah E.; Burdette, Edward M.; Smith, William L.; Kireev, Stanislav; Cornman, Larry; Sharman, Robert D.

    2012-01-01

    The Forward-Looking Interferometer (FLI) is an airborne sensor concept for detection and estimation of potential atmospheric hazards to aircraft. The FLI concept is based on high-resolution Infrared Fourier Transform Spectrometry technologies that have been developed for satellite remote sensing. The FLI is being evaluated for its potential to address multiple hazards, during all phases of flight, including clear air turbulence, volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing. In addition, the FLI is being evaluated for its potential to detect hazardous runway conditions during landing, such as wet or icy asphalt or concrete. The validation of model-based instrument and hazard simulation results is accomplished by comparing predicted performance against empirical data. In the mountain lee wave data collected in the previous FLI project, the data showed a damped, periodic mountain wave structure. The wave data itself will be of use in forecast and nowcast turbulence products such as the Graphical Turbulence Guidance and Graphical Turbulence Guidance Nowcast products. Determining how turbulence hazard estimates can be derived from FLI measurements will require further investigation.

  3. Deterministic approach for multiple-source tsunami hazard assessment for Sines, Portugal

    NASA Astrophysics Data System (ADS)

    Wronna, M.; Omira, R.; Baptista, M. A.

    2015-11-01

    In this paper, we present a deterministic approach to tsunami hazard assessment for the city and harbour of Sines, Portugal, one of the test sites of project ASTARTE (Assessment, STrategy And Risk Reduction for Tsunamis in Europe). Sines has one of the most important deep-water ports, which has oil-bearing, petrochemical, liquid-bulk, coal, and container terminals. The port and its industrial infrastructures face the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING, a Non-linear Shallow Water model wIth Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages: MLLW (mean lower low water), MSL (mean sea level), and MHHW (mean higher high water). For each scenario, the tsunami hazard is described by maximum values of wave height, flow depth, drawback, maximum inundation area and run-up. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at the Sines test site considering the single scenarios at mean sea level, the aggregate scenario, and the influence of the tide on the aggregate scenario. The results confirm the composite source of Horseshoe and Marques de Pombal faults as the worst-case scenario, with wave heights of over 10 m, which reach the coast approximately 22 min after the rupture. It dominates the aggregate scenario by about 60 % of the impact area at the test site, considering maximum wave height and maximum flow depth. The HSMPF scenario inundates a total area of 3.5 km2.

  4. A contribution to the hazards assessment at Copahue volcano (Argentina-Chile) by facies analysis of a recent pyroclastic density current deposit

    NASA Astrophysics Data System (ADS)

    Balbis, C.; Petrinovic, I. A.; Guzmán, S.

    2016-11-01

    We recognised and interpreted a recent pyroclastic density current (PDC) deposit at the Copahue volcano (Southern Andes), through a field survey and a sedimentological study. The relationships between the behaviour of the PDCs, the morphology of the Río Agrio valley and the eruptive dynamics were interpreted. We identified two lithofacies in the deposit that indicate variations in the eruptive dynamics: i) the opening of the conduit and the formation of a highly explosive eruption that formed a diluted PDC through the immediate collapse of the eruptive column; ii) a continued eruption which followed immediately and records the widening of the conduit, producing a dense PDC. The eruption occurred in 2000 CE, was phreatomagmatic (VEI ≤ 2), with a vesiculation level above 4000 m depth and fragmentation driven by the interaction of magma with an hydrothermal system at ca. 1500 m depth. As deduced from the comparison between the accessory lithics of this deposit and those of the 2012 CE eruption, the depth of onset of vesiculation and fragmentation level in this volcano is constant in depth. In order to reproduce the distribution pattern of this PDC's deposit and to simulate potential PDC's forming-processes, we made several computational modelling from "denser" to "more diluted" conditions. The latter fairly reproduces the distribution of the studied deposit and represents perhaps one of the most dangerous possible scenarios of the Copahue volcanic activity. PDCs occurrence has been considered in the last volcanic hazards map as a low probability process; evidences found in this contribution suggest instead to include them as more probable and thus very important for the hazards assessment of the Copahue volcano.

  5. Linus Pauling and the scientific debate over fallout hazards.

    PubMed

    Jolly, J Christopher

    2002-12-01

    From 1954 to 1963, numerous scientists engaged in a public debate over the possible hazards from radioactive fallout from nuclear weapons testing. Nobel laureate Linus Pauling, a California Institute of Technology chemist, was one of the most prominent. His scientific papers relating to the fallout debate reveal many of the scientific, social and political issues involved in the controversy. Although the public controversy ended after the signing of the 1963 Limited Test Ban Treaty, many of the scientific questions about the possible hazards of low-level radiation remain under debate within the scientific community. Moreover, the fallout debate was a prototype of current controversies over environmental and public-health hazards.

  6. Acceptability of screening for early detection of liver disease in hazardous/harmful drinkers in primary care.

    PubMed

    Eyles, Caroline; Moore, Michael; Sheron, Nicholas; Roderick, Paul; O'Brien, Wendy; Leydon, Geraldine M

    2013-08-01

    It is estimated that one-quarter of adults in the UK drink at harmful/hazardous levels leading to increased mortality and alcohol liver disease (ALD). The Alcohol Liver Disease Detection Study (ALDDeS) aimed to test out in primary care the feasibility of alcohol misuse screening in adults, using the AUDIT questionnaire, and to assess screening harmful/hazardous alcohol users for ALD using newer non-invasive serum markers of fibrosis. To explore patients' experiences of taking part in ALDDeS and understanding of the delivery and process of screening for ALD using self-report questionnaires and feedback of liver fibrosis risk using levels of non-invasive serum markers. A nested qualitative study based in five primary care practices in the UK. From a sample of patients who were identified as drinking at harmful/hazardous levels, 30 participants were identified by maximum variation sampling for qualitative in-depth interviews. Using the principles of constant comparison the transcribed interviews were thematically analysed. Receiving a postal AUDIT questionnaire was viewed as acceptable by participants. For some completing the AUDIT increased awareness of their hazardous alcohol use and a positive blood test indicating liver fibrosis was a catalyst for behaviour change. For others, a negative blood test result provided a licence to continue drinking at hazardous levels. A limited understanding of safe drinking and of ALD was common. Educational and training needs of primary care professionals must be taken into account, so that patients with marker levels indicating low risk of fibrosis are correctly informed about the likely risks of continuing to drink at the same levels.

  7. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    PubMed

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  8. Hazard analysis of Arid and semi-Arid (ASAL) regions of Kenya.

    PubMed

    Tabu, J S; Otwelo, J A; Koskei, P; Makokha, P

    2013-06-01

    This paper describes a situationanalysis on hazards in the Arid and semi-Arid lands of Kenya. The leading hazards affecting the Arid and semi-arid lands are mainly natural and include among others drought, floods, and landslides. Other hazards of importance were found to be war and conflict, HIV/AIDS and fires. Over 80% of these are weather related. The overall objective of this study was to prioritize hazards in the ASAL region. Specifically, the study identified the top ten hazards in the ASAL Districts of Kenya, determined Probability of occurrence; Analyzed the potential impact of the hazard and utilizing multiplier effect prioritized the Hazards using a hypothetical model. This was a descriptive study conducted in over half of the Kenya's ASAL Districts in four regions of Lower and Upper Eastern, North Eastern and part of the Coast region. Six Districts were purposively selected per region with six officers from each District all totaling one hundred and forty four. The sectors where respondents were sourced from were Agriculture, Health, local Government, and Provincial Administration, Environment and NGO. The members through a consensus process analyzed hazards in groups of their respective districts using a tool that had been developed and respondents trained on its use. One hundred and forty four (144) officers from Twenty four Districts in the four regions were recruited. One hundred twenty seven (81%) were male and only 27 (19% ) were female The representation of participants per sector was Governance 25% followed by Civil society organizations 21%, Health 16%, Agriculture and arid lands 15%, Research and scientific institutions 13%. The top Priority Hazards identified using the mean score were Drought and famine (5.4) Epidemics and epizootics (3.8), HIV/AIDS (3.6), War and conflict (2.5), Floods (2.5) CONCLUSIONS: The exercise confirmed the priority hazards in the Arid and semi-arid regions of Kenya and described vulnerability factors that included water scarcity, poverty and low educational levels. The region suffers from a variety of hazards in particular Drought and famine, Epidemics including HIV/AIDS and War and conflict. Environmental degradation though given a low score may be more of a perception. There is need to undertake a comprehensive hazard and Vulnerability analysis at regional and country level to inform interventions and other developmental activities. Women should be targeted at the community and leadership level, and efforts to empower them should be stepped up.

  9. Risk assessment based on a combination of historical analysis, a detailed field study and numerical modeling on the alluvial fan Gadeinerbach as a basis for a risk management concept

    NASA Astrophysics Data System (ADS)

    Moser, M.

    2009-04-01

    The catchment Gadeinerbach in the District of Lungau/Salzburg/Austria is prone to debris flows. Large debris flow events dates back from the years 1934 and 1953. In the upper catchment large mass movements represent debris sources. A field study shows the debris potential and the catchment looks like a "sleeping torrential giant". To carry out mitigation measures a detailed risk management concept, based on a risk assessment in combination of historical analysis, field study and numerical modeling on the alluvial fan was conducted. Human activities have partly altered the surface of the alluvial fan Gadeinerbach but nevertheless some important hazard indicators could be found. With the hazard indicators and photo analysis from the large debris flow event 1934 the catchment character could be pointed out. With the help of these historical data sets (hazard indicators, sediment and debris amount...) it is possible to calibrate the provided numerical models and to win useful knowledge over the pro and cons and their application. The results were used to simulate the design event and furthermore to derive mitigation measures. Therefore the most effective protection against debris with a reduction of the high energy level to a lower level under particular energy change in combination with a debris/bedload deposition place has been carried out. Expert opinion, the study of historical data and a field work is in addition to numerical simulation techniques very necessary for the work in the field of natural hazard management.

  10. Off-Nominal Performance of the International Space Station Solar Array Wings Under Orbital Eclipse Lighting Scenarios

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Scheiman, David A.

    2005-01-01

    This paper documents testing and analyses to quantify International Space Station (ISS) Solar Array Wing (SAW) string electrical performance under highly off-nominal, low-temperature-low-intensity (LILT) operating conditions with nonsolar light sources. This work is relevant for assessing feasibility and risks associated with a Sequential Shunt Unit (SSU) remove and replace (R&R) Extravehicular Activity (EVA). During eclipse, SAW strings can be energized by moonlight, EVA suit helmet lights or video camera lights. To quantify SAW performance under these off-nominal conditions, solar cell performance testing was performed using full moon, solar simulator and Video Camera Luminaire (VCL) light sources. Test conditions included 25 to 110 C temperatures and 1- to 0.0001-Sun illumination intensities. Electrical performance data and calculated eclipse lighting intensities were combined to predict SAW current-voltage output for comparison with electrical hazard thresholds. Worst case predictions show there is no connector pin molten metal hazard but crew shock hazard limits are exceeded due to VCL illumination. Assessment uncertainties and limitations are discussed along with operational solutions to mitigate SAW electrical hazards from VCL illumination. Results from a preliminary assessment of SAW arcing are also discussed. The authors recommend further analyses once SSU, R&R, and EVA procedures are better defined.

  11. Role of beach morphology in wave overtopping hazard assessment

    NASA Astrophysics Data System (ADS)

    Phillips, Benjamin; Brown, Jennifer; Bidlot, Jean-Raymond; Plater, Andrew

    2017-04-01

    Understanding the role of beach morphology in controlling wave overtopping volume will further minimise uncertainties in flood risk assessments at coastal locations defended by engineered structures worldwide. XBeach is used to model wave overtopping volume for a 1:200 yr joint probability distribution of waves and water levels with measured, pre- and post-storm beach profiles. The simulation with measured bathymetry is repeated with and without morphological evolution enabled during the modelled storm event. This research assesses the role of morphology in controlling wave overtopping volumes for hazardous events that meet the typical design level of coastal defence structures. Results show disabling storm-driven morphology under-represents modelled wave overtopping volumes by up to 39% under high Hs conditions, and has a greater impact on the wave overtopping rate than the variability applied within the boundary conditions due to the range of wave-water level combinations that meet the 1:200 yr joint probability criterion. Accounting for morphology in flood modelling is therefore critical for accurately predicting wave overtopping volumes and the resulting flood hazard and to assess economic losses.

  12. Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations

    NASA Astrophysics Data System (ADS)

    Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang

    2017-09-01

    Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.

  13. Predictability of state-level flood damage in the conterminous United States: the role of hazard, exposure and vulnerability

    DOE PAGES

    Zhou, Qianqian; Leng, Guoyong; Feng, Leyang

    2017-07-13

    Understanding historical changes in flood damage and the underlying mechanisms is critical for predicting future changes for better adaptations. In this study, a detailed assessment of flood damage for 1950–1999 is conducted at the state level in the conterminous United States (CONUS). Geospatial datasets on possible influencing factors are then developed by synthesizing natural hazards, population, wealth, cropland and urban area to explore the relations with flood damage. A considerable increase in flood damage in CONUS is recorded for the study period which is well correlated with hazards. Comparably, runoff indexed hazards simulated by the Variable Infiltration Capacity (VIC) modelmore » can explain a larger portion of flood damage variations than precipitation in 84% of the states. Cropland is identified as an important factor contributing to increased flood damage in central US while urbanland exhibits positive and negative relations with total flood damage and damage per unit wealth in 20 and 16 states, respectively. Altogether, flood damage in 34 out of 48 investigated states can be predicted at the 90% confidence level. In extreme cases, ~76% of flood damage variations can be explained in some states, highlighting the potential of future flood damage prediction based on climate change and socioeconomic scenarios.« less

  14. Predictability of state-level flood damage in the conterminous United States: the role of hazard, exposure and vulnerability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Qianqian; Leng, Guoyong; Feng, Leyang

    Understanding historical changes in flood damage and the underlying mechanisms is critical for predicting future changes for better adaptations. In this study, a detailed assessment of flood damage for 1950–1999 is conducted at the state level in the conterminous United States (CONUS). Geospatial datasets on possible influencing factors are then developed by synthesizing natural hazards, population, wealth, cropland and urban area to explore the relations with flood damage. A considerable increase in flood damage in CONUS is recorded for the study period which is well correlated with hazards. Comparably, runoff indexed hazards simulated by the Variable Infiltration Capacity (VIC) modelmore » can explain a larger portion of flood damage variations than precipitation in 84% of the states. Cropland is identified as an important factor contributing to increased flood damage in central US while urbanland exhibits positive and negative relations with total flood damage and damage per unit wealth in 20 and 16 states, respectively. Altogether, flood damage in 34 out of 48 investigated states can be predicted at the 90% confidence level. In extreme cases, ~76% of flood damage variations can be explained in some states, highlighting the potential of future flood damage prediction based on climate change and socioeconomic scenarios.« less

  15. The prevalence of lead-based paint hazards in U.S. housing.

    PubMed Central

    Jacobs, David E; Clickner, Robert P; Zhou, Joey Y; Viet, Susan M; Marker, David A; Rogers, John W; Zeldin, Darryl C; Broene, Pamela; Friedman, Warren

    2002-01-01

    In this study we estimated the number of housing units in the United States with lead-based paint and lead-based paint hazards. We included measurements of lead in intact and deteriorated paint, interior dust, and bare soil. A nationally representative, random sample of 831 housing units was evaluated in a survey between 1998 and 2000; the units and their occupants did not differ significantly from nationwide characteristics. Results indicate that 38 million housing units had lead-based paint, down from the 1990 estimate of 64 million. Twenty-four million had significant lead-based paint hazards. Of those with hazards, 1.2 million units housed low-income families (< 30,000 US dollars/year) with children under 6 years of age. Although 17% of government-supported, low-income housing had hazards, 35% of all low-income housing had hazards. For households with incomes greater than or equal to 30,000 US dollars/year, 19% had hazards. Fourteen percent of all houses had significantly deteriorated lead-based paint, and 16% and 7%, respectively, had dust lead and soil lead levels above current standards of the U.S. Department of Housing and Urban Development and the U.S. Environmental Protection Agency. The prevalence of lead-based paint and hazards increases with age of housing, but most painted surfaces, even in older housing, do not have lead-based paint. Between 2% and 25% of painted building components were coated with lead-based paint. Housing in the Northeast and Midwest had about twice the prevalence of hazards compared with housing in the South and West. The greatest risk occurs in older units with lead-based paint hazards that either will be or are currently occupied by families with children under 6 years of age and are low-income and/or are undergoing renovation or maintenance that disturbs lead-based paint. This study also confirms projections made in 2000 by the President's Task Force on Environmental Health Risks and Safety Risks to Children of the number of houses with lead-based paint hazards. Public- and private-sector resources should be directed to units posing the greatest risk if future lead poisoning is to be prevented. PMID:12361941

  16. Natural and Man-Made Hazards in the Cayman Islands

    NASA Astrophysics Data System (ADS)

    Novelo-Casanova, D. A.; Suarez, G.

    2010-12-01

    Located in the western Caribbean Sea to the northwest of Jamaica, the Cayman Islands are a British overseas territory comprised of three islands: Grand Cayman, Cayman Brac, and Little Cayman. These three islands occupy around 250 km2 of land area. In this work, historical and recent data were collected and classified to identify and rank the natural and man-made hazards that may potentially affect the Cayman Islands and determine the level of exposure of Grand Cayman to these events. With this purpose, we used the vulnerability assessment methodology developed by the North Caroline Department of Environment and Natural Resources. The different degrees of physical vulnerability for each hazard were graphically interpreted with the aid of maps using a relative scoring system. Spatial maps were generated showing the areas of different levels of exposure to multi-hazards. The more important natural hazard to which the Cayman Islands are exposed is clearly hurricanes. To a lesser degree, the islands may be occasionally exposed to earthquakes and tsunamis. Explosions or leaks of the Airport Texaco Fuel Depot and the fuel pipeline at Grand Cayman are the most significant man-made hazards. Our results indicate that there are four areas in Grand Cayman with various levels of exposure to natural and man-made hazards: The North Sound, Little Sound and Eastern West Bay (Area 1) show a very high level of exposure; The Central Mangroves, Central Bodden Town, Central George Town and the West Bay (Area 2) have high level of exposure; The Northwestern West Bay, Western Georgetown-Bodden Town, and East End-North Side (Area 3) are under moderate levels of exposure. The remainder of the island shows low exposure (Area 4). It is important to underline that this study presents a first evaluation of the main natural and man-made hazards that may affect the Cayman Islands. The maps generated will be useful tools for emergency managers and policy developers and will increase the overall awareness of decision makers for disasters prevention and mitigation plans. Our results constitute the basis of future mitigation risk projects in the islands. Areas showing the level of exposure to natural and man-made hazards at Grand Cayman.

  17. St. Louis area earthquake hazards mapping project; seismic and liquefaction hazard maps

    USGS Publications Warehouse

    Cramer, Chris H.; Bauer, Robert A.; Chung, Jae-won; Rogers, David; Pierce, Larry; Voigt, Vicki; Mitchell, Brad; Gaunt, David; Williams, Robert; Hoffman, David; Hempen, Gregory L.; Steckel, Phyllis; Boyd, Oliver; Watkins, Connor M.; Tucker, Kathleen; McCallister, Natasha

    2016-01-01

    We present probabilistic and deterministic seismic and liquefaction hazard maps for the densely populated St. Louis metropolitan area that account for the expected effects of surficial geology on earthquake ground shaking. Hazard calculations were based on a map grid of 0.005°, or about every 500 m, and are thus higher in resolution than any earlier studies. To estimate ground motions at the surface of the model (e.g., site amplification), we used a new detailed near‐surface shear‐wave velocity model in a 1D equivalent‐linear response analysis. When compared with the 2014 U.S. Geological Survey (USGS) National Seismic Hazard Model, which uses a uniform firm‐rock‐site condition, the new probabilistic seismic‐hazard estimates document much more variability. Hazard levels for upland sites (consisting of bedrock and weathered bedrock overlain by loess‐covered till and drift deposits), show up to twice the ground‐motion values for peak ground acceleration (PGA), and similar ground‐motion values for 1.0 s spectral acceleration (SA). Probabilistic ground‐motion levels for lowland alluvial floodplain sites (generally the 20–40‐m‐thick modern Mississippi and Missouri River floodplain deposits overlying bedrock) exhibit up to twice the ground‐motion levels for PGA, and up to three times the ground‐motion levels for 1.0 s SA. Liquefaction probability curves were developed from available standard penetration test data assuming typical lowland and upland water table levels. A simplified liquefaction hazard map was created from the 5%‐in‐50‐year probabilistic ground‐shaking model. The liquefaction hazard ranges from low (60% of area expected to liquefy) in the lowlands. Because many transportation routes, power and gas transmission lines, and population centers exist in or on the highly susceptible lowland alluvium, these areas in the St. Louis region are at significant potential risk from seismically induced liquefaction and associated ground deformation

  18. Molybdenum in the environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jawell, W.M.; Page, A.L.; Elseewi, A.A.

    1980-01-01

    While molybdenum is an essential element for both plants and animals, it becomes toxic above certain critical levels. Reviewed are the natural supply of molybdenum in the environment. The molybdenum cycle, the importance of molybdenum in industry and agriculture, and potential hazards that may occur when excessive levels of molybdenum occur in the environment. Although the potential of molybdenum toxicity to humans and non-ruminant animals appears to be low, the enrichment of the environment with molybdenum from modern mining, agricultural, and industrial activities has potentially hazardous implications for ruminant animal health.

  19. Molybdenum in the environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarrell, W.M.; Page, A.L.; Elseewi, A.A.

    1980-01-01

    While molybdenum is an essential element for both plants and animals, it becomes toxic above certain critical levels. Reviewed are the natural supply of molybdenum in the environment, the molybdenum cycle, the importance of molybdenum in industry and agriculture, and potential hazards that may occur when excessive levels of molybdenum occur in the environment. Although the potential of molybdenum toxicity to humans and non-ruminant animals appears to be low, the enrichment of the environment with molybdenum from modern mining, agricultural, and industrial activities has potentially hazardous implications for ruminant animal health. (3 graphs, numerous references, 16 tables)

  20. Development of Adygine glacier complex (glacier and proglacial lakes) and its link to outburst hazard

    NASA Astrophysics Data System (ADS)

    Falatkova, Kristyna; Schöner, Wolfgang; Häusler, Hermann; Reisenhofer, Stefan; Neureiter, Anton; Sobr, Miroslav; Jansky, Bohumir

    2017-04-01

    Mountain glacier retreat has a well-known impact on life of local population - besides anxiety over water supply for agriculture, industry, or households, it has proved to have a direct influence on glacier hazard occurrence. The paper focuses on lake outburst hazard specifically, and aims to describe the previous and future development of Adygine glacier complex and identify its relationship to the hazard. The observed glacier is situated in the Northern Tien Shan, with an area of 4 km2 in northern exposition at an elevation range of 3,500-4,200 m a.s.l. The study glacier ranks in the group of small-sized glaciers, therefore we expect it to respond faster to changes of the climate compared to larger ones. Below the glacier there is a three-level cascade of proglacial lakes at different stages of development. The site has been observed sporadically since 1960s, however, closer study has been carried out since 2007. Past development of the glacier-lake complex is analyzed by combination of satellite imagery interpretations and on-site measurements (geodetic and bathymetric survey). A glacier mass balance model is used to simulate future development of the glacier resulting from climate scenarios. We used the simulated future glacier extent and the glacier base topography provided by GPR survey to assess potential for future lake formation. This enables us to assess the outburst hazard for the three selected lakes with an outlook for possible/probable hazard changes linked to further complex succession/progression (originating from climate change scenarios). Considering the proximity of the capital Bishkek, spreading settlements, and increased demand for tourism-related infrastructure within the main valley, it is of high importance to identify the present and possible future hazards that have a potential to affect this region.

  1. Forest vegetation simulation tools and forest health assessment

    Treesearch

    Richard M. Teck; Melody Steele

    1995-01-01

    A Stand Hazard Rating System for Central ldaho forests has been incorporated into the Central ldaho Prognosis variant of the Forest Vegetation Simulator to evaluate how insects, disease and fire hazards within the Deadwood River Drainage change over time. A custom interface, BOISE.COMPUTE.PR, has been developed so hazard ratings can be electronically downloaded...

  2. Directing driver attention with augmented reality cues

    PubMed Central

    Rusch, Michelle L.; Schall, Mark C.; Gavin, Patrick; Lee, John D.; Dawson, Jeffrey D.; Vecera, Shaun; Rizzo, Matthew

    2013-01-01

    This simulator study evaluated the effects of augmented reality (AR) cues designed to direct the attention of experienced drivers to roadside hazards. Twenty-seven healthy middle-aged licensed drivers with a range of attention capacity participated in a 54 mile (1.5 hour) drive in an interactive fixed-base driving simulator. Each participant received AR cues to potential roadside hazards in six simulated straight (9 mile long) rural roadway segments. Drivers were evaluated on response time for detecting a potentially hazardous event, detection accuracy for target (hazard) and non-target objects, and headway with respect to the hazards. Results showed no negative outcomes associated with interference. AR cues did not impair perception of non-target objects, including for drivers with lower attentional capacity. Results showed near significant response time benefits for AR cued hazards. AR cueing increased response rate for detecting pedestrians and warning signs but not vehicles. AR system false alarms and misses did not impair driver responses to potential hazards. PMID:24436635

  3. DEVELOPMENT AND ANALYSIS OF AIR QUALITY MODELING SIMULATIONS FOR HAZARDOUS AIR POLLUTANTS

    EPA Science Inventory

    The concentrations of five hazardous air pollutants were simulated using the Community Multi Scale Air Quality (CMAQ) modeling system. Annual simulations were performed over the continental United States for the entire year of 2001 to support human exposure estimates. Results a...

  4. Ecological modelling and toxicity data coupled to assess population recovery of marine amphipod Gammarus locusta: Application to disturbance by chronic exposure to aniline.

    PubMed

    de los Santos, Carmen B; Neuparth, Teresa; Torres, Tiago; Martins, Irene; Cunha, Isabel; Sheahan, Dave; McGowan, Tom; Santos, Miguel M

    2015-06-01

    A population agent-based model of marine amphipod Gammarus locusta was designed and implemented as a basis for ecological risk assessment of chemical pollutants impairing life-history traits at the individual level. We further used the model to assess the toxic effects of aniline (a priority hazardous and noxious substance, HNS) on amphipod populations using empirically-built dose-response functions derived from a chronic bioassay that we previously performed with this species. We observed a significant toxicant-induced mortality and adverse effects in reproductive performance (reduction of newborn production) in G. locusta at the individual level. Coupling the population model with the toxicological data from the chronic bioassay allowed the projection of the ecological costs associated with exposure to aniline that might occur in wild populations. Model simulations with different scenarios indicated that even low level prolonged exposure to the HNS aniline can have significant long-term impacts on G. locusta population abundance, until the impacted population returns to undisturbed levels. This approach may be a useful complement in ecotoxicological studies of chemical pollution to transfer individual-collected data to ecological-relevant levels. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Integrating Hydrologic and Water Quality Models as a Decision Support Tool for Implementation of Low Impact Development in a Coastal Urban Watershed under Climate Variability and Sea Level Rise

    NASA Astrophysics Data System (ADS)

    Chang, N. B.

    2016-12-01

    Many countries concern about development and redevelopment efforts in urban regions to reduce the flood risk by considering hazards such as high-tide events, storm surge, flash floods, stormwater runoff, and impacts of sea level rise. Combining these present and future hazards with vulnerable characteristics found throughout coastal communities such as majority low-lying areas and increasing urban development, create scenarios for increasing exposure of flood hazard. As such, the most vulnerable areas require adaptation strategies and mitigation actions for flood hazard management. In addition, in the U.S., Numeric Nutrient Criteria (NNC) are a critical tool for protecting and restoring the designated uses of a waterbody with regard to nitrogen and phosphorus pollution. Strategies such as low impact development (LID) have been promoted in recent years as an alternative to traditional stormwater management and drainage to control both flooding and water quality impact. LID utilizes decentralized multifunctional site designs and incorporates on-site storm water management practices rather than conventional storm water management approaches that divert flow toward centralized facilities. How to integrate hydrologic and water quality models to achieve the decision support becomes a challenge. The Cross Bayou Watershed of Pinellas County in Tampa Bay, a highly urbanized coastal watershed, is utilized as a case study due to its sensitivity to flood hazards and water quality management within the watershed. This study will aid the County, as a decision maker, to implement its stormwater management policy and honor recent NNC state policy via demonstration of an integrated hydrologic and water quality model, including the Interconnected Channel and Pond Routing Model v.4 (ICPR4) and the BMPTRAIN model as a decision support tool. The ICPR4 can be further coupled with the ADCIRC/SWAN model to reflect the storm surge and seal level rise in coastal regions.

  6. 'I didn't see that coming': simulated visual fields and driving hazard perception test performance.

    PubMed

    Glen, Fiona C; Smith, Nicholas D; Jones, Lee; Crabb, David P

    2016-09-01

    Evidence is limited regarding specific types of visual field loss associated with unsafe driving. We use novel gaze-contingent software to examine the effect of simulated visual field loss on computer-based driving hazard detection with the specific aim of testing the impact of scotomata located to the right and left of fixation. The 'hazard perception test' is a component of the UK driving licence examination, which measures speed of detecting 15 different hazards in a series of real-life driving films. We have developed a novel eye-tracking and computer set up capable of generating a realistic gaze-contingent scotoma simulation (GazeSS) overlaid on film content. Thirty drivers with healthy vision completed three versions of the hazard perception test in a repeated measures experiment. In two versions, GazeSS simulated a scotoma in the binocular field of view to the left or right of fixation. A third version was unmodified to establish baseline performance. Participants' mean baseline hazard perception test score was 51 ± 7 (out of 75). This reduced to 46 ± 9 and 46 ± 11 when completing the task with a binocular visual field defect located to the left and right of fixation, respectively. While the main effect of simulated visual field loss on performance was statistically significant (p = 0.007), there were no average differences in the experimental conditions where a scotoma was located in the binocular visual field to the right or left of fixation. Simulated visual field loss impairs driving hazard detection on a computer-based test. There was no statistically significant difference in average performance when the simulated scotoma was located to the right or left of fixation of the binocular visual field, but certain types of hazard caused more difficulties than others. © 2016 Optometry Australia.

  7. 49 CFR 173.241 - Bulk packagings for certain low hazard liquid and solid materials.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... B of part 173 of this subchapter and the special provisions specified in column 7 of the § 172.101... for the specific hazardous material in Column (7) of the § 172.101 Table of this subchapter and the... performance level as specified in Column (5) of the § 172.101 Table for the material being transported. (1...

  8. 49 CFR 173.241 - Bulk packagings for certain low hazard liquid and solid materials.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... B of part 173 of this subchapter and the special provisions specified in column 7 of the § 172.101... for the specific hazardous material in Column (7) of the § 172.101 Table of this subchapter and the... performance level as specified in Column (5) of the § 172.101 Table for the material being transported. (1...

  9. 49 CFR 173.241 - Bulk packagings for certain low hazard liquid and solid materials.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... B of part 173 of this subchapter and the special provisions specified in column 7 of the § 172.101... for the specific hazardous material in Column (7) of the § 172.101 Table of this subchapter and the... performance level as specified in Column (5) of the § 172.101 Table for the material being transported. (1...

  10. 49 CFR 173.241 - Bulk packagings for certain low hazard liquid and solid materials.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... B of part 173 of this subchapter and the special provisions specified in column 7 of the § 172.101... for the specific hazardous material in Column (7) of the § 172.101 Table of this subchapter and the... performance level as specified in Column (5) of the § 172.101 Table for the material being transported. (1...

  11. 49 CFR 173.241 - Bulk packagings for certain low hazard liquid and solid materials.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... B of part 173 of this subchapter and the special provisions specified in column 7 of the § 172.101... for the specific hazardous material in Column (7) of the § 172.101 Table of this subchapter and the... performance level as specified in Column (5) of the § 172.101 Table for the material being transported. (1...

  12. Physical and environmental hazards in the prosthetics and orthotics workshop: a pilot study

    PubMed Central

    ANDERSON, Sarah; STUCKEY, Rwth; POOLE, Diana; OAKMAN, Jodi

    2017-01-01

    Prosthetists and Orthotists (P&O) are exposed to physical hazards within the workshop environment. Concern regarding these exposures has been expressed by P&Os; however, little research has been undertaken. Exposures to noise and volatile organic compounds in amounts larger than statutorily allowed can have adverse short and long term consequences on people’s health. To identify and quantify hazardous noise and chemical exposures in a typical P&O workplace. Noise and volatile organic compound testing was undertaken in 2011 and 2013. Modifications to the workshop occurred between these testing times and the impact of these changes examined. The levels of volatile organic compounds was very low in all areas in 2011 and 2013. Noise levels were high and staff require the use of PPE to prevent exposure beyond levels prescribed in the Australian Standards. Conclusions. Occupational environmental exposures in P&O are of concern to the profession. A pilot study of one facility demonstrated that Occupational Noise exposures are high and may result in hearing loss and other adverse health outcomes. Occupational chemical exposures through volatile organic compound exposures are relatively low. Further, systematic investigation is required to develop evidence-based control strategies. PMID:28179609

  13. Direct estimates of low-level radiation risks of lung cancer at two NRC-compliant nuclear installations: why are the new risk estimates 20 to 200 times the old official estimates?

    PubMed

    Bross, I D; Driscoll, D L

    1981-01-01

    An official report on the health hazards to nuclear submarine workers at the Portsmouth Naval Shipyard (PNS), who were exposed to low-level ionizing radiation, was based on a casual inspection of the data and not on statistical analyses of the dosage-response relationships. When these analyses are done, serious hazards from lung cancer and other causes of death are shown. As a result of the recent studies on nuclear workers, the new risk estimates have been found to be much higher than the official estimates currently used in setting NRC permissible levels. The official BEIR estimates are about one lung cancer death per year per million persons per rem[s]. The PNS data show 189 lung cancer deaths per year per million persons per rem.

  14. The Hawaiian Volcano Observatory's current approach to forecasting lava flow hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Kauahikaua, J. P.

    2013-12-01

    Hawaiian Volcanoes are best known for their frequent basaltic eruptions, which typically start with fast-moving channelized `a`a flows fed by high eruptions rates. If the flows continue, they generally transition into pahoehoe flows, fed by lower eruption rates, after a few days to weeks. Kilauea Volcano's ongoing eruption illustrates this--since 1986, effusion at Kilauea has mostly produced pahoehoe. The current state of lava flow simulation is quite advanced, but the simplicity of the models mean that they are most appropriately used during the first, most vigorous, days to weeks of an eruption - during the effusion of `a`a flows. Colleagues at INGV in Catania have shown decisively that MAGFLOW simulations utilizing satellite-derived eruption rates can be effective at estimating hazards during the initial periods of an eruption crisis. However, the algorithms do not simulate the complexity of pahoehoe flows. Forecasts of lava flow hazards are the most common form of volcanic hazard assessments made in Hawai`i. Communications with emergency managers over the last decade have relied on simple steepest-descent line maps, coupled with empirical lava flow advance rate information, to portray the imminence of lava flow hazard to nearby communities. Lavasheds, calculated as watersheds, are used as a broader context for the future flow paths and to advise on the utility of diversion efforts, should they be contemplated. The key is to communicate the uncertainty of any approach used to formulate a forecast and, if the forecast uses simple tools, these communications can be fairly straightforward. The calculation of steepest-descent paths and lavasheds relies on the accuracy of the digital elevation model (DEM) used, so the choice of DEM is critical. In Hawai`i, the best choice is not the most recent but is a 1980s-vintage 10-m DEM--more recent LIDAR and satellite radar DEM are referenced to the ellipsoid and include vegetation effects. On low-slope terrain, steepest descent lines calculated on a geoid-based DEM may differ significantly from those calculated on an ellipsoid-based DEM. Good estimates of lava flow advance rates can be obtained from empirical compilations of historical advance rates of Hawaiian lava flows. In this way, rates appropriate for observed flow types (`a`a or pahoehoe, channelized or not) can be applied. Eruption rate is arguably the most important factor, while slope is also significant for low eruption rates. Eruption rate, however, remains the most difficult parameter to estimate during an active eruption. The simplicity of the HVO approach is its major benefit. How much better can lava-flow advance be forecast for all types of lava flows? Will the improvements outweigh the increased uncertainty propagated through the simulation calculations? HVO continues to improve and evaluate its lava flow forecasting tools to provide better hazard assessments to emergency personnel.

  15. A Windshear Hazard Index

    NASA Technical Reports Server (NTRS)

    Proctor, Fred H.; Hinton, David A.; Bowles, Roland L.

    2000-01-01

    An aircraft exposed to hazardous low-level windshear may suffer a critical loss of airspeed and altitude, thus endangering its ability to remain airborne. In order to characterize this hazard, a nondimensional index was developed based oil aerodynamic principals and understanding of windshear phenomena, 'This paper reviews the development and application of the Bowles F-tactor. which is now used by onboard sensors for the detection of hazardous windshear. It was developed and tested during NASA/I:AA's airborne windshear program and is now required for FAA certification of onboard radar windshear detection systems. Reviewed in this paper are: 1) definition of windshear and description of atmospheric phenomena that may cause hazardous windshear. 2) derivation and discussion of the F-factor. 3) development of the F-factor hazard threshold, 4) its testing during field deployments, and 5) its use in accident reconstructions,

  16. Direction based Hazard Routing Protocol (DHRP) for disseminating road hazard information using road side infrastructures in VANETs.

    PubMed

    Berlin, M A; Anand, Sheila

    2014-01-01

    This paper presents Direction based Hazard Routing Protocol (DHRP) for disseminating information about fixed road hazards such as road blocks, tree fall, boulders on road, snow pile up, landslide, road maintenance work and other obstacles to the vehicles approaching the hazardous location. The proposed work focuses on dissemination of hazard messages on highways with sparse traffic. The vehicle coming across the hazard would report the presence of the hazard. It is proposed to use Road Side fixed infrastructure Units for reliable and timely delivery of hazard messages to vehicles. The vehicles can then take appropriate safety action to avoid the hazardous location. The proposed protocol has been implemented and tested using SUMO simulator to generate road traffic and NS 2.33 network simulator to analyze the performance of DHRP. The performance of the proposed protocol was also compared with simple flooding protocol and the results are presented.

  17. Site-specific probabilistic ecological risk assessment of a volatile chlorinated hydrocarbon-contaminated tidal estuary.

    PubMed

    Hunt, James; Birch, Gavin; Warne, Michael St J

    2010-05-01

    Groundwater contaminated with volatile chlorinated hydrocarbons (VCHs) was identified as discharging to Penrhyn Estuary, an intertidal embayment of Botany Bay, New South Wales, Australia. A screening-level hazard assessment of surface water in Penrhyn Estuary identified an unacceptable hazard to marine organisms posed by VCHs. Given the limitations of hazard assessments, the present study conducted a higher-tier, quantitative probabilistic risk assessment using the joint probability curve (JPC) method that accounted for variability in exposure and toxicity profiles to quantify risk (delta). Risk was assessed for 24 scenarios, including four areas of the estuary based on three exposure scenarios (low tide, high tide, and both low and high tides) and two toxicity scenarios (chronic no-observed-effect concentrations [NOEC] and 50% effect concentrations [EC50]). Risk (delta) was greater at low tide than at high tide and varied throughout the tidal cycle. Spatial distributions of risk in the estuary were similar using both NOEC and EC50 data. The exposure scenario including data combined from both tides was considered the most accurate representation of the ecological risk in the estuary. When assessing risk using data across both tides, the greatest risk was identified in the Springvale tributary (delta=25%)-closest to the source area-followed by the inner estuary (delta=4%) and the Floodvale tributary (delta=2%), with the lowest risk in the outer estuary (delta=0.1%), farthest from the source area. Going from the screening level ecological risk assessment (ERA) to the probabilistic ERA changed the risk from unacceptable to acceptable in 50% of exposure scenarios in two of the four areas within the estuary. The probabilistic ERA provided a more realistic assessment of risk than the screening-level hazard assessment. Copyright (c) 2010 SETAC.

  18. Final Results from A Pilot Project to Investigate Wake Vortex Patterns and Weather Patterns at the Atlantic City Airport by the Richard Stockton College of NJ and the FAA

    NASA Astrophysics Data System (ADS)

    Trout, Joseph; Manson, J. Russell; King, David; Decicco, Nicolas; Prince, Alyssa; di Mercurio, Alexis; Rios, Manual

    2017-01-01

    Wake Vortex Turbulence is the turbulence generated by an aircraft in flight. This turbulence is created by vortices at the tips of the wing that may decay slowly and persist for several minutes after creation. These vortices and turbulence are hazardous to other aircraft in the vicinity. The strength, formation and lifetime of the turbulence and vortices are effected by many things including the weather. Here we present the final results of the pilot project to investigation of low level wind fields generated by the Weather Research and Forecasting Model and an analysis of historical data. The findings from the historical data and the data simulations were used as inputs for the computational fluid dynamics model (OpenFoam) to show that the vortices could be simulated using OpenFoam. Presented here are the updated results from a research grant, ``A Pilot Project to Investigate Wake Vortex Patterns and Weather Patterns at the Atlantic City Airport by the Stockton University and the FAA''.

  19. An empirical comparison of statistical tests for assessing the proportional hazards assumption of Cox's model.

    PubMed

    Ng'andu, N H

    1997-03-30

    In the analysis of survival data using the Cox proportional hazard (PH) model, it is important to verify that the explanatory variables analysed satisfy the proportional hazard assumption of the model. This paper presents results of a simulation study that compares five test statistics to check the proportional hazard assumption of Cox's model. The test statistics were evaluated under proportional hazards and the following types of departures from the proportional hazard assumption: increasing relative hazards; decreasing relative hazards; crossing hazards; diverging hazards, and non-monotonic hazards. The test statistics compared include those based on partitioning of failure time and those that do not require partitioning of failure time. The simulation results demonstrate that the time-dependent covariate test, the weighted residuals score test and the linear correlation test have equally good power for detection of non-proportionality in the varieties of non-proportional hazards studied. Using illustrative data from the literature, these test statistics performed similarly.

  20. Vertical wind shear characteristics that promote supercell-to-MCS transitions

    NASA Astrophysics Data System (ADS)

    Peters, J. M.

    2017-12-01

    What causes supercells to transition into MCSs in some situations, but not others? To explore this question, I first examined observed environmental characteristics of supercell events when MCSs formed, and compared them to the analogous environmental characteristics of supercell events when MCSs did not form. During events when MCS growth occurred, 0-1 km (low-level) vertical wind shear was stronger and 0-10 km (deep-layer) vertical wind shear was weaker than the wind shear during events when MCS growth did not occur. Next, I used idealized simulations of supercell thunderstorms to understand the connections between low-level and deep-layer shear and MCS growth. Compared to simulations with strong deep-layer shear, the simulations with weak deep-layer shear had rain in the storm's forward-flank downdraft (FFD) that fell closer to the updraft, fell through storm-moistened air and evaporated less, and produced a more intense FFD. Compared to simulations with weak low-level shear, the simulations with stronger low-level shear showed enhanced northward low-level hydrometeor transport into the FFD. Environments with strong low-level shear and weak deep-layer shear therefore conspired to produce a storm with a more intense FFD cold pool, when compared to environments with weak low-level shear and/or strong deep-layer shear. This strong FFD periodically disrupted the supercells' mesocyclones, and favorably interacted with westerly wind shear to produce widespread linear convection initiation, which drove MCS growth. These results suggest that increasing low-level wind shear after dark - while commonly assumed to enhance tornado potential - may in fact drive MCS growth and reduce tornado potential, unless it is combined with sufficiently strong deep layer shear.

  1. A pooled analysis of the association of isolated low levels of high-density lipoprotein cholesterol with cardiovascular mortality in Japan.

    PubMed

    Hirata, Takumi; Sugiyama, Daisuke; Nagasawa, Shin-Ya; Murakami, Yoshitaka; Saitoh, Shigeyuki; Okayama, Akira; Iso, Hiroyasu; Irie, Fujiko; Sairenchi, Toshimi; Miyamoto, Yoshihiro; Yamada, Michiko; Ishikawa, Shizukiyo; Miura, Katsuyuki; Ueshima, Hirotsugu; Okamura, Tomonori

    2017-07-01

    Low levels of serum high-density lipoprotein cholesterol (HDL-C) have been shown to be associated with increased risk of coronary heart disease (CHD). However, because this is usually observed in the context of other lipid abnormalities, it is not known whether isolated low serum HDL-C levels are an independent risk factor for CHD. We performed a large pooled analysis in Japan using data from nine cohorts with 41,206 participants aged 40-89 years who were free of cardiovascular disease at baseline. We divided participants into three groups: isolated low HDL-C, non-isolated low HDL-C, and normal HDL-C. Cohort-stratified Cox proportional hazards models were used to estimate multivariate-adjusted hazard ratios (HRs) for death due to CHD, ischemic stroke, and intracranial cerebral hemorrhage; during a 12.9-year follow-up, we observed 355, 286, and 138 deaths, respectively, in these groups. Non-isolated low HDL-C was significantly associated with increased risk of CHD compared with normal HDL-C (HR 1.37, 95 % confidence interval (CI) 1.04-1.80); however, isolated low HDL-C was not. Although isolated low HDL-C was significantly associated with decreased risk of CHD (HR 0.51, 95 % CI 0.29-0.89) in women, it was significantly associated with increased risk of intracranial cerebral hemorrhage in all participants (HR 1.62, 95 % CI 1.04-2.53) and in men (HR 2.00, 95 % CI 1.04-3.83). In conclusion, isolated low HDL-C levels are not associated with increased risk of CHD in Japan. CHD risk may, therefore, be more strongly affected by serum total cholesterol levels in this population.

  2. Proposal of global flood vulnerability scenarios for evaluating future potential flood losses

    NASA Astrophysics Data System (ADS)

    Kinoshita, Y.; Tanoue, M.; Watanabe, S.; Hirabayashi, Y.

    2015-12-01

    Flooding is one of the most hazardous and damaging natural disasters causing serious economic loss and casualties across the world (Jongman et al., 2015). Previous studies showed that the global temperature increase affects regional weather pattern, and several general circulation model (GCM) simulations suggest the increase of flood events in both frequency and magnitude in many parts of the world (Hirabayashi et al., 2013). Effective adaptation to potential flood risks under the warming climate requires an in-depth understanding of both the physical and socioeconomic contributors of the flood risk. To assess the realistic future potential flood risk, future sophisticated vulnerability scenarios associated with the shared socioeconomic pathways (SSPs) are necessary. In this study we propose a new future vulnerability scenarios in mortality. Our vulnerability scenarios are constructed based on the modeled flood exposure (population potentially suffered by flooding) and a past from 1980 to 2005. All the flood fatality data were classified according to four income levels (high, mid-high, mid-low and low). Our proposed scenarios have three pathways regarding to SSPs; High efficiency (HE) scenario (SSP1, SSP4 (rich country) and SSP5), Medium efficiency (ME) scenario (SSP2), and Low efficiency (LE) scenario (SSP3 and SSP4 (poor country)). The maximum mortality protection level on each category was detected by applying exponential curve fitting with offset term. Slopes in the HE scenario are assumed to be equal to slopes estimated by regression analysis in each category. The slope in the HE scenario is defined by the mean value of all countries' slope value that is approximately -0.33 mortality decreases per year. The EM-DAT mortality data shows a decreasing trend in time in almost all of the countries. Although mortalities in some countries show an increasing trend, this is because these countries were affected by once-in-hundred-years floods after 1990's. The slope in the ME scenario are half of that in the HE scenario, and a quarter in the LE scenario. In addition, we set three categories depending on mortality level. Our proposed vulnerability scenarios would enable us to reasonably replicate self-sustained vulnerability change against flood hazard associated with the SSPs.

  3. Preliminary risk assessment of the Mexican Spotted Owl under a spatially-weighted foraging regime at the Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallegos, A.F.; Gonzales, G.J.; Bennett, K.D.

    The Record of Decision on the Dual Axis Radiographic Hydrodynamic Test Facility at the Los Alamos National Laboratory requires that the Department of Energy takes special precautions to protect the Mexican Spotted Owl (Strix occidentalis lucida). In order to do so, risk to the owl presented by radiological and nonradiological contaminants must be estimated. A preliminary risk assessment on the Mexican Spotted Owl in two Ecological Exposure Units (EEUs) was performed using a modified Environmental Protection Agency Quotient method, the FORTRAN model ECORSK4, and a geographic information system. Estimated doses to the owl under a spatially-weighted foraging regime were comparedmore » against toxicological reference doses generating hazard indices (HIs) and hazard quotients (HQs) for three risk source types. The average HI was 0.20 for EEU-21 and 0.0015 for EEU-40. Under the risk parameter assumptions made, hazard quotient results indicated no unacceptable risk to the owl, including a measure of cumulative effects from multiple contaminants that assumes a linear additive toxicity type. An HI of 1.0 was used as the evaluative criteria for determining the acceptability of risk. This value was exceeded (1.06) in only one of 200 simulated potential nest sites. Cesium-137, Ni, {sup 239}Pu, Al and {sup 234}U we`re among the constituents with the highest partial HQs. Improving model realism by weighting simulated owl foraging based on distance from potential nest sites decreased the estimated risk by 72% (0.5 HI units) for EEU-21 and by 97.6% (6.3E-02 HI units) for EEU-40. Information on risk by specific geographical location was generated, which can be used to manage contaminated areas, owl habitat, facility siting, and/or facility operations in order to maintain risk from contaminants at acceptably low levels.« less

  4. Recovery of Serum Cholesterol Predicts Survival After Left Ventricular Assist Device Implantation

    PubMed Central

    Vest, Amanda R.; Kennel, Peter J.; Maldonado, Dawn; Young, James B.; Mountis, Maria M.; Naka, Yoshifumi; Colombo, Paolo C.; Mancini, Donna M.; Starling, Randall C.; Schulze, P. Christian

    2017-01-01

    Background Advanced systolic heart failure is associated with myocardial and systemic metabolic abnormalities, including low levels of total cholesterol and low-density lipoprotein. Low cholesterol and low-density lipoprotein have been associated with greater mortality in heart failure. Implantation of a left ventricular assist device (LVAD) reverses some of the metabolic derangements of advanced heart failure. Methods and Results A cohort was retrospectively assembled from 2 high-volume implantation centers, totaling 295 continuous-flow LVAD recipients with ≥2 cholesterol values available. The cohort was predominantly bridge-to-transplantation (67%), with median age of 59 years and 49% ischemic heart failure cause. Total cholesterol, low-density lipoprotein, high-density lipoprotein, and triglyceride levels all significantly increased after LVAD implantation (median values from implantation to 3 months post implantation 125–150 mg/dL, 67–85 mg/dL, 32–42 mg/dL, and 97–126 mg/dL, respectively). On Cox proportional hazards modeling, patients achieving recovery of total cholesterol levels, defined as a median or greater change from pre implantation to 3 months post-LVAD implantation, had significantly better unadjusted survival (hazard ratio, 0.445; 95% confidence interval, 0.212–0.932) and adjusted survival (hazard ratio, 0.241; 95% confidence interval, 0.092–0.628) than those without cholesterol recovery after LVAD implantation. The continuous variable of total cholesterol at 3 months post implantation and the cholesterol increase from pre implantation to 3 months were also both significantly associated with survival during LVAD support. Conclusions Initiation of continuous-flow LVAD support was associated with significant recovery of all 4 lipid variables. Patients with a greater increase in total cholesterol by 3 months post implantation had superior survival during LVAD support. PMID:27623768

  5. Hazardous Post Anesthesia Care Unit (PACU): Reality or Myth? A Case Study

    DTIC Science & Technology

    2000-01-03

    speculation of trace anesthetic gases being associated with miscarriages and birth defects. Concerns led to a conference of several government agencies...nitrous oxide may cause depression of vitamin B12 and chronic low level exposure inhibits methionine synthase which impairs synthesis of...depression of vitamin B12 and chronic low level exposure inhibits methionine synthase which impairs synthesis of deoxyribonucleic acid and bone marrow

  6. Natural Hazards and Vulnerability in Valle de Chalco Solidaridad Estado de Mexico, Mexico. Case studies: El Triunfo, Avandaro and San Isidro

    NASA Astrophysics Data System (ADS)

    Ponce-Pacheco, A. B.; Novelo-Casanova, D. A.; Espinosa-Campos, O.; Rodriguez, F.; Huerta-Parra, M.; Reyes-Pimentel, T.; Benitez-Olivares, I.

    2010-12-01

    On February 5, 2010, occurred a fracture on a wall of the artificial water channel called “La Compañía (CC)” in the section of the municipality of Valle de Chalco Solidaridad (VCS), Estado de Mexico, Mexico. The dimensions of this fracture were 70m length, 20m wide and 5m height, and cause severe wastewater flooding that affected surrounding communities. This area was also impacted by a similar event in 2000 and 2005. In this study, we assess the social, economic, structural, and physical vulnerability to floods, earthquakes, subsidence, and landslides hazards in the communities of El Triunfo, San Isidro and Avandaro of VCS. This area is located in soil of the old Chalco Lake, and in recent decades has experienced a large population growth. Due to urban development and the overexploitation of aquifers, the zone is also exposed to subsidence up to 40 cm per year. For these reasons, CC is at present, well above ground level. In this research, we applied the methodology developed by the National Oceanic and Atmospheric Administration (NOAA) to assess vulnerability. As a first step, we established the level of exposure of the communities to the four main hazards. We also analyzed the economic and social vulnerability of the area using data collected from a field survey. From the total family houses in the studied communities, we estimated a minimum sample statistically significant and the households from this sample were selected randomly. We defined five levels of vulnerability: very low, low, moderate, high, and very high. Our results indicate that San Isidro is the community with the highest level of structural vulnerability, as for the physical vulnerability it was found that the homes most affected by flooding are those located close to CC but we did not found a direct relationship between the physical vulnerability and structural vulnerability. The main hazard to which the zone of study is exposed is flooding because its period of recurrence is about five years. About 83% of families have a high level of economic vulnerability. Regarding the structural vulnerability, approximately 25% of the structures have high, and 39% moderate vulnerability. These results indicate that the community has a low standard for living and the resilience is very low. Considering an overall vulnerability estimated by summing the results of the four types of analyzed vulnerabilities, we found that 53% of the sampled population has moderate vulnerability, 34% low, about 2% very low, 10% high and less than 1% very high.

  7. Volcanic hazard management in dispersed volcanism areas

    NASA Astrophysics Data System (ADS)

    Marrero, Jose Manuel; Garcia, Alicia; Ortiz, Ramon

    2014-05-01

    Traditional volcanic hazard methodologies were developed mainly to deal with the big stratovolcanoes. In such type of volcanoes, the hazard map is an important tool for decision-makers not only during a volcanic crisis but also for territorial planning. According to the past and recent eruptions of a volcano, all possible volcanic hazards are modelled and included in the hazard map. Combining the hazard map with the Event Tree the impact area can be zoned and defining the likely eruptive scenarios that will be used during a real volcanic crisis. But in areas of disperse volcanism is very complex to apply the same volcanic hazard methodologies. The event tree do not take into account unknown vents, because the spatial concepts included in it are only related with the distance reached by volcanic hazards. The volcanic hazard simulation is also difficult because the vent scatter modifies the results. The volcanic susceptibility try to solve this problem, calculating the most likely areas to have an eruption, but the differences between low and large values obtained are often very small. In these conditions the traditional hazard map effectiveness could be questioned, making necessary a change in the concept of hazard map. Instead to delimit the potential impact areas, the hazard map should show the expected behaviour of the volcanic activity and how the differences in the landscape and internal geo-structures could condition such behaviour. This approach has been carried out in La Palma (Canary Islands), combining the concept of long-term hazard map with the short-term volcanic scenario to show the expected volcanic activity behaviour. The objective is the decision-makers understand how a volcanic crisis could be and what kind of mitigation measurement and strategy could be used.

  8. Compounding effects of sea level rise and fluvial flooding.

    PubMed

    Moftakhari, Hamed R; Salvadori, Gianfausto; AghaKouchak, Amir; Sanders, Brett F; Matthew, Richard A

    2017-09-12

    Sea level rise (SLR), a well-documented and urgent aspect of anthropogenic global warming, threatens population and assets located in low-lying coastal regions all around the world. Common flood hazard assessment practices typically account for one driver at a time (e.g., either fluvial flooding only or ocean flooding only), whereas coastal cities vulnerable to SLR are at risk for flooding from multiple drivers (e.g., extreme coastal high tide, storm surge, and river flow). Here, we propose a bivariate flood hazard assessment approach that accounts for compound flooding from river flow and coastal water level, and we show that a univariate approach may not appropriately characterize the flood hazard if there are compounding effects. Using copulas and bivariate dependence analysis, we also quantify the increases in failure probabilities for 2030 and 2050 caused by SLR under representative concentration pathways 4.5 and 8.5. Additionally, the increase in failure probability is shown to be strongly affected by compounding effects. The proposed failure probability method offers an innovative tool for assessing compounding flood hazards in a warming climate.

  9. Serum Albumin and High-Sensitivity C-reactive Protein are Independent Risk Factors of Chronic Kidney Disease in Middle-Aged Japanese Individuals: the Circulatory Risk in Communities Study

    PubMed Central

    Kubo, Sachimi; Kitamura, Akihiko; Imano, Hironori; Cui, Renzhe; Yamagishi, Kazumasa; Umesawa, Mitsumasa; Muraki, Isao; Kiyama, Masahiko; Okada, Takeo

    2016-01-01

    Aim: It is important to explore predictive markers other than conventional cardiovascular risk factors for early detection and treatment of chronic kidney disease (CKD), a major risk factor for end-stage renal failure. We hypothesized that serum albumin and high-sensitivity C-reactive protein (hs-CRP) to be independent markers, and examined their associations with the risk of CKD. Methods: We examined the associations of serum albumin and hs-CRP levels with the risk of incident CKD, in 2535 Japanese adults aged 40–69 years without CKD at baseline during a median 9.0-year follow-up after adjustment for known cardiovascular risk factors. Results: During the follow-up period, 367 cases of CKD developed. In multivariable analyses adjusted for known risk factors, the CKD hazard ratios (95% confidence intervals) for the highest versus lowest quartiles of serum albumin levels were 0.69 (0.40–1.17) for men and 0.42 (0.28–0.64) for women. Corresponding values for hs-CRP were 0.95 (0.54–1.67) for men and 1.85 (1.25 -2.75) for women. The association of combined serum albumin and hs-CRP with the risk of CKD was examined for women. The hazard ratio was 1.72 (1.17–2.54) for low versus higher albumin levels at lower hs-CRP levels, but such an association was not observed at high hs-CRP level. The hazard ratio was 1.96 (1.44–2.66) for high versus lower hs-CRP levels at higher serum albumin levels, but such association was not observed at low serum albumin level. Conclusion: Both low serum albumin and high hs-CRP levels were predictive of CKD for women. PMID:26911856

  10. Procedures for the interpretation and use of elevation scanning laser/multi-sensor data for short range hazard detection and avoidance for an autonomous planetary rover

    NASA Technical Reports Server (NTRS)

    Troiani, N.; Yerazunis, S. W.

    1978-01-01

    An autonomous roving science vehicle that relies on terrain data acquired by a hierarchy of sensors for navigation was one method of carrying out such a mission. The hierarchy of sensors included a short range sensor with sufficient resolution to detect every possible obstacle and with the ability to make fast and reliable terrain characterizations. A multilaser, multidetector triangulation system was proposed as a short range sensor. The general system was studied to determine its perception capabilities and limitations. A specific rover and low resolution sensor system was then considered. After studying the data obtained, a hazard detection algorithm was developed that accounts for all possible terrains given the sensor resolution. Computer simulation of the rover on various terrains was used to test the entire hazard detection system.

  11. Computer modelling as a tool for the exposure assessment of operators using faulty agricultural pesticide spraying equipment.

    PubMed

    Bańkowski, Robert; Wiadrowska, Bozena; Beresińska, Martyna; Ludwicki, Jan K; Noworyta-Głowacka, Justyna; Godyń, Artur; Doruchowski, Grzegorz; Hołownicki, Ryszard

    2013-01-01

    Faulty but still operating agricultural pesticide sprayers may pose an unacceptable health risk for operators. The computerized models designed to calculate exposure and risk for pesticide sprayers used as an aid in the evaluation and further authorisation of plant protection products may be applied also to assess a health risk for operators when faulty sprayers are used. To evaluate the impact of different exposure scenarios on the health risk for the operators using faulty agricultural spraying equipment by means of computer modelling. The exposure modelling was performed for 15 pesticides (5 insecticides, 7 fungicides and 3 herbicides). The critical parameter, i.e. toxicological end-point, on which the risk assessment was based was the no observable adverse effect level (NOAEL). This enabled risk to be estimated under various exposure conditions such as pesticide concentration in the plant protection product and type of the sprayed crop as well as the number of treatments. Computer modelling was based on the UK POEM model including determination of the acceptable operator exposure level (AOEL). Thus the degree of operator exposure could be defined during pesticide treatment whether or not personal protection equipment had been employed by individuals. Data used for computer modelling was obtained from simulated, pesticide substitute treatments using variously damaged knapsack sprayers. These substitute preparations consisted of markers that allowed computer simulations to be made, analogous to real-life exposure situations, in a dose dependent fashion. Exposures were estimated according to operator dosimetry exposure under 'field' conditions for low level, medium and high target field crops. The exposure modelling in the high target field crops demonstrated exceedance of the AOEL in all simulated treatment cases (100%) using damaged sprayers irrespective of the type of damage or if individual protective measures had been adopted or not. For low level and medium field crops exceedances ranged between 40 - 80% cases. The computer modelling may be considered as an practical tool for the hazard assessment when the faulty agricultural sprayers are used. It also may be applied for programming the quality checks and maintenance systems of this equipment.

  12. Informal processing of electronic waste at Agbogbloshie, Ghana: workers' knowledge about associated health hazards and alternative livelihoods.

    PubMed

    Yu, Emily A; Akormedi, Matthew; Asampong, Emmanuel; Meyer, Christian G; Fobil, Julius N

    2017-12-01

    This study was conducted to investigate the electronic waste workers' knowledge about the potential health hazards associated with their work as well as the livelihood alternatives that they would prefer if they were given the opportunity. A qualitative cross-sectional study was conducted to gather empirical information on e-waste workers' knowledge about the potential hazards associated with their work and the livelihood alternatives to e-waste recycling with a sample consisting of twenty all-male electronic waste workers at the Agbogbloshie scrap metal yard in Accra, Ghana. Electronic waste workers at Agbogbloshie were found to be exposed to a variety of injuries and illnesses. The workers' knowledge of the association between their health status and their work was generally poor. Apart from the physical injuries, they did not believe their work played any negative role in their health conditions. They preferred occupations such as farming or professional driving located in the northern region of Ghana to be closer to their families. The study concludes that the low knowledge level of the workers on the hazards that are associated with their work has implications for them accepting technologies to protect them and the natural environment from contamination. It is therefore imperative for any intervention to consider the current low level of knowledge and actively educate the workers to raise their awareness level, taking into account the provision of opportunities for workers to acquire applicable skills for future employment in other fields.

  13. Prediction and Prevention of Chemical Reaction Hazards: Learning by Simulation.

    ERIC Educational Resources Information Center

    Shacham, Mordechai; Brauner, Neima; Cutlip, Michael B.

    2001-01-01

    Points out that chemical hazards are the major cause of accidents in chemical industry and describes a safety teaching approach using a simulation. Explains a problem statement on exothermic liquid-phase reactions. (YDS)

  14. First Steps towards an Interactive Real-Time Hazard Management Simulation

    ERIC Educational Resources Information Center

    Gemmell, Alastair M. D.; Finlayson, Ian G.; Marston, Philip G.

    2010-01-01

    This paper reports on the construction and initial testing of a computer-based interactive flood hazard management simulation, designed for undergraduates taking an applied geomorphology course. Details of the authoring interface utilized to create the simulation are presented. Students act as the managers of civil defence utilities in a fictional…

  15. Nationwide high-resolution mapping of hazards in the Philippines (Plinius Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Lagmay, Alfredo Mahar Francisco A.

    2015-04-01

    The Philippines being a locus of typhoons, tsunamis, earthquakes, and volcanic eruptions, is a hotbed of disasters. Situated in a region where severe weather and geophysical unrest is common, the Philippines will inevitably suffer from calamities similar to those experienced recently. With continued development and population growth in hazard prone areas, it is expected that damage to infrastructure and human losses would persist and even rise unless appropriate measures are immediately implemented by government. Recently, the Philippines put in place a responsive program called the Nationwide Operational Assessment of Hazards (NOAH) for disaster prevention and mitigation. The efforts of Project NOAH are an offshoot of lessons learned from previous disasters that have inflicted massive loss of lives and costly damage to property. Several components of the NOAH program focus on mapping of landslide, riverine flood and storm surge inundation hazards. By simulating hazards phenomena over IFSAR- and LiDAR-derived digital terrain models (DTMs) using high-performance computers, multi-hazards maps of 1:10,000 scale, have been produced and disseminated to local government units through a variety of platforms. These detailed village-level (barangay-level) maps are useful to identify safe evacuation sites, planning emergency access routes and prepositioning of search and rescue and relief supplies during times of crises. They are also essential for long-term development planning of communities. In the past two years, NOAH was instrumental in providing timely, site-specific, and understandable hazards information to the public, considered as best practice in disaster risk reduction management (DRR). The use of advanced science and technology in the country's disaster prevention efforts is imperative to successfully mitigate the adverse impacts of natural hazards and should be a continuous quest - to find the best products, put forth in the forefront of battle against disasters.

  16. Grout formulation for disposal of low-level and hazardous waste streams containing fluoride

    DOEpatents

    McDaniel, E.W.; Sams, T.L.; Tallent, O.K.

    1987-06-02

    A composition and related process for disposal of hazardous waste streams containing fluoride in cement-based materials is disclosed. the presence of fluoride in cement-based materials is disclosed. The presence of fluoride in waste materials acts as a set retarder and as a result, prevents cement-based grouts from setting. This problem is overcome by the present invention wherein calcium hydroxide is incorporated into the dry-solid portion of the grout mix. The calcium hydroxide renders the fluoride insoluble, allowing the grout to set up and immobilize all hazardous constituents of concern. 4 tabs.

  17. Earth science: lasting earthquake legacy

    USGS Publications Warehouse

    Parsons, Thomas E.

    2009-01-01

    On 31 August 1886, a magnitude-7 shock struck Charleston, South Carolina; low-level activity continues there today. One view of seismic hazard is that large earthquakes will return to New Madrid and Charleston at intervals of about 500 years. With expected ground motions that would be stronger than average, that prospect produces estimates of earthquake hazard that rival those at the plate boundaries marked by the San Andreas fault and Cascadia subduction zone. The result is two large 'bull's-eyes' on the US National Seismic Hazard Maps — which, for example, influence regional building codes and perceptions of public safety.

  18. Socioeconomic disparities in outcomes after acute myocardial infarction.

    PubMed

    Bernheim, Susannah M; Spertus, John A; Reid, Kimberly J; Bradley, Elizabeth H; Desai, Rani A; Peterson, Eric D; Rathore, Saif S; Normand, Sharon-Lise T; Jones, Philip G; Rahimi, Ali; Krumholz, Harlan M

    2007-02-01

    Patients of low socioeconomic status (SES) have higher mortality after acute myocardial infarction (AMI). Little is known about the underlying mechanisms or the relationship between SES and rehospitalization after AMI. We analyzed data from the PREMIER observational study, which included 2142 patients hospitalized with AMI from 18 US hospitals. Socioeconomic status was measured by self-reported household income and education level. Sequential multivariable modeling assessed the relationship of socioeconomic factors with 1-year all-cause mortality and all-cause rehospitalization after adjustment for demographics, clinical factors, and quality-of-care measures. Both household income and education level were associated with higher risk of mortality (hazard ratio 2.80, 95% CI 1.37-5.72, lowest to highest income group) and rehospitalization after AMI (hazard ratio 1.55, 95% CI 1.17-2.05). Patients with low SES had worse clinical status at admission and received poorer quality of care. In multivariable modeling, the relationship between household income and mortality was attenuated by adjustment for demographic and clinical factors (hazard ratio 1.19, 95% CI 0.54-2.62), with a further small decrement in the hazard ratio after adjustment for quality of care. The relationship between income and rehospitalization was only partly attenuated by demographic and clinical factors (hazard ratio 1.38, 95% CI 1.01-1.89) and was not influenced by adjustment for quality of care. Patients' baseline clinical status largely explained the relationship between SES and mortality, but not rehospitalization, among patients with AMI.

  19. Role of a plausible nuisance contributor in the declining obesity-mortality risks over time.

    PubMed

    Mehta, Tapan; Pajewski, Nicholas M; Keith, Scott W; Fontaine, Kevin; Allison, David B

    2016-12-15

    Recent analyses of epidemiological data including the National Health and Nutrition Examination Survey (NHANES) have suggested that the harmful effects of obesity may have decreased over calendar time. The shifting BMI distribution over time coupled with the application of fixed broad BMI categories in these analyses could be a plausible "nuisance contributor" to this observed change in the obesity-associated mortality over calendar time. To evaluate the extent to which observed temporal changes in the obesity-mortality association may be due to a shifting population distribution for body mass index (BMI), coupled with analyses based on static, broad BMI categories. Simulations were conducted using data from NHANES I and III linked with mortality data. Data from NHANES I were used to fit a "true" model treating BMI as a continuous variable. Coefficients estimated from this model were used to simulate mortality for participants in NHANES III. Hence, the population-level association between BMI and mortality in NHANES III was fixed to be identical to the association estimated in NHANES I. Hazard ratios (HRs) for obesity categories based on BMI for NHANES III with simulated mortality data were compared to the corresponding estimated HRs from NHANES I. Change in hazard ratios for simulated data in NHANES III compared to observed estimates from NHANES I. On average, hazard ratios for NHANES III based on simulated mortality data were 29.3% lower than the estimates from NHANES I using observed mortality follow-up. This reduction accounted for roughly three-fourths of the apparent decrease in the obesity-mortality association observed in a previous analysis of these data. Some of the apparent diminution of the association between obesity and mortality may be an artifact of treating BMI as a categorical variable. Copyright © 2016. Published by Elsevier Inc.

  20. Automatic systems and the low-level wind hazard

    NASA Technical Reports Server (NTRS)

    Schaeffer, Dwight R.

    1987-01-01

    Automatic flight control systems provide means for significantly enhancing survivability in severe wind hazards. The technology required to produce the necessary control algorithms is available and has been made technically feasible by the advent of digital flight control systems and accurate, low-noise sensors, especially strap-down inertial sensors. The application of this technology and these means has not generally been enabled except for automatic landing systems, and even then the potential has not been fully exploited. To fully exploit the potential of automatic systems for enhancing safety in wind hazards requires providing incentives, creating demand, inspiring competition, education, and eliminating prejudicial disincentitives to overcome the economic penalties associated with the extensive and riskly development and certification of these systems. If these changes will come about at all, it will likely be through changes in the regulations provided by the certifying agencies.

  1. Low-level exposure to radiofrequency electromagnetic fields: health effects and research needs.

    PubMed

    Repacholi, M H

    1998-01-01

    The World Health Organization (WHO), the International Commission on Non-Ionizing Radiation Protection (ICNIRP), and the German and Austrian Governments jointly sponsored an international seminar in November of 1996 on the biological effects of low-level radiofrequency (RF) electromagnetic fields. For purposes of this seminar, RF fields having frequencies only in the range of about 10 MHz to 300 GHz were considered. This is one of a series of scientific review seminars held under the International Electromagnetic Field (EMF) Project to identify any health hazards from EMF exposure. The scientific literature was reviewed during the seminar and expert working groups formed to provide a status report on possible health effects from exposure to low-level RF fields and identify gaps in knowledge requiring more research to improve health risk assessments. It was concluded that, although hazards from exposure to high-level (thermal) RF fields were established, no known health hazards were associated with exposure to RF sources emitting fields too low to cause a significant temperature rise in tissue. Biological effects from low-level RF exposure were identified needing replication and further study. These included in vitro studies of cell kinetics and proliferation effects, effects on genes, signal transduction effects and alterations in membrane structure and function, and biophysical and biochemical mechanisms for RF field effects. In vivo studies should focus on the potential for cancer promotion, co-promotion and progression, as well as possible synergistic, genotoxic, immunological, and carcinogenic effects associated with chronic low-level RF exposure. Research is needed to determine whether low-level RF exposure causes DNA damage or influences central nervous system function, melatonin synthesis, permeability of the blood brain barrier (BBB), or reaction to neurotropic drugs. Reported RF-induced changes to eye structure and function should also be investigated. Epidemiological studies should investigate: the use of mobile telephones with hand-held antennae and incidence of various cancers; reports of headache, sleep disturbance, and other subjective effects that may arise from proximity to RF emitters, and laboratory studies should be conducted on people reporting these effects; cohorts with high occupational RF exposure for changes in cancer incidence; adverse pregnancy outcomes in various highly RF exposed occupational groups; and ocular pathologies in mobile telephone users and in highly RF exposed occupational groups. Studies of populations with residential exposure from point sources, such as broadcasting transmitters or mobile telephone base stations have caused widespread health concerns among the public, even though RF exposures are very low. Recent studies that may indicate an increased incidence of cancer in exposed populations should be investigated further.

  2. Spatial patterns of natural hazards mortality in the United States

    PubMed Central

    Borden, Kevin A; Cutter, Susan L

    2008-01-01

    Background Studies on natural hazard mortality are most often hazard-specific (e.g. floods, earthquakes, heat), event specific (e.g. Hurricane Katrina), or lack adequate temporal or geographic coverage. This makes it difficult to assess mortality from natural hazards in any systematic way. This paper examines the spatial patterns of natural hazard mortality at the county-level for the U.S. from 1970–2004 using a combination of geographical and epidemiological methods. Results Chronic everyday hazards such as severe weather (summer and winter) and heat account for the majority of natural hazard fatalities. The regions most prone to deaths from natural hazards are the South and intermountain west, but sub-regional county-level mortality patterns show more variability. There is a distinct urban/rural component to the county patterns as well as a coastal trend. Significant clusters of high mortality are in the lower Mississippi Valley, upper Great Plains, and Mountain West, with additional areas in west Texas, and the panhandle of Florida, Significant clusters of low mortality are in the Midwest and urbanized Northeast. Conclusion There is no consistent source of hazard mortality data, yet improvements in existing databases can produce quality data that can be incorporated into spatial epidemiological studies as demonstrated in this paper. It is important to view natural hazard mortality through a geographic lens so as to better inform the public living in such hazard prone areas, but more importantly to inform local emergency practitioners who must plan for and respond to disasters in their community. PMID:19091058

  3. Integrity of Disposable Nitrile Exam Gloves Exposed to Simulated Movement

    PubMed Central

    Phalen, Robert N.; Wong, Weng Kee

    2011-01-01

    Every year, millions of health care, first responder, and industry workers are exposed to chemical and biological hazards. Disposable nitrile gloves are a common choice as both a chemical and physical barrier to these hazards, especially as an alternative to natural latex gloves. However, glove selection is complicated by the availability of several types or formulations of nitrile gloves, such as low-modulus, medical-grade, low-filler, and cleanroom products. This study evaluated the influence of simulated movement on the physical integrity (i.e., holes) of different nitrile exam glove brands and types. Thirty glove products were evaluated out-of-box and after exposure to simulated whole-glove movement for 2 hr. In lieu of the traditional 1-L water-leak test, a modified water-leak test, standardized to detect a 0.15 ± 0.05 mm hole in different regions of the glove, was developed. A specialized air inflation method simulated bidirectional stretching and whole-glove movement. A worst-case scenario with maximum stretching was evaluated. On average, movement did not have a significant effect on glove integrity (chi-square; p=0.068). The average effect was less than 1% between no movement (1.5%) and movement (2.1%) exposures. However, there was significant variability in glove integrity between different glove types (p ≤ 0.05). Cleanroom gloves, on average, had the highest percentage of leaks, and 50% failed the water-leak test. Low-modulus and medical-grade gloves had the lowest percentages of leaks, and no products failed the water-leak test. Variability in polymer formulation was suspected to account for the observed discrepancies, as well as the inability of the traditional 1-L water-leak test to detect holes in finger/thumb regions. Unexpectedly, greater than 80% of the glove defects were observed in the finger and thumb regions. It is recommended that existing water-leak tests be re-evaluated and standardized to account for product variability. PMID:21476169

  4. Scenario-Based Tsunami Hazard Assessment from Earthquake and Landslide Sources for Eastern Sicily, Italy

    NASA Astrophysics Data System (ADS)

    Tinti, S.; Armigliato, A.; Pagnoni, G.; Paparo, M. A.; Zaniboni, F.

    2016-12-01

    Eastern Sicily was theatre of the most damaging tsunamis that ever struck Italy, such as the 11 January 1693 and the 28 December 1908 tsunamis. Tectonic studies and paleotsunami investigations extended historical records of tsunami occurrence back of several thousands of years. Tsunami sources relevant for eastern Sicily are both local and remote, the latter being located in the Ionian Greece and in the Western Hellenic Arc. Here in 365 A.D. a large earthquake generated a tsunami that was seen in the whole eastern and central Mediterranean including the Sicilian coasts. The objective of this study is the evaluation of tsunami hazard along the coast of eastern Sicily, central Mediterranean, Italy via a scenario-based technique, which has been preferred to the PTHA approach because, when dealing with tsunamis induced by landslides, uncertainties are usually so large to undermine the PTHA results. Tsunamis of earthquake and landslide origin are taken into account for the entire coast of Sicily, from the Messina to the Siracusa provinces. Landslides are essentially local sources and can occur underwater along the unstable flanks of the Messina Straits or along the steep slopes of the Hyblaean-Malta escarpment. The method is based on a two-step procedure. After a preliminary step where very many earthquake and landslide sources are taken into account and tsunamis are computed on a low-resolution grid, the worst-case scenarios are selected and tsunamis are simulated on a finer-resolution grid allowing for a better calculation of coastal wave height and tsunami penetration. The final result of our study is given in the form of aggregate fields computed from individual scenarios. Also interesting is the contribution of the various tsunami sources in different localities along the coast. It is found that the places with the highest level of hazard are the low lands of La Playa south of Catania and of the Bay of Augusta, which is in agreement also with historical observations. It is further found that remote seismic sources from the Hellenic Arc are the dominant factor of hazard in several places, and that, though in general earthquakes contribute to hazard more than landslides, in some places the opposite is true.

  5. Ponderosa pine forest restoration treatment longevity: Implications of regeneration on fire hazard

    Treesearch

    Wade T. Tinkham; Chad M. Hoffman; Seth A. Ex; Michael A. Battaglia; Jarred D. Saralecos

    2016-01-01

    Restoration of pine forests has become a priority for managers who are beginning to embrace ideas of highly heterogeneous forest structures that potentially encourages high levels of regeneration. This study utilizes stem-mapped stands to assess how simulated regeneration timing and magnitude influence longevity of reduced fire behavior by linking growth and...

  6. Post Fukushima tsunami simulations for Malaysian coasts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koh, Hock Lye, E-mail: kohhl@ucsiuniversity.edu.my; Teh, Su Yean, E-mail: syteh@usm.my; Abas, Mohd Rosaidi Che

    The recent recurrences of mega tsunamis in the Asian region have rekindled concern regarding potential tsunamis that could inflict severe damage to affected coastal facilities and communities. The 11 March 2011 Fukushima tsunami that crippled nuclear power plants in Northern Japan has further raised the level of caution. The recent discovery of petroleum reserves in the coastal water surrounding Malaysia further ignites the concern regarding tsunami hazards to petroleum facilities located along affected coasts. Working in a group, federal government agencies seek to understand the dynamics of tsunami and their impacts under the coordination of the Malaysian National Centre formore » Tsunami Research, Malaysian Meteorological Department. Knowledge regarding the generation, propagation and runup of tsunami would provide the scientific basis to address safety issues. An in-house tsunami simulation models known as TUNA has been developed by the authors to assess tsunami hazards along affected beaches so that mitigation measures could be put in place. Capacity building on tsunami simulation plays a critical role in the development of tsunami resilience. This paper aims to first provide a simple introduction to tsunami simulation towards the achievement of tsunami simulation capacity building. The paper will also present several scenarios of tsunami dangers along affected Malaysia coastal regions via TUNA simulations to highlight tsunami threats. The choice of tsunami generation parameters reflects the concern following the Fukushima tsunami.« less

  7. Seismic Velocity and Its Temporal Variations of Hutubi Basin Revealed by Near Surface Trapped Waves

    NASA Astrophysics Data System (ADS)

    Ji, Z.; Wang, B.; Wang, H.; Wang, Q.; Su, J.

    2017-12-01

    Sedimentary basins amplify bypassing seismic waves, which may increase the seismic hazard in basin area. The study of basin structure and its temporal variation is of key importance in the assessment and mitigation of seismic hazard in basins. Recent investigations of seismic exploration have shown that basins may host a distinct wave train with strong energy. It is usually named as Trapped Wave or Whispering Gallery (WG) Phase. In this study, we image the velocity structure and monitor its temporal changes of Hutubi basin in Xinjiang, Northwestern China with trapped wave generated from an airgun source. Hutubi basin is located at mid-segment of the North Tianshan Mountain. Hutubi aigun signal transmitting station was constructed in May 2013. It is composed of six longlife airgun manufactured by BOLT. Prominent trapped waves with strong energy and low velocity are observed within 40km from the source. The airgun source radiates repeatable seismic signals for years. The trapped waves have relative low frequency 0.15s-4s and apparent low velocities of 200m/s to 1000m/s. In the temporal-frequency diagram, at least two groups of wave train can be identified. Based on the group velocity dispersion curves, we invert the S-wave velocity profile of Hutubi basin. The velocity structure is further verified with synthetic seismogram. Velocity variations and Rayleigh wave polarization changes are useful barometers of underground stress status. We observed that the consistent seasonal variations in velocity and polarization. According to the simulate results, we suggest that the variations may be related to the changes of groundwater level and the formation and disappearance of frozen soil.

  8. Hazard Detection Software for Lunar Landing

    NASA Technical Reports Server (NTRS)

    Huertas, Andres; Johnson, Andrew E.; Werner, Robert A.; Montgomery, James F.

    2011-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing a system for safe and precise manned lunar landing that involves novel sensors, but also specific algorithms. ALHAT has selected imaging LIDAR (light detection and ranging) as the sensing modality for onboard hazard detection because imaging LIDARs can rapidly generate direct measurements of the lunar surface elevation from high altitude. Then, starting with the LIDAR-based Hazard Detection and Avoidance (HDA) algorithm developed for Mars Landing, JPL has developed a mature set of HDA software for the manned lunar landing problem. Landing hazards exist everywhere on the Moon, and many of the more desirable landing sites are near the most hazardous terrain, so HDA is needed to autonomously and safely land payloads over much of the lunar surface. The HDA requirements used in the ALHAT project are to detect hazards that are 0.3 m tall or higher and slopes that are 5 or greater. Steep slopes, rocks, cliffs, and gullies are all hazards for landing and, by computing the local slope and roughness in an elevation map, all of these hazards can be detected. The algorithm in this innovation is used to measure slope and roughness hazards. In addition to detecting these hazards, the HDA capability also is able to find a safe landing site free of these hazards for a lunar lander with diameter .15 m over most of the lunar surface. This software includes an implementation of the HDA algorithm, software for generating simulated lunar terrain maps for testing, hazard detection performance analysis tools, and associated documentation. The HDA software has been deployed to Langley Research Center and integrated into the POST II Monte Carlo simulation environment. The high-fidelity Monte Carlo simulations determine the required ground spacing between LIDAR samples (ground sample distances) and the noise on the LIDAR range measurement. This simulation has also been used to determine the effect of viewing on hazard detection performance. The software has also been deployed to Johnson Space Center and integrated into the ALHAT real-time Hardware-in-the-Loop testbed.

  9. Scenario based approach for multiple source Tsunami Hazard assessment for Sines, Portugal

    NASA Astrophysics Data System (ADS)

    Wronna, M.; Omira, R.; Baptista, M. A.

    2015-08-01

    In this paper, we present a scenario-based approach for tsunami hazard assessment for the city and harbour of Sines - Portugal, one of the test-sites of project ASTARTE. Sines holds one of the most important deep-water ports which contains oil-bearing, petrochemical, liquid bulk, coal and container terminals. The port and its industrial infrastructures are facing the ocean southwest towards the main seismogenic sources. This work considers two different seismic zones: the Southwest Iberian Margin and the Gloria Fault. Within these two regions, we selected a total of six scenarios to assess the tsunami impact at the test site. The tsunami simulations are computed using NSWING a Non-linear Shallow Water Model With Nested Grids. In this study, the static effect of tides is analysed for three different tidal stages MLLW (mean lower low water), MSL (mean sea level) and MHHW (mean higher high water). For each scenario, inundation is described by maximum values of wave height, flow depth, drawback, runup and inundation distance. Synthetic waveforms are computed at virtual tide gauges at specific locations outside and inside the harbour. The final results describe the impact at Sines test site considering the single scenarios at mean sea level, the aggregate scenario and the influence of the tide on the aggregate scenario. The results confirm the composite of Horseshoe and Marques Pombal fault as the worst case scenario. It governs the aggregate scenario with about 60 % and inundates an area of 3.5 km2.

  10. Microzonation Mapping Of The Yanbu Industrial City, Western Saudi Arabia: A Multicriteria Decision Analysis Approach

    NASA Astrophysics Data System (ADS)

    Moustafa, Sayed, Sr.; Alarifi, Nassir S.; Lashin, Aref A.

    2016-04-01

    Urban areas along the western coast of Saudi Arabia are susceptible to natural disasters and environmental damages due to lack of planning. To produce a site-specific microzonation map of the rapidly growing Yanbu industrial city, spatial distribution of different hazard entities are assessed using the Analytical Hierarchal Process (AHP) together with Geographical Information System (GIS). For this purpose six hazard parameter layers are considered, namely; fundamental frequency, site amplification, soil strength in terms of effective shear-wave velocity, overburden sediment thickness, seismic vulnerability index and peak ground acceleration. The weight and rank values are determined during AHP and are assigned to each layer and its corresponding classes, respectively. An integrated seismic microzonation map was derived using GIS platform. Based on the derived map, the study area is classified into five hazard categories: very low, low, moderate high, and very high. The western and central parts of the study area, as indicated from the derived microzonation map, are categorized as a high hazard zone as compared to other surrounding places. The produced microzonation map of the current study is envisaged as a first-level assessment of the site specific hazards in the Yanbu city area, which can be used as a platform by different stakeholders in any future land-use planning and environmental hazard management.

  11. Determination of eye safety filter protection factors associated with retinal thermal hazard and blue light photochemical hazard for intense pulsed light sources.

    PubMed

    Clarkson, D McG

    2006-02-21

    An assessment is provided of protection factors afforded for retinal thermal hazard and blue light photochemical hazard for a range of filters used with intense pulsed light sources (IPLs). A characteristic IPL spectrum based on black body radiation at 5000 K with a low cut filter at 515 nm was identified as suitable for such estimations. Specific filters assessed included types with idealized transmission properties and also a range of types whose transmission characteristics were measured by means of a Bentham DMc150 spectroradiometer. Predicted behaviour based on these spectra is outlined which describes both the effectiveness of protection and the level of luminous transmittance afforded. The analysis showed it was possible to describe a figure of merit for a particular filter material relating the degree of protection provided and corresponding value of luminous transmittance. This consideration is important for providing users of IPL equipment with safety eyewear with adequate level of visual transmittance.

  12. NOTE: Determination of eye safety filter protection factors associated with retinal thermal hazard and blue light photochemical hazard for intense pulsed light sources

    NASA Astrophysics Data System (ADS)

    McG Clarkson, D.

    2006-02-01

    An assessment is provided of protection factors afforded for retinal thermal hazard and blue light photochemical hazard for a range of filters used with intense pulsed light sources (IPLs). A characteristic IPL spectrum based on black body radiation at 5000 K with a low cut filter at 515 nm was identified as suitable for such estimations. Specific filters assessed included types with idealized transmission properties and also a range of types whose transmission characteristics were measured by means of a Bentham DMc150 spectroradiometer. Predicted behaviour based on these spectra is outlined which describes both the effectiveness of protection and the level of luminous transmittance afforded. The analysis showed it was possible to describe a figure of merit for a particular filter material relating the degree of protection provided and corresponding value of luminous transmittance. This consideration is important for providing users of IPL equipment with safety eyewear with adequate level of visual transmittance.

  13. Metformin and low levels of thyroid-stimulating hormone in patients with type 2 diabetes mellitus

    PubMed Central

    Fournier, Jean-Pascal; Yin, Hui; Yu, Oriana Hoi Yun; Azoulay, Laurent

    2014-01-01

    Background: Small cross-sectional studies have suggested that metformin, a first-line oral hypoglycemic agent, may lower thyroid-stimulating hormone (TSH) levels. Our objective was to determine whether the use of metformin monotherapy, when compared with sulfonylurea monotherapy, is associated with an increased risk of low TSH levels (< 0.4 mIU/L) in patients with type 2 diabetes mellitus. Methods: Using the Clinical Practice Research Datalink, we identified patients who began receiving metformin or sulfonylurea monotherapy between Jan. 1, 1988, and Dec. 31, 2012. We assembled 2 subcohorts of patients with treated hypothyroidism or euthyroidism, and followed them until Mar. 31, 2013. We used Cox proportional hazards models to evaluate the association of low TSH levels with metformin monotherapy, compared with sulfonylurea monotherapy, in each subcohort. Results: A total of 5689 patients with treated hypothyroidism and 59 937 euthyroid patients were included in the subcohorts. Among patients with treated hypothyroidism, 495 events of low TSH levels were observed during follow-up (incidence rate 119.7/1000 person-years). In the euthyroid group, 322 events of low TSH levels were observed (incidence rate 4.5/1000 person-years). Compared with sulfonylurea monotherapy, metformin monotherapy was associated with a 55% increased risk of low TSH levels in patients with treated hypothyroidism (incidence rate 79.5/1000 person-years v. 125.2/1000 person-years, adjusted hazard ratio [HR] 1.55, 95% confidence interval [CI] 1.09–2.20), with the highest risk in the 90–180 days after initiation (adjusted HR 2.30, 95% CI 1.00–5.29). No association was observed in euthyroid patients (adjusted HR 0.97, 95% CI 0.69–1.36). Interpretation: In this longitudinal population-based study, metformin use was associated with an increased incidence of low TSH levels in patients with treated hypothyroidism, but not in euthyroid patients. The clinical consequences of this need further investigation. PMID:25246411

  14. Value of Low Triiodothyronine and Subclinical Myocardial Injury for Clinical Outcomes in Chest Pain.

    PubMed

    Lee, Young-Min; Ki, Young-Jae; Choi, Dong-Hyun; Kim, Bo-Bae; Shin, Byung Chul; Song, Heesang; Kim, Dong-Min

    2015-11-01

    Low triiodothyronine (T3) levels and subclinical myocardial injury may be associated with adverse cardiac and cerebrovascular (CCV) events in individuals without clinically apparent coronary heart disease (CHD). The aim of this study was to determine the associations of a low T3 level and subclinical myocardial injury with the development of adverse CCV events in individuals without clinically apparent CHD. T3 and high-sensitivity cardiac troponin T (hs-cTnT) levels were analyzed in 250 patients with chest pain free of CHD and heart failure. The primary end point was the composite of sudden cardiac death, ischemic stroke, newly developed atrial fibrillation, pericardial effusion and thrombosis. Throughout a mean follow-up of 15.6 months, the primary end point happened in 17 patients (6.8%). Kaplan-Meier analysis disclosed a notably higher overall occurrence rate in patients with hs-cTnT levels ≥0.014 ng/mL and in patients with T3 <60 ng/dL. An exaggerated hazard was observed in patients with combined high hs-cTnT and low T3 levels. After adjustment, the hazard ratio for overall events in patients with high hs-cTnT/low T3 versus normal hs-cTnT/T3 was 11.72 (95% confidence interval, 2.83-48.57; P = 0.001). In patients with chest pain without clinically obvious CHD, high hs-cTnT combined with low T3 was associated with adverse cardiac/CCV events and was an independent predictor of overall events even after adjustment. These data suggest the importance of systemic factors, such as low T3 syndrome, in the development of adverse cardiac/CCV events beyond advancing clinical atherosclerotic coronary disease in patients with chest pain.

  15. The impact of low-level cloud over the eastern subtropical Pacific on the ``Double ITCZ'' in LASG FGCM-0

    NASA Astrophysics Data System (ADS)

    Dai, Fushan; Yu, Rucong; Zhang, Xuehong; Yu, Yongqiang; Li, Jianglong

    2003-05-01

    Like many other coupled models, the Flexible coupled General Circulation Model (FGCM-0) suffers from the spurious “Double ITCZ”. In order to understand the “Double ITCZ” in FGCM-0, this study first examines the low-level cloud cover and the bulk stability of the low troposphere over the eastern subtropical Pacific simulated by the National Center for Atmospheric Research (NCAR) Community Climate Model version 3 (CCM3), which is the atmosphere component model of FGCM-0. It is found that the bulk stability of the low troposphere simulated by CCM3 is very consistent with the one derived from the National Center for Environmental Prediction (NCEP) reanalysis, but the simulated low-level cloud cover is much less than that derived from the International Satellite Cloud Climatology Project (ISCCP) D2 data. Based on the regression equations between the low-level cloud cover from the ISCCP data and the bulk stability of the low troposphere derived from the NCEP reanalysis, the parameterization scheme of low-level cloud in CCM3 is modified and used in sensitivity experiments to examine the impact of low-level cloud over the eastern subtropical Pacific on the spurious “Double ITCZ” in FGCM-0. Results show that the modified scheme causes the simulated low-level cloud cover to be improved locally over the cold oceans. Increasing the low-level cloud cover off Peru not only significantly alleviates the SST warm biases in the southeastern tropical Pacific, but also causes the equatorial cold tongue to be strengthened and to extend further west. Increasing the low-level cloud fraction off California effectively reduces the SST warm biases in ITCZ north of the equator. In order to examine the feedback between the SST and low-level cloud cover off Peru, one additional sensitivity experiment is performed in which the SST over the cold ocean off Peru is restored. It shows that decreasing the SST results in similar impacts over the wide regions from the southeastern tropical Pacific northwestwards to the western/central equatorial Pacific as increasing the low-level cloud cover does.

  16. 76 FR 43604 - Pipeline Safety: Applying Safety Regulations to All Rural Onshore Hazardous Liquid Low-Stress...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-21

    ... Regulations to All Rural Onshore Hazardous Liquid Low-Stress Lines, Correction AGENCY: Pipeline and Hazardous... the Federal Pipeline Safety Regulations to address rural low-stress hazardous liquid pipelines that... regarding the compliance date for identifying all segments of a Category 3 low-stress pipeline. DATES: This...

  17. Recirculating Air Filtration Significantly Reduces Exposure to Airborne Nanoparticles

    PubMed Central

    Pui, David Y.H.; Qi, Chaolong; Stanley, Nick; Oberdörster, Günter; Maynard, Andrew

    2008-01-01

    Background Airborne nanoparticles from vehicle emissions have been associated with adverse effects in people with pulmonary and cardiovascular disease, and toxicologic studies have shown that nanoparticles can be more hazardous than their larger-scale counterparts. Recirculating air filtration in automobiles and houses may provide a low-cost solution to reducing exposures in many cases, thus reducing possible health risks. Objectives We investigated the effectiveness of recirculating air filtration on reducing exposure to incidental and intentionally produced airborne nanoparticles under two scenarios while driving in traffic, and while generating nanomaterials using gas-phase synthesis. Methods We tested the recirculating air filtration in two commercial vehicles when driving in traffic, as well as in a nonventilation room with a nanoparticle generator, simulating a nanomaterial production facility. We also measured the time-resolved aerosol size distribution during the in-car recirculation to investigate how recirculating air filtration affects particles of different sizes. We developed a recirculation model to describe the aerosol concentration change during recirculation. Results The use of inexpensive, low-efficiency filters in recirculation systems is shown to reduce nanoparticle concentrations to below levels found in a typical office within 3 min while driving through heavy traffic, and within 20 min in a simulated nanomaterial production facility. Conclusions Development and application of this technology could lead to significant reductions in airborne nanoparticle exposure, reducing possible risks to health and providing solutions for generating nanomaterials safely. PMID:18629306

  18. Micrometeorological, evapotranspiration, and soil-moisture data at the Amargosa Desert Research site in Nye County near Beatty, Nevada, 2006-11

    USGS Publications Warehouse

    Arthur, Jonathan M.; Johnson, Michael J.; Mayers, C. Justin; Andraski, Brian J.

    2012-11-13

    This report describes micrometeorological, evapotranspiration, and soil-moisture data collected since 2006 at the Amargosa Desert Research Site adjacent to a low-level radio-active waste and hazardous chemical waste facility near Beatty, Nevada. Micrometeorological data include precipitation, solar radiation, net radiation, air temperature, relative humidity, saturated and ambient vapor pressure, wind speed and direction, barometric pressure, near-surface soil temperature, soil-heat flux, and soil-water content. Evapotranspiration (ET) data include latent-heat flux, sensible-heat flux, net radiation, soil-heat flux, soil temperature, air temperature, vapor pressure, and other principal energy-budget data. Soil-moisture data include periodic measurements of volumetric water-content at experimental sites that represent vegetated native soil, devegetated native soil, and simulated waste disposal trenches - maximum measurement depths range from 5.25 to 29.25 meters. All data are compiled in electronic spreadsheets that are included with this report.

  19. Perception of Natural Hazards and Risk among University of Washington Students

    NASA Astrophysics Data System (ADS)

    Herr, K.; Brand, B.; Hamlin, N.; Ou, J.; Thomas, B.; Tudor, E.

    2012-12-01

    Familiarity with a given population's perception of natural hazards and the threats they present is vital for the development of effective education prior to and emergency management response after a natural event. While much work has been done in other active tectonic regions, perception of natural hazards and risk among Pacific Northwest (PNW) residents is poorly constrained. The objective of this work is to assess the current perception of earthquake and volcanic hazards and risk in the PNW, and to better understand the factors which drive the public's behavior concerning preparedness and response. We developed a survey to assess the knowledge of natural hazards common to the region, their perception of risk concerning these hazards, and their level of preparedness should a natural hazard occur. The survey was distributed to University of Washington students and employees via an internet link as part of a class project in 'Living with Volcanoes' (ESS 106) in March of 2012, which returned more than 900 responses. The UW student population was chosen as our first "population" to assess because of their uniqueness as a large, semi-transient population (typical residence of less than 5 years). Only 50% of participants correctly reported their proximity to an active volcano, indicating either lack of knowledge of active volcanoes in the region or poor spatial awareness. When asked which area were most at risk to lahars, respondents indicated that all areas close to the hazard source, including topographically elevated regions, were at a higher risk than more distal and low-lying localities that are also at high risk, indicating a lack of knowledge concerning the topographic dependency of this hazard. Participants perceived themselves to be able to cope better with an earthquake than a volcanic event. This perception may be due to lack of knowledge of volcanic hazards and their extent or due to a false sense of security concerning earthquakes fostered by regular earthquake drills and long periods of quiescence between large earthquake events. 60% of respondents had participated in earthquake drills; however, less than 45% provided the correct response when asked what they would do if an earthquake were to occur. In summary, knowledge of natural hazards and proximity to hazard sources was found to be low or inaccurate, which corresponds to a low perception of risk. Awareness of evacuation routes, emergency response or coping protocol for natural hazards was also found to be low, suggesting this large, semi-transient population lacks the understanding of proper preparation and response to a natural hazard. These results indicate the need for better education concerning the risks of natural hazards in this region and the steps for better preparedness.

  20. Spectrum Modal Analysis for the Detection of Low-Altitude Windshear with Airborne Doppler Radar

    NASA Technical Reports Server (NTRS)

    Kunkel, Matthew W.

    1992-01-01

    A major obstacle in the estimation of windspeed patterns associated with low-altitude windshear with an airborne pulsed Doppler radar system is the presence of strong levels of ground clutter which can strongly bias a windspeed estimate. Typical solutions attempt to remove the clutter energy from the return through clutter rejection filtering. Proposed is a method whereby both the weather and clutter modes present in a return spectrum can be identified to yield an unbiased estimate of the weather mode without the need for clutter rejection filtering. An attempt will be made to show that modeling through a second order extended Prony approach is sufficient for the identification of the weather mode. A pattern recognition approach to windspeed estimation from the identified modes is derived and applied to both simulated and actual flight data. Comparisons between windspeed estimates derived from modal analysis and the pulse-pair estimator are included as well as associated hazard factors. Also included is a computationally attractive method for estimating windspeeds directly from the coefficients of a second-order autoregressive model. Extensions and recommendations for further study are included.

  1. Rossitsa River Basin: Flood Hazard and Risk Identification

    NASA Astrophysics Data System (ADS)

    Mavrova-Guirguinova, Maria; Pencheva, Denislava

    2017-04-01

    The process of Flood Risk Management Planning and adaptation of measures for flood risk reduction as the Early Warning provoke the necessity of surveys involving Identification aspects. This project presents risk identification combining two lines of analysis: (1) Creation a mathematical model of rainfall-runoff processes in a watershed based on limited number of observed input and output variables; (2) Procedures for determination of critical thresholds - discharges/water levels corresponding to certain consequences. The pilot region is Rossitsa river basin, Sevlievo, Bulgaria. The first line of analysis follows next steps: (a) Creation and calibration of Unit Hydrograph Models based on limited number of observed data for discharge and precipitation; The survey at the selected region has 22 observations for excess rainfall and discharge. (b) The relations of UHM coefficients from the input parameters have been determined statistically, excluding the ANN model of the run-off coefficient as a function of 3 parameters (amount of precipitation two days before, soil condition, intensity of the rainfall) where a feedforward neural network is used. (c) Additional simulations with UHM aiming at generation of synthetic data for rainfall-runoff events, which extend the range of observed data; (d) Training, validation and testing a generalized regional ANN Model for discharge forecasting with 4 input parameters, where the training data set consists of synthetic data, validation and testing data sets consists of observations. A function between consequences and discharges has been reached in the second line of analysis concerning critical hazard levels determination. Unsteady simulations with the hydraulic model using three typical hydrographs for determination of the existing time for reaction from one to upper critical threshold are made. Correction of the critical thresholds aiming at providing necessary time for reaction between the thresholds and probability analysis of the finally determined critical thresholds are made. The result of the described method is a Catalogue for off-line flood hazard and risk identification. It can be used as interactive computer system, based on simulations of the ANN "Catalogue". Flood risk identification of the future rainfall event is made in a multi-dimensional space for each kind of soil conditions (dry, average wet and wet condition) and observed amount of precipitation two days before. Rainfall-runoff scenarios in case of intensive rainfall or sustained rainfall (more than 6 hours) are taken into account. Critical thresholds and hazard zones needed of specific operative activities (rescue and recovery) corresponded to each of the regulated flood protection levels (unite, municipality, regional or national) are presented. The Catalogue gives the opportunity for flood hazard scenarios extraction. Regarding that, the Catalogue is useful on the prevention stage of flood protection planning (emergency operations, measures and resources for their implementation planning) and creation of scenarios for training the Emergency Plans. Concerning application for Early Warning, it gives approximate forecast for flood hazard. The Catalogue supplies the necessary time for reaction of about 24 hours. Thus, Early Warning is possible to the responsible authorities, all parts if the Unified Rescue System, members of suitable Headquarters for disaster protection (on municipality, region or national level).

  2. Simulation of the Onset of the Southeast Asian Monsoon during 1997 and 1998: The Impact of Surface Processes

    NASA Technical Reports Server (NTRS)

    Wang, Yansen; Tao, W.-K.; Lau, K.-M.; Wetzel, Peter J.

    2004-01-01

    The onset of the southeast Asian monsoon during 1997 and 1998 was simulated by coupling a mesoscale atmospheric model (MM5) and a detailed, land surface model, PLACE (the Parameterization for Land-Atmosphere-Cloud Exchange). The rainfall results from the simulations were compared with observed satellite data from the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The control simulation with the PLACE land surface model and variable sea surface temperature captured the basic signatures of the monsoon onset processes and associated rainfall statistics. Sensitivity tests indicated that simulations were sigmficantly improved by including the PLACE land surface model. The mechanism by which the land surface processes affect the moisture transport and the convection during the onset of the southeast Asian monsoon were analyzed. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation: the southwest low-level flow over the Indo-china peninsula and the northern, cold frontal intrusion from southern China. The surface sensible and latent heat fluxes modified the low-level temperature distribution and gradient, and therefore the low-level wind due to the thermal wind effect. The more realistic forcing of the sensible and latent heat fluxes from the detailed, land surface model improved the low-level wind simulation apd associated moisture transport and convection.

  3. Simulation of the Onset of the Southeast Asian Monsoon during 1997 and 1998: The Impact of Surface Processes

    NASA Technical Reports Server (NTRS)

    Wang, Yansen; Tao, W.-K.; Lau, K.-M.; Wetzel, Peter J.

    2004-01-01

    The onset of the southeast Asian monsoon during 1997 and 1998 was simulated by coupling a mesoscale atmospheric model (MM5) and a detailed, land surface model, PLACE (the Parameterization for Land-Atmosphere-Cloud Exchange). The rainfall results from the simulations were compared with observed satellite data from the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The control simulation with the PLACE land surface model and variable sea surface temperature captured the basic signatures of the monsoon onset processes and associated rainfall statistics. Sensitivity tests indicated that simulations were significantly improved by including the PLACE land surface model. The mechanism by which the land surface processes affect the moisture transport and the convection during the onset of the southeast Asian monsoon were analyzed. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation: the southwest low-level flow over the Indo-China peninsula and the northern, cold frontal intrusion from southern China. The surface sensible and latent heat fluxes modified the low-level temperature distribution and merit, and therefore the low-level wind due to the thermal wind effect. The more realistic forcing of the sensible and latent heat fluxes from the detailed, land surface model improved the low-level wind simulation and associated moisture transport and convection.

  4. An alternative approach to probabilistic seismic hazard analysis in the Aegean region using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Burton, Paul W.

    2010-09-01

    The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.

  5. Effect of Socioeconomic Status on Mortality after Bacteremia in Working-Age Patients. A Danish Population-Based Cohort Study

    PubMed Central

    Koch, Kristoffer; Nørgaard, Mette; Schønheyder, Henrik Carl; Thomsen, Reimar Wernich; Søgaard, Mette

    2013-01-01

    Objectives To examine the effect of socioeconomic status (SES) on mortality in patients with bacteremia and the underlying factors that may mediate differences in mortality. Methods We conducted a population-based cohort study in two Danish regions. All patients 30 to 65 years of age with first time bacteremia from 2000 through 2008 were identified in a population-based microbiological bacteremia database (n = 8,653). Individual-level data on patients’ SES (educational level and personal income) and comorbid conditions were obtained from public and medical registries. We used Cox regression to examine mortality within 30 days after bacteremia with and without cumulative adjustment for potential mediators. Results Bacteremia patients of low SES were more likely to live alone and be unmarried than patients of high SES. They also had more pre-existing comorbidity, more substance abuse, more Staphylococcus aureus and nosocomial infections, and more admissions to small nonteaching hospitals. Overall, 1,374 patients (15.9%) died within 30 days of follow-up. Patients of low SES had consistently higher mortality after bacteremia than those of high SES crude hazard ratio for low vs. high education, 1.38 [95% confidence interval (CI), 1.18–1.61]; crude hazard ratio for low-income vs. high-income tertile, 1.58 [CI, 1.39–1.80]. Adjustment for differences in social support, pre-existing comorbidity, substance abuse, place of acquisition of the infection, and microbial agent substantially attenuated the effect of SES on mortality (adjusted hazard ratio for low vs. high education, 1.15 [95% CI, 0.98–1.36]; adjusted hazard ratio for low-income vs. high-income tertile, 1.29 [CI, 1.12–1.49]). Further adjustment for characteristics of the admitting hospital had minimal effect on observed mortality differences. Conclusions Low SES was strongly associated with increased 30-day mortality after bacteremia. Less social support, more pre-existing comorbidity, more substance abuse, and differences in place of acquisition and agent of infection appeared to mediate much of the observed disparities in mortality. PMID:23936145

  6. Effect of socioeconomic status on mortality after bacteremia in working-age patients. A Danish population-based cohort study.

    PubMed

    Koch, Kristoffer; Nørgaard, Mette; Schønheyder, Henrik Carl; Thomsen, Reimar Wernich; Søgaard, Mette

    2013-01-01

    To examine the effect of socioeconomic status (SES) on mortality in patients with bacteremia and the underlying factors that may mediate differences in mortality. We conducted a population-based cohort study in two Danish regions. All patients 30 to 65 years of age with first time bacteremia from 2000 through 2008 were identified in a population-based microbiological bacteremia database (n = 8,653). Individual-level data on patients' SES (educational level and personal income) and comorbid conditions were obtained from public and medical registries. We used Cox regression to examine mortality within 30 days after bacteremia with and without cumulative adjustment for potential mediators. Bacteremia patients of low SES were more likely to live alone and be unmarried than patients of high SES. They also had more pre-existing comorbidity, more substance abuse, more Staphylococcus aureus and nosocomial infections, and more admissions to small nonteaching hospitals. Overall, 1,374 patients (15.9%) died within 30 days of follow-up. Patients of low SES had consistently higher mortality after bacteremia than those of high SES crude hazard ratio for low vs. high education, 1.38 [95% confidence interval (CI), 1.18-1.61]; crude hazard ratio for low-income vs. high-income tertile, 1.58 [CI, 1.39-1.80]. Adjustment for differences in social support, pre-existing comorbidity, substance abuse, place of acquisition of the infection, and microbial agent substantially attenuated the effect of SES on mortality (adjusted hazard ratio for low vs. high education, 1.15 [95% CI, 0.98-1.36]; adjusted hazard ratio for low-income vs. high-income tertile, 1.29 [CI, 1.12-1.49]). Further adjustment for characteristics of the admitting hospital had minimal effect on observed mortality differences. Low SES was strongly associated with increased 30-day mortality after bacteremia. Less social support, more pre-existing comorbidity, more substance abuse, and differences in place of acquisition and agent of infection appeared to mediate much of the observed disparities in mortality.

  7. Towards a robust framework for Probabilistic Tsunami Hazard Assessment (PTHA) for local and regional tsunami in New Zealand

    NASA Astrophysics Data System (ADS)

    Mueller, Christof; Power, William; Fraser, Stuart; Wang, Xiaoming

    2013-04-01

    Probabilistic Tsunami Hazard Assessment (PTHA) is conceptually closely related to Probabilistic Seismic Hazard Assessment (PSHA). The main difference is that PTHA needs to simulate propagation of tsunami waves through the ocean and cannot rely on attenuation relationships, which makes PTHA computationally more expensive. The wave propagation process can be assumed to be linear as long as water depth is much larger than the wave amplitude of the tsunami. Beyond this limit a non-linear scheme has to be employed with significantly higher algorithmic run times. PTHA considering far-field tsunami sources typically uses unit source simulations, and relies on the linearity of the process by later scaling and combining the wave fields of individual simulations to represent the intended earthquake magnitude and rupture area. Probabilistic assessments are typically made for locations offshore but close to the coast. Inundation is calculated only for significantly contributing events (de-aggregation). For local and regional tsunami it has been demonstrated that earthquake rupture complexity has a significant effect on the tsunami amplitude distribution offshore and also on inundation. In this case PTHA has to take variable slip distributions and non-linearity into account. A unit source approach cannot easily be applied. Rupture complexity is seen as an aleatory uncertainty and can be incorporated directly into the rate calculation. We have developed a framework that manages the large number of simulations required for local PTHA. As an initial case study the effect of rupture complexity on tsunami inundation and the statistics of the distribution of wave heights have been investigated for plate-interface earthquakes in the Hawke's Bay region in New Zealand. Assessing the probability that water levels will be in excess of a certain threshold requires the calculation of empirical cumulative distribution functions (ECDF). We compare our results with traditional estimates for tsunami inundation simulations that do not consider rupture complexity. De-aggregation based on moment magnitude alone might not be appropriate, because the hazard posed by any individual event can be underestimated locally if rupture complexity is ignored.

  8. Characterization of Natural and Simulated Herbivory on Wild Soybean (Glycine soja Seib. et Zucc.) for Use in Ecological Risk Assessment of Insect Protected Soybean

    PubMed Central

    Goto, Hidetoshi; Shimada, Hiroshi; Horak, Michael J.; Ahmad, Aqeel; Baltazar, Baltazar M.; Perez, Tim; McPherson, Marc A.; Stojšin, Duška; Shimono, Ayako; Ohsawa, Ryo

    2016-01-01

    Insect-protected soybean (Glycine max (L.) Merr.) was developed to protect against foliage feeding by certain Lepidopteran insects. The assessment of potential consequences of transgene introgression from soybean to wild soybean (Glycine soja Seib. et Zucc.) is required as one aspect of the environmental risk assessment (ERA) in Japan. A potential hazard of insect-protected soybean may be hypothesized as transfer of a trait by gene flow to wild soybean and subsequent reduction in foliage feeding by Lepidopteran insects that result in increased weediness of wild soybean in Japan. To assess this potential hazard two studies were conducted. A three-year survey of wild soybean populations in Japan was conducted to establish basic information on foliage damage caused by different herbivores. When assessed across all populations and years within each prefecture, the total foliage from different herbivores was ≤ 30%, with the lowest levels of defoliation (< 2%) caused by Lepidopteran insects. A separate experiment using five levels of simulated defoliation (0%, 10%, 25%, 50% and 100%) was conducted to assess the impact on pod and seed production and time to maturity of wild soybean. The results indicated that there was no decrease in wild soybean plants pod or seed number or time to maturity at defoliation rates up to 50%. The results from these experiments indicate that wild soybean is not limited by lepidopteran feeding and has an ability to compensate for defoliation levels observed in nature. Therefore, the potential hazard to wild soybean from the importation of insect-protected soybean for food and feed into Japan is negligible. PMID:26963815

  9. Prediction of the run out extents of the Slano Blato landslide for future debris flow events

    NASA Astrophysics Data System (ADS)

    Askarinejad, Amin; Leu, Pascal; Macek, Matej; Petkovsek, Ana; Springman, Sarah

    2013-04-01

    The Slano Blato landslide has a volume of about 1 mio m3 and is located in the western part of Slovenia. It has been considered to be a potential natural hazard for the village of Lokavec for more than 200 years. Several mud flows, exhibiting a range of volumes and velocities, have originated from the landslide body since the year 2000, when the landslide was reactivated due to an intense rainfall event. A series of obstacles, including safety dams and deposition ponds, have been constructed for the remediation of the landslide. These obstacles are designed to absorb and contain future debris flow hazard. A prerequisite to any risk analysis is to establish the vulnerability to the hazard event. The aim of this work is to simulate possible future debris flow scenarios in order to predict the run out distances, flow heights, impact pressures and potential effects on the downstream village buildings and infrastructure. The simulations were carried out using the RAMMS program (RApid Mass MovementS, www.ramms.slf.ch). A three dimensional terrain model of the landslide area and the downstream zones, with or without the inclusion of the obstacles, was made for the simulations and different scenarios concerning the released volume, the internal friction and viscosity of the sliding mass were studied. The results indicate that low viscosity mudflows with a volume of 5,000 m3 endanger some parts of Lokavec village. However, the simulations with volumes of 15,000 and 50,000 m3 predict catastrophic effects in terms of either impact pressures or deposition heights for the majority of houses. Moreover, the simulations confirmed that the choice of the material properties (internal friction and viscosity), the characteristics of the release hydrograph, event location, and natural or man-made obstacles play major roles in the run out distances and impact pressures.

  10. Safe disposal of radionuclides in low-level radioactive-waste repository sites; Low-level radioactive-waste disposal workshop, U.S. Geological Survey, July 11-16, 1987, Big Bear Lake, Calif., Proceedings

    USGS Publications Warehouse

    Bedinger, Marion S.; Stevens, Peter R.

    1990-01-01

    In the United States, low-level radioactive waste is disposed by shallow-land burial. Low-level radioactive waste generated by non-Federal facilities has been buried at six commercially operated sites; low-level radioactive waste generated by Federal facilities has been buried at eight major and several minor Federally operated sites (fig. 1). Generally, low-level radioactive waste is somewhat imprecisely defined as waste that does not fit the definition of high-level radioactive waste and does not exceed 100 nCi/g in the concentration of transuranic elements. Most low-level radioactive waste generated by non-Federal facilities is generated at nuclear powerplants; the remainder is generated primarily at research laboratories, hospitals, industrial facilities, and universities. On the basis of half lives and concentrations of radionuclides in low-level radioactive waste, the hazard associated with burial of such waste generally lasts for about 500 years. Studies made at several of the commercially and Federally operated low-level radioactive-waste repository sites indicate that some of these sites have not provided containment of waste nor the expected protection of the environment.

  11. Turbulence Hazard Metric Based on Peak Accelerations for Jetliner Passengers

    NASA Technical Reports Server (NTRS)

    Stewart, Eric C.

    2005-01-01

    Calculations are made of the approximate hazard due to peak normal accelerations of an airplane flying through a simulated vertical wind field associated with a convective frontal system. The calculations are based on a hazard metric developed from a systematic application of a generic math model to 1-cosine discrete gusts of various amplitudes and gust lengths. The math model simulates the three degree-of- freedom longitudinal rigid body motion to vertical gusts and includes (1) fuselage flexibility, (2) the lag in the downwash from the wing to the tail, (3) gradual lift effects, (4) a simplified autopilot, and (5) motion of an unrestrained passenger in the rear cabin. Airplane and passenger response contours are calculated for a matrix of gust amplitudes and gust lengths. The airplane response contours are used to develop an approximate hazard metric of peak normal accelerations as a function of gust amplitude and gust length. The hazard metric is then applied to a two-dimensional simulated vertical wind field of a convective frontal system. The variations of the hazard metric with gust length and airplane heading are demonstrated.

  12. Food safety evaluation for R-proteins introduced by biotechnology: A case study of VNT1 in late blight protected potatoes.

    PubMed

    Habig, Jeffrey W; Rowland, Aaron; Pence, Matthew G; Zhong, Cathy X

    2018-06-01

    Resistance genes (R-genes) from wild potato species confer protection against disease and can be introduced into cultivated potato varieties using breeding or biotechnology. The R-gene, Rpi-vnt1, which encodes the VNT1 protein, protects against late blight, caused by Phytophthora infestans. Heterologous expression and purification of active VNT1 in quantities sufficient for regulatory biosafety studies was problematic, making it impractical to generate hazard characterization data. As a case study for R-proteins, a weight-of-evidence, tiered approach was used to evaluate the safety of VNT1. The hazard potential of VNT1 was identified from relevant safety information including history of safe use, bioinformatics, mode of action, expression levels, and dietary intake. From the assessment it was concluded that Tier II hazard characterization was not needed. R-proteins homologous to VNT1 and identified in edible crops, have a history of safe consumption. VNT1 does not share sequence identity with known allergens. Expression levels of R-proteins are generally low, and VNT1 was not detected in potato varieties expressing the Rpi-vnt1 gene. With minimal hazard and negligible exposure, the risks associated with consumption of R-proteins in late blight protected potatoes are exceedingly low. R-proteins introduced into potatoes to confer late blight protection are safe for consumption. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Modeling laser velocimeter signals as triply stochastic Poisson processes

    NASA Technical Reports Server (NTRS)

    Mayo, W. T., Jr.

    1976-01-01

    Previous models of laser Doppler velocimeter (LDV) systems have not adequately described dual-scatter signals in a manner useful for analysis and simulation of low-level photon-limited signals. At low photon rates, an LDV signal at the output of a photomultiplier tube is a compound nonhomogeneous filtered Poisson process, whose intensity function is another (slower) Poisson process with the nonstationary rate and frequency parameters controlled by a random flow (slowest) process. In the present paper, generalized Poisson shot noise models are developed for low-level LDV signals. Theoretical results useful in detection error analysis and simulation are presented, along with measurements of burst amplitude statistics. Computer generated simulations illustrate the difference between Gaussian and Poisson models of low-level signals.

  14. Particulate Emissions Hazards Associated with Fueling Heat Engines

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C.; Bushnell, Dennis M.

    2010-01-01

    All hydrocarbon- (HC-) fueled heat engine exhaust (tailpipe) emissions (<10 to 140 nm) contribute as health hazards, including emissions from transportation vehicles (e.g., aircraft) and other HC-fueled power systems. CO2 emissions are tracked, and when mapped, show outlines of major transportation routes and cities. Particulate pollution affects living tissue and is found to be detrimental to cardiovascular and respiratory systems where ultrafine particulates directly translocate to promote vascular system diseases potentially detectable as organic vapors. This paper discusses aviation emissions, fueling, and certification issues, including heat engine emissions hazards, detection at low levels and tracking of emissions, and alternate energy sources for general aviation.

  15. Interaction of Francisella tularensis bacterial cells with dynamic speckles

    NASA Astrophysics Data System (ADS)

    Ulianova, Onega V.; Ulyanov, Sergey S.; Sazanova, Elena V.; Zudina, Irina; Zhang, Zhihong; Sibo, Zhou; Luo, Qingming

    2006-08-01

    Influence of low-coherent speckles on the colonies grows is investigated. It has been demonstrated that effects of light on the inhibition of cells (Francisella Tularensis) are caused by speckle dynamics. The regimes of illumination of cell suspension with purpose of devitalization of hazard bacteria, caused very dangerous infections, such as tularemia, are found. Mathematical model of interaction of low-coherent laser radiation with bacteria suspension has been proposed. Computer simulations of the processes of laser-cells interaction have been carried out. Role of coherence of light in the processes of laser-cell interaction is analyzed.

  16. Changes in triglyceride levels and risk for coronary heart disease in young men.

    PubMed

    Tirosh, Amir; Rudich, Assaf; Shochat, Tzippora; Tekes-Manova, Dorit; Israeli, Eran; Henkin, Yaakov; Kochba, Ilan; Shai, Iris

    2007-09-18

    Current triglyceride levels might be only a weak predictor of risk for coronary heart disease (CHD). To assess the association between changes over time in fasting triglyceride levels and CHD risk in young adults. Follow-up study over 5.5 years after 2 measurements of fasting triglycerides 5 years apart. The Staff Periodic Examination Center of the Israel Defense Forces, Zrifin, Israel. 13,953 apparently healthy, untreated, young men (age 26 to 45 years) with triglyceride levels less than 3.39 mmol/L (<300 mg/dL). Two triglyceride measurements (at enrollment [time 1] and 5 years later [time 2]), lifestyle variables, and incident cases of angiography-proven CHD. Within 5.5 years, 158 new cases of CHD were identified. The multivariate model was adjusted for age; family history; fasting glucose; high-density lipoprotein cholesterol; blood pressure; body mass index; and changes between time 1 and time 2 in body mass index, physical activity, smoking status, and habit of eating breakfast. Investigators categorized triglyceride levels according to low, intermediate, and high tertiles (as measured at time 1 and time 2 [expressed as tertile at time 1/tertile at time 2]). The risk for CHD in men with high-tertile triglyceride levels at time 1 changed depending on the tertile at time 2 (hazard ratios, 8.23 [95% CI, 2.50 to 27.13] for high/high, 6.84 [CI, 1.95 to 23.98] for high/intermediate, and 4.90 [CI, 1.01 to 24.55] for high/low, compared with the stable low/low group). The risk for CHD in men with low-tertile levels at time 1 also changed depending on the tertile at time 2 (hazard ratios, 3.81 [CI, 0.96 to 15.31] for low/intermediate and 6.76 [CI, 1.34 to 33.92] for low/high, compared with the stable low/low group). Participants were healthy and had a low incidence rate of CHD. The study was observational. Two triglyceride measurements obtained 5 years apart may assist in assessing CHD risk in young men. A decrease in initially elevated triglyceride levels is associated with a decrease in CHD risk compared with stable high triglyceride levels. However, this risk remains higher than in those with persistently low triglyceride levels.

  17. Multi-Scale Simulations of Past and Future Projections of Hydrology in Lake Tahoe Basin, California-Nevada (Invited)

    NASA Astrophysics Data System (ADS)

    Niswonger, R. G.; Huntington, J. L.; Dettinger, M. D.; Rajagopal, S.; Gardner, M.; Morton, C. G.; Reeves, D. M.; Pohll, G. M.

    2013-12-01

    Water resources in the Tahoe basin are susceptible to long-term climate change and extreme events because it is a middle-altitude, snow-dominated basin that experiences large inter-annual climate variations. Lake Tahoe provides critical water supply for its basin and downstream populations, but changes in water supply are obscured by complex climatic and hydrologic gradients across the high relief, geologically complex basin. An integrated surface and groundwater model of the Lake Tahoe basin has been developed using GSFLOW to assess the effects of climate change and extreme events on surface and groundwater resources. Key hydrologic mechanisms are identified with this model that explains recent changes in water resources of the region. Critical vulnerabilities of regional water-supplies and hazards also were explored. Maintaining a balance between (a) accurate representation of spatial features (e.g., geology, streams, and topography) and hydrologic response (i.e., groundwater, stream, lake, and wetland flows and storages), and (b) computational efficiency, is a necessity for the desired model applications. Potential climatic influences on water resources are analyzed here in simulations of long-term water-availability and flood responses to selected 100-year climate-model projections. GSFLOW is also used to simulate a scenario depicting an especially extreme storm event that was constructed from a combination of two historical atmospheric-river storm events as part of the USGS MultiHazards Demonstration Project. Historical simulated groundwater levels, streamflow, wetlands, and lake levels compare well with measured values for a 30-year historical simulation period. Results are consistent for both small and large model grid cell sizes, due to the model's ability to represent water table altitude, streams, and other hydrologic features at the sub-grid scale. Simulated hydrologic responses are affected by climate change, where less groundwater resources will be available during more frequent droughts. Simulated floods for the region indicate issues related to drainage in the developed areas around Lake Tahoe, and necessary dam releases that create downstream flood risks.

  18. Numerical Study of a Convective Turbulence Encounter

    NASA Technical Reports Server (NTRS)

    Proctor, Fred H.; Hamilton, David W.; Bowles, Roland L.

    2002-01-01

    A numerical simulation of a convective turbulence event is investigated and compared with observational data. The specific case was encountered during one of NASA's flight tests and was characterized by severe turbulence. The event was associated with overshooting convective turrets that contained low to moderate radar reflectivity. Model comparisons with observations are quite favorable. Turbulence hazard metrics are proposed and applied to the numerical data set. Issues such as adequate grid size are examined.

  19. Simulation of a Doppler lidar system for autonomous navigation and hazard avoidance during planetary landing

    NASA Astrophysics Data System (ADS)

    Budge, Scott E.; Chester, David B.

    2016-05-01

    The latest mission proposals for exploration of solar system bodies require accurate position and velocity data during the descent phase in order to ensure safe, soft landing at the pre-designated sites. During landing maneuvers, the accuracy of the on-board inertial measurement unit (IMU) may not be reliable due to drift over extended travel times to destinations. NASA has proposed an advanced Doppler lidar system with multiple beams that can be used to accurately determine attitude and position of the landing vehicle during descent, and to detect hazards that might exist in the landing area. In order to assess the effectiveness of such a Doppler lidar landing system, it is valuable to simulate the system with different beam numbers and configurations. In addition, the effectiveness of the system to detect and map potential landing hazards must be understood. This paper reports the simulated system performance for a proposed multi-beam Doppler lidar using the LadarSIM system simulation software. Details of the simulation methods are given, as well as lidar performance parameters such as range and velocity accuracy, detection and false alarm rates, and examples of the Doppler lidars ability to detect and characterize simulated hazards in the landing site. The simulation includes modulated pulse generation and coherent detection methods, beam footprint simulation, beam scanning, and interaction with terrain.

  20. 40 CFR 63.4920 - What reports must I submit?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... organic HAP in waste materials sent or designated for shipment to a hazardous waste treatment, storage... certification or audit. (ix) The date and time that each CPMS was inoperative, except for zero (low-level) and...

  1. 40 CFR 63.4920 - What reports must I submit?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... organic HAP in waste materials sent or designated for shipment to a hazardous waste treatment, storage... certification or audit. (ix) The date and time that each CPMS was inoperative, except for zero (low-level) and...

  2. Climate Ready Estuaries Rolling Easements Primer

    EPA Pesticide Factsheets

    Rolling easements enable wetlands and beaches to migrate inland and allow society to avoid the costs and hazards of protecting low lands from rising sea levels. This document provides a primer on more than a dozen rolling easement approaches.

  3. Prediction of earthquake ground motion at rock sites in Japan: evaluation of empirical and stochastic approaches for the PEGASOS Refinement Project

    NASA Astrophysics Data System (ADS)

    Edwards, Benjamin; Fäh, Donat

    2017-11-01

    Strong ground-motion databases used to develop ground-motion prediction equations (GMPEs) and calibrate stochastic simulation models generally include relatively few recordings on what can be considered as engineering rock or hard rock. Ground-motion predictions for such sites are therefore susceptible to uncertainty and bias, which can then propagate into site-specific hazard and risk estimates. In order to explore this issue we present a study investigating the prediction of ground motion at rock sites in Japan, where a wide range of recording-site types (from soil to very hard rock) are available for analysis. We employ two approaches: empirical GMPEs and stochastic simulations. The study is undertaken in the context of the PEGASOS Refinement Project (PRP), a Senior Seismic Hazard Analysis Committee (SSHAC) Level 4 probabilistic seismic hazard analysis of Swiss nuclear power plants, commissioned by swissnuclear and running from 2008 to 2013. In order to reduce the impact of site-to-site variability and expand the available data set for rock and hard-rock sites we adjusted Japanese ground-motion data (recorded at sites with 110 m s-1 < Vs30 < 2100 m s-1) to a common hard-rock reference. This was done through deconvolution of: (i) empirically derived amplification functions and (ii) the theoretical 1-D SH amplification between the bedrock and surface. Initial comparison of a Japanese GMPE's predictions with data recorded at rock and hard-rock sites showed systematic overestimation of ground motion. A further investigation of five global GMPEs' prediction residuals as a function of quarter-wavelength velocity showed that they all presented systematic misfit trends, leading to overestimation of median ground motions at rock and hard-rock sites in Japan. In an alternative approach, a stochastic simulation method was tested, allowing the direct incorporation of site-specific Fourier amplification information in forward simulations. We use an adjusted version of the model developed for Switzerland during the PRP. The median simulation prediction at true rock and hard-rock sites (Vs30 > 800 m s-1) was found to be comparable (within expected levels of epistemic uncertainty) to predictions using an empirical GMPE, with reduced residual misfit. As expected, due to including site-specific information in the simulations, the reduction in misfit could be isolated to a reduction in the site-related within-event uncertainty. The results of this study support the use of finite or pseudo-finite fault stochastic simulation methods in estimating strong ground motions in regions of weak and moderate seismicity, such as central and northern Europe. Furthermore, it indicates that weak-motion data has the potential to allow estimation of between- and within-site variability in ground motion, which is a critical issue in site-specific seismic hazard analysis, particularly for safety critical structures.

  4. Debris flow early warning systems in Norway: organization and tools

    NASA Astrophysics Data System (ADS)

    Kleivane, I.; Colleuille, H.; Haugen, L. E.; Alve Glad, P.; Devoli, G.

    2012-04-01

    In Norway, shallow slides and debris flows occur as a combination of high-intensity precipitation, snowmelt, high groundwater level and saturated soil. Many events have occurred in the last decades and are often associated with (or related to) floods events, especially in the Southern of Norway, causing significant damages to roads, railway lines, buildings, and other infrastructures (i.e November 2000; August 2003; September 2005; November 2005; Mai 2008; June and Desember 2011). Since 1989 the Norwegian Water Resources and Energy Directorate (NVE) has had an operational 24 hour flood forecasting system for the entire country. From 2009 NVE is also responsible to assist regions and municipalities in the prevention of disasters posed by landslides and snow avalanches. Besides assisting the municipalities through implementation of digital landslides inventories, susceptibility and hazard mapping, areal planning, preparation of guidelines, realization of mitigation measures and helping during emergencies, NVE is developing a regional scale debris flow warning system that use hydrological models that are already available in the flood warning systems. It is well known that the application of rainfall thresholds is not sufficient to evaluate the hazard for debris flows and shallow slides, and soil moisture conditions play a crucial role in the triggering conditions. The information on simulated soil and groundwater conditions and water supply (rain and snowmelt) based on weather forecast, have proved to be useful variables that indicate the potential occurrence of debris flows and shallow slides. Forecasts of runoff and freezing-thawing are also valuable information. The early warning system is using real-time measurements (Discharge; Groundwater level; Soil water content and soil temperature; Snow water equivalent; Meteorological data) and model simulations (a spatially distributed version of the HBV-model and an adapted version of 1-D soil water and energy balance model COUP). The data are presented in a web- and GIS-based system with daily nationwide maps showing the meteorological and hydrological conditions for the present and the near future from quantitative weather prognosis. In addition a division of the country in homogenous debris flow-prone regions is also under progress based on geomorfological, topographic parameters and loose quaternary deposits distribution. Threshold-levels are being investigated by using statistical analyses of historical debris flows events and measured hydro-meteorological parameters. The debris flow early warning system is currently being tested and is expected to be operational in 2013. Final products will be warning messages and a map showing the different hazard levels, from low to high, indicating the landslide probability and the type of expected damages in a certain area. Many activities are realized in strong collaboration with the road and railway authorities, the geological survey and private consultant companies.

  5. A comparison of the Landsat image and LAHARZ-simulated lahar inundation hazard zone by the 2010 Merapi eruption

    NASA Astrophysics Data System (ADS)

    Lee, Seul-Ki; Lee, Chang-Wook; Lee, Saro

    2015-06-01

    Located above the Java subduction zone, Merapi Volcano is an active stratovolcano with a volcanic activity cycle of 1-5 years. Most Merapi eruptions are relatively small with volcanic explosivity index (VEI) of 1-3. However, the most recent eruption, which occurred in 2010, was quite violent with a VEI of 4 and 386 people were killed. In this study, lahars and pyroclastic flow zones were detected using optical Landsat images and the lahar and pyroclastic flow zone simulated using the LAHARZ program. To detect areal extents of lahar and pyroclastic flows using Landsat images, supervised classification was performed after atmospheric correction by using a cosine of the solar zenith correction (COST) model. As a result, the extracted dimensions of pyroclastic flows are nearly identical to the Calatrava Volcanic Province (CVP) monthly reports. Then, areas of potential lahar and pyroclastic flow inundation based on flow volume using the LAHARZ program were simulated and mapped. Finally, the detected lahars and pyroclastic flow zones were compared with the simulated potential zones using LAHARZ program and verified. Results showed satisfactory similarity (55.63 %) between the detected and simulated zone. The simulated zones using the LAHARZ program can be used as an essential volcanic hazard map for preventing life and property damages for Merapi Volcano and other hazardous volcanic areas. Also, the LAHARZ program can be used to map volcano hazards in other hazardous volcanic areas.

  6. Patient safety room of horrors: a novel method to assess medical students and entering residents' ability to identify hazards of hospitalisation.

    PubMed

    Farnan, Jeanne M; Gaffney, Sean; Poston, Jason T; Slawinski, Kris; Cappaert, Melissa; Kamin, Barry; Arora, Vineet M

    2016-03-01

    Patient safety curricula in undergraduate medical education (UME) are often didactic format with little focus on skills training. Despite recent focus on safety, practical training in residency education is also lacking. Assessments of safety skills in UME and graduate medical education (GME) are generally knowledge, and not application-focused. We aimed to develop and pilot a safety-focused simulation with medical students and interns to assess knowledge regarding hazards of hospitalisation. A simulation demonstrating common hospital-based safety threats was designed. A case scenario was created including salient patient information and simulated safety threats such as the use of upper-extremity restraints and medication errors. After entering the room and reviewing the mock chart, learners were timed and asked to identify and document as many safety hazards as possible. Learner satisfaction was assessed using constructed-response evaluation. Descriptive statistics, including per cent correct and mean correct hazards, were performed. All 86 third-year medical students completed the encounter. Some hazards were identified by a majority of students (fall risk, 83% of students) while others were rarely identified (absence of deep venous thrombosis prophylaxis, 13% of students). Only 5% of students correctly identified pressure ulcer risk. 128 of 131 interns representing 49 medical schools participated in the GME implementation. Incoming interns were able to identify a mean of 5.1 hazards out of the 9 displayed (SD 1.4) with 40% identifying restraints as a hazard, and 20% identifying the inappropriate urinary catheter as a hazard. A simulation showcasing safety hazards was a feasible and effective way to introduce trainees to safety-focused content. Both students and interns had difficulty identifying common hazards of hospitalisation. Despite poor performance, learners appreciated the interactive experience and its clinical utility. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  7. Declining vulnerability to river floods and the global benefits of adaptation

    NASA Astrophysics Data System (ADS)

    Jongman, Brenden; Winsemius, Hessel; Aerts, Jeroen; Coughlan de Perez, Erin; Van Aalst, Maarten; Kron, Wolfgang; Ward, Philip

    2016-04-01

    The global impacts of river floods are substantial and rising. Effective adaptation to the increasing risks requires an in-depth understanding of the physical and socioeconomic drivers of risk. Whilst the modeling of flood hazard and exposure has improved greatly, compelling evidence on spatiotemporal patterns in vulnerability of societies around the world is lacking. Hence, the effects of vulnerability on global flood risk are not fully understood, and future projections of fatalities and losses available today are based on simplistic assumptions or do not include vulnerability. In this study, we show that trends and fluctuations in vulnerability to river floods around the world can be estimated by dynamic high-resolution modeling of flood hazard and exposure. We show that fatalities and losses as a share of exposed population and gross domestic product are decreasing with rising income. We also show that there is a tendency of convergence in vulnerability levels between low- and high-income countries. Based on these findings, we simulate future flood impacts per country using traditional assumptions of static vulnerability through time, but also using future assumptions on reduced vulnerability in the future. We show that future risk increases can be largely contained using effective disaster risk reduction strategies, including a reduction of vulnerability. The study was carried out using the global flood risk model, GLOFRIS, combined with high-resolution time-series maps of hazard and exposure at the global scale. Based on: Jongman et al., 2015. Proceedings of the National Academy of Sciences of the United States of America, doi:10.1073/pnas.1414439112.

  8. An industry perspective on commercial radioactive waste disposal conditions and trends.

    PubMed

    Romano, Stephen A

    2006-11-01

    The United States is presently served by Class-A, -B and -C low-level radioactive waste and naturally-occurring and accelerator-produced radioactive material disposal sites in Washington and South Carolina; a Class-A and mixed waste disposal site in Utah that also accepts naturally-occurring radioactive material; and hazardous and solid waste facilities and uranium mill tailings sites that accept certain radioactive materials on a site-specific basis. The Washington site only accepts low-level radioactive waste from 11 western states due to interstate Compact restrictions on waste importation. The South Carolina site will be subject to geographic service area restrictions beginning 1 July 2008, after which only three states will have continued access. The Utah site dominates the commercial Class-A and mixed waste disposal market due to generally lower state fees than apply in South Carolina. To expand existing commercial services, an existing hazardous waste site in western Texas is seeking a Class-A, -B and -C and mixed waste disposal license. With that exception, no new Compact facilities are proposed. This fluid, uncertain situation has inspired national level rulemaking initiatives and policy studies, as well as alternative disposal practices for certain low-activity materials.

  9. 75 FR 35366 - Pipeline Safety: Applying Safety Regulation to All Rural Onshore Hazardous Liquid Low-Stress Lines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-22

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Onshore Hazardous Liquid Low-Stress Lines AGENCY: Pipeline and Hazardous Materials Safety Administration... pipelines to perform a complete ``could affect'' analysis to determine which rural low-stress pipeline...

  10. Evaluation of low impact development approach for mitigating flood inundation at a watershed scale in China.

    PubMed

    Hu, Maochuan; Sayama, Takahiro; Zhang, Xingqi; Tanaka, Kenji; Takara, Kaoru; Yang, Hong

    2017-05-15

    Low impact development (LID) has attracted growing attention as an important approach for urban flood mitigation. Most studies evaluating LID performance for mitigating floods focus on the changes of peak flow and runoff volume. This paper assessed the performance of LID practices for mitigating flood inundation hazards as retrofitting technologies in an urbanized watershed in Nanjing, China. The findings indicate that LID practices are effective for flood inundation mitigation at the watershed scale, and especially for reducing inundated areas with a high flood hazard risk. Various scenarios of LID implementation levels can reduce total inundated areas by 2%-17% and areas with a high flood hazard level by 6%-80%. Permeable pavement shows better performance than rainwater harvesting against mitigating urban waterlogging. The most efficient scenario is combined rainwater harvesting on rooftops with a cistern capacity of 78.5 mm and permeable pavement installed on 75% of non-busy roads and other impervious surfaces. Inundation modeling is an effective approach to obtaining the information necessary to guide decision-making for designing LID practices at watershed scales. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Comparisons of observed seasonal climate features with a winter and summer numerical simulation produced with the GLAS general circulation model

    NASA Technical Reports Server (NTRS)

    Halem, M.; Shukla, J.; Mintz, Y.; Wu, M. L.; Godbole, R.; Herman, G.; Sud, Y.

    1979-01-01

    Results are presented from numerical simulations performed with the general circulation model (GCM) for winter and summer. The monthly mean simulated fields for each integration are compared with observed geographical distributions and zonal averages. In general, the simulated sea level pressure and upper level geopotential height field agree well with the observations. Well simulated features are the winter Aleutian and Icelandic lows, the summer southwestern U.S. low, the summer and winter oceanic subtropical highs in both hemispheres, and the summer upper level Tibetan high and Atlantic ridge. The surface and upper air wind fields in the low latitudes are in good agreement with the observations. The geographical distirbutions of the Earth-atmosphere radiation balance and of the precipitation rates over the oceans are well simulated, but not all of the intensities of these features are correct. Other comparisons are shown for precipitation along the ITCZ, rediation balance, zonally averaged temperatures and zonal winds, and poleward transports of momentum and sensible heat.

  12. Increased Mortality Risk in Older Adults with Persistently Low or Declining Feelings of Usefulness to Others

    PubMed Central

    Gruenewald, Tara L.; Karlamangla, Arun S.; Greendale, Gail A.; Singer, Burton H.; Seeman, Teresa E.

    2009-01-01

    Objectives To determine if persistently low or declining feelings of usefulness to others in later life predicts increased mortality hazard in older adults. Methods Data on change in perceptions of usefulness, health, behavioral and psychosocial covariate factors, and mortality originate from the MacArthur Study of Successful Aging, a prospective study of 1,189 older adults (age 70–79 years). Results Older adults with persistently low feelings of usefulness or who experienced a decline to low feelings of usefulness over the first 3-years of the study experienced a greater hazard of mortality (sociodemographic adjusted HR = 1.75 (95% CI = 1.22 to 2.51)) over a subsequent 9-year follow-up as compared to older adults with persistently high feelings of usefulness. Discussion Older adults with persistently low perceived usefulness or feelings of usefulness that decline to a low level may be a vulnerable group with increased risk for poor health outcomes in later life. PMID:19104034

  13. Mental health outcomes in US and UK military personnel returning from Iraq.

    PubMed

    Sundin, Josefin; Herrell, Richard K; Hoge, Charles W; Fear, Nicola T; Adler, Amy B; Greenberg, Neil; Riviere, Lyndon A; Thomas, Jeffrey L; Wessely, Simon; Bliese, Paul D

    2014-03-01

    Research of military personnel who deployed to the conflicts in Iraq or Afghanistan has suggested that there are differences in mental health outcomes between UK and US military personnel. To compare the prevalence of post-traumatic stress disorder (PTSD), hazardous alcohol consumption, aggressive behaviour and multiple physical symptoms in US and UK military personnel deployed to Iraq. Data were from one US (n = 1560) and one UK (n = 313) study of post-deployment military health of army personnel who had deployed to Iraq during 2007-2008. Analyses were stratified by high- and low-combat exposure. Significant differences in combat exposure and sociodemographics were observed between US and UK personnel; controlling for these variables accounted for the difference in prevalence of PTSD, but not in the total symptom level scores. Levels of hazardous alcohol consumption (low-combat exposure: odds ratio (OR) = 0.13, 95% CI 0.07-0.21; high-combat exposure: OR = 0.23, 95% CI 0.14-0.39) and aggression (low-combat exposure: OR = 0.36, 95% CI 0.19-0.68) were significantly lower in US compared with UK personnel. There was no difference in multiple physical symptoms. Differences in self-reported combat exposures explain most of the differences in reported prevalence of PTSD. Adjusting for self-reported combat exposures and sociodemographics did not explain differences in hazardous alcohol consumption or aggression.

  14. An early warning system for marine storm hazard mitigation

    NASA Astrophysics Data System (ADS)

    Vousdoukas, M. I.; Almeida, L. P.; Pacheco, A.; Ferreira, O.

    2012-04-01

    The present contribution presents efforts towards the development of an operational Early Warning System for storm hazard prediction and mitigation. The system consists of a calibrated nested-model train which consists of specially calibrated Wave Watch III, SWAN and XBeach models. The numerical simulations provide daily forecasts of the hydrodynamic conditions, morphological change and overtopping risk at the area of interest. The model predictions are processed by a 'translation' module which is based on site-specific Storm Impact Indicators (SIIs) (Ciavola et al., 2011, Storm impacts along European coastlines. Part 2: lessons learned from the MICORE project, Environmental Science & Policy, Vol 14), and warnings are issued when pre-defined threshold values are exceeded. For the present site the selected SIIs were (i) the maximum wave run-up height during the simulations; and (ii) the dune-foot horizontal retreat at the end of the simulations. Both SIIs and pre-defined thresholds were carefully selected on the grounds of existing experience and field data. Four risk levels were considered, each associated with an intervention approach, recommended to the responsible coastal protection authority. Regular updating of the topography/bathymetry is critical for the performance of the storm impact forecasting, especially when there are significant morphological changes. The system can be extended to other critical problems, like implications of global warming and adaptive management strategies, while the approach presently followed, from model calibration to the early warning system for storm hazard mitigation, can be applied to other sites worldwide, with minor adaptations.

  15. Augmented Reality Cues and Elderly Driver Hazard Perception

    PubMed Central

    Schall, Mark C.; Rusch, Michelle L.; Lee, John D.; Dawson, Jeffrey D.; Thomas, Geb; Aksan, Nazan; Rizzo, Matthew

    2013-01-01

    Objective Evaluate the effectiveness of augmented reality (AR) cues in improving driving safety in elderly drivers who are at increased crash risk due to cognitive impairments. Background Cognitively challenging driving environments pose a particular crash risk for elderly drivers. AR cueing is a promising technology to mitigate risk by directing driver attention to roadway hazards. This study investigates whether AR cues improve or interfere with hazard perception in elderly drivers with age-related cognitive decline. Methods Twenty elderly (Mean= 73 years, SD= 5 years), licensed drivers with a range of cognitive abilities measured by a speed of processing (SOP) composite participated in a one-hour drive in an interactive, fixed-base driving simulator. Each participant drove through six, straight, six-mile-long rural roadway scenarios following a lead vehicle. AR cues directed attention to potential roadside hazards in three of the scenarios, and the other three were uncued (baseline) drives. Effects of AR cueing were evaluated with respect to: 1) detection of hazardous target objects, 2) interference with detecting nonhazardous secondary objects, and 3) impairment in maintaining safe distance behind a lead vehicle. Results AR cueing improved the detection of hazardous target objects of low visibility. AR cues did not interfere with detection of nonhazardous secondary objects and did not impair ability to maintain safe distance behind a lead vehicle. SOP capacity did not moderate those effects. Conclusion AR cues show promise for improving elderly driver safety by increasing hazard detection likelihood without interfering with other driving tasks such as maintaining safe headway. PMID:23829037

  16. Earthquake and tsunami hazard in West Sumatra: integrating science, outreach, and local stakeholder needs

    NASA Astrophysics Data System (ADS)

    McCaughey, J.; Lubis, A. M.; Huang, Z.; Yao, Y.; Hill, E. M.; Eriksson, S.; Sieh, K.

    2012-04-01

    The Earth Observatory of Singapore (EOS) is building partnerships with local to provincial government agencies, NGOs, and educators in West Sumatra to inform their policymaking, disaster-risk-reduction, and education efforts. Geodetic and paleoseismic studies show that an earthquake as large as M 8.8 is likely sometime in the coming decades on the Mentawai patch of the Sunda megathrust. This earthquake and its tsunami would be devastating for the Mentawai Islands and neighboring areas of the western Sumatra coast. The low-lying coastal Sumatran city of Padang (pop. ~800,000) has been the object of many research and outreach efforts, especially since 2004. Padang experienced deadly earthquakes in 2007 and 2009 that, though tragedies in their own right, served also as wake-up calls for a larger earthquake to come. However, there remain significant barriers to linking science to policy: extant hazard information is sometimes contradictory or confusing for non-scientists, while turnover of agency leadership and staff means that, in the words of one local advocate, "we keep having to start from zero." Both better hazard knowledge and major infrastructure changes are necessary for risk reduction in Padang. In contrast, the small, isolated villages on the outlying Mentawai Islands have received relatively fewer outreach efforts, yet many villages have the potential for timely evacuation with existing infrastructure. Therefore, knowledge alone can go far toward risk reduction. The tragic October 2010 Mentawai tsunami has inspired further disaster-risk reduction work by local stakeholders. In both locations, we are engaging policymakers and local NGOs, providing science to help inform their work. Through outreach contacts, the Mentawai government requested that we produce the first-ever tsunami hazard map for their islands; this aligns well with scientific interests at EOS. We will work with the Mentawai government on the presentation and explanation of the hazard map, as well as assessment of its impact at the district and village levels. We are also providing science and teaching examples for an NGO-led program to integrate disaster-risk reduction into the Mentawai primary-school curriculum. We are working with our partners to develop a participatory monitoring scheme. Indicators will include the degree to which policy is informed by science, whether communities develop and publicise evacuation routes based on hazard mapping, whether and how frequently communities practice evacuation simulations, and whether hazard information is incorporated into school curricula.

  17. Impact of climate change on New York City's coastal flood hazard: Increasing flood heights from the preindustrial to 2300 CE.

    PubMed

    Garner, Andra J; Mann, Michael E; Emanuel, Kerry A; Kopp, Robert E; Lin, Ning; Alley, Richard B; Horton, Benjamin P; DeConto, Robert M; Donnelly, Jeffrey P; Pollard, David

    2017-11-07

    The flood hazard in New York City depends on both storm surges and rising sea levels. We combine modeled storm surges with probabilistic sea-level rise projections to assess future coastal inundation in New York City from the preindustrial era through 2300 CE. The storm surges are derived from large sets of synthetic tropical cyclones, downscaled from RCP8.5 simulations from three CMIP5 models. The sea-level rise projections account for potential partial collapse of the Antarctic ice sheet in assessing future coastal inundation. CMIP5 models indicate that there will be minimal change in storm-surge heights from 2010 to 2100 or 2300, because the predicted strengthening of the strongest storms will be compensated by storm tracks moving offshore at the latitude of New York City. However, projected sea-level rise causes overall flood heights associated with tropical cyclones in New York City in coming centuries to increase greatly compared with preindustrial or modern flood heights. For the various sea-level rise scenarios we consider, the 1-in-500-y flood event increases from 3.4 m above mean tidal level during 1970-2005 to 4.0-5.1 m above mean tidal level by 2080-2100 and ranges from 5.0-15.4 m above mean tidal level by 2280-2300. Further, we find that the return period of a 2.25-m flood has decreased from ∼500 y before 1800 to ∼25 y during 1970-2005 and further decreases to ∼5 y by 2030-2045 in 95% of our simulations. The 2.25-m flood height is permanently exceeded by 2280-2300 for scenarios that include Antarctica's potential partial collapse. Copyright © 2017 the Author(s). Published by PNAS.

  18. Impact of climate change on New York City’s coastal flood hazard: Increasing flood heights from the preindustrial to 2300 CE

    PubMed Central

    Mann, Michael E.; Emanuel, Kerry A.; Alley, Richard B.; Horton, Benjamin P.; DeConto, Robert M.; Donnelly, Jeffrey P.; Pollard, David

    2017-01-01

    The flood hazard in New York City depends on both storm surges and rising sea levels. We combine modeled storm surges with probabilistic sea-level rise projections to assess future coastal inundation in New York City from the preindustrial era through 2300 CE. The storm surges are derived from large sets of synthetic tropical cyclones, downscaled from RCP8.5 simulations from three CMIP5 models. The sea-level rise projections account for potential partial collapse of the Antarctic ice sheet in assessing future coastal inundation. CMIP5 models indicate that there will be minimal change in storm-surge heights from 2010 to 2100 or 2300, because the predicted strengthening of the strongest storms will be compensated by storm tracks moving offshore at the latitude of New York City. However, projected sea-level rise causes overall flood heights associated with tropical cyclones in New York City in coming centuries to increase greatly compared with preindustrial or modern flood heights. For the various sea-level rise scenarios we consider, the 1-in-500-y flood event increases from 3.4 m above mean tidal level during 1970–2005 to 4.0–5.1 m above mean tidal level by 2080–2100 and ranges from 5.0–15.4 m above mean tidal level by 2280–2300. Further, we find that the return period of a 2.25-m flood has decreased from ∼500 y before 1800 to ∼25 y during 1970–2005 and further decreases to ∼5 y by 2030–2045 in 95% of our simulations. The 2.25-m flood height is permanently exceeded by 2280–2300 for scenarios that include Antarctica’s potential partial collapse. PMID:29078274

  19. Impact of climate change on New York City's coastal flood hazard: Increasing flood heights from the preindustrial to 2300 CE

    NASA Astrophysics Data System (ADS)

    Garner, Andra J.; Mann, Michael E.; Emanuel, Kerry A.; Kopp, Robert E.; Lin, Ning; Alley, Richard B.; Horton, Benjamin P.; DeConto, Robert M.; Donnelly, Jeffrey P.; Pollard, David

    2017-11-01

    The flood hazard in New York City depends on both storm surges and rising sea levels. We combine modeled storm surges with probabilistic sea-level rise projections to assess future coastal inundation in New York City from the preindustrial era through 2300 CE. The storm surges are derived from large sets of synthetic tropical cyclones, downscaled from RCP8.5 simulations from three CMIP5 models. The sea-level rise projections account for potential partial collapse of the Antarctic ice sheet in assessing future coastal inundation. CMIP5 models indicate that there will be minimal change in storm-surge heights from 2010 to 2100 or 2300, because the predicted strengthening of the strongest storms will be compensated by storm tracks moving offshore at the latitude of New York City. However, projected sea-level rise causes overall flood heights associated with tropical cyclones in New York City in coming centuries to increase greatly compared with preindustrial or modern flood heights. For the various sea-level rise scenarios we consider, the 1-in-500-y flood event increases from 3.4 m above mean tidal level during 1970–2005 to 4.0–5.1 m above mean tidal level by 2080–2100 and ranges from 5.0–15.4 m above mean tidal level by 2280–2300. Further, we find that the return period of a 2.25-m flood has decreased from ˜500 y before 1800 to ˜25 y during 1970–2005 and further decreases to ˜5 y by 2030–2045 in 95% of our simulations. The 2.25-m flood height is permanently exceeded by 2280–2300 for scenarios that include Antarctica's potential partial collapse.

  20. Synergistic interactions between an upper-level jet streak and diabatic processes that influence the development of a low-level jet and a secondary coastal cyclone

    NASA Technical Reports Server (NTRS)

    Uccellini, Louis W.; Petersen, Ralph A.; Kocin, Paul J.; Brill, Keith F.; Tuccillo, James J.

    1987-01-01

    A series of numerical simulations of the February 1979 Presidents Day cyclone is presented. The development of the low-level jet (LLJ) associated with the cyclone is described, and the mesoscale numerical model, initial analyses, and experimental design used in the study are discussed. Four numerical simulations are discussed and compared, including an adiabatic simulation that isolates the development of upper-level divergence along the axis of a subtropical jet streak and three other simulations that reveal the contributions of sensible and latent heat release in modifying lower-tropospheric wind fields and reducing the sea-level pressure. The formation of the LLJ is described through an evaluation of trajectories derived from the various model simulations. The effect of the LLJ on secondary cyclogenesis along the East Coast is described.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, D; Parsons, D; Geerts, B

    The Plains Elevated Convection at Night (PECAN) experiment is a large field campaign that is being supported by the National Science Foundation (NSF) with contributions from the National Oceanic and Atmospheric Administration (NOAA), the National Atmospheric and Space Administration (NASA), and the U.S. Department of Energy (DOE). The overarching goal of the PECAN experiment is to improve the understanding and simulation of the processes that initiate and maintain convection and convective precipitation at night over the central portion of the Great Plains region of the United States (Parsons et al. 2013). These goals are important because (1) a large fractionmore » of the yearly precipitation in the Great Plains comes from nocturnal convection, (2) nocturnal convection in the Great Plains is most often decoupled from the ground and, thus, is forced by other phenomena aloft (e.g., propagating bores, frontal boundaries, low-level jets [LLJ], etc.), (3) there is a relative lack of understanding how these disturbances initiate and maintain nocturnal convection, and (4) this lack of understanding greatly hampers the ability of numerical weather and climate models to simulate nocturnal convection well. This leads to significant uncertainties in predicting the onset, location, frequency, and intensity of convective cloud systems and associated weather hazards over the Great Plains.« less

  2. Influences of specific ions in groundwater on concrete degradation in subsurface engineered barrier system.

    PubMed

    Lin, Wen-Sheng; Liu, Chen-Wuing; Li, Ming-Hsu

    2016-01-01

    Many disposal concepts currently show that concrete is an effective confinement material used in engineered barrier systems (EBS) at a number of low-level radioactive waste (LLW) disposal sites. Cement-based materials have properties for the encapsulation, isolation, or retardation of a variety of hazardous contaminants. The reactive chemical transport model of HYDROGEOCHEM 5.0 was applied to simulate the effect of hydrogeochemical processes on concrete barrier degradation in an EBS which has been proposed to use in the LLW disposal site in Taiwan. The simulated results indicated that the main processes that are responsible for concrete degradation are the species induced from hydrogen ion, sulfate, and chloride. The EBS with the side ditch drainage system effectively discharges the infiltrated water and lowers the solute concentrations that may induce concrete degradation. The redox processes markedly influence the formations of the degradation materials. The reductive environment in the EBS reduces the formation of ettringite in concrete degradation processes. Moreover, the chemical conditions in the concrete barriers maintain an alkaline condition after 300 years in the proposed LLW repository. This study provides a detailed picture of the long-term evolution of the hydrogeochemical environment in the proposed LLW disposal site in Taiwan.

  3. A Study of Aircraft Fire Hazards Related to Natural Electrical Phenomena

    NASA Technical Reports Server (NTRS)

    Kester, Frank L.; Gerstein, Melvin; Plumer, J. A.

    1960-01-01

    The problems of natural electrical phenomena as a fire hazard to aircraft are evaluated. Assessment of the hazard is made over the range of low level electrical discharges, such as static sparks, to high level discharges, such as lightning strikes to aircraft. In addition, some fundamental work is presented on the problem of flame propagation in aircraft fuel vent systems. This study consists of a laboratory investigation in five parts: (1) a study of the ignition energies and flame propagation rates of kerosene-air and JP-6-air foams, (2) a study of the rate of flame propagation of n-heptane, n-octane, n-nonane, and n-decane in aircraft vent ducts, (3) a study of the damage to aluminum, titanium, and stainless steel aircraft skin materials by lightning strikes, (4) a study of fuel ignition by lightning strikes to aircraft skins, and (5) a study of lightning induced flame propagation in an aircraft vent system.

  4. 3D dynamic rupture simulation and local tomography studies following the 2010 Haiti earthquake

    NASA Astrophysics Data System (ADS)

    Douilly, Roby

    The 2010 M7.0 Haiti earthquake was the first major earthquake in southern Haiti in 250 years. As this event could represent the beginning of a new period of active seismicity in the region, and in consideration of how vulnerable the population is to earthquake damage, it is important to understand the nature of this event and how it has influenced seismic hazards in the region. Most significantly, the 2010 earthquake occurred on the secondary Leogâne thrust fault (two fault segments), not the Enriquillo Fault, the major strike-slip fault in the region, despite it being only a few kilometers away. We first use a finite element model to simulate rupture along the Leogâne fault. We varied friction and background stress to investigate the conditions that best explain observed surface deformations and why the rupture did not to jump to the nearby Enriquillo fault. Our model successfully replicated rupture propagation along the two segments of the Leogâne fault, and indicated that a significant stress increase occurred on the top and to the west of the Enriquillo fault. We also investigated the potential ground shaking level in this region if a rupture similar to the Mw 7.0 2010 Haiti earthquake were to occur on the Enriquillo fault. We used a finite element method and assumptions on regional stress to simulate low frequency dynamic rupture propagation for the segment of the Enriquillo fault closer to the capital. The high-frequency ground motion components were calculated using the specific barrier model, and the hybrid synthetics were obtained by combining the low-frequencies ( 1Hz) from the stochastic simulation using matched filtering at a crossover frequency of 1 Hz. The average horizontal peak ground acceleration, computed at several sites of interest through Port-au-Prince (the capital), has a value of 0.35g. Finally, we investigated the 3D local tomography of this region. We considered 897 high-quality records from the earthquake catalog as recorded by temporary station deployments. We only considered events that had at least 6 P and 6 S arrivals, and an azimuthal gap less then 180 degrees, to simultaneously invert for hypocenters and 3D velocity structure in southern Haiti. We used the program VELEST to define a minimum 1D velocity model, which was then used as a starting model in the computer algorithm SIMULPS14 to produce the 3D tomography. Our results show a pronounced low velocity zone across the Logne fault, which is consistent with the sedimentary basin location from the geologic map. We also observe a southeast low velocity zone, which is consistent with a predefined structure in the morphology. Low velocity structure usually correlates with broad zones of deformation, such as the presence of cracks or faults, or from the presence of fluid in the crust. This work provides information that can be used in future studies focusing on how changes in material properties can affect rupture propagation, which is useful to assess the seismic hazard that Haiti and other regions are facing.

  5. Design reuse experience of space and hazardous operations robots

    NASA Technical Reports Server (NTRS)

    Oneil, P. Graham

    1994-01-01

    A comparison of design drivers for space and hazardous nuclear waste operating robots details similarities and differences in operations, performance and environmental parameters for these critical environments. The similarities are exploited to provide low risk system components based on reuse principles and design knowledge. Risk reduction techniques are used for bridging areas of significant differences. As an example, risk reduction of a new sensor design for nuclear environment operations is employed to provide upgradeable replacement units in a reusable architecture for significantly higher levels of radiation.

  6. Values of Flood Hazard Mapping for Disaster Risk Assessment and Communication

    NASA Astrophysics Data System (ADS)

    Sayama, T.; Takara, K. T.

    2015-12-01

    Flood plains provide tremendous benefits for human settlements. Since olden days people have lived with floods and attempted to control them if necessary. Modern engineering works such as building embankment have enabled people to live even in flood prone areas, and over time population and economic assets have concentrated in these areas. In developing countries also, rapid land use change alters exposure and vulnerability to floods and consequently increases disaster risk. Flood hazard mapping is an essential step for any counter measures. It has various objectives including raising awareness of residents, finding effective evacuation routes and estimating potential damages through flood risk mapping. Depending on the objectives and data availability, there are also many possible approaches for hazard mapping including simulation basis, community basis and remote sensing basis. In addition to traditional paper-based hazard maps, Information and Communication Technology (ICT) promotes more interactive hazard mapping such as movable hazard map to demonstrate scenario simulations for risk communications and real-time hazard mapping for effective disaster responses and safe evacuations. This presentation first summarizes recent advancement of flood hazard mapping by focusing on Japanese experiences and other examples from Asian countries. Then it introduces a flood simulation tool suitable for hazard mapping at the river basin scale even in data limited regions. In the past few years, the tool has been practiced by local officers responsible for disaster management in Asian countries. Through the training activities of hazard mapping and risk assessment, we conduct comparative analysis to identify similarity and uniqueness of estimated economic damages depending on topographic and land use conditions.

  7. Simulating Wake Vortex Detection with the Sensivu Doppler Wind Lidar Simulator

    NASA Technical Reports Server (NTRS)

    Ramsey, Dan; Nguyen, Chi

    2014-01-01

    In support of NASA's Atmospheric Environment Safety Technologies NRA research topic on Wake Vortex Hazard Investigation, Aerospace Innovations (AI) investigated a set of techniques for detecting wake vortex hazards from arbitrary viewing angles, including axial perspectives. This technical report describes an approach to this problem and presents results from its implementation in a virtual lidar simulator developed at AI. Threedimensional data volumes from NASA's Terminal Area Simulation System (TASS) containing strong turbulent vortices were used as the atmospheric domain for these studies, in addition to an analytical vortex model in 3-D space. By incorporating a third-party radiative transfer code (BACKSCAT 4), user-defined aerosol layers can be incorporated into atmospheric models, simulating attenuation and backscatter in different environmental conditions and altitudes. A hazard detection algorithm is described that uses a twocomponent spectral model to identify vortex signatures observable from arbitrary angles.

  8. Dynamic modelling of a forward osmosis-nanofiltration integrated process for treating hazardous wastewater.

    PubMed

    Pal, Parimal; Das, Pallabi; Chakrabortty, Sankha; Thakura, Ritwik

    2016-11-01

    Dynamic modelling and simulation of a nanofiltration-forward osmosis integrated complete system was done along with economic evaluation to pave the way for scale up of such a system for treating hazardous pharmaceutical wastes. The system operated in a closed loop not only protects surface water from the onslaught of hazardous industrial wastewater but also saves on cost of fresh water by turning wastewater recyclable at affordable price. The success of dynamic modelling in capturing the relevant transport phenomena is well reflected in high overall correlation coefficient value (R 2  > 0.98), low relative error (<0.1) and Willmott d-index (<0.95). The system could remove more than 97.5 % chemical oxygen demand (COD) from real pharmaceutical wastewater having initial COD value as high as 3500 mg/L while ensuring operation of the forward osmosis loop at a reasonably high flux of 56-58 l per square meter per hour.

  9. Annual waste reduction activities report. Issue 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1991-03-18

    This report discusses the waste minimization activities for the Pinellas Plant. The Pinellas Plant deals with low-level radioactive wastes, solvents, scrap metals and various other hazardous materials. This program has realized cost savings through recycling and reuse of materials.

  10. 30 CFR 77.1304 - Blasting agents; special provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... connection with pneumatic loading machines shall be of the semiconductive type, having a total resistance low... electric currents to a safe level. Wire-countered hose shall not be used because of the potential hazard from stray electric currents. ...

  11. 30 CFR 77.1304 - Blasting agents; special provisions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... connection with pneumatic loading machines shall be of the semiconductive type, having a total resistance low... electric currents to a safe level. Wire-countered hose shall not be used because of the potential hazard from stray electric currents. ...

  12. Deadly heat waves projected in the densely populated agricultural regions of South Asia.

    PubMed

    Im, Eun-Soon; Pal, Jeremy S; Eltahir, Elfatih A B

    2017-08-01

    The risk associated with any climate change impact reflects intensity of natural hazard and level of human vulnerability. Previous work has shown that a wet-bulb temperature of 35°C can be considered an upper limit on human survivability. On the basis of an ensemble of high-resolution climate change simulations, we project that extremes of wet-bulb temperature in South Asia are likely to approach and, in a few locations, exceed this critical threshold by the late 21st century under the business-as-usual scenario of future greenhouse gas emissions. The most intense hazard from extreme future heat waves is concentrated around densely populated agricultural regions of the Ganges and Indus river basins. Climate change, without mitigation, presents a serious and unique risk in South Asia, a region inhabited by about one-fifth of the global human population, due to an unprecedented combination of severe natural hazard and acute vulnerability.

  13. Use of a modified GreenScreen tool to conduct a screening-level comparative hazard assessment of conventional silver and two forms of nanosilver.

    PubMed

    Sass, Jennifer; Heine, Lauren; Hwang, Nina

    2016-11-08

    Increased concern for potential health and environmental impacts of chemicals, including nanomaterials, in consumer products is driving demand for greater transparency regarding potential risks. Chemical hazard assessment is a powerful tool to inform product design, development and procurement and has been integrated into alternative assessment frameworks. The extent to which assessment methods originally designed for conventionally-sized materials can be used for nanomaterials, which have size-dependent physical and chemical properties, have not been well established. We contracted with a certified GreenScreen profiler to conduct three GreenScreen hazard assessments, for conventional silver and two forms of nanosilver. The contractor summarized publicly available literature, and used defined GreenScreen hazard criteria and expert judgment to assign and report hazard classification levels, along with indications of confidence in those assignments. Where data were not available, a data gap (DG) was assigned. Using the individual endpoint scores, an aggregated benchmark score (BM) was applied. Conventional silver and low-soluble nanosilver were assigned the highest possible hazard score and a silica-silver nanocomposite called AGS-20 could not be scored due to data gaps. AGS-20 is approved for use as antimicrobials by the US Environmental Protection Agency. An existing method for chemical hazard assessment and communication can be used - with minor adaptations- to compare hazards across conventional and nano forms of a substance. The differences in data gaps and in hazard profiles support the argument that each silver form should be considered unique and subjected to hazard assessment to inform regulatory decisions and decisions about product design and development. A critical limitation of hazard assessments for nanomaterials is the lack of nano-specific hazard data - where data are available, we demonstrate that existing hazard assessment systems can work. The work is relevant for risk assessors and regulators. We recommend that regulatory agencies and others require more robust data sets on each novel nanomaterial before granting market approval.

  14. Combined fluvial and pluvial urban flood hazard analysis: concept development and application to Can Tho city, Mekong Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martínez Trepat, Oriol; Nghia Hung, Nguyen; Thi Chinh, Do; Merz, Bruno; Viet Dung, Nguyen

    2016-04-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either a fluvial or pluvial flood hazard, studies of a combined fluvial and pluvial flood hazard are hardly available. Thus this study aims to analyse a fluvial and a pluvial flood hazard individually, but also to develop a method for the analysis of a combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as an example. In this tropical environment the annual monsoon triggered floods of the Mekong River, which can coincide with heavy local convective precipitation events, causing both fluvial and pluvial flooding at the same time. The fluvial flood hazard was estimated with a copula-based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. The pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data and a stochastic rainstorm generator. Inundation for all flood scenarios was simulated by a 2-dimensional hydrodynamic model implemented on a Graphics Processing Unit (GPU) for time-efficient flood propagation modelling. The combined fluvial-pluvial flood scenarios were derived by adding rainstorms to the fluvial flood events during the highest fluvial water levels. The probabilities of occurrence of the combined events were determined assuming independence of the two flood types and taking the seasonality and probability of coincidence into account. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation taking into account the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and their usage in flood risk management are outlined.

  15. New Orleans After Hurricane Katrina: An Unnatural Disaster?

    NASA Astrophysics Data System (ADS)

    McNamara, D.; Werner, B.; Kelso, A.

    2005-12-01

    Motivated by destruction in New Orleans following hurricane Katrina, we use a numerical model to explore how natural processes, economic development, hazard mitigation measures and policy decisions intertwine to produce long periods of quiescence punctuated by disasters of increasing magnitude. Physical, economic and policy dynamics are modeled on a grid representing the subsiding Mississippi Delta region surrounding New Orleans. Water flow and resulting sediment erosion and deposition are simulated in response to prescribed river floods and storms. Economic development operates on a limited number of commodities and services such as agricultural products, oil and chemical industries and port services, with investment and employment responding to both local conditions and global constraints. Development permitting, artificial levee construction and pumping are implemented by policy agents who weigh predicted economic benefits (tax revenue), mitigation costs and potential hazards. Economic risk is reduced by a combination of private insurance, federal flood insurance and disaster relief. With this model, we simulate the initiation and growth of New Orleans coupled with an increasing level of protection from a series of flooding events. Hazard mitigation filters out small magnitude events, but terrain and hydrological modifications amplify the impact of large events. In our model, "natural disasters" are the inevitable outcome of the mismatch between policy based on short-time-scale economic calculations and stochastic forcing by infrequent, high-magnitude flooding events. A comparison of the hazard mitigation response to river- and hurricane-induced flooding will be discussed. Supported by NSF Geology and Paleontology and the Andrew W Mellon Foundation.

  16. A GIS-based methodology for the estimation of potential volcanic damage and its application to Tenerife Island, Spain

    NASA Astrophysics Data System (ADS)

    Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.

    2014-05-01

    This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.

  17. Simulating Scenario Floods for Hazard Assessment on the Lower Bicol Floodplain, the Philippines

    NASA Astrophysics Data System (ADS)

    Usamah, Muhibuddin Bin; Alkema, Dinand

    This paper describes the first results from a study to the behavior of floods in the lower Bicol area, the Philippines. A 1D2D dynamic hydraulic model was applied to simulate a set of scenario floods through the complex topography of the city Naga and surrounding area. The simulation results are integrated into a multi-parameter hazard zonation for the five scenario floods.

  18. L'aléa tsunami en Polynésie française : apports de la simulation numérique

    NASA Astrophysics Data System (ADS)

    Sladen, Anthony; Hébert, Hélène; Schindelé, François; Reymond, Dominique

    2007-04-01

    French Polynesia is frequently struck by transoceanic tsunamis originating from around the Pacific. The numerical modelling of five scenarios defined among threatening source areas has been performed on seven Polynesian sites. The results show that the Marquesian bays are consistently most affected, while the sites in Tahiti and Rurutu are significantly exposed, though less heavily. The tsunami hazard has been then mapped for whole Polynesia. Major tsunamis are expected to hit Marquesas, and Rurutu (Australes), but less frequently. An elevated hazard level is defined for the other Australes and for several Society Islands (especially Tahiti). Tuamotu atolls and other Society Islands are only moderately exposed.

  19. Tsunami risk zoning in south-central Chile

    NASA Astrophysics Data System (ADS)

    Lagos, M.

    2010-12-01

    The recent 2010 Chilean tsunami revealed the need to optimize methodologies for assessing the risk of disaster. In this context, modern techniques and criteria for the evaluation of the tsunami phenomenon were applied in the coastal zone of south-central Chile as a specific methodology for the zoning of tsunami risk. This methodology allows the identification and validation of a scenario of tsunami hazard; the spatialization of factors that have an impact on the risk; and the zoning of the tsunami risk. For the hazard evaluation, different scenarios were modeled by means of numerical simulation techniques, selecting and validating the results that better fit with the observed tsunami data. Hydrodynamic parameters of the inundation as well as physical and socioeconomic vulnerability aspects were considered for the spatialization of the factors that affect the tsunami risk. The tsunami risk zoning was integrated into a Geographic Information System (GIS) by means of multicriteria evaluation (MCE). The results of the tsunami risk zoning show that the local characteristics and their location, together with the concentration of poverty levels, establish spatial differentiated risk levels. This information builds the basis for future applied studies in land use planning that tend to minimize the risk levels associated to the tsunami hazard. This research is supported by Fondecyt 11090210.

  20. The theoretical simulation on electrostatic distribution of 1st proximity region in proximity focusing low-light-level image intensifier

    NASA Astrophysics Data System (ADS)

    Zhang, Liandong; Bai, Xiaofeng; Song, De; Fu, Shencheng; Li, Ye; Duanmu, Qingduo

    2015-03-01

    Low-light-level night vision technology is magnifying low light level signal large enough to be seen by naked eye, which uses the photons - photoelectron as information carrier. Until the micro-channel plate was invented, it has been possibility for the realization of high performance and miniaturization of low-light-level night vision device. The device is double-proximity focusing low-light-level image intensifier which places a micro-channel plate close to photocathode and phosphor screen. The advantages of proximity focusing low-light-level night vision are small size, light weight, small power consumption, no distortion, fast response speed, wide dynamic range and so on. It is placed parallel to each other for Micro-channel plate (both sides of it with metal electrode), the photocathode and the phosphor screen are placed parallel to each other. The voltage is applied between photocathode and the input of micro-channel plate when image intensifier works. The emission electron excited by photo on the photocathode move towards to micro-channel plate under the electric field in 1st proximity focusing region, and then it is multiplied through the micro-channel. The movement locus of emission electrons can be calculated and simulated when the distributions of electrostatic field equipotential lines are determined in the 1st proximity focusing region. Furthermore the resolution of image tube can be determined. However the distributions of electrostatic fields and equipotential lines are complex due to a lot of micro-channel existing in the micro channel plate. This paper simulates electrostatic distribution of 1st proximity region in double-proximity focusing low-light-level image intensifier with the finite element simulation analysis software Ansoft maxwell 3D. The electrostatic field distributions of 1st proximity region are compared when the micro-channel plates' pore size, spacing and inclination angle ranged. We believe that the electron beam movement trajectory in 1st proximity region will be better simulated when the electronic electrostatic fields are simulated.

  1. The effect of natural disturbances on the risk from hydrogeomorphic hazards under climate change

    NASA Astrophysics Data System (ADS)

    Scheidl, Christian; Thaler, Thomas; Seidl, Rupert; Rammer, Werner; Kohl, Bernhard; Markart, Gerhard

    2017-04-01

    Recent storm events in Austria show once more how floods, sediment transport processes and debris flows constitute a major threat in alpine regions with a high density of population and an increasing spatial development. As protection forests have a major control function on runoff and erosion, they directly affect the risk from such hydrogeomorphic processes. However, research on future climate conditions, with an expected increase of the global average surface temperature of 3-5°C by 2100, compared to the first decade of the 20th century, raises a number of open questions for a sustainable and improved hazard management in mountain forests. For Europe, for instance, a climate-induced increase in forest disturbances like wildfire, wind, and insect's outbreaks is highly likely for the coming decades. Especially in protection forests, future scenarios of such climate induced natural disturbances and their impact on the protective effect remain an unresolved issue. Combining methods from forestry, hydrology and geotechnical engineering our project uses an integral approach to simulate possible effects of natural disturbances on hydrogeomorphic hazards in the perspective of future protection forest developments. With the individual-based forest landscape and disturbance model (iLand) we conduct an ensemble of forest landscape simulations, assessing the impact of future changes in natural disturbance regimes in four selected torrential catchments. These catchments are situated in two different forest growth areas. Drainage rate simulations are based on the conceptual hydrological model (ZEMOKOST), whereas simulations of the effect of forest disturbances on hillslope erosion processes are conducted by the Distributed Hydrology Soil Vegetation Model (DHSVM). Beside process based simulations, we also emphasis to identify the risk perception and adaptive capacity to mitigate a probable loss of protection functions in forests. For this reason, a postal survey among forestry actors will be performed to assess forest managers concern and willingness to engage in natural hazards management in contrast to the roles of their social network and the roles of political/administrative representatives. We will compare these perceived roles along the dimensions efficacy, attribution of responsibility and trust. This theory-driven approach highlights the motivational structure underlying the willingness to participate in natural hazards initiatives, and allows to tailor policy implications to the needs and capacities of distinct target groups. The outcomes of the investigations shall contribute to the development of adaptive management strategies for forestry administrations at all political levels to mitigate negative effects of climate change in protection forests.

  2. Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Y.J.; Reich, M.

    Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.

  3. The effects of momentary visual disruption on hazard anticipation and awareness in driving.

    PubMed

    Borowsky, Avinoam; Horrey, William J; Liang, Yulan; Garabet, Angela; Simmons, Lucinda; Fisher, Donald L

    2015-01-01

    Driver distraction is known to increase crash risk, especially when a driver glances inside the vehicle for especially long periods of time. Though it is clear that such glances increase the risk for the driver when looking inside the vehicle, it is less clear how these glances disrupt the ongoing processing of information outside the vehicle once the driver's eyes return to the road. The present study was aimed at exploring the effect of in-vehicle glances on the top-down processes that guide the detection and monitoring of hazards on the forward roadway. Using a driving simulator, 12 participants were monitored with an eye-tracking system while they navigated various hazardous scenarios. Six participants were momentarily interrupted by a visual secondary task (simulating a glance inside the vehicle) prior to the occurrence of a potential hazard and 6 were not. Eye movement analyses showed that interrupted drivers often failed to continue scanning for a potential hazard when their forward view reappeared, especially when the potential threat could not easily be localized. Additionally, drivers' self-appraisal of workload and performance of the driving task indicated that, contrary to what one might expect, drivers in the interruption condition reported workload levels lower than and performance equal to drivers in the no interruption condition. Drivers who are momentarily disrupted even for a brief duration are at risk of missing important information when they return their gaze to the forward roadway. In addition, because they are not aware of missing this information they are likely to continue engaging in in-vehicle tasks even though they are demonstrably unsafe. The implications for safety, calibration, and targeted remediation are discussed.

  4. Developing a smartphone software package for predicting atmospheric pollutant concentrations at mobile locations.

    PubMed

    Larkin, Andrew; Williams, David E; Kile, Molly L; Baird, William M

    2015-06-01

    There is considerable evidence that exposure to air pollution is harmful to health. In the U.S., ambient air quality is monitored by Federal and State agencies for regulatory purposes. There are limited options, however, for people to access this data in real-time which hinders an individual's ability to manage their own risks. This paper describes a new software package that models environmental concentrations of fine particulate matter (PM 2.5 ), coarse particulate matter (PM 10 ), and ozone concentrations for the state of Oregon and calculates personal health risks at the smartphone's current location. Predicted air pollution risk levels can be displayed on mobile devices as interactive maps and graphs color-coded to coincide with EPA air quality index (AQI) categories. Users have the option of setting air quality warning levels via color-coded bars and were notified whenever warning levels were exceeded by predicted levels within 10 km. We validated the software using data from participants as well as from simulations which showed that the application was capable of identifying spatial and temporal air quality trends. This unique application provides a potential low-cost technology for reducing personal exposure to air pollution which can improve quality of life particularly for people with health conditions, such as asthma, that make them more susceptible to these hazards.

  5. Simulation of hydrologic influences on wetland ecosystem succession. Master's thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pompilio, R.A.

    1994-09-01

    This research focuses on the development of a simulation model to determine the affects of hydrological influences on a wetland ecosystem. The model allows perturbations to the inputs of various wetland data which in turn, influences the successional development of the ecosystem. This research consisted of converting a grassland ecosystem model to one which simulates wetland conditions. The critical factor in determining the success of wetland creation is the hydrology of the system. There are four of the areas of the original model which are affected by the hydrology. The model measures the health or success of the ecosystem throughmore » the measurement of the systems gross plant production, the respiration and the net primary production of biomass. Altering the auxiliary variables of water level and the rate of flow through the system explicitly details the affects hydrologic influences on those production rates. Ten case tests depicting exogenous perturbations of the hydrology were run to identify these affects. Although the tests dealt with the fluctuation of water through the system, any one of the auxiliary variables in the model could be changed to reflect site specific data. Productivity, Hazardous material management, Hazardous material pharmacy.« less

  6. Long-term mesalamine maintenance in ulcerative colitis: which is more important? Adherence or daily dose.

    PubMed

    Khan, Nabeel; Abbas, Ali M; Koleva, Yordanka N; Bazzano, Lydia A

    2013-05-01

    There are limited data about the long-term follow-up of patients with ulcerative colitis (UC) maintained on high versus low doses of mesalamine. We evaluated the best long-term average daily dose that would keep the disease in remission. Nationwide ulcerative colitis data were obtained from the Veterans Affairs health care system for the period 2001 to 2011. Those who started mesalamine maintenance during this period were included. Average daily dose and the level of adherence were assessed for the period between the first mesalamine dispense and the date of first flare defined as the first filling of 40 mg/day or more of oral prednisone or any dose of intravenous steroids. Patients with ulcerative colitis maintained on an average daily dose 2.4 to 2.8 g/day (low dose) were compared with 4.4 to 4.8 g/day (high dose). Adherence was assessed using continuous single interval medication availability indicator. We included 4452 patients with a median follow-up of 6 years. There was no significant reduction in the risk of flares when comparing high versus low average mesalamine dose among patients with high [hazard ratio = 0.96, P = 0.8)] and medium (hazard ratio = 0.74, P = 0.17) adherence. However, there was a significant reduction in the risk of flares with high dose of mesalamine among patients with low adherence (hazard ratio = 0.28, P = 0.003). Our data show that when starting a patient on mesalamine, there is no difference in the long-term flare risk between low versus high average daily dose as long as the patients have a high to moderate level of adherence.

  7. Experimental outgassing of toxic chemicals to simulate the characteristics of hazards tainting globally shipped products.

    PubMed

    Budnik, Lygia Therese; Austel, Nadine; Gadau, Sabrina; Kloth, Stefan; Schubert, Jens; Jungnickel, Harald; Luch, Andreas

    2017-01-01

    Ambient monitoring analyses may identify potential new public health hazards such as residual levels of fumigants and industrial chemicals off gassing from products and goods shipped globally. We analyzed container air with gas chromatography coupled to mass spectrometry (TD-2D-GC-MS/FPD) and assessed whether the concentration of the volatiles benzene and 1,2-dichloroethane exceeded recommended exposure limits (REL). Products were taken from transport containers and analyzed for outgassing of volatiles. Furthermore, experimental outgassing was performed on packaging materials and textiles, to simulate the hazards tainting from globally shipped goods. The mean amounts of benzene in analyzed container air were 698-fold higher, and those of ethylene dichloride were 4.5-fold higher than the corresponding REL. More than 90% of all containers struck with toluene residues higher than its REL. For 1,2-dichloroethane 53% of containers, transporting shoes exceeded the REL. In standardized experimental fumigation of various products, outgassing of 1,2-dichloroethane under controlled laboratory conditions took up to several months. Globally produced transported products tainted with toxic industrial chemicals may contribute to the mixture of volatiles in indoor air as they are likely to emit for a long period. These results need to be taken into account for further evaluation of safety standards applying to workers and consumers.

  8. Health Issues: Do Cell Phones Pose a Health Hazard?

    MedlinePlus

    ... confused with the effects from other types of electromagnetic energy. Very high levels of electromagnetic energy, such as is found in X-rays ... light, infrared radiation (heat) and other forms of electromagnetic radiation with relatively low frequencies. While RF energy ...

  9. A comparative study of ground motion hybrid simulations and the modified NGA ground motion predictive equations for directivity and its application to the the Marmara Sea region (Turkey)

    NASA Astrophysics Data System (ADS)

    Pischiutta, M.; Akinci, A.; Spagnuolo, E.; Taroni, M.; Herrero, A.; Aochi, H.

    2016-12-01

    We have simulated strong ground motions for two Mw>7.0 rupture scenarios on the North Anatolian Fault, in the Marmara Sea within 10-20 km from Istanbul. This city is characterized by one of the highest levels of seismic risk in Europe and the Mediterranean region. The increased risk in Istanbul is due to eight destructive earthquakes that ruptured the fault system and left a seismic gap at the western portion of the 1000km-long North Anatolian Fault Zone. To estimate the ground motion characteristics and its variability in the region we have simulated physics-based rupture scenarios, producing hybrid broadband time histories. We have merged two simulation techniques: a full 3D wave propagation method to generate low-frequency seismograms (Aochi and Ulrich, 2015) and the stochastic finite-fault model approach based on a dynamic corner frequency (Motazedian and Atkinson, 2005) to simulate high-frequency seismograms (Akinci et al., 2016, submitted to BSSA, 2016). They are merged to compute realistic broadband hybrid time histories. The comparison of ground motion intensity measures (PGA, PGV, SA) resulting from our simulations with those predicted by the recent Ground Motion Prediction Equations (GMPEs) in the region (Boore & Atkinson, 2008; Chiou & Young, 2008; Akkar & Bommer, 2010; Akkar & Cagnan, 2010) seems to indicate that rupture directivity and super-shear rupture effects affect the ground motion in the Marmara Sea region. In order to account for the rupture directivity we improve the comparison using the directivity predictor proposed by Spudich & Chiu (2008). This study highlights the importance of the rupture directivity for the hazard estimation in the Marmara Sea region, especially for the city of Istanbul.

  10. Simulation program for estimating statistical power of Cox's proportional hazards model assuming no specific distribution for the survival time.

    PubMed

    Akazawa, K; Nakamura, T; Moriguchi, S; Shimada, M; Nose, Y

    1991-07-01

    Small sample properties of the maximum partial likelihood estimates for Cox's proportional hazards model depend on the sample size, the true values of regression coefficients, covariate structure, censoring pattern and possibly baseline hazard functions. Therefore, it would be difficult to construct a formula or table to calculate the exact power of a statistical test for the treatment effect in any specific clinical trial. The simulation program, written in SAS/IML, described in this paper uses Monte-Carlo methods to provide estimates of the exact power for Cox's proportional hazards model. For illustrative purposes, the program was applied to real data obtained from a clinical trial performed in Japan. Since the program does not assume any specific function for the baseline hazard, it is, in principle, applicable to any censored survival data as long as they follow Cox's proportional hazards model.

  11. Traffic Incident Management in the Presence of Hazards

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Zlatanova, S.; Steenbruggen, J.

    2016-09-01

    Traffic incidents can result in different kinds of hazards (e.g., plumes) that influence the status of road networks, therefore there is a great need for incident management in the presence of the hazards. When incidents occur, the created hazards not only affect the normal road users (make them detour or blocked), but also influence the movement of first responders. Traffic managers, who are responsible for maintaining the road safety and traffic stability, should carry out quick and effective measures to manage the incidents. In this paper, we present four issues to help people better understand the situations that could occur in the management of incidents with hazards: 1). Evacuation in the presence of hazards; 2). 3D incident management; 3). Navigation support for first responders; 4). Navigation support for road users. To address these issues, we propose a solution which combines agent technology, geo-database, hazard simulation, and traffic simulation. Further research would be needed to investigate the potentials of the proposed solution in real applications.

  12. Identification and assessment of hazardous compounds in drinking water.

    PubMed

    Fawell, J K; Fielding, M

    1985-12-01

    The identification of organic chemicals in drinking water and their assessment in terms of potential hazardous effects are two very different but closely associated tasks. In relation to both continuous low-level background contamination and specific, often high-level, contamination due to pollution incidents, the identification of contaminants is a pre-requisite to evaluation of significant hazards. Even in the case of the rapidly developing short-term bio-assays which are applied to water to indicate a potential genotoxic hazard (for example Ames tests), identification of the active chemicals is becoming a major factor in the further assessment of the response. Techniques for the identification of low concentrations of organic chemicals in drinking water have developed remarkably since the early 1970s and methods based upon gas chromatography-mass spectrometry (GC-MS) have revolutionised qualitative analysis of water. Such techniques are limited to "volatile" chemicals and these usually constitute a small fraction of the total organic material in water. However, in recent years there have been promising developments in techniques for "non-volatile" chemicals in water. Such techniques include combined high-performance liquid chromatography-mass spectrometry (HPLC-MS) and a variety of MS methods, involving, for example, field desorption, fast atom bombardment and thermospray ionisation techniques. In the paper identification techniques in general are reviewed and likely future developments outlined. The assessment of hazards associated with chemicals identified in drinking and related waters usually centres upon toxicology - an applied science which involves numerous disciplines. The paper examines the toxicological information needed, the quality and deployment of such information and discusses future research needs. Application of short-term bio-assays to drinking water is a developing area and one which is closely involved with, and to some extent dependent on, powerful methods of identification. Recent developments are discussed.

  13. Comparing the Performance of Japan's Earthquake Hazard Maps to Uniform and Randomized Maps

    NASA Astrophysics Data System (ADS)

    Brooks, E. M.; Stein, S. A.; Spencer, B. D.

    2015-12-01

    The devastating 2011 magnitude 9.1 Tohoku earthquake and the resulting shaking and tsunami were much larger than anticipated in earthquake hazard maps. Because this and all other earthquakes that caused ten or more fatalities in Japan since 1979 occurred in places assigned a relatively low hazard, Geller (2011) argued that "all of Japan is at risk from earthquakes, and the present state of seismological science does not allow us to reliably differentiate the risk level in particular geographic areas," so a map showing uniform hazard would be preferable to the existing map. Defenders of the maps countered by arguing that these earthquakes are low-probability events allowed by the maps, which predict the levels of shaking that should expected with a certain probability over a given time. Although such maps are used worldwide in making costly policy decisions for earthquake-resistant construction, how well these maps actually perform is unknown. We explore this hotly-contested issue by comparing how well a 510-year-long record of earthquake shaking in Japan is described by the Japanese national hazard (JNH) maps, uniform maps, and randomized maps. Surprisingly, as measured by the metric implicit in the JNH maps, i.e. that during the chosen time interval the predicted ground motion should be exceeded only at a specific fraction of the sites, both uniform and randomized maps do better than the actual maps. However, using as a metric the squared misfit between maximum observed shaking and that predicted, the JNH maps do better than uniform or randomized maps. These results indicate that the JNH maps are not performing as well as expected, that what factors control map performance is complicated, and that learning more about how maps perform and why would be valuable in making more effective policy.

  14. Simulation of the Onset of the Southeast Asian Monsoon During 1997 and 1998: The Impact of Surface Processes

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Lau, W.; Baker, R.

    2004-01-01

    The onset of the southeast Asian monsoon during 1997 and 1998 was simulated with a coupled mesoscale atmospheric model (MM5) and a detailed land surface model. The rainfall results from the simulations were compared with observed satellite data from the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The simulation with the land surface model captured basic signatures of the monsoon onset processes and associated rainfall statistics. The sensitivity tests indicated that land surface processes had a greater impact on the simulated rainfall results than that of a small sea surface temperature change during the onset period. In both the 1997 and 1998 cases, the simulations were significantly improved by including the land surface processes. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation; the southwest low-level flow over the Indo-China peninsula and the northern cold front intrusion from southern China. The surface sensible and latent heat exchange between the land and atmosphere modified the low-level temperature distribution and gradient, and therefore the low-level. The more realistic forcing of the sensible and latent heat from the detailed land surface model improved the monsoon rainfall and associated wind simulation. The model results will be compared to the simulation of the 6-7 May 2000 Missouri flash flood event. In addition, the impact of model initialization and land surface treatment on timing, intensity, and location of extreme precipitation will be examined.

  15. Simulation of the Onset of the Southeast Asian Monsoon during 1997 and 1998: The Impact of Surface Processes

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Wang, Y.; Lau, W.; Baker, R. D.

    2004-01-01

    The onset of the southeast Asian monsoon during 1997 and 1998 was simulated with a coupled mesoscale atmospheric model (MM5) and a detailed land surface model. The rainfall results from the simulations were compared with observed satellite data from the TRMM (Tropical Rainfall Measuring Mission) TMI (TRMM Microwave Imager) and GPCP (Global Precipitation Climatology Project). The simulation with the land surface model captured basic signatures of the monsoon onset processes and associated rainfall statistics. The sensitivity tests indicated that land surface processes had a greater impact on the simulated rainfall results than that of a small sea surface temperature change during the onset period. In both the 1997 and 1998 cases, the simulations were significantly improved by including the land surface processes. The results indicated that land surface processes played an important role in modifying the low-level wind field over two major branches of the circulation; the southwest low-level flow over the Indo-China peninsula and the northern cold front intrusion from southern China. The surface sensible and latent heat exchange between the land and atmosphere modified the low-level temperature distribution and gradient, and therefore the low-level. The more realistic forcing of the sensible and latent heat from the detailed land surface model improved the monsoon rainfall and associated wind simulation. The model results will be compared to the simulation of the 6-7 May 2000 Missouri flash flood event. In addition, the impact of model initialization and land surface treatment on timing, intensity, and location of extreme precipitation will be examined.

  16. Impact of fault models on probabilistic seismic hazard assessment: the example of the West Corinth rift.

    NASA Astrophysics Data System (ADS)

    Chartier, Thomas; Scotti, Oona; Boiselet, Aurelien; Lyon-Caen, Hélène

    2016-04-01

    Including faults in probabilistic seismic hazard assessment tends to increase the degree of uncertainty in the results due to the intrinsically uncertain nature of the fault data. This is especially the case in the low to moderate seismicity regions of Europe, where slow slipping faults are difficult to characterize. In order to better understand the key parameters that control the uncertainty in the fault-related hazard computations, we propose to build an analytic tool that provides a clear link between the different components of the fault-related hazard computations and their impact on the results. This will allow identifying the important parameters that need to be better constrained in order to reduce the resulting uncertainty in hazard and also provide a more hazard-oriented strategy for collecting relevant fault parameters in the field. The tool will be illustrated through the example of the West Corinth rifts fault-models. Recent work performed in the gulf has shown the complexity of the normal faulting system that is accommodating the extensional deformation of the rift. A logic-tree approach is proposed to account for this complexity and the multiplicity of scientifically defendable interpretations. At the nodes of the logic tree, different options that could be considered at each step of the fault-related seismic hazard will be considered. The first nodes represent the uncertainty in the geometries of the faults and their slip rates, which can derive from different data and methodologies. The subsequent node explores, for a given geometry/slip rate of faults, different earthquake rupture scenarios that may occur in the complex network of faults. The idea is to allow the possibility of several faults segments to break together in a single rupture scenario. To build these multiple-fault-segment scenarios, two approaches are considered: one based on simple rules (i.e. minimum distance between faults) and a second one that relies on physically-based simulations. The following nodes represents for each rupture scenario different rupture forecast models (i.e; characteristic or Gutenberg-Richter) and for a given rupture forecast, two probability models commonly used in seismic hazard assessment: poissonian or time-dependent. The final node represents an exhaustive set of ground motion prediction equations chosen in order to be compatible with the region. Finally, the expected probability of exceeding a given ground motion level is computed at each sites. Results will be discussed for a few specific localities of the West Corinth Gulf.

  17. Evaluation of hazard and integrity monitor functions for integrated alerting and notification using a sensor simulation framework

    NASA Astrophysics Data System (ADS)

    Bezawada, Rajesh; Uijt de Haag, Maarten

    2010-04-01

    This paper discusses the results of an initial evaluation study of hazard and integrity monitor functions for use with integrated alerting and notification. The Hazard and Integrity Monitor (HIM) (i) allocates information sources within the Integrated Intelligent Flight Deck (IIFD) to required functionality (like conflict detection and avoidance) and determines required performance of these information sources as part of that function; (ii) monitors or evaluates the required performance of the individual information sources and performs consistency checks among various information sources; (iii) integrates the information to establish tracks of potential hazards that can be used for the conflict probes or conflict prediction for various time horizons including the 10, 5, 3, and <3 minutes used in our scenario; (iv) detects and assesses the class of the hazard and provide possible resolutions. The HIM monitors the operation-dependent performance parameters related to the potential hazards in a manner similar to the Required Navigation Performance (RNP). Various HIM concepts have been implemented and evaluated using a previously developed sensor simulator/synthesizer. Within the simulation framework, various inputs to the IIFD and its subsystems are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. The framework and HIM functions are implemented in SimulinkR, a modeling language developed by The MathworksTM. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft.

  18. Aviation Safety Program Atmospheric Environment Safety Technologies (AEST) Project

    NASA Technical Reports Server (NTRS)

    Colantonio, Ron

    2011-01-01

    Engine Icing: Characterization and Simulation Capability: Develop knowledge bases, analysis methods, and simulation tools needed to address the problem of engine icing; in particular, ice-crystal icing Airframe Icing Simulation and Engineering Tool Capability: Develop and demonstrate 3-D capability to simulate and model airframe ice accretion and related aerodynamic performance degradation for current and future aircraft configurations in an expanded icing environment that includes freezing drizzle/rain Atmospheric Hazard Sensing and Mitigation Technology Capability: Improve and expand remote sensing and mitigation of hazardous atmospheric environments and phenomena

  19. Process-based modelling to evaluate simulated groundwater levels and frequencies in a Chalk catchment in south-western England

    NASA Astrophysics Data System (ADS)

    Brenner, Simon; Coxon, Gemma; Howden, Nicholas J. K.; Freer, Jim; Hartmann, Andreas

    2018-02-01

    Chalk aquifers are an important source of drinking water in the UK. Due to their properties, they are particularly vulnerable to groundwater-related hazards like floods and droughts. Understanding and predicting groundwater levels is therefore important for effective and safe water management. Chalk is known for its high porosity and, due to its dissolvability, exposed to karstification and strong subsurface heterogeneity. To cope with the karstic heterogeneity and limited data availability, specialised modelling approaches are required that balance model complexity and data availability. In this study, we present a novel approach to evaluate simulated groundwater level frequencies derived from a semi-distributed karst model that represents subsurface heterogeneity by distribution functions. Simulated groundwater storages are transferred into groundwater levels using evidence from different observations wells. Using a percentile approach we can assess the number of days exceeding or falling below selected groundwater level percentiles. Firstly, we evaluate the performance of the model when simulating groundwater level time series using a spilt sample test and parameter identifiability analysis. Secondly, we apply a split sample test to the simulated groundwater level percentiles to explore the performance in predicting groundwater level exceedances. We show that the model provides robust simulations of discharge and groundwater levels at three observation wells at a test site in a chalk-dominated catchment in south-western England. The second split sample test also indicates that the percentile approach is able to reliably predict groundwater level exceedances across all considered timescales up to their 75th percentile. However, when looking at the 90th percentile, it only provides acceptable predictions for long time periods and it fails when the 95th percentile of groundwater exceedance levels is considered. By modifying the historic forcings of our model according to expected future climate changes, we create simple climate scenarios and we show that the projected climate changes may lead to generally lower groundwater levels and a reduction of exceedances of high groundwater level percentiles.

  20. T-lymphokine-activated killer cell-originated protein kinase (TOPK) as a prognostic factor and a potential therapeutic target in glioma

    PubMed Central

    Duan, Qiuhong; Yuan, Ping; Xue, Peipei; Lu, Hui; Yan, Meng; Guo, Dongsheng; Xu, Sanpeng; Zhang, Xiaohui; Lin, Xuan; Wang, Yong; Dogan, Soner; Zhang, Jianmin; Zhu, Feng; Ke, Changshu; Liu, Lin

    2018-01-01

    TOPK is overexpressed in various types of cancer and associated with poor outcomes in different types of cancer. In this study, we first found that the expression of T-lymphokine-activated killer cell-originated protein kinase (TOPK) was significantly higher in Grade III or Grade IV than that in Grade II in glioma (P = 0.007 and P < 0.001, respectively). Expression of TOPK was positively correlated with Ki67 (P < 0.001). Knockdown of TOPK significantly inhibited cell growth, colony formation and increased sensitivities to temozolomide (TMZ) in U-87 MG or U-251 cells, while TOPK overexpression promoted cell growth and colony formation in Hs 683 or A-172 cells. Glioma patients expressing high levels of TOPK have poor survival compared with those expressing low levels of TOPK in high-grade or low-grade gliomas (hazard ratio = 0.2995; 95% CI, 0.1262 to 0.7108; P = 0.0063 and hazard ratio = 0.1509; 95% CI, 0.05928 to 0.3842; P < 0.0001, respectively). The level of TOPK was low in TMZ-sensitive patients compared with TMZ-resistant patients (P = 0.0056). In TMZ-resistant population, patients expressing high TOPK have two months’ shorter survival time than those expressing low TOPK. Our findings demonstrated that TOPK might represent as a promising prognostic and predictive factor and potential therapeutic target for glioma. PMID:29487691

  1. Assessing the Relationship Between Social Vulnerability and Community Resilience to Hazards

    PubMed Central

    Bergstrand, Kelly; Brumback, Babette; Zhang, Yi

    2017-01-01

    This article contributes to the disaster literature by measuring and connecting two concepts that are highly related but whose relationship is rarely empirically evaluated: social vulnerability and community resilience. To do so, we measure community resilience and social vulnerability in counties across the United States and find a correlation between high levels of vulnerability and low levels of resilience, indicating that the most vulnerable counties also tend to be the least resilient. We also find regional differences in the distribution of community resilience and social vulnerability, with the West being particularly vulnerable while the Southeast is prone to low levels of resilience. By looking at both social vulnerability and community resilience, we are able to map communities’ social risks for harm from threats as well as their capacities for recovering and adapting in the aftermath of hazards. This provides a more complete portrait of the communities that might need the most assistance in emergency planning and response, as well as whether such interventions will need to be tailored toward reducing damage or finding the path to recovery. PMID:29276330

  2. Assessing the Relationship Between Social Vulnerability and Community Resilience to Hazards.

    PubMed

    Bergstrand, Kelly; Mayer, Brian; Brumback, Babette; Zhang, Yi

    2015-06-01

    This article contributes to the disaster literature by measuring and connecting two concepts that are highly related but whose relationship is rarely empirically evaluated: social vulnerability and community resilience. To do so, we measure community resilience and social vulnerability in counties across the United States and find a correlation between high levels of vulnerability and low levels of resilience, indicating that the most vulnerable counties also tend to be the least resilient. We also find regional differences in the distribution of community resilience and social vulnerability, with the West being particularly vulnerable while the Southeast is prone to low levels of resilience. By looking at both social vulnerability and community resilience, we are able to map communities' social risks for harm from threats as well as their capacities for recovering and adapting in the aftermath of hazards. This provides a more complete portrait of the communities that might need the most assistance in emergency planning and response, as well as whether such interventions will need to be tailored toward reducing damage or finding the path to recovery.

  3. Simulating Roll Clouds associated with Low-Level Convergence.

    NASA Astrophysics Data System (ADS)

    Prasad, A. A.; Sherwood, S. C.

    2015-12-01

    Convective initiation often takes place when features such as fronts and/or rolls collide, merge or otherwise meet. Rolls indicate boundary layer convergence and may initiate thunderstorms. These are often seen in satellite and radar imagery prior to the onset of deep convection. However, links between convergence driven rolls and convection are poor in global models. The poor representation of convection is the source of many model biases, especially over the Maritime Continent in the Tropics. We simulate low-level convergence lines over north-eastern Australia using the Weather Research and Forecasting (WRF) Model (version 3.7). The simulations are events from September-October 2002 driven by sea breeze circulations. Cloud lines associated with bore-waves that form along the low-level convergence lines are thoroughly investigated in this study with comparisons from satellite and surface observations. Initial simulations for a series of cloud lines observed on 4th October, 2002 over the Gulf of Carpentaria showed greater agreement in the timing and propagation of the disturbance and the low-level convergence, however the cloud lines or streets of roll clouds were not properly captured by the model. Results from a number of WRF simulations with different microphysics, cumulus and planetary boundary layer schemes, resolution and boundary conditions will also be discussed.

  4. SCEC Earthquake System Science Using High Performance Computing

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Archuleta, R.; Beroza, G.; Bielak, J.; Chen, P.; Cui, Y.; Day, S.; Deelman, E.; Graves, R. W.; Minster, J. B.; Olsen, K. B.

    2008-12-01

    The SCEC Community Modeling Environment (SCEC/CME) collaboration performs basic scientific research using high performance computing with the goal of developing a predictive understanding of earthquake processes and seismic hazards in California. SCEC/CME research areas including dynamic rupture modeling, wave propagation modeling, probabilistic seismic hazard analysis (PSHA), and full 3D tomography. SCEC/CME computational capabilities are organized around the development and application of robust, re- usable, well-validated simulation systems we call computational platforms. The SCEC earthquake system science research program includes a wide range of numerical modeling efforts and we continue to extend our numerical modeling codes to include more realistic physics and to run at higher and higher resolution. During this year, the SCEC/USGS OpenSHA PSHA computational platform was used to calculate PSHA hazard curves and hazard maps using the new UCERF2.0 ERF and new 2008 attenuation relationships. Three SCEC/CME modeling groups ran 1Hz ShakeOut simulations using different codes and computer systems and carefully compared the results. The DynaShake Platform was used to calculate several dynamic rupture-based source descriptions equivalent in magnitude and final surface slip to the ShakeOut 1.2 kinematic source description. A SCEC/CME modeler produced 10Hz synthetic seismograms for the ShakeOut 1.2 scenario rupture by combining 1Hz deterministic simulation results with 10Hz stochastic seismograms. SCEC/CME modelers ran an ensemble of seven ShakeOut-D simulations to investigate the variability of ground motions produced by dynamic rupture-based source descriptions. The CyberShake Platform was used to calculate more than 15 new probabilistic seismic hazard analysis (PSHA) hazard curves using full 3D waveform modeling and the new UCERF2.0 ERF. The SCEC/CME group has also produced significant computer science results this year. Large-scale SCEC/CME high performance codes were run on NSF TeraGrid sites including simulations that use the full PSC Big Ben supercomputer (4096 cores) and simulations that ran on more than 10K cores at TACC Ranger. The SCEC/CME group used scientific workflow tools and grid-computing to run more than 1.5 million jobs at NCSA for the CyberShake project. Visualizations produced by a SCEC/CME researcher of the 10Hz ShakeOut 1.2 scenario simulation data were used by USGS in ShakeOut publications and public outreach efforts. OpenSHA was ported onto an NSF supercomputer and was used to produce very high resolution hazard PSHA maps that contained more than 1.6 million hazard curves.

  5. [Re-analysis of occupational hazards in foundry].

    PubMed

    Zhang, Min; Qi, Cheng; Chen, Wei-Hong; Lu, Yang; Du, Xie-Yi; Li, Wen-Jie; Meng, Chuan-San

    2010-04-01

    To analyze systematically the characteristics of occupational hazards in the foundry, and provide precise data for epidemiology studies and control of occupational hazards in the foundry. Data of airborne dust, chemical occupational hazards and physical occupational agents in environment in the foundry from 1978 to 2008 were dynamically collected. Mean concentration and intensity (geometric mean) of occupational hazards were calculated by job in different years. Main occupational hazards in the foundry were silica, metal fume, noise and heat stress. Silica existed in all of main jobs. The mean concentration of silica before 1986 was an extremely high level of 8.6 mg/m(3), and then remarkably dropped after 1986, with the level of 2.4 mg/m(3) from 1986 to 1989, 2.7 mg/m(3) from 1990 to 2002 and 2.7 mg/m(3) from 2003 to 2008. The trend of silica concentrations by job was consistent with that in general. Silica concentrations among jobs were significantly different, with highest level in melting (4.4 mg/m(3)), followed by cast shakeout and finishing (3.4 mg/m(3)), pouring (3.4 mg/m(3)), sand preparation (2.4 mg/m(3)), moulding (2.1 mg/m(3)) and core-making (1.7 mg/m(3)). Concentration of respirable dust in pouring was highest (2.76 mg/m(3)), followed by cast shakeout and finishing (1.14 mg/m(3)). Mean concentration of asbestos dust in melting was a relative high level of 2.0 mg/m(3). In core-making and sand preparation, there existed emission production of adhesive, with mean concentrations as followed, ammonia (5.84 mg/m(3)), formaldehyde (0.60 mg/m(3)), phenol (1.73 mg/m(3)) and phenol formaldehyde resin (1.3 mg/m(3)) also existed. Benzene and its homologues existed in cast shakeout and finishing, and the level of benzene, toluene, xylene was 0.2 mg/m(3), 0.1 mg/m(3) and 1.3 mg/m(3), respectively. In pouring and melting, there existed chemical occupational hazards, including benzo(a) pyrene, metal fume (lead, cadmium, manganese, nickel, chromium) and gas(hydrogen sulfide, phosphine, sulfur dioxide, carbon monoxide). Mean concentration of benzo(a) pyrene was a low level of 1.80 x 10(-4) microg/m(3). Physical occupational agents in the foundry were noise, heat stress and vibration. Intensity of heat stress was high in melting, pouring and cast shakeout and finishing, with the level of 30 degrees C, 29 degrees C and 26 degrees C, respectively. Noise was high in cast shakeout and finishing and core-making, with the level of 93.1 dB(A) and 89.5 dB(A), respectively. Vibration existed in core-making and cast shakeout and finishing. Compulsory postures included long standing, seating and bowing. Occupational hazards in environment of the foundry are diversified and their concentrations exceed permissible exposure limits stipulated by the national occupational hygienic standards. High-concentrations of dust, metal fume, low-concentrations of variety of chemicals, high-intensity of noise and vibration, heat stress, and harmful compulsory posture, and so on all co-exist in the foundry. Control and protective measures should be strengthened.

  6. Toward uniform probabilistic seismic hazard assessments for Southeast Asia

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Wang, Y.; Shi, X.; Ornthammarath, T.; Warnitchai, P.; Kosuwan, S.; Thant, M.; Nguyen, P. H.; Nguyen, L. M.; Solidum, R., Jr.; Irsyam, M.; Hidayati, S.; Sieh, K.

    2017-12-01

    Although most Southeast Asian countries have seismic hazard maps, various methodologies and quality result in appreciable mismatches at national boundaries. We aim to conduct a uniform assessment across the region by through standardized earthquake and fault databases, ground-shaking scenarios, and regional hazard maps. Our earthquake database contains earthquake parameters obtained from global and national seismic networks, harmonized by removal of duplicate events and the use of moment magnitude. Our active-fault database includes fault parameters from previous studies and from the databases implemented for national seismic hazard maps. Another crucial input for seismic hazard assessment is proper evaluation of ground-shaking attenuation. Since few ground-motion prediction equations (GMPEs) have used local observations from this region, we evaluated attenuation by comparison of instrumental observations and felt intensities for recent earthquakes with predicted ground shaking from published GMPEs. We then utilize the best-fitting GMPEs and site conditions into our seismic hazard assessments. Based on the database and proper GMPEs, we have constructed regional probabilistic seismic hazard maps. The assessment shows highest seismic hazard levels near those faults with high slip rates, including the Sagaing Fault in central Myanmar, the Sumatran Fault in Sumatra, the Palu-Koro, Matano and Lawanopo Faults in Sulawesi, and the Philippine Fault across several islands of the Philippines. In addition, our assessment demonstrates the important fact that regions with low earthquake probability may well have a higher aggregate probability of future earthquakes, since they encompass much larger areas than the areas of high probability. The significant irony then is that in areas of low to moderate probability, where building codes are usually to provide less seismic resilience, seismic risk is likely to be greater. Infrastructural damage in East Malaysia during the 2015 Sabah earthquake offers a case in point.

  7. Commercial Nuclear Steam-Electric Power Plants, Part II

    ERIC Educational Resources Information Center

    Shore, Ferdinand J.

    1974-01-01

    Presents the pros and cons of nuclear power systems. Includes a discussion of the institutional status of the AEC, AEC regulatory record, routine low-level radiation hazards, transport of radioactive materials, storage of wastes, and uranium resources and economics of supply. (GS)

  8. NASA/MSFC FY-83 Atmospheric Research Review

    NASA Technical Reports Server (NTRS)

    Turner, R. E. (Compiler); Camp, D. W. (Compiler)

    1983-01-01

    Atmospheric research conducted at the Marshall Space Flight Center in FY 1983 is discussed. Clear air turbulence, gusts, and fog dispersal near airports is discussed. The use of Doppler Lidar signals in discussed, as are low level flow conditions that are hazardous to aircraft.

  9. Vulnerability, safety and response of nuclear power plants to the hydroclimatic hazards

    NASA Astrophysics Data System (ADS)

    János Katona, Tamás; Vilimi, András

    2016-04-01

    The Great Tohoku Earthquake and Tsunami, and the severe accident at Fukushima Dai-ichi nuclear power plant 2011 alerted the nuclear industry to danger of extreme rare natural hazards. The subsequent "stress tests" performed by the nuclear industry in Europe and all over the world identifies the nuclear power plant (NPP) vulnerabilities and define the measures for increasing the plant safety. According to the international practice of nuclear safety regulations, the cumulative core damage frequency for NPPs has to be 10-5/a, and the cumulative frequency of early large release has to be 10-6/a. In case of operating plants these annual probabilities can be little higher, but the licensees are obliged to implement all reasonable practicable measures for increasing the plant safety. For achieving the required level of safety, design basis of NPPs for natural hazards has to be defined at the 10-4/a ⎯10-5/a levels of annual exceedance probability. Tornado hazard is some kind of exception, e.g., the design basis annual probability for tornado in the US is equal to 10-7/a. Design of the NPPs shall provide for an adequate margin to protect items ultimately necessary to prevent large or early radioactive releases in the event of levels of natural hazards exceeding those to be considered for design. The plant safety has to be reviewed for accounting the changes of the environmental conditions and natural hazards in case of necessity, but as minimum every ten years in the frame of periodic safety reviews. Long-term forecast of environmental conditions and hazards has to be accounted for in the design basis of the new plants. Changes in hydroclimatic variables, e.g., storms, tornadoes, river floods, flash floods, extreme temperatures, droughts affect the operability and efficiency as well as the safety the NPPs. Low flow rates and high water temperature in the rivers may force to operate at reduced power level or shutdown the plant (Cernavoda NPP, Romania, August 2009). The practice demonstrated that the NPPs could safely withstand the meteorological extremes (Katrina hurricane, 2005). However the floods at some sites cause significant safety issues. Design of NPPs and their response to extreme hydroclimatic events depends on the features of particular hazards, e.g., predictability, possibility and time available for the protective actions, potential for causing cliff-edge effects and the possible combinations of events. The uncertainty of the prediction of extreme values for the design and safety assessment is a fundamental issue. In the paper the consequences of hydroclimatic extremes are analysed for nuclear power plants. The possibility of operational response to extremes is presented. The safety margins are assessed with respect to the effects caused by hydroclimatic extremes. The direct actions (e.g. wind) and the indirect consequences (e.g. changing of ground water level) are also considered. Methods for accounting the uncertainties of the characterisation of low probability hazards are also considered. The preparedness to severe hydroclimatic conditions / events and actions for mitigation and management are presented and discussed. The considerations in the paper are illustrated by the case of the Paks Nuclear Power Plant, Hungary.

  10. Applications of a Forward-Looking Interferometer for the On-board Detection of Aviation Weather Hazards

    NASA Technical Reports Server (NTRS)

    West, Leanne; Gimmestad, Gary; Smith, William; Kireev, Stanislav; Cornman, Larry B.; Schaffner, Philip R.; Tsoucalas, George

    2008-01-01

    The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining measurements of potential weather hazards to alert flight crews. The FLI concept is based on high-resolution Infrared (IR) Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing, and which have also been applied to the detection of aerosols and gases for other purposes. It is being evaluated for multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing, during all phases of flight. Previous sensitivity and characterization studies addressed the phenomenology that supports detection and mitigation by the FLI. Techniques for determining the range, and hence warning time, were demonstrated for several of the hazards, and a table of research instrument parameters was developed for investigating all of the hazards discussed above. This work supports the feasibility of detecting multiple hazards with an FLI multi-hazard airborne sensor, and for producing enhanced IR images in reduced visibility conditions; however, further research must be performed to develop a means to estimate the intensities of the hazards posed to an aircraft and to develop robust algorithms to relate sensor measurables to hazard levels. In addition, validation tests need to be performed with a prototype system.

  11. Butyrylcholinesterase Levels on Admission Predict Severity and 12-Month Mortality in Hospitalized AIDS Patients

    PubMed Central

    Xu, Lijun; Huang, Ying; Yang, Zongxing; Sun, Jia; Xu, Yan; Zheng, Jinlei; Kinloch, Sabine; Yin, Michael T.; Weng, Honglei

    2018-01-01

    Background Butyrylcholinesterase (BChE) is synthesized mainly in the liver and an important marker in many infectious/inflammatory diseases, but its role in acquired immunodeficiency syndrome (AIDS) patients is not clear. We wished to ascertain if BChE level is associated with the progression/prognosis of AIDS patients. Methods BChE levels (in U/L) were measured in 505 patients; <4500 was defined as “low” and ≥4500 as “normal.” Associations between BChE level and CD4 count, WHO stage, body mass index (BMI), C-reactive protein (CRP) level, and duration of hospitalization were assessed. Kaplan–Meier curves and Cox proportional hazards model were used to assess associations between low BChE levels and mortality, after adjustment for age, CD4 count, WHO stage, and laboratory parameters. Results A total of 129 patients (25.5%) had a lower BChE level. BChE was closely associated with CD4 count, WHO stage, CRP level, and BMI (all P < 0.001). Eighty-four patients (16.6%) died in the first year of follow-up. One-year survival was 64.5 ± 4.5% for patients with low BChE and 87.6 ± 1.8% for those with normal BChE (log-rank, P < 0.001). After adjustment for sex, age, BMI, WHO stage, and CD4 count, as well as serum levels of hemoglobin, sodium, and albumin, the hazard ratio was 1.8 (95% confidence interval, 1.0–3.2) for patients with a low BChE compared with those with a normal BChE (P = 0.035). Conclusion BChE level is associated with HIV/AIDS severity and is an independent risk factor for increased mortality in AIDS patients. PMID:29736152

  12. 3D Dynamic Rupture Simulations along Dipping Faults, with a focus on the Wasatch Fault Zone, Utah

    NASA Astrophysics Data System (ADS)

    Withers, K.; Moschetti, M. P.

    2017-12-01

    We study dynamic rupture and ground motion from dip-slip faults in regions that have high-seismic hazard, such as the Wasatch fault zone, Utah. Previous numerical simulations have modeled deterministic ground motion along segments of this fault in the heavily populated regions near Salt Lake City but were restricted to low frequencies ( 1 Hz). We seek to better understand the rupture process and assess broadband ground motions and variability from the Wasatch Fault Zone by extending deterministic ground motion prediction to higher frequencies (up to 5 Hz). We perform simulations along a dipping normal fault (40 x 20 km along strike and width, respectively) with characteristics derived from geologic observations to generate a suite of ruptures > Mw 6.5. This approach utilizes dynamic simulations (fully physics-based models, where the initial stress drop and friction law are imposed) using a summation by parts (SBP) method. The simulations include rough-fault topography following a self-similar fractal distribution (over length scales from 100 m to the size of the fault) in addition to off-fault plasticity. Energy losses from heat and other mechanisms, modeled as anelastic attenuation, are also included, as well as free-surface topography, which can significantly affect ground motion patterns. We compare the effect of material structure and both rate and state and slip-weakening friction laws have on rupture propagation. The simulations show reduced slip and moment release in the near surface with the inclusion of plasticity, better agreeing with observations of shallow slip deficit. Long-wavelength fault geometry imparts a non-uniform stress distribution along both dip and strike, influencing the preferred rupture direction and hypocenter location, potentially important for seismic hazard estimation.

  13. Simulation of external and internal electrostatic discharges at the spacecraft system test level

    NASA Technical Reports Server (NTRS)

    Whittlesey, A.; Leung, P.

    1984-01-01

    Environmental test activities concerned with space plasma-caused charging and discharing phenomena are discussed. It is pointed out that the origin of such an electrostatic discharge (ESD) is charging of spacecraft dielectrics by an energetic plasma in geosynchronous orbit, Jupiter's magnetosphere, or other similar space environments. In dealing with environmental testing problems, it is necessary to define the location and magnitude of any ESD's in preparation for a subsequent simulation of the given conditions. Questions of external and internal charging are discussed separately. The environmental hazard from an external discharge can be assessed by viewing the dielectric surface as one side of a parallel plate capacitor. In the case of internal charging, the level of environmental concern depends on the higher energy spectrum of the ambient electrons.

  14. Applying mathematical modeling to create job rotation schedules for minimizing occupational noise exposure.

    PubMed

    Tharmmaphornphilas, Wipawee; Green, Benjamin; Carnahan, Brian J; Norman, Bryan A

    2003-01-01

    This research developed worker schedules by using administrative controls and a computer programming model to reduce the likelihood of worker hearing loss. By rotating the workers through different jobs during the day it was possible to reduce their exposure to hazardous noise levels. Computer simulations were made based on data collected in a real setting. Worker schedules currently used at the site are compared with proposed worker schedules from the computer simulations. For the worker assignment plans found by the computer model, the authors calculate a significant decrease in time-weighted average (TWA) sound level exposure. The maximum daily dose that any worker is exposed to is reduced by 58.8%, and the maximum TWA value for the workers is reduced by 3.8 dB from the current schedule.

  15. Musculoskeletal symptoms and ergonomic hazards among material handlers in grocery retail industries

    NASA Astrophysics Data System (ADS)

    Nasrull Abdol Rahman, Mohd; Zuhaidi, Muhammad Fareez Ahmad

    2017-08-01

    Grocery retail work can be physically demanding as material handler’s tasks involve manual lifting, lowering, carrying, pushing and pulling loads. The nature of this work puts them at a risk for serious low back pain, shoulder pain and other musculoskeletal injuries. This study was conducted by using two different types of tools which were Nordic Musculoskeletal Questionnaire (NMQ) as a survey and Washington Industrial Safety and Health Act (WISHA) Checklist as a direct observation method. Among 46 males and 14 females material handlers were involved throughout this study. For NMQ, the highest body part trouble in the last 12 months was low back pain (88.3%), followed by upper back (68.3%), neck (55.3%) and shoulder (36.7%). While for WISHA Checklist, most of them experienced hazard level involving awkward posture and high hand force. From the research conducted, musculoskeletal disorders (MSDs) and ergonomic risk factors (ERFs) do related as it showed that musculoskeletal disorders may arise if the workers ignored the safety in ergonomic hazards.

  16. Combination therapy using antioxidants and low level laser therapy (LLLT) on noise induced hearing loss (NIHL)

    NASA Astrophysics Data System (ADS)

    Chang, So-Young; Lim, Sung Kyu; Lee, Min young; Chung, Phil-Sang; Jung, Jae-Yun; Rhee, Chung-Ku

    2016-02-01

    One of the most common factors that cause hearing disorders is noise trauma. Noise is an increasing hazard and it is pervasive, which makes it difficult to take precautions and prevent noise-induced hearing loss (NIHL). The prevalence of hearing loss among factory workers to be 42 %[1]. Ocupational noise induced hearing loss (ONIHL) continues to be a significant occupational hazard. ONIHL is permanent and may cause significant disability, for which there currently exists no cure, but is largely preventable. More than 30 million Americans are potentially exposed to hazardous noise levels in occupations such as transportation, construction, and coal mining, as well as recreationally. In the mainstream setting, exposure avoidance strategies aimed to reduce the incidence of ONIHL remain the focus of public health and occupational medicine approaches[2]. In military conditions this is most often caused by such things as explosions, blasts, or loud noises from vehicles ranging from 100 to 140 dB[3] and military weapons generating approximately 140-185 dB peak sound pressure levels[4].

  17. Risk Perception and the Psychology of Natural Hazard Preparedness

    NASA Astrophysics Data System (ADS)

    Thompson, K. J.; Weber, E. U.

    2014-12-01

    In the preparedness phase of the disaster cycle, willingness to invest resources in prevention and mitigation doesn't depend only on quantitative judgments of the probability of a disaster. People also evaluate the risks of situations in qualitative ways. Psychological studies of risk perception have shown that risk attitudes toward everyday technologies and activities (e.g., electric power, air travel, smoking) can be mapped onto two orthogonal dimensions: how unknown the risks seem, and how dread or severe they feel. Previously, this psychometric approach to risk perception has focused mostly on man-made risks (e.g., Fischhoff et al. 1978, Slovic 1987). In this paper we examine how natural hazards fit into the established unknown/dread risk space. Hazards that are high on the unknown dimension of risk tend to be perceived as having effects that are unknown to science and to the exposed, uncontrollable, and new. Hazards that rank high on the dread/severity dimension are seen as immediate, catastrophic, highly dreaded on a gut level, new, and likely to be fatal. Perceived risk tends to be highest for hazards that are both high on the dread dimension and low on the unknown dimension. We find that weather-related hazards rank lowest on both dimensions: blizzards, heat waves, hailstorms, fog, and ice storms are all feel very known and not particularly dread. The exception for this group is hurricanes and tornadoes, which are viewed as more similar to geophysical hazards and mass movements: high on dread, though not particularly unknown. Two notable outliers are climate change and sea-level rise, which are both considered very unknown (higher than any other natural hazard save sinkholes), and not at all dread (less dread even than fog and dust storms). But when compared with perceptions of technological hazards, nearly every natural hazard ranks as more dread than any technology or activity, including nuclear power. Man-made hazards fall with technologies, rather than with natural hazards—climate change and sea-level rise are both only as dread as electric power and motor vehicles, yet feel as unknown as terrorism and GMO foods. We discuss the implications of these qualitative elements of hazard risk perception for the preparedness phase of the disaster lifecycle, and offer recommendations to practitioners and educators.

  18. Why is the simulated climatology of tropical cyclones so sensitive to the choice of cumulus parameterization scheme in the WRF model?

    NASA Astrophysics Data System (ADS)

    Zhang, Chunxi; Wang, Yuqing

    2018-01-01

    The sensitivity of simulated tropical cyclones (TCs) to the choice of cumulus parameterization (CP) scheme in the advanced Weather Research and Forecasting Model (WRF-ARW) version 3.5 is analyzed based on ten seasonal simulations with 20-km horizontal grid spacing over the western North Pacific. Results show that the simulated frequency and intensity of TCs are very sensitive to the choice of the CP scheme. The sensitivity can be explained well by the difference in the low-level circulation in a height and sorted moisture space. By transporting moist static energy from dry to moist region, the low-level circulation is important to convective self-aggregation which is believed to be related to genesis of TC-like vortices (TCLVs) and TCs in idealized settings. The radiative and evaporative cooling associated with low-level clouds and shallow convection in dry regions is found to play a crucial role in driving the moisture-sorted low-level circulation. With shallow convection turned off in a CP scheme, relatively strong precipitation occurs frequently in dry regions. In this case, the diabatic cooling can still drive the low-level circulation but its strength is reduced and thus TCLV/TC genesis is suppressed. The inclusion of the cumulus momentum transport (CMT) in a CP scheme can considerably suppress genesis of TCLVs/TCs, while changes in the moisture-sorted low-level circulation and horizontal distribution of precipitation are trivial, indicating that the CMT modulates the TCLVs/TCs activities in the model by mechanisms other than the horizontal transport of moist static energy.

  19. Thresholds of Toxicological Concern - Setting a threshold for testing below which there is little concern.

    PubMed

    Hartung, Thomas

    2017-01-01

    Low dose, low risk; very low dose, no real risk. Setting a pragmatic threshold below which concerns become negligible is the purpose of thresholds of toxicological concern (TTC). The idea is that such threshold values do not need to be established for each and every chemical based on experimental data, but that by analyzing the distribution of lowest or no-effect doses of many chemicals, a TTC can be defined - typically using the 5th percentile of this distribution and lowering it by an uncertainty factor of, e.g., 100. In doing so, TTC aims to compare exposure information (dose) with a threshold below which any hazard manifestation is very unlikely to occur. The history and current developments of this concept are reviewed and the application of TTC for different regulated products and their hazards is discussed. TTC lends itself as a pragmatic filter to deprioritize testing needs whenever real-life exposures are much lower than levels where hazard manifestation would be expected, a situation that is called "negligible exposure" in the REACH legislation, though the TTC concept has not been fully incorporated in its implementation (yet). Other areas and regulations - especially in the food sector and for pharmaceutical impurities - are more proactive. Large, curated databases on toxic effects of chemicals provide us with the opportunity to set TTC for many hazards and substance classes and thus offer a precautionary second tier for risk assessments if hazard cannot be excluded. This allows focusing testing efforts better on relevant exposures to chemicals.

  20. The Validity and Incremental Validity of Knowledge Tests, Low-Fidelity Simulations, and High-Fidelity Simulations for Predicting Job Performance in Advanced-Level High-Stakes Selection

    ERIC Educational Resources Information Center

    Lievens, Filip; Patterson, Fiona

    2011-01-01

    In high-stakes selection among candidates with considerable domain-specific knowledge and experience, investigations of whether high-fidelity simulations (assessment centers; ACs) have incremental validity over low-fidelity simulations (situational judgment tests; SJTs) are lacking. Therefore, this article integrates research on the validity of…

  1. Evaluation of a cardiopulmonary resuscitation curriculum in a low resource environment.

    PubMed

    Chang, Mary P; Lyon, Camila B; Janiszewski, David; Aksamit, Deborah; Kateh, Francis; Sampson, John

    2015-11-07

    To evaluate whether a 2-day International Liaison Committee on Resuscitation (ILCOR) Universal Algorithm-based curriculum taught in a tertiary care hospital in Liberia increases local health care provider knowledge and skill comfort level. A combined basic and advanced cardiopulmonary resuscitation (CPR) curriculum was developed for low-resource settings that included lectures and low-fidelity manikin-based simulations. In March 2014, the curriculum was taught to healthcare providers in a tertiary care hospital in Liberia. In a quality assurance review, participants were evaluated for knowledge and comfort levels with resuscitation before and after the workshop. They were also videotaped during simulation sessions and evaluated on standardized performance metrics. Fifty-two hospital staff completed both pre-and post-curriculum surveys. The median score was 45% pre-curriculum and 82% post-curriculum (p<0.00001). The median provider comfort level score was 4 of 5 pre-curriculum and 5 of 5 post-curriculum (p<0.00001). During simulations, 93.2% of participants performed the pulse check within 10 seconds, and 97.7% performed defibrillation within 180 seconds. Clinician knowledge of and comfort level with CPR increased significantly after participating in our curriculum. A CPR curriculum based on lectures and low-fidelity manikin simulations may be an effective way to teach resuscitation in this low-resource setting.

  2. Association between lifetime exposure to inorganic arsenic in drinking water and coronary heart disease in Colorado residents.

    PubMed

    James, Katherine A; Byers, Tim; Hokanson, John E; Meliker, Jaymie R; Zerbe, Gary O; Marshall, Julie A

    2015-02-01

    Chronic diseases, including coronary heart disease (CHD), have been associated with ingestion of drinking water with high levels of inorganic arsenic (> 1,000 μg/L). However, associations have been inconclusive in populations with lower levels (< 100 μg/L) of inorganic arsenic exposure. We conducted a case-cohort study based on individual estimates of lifetime arsenic exposure to examine the relationship between chronic low-level arsenic exposure and risk of CHD. This study included 555 participants with 96 CHD events diagnosed between 1984 and 1998 for which individual lifetime arsenic exposure estimates were determined using data from structured interviews and secondary data sources to determine lifetime residence, which was linked to a geospatial model of arsenic concentrations in drinking water. These lifetime arsenic exposure estimates were correlated with historically collected urinary arsenic concentrations. A Cox proportional-hazards model with time-dependent CHD risk factors was used to assess the association between time-weighted average (TWA) lifetime exposure to low-level inorganic arsenic in drinking water and incident CHD. We estimated a positive association between low-level inorganic arsenic exposure and CHD risk [hazard ratio (HR): = 1.38, 95% CI: 1.09, 1.78] per 15 μg/L while adjusting for age, sex, first-degree family history of CHD, and serum low-density lipoprotein levels. The risk of CHD increased monotonically with increasing TWAs for inorganic arsenic exposure in water relative to < 20 μg/L (HR = 1.2, 95% CI: 0.6, 2.2 for 20-30 μg/L; HR = 2.2; 95% CI: 1.2, 4.0 for 30-45 μg/L; and HR = 3, 95% CI: 1.1, 9.1 for 45-88 μg/L). Lifetime exposure to low-level inorganic arsenic in drinking water was associated with increased risk for CHD in this population.

  3. Driving Responses of Older and Younger Drivers in a Driving Simulator

    PubMed Central

    Fildes, Brian; Charlton, Judith; Muir, Carlyn; Koppel, Sjaanie

    2007-01-01

    This paper reports the findings of a study of younger and older driver behaviour to hazardous traffic manoeuvres in a driving simulator. Hazardous situations on a highway and residential drive were studied and drivers’ vision and vehicle performance responses were collected. While all drivers were able to avoid crashes, the finding that older drivers were consistently slower to fixate hazardous stimuli in the driving environment and were slower to respond presents a potentially serious road safety concern. Further research is warranted, especially under conditions of increasing traffic complexity. PMID:18184513

  4. Hazards Caused by UV Rays of Xenon Light Based High Performance Solar Simulators.

    PubMed

    Dibowski, Gerd; Esser, Kai

    2017-09-01

    Solar furnaces are used worldwide to conduct experiments to demonstrate the feasibility of solar-chemical processes with the aid of concentrated sunlight, or to qualify high temperature-resistant components. In recent years, high-flux solar simulators (HFSSs) based on short-arc xenon lamps are more frequently used. The emitted spectrum is very similar to natural sunlight but with dangerous portions of ultraviolet light as well. Due to special benefits of solar simulators the increase of construction activity for HFSS can be observed worldwide. Hence, it is quite important to protect employees against serious injuries caused by ultraviolet radiation (UVR) in a range of 100 nm to 400 nm. The UV measurements were made at the German Aerospace Center (DLR), Cologne and Paul-Scherrer-Institute (PSI), Switzerland, during normal operations of the HFSS, with a high-precision UV-A/B radiometer using different experiment setups at different power levels. Thus, the measurement results represent UV emissions which are typical when operating a HFSS. Therefore, the biological effects on people exposed to UVR was investigated systematically to identify the existing hazard potential. It should be noted that the permissible workplace exposure limits for UV emissions significantly exceeded after a few seconds. One critical value was strongly exceeded by a factor of 770. The prevention of emissions must first and foremost be carried out by structural measures. Furthermore, unambiguous protocols have to be defined and compliance must be monitored. For short-term activities in the hazard area, measures for the protection of eyes and skin must be taken.

  5. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  6. Simulation of South-Asian Summer Monsoon in a GCM

    NASA Astrophysics Data System (ADS)

    Ajayamohan, R. S.

    2007-10-01

    Major characteristics of Indian summer monsoon climate are analyzed using simulations from the upgraded version of Florida State University Global Spectral Model (FSUGSM). The Indian monsoon has been studied in terms of mean precipitation and low-level and upper-level circulation patterns and compared with observations. In addition, the model's fidelity in simulating observed monsoon intraseasonal variability, interannual variability and teleconnection patterns is examined. The model is successful in simulating the major rainbelts over the Indian monsoon region. However, the model exhibits bias in simulating the precipitation bands over the South China Sea and the West Pacific region. Seasonal mean circulation patterns of low-level and upper-level winds are consistent with the model's precipitation pattern. Basic features like onset and peak phase of monsoon are realistically simulated. However, model simulation indicates an early withdrawal of monsoon. Northward propagation of rainbelts over the Indian continent is simulated fairly well, but the propagation is weak over the ocean. The model simulates the meridional dipole structure associated with the monsoon intraseasonal variability realistically. The model is unable to capture the observed interannual variability of monsoon and its teleconnection patterns. Estimate of potential predictability of the model reveals the dominating influence of internal variability over the Indian monsoon region.

  7. Efficacy of mechanical fuel treatments for reducing wildfire hazard

    Treesearch

    Robert J. Jr. Huggett; Karen L. Abt; Wayne Shepperd

    2008-01-01

    Mechanical fuel treatments are increasingly being used for wildfire hazard reduction in the western U.S. However, the efficacy of these treatments for reducing wildfire hazard at a landscape scale is difficult to quantify, especially when including growth following treatment. A set of uneven- and even-aged treatments designed to reduce fire hazard were simulated on 0.8...

  8. Potential Hazard to Human Health from Exposure to Fragments of Lead Bullets and Shot in the Tissues of Game Animals

    PubMed Central

    Pain, Deborah J.; Cromie, Ruth L.; Newth, Julia; Brown, Martin J.; Crutcher, Eric; Hardman, Pippa; Hurst, Louise; Mateo, Rafael; Meharg, Andrew A.; Moran, Annette C.; Raab, Andrea; Taggart, Mark A.; Green, Rhys E.

    2010-01-01

    Background Lead is highly toxic to animals. Humans eating game killed using lead ammunition generally avoid swallowing shot or bullets and dietary lead exposure from this source has been considered low. Recent evidence illustrates that lead bullets fragment on impact, leaving small lead particles widely distributed in game tissues. Our paper asks whether lead gunshot pellets also fragment upon impact, and whether lead derived from spent gunshot and bullets in the tissues of game animals could pose a threat to human health. Methodology/Principal Findings Wild-shot gamebirds (6 species) obtained in the UK were X-rayed to determine the number of shot and shot fragments present, and cooked using typical methods. Shot were then removed to simulate realistic practice before consumption, and lead concentrations determined. Data from the Veterinary Medicines Directorate Statutory Surveillance Programme documenting lead levels in raw tissues of wild gamebirds and deer, without shot being removed, are also presented. Gamebirds containing ≥5 shot had high tissue lead concentrations, but some with fewer or no shot also had high lead concentrations, confirming X-ray results indicating that small lead fragments remain in the flesh of birds even when the shot exits the body. A high proportion of samples from both surveys had lead concentrations exceeding the European Union Maximum Level of 100 ppb w.w. (0.1 mg kg−1 w.w.) for meat from bovine animals, sheep, pigs and poultry (no level is set for game meat), some by several orders of magnitude. High, but feasible, levels of consumption of some species could result in the current FAO/WHO Provisional Weekly Tolerable Intake of lead being exceeded. Conclusions/Significance The potential health hazard from lead ingested in the meat of game animals may be larger than previous risk assessments indicated, especially for vulnerable groups, such as children, and those consuming large amounts of game. PMID:20436670

  9. Potential hazard to human health from exposure to fragments of lead bullets and shot in the tissues of game animals.

    PubMed

    Pain, Deborah J; Cromie, Ruth L; Newth, Julia; Brown, Martin J; Crutcher, Eric; Hardman, Pippa; Hurst, Louise; Mateo, Rafael; Meharg, Andrew A; Moran, Annette C; Raab, Andrea; Taggart, Mark A; Green, Rhys E

    2010-04-26

    Lead is highly toxic to animals. Humans eating game killed using lead ammunition generally avoid swallowing shot or bullets and dietary lead exposure from this source has been considered low. Recent evidence illustrates that lead bullets fragment on impact, leaving small lead particles widely distributed in game tissues. Our paper asks whether lead gunshot pellets also fragment upon impact, and whether lead derived from spent gunshot and bullets in the tissues of game animals could pose a threat to human health. Wild-shot gamebirds (6 species) obtained in the UK were X-rayed to determine the number of shot and shot fragments present, and cooked using typical methods. Shot were then removed to simulate realistic practice before consumption, and lead concentrations determined. Data from the Veterinary Medicines Directorate Statutory Surveillance Programme documenting lead levels in raw tissues of wild gamebirds and deer, without shot being removed, are also presented. Gamebirds containing > or =5 shot had high tissue lead concentrations, but some with fewer or no shot also had high lead concentrations, confirming X-ray results indicating that small lead fragments remain in the flesh of birds even when the shot exits the body. A high proportion of samples from both surveys had lead concentrations exceeding the European Union Maximum Level of 100 ppb w.w. (0.1 mg kg(-1) w.w.) for meat from bovine animals, sheep, pigs and poultry (no level is set for game meat), some by several orders of magnitude. High, but feasible, levels of consumption of some species could result in the current FAO/WHO Provisional Weekly Tolerable Intake of lead being exceeded. The potential health hazard from lead ingested in the meat of game animals may be larger than previous risk assessments indicated, especially for vulnerable groups, such as children, and those consuming large amounts of game.

  10. Reduced high-density lipoprotein cholesterol: A valuable, independent prognostic marker in peripheral arterial disease.

    PubMed

    Martinez-Aguilar, Esther; Orbe, Josune; Fernández-Montero, Alejandro; Fernández-Alonso, Sebastián; Rodríguez, Jose A; Fernández-Alonso, Leopoldo; Páramo, Jose A; Roncal, Carmen

    2017-11-01

    The prognosis of patients with peripheral arterial disease (PAD) is characterized by an exceptionally high risk for myocardial infarction, ischemic stroke, and death; however, studies in search of new prognostic biomarkers in PAD are scarce. Even though low levels of high-density lipoprotein cholesterol (HDL-C) have been associated with higher risk of cardiovascular (CV) complications and death in different atherosclerotic diseases, recent epidemiologic studies have challenged its prognostic utility. The aim of this study was to test the predictive value of HDL-C as a risk factor for ischemic events or death in symptomatic PAD patients. Clinical and demographic parameters of 254 symptomatic PAD patients were recorded. Amputation, ischemic coronary disease, cerebrovascular disease, and all-cause mortality were recorded during a mean follow-up of 2.7 years. Multivariate analyses showed that disease severity (critical limb ischemia) was significantly reduced in patients with normal HDL-C levels compared with the group with low HDL-C levels (multivariate analysis odds ratio, 0.09; 95% confidence interval [CI], 0.03-0.24). A decreased risk for mortality (hazard ratio, 0.46; 95% CI, 0.21-0.99) and major adverse CV events (hazard ratio, 0.38; 95% CI, 0.16-0.86) was also found in patients with normal vs reduced levels of HDL-C in both Cox proportional hazards models and Kaplan-Meier estimates, after adjustment for confounding factors. Reduced HDL-C levels were significantly associated with higher risk for development of CV complications as well as with mortality in PAD patients. These findings highlight the usefulness of this simple test for early identification of PAD patients at high risk for development of major CV events. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  11. Rational risk-based decision support for drinking water well managers by optimized monitoring designs

    NASA Astrophysics Data System (ADS)

    Enzenhöfer, R.; Geiges, A.; Nowak, W.

    2011-12-01

    Advection-based well-head protection zones are commonly used to manage the contamination risk of drinking water wells. Considering the insufficient knowledge about hazards and transport properties within the catchment, current Water Safety Plans recommend that catchment managers and stakeholders know, control and monitor all possible hazards within the catchments and perform rational risk-based decisions. Our goal is to supply catchment managers with the required probabilistic risk information, and to generate tools that allow for optimal and rational allocation of resources between improved monitoring versus extended safety margins and risk mitigation measures. To support risk managers with the indispensable information, we address the epistemic uncertainty of advective-dispersive solute transport and well vulnerability (Enzenhoefer et al., 2011) within a stochastic simulation framework. Our framework can separate between uncertainty of contaminant location and actual dilution of peak concentrations by resolving heterogeneity with high-resolution Monte-Carlo simulation. To keep computational costs low, we solve the reverse temporal moment transport equation. Only in post-processing, we recover the time-dependent solute breakthrough curves and the deduced well vulnerability criteria from temporal moments by non-linear optimization. Our first step towards optimal risk management is optimal positioning of sampling locations and optimal choice of data types to reduce best the epistemic prediction uncertainty for well-head delineation, using the cross-bred Likelihood Uncertainty Estimator (CLUE, Leube et al., 2011) for optimal sampling design. Better monitoring leads to more reliable and realistic protection zones and thus helps catchment managers to better justify smaller, yet conservative safety margins. In order to allow an optimal choice in sampling strategies, we compare the trade-off in monitoring versus the delineation costs by accounting for ill-delineated fractions of protection zones. Within an illustrative simplified 2D synthetic test case, we demonstrate our concept, involving synthetic transmissivity and head measurements for conditioning. We demonstrate the worth of optimally collected data in the context of protection zone delineation by assessing the reduced areal demand of delineated area at user-specified risk acceptance level. Results indicate that, thanks to optimally collected data, risk-aware delineation can be made at low to moderate additional costs compared to conventional delineation strategies.

  12. Combining slope stability and groundwater flow models to assess stratovolcano collapse hazard

    NASA Astrophysics Data System (ADS)

    Ball, J. L.; Taron, J.; Reid, M. E.; Hurwitz, S.; Finn, C.; Bedrosian, P.

    2016-12-01

    Flank collapses are a well-documented hazard at volcanoes. Elevated pore-fluid pressures and hydrothermal alteration are invoked as potential causes for the instability in many of these collapses. Because pore pressure is linked to water saturation and permeability of volcanic deposits, hydrothermal alteration is often suggested as a means of creating low-permeability zones in volcanoes. Here, we seek to address the question: What alteration geometries will produce elevated pore pressures in a stratovolcano, and what are the effects of these elevated pressures on slope stability? We initially use a finite element groundwater flow model (a modified version of OpenGeoSys) to simulate `generic' stratovolcano geometries that produce elevated pore pressures. We then input these results into the USGS slope-stability code Scoops3D to investigate the effects of alteration and magmatic intrusion on potential flank failure. This approach integrates geophysical data about subsurface alteration, water saturation and rock mechanical properties with data about precipitation and heat influx at Cascade stratovolcanoes. Our simulations show that it is possible to maintain high-elevation water tables in stratovolcanoes given specific ranges of edifice permeability (ideally between 10-15 and 10-16 m2). Low-permeability layers (10-17 m2, representing altered pyroclastic deposits or altered breccias) in the volcanoes can localize saturated regions close to the surface, but they may actually reduce saturation, pore pressures, and water table levels in the core of the volcano. These conditions produce universally lower factor-of-safety (F) values than at an equivalent dry edifice with the same material properties (lower values of F indicate a higher likelihood of collapse). When magmatic intrusions into the base of the cone are added, near-surface pore pressures increase and F decreases exponentially with time ( 7-8% in the first year). However, while near-surface impermeable layers create elevated water tables and pore pressures, they do not necessarily produce the largest or deepest collapses. This suggests that mechanical properties of both the edifice and layers still exert a significant control, and collapse volumes depend on a complex interplay of mechanical factors and layering.

  13. Finite element modeling of ROPS in static testing and rear overturns.

    PubMed

    Harris, J R; Mucino, V H; Etherton, J R; Snyder, K A; Means, K H

    2000-08-01

    Even with the technological advances of the last several decades, agricultural production remains one of the most hazardous occupations in the United States. Death due to tractor rollover is a prime contributor to this hazard. Standards for rollover protective structures (ROPS) performance and certification have been developed by groups such as the Society of Automotive Engineers (SAE) and the American Society of Agricultural Engineers (ASAE) to combat these problems. The current ROPS certification standard, SAE J2194, requires either a dynamic or static testing sequence or both. Although some ROPS manufacturers perform both the dynamic and static phases of SAE J2194 testing, it is possible for a ROPS to be certified for field operation using static testing alone. This research compared ROPS deformation response from a simulated SAE J2194 static loading sequence to ROPS deformation response as a result of a simulated rearward tractor rollover. Finite element analysis techniques for plastic deformation were used to simulate both the static and dynamic rear rollover scenarios. Stress results from the rear rollover model were compared to results from simulated static testing per SAE J2194. Maximum stress values from simulated rear rollovers exceeded maximum stress values recorded during simulated static testing for half of the elements comprising the uprights. In the worst case, the static model underpredicts dynamic model results by approximately 7%. In the best case, the static model overpredicts dynamic model results by approximately 32%. These results suggest the need for additional experimental work to characterize ROPS stress levels during staged overturns and during testing according to the SAE standard.

  14. An Integrated Scenario Ensemble-Based Framework for Hurricane Evacuation Modeling: Part 2-Hazard Modeling.

    PubMed

    Blanton, Brian; Dresback, Kendra; Colle, Brian; Kolar, Randy; Vergara, Humberto; Hong, Yang; Leonardo, Nicholas; Davidson, Rachel; Nozick, Linda; Wachtendorf, Tricia

    2018-04-25

    Hurricane track and intensity can change rapidly in unexpected ways, thus making predictions of hurricanes and related hazards uncertain. This inherent uncertainty often translates into suboptimal decision-making outcomes, such as unnecessary evacuation. Representing this uncertainty is thus critical in evacuation planning and related activities. We describe a physics-based hazard modeling approach that (1) dynamically accounts for the physical interactions among hazard components and (2) captures hurricane evolution uncertainty using an ensemble method. This loosely coupled model system provides a framework for probabilistic water inundation and wind speed levels for a new, risk-based approach to evacuation modeling, described in a companion article in this issue. It combines the Weather Research and Forecasting (WRF) meteorological model, the Coupled Routing and Excess STorage (CREST) hydrologic model, and the ADvanced CIRCulation (ADCIRC) storm surge, tide, and wind-wave model to compute inundation levels and wind speeds for an ensemble of hurricane predictions. Perturbations to WRF's initial and boundary conditions and different model physics/parameterizations generate an ensemble of storm solutions, which are then used to drive the coupled hydrologic + hydrodynamic models. Hurricane Isabel (2003) is used as a case study to illustrate the ensemble-based approach. The inundation, river runoff, and wind hazard results are strongly dependent on the accuracy of the mesoscale meteorological simulations, which improves with decreasing lead time to hurricane landfall. The ensemble envelope brackets the observed behavior while providing "best-case" and "worst-case" scenarios for the subsequent risk-based evacuation model. © 2018 Society for Risk Analysis.

  15. Low-level radioactive waste management handbook series: Low-level radioactive waste management in medical and biomedical research institutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-03-01

    Development of this handbook began in 1982 at the request of the Radhealth Branch of the California Department of Health Services. California Assembly Bill 1513 directed the DHS to ''evaluate the technical and economic feasibility of (1) reducing the volume, reactivity, and chemical and radioactive hazard of (low-level radioactive) waste and (2) substituting nonradioactive or short-lived radioactive materials for those radionuclides which require long-term isolation from the environment. A contract awarded to the University of California at Irvine-UCI (California Std. Agreement 79902), to develop a document focusing on methods for decreasing low-level radioactive waste (LLW) generation in institutions was amore » result of that directive. In early 1985, the US Department of Energy, through EG and G Idaho, Inc., contracted with UCI to expand, update, and revise the California text for national release.« less

  16. Implications of new data on lead toxicity for managing and preventing exposure.

    PubMed Central

    Silbergeld, E K

    1990-01-01

    Recent advances in research on low-level lead poisoning point to the need to increase efforts to prevent exposure. Current biomedical consensus accepts that blood lead levels as low as 5 to 15 mcg/dL are risky to fetuses, young children, and adults. Lead at low dose is associated with increased blood pressure in adults, and chronic exposure has been associated in cohort studies with kidney disease and cancer. Data on lead toxicokinetics also points to the hazards of low-level, chronic exposure, since the lead that is accumulated over time in bone can be released at a relatively rapid rate during pregnancy and menopause. Sources that contribute to current lead exposure of the general population include unabated lead-based paint and contaminated soils, as well as lower level but pervasive sources in drinking water, food, and consumer products. PMID:2088754

  17. Implications of new data on lead toxicity for managing and preventing exposure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silbergeld, E.K.

    1990-11-01

    Recent advances in research on low-level lead poisoning point to the need to increase efforts to prevent exposure. Current biomedical consensus accepts that blood lead levels as low as 5 to 15 mcg/dL are risky to fetuses, young children, and adults. Lead at low dose is associated with increased blood pressure in adults, and chronic exposure has been associated in cohort studies with kidney disease and cancer. Data on lead toxicokinetics also points to the hazards of low-level, chronic exposure, since the lead that is accumulated over time in bone can be released at a relatively rapid rate during pregnancymore » and menopause. Sources that contribute to current lead exposure of the general population include unabated lead-based paint and contaminated soils, as well as lower level but pervasive sources in drinking water, food, and consumer products.« less

  18. A digital retina-like low-level vision processor.

    PubMed

    Mertoguno, S; Bourbakis, N G

    2003-01-01

    This correspondence presents the basic design and the simulation of a low level multilayer vision processor that emulates to some degree the functional behavior of a human retina. This retina-like multilayer processor is the lower part of an autonomous self-organized vision system, called Kydon, that could be used on visually impaired people with a damaged visual cerebral cortex. The Kydon vision system, however, is not presented in this paper. The retina-like processor consists of four major layers, where each of them is an array processor based on hexagonal, autonomous processing elements that perform a certain set of low level vision tasks, such as smoothing and light adaptation, edge detection, segmentation, line recognition and region-graph generation. At each layer, the array processor is a 2D array of k/spl times/m hexagonal identical autonomous cells that simultaneously execute certain low level vision tasks. Thus, the hardware design and the simulation at the transistor level of the processing elements (PEs) of the retina-like processor and its simulated functionality with illustrative examples are provided in this paper.

  19. Apparatus for incinerating hazardous waste

    DOEpatents

    Chang, Robert C. W.

    1994-01-01

    An apparatus for incinerating wastes, including an incinerator having a combustion chamber, a fluidtight shell enclosing the combustion chamber, an afterburner, an off-gas particulate removal system and an emergency off-gas cooling system. The region between the inner surface of the shell and the outer surface of the combustion chamber forms a cavity. Air is supplied to the cavity and heated as it passes over the outer surface of the combustion chamber. Heated air is drawn from the cavity and mixed with fuel for input into the combustion chamber. The pressure in the cavity is maintained at least approximately 2.5 cm WC (about 1" WC) higher than the pressure in the combustion chamber. Gases cannot leak from the combustion chamber since the pressure outside the chamber (inside the cavity) is higher than the pressure inside the chamber. The apparatus can be used to treat any combustible wastes, including biological wastes, toxic materials, low level radioactive wastes, and mixed hazardous and low level transuranic wastes.

  20. The contribution of low tar cigarettes to environmental tobacco smoke

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chortyk, O.T.; Schlotzhauer, W.S.

    A series of low tar cigarettes (LTC) were smoked and the quantities of condensable mainstream (inhaled) and sidestream (between puffs) smoke compounds were determined and compared to those produced by a high tar, nonfilter cigarette. It was found that the LTC produced large quantities of sidestream smoke condensates, about equal to the high tar cigarette, and contained very high levels of toxic or cocarcinogenic phenols. On an equal weight basis, the LTC emitted more of these hazardous compounds into sidestream and environmental tobacco smoke. Higher smoke yields of a flavor additive and a sugar degradation product indicated addition of suchmore » compounds during the manufacture of LTC. It was concluded that, compared to a high tar cigarette, smoking LTC may be better for the smoker, but not for the nearby nonsmoker. Information should be developed to allow smokers to choose LTC that produce lower levels of hazardous compounds in their environmentally emitted sidestream smoke.« less

  1. Apparatus for incinerating hazardous waste

    DOEpatents

    Chang, R.C.W.

    1994-12-20

    An apparatus is described for incinerating wastes, including an incinerator having a combustion chamber, a fluid-tight shell enclosing the combustion chamber, an afterburner, an off-gas particulate removal system and an emergency off-gas cooling system. The region between the inner surface of the shell and the outer surface of the combustion chamber forms a cavity. Air is supplied to the cavity and heated as it passes over the outer surface of the combustion chamber. Heated air is drawn from the cavity and mixed with fuel for input into the combustion chamber. The pressure in the cavity is maintained at least approximately 2.5 cm WC higher than the pressure in the combustion chamber. Gases cannot leak from the combustion chamber since the pressure outside the chamber (inside the cavity) is higher than the pressure inside the chamber. The apparatus can be used to treat any combustible wastes, including biological wastes, toxic materials, low level radioactive wastes, and mixed hazardous and low level transuranic wastes. 1 figure.

  2. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    NASA Astrophysics Data System (ADS)

    Graves, Robert; Jordan, Thomas H.; Callaghan, Scott; Deelman, Ewa; Field, Edward; Juve, Gideon; Kesselman, Carl; Maechling, Philip; Mehta, Gaurang; Milner, Kevin; Okaya, David; Small, Patrick; Vahi, Karan

    2011-03-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i.e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and magnitude uncertainty estimates used in the definition of the ruptures than is found in the traditional GMPE approach. This reinforces the need for continued development of a better understanding of earthquake source characterization and the constitutive relations that govern the earthquake rupture process.

  3. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    USGS Publications Warehouse

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and magnitude uncertainty estimates used in the definition of the ruptures than is found in the traditional GMPE approach. This reinforces the need for continued development of a better understanding of earthquake source characterization and the constitutive relations that govern the earthquake rupture process. ?? 2010 Springer Basel AG.

  4. [Dynamic monitoring and analysis of occupational hazards in working environment of foundry plant from 1987 to 2010].

    PubMed

    Lu, Yang; Zhang, Min; Chen, Wei-hong; Qi, Cheng

    2013-08-01

    To investigate the characteristics and changing trend of occupational hazards in the working environment of a foundry plant from 1987 to 2010. The foundry plant of a large-scale automobile company in Hubei Province, China was chosen as the study site. The data on occupational hazards in the working environment of the foundry plant in the past years were collected, and additional measurements were performed. The means and geometric means of the concentrations of occupational hazards were calculated. The characteristics and changing trend of occupational hazards from 1987 to 2010 were presented. There were dust, chemical, and physical occupational hazards in the working environment of the foundry plant, with silica dust, noise, and heat stress as the main ones. Dust, mainly silica dust, is found in all aspects of foundry. The mean concentration of silica dust was high (3.2∼8.2 mg/m(3)), exceeding the national occupational exposure limit (1 mg/m(3)). The mean concentrations of silica dust varied across different types of work, with higher levels in cast shakeout and finishing, overhead crane operation, and sand preparation. The mean concentration of respirable dust in the foundry plant was low (0.38 mg/m(3)), not exceeding the national occupational exposure limit (0.7 mg/m(3)). There were high concentrations of grinding wheel dust (10.6 mg/m(3)) and welding fume (5.7 mg/m(3)) in cast shakeout and finishing, exceeding the national occupational exposure limit (8 and 4 mg/m(3)). Coal dust was mainly found in melting as well as cast shakeout and finishing, with higher concentration in the former (4.7 mg/m(3). The main chemical occupational hazard in the environment of the foundry plant was formaldehyde (1.23 mg/m(3)), exceeding the national occupational exposure limit (0.5 mg/m(3)). The concentrations of ammonia, phenol, metal fume, sulfur dioxide, hydrogen sulfide, and phosphine in the foundry plant were low. The mean concentration of polycyclic aromatic hydrocarbons was 0.1405 µg/m(3), with a higher level in pouring. The main physical occupational hazards in the working environment of the foundry plant were noise and heat stress. Noise, mainly steady noise, was distributed in all workshops of the foundry plant, with a mean intensity of 85.1 db (A). Noise levels varied across different types of work, higher in cast shakeout and finishing (89.3 db (A)) and moulding (85.4 db (A)). Heat stress mainly existed in overhead crane operation (35.1°C), pouring (33.3°C), and melting (32.8°C). Dust, chemical, and physical occupational hazards co-existed in the working environment of the foundry plant. High concentration of dust was widely distributed in many workshops and across many types of work, but the dust concentration showed a downward trend. Chemical occupational hazards included ammonia, phenol, hydrogen sulfide, and metal fume, most at low concentrations. High-intensity noise was widely distributed in all working positions of foundry process and mainly from equipment operation, collision between parts, and gas injection. High-intensity heat stress mainly existed in overhead crane operation, pouring, and melting.

  5. Dimensions of socioeconomic status and clinical outcome after primary percutaneous coronary intervention.

    PubMed

    Jakobsen, Lars; Niemann, Troels; Thorsgaard, Niels; Thuesen, Leif; Lassen, Jens F; Jensen, Lisette O; Thayssen, Per; Ravkilde, Jan; Tilsted, Hans H; Mehnert, Frank; Johnsen, Søren P

    2012-10-01

    The association between low socioeconomic status (SES) and high mortality from coronary heart disease is well-known. However, the role of SES in relation to the clinical outcome after primary percutaneous coronary intervention remains poorly understood. We studied 7385 patients treated with primary percutaneous coronary intervention. Participants were divided into high-SES and low-SES groups according to income, education, and employment status. The primary outcome was major adverse cardiac events (cardiac death, recurrent myocardial infarction, and target vessel revascularization) at maximum follow-up (mean, 3.7 years). Low-SES patients had more adverse baseline risk profiles than high-SES patients. The cumulative risk of major adverse cardiac events after maximum follow-up was higher among low-income patients and unemployed patients compared with their counterparts (income: hazard ratio, 1.68; 95% CI, 1.47-1.92; employment status: hazard ratio, 1.75; 95% CI, 1.46-2.10). After adjustment for patient characteristics, these differences were substantially attenuated (income: hazard ratio, 1.12; 95% CI, 0.93-1.33; employment status: hazard ratio, 1.27; 95% CI, 1.03-1.56). Further adjustment for admission findings, procedure-related data, and medical treatment during follow-up did not significantly affect the associations. With education as the SES indicator, no between-group differences were observed in the risk of the composite end point. Even in a tax-financed healthcare system, low-SES patients treated with primary percutaneous coronary intervention face a worse prognosis than high-SES patients. The poor outcome seems to be largely explained by differences in baseline patient characteristics. Employment status and income (but not education level) were associated with clinical outcomes.

  6. Law Enforcement Officers' Involvement Level in Hurricane Katrina and Alcohol Use.

    PubMed

    Heavey, Sarah Cercone; Homish, Gregory G; Andrew, Michael E; McCanlies, Erin; Mnatsakanova, Anna; Violanti, John M; Burchfiel, Cecil M

    2015-03-01

    The purpose of this work is to examine the relationship between alcohol use and level of involvement during Hurricane Katrina among law enforcement officers, and to investigate whether marital status or previous military training offer resilience against negative outcomes. Officers in the immediate New Orleans geographic area completed surveys that assessed their involvement in Hurricane Katrina and alcohol use (Alcohol Use and Disorders Identification Test (AUDIT) score). Negative binomial regression models were used to analyze level of hazardous alcohol use; interactions were tested to examine protective influences of marriage and prior military training (controlling for age and gender). There was a significant association between heavy involvement in Hurricane Katrina and having a greater AUDIT score (exp(β)[EB]=1.81; 95% CI: 1.03, 3.17; p<0.05), indicating higher levels of hazardous alcohol use. Contrary to original hypotheses, marital status and military training were not protective against alcohol use (p>0.05). These results illustrate an association between law enforcement officers' heavy involvement during Hurricane Katrina and greater levels of hazardous alcohol use when compared to officers with low or moderate involvement. This has important treatment implications for those with high involvement in disasters as they may require targeted interventions to overcome the stress of such experiences.

  7. The Cellular Automata for modelling of spreading of lava flow on the earth surface

    NASA Astrophysics Data System (ADS)

    Jarna, A.

    2012-12-01

    Volcanic risk assessment is a very important scientific, political and economic issue in densely populated areas close to active volcanoes. Development of effective tools for early prediction of a potential volcanic hazard and management of crises are paramount. However, to this date volcanic hazard maps represent the most appropriate way to illustrate the geographical area that can potentially be affected by a volcanic event. Volcanic hazard maps are usually produced by mapping out old volcanic deposits, however dynamic lava flow simulation gaining popularity and can give crucial information to corroborate other methodologies. The methodology which is used here for the generation of volcanic hazard maps is based on numerical simulation of eruptive processes by the principle of Cellular Automata (CA). The python script is integrated into ArcToolbox in ArcMap (ESRI) and the user can select several input and output parameters which influence surface morphology, size and shape of the flow, flow thickness, flow velocity and length of lava flows. Once the input parameters are selected, the software computes and generates hazard maps on the fly. The results can be exported to Google Maps (.klm format) to visualize the results of the computation. For validation of the simulation code are used data from a real lava flow. Comparison of the simulation results with real lava flows mapped out from satellite images will be presented.

  8. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  9. 76 FR 25576 - Pipeline Safety: Applying Safety Regulations to All Rural Onshore Hazardous Liquid Low-Stress Lines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-05

    ... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... to All Rural Onshore Hazardous Liquid Low-Stress Lines AGENCY: Pipeline and Hazardous Materials... burdensome to require operators of these pipelines to perform a complete ``could affect'' analysis to...

  10. U.S. Geological Survey research in radioactive waste disposal - Fiscal years 1986-1990

    USGS Publications Warehouse

    Trask, N.J.; Stevens, P.R.

    1991-01-01

    The report summarizes progress on geologic and hydrologic research related to the disposal of radioactive wastes. The research efforts are categorized according to whether they are related most directly to: (1) high-level wastes, (2) transuranic wastes, (3) low-level and mixed low-level and hazardous wastes, or (4) uranium mill tailings. Included is research applicable to the identification and geohydrologic characterization of waste-disposal sites, to investigations of specific sites where wastes have been stored, to development of techniques and methods for characterizing disposal sites, and to studies of geologic and hydrologic processes related to the transport and/or retention of waste radionuclides.

  11. Risk assessment of debris flow hazards in natural slope

    NASA Astrophysics Data System (ADS)

    Choi, Junghae; Chae, Byung-gon; Liu, Kofei; Wu, Yinghsin

    2016-04-01

    The study area is located at north-east part of South Korea. Referring to the map of landslide sus-ceptibility (KIGAM, 2009) from Korea Institute of Geoscience and Mineral Resources (KIGAM for short), there are large areas of potential landslide in high probability on slope land of mountain near the study area. Besides, recently some severe landslide-induced debris flow hazards occurred in this area. So this site is convinced to be prone to debris flow haz-ards. In order to mitigate the influence of hazards, the assessment of potential debris flow hazards is very important and essential. In this assessment, we use Debris-2D, debris flow numerical program, to assess the potential debris flow hazards. The worst scenario is considered for simulation. The input mass sources are determined using landslide susceptibility map. The water input is referred to the daily accumulative rainfall in the past debris flow event in study area. The only one input material property, i.e. yield stress, is obtained using calibration test. The simulation results show that the study area has po-tential to be impacted by debris flow. Therefore, based on simulation results, to mitigate debris flow hazards, we can propose countermeasures, including building check dams, constructing a protection wall in study area, and installing instruments for active monitoring of debris flow hazards. Acknowledgements:This research was supported by the Public Welfare & Safety Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning (NRF-2012M3A2A1050983)

  12. Towards a Low-Cost Real-Time Photogrammetric Landslide Monitoring System Utilising Mobile and Cloud Computing Technology

    NASA Astrophysics Data System (ADS)

    Chidburee, P.; Mills, J. P.; Miller, P. E.; Fieber, K. D.

    2016-06-01

    Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM) pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i) the development of an Android mobile application; (ii) the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii) performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan) and a web-based system (Autodesk 123D Catch). Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard at a local-scale.

  13. Predictors of low back pain in physically active conscripts with special emphasis on muscular fitness.

    PubMed

    Taanila, Henri P; Suni, Jaana H; Pihlajamäki, Harri K; Mattila, Ville M; Ohrankämmen, Olli; Vuorinen, Petteri; Parkkari, Jari P

    2012-09-01

    Association between low physical fitness and low back pain (LBP) is contradictory in previous studies. The objective of the present prospective cohort study was to investigate the predictive associations of various intrinsic risk factors in young conscripts for LBP, with special attention to physical fitness. A prospective cohort study. A representative sample of Finnish male conscripts. In Finland, military service is compulsory for male citizens and 90% of young men enter into the service. Incidence of LBP and recurrent LBP prompting a visit at the garrison health clinic during 6-month military training. Four successive cohorts of 18- to 28-year-old male conscripts (N=982) were followed for 6 months. Conscripts with incidence of LBP were identified and treated at the garrison clinic. Predictive associations between intrinsic risk factors and LBP were examined using multivariate Cox proportional hazard models. The cumulative incidence of LBP was 16%, the incidence rate being 1.2 (95% confidence interval [CI], 1.0-1.4) per 1,000 person-days. Conscripts with low educational level had increased risk for incidence of LBP (hazard ratio [HR], 1.6; 95% CI, 1.1-2.3). Conscripts with low dynamic trunk muscle endurance and low aerobic endurance simultaneously (ie, having coimpairment) at baseline also had an increased risk for incidence of LBP. The strongest risk factor was coimpairment of trunk muscular endurance in tests of back lift and push-up (HR, 2.8; 95% CI, 1.4-5.9). The increased risk for LBP was observed among young men who had a low educational level and poor fitness level in both muscular and aerobic performance. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Sample Delivery and Computer Control Systems for Detecting Leaks in the Main Engines of the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Griffin, Timothy P.; Naylor, Guy R.; Hritz, Richard J.; Barrett, Carolyn A.

    1997-01-01

    The main engines of the Space Shuttle use hydrogen and oxygen as the fuel and oxidant. The explosive and fire hazards associated with these two components pose a serious danger to personnel and equipment. Therefore prior to use the main engines undergo extensive leak tests. Instead of using hazardous gases there tests utilize helium as the tracer element. This results in a need to monitor helium in the ppm level continuously for hours. The major challenge in developing such a low level gas monitor is the sample delivery system. This paper discuss a system developed to meet the requirements while also being mobile. Also shown is the calibration technique, stability, and accuracy results for the system.

  15. Weapons of Mass Destruction Technology Evaluation and Training Range

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin Larry Young

    2009-05-01

    The Idaho National Laboratory (INL) has a long history for providing technology evaluation and training for military and other federal level Weapons of Mass Destruction (WMD) response agencies. Currently there are many federal organizations and commercial companies developing technologies related to detecting, assessing, mitigating and protecting against hazards associated with a WMD event. Unfortunately, very few locations exist within the United States where WMD response technologies are realistically field tested and evaluated using real chemical, biological, radiological, nuclear and explosive materials. This is particularly true with biological and radiological hazards. Related to this lack of adequate WMD, multi-hazard technology testingmore » capability is the shortage of locations where WMD response teams can train using actual chemical, biological, and radiological material or highly realistic simulates. In response to these technology evaluation and training needs, the INL has assembled a consortium of subject matter experts from existing programs and identified dedicated resources for the purpose of establishing an all-hazards, WMD technology evaluation and training range. The author describes the challenges associated with creating the all-hazards WMD technology evaluation and training range and lists the technical, logistical and financial benefits of an all-hazards technology evaluation and training range. Current resources and capabilities for conducting all-hazard technology evaluation and training at the INL are identified. Existing technology evaluation and training programs at the INL related to radiological, biological and chemical hazards are highlighted, including successes and lessons learned. Finally, remaining gaps in WMD technology evaluation and training capabilities are identified along with recommendations for closing those gaps.« less

  16. A Method to Assess Flux Hazards at CSP Plants to Reduce Avian Mortality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, Clifford K.; Wendelin, Timothy; Horstman, Luke

    A method to evaluate avian flux hazards at concentrating solar power plants (CSP) has been developed. A heat-transfer model has been coupled to simulations of the irradiance in the airspace above a CSP plant to determine the feather temperature along prescribed bird flight paths. Probabilistic modeling results show that the irradiance and assumed feather properties (thickness, absorptance, heat capacity) have the most significant impact on the simulated feather temperature, which can increase rapidly (hundreds of degrees Celsius in seconds) depending on the parameter values. The avian flux hazard model is being combined with a plant performance model to identify alternativemore » heliostat standby aiming strategies that minimize both avian flux hazards and negative impacts on plant performance.« less

  17. A method to assess flux hazards at CSP plants to reduce avian mortality

    NASA Astrophysics Data System (ADS)

    Ho, Clifford K.; Wendelin, Timothy; Horstman, Luke; Yellowhair, Julius

    2017-06-01

    A method to evaluate avian flux hazards at concentrating solar power plants (CSP) has been developed. A heat-transfer model has been coupled to simulations of the irradiance in the airspace above a CSP plant to determine the feather temperature along prescribed bird flight paths. Probabilistic modeling results show that the irradiance and assumed feather properties (thickness, absorptance, heat capacity) have the most significant impact on the simulated feather temperature, which can increase rapidly (hundreds of degrees Celsius in seconds) depending on the parameter values. The avian flux hazard model is being combined with a plant performance model to identify alternative heliostat standby aiming strategies that minimize both avian flux hazards and negative impacts on plant performance.

  18. Small dense low-density lipoprotein-cholesterol concentrations predict risk for coronary heart disease: the Atherosclerosis Risk In Communities (ARIC) study.

    PubMed

    Hoogeveen, Ron C; Gaubatz, John W; Sun, Wensheng; Dodge, Rhiannon C; Crosby, Jacy R; Jiang, Jennifer; Couper, David; Virani, Salim S; Kathiresan, Sekar; Boerwinkle, Eric; Ballantyne, Christie M

    2014-05-01

    To investigate the relationship between plasma levels of small dense low-density lipoprotein-cholesterol (sdLDL-C) and risk for incident coronary heart disease (CHD) in a prospective study among Atherosclerosis Risk in Communities (ARIC) study participants. Plasma sdLDL-C was measured in 11 419 men and women of the biracial ARIC study using a newly developed homogeneous assay. A proportional hazards model was used to examine the relationship among sdLDL-C, vascular risk factors, and risk for CHD events (n=1158) for a period of ≈11 years. Plasma sdLDL-C levels were strongly correlated with an atherogenic lipid profile and were higher in patients with diabetes mellitus than non-diabetes mellitus (49.6 versus 42.3 mg/dL; P<0.0001). In a model that included established risk factors, sdLDL-C was associated with incident CHD with a hazard ratio of 1.51 (95% confidence interval, 1.21-1.88) for the highest versus the lowest quartile, respectively. Even in individuals considered to be at low cardiovascular risk based on their LDL-C levels, sdLDL-C predicted risk for incident CHD (hazard ratio, 1.61; 95% confidence interval, 1.04-2.49). Genome-wide association analyses identified genetic variants in 8 loci associated with sdLDL-C levels. These loci were in or close to genes previously associated with risk for CHD. We discovered 1 novel locus, PCSK7, for which genetic variation was significantly associated with sdLDL-C and other lipid factors. sdLDL-C was associated with incident CHD in ARIC study participants. The novel association of genetic variants in PCSK7 with sdLDL-C and other lipid traits may provide new insights into the role of this gene in lipid metabolism.

  19. Coastal Sea Level and Estuary Tide Modeling in Bangladesh Using SAR, Radar and GNSS-R Altimetry

    NASA Astrophysics Data System (ADS)

    Jia, Y.; Shum, C. K.; Sun, J.; Li, D.; Shang, K.; Yi, Y.; Calmant, S.; Ballu, V.; Chu, P.; Johnson, J.; Park, J.; Bao, L.; Kuo, C. Y.; Wickert, J.

    2017-12-01

    Bangladesh, located at the confluence of three large rivers - Ganges, Brahmaputra and Meghna, is a low-lying country. It is prone to monsoonal flooding, potentially aggravated by more frequent and intensified cyclones resulting from anthropogenic climate change. Its coastal estuaries, the Sundarbans wetlands, have the largest Mangrove forest in the world, and exhibits complex tidal dynamics. In order to study flood hazards, ecological or climate changes over floodplains, it is fundamentally important to know the water level and water storage capacity in wetlands. Inaccurate or inadequate information about wetland water storage will cause significant errors in hydrological simulation and modeling for understanding ecological and economic implications. However, in most areas, the exact knowledge of water level change and the flow patterns is lacking due to insufficient monitoring of water level gauging stations on private and public lands within wetlands or floodplains, due to the difficulty of physical access to the sites and logistics in data gathering. Usage of satellite all-weather remote sensing products provides an alternative approach for monitoring the water level variation over floodplains or wetlands. In this study, we used a combination of observations from satellite radar altimetry (Envisat/Jason-2/Altika/Sentinel-3), L-band synthetic aperture radar (ALOS-1/-2) backscattering coefficients inferred water level, GNSS-R altimetry from two coastal/river GNSS sites, for measuring coastal and estuary sea-level and conducting estuary ocean tide modeling in the Bangladesh delta including the Sundarbans wetlands.

  20. Associations of Insulin Resistance and Adiponectin With Mortality in Women With Breast Cancer

    PubMed Central

    Duggan, Catherine; Irwin, Melinda L.; Xiao, Liren; Henderson, Katherine D.; Smith, Ashley Wilder; Baumgartner, Richard N.; Baumgartner, Kathy B.; Bernstein, Leslie; Ballard-Barbash, Rachel; McTiernan, Anne

    2011-01-01

    Purpose Overweight or obese breast cancer patients have a worse prognosis compared with normal-weight patients. This may be attributed to hyperinsulinemia and dysregulation of adipokine levels associated with overweight and obesity. Here, we evaluate whether low levels of adiponectin and a greater level of insulin resistance are associated with breast cancer mortality and all-cause mortality. Patients and Methods We measured glucose, insulin, and adiponectin levels in fasting serum samples from 527 women enrolled in the Health, Eating, Activity, and Lifestyle (HEAL) Study, a multiethnic, prospective cohort study of women diagnosed with stage I-IIIA breast cancer. We evaluated the association between adiponectin and insulin and glucose levels (expressed as the Homeostatic Model Assessment [HOMA] score) represented as continuous measures and median split categories, along with breast cancer mortality and all-cause mortality, using Cox proportional hazards models. Results Increasing HOMA scores were associated with reduced breast cancer survival (hazard ratio [HR], 1.12; 95% CI, 1.05 to 1.20) and reduced all-cause survival (HR, 1.09; 95% CI, 1.02 to 1.15) after adjustment for possible confounders. Higher levels of adiponectin (above the median: 15.5 μg/mL) were associated with longer breast cancer survival (HR, 0.39; 95% CI, 0.15 to 0.95) after adjustment for covariates. A continuous measure of adiponectin was not associated with either breast cancer–specific or all-cause mortality. Conclusion Elevated HOMA scores and low levels of adiponectin, both associated with obesity, were associated with increased breast cancer mortality. To the best of our knowledge, this is the first demonstration of the association between low levels of adiponectin and increased breast cancer mortality in breast cancer survivors. PMID:21115858

  1. Cognitive abilities predict death during the next 15 years in older Japanese adults.

    PubMed

    Nishita, Yukiko; Tange, Chikako; Tomida, Makiko; Otsuka, Rei; Ando, Fujiko; Shimokata, Hiroshi

    2017-10-01

    The longitudinal relationship between cognitive abilities and subsequent death was investigated among community-dwelling older Japanese adults. Participants (n = 1060; age range 60-79 years) comprised the first-wave participants of the National Institute for Longevity Sciences-Longitudinal Study of Aging. Participants' cognitive abilities were measured at baseline using the Japanese Wechsler Adult Intelligence Scale-Revised Short Form, which includes the following tests: Information (general knowledge), Similarities (logical abstract thinking), Picture Completion (visual perception and long-term visual memory) and Digit Symbol (information processing speed). By each cognitive test score, participants were classified into three groups: the high-level group (≥ the mean + 1SD), the low-level group (≤ the mean - 1SD) and the middle-level group. Data on death and moving during the subsequent 15 years were collected and analyzed using the multiple Cox proportional hazard model adjusted for physical and psychosocial covariates. During the follow-up period, 308 participants (29.06%) had died and 93 participants (8.77%) had moved. In the Similarities test, adjusted hazard ratios (HR) of the low-level group to the high-level group were significant (HR 1.49, 95% CI 1.02-2.17, P = 0.038). Furthermore, in the Digit symbol test, the adjusted HR of the low-level group to the high-level group was significant (HR 1.62, 95% CI 1.03-2.58, P = 0.038). Significant adjusted HR were not observed for the Information or Picture Completion tests. It is suggested that a lower level of logical abstract thinking and slower information processing speed are associated with shorter survival among older Japanese adults. Geriatr Gerontol Int 2017; 17: 1654-1660. © 2016 Japan Geriatrics Society.

  2. Evaluation of Tsunami Hazards in Kuwait from Possible Earthquake and Landslide Sources considering Effect of Natural Tide

    NASA Astrophysics Data System (ADS)

    Latcharote, P.

    2016-12-01

    Kuwait is one of the most important oil producers to the world and most of population and many vital facilities are located along the coasts. However, even with low or unknown tsunami risk, it is important to investigate tsunami hazards in this country to ensure safety of life and sustain the global economy. This study aimed to evaluate tsunami hazards along the coastal areas of Kuwait from both earthquake and landslide sources using numerical modeling. Tsunami generation and propagation was simulated using the two-layer model and the TUNAMI model. Four cases of earthquake scenarios are expected to generate tsunami along the Makran Subduction Zone (MSZ) based on historical events and worst cases possible to simulate tsunami propagation to the coastal areas of the Arabian Gulf. Case 1 (Mw 8.3) and Case 2 (Mw 8.3) are the replication of the 1945 Makran earthquake, whereas Case 3 (Mw 8.6) and Case 4 (Mw 9.0) are the worst-case scenarios. Tsunami numerical simulation was modelled with mesh size 30 arc-second using bathymetry and topography data from GEBCO. Preliminary results suggested that tsunamis generated by Case 1 and Case 2 will impose very small effects to Kuwait (< 0.1 m) while Case 3 and Case 4 can generate maximum tsunami amplitude up to 0.3 m to 1.0 m after 12 hours from the earthquake. In addition, this study considered tsunamis generated by landslide along the opposite Iranian coast of Kuwait bay. To preliminarily assess tsunami hazards, coastal landslides were assumed occurred at the volume of 1.0-2.0 km3 at three possible locations from their topographic features. The preliminary results revealed that tsunami generated by coastal landslides could impose a significant tsunami impact to Kuwait having maximum tsunami amplitude at the Falika Island in front of Kuwait bay and Azzour power and desalination plant about 0.5 m- 1.1 m depending on landslide volume and energy dissipation. Future works will include more accuracy of tsunami numerical simulation with higher resolution of bathymetry and topography data in order to investigate tsunami inundation. Furthermore, detailed analysis on possible landslide sources will be performed by means of 3D-slope stability analysis in order to know the exact locations and landslide volumes taking into account the geological conditions, such as surface elevation and soil property data.

  3. Probabilistic volcanic hazard assessments of Pyroclastic Density Currents: ongoing practices and future perspectives

    NASA Astrophysics Data System (ADS)

    Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto

    2014-05-01

    Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.

  4. A simulator study investigating how motorcyclists approach side-road hazards.

    PubMed

    Crundall, Elizabeth; Stedmon, Alex W; Saikayasit, Rossukorn; Crundall, David

    2013-03-01

    The most common form of motorcycle collision in the UK occurs when another road user fails to give way and pulls out from a side road in front of an oncoming motorcyclist. While research has considered these collisions from the car driver's perspective, no research to date has addressed how motorcyclists approach these potential hazards. This study conducted a detailed analysis of motorcyclist speed and road position on approach to side-roads in a simulated suburban setting. Novice, Experienced and Advanced riders rode two laps of a simulated route, encountering five side-roads on each lap. On the second lap, a car emerged from the first side-road in a typical 'looked but failed to see' accident scenario. Three Experienced riders and one Novice rider collided with the hazard. The Advanced rider group adopted the safest strategy when approaching side-roads, with a lane position closer to the centre of the road and slower speeds. In contrast, Experienced riders chose faster speeds, often over the speed limit, especially when approaching junctions with good visibility. Rider behaviour at non-hazard junctions was compared between laps, to investigate if riders modified their behaviour after experiencing the hazard. Whilst all riders were generally more cautious after the hazard, the Advanced riders modified their behaviour more than the other groups after the hazard vehicle had pulled out. The results suggest that advanced training can lead to safer riding styles that are not acquired by experience alone. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. On the Storm Surge and Sea Level Rise Projections for Infrastructure Risk Analysis and Adaptation

    EPA Science Inventory

    Storm surge can cause coastal hydrology changes, flooding, water quality changes, and even inundation of low-lying terrain. Strong wave actions and disruptive winds can damage water infrastructure and other environmental assets (hazardous and solid waste management facilities, w...

  6. Long-term exposure to crystalline silica and risk of heart disease mortality.

    PubMed

    Liu, Yuewei; Rong, Yi; Steenland, Kyle; Christiani, David C; Huang, Xiji; Wu, Tangchun; Chen, Weihong

    2014-09-01

    The association between crystalline silica exposure and risk of heart disease mortality remains less clear. We investigated a cohort of 42,572 Chinese workers who were potentially exposed to crystalline silica and followed from 1960 to 2003. Cumulative silica exposure was estimated by linking a job-exposure matrix to each person's work history. Low-level silica exposure was defined as never having held a job with an exposure higher than 0.1 mg/m. We estimated hazard ratios (HRs) in exposure-response analyses using Cox proportional hazards model. We identified 2846 deaths from heart disease during an average of 35 years follow-up. Positive exposure-response trends were observed for cumulative silica exposure associated with mortality from total heart disease (HRs for increasing quartiles of cumulative silica exposure compared with the unexposed group = 0.89, 1.09, 1.32, 2.10; P for linear trend < 0.001) and pulmonary heart disease (0.92, 1.39, 2.47, 5.46; P for linear trend < 0.001). These positive trends remained among workers with both high- and low-level silica exposure. There was also a positive trend for ischemic heart disease among workers with low-level exposure, with quartile HRs of 1.04, 1.13, 1.52, and 1.60 (P for linear trend < 0.001). Low-level crystalline silica exposure was associated with increased mortality from heart disease, including pulmonary heart disease and ischemic heart disease, whereas high-level exposure mainly increased mortality from pulmonary heart disease. Current permissible exposure limits for crystalline silica in many countries may be insufficient to protect people from deaths due to heart disease.

  7. Mapping malaria risk using geographic information systems and remote sensing: The case of Bahir Dar City, Ethiopia.

    PubMed

    Minale, Amare Sewnet; Alemu, Kalkidan

    2018-05-07

    The main objective of this study was to develop a malaria risk map for Bahir Dar City, Amhara, which is situated south of Lake Tana on the Ethiopian plateau. Rainfall, temperature, altitude, slope and land use/land cover (LULC), as well as proximity measures to lake, river and health facilities, were investigated using remote sensing and geographical information systems. The LULC variable was derived from a 2012 SPOT satellite image by supervised classification, while 30-m spatial resolution measurements of altitude and slope came from the Shuttle Radar Topography Mission. Metrological data were collected from the National Meteorological Agency, Bahir Dar branch. These separate datasets, represented as layers in the computer, were combined using weighted, multi-criteria evaluations. The outcome shows that rainfall, temperature, slope, elevation, distance from the lake and distance from the river influenced the malaria hazard the study area by 35%, 15%, 10%, 7%, 5% and 3%, respectively, resulting in a map showing five areas with different levels of malaria hazard: very high (11.2%); high (14.5%); moderate (63.3%); low (6%); and none (5%). The malaria risk map, based on this hazard map plus additional information on proximity to health facilities and current LULC conditions, shows that Bahir Dar City has areas with very high (15%); high (65%); moderate (8%); and low (5%) levels of malaria risk, with only 2% of the land completely riskfree. Such risk maps are essential for planning, implementing, monitoring and evaluating disease control as well as for contemplating prevention and elimination of epidemiological hazards from endemic areas.

  8. Airborne Doppler radar detection of low altitude windshear

    NASA Technical Reports Server (NTRS)

    Bracalente, Emedio M.; Jones, William R.; Britt, Charles L.

    1990-01-01

    As part of an integrated windshear program, the Federal Aviation Administration, jointly with NASA, is sponsoring a research effort to develop airborne sensor technology for the detection of low altitude windshear during aircraft take-off and landing. One sensor being considered is microwave Doppler radar operating at X-band or above. Using a Microburst/Clutter/Radar simulation program, a preliminary feasibility study was conducted to assess the performance of Doppler radars for this application. Preliminary results from this study are presented. Analysis show, that using bin-to-bin Automatic Gain Control (AGC), clutter filtering, limited detection range, and suitable antenna tilt management, windshear from a wet microburst can be accurately detected 10 to 65 seconds (.75 to 5 km) in front of the aircraft. Although a performance improvement can be obtained at higher frequency, the baseline X-band system that was simulated detected the presence of a windshear hazard for the dry microburst. Although this study indicates the feasibility of using an airborne Doppler radar to detect low altitude microburst windshear, further detailed studies, including future flight experiments, will be required to completely characterize the capabilities and limitations.

  9. Instrumentation and Methodology Development for Mars Mission

    NASA Technical Reports Server (NTRS)

    Chen, Yuan-Liang Albert

    2002-01-01

    The Mars environment comprises a dry, cold and low air pressure atmosphere with low gravity (0.38g) and high resistivity soil. The global dust storms that cover a large portion of Mars were observed often from Earth. This environment provides an idea condition for triboelectric charging. The extremely dry conditions on the Martian surface have raised concerns that electrostatic charge buildup will not be dissipated easily. If triboelectrically generated charge cannot be dissipated or avoided, then dust will accumulate on charged surfaces and electrostatic discharge may cause hazards for future exploration missions. The low surface temperature on Mars helps to prolong the charge decay on the dust particles and soil. To better understand the physics of Martian charged dust particles is essential to future Mars missions. We research and design two sensors, velocity/charge sensor and PZT momentum sensors, to detect the velocity distribution, charge distribution and mass distribution of Martian charged dust particles. These sensors are fabricated at NASA Kenney Space Center, Electromagnetic Physics Testbed. The sensors will be tested and calibrated for simulated Mars atmosphere condition with JSC MARS-1 Martian Regolith simulant in this NASA laboratory.

  10. Aviation Trends Related to Atmospheric Environment Safety Technologies Project Technical Challenges

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Barr, Lawrence C.; Evans, Joni K.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    Current and future aviation safety trends related to the National Aeronautics and Space Administration's Atmospheric Environment Safety Technologies Project's three technical challenges (engine icing characterization and simulation capability; airframe icing simulation and engineering tool capability; and atmospheric hazard sensing and mitigation technology capability) were assessed by examining the National Transportation Safety Board (NTSB) accident database (1989 to 2008), incidents from the Federal Aviation Administration (FAA) accident/incident database (1989 to 2006), and literature from various industry and government sources. The accident and incident data were examined for events involving fixed-wing airplanes operating under Federal Aviation Regulation (FAR) Parts 121, 135, and 91 for atmospheric conditions related to airframe icing, ice-crystal engine icing, turbulence, clear air turbulence, wake vortex, lightning, and low visibility (fog, low ceiling, clouds, precipitation, and low lighting). Five future aviation safety risk areas associated with the three AEST technical challenges were identified after an exhaustive survey of a variety of sources and include: approach and landing accident reduction, icing/ice detection, loss of control in flight, super density operations, and runway safety.

  11. Volcanic unrest and hazard communication in Long Valley Volcanic Region, California

    USGS Publications Warehouse

    Hill, David P.; Mangan, Margaret T.; McNutt, Stephen R.

    2017-01-01

    The onset of volcanic unrest in Long Valley Caldera, California, in 1980 and the subsequent fluctuations in unrest levels through May 2016 illustrate: (1) the evolving relations between scientists monitoring the unrest and studying the underlying tectonic/magmatic processes and their implications for geologic hazards, and (2) the challenges in communicating the significance of the hazards to the public and civil authorities in a mountain resort setting. Circumstances special to this case include (1) the sensitivity of an isolated resort area to media hype of potential high-impact volcanic and earthquake hazards and its impact on potential recreational visitors and the local economy, (2) a small permanent population (~8000), which facilitates face-to-face communication between scientists monitoring the hazard, civil authorities, and the public, and (3) the relatively frequent turnover of people in positions of civil authority, which requires a continuing education effort on the nature of caldera unrest and related hazards. Because of delays associated with communication protocols between the State and Federal governments during the onset of unrest, local civil authorities and the public first learned that the U.S. Geological Survey was about to release a notice of potential volcanic hazards associated with earthquake activity and 25-cm uplift of the resurgent dome in the center of the caldera through an article in the Los Angeles Times published in May 1982. The immediate reaction was outrage and denial. Gradual acceptance that the hazard was real required over a decade of frequent meetings between scientists and civil authorities together with public presentations underscored by frequently felt earthquakes and the onset of magmatic CO2 emissions in 1990 following a 11-month long earthquake swarm beneath Mammoth Mountain on the southwest rim of the caldera. Four fatalities, one on 24 May 1998 and three on 6 April 2006, underscored the hazard posed by the CO2 emissions. Initial response plans developed by county and state agencies in response to the volcanic unrest began with “The Mono County Volcano Contingency Plan” and “Plan Caldera” by the California Office of Emergency Services in 1982–84. They subsequently became integrated in the regularly updated County Emergency Operation Plan. The alert level system employed by the USGS also evolved from the three-level “Notice-Watch-Warning” system of the early 1980s through a five level color-code to the current “Normal-Advisory-Watch-Warning” ground-based system in conjunction with the international 4-level aviation color-code for volcanic ash hazards. Field trips led by the scientists proved to be a particularly effective means of acquainting local residents and officials with the geologically active environment in which they reside. Relative caldera quiescence from 2000 through 2011 required continued efforts to remind an evolving population that the hazards posed by the 1980–2000 unrest persisted. Renewed uplift of the resurgent dome from 2011 to 2014 was accompanied by an increase in low-level earthquake activity in the caldera and beneath Mammoth Mountain and continues through May 2016. As unrest levels continue to wax and wane, so will the communication challenges.

  12. Deadly heat waves projected in the densely populated agricultural regions of South Asia

    PubMed Central

    Im, Eun-Soon; Pal, Jeremy S.; Eltahir, Elfatih A. B.

    2017-01-01

    The risk associated with any climate change impact reflects intensity of natural hazard and level of human vulnerability. Previous work has shown that a wet-bulb temperature of 35°C can be considered an upper limit on human survivability. On the basis of an ensemble of high-resolution climate change simulations, we project that extremes of wet-bulb temperature in South Asia are likely to approach and, in a few locations, exceed this critical threshold by the late 21st century under the business-as-usual scenario of future greenhouse gas emissions. The most intense hazard from extreme future heat waves is concentrated around densely populated agricultural regions of the Ganges and Indus river basins. Climate change, without mitigation, presents a serious and unique risk in South Asia, a region inhabited by about one-fifth of the global human population, due to an unprecedented combination of severe natural hazard and acute vulnerability. PMID:28782036

  13. Ferritin levels and risk of heart failure-the Atherosclerosis Risk in Communities Study.

    PubMed

    Silvestre, Odilson M; Gonçalves, Alexandra; Nadruz, Wilson; Claggett, Brian; Couper, David; Eckfeldt, John H; Pankow, James S; Anker, Stefan D; Solomon, Scott D

    2017-03-01

    Severe iron overload is associated with cardiac damage, while iron deficiency has been related to worse outcomes in subjects with heart failure (HF). This study investigated the relationship between ferritin, a marker of iron status, and the incidence of HF in a community-based cohort. We examined 1063 participants who were free of heart failure from the Atherosclerosis Risk in Communities (ARIC) Study in whom ferritin serum levels were measured at baseline (1987-1989). The participants (mean age 52.7 ± 5.5 years, 62% women), were categorized in low (<30 ng/mL; n = 153), normal (30-200 ng/mL in women and 30-300 ng/mL in men; n = 663), and high (>200 ng/mL in women and >300 ng/mL in men; n = 247) ferritin levels. Multivariable Cox proportional hazards models were used to evaluate the relationship between ferritin and incident HF. After 21 ± 4.6 years of follow-up, HF occurred in 144 (13.5%) participants. When compared with participants with normal ferritin levels, participants with low ferritin levels had a higher risk of HF [hazard ratio (HR) = 2.24, 95% confidence interval (CI) 1.15-4.35; P = 0.02] as did those with high ferritin levels (HR = 1.81, 95% CI 1.01-3.25; P = 0.04), after adjusting for potential confounders. Notably, low ferritin levels remained associated with incident HF even after excluding subjects with anaemia (HR = 2.28, 95% CI 1.11-4.68; P = 0.03). Derangements in iron metabolism, either low or high ferritin serum levels, were associated with higher risk of incident HF in a general population, even without concurrent anaemia. These findings suggest that iron imbalance might play a role in the development of HF. © 2016 The Authors. European Journal of Heart Failure © 2016 European Society of Cardiology.

  14. Air pollution and non-respiratory health hazards for children

    PubMed Central

    Poursafa, Parinaz

    2010-01-01

    Air pollution is a global health issue with serious public health implications, particularly for children. Usually respiratory effects of air pollutants are considered, but this review highlights the importance of non-respiratory health hazards. In addition to short-term effects, exposure to criteria air pollutants from early life might be associated with low birth weight, increase in oxidative stress and endothelial dysfunction, which in turn might have long-term effects on chronic non-communicable diseases. In view of the emerging epidemic of chronic disease in low- and middle- income countries, the vicious cycle of rapid urbanization and increasing levels of air pollution, public health and regulatory policies for air quality protection should be integrated into the main priorities of the primary health care system and into the educational curriculum of health professionals. PMID:22371790

  15. Social determinants of workers' health in Central America.

    PubMed

    Aragón, Aurora; Partanen, Timo; Felknor, Sarah; Corriols, Marianela

    2011-01-01

    This communication summarizes the available data on work-related determinants of health in Central America. The Central American working population is young and moving from agriculture toward industry and services. Ethnicity, gender, migration, subemployment and precarious work, informality, rural conditions, low-level educational, poverty, ubiquitous worksite health hazards, insufficient occupational health services, low labor inspection density, and weak unions define the constellation of social determinants of workers' health in Central America. Data are, however, scanty both for hazards and work-related illnesses and injuries. Governments and industries have the responsibility of opening decent work opportunities, especially for those facing multiple inequalities in social determinants of health. A first step would be the ratification and implementation of the ILO Convention (187) on occupational safety and health by the seven national governments of the region.

  16. High-sensitivity remote detection of atmospheric pollutants and greenhouse gases at low ppm levels using near-infrared tunable diode lasers

    NASA Astrophysics Data System (ADS)

    Roy, Anirban; Upadhyay, Abhishek; Chakraborty, Arup Lal

    2016-05-01

    The concentration of atmospheric pollutants and greenhouse gases needs to be precisely monitored for sustainable industrial development and to predict the climate shifts caused by global warming. Such measurements are made on a continuous basis in ecologically sensitive and urban areas in the advanced countries. Tunable diode laser spectroscopy (TDLS) is the most versatile non-destructive technology currently available for remote measurements of multiple gases with very high selectivity (low cross-sensitivity), very high sensitivity (on the order of ppm and ppb) and under hazardous conditions. We demonstrate absolute measurements of acetylene, methane and carbon dioxide using a fielddeployable fully automated TDLS system that uses calibration-free 2f wavelength modulation spectroscopy (2f WMS) techniques with sensitivities of low ppm levels. A 40 mW, 1531.52 nm distributed feedback (DFB) diode laser, a 10 mW, 1650 nm DFB laser and a 1 mW, 2004 nm vertical cavity surface emitting laser (VCSEL) are used in the experiments to probe the P9 transition of acetylene, R4 transition of methane and R16 transition of carbon dioxide respectively. Data acquisition and on-board analysis comprises a Raspberry Pi-based embedded system that is controllable over a wireless connection. Gas concentration and pressure are simultaneously extracted by fitting the experimental signals to 2f WMS signals simulated using spectroscopic parameters obtained from the HITRAN database. The lowest detected concentration is 11 ppm for acetylene, 275 ppm for methane and 285 ppm for carbon dioxide using a 28 cm long single-pass gas cell.

  17. Afghanistan Multi-Risk Assessment to Natural Hazards

    NASA Astrophysics Data System (ADS)

    Diermanse, Ferdinand; Daniell, James; Pollino, Maurizio; Glover, James; Bouwer, Laurens; de Bel, Mark; Schaefer, Andreas; Puglisi, Claudio; Winsemius, Hessel; Burzel, Andreas; Ammann, Walter; Aliparast, Mojtaba; Jongman, Brenden; Ranghieri, Federica; Fallesen, Ditte

    2017-04-01

    The geographical location of Afghanistan and years of environmental degradation in the country make Afghanistan highly prone to intense and recurring natural hazards such as flooding, earthquakes, snow avalanches, landslides, and droughts. These occur in addition to man-made disasters resulting in the frequent loss of live, livelihoods, and property. Since 1980, disasters caused by natural hazards have affected 9 million people and caused over 20,000 fatalities in Afghanistan. The creation, understanding and accessibility of hazard, exposure, vulnerability and risk information is key for effective management of disaster risk. This is especially true in Afghanistan, where reconstruction after recent natural disasters and military conflicts is on-going and will continue over the coming years. So far, there has been limited disaster risk information produced in Afghanistan, and information that does exist typically lacks standard methodology and does not have uniform geo-spatial coverage. There are currently no available risk assessment studies that cover all major natural hazards in Afghanistan, which can be used to assess the costs and benefits of different resilient reconstruction and disaster risk reduction strategies. As a result, the Government of Afghanistan has limited information regarding current and future disaster risk and the effectiveness of policy options on which to base their reconstruction and risk reduction decisions. To better understand natural hazard and disaster risk, the World Bank and Global Facility for Disaster Reduction and Recovery (GFDRR) are supporting the development of new fluvial flood, flash flood, drought, landslide, avalanche and seismic risk information in Afghanistan, as well as a first-order analysis of the costs and benefits of resilient reconstruction and risk reduction strategies undertaken by the authors. The hazard component is the combination of probability and magnitude of natural hazards. Hazard analyses were carried out separately for each peril. Several models were implemented used to simulate the relevant processes involved. These models were fed by global and local climate data and geological data like elevation, slope, land use, soil characteristics etc. Exposure is a measure of the assets and population at risk. An extensive data collection and processing effort was carried out to derive nation-wide exposure data. Vulnerability is a measure of potential exposure losses if a hazardous event occurs. Vulnerability analyses were carried out separately for each peril, because of differences in impact characteristics. Damage functions were derived from asset characteristics and/or experiences from (international) literature. The main project output consists of tables and (GIS-) maps of hazard, exposure and risk. Tables present results at the nation-wide level (admin0), province level (admin1) and district level (admin2). Hazard maps are provided for various return periods, including 10, 20, 50, 100, 250, 500 and 1000 years. All maps are stored in a Web-based GIS-platform. This platform contains four separate directories with [1] generic data (catchment boundaries, rivers etc), [2] hazard maps, [3] exposure maps and [4] risk maps for each of the considered perils.

  18. NOx Control for Utility Boiler OTR Compliance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamid Farzan

    Under sponsorship of the Department of Energy's National Energy Technology Laboratory (NETL), the Babcock and Wilcox Company (B and W), and Fuel Tech teamed together to investigate an integrated solution for NO{sub x} control. The system is comprised of B and W's DRB-4Z{trademark} ultra low-NO{sub x} pulverized coal (PC) burner technology and Fuel Tech's NOxOUT{reg_sign}, a urea-based selective non-catalytic reduction (SNCR) technology. Development of the low-NO{sub x} burner technology has been a focus in B and W's combustion program. The DRB-4Z{trademark} burner is B and W's newest low-NO{sub x} burner capable of achieving very low NO{sub x}. The burner ismore » designed to reduce NO{sub x} by controlled mixing of the fuel and air. Based on data from several 500 to 600 MWe boilers firing PRB coal, NOx emissions levels of 0.15 to 0.20 lb/ 106 Btu have been achieved from the DRB-4Z{trademark} burners in combination with overfire air ports. Although NOx emissions from the DRB-4Z{trademark} burner are nearing the Ozone Transport Rule (OTR) level of 0.15 lb NO{sub x}/106 Btu, the utility boiler owners can still benefit from the addition of an SNCR and/or SCR system in order to comply with the stringent NO{sub x} emission levels facing them. Large-scale testing is planned in B and W's 100-million Btu/hr Clean Environment Development Facility (CEDF) that simulates the conditions of large coal-fired utility boilers. The objective of the project is to achieve a NO{sub x} level below 0.15 lb/106 Btu (with ammonia slip of less than 5 ppm) in the CEDF using PRB coal and B and W's DRB-4Z{trademark} low-NO{sub x} pulverized coal (PC) burner in combination with dual zone overfire air ports and Fuel Tech's NO{sub x}OUT{reg_sign}. During this period B and W prepared and submitted the project management plan and hazardous substance plan to DOE. The negotiation of a subcontract for Fuel Tech has been started.« less

  19. Experimental outgassing of toxic chemicals to simulate the characteristics of hazards tainting globally shipped products

    PubMed Central

    Budnik, Lygia Therese; Austel, Nadine; Gadau, Sabrina; Kloth, Stefan; Schubert, Jens; Jungnickel, Harald; Luch, Andreas

    2017-01-01

    Ambient monitoring analyses may identify potential new public health hazards such as residual levels of fumigants and industrial chemicals off gassing from products and goods shipped globally. We analyzed container air with gas chromatography coupled to mass spectrometry (TD-2D-GC-MS/FPD) and assessed whether the concentration of the volatiles benzene and 1,2-dichloroethane exceeded recommended exposure limits (REL). Products were taken from transport containers and analyzed for outgassing of volatiles. Furthermore, experimental outgassing was performed on packaging materials and textiles, to simulate the hazards tainting from globally shipped goods. The mean amounts of benzene in analyzed container air were 698-fold higher, and those of ethylene dichloride were 4.5-fold higher than the corresponding REL. More than 90% of all containers struck with toluene residues higher than its REL. For 1,2-dichloroethane 53% of containers, transporting shoes exceeded the REL. In standardized experimental fumigation of various products, outgassing of 1,2-dichloroethane under controlled laboratory conditions took up to several months. Globally produced transported products tainted with toxic industrial chemicals may contribute to the mixture of volatiles in indoor air as they are likely to emit for a long period. These results need to be taken into account for further evaluation of safety standards applying to workers and consumers. PMID:28520742

  20. Serum magnesium is associated with the risk of dementia.

    PubMed

    Kieboom, Brenda C T; Licher, Silvan; Wolters, Frank J; Ikram, M Kamran; Hoorn, Ewout J; Zietse, Robert; Stricker, Bruno H; Ikram, M Arfan

    2017-10-17

    To determine if serum magnesium levels are associated with the risk of all-cause dementia and Alzheimer disease. Within the prospective population-based Rotterdam Study, we measured serum magnesium levels in 9,569 participants, free from dementia at baseline (1997-2008). Participants were subsequently followed up for incident dementia, determined according to the DSM-III-R criteria, until January 1, 2015. We used Cox proportional hazard regression models to associate quintiles of serum magnesium with incident all-cause dementia. We used the third quintile as a reference group and adjusted for age, sex, Rotterdam Study cohort, educational level, cardiovascular risk factors, kidney function, comorbidities, other electrolytes, and diuretic use. Our study population had a mean age of 64.9 years and 56.6% were women. During a median follow-up of 7.8 years, 823 participants were diagnosed with all-cause dementia. Both low serum magnesium levels (≤0.79 mmol/L) and high serum magnesium levels (≥0.90 mmol/L) were associated with an increased risk of dementia (hazard ratio [HR] 1.32, 95% confidence interval [CI] 1.02-1.69, and HR 1.30, 95% CI 1.02-1.67, respectively). Both low and high serum magnesium levels are associated with an increased risk of all-cause dementia. Our results warrant replication in other population-based studies. © 2017 American Academy of Neurology.

  1. Ground-motion signature of dynamic ruptures on rough faults

    NASA Astrophysics Data System (ADS)

    Mai, P. Martin; Galis, Martin; Thingbaijam, Kiran K. S.; Vyas, Jagdish C.

    2016-04-01

    Natural earthquakes occur on faults characterized by large-scale segmentation and small-scale roughness. This multi-scale geometrical complexity controls the dynamic rupture process, and hence strongly affects the radiated seismic waves and near-field shaking. For a fault system with given segmentation, the question arises what are the conditions for producing large-magnitude multi-segment ruptures, as opposed to smaller single-segment events. Similarly, for variable degrees of roughness, ruptures may be arrested prematurely or may break the entire fault. In addition, fault roughness induces rupture incoherence that determines the level of high-frequency radiation. Using HPC-enabled dynamic-rupture simulations, we generate physically self-consistent rough-fault earthquake scenarios (M~6.8) and their associated near-source seismic radiation. Because these computations are too expensive to be conducted routinely for simulation-based seismic hazard assessment, we thrive to develop an effective pseudo-dynamic source characterization that produces (almost) the same ground-motion characteristics. Therefore, we examine how variable degrees of fault roughness affect rupture properties and the seismic wavefield, and develop a planar-fault kinematic source representation that emulates the observed dynamic behaviour. We propose an effective workflow for improved pseudo-dynamic source modelling that incorporates rough-fault effects and its associated high-frequency radiation in broadband ground-motion computation for simulation-based seismic hazard assessment.

  2. Ignaz Semmelweis redux?

    PubMed

    Raemer, Daniel B

    2014-06-01

    The story of Ignaz Semmelweis suggests a lesson to beware of unintended consequences, especially with in situ simulation. In situ simulation offers many important advantages over center-based simulation such as learning about the real setting, putting participants at ease, saving travel time, minimizing space requirements, involving patients and families. Some substantial disadvantages include frequent distractions, lack of privacy, logistics of setup, availability of technology, and supply costs. Importantly, in situ simulation amplifies some of the safety hazards of simulation itself including maintaining control of simulated medications and equipment, limiting the use of valuable hospital resources, preventing incorrect learning from simulation shortcuts, and profoundly upsetting patients and their families. Mitigating these hazards by labeling effectively, publishing policies and procedures, securing simulation supplies and equipment, educating simulation staff, and informing participants of the risks are all methods that may lessen the potential for an accident. Each requires a serious effort of analysis, design, and implementation.

  3. The perception of volcanic risk in Kona communities from Mauna Loa and Hualālai volcanoes, Hawai‵i

    NASA Astrophysics Data System (ADS)

    Gregg, C. E.; Houghton, B. F.; Johnston, D. M.; Paton, D.; Swanson, D. A.

    2004-02-01

    Volcanic hazards in Kona (i.e. the western side of the island of Hawai‵i) stem primarily from Mauna Loa and Hualālai volcanoes. The former has erupted 39 times since 1832. Lava flows were emplaced in Kona during seven of these eruptions and last impacted Kona in 1950. Hualālai last erupted in ca. 1800. Society's proximity to potential eruptive sources and the potential for relatively fast-moving lava flows, coupled with relatively long time intervals since the last eruptions in Kona, are the underlying stimuli for this study of risk perception. Target populations were high-school students and adults ( n=462). Using these data, we discuss threat knowledge as an influence on risk perception, and perception as a driving mechanism for preparedness. Threat knowledge and perception of risk were found to be low to moderate. On average, fewer than two-thirds of the residents were aware of the most recent eruptions that impacted Kona, and a minority felt that Mauna Loa and Hualālai could ever erupt again. Furthermore, only about one-third were aware that lava flows could reach the coast in Kona in less than 3 h. Lava flows and ash fall were perceived to be among the least likely hazards to affect the respondent's community within the next 10 years, whereas vog (volcanic smog) was ranked the most likely. Less than 18% identified volcanic hazards as amongst the most likely hazards to affect them at home, school, or work. Not surprisingly, individual preparedness measures were found on average to be limited to simple tasks of value in frequently occurring domestic emergencies, whereas measures specific to infrequent hazard events such as volcanic eruptions were seldom adopted. Furthermore, our data show that respondents exhibit an 'unrealistic optimism bias' and infer that responsibility for community preparedness for future eruptions primarily rests with officials. We infer that these respondents may be less likely to attend to hazard information, react to warnings as directed, and undertake preparedness measures than other populations who perceive responsibility to lie with themselves. There are significant differences in hazard awareness and risk perception between students and adults, between subpopulations representing local areas, and between varying ethnicities. We conclude that long time intervals since damaging lava flows have occurred in Kona have contributed to lower levels of awareness and risk perceptions of the threat from lava flows, and that the on-going eruption at Kīlauea has facilitated greater awareness and perception of risk of vog but not of other volcanic hazards. Low levels of preparedness may be explained by low perceptions of threat and risk and perhaps by the lack of a clear motivation or incentive to seek new modes of adjustment.

  4. The perception of volcanic risk in Kona communities from Mauna Loa and Hualālai volcanoes, Hawai'i

    USGS Publications Warehouse

    Gregg, Chris E.; Houghton, Bruce F.; Johnston, David M.; Paton, Douglas; Swanson, D.A.

    2004-01-01

    Volcanic hazards in Kona (i.e. the western side of the island of Hawai'i) stem primarily from Mauna Loa and Huala??lai volcanoes. The former has erupted 39 times since 1832. Lava flows were emplaced in Kona during seven of these eruptions and last impacted Kona in 1950. Huala??lai last erupted in ca. 1800. Society's proximity to potential eruptive sources and the potential for relatively fast-moving lava flows, coupled with relatively long time intervals since the last eruptions in Kona, are the underlying stimuli for this study of risk perception. Target populations were high-school students and adults ( n =462). Using these data, we discuss threat knowledge as an influence on risk perception, and perception as a driving mechanism for preparedness. Threat knowledge and perception of risk were found to be low to moderate. On average, fewer than two-thirds of the residents were aware of the most recent eruptions that impacted Kona, and a minority felt that Mauna Loa and Huala??lai could ever erupt again. Furthermore, only about one-third were aware that lava flows could reach the coast in Kona in less than 3 h. Lava flows and ash fall were perceived to be among the least likely hazards to affect the respondent's community within the next 10 years, whereas vog (volcanic smog) was ranked the most likely. Less than 18% identified volcanic hazards as amongst the most likely hazards to affect them at home, school, or work. Not surprisingly, individual preparedness measures were found on average to be limited to simple tasks of value in frequently occurring domestic emergencies, whereas measures specific to infrequent hazard events such as volcanic eruptions were seldom adopted. Furthermore, our data show that respondents exhibit an 'unrealistic optimism bias' and infer that responsibility for community preparedness for future eruptions primarily rests with officials. We infer that these respondents may be less likely to attend to hazard information, react to warnings as directed, and undertake preparedness measures than other populations who perceive responsibility to lie with themselves. There are significant differences in hazard awareness and risk perception between students and adults, between subpopulations representing local areas, and between varying ethnicities. We conclude that long time intervals since damaging lava flows have occurred in Kona have contributed to lower levels of awareness and risk perceptions of the threat from lava flows, and that the on-going eruption at Ki??lauea has facilitated greater awareness and perception of risk of vog but not of other volcanic hazards. Low levels of preparedness may be explained by low perceptions of threat and risk and perhaps by the lack of a clear motivation or incentive to seek new modes of adjustment. ?? 2003 Published by Elsevier B.V.

  5. Bluetooth Communication Interface for EEG Signal Recording in Hyperbaric Chambers.

    PubMed

    Pastena, Lucio; Formaggio, Emanuela; Faralli, Fabio; Melucci, Massimo; Rossi, Marco; Gagliardi, Riccardo; Ricciardi, Lucio; Storti, Silvia F

    2015-07-01

    Recording biological signals inside a hyperbaric chamber poses technical challenges (the steel walls enclosing it greatly attenuate or completely block the signals as in a Faraday cage), practical (lengthy cables creating eddy currents), and safety (sparks hazard from power supply to the electronic apparatus inside the chamber) which can be overcome with new wireless technologies. In this technical report we present the design and implementation of a Bluetooth system for electroencephalographic (EEG) recording inside a hyperbaric chamber and describe the feasibility of EEG signal transmission outside the chamber. Differently from older systems, this technology allows the online recording of amplified signals, without interference from eddy currents. In an application of this technology, we measured EEG activity in professional divers under three experimental conditions in a hyperbaric chamber to determine how oxygen, assumed at a constant hyperbaric pressure of 2.8 ATA , affects the bioelectrical activity. The EEG spectral power estimated by fast Fourier transform and the cortical sources of the EEG rhythms estimated by low-resolution brain electromagnetic analysis were analyzed in three different EEG acquisitions: breathing air at sea level; breathing oxygen at a simulated depth of 18 msw, and breathing air at sea level after decompression.

  6. Inferring infection hazard in wildlife populations by linking data across individual and population scales.

    PubMed

    Pepin, Kim M; Kay, Shannon L; Golas, Ben D; Shriner, Susan S; Gilbert, Amy T; Miller, Ryan S; Graham, Andrea L; Riley, Steven; Cross, Paul C; Samuel, Michael D; Hooten, Mevin B; Hoeting, Jennifer A; Lloyd-Smith, James O; Webb, Colleen T; Buhnerkempe, Michael G

    2017-03-01

    Our ability to infer unobservable disease-dynamic processes such as force of infection (infection hazard for susceptible hosts) has transformed our understanding of disease transmission mechanisms and capacity to predict disease dynamics. Conventional methods for inferring FOI estimate a time-averaged value and are based on population-level processes. Because many pathogens exhibit epidemic cycling and FOI is the result of processes acting across the scales of individuals and populations, a flexible framework that extends to epidemic dynamics and links within-host processes to FOI is needed. Specifically, within-host antibody kinetics in wildlife hosts can be short-lived and produce patterns that are repeatable across individuals, suggesting individual-level antibody concentrations could be used to infer time since infection and hence FOI. Using simulations and case studies (influenza A in lesser snow geese and Yersinia pestis in coyotes), we argue that with careful experimental and surveillance design, the population-level FOI signal can be recovered from individual-level antibody kinetics, despite substantial individual-level variation. In addition to improving inference, the cross-scale quantitative antibody approach we describe can reveal insights into drivers of individual-based variation in disease response, and the role of poorly understood processes such as secondary infections, in population-level dynamics of disease. © 2017 John Wiley & Sons Ltd/CNRS.

  7. Inferring infection hazard in wildlife populations by linking data across individual and population scales

    USGS Publications Warehouse

    Pepin, Kim M.; Kay, Shannon L.; Golas, Ben D.; Shriner, Susan A.; Gilbert, Amy T.; Miller, Ryan S.; Graham, Andrea L.; Riley, Steven; Cross, Paul C.; Samuel, Michael D.; Hooten, Mevin B.; Hoeting, Jennifer A.; Lloyd-Smith, James O.; Webb, Colleen T.; Buhnerkempe, Michael G.

    2017-01-01

    Our ability to infer unobservable disease-dynamic processes such as force of infection (infection hazard for susceptible hosts) has transformed our understanding of disease transmission mechanisms and capacity to predict disease dynamics. Conventional methods for inferring FOI estimate a time-averaged value and are based on population-level processes. Because many pathogens exhibit epidemic cycling and FOI is the result of processes acting across the scales of individuals and populations, a flexible framework that extends to epidemic dynamics and links within-host processes to FOI is needed. Specifically, within-host antibody kinetics in wildlife hosts can be short-lived and produce patterns that are repeatable across individuals, suggesting individual-level antibody concentrations could be used to infer time since infection and hence FOI. Using simulations and case studies (influenza A in lesser snow geese and Yersinia pestis in coyotes), we argue that with careful experimental and surveillance design, the population-level FOI signal can be recovered from individual-level antibody kinetics, despite substantial individual-level variation. In addition to improving inference, the cross-scale quantitative antibody approach we describe can reveal insights into drivers of individual-based variation in disease response, and the role of poorly understood processes such as secondary infections, in population-level dynamics of disease.

  8. Cataractogenic potential of ionizing radiations in animal models that simulate man

    NASA Technical Reports Server (NTRS)

    Lett, J. T.; Cox, A. B.; Lee, A. C.

    1986-01-01

    Aspects of experiments on radiation-induced lenticular opacification during the life spans of two animal models, the New Zealand white rabbit and the rhesus monkey, are compared and contrasted with published results from a life-span study of another animal model, the beagle dog, and the most recent data from the ongoing study of the survivors from radiation exposure at Hiroshima and Nagasaki. An important connection among the three animal studies is that all the measurements of cataract indices were made by one of the authors (Lee), so variation from personal subjectivity was reduced to a minimum. The primary objective of the rabbit experiments (radiations involved: Fe-56, Ar-40, and Ne-20 ions and Co-60 gamma photons) is an evaluation of hazards to astronauts from Galactic particulate radiations. An analogous evaluation of hazards from solar flares during space flight is being made with monkeys exposed to 32, 55, 138 and 400-MeV protons. Conclusions are drawn about the proper use of animal models to simulate radiation responses in man and the levels of radiation-induced lenticular opacification that pose risks to man in space.

  9. Changes in physical activity and all-cause mortality in COPD.

    PubMed

    Vaes, Anouk W; Garcia-Aymerich, Judith; Marott, Jacob L; Benet, Marta; Groenen, Miriam T J; Schnohr, Peter; Franssen, Frits M E; Vestbo, Jørgen; Wouters, Emiel F M; Lange, Peter; Spruit, Martijn A

    2014-11-01

    Little is known about changes in physical activity in subjects with chronic obstructive pulmonary disease (COPD) and its impact on mortality. Therefore, we aimed to study changes in physical activity in subjects with and without COPD and the impact of physical activity on mortality risk. Subjects from the Copenhagen City Heart Study with at least two consecutive examinations were selected. Each examination included a self-administered questionnaire and clinical examination. 1270 COPD subjects and 8734 subjects without COPD (forced expiratory volume in 1 s 67±18 and 91±15% predicted, respectively) were included. COPD subjects with moderate or high baseline physical activity who reported low physical activity level at follow-up had the highest hazard ratios of mortality (1.73 and 2.35, respectively; both p<0.001). In COPD subjects with low baseline physical activity, no differences were found in survival between unchanged or increased physical activity at follow-up. In addition, subjects without COPD with low physical activity at follow-up had the highest hazard ratio of mortality, irrespective of baseline physical activity level (p≤0.05). A decline to low physical activity at follow-up was associated with an increased mortality risk in subjects with and without COPD. These observational data suggest that it is important to assess and encourage physical activity in the earliest stages of COPD in order to maintain a physical activity level that is as high as possible, as this is associated with better prognosis. ©ERS 2014.

  10. Tornadogenesis in Simulated Supercells from VORTEX2 Environments

    NASA Astrophysics Data System (ADS)

    Coffer, Brice Evan

    Despite an increased understanding of the environments that favor tornado formation, a high false-alarmrate for tornado warnings still exists. The composite near-stormenvironments of nontornadic and tornadic supercells sampled during the second Verification of the Origins of Rotation in Tornadoes Experiment (VORTEX2) both appear to be generally favorable for supercells and tornadoes. It has not been clear whether small differences between the two environments (e.g., more streamwise horizontal vorticity in the lowest few hundred meters above the ground in the tornadic composite) are actually determinative of storms' tornadic potential. From the VORTEX2 composite environments, simulations of a nontornadic and a tornadic supercell are used to investigate storm-scale differences that ultimately favor tornadogenesis or tornadogenesis failure. Both environments produce strong supercells with robust mid-level mesocyclones and hook echoes, though the tornadic supercell has a more intense low-level updraft and develops a tornado-like vortex exceeding the EF3 wind speed threshold. In contrast, the nontornadic supercell only produces shallow vortices, which never reach the EF0 wind speed threshold. Even though the nontornadic supercell readily produces subtornadic surface vortices, these vortices fail to be stretched by the low-level updraft. This is due to a disorganized low-level mesocyclone caused by predominately crosswise vorticity in the lowest few hundred meters above ground level within the nontornadic environment. In contrast, the tornadic supercell ingests predominately streamwise horizontal vorticity, which promotes a strong low-level mesocyclone with enhanced dynamic lifting and stretching of surface vertical vorticity. These results support the idea that larger streamwise vorticity leads to a more intense low-level mesocyclone, whereas predominately crosswise vorticity yields a less favorable configuration of the low-level mesocyclone for tornadogenesis. Since it is known that not every stormin seemingly favorable environments is tornadic, either our knowledge of environmental controls on tornadoes is incomplete, or there are factors beyond the environment that determine whether a supercell produces a tornado or not. In other words, tornado formation could be a volatile process that is largely internal to each storm. To assess this, an ensemble of thirty supercell simulations was constructed based on small variations to the nontornadic and tornadic environmental profiles composited from VORTEX2. All simulations produce distinct supercells despite occurring in similar environments. Both the tornadic and nontornadic ensemble members possess ample subtornadic surface vertical vorticity; the determinative factor is whether this vorticity can be converged and stretched by the low-level updraft. Each of the fifteen members in the tornadic VORTEX2 ensemble produces a long-track, intense tornado. Although there are notable differences in the precipitation and near-surface buoyancy fields, each stormfeatures strong dynamic lifting of surface air with vertical vorticity. This lifting is due to a steady low-level mesocyclone, which is linked to the ingestion of predominately streamwise environmental vorticity. In contrast, each nontornadic VORTEX2 simulation features a supercell with a disorganized low-level mesocyclone, due to crosswise vorticity in the lowest few hundred meters in the nontornadic environment. This generally leads to insufficient dynamic lifting and stretching to accomplish tornadogenesis. Even so, forty percent of the nontornadic VORTEX2 ensemble members becomeweakly tornadic. This implies that chaotic within-stormdetails can still play a role, and occasionally lead to marginally tornadic vortices in suboptimal storms. It is also unclear whether systematically varying the lower-tropospheric horizontal vorticity will yield a "tipping point" between nontornadic and tornadic supercells. Additional simulations have been conducted where the environment is systematically varied between the nontornadic and tornadic VORTEX2 composite profiles. The low-level wind profiles are linearly interpolated between the two composites (20/40/60/80%). The interpolated VORTEX2 simulations show that increasing lower tropospheric SRH leads to progressively more organized low-level mesocyclones and a higher probability of tornadic supercells, regardless of the upper-level winds or thermodynamic profile. The mean 0 - 500mSRH value where supercells are consistently tornadic for all the VORTEX2 interpolated simulations is 110m2 s-2. Supercells transitioned from nontornadic to tornadic when at least 40% of the tornadic low-level wind profile was introduced. This transition could not be attributed to warmer outflow temperatures nor the availability of subtornadic vertical vorticity within the hook echo. Instead, the low-level updraft was once again the discriminating factor, as a robust updraft is present directly overtop of the hook echo in each of the tornadic supercells. The fundamental feature of the nontornadic supercells is the low-level updrafts are generally disorganized, with pockets of descent present in the weak echo region. (Abstract shortened by ProQuest.).

  11. Seismic Hazard Maps for Seattle, Washington, Incorporating 3D Sedimentary Basin Effects, Nonlinear Site Response, and Rupture Directivity

    USGS Publications Warehouse

    Frankel, Arthur D.; Stephenson, William J.; Carver, David L.; Williams, Robert A.; Odum, Jack K.; Rhea, Susan

    2007-01-01

    This report presents probabilistic seismic hazard maps for Seattle, Washington, based on over 500 3D simulations of ground motions from scenario earthquakes. These maps include 3D sedimentary basin effects and rupture directivity. Nonlinear site response for soft-soil sites of fill and alluvium was also applied in the maps. The report describes the methodology for incorporating source and site dependent amplification factors into a probabilistic seismic hazard calculation. 3D simulations were conducted for the various earthquake sources that can affect Seattle: Seattle fault zone, Cascadia subduction zone, South Whidbey Island fault, and background shallow and deep earthquakes. The maps presented in this document used essentially the same set of faults and distributed-earthquake sources as in the 2002 national seismic hazard maps. The 3D velocity model utilized in the simulations was validated by modeling the amplitudes and waveforms of observed seismograms from five earthquakes in the region, including the 2001 M6.8 Nisqually earthquake. The probabilistic seismic hazard maps presented here depict 1 Hz response spectral accelerations with 10%, 5%, and 2% probabilities of exceedance in 50 years. The maps are based on determinations of seismic hazard for 7236 sites with a spacing of 280 m. The maps show that the most hazardous locations for this frequency band (around 1 Hz) are soft-soil sites (fill and alluvium) within the Seattle basin and along the inferred trace of the frontal fault of the Seattle fault zone. The next highest hazard is typically found for soft-soil sites in the Duwamish Valley south of the Seattle basin. In general, stiff-soil sites in the Seattle basin exhibit higher hazard than stiff-soil sites outside the basin. Sites with shallow bedrock outside the Seattle basin have the lowest estimated hazard for this frequency band.

  12. Decision Support for Environmental Management of Industrial Non-Hazardous Secondary Materials: New Analytical Methods Combined with Simulation and Optimization Modeling

    EPA Science Inventory

    Non-hazardous solid materials from industrial processes, once regarded as waste and disposed in landfills, offer numerous environmental and economic advantages when put to beneficial uses (BUs). Proper management of these industrial non-hazardous secondary materials (INSM) requir...

  13. Hazards and hazard combinations relevant for the safety of nuclear power plants

    NASA Astrophysics Data System (ADS)

    Decker, Kurt; Brinkman, Hans; Raimond, Emmanuel

    2017-04-01

    The potential of the contemporaneous impact of different, yet causally related, hazardous events and event cascades on nuclear power plants is a major contributor to the overall risk of nuclear installations. In the aftermath of the Fukushima accident, which was caused by a combination of severe ground shaking by an earthquake, an earthquake-triggered tsunami and the disruption of the plants from the electrical grid by a seismically induced landslide, hazard combinations and hazard cascades moved into the focus of nuclear safety research. We therefore developed an exhaustive list of external hazards and hazard combinations which pose potential threats to nuclear installations in the framework of the European project ASAMPSAE (Advanced Safety Assessment: Extended PSA). The project gathers 31 partners from Europe, North Amerika and Japan. The list comprises of exhaustive lists of natural hazards, external man-made hazards, and a cross-correlation matrix of these hazards. The hazard list is regarded comprehensive by including all types of hazards that were previously cited in documents by IAEA, the Western European Nuclear Regulators Association (WENRA), and others. 73 natural hazards and 24 man-made external hazards are included. Natural hazards are grouped into seismotectonic hazards, flooding and hydrological hazards, extreme values of meteorological phenomena, rare meteorological phenomena, biological hazards / infestation, geological hazards, and forest fire / wild fire. The list of external man-made hazards includes industry accidents, military accidents, transportation accidents, pipeline accidents and other man-made external events. The large number of different hazards results in the extremely large number of 5.151 theoretically possible hazard combinations (not considering hazard cascades). In principle all of these combinations are possible to occur by random coincidence except for 82 hazard combinations that - depending on the time scale - are mutually exclusive (e.g., extremely high air temperature and surface ice). Our dataset further provides information on hazard combinations which are more likely to occur than just by random coincidence. 577 correlations between individual hazards are identified by expert opinion and shown in a cross-correlation chart. Combinations discriminate between: (1) causally connected hazards (cause-effect relation) where one hazard (e.g., costal erosion) may be caused by another hazard (e.g., storm surge); or where one hazard (e.g., high wind) is a prerequisite for a correlated hazard (e.g., storm surge). The identified causal links are not commutative. (2) Associated hazards ("contemporary" events) which are probable to occur at the same time due to a common root cause (e.g., a cold front of a meteorological low pressure area which leads to a drop of air pressure, high wind, thunderstorm, lightning, heavy rain and hail). The root cause may not necessarily be regarded as a hazard by itself. The hazard list and the hazard correlation chart may serve as a starting point for the hazard analysis process for nuclear installations in Level 1 PSA as outlined by IAEA (2010), the definition of design basis for nuclear reactors, and the assessment of design extension conditions as required by WENRA-RHWG (2014). It may further be helpful for the identification of hazard combinations and hazard cascades which threaten other critical infrastructure. References: Decker, K. & Brinkman, H., 2017. List of external hazards to be considered in extended PSA. Report No. ASAMPSA_E/WP21/D21.2/2017-41 - IRSN/ PSN-RES/SAG/2017-00011 IAEA, 2010. Development and Application of Level 1 Probabilistic Safety Assessment for Nuclear Power Plants. Safety Guide No. SSG-3, Vienna. http://www-pub.iaea.org/books/ WENRA-RHWG, 2014. WENRA Safety Reference Levels for Existing Reactors. Update in Relation to Lessons Learned from TEPCO Fukushima Dai-Ichi Accident. http://www.wenra.org/publications/

  14. Disseminating Childhood Home Injury Risk Reduction Information in Pakistan: Results from a Community-Based Pilot Study

    PubMed Central

    Chandran, Aruna; Khan, Uzma Rahim; Zia, Nukhba; Feroze, Asher; de Ramirez, Sarah Stewart; Huang, Cheng-Ming; Razzak, Junaid A.; Hyder, Adnan A.

    2013-01-01

    Background: Most childhood unintentional injuries occur in the home; however, very little home injury prevention information is tailored to developing countries. Utilizing our previously developed information dissemination tools and a hazard assessment checklist tailored to a low-income neighborhood in Pakistan, we pilot tested and compared the effectiveness of two dissemination tools. Methods: Two low-income neighborhoods were mapped, identifying families with a child aged between 12 and 59 months. In June and July 2010, all enrolled households underwent a home hazard assessment at the same time hazard reduction education was being given using an in-home tutorial or a pamphlet. A follow up assessment was conducted 4–5 months later. Results: 503 households were enrolled; 256 received a tutorial and 247 a pamphlet. The two groups differed significantly (p < 0.01) in level of maternal education and relationship of the child to the primary caregiver. However, when controlling for these variables, those receiving an in-home tutorial had a higher odds of hazard reduction than the pamphlet group for uncovered vats of water (OR 2.14, 95% CI: 1.28, 3.58), an open fire within reach of the child (OR 3.55, 95% CI: 1.80, 7.00), and inappropriately labeled cooking fuel containers (OR 1.86, 95% CI: 1.07, 3.25). Conclusions: This pilot project demonstrates the potential utility of using home-visit tutorials to decrease home hazards in a low-income neighborhood in Pakistan. A longer-term randomized study is needed to assess actual effectiveness of the use of allied health workers for home-based injury education and whether this results in decreased home injuries. PMID:23502323

  15. Brace for the Next Threat to a Safe School Environment.

    ERIC Educational Resources Information Center

    Reecer, Marcia

    1988-01-01

    Outlines the environmental health hazard caused by radon gas percolating into buildings. Statistical data on school buildings tested for radon has found the levels low. Discusses how to test for the gas and effective ways to prevent buildup of the gas. Includes a sidebar describing radon. (MD)

  16. Warning: Your School May Be Hazardous to Your Health.

    ERIC Educational Resources Information Center

    Gursky, Daniel

    1991-01-01

    Many teachers unknowingly breathe air and drink water with low levels of harmful material. Exposure over the years may present significant health risks. This article examines the problems of indoor air pollution, pesticides, asbestos, lead in drinking water, and radon. Each section includes sources for further information. (SM)

  17. MICROSCALE EXTRACTION OF PERCHLORATE IN DRINKING WATER WITH LOW LEVEL DETECTION BY ELECTROSPRAY-MASS SPECTROMETRY.

    EPA Science Inventory

    Improper treatment and disposal of perchlorate can be an environmental hazard in regions where solid rocket motors are used, tested, or stored. The solubility and mobility of perchlorate lends itself to ground water contamination, and some of these sources are used for drinking ...

  18. Application of a low level, uniform ultrasound field for the acceleration of enzymatic bio-processing of cotton

    USDA-ARS?s Scientific Manuscript database

    Enzymatic bio-processing of cotton generates significantly less hazardous wastewater effluents, which are readily biodegradable, but it also has several critical shortcomings that impede its acceptance by industries: expensive processing costs and slow reaction rates. Our research has found that th...

  19. Application of Low Level, Uniform Ultrasound Field for Acceleration of Enzymatic Bio-processing of Cotton

    USDA-ARS?s Scientific Manuscript database

    Enzymatic bio-processing of cotton generates significantly less hazardous wastewater effluents, which are readily biodegradable, but it also has several critical shortcomings that impede its acceptance by industries: expensive processing costs and slow reaction rates. Our research has found that th...

  20. Chapter 3: Simulating fire hazard across landscapes through time: integrating state-and-transition models with the Fuel Characteristic Classification System

    Treesearch

    Jessica E. Halofsky; Stephanie K. Hart; Miles A. Hemstrom; Joshua S. Halofsky; Morris C. Johnson

    2014-01-01

    Information on the effects of management activities such as fuel reduction treatments and of processes such as vegetation growth and disturbance on fire hazard can help land managers prioritize treatments across a landscape to best meet management goals. State-and-transition models (STMs) allow landscape-scale simulations that incorporate effects of succession,...

  1. Landslide activity as a threat to infrastructure in river valleys - An example from outer Western Carpathians (Poland)

    NASA Astrophysics Data System (ADS)

    Łuszczyńska, Katarzyna; Wistuba, Małgorzata; Malik, Ireneusz

    2017-11-01

    Intensive development of the area of Polish Carpathians increases the scale of landslide risk. Thus detecting landslide hazards and risks became important issue for spatial planning in the area. We applied dendrochronological methods and GIS analysis for better understanding of landslide activity and related hazards in the test area (3,75 km2): Salomonka valley and nearby slopes in the Beskid Żywiecki Mts., Outer Western Carpathians, southern Poland. We applied eccentricity index of radial growth of trees to date past landslide events. Dendrochronological results allowed us to determine the mean frequency of landsliding at each sampling point which were next interpolated into a map of landslide hazard. In total we took samples at 46 points. In each point we sampled 3 coniferous trees. Landslide hazard map shows a medium (23 sampling points) and low (20 sampling points) level of landslide activity for most of the area. The highest level of activity was recorded for the largest landslide. Results of the dendrochronological study suggest that all landslides reaching downslope to Salomonka valley floor are active. LiDAR-based analysis of relief shows that there is an active coupling between those landslides and river channel. Thus channel damming and formation of an episodic lake are probable. The hazard of flooding valley floor upstream of active landslides should be included in the local spatial planning system and crisis management system.

  2. Comparison of landslide hazard and risk assessment practices in Europe

    NASA Astrophysics Data System (ADS)

    Corominas, J.; Mavrouli, O.

    2012-04-01

    An overview is made of the landslide hazard and risk assessment practices that are officially promoted or applied in Europe by administration offices, geological surveys, and decision makers (recommendations, regulations and codes). The reported countries are: Andorra, Austria, France, Italy (selected river basins), Romania, Spain (Catalonia), Switzerland and United Kingdom. The objective here was to compare the different practices for hazard and risk evaluation with respect to the official policies, the methodologies used (qualitative and quantitative), the provided outputs and their contents, and the terminology and map symbols used. The main observations made are illustrated with examples and the possibility of harmonization of the policies and the application of common practices to bridge the existing gaps is discussed. Some of the conclusions reached include the following: zoning maps are legally binding for public administrators and land owners only in some cases and generally when referring to site-specific or local scales rather than regional or national ones; so far, information is mainly provided on landslide susceptibility and hazard and risk assessment is performed only in a few countries; there is a variation in the use of scales between countries; the classification criteria for landslide types and mechanisms present large diversity even within the same country (in some cases no landslide mechanisms are specified while in others there is an exhaustive list); the techniques to obtain input data for the landslide inventory and susceptibility maps vary from basic to sophisticated, resulting in various levels of data quality and quantity; the procedures followed for hazard and risk assessment include analytical procedures supported by computer simulation, weighted-indicators, expert judgment and field survey-based, or a combination of all; there is an important variation between hazard and risk matrices with respect to the used parameters, the thresholds defining the different hazard or risk levels and the number and physical interpretation of the latter. In this context suggestions are made to bridge the gaps between the practices and to enhance homogenization of the hazard and risk assessment procedures and of their outputs. This work is presented within the framework of the SafeLand project funded by the European Commission's FP7 programme.

  3. Developing a smartphone software package for predicting atmospheric pollutant concentrations at mobile locations

    PubMed Central

    Larkin, Andrew; Williams, David E.; Kile, Molly L.; Baird, William M.

    2014-01-01

    Background There is considerable evidence that exposure to air pollution is harmful to health. In the U.S., ambient air quality is monitored by Federal and State agencies for regulatory purposes. There are limited options, however, for people to access this data in real-time which hinders an individual's ability to manage their own risks. This paper describes a new software package that models environmental concentrations of fine particulate matter (PM2.5), coarse particulate matter (PM10), and ozone concentrations for the state of Oregon and calculates personal health risks at the smartphone's current location. Predicted air pollution risk levels can be displayed on mobile devices as interactive maps and graphs color-coded to coincide with EPA air quality index (AQI) categories. Users have the option of setting air quality warning levels via color-coded bars and were notified whenever warning levels were exceeded by predicted levels within 10 km. We validated the software using data from participants as well as from simulations which showed that the application was capable of identifying spatial and temporal air quality trends. This unique application provides a potential low-cost technology for reducing personal exposure to air pollution which can improve quality of life particularly for people with health conditions, such as asthma, that make them more susceptible to these hazards. PMID:26146409

  4. Risk assessment predictions of open dumping area after closure using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Pauzi, Nur Irfah Mohd; Radhi, Mohd Shahril Mat; Omar, Husaini

    2017-10-01

    Currently, there are many abandoned open dumping areas that were left without any proper mitigation measures. These open dumping areas could pose serious hazard to human and pollute the environment. The objective of this paper is to determine the risk assessment at the open dumping area after they has been closed using Monte Carlo Simulation method. The risk assessment exercise is conducted at the Kuala Lumpur dumping area. The rapid urbanisation of Kuala Lumpur coupled with increase in population lead to increase in waste generation. It leads to more dumping/landfill area in Kuala Lumpur. The first stage of this study involve the assessment of the dumping area and samples collections. It followed by measurement of settlement of dumping area using oedometer. The risk of the settlement is predicted using Monte Carlo simulation method. Monte Carlo simulation calculates the risk and the long-term settlement. The model simulation result shows that risk level of the Kuala Lumpur open dumping area ranges between Level III to Level IV i.e. between medium risk to high risk. These settlement (ΔH) is between 3 meters to 7 meters. Since the risk is between medium to high, it requires mitigation measures such as replacing the top waste soil with new sandy gravel soil. This will increase the strength of the soil and reduce the settlement.

  5. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    NASA Astrophysics Data System (ADS)

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  6. Spatially explicit assessment of heat health risk by using multi-sensor remote sensing images and socioeconomic data in Yangtze River Delta, China.

    PubMed

    Chen, Qian; Ding, Mingjun; Yang, Xuchao; Hu, Kejia; Qi, Jiaguo

    2018-05-25

    The increase in the frequency and intensity of extreme heat events, which are potentially associated with climate change in the near future, highlights the importance of heat health risk assessment, a significant reference for heat-related death reduction and intervention. However, a spatiotemporal mismatch exists between gridded heat hazard and human exposure in risk assessment, which hinders the identification of high-risk areas at finer scales. A human settlement index integrated by nighttime light images, enhanced vegetation index, and digital elevation model data was utilized to assess the human exposure at high spatial resolution. Heat hazard and vulnerability index were generated by land surface temperature and demographic and socioeconomic census data, respectively. Spatially explicit assessment of heat health risk and its driving factors was conducted in the Yangtze River Delta (YRD), east China at 250 m pixel level. High-risk areas were mainly distributed in the urbanized areas of YRD, which were mostly driven by high human exposure and heat hazard index. In some less-urbanized cities and suburban and rural areas of mega-cities, the heat health risks are in second priority. The risks in some less-developed areas were high despite the low human exposure index because of high heat hazard and vulnerability index. This study illustrated a methodology for identifying high-risk areas by combining freely available multi-source data. Highly urbanized areas were considered hotspots of high heat health risks, which were largely driven by the increasing urban heat island effects and population density in urban areas. Repercussions of overheating were weakened due to the low social vulnerability in some central areas benefitting from the low proportion of sensitive population or the high level of socioeconomic development. By contrast, high social vulnerability intensifies heat health risks in some less-urbanized cities and suburban areas of mega-cities.

  7. Vulnerability Assessment Using LIDAR Data in Silang-Sta Rosa Subwatershed, Philippines

    NASA Astrophysics Data System (ADS)

    Bragais, M. A.; Magcale-Macandog, D. B.; Arizapa, J. L.; Manalo, K. M.

    2016-10-01

    Silang-Sta. Rosa Subwatershed is experiencing rapid urbanization. Its downstream area is already urbanized and the development is moving fast upstream. With the rapid land conversion of pervious to impervious areas and increase frequency of intense rainfall events, the downstream of the watershed is at risk of flood hazard. The widely used freeware HEC-RAS (Hydrologic Engineering Center- River Analysis System) model was used to implement the 2D unsteady flow analysis to develop a flood hazard map. The LiDAR derived digital elevation model (DEM) with 1m resolution provided detailed terrain that is vital for producing reliable flood extent map that can be used for early warning system. With the detailed information from the simulation like areas to be flooded, the predicted depth and duration, we can now provide specific flood forecasting and mitigation plan even at community level. The methodology of using 2D unsteady flow modelling and high resolution DEM in a watershed can be replicated to other neighbouring watersheds specially those areas that are not yet urbanized so that their development will be guided to be flood hazard resilient. LGUs all over the country will benefit from having a high resolution flood hazard map.

  8. Considerations of Environmentally Relevant Test Conditions for Improved Evaluation of Ecological Hazards of Engineered Nanomaterials.

    PubMed

    Holden, Patricia A; Gardea-Torresdey, Jorge L; Klaessig, Fred; Turco, Ronald F; Mortimer, Monika; Hund-Rinke, Kerstin; Cohen Hubal, Elaine A; Avery, David; Barceló, Damià; Behra, Renata; Cohen, Yoram; Deydier-Stephan, Laurence; Ferguson, P Lee; Fernandes, Teresa F; Herr Harthorn, Barbara; Henderson, W Matthew; Hoke, Robert A; Hristozov, Danail; Johnston, John M; Kane, Agnes B; Kapustka, Larry; Keller, Arturo A; Lenihan, Hunter S; Lovell, Wess; Murphy, Catherine J; Nisbet, Roger M; Petersen, Elijah J; Salinas, Edward R; Scheringer, Martin; Sharma, Monita; Speed, David E; Sultan, Yasir; Westerhoff, Paul; White, Jason C; Wiesner, Mark R; Wong, Eva M; Xing, Baoshan; Steele Horan, Meghan; Godwin, Hilary A; Nel, André E

    2016-06-21

    Engineered nanomaterials (ENMs) are increasingly entering the environment with uncertain consequences including potential ecological effects. Various research communities view differently whether ecotoxicological testing of ENMs should be conducted using environmentally relevant concentrations-where observing outcomes is difficult-versus higher ENM doses, where responses are observable. What exposure conditions are typically used in assessing ENM hazards to populations? What conditions are used to test ecosystem-scale hazards? What is known regarding actual ENMs in the environment, via measurements or modeling simulations? How should exposure conditions, ENM transformation, dose, and body burden be used in interpreting biological and computational findings for assessing risks? These questions were addressed in the context of this critical review. As a result, three main recommendations emerged. First, researchers should improve ecotoxicology of ENMs by choosing test end points, duration, and study conditions-including ENM test concentrations-that align with realistic exposure scenarios. Second, testing should proceed via tiers with iterative feedback that informs experiments at other levels of biological organization. Finally, environmental realism in ENM hazard assessments should involve greater coordination among ENM quantitative analysts, exposure modelers, and ecotoxicologists, across government, industry, and academia.

  9. A procedure to select ground-motion time histories for deterministic seismic hazard analysis from the Next Generation Attenuation (NGA) database

    NASA Astrophysics Data System (ADS)

    Huang, Duruo; Du, Wenqi; Zhu, Hong

    2017-10-01

    In performance-based seismic design, ground-motion time histories are needed for analyzing dynamic responses of nonlinear structural systems. However, the number of ground-motion data at design level is often limited. In order to analyze seismic performance of structures, ground-motion time histories need to be either selected from recorded strong-motion database or numerically simulated using stochastic approaches. In this paper, a detailed procedure to select proper acceleration time histories from the Next Generation Attenuation (NGA) database for several cities in Taiwan is presented. Target response spectra are initially determined based on a local ground-motion prediction equation under representative deterministic seismic hazard analyses. Then several suites of ground motions are selected for these cities using the Design Ground Motion Library (DGML), a recently proposed interactive ground-motion selection tool. The selected time histories are representatives of the regional seismic hazard and should be beneficial to earthquake studies when comprehensive seismic hazard assessments and site investigations are unavailable. Note that this method is also applicable to site-specific motion selections with the target spectra near the ground surface considering the site effect.

  10. Loss Estimations due to Earthquakes and Secondary Technological Hazards

    NASA Astrophysics Data System (ADS)

    Frolova, N.; Larionov, V.; Bonnin, J.

    2009-04-01

    Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.

  11. Hazard, Vulnerability and Capacity Mapping for Landslides Risk Analysis using Geographic Information System (GIS)

    NASA Astrophysics Data System (ADS)

    Sari, D. A. P.; Innaqa, S.; Safrilah

    2017-06-01

    This research analyzed the levels of disaster risk in the Citeureup sub-District, Bogor Regency, West Java, based on its potential hazard, vulnerability and capacity, using map to represent the results, then Miles and Huberman analytical techniques was used to analyze the qualitative interviews. The analysis conducted in this study is based on the concept of disaster risk by Wisner. The result shows that the Citeureup sub-District has medium-low risk of landslides. Of the 14 villages, three villages have a moderate risk level, namely Hambalang, Tajur, and Tangkil, or 49.58% of the total land area. Eleven villages have a low level of risk, namely Pasir Mukti, Sanja, Tarikolot, Gunung Sari, Puspasari, East Karang Asem, Citeureup, Leuwinutug, Sukahati, West Karang Asem West and Puspanegara, or 48.68% of the total land area, for high-risk areas only around 1.74%, which is part of Hambalang village. The analysis using Geographic Information System (GIS) prove that areas with a high risk potential does not necessarily have a high level of risk. The capacity of the community plays an important role to minimize the risk of a region. Disaster risk reduction strategy is done by creating a safe condition, which intensified the movement of disaster risk reduction.

  12. A randomized controlled study of manikin simulator fidelity on neonatal resuscitation program learning outcomes.

    PubMed

    Curran, Vernon; Fleet, Lisa; White, Susan; Bessell, Clare; Deshpandey, Akhil; Drover, Anne; Hayward, Mark; Valcour, James

    2015-03-01

    The neonatal resuscitation program (NRP) has been developed to educate physicians and other health care providers about newborn resuscitation and has been shown to improve neonatal resuscitation skills. Simulation-based training is recommended as an effective modality for instructing neonatal resuscitation and both low and high-fidelity manikin simulators are used. There is limited research that has compared the effect of low and high-fidelity manikin simulators for NRP learning outcomes, and more specifically on teamwork performance and confidence. The purpose of this study was to examine the effect of using low versus high-fidelity manikin simulators in NRP instruction. A randomized posttest-only control group study design was conducted. Third year undergraduate medical students participated in NRP instruction and were assigned to an experimental group (high-fidelity manikin simulator) or control group (low-fidelity manikin simulator). Integrated skills station (megacode) performance, participant satisfaction, confidence and teamwork behaviour scores were compared between the study groups. Participants in the high-fidelity manikin simulator instructional group reported significantly higher total scores in overall satisfaction (p = 0.001) and confidence (p = 0.001). There were no significant differences in teamwork behaviour scores, as observed by two independent raters, nor differences on mandatory integrated skills station performance items at the p < 0.05 level. Medical students' reported greater satisfaction and confidence with high-fidelity manikin simulators, but did not demonstrate overall significantly improved teamwork or integrated skills station performance. Low and high-fidelity manikin simulators facilitate similar levels of objectively measured NRP outcomes for integrated skills station and teamwork performance.

  13. A modeling approach to account for toxicokinetic interactions in the calculation of biological hazard index for chemical mixtures.

    PubMed

    Haddad, S; Tardif, R; Viau, C; Krishnan, K

    1999-09-05

    Biological hazard index (BHI) is defined as biological level tolerable for exposure to mixture, and is calculated by an equation similar to the conventional hazard index. The BHI calculation, at the present time, is advocated for use in situations where toxicokinetic interactions do not occur among mixture constituents. The objective of this study was to develop an approach for calculating interactions-based BHI for chemical mixtures. The approach consisted of simulating the concentration of exposure indicator in the biological matrix of choice (e.g. venous blood) for each component of the mixture to which workers are exposed and then comparing these to the established BEI values, for calculating the BHI. The simulation of biomarker concentrations was performed using a physiologically-based toxicokinetic (PBTK) model which accounted for the mechanism of interactions among all mixture components (e.g. competitive inhibition). The usefulness of the present approach is illustrated by calculating BHI for varying ambient concentrations of a mixture of three chemicals (toluene (5-40 ppm), m-xylene (10-50 ppm), and ethylbenzene (10-50 ppm)). The results show that the interactions-based BHI can be greater or smaller than that calculated on the basis of additivity principle, particularly at high exposure concentrations. At lower exposure concentrations (e.g. 20 ppm each of toluene, m-xylene and ethylbenzene), the BHI values obtained using the conventional methodology are similar to the interactions-based methodology, confirming that the consequences of competitive inhibition are negligible at lower concentrations. The advantage of the PBTK model-based methodology developed in this study relates to the fact that, the concentrations of individual chemicals in mixtures that will not result in a significant increase in the BHI (i.e. > 1) can be determined by iterative simulation.

  14. Combining criteria for delineating lahar- and flash-flood-prone hazard and risk zones for the city of Arequipa, Peru

    NASA Astrophysics Data System (ADS)

    Thouret, J.-C.; Enjolras, G.; Martelli, K.; Santoni, O.; Luque, J. A.; Nagata, M.; Arguedas, A.; Macedo, L.

    2013-02-01

    Arequipa, the second largest city in Peru, is exposed to many natural hazards, most notably earthquakes, volcanic eruptions, landslides, lahars (volcanic debris flows), and flash floods. Of these, lahars and flash floods, triggered by occasional torrential rainfall, pose the most frequently occurring hazards that can affect the city and its environs, in particular the areas containing low-income neighbourhoods. This paper presents and discusses criteria for delineating areas prone to flash flood and lahar hazards, which are localized along the usually dry (except for the rainy season) ravines and channels of the Río Chili and its tributaries that dissect the city. Our risk-evaluation study is based mostly on field surveys and mapping, but we also took into account quality and structural integrity of buildings, available socio-economic data, and information gained from interviews with risk-managers officials. In our evaluation of the vulnerability of various parts of the city, in addition to geological and physical parameters, we also took into account selected socio-economic parameters, such as the educational and poverty level of the population, unemployment figures, and population density. In addition, we utilized a criterion of the "isolation factor", based on distances to access emergency resources (hospitals, shelters or safety areas, and water) in each city block. By combining the hazard, vulnerability and exposure criteria, we produced detailed risk-zone maps at the city-block scale, covering the whole city of Arequipa and adjacent suburbs. Not surprisingly, these maps show that the areas at high risk coincide with blocks or districts with populations at low socio-economic levels. Inhabitants at greatest risk are the poor recent immigrants from rural areas who live in unauthorized settlements in the outskirts of the city in the upper parts of the valleys. Such settlements are highly exposed to natural hazards and have little access to vital resources. Our study provides good rationale for the risk zoning of the city, which in turn may be used as an educational tool for better understanding the potential effects of natural hazards and the exposure of the population residing in and around Arequipa. We hope that our work and the risk-zonation maps will provide the impetus and basis for risk-management authorities of the Municipality and the regional government of Arequipa to enforce existing regulations in building in hazardous zones and to adopt an effective long-term strategy to reduce risks from lahar, flash flood, and other natural hazards.

  15. U-Shaped Association Between Serum Uric Acid Level and Risk of Mortality: A Cohort Study.

    PubMed

    Cho, Sung Kweon; Chang, Yoosoo; Kim, Inah; Ryu, Seungho

    2018-04-25

    In addition to the controversy regarding the association of hyperuricemia with cardiovascular disease (CVD) mortality, few studies have examined the impact of a low uric acid level on mortality. We undertook the present study to evaluate the relationship between both low and high uric acid levels and the risk of all-cause and cause-specific mortality in a large sample of Korean adults over a full range of uric acid levels. A cohort study was performed in 375,163 South Korean men and women who underwent health check-ups from 2002 to 2012. Vital status and cause of death were ascertained from the national death records. Hazard ratios (HRs) and 95% confidence intervals (95% CIs) for mortality outcomes were estimated using Cox proportional hazards regression analysis. During a total of 2,060,721.9 person-years of follow-up, 2,020 participants died, with 287 CVD deaths and 963 cancer deaths. Low and high uric acid levels were associated with increased all-cause, CVD, and cancer mortality. The multivariable-adjusted HRs for all-cause mortality in the lowest uric acid categories (<3.5 mg/dl for men and <2.5 mg/dl for women) compared with the sex-specific reference category were 1.58 (95% CI 1.18-2.10) and 1.80 (95% CI 1.10-2.93), respectively. Corresponding HRs in the highest uric acid categories (≥9.5 mg/dl for men and ≥8.5 mg/dl for women) were 2.39 (95% CI 1.57-3.66) and 3.77 (95% CI 1.17-12.17), respectively. In this large cohort study of men and women, both low and high uric acid levels were predictive of increased mortality, supporting a U-shaped association between serum uric acid levels and adverse health outcomes. © 2018, American College of Rheumatology.

  16. New version of 1 km global river flood hazard maps for the next generation of Aqueduct Global Flood Analyzer

    NASA Astrophysics Data System (ADS)

    Sutanudjaja, Edwin; van Beek, Rens; Winsemius, Hessel; Ward, Philip; Bierkens, Marc

    2017-04-01

    The Aqueduct Global Flood Analyzer, launched in 2015, is an open-access and free-of-charge web-based interactive platform which assesses and visualises current and future projections of river flood impacts across the globe. One of the key components in the Analyzer is a set of river flood inundation hazard maps derived from the global hydrological model simulation of PCR-GLOBWB. For the current version of the Analyzer, accessible on http://floods.wri.org/#/, the early generation of PCR-GLOBWB 1.0 was used and simulated at 30 arc-minute ( 50 km at the equator) resolution. In this presentation, we will show the new version of these hazard maps. This new version is based on the latest version of PCR-GLOBWB 2.0 (https://github.com/UU-Hydro/PCR-GLOBWB_model, Sutanudjaja et al., 2016, doi:10.5281/zenodo.60764) simulated at 5 arc-minute ( 10 km at the equator) resolution. The model simulates daily hydrological and water resource fluxes and storages, including the simulation of overbank volume that ends up on the floodplain (if flooding occurs). The simulation was performed for the present day situation (from 1960) and future climate projections (until 2099) using the climate forcing created in the ISI-MIP project. From the simulated flood inundation volume time series, we then extract annual maxima for each cell, and fit these maxima to a Gumbel extreme value distribution. This allows us to derive flood volume maps of any hazard magnitude (ranging from 2-year to 1000-year flood events) and for any time period (e.g. 1960-1999, 2010-2049, 2030-2069, and 2060-2099). The derived flood volumes (at 5 arc-minute resolution) are then spread over the high resolution terrain model using an updated GLOFRIS downscaling module (Winsemius et al., 2013, doi:10.5194/hess-17-1871-2013). The updated version performs a volume spreading sequentially from more upstream basins to downstream basins, hence enabling a better inclusion of smaller streams, and takes into account spreading of water over diverging deltaic regions. This results in a set of high resolution hazard maps of flood inundation depth at 30 arc-second ( 1 km at the equator) resolution. Together with many other updates and new features, the resulting flood hazard maps will be used in the next generation of the Aqueduct Global Flood Analyzer.

  17. Using Adaptive Mesh Refinment to Simulate Storm Surge

    NASA Astrophysics Data System (ADS)

    Mandli, K. T.; Dawson, C.

    2012-12-01

    Coastal hazards related to strong storms such as hurricanes and typhoons are one of the most frequently recurring and wide spread hazards to coastal communities. Storm surges are among the most devastating effects of these storms, and their prediction and mitigation through numerical simulations is of great interest to coastal communities that need to plan for the subsequent rise in sea level during these storms. Unfortunately these simulations require a large amount of resolution in regions of interest to capture relevant effects resulting in a computational cost that may be intractable. This problem is exacerbated in situations where a large number of similar runs is needed such as in design of infrastructure or forecasting with ensembles of probable storms. One solution to address the problem of computational cost is to employ adaptive mesh refinement (AMR) algorithms. AMR functions by decomposing the computational domain into regions which may vary in resolution as time proceeds. Decomposing the domain as the flow evolves makes this class of methods effective at ensuring that computational effort is spent only where it is needed. AMR also allows for placement of computational resolution independent of user interaction and expectation of the dynamics of the flow as well as particular regions of interest such as harbors. The simulation of many different applications have only been made possible by using AMR-type algorithms, which have allowed otherwise impractical simulations to be performed for much less computational expense. Our work involves studying how storm surge simulations can be improved with AMR algorithms. We have implemented relevant storm surge physics in the GeoClaw package and tested how Hurricane Ike's surge into Galveston Bay and up the Houston Ship Channel compares to available tide gauge data. We will also discuss issues dealing with refinement criteria, optimal resolution and refinement ratios, and inundation.

  18. Considering the ranges of uncertainties in the New Probabilistic Seismic Hazard Assessment of Germany - Version 2016

    NASA Astrophysics Data System (ADS)

    Grunthal, Gottfried; Stromeyer, Dietrich; Bosse, Christian; Cotton, Fabrice; Bindi, Dino

    2017-04-01

    The seismic load parameters for the upcoming National Annex to the Eurocode 8 result from the reassessment of the seismic hazard supported by the German Institution for Civil Engineering . This 2016 version of hazard assessment for Germany as target area was based on a comprehensive involvement of all accessible uncertainties in models and parameters into the approach and the provision of a rational framework for facilitating the uncertainties in a transparent way. The developed seismic hazard model represents significant improvements; i.e. it is based on updated and extended databases, comprehensive ranges of models, robust methods and a selection of a set of ground motion prediction equations of their latest generation. The output specifications were designed according to the user oriented needs as suggested by two review teams supervising the entire project. In particular, seismic load parameters were calculated for rock conditions with a vS30 of 800 ms-1 for three hazard levels (10%, 5% and 2% probability of occurrence or exceedance within 50 years) in form of, e.g., uniform hazard spectra (UHS) based on 19 sprectral periods in the range of 0.01 - 3s, seismic hazard maps for spectral response accelerations for different spectral periods or for macroseismic intensities. The developed hazard model consists of a logic tree with 4040 end branches and essential innovations employed to capture epistemic uncertainties and aleatory variabilities. The computation scheme enables the sound calculation of the mean and any quantile of required seismic load parameters. Mean, median and 84th percentiles of load parameters were provided together with the full calculation model to clearly illustrate the uncertainties of such a probabilistic assessment for a region of a low-to-moderate level of seismicity. The regional variations of these uncertainties (e.g. ratios between the mean and median hazard estimations) were analyzed and discussed.

  19. Atomic oxygen effects on materials

    NASA Technical Reports Server (NTRS)

    Banks, Bruce A.; Rutledge, Sharon K.; Brady, Joyce A.; Merrow, James E.

    1989-01-01

    Understanding of the basic processes of atomic oxygen interaction is currently at a very elementary level. However, measurement of erosion yields, surface morphology, and optical properties for low fluences have brought about much progress in the past decade. Understanding the mechanisms and those factors that are important for proper simulation of low Earth orbit is at a much lower level of understanding. The ability to use laboratory simulations with confidence to quantifiably address the functional performance and durability of materials in low Earth orbit will be necessary to assure long-term survivability to the natural space environment.

  20. A risk assessment approach for fresh fruits.

    PubMed

    Bassett, J; McClure, P

    2008-04-01

    To describe the approach used in conducting a fit-for-purpose risk assessment of microbiological human pathogens associated with fresh fruit and the risk management recommendations made. A qualitative risk assessment for microbiological hazards in fresh fruit was carried out based on the Codex Alimentarius (Codex) framework, modified to consider multiple hazards and all fresh (whole) fruits. The assessment determines 14 significant bacterial, viral, protozoal and nematodal hazards associated with fresh produce, assesses the probable level of exposure from fresh fruit, concludes on the risk from each hazard, and considers and recommends risk management actions. A review of potential risk management options allowed the comparison of effectiveness with the potential exposure to each hazard. Washing to a recommended protocol is an appropriate risk management action for the vast majority of consumption events, particularly when good agricultural and hygienic practices are followed and with the addition of refrigerated storage for low acid fruit. Additional safeguards are recommended for aggregate fruits with respect to the risk from protozoa. The potentially complex process of assessing the risks of multiple hazards in multiple but similar commodities can be simplified in a qualitative assessment approach that employs the Codex methodology.

  1. Risk of death from cardiovascular disease associated with low-level arsenic exposure among long-term smokers in a US population-based study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farzan, Shohreh F.; Departments of Population Health and Environmental Medicine, New York University School of Medicine, New York, NY; Chen, Yu

    High levels of arsenic exposure have been associated with increases in cardiovascular disease risk. However, studies of arsenic's effects at lower exposure levels are limited and few prospective studies exist in the United States using long-term arsenic exposure biomarkers. We conducted a prospective analysis of the association between toenail arsenic and cardiovascular disease mortality using longitudinal data collected on 3939 participants in the New Hampshire Skin Cancer Study. Using Cox proportional hazard models adjusted for potential confounders, we estimated hazard ratios and 95% confidence intervals associated with the risk of death from any cardiovascular disease, ischemic heart disease, and stroke,more » in relation to natural-log transformed toenail arsenic concentrations. In this US population, although we observed no overall association, arsenic exposure measured from toenail clipping samples was related to an increased risk of ischemic heart disease mortality among long-term smokers (as reported at baseline), with increased hazard ratios among individuals with ≥ 31 total smoking years (HR: 1.52, 95% CI: 1.02, 2.27), ≥ 30 pack-years (HR: 1.66, 95% CI: 1.12, 2.45), and among current smokers (HR: 1.69, 95% CI: 1.04, 2.75). These results are consistent with evidence from more highly exposed populations suggesting a synergistic relationship between arsenic exposure and smoking on health outcomes and support a role for lower-level arsenic exposure in ischemic heart disease mortality. - Highlights: • Arsenic (As) has been associated with increased cardiovascular disease (CVD) risk. • Little is known about CVD effects at lower levels of As exposure common in the US. • Few have investigated the joint effects of As and smoking on CVD in US adults. • We examine chronic low-level As exposure and smoking in relation to CVD mortality. • Arsenic exposure may increase ischemic heart disease mortality among smokers in US.« less

  2. Forecasting surface water flooding hazard and impact in real-time

    NASA Astrophysics Data System (ADS)

    Cole, Steven J.; Moore, Robert J.; Wells, Steven C.

    2016-04-01

    Across the world, there is increasing demand for more robust and timely forecast and alert information on Surface Water Flooding (SWF). Within a UK context, the government Pitt Review into the Summer 2007 floods provided recommendations and impetus to improve the understanding of SWF risk for both off-line design and real-time forecasting and warning. Ongoing development and trial of an end-to-end real-time SWF system is being progressed through the recently formed Natural Hazards Partnership (NHP) with delivery to the Flood Forecasting Centre (FFC) providing coverage over England & Wales. The NHP is a unique forum that aims to deliver coordinated assessments, research and advice on natural hazards for governments and resilience communities across the UK. Within the NHP, a real-time Hazard Impact Model (HIM) framework has been developed that includes SWF as one of three hazards chosen for initial trialling. The trial SWF HIM system uses dynamic gridded surface-runoff estimates from the Grid-to-Grid (G2G) hydrological model to estimate the SWF hazard. National datasets on population, infrastructure, property and transport are available to assess impact severity for a given rarity of SWF hazard. Whilst the SWF hazard footprint is calculated in real-time using 1, 3 and 6 hour accumulations of G2G surface runoff on a 1 km grid, it has been possible to associate these with the effective rainfall design profiles (at 250m resolution) used as input to a detailed flood inundation model (JFlow+) run offline to produce hazard information resolved to 2m resolution. This information is contained in the updated Flood Map for Surface Water (uFMfSW) held by the Environment Agency. The national impact datasets can then be used with the uFMfSW SWF hazard dataset to assess impacts at this scale and severity levels of potential impact assigned at 1km and for aggregated county areas in real-time. The impact component is being led by the Health and Safety Laboratory (HSL) within the NHP. Flood Guidance within the FFC employs the national Flood Risk Matrix, which categorises potential impacts into minimal, minor, significant and severe, and Likelihood, into very low, low, medium and high classes, and the matrix entries then define the Overall Flood Risk as very low, low, medium and high. Likelihood is quantified by running G2G with Met Office ensemble rainfall inputs that in turn allows a probability to be assigned to the SWF hazard and associated impact. This overall procedure is being trialled and refined off-line by CEH and HSL using case study data, and at the same time implemented as a pre-operational test system at the Met Office for evaluation by FFC (a joint Environment Agency and Met Office centre for flood forecasting) in 2016.

  3. Association between Recurrent Metastasis from Stage II and III Primary Colorectal Tumors and Moderate Microsatellite Instability

    PubMed Central

    Garcia, Melissa; Choi, Chan; Kim, Hyeong-Rok; Daoud, Yahya; Toiyama, Yuji; Takahashi, Masanobu; Goel, Ajay; Boland, C Richard; Koi, Minoru

    2012-01-01

    Colorectal cancer (CRC) cells frequently have low levels of microsatellite instability (MSI-L) and elevated microsatellite alterations at tetranucleotide repeats (EMAST), but little is known about the clinicopathological significance of these features. We observed that patients with stage II or III CRC with MSI-L and/or EMAST had a shorter times of recurrence-free survival than patients with high levels of MSI (MSI-H) (P=.0084) or with highly stable microsatellites (H-MSS) (P=.0415), based on Kaplan-Meier analysis. MSI-L and/or EMAST were independent predictors of recurrent distant metastasis from primary stage II or III colorectal tumors (Cox proportional hazard analysis hazard ratio, 1.83; 95% confidence interval, 1.06–3.15; P=.0301). PMID:22465427

  4. Simulation of a G-tolerance curve using the pulsatile cardiovascular model

    NASA Technical Reports Server (NTRS)

    Solomon, M.; Srinivasan, R.

    1985-01-01

    A computer simulation study, performed to assess the ability of the cardiovascular model to reproduce the G tolerance curve (G level versus tolerance time) is reported. A composite strength duration curve derived from experimental data obtained in human centrifugation studies was used for comparison. The effects of abolishing automomic control and of blood volume loss on G tolerance were also simulated. The results provide additional validation of the model. The need for the presence of autonomic reflexes even at low levels of G is pointed out. The low margin of safety with a loss of blood volume indicated by the simulation results underscores the necessity for protective measures during Shuttle reentry.

  5. Piloted Simulation Study of the Effects of High-Lift Aerodynamics on the Takeoff Noise of a Representative High-Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Glaab, Louis J.; Riley, Donald R.; Brandon, Jay M.; Person, Lee H., Jr.; Glaab, Patricia C.

    1999-01-01

    As part of an effort between NASA and private industry to reduce airport-community noise for high-speed civil transport (HSCT) concepts, a piloted simulation study was initiated for the purpose of predicting the noise reduction benefits that could result from improved low-speed high-lift aerodynamic performance for a typical HSCT configuration during takeoff and initial climb. Flight profile and engine information from the piloted simulation were coupled with the NASA Langley Aircraft Noise Prediction Program (ANOPP) to estimate jet engine noise and to propagate the resulting source noise to ground observer stations. A baseline aircraft configuration, which also incorporated different levels of projected improvements in low-speed high-lift aerodynamic performance, was simulated to investigate effects of increased lift and lift-to-drag ratio on takeoff noise levels. Simulated takeoff flights were performed with the pilots following a specified procedure in which either a single thrust cutback was performed at selected altitudes ranging from 400 to 2000 ft, or a multiple-cutback procedure was performed where thrust was reduced by a two-step process. Results show that improved low-speed high-lift aerodynamic performance provides at least a 4 to 6 dB reduction in effective perceived noise level at the FAA downrange flyover measurement station for either cutback procedure. However, improved low-speed high-lift aerodynamic performance reduced maximum sideline noise levels only when using the multiple-cutback procedures.

  6. Developing a global tsunami propagation database and its application for coastal hazard assessments in China

    NASA Astrophysics Data System (ADS)

    Wang, N.; Tang, L.; Titov, V.; Newman, J. C.; Dong, S.; Wei, Y.

    2013-12-01

    The tragedies of the 2004 Indian Ocean and 2011 Japan tsunamis have increased awareness of tsunami hazards for many nations, including China. The low land level and high population density of China's coastal areas place it at high risk for tsunami hazards. Recent research (Komatsubara and Fujiwara, 2007) highlighted concerns of a magnitude 9.0 earthquake on the Nankai trench, which may affect China's coasts not only in South China Sea, but also in the East Sea and Yellow Sea. Here we present our work in progress towards developing a global tsunami propagation database that can be used for hazard assessments by many countries. The propagation scenarios are computed by using NOAA's MOST numerical model. Each scenario represents a typical Mw 7.5 earthquake with predefined earthquake parameters (Gica et al., 2008). The model grid was interpolated from ETOPO1 at 4 arc-min resolution, covering -80° to72°N and 0 to 360°E. We use this database for preliminary tsunami hazard assessment along China's coastlines.

  7. The Composite Strain Index (COSI) and Cumulative Strain Index (CUSI): methodologies for quantifying biomechanical stressors for complex tasks and job rotation using the Revised Strain Index.

    PubMed

    Garg, Arun; Moore, J Steven; Kapellusch, Jay M

    2017-08-01

    The Composite Strain Index (COSI) quantifies biomechanical stressors for complex tasks consisting of exertions at different force levels and/or with different exertion times. The Cumulative Strain Index (CUSI) further integrates biomechanical stressors from different tasks to quantify exposure for the entire work shift. The paper provides methodologies to compute COSI and CUSI along with examples. Complex task simulation produced 169,214 distinct tasks. Use of average, time-weighted average (TWA) and peak force and COSI classified 66.9, 28.2, 100 and 38.9% of tasks as hazardous, respectively. For job rotation the simulation produced 10,920 distinct jobs. TWA COSI, peak task COSI and CUSI classified 36.5, 78.1 and 66.6% jobs as hazardous, respectively. The results suggest that the TWA approach systematically underestimates the biomechanical stressors and peak approach overestimates biomechanical stressors, both at the task and job level. It is believed that the COSI and CUSI partially address these underestimations and overestimations of biomechanical stressors. Practitioner Summary: COSI quantifies exposure when applied hand force and/or duration of that force changes during a task cycle. CUSI integrates physical exposures from job rotation. These should be valuable tools for designing and analysing tasks and job rotation to determine risk of musculoskeletal injuries.

  8. Neighborhood socioeconomic status at the age of 40 years and ischemic stroke before the age of 50 years: A nationwide cohort study from Sweden.

    PubMed

    Carlsson, Axel C; Li, Xinjun; Holzmann, Martin J; Ärnlöv, Johan; Wändell, Per; Gasevic, Danijela; Sundquist, Jan; Sundquist, Kristina

    2017-10-01

    Objective We aimed to study the association between neighborhood socioeconomic status at the age of 40 years and risk of ischemic stroke before the age of 50 years. Methods All individuals in Sweden were included if their 40th birthday occurred between 1998 and 2010. National registers were used to categorize neighborhood socioeconomic status into high, middle, and low and to retrieve information on incident ischemic strokes. Hazard ratios and their 95% confidence intervals were estimated. Results A total of 1,153,451 adults (women 48.9%) were followed for a mean of 5.5 years (SD 3.5 years), during which 1777 (0.30%) strokes among men and 1374 (0.24%) strokes among women were recorded. After adjustment for sex, marital status, education level, immigrant status, region of residence, and neighborhood services, there was a lower risk of stroke in residents from high-socioeconomic status neighborhoods (hazard ratio 0.87, 95% confidence interval 0.78-0.96), and an increased risk of stroke in adults from low-socioeconomic status neighborhoods (hazard ratio 1.16, 95% confidence interval 1.06-1.27), compared to their counterparts living in middle-socioeconomic status neighborhoods. After further adjustment for hospital diagnoses of hypertension, diabetes, heart failure, and atrial fibrillation prior to the age of 40, the higher risk in neighborhoods with low socioeconomic status was attenuated, but remained significant (hazard ratio 1.12, 95% confidence interval 1.02-1.23). Conclusions In a nationwide study of individuals between 40 and 50 years, we found that the risk of ischemic stroke differed depending on neighborhood socioeconomic status, which calls for increased efforts to prevent cardiovascular diseases in low socioeconomic status neighborhoods.

  9. Assessing storm surge hazard and impact of sea level rise in the Lesser Antilles case study of Martinique

    NASA Astrophysics Data System (ADS)

    Krien, Yann; Dudon, Bernard; Roger, Jean; Arnaud, Gael; Zahibo, Narcisse

    2017-09-01

    In the Lesser Antilles, coastal inundations from hurricane-induced storm surges pose a great threat to lives, properties and ecosystems. Assessing current and future storm surge hazards with sufficient spatial resolution is of primary interest to help coastal planners and decision makers develop mitigation and adaptation measures. Here, we use wave-current numerical models and statistical methods to investigate worst case scenarios and 100-year surge levels for the case study of Martinique under present climate or considering a potential sea level rise. Results confirm that the wave setup plays a major role in the Lesser Antilles, where the narrow island shelf impedes the piling-up of large amounts of wind-driven water on the shoreline during extreme events. The radiation stress gradients thus contribute significantly to the total surge - up to 100 % in some cases. The nonlinear interactions of sea level rise (SLR) with bathymetry and topography are generally found to be relatively small in Martinique but can reach several tens of centimeters in low-lying areas where the inundation extent is strongly enhanced compared to present conditions. These findings further emphasize the importance of waves for developing operational storm surge warning systems in the Lesser Antilles and encourage caution when using static methods to assess the impact of sea level rise on storm surge hazard.

  10. Simulation of dynamics of southern pine beetle hazard rating with respect to silvicultural treatment and stand development

    Treesearch

    D. J. Leduc; J. C. G. Goelz

    2010-01-01

    The hazard of southern pine beetle (SPB) infestations is affected by characteristics such as stand density, stand age, site quality, and tree size. COMPUTE P-LOB is a model that simulates the growth and development of loblolly pine plantations in the west gulf coastal plain. P-LOB was rewritten as COMPUTE SPB-Lob to update it for current operating systems and to...

  11. CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.

    2013-12-01

    As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over 1100 hazard curves. We will report on the performance of this CyberShake study, four times larger than previous studies. Additionally, we will examine the challenges we face applying these workflow techniques to additional open-science HPC systems and discuss whether our workflow solutions continue to provide value to our large-scale PSHA calculations.

  12. Investigation of Possible Wellbore Cement Failures During Hydraulic Fracturing Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jihoon; Moridis, George

    2014-11-01

    We model and assess the possibility of shear failure, using the Mohr-Coulomb model ? along the vertical well by employing a rigorous coupled flow-geomechanic analysis. To this end, we vary the values of cohesion between the well casing and the surrounding cement to representing different quality levels of the cementing operation (low cohesion corresponds to low-quality cement and/or incomplete cementing). The simulation results show that there is very little fracturing when the cement is of high quality.. Conversely, incomplete cementing and/or weak cement can causes significant shear failure and the evolution of long fractures/cracks along the vertical well. Specifically, lowmore » cohesion between the well and cemented areas can cause significant shear failure along the well, but the same cohesion as the cemented zone does not cause shear failure. When the hydraulic fracturing pressure is high, low cohesion of the cement can causes fast propagation of shear failure and of the resulting fracture/crack, but a high-quality cement with no weak zones exhibits limited shear failure that is concentrated near the bottom of the vertical part of the well. Thus, high-quality cement and complete cementing along the vertical well appears to be the strongest protection against shear failure of the wellbore cement and, consequently, against contamination hazards to drinking water aquifers during hydraulic fracturing operations.« less

  13. Towards a GLOF hazard map for the city of Huaraz, Cordillera Blanca, Peru

    NASA Astrophysics Data System (ADS)

    Frey, Holger; Huggel, Christian; E Chisolm, Rachel; Gonzales, César; Cochachin, Alejo; Portocarrero, César

    2017-04-01

    Huaraz, with 120,000 inhabitants, is the largest city at the foot of the Cordillera Blanca Mountain Range, Peru, and is located at the confluence of the Quillcay River with the main Santa River. Three moraine dammed glacier lakes are located in the headwaters of the Quillcay catchment, which pose a potential threat of glacier lake outburst floods (GLOFs) to Huaraz: Laguna Cuchillacocha (2.5 x 106 m3), Laguna Tullparaju (12 x 106 m3), and Laguna Palcacocha (17 x 106 m3). The latter burst out in 1941, causing one of the deadliest GLOFs known in history, with about 2000 casualties and destroying a third of the city of Huaraz. Currently, the presence of these lakes within potential runout distances of possibly very large ice or rock/ice-avalanches, combined with the large damage potential in the city of Huaraz, some 20 km downstream of the lakes and further potentially endangered infrastructures such as the city of Trujillo, large-scale irrigation projects and hydropower plants along the Santa River poses a high-risk situation, despite lake safety systems at all three lakes were constructed in the last century. At Laguna Palcacocha, temporary measures, such as syphoning and a permanent supervision by a team of observers are undertaken at Laguna Palcacocha. For the future, more permanent measures are planned, including non-structural measures, such as a sensor-based early warning system for the entire catchment. In this framework, a preliminary GLOF hazard map for the entire Quillcay catchment has been developed, based on physically-based numerical modeling. For each of the three lakes, three scenarios of different magnitudes and related probabilities were modeled. For each case, a series of models was used to simulate each part of the chain of interacting processes. The eventual GLOFs were simulated with FLO2D for Palcacocha and RAMMS for Tullparaju and Cuchillacocha. Small, medium and large scenarios were merged for all three lakes, in order to come up with a single hazard map for the entire catchment. Inundation heights were first translated into intensities, and then intensities were converted into hazard levels, according to the probability of the scenario, which resulted in the preliminary hazard map. This map is currently used for informing the population and for the planning of further mitigation actions. For the development of the final hazard map, more detailed simulations in the urban area of Huaraz are needed, combined with field mapping to adjust the map to local conditions and peculiarities. Related efforts are ongoing, in close collaboration with local institutions and authorities. Besides the scientific challenges for the development of such a hazard map, the institutional aspect for the official approval and legal validation is a major challenge that needs to be tackled.

  14. Tsunami hazard map in eastern Bali

    NASA Astrophysics Data System (ADS)

    Afif, Haunan; Cipta, Athanasius

    2015-04-01

    Bali is a popular tourist destination both for Indonesian and foreign visitors. However, Bali is located close to the collision zone between the Indo-Australian Plate and Eurasian Plate in the south and back-arc thrust off the northern coast of Bali resulted Bali prone to earthquake and tsunami. Tsunami hazard map is needed for better understanding of hazard level in a particular area and tsunami modeling is one of the most reliable techniques to produce hazard map. Tsunami modeling conducted using TUNAMI N2 and set for two tsunami sources scenarios which are subduction zone in the south of Bali and back thrust in the north of Bali. Tsunami hazard zone is divided into 3 zones, the first is a high hazard zones with inundation height of more than 3m. The second is a moderate hazard zone with inundation height 1 to 3m and the third is a low tsunami hazard zones with tsunami inundation heights less than 1m. Those 2 scenarios showed southern region has a greater potential of tsunami impact than the northern areas. This is obviously shown in the distribution of the inundated area in the south of Bali including the island of Nusa Penida, Nusa Lembongan and Nusa Ceningan is wider than in the northern coast of Bali although the northern region of the Nusa Penida Island more inundated due to the coastal topography.

  15. Tsunami hazard map in eastern Bali

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Afif, Haunan, E-mail: afif@vsi.esdm.go.id; Cipta, Athanasius; Australian National University, Canberra

    Bali is a popular tourist destination both for Indonesian and foreign visitors. However, Bali is located close to the collision zone between the Indo-Australian Plate and Eurasian Plate in the south and back-arc thrust off the northern coast of Bali resulted Bali prone to earthquake and tsunami. Tsunami hazard map is needed for better understanding of hazard level in a particular area and tsunami modeling is one of the most reliable techniques to produce hazard map. Tsunami modeling conducted using TUNAMI N2 and set for two tsunami sources scenarios which are subduction zone in the south of Bali and backmore » thrust in the north of Bali. Tsunami hazard zone is divided into 3 zones, the first is a high hazard zones with inundation height of more than 3m. The second is a moderate hazard zone with inundation height 1 to 3m and the third is a low tsunami hazard zones with tsunami inundation heights less than 1m. Those 2 scenarios showed southern region has a greater potential of tsunami impact than the northern areas. This is obviously shown in the distribution of the inundated area in the south of Bali including the island of Nusa Penida, Nusa Lembongan and Nusa Ceningan is wider than in the northern coast of Bali although the northern region of the Nusa Penida Island more inundated due to the coastal topography.« less

  16. Simulation of wind-driven dispersion of fire pollutants in a street canyon using FDS.

    PubMed

    Pesic, Dusica J; Blagojevic, Milan Dj; Zivkovic, Nenad V

    2014-01-01

    Air quality in urban areas attracts great attention due to increasing pollutant emissions and their negative effects on human health and environment. Numerous studies, such as those by Mouilleau and Champassith (J Loss Prevent Proc 22(3): 316-323, 2009), Xie et al. (J Hydrodyn 21(1): 108-117, 2009), and Yassin (Environ Sci Pollut Res 20(6): 3975-3988, 2013) focus on the air pollutant dispersion with no buoyancy effect or weak buoyancy effect. A few studies, such as those by Hu et al. (J Hazard Mater 166(1): 394-406, 2009; J Hazard Mater 192(3): 940-948, 2011; J Civ Eng Manag (2013)) focus on the fire-induced dispersion of pollutants with heat buoyancy release rate in the range from 0.5 to 20 MW. However, the air pollution source might very often be concentrated and intensive, as a consequence of the hazardous materials fire. Namely, transportation of fuel through urban areas occurs regularly, because it is often impossible to find alternative supply routes. It is accompanied with the risk of fire accident occurrences. Accident prevention strategies require analysis of the worst scenarios in which fire products jeopardize the exposed population and environment. The aim of this article is to analyze the impact of wind flow on air pollution and human vulnerability to fire products in a street canyon. For simulation of the gasoline tanker truck fire as a result of a multivehicle accident, computational fluid dynamics large eddy simulation method has been used. Numerical results show that the fire products flow vertically upward, without touching the walls of the buildings in the absence of wind. However, when the wind velocity reaches the critical value, the products touch the walls of the buildings on both sides of the street canyon. The concentrations of carbon monoxide and soot decrease, whereas carbon dioxide concentration increases with the rise of height above the street canyon ground level. The longitudinal concentration of the pollutants inside the street increases with the rise of the wind velocity at the roof level of the street canyon.

  17. LAV@HAZARD: a Web-GIS Framework for Real-Time Forecasting of Lava Flow Hazards

    NASA Astrophysics Data System (ADS)

    Del Negro, C.; Bilotta, G.; Cappello, A.; Ganci, G.; Herault, A.

    2014-12-01

    Crucial to lava flow hazard assessment is the development of tools for real-time prediction of flow paths, flow advance rates, and final flow lengths. Accurate prediction of flow paths and advance rates requires not only rapid assessment of eruption conditions (especially effusion rate) but also improved models of lava flow emplacement. Here we present the LAV@HAZARD web-GIS framework, which combines spaceborne remote sensing techniques and numerical simulations for real-time forecasting of lava flow hazards. By using satellite-derived discharge rates to drive a lava flow emplacement model, LAV@HAZARD allows timely definition of parameters and maps essential for hazard assessment, including the propagation time of lava flows and the maximum run-out distance. We take advantage of the flexibility of the HOTSAT thermal monitoring system to process satellite images coming from sensors with different spatial, temporal and spectral resolutions. HOTSAT was designed to ingest infrared satellite data acquired by the MODIS and SEVIRI sensors to output hot spot location, lava thermal flux and discharge rate. We use LAV@HAZARD to merge this output with the MAGFLOW physics-based model to simulate lava flow paths and to update, in a timely manner, flow simulations. Thus, any significant changes in lava discharge rate are included in the predictions. A significant benefit in terms of computational speed was obtained thanks to the parallel implementation of MAGFLOW on graphic processing units (GPUs). All this useful information has been gathered into the LAV@HAZARD platform which, due to the high degree of interactivity, allows generation of easily readable maps and a fast way to explore alternative scenarios. We will describe and demonstrate the operation of this framework using a variety of case studies pertaining to Mt Etna, Sicily. Although this study was conducted on Mt Etna, the approach used is designed to be applicable to other volcanic areas around the world.

  18. Health risk assessment of cadmium pollution emergency for urban populations in Foshan City, China.

    PubMed

    Dou, Ming; Zhao, Peipei; Wang, Yanyan; Li, Guiqiu

    2017-03-01

    With rapid socioeconomic development, water pollution emergency has become increasingly common and could potentially harm the environment and human health, especially heavy metal pollution. In this paper, we investigate the Cd pollution emergency that occurred in the Pearl River network, China, in 2005, and we build a migration and transformation model for heavy metals to simulate the spatiotemporal distribution of Cd concentrations under various scenarios of Cd pollution emergency in Foshan City. Moreover, human health hazard and carcinogenic risk for local residents of Foshan City were evaluated. The primary conclusions were as follows: (1) the number of carcinogen-affected people per year under scenario 1 reached 254.41 when the frequency was 0.1 year/time; specifically, the number of people with cancer per year in the area of the Datang, Lubao, and Nanbian waterworks was 189.36 accounting for 74% of the total number per year; (2) at the frequency of 5 years/time, the Lubao waterwork is the only one in extremely high- or high-risk grade, while besides it, the risk grade in the Datang, Nanbian, Xinan, Shitang, and Jianlibao waterworks is in the extremely high or high grade when the frequency is 0.1 year/time; (3) when Cd pollution accidents with the same level occurs again, Cd concentration decreases to a low level in the water only if the migration distance of Cd is at least 40-50 km. Based on the health risk assessment of Cd pollution, this study gives the recommendation that the distance should keep above 50 km in tidal river network of the Pearl River Delta between those factories existing the possibility of heavy metal pollution and the drinking water source. Only then can the public protect themselves from hazardous effects of higher levels of heavy metal.

  19. Geological risk assessment for the rapid development area of the Erhai Basin

    NASA Astrophysics Data System (ADS)

    Yang, Liu; Wang, Zhanqi; Jin, Gui; Chen, Dongdong; Wang, Zhan

    For low-slope hilly land development to have more new land space in a watershed, it is particularly important that to coordinate the sharply increasing conflicts between mountainous and urban land utilization in the city. However, development of low-slope hilly land easily induce potential risks of geologic hazards such as landslide and landslip. It may lead to further environmental losses in a watershed. Hence, it is necessary to study potential risks of geo-hazards in low-slope hilly land development in urban area. Based on GIS spatial analysis technique, we select a study area, Dali City in the Erhai Basin located in watershed belt of Jinsha River, Lancang River and Red River in Yunnan Province of China. Through studying some relevant key indexes and parameters for monitoring potential risks of geo-hazards, we establish a composite index model for zoning the area with potential risks of geo-hazards in development of low-slope hilly land in the study area. Our research findings indicate that the potential risks of geo-hazards in eastern Dali City is relatively low while of that on slow hills with gentle slopes in the western area are relatively high. By using a zoning research method, generated maps show geological information of potential risks of geo-hazards on low-slope hilly land which provide important messages for guarding against natural geo-hazards and potential environmental losses in a watershed.

  20. Disaster Risks Reduction for Extreme Natural Hazards

    NASA Astrophysics Data System (ADS)

    Plag, H.; Jules-Plag, S.

    2013-12-01

    Mega disasters associated with extreme natural hazards have the potential to escalate the global sustainability crisis and put us close to the boundaries of the safe operating space for humanity. Floods and droughts are major threats that potentially could reach planetary extent, particularly through secondary economic and social impacts. Earthquakes and tsunamis frequently cause disasters that eventually could exceed the immediate coping capacity of the global economy, particularly since we have built mega cities in hazardous areas that are now ready to be harvested by natural hazards. Unfortunately, the more we learn to cope with the relatively frequent hazards (50 to 100 years events), the less we are worried about the low-probability, high-impact events (a few hundred and more years events). As a consequence, threats from the 500 years flood, drought, volcano eruption are not appropriately accounted for in disaster risk reduction (DRR) discussions. Extreme geohazards have occurred regularly throughout the past, but mostly did not cause major disasters because exposure of human assets to hazards was much lower in the past. The most extreme events that occurred during the last 2,000 years would today cause unparalleled damage on a global scale and could worsen the sustainability crisis. Simulation of these extreme hazards under present conditions can help to assess the disaster risk. Recent extreme earthquakes have illustrated the destruction they can inflict, both directly and indirectly through tsunamis. Large volcano eruptions have the potential to impact climate, anthropogenic infrastructure and resource supplies on global scale. During the last 2,000 years several large volcano eruptions occurred, which under today's conditions are associated with extreme disaster risk. The comparison of earthquakes and volcano eruptions indicates that large volcano eruptions are the low-probability geohazards with potentially the highest impact on our civilization. Integration of these low-probability, high-impact events in DRR requires an approach focused on resilience and antifragility, as well as the ability to cope with, and recover from failure of infrastructure and social systems. Resilience does not primarily result from the robustness of infrastructure but mainly is a function of the social capital. While it is important to understand the hazards (the contribution of geosciences), it is equally important to understand the processes that let us cope with the hazards, or lead to failure (the contribution of social sciences and engineering). For the latter, we need a joint effort of social sciences and engineering and a revised science-policy relationship. Democratizing knowledge about extreme geohazards is very important in order to inform deliberations of DRR through increased resilience and reduced fragility. The current science-society dialog is not fully capable of supporting deliberative governance. Most scientific knowledge is created independent of those who could put it to use, and a transition to co-design and co-development of knowledge involving a broad stakeholder base is necessary for DRR, particularly for extreme events. This transition may have the consequence of more responsibility and even liability for science.

Top