Sample records for hazard mitigation studies

  1. Study proposes wholesale change in thinking about natural hazards mitigation

    Microsoft Academic Search

    Randy Showstack

    1999-01-01

    The ``lollapaloozas,'' the major natural catastrophes, are getting bigger and bigger, and it is time to confront this growing problem by dramatically changing the way that society approaches natural hazard mitigation, conducts itself in relation to the natural environment, and accepts responsibility for activities that could lead to or increase disasters, according to Dennis Mileti, principal investigator of a new

  2. Mitigating lightning hazards

    SciTech Connect

    Hasbrouck, R.

    1996-05-01

    A new draft document provides guidance for assessing and mitigating the effects of lightning hazards on a Department of Energy (or any other) facility. Written by two Lawrence Livermore Engineers, the document combines lightning hazard identification and facility categorization with a new concept, the Lightning Safety System, to help dispel the confusion and mystery surrounding lightning and its effects. The guidance is of particular interest to DOE facilities storing and handling nuclear and high-explosive materials. The concepts presented in the document were used to evaluate the lightning protection systems of the Device Assembly Facility at the Nevada Test Site.

  3. Collective action for community-based hazard mitigation: a case study of Tulsa project impact 

    E-print Network

    Lee, Hee Min

    2005-11-01

    During the past two decades, community-based hazard mitigation (CBHM) has been newly proposed and implemented as an alternative conceptual model for emergency management to deal with disasters comprehensively in order to curtail skyrocketing...

  4. Contributions of Nimbus 7 TOMS Data to Volcanic Study and Hazard Mitigation

    NASA Technical Reports Server (NTRS)

    Krueger, Arlin J.; Bluth, G. J. S.; Schaefer, S. A.

    1998-01-01

    Nimbus TOMS data have led to advancements among many volcano-related scientific disciplines, from the initial ability to quantify SO2 clouds leading to derivations of eruptive S budgets and fluxes, to tracking of individual clouds, assessing global volcanism and atmospheric impacts. Some of the major aspects of TOMS-related research, listed below, will be reviewed and updated: (1) Measurement of volcanic SO2 clouds: Nimbus TOMS observed over 100 individual SO2 clouds during its mission lifetime; large explosive eruptions are now routinely and reliably measured by satellite. (2) Eruption processes: quantification of SO2 emissions have allowed assessments of eruption sulfur budgets, the evaluation of "excess" sulfur, and inferences of H2S emissions. (3) Detection of ash: TOMS data are now used to detect volcanic particulates in the atmosphere, providing complementary analyses to infrared methods of detection. Paired TOMS and AVHRR studies have provided invaluable information on volcanic cloud compositions and processes. (4) Cloud tracking and hazard mitigation: volcanic clouds can be considered gigantic tracers in the atmosphere, and studies of the fates of these clouds have led to new knowledge of their physical and chemical dispersion in the atmosphere for predictive models. (5) Global trends: the long term data set has provided researchers an unparalleled record of explosive volcanism, and forms a key component in assessing annual to decadal trends in global S emissions. (6) Atmospheric impacts: TOMS data have been linked to independent records of atmospheric change, in order to compare cause and effect processes following a massive injection of SO2 into the atmosphere. (7) Future TOMS instruments and applications: Nimbus TOMS has given way to new satellite platforms, with several wavelength and resolution modifications. New efforts to launch a geostationary TOMS could provide unprecedented observations of volcanic activity.

  5. Landslide hazard mitigation in North America

    USGS Publications Warehouse

    Wieczorek, G.F.; Leahy, P.P.

    2008-01-01

    Active landslides throughout the states and territories of the United States result in extensive property loss and 25-50 deaths per year. The U.S. Geological Survey (USGS) has a long history of detailed examination of landslides since the work of Howe (1909) in the San Juan Mountains of Colorado. In the last four decades, landslide inventory maps and landslide hazard maps have depicted landslides of different ages, identified fresh landslide scarps, and indicated the direction of landslide movement for different regions of the states of Colorado, California, and Pennsylvania. Probability-based methods improve landslide hazards assessments. Rainstorms, earthquakes, wildfires, and volcanic eruptions can trigger landslides. Improvements in remote sensing of rainfall make it possible to issue landslide advisories and warnings for vulnerable areas. From 1986 to 1995, the USGS issued hazard warnings based on rainfall in the San Francisco Bay area. USGS workers also identified rainfall thresholds triggering landslides in Puerto Rico, Hawaii, Washington, and the Blue Ridge Mountains of central Virginia. Detailed onsite monitoring of landslides near highways in California and Colorado aided transportation officials. The USGS developed a comprehensive, multi-sector, and multi-agency strategy to mitigate landslide hazards nationwide. This study formed the foundation of the National Landslide Hazards Mitigation Strategy. The USGS, in partnership with the U.S. National Weather Service and the State of California, began to develop a real-time warning system for landslides from wildfires in Southern California as a pilot study in 2005.

  6. Florida households' expected responses to hurricane hazard mitigation incentives.

    PubMed

    Ge, Yue; Peacock, Walter Gillis; Lindell, Michael K

    2011-10-01

    This study tested a series of models predicting household expectations of participating in hurricane hazard mitigation incentive programs. Data from 599 households in Florida revealed that mitigation incentive adoption expectations were most strongly and consistently related to hazard intrusiveness and risk perception and, to a lesser extent, worry. Demographic and hazard exposure had indirect effects on mitigation incentive adoption expectations that were mediated by the psychological variables. The results also revealed differences in the factors affecting mitigation incentive adoption expectations for each of five specific incentive programs. Overall, the results suggest that hazard managers are more likely to increase participation in mitigation incentive programs if they provide messages that repeatedly (thus increasing hazard intrusiveness) remind people of the likelihood of severe negative consequences of hurricane impact (thus increasing risk perception). PMID:21449959

  7. Examining Local Jurisdictions' Capacity and Commitment For Hazard Mitigation Policies and Strategies along the Texas Coast 

    E-print Network

    Husein, Rahmawati

    2012-07-16

    There have been studies on the role of land use planning and development regulations on hazard mitigation and the importance of including these in effective mitigation planning initiatives. However, little empirical research has examined how...

  8. WHC natural phenomena hazards mitigation implementation plan

    SciTech Connect

    Conrads, T.J.

    1996-09-11

    Natural phenomena hazards (NPH) are unexpected acts of nature which pose a threat or danger to workers, the public or to the environment. Earthquakes, extreme winds (hurricane and tornado),snow, flooding, volcanic ashfall, and lightning strike are examples of NPH at Hanford. It is the policy of U.S. Department of Energy (DOE) to design, construct and operate DOE facilitiesso that workers, the public and the environment are protected from NPH and other hazards. During 1993 DOE, Richland Operations Office (RL) transmitted DOE Order 5480.28, ``Natural Phenomena Hazards Mitigation,`` to Westinghouse Hanford COmpany (WHC) for compliance. The Order includes rigorous new NPH criteria for the design of new DOE facilities as well as for the evaluation and upgrade of existing DOE facilities. In 1995 DOE issued Order 420.1, ``Facility Safety`` which contains the same NPH requirements and invokes the same applicable standards as Order 5480.28. It will supersede Order 5480.28 when an in-force date for Order 420.1 is established through contract revision. Activities will be planned and accomplished in four phases: Mobilization; Prioritization; Evaluation; and Upgrade. The basis for the graded approach is the designation of facilities/structures into one of five performance categories based upon safety function, mission and cost. This Implementation Plan develops the program for the Prioritization Phase, as well as an overall strategy for the implemention of DOE Order 5480.2B.

  9. Volcanic hazards and their mitigation: Progress and problems

    NASA Astrophysics Data System (ADS)

    Tilling, Robert I.

    1989-05-01

    At the beginning of the twentieth century, volcanology began to emerge as a modern science as a result of increased interest in eruptive phenomena following some of the worst volcanic disasters in recorded history: Krakatau (Indonesia) in 1883 and Mont Pelée (Martinique), Soufrière (St. Vincent), and Santa María (Guatemala) in 1902. Volcanology is again experiencing a period of heightened public awareness and scientific growth in the 1980s, the worst period since 1902 in terms of volcanic disasters and crises. A review of hazards mitigation approaches and techniques indicates that significant advances have been made in hazards assessment, volcano monitoring, and eruption forecasting. For example, the remarkable accuracy of the predictions of dome-building events at Mount St. Helens since June 1980 is unprecedented. Yet a predictive capability for more voluminous and explosive eruptions still has not been achieved. Studies of magma-induced seismicity and ground deformation continue to provide the most systematic and reliable data for early detection of precursors to eruptions and shallow intrusions. In addition, some other geophysical monitoring techniques and geochemical methods have been refined and are being more widely applied and tested. Comparison of the four major volcanic disasters of the 1980s (Mount St. Helens, U.S.A. (1980), El Chichón, Mexico (1982); Galunggung, Indonesia (1982); and Nevado del Ruíz, Colombia (1985) illustrates the importance of predisaster geoscience studies, volcanic hazards assessments, volcano monitoring, contingency planning, and effective communications between scientists and authorities. The death toll (>22,000) from the Ruíz catastrophe probably could have been greatly reduced; the reasons for the tragically ineffective implementation of evacuation measures are still unclear and puzzling in view of the fact that sufficient warnings were given. The most pressing problem in the mitigation of volcanic and associated hazards on a global scale is that most of the world's dangerous volcanoes are in densely populated countries that lack the economic and scientific resources or the political will to adequately study and monitor them. This problem afflicts both developed and developing countries, but it is especially acute for the latter. The greatest advances in volcanic hazards mitigation in the near future are most likely to be achieved by wider application of existing technology to poorly understood and studied volcanoes, rather than by refinements or new discoveries in technology alone.

  10. Rockslide susceptibility and hazard assessment for mitigation works design along vertical rocky cliffs: workflow proposal based on a real case-study conducted in Sacco (Campania), Italy

    NASA Astrophysics Data System (ADS)

    Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio

    2015-04-01

    The work here presented concerns a case study in which a complete multidisciplinary workflow has been applied for an extensive assessment of the rockslide susceptibility and hazard in a common scenario such as a vertical and fractured rocky cliffs. The studied area is located in a high-relief zone in Southern Italy (Sacco, Salerno, Campania), characterized by wide vertical rocky cliffs formed by tectonized thick successions of shallow-water limestones. The study concerned the following phases: a) topographic surveying integrating of 3d laser scanning, photogrammetry and GNSS; b) gelogical surveying, characterization of single instabilities and geomecanichal surveying, conducted by geologists rock climbers; c) processing of 3d data and reconstruction of high resolution geometrical models; d) structural and geomechanical analyses; e) data filing in a GIS-based spatial database; f) geo-statistical and spatial analyses and mapping of the whole set of data; g) 3D rockfall analysis; The main goals of the study have been a) to set-up an investigation method to achieve a complete and thorough characterization of the slope stability conditions and b) to provide a detailed base for an accurate definition of the reinforcement and mitigation systems. For this purposes the most up-to-date methods of field surveying, remote sensing, 3d modelling and geospatial data analysis have been integrated in a systematic workflow, accounting of the economic sustainability of the whole project. A novel integrated approach have been applied both fusing deterministic and statistical surveying methods. This approach enabled to deal with the wide extension of the studied area (near to 200.000 m2), without compromising an high accuracy of the results. The deterministic phase, based on a field characterization of single instabilities and their further analyses on 3d models, has been applied for delineating the peculiarity of each single feature. The statistical approach, based on geostructural field mapping and on punctual geomechanical data from scan-line surveying, allowed the rock mass partitioning in homogeneous geomechanical sectors and data interpolation through bounded geostatistical analyses on 3d models. All data, resulting from both approaches, have been referenced and filed in a single spatial database and considered in global geo-statistical analyses for deriving a fully modelled and comprehensive evaluation of the rockslide susceptibility. The described workflow yielded the following innovative results: a) a detailed census of single potential instabilities, through a spatial database recording the geometrical, geological and mechanical features, along with the expected failure modes; b) an high resolution characterization of the whole slope rockslide susceptibility, based on the partitioning of the area according to the stability and mechanical conditions which can be directly related to specific hazard mitigation systems; c) the exact extension of the area exposed to the rockslide hazard, along with the dynamic parameters of expected phenomena; d) an intervention design for hazard mitigation.

  11. Space options for tropical cyclone hazard mitigation

    NASA Astrophysics Data System (ADS)

    Dicaire, Isabelle; Nakamura, Ryoko; Arikawa, Yoshihisa; Okada, Kazuyuki; Itahashi, Takamasa; Summerer, Leopold

    2015-02-01

    This paper investigates potential space options for mitigating the impact of tropical cyclones on cities and civilians. Ground-based techniques combined with space-based remote sensing instrumentation are presented together with space-borne concepts employing space solar power technology. Two space-borne mitigation options are considered: atmospheric warming based on microwave irradiation and laser-induced cloud seeding based on laser power transfer. Finally technology roadmaps dedicated to the space-borne options are presented, including a detailed discussion on the technological viability and technology readiness level of our proposed systems. Based on these assessments, the space-borne cyclone mitigation options presented in this paper may be established in a quarter of a century.

  12. Mitigation of earthquake hazards using seismic base isolation systems

    SciTech Connect

    Wang, C.Y.

    1994-06-01

    This paper deals with mitigation of earthquake hazards using seismic base-isolation systems. A numerical algorithm is described for system response analysis of isolated structures with laminated elastomer bearings. The focus of this paper is on the adaptation of a nonlinear constitutive equation for the isolation bearing, and the treatment of foundation embedment for the soil-structure-interaction analysis. Sample problems are presented to illustrate the mitigating effect of using base-isolation systems.

  13. Estimating environmental benefits of natural hazard mitigation with data transfer: results from a benefit-cost analysis of Federal Emergency Management Agency hazard mitigation grants

    Microsoft Academic Search

    John C. Whitehead; Adam Z. Rose

    2009-01-01

    This paper summarizes methods, data and results associated with the first major attempt to evaluate the environmental benefits\\u000a of U.S. Federal Emergency Management Agency natural hazards mitigation grants. The study relied heavily on the refinement\\u000a of benefit transfer methods. Categories of benefits include water quality for recreational and commercial fishing, drinking\\u000a water, outdoor recreation, hazardous waste, wetlands and aesthetic, health

  14. Seismic hazard status and mitigation actions

    E-print Network

    Paris-Sud XI, Université de

    in France) > A classification into 3 groups: · G1: Earthquake resistant, possibly with some minor work du Globe > France is divided into 5 zones for building codes > The highest French level is attributed > numerical simulation of the earthquake induced damage in a specific territory Regional & local hazard

  15. Application of weather prediction models for hazard mitigation planning: a case study of heavy off-season rains in Senegal

    Microsoft Academic Search

    Souleymane Fall; Dev Niyogi; U. C. Mohanty; Anil Kumar

    2007-01-01

    Heavy off-season rains in the tropics pose significant natural hazards largely because they are unexpected and the popular\\u000a infrastructure is ill-prepared. One such event was observed from January 9 to 11, 2002 in Senegal (14.00 N, 14.00?W), West\\u000a Africa. This tropical country is characterized by a long dry season from November to April or May. During this period, although\\u000a the

  16. Mitigation options for accidental releases of hazardous gases

    SciTech Connect

    Fthenakis, V.M.

    1995-05-01

    The objective of this paper is to review and compare technologies available for mitigation of unconfined releases of toxic and flammable gases. These technologies include: secondary confinement, deinventory, vapor barriers, foam spraying, and water sprays/monitors. Guidelines for the design and/or operation of effective post-release mitigation systems and case studies involving actual industrial mitigation systems are also presented.

  17. Mitigation of the most hazardous tank at the Hanford Site

    SciTech Connect

    Reynolds, D.A.

    1994-09-01

    Various tanks at the Hanford Site have been declared to be unresolved safety problems. This means that the tank has the potential to be beyond the limits covered by the current safety documentation. Tank 241-SY-101 poses the greatest hazard. The waste stored in this tank has periodically released hydrogen gas which exceeds the lower flammable limits. A mixer pump was installed in this tank to stir the waste. Stirring the waste would allow the hydrogen to be released slowly in a controlled manner and mitigate the hazard associated with this tank. The testing of this mixer pump is reported in this document. The mixer pump has been successful in controlling the hydrogen concentration in the tank dome to below the flammable limit which has mitigated the hazardous gas releases.

  18. Tracking volcanic sulfur dioxide clouds for aviation hazard mitigation

    Microsoft Academic Search

    Simon A. Carn; Arlin J. Krueger; Nickolay A. Krotkov; Kai Yang; Keith Evans

    2009-01-01

    Satellite measurements of volcanic sulfur dioxide (SO2) emissions can provide critical information for aviation hazard mitigation, particularly when ash detection techniques fail.\\u000a Recent developments in space-based SO2 monitoring are discussed, focusing on daily, global ultraviolet (UV) measurements by the Ozone Monitoring Instrument (OMI)\\u000a on NASA’s Aura satellite. OMI’s high sensitivity to SO2 permits long-range tracking of volcanic clouds in the

  19. New Approaches to Tsunami Hazard Mitigation Demonstrated in Oregon

    NASA Astrophysics Data System (ADS)

    Priest, G. R.; Rizzo, A.; Madin, I.; Lyles Smith, R.; Stimely, L.

    2012-12-01

    Oregon Department of Geology and Mineral Industries and Oregon Emergency Management collaborated over the last four years to increase tsunami preparedness for residents and visitors to the Oregon coast. Utilizing support from the National Tsunami Hazards Mitigation Program (NTHMP), new approaches to outreach and tsunami hazard assessment were developed and then applied. Hazard assessment was approached by first doing two pilot studies aimed at calibrating theoretical models to direct observations of tsunami inundation gleaned from the historical and prehistoric (paleoseismic/paleotsunami) data. The results of these studies were then submitted to peer-reviewed journals and translated into 1:10,000-12,000-scale inundation maps. The inundation maps utilize a powerful new tsunami model, SELFE, developed by Joseph Zhang at the Oregon Health & Science University. SELFE uses unstructured computational grids and parallel processing technique to achieve fast accurate simulation of tsunami interactions with fine-scale coastal morphology. The inundation maps were simplified into tsunami evacuation zones accessed as map brochures and an interactive mapping portal at http://www.oregongeology.org/tsuclearinghouse/. Unique in the world are new evacuation maps that show separate evacuation zones for distant versus locally generated tsunamis. The brochure maps explain that evacuation time is four hours or more for distant tsunamis but 15-20 minutes for local tsunamis that are invariably accompanied by strong ground shaking. Since distant tsunamis occur much more frequently than local tsunamis, the two-zone maps avoid needless over evacuation (and expense) caused by one-zone maps. Inundation mapping for the entire Oregon coast will be complete by ~2014. Educational outreach was accomplished first by doing a pilot study to measure effectiveness of various approaches using before and after polling and then applying the most effective methods. In descending order, the most effective methods were: (1) door-to-door (person-to-person) education, (2) evacuation drills, (3) outreach to K-12 schools, (4) media events, and (5) workshops targeted to key audiences (lodging facilities, teachers, and local officials). Community organizers were hired to apply these five methods to clusters of small communities, measuring performance by before and after polling. Organizers were encouraged to approach the top priority, person-to-person education, by developing Community Emergency Response Teams (CERT) or CERT-like organizations in each community, thereby leaving behind a functioning volunteer-based group that will continue the outreach program and build long term resiliency. One of the most effective person-to-person educational tools was the Map Your Neighborhood program that brings people together so they can sketch the basic layout of their neighborhoods to depict key earthquake and tsunami hazards and mitigation solutions. The various person-to-person volunteer efforts and supporting outreach activities are knitting communities together and creating a permanent culture of tsunami and earthquake preparedness. All major Oregon coastal population centers will have been covered by this intensive outreach program by ~2014.

  20. Composite Materials for Hazard Mitigation of Reactive Metal Hydrides.

    SciTech Connect

    Pratt, Joseph William; Cordaro, Joseph Gabriel; Sartor, George B.; Dedrick, Daniel E.; Reeder, Craig L.

    2012-02-01

    In an attempt to mitigate the hazards associated with storing large quantities of reactive metal hydrides, polymer composite materials were synthesized and tested under simulated usage and accident conditions. The composites were made by polymerizing vinyl monomers using free-radical polymerization chemistry, in the presence of the metal hydride. Composites with vinyl-containing siloxane oligomers were also polymerized with and without added styrene and divinyl benzene. Hydrogen capacity measurements revealed that addition of the polymer to the metal hydride reduced the inherent hydrogen storage capacity of the material. The composites were found to be initially effective at reducing the amount of heat released during oxidation. However, upon cycling the composites, the mitigating behavior was lost. While the polymer composites we investigated have mitigating potential and are physically robust, they undergo a chemical change upon cycling that makes them subsequently ineffective at mitigating heat release upon oxidation of the metal hydride. Acknowledgements The authors would like to thank the following people who participated in this project: Ned Stetson (U.S. Department of Energy) for sponsorship and support of the project. Ken Stewart (Sandia) for building the flow-through calorimeter and cycling test stations. Isidro Ruvalcaba, Jr. (Sandia) for qualitative experiments on the interaction of sodium alanate with water. Terry Johnson (Sandia) for sharing his expertise and knowledge of metal hydrides, and sodium alanate in particular. Marcina Moreno (Sandia) for programmatic assistance. John Khalil (United Technologies Research Corp) for insight into the hazards of reactive metal hydrides and real-world accident scenario experiments. Summary In an attempt to mitigate and/or manage hazards associated with storing bulk quantities of reactive metal hydrides, polymer composite materials (a mixture of a mitigating polymer and a metal hydride) were synthesized and tested under simulated usage and accident conditions. Mitigating the hazards associated with reactive metal hydrides during an accident while finding a way to keep the original capability of the active material intact during normal use has been the focus of this work. These composites were made by polymerizing vinyl monomers using free-radical polymerization chemistry, in the presence of the metal hydride, in this case a prepared sodium alanate (chosen as a representative reactive metal hydride). It was found that the polymerization of styrene and divinyl benzene could be initiated using AIBN in toluene at 70 degC. The resulting composite materials can be either hard or brittle solids depending on the cross-linking density. Thermal decomposition of these styrene-based composite materials is lower than neat polystyrene indicating that the chemical nature of the polymer is affected by the formation of the composite. The char-forming nature of cross-linked polystyrene is low and therefore, not an ideal polymer for hazard mitigation. To obtain composite materials containing a polymer with higher char-forming potential, siloxane-based monomers were investigated. Four vinyl-containing siloxane oligomers were polymerized with and without added styrene and divinyl benzene. Like the styrene materials, these composite materials exhibited thermal decomposition behavior significantly different than the neat polymers. Specifically, the thermal decomposition temperature was shifted approximately 100 degC lower than the neat polymer signifying a major chemical change to the polymer network. Thermal analysis of the cycled samples was performed on the siloxane-based composite materials. It was found that after 30 cycles the siloxane-containing polymer composite material has similar TGA/DSC-MS traces as the virgin composite material indicating that the polymer is physically intact upon cycling. Hydrogen capacity measurements revealed that addition of the polymer to the metal hydride in the form of a composite material reduced the inherent hydrogen storage capacity of the material. This

  1. Meteorological Hazard Assessment and Risk Mitigation in Rwanda.

    NASA Astrophysics Data System (ADS)

    Nduwayezu, Emmanuel; Jaboyedoff, Michel; Bugnon, Pierre-Charles; Nsengiyumva, Jean-Baptiste; Horton, Pascal; Derron, Marc-Henri

    2015-04-01

    Between 10 and 13 April 2012, heavy rains hit sectors adjacent to the Vulcanoes National Park (Musanze District in the Northern Province and Nyabihu and Rubavu Districts in the Western Province of RWANDA), causing floods that affected about 11,000 persons. Flooding caused deaths and injuries among the affected population, and extensive damage to houses and properties. 348 houses were destroyed and 446 were partially damaged or have been underwater for several days. Families were forced to leave their flooded homes and seek temporal accommodation with their neighbors, often in overcrowded places. Along the West-northern border of RWANDA, Virunga mountain range consists of 6 major volcanoes. Mount Karisimbi is the highest volcano at 4507m. The oldest mountain is mount Sabyinyo which rises 3634m. The hydraulic network in Musanze District is formed by temporary torrents and permanent watercourses. Torrents surge during strong storms, and are provoked by water coming downhill from the volcanoes, some 20 km away. This area is periodically affected by flooding and landslides because of heavy rain (Rwanda has 2 rainy seasons from February to April and from September to November each year in general and 2 dry seasons) striking the Volcano National Park. Rain water creates big water channels (in already known torrents or new ones) that impact communities, agricultural soils and crop yields. This project aims at identifying hazardous and risky areas by producing susceptibility maps for floods, debris flow and landslides over this sector. Susceptibility maps are being drawn using field observations, during and after the 2012 events, and an empirical model of propagation for regional susceptibility assessments of debris flows (Flow-R). Input data are 10m and 30m resolution DEMs, satellite images, hydrographic network, and some information on geological substratum and soil occupation. Combining susceptibility maps with infrastructures, houses and population density maps will be used in identifying the most risky areas. Finally, based on practical experiences in this kind of field and produced documents some recommendations for low-cost mitigation measures will be proposed. Reference: MIDIMAR, Impacts of floods and landslides on socio-economic development profile. Case study: Musanze District. Kigali, June 2012.

  2. ANALYSIS AND MITIGATION OF X-RAY HAZARD GENERATED FROM HIGH INTENSITY LASER-TARGET INTERACTIONS

    SciTech Connect

    Qiu, R.; Liu, J.C.; Prinz, A.A.; Rokni, S.H.; Woods, M.; Xia, Z.; /SLAC; ,

    2011-03-21

    Interaction of a high intensity laser with matter may generate an ionizing radiation hazard. Very limited studies have been made, however, on the laser-induced radiation protection issue. This work reviews available literature on the physics and characteristics of laser-induced X-ray hazards. Important aspects include the laser-to-electron energy conversion efficiency, electron angular distribution, electron energy spectrum and effective temperature, and bremsstrahlung production of X-rays in the target. The possible X-ray dose rates for several femtosecond Ti:sapphire laser systems used at SLAC, including the short pulse laser system for the Matter in Extreme Conditions Instrument (peak power 4 TW and peak intensity 2.4 x 10{sup 18} W/cm{sup 2}) were analysed. A graded approach to mitigate the laser-induced X-ray hazard with a combination of engineered and administrative controls is also proposed.

  3. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes reduce the level of uncertainty in their results to the uncertainty in the geophysical initial conditions. Further, when coupled with real--time free--field tsunami measurements from tsunameters, validated codes are the only choice for realistic forecasting of inundation; the consequences of failure are too ghastly to take chances with numerical procedures that have not been validated. We discuss a ten step process of benchmark tests for models used for inundation mapping. The associated methodology and algorithmes have to first be validated with analytical solutions, then verified with laboratory measurements and field data. The models need to be published in the scientific literature in peer-review journals indexed by ISI. While this process may appear onerous, it reflects our state of knowledge, and is the only defensible methodology when human lives are at stake. Synolakis, C.E., and Bernard, E.N, Tsunami science before and beyond Boxing Day 2004, Phil. Trans. R. Soc. A 364 1845, 2231--2263, 2005.

  4. India's Initiative in Mitigating Tsunami and Storm Surge Hazard

    NASA Astrophysics Data System (ADS)

    Gupta, H.

    2008-12-01

    Soon after the occurrence of the most devastating tsunami caused by the 26th December, 2004 Sumatra earthquake, India took the initiative to set up an end to end system to mitigate tsunami and storm surge hazard. The system includes all the necessary elements: networking of seismic stations; deployment of ocean bottom pressure recorders; real time sea level monitoring stations; establishment of radar based monitoring stations for real time measurement of surface currents and waves; modeling for tsunamis and storm surges; generation of coastal inundation and vulnerability maps; operation of a tsunami and storm surges warning centre on 24×7 basis; capacity building and training of all the stakeholders and communication with the global community. This initiative was estimated to have a direct cost of US $30 million and was to be operative by August 2007. This has been achieved. The Indian National Centre for Ocean Information and Services (INCOIS), belonging to the Ministry of Earth Sciences (MoES), located at Hyderabad, is the nodal agency for this program. The system fared well during the occurrence of September 12/13 2007 tsunamigenic earthquakes. One of the problems is delay in estimating the size of large earthquakes. Empirical approaches are being developed to quickly estimate the size of the earthquakes occurring in Sumatra -Andaman zone of tsunamigenic earthquakes.

  5. Mitigation of unconfined releases of hazardous gases via liquid spraying

    SciTech Connect

    Fthenakis, V.M.

    1997-02-01

    The capability of water sprays in mitigating clouds of hydrofluoric acid (HF) has been demonstrated in the large-scale field experiments of Goldfish and Hawk, which took place at the DOE Nevada Test Site. The effectiveness of water sprays and fire water monitors to remove HF from vapor plume, has also been studied theoretically using the model HGSPRAY5 with the near-field and far-field dispersion described by the HGSYSTEM models. This paper presents options to select and evaluate liquid spraying systems, based on the industry experience and mathematical modeling.

  6. Planning for the Future: Climate Adaptation in Hazard Mitigation Plans and Comprehensive Water Resource Management Plans

    E-print Network

    Bailey, Gregory

    This project investigated whether hazard mitigation plans (HMPs) and comprehensive water resource management plans (CWRMPs) completed by cities and towns in Massachusetts account for the long term effects of climate change. ...

  7. Next-Generation GPS Station for Hazards Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2013-12-01

    Our objective is to better forecast, assess, and mitigate natural hazards, including earthquakes, tsunamis, and extreme storms and flooding through development and implementation of a modular technology for the next-generation in-situ geodetic station to support the flow of information from multiple stations to scientists, mission planners, decision makers, and first responders. The same technology developed under NASA funding can be applied to enhance monitoring of large engineering structures such as bridges, hospitals and other critical infrastructure. Meaningful warnings save lives when issued within 1-2 minutes for destructive earthquakes, several tens of minutes for tsunamis, and up to several hours for extreme storms and flooding, and can be provided by on-site fusion of multiple data types and generation of higher-order data products: GPS/GNSS and accelerometer measurements to estimate point displacements, and GPS/GNSS and meteorological measurements to estimate moisture variability in the free atmosphere. By operating semi-autonomously, each station can then provide low-latency, high-fidelity and compact data products within the constraints of narrow communications bandwidth that often accompanies natural disasters. We have developed a power-efficient, low-cost, plug-in Geodetic Module for fusion of data from in situ sensors including GPS, a strong-motion accelerometer module, and a meteorological sensor package, for deployment at existing continuous GPS stations in southern California; fifteen stations have already been upgraded. The low-cost modular design is scalable to the many existing continuous GPS stations worldwide. New on-the-fly data products are estimated with 1 mm precision and accuracy, including three-dimensional seismogeodetic displacements for earthquake, tsunami and structural monitoring and precipitable water for forecasting extreme weather events such as summer monsoons and atmospheric rivers experienced in California. Unlike more traditional approaches where data are collected and analyzed from a network of stations at a central processing facility, we are embedding these capabilities in the Geodetic Module's processor for in situ analysis and data delivery through TCP/IP to avoid single points of failure during emergencies. We are infusing our technology to several local and state groups, including the San Diego County Office of Emergency Services for earthquake and tsunami early warnings, UC San Diego Health Services for hospital monitoring and early warning, Caltrans for bridge monitoring, and NOAA's Weather Forecasting Offices in San Diego and Los Angeles Counties for forecasting extreme weather events. We describe our overall system and the ongoing efforts at technology infusion.

  8. Aligning Natural Resource Conservation and Flood Hazard Mitigation in California

    PubMed Central

    Calil, Juliano; Beck, Michael W.; Gleason, Mary; Merrifield, Matthew; Klausmeyer, Kirk; Newkirk, Sarah

    2015-01-01

    Flooding is the most common and damaging of all natural disasters in the United States, and was a factor in almost all declared disasters in U.S. history. Direct flood losses in the U.S. in 2011 totaled $8.41 billion and flood damage has also been on the rise globally over the past century. The National Flood Insurance Program paid out more than $38 billion in claims since its inception in 1968, more than a third of which has gone to the one percent of policies that experienced multiple losses and are classified as “repetitive loss.” During the same period, the loss of coastal wetlands and other natural habitat has continued, and funds for conservation and restoration of these habitats are very limited. This study demonstrates that flood losses could be mitigated through action that meets both flood risk reduction and conservation objectives. We found that there are at least 11,243km2 of land in coastal California, which is both flood-prone and has natural resource conservation value, and where a property/structure buyout and habitat restoration project could meet multiple objectives. For example, our results show that in Sonoma County, the extent of land that meets these criteria is 564km2. Further, we explore flood mitigation grant programs that can be a significant source of funds to such projects. We demonstrate that government funded buyouts followed by restoration of targeted lands can support social, environmental, and economic objectives: reduction of flood exposure, restoration of natural resources, and efficient use of limited governmental funds. PMID:26200353

  9. Assessing the costs of hazard mitigation through landscape interventions in the urban structure

    NASA Astrophysics Data System (ADS)

    Bostenaru-Dan, Maria; Aldea Mendes, Diana; Panagopoulos, Thomas

    2014-05-01

    In this paper we look at an issue rarely approached, the economic efficiency of natural hazard risk mitigation. The urban scale at which a natural hazard can impact leads to the importance of urban planning strategy in risk management. However, usually natural, engineering, and social sciences deal with it, and the role of architecture and urban planning is neglected. Climate change can lead to risks related to increased floods, desertification, sea level rise among others. Reducing the sealed surfaces in cities through green spaces in the crowded centres can mitigate them, and can be foreseen in restructuration plans in presence or absence of disasters. For this purpose we reviewed the role of green spaces and community centres such as churches in games, which can build the core for restructuration efforts, as also field and archive studies show. We look at the way ICT can contribute to organize the information from the building survey to economic computations in direct modeling or through games. The roles of game theory, agent based modeling and networks and urban public policies in designing decision systems for risk management are discussed. Games rules are at the same time supported by our field and archive studies, as well as research by design. Also we take into consideration at a rare element, which is the role of landscape planning, through the inclusion of green elements in reconstruction after the natural and man-made disasters, or in restructuration efforts to mitigate climate change. Apart of existing old city tissue also landscape can be endangered by speculation and therefore it is vital to highlight its high economic value, also in this particular case. As ICOMOS highlights for the 2014 congress, heritage and landscape are two sides of the same coin. Landscape can become or be connected to a community centre, the first being necessary for building a settlement, the second raising its value, or can build connections between landmarks in urban routes. For this reason location plays a role not only for mitigating the effects of hazards but also for increasing the value of land through vicinities. Games are only another way to build a model of the complex system which is the urban organism in this regard, and a model is easier to be analysed than the system while displaying its basic rules. The role of landscape of building roads of memory between landmarks in the reconstruction is yet to be investigated in a future proposed COST action.

  10. Assessment and mitigation of combustible dust hazards in the plastics industry

    NASA Astrophysics Data System (ADS)

    Stern, Michael C.; Ibarreta, Alfonso; Myers, Timothy J.

    2015-05-01

    A number of recent industrial combustible dust fires and explosions, some involving powders used in the plastics industry, have led to heightened awareness of combustible dust hazards, increased regulatory enforcement, and changes to the current standards and regulations. This paper provides a summary of the fundamentals of combustible dust explosion hazards, comparing and contrasting combustible dust to flammable gases and vapors. The types of tests used to quantify and evaluate the potential hazard posed by plastic dusts are explored. Recent changes in NFPA 654, a standard applicable to combustible dust in the plastics industry, are also discussed. Finally, guidance on the primary methods for prevention and mitigation of combustible dust hazards are provided.

  11. Spatio-temporal patterns of hazards and their use in risk assessment and mitigation. Case study of road accidents in Romania

    NASA Astrophysics Data System (ADS)

    Catalin Stanga, Iulian

    2013-04-01

    Road accidents are among the leading causes of death in many world countries, partly as an inherent consequence of the increasing mobility of today society. The World Health Organization estimates that 1.3 million people died in road accidents in 2011, which means 186 deaths per million. The tragic picture is completed by millions of peoples experiencing different physical injuries or by the enormous social and economic costs that these events imply. Romania has one of the most unsafe road networks within the European Union, with annual averages of 9400 accidents, 8300 injuries and almost 2680 fatalities (2007-2012). An average of 141 death per million is more than twice the average fatality rate in European Union (about 60 death per million). Other specific indicators (accidents or fatalities reported to the road length, vehicle fleet size, driving license owners or adult population etc.) are even worst in the same European context. Road accidents are caused by a complex series of factors, some of them being a relatively constant premise, while others act as catalyzing factors or triggering agent: road features and quality, vehicle technical state, weather conditions, human related factors etc. All these lead to a complex equation with too many unknown variables, making almost impossible a probabilistic approach. However, the high concentration of accidents in a region or in some road sectors is caused by the existence of a specific context, created by factors with permanent or repetitive character, and leads to the idea of a spatial autocorrelation between locations of different adjoining accident. In the same way, the increasing frequency of road accidents and of their causes repeatability in different periods of the year would allow to identify those black timeframes with higher incidence of road accidents. Identifying and analyzing the road blackspots (hotspots) and black zones would help to improve road safety by acting against the common causes that create the spatial or temporal clustering of crash accidents. Since the 1990's, Geographical Informational Systems (GIS) became a very important tool for traffic and road safety management, allowing not only the spatial and multifactorial analysis, but also graphical and non-graphical outputs. The current paper presents an accessible GIS methodology to study the spatio-temporal pattern of injury related road accidents, to identify the high density accidents zones, to make a cluster analysis, to create multicriterial typologies, to identify spatial and temporal similarities and to explain them. In this purpose, a Geographical Information System was created, allowing a complex analysis that involves not only the events, but also a large set of interrelated and spatially linked attributes. The GIS includes the accidents as georeferenced point elements with a spatially linked attribute database: identification information (date, location details); accident type; main, secondary and aggravating causes; data about driver; vehicle information; consequences (damages, injured peoples and fatalities). Each attribute has its own number code that allows both the statistical analysis and the spatial interrogation. The database includes those road accidents that led to physical injuries and loss of human lives between 2007 and 2012 and the spatial analysis was realized using TNTmips 7.3 software facilities. Data aggregation and processing allowed creating the spatial pattern of injury related road accidents through Kernel density estimation at three different levels (national - Romania; county level - Iasi County; local level - Iasi town). Spider graphs were used to create the temporal pattern or road accidents at three levels (daily, weekly and monthly) directly related to their causes. Moreover the spatial and temporal database relates the natural hazards (glazed frost, fog, and blizzard) with the human made ones, giving the opportunity to evaluate the nature of uncertainties in risk assessment. At the end, this paper provides a clustering methodology based on several environmenta

  12. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    Microsoft Academic Search

    V. Titov; F. Gonzalez; U. Kanoglu; A. Yalciner; C. E. Synolakis

    2006-01-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used

  13. Development Of An Open System For Integration Of Heterogeneous Models For Flood Forecasting And Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Chang, W.; Tsai, W.; Lin, F.; Lin, S.; Lien, H.; Chung, T.; Huang, L.; Lee, K.; Chang, C.

    2008-12-01

    During a typhoon or a heavy storm event, using various forecasting models to predict rainfall intensity, and water level variation in rivers and flood situation in the urban area is able to reveal its capability technically. However, in practice, the following two causes tend to restrain the further application of these models as a decision support system (DSS) for the hazard mitigation. The first one is due to the difficulty of integration of heterogeneous models. One has to take into consideration the different using format of models, such as input files, output files, computational requirements, and so on. The second one is that the development of DSS requires, due to the heterogeneity of models and systems, a friendly user interface or platform to hide the complexity of various tools from users. It is expected that users can be governmental officials rather than professional experts, therefore the complicated interface of DSS is not acceptable. Based on the above considerations, in the present study, we develop an open system for integration of several simulation models for flood forecasting by adopting the FEWS (Flood Early Warning System) platform developed by WL | Delft Hydraulics. It allows us to link heterogeneous models effectively and provides suitable display modules. In addition, FEWS also has been adopted by Water Resource Agency (WRA), Taiwan as the standard operational system for river flooding management. That means this work can be much easily integrated with the use of practical cases. In the present study, based on FEWS platform, the basin rainfall-runoff model, SOBEK channel-routing model, and estuary tide forecasting model are linked and integrated through the physical connection of model initial and boundary definitions. The work flow of the integrated processes of models is shown in Fig. 1. This differs from the typical single model linking used in FEWS, which only aims at data exchange but without much physical consideration. So it really makes the tighter collaboration work among these hydrological models. In addition, in order to make communication between system users and decision makers efficient and effective, a real-time and multi-user communication platform, designated as Co-life, is incorporated in the present study. Through its application sharing function, the flood forecasting results can be displayed for all attendees situated at different locations to help the processes of decision making for hazard mitigation. Fig. 2 shows the cyber-conference of WRA officials with the Co-life system for hazard mitigation during the typhoon event.

  14. A comparison of fire hazard mitigation alternatives in pinyon–juniper woodlands of Arizona

    Microsoft Academic Search

    David W. Huffman; Peter Z. Fulé; Joseph E. Crouse; Kristen M. Pearson

    2009-01-01

    Concern over uncontrollable wildfire in pinyon–juniper woodlands has led public land managers in the southwestern United States to seek approaches for mitigating wildfire hazard, yet little information is available concerning effectiveness and ecological responses of alternative treatments. We established a randomized block experiment at a pinyon–juniper site in northern Arizona and tested effects of no treatment (Control), thinning only (Thin),

  15. Fourth DOE Natural Phenomena Hazards Mitigation Conference: Proceedings. Volume 1

    SciTech Connect

    Not Available

    1993-12-31

    This conference allowed an interchange in the natural phenomena area among designers, safety professionals, and managers. The papers presented in Volume I of the proceedings are from sessions I - VIII which cover the general topics of: DOE standards, lessons learned and walkdowns, wind, waste tanks, ground motion, testing and materials, probabilistic seismic hazards, risk assessment, base isolation and energy dissipation, and lifelines and floods. Individual papers are indexed separately. (GH)

  16. Avalanche hazards and mitigation in Austria: a review

    Microsoft Academic Search

    Peter Höller

    2007-01-01

    At all times natural hazards like torrents or avalanches pose a threat to settlements and infrastructures in the Austrian\\u000a Alps. Since 1950 more than 1,600 persons have been killed by avalanches in Austria, which is on average approximately 30 fatalities\\u000a per year. In particular, the winter periods 1950\\/1951 and 1953\\/1954 stand out with more than 100 fatalities. Those events\\u000a led

  17. Department of Energy Natural Phenomena Hazards Mitigation Program

    SciTech Connect

    Murray, R.C.

    1993-09-01

    This paper will present a summary of past and present accomplishments of the Natural Phenomena Hazards Program that has been ongoing at Lawrence Livermore National Laboratory since 1975. The Natural Phenomena covered includes earthquake; winds, hurricanes, and tornadoes; flooding and precipitation; lightning; and volcanic events. The work is organized into four major areas (1) Policy, requirements, standards, and guidance (2) Technical support, research development, (3) Technology transfer, and (4) Oversight.

  18. Experimentally Benchmarked Numerical Approaches to Lightning Hazard Assessment and Mitigation

    NASA Astrophysics Data System (ADS)

    Jones, Malcolm; Newton, David

    2013-04-01

    A natural hazard that has been with us since the beginning of time is the lighting strike. Not only does it represent a direct hazard to humans but also to the facilities that they work within and the products they produce. The latter categories are of particular concern when they are related to potentially hazardous processes and products. For this reason experimental and numerical modelling techniques are developed to understand the nature of the hazards, to develop appropriate protective approaches which can be put in place and finally to gain assurance that the overall risks fall within national, international accepted standards and those appropriate to the special nature of the work. The latter is of particular importance when the processes and the products within such facilities have a potential susceptibility to lightning strike and where failure is deemed unacceptable. This paper covers examples of the modelling approaches applied to such facilities within which high consequence operations take place, together with the protection that is required for high consequence products. In addition examples are given of how the numerical techniques are benchmarked by supporting experimental programmes. Not only should such a safety rationale be laid down and agreed early for these facilities and products but that it is maintained during the inevitable changes that will occur during the design, development, production and maintenance phases. For example an 'improvement', as seen by a civil engineer or a facility manager, may well turn out to be detrimental to lightning safety. Constant vigilance is key to ensuring the maintenance of safety.

  19. Mitigation of EMU Cut Glove Hazard from Micrometeoroid and Orbital Debris Impacts on ISS Handrails

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon; Christiansen, Eric L.; Davis, Bruce A.; Ordonez, Erick

    2009-01-01

    Recent cut damages sustained on crewmember gloves during extravehicular activity (ISS) onboard the International Space Station (ISS) have been caused by contact with sharp edges or a pinch point according to analysis of the damages. One potential source are protruding sharp edged crater lips from micrometeoroid and orbital debris (MMOD) impacts on metallic handrails along EVA translation paths. A number of hypervelocity impact tests were performed on ISS handrails, and found that mm-sized projectiles were capable of inducing crater lip heights two orders of magnitude above the minimum value for glove abrasion concerns. Two techniques were evaluated for mitigating the cut glove hazard of MMOD impacts on ISS handrails: flexible overwraps which act to limit contact between crewmember gloves and impact sites, and; alternate materials which form less hazardous impact crater profiles. In parallel with redesign efforts to increase the cut resilience of EMU gloves, the modifications to ISS handrails evaluated in this study provide the means to significantly reduce cut glove risk from MMOD impact craters

  20. The U.S. National Tsunami Hazard Mitigation Program: Successes in Tsunami Preparedness

    NASA Astrophysics Data System (ADS)

    Whitmore, P.; Wilson, R. I.

    2012-12-01

    Formed in 1995 by Congressional Action, the National Tsunami Hazards Mitigation Program (NTHMP) provides the framework for tsunami preparedness activities in the United States. The Program consists of the 28 U.S. coastal states, territories, and commonwealths (STCs), as well as three Federal agencies: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), and the United States Geological Survey (USGS). Since its inception, the NTHMP has advanced tsunami preparedness in the United States through accomplishments in many areas of tsunami preparedness: - Coordination and funding of tsunami hazard analysis and preparedness activities in STCs; - Development and execution of a coordinated plan to address education and outreach activities (materials, signage, and guides) within its membership; - Lead the effort to assist communities in meeting National Weather Service (NWS) TsunamiReady guidelines through development of evacuation maps and other planning activities; - Determination of tsunami hazard zones in most highly threatened coastal communities throughout the country by detailed tsunami inundation studies; - Development of a benchmarking procedure for numerical tsunami models to ensure models used in the inundation studies meet consistent, NOAA standards; - Creation of a national tsunami exercise framework to test tsunami warning system response; - Funding community tsunami warning dissemination and reception systems such as sirens and NOAA Weather Radios; and, - Providing guidance to NOAA's Tsunami Warning Centers regarding warning dissemination and content. NTHMP activities have advanced the state of preparedness of United States coastal communities, and have helped save lives and property during recent tsunamis. Program successes as well as future plans, including maritime preparedness, are discussed.

  1. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    The United States spends approximately four million dollars each year searching for near-Earth objects (NEOs). The objective is to detect those that may collide with Earth. The majority of this funding supports the operation of several observatories that scan the sky searching for NEOs. This, however, is insufficient in detecting the majority of NEOs that may present a tangible threat to humanity. A significantly smaller amount of funding supports ways to protect the Earth from such a potential collision or "mitigation." In 2005, a Congressional mandate called for NASA to detect 90 percent of NEOs with diameters of 140 meters of greater by 2020. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies identifies the need for detection of objects as small as 30 to 50 meters as these can be highly destructive. The book explores four main types of mitigation including civil defense, "slow push" or "pull" methods, kinetic impactors and nuclear explosions. It also asserts that responding effectively to hazards posed by NEOs requires national and international cooperation. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies is a useful guide for scientists, astronomers, policy makers and engineers.

  2. TH101 & TH41 Flood Mitigation Study

    E-print Network

    Minnesota, University of

    TH101 & TH41 Flood Mitigation Study Presented by Mark Benson, PE Rachel Pichelmann, EIT, CFM #12 caused by seasonal flooding of the Minnesota River · Also, identify measures to ease congestion at Hwy 169 to mitigate the impacts of detoured traffic resulting from flood-related closures at Hwy 41

  3. The NEOShield Project: Understanding the Mitigation-Relevant Physical Properties of Potentially Hazardous Asteroids

    NASA Astrophysics Data System (ADS)

    Harris, Alan W.; Drube, L.; Consortium, NEOShield

    2012-10-01

    NEOShield is a European-Union funded project to address impact hazard mitigation issues, coordinated by the German Aerospace Center, DLR. The NEOShield consortium consists of 13 research institutes, universities, and industrial partners from 6 countries and includes leading US and Russian space organizations. The primary aim of the 5.8 million euro, 3.5 year project, which commenced in January 2012, is to investigate in detail promising mitigation techniques, such as the kinetic impactor, blast deflection, and the gravity tractor, and devise feasible demonstration missions. Options for an international strategy for implementation when an actual impact threat arises will also be investigated. Our current scientific work is focused on examining the mitigation-relevant physical properties of the NEO population via observational data and laboratory experiments on asteroid surface analog materials. We are attempting to narrow the range of the expected properties of objects that are most likely to threaten the Earth and trigger space-borne mitigation attempts, and investigate how such objects would respond to different mitigation techniques. The results of our scientific work will flow into the technical phase of the project, during which detailed designs of feasible mitigation demonstration missions will be developed. We briefly describe the scope of the project and report on results obtained to date. Funded under EU FP7 program agreement no. 282703.

  4. Pulsed Electric Processing of the Seismic-Active Fault for Earthquake Hazard Mitigation

    Microsoft Academic Search

    V. A. Novikov; V. A. Zeigarnik; Yu. B. Konev; V. N. Klyuchkin

    2010-01-01

    Previous field and laboratory investigations performed in Russia (1999-2008) showed a possibility of application of high-power electric current pulses generated by pulsed MHD power system for triggering the weak seismicity and release of tectonic stresses in the Earth crust for earthquake hazard mitigation. The mechanism of the influence of man-made electromagnetic field on the regional seismicity is not clear yet.

  5. A perspective multidisciplinary geological approach for mitigation of effects due to the asbestos hazard

    NASA Astrophysics Data System (ADS)

    Vignaroli, Gianluca; Rossetti, Federico; Belardi, Girolamo; Billi, Andrea

    2010-05-01

    Asbestos-bearing rock sequences constitute a remarkable natural hazard that poses important threat to human health and may be at the origin of diseases such as asbestosis, mesothelioma and lung cancer). Presently, asbestos is classified as Category 1 carcinogen by world health authorities. Although regulatory agencies in many countries prohibit or restrict the use of asbestos, and discipline the environmental asbestos exposure, the impact of asbestos on human life still constitutes a major problem. Naturally occurring asbestos includes serpentine and amphibole minerals characterised by fibrous morphology and it is a constituent of mineralogical associations typical of mafic and ultramafic rocks within the ophiolitic sequences. Release of fibres can occur both through natural processes (erosion) and through human activities requiring fragmentation of ophiolite rocks (quarrying, tunnelling, railways construction, etc.). As a consequence, vulnerability is increasing in sites where workers and living people are involved by dispersion of fibres during mining and milling of ophiolitic rocks. By analysing in the field different exposures of ophiolitic sequences from the Italian peninsula and after an extensive review of the existing literature, we remark the importance of the geological context (origin, tectonic and deformation history) of ophiolites as a first-order parameter in evaluating the asbestos hazard. Integrated structural, textural, mineralogical and petrological studies significantly improve our understanding of the mechanisms governing the nucleation/growth of fibrous minerals in deformation structures (both ductile and brittle) within the ophiolitic rocks. A primary role is recognised in the structural processes favouring the fibrous mineralization, with correlation existing between the fibrous parameters (such as mineralogical composition, texture, mechanics characteristics) and the particles released in the air (such as shape, size, and amount liberated during rock fragmentation). Accordingly, we are confident that definition of an analytical protocol based on the geological attributes of the asbestos-bearing rocks may constitute a propaedeutical tool to evaluate the asbestos hazard in natural environments. This approach may have important implications for mitigation effects of the asbestos hazard from the medical field to the engineering operations.

  6. Planning ahead for asteroid and comet hazard mitigation, phase 1: parameter space exploration and scenario modeling

    SciTech Connect

    Plesko, Catherine S [Los Alamos National Laboratory; Clement, R Ryan [Los Alamos National Laboratory; Weaver, Robert P [Los Alamos National Laboratory; Bradley, Paul A [Los Alamos National Laboratory; Huebner, Walter F [Los Alamos National Laboratory

    2009-01-01

    The mitigation of impact hazards resulting from Earth-approaching asteroids and comets has received much attention in the popular press. However, many questions remain about the near-term and long-term, feasibility and appropriate application of all proposed methods. Recent and ongoing ground- and space-based observations of small solar-system body composition and dynamics have revolutionized our understanding of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). Ongoing increases in computing power and algorithm sophistication make it possible to calculate the response of these inhomogeneous objects to proposed mitigation techniques. Here we present the first phase of a comprehensive hazard mitigation planning effort undertaken by Southwest Research Institute and Los Alamos National Laboratory. We begin by reviewing the parameter space of the object's physical and chemical composition and trajectory. We then use the radiation hydrocode RAGE (Gittings et al. 2008), Monte Carlo N-Particle (MCNP) radiation transport (see Clement et al., this conference), and N-body dynamics codes to explore the effects these variations in object properties have on the coupling of energy into the object from a variety of mitigation techniques, including deflection and disruption by nuclear and conventional munitions, and a kinetic impactor.

  7. Post-disaster assessment of hazard mitigation for small and medium-magnitude debris flow disasters in Bali, Indonesia and Jimani, Dominican Republic

    Microsoft Academic Search

    Brent Doberstein

    2009-01-01

    This article explores whether past exposure to debris flow disasters with a human dimension (e.g. caused in part by deforestation)\\u000a results in adaptive hazard mitigation and improved environmental and resource management practices in affected areas. When\\u000a guiding hazard mitigation practice, the ‘adaptive hazard mitigation’ approach views mitigation as a multi-dimensional experiment,\\u000a with the associated need for post-experiment monitoring, evaluation, learning

  8. A portfolio approach to evaluating natural hazard mitigation policies: An Application to lateral-spread ground failure in Coastal California

    USGS Publications Warehouse

    Bernknopf, R.L.; Dinitz, L.B.; Rabinovici, S.J.M.; Evans, A.M.

    2001-01-01

    In the past, efforts to prevent catastrophic losses from natural hazards have largely been undertaken by individual property owners based on site-specific evaluations of risks to particular buildings. Public efforts to assess community vulnerability and encourage mitigation have focused on either aggregating site-specific estimates or adopting standards based upon broad assumptions about regional risks. This paper develops an alternative, intermediate-scale approach to regional risk assessment and the evaluation of community mitigation policies. Properties are grouped into types with similar land uses and levels of hazard, and hypothetical community mitigation strategies for protecting these properties are modeled like investment portfolios. The portfolios consist of investments in mitigation against the risk to a community posed by a specific natural hazard, and are defined by a community's mitigation budget and the proportion of the budget invested in locations of each type. The usefulness of this approach is demonstrated through an integrated assessment of earthquake-induced lateral-spread ground failure risk in the Watsonville, California area. Data from the magnitude 6.9 Loma Prieta earthquake of 1989 are used to model lateral-spread ground failure susceptibility. Earth science and economic data are combined and analyzed in a Geographic Information System (GIS). The portfolio model is then used to evaluate the benefits of mitigating the risk in different locations. Two mitigation policies, one that prioritizes mitigation by land use type and the other by hazard zone, are compared with a status quo policy of doing no further mitigation beyond that which already exists. The portfolio representing the hazard zone rule yields a higher expected return than the land use portfolio does: However, the hazard zone portfolio experiences a higher standard deviation. Therefore, neither portfolio is clearly preferred. The two mitigation policies both reduce expected losses and increase overall expected community wealth compared to the status quo policy.

  9. Developing a Long-term Hazard Mitigation Plan for Consequent Volcanic Sedimentation Hazards at Santiaguito Dome Complex, Guatemala

    NASA Astrophysics Data System (ADS)

    Bunzendahl, E.; Bluth, G. J.; Rose, W. I.; Reif, S. L.; Matias, O.

    2001-12-01

    Continuous volcanic activity at Santiaguito, accompanied by seasonal monsoons, results in sediment inputs downslope into the Río Ixpatz and Río Samalá river channels. This threatens the lives and economic stability in populated downstream areas of Guatemala's coastal slope, a region responsible for major contributions to Guatemala's foreign exchange earnings. The changing riverbeds are host to sediment and water quality problems; subsequent flooding threatens villages, nearby cropland, infrastructure, and transportation. Current research suggests that the volcanic activity results in costs equal to millions of US dollars per year. Mitigation efforts are needed to protect lives, fertile land, and valuable crops located along the river valleys and plains in the down-slope region of the volcano. The goal of this work is to build a GIS database for the areas affected by Santiaguito to facilitate the development of a long-range (several decades) plan for hazard mitigation and infrastructure development. The GIS will include multiple TM images which have been used to quantify activity and downslope aggradation patterns (Matias et al, this meeting), digital topography obtained from IGN and USGS/VDAP, land use maps and infrastructure overlays from IGN, Guatemala, and volcanic hazard zonation maps from INSIVUMEH, Guatemala. We expect to also use LAHARZ (Iverson, R.M., Schilling, S.S., Vallance, J.W., GSA Bulletin, 1998) to supplement GIS analysis. Additionally, we plan to work with local agencies within Guatemala to improve the current mitigation strategy which mainly involves extensive annual river and near-bridge dredging and is reactive on short time scales.

  10. Mitigating hazards to aircraft from drifting volcanic clouds by comparing and combining IR satellite data with forward transport models

    Microsoft Academic Search

    M. Alexandra Matiella Novak

    2008-01-01

    Volcanic ash clouds in the upper atmosphere (>10km) present a significant hazard to the aviation community and in some cases cause near-disastrous situations for aircraft that inadvertently encounter them. The two most commonly used techniques for mitigating hazards to aircraft from drifting volcanic clouds are (1) using data from satellite observations and (2) the forecasting of dispersion and trajectories with

  11. Lidar and Electro-Optics for Atmospheric Hazard Sensing and Mitigation

    NASA Technical Reports Server (NTRS)

    Clark, Ivan O.

    2012-01-01

    This paper provides an overview of the research and development efforts of the Lidar and Electro-Optics element of NASA's Aviation Safety Program. This element is seeking to improve the understanding of the atmospheric environments encountered by aviation and to provide enhanced situation awareness for atmospheric hazards. The improved understanding of atmospheric conditions is specifically to develop sensor signatures for atmospheric hazards. The current emphasis is on kinetic air hazards such as turbulence, aircraft wake vortices, mountain rotors, and windshear. Additional efforts are underway to identify and quantify the hazards arising from multi-phase atmospheric conditions including liquid and solid hydrometeors and volcanic ash. When the multi-phase conditions act as obscurants that result in reduced visual awareness, the element seeks to mitigate the hazards associated with these diminished visual environments. The overall purpose of these efforts is to enable safety improvements for air transport class and business jet class aircraft as the transition to the Next Generation Air Transportation System occurs.

  12. Physical Prototype Development for the Real-Time Detection and Mitigation of Hazardous Releases into a Flow System

    NASA Astrophysics Data System (ADS)

    Rimer, Sara; Katopodes, Nikolaos

    2013-11-01

    The threat of accidental or deliberate toxic chemicals released into public spaces is a significant concern to public safety. The real-time detection and mitigation of such hazardous contaminants has the potential to minimize harm and save lives. In this study, we demonstrate the feasibility of feedback control of a hazardous contaminant by means of a laboratory-scale physical prototype integrated with a previously-developed robust predictive control numerical model. The physical prototype is designed to imitate a public space characterized by a long conduit with an ambient flow (e.g. airport terminal). Unidirectional air flows through a 24-foot long duct. The ``contaminant'' plume of propylene glycol smoke is released into the duct. Camera sensors are used to visually measure concentration of the plume. A pneumatic system is utilized to localize the contaminant via air curtains, and draw it out via vacuum nozzles. The control prescribed to the pneumatic system is based on the numerical model. The threat of accidental or deliberate toxic chemicals released into public spaces is a significant concern to public safety. The real-time detection and mitigation of such hazardous contaminants has the potential to minimize harm and save lives. In this study, we demonstrate the feasibility of feedback control of a hazardous contaminant by means of a laboratory-scale physical prototype integrated with a previously-developed robust predictive control numerical model. The physical prototype is designed to imitate a public space characterized by a long conduit with an ambient flow (e.g. airport terminal). Unidirectional air flows through a 24-foot long duct. The ``contaminant'' plume of propylene glycol smoke is released into the duct. Camera sensors are used to visually measure concentration of the plume. A pneumatic system is utilized to localize the contaminant via air curtains, and draw it out via vacuum nozzles. The control prescribed to the pneumatic system is based on the numerical model. NSF-CMMI 0856438.

  13. Linear Aerospike SR-71 Experiment (LASRE): Aerospace Propulsion Hazard Mitigation Systems

    NASA Technical Reports Server (NTRS)

    Mizukami, Masashi; Corpening, Griffin P.; Ray, Ronald J.; Hass, Neal; Ennix, Kimberly A.; Lazaroff, Scott M.

    1998-01-01

    A major hazard posed by the propulsion system of hypersonic and space vehicles is the possibility of fire or explosion in the vehicle environment. The hazard is mitigated by minimizing or detecting, in the vehicle environment, the three ingredients essential to producing fire: fuel, oxidizer, and an ignition source. The Linear Aerospike SR-71 Experiment (LASRE) consisted of a linear aerospike rocket engine integrated into one-half of an X-33-like lifting body shape, carried on top of an SR-71 aircraft. Gaseous hydrogen and liquid oxygen were used as propellants. Although LASRE is a one-of-a-kind experimental system, it must be rated for piloted flight, so this test presented a unique challenge. To help meet safety requirements, the following propulsion hazard mitigation systems were incorporated into the experiment: pod inert purge, oxygen sensors, a hydrogen leak detection algorithm, hydrogen sensors, fire detection and pod temperature thermocouples, water misting, and control room displays. These systems are described, and their development discussed. Analyses, ground test, and flight test results are presented, as are findings and lessons learned.

  14. Use of a Novel Visual Metaphor Measure (PRISM) to Evaluate School Children's Perceptions of Natural Hazards, Sources of Hazard Information, Hazard Mitigation Organizations, and the Effectiveness of Future Hazard Education Programs in Dominica, Eastern Car

    NASA Astrophysics Data System (ADS)

    Parham, Martin; Day, Simon; Teeuw, Richard; Solana, Carmen; Sensky, Tom

    2015-04-01

    This project aims to study the development of understanding of natural hazards (and of hazard mitigation) from the age of 11 to the age of 15 in secondary school children from 5 geographically and socially different schools on Dominica, through repeated interviews with the students and their teachers. These interviews will be coupled with a structured course of hazard education in the Geography syllabus; the students not taking Geography will form a control group. To avoid distortion of our results arising from the developing verbalization and literacy skills of the students over the 5 years of the project, we have adapted the PRISM tool used in clinical practice to assess patient perceptions of illness and treatment (Buchi & Sensky, 1999). This novel measure is essentially non-verbal, and uses spatial positions of moveable markers ("object" markers) on a board, relative to a fixed marker that represents the subject's "self", as a visual metaphor for the importance of the object to the subject. The subjects also explain their reasons for placing the markers as they have, to provide additional qualitative information. The PRISM method thus produces data on the perceptions measured on the board that can be subjected to statistical analysis, and also succinct qualitative data about each subject. Our study will gather data on participants' perceptions of different natural hazards, different sources of information about these, and organizations or individuals to whom they would go for help in a disaster, and investigate how these vary with geographical and social factors. To illustrate the method, which is generalisable, we present results from our initial interviews of the cohort of 11 year olds whom we will follow through their secondary school education. Büchi, S., & Sensky, T. (1999). PRISM: Pictorial Representation of Illness and Self Measure: a brief nonverbal measure of illness impact and therapeutic aid in psychosomatic medicine. Psychosomatics, 40(4), 314-320.

  15. Use of a Novel Visual Metaphor Measure (PRISM) to Evaluate School Children's Perceptions of Natural Hazards, Sources of Hazard Information, Hazard Mitigation Organizations, and the Effectiveness of Future Hazard Education Programs in Dominica, Eastern Caribbean

    NASA Astrophysics Data System (ADS)

    Parham, M.; Day, S. J.; Teeuw, R. M.; Solana, C.; Sensky, T.

    2014-12-01

    This project aims to study the development of understanding of natural hazards (and of hazard mitigation) from the age of 11 to the age of 15 in secondary school children from 5 geographically and socially different schools on Dominica, through repeated interviews with the students and their teachers. These interviews will be coupled with a structured course of hazard education in the Geography syllabus; the students not taking Geography will form a control group. To avoid distortion of our results arising from the developing verbalization and literacy skills of the students over the 5 years of the project, we have adapted the PRISM tool used in clinical practice to assess patient perceptions of illness and treatment (Buchi & Sensky, 1999). This novel measure is essentially non-verbal, and uses spatial positions of moveable markers ("object" markers) on a board, relative to a fixed marker that represents the subject's "self", as a visual metaphor for the importance of the object to the subject. The subjects also explain their reasons for placing the markers as they have, to provide additional qualitative information. The PRISM method thus produces data on the perceptions measured on the board that can be subjected to statistical analysis, and also succinct qualitative data about each subject. Our study will gather data on participants' perceptions of different natural hazards, different sources of information about these, and organizations or individuals to whom they would go for help in a disaster, and investigate how these vary with geographical and social factors. To illustrate the method, which is generalisable, we present results from our initial interviews of the cohort of 11 year olds whom we will follow through their secondary school education.Büchi, S., & Sensky, T. (1999). PRISM: Pictorial Representation of Illness and Self Measure: a brief nonverbal measure of illness impact and therapeutic aid in psychosomatic medicine. Psychosomatics, 40(4), 314-320.

  16. The Puerto Rico Component of the National Tsunami Hazard and Mitigation Program Pr-Nthmp

    NASA Astrophysics Data System (ADS)

    Huerfano Moreno, V. A.; Hincapie-Cardenas, C. M.

    2014-12-01

    Tsunami hazard assessment, detection, warning, education and outreach efforts are intended to reduce losses to life and property. The Puerto Rico Seismic Network (PRSN) is participating in an effort with local and federal agencies, to developing tsunami hazard risk reduction strategies under the National Tsunami Hazards and Mitigation Program (NTHMP). This grant supports the TsunamiReady program which is the base of the tsunami preparedness and mitigation in PR. The Caribbean region has a documented history of damaging tsunamis that have affected coastal areas. The seismic water waves originating in the prominent fault systems around PR are considered to be a near-field hazard for Puerto Rico and the Virgin islands (PR/VI) because they can reach coastal areas within a few minutes after the earthquake. Sources for local, regional and tele tsunamis have been identified and modeled and tsunami evacuation maps were prepared for PR. These maps were generated in three phases: First, hypothetical tsunami scenarios on the basis of the parameters of potential underwater earthquakes were developed. Secondly, each of these scenarios was simulated. The third step was to determine the worst case scenario (MOM). The run-ups were drawn on GIS referenced maps and aerial photographs. These products are being used by emergency managers to educate the public and develop mitigation strategies. Online maps and related evacuation products are available to the public via the PR-TDST (PR Tsunami Decision Support Tool). Currently all the 44 coastal municipalities were recognized as TsunamiReady by the US NWS. The main goal of the program is to declare Puerto Rico as TsunamiReady, including two cities that are not coastal but could be affected by tsunamis. Based on these evacuation maps, tsunami signs were installed, vulnerability profiles were created, communication systems to receive and disseminate tsunami messages were installed in each TWFP, and tsunami response plans were approved. Also, the existing tsunami protocol and criteria in the PR/VI was updated. This paper describes the PR-NTHMP project, including the real time earthquake and tsunami monitoring as well as the specific protocols used to broadcast tsunami messages. The paper highlights tsunami hazards assessment, detection, warning, education and outreach in Puerto Rico.

  17. Impact hazard mitigation: understanding the effects of nuclear explosive outputs on comets and asteroids

    SciTech Connect

    Clement, Ralph R C [Los Alamos National Laboratory; Plesko, Catherine S [Los Alamos National Laboratory; Bradley, Paul A [Los Alamos National Laboratory; Conlon, Leann M [Los Alamos National Laboratory

    2009-01-01

    The NASA 2007 white paper ''Near-Earth Object Survey and Deflection Analysis of Alternatives'' affirms deflection as the safest and most effective means of potentially hazardous object (PHO) impact prevention. It also calls for further studies of object deflection. In principle, deflection of a PHO may be accomplished by using kinetic impactors, chemical explosives, gravity tractors, solar sails, or nuclear munitions. Of the sudden impulse options, nuclear munitions are by far the most efficient in terms of yield-per-unit-mass launched and are technically mature. However, there are still significant questions about the response of a comet or asteroid to a nuclear burst. Recent and ongoing observational and experimental work is revolutionizing our understanding of the physical and chemical properties of these bodies (e.g ., Ryan (2000) Fujiwara et al. (2006), and Jedicke et al. (2006)). The combination of this improved understanding of small solar-system bodies combined with current state-of-the-art modeling and simulation capabilities, which have also improved dramatically in recent years, allow for a science-based, comprehensive study of PHO mitigation techniques. Here we present an examination of the effects of radiation from a nuclear explosion on potentially hazardous asteroids and comets through Monte Carlo N-Particle code (MCNP) simulation techniques. MCNP is a general-purpose particle transport code commonly used to model neutron, photon, and electron transport for medical physics reactor design and safety, accelerator target and detector design, and a variety of other applications including modeling the propagation of epithermal neutrons through the Martian regolith (Prettyman 2002). It is a massively parallel code that can conduct simulations in 1-3 dimensions, complicated geometries, and with extremely powerful variance reduction techniques. It uses current nuclear cross section data, where available, and fills in the gaps with analytical models where data are not available. MCNP has undergone extensive verification and validation and is considered the gold-standard for particle transport. (Forrest B. Brown, et al., ''MCNP Version 5,'' Trans. Am. Nucl. Soc., 87, 273, November 2002.) Additionally, a new simulation capability using MCNP has become available to this collaboration. The first results of this new capability will also be presented.

  18. Identification, prediction, and mitigation of sinkhole hazards in evaporite karst areas

    USGS Publications Warehouse

    Gutierrez, F.; Cooper, A.H.; Johnson, K.S.

    2008-01-01

    Sinkholes usually have a higher probability of occurrence and a greater genetic diversity in evaporite terrains than in carbonate karst areas. This is because evaporites have a higher solubility and, commonly, a lower mechanical strength. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas. To deal with these hazards, a phased approach is needed for sinkhole identification, investigation, prediction, and mitigation. Identification techniques include field surveys and geomorphological mapping combined with accounts from local people and historical sources. Detailed sinkhole maps can be constructed from sequential historical maps, recent topographical maps, and digital elevation models (DEMs) complemented with building-damage surveying, remote sensing, and high-resolution geodetic surveys. On a more detailed level, information from exposed paleosubsidence features (paleokarst), speleological explorations, geophysical investigations, trenching, dating techniques, and boreholes may help in investigating dissolution and subsidence features. Information on the hydrogeological pathways including caves, springs, and swallow holes are particularly important especially when corroborated by tracer tests. These diverse data sources make a valuable database-the karst inventory. From this dataset, sinkhole susceptibility zonations (relative probability) may be produced based on the spatial distribution of the features and good knowledge of the local geology. Sinkhole distribution can be investigated by spatial distribution analysis techniques including studies of preferential elongation, alignment, and nearest neighbor analysis. More objective susceptibility models may be obtained by analyzing the statistical relationships between the known sinkholes and the conditioning factors. Chronological information on sinkhole formation is required to estimate the probability of occurrence of sinkholes (number of sinkholes/km2 year). Such spatial and temporal predictions, frequently derived from limited records and based on the assumption that past sinkhole activity may be extrapolated to the future, are non-corroborated hypotheses. Validation methods allow us to assess the predictive capability of the susceptibility maps and to transform them into probability maps. Avoiding the most hazardous areas by preventive planning is the safest strategy for development in sinkhole-prone areas. Corrective measures could be applied to reduce the dissolution activity and subsidence processes. A more practical solution for safe development is to reduce the vulnerability of the structures by using subsidence-proof designs. ?? 2007 Springer-Verlag.

  19. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Research Team

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage raft empennage.

  20. Earth sciences, GIS and geomatics for natural hazards assessment and risks mitigation: a civil protection perspective

    NASA Astrophysics Data System (ADS)

    Perotti, Luigi; Conte, Riccardo; Lanfranco, Massimo; Perrone, Gianluigi; Giardino, Marco; Ratto, Sara

    2010-05-01

    Geo-information and remote sensing are proper tools to enhance functional strategies for increasing awareness on natural hazards and risks and for supporting research and operational activities devoted to disaster reduction. An improved Earth Sciences knowledge coupled with Geomatics advanced technologies has been developed by the joint research group and applied by the ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action) centre, within its partnership with the UN World Food Programme (WFP) with the goal of reducing human, social, economic and environmental losses due to natural hazards and related disasters. By cooperating with local and regional authorities (Municipalities, Centro Funzionale of the Aosta Valley, Civil Protection Agency of Regione Piemonte), data on natural hazards and risks have been collected, compared to national and global data, then interpreted for helping communities and civil protection agencies of sensitive mountain regions to make strategic choices and decisions to better mitigation and adaption measures. To enhance the application of GIS and Remote-sensing technologies for geothematic mapping of geological and geomorphological risks of mountain territories of Europe and Developing Countries, research activities led to the collection and evaluation of data from scientific literature and historical technical archives, for the definition of predisposing/triggering factors and evolutionary processes of natural instability phenomena (landslides, floods, storms, …) and for the design and implementation of early-warning and early-impact systems. Geodatabases, Remote Sensing and Mobile-GIS applications were developed to perform analysis of : 1) large climate-related disaster (Hurricane Mitch, Central America), by the application of remote sensing techniques, either for early warning or mitigation measures at the national and international scale; 2) distribution of slope instabilities at the regional scale (Aosta Valley, NW-Italy), for preventing and recovering measures; 3) geological and geomorphological controlling factors of seismicity, to provide microzonation maps and scenarios for co-seismic response of instable zones (Dronero, NW- Italian Alps); 4) earthquake effects on ground and infrastructures, in order to register early assessment for awareness situations and for compile damage inventories (Asti-Alessandria seismic events, 2000, 2001, 2003). The research results has been able to substantiate early warning models by structuring geodatabases on natural disasters, and to support humanitarian relief and disaster management activities by creating and testing SRG2, a mobile-GIS application for field-data collection on natural hazards and risks.

  1. Topical issues on performance categorization of structures, systems and components for natural phenomena hazards mitigation

    SciTech Connect

    Hossain, Q.A.; Nelson, T.A.; Murray, R.C.

    1992-12-01

    Lawrence Livermore National Laboratory (LLNL), under contract to the United States Department of Energy (DOE), has been providing assistance in the development of criteria, standards, and guidelines for designing and evaluating DOE facilities subjected to natural phenomena hazards (NPHs). The NPH design/evaluation guideline document, UCRL-15910 is one of the documents that resulted from this assistance. Even though UCRL-15910 is referred in the General Design Criteria DOE 6430.1A, it is not uniformly applied in the design/evaluation process of DOE facilities. To achieve uniform application of this document, and also to provide a comprehensive NPH mitigation program, an order, DOE 5480.NPH, has recently been developed that requires placing structures, systems, and components (SSCs) comprising DOE facilities into five performance categories (PC) based on importance, mission, cost, and safety considerations. DOE 5480.NPH refers to a standard, DOE-STD-1021-92 (under development) that will provide criteria and guidelines for the selection of SSC performance category. An interim version of this standard has been recently proposed for trial use. The details of the topical issues that were considered in developing this proposed standard, as well as the issues that should be considered before the standard is finalized, are discussed and presented in this report Facilities owned, operated or administered by DOE vary widely in mission, complexity, and hazard potential. NPH mitigation of these facilities involves an array of rules, policies, orders, and standards, which must be considered in the development of performance categorization criteria. The interrelationship among these documents, as these relate to SSC performance categorization, is discussed in Section 2.0 of this report.

  2. Evaluation Of Risk And Possible Mitigation Schemes For Previously Unidentified Hazards

    NASA Technical Reports Server (NTRS)

    Linzey, William; McCutchan, Micah; Traskos, Michael; Gilbrech, Richard; Cherney, Robert; Slenski, George; Thomas, Walter, III

    2006-01-01

    This report presents the results of arc track testing conducted to determine if such a transfer of power to un-energized wires is possible and/or likely during an arcing event, and to evaluate an array of protection schemes that may significantly reduce the possibility of such a transfer. The results of these experiments may be useful for determining the level of protection necessary to guard against spurious voltage and current being applied to safety critical circuits. It was not the purpose of these experiments to determine the probability of the initiation of an arc track event only if an initiation did occur could it cause the undesired event: an inadvertent thruster firing. The primary wire insulation used in the Orbiter is aromatic polyimide, or Kapton , a construction known to arc track under certain conditions [3]. Previous Boeing testing has shown that arc tracks can initiate in aromatic polyimide insulated 28 volts direct current (VDC) power circuits using more realistic techniques such as chafing with an aluminum blade (simulating the corner of an avionics box or lip of a wire tray), or vibration of an aluminum plate against a wire bundle [4]. Therefore, an arc initiation technique was chosen that provided a reliable and consistent technique of starting the arc and not a realistic simulation of a scenario on the vehicle. Once an arc is initiated, the current, power and propagation characteristics of the arc depend on the power source, wire gauge and insulation type, circuit protection and series resistance rather than type of initiation. The initiation method employed for these tests was applying an oil and graphite mixture to the ends of a powered twisted pair wire. The flight configuration of the heater circuits, the fuel/oxider (or ox) wire, and the RCS jet solenoid were modeled in the test configuration so that the behavior of these components during an arcing event could be studied. To determine if coil activation would occur with various protection wire schemes, 145 tests were conducted using various fuel/ox wire alternatives (shielded and unshielded) and/or different combinations of polytetrafuloroethylene (PTFE), Mystik tape and convoluted wraps to prevent unwanted coil activation. Test results were evaluated along with other pertinent data and information to develop a mitigation strategy for an inadvertent RCS firing. The SSP evaluated civilian aircraft wiring failures to search for aging trends in assessing the wire-short hazard. Appendix 2 applies Weibull statistical methods to the same data with a similar purpose.

  3. The Identification of Filters and Interdependencies for Effective Resource Allocation: Coupling the Mitigation of Natural Hazards to Economic Development.

    NASA Astrophysics Data System (ADS)

    Agar, S. M.; Kunreuther, H.

    2005-12-01

    Policy formulation for the mitigation and management of risks posed by natural hazards requires that governments confront difficult decisions for resource allocation and be able to justify their spending. Governments also need to recognize when spending offers little improvement and the circumstances in which relatively small amounts of spending can make substantial differences. Because natural hazards can have detrimental impacts on local and regional economies, patterns of economic development can also be affected by spending decisions for disaster mitigation. This paper argues that by mapping interdependencies among physical, social and economic factors, governments can improve resource allocation to mitigate the risks of natural hazards while improving economic development on local and regional scales. Case studies of natural hazards in Turkey have been used to explore specific "filters" that act to modify short- and long-term outcomes. Pre-event filters can prevent an event from becoming a natural disaster or change a routine event into a disaster. Post-event filters affect both short and long-term recovery and development. Some filters cannot be easily modified by spending (e.g., rural-urban migration) but others (e.g., land-use practices) provide realistic spending targets. Net social benefits derived from spending, however, will also depend on the ways by which filters are linked, or so-called "interdependencies". A single weak link in an interdependent system, such as a power grid, can trigger a cascade of failures. Similarly, weak links in social and commercial networks can send waves of disruption through communities. Conversely, by understanding the positive impacts of interdependencies, spending can be targeted to maximize net social benefits while mitigating risks and improving economic development. Detailed information on public spending was not available for this study but case studies illustrate how networks of interdependent filters can modify social benefits and costs. For example, spending after the 1992 Erzincan earthquake targeted local businesses but limited alternative employment, labor losses and diminished local markets all contributed to economic stagnation. Spending after the 1995 Dinar earthquake provided rent subsidies, supporting a major exodus from the town. Consequently many local people were excluded from reconstruction decisions and benefits offered by reconstruction funds. After the 1999 Marmara earthquakes, a 3-year economic decline in Yalova illustrates the vulnerability of local economic stability to weak regulation enforcement by a few agents. A resource allocation framework indicates that government-community relations, lack of economic diversification, beliefs, and compensation are weak links for effective spending. Stronger positive benefits could be achieved through spending to target land-use regulation enforcement, labor losses, time-critical needs of small businesses, and infrastructure. While the impacts of the Marmara earthquakes were devastating, strong commercial networks and international interests helped to re-establish the regional economy. Interdependencies may have helped to drive a recovery. Smaller events in eastern Turkey, however, can wipe out entire communities and can have long-lasting impacts on economic development. These differences may accelerate rural to urban migration and perpetuate regional economic divergence in the country. 1: Research performed in the Wharton MBA Program, Univ. of Pennsylvania.

  4. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    USGS Publications Warehouse

    Hearn, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  5. Challenges in understanding, modelling, and mitigating Lake Outburst Flood Hazard: experiences from Central Asia

    NASA Astrophysics Data System (ADS)

    Mergili, Martin; Schneider, Demian; Andres, Norina; Worni, Raphael; Gruber, Fabian; Schneider, Jean F.

    2010-05-01

    Lake Outburst Floods can evolve from complex process chains like avalanches of rock or ice that produce flood waves in a lake which may overtop and eventually breach glacial, morainic, landslide, or artificial dams. Rising lake levels can lead to progressive incision and destabilization of a dam, to enhanced ground water flow (piping), or even to hydrostatic failure of ice dams which can cause sudden outflow of accumulated water. These events often have a highly destructive potential because a large amount of water is released in a short time, with a high capacity to erode loose debris, leading to a powerful debris flow with a long travel distance. The best-known example of a lake outburst flood is the Vajont event (Northern Italy, 1963), where a landslide rushed into an artificial lake which spilled over and caused a flood leading to almost 2000 fatalities. Hazards from the failure of landslide dams are often (not always) fairly manageable: most breaches occur in the first few days or weeks after the landslide event and the rapid construction of a spillway - though problematic - has solved some hazardous situations (e.g. in the case of Hattian landslide in 2005 in Pakistan). Older dams, like Usoi dam (Lake Sarez) in Tajikistan, are usually fairly stable, though landsildes into the lakes may create floodwaves overtopping and eventually weakening the dams. The analysis and the mitigation of glacial lake outburst flood (GLOF) hazard remains a challenge. A number of GLOFs resulting in fatalities and severe damage have occurred during the previous decades, particularly in the Himalayas and in the mountains of Central Asia (Pamir, Tien Shan). The source area is usually far away from the area of impact and events occur at very long intervals or as singularities, so that the population at risk is usually not prepared. Even though potentially hazardous lakes can be identified relatively easily with remote sensing and field work, modeling and predicting of GLOFs (and also the outburst of landslide-dammed lakes) remains a challenge: • The knowledge about the onset of the process is often limited (bathymetry of the lakes, subsurface water, properties of dam (content of ice), type of dam breach, understanding of process chains and interactions). • The size of glacial lakes may change rapidly but continuously, and many lakes break out within a short time after their development. Continuous monitoring is therefore required to keep updated on the existing hazards. • Also the outburst of small glacial lakes may lead to significant debris floods or even debris flows if there is plenty of erodible material available. • The available modeling software packages are of limited suitability for lake outburst floods: e.g. software developed by the hydrological community is specialized to simulate (debris) floods with input hydrographs on moderately steep flow channels and with lower sediment loads. In contrast to this, programs for rapid mass movements are better suited on steeper slopes and sudden onset of the movement. The typical characteristics of GLOFs are in between and vary for different channel sections. In summary, the major bottlenecks remain in deriving realistic or worst case scenarios and predicting their magnitude and area of impact. This mainly concerns uncertainties in the dam break process, involved volumes, erosion rates, changing rheologies, and the limited capabilities of available software packages to simulate process interactions and transformations such as the development of a hyperconcentrated flow into a debris flow. In addition, many areas prone to lake outburst floods are located in developing countries with a limited scope of the threatened population for decision-making and limited resources for mitigation.

  6. Environmental legislation as the legal framework for mitigating natural hazards in Spain

    NASA Astrophysics Data System (ADS)

    Garrido, Jesús; Arana, Estanislao; Jiménez Soto, Ignacio; Delgado, José

    2015-04-01

    In Spain, the socioeconomic losses due to natural hazards (floods, earthquakes or landslides) are considerable, and the indirect costs associated with them are rarely considered because they are very difficult to evaluate. The prevention of losses due to natural hazards is more economic and efficient through legislation and spatial planning rather than through structural measures, such as walls, anchorages or structural reinforcements. However, there isn't a Spanish natural hazards law and national and regional sector legislation make only sparse mention of them. After 1978, when the Spanish Constitution was enacted, the Autonomous Communities (Spanish regions) were able to legislate according to the different competences (urban planning, environment or civil protection), which were established in the Constitution. In the 1990's, the Civil Protection legislation (national law and regional civil protection tools) dealt specifically with natural hazards (floods, earthquakes and volcanoes), but this was before any soil, seismic or hydrological studies were recommended in the national sector legislation. On the other hand, some Autonomous Communities referred to natural hazards in the Environmental Impact Assessment legislation (EIA) and also in the spatial and urban planning legislation and tools. The National Land Act, enacted in 1998, established, for the first time, that those lands exposed to natural hazards should be classified as non-developable. The Spanish recast text of the Land Act, enacted by Royal Legislative Decree 2/2008, requires that a natural hazards map be included in the Environmental Sustainability Report (ESR), which is compulsory for all master plans, according to the provisions set out by Act 9/2006, known as Spanish Strategic Environmental Assessment (SEA). Consequently, the environmental legislation, after the aforementioned transposition of the SEA European Directive 2001/42/EC, is the legal framework to prevent losses due to natural hazards through land use planning. However, most of the Spanish master plans approved after 2008 don't include a natural hazards map or/and don't classify the areas exposed to natural hazards as non-developable. Restrictions or prohibitions for building in natural hazard-prone areas are not usually established in the master plans. According to the jurisprudence, the environmental legislation prevails over spatial and urban planning regulations. On the other hand, the precedence of the national competence in public security would allow reclassification or the land, independently of the political or economic motivations of the municipal government. Despite of the technical building code or the seismic building code where some recommendations for avoiding "geotechnical" or seismic hazards are established, there are not compulsory guidelines to do technical studies/hazard maps for floods or landslides. The current legislation should be improved, under a technical point of view, and some mechanisms for enforcing the law should be also considered.

  7. A review of accidents, prevention and mitigation options related to hazardous gases

    SciTech Connect

    Fthenakis, V.M.

    1993-05-01

    Statistics on industrial accidents are incomplete due to lack of specific criteria on what constitutes a release or accident. In this country, most major industrial accidents were related to explosions and fires of flammable materials, not to releases of chemicals into the environment. The EPA in a study of 6,928 accidental releases of toxic chemicals revealed that accidents at stationary facilities accounted for 75% of the total number of releases, and transportation accidents for the other 25%. About 7% of all reported accidents (468 cases) resulted in 138 deaths and 4,717 injuries ranging from temporary respiratory problems to critical injuries. In-plant accidents accounted for 65% of the casualties. The most efficient strategy to reduce hazards is to choose technologies which do not require the use of large quantities of hazardous gases. For new technologies this approach can be implemented early in development, before large financial resources and efforts are committed to specific options. Once specific materials and options have been selected, strategies to prevent accident initiating events need to be evaluated and implemented. The next step is to implement safety options which suppress a hazard when an accident initiating event occurs. Releases can be prevented or reduced with fail-safe equipment and valves, adequate warning systems and controls to reduce and interrupt gas leakage. If an accident occurs and safety systems fail to contain a hazardous gas release, then engineering control systems will be relied on to reduce/minimize environmental releases. As a final defensive barrier, the prevention of human exposure is needed if a hazardous gas is released, in spite of previous strategies. Prevention of consequences forms the final defensive barrier. Medical facilities close by that can accommodate victims of the worst accident can reduce the consequences of personnel exposure to hazardous gases.

  8. Integrated Tsunami Data Supports Forecast, Warning, Research, Hazard Assessment, and Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Stroker, K. J.

    2009-12-01

    With nearly 230,000 fatalities, the 26 December 2004 Indian Ocean tsunami was the deadliest tsunami in history, illustrating the importance of developing basinwide warning systems. Key to creating these systems is easy access to quality-controlled, verified data on past tsunamis. It is essential that warning centers, emergency managers, and modelers can determine if and when similar events have occurred. Following the 2004 tsunami, the National Oceanic and Atmospheric Administration’s (NOAA) National Geophysical Data Center (NGDC) began examining all aspects of the tsunami data archive to help answer questions regarding the frequency and severity of past tsunamis. Historical databases span insufficient time to reveal a region’s full tsunami hazard, so a global database of citations to articles on tsunami deposits was added to the archive. NGDC further expanded the archive to include high-resolution tide gauge data, deep-ocean sensor data, and digital elevation models used for propagation and inundation modeling. NGDC continuously reviews the data for accuracy, making modifications as new information is obtained. These added databases allow NGDC to provide the tsunami data necessary for warning guidance, hazard assessments, and mitigation efforts. NGDC is also at the forefront of standards-based Web delivery of integrated science data through a variety of tools, from Web-form interfaces to interactive maps. The majority of the data in the tsunami archive are discoverable online. Scientists, journalists, educators, planners, and emergency managers are among the many users of these public domain data, which may be used without restriction provided that users cite data sources.

  9. Volcanic hazard in Mexico: a comprehensive on-line database for risk mitigation

    NASA Astrophysics Data System (ADS)

    Manea, Marina; Constantin Manea, Vlad; Capra, Lucia; Bonasia, Rosanna

    2013-04-01

    Researchers are currently working on several key aspects of the Mexican volcanoes, such as remote sensing, field data of old and recent volcaniclastic deposits, structural framework, monitoring (rainfall data and visual observation of lahars), and laboratory experiment (analogue models and numerical simulations - fall3D, titan2D). Each investigation is focused on specific processes, but it is fundamental to visualize the global status of the volcano in order to understand its behavior and to mitigate future hazards. The Mexican Volcanoes @nline represents a novel initiative aimed to collect, on a systematic basis, the complete set of data obtained so far on the volcanoes, and to continuously update the database with new data. All the information is compiled from published works and updated frequently. Maps, such as the geological map of the Mexican volcanos and the associated hazard zonation, as well as point data, such as stratigraphic sections, sedimentology and diagrams of rainfall intensities, are presented in Google Earth format in order to be easily accessed by the scientific community and the general public. An important section of this online database is the presentation of numerical simulations results for ash dispersion associated with the principal Mexican active volcanoes. Daily prediction of ash flow dispersion (based on real-time data from CENAPRED and the Mexican Meteorological Service), as well as large-scale high-resolution subduction simulations performed on HORUS (the Computational Geodynamics Laboratory's supercomputer) represent a central part of the Mexican Volcanos @nline database. The Mexican Volcanoes @nline database is maintained by the Computational Geodynamics Laboratory and it is based entirely on Open Source software. The website can be visited at: http://www.geociencias.unam.mx/mexican_volcanoes.

  10. Hazards

    NSDL National Science Digital Library

    USGS provides detailed fact sheets, research reports and case studies of coastal erosion caused by El Nino events in California. Other natural hazards are also covered, including earthquakes, tsunami, and landslides, illustrated with maps, photos, animated simulations, extensive links to resources by experts. Sort by topic to see all entries about coastal issues, environmental issues, mapping and data, natural hazards, and other subjects.

  11. Past, Present, and Future Challenges in Earthquake Hazard Mitigation of Indonesia: A Collaborative Work of Geological Agency Indonesia and Geoscience Australia

    NASA Astrophysics Data System (ADS)

    Hidayati, S.; Cummins, P. R.; Cipta, A.; Omang, A.; Griffin, J.; Horspool, N.; Robiana, R.; Sulaeman, C.

    2012-12-01

    In the last decade, Indonesia has suffered from earthquakes disaster since four out of twelve of the world's large earthquakes with more than 1000 causalities occurred in Indonesia. The great Sumatra earthquake of December 26, 2004 followed by tsunami which cost 227,898 of lives has brought Indonesia and its active tectonic setting to the world's attention. Therefore the government of Indonesia encourages hazard mitigation efforts that are more focused on the pre-disaster phase. In response to government policy in earthquake disaster mitigation, Geological Agency Indonesia attempts to meet the need for rigorous earthquake hazard map throughout the country in provincial scale in 2014. A collaborative work with Geoscience Australia through short-term training missions; on-going training, mentoring, assistance and studying in Australia, under the auspices of Australia-Indonesia Facility for Disaster Reduction (AIFDR) have accelerated the execution of these maps. Since 2010 to date of collaboration, by using probabilistic seismic hazard assessment (PSHA) method, provincial earthquake hazard maps of Central Java (2010), West Sulawesi, Gorontalo, and North Maluku (2011) have been published. In 2012, by the same method, the remaining provinces of Sulawesi Island, Papua, North Sumatera and Jambi will be published. In the end of 2014, all 33 Indonesian provinces hazard maps will be delivered. The future challenges are to work together with the stakeholders, to produce district scale maps and establish a national standard for earthquake hazard maps. Moreover, the most important consideration is to build the capacity to update, maintain and revise the maps as recent information available.

  12. Looking Before We Leap: Recent Results From An Ongoing Quantitative Investigation Of Asteroid And Comet Impact Hazard Mitigation.

    NASA Astrophysics Data System (ADS)

    Plesko, Catherine; Weaver, R. P.; Korycansky, D. G.; Huebner, W. F.

    2010-10-01

    The asteroid and comet impact hazard is now part of public consciousness, as demonstrated by movies, Super Bowl commercials, and popular news stories. However, there is a popular misconception that hazard mitigation is a solved problem. Many people think, `we'll just nuke it.’ There are, however, significant scientific questions remaining in the hazard mitigation problem. Before we can say with certainty that an explosive yield Y at height of burst h will produce a momentum change in or dispersion of a potentially hazardous object (PHO), we need to quantify how and where energy is deposited into the rubble pile or conglomerate that may make up the PHO. We then need to understand how shock waves propagate through the system, what causes them to disrupt, and how long gravitationally bound fragments take to recombine. Here we present numerical models of energy deposition from an energy source into various materials that are known PHO constituents, and rigid body dynamics models of the recombination of disrupted objects. In the energy deposition models, we explore the effects of porosity and standoff distance as well as that of composition. In the dynamical models, we explore the effects of fragment size and velocity distributions on the time it takes for gravitationally bound fragments to recombine. Initial models indicate that this recombination time is relatively short, as little as 24 hours for a 1 km sized PHO composed of 1000 meter-scale self-gravitating fragments with an initial velocity field of v/r = 0.001 1/s.

  13. India’s Initiative in Mitigating Tsunami Hazard & Tsunami Potential in Northern Bay of Bengal (Invited)

    NASA Astrophysics Data System (ADS)

    Gupta, H. K.

    2009-12-01

    Soon after the occurrence of the most devastating tsunami caused by the 26th December 2004 Sumatra earthquake, India took the initiative to set up an end-to-end system to mitigate tsunami and storm surge hazard. The system includes all the necessary elements: networking of seismic stations; deployment of ocean bottom pressure recorders; real time sea level monitoring stations; establishment of radar based monitoring stations for real time measurement of surface currents and waves; modeling for tsunamis and storm surges; generation of coastal inundation and vulnerability maps; operation of a tsunami and storm surges warning centre on 24×7 basis; capacity building and training of all the stakeholders and communication with the global community. This initiative was estimated to have a direct cost of US $30million and was to be operative by August 2007. This has been achieved. The Indian National Centre for Ocean Information and Services (INCOIS), belonging to the Ministry of Earth Sciences (MoES), Government of India, located at Hyderabad, is the nodal agency for this program. The system is functioning well. We also examine the tsunami potential in the northern Bay of Bengal, where a large population (about 100 million) in the coastal area makes the region very vulnerable if a large tsunami was to occur. It is observed that: i) oblique plate motion characterizes the region resulting in strike-slip dominated earthquakes with low tsunami generating potential; ii) in the northern Bay of Bengal, the deformation front associated with the plate boundary between India and Sunda plates is either landward or in the shallow water in the Arakan region and therefore a great earthquake will not displace large amounts of water causing a major tsunami; and iii) there is no evidence of the region been affected by a large tsunami in the past 2000 years. We therefore conclude that though a great earthquake could occur in the Arakan region, it would not generate a large tsunami in the northern Bay of Bengal.

  14. Natural Hazards and Effects on Local Populations: Applications of NSF MARGINS research to hazards mitigation in Central America

    E-print Network

    Marshall, Jeffrey S.

    mitigation in Central America Jeffrey S. Marshall, Geological Sciences Department, Cal Poly Pomona University, T., eds., National Science Foundation ­ MARGINS Program Workshop Report: Central America Seismogenic: The combined focus of two MARGINS research initiatives in Central America (Subduction Factory and SIEZE

  15. PREDICTION/MITIGATION OF SUBSIDENCE DAMAGE TO HAZARDOUS WASTE LANDFILL COVERS

    EPA Science Inventory

    Characteristics of Resource Conservation and Recovery Act hazardous waste landfills and of landfilled hazardous wastes have been described to permit development of models and other analytical techniques for predicting, reducing, and preventing landfill settlement and related cove...

  16. Assessment and mitigation of risk from low-probability, high-consequence hazards

    Microsoft Academic Search

    Bruce R. Ellingwood

    2007-01-01

    Modern probabilistic risk assessment of civil infrastructure supports risk mitigation policy by providing insight into the factors that govern performance of civil infrastructure subjected to severe events beyond the design basis and the relative efficiency of various options for risk mitigation. A fully-coupled risk assessment of a system provides estimates of the annual probability of exceeding pre-defined performance levels, defined

  17. Seismicity and seismotectonics of southern Ghana: lessons for seismic hazard mitigation

    NASA Astrophysics Data System (ADS)

    Amponsah, Paulina

    2014-05-01

    Ghana is located on the West African craton and is far from the major earthquake zone of the world. It is therefore largely considered a stable region. However, the southern part of the country is seismically active. Records of damaging earthquakes in Ghana date as far back as 1615. A study on the microseismic activity in southern Ghana shows that the seismic activity is linked with active faulting between the east-west trending Coastal boundary fault and a northeast-southwest trending Akwapim fault zone. Epicentres of most of the earthquakes have been located close to the area where the two major faults intersect. This can be related to the level of activity of the faults. Some of the epicentres have been located offshore and can be associated with the level of activity of the coastal boundary fault. A review of the geological and instrumental recordings of earthquakes in Ghana show that earthquakes have occurred in the past and are still liable to occur within the vicinity of the intersection of the Akwapim fault zone and the Coastal boundary fault. Data from both historical and instrumental records indicate that the most seismically active areas in Ghana are the west of Accra, where the Akwapim fault zone and the Coastal boundary fault intersect. There are numerous minor faults in the intersection area between the Akwapim fault zone and the Coastal boundary fault. This mosaic of faults has a major implication for seismic activity in the area. Earthquake disaster mitigation measures are being put in place in recent times to reduce the impact of any major event that may occur in the country. The National Disaster Management Organization has come out with a building guide to assist in the mitigation effort of earthquake disasters and floods in the country. The building guide clearly stipulates the kind of material to be used, the proportion, what should go into the foundation for one or two storey building, the electrical materials to be used and many others.

  18. Evaluation and mitigation of lightning hazards to the space shuttle Solid Rocket Motors (SRM)

    NASA Technical Reports Server (NTRS)

    Rigden, Gregory J.; Papazian, Peter B.

    1988-01-01

    The objective was to quantify electric field strengths in the Solid Rocket Motor (SRM) propellant in the event of a worst case lightning strike. Using transfer impedance measurements for selected lightning protection materials and 3D finite difference modeling, a retrofit design approach for the existing dielectric grain cover and railcar covers was evaluated and recommended for SRM segment transport. A safe level of 300 kV/m was determined for the propellant. The study indicated that a significant potential hazard exists for unprotected segments during rail transport. However, modified railcar covers and grain covers are expected to prevent lightning attachment to the SRM and to reduce the levels to several orders of magnitude below 300 kV/m.

  19. Pulsed Electric Processing of the Seismic-Active Fault for Earthquake Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Novikov, V. A.; Zeigarnik, V. A.; Konev, Yu. B.; Klyuchkin, V. N.

    2010-03-01

    Previous field and laboratory investigations performed in Russia (1999-2008) showed a possibility of application of high-power electric current pulses generated by pulsed MHD power system for triggering the weak seismicity and release of tectonic stresses in the Earth crust for earthquake hazard mitigation. The mechanism of the influence of man-made electromagnetic field on the regional seismicity is not clear yet. One of possible cause of the phenomenon may be formation of cracks in the rocks under fluid pressure increase due to Joule heat generation by electric current injected into the Earth crust. Detailed 3D-calculaton of electric current density in the Earth crust of Northern Tien Shan provided by pulsed MHD power system connected to grounded electric dipole showed that at the depth of earthquake epicenters (> 5km) the electric current density is lower than 10-7 A/m2 that is not sufficient for increase of pressure in the fluid-saturated porous geological medium due to Joule heat generation, which may provide formation of cracks resulting in the fault propagation and release of tectonic stresses in the Earth crust. Nevertheless, under certain conditions, when electric current will be injected into the fault through the casing pipes of deep wells with preliminary injection of conductive fluid into the fault, the current density may be high enough for significant increase of mechanic pressure in the porous two-phase geological medium. Numerical analysis of a crack formation triggered by high-power electric pulses based on generation of mechanical pressure in the geological medium was carried out. It was shown that calculation of mechanical pressure impulse due to high-power electrical current in the porous two-phase medium may be performed neglecting thermal conductance by solving the non-stationary equation of piezo-conductivity with Joule heat generation. For calculation of heat generation the known solution of the task of current spreading from spherical or elliptic electrode submerged into unbounded medium is used. Pressure increase due to electric current is determined by voltage of the current source and the medium parameters, and it does not depend on the electrode radius. The pressure increase is proportional to parameter ? ? /r2, where ? is viscosity factor, ? is electric conductivity of fluid in pores, r is average radius of capillaries. This parameter may vary for different media and fluids in the pores by many orders of magnitude. The pressure increase for water is insignificant. If a high-mineralized fluid (e.g. sludge) is injected into the medium, the pressure may be increased by several orders. In that case the pressure may obtain tens kilobars, and an intergrowth of cracks will be possible in the medium exposed to high-power electric current. An estimation of parameters of portable pulsed power system for electric processing of the fault was performed, when the current is injected into the fault through the casing tubes of deep wells with preliminary injection of conductive fluid into the fault between the wells. The work is supported by grant No. 09-05-12059 of Russian Foundation for Basic Research.

  20. Studies Update Vinyl Chloride Hazards.

    ERIC Educational Resources Information Center

    Rawls, Rebecca

    1980-01-01

    Extensive study affirms that vinyl chloride is a potent animal carcinogen. Epidemiological studies show elevated rates of human cancers in association with extended contact with the compound. (Author/RE)

  1. Hawaiian cultural influences on support for lava flow hazard mitigation measures during the January 1960 eruption of K?lauea volcano, Kapoho, Hawai‘i

    NASA Astrophysics Data System (ADS)

    Gregg, C. E.; Houghton, B. F.; Paton, D.; Swanson, D. A.; Lachman, R.; Bonk, W. J.

    2008-05-01

    In 1960, K?lauea volcano in Hawaii erupted, destroying most of the village of Kapoho and forcing evacuation of its approximately 300 residents. A large and unprecedented social science survey was undertaken during the eruption to develop an understanding of human behavior, beliefs, and coping strategies among the adult evacuees ( n = 160). Identical studies were also performed in three control towns located at varying distances from the eruption site ( n = 478). During these studies data were collected that characterized ethnic grouping and attitudes toward Hawaiian cultural issues such as belief in Pele and two lava flow mitigation measures—use of barriers and bombs to influence the flow of lava, but the data were never published. Using these forgotten data, we examined the relationship between Hawaiian cultural issues and attitudes toward the use of barriers and bombs as mitigation strategies to protect Kapoho. On average, 72% of respondents favored the construction of earthen barriers to hold back or divert lava and protect Kapoho, but far fewer agreed with the military's use of bombs (14%) to protect Kapoho. In contrast, about one-third of respondents conditionally agreed with the use of bombs. It is suggested that local participation in the bombing strategy may explain the increased conditional acceptance of bombs as a mitigation tool, although this can not be conclusively demonstrated. Belief in Pele and being of Hawaiian ethnicity did not reduce support for the use of barriers, but did reduce support for bombs in both bombing scenarios. The disparity in levels of acceptance of barriers versus bombing and of one bombing strategy versus another suggests that historically public attitudes toward lava flow hazard mitigation strategies were complex. A modern comparative study is needed before the next damaging eruption to inform debates and decisions about whether or not to interfere with the flow of lava. Recent changes in the current eruption of K?lauea make this a timely topic.

  2. Piloted Simulation to Evaluate the Utility of a Real Time Envelope Protection System for Mitigating In-Flight Icing Hazards

    NASA Technical Reports Server (NTRS)

    Ranaudo, Richard J.; Martos, Borja; Norton, Bill W.; Gingras, David R.; Barnhart, Billy P.; Ratvasky, Thomas P.; Morelli, Eugene

    2011-01-01

    The utility of the Icing Contamination Envelope Protection (ICEPro) system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device (ICEFTD). ICEPro provides real time envelope protection cues and alerting messages on pilot displays. The pilots participating in this test were divided into two groups; a control group using baseline displays without ICEPro, and an experimental group using ICEPro driven display cueing. Each group flew identical precision approach and missed approach procedures with a simulated failure case icing condition. Pilot performance, workload, and survey questionnaires were collected for both groups of pilots. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their situation awareness of a hazardous aircraft state.

  3. System integrated considerations for multi-hazard mitigation, preparedness and response in World Expo compound

    E-print Network

    Shinozuka, Masanobu

    of the systems over a large spatial dimension and because of its unique feature stemming from the terrorist at the time of design. However, this episode teaches us a valuable lesson that the recognition-event response and crisis management, if strategically coordinated with pre-event mitigation measures and planned

  4. The respiratory health hazards of volcanic ash: a review for volcanic risk mitigation

    NASA Astrophysics Data System (ADS)

    Horwell, Claire J.; Baxter, Peter J.

    2006-07-01

    Studies of the respiratory health effects of different types of volcanic ash have been undertaken only in the last 40 years, and mostly since the eruption of Mt. St. Helens in 1980. This review of all published clinical, epidemiological and toxicological studies, and other work known to the authors up to and including 2005, highlights the sparseness of studies on acute health effects after eruptions and the complexity of evaluating the long-term health risk (silicosis, non-specific pneumoconiosis and chronic obstructive pulmonary disease) in populations from prolonged exposure to ash due to persistent eruptive activity. The acute and chronic health effects of volcanic ash depend upon particle size (particularly the proportion of respirable-sized material), mineralogical composition (including the crystalline silica content) and the physico-chemical properties of the surfaces of the ash particles, all of which vary between volcanoes and even eruptions of the same volcano, but adequate information on these key characteristics is not reported for most eruptions. The incidence of acute respiratory symptoms (e.g. asthma, bronchitis) varies greatly after ashfalls, from very few, if any, reported cases to population outbreaks of asthma. The studies are inadequate for excluding increases in acute respiratory mortality after eruptions. Individuals with pre-existing lung disease, including asthma, can be at increased risk of their symptoms being exacerbated after falls of fine ash. A comprehensive risk assessment, including toxicological studies, to determine the long-term risk of silicosis from chronic exposure to volcanic ash, has been undertaken only in the eruptions of Mt. St. Helens (1980), USA, and Soufrière Hills, Montserrat (1995 onwards). In the Soufrière Hills eruption, a long-term silicosis hazard has been identified and sufficient exposure and toxicological information obtained to make a probabilistic risk assessment for the development of silicosis in outdoor workers and the general population. A more systematic approach to multi-disciplinary studies in future eruptions is recommended, including establishing an archive of ash samples and a website containing health advice for the public, together with scientific and medical study guidelines for volcanologists and health-care workers.

  5. Assessing NEO hazard mitigation in terms of astrodynamics and propulsion systems requirements.

    PubMed

    Remo, John L

    2004-05-01

    Uncertainties associated with assessing valid near-Earth object (NEO) threats and carrying out interception missions place unique and stringent burdens on designing mission architecture, astrodynamics, and spacecraft propulsion systems. A prime uncertainty is associated with the meaning of NEO orbit predictability regarding Earth impact. Analyses of past NEO orbits and impact probabilities indicate uncertainties in determining if a projected NEO threat will actually materialize within a given time frame. Other uncertainties regard estimated mass, composition, and structural integrity of the NEO body. At issue is if one can reliably estimate a NEO threat and its magnitude. Parameters that determine NEO deflection requirements within various time frames, including the terminal orbital pass before impact, and necessary energy payloads, are quantitatively discussed. Propulsion system requirements for extending space capabilities to rapidly interact with NEOs at ranges of up to about 1 AU (astronomical unit) from Earth are outlined. Such missions, without gravitational boosts, are deemed critical for a practical and effective response to mitigation. If an impact threat is confirmed on an immediate orbital pass, the option for interactive reconnaissance, and interception, and subsequent NEO orbit deflection must be promptly carried out. There also must be an option to abort the mitigation mission if the NEO is subsequently found not to be Earth threatening. These options require optimal decision latitude and operational possibilities for NEO threat removal while minimizing alarm. Acting too far in advance of the projected impact could induce perturbations that ultimately exacerbate the threat. Given the dilemmas, uncertainties, and limited options associated with timely NEO mitigation within a decision making framework, currently available propulsion technologies that appear most viable to carry out a NEO interception/mitigation mission within the greatest margin of control and reliability are those based on a combined (bimodal) nuclear thermal/nuclear electric propulsion platform. Elements of required and currently available performance characteristics for nuclear and electric propulsion systems are also discussed. PMID:15220155

  6. Fluor Daniel Hanford implementation plan for DOE Order 5480.28, Natural phenomena hazards mitigation

    Microsoft Academic Search

    Conrads

    1997-01-01

    Natural phenomena hazards (NPH) are unexpected acts of nature that pose a threat or danger to workers, the public, or the environment. Earthquakes, extreme winds (hurricane and tornado), snow, flooding, volcanic ashfall, and lightning strikes are examples of NPH that could occur at the Hanford Site. U.S. Department of Energy (DOE) policy requires facilities to be designed, constructed, and operated

  7. Hierarchically Performed Hazard Origin and Propagation Studies

    Microsoft Academic Search

    Yiannis Papadopoulos; John A. Mcdermid

    1999-01-01

    This paper introduces a new method for safety analysis called HiP- HOPS (Hierarchically Performed Hazard Origin and Propagation Studies). HiP-HOPS originates from a number of classical techniques such as Functional Failure Analysis, Failure Mode and Effects Analysis and Fault Tree Analysis. However, it extends, automates and integrates these techniques in order to address some of the problems currently encountered in

  8. Societal transformation and adaptation necessary to manage dynamics in flood hazard and risk mitigation (TRANS-ADAPT)

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Thaler, Thomas; Bonnefond, Mathieu; Clarke, Darren; Driessen, Peter; Hegger, Dries; Gatien-Tournat, Amandine; Gralepois, Mathilde; Fournier, Marie; Mees, Heleen; Murphy, Conor; Servain-Courant, Sylvie

    2015-04-01

    Facing the challenges of climate change, this project aims to analyse and to evaluate the multiple use of flood alleviation schemes with respect to social transformation in communities exposed to flood hazards in Europe. The overall goals are: (1) the identification of indicators and parameters necessary for strategies to increase societal resilience, (2) an analysis of the institutional settings needed for societal transformation, and (3) perspectives of changing divisions of responsibilities between public and private actors necessary to arrive at more resilient societies. This proposal assesses societal transformations from the perspective of changing divisions of responsibilities between public and private actors necessary to arrive at more resilient societies. Yet each risk mitigation measure is built on a narrative of exchanges and relations between people and therefore may condition the outputs. As such, governance is done by people interacting and defining risk mitigation measures as well as climate change adaptation are therefore simultaneously both outcomes of, and productive to, public and private responsibilities. Building off current knowledge this project will focus on different dimensions of adaptation and mitigation strategies based on social, economic and institutional incentives and settings, centring on the linkages between these different dimensions and complementing existing flood risk governance arrangements. The policy dimension of adaptation, predominantly decisions on the societal admissible level of vulnerability and risk, will be evaluated by a human-environment interaction approach using multiple methods and the assessment of social capacities of stakeholders across scales. As such, the challenges of adaptation to flood risk will be tackled by converting scientific frameworks into practical assessment and policy advice. In addressing the relationship between these dimensions of adaptation on different temporal and spatial scales, this project is both scientifically innovative and policy relevant, thereby supporting climate policy needs in Europe towards a concept of risk governance. Key words: climate change adaptation; transformation; flood risk management; resilience; vulnerability; innovative bottom-up developments; multifunctional use

  9. Advances in Remote Sensing Approaches for Hazard Mitigation and Natural Resource Protection in Pacific Latin America: A Workshop for Advanced Graduate Students, Post Doctoral Researchers, and Junior Faculty

    Microsoft Academic Search

    J. S. Gierke; W. I. Rose; G. P. Waite; J. L. Palma; E. L. Gross

    2008-01-01

    Though much of the developing world has the potential to gain significantly from remote sensing techniques in terms of public health and safety, they often lack resources for advancing the development and practice of remote sensing. All countries share a mutual interest in furthering remote sensing capabilities for natural hazard mitigation and resource development. With National Science Foundation support from

  10. Towards the Establishment of the Hawaii Integrated Seismic Network for Tsunami, Seismic, and Volcanic Hazard Mitigation

    Microsoft Academic Search

    B. R. Shiro; S. K. Koyanagi; P. G. Okubo; C. J. Wolfe

    2006-01-01

    The NOAA Pacific Tsunami Warning Center (PTWC) located in `Ewa Beach, Hawai`i, provides warnings to the State of Hawai`i regarding locally generated tsunamis. The USGS Hawaiian Volcano Observatory (HVO) located in Hawai`i National Park monitors earthquakes on the island of Hawai`i in order to characterize volcanic and earthquake activity and hazards. In support of these missions, PTWC and HVO operate

  11. Mitigation of volcanic hazards to aviation: The need for real-time integration of multiple data sources (Invited)

    NASA Astrophysics Data System (ADS)

    Schneider, D. J.

    2009-12-01

    The successful mitigation of volcanic hazards to aviation requires rapid interpretation and coordination of data from multiple sources, and communication of information products to a variety of end users. This community of information providers and information users include volcano observatories, volcanic ash advisory centers, meteorological watch offices, air traffic control centers, airline dispatch and military flight operations centers, and pilots. Each of these entities has capabilities and needs that are unique to their situations that evolve over a range of time spans. Prior to an eruption, information about probable eruption scenarios are needed in order to allow for contingency planning. Once a hazardous eruption begins, the immediate questions are where, when, how high, and how long will the eruption last? Following the initial detection of an eruption, the need for information changes to forecasting the movement of the volcanic cloud, determining whether ground operations will be affected by ash fall, and estimating how long the drifting volcanic cloud will remain hazardous. A variety of tools have been developed and/or improved over the past several years that provide additional data sources about volcanic hazards that is pertinent to the aviation sector. These include seismic and pressure sensors, ground-based radar and lidar, web cameras, ash dispersion models, and more sensitive satellite sensors that are capable of better detecting volcanic ash, gases and aerosols. Along with these improved capabilities come increased challenges in rapidly assimilating the available data sources, which come from a variety of data providers. In this presentation, examples from the recent large eruptions of Okmok, Kasatochi, and Sarychev Peak volcanoes will be used to demonstrate the challenges faced by hazard response agencies. These eruptions produced volcanic clouds that were dispersed over large regions of the Northern Hemisphere and were observed by pilots and detected by various satellite sensors for several weeks. The disruption to aviation caused by these eruptions further emphasizes the need to improve the real-time characterization of volcanic clouds (altitude, composition, particle size, and concentration) and to better understand the impacts of volcanic ash, gases and aerosols on aircraft, flight crews, and passengers.

  12. A fast global tsunami modeling suite as a trans-oceanic tsunami hazard prediction and mitigation tool

    NASA Astrophysics Data System (ADS)

    Mohammed, F.; Li, S.; Jalali Farahani, R.; Williams, C. R.; Astill, S.; Wilson, P. S.; B, S.; Lee, R.

    2014-12-01

    The past decade has been witness to two mega-tsunami events, 2004 Indian ocean tsunami and 2011 Japan tsunami and multiple major tsunami events; 2006 Java, Kuril Islands, 2007 Solomon Islands, 2009 Samoa and 2010 Chile, to name a few. These events generated both local and far field tsunami inundations with runup ranging from a few meters to around 40 m in the coastal impact regions. With a majority of the coastal population at risk, there is need for a sophisticated outlook towards catastrophe risk estimation and a quick mitigation response. At the same time tools and information are needed to aid advanced tsunami hazard prediction. There is an increased need for insurers, reinsurers and Federal hazard management agencies to quantify coastal inundations and vulnerability of coastal habitat to tsunami inundations. A novel tool is developed to model local and far-field tsunami generation, propagation and inundation to estimate tsunami hazards. The tool is a combination of the NOAA MOST propagation database and an efficient and fast GPU (Graphical Processing Unit)-based non-linear shallow water wave model solver. The tsunamigenic seismic sources are mapped on to the NOAA unit source distribution along subduction zones in the ocean basin. Slip models are defined for tsunamigenic seismic sources through a slip distribution on the unit sources while maintaining limits of fault areas. A GPU based finite volume solver is used to simulate non-linear shallow water wave propagation, inundation and runup. Deformation on the unit sources provide initial conditions for modeling local impacts, while the wave history from propagation database provides boundary conditions for far field impacts. The modeling suite provides good agreement with basins for basin wide tsunami propagation to validate local and far field tsunami inundations.

  13. Protecting new health facilities from natural hazards: guidelines for the promotion of disaster mitigation.

    PubMed

    2004-01-01

    The health sector is particularly vulnerable to naturally occurring events. The vulnerability of the health infrastructure (hospitals and clinics) is of particular concern. Not only are such facilities vulnerable structurally, but their ability to continue to provide essential functions may be severely compromised, thus leaving the stricken population without essential services. This paper summarizes a more detailed document, Guidelines for Vulnerability Reduction in the Design of New Health Facilities published by the Pan-American Health Organization (PAHO)/ World Health Organization (WHO). The current document summarizes these Guidelines emphasizing how they may be used, by whom, and for what purpose. Potential users of the Guidelines include, but are not limited to: (1) initiators of health facility construction projects; (2) executors and supervisors of health facility construction projects; and (3) financing bodies in charge of funding health facility construction projects. The Guidelines include: (1) implications of natural phenomena upon the health infrastructure; (2) guidelines for vulnerability reduction for incorporation into development project cycles; (3) definitive phases and stages within the phases for development projects including: (I) Projects Assessment (needs assessment; assessment of options, the preliminary project); (II) Investment (project design, construction); and (III) Operational Activities (operations and maintenance). In addition, investment in damage reduction measures, policies and regulations, training and education, and the role of international organizations in the promotion and funding of mitigation strategies are addressed. PMID:15645629

  14. Novel bio-inspired smart control for hazard mitigation of civil structures

    NASA Astrophysics Data System (ADS)

    Kim, Yeesock; Kim, Changwon; Langari, Reza

    2010-11-01

    In this paper, a new bio-inspired controller is proposed for vibration mitigation of smart structures subjected to ground disturbances (i.e. earthquakes). The control system is developed through the integration of a brain emotional learning (BEL) algorithm with a proportional-integral-derivative (PID) controller and a semiactive inversion (Inv) algorithm. The BEL algorithm is based on the neurologically inspired computational model of the amygdala and the orbitofrontal cortex. To demonstrate the effectiveness of the proposed hybrid BEL-PID-Inv control algorithm, a seismically excited building structure equipped with a magnetorheological (MR) damper is investigated. The performance of the proposed hybrid BEL-PID-Inv control algorithm is compared with that of passive, PID, linear quadratic Gaussian (LQG), and BEL control systems. In the simulation, the robustness of the hybrid BEL-PID-Inv control algorithm in the presence of modeling uncertainties as well as external disturbances is investigated. It is shown that the proposed hybrid BEL-PID-Inv control algorithm is effective in improving the dynamic responses of seismically excited building structure-MR damper systems.

  15. Educational Approach to Seismic Risk Mitigation in Indian Himalayas -Hazard Map Making Workshops at High Schools-

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Oki, S.; Kimura, M.; Chadha, R. K.; Davuluri, S.

    2014-12-01

    How can we encourage people to take preventive measures against damage risks and empower them to take the right actions in emergencies to save their lives? The conventional approach taken by scientists had been disseminating intelligible information on up-to-date seismological knowledge. However, it has been proven that knowledge alone does not have enough impact to modify people's behaviors in emergencies (Oki and Nakayachi, 2012). On the other hand, the conventional approach taken by practitioners had been to conduct emergency drills at schools or workplaces. The loss of many lives from the 2011 Tohoku earthquake has proven that these emergency drills were not enough to save people's lives, unless they were empowered to assess the given situation on their own and react flexibly. Our challenge is to bridge the gap between knowledge and practice. With reference to best practices observed in Tohoku, such as The Miracles of Kamaishi, our endeavor is to design an effective Disaster Preparedness Education Program that is applicable to other disaster-prone regions in the world, even with different geological, socio-economical and cultural backgrounds. The key concepts for this new approach are 1) empowering individuals to take preventive actions to save their lives, 2) granting community-based understanding of disaster risks and 3) building a sense of reality and relevancy to disasters. With these in mind, we held workshops at some high schools in the Lesser Himalayan Region, combining lectures with an activity called "Hazard Map Making" where students proactively identify and assess the hazards around their living areas and learn practical strategies on how to manage risks. We observed the change of awareness of the students by conducting a preliminary questionnaire survey and interviews after each session. Results strongly implied that the significant change of students' attitudes towards disaster preparedness occurred not by the lectures of scientific knowledge, but after completing the whole program of activities. Students closed their presentation by spontaneously adding messages to others about importance of life and preparedness. In this presentation, we share good practices in terms of program design and facilitation that encouraged the transition of participants from a learner to an actor.

  16. Developing Oceanic Convective Products to Mitigate the Impact of Weather Hazards on Transoceanic Flights

    NASA Astrophysics Data System (ADS)

    Nierow, A.

    2003-12-01

    Transoceanic flights will increase significantly in the next decade. To manage this increased demand for capacity, while maintaining safety, the Federal Aviation Administration (FAA) is exploring whether the separation minima normally used between aircraft crossing oceanic regions can be reduced both horizontally and vertically. However, before reducing separation standards, the increased hazard of encountering convective weather over oceanic routes must be considered. New evidence has shown that roughly half of the turbulence encounters over oceanic regions were likely associated with convective activity. This phenomenon, Convectively-Induced Turbulence (CIT), can occur several kilometers from convective cores. Operational decision-makers need to detect turbulence associated with oceanic convective activity to route or reroute aircraft safely. However, the only weather data consistently available is from satellite imagery, which can reveal potential areas of convection, but can't unambiguously isolate the hazardous regions from the benign regions. Being able to do this would improve routing and rerouting decisions. The FAA and other agencies are collaborating to develop oceanic convective products. The National Weather Service's Aviation Weather Center created a product that identifies thunderstorms by using the output from different satellite imagers. The technique exploits the difference between the 11-micron infrared (IR) channel and the 6.7-micron water vapor channel. The National Center for Atmospheric Research has developed a new product that maps cloud top temperatures drawn from IR satellite imagery and converts them to aircraft flight levels. In addition, the Naval Research Lab in Monterey, CA is developing cloud classification algorithms that will distinguish between cirrus and convective clouds. We have compared these new convective diagnostic techniques to long-range ground base lightning data and lightning data from the National Aeronautics and Space Administration (NASA) Tropical Rainfall Measuring Mission (TRMM) satellite. We will present the results of the comparison at the meeting. Developing oceanic convective and turbulence nowcasting and short-term forecasting products would have a significant positive impact on flight operations since they would show possible locations of turbulence and wind shear associated with convection. These products would also increase airspace capacity by enabling the FAA to decrease oceanic aircraft separation standards while preserving safety. We wish to reduce the incidence of noncoordinated deviations (because of unexpected encounters with turbulence associated with convection), through greater situational awareness and more time for coordination. By helping pilots avoid areas of convective activity and associated turbulence over oceanic regions, these products have the potential to improve safety of flight and increase efficiency (e.g., facilitate routing and rerouting resulting in smaller flight track deviations and reduced fuel costs).

  17. Role of the Egyptian National Seismological Network to mitigate the seismic hazard in Egypt

    NASA Astrophysics Data System (ADS)

    Mohamed, A.-A.

    2012-04-01

    Egypt is located close to one of the continental fracture system (Hellenic arc) at the convergence boundary of two big lithospheric plates (Eurasia and Africa). Also, Egypt is affected by the open of the Red Sea (Mid Oceanic System) and its two branches (the Gulf of Suez and the Gulf of Aqaba-Dead Sea transform system). Thus the seismicity is due to the interaction between the three plates of Eurasia, Africa and Arabian plates. Thus it could be concluded that although the damaging earthquakes occurred infrequently, its risky consequences could not be ignored. Egypt witnessed a numerous of damaged event, for instance, 1992 Cairo earthquake with magnitude (5.9 mb) caught the Egyptian people. This earthquake caused 600 deaths, 10000 injured and left a damage of more than 40 million US. As a result of this damage. As well as 1995 Gulf of Aqaba earthquake with Mw 7.2. The Egyptian Government supports the National Research Institute of Astronomy and Geophysics (NRIAG) to install the Egyptian National Seismic Network ENSN and the strong motion network. The main objectives of the network are: Monitoring local and regional activity including artificial events, assessment seismic hazard, estimating the expected future earthquake effects and protecting strategic buildings, high dam and archeological sites.

  18. Towards the Establishment of the Hawaii Integrated Seismic Network for Tsunami, Seismic, and Volcanic Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Shiro, B. R.; Koyanagi, S. K.; Okubo, P. G.; Wolfe, C. J.

    2006-12-01

    The NOAA Pacific Tsunami Warning Center (PTWC) located in `Ewa Beach, Hawai`i, provides warnings to the State of Hawai`i regarding locally generated tsunamis. The USGS Hawaiian Volcano Observatory (HVO) located in Hawai`i National Park monitors earthquakes on the island of Hawai`i in order to characterize volcanic and earthquake activity and hazards. In support of these missions, PTWC and HVO operate seismic networks for rapidly detecting and evaluating earthquakes for their tsunamigenic potential and volcanic risk, respectively. These existing seismic networks are comprised mostly of short-period vertical seismometers with analog data collection and transmission based on decades-old technology. The USGS National Strong Motion Program (NSMP) operates 31 accelerometers throughout the state, but none currently transmit their data in real time. As a result of enhancements to the U.S. Tsunami Program in the wake of the December 2004 Indian Ocean tsunami disaster, PTWC is upgrading and expanding its seismic network using digital real-time telemetry from broadband and strong motion accelerometer stations. Through new cooperative agreements with partners including the USGS (HVO and NSMP), IRIS, University of Hawai`i, and Germany's GEOFON, the enhanced seismic network has been designed to ensure maximum benefit to all stakeholders. The Hawaii Integrated Seismic Network (HISN) will provide a statewide resource for tsunami, earthquake, and volcanic warnings. Furthermore, because all data will be archived by the IRIS Data Management Center (DMC), the HISN will become a research resource to greater scientific community. The performance target for the enhanced HISN is for PTWC to provide initial local tsunami warnings within 90 seconds of the earthquake origin time. This will be accomplished using real-time digital data transmission over redundant paths and by implementing contemporary analysis algorithms in real-time and near-real-time. Earthquake location, depth, and magnitude determination will be improved due to the better station coverage. More advanced seismic analysis techniques such as rapid characterization of the earthquake source will also be possible with HISN broadband data. Anticipated products from upgraded strong motion monitoring include ShakeMaps and earthquake rupture models. The HISN will ultimately consist of the following three types of stations: 12 broadband stations built to ANSS (Advanced National Seismic System) standards using STS-2 broadband seismometers and strong motion accelerometers, 15 new strong motion accelerometer stations, and at least 12 NSMP stations upgraded to real time digital communications. Combined with other existing broadband, short-period, and strong motion stations throughout Hawai`i, the HISN will greatly enhance seismic monitoring capabilities throughout the region. Although most seismicity in Hawai`i occurs under the Island of Hawai`i, large earthquakes do happen further up the island chain. Therefore, stations will be located on all major islands in order to optimize coverage. PTWC is currently finalizing site selection for new sites located on the islands of Kaua`i, O`ahu, Moloka`i, Lana`i, Maui, and Hawai`i. PTWC has begun installation of new stations and expects to have the entire HISN completed by late 2007 or early 2008.

  19. IRIDIUM SATELLITE SIGNALS: A CASE STUDY IN INTERFERENCE CHARACTERIZATION AND MITIGATION FOR

    E-print Network

    Lewis, Brian Murray

    IRIDIUM SATELLITE SIGNALS: A CASE STUDY IN INTERFERENCE CHARACTERIZATION AND MITIGATION FOR RADIO of the Iridium System. So astronomers are now studying a variety of diverse approaches to mitigating the effects-detection approaches to the identification and mitigation of RFI originating from the Iridium System. OBSERVATIONS

  20. Hazardous and radioactive waste incineration studies

    SciTech Connect

    Vavruska, J.S.; Stretz, L.A.; Borduin, L.C.

    1981-01-01

    Development and demonstration of a transuranic (TRU) waste volume-reduction process is described. A production-scale controlled air incinerator using commercially available equipment and technology has been modified for solid radioactive waste service. This unit successfully demonstrated the volume reduction of transuranic (TRU) waste with an average TRU content of about 20 nCi/g. The same incinerator and offgas treatment system is being modified further to evaluate the destruction of hazardous liquid wastes such as polychlorinated biphenyls (PCBs) and hazardous solid wastes such as pentachlorophenol (PCP)-treated wood.

  1. Recent Developments in Earthquake Hazards Studies

    Microsoft Academic Search

    Walter D. Mooney; Susan M. White

    \\u000a In recent years, there has been great progress understanding the underlying causes of earthquakes, as well as forecasting\\u000a their occurrence and preparing communities for their damaging effects. Plate tectonic theory explains the occurrence of earthquakes\\u000a at discrete plate boundaries, such as subduction zones and transform faults, but diffuse plate boundaries are also common.\\u000a Seismic hazards are distributed over a broad

  2. Caribbean Tsunami and Earthquake Hazards Studies

    NSDL National Science Digital Library

    This portal provides information on the seismicity and plate tectonics of the active boundary between the North American plate and the northeast corner of the Caribbean plate, and the research program being conducted there by the United States Geological Survey (USGS). There are links to maps and remote imagery of the plate boundary and the Caribbean Trench, and to publications and news articles on seismic and tsunami hazards, seafloor mapping, plate interactions, and submarine slides. There is also a movie that describes the geologic background and USGS research efforts in the area.

  3. Gas jet disruption mitigation studies on Alcator C-Mod

    NASA Astrophysics Data System (ADS)

    Granetz, R.; Whyte, D. G.; Izzo, V. A.; Biewer, T.; Reinke, M. L.; Terry, J.; Bader, A.; Bakhtiari, M.; Jernigan, T.; Wurden, G.

    2006-12-01

    Damaging effects of disruptions are a major concern for Alcator C-Mod, ITER and future tokamak reactors. High-pressure noble gas jet injection is a mitigation technique which potentially satisfies the operational requirements of fast response time and reliability, while still being benign to subsequent discharges. Disruption mitigation experiments using an optimized gas jet injection system are being carried out on Alcator C-Mod to study the physics of gas jet penetration into high pressure plasmas, as well as the ability of the gas jet impurities to convert plasma energy into radiation on timescales consistent with C-Mod's fast quench times, and to reduce halo currents given C-Mod's high-current density. The dependence of impurity penetration and effectiveness on noble gas species (He, Ne, Ar, Kr) is also being studied. It is found that the high-pressure neutral gas jet does not penetrate deeply into the C-Mod plasma, and yet prompt core thermal quenches are observed on all gas jet shots. 3D MHD modelling of the disruption physics with NIMROD shows that edge cooling of the plasma triggers fast growing tearing modes which rapidly produce a stochastic region in the core of the plasma and loss of thermal energy. This may explain the apparent effectiveness of the gas jet in C-Mod despite its limited penetration. The higher-Z gases (Ne, Ar, Kr) also proved effective at reducing halo currents and decreasing thermal deposition to the divertor surfaces. In addition, noble gas jet injection proved to be benign for plasma operation with C-Mod's metal (Mo) wall, actually improving the reliability of the startup in the following discharge.

  4. Peru mitigation assessment of greenhouse gases: Sector -- Energy. Peru climate change country study; Final report

    SciTech Connect

    NONE

    1996-08-01

    The aim of this study is to determine the Inventory and propose Greenhouse Gases Mitigation alternatives in order to face the future development of the country in a clean environmental setting without delaying the development process required to improve Peruvian standard of living. The main idea of this executive abstract is to show concisely the results of the Greenhouse Gases Mitigation for Peru in the period 1990--2015. The studies about mitigation for the Energy Sector are shown in this summary.

  5. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    Microsoft Academic Search

    F. I. Gonzalez; E. L. Geist; C. Synolakis; V. V. Titov

    2004-01-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for

  6. Snow-avalanche and debris-flow hazards in the fjords of north-western Iceland, mitigation and prevention

    Microsoft Academic Search

    Armelle Decaulne

    2007-01-01

    In the fjords of north-western Iceland, snow-avalanche and debris-flow hazards threaten 65% of the inhabitants. In this area,\\u000a both historical and geomorphological evidences clearly demonstrate the recurrent danger from the steep slopes. Hazard vulnerability\\u000a has increased during the last century, in connection with the population development of the Westfjords. Two snow-avalanche\\u000a disasters during 1995 (in which 34 people were killed

  7. Flood Hazard Mapping using Aster Image data with GIS

    Microsoft Academic Search

    Eric Kwabena Forkuo

    2011-01-01

    Flood is one of the most devastating natural hazards which lead to the loss of lives, properties and resources. It has therefore become important to create easily read, rapidly accessible flood hazard map, which will prioritize the mitigation effects. This study addresses the need for an efficient and cost-effect ive methodology for preparing flood hazard maps in Ghana, particularly those

  8. Hazard & Operability Study for Removal of Spent Nuclear Fuel from the 324 Building

    SciTech Connect

    VAN KEUREN, J.C.

    2002-05-07

    A hazard and operability (HAZOP) study was conducted to examine the hazards associated with the removal of the spent nuclear fuel from the 324 Building. Fifty-nine potentially hazardous conditions were identified.

  9. From structural investigation towards multi-parameter early warning systems: geophysical contributions to hazard mitigation at the landslide of Gschliefgraben (Gmunden, Upper Austria)

    NASA Astrophysics Data System (ADS)

    Supper, Robert; Baron, Ivo; Jochum, Birgit; Ita, Anna; Winkler, Edmund; Motschka, Klaus; Moser, Günter

    2010-05-01

    In December 2007 the large landslide system inside the Gschliefgraben valley (located at the east edge of the Traun lake, Upper Austria), known over centuries for its repeated activity, was reactivated. Although a hazard zone map was already set up in 1974, giving rise to a complete prohibition on building, some hundreds of people are living on the alluvial fan close to the lake. Consequently, in frame of the first emergency measures, 55 building had to be evacuated. Within the first phase of mitigation, measures were focused on property and infrastructure protection. Around 220 wells and one deep channel were implemented to drain the sliding mass. Additionally a big quantity of sliding material was removed close to the inhabited areas. Differential GPS and water level measurements were performed to evaluate the effectiveness of the measures, which led to a significant slowdown of the movement. Soon after the suspension of the evacuation several investigations, including drilling, borehole logging and complex geophysical measurements were performed to investigate the structure of the landslide area in order to evaluate maximum hazard scenarios as a basis for planning further measures. Based on these results, measuring techniques for an adapted, future early warning system are currently being tested. This emergency system should enable local stakeholders to take appropriate and timely measures in case of a future event thus lessening the impact of a future disaster significantly. Within this tree-step-plan the application of geophysical methodologies was an integral part of the research and could considerably contribute to the success. Several innovative approaches were implemented which will be described in more detail within the talk. Airborne multi-sensor geophysical surveying is one of new and progressive approaches which can remarkably contribute to effectively analyse triggering processes of large landslides and to better predict their hazard. It was tested in Gschliefgraben earthflow and landslide complex in September 2009. Several parameters, such as vegetation thickness, soil moisture, potassium and thorium content (gamma ray) or four layer resistivity were the principal studied parameters. These parameters were compared with the landslide inventory map of Gschliefgraben developed from differential airborne laser scan terrain models. Since mass wasting is usually triggered by rising water pore pressure due to heavy rainfall or seismic tremors, often supported by changes in the shape, structure, and hydrology of a slope or vegetation cover. As the electrical resistivity of the subsurface mainly depends on porosity, saturation, pore fluid conductivity and clay content, the geoelectric method is a reliable method to investigate the structure of the landslide and surrounding and could be an emerging tool for observing those triggering factors. Therefore, first a multi-electrode geoelectrical survey was performed in a broader area of the active earthflow to verify the subsurface structure and to optimise the location for a monitoring system, followed by the installacion of the geoelectric monitoring system Geomon4D in September 2009. The monitoring profiles were complemented by an automatic DMS inclinometer to correlate measured resistivity values with displacement rates. Since the installation, the system works continuously and data is processed on a daily basis at the monitoring centre in Vienna. These works were supported by the 7th FP project "Safeland - Living with the landslide risk in Europe".

  10. Case studies on mitigating harmonics in ASD systems to meet IEEE 519-1992 standards

    Microsoft Academic Search

    Mahesh M. Swamy; Steven L. Rossiter; M. C. Spencer; M. Richardson

    1994-01-01

    Typical case studies are reported in this paper addressing the application of the IEEE 519-1992 guidelines to mitigate harmonics. The study discusses possible techniques to mitigate harmonics generated by nonlinear loads, especially variable frequency drives (VFDs), also known as adjustable speed drives (ASDs). The paper intends to show how one can apply IEEE 519-1992 within an industrial or commercial environment

  11. Experimental Studies of Mitigation Materials for Blast Induced TBI

    NASA Astrophysics Data System (ADS)

    Alley, Matthew; Son, Steven

    2009-06-01

    The objective of this experimental study is to compare the effects of various materials obstructing the flow of a blast wave and the ability of the given material to reduce the damage caused by the blast. Several methods of energy transfer in blast wave flows are known or expected including: material interfaces with impedance mismatches, density changes in a given material, internal shearing, and particle fracture. The theory applied to this research is that the greatest energy transfer within the obstructing material will yield the greatest mitigation effects to the blast. Sample configurations of foam were varied to introduce material interfaces and filler materials with varying densities and impedances (liquids and powders). The samples were loaded according to a small scale blast produced by an explosive driven shock tube housing gram-range charges. The transmitted blast profiles were analyzed for variations in impulse characteristics and frequency components as compared to standard free field profiles. The results showed a rounding effect of the transmitted blast profile for all samples with the effects of the low density fillers surpassing all others tested.

  12. Experimental Studies of Mitigation Materials for Blast Induced Tbi

    NASA Astrophysics Data System (ADS)

    Alley, M. D.; Son, S. F.; Christou, G.; Goel, R.; Young, L.

    2009-12-01

    The objective of this experimental study is to compare the effects of various materials obstructing the flow of a blast wave and the ability of the material to reduce the damage caused by the blast. Several methods of energy transfer in blast wave flows are expected including: material interfaces with impedance mismatches, density changes in a given material, internal shearing, and particle fracture. Our hypothesis is that the greatest energy transfer within the obstructing material will yield the greatest mitigation effects to the blast. Sample configurations of foam were varied to introduce material interfaces and filler materials with varying densities and impedances (liquids and powders). The samples were dynamically loaded using a small scale blast produced by an explosive driven shock tube housing gram-scale explosive charges. The transmitted blast profiles were analyzed for variations in impulse characteristics and frequency components as compared to standard free field profiles. The results showed a rounding effect of the transmitted blast profile for all samples with the effects of the high density fillers surpassing all others tested. These results lead to a conclusion that low porosity, high density materials offer superior attenuation by reducing air blast features and spatially distributing the transmitted wave.

  13. Hazards analysis and prediction from remote sensing and GIS using spatial data mining and knowledge discovery: a case study for landslide hazard zonation

    NASA Astrophysics Data System (ADS)

    Hsu, Pai-Hui; Su, Wen-Ray; Chang, Chy-Chang

    2011-11-01

    Due to the particular geographical location and geological condition, Taiwan suffers from many natural hazards which often cause series property damages and life losses. To reduce the damages and casualty, an effective real-time system for hazard prediction and mitigation is necessary. In this study, a case study for Landslide Hazard Zonation (LHZ) is tested in accordance with Spatial Data Mining and Knowledge Discovery (SDMKD) from database. Many different kinds of geospatial data, such as the terrain elevation, land cover types, the distance to roads and rivers, geology maps, NDVI, and monitoring rainfall data etc., are collected into the database for SDMKD. In order to guarantee the data quality, the spatial data cleaning is essential to remove the noises, errors, outliers, and inconsistency hiding in the input spatial data sets. In this paper, the Kriging interpolation is used to calibrate the QPESUMS rainfall data to the rainfall observations from rain gauge stations to remove the data inconsistency. After the data cleaning, the artificial neural networks (ANNs) is applied to generate the LHZ map throughout the test area. The experiment results show that the accuracy of LHZ is about 92.3% with the ANNs analysis, and the landslides induced by heavy-rainfall can be mapped efficiently from remotely sensed images and geospatial data using SDMKD technologies.

  14. Integration of Tsunami Analysis Tools into a GIS Workspace – Research, Modeling, and Hazard Mitigation efforts Within NOAA’s Center for Tsunami Research

    Microsoft Academic Search

    Nazila Merati; Christopher Chamberlin; Christopher Moore; Vasily Titov; Tiffany C. Vance

    \\u000a The National Oceanic and Atmospheric Administration’s \\u000a \\u000a (NOAA) Center for Tsunami Research \\u000a \\u000a \\u000a \\u000a \\u000a \\u000a \\u000a \\u000a (NCTR) uses geospatial data and GIS analysis techniques in support of building an accurate \\u000a \\u000a \\u000a tsunami forecasting system for the US Tsunami Warning Centers. \\u000a \\u000a \\u000a \\u000a The resulting forecast products can be integrated into applications and visualizations to assess hazard risk and provide mitigation\\u000a for US coastal communities ranging from small towns

  15. Odor Mitigation with Tree Buffers: Swine Production Case Study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Tree buffers are a potential low cost sustainable odor mitigation strategy, but there is little to no data on their effectiveness. Odor transport is thought to occur one of two ways either directly through vapor phase transport or indirectly through sorption onto particles. Consequently, monitoring...

  16. Mitigation of hazards from future lahars from Mount Merapi in the Krasak River channel near Yogyakarta, central Java

    USGS Publications Warehouse

    Ege, John R.; Sutikno

    1983-01-01

    Procedures for reducing hazards from future lahars and debris flows in the Krasak River channel near Yogyakarta, Central Java, Indonesia, include (1) determining the history of the location, size, and effects of previous lahars and debris flows, and (2) decreasing flow velocities. The first may be accomplished by geologic field mapping along with acquiring information by interviewing local residents, and the second by increasing the cross sectional area of the river channel and constructing barriers in the flow path.

  17. EVOLVE 4.0 orbital debris mitigation studies

    Microsoft Academic Search

    P. H. Krisko; N. L. Johnson; J. N. Opiela

    2001-01-01

    In a continuing effort to limit future space debris generation, the NASA Policy Directive 8710.3 was issued in May 1997. It requires all NASA-sponsored programs to conduct formal assessments in accordance with NASA Safety Standard 1740.14 to quantify the potential to generate debris and to consider debris mitigation options. Recent improvements to the NASA long-term debris environment model, EVOLVE 4.0,

  18. HOUSEHOLD HAZARDOUS WASTE CHARACTERIZATION STUDY FOR PALM BEACH COUNTY, FLORIDA: A MITE PROGRAM EVALUATION

    EPA Science Inventory

    The objectives of the Household hazardous Waste Characterization Study (the HHW Study) were to quantify the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County, Florida's (the county) residential solid waste (characterized in this study as municipal soli...

  19. Mitigation of climate change impacts on maize productivity in northeast of Iran: a simulation study

    Microsoft Academic Search

    Azam Lashkari; Amin Alizadeh; Ehsan Eyshi Rezaei; Mohammad Bannayan

    2012-01-01

    Development and evaluation of mitigation strategies are fundamental to manage climate change risks. This study was built on\\u000a (1) quantifying the response of maize (Zea mays L.) grain yield to potential impacts of climate change and (2) investigating the effectiveness of changing sowing date of\\u000a maize as a mitigation option for Khorasan Province which is located in northeast of Iran.

  20. Land classification used to select abandoned hazardous waste study sites

    NASA Astrophysics Data System (ADS)

    Shirazi, Mostafa A.

    1984-07-01

    The biological effects of hazardous substances in the environment are influenced by climate, physiography, and biota. These factors interact to determine the transport and fate of chemicals, but are difficult to model accurately except for small areas with a large data base. The requirement for a large data base may be reduced locally if the regional influences of these factors were predetermined from existing data. Knowledge of the regional factors would also relax the restriction to considering only small areas. This paper advocates consideration of regional characteristics of the environment in the early stages of waste management strategy development. It presents as an example a procedure for selecting study sites from candidate-abandoned hazardous waste dumpsites in the southeastern United States. It uses small-scale maps of low resolution from the National Atlas to delineate the boundaries and to determine the environmental characteristics that prevail over units of land within the region. A computer map-overlay and graphic approach is used to facilitate the grouping of land types. Abandoned hazardous waste dumpsites found within land types that best represent the region are surveyed for selecting a study site. It is expected that environmental impact data obtained from a representative site would be useful for predicting impact potentials in similar remotely located areas within the same general region.

  1. Detecting Slow Deformation Signals Preceding Dynamic Failure: A New Strategy For The Mitigation Of Natural Hazards (SAFER)

    NASA Astrophysics Data System (ADS)

    Vinciguerra, Sergio; Colombero, Chiara; Comina, Cesare; Ferrero, Anna Maria; Mandrone, Giuseppe; Umili, Gessica; Fiaschi, Andrea; Saccorotti, Gilberto

    2015-04-01

    Rock slope monitoring is a major aim in territorial risk assessment and mitigation. The high velocity that usually characterizes the failure phase of rock instabilities makes the traditional instruments based on slope deformation measurements not applicable for early warning systems. The use of "site specific" microseismic monitoring systems, with particular reference to potential destabilizing factors, such as rainfalls and temperature changes, can allow to detect pre-failure signals in unstable sectors within the rock mass and to predict the possible acceleration to the failure. We deployed a microseismic monitoring system in October 2013 developed by the University of Turin/Compagnia San Paolo and consisting of a network of 4 triaxial 4.5 Hz seismometers connected to a 12 channel data logger on an unstable patch of the Madonna del Sasso, Italian Western Alps. The initial characterization based on geomechanical and geophysical tests allowed to understand the instability mechanism and to design a 'large aperture' configuration which encompasses the entire unstable rock and can monitor subtle changes of the mechanical properties of the medium. Stability analysis showed that the stability of the slope is due to rock bridges. A continuous recording at 250 Hz sampling frequency (switched in March 2014 to 1 kHz for improving the first arrival time picking and obtain wider frequency content information) and a trigger recording based on a STA/LTA (Short Time Average over Long Time Average) detection algorithm have been used. More than 2000 events with different waveforms, duration and frequency content have been recorded between November 2013 and March 2014. By inspecting the acquired events we identified the key parameters for a reliable distinction among the nature of each signal, i.e. the signal shape in terms of amplitude, duration, kurtosis and the frequency content in terms of range of maximum frequency content, frequency distribution in spectrograms. Four main classes of recorded signals can be recognised: microseismic events, regional earthquakes, electrical noises and calibration signals, and unclassified events (probably grouping rockfalls, quarry blasts, other anthropic and natural sources of seismic noise). Since the seismic velocity inside the rock mass is highly heterogeneous, as it resulted from the geophysical investigations and the signals are often noisy an accurate location is not possible. To overcome this limitation a three-dimensional P-wave velocity model linking the DSM (Digital Surface Model) of the cliff obtained from a laser-scanner survey to the results of the cross-hole seismic tomography, the geological observations and the geomechanical measures of the most pervasive fracture planes has been built. As a next step we will proceed to the localization of event sources, to the improvement and automation of data analysis procedures and to search for correlations between event rates and meteorological data, for a better understanding of the processes driving the rock mass instability.

  2. Interdisciplinary approach to hydrological hazard mitigation and disaster response and effects of climate change on the occurrence of flood severity in central Alaska

    NASA Astrophysics Data System (ADS)

    Kontar, Y. Y.; Bhatt, U. S.; Lindsey, S. D.; Plumb, E. W.; Thoman, R. L.

    2015-06-01

    In May 2013, a massive ice jam on the Yukon River caused flooding that destroyed much of the infrastructure in the Interior Alaska village of Galena and forced the long-term evacuation of nearly 70% of its residents. This case study compares the communication efforts of the out-of-state emergency response agents with those of the Alaska River Watch program, a state-operated flood preparedness and community outreach initiative. For over 50 years, the River Watch program has been fostering long-lasting, open, and reciprocal communication with flood prone communities, as well as local emergency management and tribal officials. By taking into account cultural, ethnic, and socioeconomic features of rural Alaskan communities, the River Watch program was able to establish and maintain a sense of partnership and reliable communication patterns with communities at risk. As a result, officials and residents in these communities are open to information and guidance from the River Watch during the time of a flood, and thus are poised to take prompt actions. By informing communities of existing ice conditions and flood threats on a regular basis, the River Watch provides effective mitigation efforts in terms of ice jam flood effects reduction. Although other ice jam mitigation attempts had been made throughout US and Alaskan history, the majority proved to be futile and/or cost-ineffective. Galena, along with other rural riverine Alaskan communities, has to rely primarily on disaster response and recovery strategies to withstand the shock of disasters. Significant government funds are spent on these challenging efforts and these expenses might be reduced through an improved understanding of both the physical and climatological principals behind river ice breakup and risk mitigation. This study finds that long term dialogue is critical for effective disaster response and recovery during extreme hydrological events connected to changing climate, timing of river ice breakup, and flood occurrence in rural communities of the Far North.

  3. Decay extent evaluation of wood degraded by a fungal community using NIRS: application for ecological engineering structures used for natural hazard mitigation

    NASA Astrophysics Data System (ADS)

    Baptiste Barré, Jean; Bourrier, Franck; Bertrand, David; Rey, Freddy

    2015-04-01

    Ecological engineering corresponds to the design of efficient solutions for protection against natural hazards such as shallow landslides and soil erosion. In particular, bioengineering structures can be composed of a living part, made of plants, cuttings or seeds, and an inert part, a timber logs structure. As wood is not treated by preservatives, fungal degradation can occur from the start of the construction. It results in wood strength loss, which practitioners try to evaluate with non-destructive tools (NDT). Classical NDT are mainly based on density measurements. However, the fungal activity reduces the mechanical properties (modulus of elasticity - MOE) well before well before a density change could be measured. In this context, it would be useful to provide a tool for assessing the residual mechanical strength at different decay stages due to a fungal community. Near-infrared spectroscopy (NIRS) can be used for that purpose, as it can allow evaluating wood mechanical properties as well as wood chemical changes due to brown and white rots. We monitored 160 silver fir samples (30x30x6000mm) from green state to different levels of decay. The degradation process took place in a greenhouse and samples were inoculated with silver fir decayed debris in order to accelerate the process. For each sample, we calculated the normalized bending modulus of elasticity loss (Dw moe) and defined it as decay extent. Near infrared spectra collected from both green and decayed ground samples were corrected by the subtraction of baseline offset. Spectra of green samples were averaged into one mean spectrum and decayed spectra were subtracted from the mean spectrum to calculate the absorption loss. Partial least square regression (PLSR) has been performed between the normalized MOE loss Dw moe (0 < Dw moe < 1) and the absorption loss, with a correlation coefficient R² equal to 0.85. Finally, the prediction of silver fir biodegradation rate by NIRS was significant (RMSEP = 0.13). This tool improves the evaluation accuracy of wood decay extent in the context of ecological engineering structures used for natural hazard mitigation.

  4. The development and mitigation of backdraft: a real-scale shipboard study

    Microsoft Academic Search

    Daniel T. Gottuk; Michelle J. Peatross; John P. Farley; Frederick W. Williams

    1999-01-01

    This paper presents the results of a real-scale experimental test series to study the development and mitigation of backdrafts. Experiments consisted of creating backdrafts onboard a US Navy test ship, ex-USS SHADWELL. This study has shown that the key parameter for backdraft development is the fuel mass fraction. The results show that the critical fuel mass fraction, Yf, required for

  5. Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards: Part II. Validation of satellite-derived Volcanic Sulphur Dioxide Levels.

    NASA Astrophysics Data System (ADS)

    Koukouli, MariLiza; Balis, Dimitris; Dimopoulos, Spiros; Clarisse, Lieven; Carboni, Elisa; Hedelt, Pascal; Spinetti, Claudia; Theys, Nicolas; Tampellini, Lucia; Zehner, Claus

    2014-05-01

    The eruption of the Icelandic volcano Eyjafjallajökull in the spring of 2010 turned the attention of both the public and the scientific community to the susceptibility of the European airspace to the outflows of large volcanic eruptions. The ash-rich plume from Eyjafjallajökull drifted towards Europe and caused major disruptions of European air traffic for several weeks affecting the everyday life of millions of people and with a strong economic impact. This unparalleled situation revealed limitations in the decision making process due to the lack of information on the tolerance to ash of commercial aircraft engines as well as limitations in the ash monitoring and prediction capabilities. The European Space Agency project Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards, was introduced to facilitate the development of an optimal End-to-End System for Volcanic Ash Plume Monitoring and Prediction. This system is based on comprehensive satellite-derived ash plume and sulphur dioxide [SO2] level estimates, as well as a widespread validation using supplementary satellite, aircraft and ground-based measurements. The validation of volcanic SO2 levels extracted from the sensors GOME-2/MetopA and IASI/MetopA are shown here with emphasis on the total column observed right before, during and after the Eyjafjallajökull 2010 eruptions. Co-located ground-based Brewer Spectrophotometer data extracted from the World Ozone and Ultraviolet Radiation Data Centre, WOUDC, were compared to the different satellite estimates. The findings are presented at length, alongside a comprehensive discussion of future scenarios.

  6. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...Assistance 1 2010-10-01 2010-10-01 false Flood Mitigation Plan approval process. 78.6 Section...SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood...

  7. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...Assistance 1 2011-10-01 2011-10-01 false Flood Mitigation Plan approval process. 78.6 Section...SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood...

  8. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...Assistance 1 2014-10-01 2014-10-01 false Flood Mitigation Plan approval process. 78.6 Section...SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood...

  9. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...Assistance 1 2013-10-01 2013-10-01 false Flood Mitigation Plan approval process. 78.6 Section...SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood...

  10. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...Assistance 1 2012-10-01 2011-10-01 true Flood Mitigation Plan approval process. 78.6 Section...SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood...

  11. Hazards and operability study for the surface moisture monitoring system

    SciTech Connect

    Board, B.D.

    1996-04-04

    The Hanford Nuclear Reservation Tank Farms` underground waste tanks have been used to store liquid radioactive waste from defense materials production since the 1940`s. Waste in certain of the tanks may contain material in the form of ferrocyanide or various organic compounds which could potentially be susceptible to condensed phase chemical reactions. Because of the presence of oxidizing materials (nitrate compounds) and heat sources (radioactive decay and chemical reactions), the ferrocyanide or organic material could potentially fuel a propagating exothermic reaction with undesirable consequences. Analysis and experiments indicate that the reaction propagation and/or initiation may be prevented by the presence of sufficient moisture in the waste. Because the reaction would probably be initiated at the surface of the waste, evidence of sufficient moisture concentration would help provide evidence that the tank waste can continue to be safely stored. The Surface Moisture Measurement System (SMMS) was developed to collect data on the surface moisture in the waste by inserting two types of probes (singly) into a waste tank-a neutron probe and an electromagnetic inductance (EMI) probe. The sensor probes will be placed on the surface of the waste utilizing a moveable deployment arm to lower them through an available riser. The movement of the SMMS within the tank will be monitored by a camera lowered through an adjacent riser. The SMMS equipment is the subject of this study. Hazards and Operability Analysis (HAZOP) is a systematic technique for assessing potential hazards and/or operability problems for a new activity. It utilizes a multidiscipline team of knowledgeable individuals in a systematic brainstorming effort. The results of this study will be used as input to an Unreviewed Safety Question determination.

  12. A study of shock mitigating materials in a split Hopkinson bar configuration. Phase 1

    SciTech Connect

    Bateman, V.I.; Brown, F.A.; Hansen, N.R.

    1998-06-01

    Sandia National Laboratories (SNL) designs mechanical systems with electronics that must survive high shock environments. These mechanical systems include penetrators that must survive soil, rock, and ice penetration, nuclear transportation casks that must survive transportation environments, and laydown weapons that must survive delivery impact of 125 fps. These mechanical systems contain electronics that may operate during and after the high shock environment and that must be protected from the high shock environments. A study has been started to improve the packaging techniques for the advanced electronics utilized in these mechanical systems because current packaging techniques are inadequate for these more sensitive electronics. In many cases, it has been found that the packaging techniques currently used not only do not mitigate the shock environment but actually amplify the shock environment. An ambitious goal for this packaging study is to avoid amplification and possibly attenuate the shock environment before it reaches the electronics contained in the various mechanical systems. As part of the investigation of packaging techniques, a two phase study of shock mitigating materials is being conducted. The purpose of the first phase reported here is to examine the performance of a joint that consists of shock mitigating material sandwiched in between steel and to compare the performance of the shock mitigating materials. A split Hopkinson bar experimental configuration simulates this joint and has been used to study the shock mitigating characteristics of seventeen, unconfined materials. The nominal input for these tests is an incident compressive wave with 50 fps peak (1,500 {micro}{var_epsilon} peak) amplitude and a 100 {micro}s duration (measured at 10% amplitude).

  13. Analyzing costs of space debris mitigation methods

    NASA Astrophysics Data System (ADS)

    Wiedemann, C.; Krag, H.; Bendisch, J.; Sdunnus, H.

    2004-01-01

    The steadily increasing number of space objects poses a considerable hazard to all kinds of spacecraft. To reduce the risks to future space missions different debris mitigation measures and spacecraft protection techniques have been investigated during the last years. However, the economic efficiency has not been considered yet in this context. Current studies have the objective to evaluate the mission costs due to space debris in a business as usual (no mitigation) scenario compared to the missions costs considering debris mitigation. The aim is an estimation of the time until the investment in debris mitigation will lead to an effective reduction of mission costs. This paper presents the results of investigations on the key issues of cost estimation for spacecraft and the influence of debris mitigation and shielding on cost. Mitigation strategies like the reduction of orbital lifetime and de- or re-orbit of non-operational satellites are methods to control the space debris environment. These methods result in an increase of costs. In a first step the overall costs of different types of unmanned satellites are analyzed. A selected cost model is simplified and generalized for an application on all operational satellites. In a next step the influence of space debris on cost is treated, if the implementation of mitigation strategies is considered.

  14. Methodological Issues In Forestry Mitigation Projects: A CaseStudy Of Kolar District

    SciTech Connect

    Ravindranath, N.H.; Murthy, I.K.; Sudha, P.; Ramprasad, V.; Nagendra, M.D.V.; Sahana, C.A.; Srivathsa, K.G.; Khan, H.

    2007-06-01

    There is a need to assess climate change mitigationopportunities in forest sector in India in the context of methodologicalissues such as additionality, permanence, leakage, measurement andbaseline development in formulating forestry mitigation projects. A casestudy of forestry mitigation project in semi-arid community grazing landsand farmlands in Kolar district of Karnataka, was undertaken with regardto baseline and project scenariodevelopment, estimation of carbon stockchange in the project, leakage estimation and assessment ofcost-effectiveness of mitigation projects. Further, the transaction coststo develop project, and environmental and socio-economic impact ofmitigation project was assessed.The study shows the feasibility ofestablishing baselines and project C-stock changes. Since the area haslow or insignificant biomass, leakage is not an issue. The overallmitigation potential in Kolar for a total area of 14,000 ha under variousmitigation options is 278,380 tC at a rate of 20 tC/ha for the period2005-2035, which is approximately 0.67 tC/ha/yr inclusive of harvestregimes under short rotation and long rotation mitigation options. Thetransaction cost for baseline establishment is less than a rupee/tC andfor project scenario development is about Rs. 1.5-3.75/tC. The projectenhances biodiversity and the socio-economic impact is alsosignificant.

  15. A procedure for global flood hazard mapping - the Africa case study

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Salamon, Peter; Feyen, Luc; Barbosa, Paulo

    2015-04-01

    River floods are recognized as one of the major causes of economic damages and loss of human lives worldwide, and their impact in the next decades could be dramatically increased by socio-economic and climatic changes. In Africa, the availability of tools and models for predicting, mapping and analysing flood hazard and risk is still limited. Consistent, high-resolution (1km or less), continental-scale hazard maps are extremely valuable for local authorities and water managers to mitigate flood risk and to reduce catastrophic impacts on population and assets. The present work describes the development of a procedure for global flood hazard mapping, which is tested and applied over Africa to derive continental flood hazard maps. We derive a long-term dataset of daily river discharges from global hydrological simulations to design flood hydrographs for different return periods for the major African river network. We then apply a hydrodynamic model to identify flood-prone areas in major river catchments, which are merged to create pan-African flood hazard maps at 900m resolution. The flood map designed for a return period of 20 years is compared with a mosaic of satellite images showing all flooded areas in the period 2000-2014. We discuss strengths and limitations emerging from the comparison and present potential future applications and developments of the methodology.

  16. The 5 key questions coping with risks due to natural hazards, answered by a case study

    NASA Astrophysics Data System (ADS)

    Hardegger, P.; Sausgruber, J. T.; Schiegg, H. O.

    2009-04-01

    Based on Maslow's hierarchy of needs, human endeavours concern primarily existential needs, consequently, to be safeguarded against both natural as well as man made threads. The subsequent needs are to realize chances in a variety of fields, as economics and many others. Independently, the 5 crucial questions are the same as for coping with risks due to natural hazards specifically. These 5 key questions are I) What is the impact in function of space and time ? II) What protection measures comply with the general opinion and how much do they mitigate the threat? III) How can the loss be adequately quantified and monetized ? IV) What budget for prevention and reserves for restoration and compensation are to be planned ? V) Which mix of measures and allocation of resources is sustainable, thus, optimal ? The 5 answers, exemplified by a case study, concerning the sustainable management of risk due to the debris flows by the Enterbach / Inzing / Tirol / Austria, are as follows : I) The impact, created by both the propagation of flooding and sedimentation, has been forecasted by modeling (numerical simulation) the 30, 50, 100, 150, 300 and 1000 year debris flow. The input was specified by detailed studies in meteorology, precipitation and runoff, in geology, hydrogeology, geomorphology and slope stability, in hydraulics, sediment transport and debris flow, in forestry, agriculture and development of communal settlement and infrastructure. All investigations were performed according to the method of ETAlp (Erosion and Transport in Alpine systems). ETAlp has been developed in order to achieve a sustainable development in alpine areas and has been evaluated by the research project "nab", within the context of the EU-Interreg IIIb projects. II) The risk mitigation measures of concern are in hydraulics at the one hand and in forestry at the other hand. Such risk management is evaluated according to sustainability, which means economic, ecologic and social, in short, "triple" compatibility. 100% protection against the 100 year event shows to be the optimal degree of protection. Consequently, impacts statistically less frequent than once in 100 year are accepted as the remaining risk. Such floods and debris flows respectively cause a fan of propagation which is substantially reduced due to the protection measures against the 100 year event. III) The "triple loss distribution" shows the monetized triple damage, dependent on its probability. The monetization is performed by the social process of participation of the impacted interests, if not, by official experts in representation. The triple loss distribution rises in time mainly due to the rise in density and value of precious goods. A comparison of the distributions of the triple loss and the triple risk, behaving in opposite direction, is shown and explained within the project. IV) The recommended yearly reserves to be stocked for restoration and compensation of losses, caused by debris flows, amount to € 70'000.- according to the approach of the "technical risk premium". The discrepancy in comparison with the much higher amounts according to the common approaches of natural hazards engineering are discussed. V) The sustainable mix of hydraulic and forestry measures with the highest return on investment at lowest risk is performed according to the portfolio theory (Markowitz), based on the triple value curves, generated by the method of TripelBudgetierung®. Accordingly, the optimum mix of measures to protect the community of Inzing against the natural hazard of debris flow, thus, the most efficient allocation of resources equals to 2/3 for hydraulic, 1/3 for forestry measures. In detail, the results of the research pilot project "Nachhaltiges Risikomanagement - Enterbach / Inzing / Tirol / Austria" may be consulted under www.ibu.hsr.ch/inzing.

  17. NOAA Technical Memorandum ERL PMEL-37 FEASIBILITY STUDY ON MITIGATING TSUNAMI HAZARDS IN THE PACIFIC

    E-print Network

    Lean, Virginia Pacific Marine Environmental Laboratory Seattle, Washington December 1982 UNITED ST~rES DEPARTMENT, generated by large submarine earthquakes. Since 1850, more than 70,000 lives have been lost in the Pacific

  18. Feasibility study of tank leakage mitigation using subsurface barriers

    SciTech Connect

    Treat, R.L.; Peters, B.B.; Cameron, R.J.; McCormak, W.D.; Trenkler, T.; Walters, M.F. [Ensearch Environmental, Inc. (United States); Rouse, J.K.; McLaughlin, T.J. [Bovay Northwest, Inc., Richland, WA (United States); Cruse, J.M. [Westinghouse Hanford Co., Richland, WA (United States)

    1994-09-21

    The US Department of Energy (DOE) has established the Tank Waste Remediation System (TWRS) to satisfy manage and dispose of the waste currently stored in the underground storage tanks. The retrieval element of TWRS includes a work scope to develop subsurface impermeable barriers beneath SSTs. The barriers could serve as a means to contain leakage that may result from waste retrieval operations and could also support site closure activities by facilitating cleanup. Three types of subsurface barrier systems have emerged for further consideration: (1) chemical grout, (2) freeze walls, and (3) desiccant, represented in this feasibility study as a circulating air barrier. This report contains analyses of the costs and relative risks associated with combinations retrieval technologies and barrier technologies that from 14 alternatives. Eight of the alternatives include the use of subsurface barriers; the remaining six nonbarrier alternative are included in order to compare the costs, relative risks and other values of retrieval with subsurface barriers. Each alternative includes various combinations of technologies that can impact the risks associated with future contamination of the groundwater beneath the Hanford Site to varying degrees. Other potential risks associated with these alternatives, such as those related to accidents and airborne contamination resulting from retrieval and barrier emplacement operations, are not quantitatively evaluated in this report.

  19. Studies on Hazard Characterization for Performance-based Structural Design 

    E-print Network

    Wang, Yue

    2010-07-14

    Performance-based engineering (PBE) requires advances in hazard characterization, structural modeling, and nonlinear analysis techniques to fully and efficiently develop the fragility expressions and other tools forming the basis for risk...

  20. Feasibility Study of Radiometry for Airborne Detection of Aviation Hazards

    NASA Technical Reports Server (NTRS)

    Gimmestad, Gary G.; Papanicolopoulos, Chris D.; Richards, Mark A.; Sherman, Donald L.; West, Leanne L.; Johnson, James W. (Technical Monitor)

    2001-01-01

    Radiometric sensors for aviation hazards have the potential for widespread and inexpensive deployment on aircraft. This report contains discussions of three aviation hazards - icing, turbulence, and volcanic ash - as well as candidate radiometric detection techniques for each hazard. Dual-polarization microwave radiometry is the only viable radiometric technique for detection of icing conditions, but more research will be required to assess its usefulness to the aviation community. Passive infrared techniques are being developed for detection of turbulence and volcanic ash by researchers in this country and also in Australia. Further investigation of the infrared airborne radiometric hazard detection approaches will also be required in order to develop reliable detection/discrimination techniques. This report includes a description of a commercial hyperspectral imager for investigating the infrared detection techniques for turbulence and volcanic ash.

  1. Land use /Land Cover Approaches as Instruments of Natural Hazard Mitigation in the Manjira River Sub-Basin, Andhra Pradesh, India.

    NASA Astrophysics Data System (ADS)

    THATIPARTI, V. L.

    2001-05-01

    Rapid industrialization during the last three decades had a profound adverse effect on the land use / land cover practices in , and the water quality, of the Manjira River sub-basin, Medak district, Andhra Pradesh, India. As water interacts with all other components of the environment, such as geology, soils, weather and climate, flora and fauna, the pollution of water has affected both biophysical and socioeconomic and cultural environments. The area of study is the catchment of Nakkavagu (stream) in the Manjira river system, which lies between long. 78 05' - 78 25' E., and the lat. 17 25'- and 17 45' N., and covers an area of 734 sq.km. Remote Sensing and GIS techniques have been employed to identify and quantify measures for mitigating the adverse impacts of the industrialization and for being prepared for extreme weather events. The methodology employed in the present study involves the generation of various thematic layers like slope, hydrogeomorphology and land use / land cover maps using Land sat MSS, IRS IA LISS II and IRS ID LISS III and PAN merged data in EASI / PACE 6.3 ver. Platform. By overlaying all the above thematic maps, action plan maps are generated to device various ways and means of rolling back the degradation of the environment, and to develop low -cost, people - participatory strategies ( such as, agricultural practices, use of water bodies and land under urbanization, structural and non-structural, particularly vegetation methods, etc.) of reducing the vulnerability of the population for extreme weather events.

  2. The critical need for moderate to high resolution thermal infrared data for volcanic hazard mitigation and process monitoring from the micron to the kilometer scale

    NASA Astrophysics Data System (ADS)

    Ramsey, M. S.

    2006-12-01

    The use of satellite thermal infrared (TIR) data to rapidly detect and monitor transient thermal events such as volcanic eruptions commonly relies on datasets with coarse spatial resolution (1.0 - 8.0 km) and high temporal resolution (minutes to hours). However, the growing need to extract physical parameters at meter to sub- meter scales requires data with improved spectral and spatial resolution. Current orbital systems such as the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) and the Landsat Enhanced Thematic Mapper plus (ETM+) can provide TIR data ideal for this type of scientific analysis, assessment of hazard risks, and to perform smaller scale monitoring; but at the expense of rapid repeat observations. A potential solution to this apparent conflict is to combine the spatial and temporal scales of TIR data in order to provide the benefits of rapid detection together with the potential of detailed science return. Such a fusion is now in place using ASTER data collected in the north Pacific region to monitor the Aleutian and Kamchatka arcs. However, this approach of cross-instrument/cross-satellite monitoring is in jeopardy with the lack of planned moderate resolution TIR instruments following ETM+ and ASTER. This data collection program is also being expanded globally, and was used in 2006 to assist in the response and monitoring of the volcanic crisis at Merapi Volcano in Indonesia. Merapi Volcano is one of the most active volcanoes in the country and lies in central Java north of the densely-populated city of Yogyakarta. Pyroclastic flows and lahars are common following the growth and collapse of the summit lava dome. These flows can be fatal and were the major hazard concern during a period of renewed activity beginning in April 2006. Lava at the surface was confirmed on 25 April and ASTER was tasked with an urgent request observation, subsequently collecting data on 26 April (daytime) and 28 April (nighttime). The TIR revealed thermally-elevated pixels (max = 25.9 C) clustered near the summit with a lesser anomaly (max = 15.5 C) approximately 650 m to the southwest and down slope from the summit. Such small-scale and low-grade thermal features confirmed the increased activity state of the volcano and were only made possible with the moderate spatial, spectral, and radiometric resolution of ASTER. ASTER continued to collect data for the next 12 weeks tracking the progress of large scale pyroclastic flows, the growth of the lava dome, and the path of ash-rich plumes. Data from these observations were reported world-wide and used for evacuation and hazard planning purposes. With the pending demise of such TIR data from orbit, research is also focused on the use of handheld TIR instruments such as the forward-looking infrared radiometer (FLIR) camera. These instruments provide the highest spatial resolution in-situ TIR data and have been used to observe numerous volcanic phenomena and quantitatively model others (e.g., the rise of the magma body preceding the eruption of Mt. St. Helens Volcano; the changes on the lava dome at Bezymianny Volcano; the behavior of basalt crusts during pahoehoe flow inflation). Studies such as these confirm the utility and importance of future moderate to high resolution TIR data in order to understand volcanic processes and their accompanying hazards.

  3. Rock cliffs hazard analysis based on remote geostructural surveys: The Campione del Garda case study (Lake Garda, Northern Italy)

    NASA Astrophysics Data System (ADS)

    Ferrero, A. M.; Migliazza, M.; Roncella, R.; Segalini, A.

    2011-02-01

    The town of Campione del Garda (located on the west coast of Lake Garda) and its access road have been historically subject to rockfall phenomena with risk for public security in several areas of the coast. This paper presents a study devoted to the determination of risk for coastal cliffs and the design of mitigation measures. Our study was based on statistical rockfall analysis performed with a commercial code and on stability analysis of rock slopes based on the key block method. Hazard from block kinematics and rock-slope failure are coupled by applying the Rockfall Hazard Assessment Procedure (RHAP). Because of the huge dimensions of the slope, its morphology and the geostructural survey were particularly complicated and demanding. For these reasons, noncontact measurement methods, based on aerial photogrammetry by helicopter, were adopted. A special software program, developed by the authors, was applied for discontinuity identification and for their orientation measurements. The potentially of aerial photogrammetic survey in rock mechanic application and its improvement in the rock mass knowledge is analysed in the article.

  4. Landslide hazard analysis - a case study in WuShe reservoir catchment

    NASA Astrophysics Data System (ADS)

    Huang, C. M.

    2014-12-01

    A complete landslide inventory covering a long time span is the foundation for landslide hazard analysis. Previous studies only estimate landslide susceptibility because enough landslide inventory was usually unavailable. This study collects the spot image of ten events from 1994 to 2009 of WuShe reservoir catchment in central Taiwan. All of landslide inventories were manual interpretation from spot image. The recent and expanded landslide inventory due to the event was identified from the landslide inventory of pre and post event. The recent landslide inventory was used to conduct landslide hazard evaluation.This study follows the landslide hazard analysis framework of CNR-IRPR in Italy, which had demonstrated how the spatial probability, temporal probability and size probability of each slope-unit were calculated. Landslide hazard in WuShe reservoir catchment was obtained following the procedure. The preliminary validated result shows that the prediction of landslide hazard in high landslide hazard region was better, when compared with landslide susceptibility.However, the reliability of temporal probability and size probability were related to the number of landslide inventories. When no landslide was recorded, or lack of suitable landslide inventory, would lead to the incorrect analysis of landslide hazard. The event based spatial probability was affected by rainfall distribution, but the rainfall distribution of different typhoons were usually highly variable. Next phase of this study would attempt to discuss these issues and develop a suitable framework for landslide hazard analysis in Taiwan.

  5. LAND CLASSIFICATION USED TO SELECT ABANDONED HAZARDOUS WASTE STUDY SITES

    EPA Science Inventory

    The biological effects of hazardous substances in the environment are influenced by climate, physiography, and biota. These factors interact to determine the transport and fate of chemicals, but are difficult to model accurately except for small areas with a large data base. The ...

  6. Factors in Perception of Tornado Hazard: An Exploratory Study.

    ERIC Educational Resources Information Center

    de Man, Anton; Simpson-Housley, Paul

    1987-01-01

    Administered questionnaire on tornado hazard to 142 adults. Results indicated that subject's gender and education level were best predictors of perceived probability of tornado recurrence; that ratings of severity of potential damage were related to education level; and that gender accounted for significant percentage of variance in anxiety…

  7. Land Use/Land Cover Approaches as Instruments of Natural Hazard Mitigation in the Manjira River Sub-Basin, Andhra Pradesh, India

    NASA Astrophysics Data System (ADS)

    Lakshmi, T. V.; Reddy, M. A.; Anjaneyulu, Y.

    2001-05-01

    Rapid industrialization during the last three decades had a profound adverse effect on the land use/land cover practices in, and the water quality, of the Manjira River sub-basin, Medak District, Andhra Pradesh, India. As water interacts with all other components of the environment, such as, geology, soils, weather and climate, flora and fauna, the pollution of water has affected both biophysical and socioeconomic and cultural environments. The area of study is the catchment of Nakkavagu (stream) in the Manjira river system, which lies between long. 78 05' - 78 25' E., and the lat. 17 25' - 17 45' N., and covers an area of 734 sq. km. Remote sensing and GIS techniques have been employed to identify and quantify measures for mitigating the adverse impacts of the industrialization and for being prepared for extreme weather events. The methodology employed in the present study involves the generation of various thematic layers like slope, hydrogeomorphology and land use / land cover maps using Landsat MSS, IRS 1A LISS II and IRS 1D LISS III and PAN merged data in EASI/PACE 6.3 ver. platform. By overlaying all the above thematic maps, action plan maps are generated to devise various ways and means of rolling back the degradation of the environment, and to develop low-cost, people-participatory strategies (such as, agricultural practices, use of water bodies and land under urbanization, structural and non-structural, particularly vegetation methods, etc.) of reducing the vulnerability of the population for extreme weather events.

  8. Development, Implementation, and Pilot Evaluation of a Model-Driven Envelope Protection System to Mitigate the Hazard of In-Flight Ice Contamination on a Twin-Engine Commuter Aircraft

    NASA Technical Reports Server (NTRS)

    Martos, Borja; Ranaudo, Richard; Norton, Billy; Gingras, David; Barnhart, Billy

    2014-01-01

    Fatal loss-of-control accidents have been directly related to in-flight airframe icing. The prototype system presented in this report directly addresses the need for real-time onboard envelope protection in icing conditions. The combination of prior information and real-time aerodynamic parameter estimations are shown to provide sufficient information for determining safe limits of the flight envelope during inflight icing encounters. The Icing Contamination Envelope Protection (ICEPro) system was designed and implemented to identify degradations in airplane performance and flying qualities resulting from ice contamination and provide safe flight-envelope cues to the pilot. The utility of the ICEPro system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their awareness of a hazardous aircraft state. The performance of ICEPro system was further evaluated by various levels of sensor noise and atmospheric turbulence.

  9. Seismic hazard studies for the High Flux Beam Reactor at Brookhaven National Laboratory

    SciTech Connect

    Costantino, C.J.; Heymsfield, E. (City Coll., New York, NY (United States). Dept. of Civil Engineering); Park, Y.J.; Hofmayer, C.H. (Brookhaven National Lab., Upton, NY (United States))

    1991-01-01

    This paper presents the results of a calculation to determine the site specific seismic hazard appropriate for the deep soil site at Brookhaven National Laboratory (BNL) which is to be used in the risk assessment studies being conducted for the High Flux Beam Reactor (HFBR). The calculations use as input the seismic hazard defined for the bedrock outcrop by a study conducted at Lawrence Livermore National Laboratory (LLNL). Variability in site soil properties were included in the calculations to obtain the seismic hazard at the ground surface and compare these results with those using the generic amplification factors from the LLNL study. 9 refs., 8 figs.

  10. The asteroid impact threat: instrumentation for mitigation precursor and demo missions, a study from the NEOShield project

    NASA Astrophysics Data System (ADS)

    Perna, D.; Barucci, M. A.; Fulchignoni, M.; Fornasier, S.

    2013-09-01

    The NEOShield project [1], started in January 2012, has been funded by the European Union for a period of 3.5 years. The primary aim of the project is to study in detail the three most promising techniques to mitigate the asteroid impact risk: the kinetic impactor, blast deflection, and the gravity tractor, and to devise feasible demonstration missions. NEOShield also aims to address the issue of a still missing international agreement on how to deal with the impact threat and how to organize, prepare, and implement mitigation plans. Within the NEOShield consortium, the LESIA is the leading institute for what concerns the physical characterization of near- Earth objects (NEOs). We are currently studying which is the appropriate instrumentation for both mitigation precursor missions and mitigation demo missions.

  11. A Study on Integrated Community Based Flood Mitigation with Remote Sensing Technique in Kota Bharu, Kelantan

    NASA Astrophysics Data System (ADS)

    'Ainullotfi, A. A.; Ibrahim, A. L.; Masron, T.

    2014-02-01

    This study is conducted to establish a community based flood management system that is integrated with remote sensing technique. To understand local knowledge, the demographic of the local society is obtained by using the survey approach. The local authorities are approached first to obtain information regarding the society in the study areas such as the population, the gender and the tabulation of settlement. The information about age, religion, ethnic, occupation, years of experience facing flood in the area, are recorded to understand more on how the local knowledge emerges. Then geographic data is obtained such as rainfall data, land use, land elevation, river discharge data. This information is used to establish a hydrological model of flood in the study area. Analysis were made from the survey approach to understand the pattern of society and how they react to floods while the analysis of geographic data is used to analyse the water extent and damage done by the flood. The final result of this research is to produce a flood mitigation method with a community based framework in the state of Kelantan. With the flood mitigation that involves the community's understanding towards flood also the techniques to forecast heavy rainfall and flood occurrence using remote sensing, it is hope that it could reduce the casualties and damage that might cause to the society and infrastructures in the study area.

  12. Assessment of Natural Hazard Damage and Reconstruction: A Case Study from Band Aceh, Indonesia

    E-print Network

    Gillespie, Thomas; Frankenberg, Elizabeth; Braughton, Matt; Cooke, Abigail M.; Armenta, Tiffany; Thomas, Duncan

    2009-01-01

    estimates of population available from Indonesia, only paperPopulation Research On-Line Working Paper Series Assessment of Natural Hazard Damage and Reconstruction: A Case Study from Band Aceh, Indonesia

  13. ASSESSMENT OF HAZARDOUS WASTE SURFACE IMPOUNDMENT TECHNOLOGY CASE STUDIES AND PERSPECTIVES OF EXPERTS

    EPA Science Inventory

    The available data were gathered for a large number of case studies of hazardous waste surface impoundments (SI). Actual and projected performances were compared. This collection, analysis and dissemination of the accumulated experience can contribute significantly to improving S...

  14. Prospective study of hepatic, renal, and haematological surveillance in hazardous materials firefighters

    PubMed Central

    Kales, S; Polyhronopoulos, G; Aldrich, J; Mendoza, P; Suh, J; Christiani, D

    2001-01-01

    OBJECTIVES—To evaluate possible health effects related to work with hazardous materials as measured by end organ effect markers in a large cohort over about 2 years, and in a subcohort over 5 years.?METHODS—Hepatic, renal, and haematological variables were analysed from 1996-98 in hazardous materials firefighters including 288 hazardous materials technicians (81%) and 68 support workers (19%). The same end organ effect markers in a subcohort of the technicians were also analysed (n=35) from 1993-98. Support workers were considered as controls because they are also firefighters, but had a low potential exposure to hazardous materials.?RESULTS—During the study period, no serious injuries or exposures were reported. For the end organ effect markers studied, no significant differences were found between technicians and support workers at either year 1 or year 3. After adjustment for a change in laboratory, no significant longitudinal changes were found within groups for any of the markers except for creatinine which decreased for both technicians (p<0.001) and controls (p<0.01).?CONCLUSIONS—Health effects related to work are infrequent among hazardous materials technicians. Haematological, hepatic, and renal testing is not required on an annual basis and has limited use in detecting health effects in hazardous materials technicians.???Keywords: hazardous materials; firefighters; medical surveillance PMID:11160986

  15. Using fine-scale fuel measurements to assess wildland fuels, potential fire behavior and hazard mitigation treatments in the southeastern USA

    Microsoft Academic Search

    Roger D. Ottmar; John I. Blake; William T. Crolly; Crolly; T. William

    2012-01-01

    The inherent spatial and temporal heterogeneity of fuelbeds in forests of the southeastern United States may require fine scale fuel measurements for providing reliable fire hazard and fuel treatment effectiveness estimates. In a series of five papers, an intensive, fine scale fuel inventory from the Savanna River Site in the southeastern United States is used for building fuelbeds and mapping

  16. From structural investigation towards multi-parameter early warning systems: geophysical contributions to hazard mitigation at the landslide of Gschliefgraben (Gmunden, Upper Austria)

    Microsoft Academic Search

    Robert Supper; Ivo Baron; Birgit Jochum; Anna Ita; Edmund Winkler; Klaus Motschka; Günter Moser

    2010-01-01

    In December 2007 the large landslide system inside the Gschliefgraben valley (located at the east edge of the Traun lake, Upper Austria), known over centuries for its repeated activity, was reactivated. Although a hazard zone map was already set up in 1974, giving rise to a complete prohibition on building, some hundreds of people are living on the alluvial fan

  17. 15 CFR 923.25 - Shoreline erosion/mitigation planning.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...COMMERCE OCEAN AND COASTAL RESOURCE MANAGEMENT COASTAL ZONE MANAGEMENT PROGRAM...923.25 Shoreline erosion/mitigation planning...adversely affected by such erosion. This planning process...broader context of coastal hazard mitigation...

  18. Communicating Volcanic Hazards in the North Pacific

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Webley, P.; Cunningham, K. W.

    2014-12-01

    For over 25 years, effective hazard communication has been key to effective mitigation of volcanic hazards in the North Pacific. These hazards are omnipresent, with a large event happening in Alaska every few years to a decade, though in many cases can happen with little or no warning (e.g. Kasatochi and Okmok in 2008). Here a useful hazard mitigation strategy has been built on (1) a large database of historic activity from many datasets, (2) an operational alert system with graduated levels of concern, (3) scenario planning, and (4) routine checks and communication with emergency managers and the public. These baseline efforts are then enhanced in the time of crisis with coordinated talking points, targeted studies and public outreach. Scientists naturally tend to target other scientists as their audience, whereas in effective monitoring of hazards that may only occur on year to decadal timescales, details can distract from the essentially important information. Creating talking points and practice in public communications can help make hazard response a part of the culture. Promoting situational awareness and familiarity can relieve indecision and concerns at the time of a crisis.

  19. Study on mitigation of pulsed heat load for ITER cryogenic system

    NASA Astrophysics Data System (ADS)

    Peng, N.; Xiong, L. Y.; Jiang, Y. C.; Tang, J. C.; Liu, L. Q.

    2015-03-01

    One of the key requirements for ITER cryogenic system is the mitigation of the pulsed heat load deposited in the magnet system due to magnetic field variation and pulsed DT neutron production. As one of the control strategies, bypass valves of Toroidal Field (TF) case helium loop would be adjusted to mitigate the pulsed heat load to the LHe plant. A quasi-3D time-dependent thermal-hydraulic analysis of the TF winding packs and TF case has been performed to study the behaviors of TF magnets during the reference plasma scenario with the pulses of 400 s burn and repetition time of 1800 s. The model is based on a 1D helium flow and quasi-3D solid heat conduction model. The whole TF magnet is simulated taking into account thermal conduction between winding pack and case which are cooled separately. The heat loads are given as input information, which include AC losses in the conductor, eddy current losses in the structure, thermal radiation, thermal conduction and nuclear heating. The simulation results indicate that the temperature variation of TF magnet stays within the allowable range when the smooth control strategy is active.

  20. New debris flow mitigation measures in southern Gansu, China: a case study of the Zhouqu Region

    NASA Astrophysics Data System (ADS)

    Xiong, Muqi; Meng, Xingmin; Li, Yajun

    2014-05-01

    A devastating debris flow occurred in Zhouqu of Gansu Province, China, on 8th August 2010, resulting in a catastrophic disaster, with 1463 people being perished. The debris flow valleys, as other numerous debris valleys in the mountainous region, had preventive engineering constructions, such as check dames, properly designed based on common engineering practices for safe guiding the town located right on the debris flow fan. However, failures of such preventive measures often cause even heavier disasters than those that have no human interactions, as the mitigations give a false safety impression. Given such a weird situation and in order to explore a much more effective disaster prevention strategy against debris flows in the mountainous region, this paper makes a comparative study based on two cases in the area of which one had preventive structures and one hasn't. The result shows that inappropriate mitigation measures that have commonly been applying in the disaster reduction practices in the region are of questionable. It is concluded that going with the nature and following with the natural rules are the best strategy for disaster reduction in the region. Key words: debris flow disasters, disaster reduction strategy, preventive measures

  1. Natural Hazards Observer

    NSDL National Science Digital Library

    The Natural Hazards Center of the University of Colorado Boulder offers a free online professional hazards publication called the Natural Hazards Observer. Readers will find information on current disaster issues; new international, national, and local disaster management, mitigation, and education programs; hazards research; political and policy developments; new information sources; upcoming conferences; and recent publications. The January 2003 issue (the latest of the bimonthly publication, which dates back to 1996) includes reports with titles such as Congress Passes Inland Flood Warning Bill and Dam Safety Act Passed. Those interested can view the issues online, download and view them, and even search their content by various parameters.

  2. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...2011-10-01 2011-10-01 false Standard Flood Hazard Determination Form and Instructions...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard Determination Form and...

  3. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...2010-10-01 2010-10-01 false Standard Flood Hazard Determination Form and Instructions...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard Determination Form and...

  4. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...2013-10-01 2013-10-01 false Standard Flood Hazard Determination Form and Instructions...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard Determination Form and...

  5. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...2014-10-01 2014-10-01 false Standard Flood Hazard Determination Form and Instructions...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard Determination Form and...

  6. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...2012-10-01 2011-10-01 true Standard Flood Hazard Determination Form and Instructions...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard Determination Form and...

  7. Analyzing costs of space debris mitigation methods

    NASA Astrophysics Data System (ADS)

    Wiedemann, C.; Krag, H.; Bendisch, J.; Sdunnus, H.

    The steadily increasing number of space objects poses a considerable hazard to all kinds of spacecraft. To reduce the risks to future space missions different debris mitigation measures and spacecraft protection techniques have been investigated during the last years. However, the economic efficiency has not been considered yet in this context. This economical background is not always clear to satellite operators and the space industry. Current studies have the objective to evaluate the mission costs due to space debris in a business as usual (no mitigation) scenario compared to the missions costs considering debris mitigation. The aim i an estimation of thes time until the investment in debris mitigation will lead to an effective reduction of mission costs. This paper presents the results of investigations on the key problems of cost estimation for spacecraft and the influence of debris mitigation and shielding on cost. The shielding of a satellite can be an effective method to protect the spacecraft against debris impact. Mitigation strategies like the reduction of orbital lifetime and de- or re-orbit of non-operational satellites are methods to control the space debris environment. These methods result in an increase of costs. In a first step the overall costs of different types of unmanned satellites are analyzed. The key problem is, that it is not possible to provide a simple cost model that can be applied to all types of satellites. Unmanned spacecraft differ very much in mission, complexity of design, payload and operational lifetime. It is important to classify relevant cost parameters and investigate their influence on the respective mission. The theory of empirical cost estimation and existing cost models are discussed. A selected cost model is simplified and generalized for an application on all operational satellites. In a next step the influence of space debris on cost is treated, if the implementation of mitigation strategies is considered.

  8. BICAPA case study of natural hazards that trigger technological disasters

    NASA Astrophysics Data System (ADS)

    Boca, Gabriela; Ozunu, Alexandru; Nicolae Vlad, Serban

    2010-05-01

    Industrial facilities are vulnerable to natural disasters. Natural disasters and technological accidents are not always singular or isolated events. The example in this paper show that they can occur in complex combinations and/or in rapid succession, known as NaTech disasters, thereby triggering multiple impacts. This analysis indicates that NaTech disasters have the potential to trigger hazmat releases and other types of technological accidents. Climate changes play an important role in prevalence and NATECH triggering mechanisms. Projections under the IPCC IS92 a scenario (similar to SRES A1B; IPCC, 1992) and two GCMs indicate that the risk of floods increases in central and eastern Europe. Increase in intense short-duration precipitation is likely to lead to increased risk of flash floods. (Lehner et al., 2006). It is emergent to develop tools for the assessment of risks due to NATECH events in the industrial processes, in a framework starting with the characterization of frequency and severity of natural disasters and continuing with complex analysis of industrial processes, to risk assessment and residual functionality analysis. The Ponds with dangerous technological residues are the most vulnerable targets of natural hazards. Technological accidents such as those in Baia Mare, (from January to March 2000) had an important international echo. Extreme weather phenomena, like those in the winter of 2000 in Baia Mare, and other natural disasters such as floods or earthquakes, can cause a similar disaster at Târn?veni in Transylvania Depression. During 1972 - 1978 three decanting ponds were built on the Chemical Platform Târn?veni, now SC BICAPA SA, for disposal of the hazardous-wastes resulting from the manufacture of sodium dichromate, inorganic salts, sludge from waste water purification and filtration, wet gas production from carbide. The ponds are located on the right bank of the river Târnava at about 35-50m from the flooding defense dam. The total amount of toxic waste stored in the three ponds is about 2500 tons, equivalent at 128 tons expressed in hexavalent chromium. The ponds contour dikes are strongly damaged in many places, their safety is jeopardized by leakages, sliding slopes and ravens. The upstream dike has an increased failure risk. The upstream dike has an increased failure risk. In that section the coefficients of safety are under the allowable limit, both in static applications, and the earthquake. The risk of failure is very high also due to the dikes slopes. The risk becomes higher in case of heavy rainfall, floods or an earthquake.

  9. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models

    Microsoft Academic Search

    M. J. García-Rodríguez; J. A. Malpica; B. Benito

    2009-01-01

    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend

  10. Echo-sounding method aids earthquake hazard studies

    USGS Publications Warehouse

    U.S. Geological Survey

    1995-01-01

    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  11. Assessing the Relationship Between Hazard Mitigation Plan Quality and Rural Status in a Cohort of 57 Counties from 3 States in the Southeastern U.S.

    E-print Network

    Horney, Jennifer A.; Naimi, Ashley I.; Lyles, Ward; Simon, Matt; Salvesen, David; Berke, Philip

    2013-08-13

    3020183 Terms of Use: http://www2.ku.edu/~scholar/docs/license.shtml Please share your stories about how Open Access to this article benefits you. 2006 Challenges 2012, 3, 183-193; doi:10.3390/challe3020183 challengesISSN 2078-1547 www.mdpi.com....K.; Prater, C.S.; Brody, S.D. Measuring tsunami planning capacity on the U.S. Pacific Coast. Nat. Hazard. Rev. 2008, 9(2), 91-100. 23. Reynnells, L.; John, P.L-C. What is Rural? USDA Rural Information Center: Beltsville, MD, USA, 2008. Available online: http...

  12. Google Earth Views of Probabilistic Tsunami Hazard Analysis Pilot Study, Seaside, Oregon

    Microsoft Academic Search

    F. L. Wong; A. J. Venturato; E. L. Geist

    2006-01-01

    Virtual globes such as Google Earth provide immediate geographic context for research data for coastal hazard planning. We present Google Earth views of data from a Tsunami Pilot Study conducted within and near Seaside and Gearhart, Oregon, as part of FEMA's Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). Two goals of the pilot study were

  13. Studying and Improving Human Response to Natural Hazards: Lessons from the Virtual Hurricane Lab

    NASA Astrophysics Data System (ADS)

    Meyer, R.; Broad, K.; Orlove, B. S.

    2010-12-01

    One of the most critical challenges facing communities in areas prone to natural hazards is how to best encourage residents to invest in individual and collective actions that would reduce the damaging impact of low-probability, high-consequence, environmental events. Unfortunately, what makes this goal difficult to achieve is that the relative rarity natural hazards implies that many who face the risk of natural hazards have no previous experience to draw on when making preparation decisions, or have prior experience that provides misleading guidance on how best to prepare. For example, individuals who have experienced strings of minor earthquakes or near-misses from tropical cyclones may become overly complacent about the risks that extreme events actually pose. In this presentation we report the preliminary findings of a program of work that explores the use of realistic multi-media hazard simulations designed for two purposes: 1) to serve as a basic research tool for studying of how individuals make decisions to prepare for rare natural hazards in laboratory settings; and 2) to serve as an educational tool for giving people in hazard-prone areas virtual experience in hazard preparation. We demonstrate a prototype simulation in which participants experience the approach of a virtual hurricane, where they have the opportunity to invest in different kinds of action to protect their home from damage. As the hurricane approaches participants have access to an “information dashboard” in which they can gather information about the storm threat from a variety of natural sources, including mock television weather broadcasts, web sites, and conversations with neighbors. In response to this information they then have the opportunity to invest in different levels of protective actions. Some versions of the simulation are designed as games, where participants are rewarded based on their ability to make the optimal trade-off between under and over-preparing for the threat. From a basic research perspective the data provide valuable potential insights into the dynamics of information gathering prior to hurricane impacts, as well as laboratory in which we can study how both information gathering and responses varies in responses to controlled variations in such factors as the complexity of forecast information. From an applied perspective the simulations provide an opportunity for residents in hazard-prone areas to learn about different kinds of information and receive feedback on their potential biases prior to an actual encounter with a hazard. The presentation concludes with a summary of some of the basic research findings that have emerged from the hurricane lab to date, as well as a discussion of the prospects for extending the technology to a broad range of environmental hazards.

  14. Coastal dynamics studies for evaluation of hazard and vulnerability for coastal erosion. case study the town La Bocana, Buenaventura, colombian pacific

    NASA Astrophysics Data System (ADS)

    Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza

    2015-04-01

    The analysis of the hazard and vulnerability in coastal areas caused for erosion is based on studies of coastal dynamics since that allows having a better information detail that is useful for decision-making in aspects like prevention, mitigation, disaster reduction and integrated risk management. The Town of La Bocana, located in Buenaventura (Colombian Pacific) was selected to carry out the threat assessment for coastal erosion based on three components: i) magnitude, ii) occurrence and iii) susceptibility. Vulnerability meanwhile, is also composed of three main components for its evaluation: i) exposure ii) fragility and iii) resilience, which in turn are evaluated in 6 dimensions of vulnerability: physical, social, economic, ecological, institutional and cultural. The hazard analysis performed used a semi-quantitative approach, and an index of variables such as type of geomorphological unit, type of beach, exposure of the surfing coast, occurrence, among others. Quantitative data of coastal retreat was measured through the use of DSAS (Digital Shoreline Analysis System) an application of ArcGIS, as well as the development of digital elevation models from the beach and 6 beach profiles strategically located on the coast obtained with GNSS technology. Sediment samples collected from these beaches, medium height and wave direction were used as complementary data. The information was integrated across the coast line into segments of 250 x 250 meters. 4 sectors are part of the coastal area of La Bocana: Pianguita, Vistahermosa, Donwtown and Shangay. 6 vulnerability dimensions units were taken from these population, as well as its density for exposure, wich was analyzed through a multi-array method that include variables such as, land use, population, type of structure, education, basic services, among others, to measure frailty, and their respective indicator of resilience. The hazard analysis results indicate that Vistahermosa is in very high threat, while Donwtown and Pianguita are in a medium hazard. Particularly these two sectors have the mayor population density and the biggest hotel development and services infraestructure; meanwhile Shangay was scored with low hazard because the wave action has no direct impact on it. Vulnerability analysis suggest that the sector of Shangay has a very high vulnerability status because it is a sector that does not have any basic services and have low levels of schooling, meanwhile Downtown, Vistahermosa and Pianguita are in the average of vulnerability. Additionally, it was determined that in recent years the sector of Vista hermosa the erosion rates are up to -xx m yr-1, while in other sectors the regression of the coastline can be associated with local tidal peaks that occur during April and October, while other months of the year are typically for recovery and stability processes.

  15. Field Study of Exhaust Fans for Mitigating Indoor Air Quality Problems & Indoor Air Quality - Exhaust Fan Mitigation.

    SciTech Connect

    United States. Bonneville Power Administration.

    1987-07-01

    Overall, the findings show that exhaust fans basically provide small amounts of ventilation compensation. By monitoring the common indoor air pollutants (radon, formaldehyde, carbon monoxide, nitrogen dioxide, and water vapor), it was found that the quality of the indoor air was not adversely affected by the use of exhaust fans. Nor did their use provide any measurable or significant benefits since no improvement in air quality was ascertained. While exhaust fans of this small size did not increase radon, which is the contaminant of most concern, the researchers caution that operation of a larger fan or installation in a very tight home could result in higher levels because depressurization is greater. The daily energy consumption for use of these appliances during the heating season was calculated to be 1.5 kilowatt hours or approximately 3% of the energy consumption in the study homes. The information collected in this collaborative field study indicates that the use of these particular ventilation systems has no significant effect on indoor air quality.

  16. Natural phenomena hazards site characterization criteria

    SciTech Connect

    Not Available

    1994-03-01

    The criteria and recommendations in this standard shall apply to site characterization for the purpose of mitigating Natural Phenomena Hazards (wind, floods, landslide, earthquake, volcano, etc.) in all DOE facilities covered by DOE Order 5480.28. Criteria for site characterization not related to NPH are not included unless necessary for clarification. General and detailed site characterization requirements are provided in areas of meteorology, hydrology, geology, seismology, and geotechnical studies.

  17. 24 CFR 51.205 - Mitigating measures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...Nature § 51.205 Mitigating measures. Application of the standards for determining an Acceptable Separation Distance (ASD) for a HUD-assisted project from a potential hazard of an explosion or fire prone nature is predicated on level...

  18. 24 CFR 51.205 - Mitigating measures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...Nature § 51.205 Mitigating measures. Application of the standards for determining an Acceptable Separation Distance (ASD) for a HUD-assisted project from a potential hazard of an explosion or fire prone nature is predicated on level...

  19. Mini-Sosie high-resolution seismic method aids hazards studies

    USGS Publications Warehouse

    Stephenson, W.J.; Odum, J.; Shedlock, K.M.; Pratt, T.L.; Williams, R.A.

    1992-01-01

    The Mini-Sosie high-resolution seismic method has been effective in imaging shallow-structure and stratigraphic features that aid in seismic-hazard and neotectonic studies. The method is not an alternative to Vibroseis acquisition for large-scale studies. However, it has two major advantages over Vibroseis as it is being used by the USGS in its seismic-hazards program. First, the sources are extremely portable and can be used in both rural and urban environments. Second, the shifting-and-summation process during acquisition improves the signal-to-noise ratio and cancels out seismic noise sources such as cars and pedestrians. -from Authors

  20. Seaside, Oregon Tsunami Pilot Study--Modernization of FEMA Flood Hazard Maps

    E-print Network

    Seaside, Oregon Tsunami Pilot Study-- Modernization of FEMA Flood Hazard Maps By Tsunami Pilot ADMINISTRATION U.S. D EPARTMENT OF COMM E R CE 10 m 8 6 4 500-year tsunami-- maximum wave height (m) with a 0.002 annual probability of exceedance #12;#12;Seaside, Oregon Tsunami Pilot Study-- Modernization of FEMA

  1. A Contingent Valuation Study of the Value of Reducing Fire Hazards to

    E-print Network

    Standiford, Richard B.

    A Contingent Valuation Study of the Value of Reducing Fire Hazards to Old-Growth Forests: Loomis, John B.; González-Cabán, Armando; Gregory, Robin. 1996. A contingent valuation study of the value valuation methodology was applied to old-growth forests and critical habitat units for the Northern Spotted

  2. Software safety hazard analysis

    SciTech Connect

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  3. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    SciTech Connect

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  4. A computational study of explosive hazard potential for reuseable launch vehicles.

    SciTech Connect

    Langston, Leo J. (NASA Johnson Space Center, Houston, TX); Freitas, Christopher J. (Southwest Research Institiute, San Antonio, TX); Langley, Patrick (Lockheed Martin Space Systems Company, Denver, CO); Palmer, Donald (Lockheed Martin Space Systems Company, Denver, CO); Saul, W. Venner; Chocron, Sidney (Southwest Research Institiute, San Antonio, TX); Kipp, Marlin E.

    2004-09-01

    Catastrophic failure of a Reusable Launch Vehicle (RLV) during launch poses a significant engineering problem in the context of crew escape. The explosive hazard potential of the RLV changes during the various phases of the launch. The hazard potential in the on-pad environment is characterized by release and formation of a gas phase mixture in an oxidizer rich environment, while the hazard during the in-flight phase is dominated by the boundary layer and wake flow formed around the vehicle and the interaction with the exhaust gas plume. In order to address more effectively crew escape in these explosive environments a computational analysis program was undertaken by Lockheed Martin, funded by NASA JSC, with simulations and analyses completed by Southwest Research Institute and Sandia National Laboratories. This paper presents then the details of the methodology used in this analysis, results of the study, and important conclusions that came out of the study.

  5. Blast effect on the lower extremities and its mitigation: a computational study.

    PubMed

    Dong, Liqiang; Zhu, Feng; Jin, Xin; Suresh, Mahi; Jiang, Binhui; Sevagan, Gopinath; Cai, Yun; Li, Guangyao; Yang, King H

    2013-12-01

    A series of computational studies were performed to investigate the response of the lower extremities of mounted soldiers under landmine detonation. A numerical human body model newly developed at Wayne State University was used to simulate two types of experimental studies and the model predictions were validated against test data in terms of the tibia axial force as well as bone fracture pattern. Based on the validated model, the minimum axial force causing tibia facture was found. Then a series of parametric studies was conducted to determine the critical velocity (peak velocity of the floor plate) causing tibia fracture at different upper/lower leg angles. In addition, to limit the load transmission through the vehicular floor, two types of energy absorbing materials, namely IMPAXX(®) foam and aluminum alloy honeycomb, were selected for floor matting. Their performances in terms of blast effect mitigation were compared using the validated numerical model, and it has been found that honeycomb is a more efficient material for blast injury prevention under the loading conditions studied. PMID:23973770

  6. Characterizing pork producer demand for shelterbelts to mitigate odor: an Iowa case study

    Microsoft Academic Search

    John Tyndall

    2009-01-01

    Shelterbelts have been shown to mitigate livestock odors incrementally through complex physical and social dynamics. By surveying\\u000a Iowa hog producers, we assessed the current degree of shelterbelt usage by pork producers, examined producers’ beliefs and\\u000a concerns regarding shelterbelt usage for odor mitigation and estimated both their willingness to pay and their overall demand\\u000a for shelterbelts. Overall, Iowa hog producers display

  7. Mitigating urban heat island effects in high-density cities based on sky view factor and urban morphological understanding: a study of Hong Kong

    Microsoft Academic Search

    Chao Yuan; Liang Chen

    2011-01-01

    The urban heat island (UHI) effect is one of the most studied topics in many mega cities because of rapid urbanization. The literature review showed a significant correlation between sky view factor (SVF) and UHI. However, this climate knowledge has low impact on urban planning process to mitigate UHI. More studies should be devoted to ways of mitigating UHI based

  8. Landslide hazard mapping with selected dominant factors: A study case of Penang Island, Malaysia

    NASA Astrophysics Data System (ADS)

    Tay, Lea Tien; Alkhasawneh, Mutasem Sh.; Ngah, Umi Kalthum; Lateh, Habibah

    2015-05-01

    Landslide is one of the destructive natural geohazards in Malaysia. In addition to rainfall as triggering factos for landslide in Malaysia, topographical and geological factors play important role in the landslide susceptibility analysis. Conventional topographic factors such as elevation, slope angle, slope aspect, plan curvature and profile curvature have been considered as landslide causative factors in many research works. However, other topographic factors such as diagonal length, surface area, surface roughness and rugosity have not been considered, especially for the research work in landslide hazard analysis in Malaysia. This paper presents landslide hazard mapping using Frequency Ratio (FR) and the study area is Penang Island of Malaysia. Frequency ratio approach is a variant of probabilistic method that is based on the observed relationships between the distribution of landslides and each landslide-causative factor. Landslide hazard map of Penang Island is produced by considering twenty-two (22) landslide causative factors. Among these twenty-two (22) factors, fourteen (14) factors are topographic factors. They are elevation, slope gradient, slope aspect, plan curvature, profile curvature, general curvature, tangential curvature, longitudinal curvature, cross section curvature, total curvature, diagonal length, surface area, surface roughness and rugosity. These topographic factors are extracted from the digital elevation model of Penang Island. The other eight (8) non-topographic factors considered are land cover, vegetation cover, distance from road, distance from stream, distance from fault line, geology, soil texture and rainfall precipitation. After considering all twenty-two factors for landslide hazard mapping, the analysis is repeated with fourteen dominant factors which are selected from the twenty-two factors. Landslide hazard map was segregated into four categories of risks, i.e. Highly hazardous area, Hazardous area, Moderately hazardous area and Not hazardous area. The maps was assessed using ROC (Rate of Curve) based on the area under the curve method (AUC). The result indicates an increase of accuracy from 77.76% (with all 22 factors) to 79.00% (with 14 dominant factors) in the prediction of landslide occurrence.

  9. Flood hazards studies in the Mississippi River basin using remote sensing

    NASA Technical Reports Server (NTRS)

    Rango, A.; Anderson, A. T.

    1974-01-01

    The Spring 1973 Mississippi River flood was investigated using remotely sensed data from ERTS-1. Both manual and automatic analyses of the data indicated that ERTS-1 is extremely useful as a regional tool for flood mamagement. Quantitative estimates of area flooded were made in St. Charles County, Missouri and Arkansas. Flood hazard mapping was conducted in three study areas along the Mississippi River using pre-flood ERTS-1 imagery enlarged to 1:250,000 and 1:100,000 scale. Initial results indicate that ERTS-1 digital mapping of flood prone areas can be performed at 1:62,500 which is comparable to some conventional flood hazard map scales.

  10. THE SOCIAL IMPLICATIONS OF FLAME RETARDANT CHEMICALS: A CASE STUDY IN RISK AND HAZARD PERCEPTION

    EPA Science Inventory

    This study is expected to fill an important gap in the literature by focusing on how individuals characterize exposure in terms of risk and hazard, and how this understanding can lead to concrete changes in their personal and professional lives. I expect that people differ gre...

  11. Integration of Airborne Laser Scanning Altimetry Data in Alpine Geomorphological and Hazard Studies

    Microsoft Academic Search

    A. C. Seijmonsbergen

    2007-01-01

    A digital terrain and surface model derived from an airborne laser scanning (ALS) altimetry dataset was used in the Austrian Alps for the preparation, improvement and the evaluation of a digital geomorphological hazard map. The geomorphology in the study area consists of a wide variety of landforms, which include glacial landforms such as cirques, hanging valleys, and moraine deposits, of

  12. Innovations in earthquake and natural hazards research: determining soil liquefaction potential. Case study No. 5

    Microsoft Academic Search

    G. B. Moore; R. K. Yin

    1984-01-01

    This case study analyzes how an innovation in earthquake and natural hazards research was used for practical and policy purposes, why utilization occurred, and what potential policy implications can be drawn. The innovation was the dynamic analysis method, used to identify those soils that are likely to liquefy during earthquakes. The research was designed and undertaken by H. Bolton Seed

  13. A Study of Real-Time Identification and Monitoring of Barge-Carried Hazardous Commodities

    E-print Network

    and anomaly detection modules of the system will analyze the collected real-time data and other informationA Study of Real-Time Identification and Monitoring of Barge-Carried Hazardous Commodities Yangrong on the U.S. inland waterway system, towing vessel operators and fleet area managers, at specified reporting

  14. Strategies for casualty mitigation programs by using advanced tsunami computation

    NASA Astrophysics Data System (ADS)

    IMAI, K.; Imamura, F.

    2012-12-01

    1. Purpose of the study In this study, based on the scenario of great earthquakes along the Nankai trough, we aim on the estimation of the run up and high accuracy inundation process of tsunami in coastal areas including rivers. Here, using a practical method of tsunami analytical model, and taking into account characteristics of detail topography, land use and climate change in a realistic present and expected future environment, we examined the run up and tsunami inundation process. Using these results we estimated the damage due to tsunami and obtained information for the mitigation of human casualties. Considering the time series from the occurrence of the earthquake and the risk of tsunami damage, in order to mitigate casualties we provide contents of disaster risk information displayed in a tsunami hazard and risk map. 2. Creating a tsunami hazard and risk map From the analytical and practical tsunami model (a long wave approximated model) and the high resolution topography (5 m) including detailed data of shoreline, rivers, building and houses, we present a advanced analysis of tsunami inundation considering the land use. Based on the results of tsunami inundation and its analysis; it is possible to draw a tsunami hazard and risk map with information of human casualty, building damage estimation, drift of vehicles, etc. 3. Contents of disaster prevention information To improve the hazard, risk and evacuation information distribution, it is necessary to follow three steps. (1) Provide basic information such as tsunami attack info, areas and routes for evacuation and location of tsunami evacuation facilities. (2) Provide as additional information the time when inundation starts, the actual results of inundation, location of facilities with hazard materials, presence or absence of public facilities and areas underground that required evacuation. (3) Provide information to support disaster response such as infrastructure and traffic network damage prediction. Finally, compiling all this information on a tsunami hazard and risk map with the tsunami inundation animation, it is possible to create and propose strategies for casualty mitigation programs.

  15. Economic valuation of flood mitigation services: A case study from the Otter Creek, VT.

    NASA Astrophysics Data System (ADS)

    Galford, G. L.; Ricketts, T.; Bryan, K. L.; ONeil-Dunne, J.; Polasky, S.

    2014-12-01

    The ecosystem services provided by wetlands are widely recognized but difficult to quantify. In particular, estimating the effect of landcover and land use on downstream flood outcomes remains challenging, but is increasingly important in light of climate change predictions of increased precipitation in many areas. Economic valuation can help incorporate ecosystem services into decisions and enable communities to plan for climate and flood resiliency. Here we estimate the economic value of Otter Creek wetlands for Middlebury, VT in mitigating the flood that followed Tropical Storm Irene, as well as for ten historic floods. Observationally, hydrographs above and below the wetlands in the case of each storm indicated the wetlands functioned as a temporary reservoir, slowing the delivery of water to Middlebury. We compare observed floods, based on Middlebury's hydrograph, with simulated floods for scenarios without wetlands. To simulate these "without wetlands" scenarios, we assume the same volume of water was delivered to Middlebury, but in a shorter time pulse similar to a hydrograph upstream of the wetlands. For scenarios with and without wetlands, we map the spatial extent of flooding using LiDAR digital elevation data. We then estimate flood depth at each affected building, and calculate monetary losses as a function of the flood depth and house value using established depth damage relationships. For example, we expect damages equal to 20% of the houses value for a flood depth of two feet in a two-story home with a basement. We define the value of flood mitigation services as the difference in damages between the with and without wetlands scenario, and find that the Otter Creek wetlands reduced flood damage in Middlebury by 88% following Hurricane Irene. Using the 10 additional historic floods, we estimate an ongoing mean value of $400,000 in avoided damages per year. Economic impacts of this magnitude stress the importance of wetland conservation and warrant the consideration of ecosystem services in land use decisions. Our study indicates that here and elsewhere, green infrastructure may have to potential to increase the resilience of communities to projected changes in climate.

  16. Success in transmitting hazard science

    NASA Astrophysics Data System (ADS)

    Price, J. G.; Garside, T.

    2010-12-01

    Money motivates mitigation. An example of success in communicating scientific information about hazards, coupled with information about available money, is the follow-up action by local governments to actually mitigate. The Nevada Hazard Mitigation Planning Committee helps local governments prepare competitive proposals for federal funds to reduce risks from natural hazards. Composed of volunteers with expertise in emergency management, building standards, and earthquake, flood, and wildfire hazards, the committee advises the Nevada Division of Emergency Management on (1) the content of the State’s hazard mitigation plan and (2) projects that have been proposed by local governments and state agencies for funding from various post- and pre-disaster hazard mitigation programs of the Federal Emergency Management Agency. Local governments must have FEMA-approved hazard mitigation plans in place before they can receive this funding. The committee has been meeting quarterly with elected and appointed county officials, at their offices, to encourage them to update their mitigation plans and apply for this funding. We have settled on a format that includes the county’s giving the committee an overview of its infrastructure, hazards, and preparedness. The committee explains the process for applying for mitigation grants and presents the latest information that we have about earthquake hazards, including locations of nearby active faults, historical seismicity, geodetic strain, loss-estimation modeling, scenarios, and documents about what to do before, during, and after an earthquake. Much of the county-specific information is available on the web. The presentations have been well received, in part because the committee makes the effort to go to their communities, and in part because the committee is helping them attract federal funds for local mitigation of not only earthquake hazards but also floods (including canal breaches) and wildfires, the other major concerns in Nevada. Local citizens appreciate the efforts of the state officials to present the information in a public forum. The Committee’s earthquake presentations to the counties are supplemented by regular updates in the two most populous counties during quarterly meetings of the Nevada Earthquake Safety Council, generally alternating between Las Vegas and Reno. We have only 17 counties in Nevada, so we are making good progress at reaching each within a few years. The Committee is also learning from the county officials about their frustrations in dealing with the state and federal bureaucracies. Success is documented by the mitigation projects that FEMA has funded.

  17. Reviewing and visualising relationships between anthropic processes and natural hazards within a multi-hazard framework

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2014-05-01

    Here we present a broad overview of the interaction relationships between 17 anthropic processes and 21 different natural hazard types. Anthropic processes are grouped into seven categories (subsurface extraction, subsurface addition, land use change, explosions, hydrological change, surface construction processes, miscellaneous). Natural hazards are grouped into six categories (geophysical, hydrological, shallow earth processes, atmospheric, biophysical and space). A wide-ranging review based on grey- and peer-reviewed literature from many scientific disciplines identified 54 relationships where anthropic processes have been noted to trigger natural hazards. We record case studies for all but three of these relationships. Based on the results of this review, we find that the anthropic processes of deforestation, explosions (conventional and nuclear) and reservoir construction could trigger the widest range of different natural hazard types. We also note that within the natural hazards, landslides and earthquakes are those that could be triggered by the widest range of anthropic processes. This work also examines the possibility of anthropic processes (i) resulting in an increased occurrence of a particular hazard interaction (e.g., deforestation could result in an increased interaction between storms and landslides); and (ii) inadvertently reducing the likelihood of a natural hazard or natural hazard interaction (e.g., poor drainage or deforestation reducing the likelihood of wildfires triggered by lightning). This study synthesises, using accessible visualisation techniques, the large amounts of anthropic process and natural hazard information from our review. In it we have outlined the importance of considering anthropic processes within any analysis of hazard interactions, and we reinforce the importance of a holistic approach to natural hazard assessment, mitigation and management.

  18. Remedial mitigation strategies, a case study: Route 1 and 9 -- Paterson Plank Road, Secaucus, New Jersey

    SciTech Connect

    Lenhardt, D.R. [URS Consultants, Inc., Buffalo, NY (United States); Hamm, A. [NJDOT/Bureau Environmental Analysis, Trenton, NJ (United States)

    1996-12-31

    Route 1 and 9--Paterson Plank Road is located near the Lincoln Tunnel approach in the Town of Secaucus, North Bergen Township, New Jersey. The New Jersey Department of Transportation (NJDOT) is finalizing plans to alleviate congestion through multiple intersections that converge on a critical flow constriction at a heavily trafficked Conrail crossing. The project area has a long history of commercial and industrial development centered around the present rail corridor. Land use in the area has included heavy manufacturing, coal storage and fuel distribution. Studies commissioned by the NJDOT/Bureau of Environmental Analysis have delineated extensive areas of organic and inorganic contamination including the existence of several shallow overlapping groundwater plumes. In response to concerns surrounding projected construction costs related to site remediation and potential health and safety impacts, innovative approaches were developed to mitigate contamination effects on the project. A significant reduction in the volume of soil and groundwater handled during construction was achieved by redesigning the alignment to include an aerial extension of the proposed structure over the Conrail corridor. By elevating sections of the roadway and by utilizing driven H-piles to provide foundation support, disturbance within environmental problem areas was avoided or minimized.

  19. Natural hazard risk perception of Italian population: case studies along national territory.

    NASA Astrophysics Data System (ADS)

    Gravina, Teresita; Tupputi Schinosa, Francesca De Luca; Zuddas, Isabella; Preto, Mattia; Marengo, Angelo; Esposito, Alessandro; Figliozzi, Emanuele; Rapinatore, Matteo

    2015-04-01

    Risk perception is judgment that people make about the characteristics and severity of risks, in last few years risk perception studies focused on provide cognitive elements to communication experts responsible in order to design citizenship information and awareness appropriate strategies. Several authors in order to determine natural hazards risk (Seismic, landslides, cyclones, flood, Volcanic) perception used questionnaires as tool for providing reliable quantitative data and permitting comparison the results with those of similar surveys. In Italy, risk perception studies based on surveys, were also carried out in order to investigate on national importance Natural risk, in particular on Somma-Vesuvio and Phlegrean Fields volcanic Risks, but lacked risk perception studies on local situation distributed on whole national territory. National importance natural hazard were frequently reported by national mass media and there were debate about emergencies civil protection plans, otherwise could be difficult to obtain information on bonded and regional nature natural hazard which were diffuses along National territory. In fact, Italian peninsula was a younger geological area subjected to endogenous phenomena (volcanoes, earthquake) and exogenous phenomena which determine land evolution and natural hazard (landslide, coastal erosion, hydrogeological instability, sinkhole) for population. For this reason we decided to investigate on natural risks perception in different Italian place were natural hazard were taken place but not reported from mass media, as were only local relevant or historical event. We carried out surveys in different Italian place interested by different types of natural Hazard (landslide, coastal erosion, hydrogeological instability, sinkhole, volcanic phenomena and earthquake) and compared results, in order to understand population perception level, awareness and civil protection exercises preparation. Our findings support that risks communication have to be based on citizen knowledge and conscious in natural hazards. In fact, informed citizen could participate actively in decision in urban development planning and accept positively legislation and regulation introduced to avoid natural risks. The study has gone some way towards enhancing understanding in citizens conscious in natural risks and allow us to say that communication on natural risks could not be based only in transferring emergency behavior to citizens but also allow people to improve their knowledge in landscape evolution in order to assume aware environmental behavior.

  20. Social and ethical perspectives of landslide risk mitigation measures

    NASA Astrophysics Data System (ADS)

    Kalsnes, Bjørn; Vangelsten, Bjørn V.

    2015-04-01

    Landslide risk may be mitigated by use of a wide range of measures. Mitigation and prevention options may include (1) structural measures to reduce the frequency, severity or exposure to the hazard, (2) non-structural measures, such as land-use planning and early warning systems, to reduce the hazard frequency and consequences, and (3) measures to pool and transfer the risks. In a given situation the appropriate system of mitigation measures may be a combination of various types of measures, both structural and non-structural. In the process of choosing mitigation measures for a given landslide risk situation, the role of the geoscientist is normally to propose possible mitigation measures on basis of the risk level and technical feasibility. Social and ethical perspectives are often neglected in this process. However, awareness of the need to consider social as well as ethical issues in the design and management of mitigating landslide risk is rising. There is a growing understanding that technical experts acting alone cannot determine what will be considered the appropriate set of mitigation and prevention measures. Issues such as environment versus development, questions of acceptable risk, who bears the risks and benefits, and who makes the decisions, also need to be addressed. Policymakers and stakeholders engaged in solving environmental risk problems are increasingly recognising that traditional expert-based decision-making processes are insufficient. This paper analyse the process of choosing appropriate mitigation measures to mitigate landslide risk from a social and ethical perspective, considering technical, cultural, economical, environmental and political elements. The paper focus on stakeholder involvement in the decision making process, and shows how making strategies for risk communication is a key for a successful process. The study is supported by case study examples from Norway and Italy. In the Italian case study, three different risk mitigation options was presented to the local community. The options were based on a thorough stakeholder involvement process ending up in three different views on how to deal with the landslide risk situation: i) protect lives and properties (hierarchical) ; ii) careful stewardship of the mountains (egalitarian); and iii) rational individual choice (individualist).

  1. Hazardous waste source-reduction study with treated groundwater recycling

    SciTech Connect

    Chang, L.Y.; McCoy, B.J. (Univ. of California, Davis, CA (United States))

    1993-08-01

    A feasibility study is presented for modifying electroplating processes for source reduction. Ion exchange and reverse osmosis units are suggested to allow reclaiming and recycling of metal solutions. A particular example of water conservation in an electroplating shop is presented for the treatment and utilization of groundwater contaminated by hydrocarbon chemicals, including volatile organic compounds (VOCs) and gasoline products. Granular carbon adsorption, UV oxidation, and demineralization steps and alkalinity control measures for the groundwater are discussed. Engineering and economic analyses provide a basis for comparing alternative designs. An integrated scheme, including groundwater remediation and source reduction, is feasible for the plating shop. The removal of VOCs and demineralization of the polluted groundwater are important steps. With the integrated plan, 90% removal or recovery of heavy metals can be achieved, and water usage and wastewater can be reduced by 90%. Thus, it is feasible to prevent water pollution at the source and to recycle treated groundwater and wastewater for the manufacturing process.

  2. Geologic studies for seismic hazard assessment, Las Positas Fault Zone

    SciTech Connect

    Carpenter, D.W.; Clark, R.J.

    1982-04-08

    Since its existence was reported, the northeast trending Las Positas Fault Zone has been the subject of controversy. Because of proximity of the Las Positas Fault to Lawrence Livermore National Laboratory (LLNL) facilities, geologists from the Laboratory have performed an extensive study of this structure. The Las Positas Fault Zone is best expressed in southeastern Livermore Valley between Arroyo Mocho and the Livermore Oil Field. Here, the informally named northern branch consists of en-echelon shears separating alluvium northwest of the fault from deformed Livermore Formation and uplifted late Pleistocene terrace deposits to the southeast. A /sup 14/C date of 17,400 +- 250 y B.P. was obtained from displaced alluvial deposits. The erosion surface on the lower member of the Livermore Formation has been displaced vertically about 60 m across the northern branch and preserved slickensides indicate oblique but mostly lateral motion. An 85 cm vertical offset of colluvium and possible tectonic creep along the southern branch suggest historic activity on this strand. Net motion is almost entirely lateral. Microseismicity during the period 1969 to 1981 correlates with the south branch of the Las Positas Fault but not the north branch.

  3. Respiratory hazards in hard metal workers: a cross sectional study.

    PubMed Central

    Meyer-Bisch, C; Pham, Q T; Mur, J M; Massin, N; Moulin, J J; Teculescu, D; Carton, B; Pierre, F; Baruthio, F

    1989-01-01

    A cross sectional study was conducted on 513 employees at three hard metal plants: 425 exposed workers (351 men, 74 women) and 88 controls (69 men, 19 women). Cough and sputum were more frequent in workers engaged in "soft powder" and presintering workshops compared with controls (12.5% and 16.5% v 3.5%). Spirometric abnormalities were more frequent among women in sintering and finishing workshops compared with control women (56.8% v 23.8%) and abnormalities of carbon monoxide test were more frequent in exposed groups than in controls; this difference was more pronounced in women (31.4% v 5.6%) than in men (18.5% v 13%). No significant correlation was observed between duration of exposure and age adjusted lung function tests. Slight abnormalities of chest radiographs (0/1, 1/1 according to ILO classification) were more frequent in exposed men than controls (12.8% v 1.9%) and mostly in soft powder workers. In subjects with abnormal chest radiographs FVC, FEV1 and carbon monoxide indices (fractional uptake of CO or CO transfer index or both) were lower compared with those with normal chest radiographs. Although relatively mild, the clinical, radiological, and functional abnormalities uncovered call for a regular supervision of workers exposed to hard metal dust. PMID:2787666

  4. A Study on Hazard Assessment of Employees in New Buildings

    PubMed Central

    2012-01-01

    In order to evaluate the physical and psychological health effects of air pollutants from new building materials, 100 employees who worked in new buildings were given a general health questionnaire, and the prevalence of their subjective complaints was measured. The collected data were classified according to age, gender, smoking status, profession, working time, sleep time, life style, and length of employment. The results obtained were summarized as follows: The THI lie scale scores were significantly higher among the older respondents. Compared to males, females showed a significantly higher level in the depression itemas well asa tendency toward high ratios of physical and psychological complaints. The smoking group showed higher scores regarding health complaints related to most physical and psychological items. Smokers showed significantly increased respiratory organ complaints compared to nonsmokers. Those with a profession showed significantly higher level of nervousness. The group of those working 7 to 10 hours group showed higher rates of complaints in the multiple subjective symptoms and mouth/anus items than the group working less than 2 hours. Those living an irregular life showed a tendency toward higher rates of complaints for most physical and psychological subjective factors. Those who were satisfied with their environments showed significantly lower scores in the mouth/anus, impulsiveness, mental irritability, depression, and nervousness items. In summary, this study shows that the health complaint scores regarding physical and psychological symptoms tended to be higher among the unsatisfied group, the irregular life group, the group who worked long hours, the elderly, smokers, and females. These results can be used to improve the psychosomatic health status and working environments of employees working in new buildings. PMID:24278609

  5. Mask roughness induced LER control and mitigation: aberrations sensitivity study and alternate illumination scheme

    SciTech Connect

    McClinton, Brittany M.; Naulleau, Patrick P.

    2011-03-11

    Here we conduct a mask-roughness-induced line-edge-roughness (LER) aberrations sensitivity study both as a random distribution amongst the first 16 Fringe Zernikes (for overall aberration levels of 0.25, 0.50, and 0.75nm rms) as well as an individual aberrations sensitivity matrix over the first 37 Fringe Zernikes. Full 2D aerial image modeling for an imaging system with NA = 0.32 was done for both the 22-nm and 16-nm half-pitch nodes on a rough mask with a replicated surface roughness (RSR) of 100 pm and a correlation length of 32 nm at the nominal extreme-ultraviolet lithography (EUVL) wavelength of 13.5nm. As the ideal RSR value for commercialization of EUVL is 50 pm and under, and furthermore as has been shown elsewhere, a correlation length of 32 nm of roughness on the mask sits on the peak LER value for an NA = 0.32 imaging optic, these mask roughness values and consequently the aberration sensitivity study presented here, represent a worst-case scenario. The illumination conditions were chosen based on the possible candidates for the 22-nm and 16-nm half-pitch nodes, respectively. In the 22-nm case, a disk illumination setting of {sigma} = 0.50 was used, and for the 16-nm case, crosspole illumination with {sigma} = 0.10 at an optimum offset of dx = 0 and dy = .67 in sigma space. In examining how to mitigate mask roughness induced LER, we considered an alternate illumination scheme whereby a traditional dipole's angular spectrum is extended in the direction parallel to the line-and-space mask absorber pattern to represent a 'strip'. While this illumination surprisingly provides minimal improvement to the LER as compared to several alternate illumination schemes, the overall imaging quality in terms of image-log-slope (ILS) and contrast is improved.

  6. Effects of Long-Term Land Subsidence on the Flood Hazard- A Case Study in the Southwest Coastal Area, Taiwan

    NASA Astrophysics Data System (ADS)

    Chang, Yin-Lung; Tsai, Tung-Lin; Yang, Jinn-Chuang; Chiu, Hsin-Yu

    2014-05-01

    Typically, the flood hazard assessment is conducted for current states of topography. However, the spatial-temporal variability of land subsidence should be considered in assessing the flood hazard for the land subsidence prone area. This study numerically investigated the effects of pumping induced land subsidence on the freeboard and inundation depth in the southwest coastal area, Taiwan. Firstly, the spatial distribution of accumulative land subsidence between 2012 and 2021 was predicted by a process-based land subsidence model. The digital elevation model (DEM) and channel geometry in 2021 were produced based on the predicted land subsidence field along with the present topography data. The freeboard and inundation depth before and after ten years land subsidence were simulated by the SOBEK Suite (1D channel flow couples 2D overland flow). The analysis of freeboard showed that except the extremely low-lying area where the elevation is lower than the spring high tide level, the change of freeboard after ten years land subsidence is mostly influenced by the spatial variation of land subsidence field. Higher change rate of the land subsidence field along the channel direction induces more significant change of freeboard. Besides, the freeboard at a cross section tends to decrease after ten years land subsidence if the land subsidence decreases in the channel downstream direction, and vice versa. However, the decreased (or increased) freeboard at a cross section is typically less than 0.3 times the magnitude of land subsidence at the same cross section. The spatial variation of land subsidence field also significantly influences the change of inundation depth outside the extremely low-lying area. The inundation depth at a computation grid tends to increase after ten years land subsidence if the land subsidence at this grid is greater than that at its neighbour grids, and vice versa. However, the increased (or decreased) inundation depth at a grid is significantly less than the magnitude of land subsidence at the same grid. Unlike the changes of freeboard and inundation depth outside the extremely low-lying area, the freeboard and inundation depth within the extremely low-lying area are always decreased and increased respectively after ten years land subsidence. Furthermore, the decreased freeboard at a channel cross section or the increased inundation depth at a grid are typically 0.8 to 1.0 times the magnitude of land subsidence at the same cross section or grid. To mitigate the effect of pumping induced land subsidence on flood hazard, a groundwater quantity management model was developed. The management model determines the optimal pumping patterns which prevent the flood hazard to be increased due to long-term subsidence while satisfy the groundwater demand. This study showed that for a coastal area with potential land subsidence problem, the spatial-temporal variability of future land subsidence should be quantified and incorporated into the flood management.

  7. GIS-based flood hazard mapping at different administrative scales: A case study in Gangetic West Bengal, India

    Microsoft Academic Search

    Joy Sanyal; X. X. Lu

    2006-01-01

    This paper addresses the need for an efficient and cost-effective methodology for preparing flood hazard maps in data poor countries, particularly those under a monsoon regime where floods pose a recurrent danger. Taking Gangetic West Bengal, India, as an example and using available historical data from government agencies, the study compiled a regional map indicating hazard prone subre- gional areas

  8. Seaside, Oregon, Tsunami Pilot Study-Modernization of FEMA Flood Hazard Maps: GIS Data

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2006-01-01

    Introduction: The Federal Emergency Management Agency (FEMA) Federal Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study (Chowdhury and others, 2005). Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Analysis (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines (Tsunami Pilot Study Working Group, 2006). The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey, and the National Oceanic and Atmospheric Administration (NOAA), in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. We present the spatial (geographic information system, GIS) data from the pilot study in standard GIS formats and provide files for visualization in Google Earth, a global map viewer.

  9. When does highway construction to mitigate congestion reduce carbon emissions? A Case Study: The Caldecott Tunnel

    NASA Astrophysics Data System (ADS)

    Thurlow, M. E.; Maness, H.; Wiersema, D. J.; Mcdonald, B. C.; Harley, R.; Fung, I. Y.

    2014-12-01

    The construction of the fourth bore of the Caldecott Tunnel, which connects Oakland and Moraga, CA on State Route 24, was the second largest roadway construction project in California last year with a total cost of $417 million. The objective of the fourth bore was to reduce traffic congestion before the tunnel entrance in the off-peak direction of travel, but the project was a source of conflict between policy makers and environmental and community groups concerned about the air quality and traffic impacts. We analyze the impact of the opening of the fourth bore on CO2 emissions associated with traffic. We made surface observations of CO2from a mobile platform along State Route 24 for several weeks in November 2013 incorporating the period prior to and after the opening of the fourth bore on November 16, 2013. We directly compare bottom-up and top-down approaches to estimate the change in traffic emissions associated with the fourth bore opening. A bottom-up emissions inventory was derived from the high-resolution Performance Measurement System (PeMs) dataset and the Multi-scale Motor Vehicle and Equipment Emissions System (MOVES). The emissions inventory was used to drive a box model as well as a high-resolution regional transport model (the Weather and Regional Forecasting Model). The box model was also used to derive emissions from observations in a basic inversion. We also present an analysis of long-term traffic patterns and consider the potential for compensating changes in behavior that offset the observed emissions reductions on longer timescales. Finally, we examine how the results from the Caldecott study demonstrate the general benefit of using mobile measurements for quantifying environmental impacts of congestion mitigation projects.

  10. MITIGATION OF ICE LOADING ON OFFSHORE WIND TURBINES FEASIBILITY STUDY OF A SEMIACTIVE SOLUTION

    Microsoft Academic Search

    Arkadiusz Mróz; Jan Holnicki-Szulc; Tuomo Kärnä

    This paper focuses on mitigation of dynamic loading induced on off-shore towers by drifting ice. Conical structures attached to off-shore towers at water level are widely used to reduce static and dynamic ice actions. The level of remaining forces is enough to cause severe tower vibrations and, in some situations also self induced excitation. A need for an additional device

  11. Examination of Icing Induced Loss of Control and Its Mitigations

    NASA Technical Reports Server (NTRS)

    Reehorst, Andrew L.; Addy, Harold E., Jr.; Colantonio, Renato O.

    2010-01-01

    Factors external to the aircraft are often a significant causal factor in loss of control (LOC) accidents. In today s aviation world, very few accidents stem from a single cause and typically have a number of causal factors that culminate in a LOC accident. Very often the "trigger" that initiates an accident sequence is an external environment factor. In a recent NASA statistical analysis of LOC accidents, aircraft icing was shown to be the most common external environmental LOC causal factor for scheduled operations. When investigating LOC accident or incidents aircraft icing causal factors can be categorized into groups of 1) in-flight encounter with super-cooled liquid water clouds, 2) take-off with ice contamination, or 3) in-flight encounter with high concentrations of ice crystals. As with other flight hazards, icing induced LOC accidents can be prevented through avoidance, detection, and recovery mitigations. For icing hazards, avoidance can take the form of avoiding flight into icing conditions or avoiding the hazard of icing by making the aircraft tolerant to icing conditions. Icing detection mitigations can take the form of detecting icing conditions or detecting early performance degradation caused by icing. Recovery from icing induced LOC requires flight crew or automated systems capable of accounting for reduced aircraft performance and degraded control authority during the recovery maneuvers. In this report we review the icing induced LOC accident mitigations defined in a recent LOC study and for each mitigation describe a research topic required to enable or strengthen the mitigation. Many of these research topics are already included in ongoing or planned NASA icing research activities or are being addressed by members of the icing research community. These research activities are described and the status of the ongoing or planned research to address the technology needs is discussed

  12. Assessment and indirect adjustment for confounding by smoking in cohort studies using relative hazards models.

    PubMed

    Richardson, David B; Laurier, Dominique; Schubauer-Berigan, Mary K; Tchetgen Tchetgen, Eric; Cole, Stephen R

    2014-11-01

    Workers' smoking histories are not measured in many occupational cohort studies. Here we discuss the use of negative control outcomes to detect and adjust for confounding in analyses that lack information on smoking. We clarify the assumptions necessary to detect confounding by smoking and the additional assumptions necessary to indirectly adjust for such bias. We illustrate these methods using data from 2 studies of radiation and lung cancer: the Colorado Plateau cohort study (1950-2005) of underground uranium miners (in which smoking was measured) and a French cohort study (1950-2004) of nuclear industry workers (in which smoking was unmeasured). A cause-specific relative hazards model is proposed for estimation of indirectly adjusted associations. Among the miners, the proposed method suggests no confounding by smoking of the association between radon and lung cancer--a conclusion supported by adjustment for measured smoking. Among the nuclear workers, the proposed method suggests substantial confounding by smoking of the association between radiation and lung cancer. Indirect adjustment for confounding by smoking resulted in an 18% decrease in the adjusted estimated hazard ratio, yet this cannot be verified because smoking was unmeasured. Assumptions underlying this method are described, and a cause-specific proportional hazards model that allows easy implementation using standard software is presented. PMID:25245043

  13. Proportional hazards regression for the analysis of clustered survival data from case-cohort studies

    PubMed Central

    Schaubel, Douglas E.; Kalbeisch, John D.

    2015-01-01

    Summary Case-cohort sampling is a commonly used and efficient method for studying large cohorts. Most existing methods of analysis for case-cohort data have concerned the analysis of univariate failure time data. However, clustered failure time data are commonly encountered in public health studies. For example, patients treated at the same center are unlikely to be independent. In this article, we consider methods based on estimating equations for case-cohort designs for clustered failure time data. We assume a marginal hazards model, with a common baseline hazard and common regression coefficient across clusters. The proposed estimators of the regression parameter and cumulative baseline hazard are shown to be consistent and asymptotically normal, and consistent estimators of the asymptotic covariance matrices are derived. The regression parameter estimator is easily computed using any standard Cox regression software that allows for offset terms. The proposed estimators are investigated in simulation studies, and demonstrated empirically to have increased efficiency relative to some existing methods. The proposed methods are applied to a study of mortality among Canadian dialysis patients. PMID:20560939

  14. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after further consideration of the reliability and scientific acceptability of each alternative input model. Forecasting the seismic hazard from induced earthquakes is fundamentally different from forecasting the seismic hazard for natural, tectonic earthquakes. This is because the spatio-temporal patterns of induced earthquakes are reliant on economic forces and public policy decisions regarding extraction and injection of fluids. As such, the rates of induced earthquakes are inherently variable and nonstationary. Therefore, we only make maps based on an annual rate of exceedance rather than the 50-year rates calculated for previous U.S. Geological Survey hazard maps.

  15. Analytic study to evaluate associations between hazardous waste sites and birth defects. Final report

    SciTech Connect

    Marshall, E.G.; Gensburg, L.J.; Geary, N.S.; Deres, D.A.; Cayo, M.R.

    1995-06-01

    A study was conducted to evaluate the risk of two types of birth defects (central nervous system and musculoskeletal defects) associated with mothers` exposure to solvents, metals, and pesticides through residence near hazardous waste sites. The only environmental factor showing a statistically significant elevation in risk was living within one mile of industrial or commercial facilities emitting solvents into the air. Residence near these facilities showed elevated risk for central nervous system defects but no elevated risks for musculoskeletal defects.

  16. Seaside, Oregon Tsunami Pilot Study - modernization of FEMA flood hazard maps

    USGS Publications Warehouse

    Tsunami Pilot Study Working Group

    2006-01-01

    FEMA Flood Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study. Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Assessment (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines. The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State Agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey and the National Oceanic and Atmospheric Administration, in collaboration with the University of Southern California, Middle East Technical University. Portland State University, Horning Geosciences, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. Draft copies and a briefing on the contents, results and recommendations of this document were provided to FEMA officials before final publication.

  17. 44 CFR 65.11 - Evaluation of sand dunes in mapping coastal flood hazard areas.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...Evaluation of sand dunes in mapping coastal flood hazard areas. 65.11 Section 65...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...Evaluation of sand dunes in mapping coastal flood hazard areas. (a) General...

  18. Preparation of a national Copernicus service for detection and monitoring of land subsidence and mass movements in the context of remote sensing assisted hazard mitigation

    NASA Astrophysics Data System (ADS)

    Kalia, Andre C.; Frei, Michaela; Lege, Thomas

    2014-10-01

    Land subsidence can cause severe damage for e.g. infrastructure and buildings and mass movements even can lead to loss of live. Detection and monitoring of these processes by terrestrial measurement techniques remain a challenge due to limitations in spatial coverage and temporal resolution. Since the launch of ERS-1 in 1991 numerous scientific studies demonstrated the capability of differential SAR-Interferometry (DInSAR) for the detection of surface deformation proving the usability of this method. In order to assist the utilization of DInSAR for governmental tasks a national service-concept within the EU-ESA Program "Copernicus" is in the process of preparation. This is done by i) analyzing the user requirements, ii) developing a concept and iii) perform case studies as "proof of concept". Due to the iterative nature of this procedure governmental users as well as DInSAR experts are involved. This paper introduces the concept, shows the available SAR data archive from ERS-1/2, TerraSAR-X and TanDEM-X as well as the proposed case study. The case study is focusing on the application of advanced DInSAR methods for the detection of subsidence in a region with active gas extraction. The area of interest is located in the state of Lower Saxony in the northwest of Germany. The DInSAR analysis will be based on ERS-1/2 and on TerraSARX/ TanDEM-X SAR data. The usability of the DInSAR products will be discussed with the responsible mining authority (LBEG) in order to adapt the DInSAR products to the user needs and to evaluate the proposed concept.

  19. FDA-iRISK--a comparative risk assessment system for evaluating and ranking food-hazard pairs: case studies on microbial hazards.

    PubMed

    Chen, Yuhuan; Dennis, Sherri B; Hartnett, Emma; Paoli, Greg; Pouillot, Régis; Ruthman, Todd; Wilson, Margaret

    2013-03-01

    Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012. PMID:23462073

  20. Study on anaerobic digestion treatment of hazardous colistin sulphate contained pharmaceutical sludge.

    PubMed

    Yin, Fubin; Wang, Dongling; Li, Zifu; Ohlsen, Thomas; Hartwig, Peter; Czekalla, Sven

    2015-02-01

    Pharmaceutical sludge is considered as a hazardous substance with high treatment and disposal fees. Anaerobic digestion could not only transform the hazardous substance into activated sludge, but also generate valuable biogas. This research had two objectives. First: studying the feasibility of anaerobic digestion and determining the biochemical methane potential (BMP) of pharmaceutical sludge under different Inoculum to substrate TS ratios (ISRs) of 0, 0.65, 2.58 and 10.32 in mesophilic condition of 37±1°C. Secondly, investigating the removal efficiency of colistin sulphate during anaerobic digestion. The results showed that the use of anaerobic digestion to treat the pharmaceutical sludge is feasible and that it can completely eliminate the colistin sulphate. The highest biogas production from pharmaceutical sludge is 499.46 mL/g TS at an ISR of 10.32. PMID:25490101

  1. Intrinsic vulnerability, hazard and risk mapping for karst aquifers: A case study

    NASA Astrophysics Data System (ADS)

    Mimi, Ziad A.; Assi, Amjad

    2009-01-01

    SummaryGroundwater from karst aquifers is among the most important resources of drinking water supply of the worldwide population. The European COST action 620 proposed a comprehensive approach to karst groundwater protection, comprising methods of intrinsic and specific vulnerability mapping, hazard and risk mapping. This paper presents the first application of all components of this European approach to the groundwater underlying the Ramallah district, a karst hydrogeology system in Palestine. The vulnerability maps which were developed can assist in the implementation of groundwater management strategies to prevent degradation of groundwater quality. Large areas in the case study area can be classified as low or very low risk area corresponding to the pollution sources due to the absence of hazards and also due to low vulnerabilities. These areas could consequently be interesting for future development as they are preferable in view of ground water protection.

  2. Experimental Study on the Mitigation of Backdraft in Compartment Fires with Water Mist

    Microsoft Academic Search

    W. G. Weng; W. C. Fan

    2002-01-01

    In this paper, the results of a reduced-scale experimental test series using water mist to mitigate backdraft in compartment fires are presented. This reduced-scale compartment (1.2 m 0.6 m 0.6 m)was fitted with a variety of end opening geometries: middle-slot, upside-slot, downside-slot, door, window and vertical middle-slot, and ceiling opening geometries: slot and window. Water mist was generated by a

  3. Climate engineering of vegetated land for hot extremes mitigation: an ESM sensitivity study

    NASA Astrophysics Data System (ADS)

    Wilhelm, Micah; Davin, Edouard; Seneviratne, Sonia

    2014-05-01

    Mitigation efforts to reduce anthropogenic climate forcing have thus far proven inadequate, as evident from accelerating greenhouse gas emissions. Many subtropical and mid-latitude regions are expected to experience longer and more frequent heat waves and droughts within the next century. This increased occurrence of weather extremes has important implications for human health, mortality and for socio-economic factors including forest fires, water availability and agricultural production. Various solar radiation management (SRM) schemes that attempt to homogeneously counter the anthropogenic forcing have been examined with different Earth System Models (ESM). Land climate engineering schemes have also been investigated which reduces the amount of solar radiation that is absorbed at the surface. However, few studies have investigated their effects on extremes but rather on mean climate response. Here we present the results of a series of climate engineering sensitivity experiments performed with the Community Earth System Model (CESM) version 1.0.2 at 2°-resolution. This configuration entails 5 fully coupled model components responsible for simulating the Earth's atmosphere, land, land-ice, ocean and sea-ice that interact through a central coupler. Historical and RCP8.5 scenarios were performed with transient land-cover changes and prognostic terrestrial Carbon/Nitrogen cycles. Four sets of experiments are performed in which surface albedo over snow-free vegetated grid points is increased by 0.5, 0.10, 0.15 and 0.20. The simulations show a strong preferential cooling of hot extremes throughout the Northern mid-latitudes during boreal summer. A strong linear scaling between the cooling of extremes and additional surface albedo applied to the land model is observed. The strongest preferential cooling is found in southeastern Europe and the central United States, where increases of soil moisture and evaporative fraction are the largest relative to the control simulation. This preferential cooling is found to intensify in the future scenario. Cloud cover strongly limits the efficacy of a given surface albedo increase to reflect incoming solar radiation back into space. As anthropogenic forcing increases, cloud cover decreases over much of the northern mid-latitudes in CESM.

  4. Use of geotextiles for mitigation of the effects of man-made hazards such as greening of waste deposits in frame of the conversion of industrial areas

    NASA Astrophysics Data System (ADS)

    Bostenaru, Magdalena; Siminea, Ioana; Bostenaru, Maria

    2010-05-01

    The city of Karlsruhe lays on the Rhine valley; however, it is situated at a certain distance from the Rhine river and the coastal front is not integrated in the urban development. However, the port to the Rhine developed to the second largest internal port in Germany. With the process of deindustrialisation, industrial use is now shrinking. With the simultaneous process of the ecological re-win of rivers, the conversion of the industrial area to green and residential areals is imposed. In the 1990s a project was made by the third author of the contribution with Andrea Ciobanu as students of the University of Karlsruhe for the conversion of the Rhine port area of Karlsruhe into such a nature-residential use. The area included also a waste deposit, proposed to be transformed into a "green hill". Such an integration of a waste deposit into a park in the process of the conversion of an industrial area is not singular in Germany; several such projects were proposed and some of them realised at the IBA Emscher Park in the Ruhr area. Some of them were coupled with artistic projects. The technical details are also subject of the contribution. Studies were made by the first two authors on the conditions in which plants grow on former waste deposits if supported by intermediar layers of a geotextile. The characteristics of the geotextiles, together with the technologic process of obtaining, and the results of laboratory and field experiments for use on waste deposits in comparable conditions in Romania will be shown. The geotextile is also usable for ash deposits such as those in the Ruhr area.

  5. Mitigation: Interfaces between NASA, Risk Managers, and the Public Clark R. Chapman1,2

    E-print Network

    Chapman, Clark R.

    Mitigation," Vail CO, 26 June 2006. Summary: Threat mitigation, in the disaster management community's responsibility. I. NEO Threat Mitigation: All- Hazards Disaster Management Strategy The Congressional mandate1 Mitigation: Interfaces between NASA, Risk Managers, and the Public Clark R. Chapman1,2 , Daniel D

  6. Study on the Application of Probabilistic Tsunami Hazard Analysis for the Nuclear Power Plant Site in Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Rhee, H. M.; Kim, M.; Sheen, D. H.; Choi, I. K.

    2014-12-01

    The necessity of study on the tsunami hazard assessment for Nuclear Power Plant (NPP) site was suggested since the event of Fukushima in 2011 had been occurred. It has being emphasized because all of the NPPs in Korean Peninsula are located in coastal region. The tsunami hazard is regarded as the annual exceedance probability for the wave heights. The methodology for analysis of tsunami hazard is based on the seismic hazard analysis. The seismic hazard analysis had been performed by using both deterministic and probabilistic method. Recently, the probabilistic method had been received more attention than the deterministic method because the uncertainties of hazard analysis could be considered by using the logic tree approach. In this study, the probabilistic tsunami hazard analysis for Uljin NPP site was performed by using the information of fault sources which was published by Atomic Energy Society of Japan (AESJ). The wave parameter is the most different parameter with seismic hazard. It could be estimated from the results of tsunami propagation analysis. The TSUNAMI_ver1.0 which was developed by Japan nuclear energy safety organization (JNES), was used for the tsunami simulation. The 80 cases tsunami simulations were performed and then the wave parameters were estimated. For reducing the sensitivity which was encouraged by location of sampling point, the wave parameters were estimated from group of sampling points.The probability density function on the tsunami height was computed by using the recurrence intervals and the wave parameters. And then the exceedance probability distribution was calculated from the probability density function. The tsunami hazards for the sampling groups were calculated. The fractile curves which were shown the uncertainties of input parameters were estimated from the hazards by using the round-robin algorithm. In general, tsunami hazard analysis is focused on the maximum wave heights. But the minimum wave height should be considered for the tsunami hazard analysis on the NPP site since it is connected with water intake system. The results of tsunami hazard analysis for the NPP site was suggested by the annual exceedance probability with the wave heights. This study shows that the PTHA method could be applied for the estimation of tsunami wave height in NPP sites

  7. Effects of anthropogenic land-subsidence on river flood hazard: a case study in Ravenna, Italy

    NASA Astrophysics Data System (ADS)

    Carisi, Francesca; Domeneghetti, Alessio; Castellarin, Attilio

    2015-04-01

    Can differential land-subsidence significantly alter the river flooding dynamics, and thus flood risk in flood prone areas? Many studies show how the lowering of the coastal areas is closely related to an increase in the flood-hazard due to more important tidal flooding and see level rise. On the contrary, the literature on the relationship between differential land-subsidence and possible alterations to riverine flood-hazard of inland areas is still sparse, while several areas characterized by significant land-subsidence rates during the second half of the 20th century experienced an intensification in both inundation magnitude and frequency. This study investigates the possible impact of a significant differential ground lowering on flood hazard in proximity of Ravenna, which is one of the oldest Italian cities, former capital of the Western Roman Empire, located a few kilometers from the Adriatic coast and about 60 km south of the Po River delta. The rate of land-subsidence in the area, naturally in the order of a few mm/year, dramatically increased up to 110 mm/year after World War II, primarily due to groundwater pumping and a number of deep onshore and offshore gas production platforms. The subsidence caused in the last century a cumulative drop larger than 1.5 m in the historical center of the city. Starting from these evidences and taking advantage of a recent digital elevation model of 10m resolution, we reconstructed the ground elevation in 1897 for an area of about 65 km2 around the city of Ravenna. We referred to these two digital elevation models (i.e. current topography and topographic reconstruction) and a 2D finite-element numerical model for the simulation of the inundation dynamics associated with several levee failure scenarios along embankment system of the river Montone. For each scenario and digital elevation model, the flood hazard is quantified in terms of water depth, speed and dynamics of the flooding front. The comparison enabled us to quantify alterations to the flooding hazard due to large and rapid differential land-subsidence, shedding some light on whether to consider anthropogenic land-subsidence among the relevant human-induced drivers of flood-risk change.

  8. Economics of Tsunami Mitigation in the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Goettel, K. A.; Rizzo, A.; Sigrist, D.; Bernard, E. N.

    2011-12-01

    The death total in a major Cascadia Subduction Zone (CSZ) tsunami may be comparable to the Tohoku tsunami - tens of thousands. To date, tsunami risk reduction activities have been almost exclusively hazard mapping and evacuation planning. Reducing deaths in locations where evacuation to high ground is impossible in the short time between ground shaking and arrival of tsunamis requires measures such as vertical evacuation facilities or engineered pathways to safe ground. Yet, very few, if any, such tsunami mitigation projects have been done. In contrast, many tornado safe room and earthquake mitigation projects driven entirely or in largely by life safety have been done with costs in the billions of dollars. The absence of tsunami mitigation measures results from the belief that tsunamis are too infrequent and the costs too high to justify life safety mitigation measures. A simple analysis based on return periods, death rates, and the geographic distribution of high risk areas for these hazards demonstrates that this belief is incorrect: well-engineered tsunami mitigation projects are more cost-effective with higher benefit-cost ratios than almost all tornado or earthquake mitigation projects. Goldfinger's paleoseismic studies of CSZ turbidites indicate return periods for major CSZ tsunamis of about 250-500 years (USGS Prof. Paper 1661-F in press). Tsunami return periods are comparable to those for major earthquakes at a given location in high seismic areas and are much shorter than those for tornados at any location which range from >4,000 to >16,000 years for >EF2 and >EF4 tornadoes, respectively. The average earthquake death rate in the US over the past 100-years is about 1/year, or about 30/year including the 1906 San Francisco earthquake. The average death rate for tornadoes is about 90/year. For CSZ tsunamis, the estimated average death rate ranges from about 20/year (10,000 every 500 years) to 80/year (20,000 every 250 years). Thus, the long term deaths rates from tsunamis, earthquakes and tornadoes are comparable. High hazard areas for tornadoes and earthquakes cover ~40% and ~15% of the contiguous US, ~1,250,000 and ~500,000 square miles, respectively. In marked contrast, tsunami life safety risk is concentrated in communities with significant populations in areas where evacuation to high ground is impossible: probably <4,000 square miles or <0.1% of the US. The geographic distribution of life safety risk profoundly affects the economics of tsunami life safety mitigation projects. Consider a tsunami life safety project which saves an average of one life per year (500 lives per 500 years). Using FEMA's value of human life (5.8 million), 7% discount rate and a 50-year project useful lifetime, the net present value of avoided deaths is 80 million. Thus, the benefit-cost ratio would be about 16 or about 80 for tsunami mitigation projects which cost 5 million or 1 million, respectively. These rough calculations indicate that tsunami mitigation projects in high risk locations are economically justified. More importantly, these results indicate that national and local priorities for natural hazard mitigation should be reconsidered, with tsunami mitigation given a very high priority.

  9. Mitigation potential of horizontal ground coupled heat pumps for current and future climatic conditions: UK environmental modelling and monitoring studies

    NASA Astrophysics Data System (ADS)

    García González, Raquel; Verhoef, Anne; Vidale, Pier Luigi; Gan, Guohui; Wu, Yupeng; Hughes, Andrew; Mansour, Majdi; Blyth, Eleanor; Finch, Jon; Main, Bruce

    2010-05-01

    An increased uptake of alternative low or non-CO2 emitting energy sources is one of the key priorities for policy makers to mitigate the effects of environmental change. Relatively little work has been undertaken on the mitigation potential of Ground Coupled Heat Pumps (GCHPs) despite the fact that a GCHP could significantly reduce CO2 emissions from heating systems. It is predicted that under climate change the most probable scenario is for UK temperatures to increase and for winter rainfall to become more abundant; the latter is likely to cause a general rise in groundwater levels. Summer rainfall may reduce considerably, while vegetation type and density may change. Furthermore, recent studies underline the likelihood of an increase in the number of heat waves. Under such a scenario, GCHPs will increasingly be used for cooling as well as heating. These factors will affect long-term performance of horizontal GCHP systems and hence their economic viability and mitigation potential during their life span ( 50 years). The seasonal temperature differences encountered in soil are harnessed by GCHPs to provide heating in the winter and cooling in the summer. The performance of a GCHP system will depend on technical factors (heat exchanger (HE) type, length, depth, and spacing of pipes), but also it will be determined to a large extent by interactions between the below-ground parts of the system and the environment (atmospheric conditions, vegetation and soil characteristics). Depending on the balance between extraction and rejection of heat from and to the ground, the soil temperature in the neighbourhood of the HE may fall or rise. The GROMIT project (GROund coupled heat pumps MITigation potential), funded by the Natural Environment Research Council (UK), is a multi-disciplinary research project, in collaboration with EarthEnergy Ltd., which aims to quantify the CO2 mitigation potential of horizontal GCHPs. It considers changing environmental conditions and combines model predictions of soil moisture content and soil temperature with measurements at different GCHP locations over the UK. The combined effect of environment dynamics and horizontal GCHP technical properties on long-term GCHP performance will be assessed using a detailed land surface model (JULES: Joint UK Land Environment Simulator, Meteorological Office, UK) with additional equations embedded describing the interaction between GCHP heat exchangers and the surrounding soil. However, a number of key soil physical processes are currently not incorporated in JULES, such as groundwater flow, which, especially in lowland areas, can have an important effect on the heat flow between soil and HE. Furthermore, the interaction between HE and soil may also cause soil vapour and moisture fluxes. These will affect soil thermal conductivity and hence heat flow between the HE and the surrounding soil, which will in turn influence system performance. The project will address these issues. We propose to drive an improved version of JULES (with equations to simulate GCHP exchange embedded), with long-term gridded (1 km) atmospheric, soil and vegetation data (reflecting current and future environmental conditions) to reliably assess the mitigation potential of GCHPs over the entire domain of the UK, where uptake of GCHPs has been low traditionally. In this way we can identify areas that are most suitable for the installation of GCHPs. Only then recommendations can be made to local and regional governments, for example, on how to improve the mitigation potential in less suitable areas by adjusting GCHP configurations or design.

  10. GIS data for the Seaside, Oregon, Tsunami Pilot Study to modernize FEMA flood hazard maps

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2007-01-01

    A Tsunami Pilot Study was conducted for the area surrounding the coastal town of Seaside, Oregon, as part of the Federal Emergency Management's (FEMA) Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). The Cascadia subduction zone extends from Cape Mendocino, California, to Vancouver Island, Canada. The Seaside area was chosen because it is typical of many coastal communities subject to tsunamis generated by far- and near-field (Cascadia) earthquakes. Two goals of the pilot study were to develop probabilistic 100-year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improving tsunami hazard assessment guidelines for FEMA and state and local agencies. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study model data and results are published separately as a geographic information systems (GIS) data report (Wong and others, 2006). The flood maps and GIS data are briefly described here.

  11. Awareness of occupational hazards and use of safety measures among welders: a cross-sectional study from eastern Nepal

    PubMed Central

    Budhathoki, Shyam Sundar; Singh, Suman Bahadur; Sagtani, Reshu Agrawal; Niraula, Surya Raj; Pokharel, Paras Kumar

    2014-01-01

    Objective The proper use of safety measures by welders is an important way of preventing and/or reducing a variety of health hazards that they are exposed to during welding. There is a lack of knowledge about hazards and personal protective equipments (PPEs) and the use of PPE among the welders in Nepal is limited. We designed a study to assess welders’ awareness of hazards and PPE, and the use of PPE among the welders of eastern Nepal and to find a possible correlation between awareness and use of PPE among them. Materials and methods A cross-sectional study of 300 welders selected by simple random sampling from three districts of eastern Nepal was conducted using a semistructured questionnaire. Data regarding age, education level, duration of employment, awareness of hazards, safety measures and the actual use of safety measures were recorded. Results Overall, 272 (90.7%) welders were aware of at least one hazard of welding and a similar proportion of welders were aware of at least one PPE. However, only 47.7% used one or more types of PPE. Education and duration of employment were significantly associated with the awareness of hazards and of PPE and its use. The welders who reported using PPE during welding were two times more likely to have been aware of hazards (OR=2.52, 95% CI 1.09 to 5.81) and five times more likely to have been aware of PPE compared with the welders who did not report the use of PPE (OR=5.13, 95% CI 2.34 to 11.26). Conclusions The welders using PPE were those who were aware of hazards and PPE. There is a gap between being aware of hazards and PPE (90%) and use of PPE (47%) at work. Further research is needed to identify the underlying factors leading to low utilisation of PPE despite the welders of eastern Nepal being knowledgeable of it. PMID:24889850

  12. Proceedings of the IUCAF RFI Mitigation Workshop ---Bonn RFI Mitigation with the Rapid Prototype Array

    E-print Network

    Bower, Geoffrey

    Proceedings of the IUCAF RFI Mitigation Workshop --- Bonn RFI Mitigation with the Rapid Prototype Array (RPA) is a 7-element L-band interferometer designed for RFI mitigation studies. We give with the RPA in RFI measurement and mitigation. The Rapid Prototype Array The 7-elements of the RPA

  13. Towards the Seismic Hazard Reassessment of Paks NPP (Hungary) Site: Seismicity and Sensitivity Studies

    NASA Astrophysics Data System (ADS)

    Toth, Laszlo; Monus, Peter; Gyori, Erzsebet; Grenerczy, Gyula; Janos Katona, Tamas; Kiszely, Marta

    2015-04-01

    In context of extension of Paks Nuclear Power Plant by new units, a comprehensive site seismic hazard evaluation program has been developed that is already approved by the Hungarian Authorities. This includes a 3D seismic survey, drilling of several deep boreholes, extensive geological mapping, and geophysical investigations at the site and its vicinity, as well as on near regional, and regional scale. Furthermore, all relevant techniques of modern space geodesy (GPS, PSInSAR) will be also utilized to construct a new seismotectonic model. The implementation of the project is still in progress. In the presentation, some important elements of the new seismic hazard assessment are highlighted, and some results obtained in the preliminary phase of the program are presented and discussed. The first and most important component of the program is the compilation of the seismological database that is developed on different time scale zooming on different event recurrence rates such as paleo-earthquakes (10-1/a). In 1995, Paks NPP installed and started to operate a sensitive microseismic monitoring network capable for locating earthquakes as small as magnitude 2.0 within about 100 km of the NPP site. During the two decades of operation, the microseismic monitoring network located some 2,000 earthquakes within the region of latitude 45.5 - 49 N and longitude 16 - 23 E. Out of the total number of events, 130 earthquakes were reported as 'felt events'. The largest earthquake was an event of ML 4.8, causing significant damage in the epicenter area. The results of microseismic monitoring provide valuable data for seismotectonic modelling and results in more accurate earthquake recurrence equations. The first modern PSHA of Paks NPP site was carried out in 1995. Complex site characterization project was implemented and hazard curves had been evaluated for 10-3 - 10-5 annual frequency. As a follow-up, PSHA results have been reviewed and updated in the frame of periodic safety reviews, and hazard characterization of the site has been confirmed. The hazard curves have been extended to lower probability events, as it is required by the probabilistic safety analysis. These earlier projects resulted in 0.22-0.26 g and 0.43-0.54 g mean PGA at 104 and 105 return periods. The site effect and liquefaction probability have also been evaluated. As it is expected for the site of soft soil conditions, the amplification is greater at shorter periods for the lower amplitude ground motion of 104 return period compared to the longer periods for the higher amplitude of the 105 year level ground motion. Further studies will be based on the improved regional seismotectonic model, state-of-the-art hazard evaluation software, and better knowledge of the local soil conditions. The presented preliminary results can demonstrate the adequacy of the planned program and highlight the progress in the hazard assessment.

  14. Study on Risk Assessment and Management of Landslide Hazard in New Bodong County, Three Gorge Reservoir

    Microsoft Academic Search

    Wu Yiping; Yin Kunlong; Jiang Wei

    2009-01-01

    In this paper, considering geological environment and landslide development in New Badong County, 11 representative factors are decided to assess landslide hazard in New Badong County. By using the Artificial Neural Network landslide hazard evaluating model and the advantages of GIS technology in image processing and space-analysis, landslide hazard zonation of New Badong County is compiled. The result of the

  15. An evaluation of soil erosion hazard: A case study in Southern Africa using geomatics technologies

    NASA Astrophysics Data System (ADS)

    Eiswerth, Barbara Alice

    Accelerated soil erosion in Malawi, Southern Africa, increasingly threatens agricultural productivity, given current and projected population growth trends. Previous attempts to document soil erosion potential have had limited success, lacking appropriate information and diagnostic tools. This study utilized geomatics technologies and the latest available information from topography, soils, climate, vegetation, and land use of a watershed in southern Malawi. The Soil Loss Estimation Model for Southern Africa (SLEMSA), developed for conditions in Zimbabwe, was evaluated and used to create a soil erosion hazard map for the watershed under Malawi conditions. The SLEMSA sub-models of cover, soil loss, and topography were computed from energy interception, rainfall energy, and soil erodibility, and slope length and steepness, respectively. Geomatics technologies including remote sensing and Geographic Information Systems (GIS) provided the tools with which land cover/land use, a digital elevation model, and slope length and steepness were extracted and integrated with rainfall and soils spatial information. Geomatics technologies enable rapid update of the model as new and better data sets become available. Sensitivity analyses of the SLEMSA model revealed that rainfall energy and slope steepness have the greatest influence on soil erosion hazard estimates in this watershed. Energy interception was intermediate in sensitivity level, whereas slope length and soil erodibility ranked lowest. Energy interception and soil erodibility were shown by parameter behavior analysis to behave in a linear fashion with respect to soil erosion hazard, whereas rainfall energy, slope steepness, and slope length exhibit non-linear behavior. When SLEMSA input parameters and results were compared to alternative methods of soil erosion assessment, such as drainage density and drainage texture, the model provided more spatially explicit information using 30 meter grid cells. Results of this study indicate that more accurate soil erosion estimates can be made when: (1) higher resolution digital elevation models are used; (2) data from improved precipitation station network are available, and; (3) greater investment in rainfall energy research.

  16. Hazardous Waste

    MedlinePLUS

    ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

  17. Progress in NTHMP Hazard Assessment

    USGS Publications Warehouse

    Gonzalez, F.I.; Titov, V.V.; Mofjeld, H.O.; Venturato, A.J.; Simmons, R.S.; Hansen, R.; Combellick, R.; Eisner, R.K.; Hoirup, D.F.; Yanagi, B.S.; Yong, S.; Darienzo, M.; Priest, G.R.; Crawford, G.L.; Walsh, T.J.

    2005-01-01

    The Hazard Assessment component of the U.S. National Tsunami Hazard Mitigation Program has completed 22 modeling efforts covering 113 coastal communities with an estimated population of 1.2 million residents that are at risk. Twenty-three evacuation maps have also been completed. Important improvements in organizational structure have been made with the addition of two State geotechnical agency representatives to Steering Group membership, and progress has been made on other improvements suggested by program reviewers. ?? Springer 2005.

  18. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  19. Study on FPGA SEU Mitigation for the Readout Electronics of DAMPE BGO Calorimeter in Space

    NASA Astrophysics Data System (ADS)

    Shen, Zhongtao; Feng, Changqing; Gao, Shanshan; Zhang, Deliang; Jiang, Di; Liu, Shubin; An, Qi

    2015-06-01

    The BGO calorimeter, which provides a wide measurement range of the primary cosmic ray spectrum, is a key sub-detector of Dark Matter Particle Explorer (DAMPE). The readout electronics of calorimeter consists of 16 pieces of Actel ProASIC Plus FLASH-based FPGA, of which the design-level flip-flops and embedded block RAMs are single event upset (SEU) sensitive in the harsh space environment. Therefore to comply with radiation hardness assurance (RHA), SEU mitigation methods, including partial triple modular redundancy (TMR), CRC checksum, and multi-domain reset are analyzed and tested by the heavy-ion beam test. Composed of multi-level redundancy, a FPGA design with the characteristics of SEU tolerance and low resource consumption is implemented for the readout electronics.

  20. Study on FPGA SEU Mitigation for Readout Electronics of DAMPE BGO Calorimeter

    E-print Network

    Zhongtao Shen; Changqing Feng; Shanshan Gao; Deliang Zhang; Di Jiang; Shubin Liu; Qi An

    2014-06-16

    The BGO calorimeter, which provides a wide measurement range of the primary cosmic ray spectrum, is a key sub-detector of Dark Matter Particle Explorer (DAMPE). The readout electronics of calorimeter consists of 16 pieces of Actel ProASIC Plus FLASH-based FPGA, of which the design-level flip-flops and embedded block RAMs are single event upset (SEU) sensitive in the harsh space environment. Therefore to comply with radiation hardness assurance (RHA), SEU mitigation methods, including partial triple modular redundancy (TMR), CRC checksum, and multi-domain reset are analyzed and tested by the heavy-ion beam test. Composed of multi-level redundancy, a FPGA design with the characteristics of SEU tolerance and low resource consumption is implemented for the readout electronics.

  1. Delayed geochemical hazard: a tool for risk assessment of heavy metal polluted sites and case study.

    PubMed

    Zheng, Mingxia; Feng, Liu; He, Juanni; Chen, Ming; Zhang, Jiawen; Zhang, Minying; Wang, Jing

    2015-04-28

    A concept of delayed geochemical hazard (DGH) was proposed instead of chemical time bomb to represent an ecological and environmental hazard caused by sudden reactivation and release of long-term accumulated pollutants in soil/sediment system due to the change of physicochemical conditions or the decrease of environmental capacity. A DGH model was also established to provide a quantitative tool to assess and predict potential environmental risk caused by heavy metals and especially its dynamic evolutions. A case study of DGH was carried out for a mercury-polluted area in southern China. Results of soil column experiment showed that DGH was directly resulted from the transformation and release of pollutant from the releasable species to the active ones through a mechanism of chain reaction. The most possible chain reaction was summarized as HgE+C+F+O+R?HgE+C+F+O?HgE+C+F?HgE+C?HgE. Although 8.3% of the studied area with the total releasable content of mercury (TRCPHg) exceeded the DGH critical point value of 16.667mg/kg, with the possibility of DGH burst, the area was classified as low-risk of DGH. This confirmed that DGH model could contribute to the risk assessment and early warning of soil/sediment pollution. PMID:25661167

  2. Alternatives to Hazard Ratios for Comparing the Efficacy or Safety of Therapies in Noninferiority Studies.

    PubMed

    Uno, Hajime; Wittes, Janet; Fu, Haoda; Solomon, Scott D; Claggett, Brian; Tian, Lu; Cai, Tianxi; Pfeffer, Marc A; Evans, Scott R; Wei, Lee-Jen

    2015-07-21

    A noninferiority study is often used to investigate whether a treatment's efficacy or safety profile is acceptable compared with an alternative therapy regarding the time to a clinical event. The empirical quantification of the treatment difference for such a study is routinely based on the hazard ratio (HR) estimate. The HR, which is not a relative risk, may be difficult to interpret clinically, especially when the underlying proportional hazards assumption is violated. The precision of the HR estimate depends primarily on the number of observed events but not directly on exposure times or sample size of the study population. If the event rate is low, the study may require an impractically large number of events to ensure that the prespecified noninferiority criterion for the HR is attainable. This article discusses deficiencies in the current approach for the design and analysis of a noninferiority study. Alternative procedures are provided, which do not depend on any model assumption, to compare 2 treatments. For a noninferiority safety study, the patients' exposure times are more clinically important than the observed number of events. If the patients' exposure times are long enough to evaluate safety reliably, then these alternative procedures can effectively provide clinically interpretable evidence on safety, even with relatively few observed events. These procedures are illustrated with data from 2 studies. One explores the cardiovascular safety of a pain medicine; the second examines the cardiovascular safety of a new treatment for diabetes. These alternative strategies to evaluate safety or efficacy of an intervention lead to more meaningful interpretations of the analysis results than the conventional strategy that uses the HR estimate. PMID:26054047

  3. Social Uptake of Scientific Understanding of Seismic Hazard in Sumatra and Cascadia

    NASA Astrophysics Data System (ADS)

    Shannon, R.; McCloskey, J.; Guyer, C.; McDowell, S.; Steacy, S.

    2007-12-01

    The importance of science within hazard mitigation cannot be underestimated. Robust mitigation polices rely strongly on a sound understanding of the science underlying potential natural disasters and the transference of that knowledge from the scientific community to the general public via governments and policy makers. We aim to investigate how and why the public's knowledge, perceptions, response, adjustments and values towards science have changed throughout two decades of research conducted in areas along and adjacent to the Sumatran and Cascadia subduction zones. We will focus on two countries subject to the same potential hazard, but which encompass starkly contrasting political, economic, social and environmental settings. The transfer of scientific knowledge into the public/ social arena is a complex process, the success of which is reflected in a community's ability to withstand large scale devastating events. Although no one could have foreseen the magnitude of the 2004 Boxing Day tsunami, the social devastation generated underscored the stark absence of mitigation measures in the nations most heavily affected. It furthermore emphasized the need for the design and implementation of disaster preparedness measures. Survey of existing literature has already established timelines for major events and public policy changes in the case study areas. Clear evidence exists of the link between scientific knowledge and its subsequent translation into public policy, particularly in the Cascadia context. The initiation of the National Tsunami Hazard Mitigation Program following the Cape Mendocino earthquake in 1992 embodies this link. Despite a series of environmental disasters with recorded widespread fatalities dating back to the mid 1900s and a heightened impetus for scientific research into tsunami/ earthquake hazard following the 2004 Boxing Day tsunami, the translation of science into the public realm is not widely obvious in the Sumatran context. This research aims to further investigate how the enhanced understanding of earthquake and tsunami hazards is being used to direct hazard mitigation strategies and enables direct comparison with the scientific and public policy developments in Cascadia.

  4. 44 CFR 66.3 - Establishment of community case file and flood elevation study docket.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...Establishment of community case file and flood elevation study docket. 66.3 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program CONSULTATION WITH LOCAL...Establishment of community case file and flood elevation study docket. (a)...

  5. 44 CFR 66.3 - Establishment of community case file and flood elevation study docket.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...Establishment of community case file and flood elevation study docket. 66.3 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program CONSULTATION WITH LOCAL...Establishment of community case file and flood elevation study docket. (a)...

  6. 44 CFR 66.3 - Establishment of community case file and flood elevation study docket.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...Establishment of community case file and flood elevation study docket. 66.3 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program CONSULTATION WITH LOCAL...Establishment of community case file and flood elevation study docket. (a)...

  7. 44 CFR 66.3 - Establishment of community case file and flood elevation study docket.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...Establishment of community case file and flood elevation study docket. 66.3 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program CONSULTATION WITH LOCAL...Establishment of community case file and flood elevation study docket. (a)...

  8. Engaging Students in Active Learning: Case Studies in Geography, Environment

    E-print Network

    Al-Qahtani, Mohammad

    Hazard mitigation practical: predicting a volcanic eruption Phil GravestockEngaging Students in Active Learning: Case Studies in Geography, Environment and Related copies of this publication are available from this address Engaging Students in Active Learning: Case

  9. 44 CFR 65.5 - Revision to special hazard area boundaries with no change to base flood elevation determinations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...hazard area boundaries with no change to base flood elevation determinations. 65.5 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...hazard area boundaries with no change to base flood elevation determinations. (a)...

  10. 44 CFR 65.5 - Revision to special hazard area boundaries with no change to base flood elevation determinations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...hazard area boundaries with no change to base flood elevation determinations. 65.5 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...hazard area boundaries with no change to base flood elevation determinations. (a)...

  11. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy)

    Microsoft Academic Search

    S. Grasso; M. Maugeri

    2002-01-01

    After the Summit held in Washington on August 20-22 2001 to plan the first World Conference on the mitigation of Natural Hazards, a Group for the analysis of Natural Hazards within the Mediterranean area has been formed. The Group has so far determined the following hazards: (1) Seismic hazard (hazard for historical buildings included); (2) Hazard linked to the quantity

  12. Mitigating tourism seasonality

    Microsoft Academic Search

    2004-01-01

    Seasonality is one of the most problematic but least understood aspects of tourism. Many destinations are suffering from this phenomenon every year, yet limited efforts have been made to overcome the troublesome issue. This research proposed an approach to obtaining quantitative solutions which can ultimately assist marketers in mitigating seasonal effects. The study applied financial portfolio theory, widely used in

  13. GIS-based pollution hazard mapping and assessment framework of shallow lakes: southeastern Pampean lakes (Argentina) as a case study.

    PubMed

    Romanelli, A; Esquius, K S; Massone, H E; Escalante, A H

    2013-08-01

    The assessment of water vulnerability and pollution hazard traditionally places particular emphasis on the study on groundwaters more than on surface waters. Consequently, a GIS-based Lake Pollution Hazard Index (LPHI) was proposed for assessing and mapping the potential pollution hazard for shallow lakes due to the interaction between the Potential Pollutant Load and the Lake Vulnerability. It includes easily measurable and commonly used parameters: land cover, terrain slope and direction, and soil media. Three shallow lake ecosystems of the southeastern Pampa Plain (Argentina) were chosen to test the usefulness and applicability of this suggested index. Moreover, anthropogenic and natural medium influence on biophysical parameters in these three ecosystems was examined. The evaluation of the LPHI map shows for La Brava and Los Padres lakes the highest pollution hazard (?30 % with high to very high category) while Nahuel Rucá Lake seems to be the less hazardous water body (just 9.33 % with high LPHI). The increase in LPHI value is attributed to a different loading of pollutants governed by land cover category and/or the exposure to high slopes and influence of slope direction. Dissolved oxygen and biochemical oxygen demand values indicate a moderately polluted and eutrophized condition of shallow lake waters, mainly related to moderate agricultural activities and/or cattle production. Obtained information by means of LPHI calculation result useful to perform a local diagnosis of the potential pollution hazard to a freshwater ecosystem in order to implement basic guidelines to improve lake sustainability. PMID:23355019

  14. Property-close source separation of hazardous waste and waste electrical and electronic equipment - A Swedish case study

    SciTech Connect

    Bernstad, Anna, E-mail: anna.bernstad@chemeng.lth.se [Dep. of Chem. Eng., Faculty of Eng., Lund University, Lund (Sweden); Cour Jansen, Jes la [Dep. of Chem. Eng., Faculty of Eng., Lund University, Lund (Sweden); Aspegren, Henrik [VA SYD, City of Malmoe (Sweden)

    2011-03-15

    Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems.

  15. New approach to inventorying army hazardous materials. A study done for the Eighth U. S. Army, Korea. Volume 2. Hazardous-material data. Final report

    SciTech Connect

    Kim, B.J.; Gee, C.S.; Lee, Y.H.; Mikulich, L.R.; Grafmyer, D.E.

    1991-09-01

    The goal of the Army hazardous waste minimization program is to achieve a 50 percent reduction of the hazardous waste generated by calendar year 1992 (CY92), as compared to baseline CY85. A first step in achieving effective hazardous waste management is to conduct a thorough hazardous material inventory. Volume I describes a method created to inventory hazardous material by collecting supply data from Logistics Control Activity (LCA) at the Presidio, San Francisco, CA, an comparing this data with the Material Safety Data Sheet (MSDS) in the Hazardous Material Information System (HMIS). Volume H lists hazardous material data collected for the Eighth U.S. Army (EUSA), Korea. Common elements between the two data bases were compiled, analyzed, and validated. It was found that the intersection of the two data bases created a composite list that substantially reduced the number of nonhazardous wastes included in the individual lists. This method may also be applied to supply data from other Army installations.

  16. Conforth Ranch Wildlife Mitigation Feasibility Study, McNary, Oregon : Annual Report.

    SciTech Connect

    Rasmussen, Larry; Wright, Patrick; Giger, Richard

    1991-03-01

    The 2,860-acre Conforth Ranch near Umatilla, Oregon is being considered for acquisition and management to partially mitigate wildlife losses associated with McNary Hydroelectric Project. The Habitat Evaluation Procedures (HEP) estimated that management for wildlife would result in habitat unit gains of 519 for meadowlark, 420 for quail, 431 for mallard, 466 for Canada goose, 405 for mink, 49 for downy woodpecker, 172 for yellow warbler, and 34 for spotted sandpiper. This amounts to a total combined gain of 2,495 habitat units -- a 110 percent increase over the existing values for these species combined of 2,274 habitat units. Current water delivery costs, estimated at $50,000 per year, are expected to increase to $125,000 per year. A survey of local interest indicated a majority of respondents favored the concept with a minority opposed. No contaminants that would preclude the Fish and Wildlife Service from agreeing to accept the property were identified. 21 refs., 3 figs., 5 tabs.

  17. Study of cover source mismatch in steganalysis and ways to mitigate its impact

    NASA Astrophysics Data System (ADS)

    Kodovský, Jan; Sedighi, Vahid; Fridrich, Jessica

    2014-02-01

    When a steganalysis detector trained on one cover source is applied to images from a different source, generally the detection error increases due to the mismatch between both sources. In steganography, this situation is recognized as the so-called cover source mismatch (CSM). The drop in detection accuracy depends on many factors, including the properties of both sources, the detector construction, the feature space used to represent the covers, and the steganographic algorithm. Although well recognized as the single most important factor negatively affecting the performance of steganalyzers in practice, the CSM received surprisingly little attention from researchers. One of the reasons for this is the diversity with which the CSM can manifest. On a series of experiments in the spatial and JPEG domains, we refute some of the common misconceptions that the severity of the CSM is tied to the feature dimensionality or their "fragility." The CSM impact on detection appears too difficult to predict due to the effect of complex dependencies among the features. We also investigate ways to mitigate the negative effect of the CSM using simple measures, such as by enlarging the diversity of the training set (training on a mixture of sources) and by employing a bank of detectors trained on multiple different sources and testing on a detector trained on the closest source.

  18. Viscoelastic Materials Study for the Mitigation of Blast-Related Brain Injury

    NASA Astrophysics Data System (ADS)

    Bartyczak, Susan; Mock, Willis, Jr.

    2011-06-01

    Recent preliminary research into the causes of blast-related brain injury indicates that exposure to blast pressures, such as from IED detonation or multiple firings of a weapon, causes damage to brain tissue resulting in Traumatic Brain Injury (TBI) and Post Traumatic Stress Disorder (PTSD). Current combat helmets are not sufficient to protect the warfighter from this danger and the effects are debilitating, costly, and long-lasting. Commercially available viscoelastic materials, designed to dampen vibration caused by shock waves, might be useful as helmet liners to dampen blast waves. The objective of this research is to develop an experimental technique to test these commercially available materials when subject to blast waves and evaluate their blast mitigating behavior. A 40-mm-bore gas gun is being used as a shock tube to generate blast waves (ranging from 1 to 500 psi) in a test fixture at the gun muzzle. A fast opening valve is used to release nitrogen gas from the breech to impact instrumented targets. The targets consist of aluminum/ viscoelastic polymer/ aluminum materials. Blast attenuation is determined through the measurement of pressure and accelerometer data in front of and behind the target. The experimental technique, calibration and checkout procedures, and results will be presented.

  19. Remedial Action Assessment System (RAAS): Evaluation of selected feasibility studies of CERCLA (Comprehensive Environmental Response, Compensation, and Liability Act) hazardous waste sites

    SciTech Connect

    Whelan, G. (Pacific Northwest Lab., Richland, WA (USA)); Hartz, K.E.; Hilliard, N.D. (Beck (R.W.) and Associates, Seattle, WA (USA))

    1990-04-01

    Congress and the public have mandated much closer scrutiny of the management of chemically hazardous and radioactive mixed wastes. Legislative language, regulatory intent, and prudent technical judgment, call for using scientifically based studies to assess current conditions and to evaluate and select costeffective strategies for mitigating unacceptable situations. The NCP requires that a Remedial Investigation (RI) and a Feasibility Study (FS) be conducted at each site targeted for remedial response action. The goal of the RI is to obtain the site data needed so that the potential impacts on public health or welfare or on the environment can be evaluated and so that the remedial alternatives can be identified and selected. The goal of the FS is to identify and evaluate alternative remedial actions (including a no-action alternative) in terms of their cost, effectiveness, and engineering feasibility. The NCP also requires the analysis of impacts on public health and welfare and on the environment; this analysis is the endangerment assessment (EA). In summary, the RI, EA, and FS processes require assessment of the contamination at a site, of the potential impacts in public health or the environment from that contamination, and of alternative RAs that could address potential impacts to the environment. 35 refs., 7 figs., 1 tab.

  20. Case study: Mapping tsunami hazards associated with debris flow into a reservoir

    USGS Publications Warehouse

    Walder, J.S.; Watts, P.; Waythomas, C.F.

    2006-01-01

    Debris-flow generated impulse waves (tsunamis) pose hazards in lakes, especially those used for hydropower or recreation. We describe a method for assessing tsunami-related hazards for the case in which inundation by coherent water waves, rather than chaotic splashing, is of primary concern. The method involves an experimentally based initial condition (tsunami source) and a Boussinesq model for tsunami propagation and inundation. Model results are used to create hazard maps that offer guidance for emergency planners and responders. An example application explores tsunami hazards associated with potential debris flows entering Baker Lake, a reservoir on the flanks of the Mount Baker volcano in the northwestern United States. ?? 2006 ASCE.

  1. Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations. Volume II

    SciTech Connect

    Crowe, B.M.; Wohletz, K.H.; Vaniman, D.T.; Gladney, E.; Bower, N.

    1986-01-01

    Volcanic hazard investigations during FY 1984 focused on five topics: the emplacement mechanism of shallow basalt intrusions, geochemical trends through time for volcanic fields of the Death Valley-Pancake Range volcanic zone, the possibility of bimodal basalt-rhyolite volcanism, the age and process of enrichment for incompatible elements in young basalts of the Nevada Test Site (NTS) region, and the possibility of hydrovolcanic activity. The stress regime of Yucca Mountain may favor formation of shallow basalt intrusions. However, combined field and drill-hole studies suggest shallow basalt intrusions are rare in the geologic record of the southern Great Basin. The geochemical patterns of basaltic volcanism through time in the NTS region provide no evidence for evolution toward a large-volume volcanic field or increases in future rates of volcanism. Existing data are consistent with a declining volcanic system comparable to the late stages of the southern Death Valley volcanic field. The hazards of bimodal volcanism in this area are judged to be low. The source of a 6-Myr pumice discovered in alluvial deposits of Crater Flat has not been found. Geochemical studies show that the enrichment of trace elements in the younger rift basalts must be related to an enrichment of their mantle source rocks. This geochemical enrichment event, which may have been metasomatic alteration, predates the basalts of the silicic episode and is, therefore, not a young event. Studies of crater dimensions of hydrovolcanic landforms indicate that the worst case scenario (exhumation of a repository at Yucca Mountain by hydrovolcanic explosions) is unlikely. Theoretical models of melt-water vapor explosions, particularly the thermal detonation model, suggest hydrovolcanic explosion are possible at Yucca Mountain. 80 refs., 21 figs., 5 tabs.

  2. Geomorphological surveys and software simulations for rock fall hazard assessment: a case study in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Devoto, S.; Boccali, C.; Podda, F.

    2014-12-01

    In northern Italy, fast-moving landslides represent a significant threat to the population and human facilities. In the eastern portion of the Italian Alps, rock falls are recurrent and are often responsible for casualties or severe damage to roads and buildings. The above-cited type of landslide is frequent in mountain ranges, is characterised by strong relief energy and is triggered by earthquakes or copious rainfall, which often exceed 2000 mm yr-1. These factors cause morphological dynamics with intense slope erosion and degradation processes. This work investigates the appraisal of the rock-fall hazard related to the presence of several large unstable blocks located at the top of a limestone peak, approximately 500 m NW with respect to the Village of Cimolais. Field surveys recognised a limestone block exceeding a volume of 400 m3 and identified this block as the most hazardous for Cimolais Village because of its proximity to the rocky cliff. A first assessment of the possible transit and stop areas has been investigated through in-depth traditional activities, such as geomorphological mapping and aerial photo analysis. The output of field surveys was a detailed land use map, which provided a fundamental starting point for rock fall software analysis. The geomorphological observations were correlated with DTMs derived by regional topography and Airborne Laser Scanning (ALS) surveys to recognise possible rock fall routes. To simulate properly rock fall trajectories with a hybrid computer program, particular attention was devoted to the correct quantification of rates of input parameters, such as restitution coefficients and horizontal acceleration associated to earthquakes, which historically occur in this portion of Italy. The simulation outputs regarding the distribution of rock fall end points and kinetic energy along rock falling paths highlight the hazardous situation for Cimolais Village. Because of this reason, mitigation works have been suggested to immediately reduce the landslide risk. This proposal accounts for the high volume of blocks, which, in case of a fall, render the passive mitigation measures already in place at the back of Cimolais worthless.

  3. The Relative Severity of Single Hazards within a Multi-Hazard Framework

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2013-04-01

    Here we present a description of the relative severity of single hazards within a multi-hazard framework, compiled through examining, quantifying and ranking the extent to which individual hazards trigger or increase the probability of other hazards. Hazards are broken up into six major groupings (geophysical, hydrological, shallow earth processes, atmospheric, biophysical and space), with the interactions for 21 different hazard types examined. These interactions include both one primary hazard triggering a secondary hazard, and one primary hazard increasing the probability of a secondary hazard occurring. We identify, through a wide-ranging review of grey- and peer-review literature, >90 interactions. The number of hazard-type linkages are then summed for each hazard in terms of their influence (the number of times one hazard type triggers another type of hazard, or itself) and their sensitivity (the number of times one hazard type is triggered by other hazard types, or itself). The 21 different hazards are then ranked based on (i) influence and (ii) sensitivity. We found, by quantification and ranking of these hazards, that: (i) The strongest influencers (those triggering the most secondary hazards) are volcanic eruptions, earthquakes and storms, which when taken together trigger almost a third of the possible hazard interactions identified; (ii) The most sensitive hazards (those being triggered by the most primary hazards) are identified to be landslides, volcanic eruptions and floods; (iii) When sensitivity rankings are adjusted to take into account the differential likelihoods of different secondary hazards being triggered, the most sensitive hazards are found to be landslides, floods, earthquakes and ground heave. We believe that by determining the strongest influencing and the most sensitive hazards for specific spatial areas, the allocation of resources for mitigation measures might be done more effectively.

  4. UAV-based Natural Hazard Management in High-Alpine Terrain - Case Studies from Austria

    NASA Astrophysics Data System (ADS)

    Sotier, Bernadette; Adams, Marc; Lechner, Veronika

    2015-04-01

    Unmanned Aerial Vehicles (UAV) have become a standard tool for geodata collection, as they allow conducting on-demand mapping missions in a flexible, cost-effective manner at an unprecedented level of detail. Easy-to-use, high-performance image matching software make it possible to process the collected aerial images to orthophotos and 3D-terrain models. Such up-to-date geodata have proven to be an important asset in natural hazard management: Processes like debris flows, avalanches, landslides, fluvial erosion and rock-fall can be detected and quantified; damages can be documented and evaluated. In the Alps, these processes mostly originate in remote areas, which are difficult and hazardous to access, thus presenting a challenging task for RPAS data collection. In particular, the problems include finding suitable landing and piloting-places, dealing with bad or no GPS-signals and the installation of ground control points (GCP) for georeferencing. At the BFW, RPAS have been used since 2012 to aid natural hazard management of various processes, of which three case studies are presented below. The first case study deals with the results from an attempt to employ UAV-based multi-spectral remote sensing to monitor the state of natural hazard protection forests. Images in the visible and near-infrared (NIR) band were collected using modified low-cost cameras, combined with different optical filters. Several UAV-flights were performed in the 72 ha large study site in 2014, which lies in the Wattental, Tyrol (Austria) between 1700 and 2050 m a.s.l., where the main tree species are stone pine and mountain pine. The matched aerial images were analysed using different UAV-specific vitality indices, evaluating both single- and dual-camera UAV-missions. To calculate the mass balance of a debris flow in the Tyrolean Halltal (Austria), an RPAS flight was conducted in autumn 2012. The extreme alpine environment was challenging for both the mission and the evaluation of the aerial images: In the upper part of the steep channel there was no GPS-signal available, because of the high surrounding rock faces, the landing area consisted of coarse gravel. Therefore, only a manual flight with a high risk of damage was possible. With the calculated RPAS-based digital surface model, created from the 600 aerial images, a chronologically resolved back-calculation of the last big debris-flow event could be performed. In a third case study, aerial images from RPAS were used for a similar investigation in Virgen, Eastern Tyrol (Austria). A debris flow in the Firschnitzbach catchment caused severe damages to the village of Virgen in August 2012. An RPAS-flight was performed, in order to refine the estimated displaced debris mass for assessment purposes. The upper catchment of the Firschnitzbach is situated above the timberline and covers an area of 6.5 ha at a height difference of 1000 m. Therefore, three separate flights were necessary to achieve a sufficient image overlap. The central part of the Firschnitzbach consists of a steep and partly dense forested canyon / gorge, so there was no flight possible for this section up to now. The evaluation of the surface model from the images showed, that only half of the estimated debris mass came from the upper part of the catchment.

  5. Eco-efficiency for greenhouse gas emissions mitigation of municipal solid waste management: A case study of Tianjin, China

    SciTech Connect

    Zhao Wei, E-mail: zhaowei.tju@gmail.com [College of Civil Engineering and Architecture, Liaoning University of Technology, 121000 Jinzhou (China); Institute of Environmental Sciences (CML), Leiden University, P.O. Box 9518, 2300RA Leiden (Netherlands); Huppes, Gjalt, E-mail: huppes@cml.leidenuniv.nl [Institute of Environmental Sciences (CML), Leiden University, P.O. Box 9518, 2300RA Leiden (Netherlands); Voet, Ester van der, E-mail: Voet@cml.leidenuniv.nl [Institute of Environmental Sciences (CML), Leiden University, P.O. Box 9518, 2300RA Leiden (Netherlands)

    2011-06-15

    The issue of municipal solid waste (MSW) management has been highlighted in China due to the continually increasing MSW volumes being generated and the limited capacity of waste treatment facilities. This article presents a quantitative eco-efficiency (E/E) analysis on MSW management in terms of greenhouse gas (GHG) mitigation. A methodology for E/E analysis has been proposed, with an emphasis on the consistent integration of life cycle assessment (LCA) and life cycle costing (LCC). The environmental and economic impacts derived from LCA and LCC have been normalized and defined as a quantitative E/E indicator. The proposed method was applied in a case study of Tianjin, China. The study assessed the current MSW management system, as well as a set of alternative scenarios, to investigate trade-offs between economy and GHG emissions mitigation. Additionally, contribution analysis was conducted on both LCA and LCC to identify key issues driving environmental and economic impacts. The results show that the current Tianjin's MSW management system emits the highest GHG and costs the least, whereas the situation reverses in the integrated scenario. The key issues identified by the contribution analysis show no linear relationship between the global warming impact and the cost impact in MSW management system. The landfill gas utilization scenario is indicated as a potential optimum scenario by the proposed E/E analysis, given the characteristics of MSW, technology levels, and chosen methodologies. The E/E analysis provides an attractive direction towards sustainable waste management, though some questions with respect to uncertainty need to be discussed further.

  6. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    Microsoft Academic Search

    Bernreuter

    1981-01-01

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern

  7. Incorporating natural hazard assessments into municipal master-plans; case-studies from Israel

    NASA Astrophysics Data System (ADS)

    Katz, Oded

    2010-05-01

    The active Dead Sea Rift (DSR) runs along the length of Israel, making the entire state susceptible to earthquake-related hazards. Current building codes generally acknowledge seismic hazards and direct engineers towards earthquake-resistant structures. However, hazard mapping on a scale fit for municipal/governmental planning is subject to local initiative and is currently not mandatory as seems necessary. In the following, a few cases of seismic-hazard evaluation made by the Geological Survey of Israel are presented, emphasizing the reasons for their initiation and the way results were incorporated (or not). The first case is a seismic hazard qualitative micro-zonation invited by the municipality of Jerusalem as part of a new master plan. This work resulted in maps (1:50,000; GIS format) identifying areas prone to (1) amplification of seismic shaking due to site characteristics (outcrops of soft rocks or steep topography) and (2) sites with earthquake induced landslide (EILS) hazard. Results were validated using reports from the 1927, M=6.2 earthquake that originated along the DSR about 30km east of Jerusalem. Although the hazard maps were accepted by municipal authorities, practical use by geotechnical engineers working within the frame of the new master-plan was not significant. The main reason for that is apparently a difference of opinion between the city-engineers responsible for implementing the new master-plan and the geologists responsible of the hazard evaluation. The second case involves evaluation of EILS hazard for two towns located further north along the DSR, Zefat and Tiberias. Both were heavily damaged more than once by strong earthquakes in past centuries. Work was carried out as part of a governmental seismic-hazard reduction program. The results include maps (1:10,000 scales) of sites with high EILS hazard identified within city limits. Maps (in GIS format) were sent to city engineers with reports explaining the methods and results. As far as we know, widespread implementation of the maps within municipal master plans never came about, and there was no open discussion between city engineers and the Geological Survey. The main reasons apparently are (1) a lack, until recently, of mandatory building codes requiring incorporation of EILS hazard; (2) budget priorities; (3) failure to involve municipality personnel in planning and executing the EILS hazard evaluation. These cases demonstrate that for seismic hazard data to be incorporated and implemented within municipal master-plans there needs to be (1) active involvement of municipal officials and engineers from the early planning stages of the evaluation campaign, and (2) a-priori dedication of funds towards implementation of evaluation results.

  8. The Study on Ecological Treatment of Saline Lands to Mitigate the Effects of Climate Change

    NASA Astrophysics Data System (ADS)

    Xie, Jiancang; Zhu, Jiwei; Wang, Tao

    2010-05-01

    The soil water and salt movement is influenced strongly by the frequent droughts, floods and climate change. Additionally, as continued population growth, large-scale reclaiming of arable land and long-term unreasonable irrigation, saline land is increasing at the rate of 1,000,000~15,000,000 mu each year all over the world. In the tradition management, " drainage as the main " measure has series of problem, which appears greater project, more occupation of land, harmful for water saving and downstream pollution. To response the global climate change, it has become the common understanding, which promote energy-saving and environment protection, reflect the current model, explore the ecological management model. In this paper, we take severe saline land—Lubotan in Shaanxi Province as an example. Through nearly 10 years harnessing practice and observing to meteorology, hydrology, soil indicators of climate, we analyze the influence of climate change to soil salinity movement at different seasons and years, then put forward and apply a new model of saline land harnessing to mitigate the Effects of Climate Change and self-rehabilitate entironment. This model will be changed "drainage" to "storage", through the establishment engineering of " storage as the main ", taken comprehensive measures of " project - biology - agriculture ", we are changing saline land into arable land. Adapted to natural changes of climate, rainfall, irrigation backwater, groundwater level, reduced human intervention to achieve system dynamic equilibrium. During the ten years, the salt of plough horizon has reduced from 0.74% to 0.20%, organic matter has increased from 0.7% to 0.92%, various indicators of soil is begining to go better. At the same time, reduced the water for irrigation, drainage pollution and investment costs. Through the model, reformed severe saline land 18,900 mu, increased new cultivated land 16,500 mu, comprehensive efficient significant, ensured the coordinated development of " water - biology - environment " in the region. Model application and promotion can treat saline-alkali and add cultivated land effectively, at the same time, ease the pressure for urban construction land, promote energy saving and emission reducting and ecological restoration, so we can construct a resource-saving and environment-friendly society, realize sustainable development of the population, resources and environment.

  9. A comparative study of frequency ratio, statistical index and poisson method for landslide hazard mapping along East-West highway

    NASA Astrophysics Data System (ADS)

    Azizat, Nazirah; Lateh, Habibah; Tay, Lea Tien; Yusoff, Izham Mohamad

    2015-05-01

    The purpose of this study is to evaluate of the landslide hazard along East-west Highway (Gerik- Jeli) using frequency ratio, statistical index and poisson method. Historical data in this area were identified by field work in years 2007 until 2012. The factors chosen in producing of landslide hazard map were geology, rainfall, slope angle, soil texture, stream, profile curvature, plan curvature, NDVI, fault Line, elevation, age of rock and aspect. The results of the analysis for these three methods were verified using actual landslide location data. The validation results showed satisfactory agreement between the landslide hazard map and the existing data on landslide locations. The estimating of influencing factor also was carried out using sensitivity analysis.

  10. Study of the effect of involuntary user movement on the potential light hazards from some ophthalmic instruments.

    PubMed

    Landry, Robert; Wolffe, Michael; Burrows, Clive; Rassow, Bernhard; Byrnes, Gordon

    2004-03-10

    A study was undertaken to determine whether involuntary user movement provides a basis for relaxing the measurement conditions for evaluating the potential optical radiation hazards to the eye from slit lamps and indirect ophthalmoscopes. This was accomplished by assessment of the extent to which light from these devices can be maintained in focus on a 1-mm-diameter fiber-optic cable for 45 s. The results suggest that, although involuntary user movements can be significant, they do not provide a basis for relaxing the measurement conditions for evaluating the potential optical radiation hazards to the cornea and lens from slit lamps and indirect ophthalmoscopes. PMID:15046166

  11. Safety in earth orbit study. Volume 2: Analysis of hazardous payloads, docking, on-board survivability

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Detailed and supporting analyses are presented of the hazardous payloads, docking, and on-board survivability aspects connected with earth orbital operations of the space shuttle program. The hazards resulting from delivery, deployment, and retrieval of hazardous payloads, and from handling and transport of cargo between orbiter, sortie modules, and space station are identified and analyzed. The safety aspects of shuttle orbiter to modular space station docking includes docking for assembly of space station, normal resupply docking, and emergency docking. Personnel traffic patterns, escape routes, and on-board survivability are analyzed for orbiter with crew and passenger, sortie modules, and modular space station, under normal, emergency, and EVA and IVA operations.

  12. Exploratory study of burn time, duty factor, and fluence on ITER activation hazards

    SciTech Connect

    Piet, S.J.

    1992-08-01

    The safety analyses for the Conceptual Design Activity (CDA) of the International Thermonuclear Experimental Reactor (ITER) were based on the simplifying assumption that the activation of materials occurs continuously. Since the analyses showed a significant hazard, it is appropriate to examine how much hazard reduction might occur if this conservative assumption were relaxed. This report explores how much reduction might be gained by considering non-continuous operation, that is, by considering plasma burn time, duty factor, and integrated fluence. Other factors impacting activation hazards - material choice, flux, and size - are not considered here.

  13. Household hazardous wastes as a potential source of pollution: a generation study.

    PubMed

    Ojeda-Benítez, Sara; Aguilar-Virgen, Quetzalli; Taboada-González, Paul; Cruz-Sotelo, Samantha E

    2013-12-01

    Certain domestic wastes exhibit characteristics that render them dangerous, such as explosiveness, flammability, spontaneous combustion, reactivity, toxicity and corrosiveness. The lack of information about their generation and composition hinders the creation of special programs for their collection and treatment, making these wastes a potential threat to human health and the environment. We attempted to quantify the levels of hazardous household waste (HHW) generated in Mexicali, Mexico. The analysis considered three socioeconomic strata and eight categories. The sampling was undertaken on a house-by-house basis, and hypothesis testing was based on differences between two proportions for each of the eight categories. In this study, HHW comprised 3.49% of the total generated waste, which exceeded that reported in previous studies in Mexico. The greatest quantity of HHW was generated by the middle stratum; in the upper stratum, most packages were discarded with their contents remaining. Cleaning products represent 45.86% of the HHW generated. Statistical differences were not observed for only two categories among the three social strata. The scarcity of studies on HHW generation limits direct comparisons. Any decrease in waste generation within the middle social stratum will have a large effect on the total amount of waste generated, and decrease their impact on environmental and human health. PMID:24293231

  14. Determining associations and assessing methodological issues in a case-control study of hazardous air pollutants and neural tube defects

    Microsoft Academic Search

    Philip Joseph Lupo

    2009-01-01

    Recent studies have reported positive associations between maternal exposures to air pollutants and several adverse birth outcomes. However, there have been no assessments of the association between environmental hazardous air pollutants (HAPs) such as benzene, toluene, ethylbenzene, and xylene (BTEX) and neural tube defects (NTDs) a common and serious group of congenital malformations. Before examining this association, two important methodological

  15. Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled ?Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings.? This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method, developed ...

  16. Laboratory Study of Polychlorinated Biphenyl Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled “Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings.” This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method...

  17. Exercise as an intervention for sedentary hazardous drinking college students: A pilot study

    PubMed Central

    Weinstock, Jeremiah; Capizzi, Jeffrey; Weber, Stefanie M.; Pescatello, Linda S.; Petry, Nancy M.

    2014-01-01

    Young adults 18–24 years have the highest rates of problems associated with alcohol use among all age groups, and substance use is inversely related to engagement in substance-free activities. This pilot study investigated the promotion of one specific substance-free activity, exercise, on alcohol use in college students. Thirty-one sedentary college students who engaged in hazardous drinking (Alcohol Use Disorders Identification Test scores ? 8) were randomized to one of two conditions: (a) one 50-minute session of motivational enhancement therapy (MET) focused on increasing exercise, or (b) one 50-minute session of MET focused on increasing exercise plus 8 weeks of contingency management (CM) for adhering to specific exercise activities. All participants completed evaluations at baseline and post-treatment (2-months later) assessing exercise participation and alcohol use. Results of the pilot study suggest the interventions were well received by participants, the MET+CM condition showed an increased self-reported frequency of exercise in comparison to the MET alone condition, but other indices of exercise, physical fitness, and alcohol use did not differ between the interventions over time. These results suggest that a larger scale trial could better assess efficacy of this well received combined intervention. Investigation in other clinically relevant populations is also warranted. PMID:24949085

  18. Disaster mitigation and recovery planning for historic buildings: Guam as a case study

    E-print Network

    Cepeda, Ricarda P

    2002-01-01

    successful approaches for future use in extending their useful life are goals of this study. The study includes background information on Guam's environment and the natural disasters that can critically alter or damage the structures. The end result...

  19. Application of the WFD cost proportionality principle to diffuse pollution mitigation: a case study for Scottish Lochs.

    PubMed

    Vinten, A J A; Martin-Ortega, J; Glenk, K; Booth, P; Balana, B B; MacLeod, M; Lago, M; Moran, D; Jones, M

    2012-04-30

    The Water Framework Directive (WFD) aims to deliver good ecological status (GES) for Europe's waters. It prescribes the use of economic principles, such as derogation from GES on grounds of disproportionate costs of mitigation. This paper proposes an application of the proportionality principle to mitigation of phosphorus (P) pollution of 544 Scottish lochs at national and local water body scales. P loading estimates were derived from a national diffuse pollution screening tool. For 293 of these lochs (31% of the loch area), GES already occurred. Mitigation cost-effectiveness was assessed using combined mitigation cost curves for managed grassland, rough grazing, arable land, sewage and septic tank sources. These provided sufficient mitigation (92% of national P load) for GES to be achieved on another 31% of loch area at annualised cost of £2.09 m/y. Mitigation of the residual P loading preventing other lochs achieving GES was considered by using a "mop-up" cost of £200/kg P (assumed cost effectiveness of removal of P directly from lochs), leading to a total cost of £189 m/y. Lochs were ranked by mitigation costs per loch area to give a national scale marginal mitigation cost curve. A published choice experiment valuation of WFD targets for Scottish lochs was used to estimate marginal benefits at national scale and combined with the marginal cost curve. This gave proportionate costs of £5.7 m/y leading to GES in 72% of loch area. Using national mean marginal benefits with a scheme to estimate changes in individual loch value with P loading gave proportionate costs of £25.6 m/y leading to GES in 77% of loch area (491 lochs). PMID:22325580

  20. Effect of anxiolytic and hypnotic drug prescriptions on mortality hazards: retrospective cohort study

    PubMed Central

    2014-01-01

    Objective To test the hypothesis that people taking anxiolytic and hypnotic drugs are at increased risk of premature mortality, using primary care prescription records and after adjusting for a wide range of potential confounders. Design Retrospective cohort study. Setting 273 UK primary care practices contributing data to the General Practice Research Database. Participants 34?727 patients aged 16 years and older first prescribed anxiolytic or hypnotic drugs, or both, between 1998 and 2001, and 69?418 patients with no prescriptions for such drugs (controls) matched by age, sex, and practice. Patients were followed-up for a mean of 7.6 years (range 0.1-13.4 years). Main outcome All cause mortality ascertained from practice records. Results Physical and psychiatric comorbidities and prescribing of non-study drugs were significantly more prevalent among those prescribed study drugs than among controls. The age adjusted hazard ratio for mortality during the whole follow-up period for use of any study drug in the first year after recruitment was 3.46 (95% confidence interval 3.34 to 3.59) and 3.32 (3.19 to 3.45) after adjusting for other potential confounders. Dose-response associations were found for all three classes of study drugs (benzodiazepines, Z drugs (zaleplon, zolpidem, and zopiclone), and other drugs). After excluding deaths in the first year, there were approximately four excess deaths linked to drug use per 100 people followed for an average of 7.6 years after their first prescription. Conclusions In this large cohort of patients attending UK primary care, anxiolytic and hypnotic drugs were associated with significantly increased risk of mortality over a seven year period, after adjusting for a range of potential confounders. As with all observational findings, however, these results are prone to bias arising from unmeasured and residual confounding. PMID:24647164

  1. Can hazardous waste become a raw material? The case study of an aluminium residue: a review.

    PubMed

    López-Delgado, Aurora; Tayibi, Hanan

    2012-05-01

    The huge number of research studies carried out during recent decades focused on finding an effective solution for the waste treatment, have allowed some of these residues to become new raw materials for many industries. Achieving this ensures a reduction in energy and natural resources consumption, diminishing of the negative environmental impacts and creating secondary and tertiary industries. A good example is provided by the metallurgical industry, in general, and the aluminium industry in this particular case. The aluminium recycling industry is a beneficial activity for the environment, since it recovers resources from primary industry, manufacturing and post-consumer waste. Slag and scrap which were previously considered as waste, are nowadays the raw material for some highly profitable secondary and tertiary industries. The most recent European Directive on waste establishes that if waste is used as a common product and fulfils the existing legislation for this product, then this waste can be defined as 'end-of-waste'. The review presented here, attempts to show several proposals for making added-value materials using an aluminium residue which is still considered as a hazardous waste, and accordingly, disposed of in secure storage. The present proposal includes the use of this waste to manufacture glass, glass-ceramic, boehmite and calcium aluminate. Thus the waste might effectively be recovered as a secondary source material for various industries. PMID:22071175

  2. Study of hazardous metals in iron slag waste using laser induced breakdown spectroscopy.

    PubMed

    Gondal, M A; Hussain, T; Yamani, Z H; Bakry, A H

    2007-05-01

    Laser Induced Breakdown Spectroscopy (LIBS) was applied for quantitative elemental analysis of slag samples collected from a local steel plant using an Nd: YAG laser emitting radiation at 1064 nm wavelength. The concentration of different elements of environmental significance such as cadmium, calcium, sulfur, magnesium, chromium, manganese, titanium, barium, phosphorus and silicon were 44, 2193, 1724,78578, 217260, 22220, 5178, 568, 2805, 77871 were mg Kg-1, respectively. Optimal experimental conditions for analysis were investigated. The calibration curves were drawn for different elements. The concentrations determined with our Laser-Induced Breakdown Spectrometers were compared with the results obtained using Inductively Coupled Plasma (ICP) emission spectroscopy. Our study demonstrates that LIBS could be highly appropriate for rapid online analysis of iron slag waste. The relative accuracy of our LIBS system for various elements as compared with ICP method is in the range of 0.001-0.049 at 2.5% error confidence. Limits of detection (LOD) of our LIBS system were also estimated for the elements noted here. The hazardous effects of some of the trace elements present in iron slag exceeding permissible safe limits are also discussed. PMID:17474003

  3. Flood hazard studies in Central Texas using orbital and suborbital remote sensing machinery

    NASA Technical Reports Server (NTRS)

    Baker, V. R.; Holz, R. K.; Patton, P. C.

    1975-01-01

    Central Texas is subject to infrequent, unusually intense rainstorms which cause extremely rapid runoff from drainage basins developed on the deeply dissected limestone and marl bedrock of the Edwards Plateau. One approach to flood hazard evaluation in this area is a parametric model relating flood hydrograph characteristics to quantitative geomorphic properties of the drainage basins. The preliminary model uses multiple regression techniques to predict potential peak flood discharge from basin magnitude, drainage density, and ruggedness number. After mapping small catchment networks from remote sensing imagery, input data for the model are generated by network digitization and analysis by a computer assisted routine of watershed analysis. The study evaluated the network resolution capabilities of the following data formats: (1) large-scale (1:24,000) topographic maps, employing Strahler's "method of v's," (2) standard low altitude black and white aerial photography (1:13,000 and 1:20,000 scales), (3) NASA - generated aerial infrared photography at scales ranging from 1:48,000 to 1:123,000, and (4) Skylab Earth Resources Experiment Package S-190A and S-190B sensors (1:750,000 and 1:500,000 respectively).

  4. EXPERIMENTAL STUDY ON VIBRATION MITIGATION OF A STAY CABLE USING NONLINEAR HYSTERETIC DAMPERS

    Microsoft Academic Search

    J. M. Ko; Y. Chen; G. Zheng; Y. Q. Ni

    This paper describes an experimental study on the cable vibration control using nonlinear hysteretic dampers. A 12m-long stay cable specimen, which is a 1:12 scale model of a 143m-long prototype cable in an actual cable structure, is established for the experimental study. Modal testing on the pure cable without damper is first performed to identify the actual modal properties of

  5. Geoengineering, climate change scepticism and the ‘moral hazard’ argument: an experimental study of UK public perceptions

    PubMed Central

    Corner, Adam; Pidgeon, Nick

    2014-01-01

    Many commentators have expressed concerns that researching and/or developing geoengineering technologies may undermine support for existing climate policies—the so-called moral hazard argument. This argument plays a central role in policy debates about geoengineering. However, there has not yet been a systematic investigation of how members of the public view the moral hazard argument, or whether it impacts on people's beliefs about geoengineering and climate change. In this paper, we describe an online experiment with a representative sample of the UK public, in which participants read one of two arguments (either endorsing or rejecting the idea that geoengineering poses a moral hazard). The argument endorsing the idea of geoengineering as a moral hazard was perceived as more convincing overall. However, people with more sceptical views and those who endorsed ‘self-enhancing’ values were more likely to agree that the prospect of geoengineering would reduce their motivation to make changes in their own behaviour in response to climate change. The findings suggest that geoengineering is likely to pose a moral hazard for some people more than others, and the implications for engaging the public are discussed. PMID:25404680

  6. High-voltage interactions in plasma wakes: Simulation and flight measurements from the Charge Hazards and Wake Studies (CHAWS) experiment

    Microsoft Academic Search

    V. A. Davis; M. J. Mandell; D. L. Cooke; C. L. Enloe

    1999-01-01

    The Charge Hazards and Wake Studies (CHAWS) flight experiment flew on the Wake Shield Facility (WSF) aboard STS-60 and STS-69. The experiment studied high-voltage current collection within the spacecraft wake. The wake-side sensor was a 45-cm-long, biasable cylindrical probe mounted on the 3.66-m-diameter WSF. Operations were performed in free flight and at various attitudes while on the shuttle orbiter remote

  7. Houston’s Novel Strategy to Control Hazardous Air Pollutants: A Case Study in Policy Innovation and Political Stalemate

    PubMed Central

    Sexton, Ken; Linder, Stephen H

    2015-01-01

    Although ambient concentrations have declined steadily over the past 30 years, Houston has recorded some of the highest levels of hazardous air pollutants in the United States. Nevertheless, federal and state regulatory efforts historically have emphasized compliance with the National Ambient Air Quality Standard for ozone, treating “air toxics” in Houston as a residual problem to be solved through application of technology-based standards. Between 2004 and 2009, Mayor Bill White and his administration challenged the well-established hierarchy of air quality management spelled out in the Clean Air Act, whereby federal and state authorities are assigned primacy over local municipalities for the purpose of designing and implementing air pollution control strategies. The White Administration believed that existing regulations were not sufficient to protect the health of Houstonians and took a diversity of both collaborative and combative policy actions to mitigate air toxic emissions from stationary sources. Opposition was substantial from a local coalition of entrenched interests satisfied with the status quo, which hindered the city’s attempts to take unilateral policy actions. In the short term, the White Administration successfully raised the profile of the air toxics issue, pushed federal and state regulators to pay more attention, and induced a few polluting facilities to reduce emissions. But since White left office in 2010, air quality management in Houston has returned to the way it was before, and today there is scant evidence that his policies have had any lasting impact. PMID:25698880

  8. Modeling household adoption of earthquake hazard adjustments: a longitudinal panel study of Southern California and Western Washington residents 

    E-print Network

    Arlikatti, Sudha S

    2006-10-30

    This research, aimed at advancing the theory of environmental hazard adjustment processes by contrasting households from three cities in a high seismic hazard area with households from three other cities in a moderate seismic hazard area...

  9. Mitigating Challenges of Using Virtual Reality in Online Courses: A Case Study

    ERIC Educational Resources Information Center

    Stewart, Barbara; Hutchins, Holly M.; Ezell, Shirley; De Martino, Darrell; Bobba, Anil

    2010-01-01

    Case study methodology was used to describe the challenges experienced in the development of a virtual component for a freshman-level undergraduate course. The purpose of the project was to use a virtual environment component to provide an interactive and engaging learning environment. While some student and faculty feedback was positive, this…

  10. Parameter study for child injury mitigation in near side impacts through FE simulations

    Microsoft Academic Search

    Marianne Andersson; Bengt Pipkorn; Per Lövsund

    2012-01-01

    Objective: The objective of this study is to investigate the effects by crash related car parameters on head and chest injury measures for 3- and 12-year-old children in near side impacts.Methods: The evaluation was made by using a model of a complete passenger car which was impacted laterally by a barrier. The car model was validated in two crash conditions:

  11. The California Hazards Institute

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Kellogg, L. H.; Turcotte, D. L.

    2006-12-01

    California's abundant resources are linked with its natural hazards. Earthquakes, landslides, wildfires, floods, tsunamis, volcanic eruptions, severe storms, fires, and droughts afflict the state regularly. These events have the potential to become great disasters, like the San Francisco earthquake and fire of 1906, that overwhelm the capacity of society to respond. At such times, the fabric of civic life is frayed, political leadership is tested, economic losses can dwarf available resources, and full recovery can take decades. A patchwork of Federal, state and local programs are in place to address individual hazards, but California lacks effective coordination to forecast, prevent, prepare for, mitigate, respond to, and recover from, the harmful effects of natural disasters. Moreover, we do not know enough about the frequency, size, time, or locations where they may strike, nor about how the natural environment and man-made structures would respond. As California's population grows and becomes more interdependent, even moderate events have the potential to trigger catastrophes. Natural hazards need not become natural disasters if they are addressed proactively and effectively, rather than reactively. The University of California, with 10 campuses distributed across the state, has world-class faculty and students engaged in research and education in all fields of direct relevance to hazards. For that reason, the UC can become a world leader in anticipating and managing natural hazards in order to prevent loss of life and property and degradation of environmental quality. The University of California, Office of the President, has therefore established a new system-wide Multicampus Research Project, the California Hazards Institute (CHI), as a mechanism to research innovative, effective solutions for California. The CHI will build on the rich intellectual capital and expertise of the Golden State to provide the best available science, knowledge and tools for leaders, managers, stakeholders, policy makers, educators and the public to effectively and comprehensively combat the problems caused by the natural hazards that threaten California. During this first year of operation, UC faculty involved in the CHI will identify the science and technology research priorities of the Institute, followed by the solicitation of participation by other important stakeholders within California. The CHI is founded upon the idea that the hazards associated with events such as earthquakes and floods need not become great disasters such as the San Francisco earthquake of 1906 and 2005 Hurricane Katrina if these hazards can be anticipated proactively, before they must be dealt with reactively.

  12. Hazardous Materials

    NSDL National Science Digital Library

    Alston, Michele

    Hazardous Materials is a lesson plan which teaches students how to recognize and safely handle hazardous materials in the workplace. After completing this module, students should be able to interpret MSDS sheets and demonstrate their ability to identify hazardous materials during an experiential exercise. Note: This module is part of a modularized manufacturing technology curriculum created by the PSCME, found at www.pscme.org/educators.html.

  13. Numerical study of shock-wave mitigation through matrices of solid obstacles

    NASA Astrophysics Data System (ADS)

    Chaudhuri, A.; Hadjadj, A.; Sadot, O.; Ben-Dor, G.

    2013-02-01

    Shock-wave propagation through different arrays of solid obstacles and its attenuation are analyzed by means of numerical simulations. The two-dimensional compressible Navier-Stokes equations are solved using a fifth-order weighted essentially non-oscillatory scheme, in conjunction with an immersed-boundary method to treat the embedded solids within a cartesian grid. The present study focuses on the geometrical aspects of the solid obstacles, particularly at lower effective flow area, where the frictional forces are expected to be important. The main objective is to analyze the controlling mechanism for shock propagation and attenuation in complex inhomogeneous and porous medium. Different parameters are investigated such as the geometry of the obstacles, their orientation in space as well as the relaxation lengths between two consecutive columns. The study highlights a number of interesting phenomena such as compressible vortices and shock-vortex interactions that are produced in the post-shock region. This also includes shock interactions, hydrodynamic instabilities and non-linear growth of the mixing. Ultimately, the Kelvin-Helmholtz instability invokes transition to a turbulent mixing region across the matrix columns and eddies of different length scales are generated in the wake region downstream of the solid blocks. The power spectrum of instantaneous dynamic pressure shows the existence of a wide range of frequencies which scales nearly with f -5/3. In terms of shock attenuation, the results indicate that the staggered matrix of reversed triangular prism (where the base of the triangular prism is facing the incoming shock) is the most efficient arrangement. In this case, both static and dynamic pressure impulses show significant reduction compared to the other studied configurations, which confirms the effectiveness of this type of barrier configuration. Furthermore, the use of combination of reverse-reverse arrangement of triangular prism obstacle maze is found more effective compared to the forward-reverse or forward-forward arrangements.

  14. Field study of exhaust fans for mitigating indoor air quality problems: Final report

    SciTech Connect

    Grimsrud, D.T.; Szydlowski, R.F.; Turk, B.H.

    1986-09-01

    Residential ventilation in the United States housing stock is provided primarily by infiltration, the natural leakage of outdoor air into a building through cracks and holes in the building shell. Since ventilation is the dominant mechanism for control of indoor pollutant concentrations, low infiltration rates caused fluctuation in weather conditions may lead to high indoor pollutant concentrations. Supplemental mechanical ventilation can be used to eliminate these periods of low infiltration. This study examined effects of small continuously-operating exhaust fan on pollutant concentrations and energy use in residences.

  15. Integrated study to define the hazard of the unstable flanks of Mt. Etna: the Italian DPC-INGV FLANK Project

    NASA Astrophysics Data System (ADS)

    Acocella, Valerio; Puglisi, Giuseppe

    2010-05-01

    Volcanoes are often characterized by unstable flanks. The eastern and south-eastern flanks of Mt. Etna (Italy) have shown repeated evidence of instability in the recent past. The extent and frequency of these processes varies widely, from nearly continuous creep-like movements of specific portions of the flank to the rarer slip of the entire eastern sector, involving also the off-shore portion. Estimated slip rates may vary enormously, from mm/yr to m/week. The most dramatic instability events are associated with major eruptions and shallow seismic activity, as during 2002-2003, posing a serious hazard to the inhabited flanks of the volcano. The Italian Department of Civil Defense (DPC), with the National Institute of Geophysics and Volcanology (INGV), as well as with the involvement of Italian Universities and other Research Institutes, has launched a 2-years project (may 2008-may 2010) devoted to minimize the hazard deriving from the instability of the Etna flanks. This multidisciplinary project embraces geological, geophysical, volcanological, modeling and hazard studies, both on the on-shore and the off-shore portions of the E and SE flanks of the volcano. Indeed, the main aims are to define: (a) the 3D geometry of the collapsing sector(s); (b) the relationships between flank movement and volcanic and seismic activity; (c) the hazard related to the flank instability. The collected data populate a GIS database implemented according the WoVo rules. This project represents the first attempt, at least in Europe, to use an integrated approach to minimize the hazard deriving from flank instability in a volcano. Here we briefly summarize the state of the art of the project at an advanced stage, highlighting the path of the different Tasks, as well as the main results.

  16. Land use and management change under climate change adaptation and mitigation strategies: a U.S. case study

    USGS Publications Warehouse

    Mu, Jianhong E.; Wein, Anne; McCarl, Bruce

    2013-01-01

    We examine the effects of crop management adaptation and climate mitigation strategies on land use and land management, plus on related environmental and economic outcomes. We find that crop management adaptation (e.g. crop mix, new species) increases Greenhouse gas (GHG) emissions by 1.7 % under a more severe climate projection while a carbon price reduces total forest and agriculture GHG annual flux by 15 % and 9 %, respectively. This shows that trade-offs are likely between mitigation and adaptation. Climate change coupled with crop management adaptation has small and mostly negative effects on welfare; mitigation, which is implemented as a carbon price starting at $15 per metric ton carbon dioxide (CO2) equivalent with a 5 % annual increase rate, bolsters welfare carbon payments. When both crop management adaptation and carbon price are implemented the effects of the latter dominates.

  17. Migration, environmental hazards, and health outcomes in China.

    PubMed

    Chen, Juan; Chen, Shuo; Landry, Pierre F

    2013-03-01

    China's rapid economic growth has had a serious impact on the environment. Environmental hazards are major sources of health risk factors. The migration of over 200 million people to heavily polluted urban areas is likely to be significantly detrimental to health. Based on data from the 2009 national household survey "Chinese Attitudes toward Inequality and Distributive Injustice" (N = 2866) and various county-level and municipal indicators, we investigate the disparities in subjective exposure to environmental hazards and associated health outcomes in China. This study focuses particularly on migration-residency status and county-level socio-economic development. We employ multiple regressions that account for the complex multi-stage survey design to assess the associations between perceived environmental hazards and individual and county-level indicators and between perceived environmental hazards and health outcomes, controlling for physical and social environments at multiple levels. We find that perceived environmental hazards are associated with county-level industrialization and economic development: respondents living in more industrialized counties report greater exposure to environmental hazards. Rural-to-urban migrants are exposed to more water pollution and a higher measure of overall environmental hazard. Perceived environmental risk factors severely affect the physical and mental health of the respondents. The negative effects of perceived overall environmental hazard on physical health are more detrimental for rural-to-urban migrants than for urban residents. The research findings call for restructuring the household registration system in order to equalize access to public services and mitigate adverse environmental health effects, particularly among the migrant population. PMID:23273408

  18. A study of hazardous air pollutants at the Tidd PFBC Demonstration Plant

    SciTech Connect

    NONE

    1994-10-01

    The US Department of Energy (DOE) Clean Coal Technology (CCD Program is a joint effort between government and industry to develop a new generation of coal utilization processes. In 1986, the Ohio Power Company, a subsidiary of American Electric Power (AEP), was awarded cofunding through the CCT program for the Tidd Pressure Fluidized Bed Combustor (PFBC) Demonstration Plant located in Brilliant, Ohio. The Tidd PFBC unit began operation in 1990 and was later selected as a test site for an advanced particle filtration (APF) system designed for hot gas particulate removal. The APF system was sponsored by the DOE Morgantown Energy Technology Center (METC) through their Hot Gas Cleanup Research and Development Program. A complementary goal of the DOE CCT and METC R&D programs has always been to demonstrate the environmental acceptability of these emerging technologies. The Clean Air Act Amendments of 1990 (CAAA) have focused that commitment toward evaluating the fate of hazardous air pollutants (HAPs) associated with advanced coal-based and hot gas cleanup technologies. Radian Corporation was contacted by AEP to perform this assessment of HAPs at the Tidd PFBC demonstration plant. The objective of this study is to assess the major input, process, and emission streams at Plant Tidd for the HAPs identified in Title III of the CAAA. Four flue gas stream locations were tested: ESP inlet, ESP outlet, APF inlet, and APF outlet. Other process streams sampled were raw coal, coal paste, sorbent, bed ash, cyclone ash, individual ESP hopper ash, APF ash, and service water. Samples were analyzed for trace elements, minor and major elements, anions, volatile organic compounds, dioxin/furan compounds, ammonia, cyanide, formaldehyde, and semivolatile organic compounds. The particle size distribution in the ESP inlet and outlet gas streams and collected ash from individual ESP hoppers was also determined.

  19. Study of gas cluster ion beam surface treatments for mitigating RF breakdown

    NASA Astrophysics Data System (ADS)

    Swenson, D. R.; Degenkolb, E.; Insepov, Z.

    2006-07-01

    Surface processing with high-energy gas cluster ion beams (GCIB) is investigated for increasing the high voltage breakdown strength of RF cavities and electrodes in general. Various GCIB treatments were studied for Nb, Cu, Stainless Steel and Ti electrode materials using beams of Ar, Ar + H 2, O 2, N 2, Ar + CH 4, or O 2 + NF 3 clusters with accelerating potentials up to 35 kV. Etching using chemically active clusters such as NF 3 reduces the grain structure of Nb used for SRF cavities. Smoothing effects on stainless steel and Ti substrates were evaluated using SEM and AFM imaging and show that 200 nm wide polishing scratch marks are greatly attenuated. Using a combination of Ar and O 2 processing for stainless steel electrode material, the oxide thickness and surface hardness are dramatically increased. The DC field emission of a 150-mm diameter sample of GCIB processed stainless steel electrode material was a factor of 10 6 less than a similar untreated sample.

  20. Control strategy optimization for attainment and exposure mitigation: case study for ozone in Macon, Georgia.

    PubMed

    Cohan, Daniel S; Tian, Di; Hu, Yongtao; Russell, Armistead G

    2006-09-01

    Implementation of more stringent 8-hour ozone standards has led the U.S. Environmental Protection Agency to designate nonattainment status to 474 counties nationwide, many of which had never previously violated air quality standards. As states select emission control measures to achieve attainment in these regions, their choices pose significant implications to local economies and the health of their citizens. Considering a case study of one such nonattainment region, Macon, Georgia, we develop a menu of potential controls that could be implemented locally or in neighboring parts of the state. The control menu offers the potential to control about 20-35% of ozone precursor emissions in most Georgia regions, but marginal costs increase rapidly beyond 15-20%. We link high-order ozone sensitivities with the control menu to identify cost-optimized strategies for achieving attainment and for alternative goals such as reducing spatially averaged or population-weighted ozone concentrations. Strategies targeted toward attainment of Macon ozone would prioritize local reductions of nitrogen oxides, whereas controls in the more densely populated Atlanta region are shown to be more effective for reducing statewide potential population exposure to ozone. A U.S. EPA-sanctioned approach for demonstrating ozone attainment with photochemical models is shown to be highly dependent on the choice of a baseline period and may not foster optimal strategies for assuring attainment and protecting human health. PMID:16738816

  1. Mitigation of air pollution and carbon footprint by energy conservation through CFLs: a case study.

    PubMed

    Wath, Sushant B; Majumdar, Deepanjan

    2011-01-01

    Electricity consumption of compact fluorescent lamps (CFLs) is low, making them a useful tool for minimizing the rapidly increasing demand of electrical energy in India. The present study aims to project the likely electricity conservation in a scenario of complete replacement of existing Fluorescent Tubes (FTs) by CFLs at CSIR-NEERI (National Environmental Engineering Research Institute) visa vis the financial repercussions and indirect reduction in emissions of greenhouse gases, e.g. CO2, N2O, CH4 and other air pollutants, e.g. SO2, NO, suspended particulate matter (SPM), black carbon (BC) and mercury (Hg) from coal fired thermal power plants. The calculations show that the Institute could save around 122850 kWh of electricity per annum, thereby saving approximately INR 859950/(USD 18453.86) towards electricity cost per annum and would be able to minimize 44579.08 kg of CO2-C equivalent (over 100 year time horizon), 909 kg SO2, 982.8 kg NO, 9.8 kg of BC, 368.5 kg SPM, 18.4 kg PM10 and 0.0024 kg Hg emissions per annum from a coal fired thermal power plant by conserving electricity at the institute level. PMID:22324148

  2. Feasibility study of tank leakage mitigation using subsurface barriers. Revision 1

    SciTech Connect

    Treat, R.L.; Peters, B.B.; Cameron, R.J. [Enserch Environmental, Inc., Richland, WA (United States)] [and others

    1995-01-01

    This document reflects the evaluations and analyses performed in response to Tri-Party Agreement Milestone M-45-07A - {open_quotes}Complete Evaluation of Subsurface Barrier Feasibility{close_quotes} (September 1994). In addition, this feasibility study was revised reflecting ongoing work supporting a pending decision by the DOE Richland Operations Office, the Washington State Department of Ecology, and the US Environmental Protection Agency regarding further development of subsurface barrier options for SSTs and whether to proceed with demonstration plans at the Hanford Site (Tri-Party Agreement Milestone M-45-07B). Analyses of 14 integrated SST tank farm remediation alternatives were conducted in response to the three stated objectives of Tri-Party Agreement Milestone M-45-07A. The alternatives include eight with subsurface barriers and six without. Technologies used in the alternatives include three types of tank waste retrieval, seven types of subsurface barriers, a method of stabilizing the void space of emptied tanks, two types of in situ soil flushing, one type of surface barrier, and a clean-closure method. A no-action alternative and a surface-barrier-only alternative were included as nonviable alternatives for comparison. All other alternatives were designed to result in closure of SST tank farms as landfills or in clean-closure. Revision 1 incorporates additional analyses of worker safety, large leak scenarios, and sensitivity to the leach rates of risk controlling constituents. The additional analyses were conducted to support TPA Milestone M-45-07B.

  3. Control Strategy Optimization for Attainment and Exposure Mitigation: Case Study for Ozone in Macon, Georgia

    NASA Astrophysics Data System (ADS)

    Cohan, Daniel S.; Tian, Di; Hu, Yongtao; Russell, Armistead G.

    2006-09-01

    Implementation of more stringent 8-hour ozone standards has led the U.S. Environmental Protection Agency to designate nonattainment status to 474 counties nationwide, many of which had never previously violated air quality standards. As states select emission control measures to achieve attainment in these regions, their choices pose significant implications to local economies and the health of their citizens. Considering a case study of one such nonattainment region, Macon, Georgia, we develop a menu of potential controls that could be implemented locally or in neighboring parts of the state. The control menu offers the potential to control about 20 35% of ozone precursor emissions in most Georgia regions, but marginal costs increase rapidly beyond 15 20%. We link high-order ozone sensitivities with the control menu to identify cost-optimized strategies for achieving attainment and for alternative goals such as reducing spatially averaged or population-weighted ozone concentrations. Strategies targeted toward attainment of Macon ozone would prioritize local reductions of nitrogen oxides, whereas controls in the more densely populated Atlanta region are shown to be more effective for reducing statewide potential population exposure to ozone. A U.S. EPA-sanctioned approach for demonstrating ozone attainment with photochemical models is shown to be highly dependent on the choice of a baseline period and may not foster optimal strategies for assuring attainment and protecting human health.

  4. Climate engineering of vegetated land for hot extremes mitigation: An Earth system model sensitivity study

    NASA Astrophysics Data System (ADS)

    Wilhelm, Micah; Davin, Edouard L.; Seneviratne, Sonia I.

    2015-04-01

    Various climate engineering schemes have been proposed as a way to curb anthropogenic climate change. Land climate engineering schemes aiming to reduce the amount of solar radiation absorbed at the surface by changes in land surface albedo have been considered in a limited number of investigations. However, global studies on this topic have generally focused on the impacts on mean climate rather than extremes. Here we present the results of a series of transient global climate engineering sensitivity experiments performed with the Community Earth System Model over the time period 1950-2100 under historical and Representative Concentration Pathway 8.5 scenarios. Four sets of experiments are performed in which the surface albedo over snow-free vegetated grid points is increased respectively by 0.05, 0.10, 0.15, and 0.20. The simulations show a preferential cooling of hot extremes relative to mean temperatures throughout the Northern midlatitudes during boreal summer under the late twentieth century conditions. Two main mechanisms drive this response: On the one hand, a stronger efficacy of the albedo-induced radiative forcing on days with high incoming shortwave radiation and, on the other hand, enhanced soil moisture-induced evaporative cooling during the warmest days relative to the control simulation due to accumulated soil moisture storage and reduced drying. The latter effect is dominant in summer in midlatitude regions and also implies a reduction of summer drought conditions. It thus constitutes another important benefit of surface albedo modifications in reducing climate change impacts. The simulated response for the end of the 21st century conditions is of the same sign as that for the end of the twentieth century conditions but indicates an increasing absolute impact of land surface albedo increases in reducing mean and extreme temperatures under enhanced greenhouse gas forcing.

  5. Multidimensional Separating Equilibria and Moral Hazard: An Empirical Study of National Football League Contract Negotiations

    Microsoft Academic Search

    Michael Conlin; Patrick M. Emerson

    2003-01-01

    This paper empirically tests for a multidimensional separating equilibrium in contract negotiations and tests for evidence of the moral hazard inherent in many contracts. Using contract and performance data on players drafted into the National Football League from 1986 through 1991, we find evidence that players use delay to agreement and incentive clauses to reveal their private information during contract

  6. A landslide dam breach induced debris flow – a case study on downstream hazard areas delineation

    Microsoft Academic Search

    Chien-Yuan Chen; Tien-Chien Chen; Fan-Chieh Yu; Fong-Yi Hung

    2004-01-01

    A catastrophic landslide dam breach induced debris flow initiated in Da-Cu-Keng stream, Ruifang town, when typhoon Xangsane hit Taiwan on November 1, 2000. Different available methodologies were used to model the natural dam breach induced debris flow and using field topography the hazard zones affected by debris mixtures were delineated. The numerical finite element or finite difference method is time

  7. CASE STUDIES ADDENDUM: 1-8. REMEDIAL RESPONSE AT HAZARDOUS WASTE SITES

    EPA Science Inventory

    In response to the threat to human health and the environment posed by numerous uncontrolled hazardous waste sites across the country, new remedial action technologies are evolving and known technologies are being retrofitted and adapted for use in cleaning up these sites. The re...

  8. CASE STUDIES 1-23: REMEDIAL RESPONSE AT HAZARDOUS WASTE SITES

    EPA Science Inventory

    In response to the threat to human health and the environment posed by numerous uncontrolled hazardous waste sites across the country, new remedial action technologies are evolving and known technologies are being retrofitted and adapted for use in cleaning up these sites. This r...

  9. Towards Practical, Real-Time Estimation of Spatial Aftershock Probabilities: A Feasibility Study in Earthquake Hazard

    Microsoft Academic Search

    P. Morrow; J. McCloskey; S. Steacy

    2001-01-01

    It is now widely accepted that the goal of deterministic earthquake prediction is unattainable in the short term and may even be forbidden by nonlinearity in the generating dynamics. This nonlinearity does not, however, preclude the estimation of earthquake probability and, in particular, how this probability might change in space and time; earthquake hazard estimation might be possible in the

  10. GIS-based spatial analysis and modeling for landslide hazard assessment: A case study in upper Minjiang River basin

    Microsoft Academic Search

    Feng Wenlan; Zhou Qigang; Zhang Baolei; Zhou Wancun; Li Ainong; Zhang Haizhen; Xian Wei

    2006-01-01

    By analyzing the topographic features of past landslides since 1980s and the main land-cover types (including change information)\\u000a in landslide-prone area, modeled spatial distribution of landslide hazard in upper Minjiang River Basin was studied based\\u000a on spatial analysis of GIS in this paper. Results of GIS analysis showed that landslide occurrence in this region closely\\u000a related to topographic feature. Most

  11. Mitigating GHG emissions from agriculture under climate change constrains - a case study for the State of Saxony, Germany

    NASA Astrophysics Data System (ADS)

    Haas, E.; Kiese, R.; Klatt, S.; Butterbach-Bahl, K.

    2012-12-01

    Mitigating greenhouse gas (N2O, CO2, CH4) emissions from agricultural soils under conditions of projected climate change (IPCC SRES scenarios) is a prerequisite to limit global warming. In this study we used the recently developed regional biogeochemical ecosystem model LandscapeDNDC (Haas et al., 2012, Landscape Ecology) and two time slices for present day (1998 - 2018) and future climate (2078-2098) (regional downscale of IPCC SRES A1B climate simulation) and compared a business as usual agricultural management scenario (winter rape seed - winter barley - winter wheat rotation; fertilization: 170 / 150 / 110 kg-N mineral fertilizer; straw harvest barley/wheat: 90 %) with scenarios where either one or all of the following options were realized: no-till, residue return to fields equal 100%, reduction of fertilization rate s were left on the field or reduction of N fertilization by 10%. The spatial domain is the State of Saxony (1 073 523 hectares of arable land), a typical region for agricultural production in Central Europe. The simulations are based on a high resolution polygonal datasets (5 517 agricultural grid cells) for which relevant information on soil properties is available. The regionalization of the N2O emissions was validated against the IPCC Tier I methodology resulting in N2O emissions of 1 824 / 1 610 / 1 180 [t N2O-N yr-1] for of the baseline years whereas the simulations results in 6 955 / 6 039 / 2 207 [t N2O-N yr-1] for the first three years of the baseline scenarios and ranging between 621 and 6 955 [t N2O-N yr-1] within the following years (mean of 2 923). The influence of climate change (elevated mean temperature of approx. 2°C and minor changes in precipitation) results in an increase of 259 [t N2O-N yr-1] (mean 3 182) or approx. 9 percent on average (with a minimum of 618 and a maximum of 6 553 [t N2O-N yr-1]). Focusing on the mitigation , the recarbonization did result in an increase of soil carbon stocks of 2 585 [kg C/ha] within the simulation time span (with 161 868 [kg C/ha] at the initial stage and 164 453 [kg C/ha] at the end of the 21 years of simulation, mean at 163 444). The study present a fully compensation of the carbon sequestration due to the incorporation of the residues by a steady increase of soil N2O emissions due to additional nitrogen supply by the mineralization of organic material (residues). For the derivation of a sustainable land use, the study presents an optimal scenario to keep the yields high while increasing soil C and reducing gaseous N losses and leaching.

  12. Case studies in risk assessment for hazardous waste burning cement kilns

    SciTech Connect

    Fraiser, L.H.; Lund, L.G.; Tyndall, K.H. [Texas Natural Resource Conservation Commission (TNRCC), Austin, TX (United States)

    1996-12-31

    In November of 1994, the Environmental Protection Agency (EPA) issued its final Strategy for Hazardous Waste Minimization and Combustion. In the Strategy, EPA outlined the role of risk assessment in assuring the safe operation of hazardous waste combustion facilities. In accordance with the goals of the Strategy, the Texas Natural Resource Conservation Commission performed screening analyses for two cement companies that supplement with hazardous waste-derived fuel. The methodology employed was that outlined in the Guidance for Performing Screening Level Risk Analyses at Combustion Facilities Burning Hazardous Wastes. A tiered screening approach, allowing progression from a generic worst-case risk assessment to an increasingly more detailed site-specific risk assessment, was developed for applying EPA`s methodology. Interactive spreadsheets, consisting of approximately 50 fate and transport equations and an additional 30 algorithms for estimating human health risks by indirect and direct pathways, were developed for performing the screening analyses. Exposure scenarios included adult and child residents, subsistence farmers (dairy and beef), and a subsistence fisher. Residents were assumed to consume soil and vegetables and to inhale contaminated air. Farmers were assumed to consume soil, vegetables, beef and/or milk (as appropriate) and to inhale contaminated air. In addition to inhaling contaminated air, the fisher was assumed to consume soil, vegetables and fish. The subsistence fisher scenario dominated the estimated risks posed by the two cement companies evaluated. As expected, indirect pathways contributed the majority of the risk. In conclusion, the results indicate that the cumulative (indirect and direct) cancer risks and non-cancer hazard indices did not exceed target levels of 1E{minus}05 and 0.5, respectively.

  13. Development and application of the EPIC model for carbon cycle, greenhouse-gas mitigation, and biofuel studies

    SciTech Connect

    Izaurralde, Roberto C.; Mcgill, William B.; Williams, J.R.

    2012-06-01

    This chapter provides a comprehensive review of the EPIC model in relation to carbon cycle, greenhouse-gas mitigation, and biofuel applications. From its original capabilities and purpose (i.e., quantify the impacts or erosion on soil productivity), the EPIC model has evolved into a comprehensive terrestrial ecosystem model for simulating with more or less process-level detail many ecosystem processes such as weather, hydrology, plant growth and development, carbon cycle (including erosion), nutrient cycling, greenhouse-gas emissions, and the most complete set of manipulations that can be implemented on a parcel of land (e.g. tillage, harvest, fertilization, irrigation, drainage, liming, burning, pesticide application). The chapter also provides details and examples of the latest efforts in model development such as the coupled carbon-nitrogen model, a microbial denitrification model with feedback to the carbon decomposition model, updates on calculation of ecosystem carbon balances, and carbon emissions from fossil fuels. The chapter has included examples of applications of the EPIC model in soil carbon sequestration, net ecosystem carbon balance, and biofuel studies. Finally, the chapter provides the reader with an update on upcoming improvements in EPIC such as the additions of modules for simulating biochar amendments, sorption of soluble C in subsoil horizons, nitrification including the release of N2O, and the formation and consumption of methane in soils. Completion of these model development activities will render an EPIC model with one of the most complete representation of biogeochemical processes and capable of simulating the dynamic feedback of soils to climate and management in terms not only of transient processes (e.g., soil water content, heterotrophic respiration, N2O emissions) but also of fundamental soil properties (e.g. soil depth, soil organic matter, soil bulk density, water limits).

  14. The Impact Hazard

    NASA Astrophysics Data System (ADS)

    Morrison, D.

    2009-12-01

    Throughout its existence, Earth has been pummelled by rocks from space. The cratered face of the Moon testifies to this continuing cosmic bombardment, and the 1908 Tunguska impact in Siberia should have been a wake-up call to the impact hazard. For most scientists, however, it was the discovery 30 years ago that the KT mass extinction was caused by an impact that opened our eyes to this important aspect of Earth history -- that some geological and biological changes have an external origin, and that the biosphere is much more sensitive to impact disturbance than was imagined. While life adapts beautifully to slow changes in the enviroment, a sudden event, like a large impact, can have catastrophic consequences. While we do not face any known hazard today for an extinction-level event, we are becoming aware that more than a million near-earth asteroids (NEAs) exist with the capacity to take out a city if they hit in the wrong place. The NASA Spaceguard Survey has begun to discover and track the larger NEAs, but we do not yet have the capability to find more than a few pecent of the objects as small as the Tunguska impactor (about 40 m diameter). This continuing impact hazard is at roughly the hazard level of volcanic eruptions, including the rare supervolcano eruptions. The differnece is that an incoming cosmic projectile can be detected and tracked, and by application of modern space technology, most impactors could be deflected. Impacts are the only natural hazard that can be eliminated. This motivates our NEA search programs such as Spaceguard and argues for extending them to smaller sizes. At the same time we realize that the most likely warning time for the next impact remains a few seconds, and we may therefore need to fall back on the more conventional responses of disaster mitigation and relief.

  15. EU Advanced study course 1997: "Goals and Economic Instruments for the Achievement of Global Warming Mitigation in Europe"

    E-print Network

    Paris-Sud XI, Université de

    Warming Mitigation in Europe" Necessities and problems of coupling climate and socioeconomic models). This very broad class of models could also be called applied inter-disciplinary models, established for more than three centuries and the traditions of social sciences which are only around 50 years

  16. NEOShield - A global approach to NEO Impact Threat Mitigation

    NASA Astrophysics Data System (ADS)

    Michel, Patrick

    2015-03-01

    NEOShield is a European-Union funded project coordinated by the German Aero-space Center, DLR, to address near-Earth object (NEO) impact hazard mitigation issues. The NEOShield consortium consists of 13 research institutes, universities, and industrial partners from 6 countries and includes leading US and Russian space organizations. The project is funded for a period of 3.5 years from January 2012 with a total of 5.8 million euros. The primary aim of the project is to investigate in detail promising mitigation techniques, such as the kinetic impactor, blast deflection, and the gravity tractor, and devise feasible demonstration missions. Options for an international strategy for implementation when an actual impact threat arises will also be investigated. The NEOShield work plan consists of scientific investigations into the nature of the impact hazard and the physical properties of NEOs, and technical and engineering studies of practical means of deflecting NEOs. There exist many ideas for asteroid deflection techniques, many of which would require considerable scientific and technological development. The emphasis of NEOShield is on techniques that are feasible with current technology, requiring a minimum of research and development work. NEOShield aims to provide detailed designs of feasible mitigation demonstration missions, targeting NEOs of the kind most likely to trigger the first space-based mitigation action. Most of the asteroid deflection techniques proposed to date require physical contact with the threatening object, an example being the kinetic impactor. NEOShield includes research into the mitigation-relevant physical properties of NEOs on the basis of remotely-sensed astronomical data and the results of rendezvous missions, the observational techniques required to efficiently gather mitigation-relevant data on the dynamical state and physical properties of a threatening NEO, and laboratory investigations using gas guns to fire projectiles into asteroid regolith analog materials. The gas-gun investigations enable state-of-the-art numerical models to be verified at small scales. Computer simulations at realistic NEO scales are used to investigate how NEOs with a range of properties would respond to a pulse of energy applied in a deflection attempt. The technical work includes the development of crucial technologies, such as the autonomous guidance of a kinetic impactor to a precise point on the surface of the target, and the detailed design of realistic missions for the purpose of demonstrating the applicability and feasibility of one or more of the techniques investigated. Theoretical work on the blast deflection method of mitigation is designed to probe the circumstances in which this last line of defense may be the only viable option and the issues relating to its deployment. A global response campaign roadmap will be developed based on realistic scenarios presented, for example, by the discovery of an object such as 99942 Apophis or 2011 AG5 on a threatening orbit. The work will include considerations of the timeline of orbit knowledge and impact probability development, reconnaissance observations and fly-by or rendezvous missions, the political decision to mount a mitigation attempt, and the design, development, and launch of the mitigation mission. Collaboration with colleagues outside the NEOShield Consortium involved in complementary activities (e.g. under the auspices of the UN, NASA, or ESA) is being sought in order to establish a broad international strategy. We present a brief overview of the history and planned scope of the project, and progress made to date. The NEOShield project (http://www.neoshield.net) has received funding from the European Union Seventh Framework Program (FP7/2007-2013) under Grant Agreement no. 282703.

  17. GUIDELINES FOR THE USE OF CHEMICALS IN REMOVING HAZARDOUS SUBSTANCE DISCHARGES

    EPA Science Inventory

    This report was undertaken to develop guidelines on the use of various chemical and biological agents to mitigate discharge of hazardous substances. Eight categories of mitigative agents and their potential uses in removing hazardous substances discharged on land and in waterways...

  18. Flood fatality hazard and flood damage hazard: combining multiple hazard characteristics into meaningful maps for spatial planning

    NASA Astrophysics Data System (ADS)

    de Bruijn, K. M.; Klijn, F.; van de Pas, B.; Slager, C. T. J.

    2015-06-01

    For comprehensive flood risk management, accurate information on flood hazards is crucial. While in the past an estimate of potential flood consequences in large areas was often sufficient to make decisions on flood protection, there is currently an increasing demand to have detailed hazard maps available to be able to consider other risk-reducing measures as well. Hazard maps are a prerequisite for spatial planning, but can also support emergency management, the design of flood mitigation measures, and the setting of insurance policies. The increase in flood risks due to population growth and economic development in hazardous areas in the past shows that sensible spatial planning is crucial to prevent risks increasing further. Assigning the least hazardous locations for development or adapting developments to the actual hazard requires comprehensive flood hazard maps. Since flood hazard is a multi-dimensional phenomenon, many different maps could be relevant. Having large numbers of maps to take into account does not, however, make planning easier. To support flood risk management planning we therefore introduce a new approach in which all relevant flood hazard parameters can be combined into two comprehensive maps of flood damage hazard and flood fatality hazard.

  19. Study of regional monsoonal effects on landslide hazard zonation in Cameron Highlands, Malaysia

    Microsoft Academic Search

    Abd Nasir Matori; Abdul Basith; Indra Sati Hamonangan Harahap

    In general, landslides in Malaysia mostly occurred during northeast and southwest periods, two monsoonal systems that bring\\u000a heavy rain. As the consequence, most landslide occurrences were induced by rainfall. This paper reports the effect of monsoonal-related\\u000a geospatial data in landslide hazard modeling in Cameron Highlands, Malaysia, using Geographic Information System (GIS). Land\\u000a surface temperature (LST) data was selected as the

  20. Objective assessment of source models for seismic hazard studies: with a worked example from UK data

    Microsoft Academic Search

    R. M. W. Musson; P. W. Winter

    Up to now, the search for increased reliability in probabilistic seismic hazard analysis (PSHA) has concentrated on ways of\\u000a assessing expert opinion and subjective judgement. Although in some areas of PSHA subjective opinion is unavoidable, there\\u000a is a danger that assessment procedures and review methods contribute further subjective judgements on top of those already\\u000a elicited. It is helpful to find

  1. Hazard and operability study of the multi-function Waste Tank Facility. Revision 1

    SciTech Connect

    Hughes, M.E.

    1995-05-15

    The Multi-Function Waste Tank Facility (MWTF) East site will be constructed on the west side of the 200E area and the MWTF West site will be constructed in the SW quadrant of the 200W site in the Hanford Area. This is a description of facility hazards that site personnel or the general public could potentially be exposed to during operation. A list of preliminary Design Basis Accidents was developed.

  2. Comparative hazard identification of nano- and micro-sized cerium oxide particles based on 28-day inhalation studies in rats.

    PubMed

    Gosens, Ilse; Mathijssen, Liesbeth E A M; Bokkers, Bas G H; Muijser, Hans; Cassee, Flemming R

    2014-09-01

    There are many uncertainties regarding the hazard of nanosized particles compared to the bulk material of the parent chemical. Here, the authors assess the comparative hazard of two nanoscale (NM-211 and NM-212) and one microscale (NM-213) cerium oxide materials in 28-day inhalation toxicity studies in rats (according to Organisation for Economic Co-operation and Development technical guidelines). All three materials gave rise to a dose-dependent pulmonary inflammation and lung cell damage but without gross pathological changes immediately after exposure. Following NM-211 and NM-212 exposure, epithelial cell injury was observed in the recovery groups. There was no evidence of systemic inflammation or other haematological changes following exposure of any of the three particle types. The comparative hazard was quantified by application of the benchmark concentration approach. The relative toxicity was explored in terms of three exposure metrics. When exposure levels were expressed as mass concentration, nanosized NM-211 was the most potent material, whereas when expression levels were based on surface area concentration, micro-sized NM-213 material induced the greatest extent of pulmonary inflammation/damage. Particles were equipotent based on particle number concentrations. In conclusion, similar pulmonary toxicity profiles including inflammation are observed for all three materials with little quantitative differences. Systemic effects were virtually absent. There is little evidence for a dominant predicting exposure metric for the observed effects. PMID:23768316

  3. Evaluating a multi-criteria model for hazard assessment in urban design. The Porto Marghera case study

    SciTech Connect

    Luria, Paolo; Aspinall, Peter A

    2003-08-01

    The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based on a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)

  4. Analyzing electrical hazards in the workplace.

    PubMed

    Neitzel, Dennis K

    2013-10-01

    In resolving the issues in analyzing electrical hazards in an industry, we must follow a path that will lead to a comprehensive analysis of the problems that exist and provide a quantified value to ensure the selection of appropriate personal protective equipment and clothing. An analysis of all three hazards--shock, arc, and blast--must be completed and steps taken to prevent injuries. The following steps could be taken to ensure adequacy of the electrical safe work practices program and training of "qualified" electrical personnel: 1. Conduct a comprehensive Job Task Analysis. 2. Complete a Task Hazard Assessment including: a) shock hazard, b) arc flash hazard, c) arc blast hazard, d) other hazards (slip, fall, struck-by, environmental, etc.). 3. Analyze task for the personal protective equipment needed. 4. Conduct training needs assessment for qualified and non-qualified electrical workers. 5. Revise, update, or publish a complete electrical safe work practices program. Regulatory agencies and standards organizations have long recognized the need to analyze the hazards of electrical work and plan accordingly to mitigate the hazards. Unfortunately, many in the electrical industry have chosen to "take their chances," largely because nothing bad has yet happened. As more information becomes available on the economic and human costs of electrical accidents, it is hoped that more in the industry will recognize the need for systematic hazard analysis and an electrical safe work program that emphasizes hazard identification and abatement. PMID:24358642

  5. Coastal Hazards.

    ERIC Educational Resources Information Center

    Vandas, Steve

    1998-01-01

    Focuses on hurricanes and tsunamis and uses these topics to address other parts of the science curriculum. In addition to a discussion on beach erosion, a poster is provided that depicts these natural hazards that threaten coastlines. (DDR)

  6. Reproductive Hazards

    MedlinePLUS

    ... and female reproductive systems play a role in pregnancy. Problems with these systems can affect fertility and ... a reproductive hazard can cause different effects during pregnancy, depending on when she is exposed. During the ...

  7. Climate change induced lanslide hazard mapping over Greece- A case study in Pelion Mountain (SE Thessaly, Central Greece)

    NASA Astrophysics Data System (ADS)

    Angelitsa, Varvara; Loupasakis, Constantinos; Anagnwstopoulou, Christina

    2015-04-01

    Landslides, as a major type of geological hazard, represent one of the natural events that occur most frequently worldwide after hydro-meteorological events. Landslides occur when the stability of a slope changes due to a number of factors, such as the steep terrain and prolonged precipitation. Identification of landslides and compilation of landslide susceptibility, hazard and risk maps are very important issues for the public authorities providing substantial information regarding, the strategic planning and management of the land-use. Although landslides cannot be predicted accurately, many attempts have been made to compile these maps. Important factors for the the compilation of reliable maps are the quality and the amount of available data and the selection of the best method for the analysis. Numerous studies and publications providing landslide susceptibility,hazard and risk maps, for different regions of Greece, have completed up to now. Their common characteristic is that they are static, taking into account parameters like geology, mean annual precipitaion, slope, aspect, distance from roads, faults and drainage network, soil capability, land use etc., without introducing the dimension of time. The current study focuses on the Pelion Mountain, which is located at the southeastern part of Thessaly in Central Greece; aiming to compile "dynamic" susceptibility and hazard maps depending on climate changes. For this purpose, past and future precipipation data from regional climate models (RCMs) datasets are introduced as input parameters for the compilation of "dynamic" landslide hazard maps. Moreover, land motion mapping data produced by Persistent Scatterer Interferometry (PSI) are used for the validation of the landslide occurrence during the period from June 1992 to December 2003 and as a result for the calibration of the mapping procedure. The PSI data can be applied at a regional scale as support for land motion mapping and at local scale for the monitoring of single well-known ground motion event. The PSI data were produced within the framework of the Terrafirma project. Terrafirma is a pan- European ground motion information service focused on seismic risk, flood defense and costal lowland subsidence,inactive mines and hydrogeological risks. The produced maps provided substantial information for the land use planning and the civil protection of an area presenting excelent natural beauty and numerous preservable trtaditional villages. Keywords: landslide, psi technique, regional climate models, lanslide susceptibility maps, Greece

  8. Hazardous Waste

    NSDL National Science Digital Library

    Harris, Kathryn Louise.

    Given media attention to the US Navy's recent problems with the disposal of a large amount of napalm, an incendiary compound, this week's In the News examines the issue of hazardous waste and materials. The eight resources discussed provide information on various aspects of the topic. Due to the large number of companies specializing in the management and remediation of hazardous waste contamination, private firms will not be noted.

  9. Natural and Man-Made Hazards in the Cayman Islands

    NASA Astrophysics Data System (ADS)

    Novelo-Casanova, D. A.; Suarez, G.

    2010-12-01

    Located in the western Caribbean Sea to the northwest of Jamaica, the Cayman Islands are a British overseas territory comprised of three islands: Grand Cayman, Cayman Brac, and Little Cayman. These three islands occupy around 250 km2 of land area. In this work, historical and recent data were collected and classified to identify and rank the natural and man-made hazards that may potentially affect the Cayman Islands and determine the level of exposure of Grand Cayman to these events. With this purpose, we used the vulnerability assessment methodology developed by the North Caroline Department of Environment and Natural Resources. The different degrees of physical vulnerability for each hazard were graphically interpreted with the aid of maps using a relative scoring system. Spatial maps were generated showing the areas of different levels of exposure to multi-hazards. The more important natural hazard to which the Cayman Islands are exposed is clearly hurricanes. To a lesser degree, the islands may be occasionally exposed to earthquakes and tsunamis. Explosions or leaks of the Airport Texaco Fuel Depot and the fuel pipeline at Grand Cayman are the most significant man-made hazards. Our results indicate that there are four areas in Grand Cayman with various levels of exposure to natural and man-made hazards: The North Sound, Little Sound and Eastern West Bay (Area 1) show a very high level of exposure; The Central Mangroves, Central Bodden Town, Central George Town and the West Bay (Area 2) have high level of exposure; The Northwestern West Bay, Western Georgetown-Bodden Town, and East End-North Side (Area 3) are under moderate levels of exposure. The remainder of the island shows low exposure (Area 4). It is important to underline that this study presents a first evaluation of the main natural and man-made hazards that may affect the Cayman Islands. The maps generated will be useful tools for emergency managers and policy developers and will increase the overall awareness of decision makers for disasters prevention and mitigation plans. Our results constitute the basis of future mitigation risk projects in the islands. Areas showing the level of exposure to natural and man-made hazards at Grand Cayman.

  10. PULSED FOCUSED ULTRASOUND TREATMENT OF MUSCLE MITIGATES PARALYSIS-INDUCED BONE LOSS IN THE ADJACENT BONE: A STUDY IN A MOUSE MODEL

    PubMed Central

    Poliachik, Sandra L.; Khokhlova, Tatiana D.; Wang, Yak-Nam; Simon, Julianna C.; Bailey, Michael R.

    2015-01-01

    Bone loss can result from bed rest, space flight, spinal cord injury or age-related hormonal changes. Current bone loss mitigation techniques include pharmaceutical interventions, exercise, pulsed ultrasound targeted to bone and whole body vibration. In this study, we attempted to mitigate paralysis-induced bone loss by applying focused ultrasound to the midbelly of a paralyzed muscle. We employed a mouse model of disuse that uses onabotulinumtoxinA-induced paralysis, which causes rapid bone loss in 5 d. A focused 2 MHz transducer applied pulsed exposures with pulse repetition frequency mimicking that of motor neuron firing during walking (80 Hz), standing (20 Hz), or the standard pulsed ultrasound frequency used in fracture healing (1 kHz). Exposures were applied daily to calf muscle for 4 consecutive d. Trabecular bone changes were characterized using micro-computed tomography. Our results indicated that application of certain focused pulsed ultrasound parameters was able to mitigate some of the paralysis-induced bone loss. PMID:24857416

  11. Protection of large alpine infrastructures against natural hazards

    NASA Astrophysics Data System (ADS)

    Robl, Jörg; Scheikl, Manfred; Hergarten, Stefan

    2013-04-01

    Large infrastructures in alpine domains are threatened by a variety of natural hazards like debris flows, rock falls and snow avalanches. Especially linear infrastructure including roads, railway lines, pipe lines and power lines passes through the entire mountain range and the impact of natural hazards can be expected along a distance over hundreds of kilometers. New infrastructure projects like storage power plants or ski resorts including access roads are often located in remote alpine domains without any historical record of hazardous events. Mitigation strategies against natural hazards require a detailed analysis on the exposure of the infrastructure to natural hazards. Following conventional concepts extensive mapping and documentation of surface processes over hundreds to several thousand km² of steep alpine domain is essential but can be hardly performed. We present a case study from the Austrian Alps to demonstrate the ability of a multi-level concept to describe the impact of natural hazards on infrastructure by an iterative process. This includes new state of the art numerical models, modern field work and GIS-analysis with an increasing level of refinement at each stage. A set of new numerical models for rock falls, debris flows and snow avalanches was designed to operate with information from field in different qualities and spatial resolutions. Our analysis starts with simple and fast cellular automata for rockfalls and debrisflows to show the exposure of the infrastructure to natural hazards in huge domains and detects "high risk areas" that are investigated in more detail in field in the next refinement level. Finally, sophisticated 2D- depth averaged fluid dynamic models for all kinds of rapid mass movements are applied to support the development of protection structures.

  12. USGS National Seismic Hazard Maps

    NSDL National Science Digital Library

    This set of resources provides seismic hazard assessments and information on design values and mitigation for the U.S. and areas around the world. Map resources include the U.S. National and Regional probabilistic ground motion map collection, which covers the 50 states, Puerto Rico, and selected countries. These maps display peak ground acceleration (PGA) values, and are used as the basis for seismic provisions in building codes and for new construction. There is also a custom mapping and analysis tool, which enables users to re-plot these maps for area of interest, get hazard values using latitude/longitude or zip code, find predominant magnitudes and distances, and map the probability of given magnitude within a certain distance from a site. The ground motion calculator, a Java application, determines hazard curves, uniform hazard response spectra, and design parameters for sites in the 50 states and most territories. There is also a two-part earthquake hazards 'primer', which provides links to hazard maps and frequently-asked-questions, and more detailed information for building and safety planners.

  13. Marginal additive hazards model for case-cohort studies with multiple disease outcomes: an application to the Atherosclerosis Risk in Communities (ARIC) study

    PubMed Central

    Kang, Sangwook; Cai, Jianwen; Chambless, Lloyd

    2013-01-01

    In the case-cohort studies conducted within the Atherosclerosis Risk in Communities (ARIC) study, it is of interest to assess and compare the effect of high-sensitivity C-reactive protein (hs-CRP) on the increased risks of incident coronary heart disease and incident ischemic stroke. Empirical cumulative hazards functions for different levels of hs-CRP reveal an additive structure for the risks for each disease outcome. Additionally, we are interested in estimating the difference in the risk for the different hs-CRP groups. Motivated by this, we consider fitting marginal additive hazards regression models for case-cohort studies with multiple disease outcomes. We consider a weighted estimating equations approach for the estimation of model parameters. The asymptotic properties of the proposed estimators are derived and their finite-sample properties are assessed via simulation studies. The proposed method is applied to analyze the ARIC Study. PMID:22826550

  14. Rockfall hazard assessment by using terrestrial laser scanning. A case study in Funchal (Madeira)

    NASA Astrophysics Data System (ADS)

    Nguyen, Hieu Trung; Fernandez-Steeger, Tomas; Domingos, Rodriguez; Wiatr, Thomas; Azzam, Rafig

    2010-05-01

    Rockfall hazard assessment in a high-relief volcanic environment is a difficult task, facing the challenge of missing standard rating systems and procedures. Likewise mountainous areas, further handicaps are a restricted accessibility to the rock faces and the high efforts in terms of time and labour force to identify and rate these problems. To develop a procedure for rockfall hazard assessment, the island of Maderia is a good research area to investigate rockfalls in a volcanic environment under sub-tropic humid climate conditions. As the entire island is characterised by high mountain ridges and steep deep valleys in lavaflows and tuff layers, the occurrence of rockfalls is a frequent and a serious problem. These hazards are the most frequent causing severe damage to infrastructure and fatalities. In this research, slopes in Funchal city have been mapped and investigated regarding their rock fall hazard potential. The analysed slopes are build-up of lava flows with column structures and intercalated breccias, pyroclatics or tuff layers. Many of the columns already lack basal support and show a wide joint spacing, threatening houses and streets in the city. TLS data acquisitions in May and December 2008 provide information for detailed structural analysis, detection of unstable areas within a slope and rockfall simulations. High resolution scans have been recorded on uncovered rock surfaces with detectable joints while in areas with dense vegetation a lower resolution has been chosen. Although it makes sense to scan an entire area with the best acquirable resolution, the resulting enormous data require powerful computing environments and will slow down data processing. To speed up the data processing, a conventional local digital elevation model (DEM) built up the geometric basic model. Its main disadvantage is that it is not possible to project overhanging parts or notches within the steep slopes which have an important influence on the accuracy of any rockfall simulations. By implantation of the high resolution scans of the TLS into the local DEM, an improvement close to a solely high-resolution digital elevation model (HRDEM) can be achieved. The rockfall hazard assessment starts by comparison of time-shifted datasets and with additional automatic jointing analysis. Based on this data 3-D displacements and associated kinematical failure mechanism can be identified. Using on this information, it becomes possible to determine specific parameters for numerical rockfall simulations like average block sizes, shape or potential sources. Including additional data like surface roughness the results of numerical rockfall simulations allow to classify different areas of hazard based on run-out distances, frequency of impacts and related kinetic energy. The analysis shows that rockfall favourable occurs in areas where notches and undercuts, due to the lesser erosionresistence of pyroclatics or tuff layers, appear. In case of a rockfall the typical blocks have a cylindrical shape, a volume of 1 m3 and are able to hit the entire area. The results can help to provide useful information for civil protection and engineering countermeasures. Repeated TLS scans on the same area will continue the observation and the progress of instability and mass movement occurrence.

  15. Studying geodesy and earthquake hazard in and around the New Madrid Seismic Zone

    USGS Publications Warehouse

    Boyd, Oliver Salz; Magistrale, Harold

    2011-01-01

    Workshop on New Madrid Geodesy and the Challenges of Understanding Intraplate Earthquakes; Norwood, Massachusetts, 4 March 2011 Twenty-six researchers gathered for a workshop sponsored by the U.S. Geological Survey (USGS) and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazards. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. The workshop presentations and conclusions will be available in a forthcoming USGS open-file report (http://pubs.usgs.gov).

  16. Tests for Hazard Transformation

    Microsoft Academic Search

    Weang Kee Ho; Robin Henderson; Peter M. Philipson

    2010-01-01

    The semiparametric Cox proportional hazards model is routinely adopted to model time-to-event data. Proportionality is a strong\\u000a assumption, especially when follow-up time, or study duration, is long. Zeng and Lin (J. R. Stat. Soc., Ser. B, 69:1–30, 2007) proposed a useful generalisation through a family of transformation models which allow hazard ratios to vary over time.\\u000a In this paper we

  17. Combined Hurricane and Earthquake Hazard Component Vulnerability Analysis

    Microsoft Academic Search

    Terri R. Norton; Makola M. Abdullah

    Recent advancements in the area of structural dynamics have allowed civil engineers to better understand and control the response of structures to natural hazards, such as earthquakes and hurricanes. The social and economic impacts of natural disasters are continually increasing due to population growth and increasing number of older buildings with time. Through hazard mitigation engineers are able to reduce

  18. Mitigating odors from animal facilities using biofilters

    Microsoft Academic Search

    Lide Chen

    2008-01-01

    Mitigating odors from livestock sites using biofilters was addressed in this dissertation which is organized in paper format and comprises a literature review paper and two original research papers. In the first literature review paper, both the laboratory and field research from 1997 to 2008 was reviewed to give an up-to-date perspective of studies on the mitigation of odors and

  19. Gracefully Mitigating Breakdowns in Robotic Services

    E-print Network

    Kiesler, Sara

    ) to the question of how such a robot should mitigate errors and aid in service recovery. From a scenario studyGracefully Mitigating Breakdowns in Robotic Services Min Kyung Lee1 , Sara Kiesler1 , Jodi Forlizzi1 , Siddhartha Srinivasa2, 3 , Paul Rybski2 Human-Computer Interaction Institute1 , Robotics

  20. SOME APPLICATIONS OF SEISMIC SOURCE MECHANISM STUDIES TO ASSESSING UNDERGROUND HAZARD.

    USGS Publications Warehouse

    McGarr, A.

    1984-01-01

    Various measures of the seismic source mechanism of mine tremors, such as magnitude, moment, stress drop, apparent stress, and seismic efficiency, can be related directly to several aspects of the problem of determining the underground hazard arising from strong ground motion of large seismic events. First, the relation between the sum of seismic moments of tremors and the volume of stope closure caused by mining during a given period can be used in conjunction with magnitude-frequency statistics and an empirical relation between moment and magnitude to estimate the maximum possible sized tremor for a given mining situation. Second, it is shown that the 'energy release rate,' a commonly-used parameter for predicting underground seismic hazard, may be misleading in that the importance of overburden stress, or depth, is overstated. Third, results involving the relation between peak velocity and magnitude, magnitude-frequency statistics, and the maximum possible magnitude are applied to the problem of estimating the frequency at which design limits of certain underground support equipment are likely to be exceeded.

  1. Pilot studies of seismic hazard and risk in North Sulawesi Province, Indonesia

    USGS Publications Warehouse

    Thenhaus, P.C.; Hanson, S.L.; Effendi, I.; Kertapati, E.K.; Algermissen, S.T.

    1993-01-01

    Earthquake ground motions in North Sulawesi on soft soil that have a 90% probability of not been exceeded in 560 years are estimated to be 0.63 g (63% of the acceleration of gravity) at Palu, 0.31 g at Gorontalo, and 0.27 g at Manado. Estimated ground motions for rock conditions for the same probability level and exposure time are 56% of those for soft soil. The hazard estimates are obtained from seismic sources that model the earthquake potential to a depth of 100 km beneath northern and central Sulawesi and include the Palu fault zone of western Sulawesi, the North Sulawesi subduction zone, and the southern most segment of the Sangihe subduction zone beneath the Molucca Sea. An attenuation relation based on Japanese strong-motion data and considered appropriate for subduction environments of the western Pacific was used in determination of ground motions. Following the 18 April 1990 North Sulawesi earthquake (Ms 7.3) a seismic hazard and risk assessment was carried out. -from Authors

  2. Recording and cataloging hazards information, revision A

    NASA Technical Reports Server (NTRS)

    Stein, R. J.

    1974-01-01

    A data collection process is described for the purpose of discerning causation factors of accidents, and the establishment of boundaries or controls aimed at mitigating and eliminating accidents. A procedure is proposed that suggests a discipline approach to hazard identification based on energy interrelationships together with an integrated control technique which takes the form of checklists.

  3. Voluntary Climate Change Mitigation Actions of Young Adults: A Classification of Mitigators through Latent Class Analysis

    PubMed Central

    Korkala, Essi A. E.; Hugg, Timo T.; Jaakkola, Jouni J. K.

    2014-01-01

    Encouraging individuals to take action is important for the overall success of climate change mitigation. Campaigns promoting climate change mitigation could address particular groups of the population on the basis of what kind of mitigation actions the group is already taking. To increase the knowledge of such groups performing similar mitigation actions we conducted a population-based cross-sectional study in Finland. The study population comprised 1623 young adults who returned a self-administered questionnaire (response rate 64%). Our aims were to identify groups of people engaged in similar climate change mitigation actions and to study the gender differences in the grouping. We also determined if socio-demographic characteristics can predict group membership. We performed latent class analysis using 14 mitigation actions as manifest variables. Three classes were identified among men: the Inactive (26%), the Semi-active (63%) and the Active (11%) and two classes among women: the Semi-active (72%) and the Active (28%). The Active among both genders were likely to have mitigated climate change through several actions, such as recycling, using environmentally friendly products, preferring public transport, and conserving energy. The Semi-Active had most probably recycled and preferred public transport because of climate change. The Inactive, a class identified among men only, had very probably done nothing to mitigate climate change. Among males, being single or divorced predicted little involvement in climate change mitigation. Among females, those without tertiary degree and those with annual income €?16801 were less involved in climate change mitigation. Our results illustrate to what extent young adults are engaged in climate change mitigation, which factors predict little involvement in mitigation and give insight to which segments of the public could be the audiences of targeted mitigation campaigns. PMID:25054549

  4. Automated Standard Hazard Tool

    NASA Technical Reports Server (NTRS)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  5. Contributions to the Characterization and Mitigation of Rotorcraft Brownout

    NASA Astrophysics Data System (ADS)

    Tritschler, John Kirwin

    Rotorcraft brownout, the condition in which the flow field of a rotorcraft mobilizes sediment from the ground to generate a cloud that obscures the pilot's field of view, continues to be a significant hazard to civil and military rotorcraft operations. This dissertation presents methodologies for: (i) the systematic mitigation of rotorcraft brownout through operational and design strategies and (ii) the quantitative characterization of the visual degradation caused by a brownout cloud. In Part I of the dissertation, brownout mitigation strategies are developed through simulation-based brownout studies that are mathematically formulated within a numerical optimization framework. Two optimization studies are presented. The first study involves the determination of approach-to-landing maneuvers that result in reduced brownout severity. The second study presents a potential methodology for the design of helicopter rotors with improved brownout characteristics. The results of both studies indicate that the fundamental mechanisms underlying brownout mitigation are aerodynamic in nature, and the evolution of a ground vortex ahead of the rotor disk is seen to be a key element in the development of a brownout cloud. In Part II of the dissertation, brownout cloud characterizations are based upon the Modulation Transfer Function (MTF), a metric commonly used in the optics community for the characterization of imaging systems. The use of the MTF in experimentation is examined first, and the application of MTF calculation and interpretation methods to actual flight test data is described. The potential for predicting the MTF from numerical simulations is examined second, and an initial methodology is presented for the prediction of the MTF of a brownout cloud. Results from the experimental and analytical studies rigorously quantify the intuitively-known facts that the visual degradation caused by brownout is a space and time-dependent phenomenon, and that high spatial frequency features, i.e., fine-grained detail, are obscured before low spatial frequency features, i.e., large objects. As such, the MTF is a metric that is amenable to Handling Qualities (HQ) analyses.

  6. RFI mitigation workshop, Penticton, July 2004 RFI mitigation

    E-print Network

    Ellingson, Steven W.

    RFI mitigation workshop, Penticton, July 2004 RFI mitigation with the phase-only adaptive beamforming P. A. Fridman ASTRON #12;RFI mitigation workshop, Penticton, July 2004 Adaptive phase control #12;RFI mitigation workshop, Penticton, July 2004 #12;RFI mitigation workshop, Penticton, July 2004 #12

  7. Preliminary hazards analysis -- vitrification process

    SciTech Connect

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)] [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  8. Volcanic Hazard Education through Virtual Field studies of Vesuvius and Laki Volcanoes

    NASA Astrophysics Data System (ADS)

    Carey, S.; Sigurdsson, H.

    2011-12-01

    Volcanic eruptions pose significant hazards to human populations and have the potential to cause significant economic impacts as shown by the recent ash-producing eruptions in Iceland. Demonstrating both the local and global impact of eruptions is important for developing an appreciation of the scale of hazards associated with volcanic activity. In order to address this need, Web-based virtual field exercises at Vesuvius volcano in Italy and Laki volcano in Iceland have been developed as curriculum enhancements for undergraduate geology classes. The exercises are built upon previous research by the authors dealing with the 79 AD explosive eruption of Vesuvius and the 1783 lava flow eruption of Laki. Quicktime virtual reality images (QTVR), video clips, user-controlled Flash animations and interactive measurement tools are used to allow students to explore archeological and geological sites, collect field data in an electronic field notebook, and construct hypotheses about the impacts of the eruptions on the local and global environment. The QTVR images provide 360o views of key sites where students can observe volcanic deposits and formations in the context of a defined field area. Video sequences from recent explosive and effusive eruptions of Carribean and Hawaiian volcanoes are used to illustrate specific styles of eruptive activity, such as ash fallout, pyroclastic flows and surges, lava flows and their effects on the surrounding environment. The exercises use an inquiry-based approach to build critical relationships between volcanic processes and the deposits that they produce in the geologic record. A primary objective of the exercises is to simulate the role of a field volcanologist who collects information from the field and reconstructs the sequence of eruptive processes based on specific features of the deposits. Testing of the Vesuvius and Laki exercises in undergraduate classes from a broad spectrum of educational institutions shows a preference for the web-based interactive tools compared with traditional paper-based laboratory exercises. The exercises are freely accessible for undergraduate classes such as introductory geology, geologic hazards, or volcanology. Accompany materials, such as lecture-based Powerpoint presentations about Vesuvius and Laki, are also being developed for instructors to better integrate the web-based exercises into their existing curriculum.

  9. Beach hazard and susceptibility to inundation and erosion. Case studies in the west coast of Portugal.

    NASA Astrophysics Data System (ADS)

    Trindade, Jorge; Ramos-Pereira, Ana

    2010-05-01

    Hydrodynamic forces over the beach sediments are the main driving factors affecting the frequency and magnitude of morphological changes in beach systems. In most of the time, this driving factors act in a foreseeable way and don't represent any danger to the coastal systems nor to its populations. However, hydrodynamic forces are also capable of induce high morphodynamic behavior on the beach profiles and very often in a short period of time which endangers people and property and leads to system retreat. The most common consequences of the occurrence of this type of phenomena over the coastal landforms are costal inundation and erosion. Still, many coastal systems, and specially beach systems, have recovery mechanisms and resilience levels have a very important role in the beach morphodynamic state and exposure to potential damaging events assessments. The wave dominated Portuguese West coast is an high energetic environment during winter, with 2.5m mean offshore significant wave height. Waves with 5 year recurrence period can reach 9.2m and storms are frequent. Beach systems are frequently associated with rocky coasts. In these cases, the subsystems present are beach-dune, beach-cliff and beach-estuary subsystems exposed to NW Atlantic wave climate. This research aim is to access beach hazard and susceptibility to inundation and erosion. Three beach systems were selected and monitored applying sequential profiling methodology over a three year period (2004-2007). Sta. Rita, Azul and Foz do Lizandro beaches are representative systems of the coastal stretch between Peniche and Cascais, which is a cliff dominate coast. Results from the monitoring campaigns are presented, including volume budgets, beach face slope changes, berm occurrence and heights and planimetric coastline dynamics. A hazard and susceptibility assessment schema and zonation are proposed, including the parameterization of local flood (i.e. mean sea, maximum spring tide, and storm surge and run-up levels) and erosion potentials (i.e. volume budget and beach planimetric dynamics).

  10. Applications of a Forward-Looking Interferometer for the On-board Detection of Aviation Weather Hazards

    NASA Technical Reports Server (NTRS)

    West, Leanne; Gimmestad, Gary; Smith, William; Kireev, Stanislav; Cornman, Larry B.; Schaffner, Philip R.; Tsoucalas, George

    2008-01-01

    The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining measurements of potential weather hazards to alert flight crews. The FLI concept is based on high-resolution Infrared (IR) Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing, and which have also been applied to the detection of aerosols and gases for other purposes. It is being evaluated for multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing, during all phases of flight. Previous sensitivity and characterization studies addressed the phenomenology that supports detection and mitigation by the FLI. Techniques for determining the range, and hence warning time, were demonstrated for several of the hazards, and a table of research instrument parameters was developed for investigating all of the hazards discussed above. This work supports the feasibility of detecting multiple hazards with an FLI multi-hazard airborne sensor, and for producing enhanced IR images in reduced visibility conditions; however, further research must be performed to develop a means to estimate the intensities of the hazards posed to an aircraft and to develop robust algorithms to relate sensor measurables to hazard levels. In addition, validation tests need to be performed with a prototype system.

  11. Flood hazard and flood risk assessment using a time series of satellite images: a case study in Namibia.

    PubMed

    Skakun, Sergii; Kussul, Nataliia; Shelestov, Andrii; Kussul, Olga

    2014-08-01

    In this article, the use of time series of satellite imagery to flood hazard mapping and flood risk assessment is presented. Flooded areas are extracted from satellite images for the flood-prone territory, and a maximum flood extent image for each flood event is produced. These maps are further fused to determine relative frequency of inundation (RFI). The study shows that RFI values and relative water depth exhibit the same probabilistic distribution, which is confirmed by Kolmogorov-Smirnov test. The produced RFI map can be used as a flood hazard map, especially in cases when flood modeling is complicated by lack of available data and high uncertainties. The derived RFI map is further used for flood risk assessment. Efficiency of the presented approach is demonstrated for the Katima Mulilo region (Namibia). A time series of Landsat-5/7 satellite images acquired from 1989 to 2012 is processed to derive RFI map using the presented approach. The following direct damage categories are considered in the study for flood risk assessment: dwelling units, roads, health facilities, and schools. The produced flood risk map shows that the risk is distributed uniformly all over the region. The cities and villages with the highest risk are identified. The proposed approach has minimum data requirements, and RFI maps can be generated rapidly to assist rescuers and decisionmakers in case of emergencies. On the other hand, limitations include: strong dependence on the available data sets, and limitations in simulations with extrapolated water depth values. PMID:24372226

  12. Main natural hazards and vulnerability studies for some historical constructions and urban sectors of Valparaiso City (Chile)

    NASA Astrophysics Data System (ADS)

    Romanelli, F.

    2009-04-01

    The Project "MAR VASTO" ("Risk Management in Valparaíso/Manejo de Riesgos en Valparaíso, Servicios Técnicos", 2007) started in March 2007, with coordination of ENEA (Italian Agency for New Technologies, Energy and Environment), participation of several partners (Italy: University of Ferrara, Faculties of Architecture and Engineering; University of Padua, Faculty of Engineering; Abdus Salam International Centre for Theoretical Physics of Trieste; Chile: Valparaíso Technical University Federico Santa Maria, Civil Works Department; Santiago University of Chile, Division Structures Constructions Geotechnics), and support of local stakeholders. Being Valparaíso included since 2003 in the UNESCO Word Heritage List of protected sites, the project main goals are the following: to collect, analyze and elaborate the existing information, with a satisfying evaluation of main hazards; to develop a GIS digital archive, well organized, user-friendly and easy to be implemented in the future, providing maps and scenarios of specific and multiple risk; to provide a vulnerability analysis for three historical churches (San Francisco del Baron, Las Hermanitas de la Providencia, La Matríz, made by various materials - masonry, concrete, wood and adobe - and located in different city sites) and for a building stock in the Cerro Cordillera (partially inside the UNESCO area), analyzing more than 200 constructions; to suggest guidelines for future urban planning and strengthening interventions. In the framework of the MAR VASTO Project, the most important hazards have been investigated carried out. With regard to seismic hazard, "state-of-the-art" information has been provided by Chilean partners and stakeholders, using materials of several studies and stored in original earthquake reports, local newspapers and magazines. The activities performed by the Italian team regarded the definition, for the city of Valparaiso, of earthquake scenarios and maps based on the neo-deterministic approach. With regard to tsunami, the information has been provided by SHOA (Servicio Hidrografico y Oceanografico de la Armada de Chile). Tsunami scenarios and inundation maps (for both the 1906 and 1985 earthquakes) have been evaluated by the Italian team, taking into account also worse scenarios (namely the 1730 seismic event). Landslide hazard (identifying main flow areas and pointing out most affected zones, with a deeper investigation in the Cerro Cordillera, pilot area for the MAR VASTO Project) and fire hazard have been also evaluated. Finally, a GIS database has been developed, to store the hazard maps produced in the project. In addition, the GIS database has been verified by using the data obtained by a DGPS survey, performed during the in situ work in Valparaiso. In the framework of the MAR VASTO Project a building stock located in the Cerro Cordillera, partially inside the UNESCO area, has been investigated. The work done in the above said Cerro Cordillera sector by the Italian team can be considered a pilot experience which would be enlarged to all the Valparaiso City area in the framework of the town planning, actually in progress.

  13. A Probabilistic Tsunami Hazard Assessment for Western Australia

    NASA Astrophysics Data System (ADS)

    Burbidge, David; Cummins, Phil R.; Mleczko, Richard; Thio, Hong Kie

    2008-12-01

    The occurrence of the Indian Ocean Tsunami on 26 December, 2004 has raised concern about the difficulty in determining appropriate tsunami mitigation measures in Australia, due to the lack of information on the tsunami threat. A first step in the development of such measures is a tsunami hazard assessment, which gives an indication of which areas of coastline are most likely to experience tsunamis, and how likely such events are. Here we present the results of a probabilistic tsunami hazard assessment for Western Australia (WA). Compared to other parts of Australia, the WA coastline experiences a relatively high frequency of tsunami occurrence. This hazard is due to earthquakes along the Sunda Arc, south of Indonesia. Our work shows that large earthquakes offshore of Java and Sumba are likely to be a greater threat to WA than those offshore of Sumatra or elsewhere in Indonesia. A magnitude 9 earthquake offshore of the Indonesian islands of Java or Sumba has the potential to significantly impact a large part of the West Australian coastline. The level of hazard varies along the coast, but is highest along the coast from Carnarvon to Dampier. Tsunamis generated by other sources (e.g., large intra-plate events, volcanoes, landslides and asteroids) were not considered in this study.

  14. Mapping of hazard from rainfall-triggered landslides in developing countries: Examples from Honduras and Micronesia

    Microsoft Academic Search

    Edwin L. Harp; Mark E. Reid; Jonathan P. McKenna; John A. Michael

    2009-01-01

    Loss of life and property caused by landslides triggered by extreme rainfall events demonstrates the need for landslide-hazard assessment in developing countries where recovery from such events often exceeds the country's resources. Mapping landslide hazards in developing countries where the need for landslide-hazard mitigation is great but the resources are few is a challenging, but not intractable problem. The minimum

  15. MITIGATING WETLANDS LOSSES

    EPA Science Inventory

    Under Section 404 of the Clean Water Act, compensatory mitigation of wetlands is required to offset any unavoidable adverse impacts to wetlands that cannot otherwise be minimized. Compensatory mitigation usually is in the form of restoration, enhancement, or creation of new wetl...

  16. Long term performance of radon mitigation systems

    SciTech Connect

    Prill, R.; Fisk, W.J.

    2002-03-01

    Researchers installed radon mitigation systems in 12 houses in Spokane, Washington and Coeur d'Alene, Idaho during the heating season 1985--1986 and continued to monitor indoor radon quarterly and annually for ten years. The mitigation systems included active sub-slab ventilation, basement over-pressurization, and crawlspace isolation and ventilation. The occupants reported various operational problems with these early mitigation systems. The long-term radon measurements were essential to track the effectiveness of the mitigation systems over time. All 12 homes were visited during the second year of the study, while a second set 5 homes was visited during the fifth year to determine the cause(s) of increased radon in the homes. During these visits, the mitigation systems were inspected and measurements of system performance were made. Maintenance and modifications were performed to improve system performance in these homes.

  17. Augmented Reality Cues and Elderly Driver Hazard Perception

    PubMed Central

    Schall, Mark C.; Rusch, Michelle L.; Lee, John D.; Dawson, Jeffrey D.; Thomas, Geb; Aksan, Nazan; Rizzo, Matthew

    2013-01-01

    Objective Evaluate the effectiveness of augmented reality (AR) cues in improving driving safety in elderly drivers who are at increased crash risk due to cognitive impairments. Background Cognitively challenging driving environments pose a particular crash risk for elderly drivers. AR cueing is a promising technology to mitigate risk by directing driver attention to roadway hazards. This study investigates whether AR cues improve or interfere with hazard perception in elderly drivers with age-related cognitive decline. Methods Twenty elderly (Mean= 73 years, SD= 5 years), licensed drivers with a range of cognitive abilities measured by a speed of processing (SOP) composite participated in a one-hour drive in an interactive, fixed-base driving simulator. Each participant drove through six, straight, six-mile-long rural roadway scenarios following a lead vehicle. AR cues directed attention to potential roadside hazards in three of the scenarios, and the other three were uncued (baseline) drives. Effects of AR cueing were evaluated with respect to: 1) detection of hazardous target objects, 2) interference with detecting nonhazardous secondary objects, and 3) impairment in maintaining safe distance behind a lead vehicle. Results AR cueing improved the detection of hazardous target objects of low visibility. AR cues did not interfere with detection of nonhazardous secondary objects and did not impair ability to maintain safe distance behind a lead vehicle. SOP capacity did not moderate those effects. Conclusion AR cues show promise for improving elderly driver safety by increasing hazard detection likelihood without interfering with other driving tasks such as maintaining safe headway. PMID:23829037

  18. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and consequences.

  19. Mitigation Action Plan

    SciTech Connect

    Not Available

    1994-02-01

    This Mitigation Action Plan (MAP) focuses on mitigation commitments stated in the Supplemental Environmental Impact Statement (SEIS) and the Record of Decision (ROD) for the Naval Petroleum Reserve No. 1 (NPR-1). Specific commitments and mitigation implementation actions are listed in Appendix A-Mitigation Actions, and form the central focus of this MAP. They will be updated as needed to allow for organizational, regulatory, or policy changes. It is the intent of DOE to comply with all applicable federal, state, and local environmental, safety, and health laws and regulations. Eighty-six specific commitments were identified in the SEIS and associated ROD which pertain to continued operation of NPR-1 with petroleum production at the Maximum Efficient Rate (MER). The mitigation measures proposed are expected to reduce impacts as much as feasible, however, as experience is gained in actual implementation of these measures, some changes may be warranted.

  20. Study of the seismicity temporal variation for the current seismic hazard evaluation in Val d'Agri, Italy

    NASA Astrophysics Data System (ADS)

    Baskoutas, I.; D'Alessandro, A.

    2014-12-01

    This study examines the temporal variation of the seismicity in the Val d'Agri (southern Italy) and adjacent areas, for the current seismic hazard evaluation. The temporal variation of the seismicity is expressed as time series of the number of earthquakes, b value of Gutenberg-Richter relationship or b value of the frequency-magnitude distribution and the seismic energy released in the form of logE2/3. The analysis was performed by means of a new research tool that includes visualizing techniques, which helps the interactive exploration and the interpretation of temporal variation changes. The obtained time series show a precursory seismicity pattern, characterized by low and high probability periods, which preceded earthquakes of magnitude M ? 4.0. The 75% of the examined cases were successfully correlated with a change in seismicity pattern. The average duration of the low and the high probability periods is 10.6 and 13.8 months respectively. These results indicate that the seismicity temporal variation monitoring in a given area and the recognition of the low and high probability periods can contribute to the evaluation, in regular monthly intervals, of current seismic hazard status.

  1. A performance improvement case study in aircraft maintenance and its implications for hazard identification.

    PubMed

    Ward, Marie; McDonald, Nick; Morrison, Rabea; Gaynor, Des; Nugent, Tony

    2010-02-01

    Aircraft maintenance is a highly regulated, safety critical, complex and competitive industry. There is a need to develop innovative solutions to address process efficiency without compromising safety and quality. This paper presents the case that in order to improve a highly complex system such as aircraft maintenance, it is necessary to develop a comprehensive and ecologically valid model of the operational system, which represents not just what is meant to happen, but what normally happens. This model then provides the backdrop against which to change or improve the system. A performance report, the Blocker Report, specific to aircraft maintenance and related to the model was developed gathering data on anything that 'blocks' task or check performance. A Blocker Resolution Process was designed to resolve blockers and improve the current check system. Significant results were obtained for the company in the first trial and implications for safety management systems and hazard identification are discussed. Statement of Relevance: Aircraft maintenance is a safety critical, complex, competitive industry with a need to develop innovative solutions to address process and safety efficiency. This research addresses this through the development of a comprehensive and ecologically valid model of the system linked with a performance reporting and resolution system. PMID:20099178

  2. A Preliminary Study of Some Health Hazards in the Plasma Jet Process

    PubMed Central

    Hickish, D. E.; Challen, P. J. R.

    1963-01-01

    A brief technical description is given of the plasma jet process, and reference is made to its likely practical applications in industry. An opportunity has been taken during experiments with a prototype plasma jet to assess some of the health hazards which might arise from these industrial applications and to indicate the type of precautions which should be observed in practice. Measurements and analysis of the noise emitted during the operation of a jet showed that the sound intensities ranged from 79·5 to 90·5 dB (re 0·0002 dynes/cm.2) per octave band between 300 and 10,000 cycles/second. Three male volunteers exposed to the noise for a period of one hour were subsequently found to have a mean temporary threshold shift of 19 dB at 4,000 cycles/second. Air sampling and analysis for ozone and nitrogen dioxide in the near vicinity of the jet gave a negative result for the former substance but demonstrated that the latter contaminant was present in concentrations ranging from 0·1 to 9·6 p.p.m. Images PMID:13961129

  3. Pre-Disaster Mitigation Plan Montana State University -Bozeman

    E-print Network

    Maxwell, Bruce D.

    Resistant University (DRU) Program. At that time, FEMA officials recognized the major role universities) Earthquake 3) Structure Fire 4) Severe Summer Weather 5) Hazardous Material Incidents 6) Utility Interruption. Top priority mitigation projects are: Implement risk reduction measures into future buildings and

  4. Lunar Dust Mitigation Technology Development

    NASA Technical Reports Server (NTRS)

    Hyatt, Mark J.; Deluane, Paul B.

    2008-01-01

    NASA s plans for implementing the Vision for Space Exploration include returning to the moon as a stepping stone for further exploration of Mars, and beyond. Dust on the lunar surface has a ubiquitous presence which must be explicitly addressed during upcoming human lunar exploration missions. While the operational challenges attributable to dust during the Apollo missions did not prove critical, the comparatively long duration of impending missions presents a different challenge. Near term plans to revisit the moon places a primary emphasis on characterization and mitigation of lunar dust. Comprised of regolith particles ranging in size from tens of nanometers to microns, lunar dust is a manifestation of the complex interaction of the lunar soil with multiple mechanical, electrical, and gravitational effects. The environmental and anthropogenic factors effecting the perturbation, transport, and deposition of lunar dust must be studied in order to mitigate it s potentially harmful effects on exploration systems. This paper presents the current perspective and implementation of dust knowledge management and integration, and mitigation technology development activities within NASA s Exploration Technology Development Program. This work is presented within the context of the Constellation Program s Integrated Lunar Dust Management Strategy. The Lunar Dust Mitigation Technology Development project has been implemented within the ETDP. Project scope and plans will be presented, along with a a perspective on lessons learned from Apollo and forensics engineering studies of Apollo hardware. This paper further outlines the scientific basis for lunar dust behavior, it s characteristics and potential effects, and surveys several potential strategies for its control and mitigation both for lunar surface operations and within the working volumes of a lunar outpost.

  5. Towards a Proactive Risk Mitigation Strategy at La Fossa Volcano, Vulcano Island

    NASA Astrophysics Data System (ADS)

    Biass, S.; Gregg, C. E.; Frischknecht, C.; Falcone, J. L.; Lestuzzi, P.; di Traglia, F.; Rosi, M.; Bonadonna, C.

    2014-12-01

    A comprehensive risk assessment framework was built to develop proactive risk reduction measures for Vulcano Island, Italy. This framework includes identification of eruption scenarios; probabilistic hazard assessment, quantification of hazard impacts on the built environment, accessibility assessment on the island and risk perception study. Vulcano, a 21 km2 island with two primary communities host to 900 permanent residents and up to 10,000 visitors during summer, shows a strong dependency on the mainland for basic needs (water, energy) and relies on a ~2 month tourism season for its economy. The recent stratigraphy reveals a dominance of vulcanian and subplinian eruptions, producing a range of hazards acting at different time scales. We developed new methods to probabilistically quantify the hazard related to ballistics, lahars and tephra for all eruption styles. We also elaborated field- and GIS- based methods to assess the physical vulnerability of the built environment and created dynamic models of accessibility. Results outline the difference of hazard between short and long-lasting eruptions. A subplinian eruption has a 50% probability of impacting ~30% of the buildings within days after the eruption, but the year-long damage resulting from a long-lasting vulcanian eruption is similar if tephra is not removed from rooftops. Similarly, a subplinian eruption results in a volume of 7x105 m3 of material potentially remobilized into lahars soon after the eruption. Similar volumes are expected for a vulcanian activity over years, increasing the hazard of small lahars. Preferential lahar paths affect critical infrastructures lacking redundancy, such as the road network, communications systems, the island's only gas station, and access to the island's two evacuation ports. Such results from hazard, physical and systemic vulnerability help establish proactive volcanic risk mitigation strategies and may be applicable in other island settings.

  6. Reducing hazardous waste incinerator emissions through blending: A study of 1,1,1-trichloroethane injection

    SciTech Connect

    Thomson, M.; Koshland, C.P.; Sawyer, R.F. [Univ. of California, Berkeley, CA (United States)] [and others

    1996-12-31

    We investigate whether blending liquid hazardous wastes with hydrocarbons such as alkanes can improve the destruction efficiency and reduce the combustion byproduct levels in the post-flame region of a laboratory scale combustor. Outlet species concentrations are measured with an FTIR spectrometer for mixtures of 1,1,1-trichloroethane and 25% (by volume) dodecane or heptane injected as a spray of droplets. We also inject sprays of liquid pure 1,1,1-trichloroethane, gaseous pure 1,1,1-trichloroethane, and gaseous 1,1,1-trichloroethane with 25% (by volume) heptane. Once vaporized, the 1,1,1-trichloroethane decomposes to form CO{sub 2} and HCl through the intermediates 1,1-dichloroethylene, phosgene, acetylene, and carbon monoxide. The 1,1,1-trichloroethane/alkane mixtures also form the intermediate ethylene. No significant differences are observed between injecting the compounds as a droplet spray or as a gaseous jet, not as unexpected result as the mixing time of the gas jet is longer than the vaporization time of the droplets. The addition of heptane or dodecane to 1,1,1-trichloroethane produces two principal effects: an increase in ethylene, acetylene and carbon monoxide levels for injection temperatures between 950 to 1040 K, and a decrease in 1,1-dichloroethylene, phosgene, acetylene, and carbon monoxide levels for injection temperatures greater than 1050 K. Reaction of the injected alkane causes the former effect, while the additional heat of combustion of the alkane additives causes the latter. 17 refs., 6 figs., 3 tabs.

  7. Flood Hazard Recurrence Frequencies for the Savannah River Site

    SciTech Connect

    Chen, K.F.

    2001-07-11

    Department of Energy (DOE) regulations outline the requirements for Natural Phenomena Hazard (NPH) mitigation for new and existing DOE facilities. The NPH considered in this report is flooding. The facility-specific probabilistic flood hazard curve defines, as a function of water elevation, the annual probability of occurrence or the return period in years. The facility-specific probabilistic flood hazard curves provide basis to avoid unnecessary facility upgrades, to establish appropriate design criteria for new facilities, and to develop emergency preparedness plans to mitigate the consequences of floods. A method based on precipitation, basin runoff and open channel hydraulics was developed to determine probabilistic flood hazard curves for the Savannah River Site. The calculated flood hazard curves show that the probabilities of flooding existing SRS major facilities are significantly less than 1.E-05 per year.

  8. ETINDE. Improving the role of a methodological approach and ancillary ethnoarchaeological data application for place vulnerability and resilience to a multi-hazard environment: Mt. Cameroon volcano case study [MIA-VITA project -FP7-ENV-2007-1

    NASA Astrophysics Data System (ADS)

    Ilaria Pannaccione Apa, Maria; Kouokam, Emmanuel; Mbe Akoko, Robert; Peppoloni, Silvia; Fabrizia Buongiorno, Maria; Thierry, Pierre

    2013-04-01

    The FP7 MIA-VITA [Mitigate and assess risk from volcanic impact on terrain and human activities] project has been designed to address multidisciplinary aspects of volcanic threat assessment and management from prevention to crisis management recovery. In the socio-economic analysis carried out at Mt. Cameroon Bakweri and Bakossi ethnic groups, ancillary ethnoarchaeological information has been included to point out the cultural interaction between the volcano and its residents. In 2009-2011, ethnoanthropological surveys and interviews for data collection were carried out at Buea, Limbe, West Coast, Tiko and Muyuka sub-divisions adjacent to Mt. Cameroon. One of the outstanding, results from the Bakweri and Bakossi cultural tradition study: natural hazards are managed and produced by supernatural forces, as: Epasa Moto, God of the Mountain (Mt. Cameroon volcano) and Nyango Na Nwana , Goddess of the sea (Gulf of Guinea). In the case of Mount Cameroon, people may seek the spirit or gods of the mountain before farming, hunting and most recently the undertaking of the Mount Cameroon annual race are done. The spirit of this mountain must be seek to avert or stop a volcanic eruption because the eruption is attributed to the anger of the spirit. Among the Northern Bakweri, the association of spirits with the mountain could also be explained in terms of the importance of the mountain to the people. Most of their farming and hunting is done on the Mountain. Some forest products, for instance, wood for building and furniture is obtained from the forest of the mountain; this implies that the people rely on the Mountain for food, game and architecture/furniture etc. In addition, the eruption of the mountain is something which affects the people. It does not only destroy property, it frustrates people and takes away human lives when it occurs. Because of this economic importance of the Mountain and its unexpected and unwanted eruption, the tendency is to believe that it has some supernatural force dwelling in it: the god EPASA MOTO. Since social group is forever indebted to the gods because of his deceptive behavior, it must remedy to calm the anger of the gods. Rites are managed by traditional chiefs in the name of the group making offerings and sacrifices, which preciousness is directly proportional to the request; The perception of vulnerability to natural disasters is mitigated by ritual practices devoted to keep under control the Genius Loci (EPASA MOTO) negative reactions as eruptions, tidal waves, etc.. According with landscape evolution, the present work will describe the anthropogenic remodeled space and the related Vulnerability Hazards-of-Place Model elaborated by S. Cutter in 1996. Results will suggest a good approach to local geo-hazards management through traditional methods. Principles of Geoethics are important tools in managing natural hazards in different cultural contexts. A geoethical approach in risk management guarantees the respect for beliefs and cultural traditions and the development of strategies respectful of values and sensibilities of the involved populations.

  9. Respirometric studies on the impact of humic substances on the activated sludge treatment: mitigation of an inhibitory effect caused by diesel oil.

    PubMed

    Lipczynska-Kochany, E; Kochany, J

    2008-10-01

    This paper describes the results of aerobic respirometric studies on the application of humic substances (humate) to mitigate an inhibitory effect of petroleum hydrocarbons (diesel oil) on the returned activated sludge (RAS) in sewage from a municipal treatment plant. Initial results of the respirometric tests and non-linear regression analysis showed that diesel oil had an inhibitory effect on the activity of biomass and that kinetic data complied with the Haldane model for inhibitory wastes. Humate addition significantly enhanced the oxygen uptake by RAS. Application of humate at the dose of 2000 mg 1(-1) to the sewage contaminated with 10 mg l(-1) of diesel oil resulted in almost complete recovery of the biomass oxygen uptake. Non-linear regression analysis of the respirometric data indicated that this system complied with the Monod model for non-inhibitory wastes. Thus, the application of humic substances to mitigate the inhibitory effects of oil spills in wastewater treatment plants seems to be an attractive alternative to the treatments using activated carbon or specialized sorbents. PMID:18942578

  10. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  11. Designing Effective Natural Hazards Preparedness Communications: Factors that Influence Perceptions and Action

    NASA Astrophysics Data System (ADS)

    Wong-Parodi, G.; Fischhoff, B.

    2012-12-01

    Even though most people believe that natural hazards preparation is important for mitigating damage to their homes and basic survival in the aftermath of a disaster, few actually disaster-proof their homes, create plans, or obtain supplies recommended by agencies such as the Federal Emergency Management Agency. Several observational studies suggest that socio-demographic characteristics such as income and psychological characteristics such as self-efficacy affect whether or not an individual takes action to prepare for a natural hazard. These studies, however, only suggest that these characteristics may play a role. There has been little research that systematically investigates how these characteristics play a role in people's perceptions of recommended preparatory activities and decisions to perform them. Therefore, in Study 1, we explore people's perceptions of natural hazards preparedness measures on four dimensions: time, cost, helpfulness, and sense of preparedness. We further investigate if these responses vary by the socio-demographic and psychological characteristics of self-efficacy, knowledge, and income level. In Study 2, we experimentally test whether people's sense of self-efficacy, as it relates to natural hazards, can be manipulated through exposure to an "easy-and-effective" versus a "hard-and-effective" set of preparation measures. Our findings have implications for the design of natural hazards communication materials for the general public.

  12. PSI Wide Area Product (WAP) for measuring Ground Surface Displacements at regional level for multi-hazards studies

    NASA Astrophysics Data System (ADS)

    Duro, Javier; Iglesias, Rubén; Blanco, Pablo; Albiol, David; Koudogbo, Fifamè

    2015-04-01

    The Wide Area Product (WAP) is a new interferometric product developed to provide measurement over large regions. Persistent Scatterers Interferometry (PSI) has largely proved their robust and precise performance in measuring ground surface deformation in different application domains. In this context, however, the accurate displacement estimation over large-scale areas (more than 10.000 km2) characterized by low magnitude motion gradients (3-5 mm/year), such as the ones induced by inter-seismic or Earth tidal effects, still remains an open issue. The main reason for that is the inclusion of low quality and more distant persistent scatterers in order to bridge low-quality areas, such as water bodies, crop areas and forested regions. This fact yields to spatial propagation errors on PSI integration process, poor estimation and compensation of the Atmospheric Phase Screen (APS) and the difficult to face residual long-wavelength phase patterns originated by orbit state vectors inaccuracies. Research work for generating a Wide Area Product of ground motion in preparation for the Sentinel-1 mission has been conducted in the last stages of Terrafirma as well as in other research programs. These developments propose technological updates for keeping the precision over large scale PSI analysis. Some of the updates are based on the use of external information, like meteorological models, and the employment of GNSS data for an improved calibration of large measurements. Usually, covering wide regions implies the processing over areas with a land use which is chiefly focused on livestock, horticulture, urbanization and forest. This represents an important challenge for providing continuous InSAR measurements and the application of advanced phase filtering strategies to enhance the coherence. The advanced PSI processing has been performed out over several areas, allowing a large scale analysis of tectonic patterns, and motion caused by multi-hazards as volcanic, landslide and flood. Several examples of the application of the PSI WAP to wide regions for measuring ground displacements related to different types of hazards, natural and human induced will be presented. The InSAR processing approach to measure accurate movements at local and large scales for allowing multi-hazard interpretation studies will also be discussed. The test areas will show deformations related to active faults systems, landslides in mountains slopes, ground compaction over underneath aquifers and movements in volcanic areas.

  13. 77 FR 75441 - Healthy Home and Lead Hazard Control Grant Programs Data Collection; Progress Reporting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-20

    ...Program, Healthy Homes Technical Studies Program, Lead Base paint Hazard Control program, Lead Hazard Reduction Demonstration...Program, Healthy Homes Technical Studies Program, Lead Base paint Hazard Control program, Lead Hazard Reduction...

  14. Mercury, cadmium and lead contamination in seafood: A comparative study to evaluate the usefulness of Target Hazard Quotients

    Microsoft Academic Search

    A. Petroczi; D. P. Naughton

    2009-01-01

    The aim of this paper is to explore the applicability of Target Hazard Quotient (THQ) estimations to inform on seafood hazards through metal contamination. The food recall data set was collated by the Laboratory of the Government Chemist (LGC, UK) over the period from January to November 2007. Pearson chi-square goodness of fit test, nonparametric correlation (Kendall tau) and Kruskal–Wallis

  15. HACCP (Hazard Analysis and Critical Control Points) to guarantee safe water reuse and drinking water production - a case study

    Microsoft Academic Search

    T. Dewettinck; E. Van Houtte; D. Geenens; K. Van Hege; W. Verstraete

    2001-01-01

    To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was

  16. Mitigation win-win

    NASA Astrophysics Data System (ADS)

    Moran, Dominic; Lucas, Amanda; Barnes, Andrew

    2013-07-01

    Win-win messages regarding climate change mitigation policies in agriculture tend to oversimplify farmer motivation. Contributions from psychology, cultural evolution and behavioural economics should help to design more effective policy.

  17. Orbital Debris Mitigation

    NASA Technical Reports Server (NTRS)

    Kelley, R. L.; Jarkey, D. R.; Stansbery, G.

    2014-01-01

    Policies on limiting orbital debris are found throughout the US Government, many foreign space agencies, and as adopted guidelines in the United Nations. The underlying purpose of these policies is to ensure the environment remains safe for the operation of robotic and human spacecraft in near- Earth orbit. For this reason, it is important to consider orbital debris mitigation during the design of all space vehicles. Documenting compliance with the debris mitigation guidelines occurs after the vehicle has already been designed and fabricated for many CubeSats, whereas larger satellites are evaluated throughout the design process. This paper will provide a brief explanation of the US Government Orbital Debris Mitigation Standard Practices, a discussion of international guidelines, as well as NASA's process for compliance evaluation. In addition, it will discuss the educational value of considering orbital debris mitigation requirements as a part of student built satellite design.

  18. Effects of hazardous and harmful alcohol use on HIV incidence and sexual behaviour: a cohort study of Kenyan female sex workers

    PubMed Central

    2014-01-01

    Aims To investigate putative links between alcohol use, and unsafe sex and incident HIV infection in sub-Saharan Africa. Methods A cohort of 400 HIV-negative female sex workers was established in Mombasa, Kenya. Associations between categories of the Alcohol Use Disorders Identification Test (AUDIT) and the incidence at one year of unsafe sex, HIV and pregnancy were assessed using Cox proportional hazards models. Violence or STIs other than HIV measured at one year was compared across AUDIT categories using multivariate logistic regression. Results Participants had high levels of hazardous (17.3%, 69/399) and harmful drinking (9.5%, 38/399), while 36.1% abstained from alcohol. Hazardous and harmful drinkers had more unprotected sex and higher partner numbers than abstainers. Sex while feeling drunk was frequent and associated with lower condom use. Occurrence of condom accidents rose step-wise with each increase in AUDIT category. Compared with non-drinkers, women with harmful drinking had 4.1-fold higher sexual violence (95% CI adjusted odds ratio [AOR]?=?1.9-8.9) and 8.4 higher odds of physical violence (95% CI AOR?=?3.9-18.0), while hazardous drinkers had 3.1-fold higher physical violence (95% CI AOR?=?1.7-5.6). No association was detected between AUDIT category and pregnancy, or infection with Syphilis or Trichomonas vaginalis. The adjusted hazard ratio of HIV incidence was 9.6 comparing women with hazardous drinking to non-drinkers (95% CI?=?1.1-87.9). Conclusions Unsafe sex, partner violence and HIV incidence were higher in women with alcohol use disorders. This prospective study, using validated alcohol measures, indicates that harmful or hazardous alcohol can influence sexual behaviour. Possible mechanisms include increased unprotected sex, condom accidents and exposure to sexual violence. Experimental evidence is required demonstrating that interventions to reduce alcohol use can avert unsafe sex. PMID:24708844

  19. An observational urban heat island study: A primary step in heat event mitigation planning in Detroit, MI

    NASA Astrophysics Data System (ADS)

    Oswald, E.; Rood, R. B.; O'Neill, M.; Zhang, K.

    2010-12-01

    Knowledge of the structure and characteristics of urban heat islands (UHIs) is becoming evermore important to public health practitioners and city planners as they attempt to better identify parts of the city that are especially vulnerable and to plan strategies to mitigate heat-related health threats. The spatial structure of UHIs can be investigated in many different manners, but investigation of raw observations can be problematic. From a meteorological point of view, one goal is to map the structure of the urban heat island from routinely-made standard weather observations to a complex urban environment - in effect, a highly localized downscaling. In order to accomplish such a goal, we conducted analysis using a dense network of temporary observation stations, in concert with established observing networks, inside the city of Detroit. In this talk we correlate point-source temperature measurements with relevant spatial attributes (surface imperviousness, proximity to water, etc.) to model the observed temperature patterns. Future work towards mapping heat vulnerability includes co-analysis with spatial data of population adaptive capacity and sensitivity to heat stress.

  20. RFI Mitigation Workshop

    NASA Astrophysics Data System (ADS)

    2010-05-01

    The increased sensitivity of passive instrumentation in radio astronomy and remote sensing and the intensifying active use of the spectrum have led to an increasing level of radio frequency interference (RFI) of the active services on the passive use of the spectrum. Advances in technology and computing have opened up new possibilities for mitigating the effects of certain classes of interference in the observing data. Interference in allocated bands always leads to data loss for the passive users of the spectrum even if interference mitigation is applied. However, interference mitigation in non-allocated spectral bands may facilitate the partial use of this spectrum for passive (non-interfering) observations. There is no generic method to mitigate all types of interference, so a multi-layered system approach may be advisable to reduce detrimental effects for a congested interference environment. Specific mitigation methods implemented at different points in the data acquisition chain will thus result in a cumulative mitigation effect on the data. This third RFI Mitigation Workshop considered RFI mitigation in radio astronomy in all its facets with the aim of facilitating the implementation of instrumental and data processing techniques. This workshop aimed to take a forward look at applications for the next generation of radio instruments, such as the SKA and its pathfinders and LOFAR, as well as considering their application to existing instruments. This workshop has been organized by ASTRON and NAIC, with support from the Engineering Forum of FP7 RadioNet, the SKA Project Development Office, and in collaboration with CRAF and IUCAF.

  1. Volcanic hazards to airports

    USGS Publications Warehouse

    Guffanti, M.; Mayberry, G.C.; Casadevall, T.J.; Wunderman, R.

    2009-01-01

    Volcanic activity has caused significant hazards to numerous airports worldwide, with local to far-ranging effects on travelers and commerce. Analysis of a new compilation of incidents of airports impacted by volcanic activity from 1944 through 2006 reveals that, at a minimum, 101 airports in 28 countries were affected on 171 occasions by eruptions at 46 volcanoes. Since 1980, five airports per year on average have been affected by volcanic activity, which indicates that volcanic hazards to airports are not rare on a worldwide basis. The main hazard to airports is ashfall, with accumulations of only a few millimeters sufficient to force temporary closures of some airports. A substantial portion of incidents has been caused by ash in airspace in the vicinity of airports, without accumulation of ash on the ground. On a few occasions, airports have been impacted by hazards other than ash (pyroclastic flow, lava flow, gas emission, and phreatic explosion). Several airports have been affected repeatedly by volcanic hazards. Four airports have been affected the most often and likely will continue to be among the most vulnerable owing to continued nearby volcanic activity: Fontanarossa International Airport in Catania, Italy; Ted Stevens Anchorage International Airport in Alaska, USA; Mariscal Sucre International Airport in Quito, Ecuador; and Tokua Airport in Kokopo, Papua New Guinea. The USA has the most airports affected by volcanic activity (17) on the most occasions (33) and hosts the second highest number of volcanoes that have caused the disruptions (5, after Indonesia with 7). One-fifth of the affected airports are within 30 km of the source volcanoes, approximately half are located within 150 km of the source volcanoes, and about three-quarters are within 300 km; nearly one-fifth are located more than 500 km away from the source volcanoes. The volcanoes that have caused the most impacts are Soufriere Hills on the island of Montserrat in the British West Indies, Tungurahua in Ecuador, Mt. Etna in Italy, Rabaul caldera in Papua New Guinea, Mt. Spurr and Mt. St. Helens in the USA, Ruapehu in New Zealand, Mt. Pinatubo in the Philippines, and Anatahan in the Commonwealth of the Northern Mariana Islands (part of the USA). Ten countries - USA, Indonesia, Ecuador, Papua New Guinea, Italy, New Zealand, Philippines, Mexico, Japan, and United Kingdom - have the highest volcanic hazard and/or vulnerability measures for airports. The adverse impacts of volcanic eruptions on airports can be mitigated by preparedness and forewarning. Methods that have been used to forewarn airports of volcanic activity include real-time detection of explosive volcanic activity, forecasts of ash dispersion and deposition, and detection of approaching ash clouds using ground-based Doppler radar. Given the demonstrated vulnerability of airports to disruption from volcanic activity, at-risk airports should develop operational plans for ashfall events, and volcano-monitoring agencies should provide timely forewarning of imminent volcanic-ash hazards directly to airport operators. ?? Springer Science+Business Media B.V. 2008.

  2. Respiratory hospitalizations of children living near a hazardous industrial site adjusted for prevalent dust: a case-control study.

    PubMed

    Nirel, Ronit; Maimon, Nimrod; Fireman, Elizabeth; Agami, Sarit; Eyal, Arnona; Peretz, Alon

    2015-03-01

    The Neot Hovav Industrial Park (IP), located in southern Israel, hosts 23 chemical industry facilities and the national site for treatment of hazardous waste. Yet, information about its impact on the health of local population has been mostly ecological, focused on Bedouins and did not control for possible confounding effect of prevalent dust storms. This case-control study examined whether living near the IP could lead to increased risk of pediatric hospitalization for respiratory diseases. Cases (n=3608) were residents of the Be'er Sheva sub-district aged 0-14 years who were admitted for respiratory illnesses between 2004 and 2009. These were compared to children admitted for non-respiratory conditions (n=3058). Home addresses were geocoded and the distances from the IP to the child's residence were calculated. The association between hospitalization and residential distance from the IP was examined for three age groups (0-1, 2-6, 7-14) by logistic regressions adjusting for gender, socioeconomic status, urbanity and temperature. We found that infants in the first year of life who lived within 10 km of the IP had increased risk of respiratory hospitalization when compared with those living >20 km from the IP (adjusted odds ratio, OR=2.07, 95% confidence interval, CI: 1.19-3.59). In models with both distance from the IP and particulate matter with aerodynamic diameter <10 ?m (PM(10)) the estimated risk was modestly attenuated (OR=1.96, 95% CI: 1.09-3.51). Elevated risk was also observed for children 2-5 years of age but with no statistical significance (OR=1.16, 95% CI: 0.76-1.76). Our findings suggest that residential proximity to a hazardous industrial site may contribute to early life respiratory admissions, beyond that of prevailing PM(10). PMID:25547415

  3. How Hazardous Substances Affect People

    NSDL National Science Digital Library

    This activity helps students gain an appreciation for how scientists determine the human health effects of hazardous substances. Students also demonstrate how hazardous substances can affect the health of test animals. They will discover that toxicology is the study of the effects of poisons on living organisms and that scientists conduct a variety of studies to discover toxicological information about hazardous substances. Students will also learn about two of the most common types of studies, which are epidemiological studies, matching disease and other adverse health effects in humans with possible causes, and animal toxicological studies.

  4. National Geophysical Data Center: Natural Hazards Data

    NSDL National Science Digital Library

    Natural hazards such as earthquakes, tsunamis, and volcanoes affect both coastal and inland areas. Long-term data from these events, including photographs, can be used to establish the past record of natural hazard event occurrences. These data are also important for planning, response, and mitigation of future events. The National Geophysical Data Center plays a major role in post-event data collection. This site provides links to database archives which contain information on natural hazards including publications, historical data, imagery, and other materials such as interactive maps and software. It also features supporting educational materials such as a natural hazards quiz and posters. The site provides links to the National Environmental Satellite Data and Information Service (NESDIS); and a Resource Directory, which functions as a clearinghouse.

  5. Dust: A major environmental hazard on the earth's moon

    SciTech Connect

    Heiken, G.; Vaniman, D.; Lehnert, B.

    1990-01-01

    On the Earth's Moon, obvious hazards to humans and machines are created by extreme temperature fluctuations, low gravity, and the virtual absence of any atmosphere. The most important other environmental factor is ionizing radiation. Less obvious environmental hazards that must be considered before establishing a manned presence on the lunar surface are the hazards from micrometeoroid bombardment, the nuisance of electro-statically-charged lunar dust, and an alien visual environment without familiar clues. Before man can establish lunar bases and lunar mining operations, and continue the exploration of that planet, we must develop a means of mitigating these hazards. 4 refs.

  6. Moral Hazard in Teams

    Microsoft Academic Search

    Bengt Holmstrom

    1982-01-01

    This article studies moral hazard with many agents. The focus is on two features that are novel in a multiagent setting: free riding and competition. The free-rider problem implies a new role for the principal: administering incentive schemes that do not balance the budget. This new role is essential for controlling incentives and suggests that firms in which ownership and

  7. Study of the impact of cruise and passenger ships on a Mediterranean port city air quality - Study of future emission mitigation scenarios

    NASA Astrophysics Data System (ADS)

    Liora, Natalia; Poupkou, Anastasia; Kontos, Serafim; Giannaros, Christos; Melas, Dimitrios

    2015-04-01

    An increase of the passenger ships traffic is expected in the Mediterranean Sea as targeted by the EU Blue Growth initiative. This increase is expected to impact the Mediterranean port-cities air quality considering not only the conventional atmospheric pollutants but also the toxic ones that are emitted by the ships (e.g. Nickel). The aim of this study is the estimation of the present and future time pollutant emissions from cruise and passenger maritime transport in the port area of Thessaloniki (Greece) as well as the impact of those emissions on the city air quality. Cruise and passenger ship emissions have been estimated for the year 2013 over a 100m spatial resolution grid which covers the greater port area of Thessaloniki. Emissions have been estimated for the following macro-pollutants; NOx, SO2, NMVOC, CO, CO2 and particulate matter (PM). In addition, the most important micro-pollutants studied in this work are As, Cd, Pb, Ni and Benzo(a)pyrene for which air quality limits have been set by the EU. Emissions have been estimated for three operation modes; cruising, maneuvering and hotelling. For the calculation of the present time maritime emissions, the activity data used were provided by the Thessaloniki Port Authority S.A. Moreover, future pollutant emissions are estimated using the future activity data provided by the Port Authority and the IMO legislation for shipping in the future. In addition, two mitigation emission scenarios are examined; the use of Liquefied Natural Gas (LNG) as a fuel used by ships and the implementation of cold ironing which is the electrification of ships during hotelling mode leading to the elimination of the corresponding emissions. The impact of the present and future passenger ship emissions on the air quality of Thessaloniki is examined with the use of the model CALPUFF applied over the 100m spatial resolution grid using the meteorology of WRF. Simulations of the modeling system are performed for four different emission scenarios; present time scenario, future time scenario, future time scenario plus use of LNG and future time scenario plus use of cold ironing. The differences in pollutant levels between the scenarios examined are presented and discussed

  8. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  9. Probabilistic Tsunami Hazard Analysis - Results for the Western United States

    NASA Astrophysics Data System (ADS)

    Thio, H.; Polet, J.; Somerville, P.

    2007-12-01

    We have developed a series of probabilistic tsunami hazard maps for the coasts of western North America based on fault source characterizations of the circum-Pacific subduction zones as well as local offshore faults. The maps show the probabilistic offshore exceedance waveheights at 72, 475, 975 and 2475 year return periods, which are the return periods typically used in Probabilistic Seismic Hazard Analysis (PSHA). Our method follows along similar lines as (PSHA) which has become a standard practice in the evaluation and mitigation of seismic hazard in particular with respect to structures, infrastructure and lifelines. Its ability to condense complexities, variability and uncertainties of seismic activity into a manageable set of ground motion parameters greatly facilitates the planning and design of effective seismic resistant buildings and infrastructure. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can rapidly synthesize tsunami waveforms for any slip distribution on those faults by summing the individual weighted subfault tsunami waveforms. This Green's function summation provides accurate estimates of tsunami height for probabilistic calculations, where one typically integrates over thousands of earthquake scenarios. We have carried out tsunami hazard calculations for western North America and Hawaii based on a comprehensive source model around the Pacific Ocean including both subduction zone sources as well as local offshore faults. We will present the tsunami hazard maps and discuss how these results are used for probabilistic inundation mapping, including a follow-up inundation study of the San Francisco Bay area that is based on disaggregation results of the probabilistic analysis. We will also show that even for tsunami sources with large uncertainties (e.g. submarine landslides) a probabilistic framework can yield meaningful and consistent results.

  10. Methods for recruiting white, black, and hispanic working-class women and men to a study of physical and social hazards at work: the United for Health study.

    PubMed

    Barbeau, Elizabeth M; Hartman, Cathy; Quinn, Margaret M; Stoddard, Anne M; Krieger, Nancy

    2007-01-01

    Despite research on work and health having a long-standing concern about unjust exposures and inequitable burdens of disease, there are few studies that document the joint distribution and health effects of physical and psychosocial hazards (e.g., noise, dusts, fumes, and job strain) and social hazards (e.g., racial discrimination and gender harassment) encountered at work. Also, there is a paucity of data on how these exposures, singly and combined, are distributed in relation to sociodemographic characteristics including race/ethnicity, gender, socioeconomic position, and nativity. This article presents a conceptual model for redressing these knowledge gaps and describes recruitment strategies and the characteristics of study participants in the United for Health study. Working with labor unions, the authors recruited 14 (67%) of 21 worksites from manufacturing, meat processing, retail, and transportation, and 1,282 workers (72% response rate), of whom 62 percent were men, 36 percent were women, 39 percent were black, 23 percent were Hispanic, 25 percent were white, 31% earned less than a living wage, 40 percent were below the poverty level, and 23 percent had less than a high school education. PMID:17436989

  11. Impact of a moral hazard problem in the Joint Forest Management Programme: a study from forest?dependent households in West Bengal

    Microsoft Academic Search

    Nimai Das; Debnarayan Sarker

    2009-01-01

    This study seeks to explore the impact of a moral hazard problem in the Joint Forest Management (JFM) programme between the government and forest fringe communities of a province in West Bengal, India. It suggests that if there is no incentive plan for the poor, it is hard for the government to monitor their actions. The poor tend to take

  12. Debris flow hazards due to land use change above source areas in torrent catchments. The case study of Les Arcs (Savoie,

    E-print Network

    Paris-Sud XI, Université de

    Debris flow hazards due to land use change above source areas in torrent catchments. The case study on processes such as runoff and erosion affecting torrent banks and beds, directly at the origin of debris flow is responsible for a change of annual water balance resulting in a significant increase of torrent water flow

  13. Natural Hazard Assessment and Communication in the Central United States

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Lynch, M. J.

    2009-12-01

    In the central United States, natural hazards, such as floods, tornados, ice storms, droughts, and earthquakes, result in significant damages and losses of life every year. For example, the February 5-6, 2008 tornado touched down in nine states (Alabama, Arkansas, Illinois, Indiana, Kentucky, Mississippi, Missouri, and Tennessee), killing 57, injuring 350, and causing more than 1.0 billion in damages. The January 2009 ice storm struck Arkansas, Illinois, Indiana, Kentucky, Missouri, Ohio, Tennessee, and West Virginia, killing 36 and causing more than 1.0 billion in damages. It is a great challenge for the society to develop an effective policy for mitigating these natural hazards in the central United States. However, the development of an effective policy starts with a good assessment of the natural hazards. Scientists play a key role in assessing the natural hazards. Therefore, scientists play an important role in the development of an effective policy for the natural hazard mitigation. It is critical for scientists to clearly define, quantify, and communicate the hazard assessments, including the associated uncertainties which are a key factor in policy decision making, to end-users. Otherwise, end-users will have difficulty understanding and using the information provided. For example, ground motion hazard maps with 2, 5, and 10 percent probabilities of exceedance (PE) in 50 years in the central United States have been produced for seismic hazard mitigation purpose. End-users have difficulty understanding and using the maps, however, which has led to either indecision or ineffective policy for seismic hazard mitigation in many communities in the central United States.

  14. Hazard identification of environmental pollutants by combining results from ecological and biomarker studies: an example

    EPA Science Inventory

    Objective: Linking exposures from environmental pollutants with adverse health effects is difficult because these exposures are usually low-dose and ill-defined. According to several investigators, a series of multidisciplinary, multilevel studies is needed to address this prob...

  15. A case study for natural cascading hazard: the Great Blizzard of 1888 in the Asturian Massif (Northern Spain)

    NASA Astrophysics Data System (ADS)

    Garcia-Hernandez, Cristina; Ruiz-Fernández, Jesús; Gallinar, David

    2015-04-01

    In this paper we study the events triggered by the Great Blizzard of 1888 in the Asturian Massif as a case study that shows how one hazard can be the main cause of another hazard occurring. The reconstruction of the chain of hazards triggered by the episode has been done on the basis of nivo-meteorogical conditions, event geographical location, and socio-economic impact. The episode has been studied through the analysis of the issues published in six different newspapers between the 20th of January and 30th of May 1888. We have collected the data of the ancient meteorological station of the University of Oviedo, and those contained in parish documents. Field work consisted in visual inspection and interviews to the contemporary residents. The information has been stored and crossed for statistical analysis using a logical database structure that has been designed with this purpose. The snowfall episode consisted in four consecutive snowstorms that occurred between the 14th of February 1888 and the 8th of April 1888, creating snow covers with an average depth ranging between 5 and 7 m. The snow accumulations were the main cause of material damage, affecting 27 high- and mid-elevation mountain municipalities. However, we have to consider that the newspapers only reflected those events affecting densely populated areas along with those which affected vital economic spaces (railway lines, roads in mountain passes, etc.). There were more than 200 interruptions with the traffic flow and communication outages, hampering economic activities. Snow built up on the roofs added extra weight to the structure of the buildings so more than 900 constructions collapsed, killing three persons and causing the loss of more than 19.000 head of cattle. Moreover, these snow accumulations were the basis of an episode of sixty-four snow avalanches that, undoubtedly, meant the main personal damage with a number of dead and wounded that reached 29 and 23 respectively. During the snowfall breaks, snow-melting processes became important: the river rising affected all the main hydrological basins, 29 news related to material damage were documented and three people died drowned. In addition, snow avalanches caused fast damming followed of violent river risings in at least two cases, causing even worse damages because of the surprise effect. Finally, we have to consider the connection that can be made between the melting process and thirty-six mass movements that were documented, destroying six buildings, causing the death of one person and dozens of interruptions in communications: the increase in such events is clearly associated with the temperature rising and, at the same time, its decline can be observed with the temperature dropping. These events took place mainly during the second snowfall break, so we must take into account the cumulative effect on the water saturation of the surface formations.

  16. Debris flow hazard mapping, Hobart, Tasmania, Australia

    NASA Astrophysics Data System (ADS)

    Mazengarb, Colin; Rigby, Ted; Stevenson, Michael

    2015-04-01

    Our mapping on the many dolerite capped mountains in Tasmania indicates that debris flows are a significant geomorphic process operating there. Hobart, the largest city in the State, lies at the foot of one of these mountains and our work is focussed on identifying areas that are susceptible to these events and estimating hazard in the valley systems where residential developments have been established. Geomorphic mapping with the benefit of recent LiDAR and GIS enabled stereo-imagery has allowed us to add to and refine a landslide inventory in our study area. In addition, a dominant geomorphic model has been recognised involving headward gully retreat in colluvial materials associated with rainstorms explains why many past events have occurred and where they may occur in future. In this paper we will review the landslide inventory including a large event (~200 000m3) in 1872 that affected a lightly populated area but since heavily urbanised. From this inventory we have attempted volume-mobility relationships, magnitude-frequency curves and likelihood estimates. The estimation of volume has been challenging to determine given that the area of depletion for each debris flow feature is typically difficult to distinguish from the total affected area. However, where LiDAR data exists, this uncertainty is substantially reduced and we develop width-length relationships (area of depletion) and area-volume relationships to estimate volume for the whole dataset exceeding 300 features. The volume-mobility relationship determined is comparable to international studies and in the absence of reliable eye-witness accounts, suggests that most of the features can be explained as single event debris flows, without requiring more complex mechanisms (such as those that form temporary debris dams that subsequently fail) as proposed by others previously. Likelihood estimates have also been challenging to derive given that almost all of the events have not been witnessed, some are constrained by aerial photographs to decade precision and many predate regional photography (pre 1940's). We have performed runout modelling, using 2D hydraulic modelling software (RiverFlow2D with Mud and Debris module), in order to calibrate our model against real events and gain confidence in the choice of parameters. Runout modelling was undertaken in valley systems with volumes calibrated to existing flood model likelihoods for each catchment. The hazard outputs from our models require developing a translation to hazard models used in Australia. By linking to flood mapping we aim to demonstrate to emergency managers where existing mitigation measures may be inadequate and how they can be adapted to address multiple hazards.

  17. Preliminary hazards analysis for the National Ignition Facility

    SciTech Connect

    Brereton, S.J.

    1993-10-01

    This report documents the Preliminary Hazards Analysis (PHA) for the National Ignition Facility (NIF). In summary, it provides: a general description of the facility and its operation; identification of hazards at the facility; and details of the hazards analysis, including inventories, bounding releases, consequences, and conclusions. As part of the safety analysis procedure set forth by DOE, a PHA must be performed for the NIF. The PHA characterizes the level of intrinsic potential hazard associated with a facility, and provides the basis for hazard classification. The hazard classification determines the level of safety documentation required, and the DOE Order governing the safety analysis. The hazard classification also determines the level of review and approval required for the safety analysis report. The hazards of primary concern associated with NIF are radiological and toxicological in nature. The hazard classification is determined by comparing facility inventories of radionuclides and chemicals with threshold values for the various hazard classification levels and by examining postulated bounding accidents associated with the hazards of greatest significance. Such postulated bounding accidents cannot take into account active mitigative features; they must assume the unmitigated consequences of a release, taking into account only passive safety features. In this way, the intrinsic hazard level of the facility can be ascertained.

  18. College's hot topics? Wildfire and Hazards' risk perception among university's population

    NASA Astrophysics Data System (ADS)

    Wuerzer, T.

    2014-12-01

    This research presents a novel perspective on college students and their risk perception in a fire prone US State; Idaho. Idaho was "top #1" in burned lands by acreage in 2012 with approximate 15% of all US burned lands; in 2013 "top #2". Past studies have conducted surveys on residents in high wildfire risk communities to learn what factors make homeowners more likely to engage in mitigation activities and therefore increase communities' resiliency. This research emphasis is on a population that deals with the threat of fire but is likely less invested through property ownership and other investment of risk; yet, equally threatened in quality of life. Are college students the left-out population in the 'planning for wildfires' and its communication process? Main hypothesis is that a college population will have a different perception and awareness (and therefore mitigation actions) than i.e. residents invested in the wild land urban interface (WUI). Dominant research methodologies in past studies are identified as surveys, focus groups, or interviews focusing on homeowners in fire prone areas that have witnessed wildfire or are exposed to increasing fire risk. Yet again, research on population that has no property ownership, investments at stake, and therefore no direct monetary values associated (but potentially non-monetary), is found little to none in these studies. The university population based study and its findings offers a contrast to current literature related to these traditional residents surveys/interviews. The study's survey and interactive spatial assessment of risk perception with allocation of perceived hazards zones promotes understanding of where risk is apparent for participants. Results are used to inform agencies such as campus emergency management, regional wild fire mitigation efforts, and to enhance public communication. Lessons learned include the challenges of a comprehensive inclusion process when mitigating for hazards and building resiliency in a region with development pressures.

  19. Overview of Seismic Hazard and Vulnerability of Ordinary Buildings in Belgium: Methodological Aspects and Study Cases

    SciTech Connect

    Barszez, Anne-Marie [Department Architecture Department, Polytechnic University of Mons Rue du Joncquois 53, B-7000 Mons (Belgium); Camelbeeck, Thierry [Section of Seismology, Royal Observatory of Belgium Avenue Circulaire 3, B-1180 Bruxelles (Belgium); Plumier, Andre [Earthquake Engineering, Liege University Chemin des Chevreuils 1, B-4000 Liege 1 (Belgium); Sabbe, Alain [Senior Lecturer, Head of Department Architecture Department, Polytechnic University of Mons Rue du Joncquois 53, B-7000 Mons (Belgium); Wilquin, Hugues [Head of Department, Architecture Department, Polytechnic University of Mons Rue du Joncquois 53, B-7000 Mons (Belgium)

    2008-07-08

    Northwest Europe is a region in which damaging earthquakes exist. Assessing the risks of damages is useful, but this is not an easy work based on exact science.In this paper, we propose a general tool for a first level assessment of seismic risks (rapid diagnosis). General methodological aspects are presented. For a given building, the risk is represented by a volume in a multi-dimensional space. This space is defined by axes representing the main parameters that have an influence on the risk. We notably express the importance of including a parameter to consider the specific value of cultural heritage.Then we apply the proposed tool to analyze and compare methods of seismic risk assessment used in Belgium. They differ by the spatial scale of the studied area. Study cases for the whole Belgian Territory and for part of cities in Liege and Mons (Be) aim also to give some sense of the overall risk in Belgium.

  20. Proposal for a longitudinal study to assess the hazards of radiation in space flight. Master's thesis

    SciTech Connect

    Reeves, G.I.

    1985-06-01

    This thesis involves the establishment of a registry of all United States astronuats, past and future, plus non-astronaut controls. The registry will record the incidences of malignant neoplastic disease and diabetes mellitus, and the space radiation exposure received. Data will be carefully analyzed to see if there is a dose-related increase in these diseases related to the exposure to ionizing radiation, with an eventual goal of establishing reliable risk estimates related to dose received. The history of cancer related to radiation exposure is summarized, and the space radiation environment briefly described. Physiological changes accompanying space flight and their potential effects on radiation tolerance and carcinogenesis are discussed. The reasons why data from animal experiments and human occupational, medical, and nuclear-weapon exposure cannot be extrapolated to the long-term health risks of astronauts are discussed at length, and the study instruments for establishing a long-term descriptive surveillance study are described.

  1. Assessing health risks of inhaled nanomaterials: development of pulmonary bioassay hazard studies

    Microsoft Academic Search

    David B. Warheit

    2010-01-01

    This brief discussion provides an overview of current concepts and perceptions regarding the pulmonary toxicity of ultrafine\\u000a or nanoparticles. These aspects include, but are not limited to comparisons of fine particle vs. ultrafine particle effects\\u000a and the unique response of pulmonary effects in rats vs. other rodent species, particularly at particle overload concentrations.\\u000a In the final section, two studies are

  2. Precarious Rock Methodology for Seismic Hazard: Physical Testing, Numerical Modeling and Coherence Studies

    SciTech Connect

    Anooshehpoor, Rasool; Purvance, Matthew D.; Brune, James N.; Preston, Leiph A.; Anderson, John G.; Smith, Kenneth D.

    2006-09-29

    This report covers the following projects: Shake table tests of precarious rock methodology, field tests of precarious rocks at Yucca Mountain and comparison of the results with PSHA predictions, study of the coherence of the wave field in the ESF, and a limited survey of precarious rocks south of the proposed repository footprint. A series of shake table experiments have been carried out at the University of Nevada, Reno Large Scale Structures Laboratory. The bulk of the experiments involved scaling acceleration time histories (uniaxial forcing) from 0.1g to the point where the objects on the shake table overturned a specified number of times. The results of these experiments have been compared with numerical overturning predictions. Numerical predictions for toppling of large objects with simple contact conditions (e.g., I-beams with sharp basal edges) agree well with shake-table results. The numerical model slightly underpredicts the overturning of small rectangular blocks. It overpredicts the overturning PGA for asymmetric granite boulders with complex basal contact conditions. In general the results confirm the approximate predictions of previous studies. Field testing of several rocks at Yucca Mountain has approximately confirmed the preliminary results from previous studies, suggesting that he PSHA predictions are too high, possibly because the uncertainty in the mean of the attenuation relations. Study of the coherence of wavefields in the ESF has provided results which will be very important in design of the canisters distribution, in particular a preliminary estimate of the wavelengths at which the wavefields become incoherent. No evidence was found for extreme focusing by lens-like inhomogeneities. A limited survey for precarious rocks confirmed that they extend south of the repository, and one of these has been field tested.

  3. Case studies of hazardous-waste treatment to remove volatile organics. Volume 1

    Microsoft Academic Search

    C. Allen; M. Branscome; C. Northeim; K. Leese; S. Harkins

    1987-01-01

    Case studies are presented for treatment of refinery wastes in a pilot-scale thin-film evaporator, the removal of volatiles from industrial wastewater for two steam strippers, and the removal of semivolatiles from water by steam stripping followed by liquid-phase carbon adsorption. The report provides data on removal efficiency, air emissions, process residuals, treatment costs, and process limitations. Details on sampling and

  4. Case studies of hazardous-waste treatment to remove volatile organics. Volume 2

    Microsoft Academic Search

    C. Allen; M. Branscome; C. Northeim; K. Leese; S. Harkins

    1987-01-01

    Case studies are presented for treatment of refinery wastes in a pilot-scale thin-film evaporator, the removal of volatiles from industrial wastewater for two steam strippers, and the removal of semivolatiles from water by steam stripping followed by liquid-phase carbon adsorption. The report provides data on removal efficiency, air emissions, process residuals, treatment costs, and process limitations. Details on sampling and

  5. Feasibility study of detection of hazardous airborne pollutants using passive open-path FTIR

    NASA Astrophysics Data System (ADS)

    Segal-Rosenheimer, M.; Dubowski, Y.; Jahn, C.; Schäfer, K.; Gerl, G.; Linker, R.

    2010-04-01

    In recent years open-path FTIR systems (active and passive) have demonstrated great potential and success for monitoring air pollution, industrial stack emissions, and trace gas constituents in the atmosphere. However, most of the studies were focused mainly on monitoring gaseous species and very few studies have investigated the feasibility of detecting bio-aerosols and dust by passive open-path FTIR measurements. The goal of the present study was to test the feasibility of detecting a cloud of toxic aerosols by a passive mode open-path FTIR. More specifically, we are focusing on the detection of toxic organophosphorous nerve agents for which we use Tri-2-ethyl-hexyl-phosphate as a model compound. We have determined the compounds' optical properties, which were needed for the radiative calculations, using a procedure developed in our laboratory. In addition, measurements of the aerosol size distribution in an airborne cloud were performed, which provided the additional input required for the radiative transfer model. This allowed simulation of the radiance signal that would be measured by the FTIR instrument and hence estimation of the detection limit of such a cloud. Preliminary outdoor measurements have demonstrated the possibility of detecting such a cloud using two detection methods. However, even in a simple case consisting of the detection of a pure airborne cloud, detection is not straightforward and reliable identification of the compound would require more advanced methods than simple correlation with spectral library.

  6. Space traffic hazards from orbital debris mitigation strategies

    NASA Astrophysics Data System (ADS)

    Smirnov, N. N.; Kiselev, A. B.; Smirnova, M. N.; Nikitin, V. F.

    2015-04-01

    The paper gives coverage of recent advances in mathematical modeling of long term orbital debris evolution within the frames of continua approach. Under the approach the evolution equations contain a number of source terms responsible for the variations of quantities of different fractions of orbital debris population due to fragmentations and collisions. Mechanisms of hypervelocity collisions of debris fragments with pressurized vessels are investigated. The spacecraft shield honeycomb concept is suggested based on principles of impact energy conversion and redistribution and consumption by destroyable structures. The paper is devoted to the 100th anniversary of the founder of space debris research in Moscow State University Prof. G.A. Tyulin.

  7. Space elevator radiation hazards and how to mitigate them.

    SciTech Connect

    Jorgensen, A. M. (Anders M.); Gassend, B.; Friedel, R. H. W. (Reiner H. W.); Cayton, T. E. (Thomas E.); Patamia, S. E. (Steven E.)

    2004-01-01

    The conclusions of this paper are: (1) the radiation field is severe; (2) shielding with aluminium is not economical; (3) shielding with a magnetic field may be feasible; (4) reducing dose by going gaster is not very effective; (5) larger/heavier climbers are more efficient when shielding with a heavy material (contrary requirement to talk by Ben Shelef); (6) climber mass and cost to orbit are impacted; and (7) power requirement could be impacted.

  8. Preparedness and Mitigation Systems for Asian Tsunami-Type Hazards

    NASA Astrophysics Data System (ADS)

    Aswathanarayana, U.

    2005-03-01

    The devastating impact of the 26 December 2004 Indian Ocean tsunami (also known as the Asian tsunami) on coastal communities has been widely reported in the media. The tsunami has so traumatized the public that governments are under pressure to spend vast amounts of money for warning and protective measures against the tsunamis. It should not, however, be forgotten that tsunamis are comparatively rare events, and consequently the expenditure on preparedness should be commensurate with the probability of risk. It is safe to state that the probability of an equally powerful tsunami being triggered at around the same location in the next few decades is low (http://www.msnbc.msn.com/id/6759529/site/newsweek/), for two reasons: (1) The repeat time for Sumatran-type subduction zone earthquakes is typically 200-300 years; and (2) the tsunami of 26 December was rendered so powerful because of the sudden release of the stress energy that accumulated over a long period of time in the area. This does not, however, preclude less powerful tsunamis from being set off in the future at this or other locations in the Sumatran belt.

  9. El Nino - La Nina Implications on Flood Hazard Mitigation

    SciTech Connect

    R. French; J. Miller

    2006-03-31

    The effects of El Nino and La Nina periods on the maximum daily winter period depths of precipitation are examined using records from five precipitation gages on the Nevada Test Site. The potential implications of these effects are discussed.

  10. Space elevator radiation hazards and how to mitigate them

    Microsoft Academic Search

    A. M. Jorgensen; B. Gassend; R. H. W. Friedel; T. E. Cayton; S. E. Patamia

    2004-01-01

    The conclusions of this paper are: (1) the radiation field is severe; (2) shielding with aluminium is not economical; (3) shielding with a magnetic field may be feasible; (4) reducing dose by going gaster is not very effective; (5) larger\\/heavier climbers are more efficient when shielding with a heavy material (contrary requirement to talk by Ben Shelef); (6) climber mass

  11. Creating Probabilistic Multi-Peril Hazard Maps

    NASA Astrophysics Data System (ADS)

    Holliday, J. R.; Page, N. A.; Rundle, J. B.

    2011-12-01

    An often overlooked component of natural hazards is the element of human involvement. Physical events--such as massive earthquakes--that do not affect society constitute natural phenomena, but are not necessarily natural hazards. Natural phenomena that occur in populated areas constitute hazardous events. Furthermore, hazardous events that cause damage--either in the form of structural damage or the loss or injury of lives--constitute natural disasters. Geographic areas that do not contain human interests, by definition, cannot suffer from hazardous events. Therefore, they do not contain a component of natural hazard. Note that this definition differs from the view of natural hazards as "unavoidable havoc wreaked by the unrestrained forces of nature". On the contrary, the burden of cause is shifted from purely natural processes to the concurrent presence of human society and natural events. Although individuals can do little to change the occurrences or intensities of most natural phenomena, they can mitigate their exposure to natural events and help ensure hazardous events do not become natural disasters. For example, choosing to build new settlements in known flood zones increases the exposure--and therefore risk--to natural flood events. Similarly, while volcanoes do erupt periodically, it is the conscious act of reappropriating the rich soils formed by ejecta as farmland that makes the volcanoes hazardous. Again, this empowers individuals and makes them responsible for their own exposure to natural hazards. Various local and governmental agencies--in particular, the United States Geographical Survey (USGS)--do a good job of identifying and listing various local natural hazards. These listings, however, are often treated individually and independently. Thus, it is often difficult to construct a "big picture" image of total natural hazard exposure. In this presentation, we discuss methods of identifying and combining different natural hazards for a given location. We then further refine our calculation into a single value--a hazard exposure index--and show how this value can be used to create multi-peril hazard maps. As an example, we contour and present one such map for all of California.

  12. Tsunami hazard assessment in Greece - review of numerical modeling (numerical simulations) from ten different studies

    NASA Astrophysics Data System (ADS)

    Maris, Fotios; Karagiorgos, Konstantinos; Fuchs, Sven; Kitikidou, Kyriaki

    2015-04-01

    Efforts towards a quantification of tsunamis, even though it started about seventy-five years ago, is still a puzzling aspect, since there are several scales proposed to measure tsunami size that are confusing regarding the intensity or magnitude, or they are difficult to apply. In the present work, ten different studies that relate specifically to numerical modeling of tsunami prediction and handling in Greek areas are reviewed. Even though current knowledge cannot guarantee the appearance and severity of a tsunami, warnings about tsunamis and predicting them in general can become more attuned and warnings can be more specific, with the development of numerical modeling.

  13. Flood Hazard Assessment for the Savannah River Site

    SciTech Connect

    Chen, K.F.

    1999-07-08

    'A method was developed to determine the probabilistic flood elevation curves for certain Savannah River Site (SRS) facilities. This paper presents the method used to determine the probabilistic flood elevation curve for F-Area due to runoff from the Upper Three Runs basin. Department of Energy (DOE) Order 420.1, Facility Safety, outlines the requirements for Natural Phenomena Hazard (NPH) mitigation for new and existing DOE facilities. The NPH considered in this paper is flooding. The facility-specific probabilistic flood hazard curve defines as a function of water elevation the annual probability of occurrence or the return period in years. Based on facility-specific probabilistic flood hazard curves and the nature of facility operations (e.g., involving hazardous or radioactive materials), facility managers can design permanent or temporary devices to prevent the propagation of flood on site, and develop emergency preparedness plans to mitigate the consequences of floods.'

  14. Hazard Alert Cards

    MedlinePLUS

    ... are here Home » Publications » Materials for Workers Hazard Alert Cards Hazard Alerts are short, image-driven materials that deliver simple, ... by clicking on links below. To order Hazard Alert pocket cards: Email csinyai@cpwr.com or call ...

  15. Hazard Alert: Trenches

    MedlinePLUS

    ... Construction Chart Book, p. 39. CPWR. 2008. HAZARD ALERT Find out more about safe work in trenches: • ... about construction hazards. Get more of these Hazard Alert cards – and cards on other topics. Call 301- ...

  16. Action on Hazardous Wastes.

    ERIC Educational Resources Information Center

    EPA Journal, 1979

    1979-01-01

    U.S. EPA is gearing up to investigate about 300 hazardous waste dump sites per year that could pose an imminent health hazard. Prosecutions are expected to result from the priority effort at investigating illegal hazardous waste disposal. (RE)

  17. Hazard and risk assessment of a nanoparticulate cerium oxide-based diesel fuel additive - a case study.

    PubMed

    Park, Barry; Donaldson, Kenneth; Duffin, Rodger; Tran, Lang; Kelly, Frank; Mudway, Ian; Morin, Jean-Paul; Guest, Robert; Jenkinson, Peter; Samaras, Zissis; Giannouli, Myrsini; Kouridis, Haris; Martin, Patricia

    2008-04-01

    Envirox is a scientifically and commercially proven diesel fuel combustion catalyst based on nanoparticulate cerium oxide and has been demonstrated to reduce fuel consumption, greenhouse gas emissions (CO(2)), and particulate emissions when added to diesel at levels of 5 mg/L. Studies have confirmed the adverse effects of particulates on respiratory and cardiac health, and while the use of Envirox contributes to a reduction in the particulate content in the air, it is necessary to demonstrate that the addition of Envirox does not alter the intrinsic toxicity of particles emitted in the exhaust. The purpose of this study was to evaluate the safety in use of Envirox by addressing the classical risk paradigm. Hazard assessment has been addressed by examining a range of in vitro cell and cell-free endpoints to assess the toxicity of cerium oxide nanoparticles as well as particulates emitted from engines using Envirox. Exposure assessment has taken data from modeling studies and from airborne monitoring sites in London and Newcastle adjacent to routes where vehicles using Envirox passed. Data have demonstrated that for the exposure levels measured, the estimated internal dose for a referential human in a chronic exposure situation is much lower than the no-observed-effect level (NOEL) in the in vitro toxicity studies. Exposure to nano-size cerium oxide as a result of the addition of Envirox to diesel fuel at the current levels of exposure in ambient air is therefore unlikely to lead to pulmonary oxidative stress and inflammation, which are the precursors for respiratory and cardiac health problems. PMID:18444008

  18. Age, skill, and hazard perception in driving.

    PubMed

    Borowsky, Avinoam; Shinar, David; Oron-Gilad, Tal

    2010-07-01

    This study examined the effects of age and driving experience on the ability to detect hazards while driving; namely, hazard perception. Studies have shown that young-inexperienced drivers are more likely than experienced drivers to suffer from hazard perception deficiencies. However, it remains to be determined if this skill deteriorates with advancing age. Twenty-one young-inexperienced, 19 experienced, and 16 elderly drivers viewed six hazard perception movies while connected to an eye tracking system and were requested to identify hazardous situations. Four movies embedded planned, highly hazardous, situations and the rest were used as control. Generally, experienced and older-experienced drivers were equally proficient at hazard detection and detected potentially hazardous events (e.g., approaching an intersection, pedestrians on curb) continuously whereas young-inexperienced drivers stopped reporting on hazards that followed planned, highly hazardous situations. Moreover, while approaching T intersections older and experienced drivers fixated more towards the merging road on the right while young-inexperienced drivers fixated straight ahead, paying less attention to potential vehicles on the merging road. The study suggests that driving experience improves drivers' awareness of potential hazards and guides drivers' eye movements to locations that might embed potential risks. Furthermore, advanced age hardly affects older drivers' ability to perceive hazards, and older drivers are at least partially aware of their age-related limitations. PMID:20441838

  19. Man-Made Major Hazards Like Earthquake or Explosion; Case Study, Turkish Mine Explosion (13 May 2014)

    PubMed Central

    VASHEGHANI FARAHANI, Jamileh

    2014-01-01

    Abstract In all over the world, mining is considered as a high-risk activity that is pregnant with serious disasters not only for miners, engineers, and other people into it, but also for people who live near the mines. In this article, our main purpose is to examine some major mine disasters and safety in mines and the case study is a coal mine in Turkey. Safety in mines is one of the most important issues that need attention. Therefore, it is suggested that existing deficiencies in mines should be removed by continuous monitoring in all devices, equipments, control of Methane and safe separation of coal from a mine. Moreover, we recommend that early warning systems should be installed to alert some explosions, fires and other dangerous events to the fire departments, hospitals, Red Crescent and other major reliefs. Experiences from previous events in mines can help managers and miners. With some plans and projects related to disasters in mines and solution for them, some diseases such as black lung disease or other problems in mines such as carbon monoxide poisoning can forestall a danger. Before Mine owners begin their activity, they must research about the environmental and social effects of their activities. Therefore, they should identify some important hazards and determine some essential tasks to remove them or control risks via collaboration with other scientists. PMID:26060707

  20. Treatability study on the use of by-product sulfur in Kazakhstan for the stabilization of hazardous and radioactive wastes

    SciTech Connect

    Kalb, P.D.; Milian, L.W. [Brookhaven National Lab., Upton, NY (United States). Environmental and Waste Technology Center; Yim, S.P. [Korea Atomic Energy Research Inst. (Korea, Republic of); Dyer, R.S.; Michaud, W.R. [Environmental Protection Agency (United States)

    1997-12-01

    The Republic of Kazakhstan generates significant quantities of excess elemental sulfur from the production and refining of petroleum reserves. In addition, the country also produces hazardous, and radioactive wastes which require treatment/stabilization. In an effort to find secondary uses for the elemental sulfur, and simultaneously produce a material which could be used to encapsulate, and reduce the dispersion of harmful contaminants into the environment, BNL evaluated the use of the sulfur polymer cement (SPC) produced from by-product sulfur in Kazakhstan. This thermoplastic binder material forms a durable waste form with low leaching properties and is compatible with a wide range of waste types. Several hundred kilograms of Kazakhstan sulfur were shipped to the US and converted to SPC (by reaction with 5 wt% organic modifiers) for use in this study. A phosphogypsum sand waste generated in Kazakhstan during the purification of phosphate fertilizer was selected for treatment. Waste loadings of 40 wt% were easily achieved. Waste form performance testing included compressive strength, water immersion, and Accelerated Leach Testing.

  1. Treatability study on the use of by-product sulfur in Kazakhstan for the stabilization of hazardous and radioactive wastes

    SciTech Connect

    Yim, Sung Paal; Kalb, P.D.; Milian, L.W.

    1997-08-01

    The Republic of Kazakhstan generates significant quantities of excess sulfur from the production and refining of petroleum reserves. In addition, the country also produces hazardous, and radioactive wastes which require treatment/stabilization. In an effort to find secondary uses for the elemental sulfur, and simultaneously produce a material which could be used to encapsulate, and reduce the dispersion of harmful contaminants into the environment, BNL evaluated the use of the sulfur polymer cement (SPC) produced from by-product sulfur in Kazakhstan. This thermoplastic binder material forms a durable waste form with low leaching properties and is compatible with a wide range of waste types. Several hundred kilograms of Kazakhstan sulfur were shipped to the U.S. and converted to SPC (by reaction with 5 wt% organic modifiers) for use in this study. A phosphogypsum sand waste generated in Kazakhstan during the purification of phosphate fertilizer was selected for treatment. Waste loading of 40 wt% were easily achieved. Waste form performance testing included compressive strength, water immersion, and Accelerated Leach Testing. 14 refs., 7 figs., 6 tabs.

  2. Basic Exploratory Research versus Guideline-Compliant Studies Used for Hazard Evaluation and Risk Assessment: Bisphenol A as a Case Study

    PubMed Central

    Tyl, Rochelle W.

    2009-01-01

    Background Myers et al. [Environ Health Perspect 117:309–315 (2009)] argued that Good Laboratory Practices (GLPs) cannot be used as a criterion for selecting data for risk assessment, using bisphenol A (BPA) as a case study. They did not discuss the role(s) of guideline-compliant studies versus basic/exploratory research studies, and they criticized both GLPs and guideline-compliant studies and their roles in formal hazard evaluation and risk assessment. They also specifically criticized our published guideline-compliant dietary studies on BPA in rats and mice and 17?-estradiol (E2) in mice. Objectives As the study director/first author of the criticized E2 and BPA studies, I discuss the uses of basic research versus guideline-compliant studies, how testing guidelines are developed and revised, how new end points are validated, and the role of GLPs. I also provide an overview of the BPA guideline-compliant and exploratory research animal studies and describe BPA pharmacokinetics in rats and humans. I present responses to specific criticisms by Myers et al. Discussion and conclusions Weight-of-evidence evaluations have consistently concluded that low-level BPA oral exposures do not adversely affect human developmental or reproductive health, and I encourage increased validation efforts for “new” end points for inclusion in guideline studies, as well as performance of robust long-term studies to follow early effects (observed in small exploratory studies) to any adverse consequences. PMID:20049112

  3. Evaluation of physical health effects due to volcanic hazards: human studies

    SciTech Connect

    Buist, A.S.; Bernstein, R.S.; Johnson, L.R.; Vollmer, W.M.

    1986-03-01

    Despite certain gaps in the knowledge that has been gained from the studies done in the aftermath of Mount St. Helens and following previous major volcanic eruptions, we can be confident that the health effects of both short- and long-term exposures to the relatively low levels of airborne volcanic ash that are typical following such a volcanic eruption are minor. These effects seem to relate more to the irritating effect of the ash on mucous membranes and airway epithelia than to the potential of the ash (due to its free crystalline silica content) to initiate a fibrotic response. Nevertheless, common sense dictates that exposures should be minimized whenever possible by use of appropriate preventive measures, such as wetting the sedimented ash before disturbing it, and using commercially available disposable paper masks meeting NIOSH code TC-23 for dusts for light exposures and industrial half- and full-faced respirator and goggles for more extensive and heavy exposure.

  4. Flood-hazard study: 100-year flood stage for Lucerne Lake, San Bernadino County, California

    USGS Publications Warehouse

    Busby, Mark William

    1977-01-01

    A study of the flood hydrology of Lucerne Valley, Calif., was made to develop the 100-year stage for Lucerne Lake. Synthetic-hydrologic techniques were used; and the 100-year flood stage was estimated to be at an elevation of 2,849.3 feet above mean sea level. Channel dimensions were measured at 59 sites in Lucerne Valley. Dranage area-discharge relations developed from channel-geometry data for sites nearby were used to estimate the discharge at 12 additional sites where channel geometry could not be measured. In order to compute the total volume discharge into the playa, the peak discharges were converted to volumes. From the Apple Valley report (Busby, 1975) the equation formulated from the relation between peak discharge and flood volume for the deserts of California was used to compute the flood volumes for routing into Lucerne Lake. (Woodard-USGS)

  5. Problems with mitigation translocation of herpetofauna.

    PubMed

    Sullivan, Brian K; Nowak, Erika M; Kwiatkowski, Matthew A

    2015-02-01

    Mitigation translocation of nuisance animals is a commonly used management practice aimed at resolution of human-animal conflict by removal and release of an individual animal. Long considered a reasonable undertaking, especially by the general public, it is now known that translocated subjects are negatively affected by the practice. Mitigation translocation is typically undertaken with individual adult organisms and has a much lower success rate than the more widely practiced conservation translocation of threatened and endangered species. Nonetheless, the public and many conservation practitioners believe that because population-level conservation translocations have been successful that mitigation translocation can be satisfactorily applied to a wide variety of human-wildlife conflict situations. We reviewed mitigation translocations of reptiles, including our own work with 3 long-lived species (Gila monsters [Heloderma suspectum], Sonoran desert tortoises [Gopherus morafkai], and western diamond-backed rattlesnakes [Crotalus atrox]). Overall, mitigation translocation had a low success rate when judged either by effects on individuals (in all studies reviewed they exhibited increased movement or increased mortality) or by the success of the resolution of the human-animal conflict (translocated individuals often returned to the capture site). Careful planning and identification of knowledge gaps are critical to increasing success rates in mitigation translocations in the face of increasing pressure to find solutions for species threatened by diverse anthropogenic factors, including climate change and exurban and energy development. PMID:25040040

  6. Relative Hazard and Risk Measure Calculation Methodology

    SciTech Connect

    Stenner, Robert D.; Strenge, Dennis L.; Elder, Matthew S.

    2004-03-20

    The relative hazard (RH) and risk measure (RM) methodology and computer code is a health risk-based tool designed to allow managers and environmental decision makers the opportunity to readily consider human health risks (i.e., public and worker risks) in their screening-level analysis of alternative cleanup strategies. Environmental management decisions involve consideration of costs, schedules, regulatory requirements, health hazards, and risks. The RH-RM tool is a risk-based environmental management decision tool that allows managers the ability to predict and track health hazards and risks over time as they change in relation to mitigation and cleanup actions. Analysis of the hazards and risks associated with planned mitigation and cleanup actions provides a baseline against which alternative strategies can be compared. This new tool allows managers to explore “what if scenarios,” to better understand the impact of alternative mitigation and cleanup actions (i.e., alternatives to the planned actions) on health hazards and risks. This new tool allows managers to screen alternatives on the basis of human health risk and compare the results with cost and other factors pertinent to the decision. Once an alternative or a narrow set of alternatives are selected, it will then be more cost-effective to perform the detailed risk analysis necessary for programmatic and regulatory acceptance of the selected alternative. The RH-RM code has been integrated into the PNNL developed Framework for Risk Analysis In Multimedia Environmental Systems (FRAMES) to allow the input and output data of the RH-RM code to be readily shared with the more comprehensive risk analysis models, such as the PNNL developed Multimedia Environmental Pollutant Assessment System (MEPAS) model.

  7. A study of professional radiation hazards in CT scan and nuclear medicine workers using GTG-banding and solid stain

    PubMed Central

    Changizi, Vahid; Alizadeh, Mohammad Hossein; Mousavi, Akbar

    2015-01-01

    Background: CT scan and nuclear medicine exams deliver a great part of medical exposures. This study examined professional radiation hazards in CT scan and nuclear medicine workers. Methods: In a cross sectional study 30 occupationally exposed workers and 7 controls (all from personnel of a laboratory) were selected. Physical dosimetry was performed for exposed workers. Blood samples were obtained from the experimental and control groups. Three culture mediums for each one were prepared in due to routine chromosome analysis using G-banding and solid stain. Results: There were significant increased incidence of chromatid gap (ctg) and chromatid break (ctb) with mean±SD frequencies of 3±0.84 and 3.1±1.40 per 100 cells respectively in the nuclear medicine workers versus controls with mean±SD frequencies of 1.9±0.69 and 1.3±0.84 for ctg and ctb, respectively. Chromosome gaps (chrg) were higher significantly in the nuclear medicine population (2.47±0.91) than in controls (1.4±0.9) (p< 0.05). In CT scan group the ctg and ctb were increased with a mean±SD frequency of 2.7±0.79 and 2.6±0.91 per 100 cells respectively compared with control group. The mean±SD frequencies of the chrb were 2.0±0.75 and 0.86±0.690 per 100 cells for exposed workers and control group, respectively. Conclusion: This study showed chromosome aberrations in peripheral lymphocytes using solid stain method are reasonable biomarker reflecting personnel radiation damage.

  8. Reproductive Hazards in the Lab Reproductive Hazards

    E-print Network

    de Lijser, Peter

    and pesticides have been shown to have effects on human reproductive systems. Organic solvents, such as xyleneReproductive Hazards in the Lab Reproductive Hazards The term reproductive hazard refers to agents (radiation, x-rays, chemicals or biologicals) that affect the reproductive health of women or men to have

  9. What is Hazardous Hazardous waste is

    E-print Network

    de Lijser, Peter

    What is Hazardous Waste? Hazardous waste is any product charac- terized or labeled as toxic, reactive, cor- rosive, flammable, combustible that is unwanted, dis- carded or no longer useful. This waste may be harmful to human health and/ or the environment. Hazardous Waste Disposal EH&S x7233 E-Waste

  10. RFI Mitigation Steve Ellingson

    E-print Network

    Ellingson, Steven W.

    RFI Mitigation Steve Ellingson Virginia Polytechnic Institute & State University "Frontiers of Astronomy with the World's Largest Radio Telescope" Meeting September 13, 2007 #12;RFI Problems · Ourselves. Time Series Matched Filter Output Rank Detector Pulse Detector Top: RFI mit off Middle: Nulling Bottom

  11. Seismic Hazard of Eritrea

    NASA Astrophysics Data System (ADS)

    Hagos, L.; Arvidsson, R.

    2003-04-01

    The method of spatially smoothed seismicity developed by Frankel(1995) and later extended by Lapajne et al.(1997) , is applied to estimate the seismic hazard of Eritrea. The extended method unlike the original one involves the delineation of the whole region into subregions with statistically determined directions of seismogenic faults pertaining to the respective tectonic regions (Poljak, 2000). Fault-rupture oriented elliptical Gaussian smoothing results in spatial models of expected seismicity. Seismic catalogue was compiled from ISC, NEIC, and Turyomurgyendo(1996) and homogenized to Ms. Three seismicity models suggested by Frankel(1995) which are based on different time and magnitude intervals are used in this approach, and a fourth model suggested by Lapajne et al.(2000), which is based on the seismic energy release is also used to enhance the influence of historical events on the hazard computation. Activity rates and maximum likelihood estimates of b- values for the different models are computed using the OHAZ program. The western part of the region shows no seismic activity. b -value for models 1-3 is estimated to be 0.91. Mmax has been estimated to be 7.0. Correlation distances are obtained objectively from the location error in the seismic catalogue. The attenuation relationship by Ambraseys et al .(1996) was found suitable for the region under study. PGA values for 10% probability of exceedence in 50 years (return period of 475 years) are computed for each model and a combined seismic hazard map was produced by subjectively assigning weights to each of the models. A worst case map is also obtained showing the highest PGA values at each location from the four hazard maps. The map indicates a higher hazard along the main tectonic features of the East African and the Red sea rift systems, with its highest PGA values within Eritrea exceeding 25% of g being located north of the red sea port of Massawa. In areas around Asmara PGA values exceed 10% of g.

  12. Public control of environmental health hazards (clinical and experimental studies of distal axonopathy--a frequent form of brain and nerve damage produced by environmental chemical hazards)

    Microsoft Academic Search

    H. H. Schaumburg; P. S. Spencer

    1979-01-01

    Clinical and pathological studies of the peripheral and central nervous system degeneration (distal dying-back axonopathy) in humans and experimental animals produced by acrylamide monomer and certain hydrocarbon compounds are summarized. The human distal axonopathies include: many of the naturally occurring, genetically determined system disorders\\/ certain nutritional disorders\\/ uremic neuropathy\\/ the neuropathies associated with some malignancies\\/ and the toxic neuropathies induced

  13. In-Space Propulsion Engine Architecture Based on Sublimation of Planetary Resources: From Exploration Robots to NED Mitigation

    NASA Technical Reports Server (NTRS)

    Sibille, Laurent; Mantovani, James; Dominquez, Jesus

    2011-01-01

    The purpose of this NIAC study is to identify those volatile and mineral resources that are available on asteroids, comets, moons and planets in the solar system, and investigate methods to transform these resources into forms of power that will expand the capabilities of future robotic and human exploration missions to explore planetary bodies beyond the Moon and will mitigate hazards from NEOs. The sources of power used for deep space probe missions are usually derived from either solar panels for electrical energy, radioisotope thermal generators for thermal energy, or fuel cells and chemical reactions for chemical energy and propulsion.

  14. Probabilistic flood hazard mapping: effects of uncertain boundary conditions

    NASA Astrophysics Data System (ADS)

    Domeneghetti, A.; Vorogushyn, S.; Castellarin, A.; Merz, B.; Brath, A.

    2013-08-01

    Comprehensive flood risk assessment studies should quantify the global uncertainty in flood hazard estimation, for instance by mapping inundation extents together with their confidence intervals. This appears of particular importance in the case of flood hazard assessments along dike-protected reaches, where the possibility of occurrence of dike failures may considerably enhance the uncertainty. We present a methodology to derive probabilistic flood maps in dike-protected flood prone areas, where several sources of uncertainty are taken into account. In particular, this paper focuses on a 50 km reach of River Po (Italy) and three major sources of uncertainty in hydraulic modelling and flood mapping: uncertainties in the (i) upstream and (ii) downstream boundary conditions, and (iii) uncertainties in dike failures. Uncertainties in the definition of upstream boundary conditions (i.e. design-hydrographs) are assessed through a copula-based bivariate analysis of flood peaks and volumes. Uncertainties in the definition of downstream boundary conditions are characterised by uncertainty in the rating curve with confidence intervals which reflect discharge measurement and interpolation errors. The effects of uncertainties in boundary conditions and randomness of dike failures are assessed by means of the Inundation Hazard Assessment Model (IHAM), a recently proposed hybrid probabilistic-deterministic model that considers three different dike failure mechanisms: overtopping, piping and micro-instability due to seepage. The results of the study show that the IHAM-based analysis enables probabilistic flood hazard mapping and provides decision-makers with a fundamental piece of information for devising and implementing flood risk mitigation strategies in the presence of various sources of uncertainty.

  15. International Studies of Hazardous Groundwater/Surface Water Exchange in the Volcanic Eruption and Tsunami Affected Areas of Kamchatka

    NASA Astrophysics Data System (ADS)

    Kontar, Y. A.; Gusiakov, V. K.; Izbekov, P. E.; Gordeev, E.; Titov, V. V.; Verstraeten, I. M.; Pinegina, T. K.; Tsadikovsky, E. I.; Heilweil, V. M.; Gingerich, S. B.

    2012-12-01

    During the US-Russia Geohazards Workshop held July 17-19, 2012 in Moscow, Russia the international research effort was asked to identify cooperative actions for disaster risk reduction, focusing on extreme geophysical events. As a part of this recommendation the PIRE project was developed to understand, quantify, forecast and protect the coastal zone aquifers and inland water resources of Kamchatka (Russia) and its ecosystems affected by the November 4, 1952 Kamchatka tsunami (Khalatyrka Beach near Petropavlovsk-Kamchatskiy) and the January 2, 1996 Karymskiy volcano eruption and the lake tsunami. This project brings together teams from U.S. universities and research institutions located in Russia. The research consortium was briefed on recent technical developments and will utilize samples secured via major international volcanic and tsunami programs for the purpose of advancing the study of submarine groundwater discharge (SGD) in the volcanic eruption and tsunami affected coastal areas and inland lakes of Kamchatka. We plan to accomplish this project by developing and applying the next generation of field sampling, remote sensing, laboratory techniques and mathematical tools to study groundwater-surface water interaction processes and SGD. We will develop a field and modeling approach to define SGD environment, key controls, and influence of volcano eruption and tsunami, which will provide a framework for making recommendations to combat contamination. This is valuable for politicians, water resource managers and decision-makers and for the volcano eruption and tsunami affected region water supply and water quality of Kamchatka. Data mining and results of our field work will be compiled for spatial modeling by Geo-Information System (GIS) using 3-D Earth Systems Visualization Lab. The field and model results will be communicated to interested stakeholders via an interactive web site. This will allow computation of SGD spatial patterns. In addition, thanks to the conceptual integrated approach, the mathematical tool will be transportable to other regions affected by volcanic eruption and tsunami. We will involve students in the work, incorporate the results into our teaching portfolio and work closely with the IUGG GeoRisk Commission and AGU Natural Hazards Focus Group to communicate our findings to the broader public, specifically local communities that will be most impacted. Under the PIRE education component, a cohort of U.S. and Russian post-doctoral researchers and students will receive training and contribute to the overall natural hazards SGD science agenda in cooperation with senior U.S. researchers and leading investigators from the Russian institutions. Overall, the extensive team of researchers, students and institutions is poised to deliver an innovative and broad spectrum of science associated with the study of SGD in the volcanic eruption and tsunami affected areas, in a way not possible to achieve in isolation.

  16. Ab initio study of adsorption properties of hazardous organic molecules on graphene: Phenol, phenyl azide, and phenylnitrene

    NASA Astrophysics Data System (ADS)

    Lee, Junsu; Min, Kyung-Ah; Hong, Suklyun; Kim, Gunn

    2015-01-01

    Phenol, phenyl azide, and phenylnitrene are hazardous organic molecules; therefore, the fabrication of sensors or filters with high sorption capabilities for the chemicals is necessary. Considering van der Waals interaction, we perform first-principles density functional theory calculations to investigate the adsorption properties of the hazardous molecules on graphene. For parallel stacking configurations, AB stacking is slightly more favorable than AA stacking for all the adsorbates that we considered. We find that phenyl azide has a higher adsorption energy than phenol. Phenylnitrene forms covalent bonds with graphene in oblique stacking structures, resulting in a bandgap opening in graphene.

  17. Environmental impact assessment of structural flood mitigation measures by a rapid impact assessment matrix (RIAM) technique: a case study in Metro Manila, Philippines.

    PubMed

    Gilbuena, Romeo; Kawamura, Akira; Medina, Reynaldo; Amaguchi, Hideo; Nakagawa, Naoko; Bui, Duong Du

    2013-07-01

    In recent decades, the practice of environmental impact assessment (EIA) in the planning processes of infrastructure projects has created significant awareness on the benefits of environmentally sound and sustainable urban development around the world. In the highly urbanized megacities in the Philippines, like Metro Manila, high priority is given by the national government to structural flood mitigation measures (SFMM) due to the persistently high frequency of flood-related disasters, which are exacerbated by the on-going effects of climate change. EIA thus, should be carefully and effectively executed to maximize the potential benefits of the SFMM. The common practice of EIA in the Philippines is generally qualitative and lacks clear methodology in evaluating multi-criteria systems. Thus, this study proposes the use of the rapid impact assessment matrix (RIAM) technique to provide a method that would systematically and quantitatively evaluate the socio-economic and environmental impacts of planned SFMM in Metro Manila. The RIAM technique was slightly modified to fit the requirements of this study. The scale of impact was determined for each perceived impact, and based on the results, the planned SFMM for Metro Manila will likely bring significant benefits; however, significant negative impacts may also likely occur. The proposed modifications were found to be highly compatible with RIAM, and the results of the RIAM analysis provided a clear view of the impacts associated with the implementation of SFMM projects. This may prove to be valuable in the practice of EIA in the Philippines. PMID:23588136

  18. RFI Mitigation Project S. Ellingson

    E-print Network

    Ellingson, Steven W.

    RFI Mitigation Project S. Ellingson Dec 18, 2013 #12;Agenda Background Ultimate goals Partial implementation & experiment Consider next steps #12;Background Original idea: Real-time RFI mitigation instrument-related issues, but no particular need for explicit RFI mitigation Two ideas going forward

  19. wind engineering & natural disaster mitigation

    E-print Network

    Denham, Graham

    wind engineering & natural disaster mitigation #12;wind engineering & natural disaster mitigation Investment WindEEE Dome at Advanced Manufacturing Park $31million Insurance Research Lab for Better Homes $8million Advanced Facility for Avian Research $9million #12;wind engineering & natural disaster mitigation

  20. Alternative Evaluation Study: Methods to Mitigate/Accommodate Subsidence for the Radioactive Waste Management Sites at the Nevada Test Site, Nye County Nevada, with Special Focus on Disposal Cell U-3ax/bl

    SciTech Connect

    Barker, L.

    1997-09-01

    An Alternative Evaluation Study is a type of systematic approach to problem identification and solution. An Alternative Evaluation Study was convened August 12-15, 1997, for the purpose of making recommendations concerning closure of Disposal Cell U-3ax/bl and other disposal cells and mitigation/accommodation of waste subsidence at the Radioactive Waste Management Sites at the Nevada Test Site. This report includes results of the Alternative Evaluation Study and specific recommendations.

  1. Correlates of household seismic hazard adjustment adoption.

    PubMed

    Lindell, M K; Whitney, D J

    2000-02-01

    This study examined the relationships of self-reported adoption of 12 seismic hazard adjustments (pre-impact actions to reduce danger to persons and property) with respondents' demographic characteristics, perceived risk, perceived hazard knowledge, perceived protection responsibility, and perceived attributes of the hazard adjustments. Consistent with theoretical predictions, perceived attributes of the hazard adjustments differentiated among the adjustments and had stronger correlations with adoption than any of the other predictors. These results identify the adjustments and attributes that emergency managers should address to have the greatest impact on improving household adjustment to earthquake hazard. PMID:10795335

  2. Toxic hazards of underground excavation

    SciTech Connect

    Smith, R.; Chitnis, V.; Damasian, M.; Lemm, M.; Popplesdorf, N.; Ryan, T.; Saban, C.; Cohen, J.; Smith, C.; Ciminesi, F.

    1982-09-01

    Inadvertent intrusion into natural or man-made toxic or hazardous material deposits as a consequence of activities such as mining, excavation or tunnelling has resulted in numerous deaths and injuries in this country. This study is a preliminary investigation to identify and document instances of such fatal or injurious intrusion. An objective is to provide useful insights and information related to potential hazards due to future intrusion into underground radioactive-waste-disposal facilities. The methodology used in this study includes literature review and correspondence with appropriate government agencies and organizations. Key categories of intrusion hazards are asphyxiation, methane, hydrogen sulfide, silica and asbestos, naturally occurring radionuclides, and various mine or waste dump related hazards.

  3. Occupational Hazards of Farming

    PubMed Central

    White, Gill; Cessna, Allan

    1989-01-01

    A number of occupational hazards exist for the farmer and farm worker. They include the hazards of farm machinery, biologic and chemical hazards, and social and environmental stresses. Recognizing of these hazards will help the family physician care for farmers and their families. PMID:21248929

  4. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  5. Study of application of ERTS-A imagery to fracture related mine safety hazards in the coal mining industry

    NASA Technical Reports Server (NTRS)

    Wier, C. E.; Wobber, F. J. (principal investigators); Russell, O. R.; Amato, R. V.

    1973-01-01

    The author has identified the following significant results. The utility of ERTS-1/high altitude aircraft imagery to detect underground mine hazards is strongly suggested. A 1:250,000 scale mined lands map of the Vincennes Quadrangle, Indiana has been prepared. This map is a prototype for a national mined lands inventory and will be distributed to State and Federal offices.

  6. Geological-geophysical techniques applied to urban planning in karst hazardous areas. Case study of Zaragoza, NE Spain

    Microsoft Academic Search

    O. Pueyo Anchuela; A. Soriano; A. Casas Sainz; A. Pocoví Juan

    2009-01-01

    Industrial and urban growth must deal in some settings with geological hazards. In the last 50 years, the city of Zaragoza (NE Spain) has developed an increase of its urbanized area in a progression several orders higher than expected from its population increase. This fast growth has affected several areas around the city that were not usually used for construction.

  7. Coal mine flooding as a cause of methane hazard. The case study of Morcinek mine, Upper Silesian Coal Basin, Poland

    Microsoft Academic Search

    Grzybek Ireneusz

    In hydrogeologically impermeable-covered region of USCB, where Morcinek coal mine is situated, the isolative overburden protects the surface from the emission of gasses compressed by the water rebound in the flooded coal mines. Potential gas hazard therefore can only arise in the regions where the shafts break through the overburden. Basing on mining and geological conditions, the volume of the

  8. Evaluation of subsidence hazard in mantled karst setting: a case study from Val d'Orléans (France)

    NASA Astrophysics Data System (ADS)

    Perrin, Jérôme; Cartannaz, Charles; Noury, Gildas; Vanoudheusden, Emilie

    2015-04-01

    Soil subsidence/collapse is a major geohazard occurring in karst region. It occurs as suffosion or dropout sinkholes developing in the soft cover. Less frequently it corresponds to a breakdown of karst void ceiling (i.e., collapse sinkhole). This hazard can cause significant engineering challenges. Therefore decision-makers require the elaboration of methodologies for reliable predictions of such hazards (e.g., karst subsidence susceptibility and hazards maps, early-warning monitoring systems). A methodological framework was developed to evaluate relevant conditioning factors favouring subsidence (Perrin et al. submitted) and then to combine these factors to produce karst subsidence susceptibility maps. This approach was applied to a mantled karst area south of Paris (Val d'Orléans). Results show the significant roles of the overburden lithology (presence/absence of low-permeability layer) and of the karst aquifer piezometric surface position within the overburden. In parallel, an experimental site has been setup to improve the understanding of key processes leading to subsidence/collapse and includes piezometers for measurements of water levels and physico-chemical parameters in both the alluvial and karst aquifers as well as surface deformation monitoring. Results should help in designing monitoring systems to anticipate occurrence of subsidence/collapse. Perrin J., Cartannaz C., Noury G., Vanoudheusden E. 2015. A multicriteria approach to karst subsidence hazard mapping supported by Weights-of-Evidence analysis. Submitted to Engineering Geology.

  9. Mitigation of the heat island effect in urban New Jersey

    Microsoft Academic Search

    William D. Solecki; Cynthia Rosenzweig; Lily Parshall; Greg Pope; Maria Clark; Jennifer Cox; Mary Wiencke

    2005-01-01

    Implementation of urban heat island (UHI) mitigation strategies such as increased vegetative cover and higher-albedo surface materials can reduce the impacts of biophysical hazards in cities, including heat stress related to elevated temperatures, air pollution and associated public health effects. Such strategies also can lower the demand for air-conditioning-related energy production. Since local impacts of global climate change may be

  10. Use of Space Technology in Flood Mitigation (Western Province, Zambia)

    NASA Astrophysics Data System (ADS)

    Mulando, A.

    2001-05-01

    Disasters, by definition are events that appear suddenly and with little warning. They are usually short lived, with extreme events bringing death, injury and destruction of buildings and communications. Their aftermath can be as damaging as their physical effects through destruction of sanitation and water supplies, destruction of housing and breakdown of transport for food, temporary shelter and emergency services. Since floods are one of the natural disasters which endanger both life and property, it becomes vital to know its extents and where the hazards exists. Flood disasters manifest natural processes on a larger scale and information provided by Remote Sensing is a most appropriate input to analysis of actual events and investigations of potential risks. An analytical and qualitative image processing and interpretation of Remotely Sensed data as well as other data such as rainfall, population, settlements not to mention but a few should be used to derive good mitigation strategies. Since mitigation is the cornerstone of emergency management, it therefore becomes a sustained action that will reduce or eliminate long term risks to people and property from natural hazards such as floods and their effects. This will definitely involve keeping of homes and other sensitive structures away from flood plains. Promotion of sound land use planning based on this known hazard, "FLOODS" is one such form of mitigation that can be applied in flood affected areas within flood plain. Therefore future mitigation technologies and procedures should increasingly be based on the use of flood extent information provided by Remote Sensing Satellites like the NOAA AVHRR as well as information on the designated flood hazard and risk areas.

  11. Tracking World Aerosol Hazards

    NSDL National Science Digital Library

    2013-02-13

    Worldwide patterns and sources of aerosols are analyzed and evaluated for potential hazards to aircraft safety. Using aerosol index maps created from data gathered by the TOMS instrument, student groups will analyze and compare aerosol data from either eight consecutive or eight random days. Each group will graph the data, rank the hazard level of their study area and analyze the patterns and probable causes of those aerosols. Directions and materials are included for classes with computer access and for those without computer access. The URL opens to the investigation directory, with links to teacher and student materials, lesson extensions, resources, teaching tips, and assessment strategies. Note that this is the last of three investigations found in the Grades 5-8 Module 1 of Mission Geography. The Mission Geography curriculum integrates data and images from NASA missions with the National Geography Standards. Each of the three investigations in Module 1, while related, can be done independently.

  12. USGS Geologic Hazards

    NSDL National Science Digital Library

    The Geologic Hazards section of the US Geological Survey (USGS) conducts research into the causes of geological phenomena such as landslides and earthquakes. The homepage connects visitors to the Geologic Hazards team's three main areas of endeavor. Geomagnetism provides links to the National Geomagnetic Information Center; Magnetic Observatories, Models, and Charts; and the Geomagnetic Information Node, which receives geomagnetic observatory data from around the world. The Landslide group studies the "causes and mechanisms of ground failure" to prevent "long-term losses and casualties." Their section provides links to the program and information center, publications, events, and current projects. The Earthquakes department hosts a wealth of information, including neotectonics, engineering seismology, and paleoseismology. Interactive maps are also provided.

  13. Considerations for an integrated wind turbine controls capability at the National Wind Technology Center: An aileron control case study for power regulation and load mitigation

    SciTech Connect

    Stuart, J.G.; Wright, A.D.; Butterfield, C.P.

    1996-06-01

    Several structural dynamics codes have been developed at, and under contract to, the National Wind Technology Center (NWTC). These design codes capture knowledge and expertise that has accumulated over the years through federally funded research and wind industry operational experience. The codes can generate vital information required to successfully implement wind turbine active control. However, system information derived from the design codes does not necessarily produce a system description that is consistent with the one assumed by standard control design and analysis tools (e.g., MATLAB{reg_sign} and Matrix-X{reg_sign}). This paper presents a system identification-based method for extracting and utilizing high-fidelity dynamics information, derived from an existing wind turbine structural dynamics code (FAST), for use in active control design. A simple proportional-integral (PI) aileron control case study is then used to successfully demonstrate the method, and to investigate controller performance for gust and turbulence wind input conditions. Aileron control results show success in both power regulation and load mitigation.

  14. Hazardous factories: Nigerian evidence.

    PubMed

    Oloyede, Olajide

    2005-06-01

    The past 15 years have seen an increasing governmental and corporate concern for the environment worldwide. For governments, information about the environmental performance of the industrial sector is required to inform macro-level decisions about environmental targets such as those required to meet UN directives. However, in many African, Asian, and Latin American countries, researching and reporting company environmental performance is limited. This article serves as a contribution to filling the gap by presenting evidence of physical and chemical risk in Nigerian factories. One hundred and three factories with a total of 5,021 workers were studied. One hundred and twenty physical and chemical hazards were identified and the result shows a high number of workers exposed to such hazards. The study also reveals that workers' awareness level of chemical hazards was high. Yet the danger was perceived in behavioral terms, especially by manufacturing firms, which tend to see environmental investment in an increasingly global economy as detrimental to profitability. PMID:16022703

  15. An artificial neural network approach for landslide hazard zonation in the Bhagirathi (Ganga) Valley, Himalayas

    Microsoft Academic Search

    A. S. Das Gupta; R. P. Gupta

    2004-01-01

    Landslides are natural hazards that cause havoc to both property and life every year, especially in the Himalayas. Landslide hazard zonation (LHZ) of areas affected by landslides therefore is essential for future developmental planning and organization of various disaster mitigation programmes. The conventional Geographical Information System (GIS)-based approaches for LHZ suffer from the subjective weight rating system where weights are

  16. Assessment of outdoor radiation hazard of natural radionuclides in tourism beach areas

    Microsoft Academic Search

    Ahmad Saat; Zaini Hamzah; Hamimah Jamaluddin; Husna Mardhiah Muda

    2011-01-01

    Tourism beaches are main attractions visited by members of the public for leisure and holidays. Knowledge of radiation hazard would enable the radiation risk estimation to be made, and suggest mitigation steps if needed. Surface radiation dose, activity concentration of 238 U, 232 Th and 40 K, and radiation hazard index in 18 beaches at eastern, south western and southern

  17. Evaluation of mitigation strategies in Facility Group 1 double-shell flammable-gas tanks at the Hanford Site

    SciTech Connect

    Unal, C.; Sadasivan, P.; Kubic, W.L.; White, J.R.

    1997-11-01

    Radioactive nuclear waste at the Hanford Site is stored in underground waste storage tanks at the site. The tanks fall into two main categories: single-shell tanks (SSTs) and double-shell tanks (DSTs). There are a total of 149 SSTs and 28 DSTs. The wastes stored in the tanks are chemically complex. They basically involve various sodium salts (mainly nitrite, nitrate, carbonates, aluminates, and hydroxides), organic compounds, heavy metals, and various radionuclides, including cesium, strontium, plutonium, and uranium. The waste is known to generate flammable gas (FG) [hydrogen, ammonia, nitrous oxide, hydrocarbons] by complex chemical reactions. The process of gas generation, retention, and release is transient. Some tanks reach a quasi-steady stage where gas generation is balanced by the release rate. Other tanks show continuous cycles of retention followed by episodic release. There currently are 25 tanks on the Flammable Gas Watch List (FGWL). The objective of this report is to evaluate possible mitigation strategies to eliminate the FG hazard. The evaluation is an engineering study of mitigation concepts for FG generation, retention, and release behavior in Tanks SY-101, AN-103, AN 104, An-105, and Aw-101. Where possible, limited quantification of the effects of mitigation strategies on the FG hazard also is considered. The results obtained from quantification efforts discussed in this report should be considered as best-estimate values. Results and conclusions of this work are intended to help in establishing methodologies in the contractor`s controls selection analysis to develop necessary safety controls for closing the FG unreviewed safety question. The general performance requirements of any mitigation scheme are discussed first.

  18. Climate Change & Mitigation Options

    NSDL National Science Digital Library

    Nummedal, Dag

    The Advanced Technology Environmental and Energy Center (ATEEC) provides this presentation from Dag Nummedal of the Colorado Energy Research Institute on climate change and mitigation options. This presentation is intended for users with a background knowledge on the topic and includes graphical representations of important data. The document may be downloaded in PDF file format. Users must download this resource for viewing, which requires a free log-in. There is no cost to download the item.

  19. Use of solid waste for chemical stabilization: Adsorption isotherms and {sup 13}C solid-state NMR study of hazardous organic compounds sorbed on coal fly ash

    SciTech Connect

    Netzel, D.A.; Lane, D.C.; Rovani, J.F.; Cox, J.D.; Clark, J.A.; Miknis, F.P.

    1993-09-01

    Adsorption of hazardous organic compounds on the Dave Johnston plant fly ash is described. Fly ash from Dave Johnston and Laramie River power plants were characterized using elemental, x-ray, and {sup 29}Si NMR; the Dave Johnston (DJ) fly ash had higher quartz contents, while the Laramie River fly ash had more monomeric silicate anions. Adsorption data for hydroaromatics and chlorobenzenes indicate that the adsorption capacity of DJ coal fly ash is much less than that of activated carbon by a factor of >3000; but it is needed to confirm that solid-gas and solid-liquid equilibrium isotherms can indeed be compared. However, for pyridine, pentachlorophenol, naphthalene, and 1,1,2,2-tetrachloroethane, the DJ fly ash appears to adsorb these compounds nearly as well as activated carbon. {sup 13}C NMR was used to study the adsorption of hazardous org. cpds on coal fly ash; the nuclear spin relaxation times often were very long, resulting in long experimental times to obtain a spectrum. Using a jumbo probe, low concentrations of some hazardous org. cpds could be detected; for pentachlorophenol adsorbed onto fly ash, the chemical shift of the phenolic carbon was changed. Use of NMR to study the adsorption needs further study.

  20. The use of hazards analysis in the development of training

    SciTech Connect

    Houghton, F.K.

    1998-12-01

    A hazards analysis identifies the operation hazards and the positive measures that aid in the mitigation or prevention of the hazard. If the tasks are human intensive, the hazard analysis often credits the personnel training as contributing to the mitigation of the accident`s consequence or prevention of an accident sequence. To be able to credit worker training, it is important to understand the role of the training in the hazard analysis. Systematic training, known as systematic training design (STD), performance-based training (PBT), or instructional system design (ISD), uses a five-phase (analysis, design, development, implementation, and evaluation) model for the development and implementation of the training. Both a hazards analysis and a training program begin with a task analysis that documents the roles and actions of the workers. Though the tasks analyses are different in nature, there is common ground and both the hazard analysis and the training program can benefit from a cooperative effort. However, the cooperation should not end with the task analysis phase of either program. The information gained from the hazards analysis should be used in all five phases of the training development. The training evaluation, both of the individual worker and institutional training program, can provide valuable information to the hazards analysis effort. This paper will discuss the integration of the information from the hazards analysis into a training program. The paper will use the installation and removal of a piece of tooling that is used in a high-explosive operation. This example will be used to follow the systematic development of a training program and demonstrate the interaction and cooperation between the hazards analysis and training program.

  1. Planning Tools For Seismic Risk Mitigation. Rules And Applications

    SciTech Connect

    De Paoli, Rosa Grazia [Department of Landscape Planning, Mediterranean University of Reggio Calabria (Italy)

    2008-07-08

    Recently, Italian urban planning research in the field of seismic risk mitigation are renewing. In particular, it promotes strategies that integrate urban rehabilitation and aseismic objectives, and also politicizes that are directed to revitalizes urban systems, coupling physical renewal and socio-economic development.In Italy the first law concerning planning for seismic mitigation dates back 1974, the law n. 64 'Regulation for buildings with particular rules for the seismic areas' where the rules for buildings in seismic areas concerning also the local hazard. This law, in fact, forced the municipalities to acquire, during the formation of the plans, a preventive opinion of compatibility between planning conditions and geomorphology conditions of the territory. From this date the conviction that the seismic risk must be considered inside the territorial planning especially in terms of strategies of mitigation has been strengthened.The town planners have started to take an interest in seismic risk in the [80]s when the Irpinia's earthquake took place. The researches developed after this earthquake have established that the principal cause of the collapse of buildings are due to from the wrong location of urban settlements (on slopes or crowns) After Irpinia's earthquake the first researches on seismic risk mitigation, in particular on the aspects related to the hazards and to the urban vulnerability were made.

  2. ORIGINAL PAPER PGA distributions and seismic hazard evaluations

    E-print Network

    Wu, Yih-Min

    , in a hope to develop local earthquake-resistant designs, and to mitigate potential earthquake hazards have used it for developing the earthquake-resistant designs of critical structures (U.S. NRC 2007 is a logical and practical approach (Geller et al. 1997), and can be used for developing the earthquake-resistant

  3. Digging Our Own Holes: Institutional Perspectives on Seismic Hazards

    Microsoft Academic Search

    S. Stein; J. Tomasello

    2005-01-01

    It has been observed that there are no true students of the earth; instead, we each dig our own holes and sit in them. A similar situation arises in attempts to assess the hazards of earthquakes and other natural disasters and to develop strategies to mitigate them. Ideally, we would like to look at the interests of society as a

  4. 24 CFR 51.204 - HUD-assisted hazardous facilities.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...facilities. 51.204 Section 51.204 Housing and Urban Development...Explosive or Flammable Nature § 51.204 HUD-assisted hazardous...and from any other facility or area where people may congregate or...mitigating measures listed in § 51.205 may be taken into...

  5. 24 CFR 51.204 - HUD-assisted hazardous facilities.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...facilities. 51.204 Section 51.204 Housing and Urban Development...Explosive or Flammable Nature § 51.204 HUD-assisted hazardous...and from any other facility or area where people may congregate or...mitigating measures listed in § 51.205 may be taken into...

  6. 24 CFR 51.204 - HUD-assisted hazardous facilities.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...facilities. 51.204 Section 51.204 Housing and Urban Development...Explosive or Flammable Nature § 51.204 HUD-assisted hazardous...and from any other facility or area where people may congregate or...mitigating measures listed in § 51.205 may be taken into...

  7. 24 CFR 51.204 - HUD-assisted hazardous facilities.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...facilities. 51.204 Section 51.204 Housing and Urban Development...Explosive or Flammable Nature § 51.204 HUD-assisted hazardous...and from any other facility or area where people may congregate or...mitigating measures listed in § 51.205 may be taken into...

  8. 24 CFR 51.204 - HUD-assisted hazardous facilities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...facilities. 51.204 Section 51.204 Housing and Urban Development...Explosive or Flammable Nature § 51.204 HUD-assisted hazardous...and from any other facility or area where people may congregate or...mitigating measures listed in § 51.205 may be taken into...

  9. New Methods of Energy Efficient Radon Mitigation

    SciTech Connect

    Fisk, W.J.; Prill, R.J.; Wooley, J.; Bonnefous, Y.C.; Gadgil, A.J.; Riley, W.J.

    1994-05-01

    Two new radon mitigation techniques are introduced and their evaluation in a field study complemented by numerical model predictions is described. Based on numerical predictions, installation of a sub gravel membrane at the study site resulted in a factor of two reduction in indoor radon concentrations. Experimental data indicated that installation of 'short-circuit' pipes extending between the subslab gravel and outdoors, caused an additional factor of two decrease in the radon concentration. Consequently, the combination of these two passive radon mitigation features, called the membrane and short-circuit (MASC) technique, was associated with a factor of four reduction in indoor radon concentration. The energy-efficient active radon mitigation method, called efficient active subslab pressurization (EASP), required only 20% of the fan energy of conventional active subslab depressurization and reduced the indoor radon concentration by approximately a factor of 15, including the numerically-predicted impact of the sub-gravel membrane.

  10. The discriminatory cost of ICD-10-CM transition between clinical specialties: metrics, case study, and mitigating tools

    PubMed Central

    Boyd, Andrew D; Li, Jianrong ‘John’; Burton, Mike D; Jonen, Michael; Gardeux, Vincent; Achour, Ikbel; Luo, Roger Q; Zenku, Ilir; Bahroos, Neil; Brown, Stephen B; Vanden Hoek, Terry; Lussier, Yves A

    2013-01-01

    Objective Applying the science of networks to quantify the discriminatory impact of the ICD-9-CM to ICD-10-CM transition between clinical specialties. Materials and Methods Datasets were the Center for Medicaid and Medicare Services ICD-9-CM to ICD-10-CM mapping files, general equivalence mappings, and statewide Medicaid emergency department billing. Diagnoses were represented as nodes and their mappings as directional relationships. The complex network was synthesized as an aggregate of simpler motifs and tabulation per clinical specialty. Results We identified five mapping motif categories: identity, class-to-subclass, subclass-to-class, convoluted, and no mapping. Convoluted mappings indicate that multiple ICD-9-CM and ICD-10-CM codes share complex, entangled, and non-reciprocal mappings. The proportions of convoluted diagnoses mappings (36% overall) range from 5% (hematology) to 60% (obstetrics and injuries). In a case study of 24?008 patient visits in 217 emergency departments, 27% of the costs are associated with convoluted diagnoses, with ‘abdominal pain’ and ‘gastroenteritis’ accounting for approximately 3.5%. Discussion Previous qualitative studies report that administrators and clinicians are likely to be challenged in understanding and managing their practice because of the ICD-10-CM transition. We substantiate the complexity of this transition with a thorough quantitative summary per clinical specialty, a case study, and the tools to apply this methodology easily to any clinical practice in the form of a web portal and analytic tables. Conclusions Post-transition, successful management of frequent diseases with convoluted mapping network patterns is critical. The http://lussierlab.org/transition-to-ICD10CM web portal provides insight in linking onerous diseases to the ICD-10 transition. PMID:23645552

  11. Revealing the Vulnerability of People and Places: A Case Study of Georgetown County, South Carolina

    Microsoft Academic Search

    Susan L. Cutter; Jerry T. Mitchell; Michael S. Scott

    2000-01-01

    Losses from environmental hazards have escalated in the past decade, prompting a reorientation of emergency management systems away from simple postevent response. There is a noticeable change in policy, with more emphasis on loss reduction through mitigation, preparedness, and recovery programs. Effective mitigation of losses from hazards requires hazard identification, an assessment of all the hazards likely to affect a

  12. Surface wave mitigation using photonic crystals

    Microsoft Academic Search

    Philip Keith Kelly

    2000-01-01

    This dissertation studies the mitigation of radio frequency surface wave propagation supported by grounded dielectric substrates using photonic crystal substrates. The studies are done primarily using the FDTD computational electromagnetic technique. The photonic crystal substrate is formed by removing cylinders of dielectric material, without removing the ground plane, in a two-dimensional periodic arrangement. It is shown that by making the

  13. An Innovative Solution to NASA's NEO Impact Threat Mitigation Grand Challenge and Flight Validation Mission Architecture Development

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Barbee, Brent W.

    2015-01-01

    This paper presents the results of a NASA Innovative Advanced Concept (NIAC) Phase 2 study entitled "An Innovative Solution to NASA's Near-Earth Object (NEO) Impact Threat Mitigation Grand Challenge and Flight Validation Mission Architecture Development." This NIAC Phase 2 study was conducted at the Asteroid Deflection Research Center (ADRC) of Iowa State University in 2012-2014. The study objective was to develop an innovative yet practically implementable mitigation strategy for the most probable impact threat of an asteroid or comet with short warning time (< 5 years). The mitigation strategy described in this paper is intended to optimally reduce the severity and catastrophic damage of the NEO impact event, especially when we don't have sufficient warning times for non-disruptive deflection of a hazardous NEO. This paper provides an executive summary of the NIAC Phase 2 study results. Detailed technical descriptions of the study results are provided in a separate final technical report, which can be downloaded from the ADRC website (www.adrc.iastate.edu).

  14. Hydrochemistry of Arsenic-Enriched Aquifer from Rural West Bengal, India: A Study of the Arsenic Exposure and Mitigation Option

    Microsoft Academic Search

    Bibhash Nath; Sudip J. Sahu; Joydev Jana; Aishwarya Mukherjee-Goswami; Sharmi Roy; Madhav J. Sarkar; Debashis Chatterjee

    2008-01-01

    The present study aims to understand the hydrochemistry vis-à-vis As-exposure from drinking groundwater in rural Bengal. The\\u000a characteristic feature of the groundwaters are low Eh (range, ?151 to ?37 mV; mean, ?68 mV) and nitrate (range, 0.01–1.7 mg\\/l;\\u000a mean, 0.14 mg\\/l) followed by high alkalinity (range, 100–630 mg\\/l; mean, 301 mg\\/l), Fe (range, 0.99–38 mg\\/l; mean, 8.1 mg\\/l),\\u000a phosphate (range, 0.01–15 mg\\/l; mean, 0.54 mg\\/l), hardness (range, 46–600 mg\\/l; mean, 245 mg\\/l)

  15. Use of geophysical methods in man-made hazard management strategies. Case study from Ploiesti city, Romania

    NASA Astrophysics Data System (ADS)

    Chitea, F.; Anghelache, M. A.; Ioane, D.

    2010-05-01

    Identification of damages/changes that are affecting the underground water quality due to the effect of anthropogenic activities is often done after environmental problems have become evident, water potability being strongly affected. In this paper we will discuss the necessity of implementing non-invasive and non-destructive investigation tools in different parts of the management plan for urban areas affected or with high risk of being affected by man-made hazards. Geophysical investigations represent nowadays a useful tool in environmental problems that affect soil and underground water in urban areas, as useful information can be obtained regarding the following aspects: - detection of affected areas, especially when the effect or hazard sources are not visible at the surface - zonation of the area (severely affected zone or less affected) - investigation of the area (details on affected surface and affected soil depth) - location of "hidden" sources (illegal waste dump sites, petroleum transport or transfer pipes, etc) - estimation of soil and underground damages by monitoring petrophysical markers - risk evaluation (estimations on the direction and speed of environmental problems development, estimations of amplifying negative effects) - recovery from the man-made hazard of a certain area (monitoring information can give information about natural attenuation of the environmental problems or efficacy of resilience program) - preparedness for man-made hazards (prediction). Functionality of the above mentioned plans of geophysical applicability in identifying and characterizing the effect of anthropogenic hazards which affect soil and underground water quality has been tested in Ploiesti city, Romania. In this urban area, as well as in surrounding villages, water potability is severely affected because of the oil-products contamination caused by the refinery facilities developed in the area. Oil-contamination is a major problem environmental problem, due to the fact that affected area is continuously expanding as a consequence of contaminant transport by the underground water. Hydrogeologically the research area is located in the alluvium of one of the main hydrostructures of Romania, which holds important water resources. Preliminary investigations made in the Ploiesti city area, has shown the high vulnerability of the aquifer to pollution and it was detected a highly contaminated area. By detailed investigations made using geophysical investigations in the test-zone, it was possible the detection of the presence of the particular type of pollutants and a map with area zonation has been produced. Appliance of geophysical investigations in environmental strategies concerning underground water pollution should be added to the ones obtained by direct investigations for risk evaluation and remediation strategies in cases of man made hazards. Acknowledgements: The research was performed with financial support from MENER (project nr. 725/ 2006) and CNCSIS-UEFISCU (project nr. 244/2007)

  16. Study Of The Risks Arising From Natural Disasters And Hazards On Urban And Intercity Motorways By Using Failure Mode Effect Analysis (FMEA) Methods

    NASA Astrophysics Data System (ADS)

    DEL?CE, Yavuz

    2015-04-01

    Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing, signaling caused malfunctions and risks), fire or explosion etc.- In this study, with FMEA method, risk analysis of the urban and intercity motorways against natural disasters and hazards have been performed and found solutions were brought against these risks. Keywords: Failure Modes Effects Analysis (FMEA), Pareto Analyses (PA), Highways, Risk Management.

  17. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  18. Integrated Risk Assessment to Natural Hazards in Motozintla, Chiapas, Mexico

    NASA Astrophysics Data System (ADS)

    Novelo-Casanova, D. A.

    2012-12-01

    An integrated risk assessment includes the analysis of all components of individual constituents of risk such as baseline study, hazard identification and categorization, hazard exposure, and vulnerability. Vulnerability refers to the inability of people, organizations, and societies to withstand adverse impacts from multiple stressors to which they are exposed. These impacts are due to characteristics inherent in social interactions, institutions, and systems of cultural values. Thus, social vulnerability is a pre-existing condition that affects a society's ability to prepare for and recover from a disruptive event. Risk is the probability of a loss, and this loss depends on three elements: hazard, exposure, and vulnerability. Thus, risk is the estimated impact that a hazard event would have on people, services, facilities, structures and assets in a community. In this work we assess the risk to natural hazards in the community of Motozintla located in southern Mexico in the state of Chiapas (15.37N, 92.25W) with a population of about 20 000 habitants. Due to its geographical and geological location, this community is continuously exposed to many different natural hazards (earthquakes, landslides, volcanic eruptions, and floods). To determine the level of exposure of the community to natural hazards, we developed integrated studies and analysis of seismic microzonation, landslide and flood susceptibility as well as volcanic impact using standard methodologies. Social vulnerability was quantified from data obtained from local families interviews. Five variables were considered: household structure quality and design, availability of basic public services, family economic conditions, existing family plans for disaster preparedness, and risk perception.The number of families surveyed was determined considering a sample statistically significant. The families that were interviewed were selected using the simple random sampling technique with replacement. With these procedure, each household was chosen randomly and entirely by chance with the same probability of being chosen at any stage during the sampling process. To facilitate our interpretation, all results were spatially analyzed using a Geographical Information System (GIS). Our results indicate that the community of Motozintla is higly exposed to floods, landslides and earthquakes and to a lesser extent to the impact of a volcanic eruption. The locality has a high level of structural vulnerability to the main identified hazards (floods and landslides). About 70% of the families has a daily income below 11 USD. Approximately 66% of the population does not know any existing Civil Protection Plan. Another major observation is that the community organization for disaster prevention is practically nonexistent. These natural and social conditions indicate that the community of Motozintla has a very high level of risk to natural hazards. This research will support decision makers in Mexico, and particularly from the sate of Chiapas, in the development of an integrated comprenhensive natural hazards mitigation and prevention program in this region.

  19. Hazardous materials in Fresh Kills landfill

    SciTech Connect

    Hirschhorn, J.S. [Hirschhorn and Associates, Wheaton, MD (United States)

    1997-12-31

    No environmental monitoring and corrective action programs can pinpoint multiple locations of hazardous materials the total amount of them in a large landfill. Yet the consequences of hazardous materials in MSW landfills are considerable, in terms of public health concerns, environmental damage, and cleanup costs. In this paper a rough estimation is made of how much hazardous material may have been disposed in Fresh Kills landfill in Staten Island, New York. The logic and methods could be used for other MSW landfills. Fresh Kills has frequently been described as the world`s largest MSW landfill. While records of hazardous waste disposal at Fresh Kills over nearly 50 years of operation certainly do not exist, no reasonable person would argue with the conclusion that large quantities of hazardous waste surely have been disposed at Fresh Kills, both legally and illegally. This study found that at least 2 million tons of hazardous wastes and substances have been disposed at Fresh Kills since 1948. Major sources are: household hazardous waste, commercial RCRA hazardous waste, incinerator ash, and commercial non-RCRA hazardous waste, governmental RCRA hazardous waste. Illegal disposal of hazardous waste surely has contributed even more. This is a sufficient amount to cause serious environmental contamination and releases, especially from such a landfill without an engineered liner system, for example. This figure is roughly 1% of the total amount of waste disposed in Fresh Kills since 1948, probably at least 200 million tons.

  20. Geoethical issues in long-term assessment of geohazards and related mitigation policies

    NASA Astrophysics Data System (ADS)

    Tinti, Stefano; Armigliato, Alberto

    2015-04-01

    Long-term assessment of large-impact and relatively (or very) infrequent geohazards like earthquakes, tsunamis and volcanic eruptions is nowadays a common practice for geoscientists and many groups have been and are involved in producing global and regional hazard maps in response of an increasing demand of the society. Though the societal needs are the basic motivations for such studies, often this aspect is not pondered enough and a lack of communication between geoscientists and the society might be a serious limit to the effective exploitation of the hazard assessment products and to the development of adequate mitigation policies. This paper is an analysis of the role of geoscientists in the process of the production of long-term assessments of dangerous natural phenomena (such as mapping of seismic, tsunami and volcanic hazards), with special emphasis given to the role of communicators and disseminators (with respect to the general public, to authorities, to restricted specialized audiences…), but also of providers of active support to the planners who should be given key elements for making decision. Geoethics imposes geoscientists to take clear and full responsibilities on the products resulting from their assessments, but also to be aware that these products are valuable insofar they are scientifically sound, known, understandable, and utilizable by a wide universe of users.

  1. The Combined Effect of Nursing Support and Adverse Event Mitigation on Adherence to Interferon Beta-1b Therapy in Early Multiple Sclerosis: The START Study.

    PubMed

    Dhib-Jalbut, Suhayl; Markowitz, Clyde; Patel, Payal; Boateng, Francis; Rametta, Mark

    2012-01-01

    There is limited clinical evidence on the impact of nurse support and adverse event (AE) mitigation techniques on adherence to interferon beta-1b (IFN?-1b) therapy in multiple sclerosis (MS) in a real-world setting. The aim of the Success of Titration, analgesics, and BETA nurse support on Acceptance Rates in MS Treatment (START) trial was to assess the combined effect of titration, analgesics, and BETA (Betaseron Education, Training, Assistance) nurse support on adherence to IFN?-1b therapy in patients with early-onset MS and to evaluate safety. Participants were instructed to titrate IFN?-1b and use analgesics to minimize flu-like symptoms. All received BETA nurse follow-up at frequent intervals: live training, two telephone calls during the first month of therapy, and monthly calls thereafter. Participants were considered adherent if they took at least 75% of the total prescribed doses over 12 months (?75% compliance). Safety was monitored via reported AEs and laboratory test results. Participants who took at least one IFN?-1b dose over 12 months were analyzed (N = 104); 73.8% of participants completed the study. The mean age of participants was 37.2 years; 72.1% were women and 78.8% were white. Ninety participants had relapsing-remitting MS and 14 had clinically isolated syndrome. The mean compliance rate, reported for 96 participants with complete dose interruption records, was 84.4%. At 12 months, 78.1% of participants were considered adherent. The serious adverse event rate was 9.6%; most events were unrelated to therapy. Thus in the START study, in which participants received nursing support combined with dose titration and use of analgesics, the majority of participants were adherent to therapy. PMID:24453752

  2. Analyzing environmental and structural charactersitics of concrete for carbon mitigation and climate adaptation in urban areas: A case study in Rajkot, India

    NASA Astrophysics Data System (ADS)

    Solis, Andrea Valdez

    Increasing temperatures, varying rain events accompanied with flooding or droughts coupled with increasing water demands, and decreasing air quality are just some examples of stresses that urban systems face with the onset of climate change and rapid urbanization. Literature suggests that greenhouse gases are a leading cause of climate change and are of a result of anthropogenic activities such as infrastructure development. Infrastructure development is heavily dependent on the production of concrete. Yet, concrete can contribute up to 7% of total CO29 emissions globally from cement manufacturing alone. The goal of this dissertation was to evaluate current concrete technologies that could contribute to carbon mitigation and climate adaptation in cities. The objectives used to reach the goal of the study included (1) applying a material flow and life cycle analysis (MFA-LCA) to determine the environmental impacts of pervious and high volume fly ash (HVFA) concrete compared to ordinary portland cement (OPC) concrete in a developing country; (2) performing a comparative assessment of pervious concrete mixture designs for structural and environmental benefits across the U.S. and India; and (3) Determining structural and durability benefits from HVFA concrete mixtures when subjected to extreme hot weather conditions (a likely element of climate change). The study revealed that cities have a choice in reducing emissions, improving stormwater issues, and developing infrastructure that can sustain higher temperatures. Pervious and HVFA concrete mixtures reduce emissions by 21% and 47%, respectively, compared to OPC mixtures. A pervious concrete demonstration in Rajkot, India showed improvements in water quality (i.e. lower levels of nitrogen by as much as 68% from initial readings), and a reduction in material costs by 25%. HVFA and OPC concrete mixtures maintained compressive strengths above a design strength of 27.6 MPa (4000 psi), achieved low to moderate permeability's (1000 to 4000 coulombs), and prevented changes in length that could be detrimental to the performance of the concrete in long-term temperatures above 37.8°C (100°F).

  3. Defining 3D seismogenic faults to improve the probabilistic seismic hazard model: a case study from central Apennines (Italy)

    NASA Astrophysics Data System (ADS)

    Boncio, P.; Pace, B.; Lavecchia, G.

    2003-04-01

    Italy has a long history of large and moderate earthquakes. Contrary to other areas in the world, the deformation velocity across the Apennines is relatively low, large earthquakes may have long recurrence intervals (>=1000 a) and historical records may be not representative of the whole seismic activity. Several case studies clearly showed that earthquakes occur on existing faults having structural, geomorphic and paleoseismic features which can be recognised and quantified by geologic investigations. This strongly suggests that faults may greatly improve Seismic Hazard Assessment (SHA). Geology-based time-dependent SHA methods have been developing, also in Italy. These new methods require information on the geometric, kinematic and energetic parameters of the major seismogenic faults. Using surface data alone (faults by geological mapping i.e. 2D linear features) may create some problems in SHA. Linear faults are far from represent both the real radiating-energy sources and the likely epicentres of the attended earthquakes. It is well known that seismic moment is strictly dependent on the fault area. It follows that fault area should be the geological parameters to be used for SHA. Therefore, the third dimension is essential. In this work we define a model of seismogenic sources, suitable for SHA purposes, for the central Apennines of Italy. Our approach is mainly structural-seismotectonic. We integrate surface geology data with seismological and subsurface structural data (3D approach). A fundamental step is the definition of the seismogenic layer thickness. We used well located background seismicity as well as the depth distribution of aftershock zones. We also compared the instrumental seismicity with the strength and behaviour (frictional vs. plastic) of the crust by rheological profiling. In most cases the hypocentral distributions and the reological profiles are in good agreement, suggesting that rheological profiling may be a powerful tool in estimating the brittle layer thickness where detailed seismological data are lacking. Once the 3D fault features and a segmentation model have been defined, the step onward is the computation of the maximum magnitude and the mean recurrence time of the attended earthquake (essential for SHA). We compared three energetic parameters estimates: historical (fault-historical earthquakes association), geometrical (from fault geometry and kinematics) and geometrical revised (geometrical approach corrected, when necessary, by earthquake scaling laws).

  4. The Impact of Changes in Municipal Solid Waste Disposal Laws on Proximity to Environmental Hazards: A Case Study of Connecticut

    Microsoft Academic Search

    Ellen K. Cromley

    \\u000a Environmental policies affect proximity to environmental hazards. In the late 1980s, the State of Connecticut implemented\\u000a mandatory recycling laws to improve management of municipal solid waste. At that time, more than 80% of the State’s 169 towns\\u000a disposed of trash within their own borders. The regulatory change redirected flows of waste to transfer stations and trash-to-energy\\u000a plants. To assess changes

  5. The Handling of Hazard Data on a National Scale: A Case Study from the British Geological Survey

    NASA Astrophysics Data System (ADS)

    Royse, Katherine R.

    2011-11-01

    This paper reviews how hazard data and geological map data have been combined by the British Geological Survey (BGS) to produce a set of GIS-based national-scale hazard susceptibility maps for the UK. This work has been carried out over the last 9 years and as such reflects the combined outputs of a large number of researchers at BGS. The paper details the inception of these datasets from the development of the seamless digital geological map in 2001 through to the deterministic 2D hazard models produced today. These datasets currently include landslides, shrink-swell, soluble rocks, compressible and collapsible deposits, groundwater flooding, geological indicators of flooding, radon potential and potentially harmful elements in soil. These models have been created using a combination of expert knowledge (from both within BGS and from outside bodies such as the Health Protection Agency), national databases (which contain data collected over the past 175 years), multi-criteria analysis within geographical information systems and a flexible rule-based approach for each individual geohazard. By using GIS in this way, it has been possible to model the distribution and degree of geohazards across the whole of Britain.

  6. High frequency ultrasonic mitigation of microbial corrosion

    NASA Astrophysics Data System (ADS)

    Almahamedh, Hussain H.; Meegan, G. Douglas; Mishra, Brajendra; Olson, David L.; Spear, John R.

    2012-05-01

    Microbiologically Influenced Corrosion (MIC) is a major problem in oil industry facilities, and considerable effort has been spent to mitigate this costly issue. More environmentally benign methods are under consideration as alternatives to biocides, among which are ultrasonic techniques. In this study, a high frequency ultrasonic technique (HFUT) was used as a mitigation method for MIC. The killing percentages of the HFUT were higher than 99.8 percent and their corrosivity on steel was reduced by more than 50 percent. The practice and result will be discussed.

  7. Mitigating PQ Problems in Legacy Data Centers

    SciTech Connect

    Ilinets, Boris; /SLAC

    2011-06-01

    The conclusions of this presentation are: (1) Problems with PQ in legacy data centers still exist and need to be mitigated; (2) Harmonics generated by non-linear IT load can be lowered by passive, active and hybrid cancellation methods; (3) Harmonic study is necessary to find the best way to treat PQ problems; (4) AHF's and harmonic cancellation transformers proved to be very efficient in mitigating PQ problems; and (5) It is important that IT leaders partner with electrical engineering to appropriate ROI statements, justifying many of these expenditures.

  8. Risk Assessment of Arsenic Mitigation Options in Bangladesh

    Microsoft Academic Search

    Guy Howard; M. Feroze Ahmed; Abu Jafar Shamsuddin; Shamsul Gafur Mahmud; Daniel Deere

    2006-01-01

    The provision of alternative water sources is the principal arsenic mitigation strategy in Bangladesh, but can lead to risk substitution. A study of arsenic mitigation options was undertaken to assess water quality and sanitary condition and to estimate the burden of disease associated with each technology in disability-adjusted life years (DALYs). Dugwells and pond-sand filters showed heavy microbial contamination in

  9. Studies on potential emission of hazardous gases due to uncontrolled open-air burning of waste vehicle tyres and their possible impacts on the environment

    NASA Astrophysics Data System (ADS)

    Shakya, Pawan R.; Shrestha, Pratima; Tamrakar, Chirika S.; Bhattarai, Pradeep K.

    Uncontrolled open-air burning of waste vehicle tyres causing environmental pollution has become a popular practice in Nepal despite official ban considering the environment and public health hazards. In this study, an experimental model was set up in a laboratory scale in an attempt to understand the potential emission of